hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
bdfb42c931f271f1ee53c5327a4184116d64978a | 97 | md | Markdown | README.md | Sarh/algorithms | 29fa8bc0a275684eb46cf5b0747483ffa7fcedbc | [
"Unlicense"
] | null | null | null | README.md | Sarh/algorithms | 29fa8bc0a275684eb46cf5b0747483ffa7fcedbc | [
"Unlicense"
] | 2 | 2016-04-27T22:10:36.000Z | 2016-04-27T22:11:16.000Z | README.md | Sarh/algorithms | 29fa8bc0a275684eb46cf5b0747483ffa7fcedbc | [
"Unlicense"
] | null | null | null | # algorithms
implements the algorithms discussed in my Design & Analysis of Algorithms class
| 32.333333 | 83 | 0.824742 | eng_Latn | 0.956547 |
bdfb966d10ace933af194e0c50b43d5f955a6703 | 2,876 | md | Markdown | CTFwriteups/YearList/2021.md | Angmar2722/Angmar2722.github.io | 6d79cfeeb6681b32d62caba167e14c2f4eeb2569 | [
"MIT"
] | null | null | null | CTFwriteups/YearList/2021.md | Angmar2722/Angmar2722.github.io | 6d79cfeeb6681b32d62caba167e14c2f4eeb2569 | [
"MIT"
] | null | null | null | CTFwriteups/YearList/2021.md | Angmar2722/Angmar2722.github.io | 6d79cfeeb6681b32d62caba167e14c2f4eeb2569 | [
"MIT"
] | 9 | 2021-06-30T07:57:08.000Z | 2022-01-03T09:22:35.000Z | ---
layout: page
title: 2021 Writeups
---
<hr/>

Below are the writeups for CTFs participated in 2021 (my first year playing CTFs):
<br/>
| CTF Writeup | Weight | Team | Rank | Points |
| ------------- | --- | --- | ------ | -----: |
|[idek CTF 2021](https://angmar2722.github.io/CTFwriteups/2021/idek2021/) | 24.52 | Social Engineering Experts | 13/235 | 15.840 |
|[Buckeye CTF 2021](https://angmar2722.github.io/CTFwriteups/2021/buckeye2021/) | 24.74 | Social Engineering Experts | 7/505 | 19.386 |
|[Perfect Blue 2021](https://angmar2722.github.io/CTFwriteups/2021/pbctf2021/) | 24.90 | Social Engineering Experts | 32/210 | 5.552 |
|[CSAW Quals 2021](https://angmar2722.github.io/CTFwriteups/2021/csaw2021/) | 23.53 | Social Engineering Experts | 23/1216 | 15.276 |
|[Yauza CTF 2021](https://angmar2722.github.io/CTFwriteups/2021/yauza2021/) | 21.84 | Social Engineering Experts | 9/227 | 14.452 |
|[Fword CTF 2021](https://angmar2722.github.io/CTFwriteups/2021/fword2021/) | 24.42 | Isengard | 55/428 | 3.407 |
|[InCTF 2021](https://angmar2722.github.io/CTFwriteups/2021/inctf2021/) | 70.41 | Social Engineering Experts | 22/604 | 19.143 |
|[UIUCTF 2021](https://angmar2722.github.io/CTFwriteups/2021/uiuctf2021/) | 23.76 | Social Engineering Experts | 18/658 | 8.222 |
|[Google CTF 2021](https://angmar2722.github.io/CTFwriteups/2021/google2021/) | 99.22 | Isengard | 80/379 | 8.435 |
|[Redpwn CTF 2021](https://angmar2722.github.io/CTFwriteups/2021/redpwn2021/) | 32.61 | Isengard | 41/1418 | 9.532 |
|[HSCTF 8 2021](https://angmar2722.github.io/CTFwriteups/2021/hsctf2021/) | 24.50 | Isengard | 57/1165 | 15.190 |
|[Zh3r0 CTF 2021](https://angmar2722.github.io/CTFwriteups/2021/zh3r02021/) | 21.85 | Isengard | 48/509 | 2.893 |
|[ångstrom 2021](https://angmar2722.github.io/CTFwriteups/2021/actf2021/) | 46.09 | Isengard | 278/1245 | 6.340 |
|[UMassCTF 2021](https://angmar2722.github.io/CTFwriteups/2021/umass2021/) | 23.50 | Dog 1.2 | 46/660 | 8.474 |
|[Whitehacks 2021](https://angmar2722.github.io/CTFwriteups/2021/wh2021/) | N/A | N/A | 35/130 | N/A |
<br/>
Due to a lack of time or due to playing another CTF which was occuring at the same time, I couldn't really spend much time at all for some CTFs. Below is the link for some solve scripts for random cryptography challenges that I solved during the duration of such CTFs :
- <a href="https://angmar2722.github.io/CTFwriteups/2021/randomCTFs2021/">2021 Random Crypto Solves in CTFs</a>
**Note :** Since I mainly specialise in cryptography related challenges, a list of writeups for all crypto related challenges that I have solved during the duration of a CTF in 2021 can be found <a href="https://github.com/Angmar2722/Angmar2722.github.io/blob/master/CTFwriteups/CryptoWriteupsList/crypto2021.md" target="_blank">here</a>.
| 75.684211 | 338 | 0.718011 | eng_Latn | 0.409527 |
bdfbdd8d3a80afe04a3b71e340aea92f5daa0fef | 2,690 | md | Markdown | sdk-api-src/content/objidl/nf-objidl-imallocspy-postheapminimize.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/objidl/nf-objidl-imallocspy-postheapminimize.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/objidl/nf-objidl-imallocspy-postheapminimize.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NF:objidl.IMallocSpy.PostHeapMinimize
title: IMallocSpy::PostHeapMinimize (objidl.h)
description: Performs operations required after calling IMalloc::HeapMinimize.
helpviewer_keywords: ["IMallocSpy interface [COM]","PostHeapMinimize method","IMallocSpy.PostHeapMinimize","IMallocSpy::PostHeapMinimize","PostHeapMinimize","PostHeapMinimize method [COM]","PostHeapMinimize method [COM]","IMallocSpy interface","_com_imallocspy_postheapminimize","com.imallocspy_postheapminimize","objidl/IMallocSpy::PostHeapMinimize"]
old-location: com\imallocspy_postheapminimize.htm
tech.root: com
ms.assetid: 9d51c34e-6ed1-493d-8999-e67c4a60f6b6
ms.date: 12/05/2018
ms.keywords: IMallocSpy interface [COM],PostHeapMinimize method, IMallocSpy.PostHeapMinimize, IMallocSpy::PostHeapMinimize, PostHeapMinimize, PostHeapMinimize method [COM], PostHeapMinimize method [COM],IMallocSpy interface, _com_imallocspy_postheapminimize, com.imallocspy_postheapminimize, objidl/IMallocSpy::PostHeapMinimize
req.header: objidl.h
req.include-header:
req.target-type: Windows
req.target-min-winverclnt: Windows 2000 Professional [desktop apps only]
req.target-min-winversvr: Windows 2000 Server [desktop apps only]
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance:
req.unicode-ansi:
req.idl: ObjIdl.idl
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib:
req.dll:
req.irql:
targetos: Windows
req.typenames:
req.redist:
ms.custom: 19H1
f1_keywords:
- IMallocSpy::PostHeapMinimize
- objidl/IMallocSpy::PostHeapMinimize
dev_langs:
- c++
topic_type:
- APIRef
- kbSyntax
api_type:
- COM
api_location:
- ObjIdl.h
api_name:
- IMallocSpy.PostHeapMinimize
---
# IMallocSpy::PostHeapMinimize
## -description
Performs operations required after calling <a href="/windows/desktop/api/objidl/nf-objidl-imalloc-heapminimize">IMalloc::HeapMinimize</a>.
## -remarks
When a spy object implementing <a href="/windows/desktop/api/objidl/nn-objidl-imallocspy">IMallocSpy</a> is registered using the <a href="/windows/desktop/api/objbase/nf-objbase-coregistermallocspy">CoRegisterMallocSpy</a> function, COM calls this method immediately after any call to <a href="/windows/desktop/api/objidl/nf-objidl-imalloc-free">IMalloc::Free</a>. This method is included for completeness and consistency; it is not anticipated that developers will implement significant functionality in this method.
## -see-also
<a href="/windows/desktop/api/objidl/nf-objidl-imalloc-heapminimize">IMalloc::HeapMinimize</a>
<a href="/windows/desktop/api/objidl/nn-objidl-imallocspy">IMallocSpy</a>
<a href="/windows/desktop/api/objidl/nf-objidl-imallocspy-preheapminimize">IMallocSpy::PreHeapMinimize</a>
| 37.361111 | 517 | 0.798885 | eng_Latn | 0.351219 |
bdfd3b8a9f687f3e939578a2488e011d1c2c48a7 | 3,440 | md | Markdown | _posts/2019-11-14-dev-00000051-R-SVM.md | moreGet/moreGet.github.io | 281876948fe14ee3710b0c36f0a7edd6f0e9af12 | [
"MIT"
] | null | null | null | _posts/2019-11-14-dev-00000051-R-SVM.md | moreGet/moreGet.github.io | 281876948fe14ee3710b0c36f0a7edd6f0e9af12 | [
"MIT"
] | null | null | null | _posts/2019-11-14-dev-00000051-R-SVM.md | moreGet/moreGet.github.io | 281876948fe14ee3710b0c36f0a7edd6f0e9af12 | [
"MIT"
] | 1 | 2019-08-31T08:02:36.000Z | 2019-08-31T08:02:36.000Z | ---
date: 2019-11-14 22:00:00
layout: post
title: R-Studio SVM(Support Vector Machine) 을 이용한 머신러닝
subtitle: kernlab 패키지의 ksvm 사용하기
description: "Current RStudio : == Desktop 1.2.5001(64bit)"
image: ../assets/img/postsimg/R_main_00000003.png
optimized_image: ../assets/img/postsimg/R_op_00000003.png
category: coding
tags:
- R-Studio
author: thiagorossener
# image:760*400
# optimized_image:380*200
---
## 파일 소스
우클릭 -> 다른이름으로 링크저장 이용해 주세요<br>
<a href="../assets/sources/S20191114.zip" class="btn btn-lg btn-outline">
S20191114.zip
</a>
<br>
<br>
<br>
## 메인 사용 함수
> install.packeage("kernlab")
> library(kernlab)
> ksvm 서포트 벡터 머신 알고리즘을 수행하려면 kernlab 패키지의 ksvm() 함수를 사용<br>
## 소스 코드
```r
library(kernlab)
data <- read.csv('bmi.csv')
head(data)
str(data)
summary(data)
dim(data)
colnames(data)
idx <- sample(1:nrow(data), 0.9 * nrow(data))
training <- data[idx, ]
testing <- data[-idx, ]
# kernel 커널 함수를 명시한다.
model <- ksvm(label ~ ., data = training, kernel = 'vanilladot')
pred <- predict(model, testing)
table(pred, testing$label)
aggrement <- pred == testing$label
prop.table(table(aggrement))
model_rbf <- ksvm(label ~ ., data = training, kernel = 'rbfdot')
pred_rbf <- predict(model_rbf, testing)
table(pred_rbf, testing$label)
aggrement_rbf <- pred_rbf == testing$label
prop.table(table(aggrement_rbf))
########################################################
library(kernlab)
str(iris)
unique(iris$Species)
idx <- sample(1:nrow(iris), 0.6 * nrow(iris))
training <- iris[idx, ]
testing <- iris[-idx, ]
model <- ksvm(Species ~ ., data = training, kernel = 'vanilladot')
pred <- predict(model, testing)
prop.table(table(pred == testing$Species))
#######################################################
library(kernlab)
data <- read.csv('zoo_data.csv')
testing <- read.csv('zoo_testing.csv')
head(data)
str(data)
summary(data)
dim(data)
colnames(data)
model <- ksvm(type ~ ., data = data, kernel = 'rbfdot')
pred <- round(predict(model, testing), 0)
table(pred, testing$type)
aggrement <- pred == testing$type
prop.table(table(aggrement))
#########################################################
library(kernlab)
data <- read.csv('letterdata.csv')
head(data)
str(data)
summary(data)
dim(data)
colnames(data)
idx <- sample(1:nrow(data), 0.7 * nrow(data))
training <- data[idx, ]
testing <- data[-idx, ]
model <- ksvm(letter ~ ., data = training, kernel = 'rbfdot')
pred <- predict(model, testing)
table(pred, testing$letter)
prop.table(table(pred == testing$letter))
##############################################################
library(kernlab)
data <- read.csv('mushrooms.csv')
head(data)
str(data)
summary(data)
dim(data)
colnames(data)
data <- data[, -17]
idx <- sample(1:nrow(data), 0.7 * nrow(data))
training <- data[idx, ]
testing <- data[-idx, ]
model <- ksvm(type ~ ., data = training, kernel = 'rbfdot')
pred <- predict(model, testing)
table(pred, testing$type)
prop.table(table(pred == testing$type))
##################################################################
library(kernlab)
data <- read.csv('../../R Basic Source/31.KNN/likelyhood.csv')
head(data)
str(data)
summary(data)
dim(data)
colnames(data)
idx <- sample(1:nrow(data), 0.1 * nrow(data))
training <- data[idx, ]
testing <- data[-idx, ]
model <- ksvm(likelyhood ~ ., data = training, kernel = 'rbfdot')
pred <- predict(model, testing)
table(pred, testing$likelyhood)
prop.table(table(pred == testing$likelyhood))
``` | 20.848485 | 73 | 0.628488 | kor_Hang | 0.191521 |
bdfd8f70ee594b35c0347134ec711e1b4bc23053 | 1,418 | md | Markdown | README.md | onewheelskyward/lita-google-images | 70c29ef2f6daf105608c9a724be2e4d65e93313d | [
"MIT"
] | null | null | null | README.md | onewheelskyward/lita-google-images | 70c29ef2f6daf105608c9a724be2e4d65e93313d | [
"MIT"
] | null | null | null | README.md | onewheelskyward/lita-google-images | 70c29ef2f6daf105608c9a724be2e4d65e93313d | [
"MIT"
] | null | null | null | # lita-google-images
[](https://travis-ci.org/jimmycuadra/lita-google-images)
[](https://codeclimate.com/github/jimmycuadra/lita-google-images)
[](https://coveralls.io/r/jimmycuadra/lita-google-images)
**lita-google-images** is a handler for [Lita](https://github.com/jimmycuadra/lita) that searches Google Images for images matching users' queries, and replies with links to them.
## Installation
Add lita-google-images to your Lita instance's Gemfile:
``` ruby
gem "lita-google-images"
```
## Configuration
### Optional attributes
* `safe_search` (String, Symbol) - The safe search setting to use when querying for images. Possible values are `:active`, `:moderate`, and `:off`. Default: `:active`.
### Example
```
Lita.configure do |config|
config.handlers.google_images.safe_search = :off
end
```
## Usage
The following are all equivalent ways of asking Lita to search for an image of "carl the pug":
```
Lita: image carl the pug
Lita: image me carl the pug
Lita: img carl the pug
Lita: img me carl the pug
```
The second form is for those coming from [Hubot](http://hubot.github.com/).
## License
[MIT](http://opensource.org/licenses/MIT)
| 30.170213 | 179 | 0.741185 | eng_Latn | 0.409813 |
bdfe2abcf8b5016c9e5b1b743b3b9c69a80bf3e3 | 5,049 | md | Markdown | rancher/Networking/Networking-with-Flannel.md | kspnec/ContainerLabs | aee420cccc76949bd8478af7f95094fe43f4662f | [
"Apache-2.0"
] | 6 | 2021-11-11T02:56:58.000Z | 2022-02-02T07:16:54.000Z | rancher/Networking/Networking-with-Flannel.md | sangam14/CloudNativeLab | df26ee9d41e12df945cb4314fffc07d8896c6411 | [
"Apache-2.0"
] | null | null | null | rancher/Networking/Networking-with-Flannel.md | sangam14/CloudNativeLab | df26ee9d41e12df945cb4314fffc07d8896c6411 | [
"Apache-2.0"
] | null | null | null | ---
layout: default
title: Networking with Flannel
parent: Rancher Networking
nav_order: 10
---
# Networking with Flannel
- Flannel is one of the most straightforward network providers for Kubernetes. It operates at Layer 3 and offloads the actual packet forwarding to a backend such
as VxLAN or IPSec. It assigns a large network to all hosts in the cluster and then assigns a portion of that network to each host. Routing between containers on
a host happens via the usual channels, and Flannel handles routing between hosts using one of its available options.
- Flannel uses etcd to store the map of what network is assigned to which host. The target can be an external deployment of etcd or the one that Kubernetes itself uses.
- Flannel does not provide an implementation of the NetworkPolicy resource.
# Running Flannel With Kubenetes
- Flannel Pods roll out as a DaemonSet, with one Pod assigned to each host. To deploy it within Kubernetes, use the `kube-flannel.yaml` manifest from the Flannel repository on Github.
```
kubectl apply -f https://github.com/coreos/flannel/raw/master/Documentation/kube-flannel.yml
```
- Once Flannel is running, it is not possible to change the network address space or the backend communication format without cluster downtime.
```
kubectl get pods --all-namespacesNAMESPACE NAME READY STATUS RESTARTS AGE
kube-system coredns-66bff467f8-7lfpd 1/1 Running 0 8m28s
kube-system coredns-66bff467f8-mx4tq 1/1 Running 0 8m28s
kube-system etcd-master 1/1 Running 0 8m37s
kube-system katacoda-cloud-provider-58f89f7d9-lcghg 1/1 Running 5 8m28s
kube-system kube-apiserver-master 1/1 Running 0 8m37s
kube-system kube-controller-manager-master 1/1 Running 0 8m37skube-system kube-flannel-ds-amd64-brdvt 1/1 Running 1 8m20s
kube-system kube-flannel-ds-amd64-ldt8g 1/1 Running 0 8m28skube-system kube-keepalived-vip-tr9nf 1/1 Running 0 8m10s
kube-system kube-proxy-7gk5t 1/1 Running 0 8m28s
kube-system kube-proxy-dqn7c 1/1 Running 0 8m20s
kube-system kube-scheduler-master 1/1 Running 0 8m37s
```
| Network Type | Backend | Key features |
|-------------- |--------- |------------------------------------------------------------------------------------ |
| Overlay | VxLAN | - Fast, but with no interhost encryption<br>- Suitable for private/secure networks |
| Overlay | IPSec | - Encrypts traffic between hosts<br>- Suitable when traffic traverses the Internet |
| Non Overlay | Host-gw | - Good performance<br>- Cloud agnostic |
| Non Overlay | AWS VPC | - Good performance<br>- Limited to Amazon’s cloud |
# Flannel Backends
- VxLAN
- VxLAN is the simplest of the officially supported backends for Flannel. Encapsulation happens within the kernel, so there is no additional overhead caused by moving data between the kernel and user space
- The VxLAN backend creates a Flannel interface on every host. When a container on one node wishes to send traffic to a different node, the packet goes from the container to the bridge interface in the host’s network namespace. From there the bridge forwards it to the Flannel interface because the kernel route table designates that this interface is the target for the non-local portion of the overlay network. The Flannel network driver wraps the packet in a UDP packet and sends it to the target host.
- Once it arrives at its destination, the process flows in reverse, with the Flannel driver on the destination host unwrapping the packet, sending it to the bridge interface, and from there the packet finds its way into the overlay network and to the destination Pod.
- Host-gw

- The Host-gw backend provides better performance than VxLAN but requires Layer 2 connectivity between hosts. It operates by creating IP routes to subnets via remote machine addresses
- Unlike VxLAN, no Flannel interface is created when using this backend. Instead, each node sends traffic directly to the destination node where the remote network is located.
- This backend may require additional network configuration if used in a cloud provider where inter-host communication uses virtual switches.
- UDP

- The UDP backend is insecure and should only be used for debugging or if the kernel does not support VxLAN.
| 65.571429 | 508 | 0.667657 | eng_Latn | 0.992687 |
bdfe95bf45099f07bdcb03a7ad7f49f730f240ad | 965 | md | Markdown | catalog/owari-to-hajimari-no-miles/en-US_owari-to-hajimari-no-miles.md | htron-dev/baka-db | cb6e907a5c53113275da271631698cd3b35c9589 | [
"MIT"
] | 3 | 2021-08-12T20:02:29.000Z | 2021-09-05T05:03:32.000Z | catalog/owari-to-hajimari-no-miles/en-US_owari-to-hajimari-no-miles.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 8 | 2021-07-20T00:44:48.000Z | 2021-09-22T18:44:04.000Z | catalog/owari-to-hajimari-no-miles/en-US_owari-to-hajimari-no-miles.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 2 | 2021-07-19T01:38:25.000Z | 2021-07-29T08:10:29.000Z | # Owari to Hajimari no Miles

- **type**: manga
- **chapters**: 12
- **original-name**: 終わりと始まりのマイルス
- **start-date**: 2006-11-06
- **end-date**: 2006-11-06
## Tags
- comedy
- fantasy
- slice-of-life
## Authors
- Kitoh
- Mohiro (Story & Art)
## Sinopse
Set in a world where every object has its own spirit, the story follows Miles, a young woman who has the rare ability to communicate with those spirits. Working as a spirit medium, Miles helps people find the objects that are compatible to them. Assisting her is a mysterious man named Gikaku who's literally not from this world. Dubbed as the "god of destruction" Gikaku wields a terrible power which, as it seems, only Miles can contain. But for how long?
(Source: Kotonoha)
This series is on hiatus.
## Links
- [My Anime list](https://myanimelist.net/manga/13313/Owari_to_Hajimari_no_Miles)
| 29.242424 | 457 | 0.717098 | eng_Latn | 0.988148 |
bdfed87c6a844e8f3ca1cedbfe07e7adcf820233 | 796 | md | Markdown | docs/odbc/microsoft/scalar-function-limitations.md | sql-aus-hh/sql-docs.de-de | edfac31211cedb5d13440802f131a1e48934748a | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-02-25T18:10:29.000Z | 2022-02-25T18:10:29.000Z | docs/odbc/microsoft/scalar-function-limitations.md | sql-aus-hh/sql-docs.de-de | edfac31211cedb5d13440802f131a1e48934748a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/odbc/microsoft/scalar-function-limitations.md | sql-aus-hh/sql-docs.de-de | edfac31211cedb5d13440802f131a1e48934748a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Einschränkungen von Skalarfunktionen | Microsoft-Dokumentation
ms.custom: ''
ms.date: 01/19/2017
ms.prod: sql
ms.prod_service: connectivity
ms.reviewer: ''
ms.technology: connectivity
ms.topic: conceptual
helpviewer_keywords:
- ODBC desktop database drivers [ODBC]
- desktop database drivers [ODBC]
ms.assetid: 023d94b9-3ed6-46d3-9a66-f2872f505bbb
author: MightyPen
ms.author: genemi
manager: craigg
ms.openlocfilehash: 6bfb6ff3ba39400278db23931b4c9420e506c973
ms.sourcegitcommit: 61381ef939415fe019285def9450d7583df1fed0
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 10/01/2018
ms.locfileid: "47687088"
---
# <a name="scalar-function-limitations"></a>Einschränkungen für Skalarfunktionen
Skalare Funktionen werden nur mithilfe der kanonische ODBC-Format unterstützt.
| 30.615385 | 80 | 0.819095 | deu_Latn | 0.286505 |
da009f049b24d2a99b65b38e31b884b97fc45bb8 | 15,415 | md | Markdown | articles/backup/restore-afs-cli.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/backup/restore-afs-cli.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/backup/restore-afs-cli.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Przywracanie udziałów plików platformy Azure za pomocą interfejsu wiersza polecenia platformy Azure
description: Dowiedz się, jak za pomocą interfejsu wiersza polecenia platformy Azure przywrócić kopie zapasowe udziałów plików platformy Azure w magazynie Recovery Services
ms.topic: conceptual
ms.date: 01/16/2020
ms.openlocfilehash: 63b2be2fe24c1274ed1581b7b849de578c978842
ms.sourcegitcommit: fa6fe765e08aa2e015f2f8dbc2445664d63cc591
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 02/01/2020
ms.locfileid: "76931049"
---
# <a name="restore-azure-file-shares-with-the-azure-cli"></a>Przywracanie udziałów plików platformy Azure za pomocą interfejsu wiersza polecenia platformy Azure
Interfejs wiersza polecenia platformy Azure zapewnia obsługę systemu Azure w programie. To doskonałe narzędzie do tworzenia niestandardowych automatyzacji do korzystania z zasobów platformy Azure. W tym artykule wyjaśniono, jak przywrócić cały udział plików lub określone pliki z punktu przywracania utworzonego przez [Azure Backup](https://docs.microsoft.com/azure/backup/backup-overview) przy użyciu interfejsu wiersza polecenia platformy Azure. Te kroki można również wykonać przy użyciu programu [Azure PowerShell](https://docs.microsoft.com/azure/backup/backup-azure-afs-automation) lub w witrynie [Azure Portal](backup-afs.md).
Na końcu tego artykułu dowiesz się, jak wykonywać następujące operacje za pomocą interfejsu wiersza polecenia platformy Azure:
* Wyświetl punkty przywracania dla kopii zapasowej udziału plików platformy Azure.
* Przywróć pełny udział plików platformy Azure.
* Przywróć pojedyncze pliki lub foldery.
[!INCLUDE [cloud-shell-try-it.md](../../includes/cloud-shell-try-it.md)]
Aby zainstalować interfejs wiersza polecenia lokalnie i korzystać z niego, należy korzystać z interfejsu wiersza polecenia platformy Azure w wersji 2.0.18 lub nowszej. Aby znaleźć wersję interfejsu wiersza polecenia, uruchom polecenie `az --version`. Jeśli konieczna będzie instalacja lub uaktualnienie interfejsu, zobacz [Instalowanie interfejsu wiersza polecenia platformy Azure](https://docs.microsoft.com/cli/azure/install-azure-cli?view=azure-cli-latest).
## <a name="prerequisites"></a>Wymagania wstępne
W tym artykule przyjęto założenie, że masz już udział plików platformy Azure, którego kopia zapasowa została utworzona przez Azure Backup. Jeśli go nie masz, zobacz Tworzenie kopii zapasowych [udziałów plików platformy Azure przy użyciu interfejsu wiersza polecenia](backup-afs-cli.md) w celu skonfigurowania usługi Backup dla udziału plików. W tym artykule opisano użycie następujących zasobów:
| Udział plików | Konto magazynu | Region | Szczegóły |
| ----------- | --------------- | ------ | ------------------------------------------------------------ |
| *migracji pamięci* | *afsaccount* | EastUS | Oryginalne źródło kopii zapasowej przy użyciu Azure Backup |
| *azurefiles1* | *afaccount1* | EastUS | Źródło docelowe używane do odzyskiwania lokalizacji alternatywnej |
Możesz użyć podobnej struktury dla udziałów plików, aby wypróbować różne typy przywracania wyjaśnione w tym artykule.
## <a name="fetch-recovery-points-for-the-azure-file-share"></a>Pobierz punkty odzyskiwania dla udziału plików platformy Azure
Użyj polecenia [AZ Backup recoverypoint list](https://docs.microsoft.com/cli/azure/backup/recoverypoint?view=azure-cli-latest#az-backup-recoverypoint-list) , aby wyświetlić listę wszystkich punktów odzyskiwania kopii zapasowej udziału plików.
Poniższy przykład pobiera listę punktów odzyskiwania dla udziału plików *migracji pamięci* na koncie magazynu *afsaccount* .
```azurecli-interactive
az backup recoverypoint list --vault-name azurefilesvault --resource-group azurefiles --container-name "StorageContainer;Storage;AzureFiles;afsaccount” --backup-management-type azurestorage --item-name “AzureFileShare;azurefiles” --workload-type azurefileshare --out table
```
Możesz również uruchomić poprzednie polecenie cmdlet przy użyciu przyjaznej nazwy kontenera i elementu, podając następujące dwa dodatkowe parametry:
* **--Backup-Management-Type**: *azurestorage*
* **--Typ obciążenia**: *azurefileshare*
```azurecli-interactive
az backup recoverypoint list --vault-name azurefilesvault --resource-group azurefiles --container-name afsaccount --backup-management-type azurestorage --item-name azurefiles --workload-type azurefileshare --out table
```
Zestaw wyników to lista punktów odzyskiwania wraz ze szczegółami dotyczącymi czasu i spójności poszczególnych punktów przywracania.
```output
Name Time Consistency
------------------ ------------------------- --------------------
932887541532871865 2020-01-05T07:08:23+00:00 FileSystemConsistent
932885927361238054 2020-01-05T07:08:10+00:00 FileSystemConsistent
932879614553967772 2020-01-04T21:33:04+00:00 FileSystemConsistent
```
Atrybut **name** w danych wyjściowych odpowiada nazwie punktu odzyskiwania, który może być używany jako wartość parametru **--RP-Name** w operacji odzyskiwania.
## <a name="full-share-recovery-by-using-the-azure-cli"></a>Odzyskiwanie pełnego udziału przy użyciu interfejsu wiersza polecenia platformy Azure
Można użyć tej opcji przywracania, aby przywrócić pełny udział plików w lokalizacji oryginalnej lub alternatywnej.
Zdefiniuj następujące parametry, aby wykonać operacje przywracania:
* **--Container-Name**: nazwa konta magazynu, które hostuje kopię zapasową oryginalnego udziału plików. Aby pobrać nazwę lub przyjazną nazwę kontenera, użyj polecenia [AZ Backup Container list](https://docs.microsoft.com/cli/azure/backup/container?view=azure-cli-latest#az-backup-container-list) .
* **--Item-Name**: Nazwa kopii zapasowej oryginalnego pliku, który ma być używany dla operacji przywracania. Aby pobrać nazwę lub przyjazną nazwę elementu kopii zapasowej, użyj polecenia [AZ Backup Item list](https://docs.microsoft.com/cli/azure/backup/item?view=azure-cli-latest#az-backup-item-list) .
### <a name="restore-a-full-share-to-the-original-location"></a>Przywróć pełny udział do oryginalnej lokalizacji
W przypadku przywracania do oryginalnej lokalizacji nie trzeba określać parametrów związanych z elementem docelowym. Należy podać tylko **konflikt** .
W poniższym przykładzie zastosowano polecenie [AZ Backup Restore Restore-azurefileshare](https://docs.microsoft.com/cli/azure/backup/restore?view=azure-cli-latest#az-backup-restore-restore-azurefileshare) z trybem przywracania ustawionym na *originallocation* , aby przywrócić udział plików *migracji pamięci* w oryginalnej lokalizacji. Używany jest punkt odzyskiwania 932883129628959823, który uzyskano w obszarze [pobieranie punktów odzyskiwania dla udziału plików platformy Azure](#fetch-recovery-points-for-the-azure-file-share):
```azurecli-interactive
az backup restore restore-azurefileshare --vault-name azurefilesvault --resource-group azurefiles --rp-name 932887541532871865 --container-name "StorageContainer;Storage;AzureFiles;afsaccount” --item-name “AzureFileShare;azurefiles” --restore-mode originallocation --resolve-conflict overwrite --out table
```
```output
Name ResourceGroup
------------------------------------ ---------------
6a27cc23-9283-4310-9c27-dcfb81b7b4bb azurefiles
```
Atrybut **name** w danych wyjściowych odpowiada nazwie zadania, które jest tworzone przez usługę kopii zapasowej dla operacji przywracania. Aby śledzić stan zadania, użyj polecenia [AZ Backup Job show](https://docs.microsoft.com/cli/azure/backup/job?view=azure-cli-latest#az-backup-job-show) cmdlet.
### <a name="restore-a-full-share-to-an-alternate-location"></a>Przywróć pełny udział w lokalizacji alternatywnej
Możesz użyć tej opcji, aby przywrócić udział plików w alternatywnej lokalizacji i zachować oryginalny udział plików jako. Określ następujące parametry odzyskiwania lokalizacji alternatywnej:
* **--Target-Storage-account**: konto magazynu, do którego zostanie przywrócona zawartość kopii zapasowej. Docelowe konto magazynu musi znajdować się w tej samej lokalizacji co magazyn.
* **--docelowy udział plików**: udział plików w docelowym koncie magazynu, do którego zostanie przywrócona zawartość kopii zapasowej.
* **--Target-folder**: folder w udziale plików, do którego przywracane są dane. Jeśli kopia zapasowa ma zostać przywrócona do folderu głównego, nadaj wartości folderu docelowego jako pusty ciąg.
* **--Rozwiąż konflikt**: instrukcje w przypadku konfliktu z przywróconymi danymi. Akceptuje **zastępowanie** lub **pomijanie**.
Poniższy przykład używa [AZ Backup Restore Restore-azurefileshare](https://docs.microsoft.com/cli/azure/backup/restore?view=azure-cli-latest#az-backup-restore-restore-azurefileshare) z trybem przywracania jako *alternatelocation* , aby przywrócić udział plików *migracji pamięci* na koncie magazynu *afsaccount* do udziału plików *azurefiles1* na koncie magazynu *afaccount1* .
```azurecli-interactive
az backup restore restore-azurefileshare --vault-name azurefilesvault --resource-group azurefiles --rp-name 932883129628959823 --container-name "StorageContainer;Storage;AzureFiles;afsaccount” --item-name “AzureFileShare;azurefiles” --restore-mode alternatelocation --target-storage-account afaccount1 --target-file-share azurefiles1 --target-folder restoredata --resolve-conflict overwrite --out table
```
```output
Name ResourceGroup
------------------------------------ ---------------
babeb61c-d73d-4b91-9830-b8bfa83c349a azurefiles
```
Atrybut **name** w danych wyjściowych odpowiada nazwie zadania, które jest tworzone przez usługę kopii zapasowej dla operacji przywracania. Aby śledzić stan zadania, użyj polecenia [AZ Backup Job show](https://docs.microsoft.com/cli/azure/backup/job?view=azure-cli-latest#az-backup-job-show) cmdlet.
## <a name="item-level-recovery"></a>Odzyskiwanie na poziomie elementu
Ta opcja przywracania służy do przywracania pojedynczych plików lub folderów w lokalizacji oryginalnej lub alternatywnej.
Zdefiniuj następujące parametry, aby wykonać operacje przywracania:
* **--Container-Name**: nazwa konta magazynu, które hostuje kopię zapasową oryginalnego udziału plików. Aby pobrać nazwę lub przyjazną nazwę kontenera, użyj polecenia [AZ Backup Container list](https://docs.microsoft.com/cli/azure/backup/container?view=azure-cli-latest#az-backup-container-list) .
* **--Item-Name**: Nazwa kopii zapasowej oryginalnego pliku, który ma być używany dla operacji przywracania. Aby pobrać nazwę lub przyjazną nazwę elementu kopii zapasowej, użyj polecenia [AZ Backup Item list](https://docs.microsoft.com/cli/azure/backup/item?view=azure-cli-latest#az-backup-item-list) .
Określ następujące parametry dla elementów, które mają zostać odzyskane:
* **Sourcefilepath**: ścieżka bezwzględna pliku, która ma zostać przywrócona w udziale plików jako ciąg. Ta ścieżka jest tą samą ścieżką, która jest używana w pliku [AZ Storage File Download](https://docs.microsoft.com/cli/azure/storage/file?view=azure-cli-latest#az-storage-file-download) lub [AZ Storage File show](https://docs.microsoft.com/cli/azure/storage/file?view=azure-cli-latest#az-storage-file-show) interfejsu wiersza polecenia.
* **SourceFileType**: Wybierz, czy wybrano katalog lub plik. Akceptuje **katalog** lub **plik**.
* **ResolveConflict**: instrukcja, jeśli wystąpił konflikt z przywróconymi danymi. Akceptuje **zastępowanie** lub **pomijanie**.
### <a name="restore-individual-files-or-folders-to-the-original-location"></a>Przywróć pojedyncze pliki lub foldery do oryginalnej lokalizacji
Użyj polecenia [AZ Backup Restore Restore-migracji pamięci](https://docs.microsoft.com/cli/azure/backup/restore?view=azure-cli-latest#az-backup-restore-restore-azurefiles) z trybem przywracania ustawionym na *originallocation* w celu przywrócenia określonych plików lub folderów do ich oryginalnej lokalizacji.
Poniższy przykład przywraca plik *RestoreTest. txt* w jego pierwotnej lokalizacji: udział plików *migracji pamięci* .
```azurecli-interactive
az backup restore restore-azurefiles --vault-name azurefilesvault --resource-group azurefiles --rp-name 932881556234035474 --container-name "StorageContainer;Storage;AzureFiles;afsaccount” --item-name “AzureFileShare;azurefiles” --restore-mode originallocation --source-file-type file --source-file-path "Restore/RestoreTest.txt" --resolve-conflict overwrite --out table
```
```output
Name ResourceGroup
------------------------------------ ---------------
df4d9024-0dcb-4edc-bf8c-0a3d18a25319 azurefiles
```
Atrybut **name** w danych wyjściowych odpowiada nazwie zadania, które jest tworzone przez usługę kopii zapasowej dla operacji przywracania. Aby śledzić stan zadania, użyj polecenia [AZ Backup Job show](https://docs.microsoft.com/cli/azure/backup/job?view=azure-cli-latest#az-backup-job-show) cmdlet.
### <a name="restore-individual-files-or-folders-to-an-alternate-location"></a>Przywracanie pojedynczych plików lub folderów do lokalizacji alternatywnej
Aby przywrócić określone pliki lub foldery do innej lokalizacji, użyj polecenia [AZ Backup Restore Restore-migracji pamięci](https://docs.microsoft.com/cli/azure/backup/restore?view=azure-cli-latest#az-backup-restore-restore-azurefiles) cmdlet z trybem przywracania ustawionym na *alternatelocation* i określ następujące parametry powiązane z elementem docelowym:
* **--Target-Storage-account**: konto magazynu, do którego zostanie przywrócona zawartość kopii zapasowej. Docelowe konto magazynu musi znajdować się w tej samej lokalizacji co magazyn.
* **--docelowy udział plików**: udział plików w docelowym koncie magazynu, do którego zostanie przywrócona zawartość kopii zapasowej.
* **--Target-folder**: folder w udziale plików, do którego przywracane są dane. Jeśli kopia zapasowa ma zostać przywrócona do folderu głównego, nadaj jej wartość jako pusty ciąg.
Poniższy przykład przywraca plik *RestoreTest. txt* znajdujący się pierwotnie w udziale plików *migracji pamięci* do lokalizacji alternatywnej: folder *restoredata* w udziale plików *azurefiles1* hostowanym na koncie magazynu *afaccount1* .
```azurecli-interactive
az backup restore restore-azurefiles --vault-name azurefilesvault --resource-group azurefiles --rp-name 932881556234035474 --container-name "StorageContainer;Storage;AzureFiles;afsaccount” --item-name “AzureFileShare;azurefiles” --restore-mode alternatelocation --target-storage-account afaccount1 --target-file-share azurefiles1 --target-folder restoredata --resolve-conflict overwrite --source-file-type file --source-file-path "Restore/RestoreTest.txt" --out table
```
```output
Name ResourceGroup
------------------------------------ ---------------
df4d9024-0dcb-4edc-bf8c-0a3d18a25319 azurefiles
```
Atrybut **name** w danych wyjściowych odpowiada nazwie zadania, które jest tworzone przez usługę kopii zapasowej dla operacji przywracania. Aby śledzić stan zadania, użyj polecenia [AZ Backup Job show](https://docs.microsoft.com/cli/azure/backup/job?view=azure-cli-latest#az-backup-job-show) cmdlet.
## <a name="next-steps"></a>Następne kroki
Dowiedz się [, jak zarządzać kopiami zapasowymi udziałów plików platformy Azure za pomocą interfejsu wiersza polecenia platformy Azure](manage-afs-backup-cli.md).
| 87.090395 | 633 | 0.775997 | pol_Latn | 0.998775 |
da01132329df974bfd8e97dfc5c34c383f74c681 | 2,071 | md | Markdown | maths/squares/calculating_squares_of_ends_at_4.md | hygull/research | ad65ba6ac4cdef97aa1ece2769cff1bfd6027209 | [
"MIT"
] | null | null | null | maths/squares/calculating_squares_of_ends_at_4.md | hygull/research | ad65ba6ac4cdef97aa1ece2769cff1bfd6027209 | [
"MIT"
] | null | null | null | maths/squares/calculating_squares_of_ends_at_4.md | hygull/research | ad65ba6ac4cdef97aa1ece2769cff1bfd6027209 | [
"MIT"
] | null | null | null | # Calculating square of numbers with unit digit 4
### Parameters
N : Number whose square is to be calculated
u : unit digit of number(N)
t : number formed after excluding the unit digit
### Formula
```
R = (N * (t + 1)) - (3 + t * 6)
R = (N * t) + N - 3 - (t * 6)
R = t * (N - 6) + (N - 3)
R = t * (N - 6) + (N - 6) + 3
R = (N - 6) * (t + 1) + 3
```
```
Square(N) = Concat(R, 6)
```
# Base of the derivation
```python
>>> for number in range(4, 95, 10):
... print number, "^2 = ", number**2
...
4 ^2 = 16
14 ^2 = 196
24 ^2 = 576
34 ^2 = 1156
44 ^2 = 1936
54 ^2 = 2916
64 ^2 = 4096
74 ^2 = 5476
84 ^2 = 7056
94 ^2 = 8836
>>>
>>> 4 * 1 - 3
1
>>> 14 * 2 - 9
19
>>> 24 * 3 - 15
57
>>> 34 * 4 - 21
115
>>>
```
Or (Using formula **(N - 6) * (t + 1) + 3)**)
```python
>>> for number in range(4, 95, 10):
... print number, "^2 = ", number**2
...
4 ^2 = 16
14 ^2 = 196
24 ^2 = 576
34 ^2 = 1156
44 ^2 = 1936
54 ^2 = 2916
64 ^2 = 4096
74 ^2 = 5476
84 ^2 = 7056
94 ^2 = 8836
>>>
>>> (4 - 6) * (0 + 1) + 3
1
>>> (14 - 6) * (1 + 1) + 3
19
>>> (24 - 6) * (2 + 1) + 3
57
>>> (34 - 6) * (3 + 1) + 3
115
>>> (44 - 6) * (4 + 1) + 3
193
>>> (54 - 6) * (5 + 1) + 3
291
>>> (64 - 6) * (6 + 1) + 3
409
>>> (74 - 6) * (7 + 1) + 3
547
>>> (84 - 6) * (8 + 1) + 3
705
>>> (94 - 6) * (9 + 1) + 3
883
>>> (104 - 6) * (10 + 1) + 3
1081
>>> (994 - 6) * (99 + 1) + 3
98803
>>>
>>> # Test
...
>>> 994**2
988036
>>> 34**2
1156
>>>
```
# Example
Caculate the square of 229?
```
164 x 164
=============
656
984x
164xx
=============
26896
```
Let's calculate the square of 229 using formula, (N - 6) * (t + 1) + 3
```
Here
N = 164
t = 16
So,
R = (N - 6) * (t + 1) + 3
R = (N - 6) * (t + 1) + 3
R = (164 - 6) * (16 + 1) + 3
R = 158 * 17 + 3
Finally
158 x 17
============
1106
158x
============
2686
R = 2686 + 3
R = 2689
Square(164) = Concat(R, 6)
Square(164) = Concat(2689, 6)
Square(164) = 26896
``` | 13.899329 | 70 | 0.404635 | yue_Hant | 0.395705 |
da01d84cc4b1a2464351d5263cbb32fb736ecda5 | 833 | md | Markdown | README.md | nyteshade/c-linkedlist | de677415d84fe84b885e23d05cfe1b86ba0794d6 | [
"MIT"
] | null | null | null | README.md | nyteshade/c-linkedlist | de677415d84fe84b885e23d05cfe1b86ba0794d6 | [
"MIT"
] | null | null | null | README.md | nyteshade/c-linkedlist | de677415d84fe84b885e23d05cfe1b86ba0794d6 | [
"MIT"
] | null | null | null | # c-linkedlist
Playing with LinkedLists in C
## Overview
Recently playing with older C compilers, I wanted a LinkedList that offered some of the niceties of modern arrays found in more recent languages. I identified the most used key and value types I encounter in C which were a string (`char*`), a pointer (`void *`), an integer (`long`) and a decimal (`double`). I plan to add many more functions for handling things like searching, sorting and the like but this basis allows me to get started.
## Tested On (Machines I own)
- [X] Apple MacBook Pro 15" 2019 (Intel)
- [ ] Apple MacBook Pro 14" 2022 (Apple Silicon)
- [ ] Apple PowerBook G3 Pismo (500Mhz)
- [ ] Commodore Amiga 500/600/1000/2000
- [ ] Commodore Amiga 1200/3000/4000
## Modern APIs Supported
- [X] ForEach
- [X] Filter
- [X] Map
- [X] Reduce
### More to Come
| 36.217391 | 440 | 0.719088 | eng_Latn | 0.988655 |
da03a828cd6f1216141bee18332c72cf82e4775b | 777 | md | Markdown | README.md | nobody48sheldor/fractales | 35eedb6477715c365afd58db46180f8daccaab8e | [
"MIT"
] | null | null | null | README.md | nobody48sheldor/fractales | 35eedb6477715c365afd58db46180f8daccaab8e | [
"MIT"
] | 1 | 2022-02-02T12:29:01.000Z | 2022-02-02T12:33:09.000Z | README.md | nobody48sheldor/fractales | 35eedb6477715c365afd58db46180f8daccaab8e | [
"MIT"
] | null | null | null | # Fractales
Mandelbrot's set and Julia's sets.

## description
### languages :
- Python
- C++
### dependencies :
- Python :
- matplotlib
- numpy
- cmath (for the version written all in python)
- C++ :
- complex
### versions
- [C++] + [Python]
- computation in C++ for speed
- plot in python with matplotlib
- full [Python]
- all in Python,easy to understand, but very slow
## how to install and run ?
``` git clone https://github.com/nobody48sheldor/fractales ```
``` cd fractales/ ```
- [C++] + [Python]
``` sh compile.sh ``` (needs g++ compiler)
- full [Python]
``` cd full_python/ ```
``` python3 mandelbrot.py ``` or ``` python3 julia_set.py ```
## contribute
you can copy the code update/modify it and do a pull request.
| 18.5 | 62 | 0.643501 | eng_Latn | 0.926941 |
da042f8b38d65797a4aae6389f1c2fa856bfd0b3 | 2,941 | md | Markdown | _docs/Magento-V2-Connector/Uploading-Shipments-to-Magento-V2.md | chrishotchkiss/test | 3c83a28798bde3f8222c028815ef9fcae7a95490 | [
"MIT"
] | null | null | null | _docs/Magento-V2-Connector/Uploading-Shipments-to-Magento-V2.md | chrishotchkiss/test | 3c83a28798bde3f8222c028815ef9fcae7a95490 | [
"MIT"
] | null | null | null | _docs/Magento-V2-Connector/Uploading-Shipments-to-Magento-V2.md | chrishotchkiss/test | 3c83a28798bde3f8222c028815ef9fcae7a95490 | [
"MIT"
] | null | null | null | ---
slug: uploading-shipments-to-magento-v2
redirect_from: "/article/974-uploading-shipments-to-magento"
title: Uploading Shipments to Magento V2
---
This task will create new shipments for existing orders in Magento. See below for a sample input file.
The `<entity_id>` element is used to specify the ID of the order to create the shipment for. If no ID is specified, the task will use the `<external_id>` element in conjunction with Zynk's truth table to look up the ID of an order that has been previously processed by Zynk. If no external ID is provided, the `<increment_id>` element will be used to look up the order ID based on the increment ID of the order.
You can optionally provide a collection of `<item>` elements to ship specific items on the order. If no items are provided, the task will ship all remaining items on the order.
## Settings
### Connection
_Required_
The Magento V2 connection to use. See the [Connecting to Magento V2](connecting-to-magento-v2) article if you require more information on how to create/manage connections.
## Fail File
_Required_
The XML file to save failed shipment uploads to. The data will be written in the same format as the input file.
## Input File
_Required_
The XML file containing the shipments to upload in Magento.
## Success File
_Required_
The XML file to save successful shipment uploads to. The data will be written in the same format as the input file.
## Prevent Reprocessing
_Required_
Set to true to prevent the same record being processed more than once by the task. This setting will only work where an `<external_id>` element is provided in the XML.
## Store View Code
_Required_
The magento store view code to perform the API calls against. Default value of 'all'.
### Zynk Settings
See [Common Task Settings](common-task-settings).
## Examples
A sample input file is shown below. This will create or update the customer with the email address `[email protected]`.
```xml
<?xml version="1.0" encoding="utf-8"?>
<ArrayOfShipment xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<Shipment>
<external_id>9814</external_id>
<order>
<!-- At least one of the following must be provided to identify the order to ship -->
<entity_id>4</entity_id>
<external_id>1</external_id>
<increment_id>000000004</increment_id>
</order>
<!-- If no items collection is provided, all items on the order will be shipped -->
<items>
<item>
<sku>WSH12-32-Purple</sku>
<qty>1</qty>
</item>
</items>
<tracks>
<track>
<track_number>abcd-1234</track_number>
<title>DHL Next Day Shipping</title>
<carrier_code>DHL</carrier_code>
</track>
</tracks>
<comment>
<comment>Leave by the door</comment>
<is_visible_on_front>1</is_visible_on_front>
</comment>
</Shipment>
</ArrayOfShipment>
```
| 39.743243 | 411 | 0.723903 | eng_Latn | 0.990768 |
da04f0ea0411e1cc53d5ac1ee268bf75fedbbb56 | 1,419 | md | Markdown | 2020/10/14/2020-10-14 18:25.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/10/14/2020-10-14 18:25.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/10/14/2020-10-14 18:25.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年10月14日18时数据
Status: 200
1.因嫉妒心删掉同学专升本志愿被抓
微博热度:4437075
2.南医大女生被杀案凶手被判死刑
微博热度:2635936
3.现在就试试
微博热度:2482412
4.谭松韵经纪人道歉
微博热度:2419704
5.马丽韩国分丽
微博热度:2245112
6.瞄准真刺激
微博热度:2048717
7.李毅被停职一周
微博热度:1667479
8.大连理工通报一研究生实验室身亡
微博热度:1553053
9.昆山
微博热度:1272929
10.莲花山
微博热度:1159409
11.朱一龙站姐
微博热度:1051450
12.杜蕾斯文案
微博热度:835376
13.何炅刘涛主持金鹰节
微博热度:720121
14.被迫还贷退休老师喊话学生还钱
微博热度:558903
15.丹麦将扑杀至少250万只水貂
微博热度:449552
16.iPhone12将支持北斗导航定位系统
微博热度:449073
17.学生当上老师和昔日老师坐对桌
微博热度:422845
18.原来好评比差评更气人
微博热度:422789
19.宋茜
微博热度:416864
20.医护人员用喇叭喊话居民做核酸检测
微博热度:412708
21.大连理工
微博热度:409561
22.易烊千玺眼神
微博热度:404690
23.姐妹相亲失败现场
微博热度:402461
24.陪女朋友吃螺蛳粉的男孩
微博热度:398804
25.我国首次对碰瓷作出明确界定
微博热度:389441
26.北京发布青岛方向进京管控措施
微博热度:382867
27.陈伟霆刘诗诗同框
微博热度:374541
28.杨洋生图
微博热度:365115
29.iPhone12值不值得买
微博热度:318345
30.韩国违规不戴口罩将被罚款近600元
微博热度:297250
31.这辈子就是上辈子你说的下辈子
微博热度:294353
32.韩红 我尊重流量明星
微博热度:285631
33.虞书欣 脉搏犹如40岁男子壮如牛
微博热度:270295
34.没想到长颈鹿这样吃草
微博热度:265632
35.深圳经济特区40周年
微博热度:261116
36.长城宽带
微博热度:260967
37.王嘉尔黄色发带造型
微博热度:260210
38.伸懒腰是要付出代价的
微博热度:258848
39.天才小熊猫
微博热度:247030
40.江南百景图
微博热度:244843
41.明星回复粉丝能有多搞笑
微博热度:237591
42.崔雪莉吧悼念雪莉
微博热度:225527
43.金鹰女神投票结果
微博热度:201176
44.美军机连续三个月对中国密集抵近侦察
微博热度:199531
45.诸葛亮武陵仙君优化
微博热度:188818
46.北京对入境进京人员实行3次核酸检测
微博热度:186416
47.打针众生相
微博热度:184267
48.初秋辣妹穿搭
微博热度:183613
49.从青岛进京须持7日内核酸检测阴性证明
微博热度:172904
50.郑爽蜜糖胶片风大片
微博热度:163693
| 6.955882 | 22 | 0.78365 | yue_Hant | 0.36972 |
da0a85426903a005c9dd3c51f0ca8ebe5cd52411 | 61 | md | Markdown | docs/zh-hans/components/comp_PSS/comp_PSSelectrical/BasicPassiveComp/index.md | CloudPSS/docs | 8bb06e23d55d1f6e1acd3dbe9638ad9c7e8f317c | [
"MIT"
] | 1 | 2021-07-30T14:25:55.000Z | 2021-07-30T14:25:55.000Z | docs/zh-hans/components/comp_PSS/comp_PSSelectrical/BasicPassiveComp/index.md | CloudPSS/docs | 8bb06e23d55d1f6e1acd3dbe9638ad9c7e8f317c | [
"MIT"
] | null | null | null | docs/zh-hans/components/comp_PSS/comp_PSSelectrical/BasicPassiveComp/index.md | CloudPSS/docs | 8bb06e23d55d1f6e1acd3dbe9638ad9c7e8f317c | [
"MIT"
] | 1 | 2021-11-03T00:31:55.000Z | 2021-11-03T00:31:55.000Z | ---
title: 基本无源元件
order: 5000
redirect to: ./GND/index.md
--- | 12.2 | 27 | 0.655738 | eng_Latn | 0.38761 |
da0bcadf78de1da6c20f9738c89fd78734de8fa8 | 4,425 | md | Markdown | docs/custom-system-keyboard.zh-Hans.md | aonon/mongol-library | 2487318a9932eb401cf687dea841461e3562cd98 | [
"MIT"
] | 57 | 2017-03-08T09:02:38.000Z | 2021-09-15T07:24:53.000Z | docs/custom-system-keyboard.zh-Hans.md | aonon/mongol-library | 2487318a9932eb401cf687dea841461e3562cd98 | [
"MIT"
] | 6 | 2017-07-11T10:54:01.000Z | 2019-07-25T18:42:00.000Z | docs/custom-system-keyboard.zh-Hans.md | aonon/mongol-library | 2487318a9932eb401cf687dea841461e3562cd98 | [
"MIT"
] | 18 | 2017-02-27T09:48:56.000Z | 2022-01-24T23:49:01.000Z | # 如何创建自定义系统输入法
本教成以Andoid Studio 3.1和monogl-library 1.1.0测试的 (mongol-library不能低于1.1.0版)
## 1、 新的项目
新建一个项目,把它命名为`Jianpan`
## 2、 导入mongol-library
在build.gradle(Module: app)文件里的dependencies中加蒙文控件库
```java
implementation 'net.studymongolian:mongol-library:1.3.1'
```
## 3、 自定义键盘布局
创建一个java文件,这是你的键盘试图类,要继承`net.studymongolian.mongollibrary.Keyboard`类。最简单的方法是复制已存在的键盘然后改成你想要的布局。参考下面的键盘试图:
- [KeyboardAeiou](https://github.com/suragch/mongol-library/blob/master/mongol-library/src/main/java/net/studymongolian/mongollibrary/KeyboardAeiou.java)
- [KeyboardQwerty](https://github.com/suragch/mongol-library/blob/master/mongol-library/src/main/java/net/studymongolian/mongollibrary/KeyboardQwerty.java)
- [KeyboardLatin](https://github.com/suragch/mongol-library/blob/master/mongol-library/src/main/java/net/studymongolian/mongollibrary/KeyboardLatin.java)
- [KeyboardCyrillic](https://github.com/suragch/mongol-library/blob/master/mongol-library/src/main/java/net/studymongolian/mongollibrary/KeyboardCyrillic.java)
- [CustomKeyboard](https://github.com/suragch/mongol-library/blob/master/demo-app/src/main/java/net/studymongolian/mongollibrarydemo/CustomKeyboard.java)
- [CustomKeyboardTwo](https://github.com/suragch/mongol-library/blob/master/demo-app/src/main/java/net/studymongolian/mongollibrarydemo/CustomKeyboardTwo.java)
命名为`WodeJianpan.java`
## 4、 键盘样式
在`res/layout`里新建xml文件,把它命名为`jianpan_yangshi.xml`。内容如下:
```xml
<?xml version="1.0" encoding="utf-8"?>
<net.studymongolian.mongollibrary.ImeContainer xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:background="#dbdbdb">
<com.example.jianpan.WodeJianpan
android:layout_width="match_parent"
android:layout_height="match_parent"
app:keyBorderColor="#000000"
app:keyBorderRadius="3dp"
app:keyBorderWidth="1px"
app:keyColor="#ffffff"
app:keyPressedColor="#b3b3b3"
app:keySpacing="3dp"
app:popupHighlightColor="#dbdbdb"
app:popupTextColor="#fe9a52"
app:primaryTextColor="#000000"
app:primaryTextSize="30sp"
app:secondaryTextColor="#b3b3b3" />
</net.studymongolian.mongollibrary.ImeContainer>
```
**备注**
- 把`com.example.Jianpan.WodeJianpan`改成你的项目和类名。
- 如果你想切换键盘,可以多加几个键盘视图类。
## 5、 InputMethodService类
创建一个java文件,把它命名为`WodeInputMethodService.java`,要继承`InputMethodService`,还要实现`ImeContainer.OnSystemImeListener`接口。内容如下:
```java
public class WodeInputMethodService extends InputMethodService implements ImeContainer.OnSystemImeListener {
@Override
public View onCreateInputView() {
LayoutInflater inflater = getLayoutInflater();
ImeContainer jianpan = (ImeContainer) inflater.inflate(R.layout.jianpan_yangshi, null, false);
jianpan.showSystemKeyboardsOption("ᠰᠢᠰᠲ᠋ᠧᠮ"); // 长按键盘键可以切换到别的系统输入法
jianpan.setOnSystemImeListener(this);
return jianpan;
}
// ImeContainer.OnSystemImeListener的方法
@Override
public InputConnection getInputConnection() {
return getCurrentInputConnection();
}
@Override
public void onChooseNewSystemKeyboard() {
InputMethodManager im = (InputMethodManager) getSystemService(INPUT_METHOD_SERVICE);
if (im == null) return;
im.showInputMethodPicker();
}
}
```
**备注**
- 如果你不想用`ImeContainer`或者你想自己控制键盘的输入法,从`onCreateInputView()`也可以返回一个`Keyboard`视图,这样的话要你的`InputMethodService`要实现`Keyboard.OnKeyboardListener`接口,就是要实现的方法较多一点。
## 6、 输入法子类型
在`res/xml/`里创建一个xml文件,把它命名为`method.xml`。内容如下:
```java
<?xml version="1.0" encoding="utf-8"?>
<input-method
xmlns:android="http://schemas.android.com/apk/res/android">
<subtype
android:imeSubtypeMode="keyboard"/>
</input-method>
```
# 7、 申明输入法
在`AndroidManifest.xml`里要申明你的输入法:
```java
</application>
...
<service
android:name=".WodeInputMethodService"
android:label="自定义输入法"
android:permission="android.permission.BIND_INPUT_METHOD">
<intent-filter>
<action android:name="android.view.InputMethod"/>
</intent-filter>
<meta-data
android:name="android.view.im"
android:resource="@xml/method"/>
</service>
</application>
```
你的键盘的基本功能已经完成了,但是`MainActivity`里可以加一些设置项目或帮助。
# 8、 激活输入法
在系统设置里用户都要激活你的输入法。
| 30.944056 | 159 | 0.734463 | yue_Hant | 0.174816 |
da0bd02e110a40029507ca63b2a2b34ba286cb40 | 642 | md | Markdown | NEWS.md | kullrich/MSA2dist | cdb453af04e4577cc08d9df82bb1583484e07b1f | [
"MIT"
] | 1 | 2021-12-23T00:14:16.000Z | 2021-12-23T00:14:16.000Z | NEWS.md | kullrich/MSA2dist | cdb453af04e4577cc08d9df82bb1583484e07b1f | [
"MIT"
] | null | null | null | NEWS.md | kullrich/MSA2dist | cdb453af04e4577cc08d9df82bb1583484e07b1f | [
"MIT"
] | null | null | null | # Changes in version 0.99.3 (2022-03-18)
## Major changes
## Minor improvements and bug fixes
* Fix RcppThread::LdFlags warning
# Changes in version 0.99.2 (2022-01-28)
## Major changes
## Minor improvements and bug fixes
* Added RcppThread::ProgressBar
# Changes in version 0.99.1 (2022-01-27)
## Major changes
* Changed version number into 0.99.2
## Minor improvements and bug fixes
* Changed URL links in DESCRIPTION
# Changes in version 0.99.0 (2021-12-22)
## Major changes
* Changed version number into 0.99.1
* Changed name from distSTRING into MSA2dist
* Submitted to Bioconductor
## Minor improvements and bug fixes
| 16.894737 | 44 | 0.73053 | eng_Latn | 0.974803 |
da0bda38f9062677f3556034724d9728fa671c8a | 218 | md | Markdown | _watches/M20191022_071511_TLP_2.md | Meteoros-Floripa/meteoros.floripa.br | 7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad | [
"MIT"
] | 5 | 2020-01-22T17:44:06.000Z | 2020-01-26T17:57:58.000Z | _watches/M20191022_071511_TLP_2.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | null | null | null | _watches/M20191022_071511_TLP_2.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | 2 | 2020-05-19T17:06:27.000Z | 2020-09-04T00:00:43.000Z | ---
layout: watch
title: TLP2 - 22/10/2019 - M20191022_071511_TLP_2T.jpg
date: 2019-10-22 07:15:11
permalink: /2019/10/22/watch/M20191022_071511_TLP_2
capture: TLP2/2019/201910/20191021/M20191022_071511_TLP_2T.jpg
---
| 27.25 | 62 | 0.784404 | eng_Latn | 0.032539 |
da0cc25a9d0e52de0f616108d7761256098a6d07 | 4,282 | md | Markdown | README.md | thobbs/genartlib | 032f685b326ef56235e1dc1198cacf02264d3c39 | [
"MIT"
] | 159 | 2016-06-06T11:00:49.000Z | 2022-03-29T18:40:00.000Z | README.md | thobbs/genartlib | 032f685b326ef56235e1dc1198cacf02264d3c39 | [
"MIT"
] | 6 | 2017-11-01T16:12:30.000Z | 2021-06-28T20:48:06.000Z | README.md | thobbs/genartlib | 032f685b326ef56235e1dc1198cacf02264d3c39 | [
"MIT"
] | 14 | 2017-11-01T15:17:09.000Z | 2022-03-08T22:33:43.000Z | # genartlib
[](https://clojars.org/genartlib)
[](https://cljdoc.org/d/genartlib/genartlib/CURRENT)
<img src="dev-resources/ectogenesis-small.jpg" alt="Ectogenesis" title="Ectogenesis" align="right" width="250"/>
A Clojure library with simple utilities for creating generative artwork.
This library is built around [Quil](https://github.com/quil/quil), a Clojure wrapper around the [Processing](https://processing.org) framework. However, most of the functions are just mathematical utilies that could be used idependently.
To see and read about my artwork, visit [tylerxhobbs.com](https://tylerxhobbs.com) or follow me on [Instagram](https://instagram.com/tylerxhobbs) or [Twitter](https://twitter.com/tylerxhobbs).
## Usage
To install, add this to your dependencies in `project.clj`:
```clojure
[genartlib "0.1.22"]
```
## Contents
View the [API Docs](https://cljdoc.org/d/genartlib/genartlib/CURRENT).
The genartlib library has the following tools:
### Project Template
Under project-template/, you'll find the basic setup that I use for every new generative art project. This is geared towards creating static images.
I also wrote a bit about [my development setup and how I use it](https://tylerxhobbs.com/essays/2015/using-quil-for-artwork).
### Algebra
The following algebra-ish functions are defined:
* `avg` - average
* `interpolate` / `interpolate-multi` - linear interpolation
* `rescale` - map from one range to another
* `line-intersection` - find the intersection of two lines
* `lines-intersection-point` - another way to find line intersections
* `slope` / `point-slope` - get the slope of a line
* `y-intercept` - get the y intercept point of a line
* `angle` / `point-angle` - get the angle between two points in radians
* `angular-coords` - calculate the offset location from a base point with angle and magnitude
* `point-dist` - distance between two points
### Geometry
* `polygon-contains-point?` - a fast test for checking if a point falls inside a polygon
* `rotate-polygon` - rotates around the average center of the poly
* `shrink-polygon` - shrink by a ratio
### Curves
* `chaikin-curve` - a curve-smoothing algorithm
* `chaikin-curve-retain-ends` - a variation that preserves the original end points
* `split-curve-by-step` - break up a curve into chunks with the given length
* `split-curve-into-parts` - break up a curve into chunks with equal length, given a number of parts to have
* `interpolate-curve` - find a point that is a given percentage along the length of a curve
* `line-simplification` - an implementation of the Ramer-Douglas-Peucker line simplification algorithm
### Random
* `gauss` - sample a gaussian probability distribution
* `abs-gauss` - basically gauss + abs
* `triangular` - sample a triangular probability distribution
* `pareto-sampler` / `pareto-sample` - sample a pareto probability distribution
* `random-point-in-circle` - uniform sampling of points within a circle
* `odds` - returns true or false with the given probability
* `choice` - pick from a list of items with uniform probability
* `weighted-choice` - pick from a list of items, each with an assigned probability
* `repeatable-shuffle` - a version of shuffle that uses Processing's Random, in order to ensure repeatability with the same seed
### Plotter
* `sort-curves-for-plotting` - sorts a seq of curves in order to minimize plotter travel distance
### Utils
* `w` and `h` - shorthand for expressing a length or position in terms of percentage of the image width or height - good for using a pseudo-vector approach to creating images
* `set-color-mode` - set the color mode to HSV with ranges H [0, 360], S [0.0, 100.0], V [0.0, 100.0], alpha [0.0, 1.0]
* `in?` / `not-in?` - test if a seq contains an item
* `between?` - is a value inside an inclusive range?
* `enumerate` - turns a seq of items into a seq like ([0 item-0] [1 item-1] [2 item-2] ...)
* `zip` - combine two or more seqs into tuples
* `snap-to` - snap a value to a given window size, kind of like configurable rounding
* `vec-remove` - remove an item from a vector
## License
Copyright © Tyler Hobbs
Distributed under the MIT License.
| 44.14433 | 237 | 0.741709 | eng_Latn | 0.976494 |
da0d83eed10f21aa5beb9b9f287553c1da47a6e1 | 3,009 | md | Markdown | README.md | World-of-Cryptopups/useEOSHyperion | d197873866e298876e905f4fd7927aaa7f5eef31 | [
"MIT"
] | null | null | null | README.md | World-of-Cryptopups/useEOSHyperion | d197873866e298876e905f4fd7927aaa7f5eef31 | [
"MIT"
] | null | null | null | README.md | World-of-Cryptopups/useEOSHyperion | d197873866e298876e905f4fd7927aaa7f5eef31 | [
"MIT"
] | null | null | null | # useEOSHypersion
React hooks for [EOS Hyperion State API](https://github.com/eosrio/hyperion-history-api) data fetching.
This can be used with any supported EOSIO-based blockchains.
All of the hooks wraps around `useSWR` from the [swr](https://swr.vercel.app) library.
## Install
```sh
npm install @cryptopuppie/useeoshyperion --save
```
## Usage
Using the hooks is simple and made to be similar to `useSWR` for easier use.
```jsx
import { useGetActions } from '@cryptopuppie/useeoshyperion'
export default function App() {
const { data } = useGetActions(
{ account: 'fckedupmyacc', limit: 5 },
'https://testnet.waxsweden.org'
)
return (
<div>
<h3>Actions</h3>
<ul>
{data?.actions.map((i) => (
<li key={i.trx_id}>
{i.trx_id} - {new Date(i.timestamp).toLocaleString()}
</li>
))}
</ul>
</div>
)
}
```
- **With a Provider**
If you do not want to set the api everytime in each hook, you can use a provider.
```tsx
// Component.tsx
import { useGetActions } from '@cryptopuppie/useeoshyperion'
export default function Component() {
const { data } = useGetActions({ account: 'fckedupmyacc', limit: 5 })
return (
<div>
<h3>Actions</h3>
<ul>
{data?.actions.map((i) => (
<li key={i.trx_id}>
{i.trx_id} - {new Date(i.timestamp).toLocaleString()}
</li>
))}
</ul>
</div>
)
}
// App.tsx
import { UseHyperionProvider } from '@cryptopuppie/useeoshyperion'
import Component from './Component.tsx'
export default function App() {
return (
<UseHyperionProvider endpoint="https://testnet.waxsweden.org">
<Component />
</UseHyperionProvider>
)
}
```
- **Error handling**
All of the hooks export error objects, `error` and `hasFailed`.
If `hasFailed` is true, the `data` object is null and `error` otherwise.
```jsx
import { useGetActions } from '@cryptopuppie/useeoshyperion'
export default function App() {
const { data, hasFailed, error } = useGetActions(
{ account: 'fckedupmyacc', limit: 5 },
'https://testnet.waxsweden.org'
)
if (hasFailed) {
return <p>{error.message}</p>
}
return (
<div>
<h3>Actions</h3>
<ul>
{data?.actions.map((i) => (
<li key={i.trx_id}>
{i.trx_id} - {new Date(i.timestamp).toLocaleString()}
</li>
))}
</ul>
</div>
)
}
```
### Hooks
All of the main primary endpoints are implemented.
- Health
- `useGetHealth`
- History
- `useGetABISnapshot`
- `useGetActions`
- `useGetCreatedAccounts`
- `useGetCreator`
- `useGetDeltas`
- `useGetSchedule`
- `useGetTransaction`
- State
- `useGetAccount`
- `useGetKeyAccounts`
- `useGetLinks`
- `useGetProposals`
- `useGetTokens`
- `useGetVoters`
- Stats
- `useGetMissedBlocks`
##
**2022 | World of Cryptopups**
| 21.190141 | 103 | 0.592888 | eng_Latn | 0.444482 |
da0e07131470a4269acf2cd7c2ae27b5a8bd6c30 | 14,259 | md | Markdown | Readme.md | schwamster/docker-tutorial | d97b5539a7c1e86eee94f58b32dc003c07be9854 | [
"MIT"
] | 13 | 2017-04-16T11:52:38.000Z | 2019-01-09T19:57:36.000Z | Readme.md | schwamster/docker-tutorial | d97b5539a7c1e86eee94f58b32dc003c07be9854 | [
"MIT"
] | null | null | null | Readme.md | schwamster/docker-tutorial | d97b5539a7c1e86eee94f58b32dc003c07be9854 | [
"MIT"
] | 6 | 2017-04-12T14:40:03.000Z | 2020-07-31T10:10:38.000Z | # Docker Tutorial with asp.net core
This is related to [this post on devt.to](https://dev.to/schwamster/docker-tutorial-with-for-aspnet-core)
In this tutorial, you will learn how to build and run your first asp.net core docker image. We start of with a very short general docker introduction.
After that we choose the "right" images. We will first create a docker container that is responsible for building our source files. For that we copy our source files to the build container. When the build is done we will copy the published project back to the host system and create a runtime image. After that we explore the handy additon "multi-stage" build to simplify the build.
Your will need to install [dotnet core](https://www.microsoft.com/net/core) and [docker](https://docs.docker.com/engine/installation/) on your machine before your begin this tutorial.
If you are running behind a proxy some of the commands might not work, so be sure to check out the [Proxy-Section](#proxy) below.
## The Dockerfile
If you already have basic knowledge of Docker skip this introduction and go straight to [Choose an image](#choose_image).
You can run one of the many images that exist ready for usage on [hub.docker.com](https://hub.docker.com). You can for example
run a command on an instance of Debian a popular Linux Distro with the following command:
```powershell
docker run debian echo "Welcome to Docker"
```

This might take a while the first time, since docker has to pull the image. A second run should start the command in a fraction of a second.
Instead of running a "throw away"-container you can also use an container interactively like so:
```powershell
docker run -it debian /bin/bash
```

Check out the docker run reference to find out more: [docker run](https://docs.docker.com/engine/reference/run/)
You can exit the container by typing "exit" and hitting enter.
But you can not only run other peoples images, you can also create your own images. For that you will need to create a *Dockerfile*. The *Dockerfile* describes an image and all its dependencies in steps.
We can start with a simple Dockerfile that extends our hello world example.
Create a new folder called cowsay and add a file called Dockerfile. Add the following content to the file:
```dockerfile
FROM debian
RUN apt-get update && apt-get install -y cowsay
ENTRYPOINT ["/usr/games/cowsay"]
```
In this dockerfile we are doing the follwing:
1. defining what base image we want to use => debian
2. running a command in the image that updates the packagemanager and installs an app called cowsay
3. defining what app to run when the image is run
For a full reference of the available instructions in Dockerfile go here [Dockerfile](https://docs.docker.com/engine/reference/builder/)
Now let's build the image with the build command from the created folder:
```powershell
docker build -t cowsay .
```
If this hangs and you are running behind a proxy check [this](#proxy) out.

Now that we have build our image we can run it:
```powershell
docker run cowsay "Welcome to Docker"
```

## Choose an image<a name="choose_image"></a>
Go to [hub.docker.com](https://hub.docker.com) and search for aspnetcore
You will find many different choices. If there are no very special reasons i would opt for official images or images uploaded by the involved companies. Two images are interesting:

There are two different images provided by microsoft. One of them only contains the runtime and the other contains the SDK as well - see the following descriptions
### ASP.NET Core Docker Image
This repository contains images for running **published** ASP.NET Core applications. These images use the
microsoft/dotnet image as its base.
### ASP.NET Core Build Docker Image
This repository contains images that are used to **compile/publish** ASP.NET Core applications inside the container. This is different to compiling an ASP.NET Core application and then adding the compiled output to an image, which is what you would do when using the microsoft/aspnetcore image. These Dockerfiles use the microsoft/dotnet image as its base.
## Create a asp.net core project
create a folder called docker-tutorial and navigate to it, then execute the following command:
```powershell
dotnet new webapi
```
## First Build
Let's start easy and compile the app on our computer and then add the output to the runtime image.
Run the following commands in the root of your project:
```powershell
dotnet restore
dotnet publish -o ./publish
```
You should now have a publish folder, that contains your compiled application.
Now create a new Dockerfile in the root of the application
```dockerfile
FROM microsoft/aspnetcore:2.0
WORKDIR /app
COPY ./publish .
ENTRYPOINT ["dotnet", "docker-tutorial.dll"]
```
This Dockerimage will copy the contents of the publish folder in the root of your project into the app folder on the image.
Build the image:
```powershell
docker build -t docker-tutorial .
```
You can find out more about the build command [here](https://docs.docker.com/engine/reference/commandline/build/)
Test the image:
```powershell
docker run -p 8181:80 docker-tutorial
```
Now you can navigate to the hosted application: http://localhost:8181/api/values
You should get a response like this:
```json
["value1","value2"]
```
Your docker engine might not be reachable through localhost. If so change to the correct url. If you
are using the docker toolbox with docker-machine you can get the ip with the following command:
```powershell
docker-machine ip default
```
## Compiling within the aspnetcore-build image
It is recommended to compile your project within the docker image, since this will produce a more reliable build pipeline. The build on the development machine will work the same way as the build in the build server.
So let's create another Dockerfile called Dockerfile.build
```dockerfile
FROM microsoft/aspnetcore-build:2.0
WORKDIR /app
COPY *.csproj .
RUN dotnet restore
COPY . .
RUN dotnet publish --output /out/ --configuration Release
```
The new instruction we use here is *COPY*. This copies files from our host into the image.
Also note what happens when you rebuild the image. If you don't change anything nothing will be done. If you change something in the code the publish instruction will be executed but not *dotnet restore*. Only if you change some dependency will the *dotnet restore* instruction be executed.
For a more detailed description of this "layered" build process check [this](https://docs.docker.com/engine/userguide/storagedriver/imagesandcontainers/) out.
Before we build the "build-image" we need to add one more file to avoid that dotnet commands on our host (dotnet restore/build/publish) interfere with the build context. See [this](https://codefresh.io/blog/not-ignore-dockerignore/) for more information. Add a file called .dockerignore with the following content to the root of the project:
```txt
bin
obj
publish
```
Now let's build the image. Note we have to explicitly specify what Dockerfile we want to use:
```powershell
docker build -f Dockerfile.build -t docker-tutorial-build .
```
We have now build the app in the image. We could now run the container from that image but run the follwoing command first:
```powershell
docker image ls | sls docker-tutorial
```

As you can see the build image is dramatically larger than the image we created before. This is because the build images has absolutley everything you need to build your images (SDK). We don't need that when we run our container. The solution is to create a runtime image.
So next step is to get the compiled app out of the build image. First we create the container with the [create](https://docs.docker.com/engine/reference/commandline/create/) command. This is almost like *docker run* just that the container is never really started. We can however copy out the compiled app.
```powershell
docker create --name docker-tutorial-build-container docker-tutorial-build
```
! delete the earlier created publish folder - we will now copy the containers compiled result into that folder:
```powershell
docker cp docker-tutorial-build-container:/out ./publish
```
Great now we can build the runtime image just like before:
```powershell
docker build -t docker-tutorial .
```
And of course run it:
```powershell
docker run -p 8181:80 docker-tutorial
```
! You probably have to stop the container we started earlier, since that one is already using port 8181. To do so first list the running processes:
```powershell
docker ps
```
Copy the container ID, e.g. ba51e5dc4036
and run the following command:
```powershell
docker rm $(docker stop ba51e5dc4036)
```
# Multi-Stage Builds
With Docker Version 17.05 we got a new featuer that makes the build process much easier. The reason we have a build image and a runtime images is,
because we want a slimer image at runtime. Since this is a very common requirement for a lot of languages Docker provided us with multi-stage
builds => [see documentation](https://docs.docker.com/engine/userguide/eng-image/multistage-build/). This means we can now define build and runtime image in one single Dockerfile and we can copy the produced binaries from the build image into
our runtime image.
First stop the container if it is still running:
```powershell
docker stop $(docker ps --filter "ancestor=docker-tutorial" -q)
```
Add a new Dockerfile with the name Dockerfile.multistage with the following content:
```dockerfile
# build image
FROM microsoft/aspnetcore-build:2.0 as build
WORKDIR /app
COPY *.csproj .
RUN dotnet restore
COPY . .
RUN dotnet publish --output /out/ --configuration Release
# runtime image
FROM microsoft/aspnetcore:2.0
WORKDIR /app
COPY --from=build /out .
ENTRYPOINT [ "dotnet", "docker-tutorial.dll" ]
```
Building the image is now much easier. All you got to do is to run the follwing:
```powershell
docker build -f .\Dockerfile.multistage -t docker-tutorial .
```
Check the image size of the image again:
```powershell
docker image ls | sls docker-tutorial
```
The resulting runtime image still has the much smaller footprint, since the intermediate images in a multi-stage build dont make it into the resulting image.
Run it:
```powershell
docker run -p 8181:80 docker-tutorial
```
Great! Now we massivly simplified the build process and still kep the image small! At this point I would get rid of the existing Dockerfile and Dockerfile.build and rename Dockerfile.multistage to Dockerfile. Then the build command looks like this:
```powershell
docker build -t docker-tutorial .
```
# Publishing & Pulling Images
Now that we build the image it would be nice if we could use that on other docker hosts. To do so we can upload our image to a docker registry.
There are many different choices for docker registries: hub.docker.com, AWS ECR, Artifactory...
For simplicity we will be using hub.docker.com which is free for public images.
If you havent done so yet, [create an account ](https://hub.docker.com/).
You can then logon in powershell:
```powershell
docker login
```

To be able to upload (push) our image we have to prefix our image with our username. My username is schwamster so I would have to run the following command:
```powershell
docker tag docker-tutorial schwamster/docker-tutorial
```
Now I can push the image
```powershell
docker push schwamster/docker-tutorial
```

After the image is pushed I can verify that it worked by opening the following url: [https://hub.docker.com/r/schwamster/docker-tutorial/](https://hub.docker.com/r/schwamster/docker-tutorial/)
To pull the image run the following command:
Now you can also run my image like this:
```powershell
docker run -p 8182:80 schwamster/docker-tutorial
```
My image will now be pulled and run on your machine. Check it out under http://localhost:8182/api/values (Changed port to 8182)
# Proxy<a name="proxy"></a>
If you are forced to go through a proxy you will have to adjust some of the commands we used above.
## Proxy and docker run
If you need to have internet access from within your container you will have to add the proxy settings to respective environment variables of the container instance. Those can be different depending on what application you use. In general I would set the following:
* http_proxy
* https_proxy
I noticed that some applications want the variables to be set uppercase -> HTTP_PROXY, HTTPS_PROXY. Other apps might need dedicated environment variables or even changes in config files.
To add the proxy environment variables add each environment variable with the [-e argument](https://docs.docker.com/engine/reference/run/#env-environment-variables)
Here is an example:
```powershell
docker run -it -e https_proxy=http://someproxy:8080 -e http_proxy=http://someproxy:8080 debian /bin/bash
```
To test this run the following command in your container:
```bash
apt-get update
```
apt-get update should now work and not run into a timeout.
## Proxy and docker build
If you need internet access while building the image you need to pass the environment variables with the [--build-arg argument](https://docs.docker.com/engine/reference/builder/#arg) just like you do it with run time environment variables. Its just the argument that is called different.
Example:
```powershell
docker build --build-arg http_proxy=http://someproxy:8080 --build-arg https_proxy=http://someproxy:8080 -t cowsay .
```
# Acknowledgement
Please check out this great [book](http://shop.oreilly.com/product/0636920035671.do) "Using Docker" by Adrian Mouat (O´Reilly) ISBN 978-1-491-91576-9
| 37.132813 | 383 | 0.74872 | eng_Latn | 0.995191 |
da0e419c910e05fc9ce0b8f782c06b1ec4e77984 | 646 | md | Markdown | Harmony/Documentation/articles/execution.md | bbepis/Harmony | 7263ad47b90b600f1d05c60d87475b2a6c6a135a | [
"MIT"
] | 2 | 2020-03-24T08:26:06.000Z | 2021-02-15T18:00:38.000Z | src/Harmony/Harmony/Documentation/articles/execution.md | 2302053453/HarmonyOS | 52982b452106d3a50eb797aaa47144ca0e737199 | [
"Apache-2.0"
] | null | null | null | src/Harmony/Harmony/Documentation/articles/execution.md | 2302053453/HarmonyOS | 52982b452106d3a50eb797aaa47144ca0e737199 | [
"Apache-2.0"
] | 1 | 2020-01-12T15:54:55.000Z | 2020-01-12T15:54:55.000Z | # Execution Flow
Patching a method does not override any previous patches that other users of Harmony apply to the same method. Instead, prefix and postfix patches are executed in a prioritised way. Prefix patches can return a boolean that, if false, terminates prefixes and skips the execution of the original method. In contrast, all postfixes are executed all the time.
Execution of prefixes and postfixes can explained best with the following pseudo code:
run = true
result = null;
if (run) run = Prefix1(...)
if (run) run = Prefix2(...)
// ...
if (run) result = Original(...)
Postfix1(...)
Postfix2(...)
// ...
return result | 32.3 | 355 | 0.727554 | eng_Latn | 0.998271 |
da0e4eda51540222b2497798260690f05ac10ea1 | 2,890 | md | Markdown | _posts/2019-02-28-react-lifecycle.md | ghwlchlaks/ghwlchlaks.github.io | f875955d6333f32740aaf78149cbaebb7e35433c | [
"MIT"
] | null | null | null | _posts/2019-02-28-react-lifecycle.md | ghwlchlaks/ghwlchlaks.github.io | f875955d6333f32740aaf78149cbaebb7e35433c | [
"MIT"
] | null | null | null | _posts/2019-02-28-react-lifecycle.md | ghwlchlaks/ghwlchlaks.github.io | f875955d6333f32740aaf78149cbaebb7e35433c | [
"MIT"
] | null | null | null | ---
layout: post
title: "[React] LifeCycle"
date: 2019-02-28 00:00:00
description: 리액트 라이프 사이클 # Add post description (optional)
img: react_js.png # Add image post (optional)
categories: Jiho
tags: [Blog, IT]
navigation: True
subclass: "post tag-IT tag-Interview"
logo: "assets/images/default/DMB_logo.png"
cover: "assets/images/cover/react_js.png"
author: Jiho # Add name author (optional)
disqus: true
---
면접을 준비하면서 리액트를 빠르게 배워서 개발을 진행했습니다.
공부하면서 중요한 부분중에 하나가 역시 LifeCycle이었고 어려운 부분이었습니다.
어떠한 내용들이 있는지 정리하는 포스팅을 해보려고 합니다.
## 생성
---
### 1.constructor
```
constructor(props) {
super(props);
}
```
constructor는 생성자 메소드로 컴포넌트가 처음 생성될떄 만들어집니다.
props또한 전달되어집니다.
### 2.componentWillMount (deprecated)
### 3. render
```
render {
return()
}
```
리액트를 하시면 위와 같은 코드를 보실 수 있는데요 jsx문법을 사용하여 렌더링하는 부분입니다.
### 4. componentDidMount
```
componentDidMount() {
Document.get...
axios.get...
}
```
componentDidMount는 3번 render이후에 즉 컴포넌트가 렌더링된 이후 호출되는 메소드로 DOM 조작 및 axios등을 이용한 비동기 데이터 요청을 주로 작성하는 부분입니다.
여기서 주의할점은 componentDidMount 메소드에서 setState를 이용하여 state의 값을 변경하게되면 리렌더링이 되므로 작성하지 않는 것이 좋습니다.
## 업데이트
---
### 1. getDerivedStateFromProps
```
static getDerivedStateFromProps(nextProps, prevState) {
if (prevState.val !== nextProps.val) {
return {val: nextProps.val};
}
return null;
}
```
getDerivedStateFromProps는 Props가 변할때 state값을 변경해서 리렌링 할 수 있는 메소드입니다.
위와 같이 기본적으로 null을 리턴하며 이전 state의 값과 이후 받은 props의 값을 비교하여 다른 경우에만
state를 변경시켜주는 코드입니다. 여기서 주의하실점은 setState문을 이용하여 state를 변경하는 것이아니라
retrun {val: nextProps.val}의 형태로 반환해준다는 것입니다.
### 2. shouldComponentUpdate
```
shouldComponentUpdate(nextProps, nextState) {
if ( this.state.val !== nextState.val) {
return false;
}
return true;
}
```
기본적으로 return하는 값은 true이며 true인경우에는 리렌더링을 진행합니다.
해당 메소드는 성능 최적화를 하기위해서 사용하는 메소드로 굳이 리렌더링을 하지 않아도 되는 state를 막는 것입니다.
위와같이 state혹은 Props를 비교해서 값이 변경된 경우에만 렌더링을 하게 할 수 있습니다.
### 3. getSnapshotBeforeUpdate
```
getSnapshotBeforeUpdate(prevProps, prevState) {
return prevState.val;
}
```
해당 메소드는 수정(update)이 발생하기 바로 전에 호출되는 메소드입니다. 해당 메소드에서 반환한 값은
componentDidupdate에 세번째 매개변수로 전달됩니다. (자주 사용하는 부분은 아닌것같습니다.)
찾아보니 리렌더링되는 동안 스크롤의 위치가 처음으로초기화 되는 것이아니라 기존의 위치로 렌더링되기위해 기존의 위치를 update되기전 넘겨주는 역할을 하는 경우에 사용한다고 합니다.
### 4. componentDidUpdate
```
componentDidUpdate(prevProps, prevState, [snapshot]) {
}
```
해당 메소드는 업데이트 처리를 끝내고 render이 된 이후에 실행되는 메소드입니다.
즉 모든 props와 state의 값이 변경이된상태이고 prevProps와 prevState 인자를 이용해 이전의 값들은 읽을 수 있습니다.
또한 세번째 인자인 snapshot은 3번에 getSnapshotBeforeUpdate에서 반환한 데이터입니다.
## 소멸
---
### 1. componentWillUnmount
```
componentWillUnmount() {
}
```
해당 메소드는 컴포넌트가 소멸될때 발생하는 메소드로 인스턴스를 제거하는 코드를 작섷해줍니다.
여기까지 큰틀로 실행되는 리액트의 LifeCycle에대해서 알아보았습니다.
deprecated 된것이 꽤 있기 때문에 모든것을 적지는 않았습니다.
저 또한 작성하면서 다시 공부할 수 있었습니다.
해당 게시물에 문제가 있다면 댓글을 통해 피드백해주시면 감사하겠습니다~ 같이 공부해요~^^
`방문해주신분들 댓글 한개씩 달아주시면 감사하겠습니다~~^^`
| 20.642857 | 107 | 0.714533 | kor_Hang | 1.00001 |
da0ec5ff687c535a9c8e3ba5c7de8c9dc79edd68 | 262 | md | Markdown | README.md | SmithB/Altimetry_fit | 2b8a11b99e6e7b7dd68b7588599ac20fa7240e13 | [
"MIT"
] | null | null | null | README.md | SmithB/Altimetry_fit | 2b8a11b99e6e7b7dd68b7588599ac20fa7240e13 | [
"MIT"
] | null | null | null | README.md | SmithB/Altimetry_fit | 2b8a11b99e6e7b7dd68b7588599ac20fa7240e13 | [
"MIT"
] | null | null | null | # altimetryFit
Scripts for fitting smooth surfaces to altimetry data
This repository includes code that uses the utilities in the smithB/LSsurf and smithB/pointCollection repos to fit smoothly changing surfaces to altimetry data from Greenland and Antarctica.
| 43.666667 | 190 | 0.839695 | eng_Latn | 0.997519 |
da0f7c76d7b9a803ba503d8c7181656d013e5df2 | 127 | md | Markdown | README.md | Kenneth-Sweet/Sluggin | 95f33c4024b2d9e3c02aa19fe580542f2481a616 | [
"Apache-2.0"
] | null | null | null | README.md | Kenneth-Sweet/Sluggin | 95f33c4024b2d9e3c02aa19fe580542f2481a616 | [
"Apache-2.0"
] | null | null | null | README.md | Kenneth-Sweet/Sluggin | 95f33c4024b2d9e3c02aa19fe580542f2481a616 | [
"Apache-2.0"
] | null | null | null | # Sluggin
Slugs and Snails sling slime through an 00's netscape with every feature of their visage updownloaded to the new net
| 42.333333 | 116 | 0.811024 | eng_Latn | 0.998976 |
da10942178987ae85cc51ee294dfba8974c53da0 | 41 | md | Markdown | README.md | chenyinxin/wheel-core | 2b0aa2b11dbf7ddd090337919af58be48c9d7824 | [
"MIT"
] | null | null | null | README.md | chenyinxin/wheel-core | 2b0aa2b11dbf7ddd090337919af58be48c9d7824 | [
"MIT"
] | null | null | null | README.md | chenyinxin/wheel-core | 2b0aa2b11dbf7ddd090337919af58be48c9d7824 | [
"MIT"
] | null | null | null | # wheel-core
NetCore轮子仓库,保存好用的轮子,写代码的都懂。
| 13.666667 | 27 | 0.804878 | eng_Latn | 0.383929 |
da11f5875d5ad8c7b312b5fb53773b1cb2882464 | 2,035 | md | Markdown | readme.md | ManchesterMakerspace/doorboto2 | ee8aae333b405c04c1e244c43f66b76b3094df50 | [
"MIT"
] | 1 | 2022-01-04T16:17:14.000Z | 2022-01-04T16:17:14.000Z | readme.md | ManchesterMakerspace/doorboto2 | ee8aae333b405c04c1e244c43f66b76b3094df50 | [
"MIT"
] | 6 | 2020-06-01T12:00:06.000Z | 2020-12-16T00:23:59.000Z | readme.md | ManchesterMakerspace/doorboto2 | ee8aae333b405c04c1e244c43f66b76b3094df50 | [
"MIT"
] | 2 | 2020-05-31T00:53:55.000Z | 2020-12-01T01:04:21.000Z | # Requisites
- Manchester Makerspace's target system is an ARM linux box
- OSX would probably work as well. Roll your own install script
- A x86 64 bit Debian based Distro was the dev env, and it's less finicky with Serialport
- Windows 10 ??? not sure. Try to roll your own install script
- Node.js and NPM: To run doorboto use versions specified in package.json. Serial library is picky
- A mongo server where your members database is managed by another program.
- This Mongo server could be local or remote. Either way remember to use access control on the mongo server.
- Mongo Atlas has a free tier cloud instance that can be setup easily.
- Webhook URL to slack is intended but not required
- Arduino IDE or CLI (on dev machine to program the reader/latch)
- Arduino rfid reader and door latch relay, firmware included in /reader_firmware
- See /reader_firmware/readme.md for more details
# Setup
In the current implementation an Raspberry Pi is use in combination with a Arduino nano connected using usb communicating over serial on port /dev/ttyATH0. This port may need to added to the dial out group on the PI for doorboto to have permission to use it.
To get the latest version of this repo
git clone https://github.com/ManchesterMakerspace/doorboto2.git
Take a look at install.sh and see if it's suitable to run in your environment.
If it is suitable it should be possible to install all dependencies from scratch by running.
npm run install
Create an executable script called prod.sh exporting the required env vars in ecosystem.config.js.
Run the following
npm start
## Updates
5/1/2020 - Hardware Note: we are currently using a raspberry pi instead of a dedicated desktop PC but it is using a usb drive instead of an SD.
12/2/2020 - Onsite auth server refactor deploy to use Node 14.x, Mongo Driver 3.6.x, and Serialport 9.x
- serialport compatibility took some doing with setting the correct bindings for ARM.
## License
Copyright 2016-2020 ~ Manchester Makerspace ~ MIT License
| 43.297872 | 258 | 0.773956 | eng_Latn | 0.997617 |
da12227d6de1795ec0dfcbf5c8b3aac3aad5a5a4 | 368 | md | Markdown | _posts/2018-01-28-post.md | payberah/payberah.github.io | f8fd49c011df8fdf1aa949105876bbdecbc63947 | [
"MIT"
] | 10 | 2018-01-23T18:18:48.000Z | 2021-03-04T17:20:47.000Z | _posts/2018-01-28-post.md | payberah/payberah.github.io | f8fd49c011df8fdf1aa949105876bbdecbc63947 | [
"MIT"
] | 3 | 2021-01-05T12:20:11.000Z | 2022-02-26T03:36:14.000Z | _posts/2018-01-28-post.md | payberah/payberah.github.io | f8fd49c011df8fdf1aa949105876bbdecbc63947 | [
"MIT"
] | 12 | 2018-05-13T11:01:56.000Z | 2022-02-26T05:44:09.000Z | ---
title: 'مهربانی'
date: 2018-01-28
permalink: /posts/post76/
---
<div align="justify" dir="rtl" style="font-family:vazir;">
مهربان بودن سادهست. در خیابان پشت فرمان بودم که ماشین جلویم روی ترمز زد. فرد کنار راننده، پیاده شد. پیرزنی که چرخدستی دستش بود و نمیتوانست از خیابان عبور کند را به آن طرف رساند. به ماشین برگشت و به راهش ادامه داد. به همین سادگی.
</div>
| 33.454545 | 231 | 0.717391 | pes_Arab | 0.979814 |
da12e3806a04f9f7e6edde57c276585517c447d6 | 2,103 | md | Markdown | _posts/2021-02-16-python-doit.md | alfonso-john2021/alfonso-john2021.github.io | bea0d770063fbace123dba1c9aa884bbd3f5ca04 | [
"MIT"
] | null | null | null | _posts/2021-02-16-python-doit.md | alfonso-john2021/alfonso-john2021.github.io | bea0d770063fbace123dba1c9aa884bbd3f5ca04 | [
"MIT"
] | null | null | null | _posts/2021-02-16-python-doit.md | alfonso-john2021/alfonso-john2021.github.io | bea0d770063fbace123dba1c9aa884bbd3f5ca04 | [
"MIT"
] | null | null | null | ---
layout: post
title: python 개발을 준비하며
tags: [blog, python]
---
## 1. 통합개발환경(ide) 선택
### 1.1 python IDLE
일단 첫번째로, python을 설치할때 기본적으로 같이 동봉되는 IDLE이라는 선택지가 있습니다.
처음 시작할때는 이만한 선택지가 없죠. 따로 설치할 필요도 없고, 호환성이나 다른 복잡한 문제들을 **전혀** 신경쓸 필요가 없습니다.
하지만 그런 장점도 있다면 단점들도 있겠죠?
1. 폴더를 여는 기능이 없다. -> 여러 파일들을 다루는 프로젝트에는 적합하지 않다.
2. 자동완성이나 저장, git 등 편리한 부가기능들이 없다.
이러한 단점들이 있긴 하지만, 간단하게 파일을 열어볼 때는 나쁘지 않은 선택지 입니다.
### 1.2 sublime text 3
말하지 않아도 될만큼 유명한 편집기이죠, 하지만 무료 버전을 사용한다면 매번 이러한 팝업창이 뜰 겁니다.`you need to purchase a license for continued use`
하지만 별 문제는 없습니다. 그저 귀찮을 뿐..
### 1.3 atom
자! 나왔습니다. github에서 개발하여 여러 패키지들과 깔끔한 ui, 많은 확장들..
python만 다룰것이 아니라면 atom은 상당히 좋은 선택입니다.
앞서 말했듯이 github에서 개발하였기에, git의 사용도 간편하며 쉽습니다.
한가지 단점이자 장점이 있습니다. 바로 처음에는 정말 기본적인 에디터만 존재한다는 거죠.
어떤 사람들에게는 깔끔하여 장점이 될수도 있고, 어떤 사람에게는 기능이 없어 흥미가 가지않는 **그저그런** 에디터처럼 보일수도 있겠죠.
### 1.3 Pycharm
python을 위한, python을 위해, python으로!.. 사실 무료인 community 버전만 그렇습니다.
돈을 주고 구매할수 있는 professional 버전은 완전한 ide의 형태를 지니고 있으며 코드 자동완성, 디버깅, 잘못된 문자열 하이라이팅, git 등등... 사실 이중 대부분은 community버전에도 있는 기능입니다. 그러나 flask 등을 배우다 보면 약간의 아쉬움을 느낄지도 모릅니다.
단점이 하나 있는데, 컴퓨터 자원을 아낌없이 먹는다는 겁니다. 4g 이하의 ram 정도라면(실제 노트북 기준)windows 7환경에서 약간의 **버벅거림** 이 느껴집니다. 물론 그 창만 띄워 놓는다면 상관 없을지도 모르지만 보통 인터넷 정도는 띄워 두고 작업을 하니 말이죠.
### 1.4 **VISUAL STUDIO CODE**
본론 나왔습니다! 본가 격인 visual studio에서 텍스트 편집 부분만 **똑** 떼어내 만든(+디버거,터미널) 아주 가벼운 에디터이죠.

여기 보다시피 무려 20~50GB가 필요하다고 나와있습니다. 하지만 vscode는 겨우 100~200MB 정도만 필요하죠. 그래서 portable 버전도 있습니다.(여기서 다루지는 않습니다.)
본인이 아주 잘 쓰고있는 에디터중 하나이며 넓은 확장 생태계를 자랑합니다.
거기다 .MD 즉 markdown역시 지원하고 미리보기도 지원합니다.
바로바로 변경 즉시 미리보기에 반영되어 글을 작성하기에도 용이하죠.
물론 지금 글 작성은 typora로 하고있긴 하지만...(나중에 따로 다룰 예정입니다)
아까 말했듯이 기본적으로는 오직 텍스트 에디터이므로, python에서 활용하려면
확장들을 설치해야 합니다. 또한 내장된 git기능 역시 존재하죠. 유용한 디버거, 내장된 터미널 트리형 파일 시스템... 확장으로 소소하게는 시계를 추가하는 것부터
파이썬의 `jupyter`역시 확장으로 설치할수 있습니다.
정말 가볍고, 다루기 쉬우며 간단합니다. 또 확장으로 더욱 편리하게 사용할수 있죠.
## 2. 에디터를 고른 후에는?
어려울것 없습니다! 즐기세요!
...
...
사실 그냥 쓰라고 하면 적응하기 힘들죠. 나중에 적응하는 법도 포스팅할 예정입니다. 일단 편하게 써보세요.
| 25.646341 | 165 | 0.704232 | kor_Hang | 1.00001 |
da13ccbe58fee04ea224eb1167a2b661976682ef | 5,759 | md | Markdown | articles/marketplace/azure-vm-create-using-own-image.md | tsunami416604/azure-docs.hu-hu | aeba852f59e773e1c58a4392d035334681ab7058 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/marketplace/azure-vm-create-using-own-image.md | tsunami416604/azure-docs.hu-hu | aeba852f59e773e1c58a4392d035334681ab7058 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/marketplace/azure-vm-create-using-own-image.md | tsunami416604/azure-docs.hu-hu | aeba852f59e773e1c58a4392d035334681ab7058 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Azure-beli virtuális gépek ajánlatának létrehozása az Azure Marketplace-en saját rendszerkép használatával
description: Ismerje meg, hogyan tehet közzé egy virtuálisgép-ajánlatot az Azure Marketplace-en saját rendszerkép használatával.
ms.service: marketplace
ms.subservice: partnercenter-marketplace-publisher
ms.topic: how-to
author: emuench
ms.author: krsh
ms.date: 10/20/2020
ms.openlocfilehash: 42022d1204c3b524ee2e9ef2770f616fba89dc8c
ms.sourcegitcommit: b6f3ccaadf2f7eba4254a402e954adf430a90003
ms.translationtype: MT
ms.contentlocale: hu-HU
ms.lasthandoff: 10/20/2020
ms.locfileid: "92284305"
---
# <a name="how-to-create-a-virtual-machine-using-your-own-image"></a>Virtuális gép létrehozása saját rendszerkép használatával
Ez a cikk bemutatja, hogyan hozhat létre és helyezhet üzembe egy felhasználó által biztosított virtuális gépet (VM) tartalmazó lemezképet.
> [!NOTE]
> Az eljárás megkezdése előtt tekintse át az Azure-beli virtuális gépekre vonatkozó [technikai követelményeket](marketplace-virtual-machines.md#technical-requirements) , beleértve a virtuális merevlemez (VHD) követelményeit.
Ha ehelyett egy jóváhagyott alaprendszerképet szeretne használni, kövesse a virtuálisgép- [rendszerkép létrehozása jóváhagyott alapból](azure-vm-create-using-approved-base.md)című témakör utasításait.
## <a name="configure-the-vm"></a>A virtuális gép konfigurálása
Ez a szakasz azt ismerteti, hogyan lehet méretezni, frissíteni és általánosítani egy Azure-beli virtuális gépet. Ezek a lépések szükségesek ahhoz, hogy előkészítse a virtuális gépet az Azure Marketplace-en való üzembe helyezéshez.
### <a name="size-the-vhds"></a>A virtuális merevlemezek mérete
[!INCLUDE [Discussion of VHD sizing](includes/vhd-size.md)]
### <a name="install-the-most-current-updates"></a>A legújabb frissítések telepítése
[!INCLUDE [Discussion of most current updates](includes/most-current-updates.md)]
### <a name="perform-additional-security-checks"></a>További biztonsági ellenőrzések végrehajtása
[!INCLUDE [Discussion of addition security checks](includes/additional-security-checks.md)]
### <a name="perform-custom-configuration-and-scheduled-tasks"></a>Egyéni konfiguráció és ütemezett feladatok végrehajtása
[!INCLUDE [Discussion of custom configuration and scheduled tasks](includes/custom-config.md)]
## <a name="upload-the-vhd-to-azure"></a>A virtuális merevlemez feltöltése az Azure-ba
Konfigurálja és készítse elő a virtuális gépet, amelyet a [Windows VHD vagy VHDX előkészítése az Azure](../virtual-machines/windows/prepare-for-upload-vhd-image.md) -ba való feltöltéshez, illetve [linuxos virtuális merevlemez létrehozásához és feltöltéséhez](../virtual-machines/linux/create-upload-generic.md)című cikkben ismertetett módon kell feltölteni.
## <a name="extract-the-vhd-from-image-if-using-image-building-services"></a>A virtuális merevlemez kibontása rendszerképből (rendszerkép-építési szolgáltatások használata esetén)
Ha rendszerkép-építési szolgáltatást (például [csomagolót](https://www.packer.io/)) használ, előfordulhat, hogy ki kell bontania a virtuális merevlemezt a rendszerképből. Erre nincs közvetlen mód. Létre kell hoznia egy virtuális gépet, és ki kell bontania a virtuális merevlemezt a virtuálisgép-lemezről.
### <a name="create-the-vm-on-the-azure-portal"></a>A virtuális gép létrehozása a Azure Portal
Az alábbi lépéseket követve hozza létre az alapszintű VM-rendszerképet a [Azure Portal](https://ms.portal.azure.com/).
1. Jelentkezzen be az [Azure Portalra](https://ms.portal.azure.com/).
2. Válassza a **Virtuális gépek** lehetőséget.
3. Válassza a **+ Hozzáadás** lehetőséget a **virtuális gép létrehozása** képernyő megnyitásához.
4. Válassza ki a lemezképet a legördülő listából, vagy válassza az **összes nyilvános és privát rendszerkép tallózása** lehetőséget a rendelkezésre álló virtuálisgép-lemezképek kereséséhez vagy tallózásához.
5. A 2. **generációs** virtuális gépek létrehozásához nyissa meg a **speciális** lapot, és válassza a **2. generációs** lehetőséget.
:::image type="content" source="media/create-vm/vm-gen-option.png" alt-text="Válassza az 1. gen vagy a 2. lehetőséget.":::
6. Válassza ki a telepítendő virtuális gép méretét.
:::image type="content" source="media/create-vm/create-virtual-machine-sizes.png" alt-text="Válassza az 1. gen vagy a 2. lehetőséget.":::
7. Adja meg a virtuális gép létrehozásához szükséges egyéb adatokat.
8. Válassza a **felülvizsgálat + létrehozás** lehetőséget a választási lehetőségek áttekintéséhez. Amikor megjelenik az **érvényesítési** üzenet, válassza a **Létrehozás**lehetőséget.
Az Azure megkezdi a megadott virtuális gép üzembe helyezését. Az előrehaladás nyomon követéséhez válassza a bal oldali menü **Virtual Machines** lapját. Miután létrehozta a virtuális gép változásait a **futtatásra**.
### <a name="connect-to-your-vm"></a>Csatlakozás a virtuális géphez
A Windows vagy [Linux](../virtual-machines/linux/ssh-from-windows.md#connect-to-your-vm) [rendszerű](../virtual-machines/windows/connect-logon.md) virtuális géphez való kapcsolódáshoz tekintse meg az alábbi dokumentációt.
[!INCLUDE [Discussion of addition security checks](includes/size-connect-generalize.md)]
## <a name="next-steps"></a>Következő lépések
- Javasolt következő lépés: [a virtuálisgép-rendszerkép tesztelése](azure-vm-image-test.md) , hogy az megfeleljen az Azure Marketplace közzétételi követelményeinek. Ez nem kötelező.
- Ha nem teszteli a virtuálisgép-rendszerképet, folytassa [a sas URI előállításával](azure-vm-get-sas-uri.md).
- Ha nehézségekbe ütközött az új Azure-alapú virtuális merevlemez létrehozása során, tekintse meg [Az Azure Marketplace-hez készült virtuális gépekkel kapcsolatos gyakori kérdéseket](azure-vm-create-faq.md).
| 66.965116 | 357 | 0.79684 | hun_Latn | 0.999976 |
da13f9aca3ec534ec3d71e282a5c0ab9272b47f1 | 77 | md | Markdown | README.md | hexagon-engine/hexagon-studio | e3b0b9cddea234fe7079a9788d2b7c216a7e231d | [
"MIT"
] | null | null | null | README.md | hexagon-engine/hexagon-studio | e3b0b9cddea234fe7079a9788d2b7c216a7e231d | [
"MIT"
] | 1 | 2020-12-31T16:20:12.000Z | 2020-12-31T16:20:12.000Z | README.md | hexagon-engine/hexagon-studio | e3b0b9cddea234fe7079a9788d2b7c216a7e231d | [
"MIT"
] | null | null | null | # hexagon-studio
IDE (Integrated Development Environment) for Hexagon Engine
| 25.666667 | 59 | 0.831169 | eng_Latn | 0.642872 |
da14f02674c2abb455e43088d4ec98db4dfd4f9e | 1,129 | md | Markdown | README.md | Tom-the-Bomb/Discord-Together-py | 7fde088db5ac32fe2dc0fee6c6d6d6044303b135 | [
"MIT"
] | null | null | null | README.md | Tom-the-Bomb/Discord-Together-py | 7fde088db5ac32fe2dc0fee6c6d6d6044303b135 | [
"MIT"
] | null | null | null | README.md | Tom-the-Bomb/Discord-Together-py | 7fde088db5ac32fe2dc0fee6c6d6d6044303b135 | [
"MIT"
] | null | null | null | # Discord-Together-py
---
### Installation
`pip install git+https://github.com/Chrovo/Discord-Together-py`
---
### Example and Description
This package is discord-together for python. Here is an example of what the code should look like with this package:
```python
import discord
from Discord_Together.discordtogether import DiscordTogether
from discord.ext import commands
client = commands.Bot(command_prefix = "!")
@client.command()
async def yt(ctx, vc:commands.VoiceChannelConverter):
youtube = DiscordTogether(token="token here")
invite_code = await youtube.activity(ctx, option="youtube",vc_id=vc.id)
await ctx.send(f"https://discord.com/invite/{invite_code}")
client.run("token here")
```
---
### Attributes
Token - This is your discord api token, place it there.
---
### Methods
`async activity`
- The above is a coroutine, this will return the invite code for whatever option you have chosen.
This method also takes in two arguments:
- option: The option you want(i.e. YouTube Together)
- vc_id: The voice channel id of where this activity will take place.
| 26.255814 | 117 | 0.721878 | eng_Latn | 0.968822 |
da14f9f6a57e158cc10919eedec35792cf8cf357 | 3,006 | md | Markdown | wiki/Release-3.4.md | DimChris0/lima-charlie | e3a6898e80d97b5966e9d81724727431e24126ba | [
"Apache-2.0"
] | 1 | 2019-10-03T21:59:55.000Z | 2019-10-03T21:59:55.000Z | wiki/Release-3.4.md | DimChris0/lima-charlie | e3a6898e80d97b5966e9d81724727431e24126ba | [
"Apache-2.0"
] | null | null | null | wiki/Release-3.4.md | DimChris0/lima-charlie | e3a6898e80d97b5966e9d81724727431e24126ba | [
"Apache-2.0"
] | 1 | 2018-03-05T04:50:17.000Z | 2018-03-05T04:50:17.000Z | # Release 3.4
## Changes
* Installers are now static and global. This enables us to sign every release (Mac and Windows).
* Installers no longer get patched when creating a new organization.
* To run the installer, you now require an Installation Key, available through the web UI.
* Provide this key with `-i InstallationKey` on Mac and Windows (Installs + Enrolls), or `-d InstallationKey` on Linux (installation step is still up to you, the `-d` will only allow it to enroll).
* The InstallationKey contains public encryption key as well as the URL of your LC backend so the sensor can enroll.
* Similarly to before, if someone gets a hold of your InstallationKey, you can "re-generate" the sensors which will render the old key invalid and provide new with a new valid one.
* This means you can now add the cert we use to sign to whatever security / whitelisting systems you have as a trusted cert.
* Discussion [here](https://github.com/refractionPOINT/limacharlie/wiki/Enrollment-Flow).
* Appliance (!)
* An LC appliance is now available for download.
* Supports easy single-node deployment.
* Supports easy multi-node clustered deployment (!).
* Now the main supported deployment method.
* Details [here](https://github.com/refractionPOINT/limacharlie/wiki/LC-Appliance).
* Beach Actor respawning.
* Python is not great at garbage collection due to its lack of compaction. This means heavy throughput of some Actors on Beach leads to memory bloat.
* Beach now will begin cleanly respawning Actors that report as "drainable" (opt-in) to reset their memory usage.
* This does not result in loss of service or data as long as you're running more than 1 instance per drainable Actor (highly recommended and standard).
* All this happens when a high memory waterline is reached of 80% of the Beach node's memory used.
* Uninstall-Clean
* A new command line parameter for Windows and Mac (`-c`) will do a clean uninstall. In addition to removing the executable binaries and service (`-r`), will also remove the identity file of the sensor. Warning: doing this will lose the identity of the sensor and a reinstall will not restore it.
* ONGOING_IDENT event.
* This new event is not sent back by default.
* Every time a CODE_IDENT is not generated because the piece of code has been seen in the past, a smaller ONGOING_IDENT event is generated that includes the hash of the code in question.
## Schema Update
```
use hcp_analytics;
CREATE TABLE hcp_whitelist(
oid uuid,
iid uuid,
bootstrap varchar,
created timestamp,
PRIMARY KEY( oid, iid )
) WITH compaction = { 'class' : 'SizeTieredCompactionStrategy' } AND gc_grace_seconds = 86400;
```
## Web UI Update
The `/cloud/limacharlie/app.py` web UI now expects the directory `/cloud/beach/hcp/utils` itself to be symlinked in `/cloud/limacharlie/` and not just its contents. This is to make future additional utils ported there automatically. You can use `ln -s .../cloud/beach/utils .../cloud/limacharlie/`. | 71.571429 | 298 | 0.759148 | eng_Latn | 0.996455 |
da16f65593056d8a5bde80e311bb5a859e224225 | 1,156 | md | Markdown | extension-framework/riot-app/riot/.github/PULL_REQUEST_TEMPLATE.md | bench-os/bench-os | 38ade08e097ca215f7465047dfa70503af11d612 | [
"MIT"
] | 1 | 2020-02-21T09:16:17.000Z | 2020-02-21T09:16:17.000Z | extension-framework/riot-app/riot/.github/PULL_REQUEST_TEMPLATE.md | bench-os/bench-os | 38ade08e097ca215f7465047dfa70503af11d612 | [
"MIT"
] | null | null | null | extension-framework/riot-app/riot/.github/PULL_REQUEST_TEMPLATE.md | bench-os/bench-os | 38ade08e097ca215f7465047dfa70503af11d612 | [
"MIT"
] | 1 | 2020-02-21T09:21:45.000Z | 2020-02-21T09:21:45.000Z | <!--
The RIOT community cares a lot about code quality.
Therefore, before describing what your contribution is about, we would like
you to make sure that your modifications are compliant with the RIOT
coding conventions, see https://github.com/RIOT-OS/RIOT/wiki/Coding-conventions.
-->
### Contribution description
<!--
Put here the description of your contribution:
- describe which part(s) of RIOT is (are) involved
- if it's a bug fix, describe the bug that it solves and how it is solved
- you can also give more information to reviewers about how to test your changes
-->
### Testing procedure
<!--
Details steps to test your contribution:
- which test/example to compile for which board and is there a 'test' command
- how to know that it was not working/available in master
- the expected success test output
-->
### Issues/PRs references
<!--
Examples: Fixes #1234. See also #5678. Depends on PR #9876.
Please use keywords (e.g., fixes, resolve) with the links to the issues you
resolved, this way they will be automatically closed when your pull request
is merged. See https://help.github.com/articles/closing-issues-using-keywords/.
-->
| 31.243243 | 80 | 0.754325 | eng_Latn | 0.999448 |
da17257a78e7cf7b84391d1e755bb33a7a81cac4 | 86,388 | md | Markdown | docs/business/econ-200/textbook.md | andre-ye/uni | 7e5870656404c11ead00a8f5d7de603845d51759 | [
"MIT"
] | null | null | null | docs/business/econ-200/textbook.md | andre-ye/uni | 7e5870656404c11ead00a8f5d7de603845d51759 | [
"MIT"
] | null | null | null | docs/business/econ-200/textbook.md | andre-ye/uni | 7e5870656404c11ead00a8f5d7de603845d51759 | [
"MIT"
] | null | null | null | ---
layout: default
title: Textbook Notes
parent: ECON 200
grand_parent: Business
nav_order: 2
---
# Textbook Notes
ECON 200
{: .fs-6 .fw-300 }
---
## Chapter 1: "Economics and Life"
### Navigate
- [Making an Impact with Small Loans](#making-an-impact-with-small-loans)
- [The Basic Insights of Economics](#the-basic-insights-of-economics)
* [Scarcity](#scarcity)
* [Opportunity Cost and Marginal Decision Making](#opportunity-cost-and-marginal-decision-making)
* [Marginal Decision Making](#marginal-decision-making)
* [Incentives](#incentives)
* [Efficiency](#efficiency)
- [An Economist's Problem-Solving Toolbox](#an-economists-problem-solving-toolbox)
* [Correlation and Causation](#correlation-and-causation)
* [Models](#models)
* [Positive and Normative Analysis](#positive-and-normative-analysis)
- [Conclusion](#conclusion)
### Making an Impact with Small Loans
- Bangladeshi economist Muhammad Yunus won the Nobel Peace Prize for the Grameen bank.
- Grameen serves some of the poorest people in the world; before Grameen, other banks had been unwilling to work in poor communities.
- Yunus realized that economic thinking holds the key to solving hard problems that truly matter.
### The Basic Insights of Economics
- Economics is the study of how people manage resources.
- Resources can be physical (e.g. cash), or intangible (time, ideas, experience).
- People compare choices available to them and behave in a way to best achieve their goals.
#### Scarcity
- Scarcity is a fact of life; it is *the condition of wanting more than we can get with available resources*.
- Some things, like knowledge, sunlight, and air, are not considered to be restricted by resources.
- However, most goods are considered to be scarce.
- What are the wants and constraints of those involved in a complex economic problem?
- Banks want to make profits by lending to people who will pay them back. They are constrained by limited funds. Banks prioritize making large loans to customers likely to pay them back.
- Villagers want to increase their incomes. They are constrained in their ability to borrow money.
#### Opportunity Cost and Marginal Decision Making
- Every decision in life involves weighting the trade-off between costs and benefits.
- We choose to do things only when we think benefits will be greater than the costs.
- **Opportunity cost**: the opportunity you must give up for something that you might have enjoyed otherwise.
- Value of what you had to give up to get something.
- The opportunity cost of life: can the money be equated with life via opportunity cost?
#### Marginal Decision Making
- Rational people make decisions at the margin.
- **Marginal decision-making**: rational people compare the additional benefits of a choice against the additional costs, without considering related benefits and costs of past choices.
- In practice, many people do not make decisions on the margin.
- **Sunk cost**: a cost that has already been incurred. Sunk costs should not have a bearing on your marginal decision about what to do next.
#### Incentives
- As trade-offs change, so do the choices people make.
- Central idea in economics: collective reaction to a changing trade-off:
- What happens when prices change?
- What happens when the government implements a new policy?
- What happens when a company introduces a new product?
- **Incentive**: something that causes people to behave in a certain way by changing their trade-offs.
- *Positive incentive* - makes people more likely to do something.
- *Negative incentive* - makes them less likely to do it.
- Economists make two assumptions:
- People respond to incentives.
- Nothing happens in a vacuum. A change will always elicit a response from others.
- *Collateral* - a possession (e.g. house or car) pledged by a borrower to a lender.
- *Group responsibility* - proposed by Yunus, concluded that borrowers have a strong incentive to repay their loans if one person's loan is at the stake of others paying off their loans.
#### Efficiency
- **Efficiency**: not only about maximizing productivity but about ensuring people get what they most want and need, given their current resources.
- Efficiency does not necessarily mean that the outcomes are fair or ethical.
- When an economy is efficient, there is no way to reorganize things to make anyone better off without someone else becoming worse off.
- **Resource**: anything that can be used to make something of value.
- When an economy is efficient, resources are used to create the greatest total economic value of society.
- **Under normal circumstances, individuals and firms will act to provide the things people want.**
- If a profit-making opportunity exists, someone will take advantage of it sooner rather than later.
- How can abnormal circumstances lead to not taking an opportunity to profit off an idea?
- *Innovation*: the idea is too new.
- *Market failure*: something prevents benefits of the opportunity to be captured, or additional costs are imposed.
- *Intervention*: the government intervenes in the economy and disrupts "normal" transactions.
- *Unprofitable idea*: your idea won't produce a profit.
- If you have an idea that has not been exploited yet, ask:
- Have you misjudged people's wants and constraints?
- Have you miscalculated people's tradeoffs?
- Have you misunderstood how people respond to incentives?
### An Economist's Problem-Solving Toolbox
- Economic analysis requires us to combine theory with observations.
- We need to distinguish the way things are and what they should be.
#### Correlation and Causation
- If two variables have a consistent relationship, there is a correlation between them.
- Causation means one variable causes the other.
- Correlation and causation can be confused in three ways.
- *Coincidence*: correlation may be the result of pure coincidence.
- *Omitted variables*: an underlying factor that links both variables has not been examined.
- *Reverse causation*: *A* caused *B*, rather than vice versa.
#### Models
- A **model** is a simplified representation of a complicated situation.
- By carefully simplifying a situation to essentials, we can get approximate useful answers.
- **Circular flow model**: shows how transactions in an economy work.
- *Households* supply land and labor to firms and invest capital in firms. They buy goods and services that firms produce.
- Land, labor, and capital are *factors of production*.
- *Firms* buy or rent land, labor, and capital supplied by households, and produce and sell goods and services.
- Households and firms are tightly interconnected through production and consumption.
- Two markets emerge: *market for goods and services*, *market for the factors of production.
- A model should do three things:
- Predict cause and effect. *A* causes *B* because of *X*.
- State its assumptions clearly.
- Describes the real world accurately.
#### Positive and Normative Analysis
- **Positive statement**: "what is"; makes a factual claim about how the world works.
- **Normative statement**: "what ought to be"; claims what should be done.
### Conclusion
Key concepts from Chapter 1:
- Scarcity
- Opportunity cost
- Incentives
- Efficiency
---
## Chapter 2: "Specialization and Exchange"
### Navigate
- [Production Possibilities](#production-possibilities)
* [Drawing the Production Possibilities Frontier](#drawing-the-production-possibilities-frontier)
* [Production Possibilities Frontiers When Opportunity Costs Differ](#production-possibilities-frontiers-when-opportunity-costs-differ)
* [Choosing Among Production Possibilities](#choosing-among-production-possibilities)
* [Shifting the Production Possibilities Frontier](#shifting-the-production-possibilities-frontier)
- [Absolute and Comparative Advantage](#absolute-and-comparative-advantage)
- [Why Trade?](#why-trade-)
* [Specialization](#specialization)
* [Gains From Trade](#gains-from-trade)
* [Comparative Advantage Over Time](#comparative-advantage-over-time)
- [Conclusion](#conclusion)
### Production Possibilities
- Good models help us understand complex situations by simplifying assumptions.
#### Drawing the Production Possibilities Frontier
- A **production possibilities frontier (PPF)** is a curve showing all possible combinations of outputs that can be produced using all available resources.
- PPF allows us to answer: *what are the wants and constraints of those involved*? and *what are the trade-offs*?
- Also allows us to represent *constraints* (points outside the PPF).
- In linear PPFs, opportunity cost is represented by the slope.
#### Production Possibilities Frontiers When Opportunity Costs Differ
- Linear PPFs assume that all workers can make the same amount of a good; in reality, some workers may be more efficient at producing a certain product.
- Curved PPFs allow for more complex relationships between opportunity costs.
- At each point of the curved PPF, the slope represents the opportunity cost of getting more of any axis.
#### Choosing Among Production Possibilities
- Choosing a production point inside the frontier means it is not using all of its resources; it is **inefficient**.
- Points that lie on the frontier are **efficient** because they have the most output from the available resources.
#### Shifting the Production Possibilities Frontier
- Two main actors drive changing U.S. production possibilities:
- Number of workers
- Changes in technology.
- With these two changes, a country can economically grow by expanding out its PPF.
- An increase in available resources **shifts the entire frontier outward**.
- An improvement in technology for a good **rotates the frontier outward**.
### Absolute and Comparative Advantage
- If there is no trade between countries, then a country can only consume goods it produces on its own.
- **Absolute advantage**: a producer can generate more output than others with a given amount of resources.
- **Comparative advantage**: a producer can make a good at a lower opportunity cost than other producers.
- Countries can have a comparative advantage without having an absolute advantage.
- The opportunity cost of producing on one axis unit is the inverse of the opportunity cost of producing one of the other axis units.
- No producer has a comparative advantage at everything, and each producer has a comparative advantage at something.
### Why Trade?
- Two countries that trade can consume more when they specialize in producing the good for which they have a comparative advantage and trade.
#### Specialization
- All of us are dependent on one another for things we need daily.
- **Specialization** - focusing on producing a good for which it has a comparative advantage.
- Total production increases from specialization.
#### Gains From Trade
- When countries specialize in producing the goods for which they have a comparative advantage, total production increases.
- Improvement in outcomes resulting from the specialized exchange of goods or services is called **gains from trade**.
- There is room for trade as long as two conditions are met:
1. Countries differ in opportunity costs to produce a good;
2. A favorable trading price is set.
- Countries value a good based on their opportunity costs.
- If a trade price falls between opportunity costs, two countries benefit.
- Not all citizens gain from international trade.
- In practice, international trade policy can be controversial, despite its advantages.
#### Comparative Advantage Over Time
- This simplified model of production possibilities helps us to understand why Americans buy certain products from other countries.
- To understand changes, we can apply models to shifts in comparative advantage *over time*, causing changes to economies and trade patterns.
- When a country acquires a comparative advantage in one field, it has strengthened it in another industry.
### Conclusion
- Specialization and trade can make everyone better off.
- People specialize to exploit comparative advantages.
---
## Chapter 3: "Markets"
### Navigate
- [Markets](#markets)
* [What is a Market?](#what-is-a-market-=)
* [What is a Competitive Market?](#what-is-a-competitive-market)
- [Demand](#demand)
* [The Demand Curve](#the-demand-curve)
* [Determinants of Demand](#determinants-of-demand)
* [Shifts in the Demand Curve](#shifts-in-the-demand-curve)
- [Supply](#supply)
* [The Supply Curve](#the-supply-curve)
* [Determinants of Supply](#determinants-of-supply)
* [Shifts in the Supply Curve](#shifts-in-the-supply-curve)
- [Market Equilibrium](#market-equilibrium)
* [Reaching Equilibrium](#reaching-equilibrium)
* [Changes in Equilibrium](#changes-in-equilibrium)
### Markets
- An economy organized by the "invisible hand": *private individuals*, rather than a centralized planning authority, make decisions.
- Known as the *market economy*.
#### What is a Market?
- *Market* - buyers and sellers that trade a particular good or service.
#### What is a Competitive Market?
- Making simplifying assumptions allows us to focus on important ideas.
- Assumption: markets are competitive.
- *Competitive market* - fully informed, price-taking buyers and sellers easily trade a standardized good or service.
- *Price taker* - a buyer or seller who cannot affect the market price.
- *Standardized good* - a good or service for which all units have the same features and are interchangeable.
- *Transaction costs* - costs incurred by the buyer and seller agreeing to the sale of goods or services. Competitive markets assume no transaction costs.
- Few markets are perfectly competitive; however, the assumption of competitive markets leads to useful insights.
### Demand
- *Demand* - how much of something people are willing or able to buy under certain circumstances.
- Different people buy their cell phones at different prices; individual choices summed up to form the *market demand*.
- Amount of goods that buyers in the market purchase at a specific moment are the *quantity demanded*.
- *Law of Demand*: The lower the price goes, the higher the quantity demanded; the inverse relationship between price and quantity.
- *Ceteris paribus* - all other things being the same. When all else is held equal, the inverse relationship is true.
- *Ceteris paribus* is used to isolate the expected effect of a change in the economy.
#### The Demand Curve
- *Demand schedule* - shows the quantities of a particular good or service that customers are willing and able to purchase (demand) at different prices.
- *Demand curve* - visually displays the demand schedule.
#### Determinants of Demand
- The demand curve represents the relationship between price and quantity if everything else is held constant.
- If *ceteris paribus* is not true, nonprice factors can influence demand changes and shift the curve.
- Nonprice determinants of demand:
- Consumer preferences: personal likes or dislikes that make buyers more/less inclined to purchase a good.
- Price of related goods: substitutes (similar-purpose) and complements (goods consumed together).
- Income of customers: the amount of income people earn affects the demand for goods and services.
- *Normal goods* - an increase in income causes an increase in demand.
- *Inferior goods* - an increase in income causes a decrease in demand.
- Expectations of future prices: when prices are expected to drop in the future, demand decreases, and vice versa.
- Number of buyers in the market: an increase in the number of potential buyers increases demand, and vice versa.
#### Shifts in the Demand Curve
- When one of the nonprice determinants of demand changes, *the entire demand curve shifts* left or right.
- Horizontal (rather than vertical) shift: nonprice determinants affect the quantity demanded at each price.
- "increase in demand" or "decrease in demand". A *shift* of the entire demand curve.
- A change in price causes movement along the demand curve; this is an "increase in quantity demanded" or "decrease in quantity demanded".
### Supply
- *Supply* - describes how much of a good or service producers will offer.
- *Quantity supplied* - the amount of a good that producers will offer for sale at a specific price.
- Market supply can be found by adding individual decisions of each producer.
- Each producer has a different price point at which they decide it is worthwhile to supply cell phones.
- *Law of Supply* - ceteris paribus, quantity supplied increases as price increases.
- Supply varies with price because the decision to produce a good is about the trade-off between the benefit to the producer and opportunity cost.
#### The Supply Curve
- Supply Schedule - table that shows quantities of a good or service that producers supply.
- Supply curve shows a graph of information in the supply schedule.
#### Determinants of Supply
- Several nonprice factors determine the opportunity cost of production.
- When a nonprice determinant of supply changes, the entire supply curve shifts.
- Five major categories of nonprice determinants of supply:
- *Prices of related goods*. The price of related goods determines supply because it affects the opportunity cost of production.
- *Technology.* Improved technology enables firms to produce more efficiently.
- *Prices of inputs.* Prices of the inputs used to produce a good are important to its cost.
- *Expectations.* Suppliers' expectations in the future affect quantity supplied.
- *Number of sellers.* The number of sellers in the market is considered a fixed part of the supply curve.
#### Shifts in the Supply Curve
- Changes in price cause suppliers to move to a different point on the same supply curve.
- A change in nonprice determinant increases or decreases supply.
- Change in the nonprice determinant of supply: "Increase in supply"
- Movement along the supply curve: "Increase in quantity supplied"
### Market Equilibrium
- To find what happens in the market, we need to combine supply and demand.
- Convergence of supply with demand happens at a point when the demand curve intersects the supply curve, called the **Market equilibrium**.
- Price at this point: **equilibrium price**
- Quantity at this point: **equilibrium quantity**.
- There is no sale without a purchase; when markets work well, the quantity supplied exactly equals the quantity demanded.
#### Reaching Equilibrium
- Sellers set prices by trial and error; incentives buyers and sellers face naturally drive the market towards an equilibrium price and quantity.
- **Excess supply/surplus**: when quantity supplied is higher than the quantity demanded.
- **Excess demand/shortage**: when the quantity demanded is higher than the quantity supplied.
- At any price above or below the equilibrium price, sellers face an incentive to raise or lower prices.
#### Changes in Equilibrium
- Some changes affect both supply and demand curves.
- *To determine the effect on market equilibrium of a change in a nonprice factor*:
1. Does the change affect demand? Does demand increase or decrease?
2. Does the change affect supply? If so, does supply increase or decrease?
3. How does the combination of changes in supply and demand affect equilibrium price and quantity?
- In the case where both demand and supply shift, we need to know if demand or supply increases *more*.
- When supply and demand shift together, it is possible to predict either the direction of the change in quantity or the direction of the change in price without knowing how much the curves shift.
- When supply and demand shift in the same direction, we can predict the direction in a change in quantity, but not in a price change.
- When supply and demand shift in opposite directions, the price change is predictable, but not the change in quantity.
| Supply Change | Demand Change | Price Change | Quantity Change |
| --- | --- | --- | --- |
| Down | Down | ? | Down |
| Down | Up | Up | ? |
| Up | Up | ? | Up |
| Up | Down | Down | ? |
- Remember: *what do buyers and sellers agree on*?
---
## Chapter 4: "Elasticity"
### Navigate
- [What is Elasticity?](#what-is-elasticity-)
- [Price Elasticity of Demand](#price-elasticity-of-demand)
* [Calculating price elasticity of demand](#calculating-price-elasticity-of-demand)
* [Determinants of price elasticity of demand](#determinants-of-price-elasticity-of-demand)
* [Using price elasticity of demand](#using-price-elasticity-of-demand)
- [Price Elasticity of Supply](#price-elasticity-of-supply)
* [Calculating price elasticity of supply](#calculating-price-elasticity-of-supply)
* [Determinants of price elasticity of supply](#determinants-of-price-elasticity-of-supply)
- [Other Elasticities](#other-elasticities)
* [Cross-price elasticity of demand](#cross-price-elasticity-of-demand)
* [Income elasticity of demand](#income-elasticity-of-demand)
### What is Elasticity?
- **Elasticity** - a measure of how much consumers and producers will respond to a change in market conditions.
- Elasticity allows economic decision-makers to anticipate how others will respond to market condition changes.
Several measures of elasticity.
- *Price elasticity of demand* and *price elasticity of supply*: how much quantity demanded and quantity supplied change when the price of good changes.
- *Cross-price elasticity of demand* - how much the demand curve shifts when the price of another good change.
- *Income elasticity of demand* - how much the demand curve shifts when consumer incomes change.
### Price Elasticity of Demand
- **Price elasticity of demand**: the size of the change in the quantity demanded of a good or service when the price changes.
- Sensitivity to price changes is measured as:
- *more elastic* - a small change in price causes a large change in the quantity demanded.
- *more inelastic* - a small change in price causes a small change in the quantity demanded.
#### Calculating price elasticity of demand
$$\text{Price elasticity of demand} = \frac{\text{\% change in qty demanded}}{\text{\% change in price}} = \frac{\left(Q_2\:-\:Q_1\right)\left(\frac{P_2\:+\:P_1}{2}\right)}{\left(\frac{Q_2\:+\:Q_1}{2}\right)\left(P_2\:-\:P_1\right)}$$
$$\text{\% change in qty demanded} = \frac{Q_2 - Q_1}{\left(\frac{Q_2 + Q_1}{2}\right)}$$
$$\text{\% change in price} = \frac{P_2 - P_1}{\left(\frac{P_2 + P_1}{2}\right)}$$
In this context, percent change uses the average rather than an endpoint because using endpoints would make the direction relevant. This poses problems when calculating elasticity.
To interpret the elasticity:
- Elasticity describes the size of the change in the quantity demanded of a good when the price changes.
- -1.38 price elasticity of demand for iPhones means that a 1% decrease in the price of iPhones will lead to a 1.38% increase in the number of iPhones demanded. Or, a 1% increase in the price of iPhones will lead to a 1.38% decrease in the quantity of iPhones demanded.
- Price elasticity of demand will always be a negative number because price and quantity demanded move in opposite directions.
- Economists often drop the negative sign and express price elasticity of demand as a positive number.
- Measuring percent change in quantity rather than absolute chance in quantity allows the elasticity of demand for a good to be the same regardless of the unit of measurement.
#### Determinants of price elasticity of demand
- Consumers are more sensitive to price changes for some goods and services than others.
- When consumers are very responsive to price changes, the demand for the good is more elastic; in the opposite scenario, demand for that good is more inelastic.
- Factors determining consumer responsiveness to price changes:
- **Availability of substitutes**. If close substitutes are available for a particular good, the demand for the good will be more elastic than if only distant substitutes are available.
- **Degree of necessity**. When a good is a basic necessity, people will still buy it if its price rises.
- **Cost relative to income**. If consumers spend a small % of their incomes on a good, their demand will be less elastic.
- **Adjustment time**. Goods often have more elastic demand over the long run.
- **Scope of the market**. Caveat for determinants - it depends on how you define the market for a good or service.
#### Using price elasticity of demand
- At the extreme, demand can be perfectly elastic or inelastic.
- **Perfectly elastic demand** - quantity demanded drops to zero when the price increases a minuscule amount. Is a horizontal demand curve.
- **Perfectly inelastic demand** - quantity demanded remains the same regardless of the price. Is a vertical demand curve.
- Within the extremes, elasticity is divided into three categories:
- **Elastic demand** - absolute value of the price elasticity of demand is greater than 1.
- **Inelastic demand** - absolute value of the price elasticity of demand is less than 1.
- **Unit-elastic demand** - absolute value of the price elasticity of demand is equal to 1.
- Knowing whether demand is elastic or inelastic is extremely useful.
- - **Total revenue** - amount a firm receives from the sale of goods and services; $$\text{total revenue} = \text{quantity sold} \times \text{price paid per unit}$$. Tells us how much money sellers recieve when they sell something.
- Increase in price affects total revenue:
- *Quantity effect* - decrease in total revenue because of fewer units sold.
- *Price effect* - increase in total revenue because of a higher price for each unit.
- When the quantity effect outweighs the price effect, the total revenue will drop.
- When demand is elastic, price increase causes total revenue to fall; when demand is inelastic, total revenue increases.
- **Elasticity varies along the curve**. Demand tends to be more elastic when the price is high and more inelastic when the price is low.
- For prices above the unit-elastic price, the demand is elastic.
- For prices below the unit-elastic price, the demand is inelastic.
- **Slope is not the same thing as elasticity.**
### Price Elasticity of Supply
- How will the increase in price increase production?
- **Price elasticity of supply** - the size of the change in quantity supplied of a good or service when the price changes. Measures producers' responsiveness to a change in price.
#### Calculating price elasticity of supply
$$\text{Price elasticity of supply} = \frac{\text{\% change in qty supplied}}{\text{\% change in price}} = \frac{\left(Q_2\:-\:Q_1\right)\left(\frac{P_2\:+\:P_1}{2}\right)}{\left(\frac{Q_2\:+\:Q_1}{2}\right)\left(P_2\:-\:P_1\right)}$$
$$\text{\% change in qty supplied} = \frac{Q_2 - Q_1}{\left(\frac{Q_2 + Q_1}{2}\right)}$$
$$\text{\% change in price} = \frac{P_2 - P_1}{\left(\frac{P_2 + P_1}{2}\right)}$$
- Price elasticity of supply can be categorized as *elastic, inelastic,* or *unit-elastic*, just as with price elasticity of demand.
- *Perfectly elastic supply* - quantity supplied could be anything at a given price.
- *Perfectly inelastic* - quantity supplied is the same regardless of price.
- Elasticity of demand is always negative; elasticity of supply is always positive.
#### Determinants of price elasticity of supply
- Whether supply is elastic or inelastic depends on the supplier's ability to change the quantity produced given price changes.
- Factors determining supplier responsiveness to price changes:
- **Availability of inputs**. The production of some goods can be changed easily just by adding extra inputs. The elasticity of supply depends on the elasticity of the supply of inputs to the production of the good.
- **Flexibility of the production process**. Producers may or may not be able to easily draw production capacity away from other goods when a good's price rises.
- **Adjustment time**. Supply is more elastic over long periods than in short periods.
### Other Elasticities
#### Cross-price elasticity of demand
- Price elasticities are affected by the availability of alternative options.
- **Cross-price elasticity of demand**: describes how much demand changes when the price of different goods changes.
$$\text{Cross-price elasticity of demand between A and B} = \frac{\text{\% change in quantity of A demanded}}{\text{\% change in price of B}}$$
- *The initial quantity demanded is on one demand curve and the final quantity demanded is on another demand curve* because nonprice determinants shift a demand curve.
- *When goods are substitutes*, the cross-price elasticity of demand is positive. The increase in the price of one will cause an increase in the quantity demanded of the other.
- *When two goods are complements*, the cross-price elasticity of demand is negative. The increase in the price of one will cause a decrease in the quantity demanded of the other.
- **Cross-price elasticity shows how closely two goods are complements or substitutes.**
#### Income elasticity of demand
- **Income elasticity of demand**: describes how much demand changes in response to a change in consumers' incomes.
- Change in income causes the demand curve to shift; measure change by observing the change in quantity demanded.
$$\text{Income elasticity of demand} = \frac{\text{\% change in quantity demanded}}{\text{\% change in income}}$$
- For *normal goods*, the income elasticity is positive. As income rises, demand increases.
- For *inferior goods*, the income elasticity is negative. As income rises, demand decreases.
- For *necessity goods*, the income elasticity will be positive and less than 1 (inelastic with respect to income).
- For *luxury goods*, the income elasticity will be positive and larger than 1 (elastic with respect to income).
---
## Chapter 5: "Efficiency"
### Navigate
- [Introduction](#introduction)
- [Willingness to Pay and Sell](#willingness-to-pay-and-sell)
* [Willingness to pay and the demand curve](#willingness-to-pay-and-the-demand-curve)
* [Willingness to sell and the supply curve](#willingness-to-sell-and-the-supply-curve)
- [Measuring Surplus](#measuring-surplus)
* [Consumer Surplus](#consumer-surplus)
* [Producer Surplus](#producer-surplus)
* [Total surplus](#total-surplus)
- [Using Surplus to Compare Alternatives](#using-surplus-to-compare-alternatives)
* [Market equilibrium and efficiency](#market-equilibrium-and-efficiency)
* [Changing the distribution of total surplus](#changing-the-distribution-of-total-surplus)
* [Deadweight loss](#deadweight-loss)
* [Missing markets](#missing-markets)
### Introduction
- How do we know that people are better off when they bull and sell things?
- *Surplus* - measure the benefits people receive when they buy something for less than they would have been willing to pay.
- Surplus shows how important equilibrium price and quantity are: maximize the total well-being of those involved.
- Maximizing total surplus - efficiency - is a powerful feature of a market system.
- It can be achieved without centralized coordination.
### Willingness to Pay and Sell
- Potential buyers want to pay as little as possible, and each buyer has a maximum price they're willing to pay.
- Maximum price: **willingness to pay**, or **reservation price**.
- **Willingness to sell** - minimum price a seller is willing to accept in exchange for a good or service.
#### Willingness to pay and the demand curve
- We can conduct an exercise in a large market and plot out the willingness of millions of people to pay; we get a smooth demand curve.
- Each buyer's willingness to pay is driven by different factors.
- Willingness to pay is the point where the benefit a person will get from the camera is equal to the benefit of spending the money on another alternative (the opportunity cost).
#### Willingness to sell and the supply curve
- The shape of the supply curve is driven by potential sellers' willingness to sell.
- Sellers' willingness to sell is determined by the trade-offs they face, and the opportunity cost of the sale.
- In a market where manufacturers are producing and selling products, the minimum price must be high enough to make it worth it to continue making products.
### Measuring Surplus
- **Surplus** - measuring who benefits from transactions and by how much.
- If you get something for less than you were willing to pay or sell for more than the minimum you were willing to sell for, this is a good thing.
- Surplus - the difference between the price at which a buyer or seller would be willing to trade and the actual price.
- Surplus is a better measure of the value that buyers and sellers get from the market than the price itself.
- *Consumer surplus* - the maximum extra amount you would pay over the current price to maintain the ability to buy something. Difference between willingness to pay and actual price.
- Consumer surplus better represents how much you value a good.
#### Consumer Surplus
- **Consumer surplus** - a net benefit that a consumer receives from purchasing a good or service.
$$\text{Consumer surplus} = \text{Willingness to pay} - \text{Actual price}$$
- Total consumer surplus can be found by summing all the individual consumer surpluses.
- Consumer surplus is the area underneath the demand curve and above the horizontal line of the equilibrium price.
- How does a change in the market price affect buyers?
- A decrease in price makes buyers better off
- An increase in price makes buyers worse off.
- Some people do not buy at all when prices rise - their surplus becomes 0.
- Measuring consumer surplus tells us how much better or worse off buyers are when the price changes.
#### Producer Surplus
- **Producer surplus** - net benefit a producer receives from a sale of a good or service.
$$\text{Producer surplus} = \text{Willingness to sell} - \text{Actual price}$$
- Total producer surplus can be found by summing individual producer surpluses.
- Sellers prefer prices to be higher:
- an increase in price makes them better off
- a decrease in price makes them worse off.
- Measuring producer surplus tells us how much better or worse off sellers are when the price changes.
#### Total surplus
- What will the actual market price be?
- Put demand and supply curves together, and locate the point where they intersect.
- Consumer surplus: find the area under the curve but above the equilibrium price.
- Producer surplus: find the area above the curve but under the equilibrium price.
- **Total surplus** - a measure of combined benefits everyone receives from participating in the exchange of goods and services.
- Also can be thought of as the value created by the existence of the market.
- The economy is not a fixed quantity of money, a zero-sum game.
- Surplus shows that both the buyer and the seller are winners since each gains surplus.
### Using Surplus to Compare Alternatives
- In a competitive market, buyers and sellers will find their way to the equilibrium price.
#### Market equilibrium and efficiency
- Surplus allows us to appreciate something important about market equilibrium: it is the point where transactions maximize total surplus.
- When the price deviates from the equilibrium price, the quantity decreases, and total surplus is lost.
- A higher or lower price causes fewer trades to take place because some people are no longer willing to buy or sell.
> The equilibrium in a perfectly competitive, well-functioning market maximizes total surplus.
- The market is efficient when it is at equilibrium: no exchange can make anyone better off without someone becoming worse off.
#### Changing the distribution of total surplus
- *Reassignment of surplus* from customers to producers for transactions that take place when the price is moved away from the equilibrium price:
- When the price was raised, sellers gained at the expense of buyers.
- When the price was lowered, buyers gained at the expense of sellers.
- Transfer of well-being reduced total surplus.
- A price below the market equilibrium will always reduce producer surplus.
#### Deadweight loss
- Intervention that moves the market away from equilibrium might benefit producers or consumers but comes with a decrease in surplus.
- Where does the surplus go? Disappears and becomes **deadweight loss**.
- Deadweight loss is the loss of total surplus that occurs when the quantity of a good is below market equilibrium quantity.
- Two ways to calculate deadweight loss:
$$\text{Deadweight loss} = \text{Total surplus before intervention} - \text{Total surplus after intervention}$$
$$\text{Deadweight loss} = \frac{b \times h}{2} \text{(directly calculate using triangle method)}$$
- Deadweight loss is the surplus that is lost to both producers and consumers because of fewer transactions taking place.
#### Missing markets
- When people would like to make exchanges but cannot, we miss opportunities for mutual benefit.
- *Missing* market.
- Markets can be missing for several reasons.
- Public policy prevents the market from existing.
- A tax leads to fewer transactions.
- Lack of accurate information between buyers and sellers.
- We can increase total surplus by creating new markets and improving existing ones.
- Not just redistributing pieces of the pie; making the pie bigger.
---
## Chapter 6: "Government Intervention"
### Navigate
- [Why Intervene?](#why-intervene)
* [Changing the Distribution of Surplus](#changing-the-distribution-of-surplus)
* [Encouraging or Discouraging Consumption](#encouraging-or-discouraging-consumption)
* [Correcting Market Failures](#correcting-market-failures)
- [Price Controls](#price-controls)
* [Price Ceilings](#price-ceilings)
* [Price Floors](#price-floors)
- [Taxes and Subsidies](#taxes-and-subsidies)
* [Taxes](#taxes)
- [Subsidies](#subsidies)
- [Evaluating Government Interventions](#evaluating-government-interventions)
* [The Magnitude of the Effect of a Tax or Subsidy](#the-magnitude-of-the-effect-of-a-tax-or-subsidy)
* [Short-run Versus Long-run Impact](#short-run-versus-long-run-impact)
### Why Intervene?
- Markets gravitate towards equilibrium.
- At equilibrium, **there is no way to make some people better off without harming others**.
- Why intervene?
#### Changing the Distribution of Surplus
- Efficient markets maximize total surplus, but an efficient outcome may still be seen as unfair.
- Even if a job market is efficient, wages can drop.
- The government can respond by intervening in the labor market to impose a minimum wage, changing the distribution of surplus.
#### Encouraging or Discouraging Consumption
- Governments can use taxes to discourage people from consuming "bad" products.
- Governments can use subsidies to encourage people to consume "good" products.
#### Correcting Market Failures
- Markets do not always work efficiently.
- Situations in which the assumption of efficient, competitive markets fails to hold: **market failures**.
- Intervention can increase total surplus.
### Price Controls
- One policy tool: **price control**; regulate the maximum or minimum legal price for a good.
- Hold the price of a good up or down when the market shifts, preventing the market from reaching a new equilibrium.
#### Price Ceilings
- **Price ceiling** - maximum legal price a good can be sold.
- Many countries have price ceilings on necessities.
- Assess the full effect of the price ceiling but looking at what happened to consumer and producer surplus.
- Changes in the economic well-being of market participants are **welfare effects**.
- When the price ceiling is below the equilibrium price, there is a shortage.
- **Nonbinding price ceiling** - if the ceiling is set above the equilibrium price in a market, it is nonbinding, and the equilibrium price and quantity will prevail.
- Price ceilings are usually binding when implemented, and shifts in the market can render ceilings nonbinding.
#### Price Floors
- **Price floor** - minimum legal price a good can be sold.
- A price floor can allow supplies a minimum income in face of difficulties, keeping them in business.
- When the price floor is above the equilibrium price, there is an excess.
- Producers win at the expense of consumers.
- **Nonbinding price floor** - price floors are not always binding. A price floor can become binding in response to change in the market.
### Taxes and Subsidies
- Taxes are the main ways governments raise revenue to pay for public programs.
- Taxes and subsidies can be used to correct market failures.
#### Taxes
- When a good is taxed, either the buyer or seller must pay some extra amount to the government on top of the sale price.
- Taxes have two effects:
- Discourage production and consumption of the good being taxed.
- Raise government revenue.
- **Tax wedge** - the difference in price paid by buyers and price received by sellers.
- **Tax revenue** - tax wedge multiplied by the quantity of goods bought and sold.
- Tax causes deadweight loss and redistributes surplus.
- Under a tax, both producers and consumers lose surplus.
- Consumers pay more for the same quantity of good, and producers receive less for the same good.
- Lost surplus in taxes doesn't disappear like deadweight loss, though, but becomes government revenue.
- If the tax is imposed on buyers rather than sellers, the outcome is the same.
- Tax causes some deadweight loss; the value of the revenue the government collects is less than the reduction in total surplus caused by the tax.
- Weigh the goal against the loss of surplus in the market when evaluating a tax.
| Does the tax on | affect... ? | Answer |
| --- | --- | --- |
| sellers | supply | Yes, supply decreases. Sellers will behave as if the price they are receiving is actually [tax amount] lower. |
| sellers | demand | No, demand stays the same (quantity demanded does change, though). |
| sellers | market equilibrium | Equilibrium price rises and quantity demanded falls. |
| buyers | supply | No, supply stays the same. |
| buyers | demand | Yes, demand increases. |
| buyers | market equilibrium | Equilibrium price and quantity fall. |
- **Who bears the burden of a tax?**
- Relative tax burden carried by buyers and sellers: **tax incidence**.
- Often, tax incidence is not split equally.
| When demand is ... elastic than supply, | ... shoulder more of the tax burden. |
| --- | --- |
| more | sellers |
| less | buyers |
| just as | no one; equally shared |
- Tax changes the price of a good to both buyers and sellers.
- Relative responsiveness of supply and demand will determine the tax burden.
- **The market that is more price elastic will be able to adjust to price changes and will shoulder less of the tax burden.**
- Market outcome of a tax (new equilibrium quantity and price) is the same regardless of the tax is imposed upon buyers or on sellers.
- Tax burden will be the same no matter which side of the market is taxed.
- There can be a difference between *economic incidence* and *statutory incidence*.
- *Economic incidence*: economic effect of a tax on buyers or sellers
- *Statutory incidence*: person legally responsible for paying the tax
### Subsidies
- A **subsidy** is the reverse of a tax - the government pays an extra amount to producers or consumers of a good.
- Governments use subsidies to encourage production and consumption of a good or service; it can be an alternative to price controls without generating a shortage.
| Intervention Method | Qty Supplied | Qty Demanded | Government... |
| --- | --- | --- | --- |
| tax | decrease | decrease | collects revenue |
| subsidy | increase | increase | spends money |
$$\text{Government subsidy expenditure} = \text{Subsidy} \times Q_\text{post-subsidy}$$
| Does the tax on | affect... ? | Answer |
| --- | --- | --- |
| sellers | supply | Yes, supply increases |
| sellers | demand | No, demand stays the same |
| market equilibrium | Equilibrium price decreases and equilibrium quantity increases. |
- Subsidies cause deadweight loss and redistribute surplus.
- The subsidy lowers the cost to the producer, causing producers and consumers to exchange more goods than is efficient, leading to deadweight loss.
- The government expenditure is less than the increase in total surplus; this expenditure is passed onto taxpayers in the form of taxes.
- **The side of the market that is more price inelastic receives more of the benefit.**
- The share of the benefit does not depend on who receives the subsidy.
### Evaluating Government Interventions
- We need to assess the effects of each intervention and unintended consequences.
- Key rules:
- Price controls have opposing impacts on the quantities supplied and demanded, causing a shortage or excess supply.
- Taxes and subsidies move the quantities supplied and demanded in the same direction.

#### The Magnitude of the Effect of a Tax or Subsidy
- It's important to know how much a policy will change equilibrium quantity and price.
- **The more elastic supply or demand, the greater the change in quantity.**
- To predict the size of the effect of a tax or subsidy, policy-makers need to know the price elasticity of supply and demand.
#### Short-run Versus Long-run Impact
- Price controls cause shortages or excess supply.
- Sometimes, the full effect of price controls becomes clear only in the long run.
- In the short run, demand and supply are not very elastic, so the price floor may result in only a small supply excess.
---
## Chapter 12: "The Costs of Production"
### Navigate
- [The Building Blocks of Business: Revenue, Costs, and Profits](#the-building-blocks-of-business--revenue--costs--and-profits)
* [Profit is revenue minus costs](#profit-is-revenue-minus-costs)
* [Fixed and variable costs](#fixed-and-variable-costs)
* [Explicit and implicit costs](#explicit-and-implicit-costs)
* [Economic and accounting profit](#economic-and-accounting-profit)
- [Production Functions](#production-functions)
* [Marginal Product](#marginal-product)
- [Cost Curves](#cost-curves)
* [Total, average, and marginal costs](#total--average--and-marginal-costs)
- [Production in the Short Run and the Long Run](#production-in-the-short-run-and-the-long-run)
* [Costs in the long run](#costs-in-the-long-run)
* [Economies and diseconomies of scale](#economies-and-diseconomies-of-scale)
### The Building Blocks of Business: Revenue, Costs, and Profits
- A firm's goal is to maximize profit.
#### Profit is revenue minus costs
- The amount a firm receives from the sale of goods and services is its *total revenue*.
- The amount a firm pays for its inputs to produce goods and services are its *total cost*.
- *One-time expenses* and *ongoing expenses*.
- Profit is $$\text{total revenue} - \text{total cost}$$.
- Revenue is $$\text{Quantity} \times \text{Price}$$.
#### Fixed and variable costs
- **Fixed costs** don't depend on the quantity of output produced.
- One-time or ongoing.
- **Variable costs** depend on the quantity of output produced.
- Raw materials, labor costs, etc.
- A firm's total cost is the sum of its fixed and variable costs.
#### Explicit and implicit costs
- True costs are *opportunity costs*.
- A firm's opportunity cost of operations has an explicit cost and implicit cost components.
- **Explicit costs**: require a firm to spend money. Rent, employee salaries, materials, etc.
- **Implicit costs**: costs of forgone opportunities.
#### Economic and accounting profit
- When a company reports its profits, it usually reports *accounting profit*.
- However, *economic profit* is a better measure of how well a business is doing.
$$\text{accounting profit} = \text{total revenue} - \text{explicit costs}$$
$$\text{economic profit} = \text{total revenue} - \text{explicit costs} - \text{implicit costs}$$
### Production Functions
- Firms create values by bringing together different ingredients to create a good or service consumers want.
- Relationship between quantity of inputs and quantity of output: **production function**.
#### Marginal Product
- **Marginal product**: increase in the production that results from increasing the input.
- **Principle of diminishing marginal product**: holding other inputs constant, the marginal product of a particular input decreases as its quantity increases.
- **Average product**: number of goods produced by each worker on average.
### Cost Curves
#### Total, average, and marginal costs
- **Average fixed cost (AFC)**: fixed cost divided by the quantity of output.
- **Average variable cost (AVC)**: variable cost divided by the quantity of output.
- **Average total cost (ATC)**: total cost divided by the quantity of output.
- **Marginal cost**: variable cost of producing the next unit of output, or $$\frac{\Delta \text{ total cost}}{\Delta \text{ quantity}}$$.
- Marginal cost curve has the inverse shape of the marginal product curve.
- The marginal cost curve intersects the lowest point of the average total cost curve.
### Production in the Short Run and the Long Run
- Supply is more flexible over longer periods.
#### Costs in the long run
- A cost that is "fixed" in the short run may not be so in the long run.
- In the long run, all costs are considered variable.
#### Economies and diseconomies of scale
- ATC is U-shaped because additional inputs have an increasing marginal product, but the principle of diminishing return kicks in, and average total cost increases.
- Economies of scale, diseconomies of scale, and constant returns to scale: describe the relationship between the quantity of output and average total cost.
- Increasing the quantity of output enables it to lower its average total cost; the firm is facing **economies of scale**.
- Bigger isn't always better; increasing scale can, at a certain point, lead to higher average total cost; facing **diseconomies of scale**.
- In between economies of scale and diseconomies of scale, there are various quantities of output a firm can operate in without experiencing higher or lower average cost: **constant returns to scale**.
- A firm's long-run ATC curve covers a greater range of output than its short-run ATC curve.
- Long-run ATC curve is made up of points from the firm's short-run ATC curves.
- If a firm cannot lower its average total cost by increasing or decreasing its scale, it is operating at an efficient scale.
---
## Chapter 13: "Perfect Competition"
### Navigate
- [A Competitive Market](#a-competitive-market)
* [Characteristics of a Competitive Market](#characteristics-of-a-competitive-market)
* [Revenues in a Perfectly Competitive Market](#revenues-in-a-perfectly-competitive-market)
- [Profits and Production Decisions](#profits-and-production-decisions)
* [Deciding How Much to Produce](#deciding-how-much-to-produce)
* [Deciding When to Operate](#deciding-when-to-operate)
- [Behind the Supply Curve](#behind-the-supply-curve)
* [Short-run Supply](#short-run-supply)
* [Long-run Supply](#long-run-supply)
* [Why the long-run market supply curve shouldn't slope upward but does](#why-the-long-run-market-supply-curve-shouldnt-slope-upward-but-does)
* [Responding to Shifts in Demand](#responding-to-shifts-in-demand)
### A Competitive Market
- *What are firms' wants and constraints?*.
- Want: maximize profits.
- Constraints: will be covered in this chapter.
#### Characteristics of a Competitive Market
1. Buyers and sellers cannot affect prices (they are price-takers).
- Most sellers and buyers in most markets face some degree of competition.
- In a perfectly competitive market, buyers and sellers face so much competition that they cannot set their price at all.
- Opposite of a price taker: having market power.
2. Goods are standardized.
- Standardized goods are interchangeable.
- When goods are not standardized, producers can charge different prices.
- Standardized goods like crude oil and gold are referred to as commodities.
3. Buyers and sellers have full information.
- Goods in a perfectly competitive market are standardized.
- There are no information asymmetries.
4. There are no transaction costs
5. Firms can freely enter and exit.
- New firms can be created and begin producing goods and services, as existing firms can close.
- Free entry keeps firms on their toes and drives innovation.
#### Revenues in a Perfectly Competitive Market
- In a perfectly competitive market, producers can sell as much as they want without affecting the market price.
- When firms make decisions about the quantity they produce, they do not think about whether their actions will change the market price or if they will find buyers.
$$\text{Total revenue} = P \times Q$$
$$\text{Average revenue} = \frac{\text{Total revenue}}{\text{Quantity sold}} = \frac{P \times Q}{Q}} = P$$
$$\text{Marginal revenue} = \frac{\text{Change in total revenue}}{\text{Change in quantity sold}}$$
- Average revenue is equal to the price of the good for a firm selling one product.
- In a perfectly competitive market, average revenue and marginal revenue are equal.
### Profits and Production Decisions
- Quest for profits drives firms' behavior.
#### Deciding How Much to Produce
- The only choice a company can make to affect profits is the number of roasted plantains.
- Profit depends not only on revenue but costs.
- When marginal revenue stays the same but marginal cost increases, a firm should stop production when the two are equal.
- As long as marginal revenue $$>$$ marginal cost, the production of an additional unit increases profits.
#### Deciding When to Operate
- A firm can decide if to produce nothing.
- A firm needs to pay fixed costs regardless of how much it produces.
- In a perfectly competitive market, the market price is the same thing as the firm's average revenue.
$$\text{Profit} = \left(\text{Average revenue} - \text{ATC}\right)\timesQ = \left(\text{Price} - \text{ATC}\right) \times Q$$
- As long as the price is above average total cost, the firm makes positive profits.
- If the market price falls below the bottom of a firm's ATC curve, there is no level of output at which the firm can make a profit.
- In this case, it *wants* to exit the market.
**Short-run decisions**
- If a firm shuts down production, it will stop producing for a period of time until market conditions change.
- A firm is stuck with fixed costs. They are irrelevant in deciding whether to shut down production in the short run.
- If the market price is lower than ATC but higher than AVC, the firm should still produce (yields more revenue than the variable cost).
- Not about gaining profits anymore, but about losing the least amount.
- Profit-maximizing level of production is at the quantity which the market price intersects the marginal cost curve.
- If the market price is below average variable cost, it is more advantageous to produce nothing.
**Long-run decisions**
- In the long run, all costs become variable.
- Only in the long run can firms completely exit the market.
- Firms should consider whether average revenue is greater than average total cost.
- If the market price is less than the lowest point on the ATC curve, the firm should make a long-run decision to exit the market.
### Behind the Supply Curve
- The supply curve for the market reflects the sum of choices of many individual suppliers.
#### Short-run Supply
- In the short run, the number of firms in the market is fixed.
- Firms have the same cost structure.
#### Long-run Supply
- In the long run, firms can enter and exit the number.
- Firms exit the market if the price falls below the lowest point on the ATC curve in the long run. Firms enter the market if they can produce at a level of ATC below the market price.
**Effects of market entry on the long-run supply curve**
- The existence of economic profits signals that there is money to be made; firms will enter the market to take advantage of the opportunity.
- As the equilibrium market price falls, revenues and profits fall.
- If economic profit is positive, firms have an incentive to enter the market.
- When the price is so low that economic profits are reduced to zero ($$P = ATC$$), firms no longer have an incentive to enter the market.
**Effects of market exit on the long-run supply curve**
- When firms exit the market, the supply curve shifts to the left; the new market equilibrium quantity decreases, and price increases.
- In the long run, in a perfectly competitive market,
- Firms earn zero economic profits.
- Firms operate at an efficient scale.
- Supply is perfectly elastic.
**Firms earn zero economic profit.**
- A business might be earning accounting profit, but not economic profit.
**Firms operate at an efficient scale**
- A firm's optimal production is at the point where marginal revenue equals marginal cost.
- The marginal cost curve intersects the average total cost curve at its lowest point.
- In the long run, economic profits are zero - price is equal to the average total cost.
- In the long run, price equals marginal cost equals average total cost.
**Supply is perfectly elastic.**
- Price must be equal to the minimum of ATC.
- Anything that moves the price will result in firms entering and exiting the market.
#### Why the long-run market supply curve shouldn't slope upward but does
- Assumption that the price of a good or service never changes, and all firms face identical costs.
- Adding nuances: some firms are more efficient than others (different cost structures).
- Newer firms with higher costs will enter only markets with higher prices.
- The long-run supply curve slopes upward because the price has to rise for new firms to enter.
- Price equals the minimum of ATC for the least efficient firm in the market, not every firm in the market.
- The last firm to enter the market earns zero economic profit since ATC $$=$$ price.
- More efficient firms with a lower ATC will be able to earn a positive economic profit.
- Over time, the average total cost changes.
#### Responding to Shifts in Demand
- The long-run supply curve is not perfectly elastic in practice.
- The demand curve shifts; the long-run supply curve remains horizontally in a perfectly competitive market (at least for purposes of theory); the short-run supply curve shifts because more firms enter the market until there is no more profit to be made.
- The result of the demand curve shifting is an increase in quantity traded without change in price.
---
## Chapter 14: "Monopoly"
### Navigate
- [Why Do Monopolies Exist?](#why-do-monopolies-exist-)
* [Barriers to Entry](#barriers-to-entry)
- [How Monopolies Work](#how-monopolies-work)
- [Monopolists and the Demand Curve](#monopolists-and-the-demand-curve)
- [Monopoly Revenue](#monopoly-revenue)
- [Problems with Monopoly and Public Policy Solutions](#problems-with-monopoly-and-public-policy-solutions)
- [The Welfare Costs of Monopoly](#the-welfare-costs-of-monopoly)
- [Public Policy Responses](#public-policy-responses)
- [Market Power and Price Discrimination](#market-power-and-price-discrimination)
* [Perfect Price Discrimination](#perfect-price-discrimination)
* [Price Discrimination in the Real World](#price-discrimination-in-the-real-world)
### Why Do Monopolies Exist?
- Most firms face some degree of competition.
- What happens when a firm faces no competition at all?
- **Monopoly** - a firm that faces no competition, the producer of a good or service with no close substitutes.
- **Perfect monopoly** - controls all ($$100\%$$) of the market in a product.
- **Monopoly power** - the power of a monopoly that can be exercised even when a firm controls less than all of the market of a product.
#### Barriers to Entry
- In a monopoly market, some barriers prevent firms other than the monopolist from entering the market.
- **Scarce resources**. Some key resource or input to the production process is limited.
- **Economies of scale**. In some industries, the fixed cost of infrastructure creates economies of scale that help a single firm produce the entire quantity of output demanded at a lower cost than other firms.
- *Natural monopoly* - this monopoly can be the "natural" outcome of competitive forces.
- **Government intervention**. Governments can create or sustain monopolies when they would have otherwise not existed. Can occur via state-owned firms, recognition of intellectual property rights.
- **Aggressive tactics**. Punishments, predatory pricing, buying up.
### How Monopolies Work
- A monopoly wants to maximize its profits.
- The monopoly is constrained by the market demand curve.
### Monopolists and the Demand Curve
- In a perfectly competitive market, the demand curve for the market slopes downward.
- In a perfectly competitive market, an individual firm faces a horizontal demand curve.
- When there is only one producer in the market, the monopolist faces the demand curve for the entire market.
### Monopoly Revenue
- Total revenue is price times quantity sold.
- Total revenue increases in sections of the demand curve where demand is price-elastic.
- Average revenue is the price.
- Marginal revenue is not equal to price in a monopoly market.
- A monopoly's choice to produce an additional unit drives down market price and marginal revenue.
- Producing an additional unit of output has the quantity and price effect.
- *Quantity effect*: increase in total revenue due to money brought in by the sale of more units.
- *Price effect*: decrease in total revenue due to lower price of an increase in quantity.
- Price effect works in the opposite direction of the quantity effect, decreasing revenue.
- In a monopoly market, marginal revenue is always less than price.
- Marginal revenue curve lies below the demand curve because marginal revenue is always less than the price.
- **Revenue maximizing-quantity: the point at which the MR curve crosses the $$x$$-axis.**
- Marginal and average revenue slope downward for the monopolist.
- Profit-optimizing quantity is when the marginal revenue intersects marginal cost.
- In a competitive market, marginal revenue $$=$$ price. In a monopoly market, $$\text{price} > \Delta\text{revenue}$$; price is greater than marginal cost at the optimal production point.
- Monopolies can make positive economic profits in the long run.
- In a monopoly market, other firms cannot enter the market. The monopolist can maintain a price higher than ATC.
### Problems with Monopoly and Public Policy Solutions
- Monopolies have welfare costs.
#### The Welfare Costs of Monopoly
- Monopoly's ability to keep quantity low and prices high hurts society and consumers.
- Monopoly is inefficient.
#### Public Policy Responses
- Policymakers have developed many policy responses to monopolies.
**Antitrust laws**
- Trusts - massive corporations that dominate entire industries.
- Breaking up corporations perceived to be engaging in anti-competitive behavior.
- Block mergers.
- Antitrust action could break up a natural monopoly.
**Public ownership**
- Natural monopolies pose a problem for policymakers.
- Monopolist can achieve lower costs of production than multiple competing producers would.
- A natural monopoly still chooses to produce at the profit-maximizing quantity, which is inefficient and causes deadweight loss.
- Governments can run natural monopolies as public agencies.
**Regulation**
- Governments can allow private monopolies to exist but cap prices.
- Firms have an incentive to avoid giving regulators useful information, though.
**Vertical splits**
- Split an industry "vertically" to introduce competitions into different parts of it.
### Market Power and Price Discrimination
- **Price discrimination** - the ability to charge customers different prices for the same good.
- Firms cannot engage in price discrimination in a perfectly competitive market.
#### Perfect Price Discrimination
- With perfect price discrimination, the area under the demand curve is the producer surplus.
- In fact, monopolies that price-discriminate are efficient - all possible mutually beneficial trades take place.
#### Price Discrimination in the Real World
- Problems with price discrimination:
- Defining categories of customers
- Products can be resold.
- Perfect price discrimination is essentially impossible.
---
## Chapter 15: "Monopolistic Competition and Oligopoly"
### Navigate
- [What Sort of Market?](#what-sort-of-market-)
* [Oligopoly and Monopolistic Competition](#oligopoly-and-monopolistic-competition)
- [Monopolistic Competition](#monopolistic-competition)
* [Monopolistic Competition in the Short Run](#monopolistic-competition-in-the-short-run)
* [Monopolistic Competition in the Long Run](#monopolistic-competition-in-the-long-run)
* [The Welfare Costs of Monopolistic Competition](#the-welfare-costs-of-monopolistic-competition)
* [Product Differentiation, Advertising, and Branding](#product-differentiation--advertising--and-branding)
- [Oligopoly](#oligopoly)
* [Oligopolies in Competition](#oligopolies-in-competition)
* [Duopoly](#duopoly)
* [Three or More Firms](#three-or-more-firms)
* [Compete or Collude](#compete-or-collude)
* [Oligopoly and Public Policy](#oligopoly-and-public-policy)
### What Sort of Market?
- Oligopoly and monopolistic competition market structures are common in the real world.
#### Oligopoly and Monopolistic Competition
- **Oligopoly** - a market with only a few firms.
- Products may/not be standardized but are similar enough such that the firms compete with each other.
- Interactions between firms and rivals impact success.
- In an oligopoly, it is important to keep your eye on competitors.
- Oligopolies have the existence of barriers to entry. There are not any barriers to entry, but not monopoly-level barriers.
- **Monopolistic competition** - a market with many firms that sell goods & services that are similar but slightly different.
- In a monopoly, a product has no close substitutes.
- Monopolistic competition markets, between perfect competition (all products are standardized) and monopoly.
- Products are slightly different; consumers can pay a little bit more, but if the price is too large, they will choose a substitute.
- A firm can have a monopoly, in a limited sense.
- Oligopoly is about *number of firms*.
- Can exist when products are standardized.
- Monopolistic competition is about *variety of products*.
- Can exist with many small firms.
### Monopolistic Competition
- **Product differ estimation**: firms must offer goods that are similar to competitors' products but are more attractive.
#### Monopolistic Competition in the Short Run
- Product differentiation allows firms in monopolistic competitive markets to behave like a monopolist in the short run.
- When monopolistically competitive firms behave like monopolists:
1. Firms face a downward-facing demand curve; a firm cannot adjust its price without changing the quantity consumers demand.
2. Firms face a u-shaped ATC curve.
3. Profit-maximizing quantity is where $$MR = MC$$. Price is the corresponding point on the demand curve.
#### Monopolistic Competition in the Long Run
- Monopolistic competitive firms face a problem monopolists do not - other firms can enter the market.
- The availability of substitute goods is a determinant of demand. The demand curve for the original firm shifts leftwards.
- In the long run, monopolistically competitive market firms face the same profit situation as in a perfectly competitive market.
- Profits go to zero.
- Entry-exit continues to shift the demand curve left or right until ATC touches the demand curve at the point where $$MR = MC$$.
| Characteristic | Perfect Competition | Monopoly | Monopolistic Competition |
| --- | --- | --- | --- |
| How many firms? | Many firms | One firm | Many firms |
| Price taker or price maker? | Price taker | Price maker | Price maker |
| Marginal revenue? | $$MR = \text{Price}$$ | $$MR < \text{Price}$$ | $$MR < \text{Price}$$ |
| Profit-maximizing quantity occurs at | $$MR = MC$$ | $$MR = MC$$ | $$MR = MC$$ |
| Can earn economic profits in short run? | Yes | Yes | Yes |
| Can earn economic profits in the long run? | No | Yes | No |
| Quantity is efficient? | Yes | No | No |
- monopolistically competitive firms operate at a smaller-than-efficient scale.
- Optimal production point in the long run is when the ATC curve touches the demand curve.
- This will always be on the section of the ATC curve that is down-sloping
- *Efficient scale* - when firms produce an ATC-minimizing quantity.
- Monopolistically competitive firms maximize profits by operating at a smaller scale; it has excess capacity.
- Firms need to respond to competitors entering the market with continual product differentiation.
#### The Welfare Costs of Monopolistic Competition
- Monopolistic competition is inefficient; firms maximize profits at a price higher than marginal cost.
- A government can set a single price for all firms and let natural forces take over.
#### Product Differentiation, Advertising, and Branding
- Product differentiation enables firms to keep making economic profits in the short run.
- Firms can persuade customers that their products cannot easily be substituted.
- Advertising can both be informative and inaccurate.
- Decreases customers' willingness to substitute between similar products.
- **Asymmetric information**: firms no more about the true quality of their products than consumers do.
- **Advertising**: The more expensive advertising is, the more consumers assume the firm is confident it has a good product.
- **Branding**.
### Oligopoly
- Firms in an oligopoly compete with a few identifiable competition firms.
- Oligopolists make strategic decisions about price and quantity.
#### Oligopolies in Competition
- *Duopoly* - oligopoly with two firms.
#### Duopoly
- In a duopoly, the two firms could agree to act as joint monopolists.
- Competition between oligopolists drives price and profits down to below the monopoly level.
Oligopolistic competition does not necessarily drive profits down to the efficient level, as perfect competition does.
- When $$\text{quantity effect} > \text{price effect}$$, an increase in output increases profit.
- Otherwise, the firm has no incentive to increase output.
#### Three or More Firms
- Smaller increases in total increases in total quantity in a three+-firm market have a smaller downward effect on market price.
- An oligopolist will continue to increase output up to the point where $\text{quantity effect} = \text{price effect}$$.
- Oligopolist production decisions affect the profits of other firms.
#### Compete or Collude
- **Collusion**: the act of working together to make decisions about price and quantity.
- The dominant strategy is to compete.
- No player has an incentive to break the equilibrium.
- **Cartel** - a number of firms that collude to make collective production decisions about quantities or prices.
- It is in the long-term interest to collude rather than compete.
- Cartels are usually illegal.
#### Oligopoly and Public Policy
- It is illegal for an oligopolist to offer to collude.
- In a monopoly, there is a deadweight loss.
- Monopoly and collusion deadweight loss are identical.
---
## Chapter 17: "International Trade"
### Navigate
- [A Review on Trading](#a-review-on-trading)
* [The Roots of Comparative Advantage](#the-roots-of-comparative-advantage)
* [Incomplete Specialization](#incomplete-specialization)
- [From Autarky to Free Trade](#from-autarky-to-free-trade)
* [Becoming a Net Importer](#becoming-a-net-importer)
* [Becoming a Net Exporter](#becoming-a-net-exporter)
* [Big Economy, Small Economy](#big-economy-small-economy)
- [Restrictions on Trade](#restrictions-on-trade)
* [Why restrict trade?](#why-restrict-trade)
* [Tariffs](#tariffs)
* [Quotas](#quotas)
- [Trade Agreements](#trade-agreements)
* [International labor and capital](#international-labor-and-capital)
* [The WTO and Trade Mediation](#the-wto-and-trade-mediation)
* [Labor and Environmental Standards](#labor-and-environmental-standards)
* [Embargoes: Trade as Foreign Policy](#embargoes-trade-as-foreign-policy)
### A Review on Trading
- *Comparative advantage* - ability to produce a good at a lower opportunity cost than others can.
- *Gains from trade* - increase in welfare in both countries that result from specialization and trade.
#### The Roots of Comparative Advantage
- Countries are described as a national entity, but trade is usually carried out by firms and individuals, not by governments.
- When everyone responds to the profit motives they face, they produce products with which they have a comparative advantage.
- What causes firms in one country to have a lower opportunity cost of production.
- *Natural resources and climate*. Geography can also affect the cost of transporting goods.
- *Factor endowment*. The relative abundance of different factors of production affects comparative advantage. Land, capital, labor.
- *Technology*.
#### Incomplete Specialization
- Why doesn't every country produce just one good?
- No national economy is a perfectly free market.
- Specialization is limited by trade agreements.
- Within each country, there are differences in the natural resources, climate, and relative factor endowment of different areas.
### From Autarky to Free Trade
- Free and unrestricted exchange between buyers and sellers maximizes surplus, just as free trade between countries does.
- *Imports*: goods and services produced in other countries and consumed domestically.
- *Exports*: goods and services produced domestically and consumed in other countries.
- *Autarky*: no trade, a self-contained economy.
#### Becoming a Net Importer
- *World price* - a useful simplification to describe a complex situation.
- If an economy once under autarky engages in free trade and $$\text{Autarky domestic price} > \text{World price}$$, the product is imported. The country becomes a net-importer of that product.
- Trade does not affect the supply and demand curves.
- New surplus is created by trade.
#### Becoming a Net Exporter
- If an economy once under autarky engages in free trade and $$\text{Autarky domestic price} < \text{World price}$$, the product is exported. The country becomes a net exporter of that product.
- Post-trade equilibrium is more efficient than pre-trade (autarky) equilibrium.
#### Big Economy, Small Economy
- Whether a country engaging in free trade impacts the world price depends on how big or small it is.
- Buyers and sellers are price takers if they are too small relative to the size of the market.
- We need to add nuance and consider the supply and demand in world markets.
- For instance, if the US joins the world market for computer software, both the supply and demand curves for the world market would shift rightwards.
### Restrictions on Trade
- Who wins and who loses from trade?
#### Why restrict trade?
- Trade is efficiency-enhancing: it always increases total surplus.
- Some trade restrictions are the result of global politics.
- However, often trade restrictions protect those who lose surplus because of free trade.
- Laws limiting trade are trade protection, part of *protectionism*.
- Policies that promote free trade: *trade liberalization*.
#### Tariffs
- A **tariff**: tax on imported goods. Causes deadweight loss and is inefficient.
- Used to protect domestic producers.
- Tariffs have the same effect on the U.S. steel market as an increase in world price; a higher price pushes domestic producers up the supply curve.
- Fewer items are imported, and domestic producers supply more.
- Domestic steel producers enjoy an increase in surplus.
- However, domestic steel consumers can also lose surplus.
#### Quotas
- Multifiber Arrangement (MFA): regulated clothing items, used a quota.
- **Import quota** - limit on how much of a particular good can be imported.
- The effect of a quota is similar to that of a tariff.
- Domestic quantity demanded decreases.
- Domestic quantity supplied increases.
- Quantity of imports falls.
- Domestic producers gain surplus from selling a higher quantity at a higher price, but domestic consumers lose surplus from buying a lower quantity at a higher price.
- Difference in who benefits from the difference between the value of a good in the domestic market and the value of the good in a world market:
- *Tariff*: the domestic government collects tax revenue. $$\text{quantity of imports} \times \left| \text{domestic price} - \text{world price}|$$.
- *Quota*: value goes to whoever has the rights to import, or quota rent. The importing country/firm receives surplus. This results in profits for foreign agents.
### Trade Agreements
- Why do trade agreements happen?
#### International labor and capital
- Although countries as a whole gain from liberalizing trade, some segments of the population lose out.
- Free trade equalizes supply and demand of the factors of production across countries.
- Factor prices (wages, for example) converge across countries.
- Owners of domestically *scarce* factors lose, because of increased competition.
- Owners of domestically *abundant* factors win, because of increased demand.
- Arguments over trade policy are debates over the distribution of benefits.
#### The WTO and Trade Mediation
- When a country doesn't like another country's trade restrictions, it can appeal to the World Trade Organization.
- "Most favored nation status" to incentivize countries to follow rules.
- Members of the WTO offer the same trade terms to all other members of the organization.
- Trade brings more benefits than costs.
#### Labor and Environmental Standards
- Every country has its own set of rules, which causes friction in trade.
- Many clothes sold in the United States are produced in illegal ways to United States.
**Import standards**.
- Some countries impose standards on imported goods.
- *Blanket standards*: address issues affecting customers. Imports on products that violate standards are restricted.
- *Import standards*: placed on specific countries to address production issues on a country of origin.
- North American Agreement on Labor Cooperation (NAALC), part of NAFTA. Working towards a set of labor standards.
- Does not expect each country to maintain the same standards, but only that each country enforce its existing labor law.
**Pocketbook activism**.
- Individual consumers can make choices about what they will buy, even if there are no regulations for standards.
- Consumers can pay more for fair-trade goods, and producers can differentiate their products.
#### Embargoes: Trade as Foreign Policy
- Sometimes, trade restrictions are not implemented for economic reasons but foreign policy reasons.
- Restricting the ability to trade can be a punishment.
- **Embargo**: prohibition or restriction on trade to put political pressure.
---
## Chapter 18: "Externalities"
### Navigate
- [What Are Externalities?](#what-are-externalities)
* [External Costs and Benefits](#external-costs-and-benefits)
* [Negative Externalities](#negative-externalities)
* [Positive Externalities](#positive-externalities)
- [Private Solutions to Externalities](#private-solutions-to-externalities)
- [Public Solutions to Externalities](#public-solutions-to-externalities)
* [Taxes and Subsidies](#taxes-and-subsidies)
* [Quotas and Tradable Allowances](#quotas-and-tradable-allowances)
* [Targeting Externalities with Public Policy](#targeting-externalities-with-public-policy)
### What Are Externalities?
- Every time you make a decision, there is an underlying trade-off you consider.
- The price and quantity at which buyers and sellers trade their goods and services reflect their private costs and benefits.
- With many people making decisions, however, the costs add up.
- Examples: pollution caused by car use, the production of $$CO_2$.
#### External Costs and Benefits
- **Private costs**: costs that fall directly on the economic decision-maker.
- **External costs**: uncompensated costs imposed on someone other than the person who caused them.
- **Social cost**: $$\text{private costs} + \text{external costs}$$.
- This also applies to *private benefits*, *external benefits*, and *social benefit*.
- **Externality**: an external cost or benefit.
- *Negative externality*: an external cost.
- *Positive externality*: an external benefit.
- Externalities are one of the most common causes of a market failure.
- We assume externalities are constant and predictable.
- **Network externality**: the effect the user of a good has on the value of that good for others. People help or harm others simply by participating in a group.
#### Negative Externalities
- **Production externality**: externality that occurs when a good or service is being produced.
- **Consumption externality**: externality that occurs when a good or service is being consumed.
Negative production externality
- We can quantify the cost imposed every time a good is produced and plot a social cost curve, such that the external cost is equal to the vertical distance between the social cost curve and the private supply curve.
- The deadweight loss society represents the loss of economic surplus to society. Negative production externalities result in "too much" production of some goods.
Negative consumption externality
- A cost is imposed upon others when a good or service is being consumed.
- Overconsumption produces a deadweight loss and decreases overall economic surplus.
#### Positive Externalities
- A positive externality also pushes quantity away from the efficient equilibrium level, reducing the total surplus.
Positive consumption externality
- A third party benefits when a good or service is being consumed.
- A new social benefit curve can be formed by adding the external benefit to the demand curve.
- Underproducing a good of service will reduce economic surplus and generate a deadweight loss.
Positive production externality
- A third-party benefit when a good or service is being produced.
- A positive production externality results in "too little" production.
### Private Solutions to Externalities
- Externalities reduce the total surplus by creating a deadweight loss for society.
- We can transform external costs and benefits into private costs and benefits.
- Individuals will pursue mutually beneficial trades. Someone will always gain something by pursuing it.
- Externalities reduce surplus: therefore, there must be mutually beneficial trades waiting to be exploited.
- Example: why don't those who suffer from pollution pay drivers to drive less? There is a surplus to be gained from decreasing the quantity of gas burned.
- **Coase theorem**: individuals can reach an efficient equilibrium through private trades, even in the presence of an externality.
- Assumptions: people can make enforceable agreements to pay each other, there are no transaction costs in coordinating and enforcing agreements.
- However, these two assumptions usually never hold.
- A private solution yields an efficient outcome, but the distribution of the surplus is different.
- Assumptions of "fairness" are different.
- Private solution: drivers have a "right" to pollute and are paid not to.
- Government intervention: citizens have a "right" to live free of pollution and need to be paid to accept pollution.
- Efficiency is about *maximizing* total surplus, but nothing about the fairness of the distribution of surplus.
### Public Solutions to Externalities
- People often turn to public policy for solutions to externalities.
- These are often addressed via taxes, subsidies, quotas, or tradable allowances.
- That a market works efficiently only means that it maximizes surplus; but increasing surplus for society does not mean anything for the distribution of the surplus.
#### Taxes and Subsidies
Countering a negative externality with a tax
- A tax meant to counter the effect of a negative externality: **Pigovian tax**.
- A Pigovian tax increases the effective price paid for a good to that of the social cost.
- Pigovian taxes must be set at the right level; if the estimate is too high or too low, the result will be inefficient.
- Pigovian taxes do not guarantee the government can help people bearing the external cost.
- The tax still maximizes surplus in society by moving the car market to an efficient equilibrium.
Capturing a positive externality with a subsidy
- A subsidy can help consumers or producers capture the benefits of positive externalities.
- Using a subsidy to increase efficiency does not equal fairness, but it does maximize total surplus.
#### Quotas and Tradable Allowances
Quotas
- If we know the socially optimal quantity of something, we can set a quota, rather than imposing taxes.
- Limiting total consumption to the efficient quantity does not make the market level.
- The invisible hand allocates resources to hose with the greatest WTP; maximizing surplus depends not only on how much of a good is sold but also on who produces them.
- A tax allows the market to sort itself out, whereas a quota does not.
Tradable Allowances
- Different manufacturers have different abilities to reduce emissions; there is a missed opportunity for a mutually beneficial trade.
- **Tradable allowance**: a production or consumption quota that can be bought and sold.
- Set a quota, and allow firms to buy and sell their quota allowances.
- Tradable allowances result in efficient quantity and maximize surplus.
- A Pigovian tax results in revenue for the government, whereas tradable allowance creates a market where quotas are sold between private parties.
#### Targeting Externalities with Public Policy
- Economists try to propose taxes based on the externality itself, but this is hard to do.
- Policies that target individual goods and policies give consumers and producers an incentive to find clever ways around it.
---
| 57.097158 | 271 | 0.764956 | eng_Latn | 0.9987 |
da17b3568171bfe64b8a63a25ebe309f6b5d9d05 | 4,912 | md | Markdown | _posts/econometrics/2022-03-21-econometrics_2.md | 2joonh2/2joonh2.github.io | d4ed0bf34ff1acfeecdf82ddcb54ac566a83953b | [
"MIT"
] | null | null | null | _posts/econometrics/2022-03-21-econometrics_2.md | 2joonh2/2joonh2.github.io | d4ed0bf34ff1acfeecdf82ddcb54ac566a83953b | [
"MIT"
] | null | null | null | _posts/econometrics/2022-03-21-econometrics_2.md | 2joonh2/2joonh2.github.io | d4ed0bf34ff1acfeecdf82ddcb54ac566a83953b | [
"MIT"
] | null | null | null | ---
layout: single
title: "[Econometrics] 2. Binary Choice"
categories: Econometrics
toc: true
toc_sticky: true
---
응용계량경제학 필기노트

석사냐 박사냐. 그것이 binary choice다.
# Binary Choice
## Limited Dependent Variables
종속변수에 limitation이 있음
제한된 값(ex. 0 or 1 / 1~4)
이번 챕터에서는 종속변수 Y가 0 또는 1의 값만 갖는 binary variable임을 밝힌다.
### Response Probability & Marginal Effect
$$
p(x)=P[Y=1\,|\,X]=E[Y\,|\,X=x] \;:\; Response\; Probability
$$
**Response Probability**의 X로의 Derivative를 **Marginal Effect**라고 부른다.
$$
\frac{\partial}{\partial x}p(x)=\frac{\partial}{\partial x}P[Y=1\,|\,X]=\frac{\partial}{\partial x}E[Y\,|\,X=x]\;:\;Marginal\;Effect
$$
Marginal Effect는 곧, X가 1(단계) 증가할때, Y가 1일 확률이 얼마나 변화하는가. 라는 미시경제학의 marginal 개념과 같은 결이다.
*"Economic applications often focus on the marginal effect."*
우리가 본 모델에서 중점적으로 확인하고자 하는 것은 marginal effect이다.
$$
Y=P(X)+e
$$
여기서, e는 heteroskedastic하며, 즉 classical error라고 할 수 없다.
$$
Var(e|X)=P(X)(1-P(X))
$$
## Binary Choice Models
위 Y와 P(X)의 관계를 설명하는 대표적인 모델들을 알아보자.
### Linear Probability Model
말그대로 OLS
한 줄로 요약하면, *무식하지만 깔끔하고, 그래서 무식하다* 라는 것이다.
장점: 돌리기 쉽고, 해석이 쉽다.
단점: 선형이기 때문에 0과 1이라는 limited dependent variable의 조건에 부합하지 않는다.
$$
marginal\; effect\; =\; \beta
$$

### Index Model
Linear 모델의 단점을 보완한 모델이다.
0과 1 사이의 값을 벗어나지 않도록 아래와 같은 transformation by **link function**을 진행하는 것이다. 이를 **single index model**이라고도 불린다.
따라서 본 모델의 marginal effect는 linear model과 달리 $\beta$라는 상수가 아닌 $\beta g(x'\beta)$라는 transformed 값이 marginal effect이다.
transformation을 통해 linear model의 한계였던 함수의 치역이 0과 1 사이라는 조건에 부합시키는 것이 가능하다.
$$
\displaylines{P(x)=G(x'\beta)\newline
0\leq G(u)\leq 1\newline
then,\; marginal\; effect\; :\;\frac{\partial}{\partial x}P(x)=\beta g(x'\beta)}
$$
#### Probit Model
Normal 분포의 CDF를 이용하는 것이다.
항상 0과 1사이의 값만을 갖는 점을 활용한 모델이다.
#### Logit Model
Logistic Distribution function을 link function으로 이용하는 Index Model이다.
본 link function은 형태가 단순하여 computational power 측면에서 transformation을 진행할때 probit 모델 대비 더 유리한 장점이 있었다.
하지만, 예전에 비해 컴퓨터 파워가 근심거리가 아닌 요즘은 Logit과 Probit 모두 적절히 혼용되고 있다.

## Latent Variable Interpretation
logit과 probit 등의 index model을 이해하는 것.
$Y^*$, **latent**라는 보이지 않는 변수를 아래와 같이 정의한다.
$$
Y^*=X'\beta+e \newline
e\; \sim G(e) \newline
Y=1\{Y^*>0\}=1\; (if\; Y^*>0,\;0\;otherwise)
$$
해석이 진행되는 경로를 천천히 살펴보자.
먼저 Y=1이라는 event는 곧, $Y^*$, **latent**가 0보다 크다는 것을 의미한다. 이는 다시 곧, $X'\beta +e>0$임을 의미한다.
이때 response probability는 아래와 같을 것이다.
$$
P(x)=P[e>-x'\beta]=1-G(-x'\beta)=G(x'\beta),\;since\; symmetry
$$
위의 식에서 우리는 $P(x)=G(x'\beta)$ 임을, e의 분포에 해당하는 link function을 따르는 것을 확인할 수 있다.
대표적인 Index Model에서 probit의 경우, e는 Std. Normal Distribution을 따르게 되고, logit의 경우에는 Std. Logistic Distribution을 따른다는 것을 알 수 있다.
## Likelihood
### Distribution of an individual observation
Y를 베르누이 분포 값이라고 인지할때, 즉 $P[Y=1]=p$이고 $P[Y=0]=1-p$일 때, Y의 prob mass function은 아래와 같을 것이다.
$$
\pi(y)=p^y(1-p)^{1-y},\quad y=0,1
$$
Index model 하에서는, Y가 conditionally Bernoulli를 따르기 때문에, 아래와 같은 prob mass function을 갖는다.
$$
\pi(Y|X)=G(X'\beta)^Y(1-G(X'\beta))^{1-Y}
$$
### Log-likelihood function and MLE
위의 식에 log를 취함으로서 곱셈을 덧셈식으로 간단히 나타낼 수 있다.
$$
\displaylines{l_n(\beta)=\Sigma\,log\,G(X'\beta)^Y(1-G(X'\beta))^{1-Y}\newline\newline
l_n^{probit}(\beta)=\Sigma\,log\,\Phi(X'\beta)^Y(1-\Phi(X'\beta))^{1-Y}\newline
l_n^{logit}(\beta)=\Sigma\,log\,\Lambda(X'\beta)^Y(1-\Lambda(X'\beta))^{1-Y}\newline}
$$
위의 식 하에서, MLE (Maximum Likelihood Estimation)란, the value which maximizes the log-likelihood function, 위의 식의 값을 최대화 시키는 $\beta$로 정의된다.
## Marginal Effects
### Average Marginal Effect
우리는 앞서 Index Model의 marginal effect 식을 확인하였다.
$$
marginal\; effect\; :\;\frac{\partial}{\partial x}P(x)=\beta g(x'\beta)
$$
각 x에 대해 각기 다른 marginal effect 값이 존재하기 때문에, 하나의 대표값으로 **average marginal effect**를 활용한다.
$$
AME\;(average\; marginal\; effect)=E[\delta(x)]=\beta\, E[g(X'\beta)]
$$
평균값에서의 marginal effect와의 차이를 분명히 하자.
### when X includes nonlinear transformations
$$
\displaylines{For\quad P[Y=1|X=x]=G(\beta_0+\beta_1x+...+\beta_px^p),\newline
\delta(x)=(\beta_1+...+p\beta_px^{p-1})\,g(\beta_0+\beta_1x+...+\beta_px^p)}
$$
## Application to the Boston HMDA Data
*Mortgages(모기지), 주택담보대출을 받을 때 인종에 따른 대출 제공에 차별이 있는가?*
Dependent Variable은 곧, *is the mortgage denied or accepted?* 라는 binary variable 이 될 것이다.
## Remaining Threats to Internal, External Validity
#### Internal Validity
1. OVB
2. Wrong functional form
3. Errors-in-variables bias
4. Sample selection bias
5. Simultaneous causality bias
#### External Validity
90-91년도의 데이터로부터 얻은 결과를 현재에도 적용할수 있을까? 등의 이슈가 있을 것이다.
| 17.295775 | 134 | 0.677728 | kor_Hang | 0.999872 |
da17f97980fd75c74320d65b275bd6b71fc5b183 | 337 | md | Markdown | README.md | hengyunabc/dubbo-arthas-demo | 63cc8e5c75aae04b37024da756607290048bd577 | [
"Apache-2.0"
] | 88 | 2018-11-30T15:44:15.000Z | 2021-12-25T15:57:32.000Z | README.md | hengyunabc/dubbo-arthas-demo | 63cc8e5c75aae04b37024da756607290048bd577 | [
"Apache-2.0"
] | 1 | 2018-12-16T12:32:55.000Z | 2018-12-16T12:32:55.000Z | README.md | hengyunabc/dubbo-arthas-demo | 63cc8e5c75aae04b37024da756607290048bd577 | [
"Apache-2.0"
] | 21 | 2018-12-01T10:07:07.000Z | 2021-02-26T02:43:12.000Z | # dubbo-arthas-demo
演示Arthas排查Dubbo问题的Demo。
* 文字稿:https://github.com/alibaba/arthas/issues/327
* PDF: [当DUBBO遇上Arthas-排查问题的实践.pdf](当DUBBO遇上Arthas-排查问题的实践.pdf)
* https://github.com/alibaba/arthas
* https://alibaba.github.io/arthas/
* http://start.dubbo.io/
## 微信公众号
欢迎扫一扫下面的微信公众号,订阅“横云断岭的专栏”。

| 18.722222 | 63 | 0.747774 | yue_Hant | 0.464463 |
da17fdd75c3a7de0e301f9510b20b077067e3708 | 516 | md | Markdown | .github/ISSUE_TEMPLATE/bug_report.md | fakegit/coderedcms | 10dd10635bba9c2dcecede4b8e557b5a6ffd8b23 | [
"BSD-3-Clause"
] | 526 | 2018-07-31T20:14:17.000Z | 2022-03-23T08:08:29.000Z | .github/ISSUE_TEMPLATE/bug_report.md | fakegit/coderedcms | 10dd10635bba9c2dcecede4b8e557b5a6ffd8b23 | [
"BSD-3-Clause"
] | 325 | 2018-08-01T13:53:55.000Z | 2022-03-31T15:08:28.000Z | .github/ISSUE_TEMPLATE/bug_report.md | fakegit/coderedcms | 10dd10635bba9c2dcecede4b8e557b5a6ffd8b23 | [
"BSD-3-Clause"
] | 153 | 2018-08-02T07:42:40.000Z | 2022-03-24T23:54:59.000Z | ---
name: Bug Report
about: Report something that is broken, defective, or does not function correctly.
title: ""
labels: "Type: Bug"
assignees: ""
---
#### Describe the bug
A clear and concise description of what is broken.
#### Steps to reproduce
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
#### Expected behavior
A clear and concise description of what you expected to happen.
#### Additional context
Add any other context about the problem here.
| 20.64 | 82 | 0.70155 | eng_Latn | 0.996539 |
da185f3e6b33b0d56cc43d4162d21782ca32b462 | 5,913 | md | Markdown | _posts/2019-08-15-Download-astrology-psychology-and-the-four-elements-pdf.md | Bunki-booki/29 | 7d0fb40669bcc2bafd132f0991662dfa9e70545d | [
"MIT"
] | null | null | null | _posts/2019-08-15-Download-astrology-psychology-and-the-four-elements-pdf.md | Bunki-booki/29 | 7d0fb40669bcc2bafd132f0991662dfa9e70545d | [
"MIT"
] | null | null | null | _posts/2019-08-15-Download-astrology-psychology-and-the-four-elements-pdf.md | Bunki-booki/29 | 7d0fb40669bcc2bafd132f0991662dfa9e70545d | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Astrology psychology and the four elements pdf book
three centuries, a meadow bank grows. "It's always a problem, to clandestine leading from this space suggest additional rooms beyond. Ordinary readers can skip, the headaches, where is Amanda?" He knew he was no match for Early. She was in her entirety unusual. Junior would never again use it to store leftover soup. to this most momentous day? " fish was to be found in the fresh-water lagoon at Yinretlen, and the defiant jaw, or it's thrown away, but for a moment it felt like it" He pauses. His arms flailed for equilibrium, i, only of their physical discomfort. It was just what we'd wanted to hear. You want to see my ID?" 25. But, Nevada, breed there. " "Why do people follow leaders?" Pernak replied. If you'll trust me with it, Seraphim was a virgin. " She didn't humph, a cage. The Lovers of the Benou Udhreh (232) dcxlvi 192 which separates the sleeping chamber from the exterior tent, crater on the moon, and like her It is related that Ibn es Semmak (162) went in one day to Er Reshid and the Khalif, however, she had ripped the cards in thirds and had been the lady of the hour. By Thursday, "Arise and come down and show us the contract, either because she catches an appealing scent or because she have assured an explosion of respectable magnitude, the temperature was never pinched Junior's cheek between thumb and forefinger, wasn't a much better future than astrology psychology and the four elements pdf. the center of her vision of a astrology psychology and the four elements pdf future? He looked at the man he knew only as Otter? Tom snatched the revolver off the table, without leave or commandment. the Yenisej. Nevertheless, commanded by the Russian merchant captain, there aren't, the clergyman's curse-all this amounted to more than even a committed man could handle! Sir HUGH WILLOUOUGHBY's in 1553 was thus the first maritime consecutive successful missions against the Zorphs is entitled to promotion to Fleet Captain. 209. astrology psychology and the four elements pdf, among which were "Tajmur river" or "Taimur river" "Good morning," I said and showed him my ID. Had it been anyone else he would have looked more surprised, and so harmless. His instructor. Contrary to Micky's expectations, considering the unreliability of all machinery made by man, O youth. astrology psychology and the four elements pdf "But it's only a formality!" he interrupted me. When the nurse was gone, thou gladdenest the place and with thy presence enlightenest and embellishest it; but now fain would these kings hear somewhat of thy singing, Nolan still remembered the basic rule-never contradict these people or make fun of their super- once found her where she'd been hiding in the silver-black folds of its "I don't know," Dragonfly said, but the storm moved south soon after dawn. words: one who libeled or slandered, as astrology psychology and the four elements pdf waited to hit the trail, for Panglo. "It could not be applied in any way to the present circumstances. We need to make a list of what's available Suddenly Leilani was scared, he usually parties late. " They walked past the roaster tower, and then The bear is not difficult to kill, K. So I think we can rule that out however, p, to Astrology psychology and the four elements pdf in his sailor suit-and hello, and began to speak to them, realizing he must have slept for hours. "How are you going to find a record of the marriage?" "I'm brooding on it. Why?" The motel room was a flung palette of colors, and recklessly courageous in the pursuit of his goalsвbut socially inept enough to entertain In spite of the urgency of his desire. stop playing the quiet hero, as if his visit to Jacob were a weight that bowed him. Is By Allah, Paul looked down the Section 4. " "God forbid," exclaimed the old man, the pale, and shared the vision of other Barty's in other places, she said to him, 1738 and Archangel. Leave her screaming with no one really dead. _Zaritza_, whereas this was desperate, for all he saw was a mass of confusing colors, and astrology psychology and the four elements pdf only seats left vacant were those of the Deputy Mission Director, peace came, Noah hesitated! I fell on the pillows. I keep both doors. Fallows sat. Otter was slow to recover, might improbability. That left him, not even for a moment. His name for Edom was E-bomb. I doubted that it would understand me if I stand up. Her hands so pale, the flash and "You. several circumstances in fact indicate, identical twins, ii, "thanks, "Not another word. The rapid clearing of the sky-indicating a stiff wind at high altitudes, without moving his mouth, or you are left with no one to trust. Maybe one day I'll return to medicine. linen, summer fruits! digitoxin less than twelve hours ago and whose fate he had shared with Leilani upon returning home in the OF THE APPOINTED TERM, what she had stubbornly refused to learn from she didn't seem in danger of being permanently traumatized, at a height of rot, that nothing may assain, dressed this way. One animal Since the name of the person is the person, as though these figures were mummified Schelechov, "There is no power and no virtue save in God the Most High, because this girl is the right stuff, or loss. the dogs of an encampment and those of strangers? " the Federal Communications Handbook. the men who had astrology psychology and the four elements pdf among the Samoyeds returned home. 264). These are metagen expansions in an n- retreated to her bed with dinner and with the novel about evil pigmen from "What are you?" he said to her at last. "Weak as women's magic, "She just calls him Klonk because she claims that was the noise he made if you rapped him drumming from the physical demands of flight. In spite of the August heat, pinioned him! | 657 | 5,793 | 0.784712 | eng_Latn | 0.999939 |
da18c00a65380ff4344174d1d5a147cb5754a596 | 18 | md | Markdown | README.md | Weixin3416/xinwei.github.io | 87f7f4935b426726604df2fb856d0029b1bcca4f | [
"MIT"
] | null | null | null | README.md | Weixin3416/xinwei.github.io | 87f7f4935b426726604df2fb856d0029b1bcca4f | [
"MIT"
] | null | null | null | README.md | Weixin3416/xinwei.github.io | 87f7f4935b426726604df2fb856d0029b1bcca4f | [
"MIT"
] | null | null | null | # xinwei.github.io | 18 | 18 | 0.777778 | xho_Latn | 0.39319 |
da19351674d800053a78d661cc80764590a06220 | 485 | md | Markdown | screenshots.md | kapdap/mgu-srt | c6d190f17d48ca7c1c9a1341f0c12f7980157560 | [
"MIT"
] | 1 | 2020-11-15T17:53:28.000Z | 2020-11-15T17:53:28.000Z | screenshots.md | kapdap/mgu-srt | c6d190f17d48ca7c1c9a1341f0c12f7980157560 | [
"MIT"
] | null | null | null | screenshots.md | kapdap/mgu-srt | c6d190f17d48ca7c1c9a1341f0c12f7980157560 | [
"MIT"
] | null | null | null | ---
layout: page
title: Screenshots
---

*SRT overlay with game running at 480p windowed mode using [DXWnd](https://sourceforge.net/projects/dxwnd/){:target="_blank" rel="noopener"}.*

*SRT overlay with game running at 2k fullscreen using [Peixoto's Patches](https://www.vogons.org/viewtopic.php?f=24&t=53121){:target="_blank" rel="noopener"}.*
| 48.5 | 159 | 0.754639 | eng_Latn | 0.643736 |
da19a4352f93f00a0a19da50df28734bda91a2b7 | 193 | md | Markdown | Retirement Simulator/fixedDate.md | sroehling/FutureBudget | 7bf1e24be1a7013f47e9a5f08ab5c89bcd5da1d0 | [
"MIT"
] | null | null | null | Retirement Simulator/fixedDate.md | sroehling/FutureBudget | 7bf1e24be1a7013f47e9a5f08ab5c89bcd5da1d0 | [
"MIT"
] | null | null | null | Retirement Simulator/fixedDate.md | sroehling/FutureBudget | 7bf1e24be1a7013f47e9a5f08ab5c89bcd5da1d0 | [
"MIT"
] | null | null | null | ## Fixed Date
A fixed date is for the current input only. If this date is selected, changing
this date only impacts results for the current input.
### See Also
* [Milestone Date][milestone] | 24.125 | 79 | 0.746114 | eng_Latn | 0.999312 |
da1aacdfc927fe1f2f64bc05d574908a13e22950 | 2,930 | md | Markdown | README.md | Alkarex/AR.Drone | f8353f13a0a400b31c72eac11c24fdef5a760093 | [
"Unlicense"
] | 1 | 2018-05-04T05:44:07.000Z | 2018-05-04T05:44:07.000Z | README.md | Alkarex/AR.Drone | f8353f13a0a400b31c72eac11c24fdef5a760093 | [
"Unlicense"
] | null | null | null | README.md | Alkarex/AR.Drone | f8353f13a0a400b31c72eac11c24fdef5a760093 | [
"Unlicense"
] | null | null | null | ## Fork for gaze-control
The original project is located at [Ruslan-B / AR.Drone](https://github.com/Ruslan-B/AR.Drone).
This if a fork to perform some experiments (2013-09) with a [helicopter drone controlled by gaze-interaction](http://alexandre.alapetite.fr/research/gaze-drone/).
Results to appear in “The Use of Gaze to Control Drones”, ETRA’2014 conference.
--
## AR.Drone [](https://travis-ci.org/Ruslan-B/AR.Drone)
The AR.Drone 2.0 controlling library for C#/.NET and Mono, with video support.
Built over the original [AR.Drone SDK](https://projects.ardrone.org) 2.0.1 - using lastest drone firmware.
If case you are looking for Windows RT/Windows Phone support please check this project [ARDrone2Windows](https://github.com/ARDrone2Windows/SDK).
## Dependencies
[FFmpeg.AutoGen](https://github.com/Ruslan-B/FFmpeg.AutoGen) - .NET wrapper for FFmpeg.
## Status
This library is stable now. All major features are supported - video, configuration and control.
All experimental features moved to [dedicated](https://github.com/Ruslan-B/AR.Drone/tree/Experimental) branch. Please note that this branch under heavy development,
so please don't be suprised if you find some functionality missing or undocumented.
## Build
How to build from scratch:
- Clone this:
```bash
git clone git://github.com/Ruslan-B/AR.Drone.git
cd AR.Drone
git submodule update --init
```
- For the **video support** please review: **[Usage](https://github.com/Ruslan-B/FFmpeg.AutoGen#Usage)** section of the [FFmpeg.AutoGen](https://github.com/Ruslan-B/FFmpeg.AutoGen) project.
- Build AR.Drone solution with MonoDevelop, VS2010 or VS2012.
Please note: for opening solution in VS2010 you should have *Microsoft Visual Studio 2010 Service Pack 1* installed.
## Usage
The solution includes Winform application - AR.Drone.WinApp, it provides minimalistic interface
for controling and displaying video from the AR.Drone 2.0.
##License
Copyright 2013 Ruslan Balanukhin [email protected]
GNU Lesser General Public License (LGPL) version 3 or later.
http://www.gnu.org/licenses/lgpl.html
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
| 44.393939 | 189 | 0.771331 | yue_Hant | 0.624135 |
da1c405ab4037951a9e879ae992efb2e855b7ae0 | 21 | md | Markdown | README.md | welcome-dev/Simple-Django-Crud | 72428ab9f504d7150d9bbd925e5abe07339aeddd | [
"MIT"
] | 1 | 2019-08-22T16:50:12.000Z | 2019-08-22T16:50:12.000Z | README.md | welcome-dev/Simple-Django-Crud | 72428ab9f504d7150d9bbd925e5abe07339aeddd | [
"MIT"
] | 6 | 2021-03-18T23:23:28.000Z | 2021-09-08T01:10:38.000Z | README.md | welcome-dev/Simple-Django-Crud | 72428ab9f504d7150d9bbd925e5abe07339aeddd | [
"MIT"
] | null | null | null | # Simple-Django-Crud
| 10.5 | 20 | 0.761905 | spa_Latn | 0.124563 |
da1d8557efbf2f292c969d07d36831e84fbafd7a | 1,228 | md | Markdown | docs/error-messages/tool-errors/linker-tools-warning-lnk4001.md | B4V/cpp-docs.ru-ru | e820ac4bfffac205c605a9982ab55c2ef7cd0d41 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/tool-errors/linker-tools-warning-lnk4001.md | B4V/cpp-docs.ru-ru | e820ac4bfffac205c605a9982ab55c2ef7cd0d41 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/tool-errors/linker-tools-warning-lnk4001.md | B4V/cpp-docs.ru-ru | e820ac4bfffac205c605a9982ab55c2ef7cd0d41 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Предупреждение средств компоновщика LNK4001 | Документация Майкрософт
ms.custom: ''
ms.date: 11/04/2016
ms.technology:
- cpp-diagnostics
ms.topic: error-reference
f1_keywords:
- LNK4001
dev_langs:
- C++
helpviewer_keywords:
- LNK4001
ms.assetid: 0a8b1c3a-64ce-4311-b7c0-065995059246
author: corob-msft
ms.author: corob
ms.workload:
- cplusplus
ms.openlocfilehash: f684e85233c4df777a53f03f07936137c425946e
ms.sourcegitcommit: 913c3bf23937b64b90ac05181fdff3df947d9f1c
ms.translationtype: MT
ms.contentlocale: ru-RU
ms.lasthandoff: 09/18/2018
ms.locfileid: "46070423"
---
# <a name="linker-tools-warning-lnk4001"></a>Предупреждение средств компоновщика LNK4001
объектные файлы не указаны; библиотеки, которые используются
Компоновщик передан один или несколько LIB-файлы, но не OBJ-файлы.
Так как компоновщик не может получить доступ к данным в LIB-файл, он может обращаться в OBJ-файл, это предупреждение означает, что необходимо явно указать другие параметры компоновщика. Например, может понадобиться указать [или компьютер,](../../build/reference/machine-specify-target-platform.md), [/OUT](../../build/reference/out-output-file-name.md), или [/Entry](../../build/reference/entry-entry-point-symbol.md) параметры. | 38.375 | 428 | 0.797231 | rus_Cyrl | 0.51465 |
da1dd7ba1e20e0044d63050ac12c8045f1dfa386 | 975 | md | Markdown | README.md | wintonpc/swish-lite | 91ea0d9c2046bc47e85b56fd8737303c88eee291 | [
"MIT"
] | 1 | 2021-06-05T22:24:42.000Z | 2021-06-05T22:24:42.000Z | README.md | wintonpc/swish-lite | 91ea0d9c2046bc47e85b56fd8737303c88eee291 | [
"MIT"
] | null | null | null | README.md | wintonpc/swish-lite | 91ea0d9c2046bc47e85b56fd8737303c88eee291 | [
"MIT"
] | 1 | 2021-08-13T14:11:03.000Z | 2021-08-13T14:11:03.000Z | # Swish Lite
Swish Lite is a set of Chez Scheme libraries based on the [Swish Concurrency
Engine](https://github.com/indigobio/swish).
[Documentation](https://indigobio.github.io/swish-lite/swish-lite.pdf)
# Build System Requirements
## Linux
- Chez Scheme 9.5.4 or 9.5.6
- graphviz, texlive, texlive-latex-recommended, and texlive-latex-extra packages for
building the documentation
## Mac
- Chez Scheme 9.5.4 or 9.5.6
- dot (can be installed through homebrew using `brew install graphviz`)
- pdflatex (can be installed through homebrew using `brew cask install mactex`)
- Latin Modern fonts from LaTeX (can be installed with Font Book from a location like
`/usr/local/texlive/2020/texmf-dist/fonts/opentype/public/lm`)
## Windows
- Chez Scheme 9.5.4 or 9.5.6
- Cygwin or MinGW/MSYS with bash, git, graphviz, grep, perl, texlive, GNU make, etc.
- Put scheme in PATH.
# Maintenance
Run `update-pdf` and force-push branch `gh-pages` when documentation changes.
| 29.545455 | 85 | 0.748718 | eng_Latn | 0.927909 |
da1e119c37218d4e662ad9144ff8e7ceb38aec8d | 202 | md | Markdown | CoreGraphics/DragDrop/README.md | domenicosolazzo/practice-swift | 0ee3618fee2b2701f27e0f50f995eddc892f030f | [
"MIT"
] | 174 | 2015-01-06T16:37:54.000Z | 2021-03-03T02:01:32.000Z | CoreGraphics/DragDrop/README.md | miguelius/practice-swift | 0ee3618fee2b2701f27e0f50f995eddc892f030f | [
"MIT"
] | 2 | 2015-07-24T05:49:32.000Z | 2015-11-08T18:05:59.000Z | CoreGraphics/DragDrop/README.md | miguelius/practice-swift | 0ee3618fee2b2701f27e0f50f995eddc892f030f | [
"MIT"
] | 40 | 2015-01-15T16:43:37.000Z | 2019-01-07T05:07:45.000Z | Sprite Kit: Drag & Drop Sprites
Source [Original Tutorial by Ray Wenderlich](http://goo.gl/gZAEjo)
Description:
Swift implementation of the tutorial 'How to Drag and Drop sprites' by Ray Wenderlich
| 22.444444 | 85 | 0.777228 | eng_Latn | 0.466622 |
da1e29d843bd0f7bc3f69a2a65de4734019f88d8 | 4,268 | md | Markdown | examples/dataflow-bigquery-transpose/README.md | ruchirjain86/professional-services | 739ac0f5ffc8237f750804fa9f0f14d4d918a0fa | [
"Apache-2.0"
] | 2,116 | 2017-05-18T19:33:05.000Z | 2022-03-31T13:34:48.000Z | examples/dataflow-bigquery-transpose/README.md | ruchirjain86/professional-services | 739ac0f5ffc8237f750804fa9f0f14d4d918a0fa | [
"Apache-2.0"
] | 548 | 2017-05-20T05:05:35.000Z | 2022-03-28T16:38:12.000Z | examples/dataflow-bigquery-transpose/README.md | ruchirjain86/professional-services | 739ac0f5ffc8237f750804fa9f0f14d4d918a0fa | [
"Apache-2.0"
] | 1,095 | 2017-05-19T00:02:36.000Z | 2022-03-31T05:21:39.000Z | # Transpose a BigQuery table using Dataflow
Transposing/Pivoting/Rotating the orientation of a table is a very common task that is performed as part of a standard report generation workflow. While some relational databases provide a built-in *pivot* function of some sort, it can also be done via standard SQL.
As an example, the following table can be pivoted using [BigQuery Standard SQL](https://cloud.google.com/bigquery/docs/reference/standard-sql/):
<img src="img/simple_sql_based_pivot.png" alt="Simple SQL based pivot" height=150 width=650/>
```sql
SELECT
id,
MAX(CASE
WHEN class = 'HVAC' THEN SALES END) AS HVAC_SALES, MAX(CASE
WHEN class = 'GENERATORS' THEN SALES END) AS GENERATORS_SALES
FROM
`project-id.dataset_id.table_id`
GROUP BY
id;
```
However, this can get significantly more complicated as:
* Number of pivot fields increase (single pivot field _**class**_ in the above example).
* Number of distinct values in pivot fields increase (two distinct values _**HVAC**_ and _**GENERATORS**_ in the above example).
* Number of pivot values increase (single pivot value _**sales**_ in the above example).
The most common approach to pivoting a complex table would be a two step approach:
1. Run a custom script to analyze the table and generate a SQL statement such as the one above.
2. Run the dynamically generated SQL to pivot the table and write the output to another table.
This could also be done using a convenient Dataflow pipeline as described below.
## [Pivot Dataflow Pipeline](src/main/java/com/google/cloud/pso/pipeline/Pivot.java)
[Pivot](src/main/java/com/google/cloud/pso/pipeline/Pivot.java) -
A Dataflow pipeline that can be used to pivot a BigQuery table across any number of pivot fields and values.
This pipeline allows the user to specify a comma separated list of fields across which the table should be rotated in addition to a comma separated list of fields that are rotated.
i.e. The user can specify:
* Key fields along which the table is rotated (_**id**_ in the above example).
* Pivot fields that should be rotated (_**class**_ in the above example).
* Pivot values that should be rotated (_**sales**_ in the above example).
The pipeline will perform various steps to complete the pivot process:
1. Validate that the fields are valid and have the correct datatypes.
2. Read the data from an input BigQuery table.
3. Analyze the pivot fields and dynamically generate the correct schema.
4. Pivot every record based on the dynamically generated schema.
5. Write the pivoted records into a target BigQuery table.
<img src="img/pipeline_graph.png" alt="Pipeline Graph" height=650 width=500/>
## Getting Started
### Requirements
* Java 8
* Maven 3
### Building the Project
Build the entire project using the maven compile command.
```sh
mvn clean compile
```
### Running unit tests
Run all unit tests.
```sh
mvn clean test
```
### Running the Pipeline
<img src="img/example_raw_table.png" alt="Raw input table" height=200 width=550/>
The above input table shows a slightly more complex example. In order to pivot this table, we have the following inputs:
* keyFields = id,locid
* pivotFields = class,on_sale,state
* pivotValues = sale_price,count
The _**desc**_ field is ignored and will not be in the output table.
The [Pivot](src/main/java/com/google/cloud/pso/pipeline/Pivot.java) pipeline will create a new pivot table based on the inputs.
<img src="img/example_pivoted_table.png" alt="Raw input table" height=175 width=850/>
Execute the pipeline using the maven exec:java command.
```sh
MY_PROJECT=my-project-id
MY_STAGING_BUCKET=my-staging-bucket-name
MY_DATASET_ID=my-dataset-id
MY_SOURCE_TABLE_ID=my-source-table-id
MY_TARGET_TABLE_ID=my-target-table-id
mvn compile exec:java -Dexec.mainClass=com.google.cloud.pso.pipeline.Pivot -Dexec.cleanupDaemonThreads=false -Dexec.args=" \
--project=$MY_PROJECT \
--runner=DataflowRunner \
--stagingLocation=gs://${MY_STAGING_BUCKET}/staging \
--tempLocation=gs://${MY_STAGING_BUCKET}/tmp \
--inputTableSpec=${MY_PROJECT}:${MY_DATASET_ID}.${MY_SOURCE_TABLE_ID} \
--outputTableSpec=${MY_PROJECT}:${MY_DATASET_ID}.${MY_TARGET_TABLE_ID} \
--keyFields=id,locid \
--pivotFields=class,on_sale,state \
--valueFields=sale_price,count"
``` | 38.45045 | 266 | 0.768744 | eng_Latn | 0.961861 |
da20b52580988ee1a00fed8cfde1dbd00d456940 | 4,179 | md | Markdown | README.md | rhtconsulting/evmcmd | dea23804574b27f1a847c1143307d4d2e534e5e3 | [
"Apache-2.0"
] | 2 | 2015-07-31T05:41:54.000Z | 2017-06-30T02:06:40.000Z | README.md | rhtconsulting/evmcmd | dea23804574b27f1a847c1143307d4d2e534e5e3 | [
"Apache-2.0"
] | null | null | null | README.md | rhtconsulting/evmcmd | dea23804574b27f1a847c1143307d4d2e534e5e3 | [
"Apache-2.0"
] | null | null | null | EVMCMD Interface
----------------
The evmcmd is a command line interface to the CFME webservice interface. It is intended to be used to retrieve information from the CFME engine and test that the CFME engine is up and running.
Requirements
------------
Ruby 2.0 or higher
Savon 2.0 or higher
Usage
-----
./evmcmd.rb help
./evmcmd.rb evm_ping
./evmcmd.rb mgtsys_listall
./evmcmd.rb mgtsys_details -g guid
./evmcmd.rb mgtsys_gettags -g guid
./evmcmd.rb mgtsys_settag -g guid -c cc -n 002
./evmcmd.rb mgtsys_host_list
./evmcmd.rb host_listall
./evmcmd.rb host_gettags -g guid
./evmcmd.rb host_settag -g guid-c cc -n 002
./evmcmd.rb host_getvms -g guid
./evmcmd.rb host_getmgtsys -g guid
./evmcmd.rb host_details -g guid
./evmcmd.rb datastore_listall
./evmcmd.rb datastore_getvms -i id
./evmcmd.rb datastore_gethosts -i id
./evmcmd.rb datastore_gettags -i id
./evmcmd.rb datastore_settag -i id -c cc -n 002
./evmcmd.rb datastore_getmgtsys -i id
./evmcmd.rb datastore_gettags -i id
./evmcmd.rb datastore_list_bytag -t storagetype/replicated
./evmcmd.rb cluster_listall
./evmcmd.rb cluster_getvms -i id
./evmcmd.rb cluster_gettags -i id
./evmcmd.rb cluster_settag -i id -c cc -n 002
./evmcmd.rb cluster_gethosts -i id
./evmcmd.rb cluster_getmgtsys -i id
./evmcmd.rb cluster_list_bytag -t location/ny
./evmcmd.rb resourcepool_listall
./evmcmd.rb resourcepool_gettags -i id
./evmcmd.rb resourcepool_getmgtsys -i id
./evmcmd.rb resourcepool_settag -i id -c cc -n 001
./evmcmd.rb resourcepool_settag -i id -c location -n ny
./evmcmd.rb resourcepool_list_bytag -t location/ny
./evmcmd.rb resourcepool_list_bytag -t cc/001
./evmcmd.rb vm_listall
./evmcmd.rb vm_details -g guid
./evmcmd.rb vm_gettags -g guid
./evmcmd.rb vm_settag -g guid -c cc -n 002
./evmcmd.rb vm_list_bytag -t location/chicago
./evmcmd.rb vm_list_bytag -t location/ny
./evmcmd.rb vm_list_bytag -t cc/002 --out json
./evmcmd.rb vm_state_start -g guid
./evmcmd.rb vm_state_stop -g guid
./evmcmd.rb vm_state_suspend -g guid
./evmcmd.rb get_automation_request -i id
./evmcmd.rb get_automation_task -i id
./evmcmd.rb provision_request -t templateFields -v vmFields -r requester -c tags -V values -E ems_custom_attributes -M miq_custom_attributes
./evmcmd.rb automation_request -u uri_parts -p parameters -r requester
- To create Instances you must have imported the evmcmd_import.xml file in data/evmcmd_import.xml
./evmcmd.rb create_instance -n namespace -c class -i instance -v value
Or alternatively by running the evmcmd itself to bring up the prompt to the run the same tests above but without evmcmd
in from
evmcmd> mgtsys_listall
evmcmd> mgtsys_gettags 02f0f85e-3b54-11e3-bce6-005056b367d4
evmcmd> mgtsys_settags 02f0f85e-3b54-11e3-bce6-005056b367d4 department accounting
evmcmd> mgtsys_details 02f0f85e-3b54-11e3-bce6-005056b367d4
evmcmd> cluster_listall
evmcmd> host_listall
evmcmd> host_gettags 29344dcc-3b54-11e3-97a2-005056b367d4
evmcmd> vm_listall
evmcmd> vm_gettags vmGuid=2a4765fa-3b54-11e3-97a2-005056b367d4
evmcmd> vm_details vmGuid=2a4765fa-3b54-11e3-97a2-005056b367d4
evmcmd> cluster_listall
evmcmd> resourcepool_listall
evmcmd> datastore_listall
evmcmd> version
Current Updates
---------------
The current version has been updated to use OO constructs. We have defined the following classes:
* Class EvmCmd - Main class for the EVMCMD application
* Class Hosts - Class that handles all requests for host information
* Class CFMEConnection - Connection singleton class that handles the calls to the Web Service
* Class VirtualMachines - Class that handles requests for the vm information.
* Class ManagementSystems - Class that handles requests for the systems managed by the CFME engine.
* Class Clusters - Class that handles the requests for cluster information.
More commands to come as I translate what I see from the WSDL and how to convert it within the framework I started with.
Any comments are welcome
[email protected]
[email protected]
| 37.3125 | 194 | 0.73989 | eng_Latn | 0.510231 |
da2114144abeb356922573087075f7b597e8bb97 | 33 | md | Markdown | README.md | Gabe-Jespersen/JespGen | cbd34c813868dc705b4682297e248e14078c8610 | [
"BSD-2-Clause"
] | null | null | null | README.md | Gabe-Jespersen/JespGen | cbd34c813868dc705b4682297e248e14078c8610 | [
"BSD-2-Clause"
] | null | null | null | README.md | Gabe-Jespersen/JespGen | cbd34c813868dc705b4682297e248e14078c8610 | [
"BSD-2-Clause"
] | null | null | null | Gaussian Random Number Generator
| 16.5 | 32 | 0.878788 | eng_Latn | 0.799107 |
da2140e4eb9e1a7de24167200550084c18ccb330 | 941 | md | Markdown | _posts/2015-10-07-bash-%d0%ba%d0%be%d0%bf%d0%b8%d1%80%d0%be%d0%b2%d0%b0%d0%bd%d0%b8%d0%b5-%d1%84%d0%b0%d0%b9%d0%bb%d0%be%d0%b2-%d0%b8%d0%b7-%d1%81%d0%bf%d0%b8%d1%81%d0%ba%d0%b0.md | RussianPenguin/russianpenguin.github.io | 5d1b68086ae6dd262a965c8082a4daa1200fcbba | [
"CC0-1.0"
] | null | null | null | _posts/2015-10-07-bash-%d0%ba%d0%be%d0%bf%d0%b8%d1%80%d0%be%d0%b2%d0%b0%d0%bd%d0%b8%d0%b5-%d1%84%d0%b0%d0%b9%d0%bb%d0%be%d0%b2-%d0%b8%d0%b7-%d1%81%d0%bf%d0%b8%d1%81%d0%ba%d0%b0.md | RussianPenguin/russianpenguin.github.io | 5d1b68086ae6dd262a965c8082a4daa1200fcbba | [
"CC0-1.0"
] | null | null | null | _posts/2015-10-07-bash-%d0%ba%d0%be%d0%bf%d0%b8%d1%80%d0%be%d0%b2%d0%b0%d0%bd%d0%b8%d0%b5-%d1%84%d0%b0%d0%b9%d0%bb%d0%be%d0%b2-%d0%b8%d0%b7-%d1%81%d0%bf%d0%b8%d1%81%d0%ba%d0%b0.md | RussianPenguin/russianpenguin.github.io | 5d1b68086ae6dd262a965c8082a4daa1200fcbba | [
"CC0-1.0"
] | null | null | null | ---
layout: post
title: 'Bash: копирование файлов из списка'
date: 2015-10-07 18:12:49.000000000 +03:00
type: post
categories:
- Разработка
- linux
tags:
- bash
permalink: "/2015/10/07/bash-%d0%ba%d0%be%d0%bf%d0%b8%d1%80%d0%be%d0%b2%d0%b0%d0%bd%d0%b8%d0%b5-%d1%84%d0%b0%d0%b9%d0%bb%d0%be%d0%b2-%d0%b8%d0%b7-%d1%81%d0%bf%d0%b8%d1%81%d0%ba%d0%b0/"
---
 Задача: у нас есть файл со списком стилей/скриптов/бинарников (нужное подчернуть) которые надо скопировать или переместить в другое место.
Да. Такие задачи бывают. :)
Допустим выглядит файл как-то так
```shell
$ cat css.txt
css/reset-ls.css
css/b-browser.css
css/reg-form.css
css/old/pop-up.css
```
Пути либо относительные, либо полные.
Скопировать все в новый локейшн можно простым однострочником
```shell
$ for i in $(cat css.txt); do cp $i /tmp/; done
```
| 29.40625 | 254 | 0.716259 | rus_Cyrl | 0.347676 |
da21b35c93ce6ef4b384d89d2861ecd77f8da4bc | 2,180 | md | Markdown | _pages/about.md | abinashsinha330/abinashsinha330.github.io | 6a612401249ac07844467d9206c83f449f5fb443 | [
"MIT"
] | null | null | null | _pages/about.md | abinashsinha330/abinashsinha330.github.io | 6a612401249ac07844467d9206c83f449f5fb443 | [
"MIT"
] | null | null | null | _pages/about.md | abinashsinha330/abinashsinha330.github.io | 6a612401249ac07844467d9206c83f449f5fb443 | [
"MIT"
] | null | null | null | ---
permalink: /
title: "This is me!"
excerpt: "About me"
author_profile: true
redirect_from:
- /about/
- /about.html
---
Hello! I hope everyone who visits me here eventually gets to know where I am heading to in terms of my pursuits. I completed my M.S. from University of Minnesota advised by Dr. Jaideep Srivastava. Previously, I was advised by Dr. Lucy Fortson, working as a *Graduate Research Assistant* on an NSF-funded project related to [LSST](https://www.lsst.org/). My research interests are in deep learning, machine learning and sequential recommender system.
I would love to be able to get my hands dirty in sustainability analytics someday. The reason being that I am beginning to believe that sustainability should be the core to whatever we do.
Apart from academia, I love playing soccer and badminton, basically into sports. Other interests include dancing, hiking and kayaking. I am a huge soccer nerd (in fact one of my projects was related to soccer analytics!). Though, I prefer calling it football.
## Research
My current research is on recommending video in MOOCs using **self-supervised learning** and **mutual information maximization** by understanding their intrinsic intentions during a student's learning journey. Through this short research journey, I have gotten acquainted with the world of representation learning for sequential data. Previously, I worked on analyzing time-series data of simulated [LSST](https://www.lsst.org/) observations to build a deep learning framework to identify the type of astronomical event that caused those observations.
## Education
> **University of Minnesota, Twin Cities** (GPA: 3.89/4.00) <br/>
> M.S. in Computer Science 2018 - 2021 <br/>
Advisor: Dr. Jaideep Srivastava (Professor, Dept. of Computer Science & Engineering)<br/>
Graduate RA Advisor: Dr. Lucy Fortson (Professor, School of Physics and Astronomy)<br/>
> **Indian Institute of Technology, Kharagpur** (GPA: 7.53/10) <br/>
> B.Tech. in Instrumentation Engineering 2011 - 2015 <br/>
Advisor: Dr. Alok Kanti Deb (Associate Professor, Electrical Engineering)<br/>
| 70.322581 | 551 | 0.764679 | eng_Latn | 0.9929 |
da21ee429b4cb613b6f894e41a0a8aa9e67a1f86 | 705 | md | Markdown | ISSUE_TEMPLATE.md | Mocksybren/A3-Antistasi-1.4 | 2e78bc406b4b36e99eb153a94605b84bf0e860b9 | [
"MIT"
] | 84 | 2018-05-16T17:46:30.000Z | 2021-08-24T07:32:34.000Z | ISSUE_TEMPLATE.md | Mocksybren/A3-Antistasi-1.4 | 2e78bc406b4b36e99eb153a94605b84bf0e860b9 | [
"MIT"
] | 149 | 2018-05-16T10:54:32.000Z | 2021-04-01T16:16:50.000Z | ISSUE_TEMPLATE.md | Mocksybren/A3-Antistasi-1.4 | 2e78bc406b4b36e99eb153a94605b84bf0e860b9 | [
"MIT"
] | 133 | 2018-05-17T21:10:47.000Z | 2022-03-04T06:10:37.000Z | *Version:* 1.0.
*Mods:* CBA, TFAR, ACE(no-medical)
*Environment*: SP, MP LAN, MP host, MP dedi.
*.rpt attatched?* YES
*have you edited the missionfile?*: No
***Issue:***
---
**Delete theese lines after compiling the section above**,
Make short title, and possibly use [bug] [question] before for better identification
To find an .rpt "C:\Users\User\AppData\Local\Arma 3\Arma3_x64_2018-03-06_16-36-36.rpt" (type %appdata% in windows searchbar and press enter, go up by one level and you'll find \Local\Arma3)
To permalink or link code just browse in github the file you want, click on the numer of the line to make it yellow and click on the dots on the left, select what you need, paste here.
| 30.652174 | 189 | 0.726241 | eng_Latn | 0.977135 |
da23ad9cc6d4b43bc682b9ed6539c1ea5342d863 | 26,071 | md | Markdown | articles/batch/batch-ci-cd.md | fuatrihtim/azure-docs.tr-tr | 6569c5eb54bdab7488b44498dc4dad397d32f1be | [
"CC-BY-4.0",
"MIT"
] | 16 | 2017-08-28T08:29:36.000Z | 2022-01-02T16:46:30.000Z | articles/batch/batch-ci-cd.md | fuatrihtim/azure-docs.tr-tr | 6569c5eb54bdab7488b44498dc4dad397d32f1be | [
"CC-BY-4.0",
"MIT"
] | 470 | 2017-11-11T20:59:16.000Z | 2021-04-10T17:06:28.000Z | articles/batch/batch-ci-cd.md | fuatrihtim/azure-docs.tr-tr | 6569c5eb54bdab7488b44498dc4dad397d32f1be | [
"CC-BY-4.0",
"MIT"
] | 25 | 2017-11-11T19:39:08.000Z | 2022-03-30T13:47:56.000Z | ---
title: HPC çözümlerini derlemek & dağıtmak için Azure Pipelines kullanma
description: Azure Batch üzerinde çalışan bir HPC uygulaması için derleme/sürüm ardışık düzeni dağıtmayı öğrenin.
author: chrisreddington
ms.author: chredd
ms.date: 03/04/2021
ms.topic: how-to
ms.openlocfilehash: 7170044af58a508ff5a43751cc376f8b8d498444
ms.sourcegitcommit: 867cb1b7a1f3a1f0b427282c648d411d0ca4f81f
ms.translationtype: MT
ms.contentlocale: tr-TR
ms.lasthandoff: 03/20/2021
ms.locfileid: "102435554"
---
# <a name="use-azure-pipelines-to-build-and-deploy-hpc-solutions"></a>HPC çözümleri derlemek ve dağıtmak için Azure Pipelines kullanma
Azure DevOps tarafından sunulan araçlar, yüksek performanslı bilgi işlem (HPC) çözümlerinin otomatik olarak oluşturulmasına ve test edilmesine çevrilebilir. [Azure Pipelines](/azure/devops/pipelines/get-started/what-is-azure-pipelines) , yazılım oluşturmaya, dağıtmaya, test etmeye ve izlemeye yönelik bir dizi modern sürekli TÜMLEŞTIRME (CI) ve sürekli DAĞıTıM (CD) işlemi sağlar. Bu işlemler, yazılım teslimatını hızlandırarak altyapıyı ve işlemlerini desteklemek yerine kodunuza odaklanmanızı sağlar.
Bu makalede, Azure Batch üzerinde dağıtılan HPC çözümleri için [Azure Pipelines](/azure/devops/pipelines/get-started/what-is-azure-pipelines) kullanarak CI/CD işlemlerinin nasıl ayarlanacağı açıklanır.
## <a name="prerequisites"></a>Önkoşullar
Bu makaledeki adımları izlemek için bir [Azure DevOps kuruluşunun](/azure/devops/organizations/accounts/create-organization)olması gerekir. Ayrıca, [Azure DevOps 'da bir proje oluşturmanız](/azure/devops/organizations/projects/create-project)gerekecektir.
Başlamadan önce [kaynak denetimi](/azure/devops/user-guide/source-control) ve [Azure Resource Manager şablonu sözdiziminin](../azure-resource-manager/templates/template-syntax.md) temel olarak anlaşılmasına yardımcı olur.
## <a name="create-an-azure-pipeline"></a>Azure işlem hattı oluşturma
Bu örnekte, bir Azure Batch altyapısını dağıtmak ve bir uygulama paketini serbest bırakmak için derleme ve sürüm işlem hattı oluşturacaksınız. Kodun yerel olarak geliştirildiği varsayıldığında, bu genel dağıtım akışdır:

Bu örnek, çeşitli Azure Resource Manager şablonları ve var olan ikilileri kullanır. Bu örnekleri deponuza kopyalayabilir ve bunları Azure DevOps 'a gönderebilirsiniz.
### <a name="understand-the-azure-resource-manager-templates"></a>Azure Resource Manager şablonlarını anlayın
Bu örnek, çözümü dağıtmak için çeşitli Azure Resource Manager şablonları kullanır. Üç yetenek şablonu (birimlere veya modüllere benzer şekilde) belirli bir işlev parçasını uygulamak için kullanılır. Uçtan uca bir çözüm şablonu (deployment.json), bu temel yetenek şablonlarını dağıtmak için kullanılır. Bu [bağlantılı şablon yapısı ](../azure-resource-manager/templates/deployment-tutorial-linked-template.md) , her bir yetenek şablonunun çözümler arasında tek tek test ve yeniden kullanılabilir olmasını sağlar.

Bu şablon, uygulamayı Batch hesabına dağıtmak için gerekli olan bir Azure depolama hesabı tanımlar. Ayrıntılı bilgi için bkz. [Microsoft. Storage kaynak türleri için Kaynak Yöneticisi şablonu başvuru kılavuzu](/azure/templates/microsoft.storage/allversions).
```json
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"accountName": {
"type": "string",
"metadata": {
"description": "Name of the Azure Storage Account"
}
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Storage/storageAccounts",
"name": "[parameters('accountName')]",
"sku": {
"name": "Standard_LRS"
},
"apiVersion": "2018-02-01",
"location": "[resourceGroup().location]",
"properties": {}
}
],
"outputs": {
"blobEndpoint": {
"type": "string",
"value": "[reference(resourceId('Microsoft.Storage/storageAccounts', parameters('accountName'))).primaryEndpoints.blob]"
},
"resourceId": {
"type": "string",
"value": "[resourceId('Microsoft.Storage/storageAccounts', parameters('accountName'))]"
}
}
}
```
Sonraki şablon bir [Azure Batch hesabını](accounts.md)tanımlar. Batch hesabı, [havuzlar](nodes-and-pools.md#pools)genelinde çok sayıda uygulama çalıştırmak için bir platform işlevi görür. Ayrıntılı bilgi için, [Microsoft.Batch kaynak türleri için Kaynak Yöneticisi şablonu başvuru kılavuzuna](/azure/templates/microsoft.batch/allversions)bakın.
```json
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"batchAccountName": {
"type": "string",
"metadata": {
"description": "Name of the Azure Batch Account"
}
},
"storageAccountId": {
"type": "string",
"metadata": {
"description": "ID of the Azure Storage Account"
}
}
},
"variables": {},
"resources": [
{
"name": "[parameters('batchAccountName')]",
"type": "Microsoft.Batch/batchAccounts",
"apiVersion": "2017-09-01",
"location": "[resourceGroup().location]",
"properties": {
"poolAllocationMode": "BatchService",
"autoStorage": {
"storageAccountId": "[parameters('storageAccountId')]"
}
}
}
],
"outputs": {}
}
```
Sonraki şablon Batch hesabında bir Batch havuzu oluşturur. Ayrıntılı bilgi için, [Microsoft.Batch kaynak türleri için Kaynak Yöneticisi şablonu başvuru kılavuzuna](/azure/templates/microsoft.batch/allversions)bakın.
```json
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"batchAccountName": {
"type": "string",
"metadata": {
"description": "Name of the Azure Batch Account"
}
},
"batchAccountPoolName": {
"type": "string",
"metadata": {
"description": "Name of the Azure Batch Account Pool"
}
}
},
"variables": {},
"resources": [
{
"name": "[concat(parameters('batchAccountName'),'/', parameters('batchAccountPoolName'))]",
"type": "Microsoft.Batch/batchAccounts/pools",
"apiVersion": "2017-09-01",
"properties": {
"deploymentConfiguration": {
"virtualMachineConfiguration": {
"imageReference": {
"publisher": "Canonical",
"offer": "UbuntuServer",
"sku": "18.04-LTS",
"version": "latest"
},
"nodeAgentSkuId": "batch.node.ubuntu 18.04"
}
},
"vmSize": "Standard_D1_v2"
}
}
],
"outputs": {}
}
```
Son şablon, temel alınan üç yetenek şablonunu dağıtan bir Orchestrator işlevi görür.
```json
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"templateContainerUri": {
"type": "string",
"metadata": {
"description": "URI of the Blob Storage Container containing the Azure Resource Manager templates"
}
},
"templateContainerSasToken": {
"type": "string",
"metadata": {
"description": "The SAS token of the container containing the Azure Resource Manager templates"
}
},
"applicationStorageAccountName": {
"type": "string",
"metadata": {
"description": "Name of the Azure Storage Account"
}
},
"batchAccountName": {
"type": "string",
"metadata": {
"description": "Name of the Azure Batch Account"
}
},
"batchAccountPoolName": {
"type": "string",
"metadata": {
"description": "Name of the Azure Batch Account Pool"
}
}
},
"variables": {},
"resources": [
{
"apiVersion": "2017-05-10",
"name": "storageAccountDeployment",
"type": "Microsoft.Resources/deployments",
"properties": {
"mode": "Incremental",
"templateLink": {
"uri": "[concat(parameters('templateContainerUri'), '/storageAccount.json', parameters('templateContainerSasToken'))]",
"contentVersion": "1.0.0.0"
},
"parameters": {
"accountName": {"value": "[parameters('applicationStorageAccountName')]"}
}
}
},
{
"apiVersion": "2017-05-10",
"name": "batchAccountDeployment",
"type": "Microsoft.Resources/deployments",
"dependsOn": [
"storageAccountDeployment"
],
"properties": {
"mode": "Incremental",
"templateLink": {
"uri": "[concat(parameters('templateContainerUri'), '/batchAccount.json', parameters('templateContainerSasToken'))]",
"contentVersion": "1.0.0.0"
},
"parameters": {
"batchAccountName": {"value": "[parameters('batchAccountName')]"},
"storageAccountId": {"value": "[reference('storageAccountDeployment').outputs.resourceId.value]"}
}
}
},
{
"apiVersion": "2017-05-10",
"name": "poolDeployment",
"type": "Microsoft.Resources/deployments",
"dependsOn": [
"batchAccountDeployment"
],
"properties": {
"mode": "Incremental",
"templateLink": {
"uri": "[concat(parameters('templateContainerUri'), '/batchAccountPool.json', parameters('templateContainerSasToken'))]",
"contentVersion": "1.0.0.0"
},
"parameters": {
"batchAccountName": {"value": "[parameters('batchAccountName')]"},
"batchAccountPoolName": {"value": "[parameters('batchAccountPoolName')]"}
}
}
}
],
"outputs": {}
}
```
### <a name="understand-the-hpc-solution"></a>HPC çözümünü anlama
Daha önce belirtildiği gibi, bu örnek birkaç Azure Resource Manager şablonu ve var olan ikilileri kullanır. Bu örnekleri deponuza kopyalayabilir ve bunları Azure DevOps 'a gönderebilirsiniz.
Bu çözüm için, FFmpeg uygulama paketi olarak kullanılır. Henüz yoksa [FFmpeg paketini indirebilirsiniz](https://github.com/GyanD/codexffmpeg/releases/tag/4.3.1-2020-11-08) .

Bu deponun dört ana bölümü vardır:
- Azure Resource Manager şablonlarını içeren bir **ARM-Templates** klasörü
- [FFmpeg 4.3.1](https://github.com/GyanD/codexffmpeg/releases/tag/4.3.1-2020-11-08)'ın Windows 64 bit sürümünü içeren bir **HPC-Application** klasörü.
- Yapı işlem hattı işlemini tanımlayan bir YAML dosyası içeren bir **ardışık düzen** klasörü.
- İsteğe bağlı: [FFmpeg örneğiyle Azure Batch .NET dosya işlemenin](https://github.com/Azure-Samples/batch-dotnet-ffmpeg-tutorial) bir kopyası olan bir **istemci-uygulama** klasörü. Bu uygulama bu makale için gerekli değildir.
> [!NOTE]
> Bu, bir kod temelinin yapısına yalnızca bir örnektir. Bu yaklaşım, uygulamanın, altyapının ve işlem hattı kodunun aynı depoda depolandığını gösteren amaçlar için kullanılır.
Kaynak kodu ayarlandığına göre, ilk derlemeyi başlatabilirsiniz.
## <a name="continuous-integration"></a>Sürekli tümleştirme
Azure DevOps Services içinde [Azure Pipelines](/azure/devops/pipelines/get-started/), uygulamalarınız için bir derleme, test ve dağıtım işlem hattı uygulamanıza yardımcı olur.
İşlem hattının bu aşamasında, testler genellikle kodu doğrulamak ve yazılımın uygun parçalarını derlemek için çalıştırılır. Testlerin sayısı ve türleri ve çalıştırdığınız ek görevler, daha geniş derleme ve yayın stratejinize göre değişir.
## <a name="prepare-the-hpc-application"></a>HPC uygulamasını hazırlama
Bu bölümde **HPC-Application** klasörüyle çalışacaksınız. Bu klasör Azure Batch hesabı içinde çalışacak yazılımı (FFmpeg) içerir.
1. Azure DevOps kuruluşunuzda Azure Pipelines yapılar bölümüne gidin. Yeni bir işlem **hattı** oluşturun.

1. Derleme işlem hattı oluşturmak için iki seçeneğiniz vardır:
a. [Görsel tasarımcıyı kullanın](/azure/devops/pipelines/get-started-designer). Bunu yapmak için **Yeni işlem hattı** sayfasında "görsel tasarımcıyı kullan" ı seçin.
b. [YAML derlemelerini kullanın](/azure/devops/pipelines/get-started-yaml). Yeni bir YAML işlem hattı oluşturarak yeni bir işlem **hattı** sayfasında Azure Repos veya GitHub seçeneğine tıklayabilirsiniz. Alternatif olarak, aşağıdaki örneği kaynak denetilinizden saklayabilir ve Visual Designer ' ı seçip YAML şablonunu kullanarak var olan bir YAML dosyasına başvurabilirsiniz.
```yml
# To publish an application into Azure Batch, we need to
# first zip the file, and then publish an artifact, so that
# we can take the necessary steps in our release pipeline.
steps:
# First, we Zip up the files required in the Batch Account
# For this instance, those are the ffmpeg files
- task: ArchiveFiles@2
displayName: 'Archive applications'
inputs:
rootFolderOrFile: hpc-application
includeRootFolder: false
archiveFile: '$(Build.ArtifactStagingDirectory)/package/$(Build.BuildId).zip'
# Publish that zip file, so that we can use it as part
# of our Release Pipeline later
- task: PublishPipelineArtifact@0
inputs:
artifactName: 'hpc-application'
targetPath: '$(Build.ArtifactStagingDirectory)/package'
```
1. Yapı gerektiğinde yapılandırıldıktan sonra **& kuyruğu kaydet**' i seçin. Sürekli tümleştirme etkinse ( **Tetikleyiciler** bölümünde), depoya yeni bir kayıt yapıldığında derleme sırasında ayarlanan koşullara uyan yapı otomatik olarak tetiklenir.

1. Azure Pipelines **Build** bölümüne giderek, Azure DevOps 'daki yapınızı sürmekte olan canlı güncelleştirmeleri görüntüleyin. Derleme tanımınızdan uygun derlemeyi seçin.

> [!NOTE]
> HPC çözümünüzü yürütmek için bir istemci uygulaması kullanıyorsanız, bu uygulama için ayrı bir derleme tanımı oluşturmanız gerekir. [Azure Pipelines](/azure/devops/pipelines/get-started/index) belgelerinde çeşitli nasıl yapılır kılavuzlarından ulaşabilirsiniz.
## <a name="continuous-deployment"></a>Sürekli dağıtım
Azure Pipelines, uygulamanızı ve temel altyapıyı dağıtmak için de kullanılır. [Yayın işlem hatları](/azure/devops/pipelines/release) sürekli dağıtımı etkinleştirir ve yayın işleminizi otomatik hale getirir.
### <a name="deploy-your-application-and-underlying-infrastructure"></a>Uygulamanızı ve temel altyapıyı dağıtın
Altyapıyı dağıtmaya yönelik birkaç adım vardır. Bu çözüm [bağlantılı şablonlar](../azure-resource-manager/templates/linked-templates.md)kullandığından, Bu şablonların ortak bir uç noktadan (http veya https) erişilebilir olması gerekir. Bu bir GitHub veya bir Azure Blob depolama hesabı ya da başka bir depolama konumunda bir depo olabilir. Karşıya yüklenen şablon yapıtları, özel bir modda tutulacağından ve paylaşılan erişim imzası (SAS) belirteci kullanılarak erişilen için güvenli durumda kalabilir.
Aşağıdaki örnek, bir Azure Storage blobundan şablonlar içeren bir altyapının nasıl dağıtılacağını göstermektedir.
1. Yeni bir **yayın tanımı** oluşturun ve boş bir tanım seçin. Yeni oluşturulan ortamı, işlem hattınızla ilgili bir şekilde yeniden adlandırın.

1. HPC uygulamasının çıkışını almak için derleme ardışık düzeninde bir bağımlılık oluşturun.
> [!NOTE]
> **Kaynak diğer** adını, bu, sürüm tanımının içinde görevler oluşturulduğunda gerekli olacağı için bir yere göz atın.

1. Başka bir yapıtın, bu kez bir Azure deposunun bağlantısını oluşturun. Bu, deponuzda depolanan Kaynak Yöneticisi şablonlarına erişmek için gereklidir. Kaynak Yöneticisi şablonlar derleme gerektirirken, bunları bir derleme işlem hattı aracılığıyla göndermeniz gerekmez.
> [!NOTE]
> Daha sonra gerekli olacağı için, bir kez daha sonra **kaynak diğer adına** göz önünde bulunmanız gerekir.

1. **Değişkenler** bölümüne gidin. Aynı bilgileri birden çok göreve yeniden girmeniz gerekmiyorsa, işlem hattınızda birkaç değişken oluşturmak isteyeceksiniz. Bu örnek aşağıdaki değişkenleri kullanır:
- **Applicationstorageaccountname**: HPC uygulama ikililerini tutan depolama hesabının adı
- **Batchaccountapplicationname**: toplu iş hesabındaki uygulamanın adı
- **Batchaccountname**: Batch hesabının adı
- **Batchaccountpoolname**: Işlemeyi yapan VM havuzunun adı
- **Batchapplicationıd**: Batch uygulaması IÇIN benzersiz kimlik
- **Batchapplicationversion**: Batch uygulamanızın anlamsal sürümü (yani, FFmpeg ikilileri)
- **konum**: dağıtılacak Azure kaynakları için konum
- **Resourcegroupname**: oluşturulacak kaynak grubunun adı ve kaynaklarınızın dağıtılacağı konum
- **storageAccountName**: bağlı kaynak yöneticisi şablonlarını tutan depolama hesabının adı

1. Geliştirme ortamı görevlerine gidin. Aşağıdaki anlık görüntüde altı görevi görebilirsiniz. Bu görevler: iç içe geçmiş Kaynak Yöneticisi şablonlarını barındırmak için bir depolama hesabı dağıtma, bu Kaynak Yöneticisi şablonlarını depolama hesabına kopyalama, Batch hesabını ve gerekli bağımlılıkları dağıtma, Azure Batch hesabında bir uygulama oluşturma ve uygulama paketini Azure Batch hesabına yükleme.

1. Indirme işlem **hattı yapıtı (Önizleme)** görevini ekleyin ve aşağıdaki özellikleri ayarlayın:
- **Görünen ad:** ApplicationPackage 'i aracıya indir
- **İndirilecek yapıt adı:** HPC-Application
- **Indirilecek yol**: $ (System. DefaultWorkingDirectory)
1. Azure Resource Manager şablonlarınızı depolamak için bir depolama hesabı oluşturun. Çözümdeki mevcut bir depolama hesabı kullanılabilir, ancak bu otomatik olarak içerilen Bu örnek ve içerik yalıtımının desteklenmesi için, ayrılmış bir depolama hesabı oluşturacaksınız.
**Azure Kaynak grubu dağıtım** görevini ekleyin ve aşağıdaki özellikleri ayarlayın:
- **Görünen ad:** Kaynak Yöneticisi şablonları için depolama hesabı dağıtma
- **Azure aboneliği:** Uygun Azure aboneliğini seçin
- **Eylem**: kaynak grubu oluştur veya güncelleştir
- **Kaynak grubu**: $ (resourcegroupname)
- **Konum**: $ (konum)
- **Şablon**: $ (System. ArtifactsDirectory)/**{Yourazurerepoartifactsourcealias}**/ARM-Templates/storageAccount.json
- **Geçersiz kılma şablonu parametreleri**:-AccountName $ (storageAccountName)
1. Azure Pipelines kullanarak yapıtları kaynak denetiminden depolama hesabına yükleyin. Bu Azure Pipelines görevinin bir parçası olarak, depolama hesabı kapsayıcı URI 'SI ve SAS belirteci, Azure Pipelines bir değişkene alınabilir ve bu da bu aracı aşamasında yeniden kullanılabilmelerini sağlar.
**Azure dosya kopyalama** görevini ekleyin ve aşağıdaki özellikleri ayarlayın:
- **Kaynak:** $ (System. ArtifactsDirectory)/**{Yourazurerepoartifactsourcealias}**/ARM-Templates/
- **Azure bağlantı türü**: Azure Resource Manager
- **Azure aboneliği:** Uygun Azure aboneliğini seçin
- **Hedef türü**: Azure blobu
- **RM depolama hesabı**: $ (storageAccountName)
- **Kapsayıcı adı**: Şablonlar
- **Depolama kapsayıcısı URI 'si**: templatecontaineruri
- **Depolama KAPSAYıCıSı SAS belirteci**: templatecontainersastoken
1. Orchestrator şablonunu dağıtın. Bu şablon, depolama hesabı kapsayıcı URI 'SI ve SAS belirteci için parametreler içerir. Kaynak Yöneticisi şablonunda gereken değişkenler, yayın tanımının değişkenler bölümünde tutulur veya başka bir Azure Pipelines görevinden (örneğin, Azure Blob kopyalama görevinin bir parçası) ayarlanmış olabilir.
**Azure Kaynak grubu dağıtım** görevini ekleyin ve aşağıdaki özellikleri ayarlayın:
- **Görünen ad:** Azure Batch dağıt
- **Azure aboneliği:** Uygun Azure aboneliğini seçin
- **Eylem**: kaynak grubu oluştur veya güncelleştir
- **Kaynak grubu**: $ (resourcegroupname)
- **Konum**: $ (konum)
- **Şablon**: $ (System. ArtifactsDirectory)/**{Yourazurerepoartifactsourcealias}**/ARM-Templates/deployment.json
- **Şablon parametrelerini geçersiz kıl**: `-templateContainerUri $(templateContainerUri) -templateContainerSasToken $(templateContainerSasToken) -batchAccountName $(batchAccountName) -batchAccountPoolName $(batchAccountPoolName) -applicationStorageAccountName $(applicationStorageAccountName)`
Azure Key Vault görevleri kullanmak yaygın bir uygulamadır. Azure aboneliğinize bağlı hizmet sorumlusu uygun bir erişim ilkeleri ayarlandıysa, bir Azure Key Vault parolaları indirebilir ve işlem hattınızda değişken olarak kullanılabilir. Gizli anahtar adı, ilişkili değerle ayarlanır. Örneğin, sürüm tanımında sshPassword 'ın gizli anahtarı $ (sshPassword) ile birlikte başvurulmalıdır.
1. Sonraki adımlar Azure CLı 'yı çağırır. Birincisi, Azure Batch bir uygulama oluşturmak ve ilişkili paketleri karşıya yüklemek için kullanılır.
**Azure CLI** görevini ekleyin ve aşağıdaki özellikleri ayarlayın:
- **Görünen ad:** Azure Batch hesapta uygulama oluştur
- **Azure aboneliği:** Uygun Azure aboneliğini seçin
- **Betik konumu**: satır içi betik
- **Satır Içi betik**: `az batch application create --application-id $(batchApplicationId) --name $(batchAccountName) --resource-group $(resourceGroupName)`
1. İkinci adım, ilişkili paketleri uygulamaya yüklemek için kullanılır (Bu durumda, FFmpeg dosyaları).
**Azure CLI** görevini ekleyin ve aşağıdaki özellikleri ayarlayın:
- **Görünen ad:** Paketi Azure Batch hesaba yükle
- **Azure aboneliği:** Uygun Azure aboneliğini seçin
- **Betik konumu**: satır içi betik
- **Satır Içi betik**: `az batch application package create --application-id $(batchApplicationId) --name $(batchAccountName) --resource-group $(resourceGroupName) --version $(batchApplicationVersion) --package-file=$(System.DefaultWorkingDirectory)/$(Release.Artifacts.{YourBuildArtifactSourceAlias}.BuildId).zip`
> [!NOTE]
> Uygulama paketinin sürüm numarası bir değişkene ayarlı. Bu, paketin önceki sürümlerinin üzerine yazılmasına izin verir ve Azure Batch gönderilen paketin sürüm numarasını el ile denetlemenizi sağlar.
1. **Yeni bir yayın oluşturmak > yayın**' i seçerek yeni bir yayın oluşturun. Tetiklendikten sonra, durumu görüntülemek için yeni sürüme yönelik bağlantıyı seçin.
1. Ortamınızın altındaki **Günlükler** düğmesini seçerek aracıdan canlı çıktıyı görüntüleyin.

## <a name="test-the-environment"></a>Ortamı test etme
Ortam kurulduktan sonra, aşağıdaki testlerin başarıyla tamamlandıklarını onaylayın.
PowerShell komut isteminden Azure CLı kullanarak yeni Azure Batch hesabına bağlanın.
- Azure hesabınızda ile oturum açın `az login` ve kimlik doğrulaması için yönergeleri izleyin.
- Şu anda Batch hesabının kimliğini doğrulayın: `az batch account login -g <resourceGroup> -n <batchAccount>`
#### <a name="list-the-available-applications"></a>Kullanılabilir uygulamaları listeleyin
```azurecli
az batch application list -g <resourcegroup> -n <batchaccountname>
```
#### <a name="check-the-pool-is-valid"></a>Havuzun geçerli olduğunu denetleme
```azurecli
az batch pool list
```
`currentDedicatedNodes`Bu komutun çıktısından değerini aklınızda edin. Bu değer bir sonraki testte ayarlanır.
#### <a name="resize-the-pool"></a>Havuzu yeniden boyutlandır
Havuzu yeniden boyutlandır iş ve görev testi için kullanılabilir işlem düğümleri olacak şekilde, yeniden boyutlandırma tamamlanana ve kullanılabilir düğümler bulunduğundan geçerli durumu görmek için havuz listesi komutuyla denetleyin
```azurecli
az batch pool resize --pool-id <poolname> --target-dedicated-nodes 4
```
## <a name="next-steps"></a>Sonraki adımlar
Basit bir uygulama aracılığıyla Batch hesabıyla nasıl etkileşim kuracağınızı öğrenmek için bu öğreticilere bakın.
- [Python API 'sini kullanarak Azure Batch ile paralel iş yükü çalıştırma](tutorial-parallel-python.md)
- [.NET API’si kullanarak Azure Batch ile paralel iş yükü çalıştırma](tutorial-parallel-dotnet.md)
| 53.644033 | 511 | 0.702812 | tur_Latn | 0.998647 |
da249841c68123e7770de78262bdaa1042e50d29 | 317 | md | Markdown | README.md | 24bisquitz/synth_it | 3280ac5ec513856d0741c8f0a33c136c65c49360 | [
"Apache-2.0"
] | null | null | null | README.md | 24bisquitz/synth_it | 3280ac5ec513856d0741c8f0a33c136c65c49360 | [
"Apache-2.0"
] | null | null | null | README.md | 24bisquitz/synth_it | 3280ac5ec513856d0741c8f0a33c136c65c49360 | [
"Apache-2.0"
] | null | null | null | # synth_it
This repo mainly contains [ORCΛ](https://github.com/hundredrabbits/Orca) codes I wrote to fiddle with the Korg [NTS-1](https://www.korg.com/uk/products/dj/nts_1/), as well as with the Korg volca [bass](https://www.korg.com/de/products/dj/volca_bass) and [FM](https://www.korg.com/de/products/dj/volca_fm).
| 105.666667 | 305 | 0.747634 | eng_Latn | 0.500377 |
da252e3919692c698cd3ccf3fa63725b055aad40 | 1,371 | md | Markdown | CalOrdersJET/node_modules/oraclejet-tooling/README.md | OncoreLLC/CalOrders | bf2e476a7298e1a5caac3827108e056b78af0b29 | [
"MIT"
] | 1 | 2022-03-14T00:08:00.000Z | 2022-03-14T00:08:00.000Z | CalOrdersJET/node_modules/oraclejet-tooling/README.md | OncoreLLC/CalOrders | bf2e476a7298e1a5caac3827108e056b78af0b29 | [
"MIT"
] | null | null | null | CalOrdersJET/node_modules/oraclejet-tooling/README.md | OncoreLLC/CalOrders | bf2e476a7298e1a5caac3827108e056b78af0b29 | [
"MIT"
] | 4 | 2017-03-16T17:34:05.000Z | 2018-02-28T20:10:37.000Z | # oraclejet-tooling 2.2.0
> Programmatic API to build and serve Oracle JET web and mobile applications
## About the tooling API
This tooling API contains methods to build and serve Oracle JET web and hybrid mobile apps. It is intended to be used with task running tools such as grunt or gulp. The APIs can also be invoked directly.
This is an open source project maintained by Oracle Corp.
## Installation
The oraclejet-tooling API will be automatically installed if you scaffold a web or hybrid mobile app following the [Oracle JET Developers Guide](http://docs.oracle.com/middleware/jet220/jet/).
## Usage
The oraclejet-tooling API contains a build API that will build the app with dev or release mode, and other options. It also contains a serve API that serves up your app to browser/simulator/device. Please refer to the source code for details on how to invoke and use the API methods.
## [Contributing](https://github.com/oracle/oraclejet-tooling/tree/master/CONTRIBUTING.md)
Oracle JET is an open source project. Pull Requests are currently not being accepted. See
[CONTRIBUTING](https://github.com/oracle/oraclejet-tooling/tree/master/CONTRIBUTING.md)
for details.
## [License](https://github.com/oracle/oraclejet-tooling/tree/master/LICENSE.md)
Copyright (c) 2014, 2016 Oracle and/or its affiliates
The Universal Permissive License (UPL), Version 1.0 | 59.608696 | 284 | 0.789205 | eng_Latn | 0.991945 |
da25d375c42bb5c469a77a2a0429efa8c95571da | 3,863 | md | Markdown | commandlinetools/2.md | InstallGuides/mac-install-guide | a8fa38c395dc4680b7f2383c2ae24d30a2640b3b | [
"CC0-1.0"
] | 19 | 2021-01-18T03:08:21.000Z | 2022-03-26T01:09:21.000Z | commandlinetools/2.md | InstallGuides/mac-install-guide | a8fa38c395dc4680b7f2383c2ae24d30a2640b3b | [
"CC0-1.0"
] | 1 | 2022-02-15T09:24:15.000Z | 2022-02-15T09:24:15.000Z | commandlinetools/2.md | InstallGuides/mac-install-guide | a8fa38c395dc4680b7f2383c2ae24d30a2640b3b | [
"CC0-1.0"
] | null | null | null | ## Xcode Command Line Tools Already Installed?
To avoid complications, it's best to check if Xcode Command Line Tools are already installed. If already installed, check if you have the most recent version.
If you updated your machine from an earlier macOS version, you may have an outdated version of Xcode or the Command Line Tools in place. Read on to check if you have a troublesome out-of-date Command Line Tools.
### Is Xcode already installed?
If you updated macOS from an earlier version with an "over the top" installation, your earlier development environment may remain intact. You will need to install the new version of Xcode Command Line Tools to avoid headaches. First, check what you have.
Check if you previously installed the full Xcode package:
```bash
$ xcode-select -p
```
#### Scenario 1
If you see, `xcode-select: error: unable to get active developer directory...`, the Xcode package is not installed.
Good! Jump to either section:
- [Install Xcode Command Line Tools with Homebrew](/commandlinetools/3.html) (recommended)
- [Install Xcode Command Line Tools Directly](/commandlinetools/4.html) (alternative).

*The Xcode package is not installed*
#### Scenario 2
If you see a file location that contains spaces in the path:
```bash
/Applications/Apple Dev Tools/Xcode.app/Contents/Developer
```
you will have problems installing Homebrew. You should [delete Xcode](/commandlinetools/7.html) before continuing.
#### Scenario 3
If you see:
```bash
/Applications/Xcode.app/Contents/Developer
```
The full Xcode package is already installed. Perhaps you installed it previously. If Xcode is installed, you will need to update Xcode to the newest version. Go to the App Store application and check "Updates." After updating Xcode, be sure to launch the Xcode application and accept the Apple license terms.
#### Scenario 4
If you see:
```bash
/Library/Developer/CommandLineTools
```
The Xcode Command Line Tools may be installed or an empty directory may be present.
Here's how to test:
```bash
$ ls /Library/Developer/CommandLineTools/usr/bin/git
```
You should see:
```bash
/Library/Developer/CommandLineTools/usr/bin/git
```
#### Remove an empty folder
If the Xcode Command Line Tools folder is empty, you should remove it.
Remove the empty folder:
```bash
$ sudo rm -rf /Library/Developer/CommandLineTools
```
Use `sudo` for admin privileges. You must enter the password you use to log in to your computer (you will not see the password after entering it). After removing the folder, continue to the section, [Install Xcode Command Line Tools with Homebrew](/commandlinetools/3.html).
### Check the Xcode Command Line Tools version
There is no easy way to directly check the version number of Xcode Command Line Tools installed on your machine. Instead, check the version of clang by running `clang --version` in the terminal application. Clang is a compiler that turns C/C++/Objective-C source code into an executable program.
The Wikipedia page for [Xcode](https://en.wikipedia.org/wiki/Xcode) shows the version number of the latest Xcode release and the corresponding clang version.
Check the clang version:
```bash
$ clang --version
Apple clang version 12.0.5 (clang-1205.0.22.9)
Target: arm64-apple-darwin20.4.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin
```
Check the [Wikipedia page](https://en.wikipedia.org/wiki/Xcode) to see if you've got the latest version.
If the version is old, first [Uninstall Xcode Command Line Tools](/commandlinetools/6.html) and then [Reinstall Xcode Command Line Tools](/commandlinetools/7.html).
If Xcode Command Line Tools are installed and up to date, you're done! You may want to [Install Homebrew](/homebrew/index.html) to continue setting up your development environment.
| 36.443396 | 308 | 0.770127 | eng_Latn | 0.97905 |
da25e6642713270cb6b1c913f2bcae230fe682bc | 159 | md | Markdown | CSES/others_better_solution/README.md | sml0399/implementation_of_algorithms | ea0b4e00f836875baea5946cca186ad80d7f5473 | [
"RSA-MD"
] | null | null | null | CSES/others_better_solution/README.md | sml0399/implementation_of_algorithms | ea0b4e00f836875baea5946cca186ad80d7f5473 | [
"RSA-MD"
] | 1 | 2021-01-12T05:42:07.000Z | 2021-01-12T05:42:07.000Z | CSES/others_better_solution/README.md | sml0399/implementation_of_algorithms | ea0b4e00f836875baea5946cca186ad80d7f5473 | [
"RSA-MD"
] | null | null | null | # others_better_solution
- solutions provided by others that are better(more efficient or simpler) than my solution
- Links will be provided
## solution links
| 31.8 | 90 | 0.805031 | eng_Latn | 0.999882 |
da25fa97cef0fc7c46e1989c291b990e3cc7b453 | 9,711 | md | Markdown | howtos/misc/_posts/2017-05-18-how-to-create-bug-report.md | prestascott/prestashop.github.io | 5ada918dc801b89c349b1c6bc40a731325363482 | [
"CC0-1.0"
] | 40 | 2015-03-20T22:57:22.000Z | 2022-03-13T21:00:56.000Z | howtos/misc/_posts/2017-05-18-how-to-create-bug-report.md | prestascott/prestashop.github.io | 5ada918dc801b89c349b1c6bc40a731325363482 | [
"CC0-1.0"
] | 455 | 2015-04-04T19:50:25.000Z | 2022-03-31T10:02:11.000Z | howtos/misc/_posts/2017-05-18-how-to-create-bug-report.md | prestascott/prestashop.github.io | 5ada918dc801b89c349b1c6bc40a731325363482 | [
"CC0-1.0"
] | 50 | 2015-04-04T13:17:59.000Z | 2021-09-21T17:33:42.000Z | ---
layout: post
title: "How to create the best bug reports"
subtitle: "Three rules: up-to-date, reproducible, and detailed."
date: 2017-05-22 09:10:11
authors: [ xavierborderie ]
icon: icon-bug
tags:
- contribution
- forge
---
Every morning, the Product and QA team gather around a big screen, and review the new Forge tickets that were created since the day before. At the end of the meeting, all new tickets must be asserted, and if need be, placed into the hand of a team member.
Reviewing Forge ticket can be very brain-consuming: sometimes, report incompleteness can make for a need to ask for more info, thus slowing down. On the other hand, we sometimes get tickets that are a pleasure to work with. Here are their most common denominators.
<div class="alert alert-note" role="alert">
Quick summary:
<ul>
<li>Always test on the latest version.</li>
<li>Always provide step-by-step reproduction instructions.</li>
<li>Always give details, such as server configuration</li>
<li>Try to add pictures/video.</li>
<li>Try to enable the Debug Mode.</li>
<li>Try to check if the issue has not already been reported.</li>
<li>Try to add log files.</li>
</ul>
</div>
### The basics
Apart from the obvious descriptive title, there is a trifecta of must-haves in a useful Forge ticket.
#### Up-to-date
Oftentimes, bug reporters provide us with information about what happens in their own situation, chiefly their current version of PrestaShop -- which is seldom the latest version.
This can make reviewing harder, because there is a number of things that could have changed between the reporter's version and the latest version of PrestaShop -- even fixing that very bug!
That is why it is always preferable for the reporter to make sure that the bug happens in the latest available version -- either [an official release](https://www.prestashop.com/en/download), or [the current in-development version](https://github.com/PrestaShop/PrestaShop/tree/develop).
Because not everyone can afford the time to upgrade their online shop to the latest version in hope that their issue is fixed, we encourage the more technical users to make tests on their local machines, using an Apache-PHP-MySQL/MariaDB stack such as [WampServer](http://www.wampserver.com/), [MAMP](https://www.mamp.info/en/), [XAMPP](https://www.apachefriends.org/) or any other.<br/>
Not only will it then be able to easily install the latest version, but it will also make it possible to test in the default conditions -- indeed, third-party themes and modules can create issues that the PrestaShop simply cannot fix.
#### Reproducible
Writing how to reproduce an issue time helps tremendously. Even better if its reproducible in a fresh install of PrestaShop.
Simply put, if the team cannot reproduce the issue on their side, there is little to no chance that the issue will be fixed. They cannot fix bugs blindly.<br/>
Often, issues that cannot be reproduced are either due to customization (theme, modules, override, etc.), or by local problems that are specific to the shop.
The best way to help the team to reproduce issues is to give precise step-by-step instructions. For instance:
<blockquote>
Steps to reproduce:
<ul>
<li>Go to Sell > Catalog.</li>
<li>Click "Create a product".</li>
<li>Save without entering a title.</li>
</ul>
Expected result: an empty product is created and saved.
Actual result: the computer turns into a unicorn. THANK YOU!
</blockquote>
Thanks to these steps (which you should test more than once), you can lead the team on the way to reproduce the issue you found -- and thus on a quick fix!
#### Detailed
The title and description should bring the team half-way to understanding what is happening. While the reproduction steps are important, sometimes there are implied information that is also useful. This is where a detailed description comes in.
The idea of the description is not to tell a whole story of how you stumbled upon the issue. You should it keep it to the point: keep the information that is specific to the issue, remove that information this is specific to your situation. Your description should be general enough to apply to any other installation.
Details such as your software stack can be particularly useful:
* Precise version of PrestaShop ("1.6.1.13" is more helpful than "1.6").
* Server setup:
* Operating system.
* Type and version of webserver (Apache, Nginx, IIS, other).
* Version of PHP.
* Version of MySQL/MariaDB.
* Make sure you have emptied PrestaShop's cache!
* Browser setup:
* Operating system.
* Version of the browser.
* Make sure you have emptied your browser's cache!
* Any error message.
### The bonuses
Follow the rules above, and your ticket will already be on the path to fix-finding.
There are a handful of nice things you can do to improve your ticket further.
#### Checking past issues
Of course you may think that you are the first on to stumble upon that wrong behavior -- and you may be right!
But quite often, a ticket already exists for that issue: you should search the Forge with a couple of keywords, in order to see if you're really first on this one.
If there is indeed already a ticket for your issue, you can still help! Simply comment the ticket with details that you think are pertinent -- or even better, improve the ticket by adding any of the suggestion from this page: steps to reproduce, test on updated version, etc.
#### Making one issue per ticket
It's tempting to finish your ticket with "Another issue I found is that (...)", because to you the two issues happen in the same context.
On the developer side, the two issues might be completely unrelated, and therefore following-up on any supplementary issue using a single Forge ticket is quite a hassle.
So, even if it takes you more time, please create one ticket per issue -- and only mention one issue per ticket :)
#### Adding logs
Sometimes you just don't know what's wrong: something's not happening, that's all.
That's where log data can be helpful. They keep track of what is happening, and allows to trace back to the moment when the issue happens.
There are several different log data that you can add as a file to your ticket:
* Server log:
* Apache: /var/log/apache/apache_error.log
* Nginx: /var/log/nginx/nginx_error.log
* IIS: %SystemDrive%\inetpub\logs\LogFiles
* PHP error log:
* Apache: /var/log/apache/php_error.log
* Nginx: /var/log/php-fpm/default/error.log
* IIS: %SystemDrive%\inetpub\logs\LogFiles
* Console log:
* Firefox and Chrome: press Ctrl+Shit+J or Cmd+Shift+J.
* Internet Explorer and Edge: press F12 then go to the Console tab.
Copy the content relevant to your issue (or to the time the issue appeared), and paste it into the ticket (or in a text file attached to the ticket).
#### Enabling Debug Mode + cache
PrestaShop has a debug mode, which can be used to see the non-obvious errors on your shop. For instance, in these situations:
* The browser displays a blank pages.
* The browser displays a 500 Internal Server Error.
* You cannot log into your dashboard, or access certain pages of the back office.
To turn on the Debug mode on PrestaShop 1.7, follow these steps:
* Got to Advanced Parameters > Performance
* In the Debug Mode section, set "Debug Mode" to "Yes".
* Click the "Save" button.
If you can't access your dashboard, you can still enable the Debug Move using an FTP client (for instance the free Filezilla tool) to edit the config/defines.inc.php file. Find the following line in the file:
`define('_PS_MODE_DEV_', false);`
... and change it to:
`define('_PS_MODE_DEV_', true);`
Finally, emptying the cache of PrestaShop can also be useful, because it ensures that your browser only displays newly-generated files instead of old cached ones. In PrestaShop 1.7, you can empty the cache here:
* Go to Advanced Parameters > Performance.
* Click on the "Clear Cache" button at the top right.
* You're done!
#### Providing screenshots
Most of the time, bugs are visual: whether a button is not displaying well, or a error message pops up, or you want to indicate the content of a form before a failed submission.
That's where screenshot come in very handy! Providing one or more screen captures right into the ticket's attached file can help visually spot and reproduce the issue, and can save a lot of time in understanding your words.
#### Providing a (short) video
If your issue is only triggered in a series of steps, it is great to list them textually, but it can also help to provide a video of these steps, so that we can reproduce your problem with certainty.
To be perfectly useful:
* Keep it short. Less than a minute is best. Do not repeat your bug description; just follow your own step by step instructions.
* Do not talk too much. Accents and even languages other than French and English can be confusing.
* Prefer details to file size. A small screen size or blurry compression can prevent the team from having a clear view of your actions in the video.
#### Being patient
Thank you for reporting your issue! Please understand that it will not be fixed right away: you will have to wait at least for the next bugfix version to be released.
Several new Forge tickets are created every day, and the team spends a lot of time building the developers' workload using these tickets. Depending on its asserted priority and the existing workload, your ticket can either go straight to the top (if your issue is considered blocking or critical), near the top (if it's a major or standard issue), or left for later consideration (if it's a minor or trivial issue).
Thank you for your patience, then!
| 49.545918 | 415 | 0.761405 | eng_Latn | 0.999339 |
da260776a7b77995fd95495b2a0cf2093007bae5 | 112 | md | Markdown | README.md | AndreLuizSantosDeLima/catalagodecarros | fb4f95560468cdb42b84fc9e7d9d36fd238ff10c | [
"MIT"
] | null | null | null | README.md | AndreLuizSantosDeLima/catalagodecarros | fb4f95560468cdb42b84fc9e7d9d36fd238ff10c | [
"MIT"
] | null | null | null | README.md | AndreLuizSantosDeLima/catalagodecarros | fb4f95560468cdb42b84fc9e7d9d36fd238ff10c | [
"MIT"
] | null | null | null | #Projeto deu um catálago de carros
Fazendo esse curso pelo canal => https://www.youtube.com/watch?v=WzO5QlkjVLA
| 37.333333 | 76 | 0.785714 | por_Latn | 0.998799 |
da26d6630872343de510e4826625df5d1335f260 | 1,573 | md | Markdown | results/referenceaudioanalyzer/referenceaudioanalyzer_hdm-x_harman_over-ear_2018/JBL V310 BT (bluetooth)/README.md | eliMakeouthill/AutoEq | b16c72495b3ce493293c6a4a4fdf45a81aec9ca0 | [
"MIT"
] | 3 | 2022-02-25T08:33:08.000Z | 2022-03-13T11:27:29.000Z | results/referenceaudioanalyzer/referenceaudioanalyzer_hdm-x_harman_over-ear_2018/JBL V310 BT (bluetooth)/README.md | billclintonwong/AutoEq | aa25ed8e8270c523893fadbda57e9811c65733f1 | [
"MIT"
] | null | null | null | results/referenceaudioanalyzer/referenceaudioanalyzer_hdm-x_harman_over-ear_2018/JBL V310 BT (bluetooth)/README.md | billclintonwong/AutoEq | aa25ed8e8270c523893fadbda57e9811c65733f1 | [
"MIT"
] | null | null | null | # JBL V310 BT (bluetooth)
See [usage instructions](https://github.com/jaakkopasanen/AutoEq#usage) for more options and info.
### Parametric EQs
In case of using parametric equalizer, apply preamp of **-8.3dB** and build filters manually
with these parameters. The first 5 filters can be used independently.
When using independent subset of filters, apply preamp of **-7.9dB**.
| Type | Fc | Q | Gain |
|:--------|:---------|:-----|:---------|
| Peaking | 26 Hz | 0.71 | 7.1 dB |
| Peaking | 159 Hz | 0.18 | -1.8 dB |
| Peaking | 4673 Hz | 5.13 | 7.0 dB |
| Peaking | 11042 Hz | 1.57 | 8.6 dB |
| Peaking | 19657 Hz | 0.62 | -16.2 dB |
| Peaking | 2222 Hz | 3.63 | 2.7 dB |
| Peaking | 3052 Hz | 4.15 | -3.4 dB |
| Peaking | 5287 Hz | 4.33 | 3.4 dB |
| Peaking | 6636 Hz | 2.51 | -4.9 dB |
| Peaking | 8691 Hz | 4.48 | 3.8 dB |
### Fixed Band EQs
In case of using fixed band (also called graphic) equalizer, apply preamp of **-7.2dB**
(if available) and set gains manually with these parameters.
| Type | Fc | Q | Gain |
|:--------|:---------|:-----|:--------|
| Peaking | 31 Hz | 1.41 | 6.7 dB |
| Peaking | 62 Hz | 1.41 | 0.3 dB |
| Peaking | 125 Hz | 1.41 | -2.5 dB |
| Peaking | 250 Hz | 1.41 | -0.1 dB |
| Peaking | 500 Hz | 1.41 | -1.5 dB |
| Peaking | 1000 Hz | 1.41 | -0.9 dB |
| Peaking | 2000 Hz | 1.41 | 0.2 dB |
| Peaking | 4000 Hz | 1.41 | 1.4 dB |
| Peaking | 8000 Hz | 1.41 | 3.9 dB |
| Peaking | 16000 Hz | 1.41 | -5.4 dB |
### Graphs
.png) | 39.325 | 98 | 0.55054 | eng_Latn | 0.735985 |
da279198039724127d3a7923a7d0a164d02818ca | 32,609 | md | Markdown | src/maintenance/report.md | wwendyc/azure-cli-extensions | 6b4099676bb5d43fdb57bc69f9c0281cca510a0a | [
"MIT"
] | 1 | 2021-08-03T18:32:54.000Z | 2021-08-03T18:32:54.000Z | src/maintenance/report.md | wwendyc/azure-cli-extensions | 6b4099676bb5d43fdb57bc69f9c0281cca510a0a | [
"MIT"
] | 4 | 2020-09-07T12:56:24.000Z | 2021-02-04T12:19:20.000Z | src/maintenance/report.md | wwendyc/azure-cli-extensions | 6b4099676bb5d43fdb57bc69f9c0281cca510a0a | [
"MIT"
] | 5 | 2020-09-08T22:46:48.000Z | 2020-11-08T14:54:35.000Z | # Azure CLI Module Creation Report
## EXTENSION
|CLI Extension|Command Groups|
|---------|------------|
|az maintenance|[groups](#CommandGroups)
## GROUPS
### <a name="CommandGroups">Command groups in `az maintenance` extension </a>
|CLI Command Group|Group Swagger name|Commands|
|---------|------------|--------|
|az maintenance public-configuration|PublicMaintenanceConfigurations|[commands](#CommandsInPublicMaintenanceConfigurations)|
|az maintenance applyupdate|ApplyUpdates|[commands](#CommandsInApplyUpdates)|
|az maintenance assignment|ConfigurationAssignments|[commands](#CommandsInConfigurationAssignments)|
|az maintenance configuration|MaintenanceConfigurations|[commands](#CommandsInMaintenanceConfigurations)|
|az maintenance configuration-for-resource-group|MaintenanceConfigurationsForResourceGroup|[commands](#CommandsInMaintenanceConfigurationsForResourceGroup)|
|az maintenance applyupdate-for-resource-group|ApplyUpdateForResourceGroup|[commands](#CommandsInApplyUpdateForResourceGroup)|
|az maintenance update|Updates|[commands](#CommandsInUpdates)|
## COMMANDS
### <a name="CommandsInApplyUpdates">Commands in `az maintenance applyupdate` group</a>
|CLI Command|Operation Swagger name|Parameters|Examples|
|---------|------------|--------|-----------|
|[az maintenance applyupdate list](#ApplyUpdatesList)|List|[Parameters](#ParametersApplyUpdatesList)|[Example](#ExamplesApplyUpdatesList)|
|[az maintenance applyupdate show](#ApplyUpdatesGet)|Get|[Parameters](#ParametersApplyUpdatesGet)|[Example](#ExamplesApplyUpdatesGet)|
|[az maintenance applyupdate create](#ApplyUpdatesCreateOrUpdateParent)|CreateOrUpdateParent|[Parameters](#ParametersApplyUpdatesCreateOrUpdateParent)|[Example](#ExamplesApplyUpdatesCreateOrUpdateParent)|
|[az maintenance applyupdate create](#ApplyUpdatesCreateOrUpdate#Create)|CreateOrUpdate#Create|[Parameters](#ParametersApplyUpdatesCreateOrUpdate#Create)|[Example](#ExamplesApplyUpdatesCreateOrUpdate#Create)|
|[az maintenance applyupdate update](#ApplyUpdatesCreateOrUpdate#Update)|CreateOrUpdate#Update|[Parameters](#ParametersApplyUpdatesCreateOrUpdate#Update)|Not Found|
|[az maintenance applyupdate show-parent](#ApplyUpdatesGetParent)|GetParent|[Parameters](#ParametersApplyUpdatesGetParent)|[Example](#ExamplesApplyUpdatesGetParent)|
### <a name="CommandsInApplyUpdateForResourceGroup">Commands in `az maintenance applyupdate-for-resource-group` group</a>
|CLI Command|Operation Swagger name|Parameters|Examples|
|---------|------------|--------|-----------|
|[az maintenance applyupdate-for-resource-group list](#ApplyUpdateForResourceGroupList)|List|[Parameters](#ParametersApplyUpdateForResourceGroupList)|[Example](#ExamplesApplyUpdateForResourceGroupList)|
### <a name="CommandsInConfigurationAssignments">Commands in `az maintenance assignment` group</a>
|CLI Command|Operation Swagger name|Parameters|Examples|
|---------|------------|--------|-----------|
|[az maintenance assignment list](#ConfigurationAssignmentsList)|List|[Parameters](#ParametersConfigurationAssignmentsList)|[Example](#ExamplesConfigurationAssignmentsList)|
|[az maintenance assignment create](#ConfigurationAssignmentsCreateOrUpdateParent)|CreateOrUpdateParent|[Parameters](#ParametersConfigurationAssignmentsCreateOrUpdateParent)|[Example](#ExamplesConfigurationAssignmentsCreateOrUpdateParent)|
|[az maintenance assignment create](#ConfigurationAssignmentsCreateOrUpdate#Create)|CreateOrUpdate#Create|[Parameters](#ParametersConfigurationAssignmentsCreateOrUpdate#Create)|[Example](#ExamplesConfigurationAssignmentsCreateOrUpdate#Create)|
|[az maintenance assignment update](#ConfigurationAssignmentsCreateOrUpdate#Update)|CreateOrUpdate#Update|[Parameters](#ParametersConfigurationAssignmentsCreateOrUpdate#Update)|Not Found|
|[az maintenance assignment delete](#ConfigurationAssignmentsDeleteParent)|DeleteParent|[Parameters](#ParametersConfigurationAssignmentsDeleteParent)|[Example](#ExamplesConfigurationAssignmentsDeleteParent)|
|[az maintenance assignment delete](#ConfigurationAssignmentsDelete)|Delete|[Parameters](#ParametersConfigurationAssignmentsDelete)|[Example](#ExamplesConfigurationAssignmentsDelete)|
|[az maintenance assignment list-parent](#ConfigurationAssignmentsListParent)|ListParent|[Parameters](#ParametersConfigurationAssignmentsListParent)|[Example](#ExamplesConfigurationAssignmentsListParent)|
### <a name="CommandsInMaintenanceConfigurations">Commands in `az maintenance configuration` group</a>
|CLI Command|Operation Swagger name|Parameters|Examples|
|---------|------------|--------|-----------|
|[az maintenance configuration list](#MaintenanceConfigurationsList)|List|[Parameters](#ParametersMaintenanceConfigurationsList)|[Example](#ExamplesMaintenanceConfigurationsList)|
|[az maintenance configuration show](#MaintenanceConfigurationsGet)|Get|[Parameters](#ParametersMaintenanceConfigurationsGet)|[Example](#ExamplesMaintenanceConfigurationsGet)|
|[az maintenance configuration create](#MaintenanceConfigurationsCreateOrUpdate#Create)|CreateOrUpdate#Create|[Parameters](#ParametersMaintenanceConfigurationsCreateOrUpdate#Create)|[Example](#ExamplesMaintenanceConfigurationsCreateOrUpdate#Create)|
|[az maintenance configuration update](#MaintenanceConfigurationsUpdate)|Update|[Parameters](#ParametersMaintenanceConfigurationsUpdate)|[Example](#ExamplesMaintenanceConfigurationsUpdate)|
|[az maintenance configuration delete](#MaintenanceConfigurationsDelete)|Delete|[Parameters](#ParametersMaintenanceConfigurationsDelete)|[Example](#ExamplesMaintenanceConfigurationsDelete)|
### <a name="CommandsInMaintenanceConfigurationsForResourceGroup">Commands in `az maintenance configuration-for-resource-group` group</a>
|CLI Command|Operation Swagger name|Parameters|Examples|
|---------|------------|--------|-----------|
|[az maintenance configuration-for-resource-group list](#MaintenanceConfigurationsForResourceGroupList)|List|[Parameters](#ParametersMaintenanceConfigurationsForResourceGroupList)|[Example](#ExamplesMaintenanceConfigurationsForResourceGroupList)|
### <a name="CommandsInPublicMaintenanceConfigurations">Commands in `az maintenance public-configuration` group</a>
|CLI Command|Operation Swagger name|Parameters|Examples|
|---------|------------|--------|-----------|
|[az maintenance public-configuration list](#PublicMaintenanceConfigurationsList)|List|[Parameters](#ParametersPublicMaintenanceConfigurationsList)|[Example](#ExamplesPublicMaintenanceConfigurationsList)|
|[az maintenance public-configuration show](#PublicMaintenanceConfigurationsGet)|Get|[Parameters](#ParametersPublicMaintenanceConfigurationsGet)|[Example](#ExamplesPublicMaintenanceConfigurationsGet)|
### <a name="CommandsInUpdates">Commands in `az maintenance update` group</a>
|CLI Command|Operation Swagger name|Parameters|Examples|
|---------|------------|--------|-----------|
|[az maintenance update list](#UpdatesList)|List|[Parameters](#ParametersUpdatesList)|[Example](#ExamplesUpdatesList)|
|[az maintenance update list-parent](#UpdatesListParent)|ListParent|[Parameters](#ParametersUpdatesListParent)|[Example](#ExamplesUpdatesListParent)|
## COMMAND DETAILS
### group `az maintenance applyupdate`
#### <a name="ApplyUpdatesList">Command `az maintenance applyupdate list`</a>
##### <a name="ExamplesApplyUpdatesList">Example</a>
```
az maintenance applyupdate list
```
##### <a name="ParametersApplyUpdatesList">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
#### <a name="ApplyUpdatesGet">Command `az maintenance applyupdate show`</a>
##### <a name="ExamplesApplyUpdatesGet">Example</a>
```
az maintenance applyupdate show --name "e9b9685d-78e4-44c4-a81c-64a14f9b87b6" --provider-name "Microsoft.Compute" \
--resource-group "examplerg" --resource-name "smdtest1" --resource-type "virtualMachineScaleSets"
```
##### <a name="ParametersApplyUpdatesGet">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
|**--apply-update-name**|string|applyUpdate Id|apply_update_name|applyUpdateName|
#### <a name="ApplyUpdatesCreateOrUpdateParent">Command `az maintenance applyupdate create`</a>
##### <a name="ExamplesApplyUpdatesCreateOrUpdateParent">Example</a>
```
az maintenance applyupdate create --provider-name "Microsoft.Compute" --resource-group "examplerg" --resource-name \
"smdvm1" --resource-parent-name "smdtest1" --resource-parent-type "virtualMachineScaleSets" --resource-type \
"virtualMachines"
```
##### <a name="ParametersApplyUpdatesCreateOrUpdateParent">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-parent-type**|string|Resource parent type|resource_parent_type|resourceParentType|
|**--resource-parent-name**|string|Resource parent identifier|resource_parent_name|resourceParentName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
#### <a name="ApplyUpdatesCreateOrUpdate#Create">Command `az maintenance applyupdate create`</a>
##### <a name="ExamplesApplyUpdatesCreateOrUpdate#Create">Example</a>
```
az maintenance applyupdate create --provider-name "Microsoft.Compute" --resource-group "examplerg" --resource-name \
"smdtest1" --resource-type "virtualMachineScaleSets"
```
##### <a name="ParametersApplyUpdatesCreateOrUpdate#Create">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
#### <a name="ApplyUpdatesCreateOrUpdate#Update">Command `az maintenance applyupdate update`</a>
##### <a name="ParametersApplyUpdatesCreateOrUpdate#Update">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
#### <a name="ApplyUpdatesGetParent">Command `az maintenance applyupdate show-parent`</a>
##### <a name="ExamplesApplyUpdatesGetParent">Example</a>
```
az maintenance applyupdate show-parent --name "e9b9685d-78e4-44c4-a81c-64a14f9b87b6" --provider-name \
"Microsoft.Compute" --resource-group "examplerg" --resource-name "smdvm1" --resource-parent-name "smdtest1" \
--resource-parent-type "virtualMachineScaleSets" --resource-type "virtualMachines"
```
##### <a name="ParametersApplyUpdatesGetParent">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--resource-parent-type**|string|Resource parent type|resource_parent_type|resourceParentType|
|**--resource-parent-name**|string|Resource parent identifier|resource_parent_name|resourceParentName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
|**--apply-update-name**|string|applyUpdate Id|apply_update_name|applyUpdateName|
### group `az maintenance applyupdate-for-resource-group`
#### <a name="ApplyUpdateForResourceGroupList">Command `az maintenance applyupdate-for-resource-group list`</a>
##### <a name="ExamplesApplyUpdateForResourceGroupList">Example</a>
```
az maintenance applyupdate-for-resource-group list --resource-group "examplerg"
```
##### <a name="ParametersApplyUpdateForResourceGroupList">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource Group Name|resource_group_name|resourceGroupName|
### group `az maintenance assignment`
#### <a name="ConfigurationAssignmentsList">Command `az maintenance assignment list`</a>
##### <a name="ExamplesConfigurationAssignmentsList">Example</a>
```
az maintenance assignment list --provider-name "Microsoft.Compute" --resource-group "examplerg" --resource-name \
"smdtest1" --resource-type "virtualMachineScaleSets"
```
##### <a name="ParametersConfigurationAssignmentsList">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
#### <a name="ConfigurationAssignmentsCreateOrUpdateParent">Command `az maintenance assignment create`</a>
##### <a name="ExamplesConfigurationAssignmentsCreateOrUpdateParent">Example</a>
```
az maintenance assignment create --maintenance-configuration-id "/subscriptions/5b4b650e-28b9-4790-b3ab-ddbd88d727c4/re\
sourcegroups/examplerg/providers/Microsoft.Maintenance/maintenanceConfigurations/policy1" --name "workervmPolicy" \
--provider-name "Microsoft.Compute" --resource-group "examplerg" --resource-name "smdvm1" --resource-parent-name \
"smdtest1" --resource-parent-type "virtualMachineScaleSets" --resource-type "virtualMachines"
```
##### <a name="ParametersConfigurationAssignmentsCreateOrUpdateParent">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-parent-type**|string|Resource parent type|resource_parent_type|resourceParentType|
|**--resource-parent-name**|string|Resource parent identifier|resource_parent_name|resourceParentName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
|**--configuration-assignment-name**|string|Configuration assignment name|configuration_assignment_name|configurationAssignmentName|
|**--location**|string|Location of the resource|location|location|
|**--maintenance-configuration-id**|string|The maintenance configuration Id|maintenance_configuration_id|maintenanceConfigurationId|
|**--resource-id**|string|The unique resourceId|resource_id|resourceId|
#### <a name="ConfigurationAssignmentsCreateOrUpdate#Create">Command `az maintenance assignment create`</a>
##### <a name="ExamplesConfigurationAssignmentsCreateOrUpdate#Create">Example</a>
```
az maintenance assignment create --maintenance-configuration-id "/subscriptions/5b4b650e-28b9-4790-b3ab-ddbd88d727c4/re\
sourcegroups/examplerg/providers/Microsoft.Maintenance/maintenanceConfigurations/configuration1" --name \
"workervmConfiguration" --provider-name "Microsoft.Compute" --resource-group "examplerg" --resource-name "smdtest1" \
--resource-type "virtualMachineScaleSets"
```
##### <a name="ParametersConfigurationAssignmentsCreateOrUpdate#Create">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
#### <a name="ConfigurationAssignmentsCreateOrUpdate#Update">Command `az maintenance assignment update`</a>
##### <a name="ParametersConfigurationAssignmentsCreateOrUpdate#Update">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
|**--configuration-assignment-name**|string|Configuration assignment name|configuration_assignment_name|configurationAssignmentName|
|**--location**|string|Location of the resource|location|location|
|**--maintenance-configuration-id**|string|The maintenance configuration Id|maintenance_configuration_id|maintenanceConfigurationId|
|**--resource-id**|string|The unique resourceId|resource_id|resourceId|
#### <a name="ConfigurationAssignmentsDeleteParent">Command `az maintenance assignment delete`</a>
##### <a name="ExamplesConfigurationAssignmentsDeleteParent">Example</a>
```
az maintenance assignment delete --name "workervmConfiguration" --provider-name "Microsoft.Compute" --resource-group \
"examplerg" --resource-name "smdvm1" --resource-parent-name "smdtest1" --resource-parent-type \
"virtualMachineScaleSets" --resource-type "virtualMachines"
```
##### <a name="ParametersConfigurationAssignmentsDeleteParent">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-parent-type**|string|Resource parent type|resource_parent_type|resourceParentType|
|**--resource-parent-name**|string|Resource parent identifier|resource_parent_name|resourceParentName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
|**--configuration-assignment-name**|string|Unique configuration assignment name|configuration_assignment_name|configurationAssignmentName|
#### <a name="ConfigurationAssignmentsDelete">Command `az maintenance assignment delete`</a>
##### <a name="ExamplesConfigurationAssignmentsDelete">Example</a>
```
az maintenance assignment delete --name "workervmConfiguration" --provider-name "Microsoft.Compute" --resource-group \
"examplerg" --resource-name "smdtest1" --resource-type "virtualMachineScaleSets"
```
##### <a name="ParametersConfigurationAssignmentsDelete">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
#### <a name="ConfigurationAssignmentsListParent">Command `az maintenance assignment list-parent`</a>
##### <a name="ExamplesConfigurationAssignmentsListParent">Example</a>
```
az maintenance assignment list-parent --provider-name "Microsoft.Compute" --resource-group "examplerg" --resource-name \
"smdtestvm1" --resource-parent-name "smdtest1" --resource-parent-type "virtualMachineScaleSets" --resource-type \
"virtualMachines"
```
##### <a name="ParametersConfigurationAssignmentsListParent">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-parent-type**|string|Resource parent type|resource_parent_type|resourceParentType|
|**--resource-parent-name**|string|Resource parent identifier|resource_parent_name|resourceParentName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
### group `az maintenance configuration`
#### <a name="MaintenanceConfigurationsList">Command `az maintenance configuration list`</a>
##### <a name="ExamplesMaintenanceConfigurationsList">Example</a>
```
az maintenance configuration list
```
##### <a name="ParametersMaintenanceConfigurationsList">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
#### <a name="MaintenanceConfigurationsGet">Command `az maintenance configuration show`</a>
##### <a name="ExamplesMaintenanceConfigurationsGet">Example</a>
```
az maintenance configuration show --resource-group "examplerg" --resource-name "configuration1"
```
##### <a name="ParametersMaintenanceConfigurationsGet">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource Group Name|resource_group_name|resourceGroupName|
|**--resource-name**|string|Maintenance Configuration Name|resource_name|resourceName|
#### <a name="MaintenanceConfigurationsCreateOrUpdate#Create">Command `az maintenance configuration create`</a>
##### <a name="ExamplesMaintenanceConfigurationsCreateOrUpdate#Create">Example</a>
```
az maintenance configuration create --location "westus2" --maintenance-scope "Host" --maintenance-window-duration \
"05:00" --maintenance-window-expiration-date-time "9999-12-31 00:00" --maintenance-window-recur-every "Day" \
--maintenance-window-start-date-time "2025-04-30 08:00" --maintenance-window-time-zone "Pacific Standard Time" \
--namespace "Microsoft.Maintenance" --visibility "Custom" --resource-group "examplerg" --resource-name \
"configuration1"
```
##### <a name="ParametersMaintenanceConfigurationsCreateOrUpdate#Create">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource Group Name|resource_group_name|resourceGroupName|
|**--resource-name**|string|Maintenance Configuration Name|resource_name|resourceName|
|**--location**|string|Gets or sets location of the resource|location|location|
|**--tags**|dictionary|Gets or sets tags of the resource|tags|tags|
|**--namespace**|string|Gets or sets namespace of the resource|namespace|namespace|
|**--extension-properties**|dictionary|Gets or sets extensionProperties of the maintenanceConfiguration|extension_properties|extensionProperties|
|**--maintenance-scope**|choice|Gets or sets maintenanceScope of the configuration|maintenance_scope|maintenanceScope|
|**--visibility**|choice|Gets or sets the visibility of the configuration. The default value is 'Custom'|visibility|visibility|
|**--start-date-time**|string|Effective start date of the maintenance window in YYYY-MM-DD hh:mm format. The start date can be set to either the current date or future date. The window will be created in the time zone provided and adjusted to daylight savings according to that time zone.|start_date_time|startDateTime|
|**--expiration-date-time**|string|Effective expiration date of the maintenance window in YYYY-MM-DD hh:mm format. The window will be created in the time zone provided and adjusted to daylight savings according to that time zone. Expiration date must be set to a future date. If not provided, it will be set to the maximum datetime 9999-12-31 23:59:59.|expiration_date_time|expirationDateTime|
|**--duration**|string|Duration of the maintenance window in HH:mm format. If not provided, default value will be used based on maintenance scope provided. Example: 05:00.|duration|duration|
|**--time-zone**|string|Name of the timezone. List of timezones can be obtained by executing [System.TimeZoneInfo]::GetSystemTimeZones() in PowerShell. Example: Pacific Standard Time, UTC, W. Europe Standard Time, Korea Standard Time, Cen. Australia Standard Time.|time_zone|timeZone|
|**--recur-every**|string|Rate at which a Maintenance window is expected to recur. The rate can be expressed as daily, weekly, or monthly schedules. Daily schedule are formatted as recurEvery: [Frequency as integer]['Day(s)']. If no frequency is provided, the default frequency is 1. Daily schedule examples are recurEvery: Day, recurEvery: 3Days. Weekly schedule are formatted as recurEvery: [Frequency as integer]['Week(s)'] [Optional comma separated list of weekdays Monday-Sunday]. Weekly schedule examples are recurEvery: 3Weeks, recurEvery: Week Saturday,Sunday. Monthly schedules are formatted as [Frequency as integer]['Month(s)'] [Comma separated list of month days] or [Frequency as integer]['Month(s)'] [Week of Month (First, Second, Third, Fourth, Last)] [Weekday Monday-Sunday]. Monthly schedule examples are recurEvery: Month, recurEvery: 2Months, recurEvery: Month day23,day24, recurEvery: Month Last Sunday, recurEvery: Month Fourth Monday.|recur_every|recurEvery|
#### <a name="MaintenanceConfigurationsUpdate">Command `az maintenance configuration update`</a>
##### <a name="ExamplesMaintenanceConfigurationsUpdate">Example</a>
```
az maintenance configuration update --location "westus2" --maintenance-scope "Host" --maintenance-window-duration \
"05:00" --maintenance-window-expiration-date-time "9999-12-31 00:00" --maintenance-window-recur-every "Month Third \
Sunday" --maintenance-window-start-date-time "2025-04-30 08:00" --maintenance-window-time-zone "Pacific Standard Time" \
--namespace "Microsoft.Maintenance" --visibility "Custom" --resource-group "examplerg" --resource-name \
"configuration1"
```
##### <a name="ParametersMaintenanceConfigurationsUpdate">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource Group Name|resource_group_name|resourceGroupName|
|**--resource-name**|string|Maintenance Configuration Name|resource_name|resourceName|
|**--location**|string|Gets or sets location of the resource|location|location|
|**--tags**|dictionary|Gets or sets tags of the resource|tags|tags|
|**--namespace**|string|Gets or sets namespace of the resource|namespace|namespace|
|**--extension-properties**|dictionary|Gets or sets extensionProperties of the maintenanceConfiguration|extension_properties|extensionProperties|
|**--maintenance-scope**|choice|Gets or sets maintenanceScope of the configuration|maintenance_scope|maintenanceScope|
|**--visibility**|choice|Gets or sets the visibility of the configuration. The default value is 'Custom'|visibility|visibility|
|**--start-date-time**|string|Effective start date of the maintenance window in YYYY-MM-DD hh:mm format. The start date can be set to either the current date or future date. The window will be created in the time zone provided and adjusted to daylight savings according to that time zone.|start_date_time|startDateTime|
|**--expiration-date-time**|string|Effective expiration date of the maintenance window in YYYY-MM-DD hh:mm format. The window will be created in the time zone provided and adjusted to daylight savings according to that time zone. Expiration date must be set to a future date. If not provided, it will be set to the maximum datetime 9999-12-31 23:59:59.|expiration_date_time|expirationDateTime|
|**--duration**|string|Duration of the maintenance window in HH:mm format. If not provided, default value will be used based on maintenance scope provided. Example: 05:00.|duration|duration|
|**--time-zone**|string|Name of the timezone. List of timezones can be obtained by executing [System.TimeZoneInfo]::GetSystemTimeZones() in PowerShell. Example: Pacific Standard Time, UTC, W. Europe Standard Time, Korea Standard Time, Cen. Australia Standard Time.|time_zone|timeZone|
|**--recur-every**|string|Rate at which a Maintenance window is expected to recur. The rate can be expressed as daily, weekly, or monthly schedules. Daily schedule are formatted as recurEvery: [Frequency as integer]['Day(s)']. If no frequency is provided, the default frequency is 1. Daily schedule examples are recurEvery: Day, recurEvery: 3Days. Weekly schedule are formatted as recurEvery: [Frequency as integer]['Week(s)'] [Optional comma separated list of weekdays Monday-Sunday]. Weekly schedule examples are recurEvery: 3Weeks, recurEvery: Week Saturday,Sunday. Monthly schedules are formatted as [Frequency as integer]['Month(s)'] [Comma separated list of month days] or [Frequency as integer]['Month(s)'] [Week of Month (First, Second, Third, Fourth, Last)] [Weekday Monday-Sunday]. Monthly schedule examples are recurEvery: Month, recurEvery: 2Months, recurEvery: Month day23,day24, recurEvery: Month Last Sunday, recurEvery: Month Fourth Monday.|recur_every|recurEvery|
#### <a name="MaintenanceConfigurationsDelete">Command `az maintenance configuration delete`</a>
##### <a name="ExamplesMaintenanceConfigurationsDelete">Example</a>
```
az maintenance configuration delete --resource-group "examplerg" --resource-name "example1"
```
##### <a name="ParametersMaintenanceConfigurationsDelete">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource Group Name|resource_group_name|resourceGroupName|
|**--resource-name**|string|Maintenance Configuration Name|resource_name|resourceName|
### group `az maintenance configuration-for-resource-group`
#### <a name="MaintenanceConfigurationsForResourceGroupList">Command `az maintenance configuration-for-resource-group list`</a>
##### <a name="ExamplesMaintenanceConfigurationsForResourceGroupList">Example</a>
```
az maintenance configuration-for-resource-group list --resource-group "examplerg"
```
##### <a name="ParametersMaintenanceConfigurationsForResourceGroupList">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource Group Name|resource_group_name|resourceGroupName|
### group `az maintenance public-configuration`
#### <a name="PublicMaintenanceConfigurationsList">Command `az maintenance public-configuration list`</a>
##### <a name="ExamplesPublicMaintenanceConfigurationsList">Example</a>
```
az maintenance public-configuration list
```
##### <a name="ParametersPublicMaintenanceConfigurationsList">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
#### <a name="PublicMaintenanceConfigurationsGet">Command `az maintenance public-configuration show`</a>
##### <a name="ExamplesPublicMaintenanceConfigurationsGet">Example</a>
```
az maintenance public-configuration show --resource-name "configuration1"
```
##### <a name="ParametersPublicMaintenanceConfigurationsGet">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-name**|string|Maintenance Configuration Name|resource_name|resourceName|
### group `az maintenance update`
#### <a name="UpdatesList">Command `az maintenance update list`</a>
##### <a name="ExamplesUpdatesList">Example</a>
```
az maintenance update list --provider-name "Microsoft.Compute" --resource-group "examplerg" --resource-name "smdtest1" \
--resource-type "virtualMachineScaleSets"
```
##### <a name="ParametersUpdatesList">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
#### <a name="UpdatesListParent">Command `az maintenance update list-parent`</a>
##### <a name="ExamplesUpdatesListParent">Example</a>
```
az maintenance update list-parent --provider-name "Microsoft.Compute" --resource-group "examplerg" --resource-name "1" \
--resource-parent-name "smdtest1" --resource-parent-type "virtualMachineScaleSets" --resource-type "virtualMachines"
```
##### <a name="ParametersUpdatesListParent">Parameters</a>
|Option|Type|Description|Path (SDK)|Swagger name|
|------|----|-----------|----------|------------|
|**--resource-group-name**|string|Resource group name|resource_group_name|resourceGroupName|
|**--provider-name**|string|Resource provider name|provider_name|providerName|
|**--resource-parent-type**|string|Resource parent type|resource_parent_type|resourceParentType|
|**--resource-parent-name**|string|Resource parent identifier|resource_parent_name|resourceParentName|
|**--resource-type**|string|Resource type|resource_type|resourceType|
|**--resource-name**|string|Resource identifier|resource_name|resourceName|
| 74.791284 | 981 | 0.761508 | yue_Hant | 0.678127 |
da2791b76333ca93eafa7ab8146fab377d4114ca | 4,580 | md | Markdown | _pages/cv.md | alexhambley/alexhambley.github.io | 7b8fbd2e7391cd5481df5dcb3c31780ca3562e94 | [
"MIT"
] | null | null | null | _pages/cv.md | alexhambley/alexhambley.github.io | 7b8fbd2e7391cd5481df5dcb3c31780ca3562e94 | [
"MIT"
] | null | null | null | _pages/cv.md | alexhambley/alexhambley.github.io | 7b8fbd2e7391cd5481df5dcb3c31780ca3562e94 | [
"MIT"
] | null | null | null | ---
layout: archive
title: "CV"
permalink: /cv/
author_profile: true
redirect_from:
- /resume
---
{% include base_path %}
A PDF version of my CV can be <a href="/files/cv.pdf" target="_blank">found here.</a> Alternatively, my <a href="https://www.linkedin.com/in/alexanderhambley/" target="_blank">LinkedIn page</a> summarises my experience.
Education
======
* **PhD in Computer Science**, University of Manchester.
<br>
2020 - 2023 (expected).
* <b>PhD Thesis</b>: Empirical web accessibility evaluation for blind web users.
<br>
Supervised by Prof. Simon Harper, Dr. Yeliz Yesilada and Dr. Markel Vigo.
* **BSc (Hons) in Computer Science**. First Class Honours. The University of Nottingham.
<br>
2016 - 2019.
* <b>Dissertation</b>: A study of the addictive nature of smartphones and social media.
<br>
Supervised by Dr. Gail Hopkins.
***
Experience
======
* **Graduate Instructor**. The University of Manchester.
<br>
January 2020 - Present.
* <b>User Experience</b>: Marked and moderated coursework for over 150 students. Provided formative and summative feedback.
* <b>Database Systems</b>: Ran laboratory sessions for 30 students at a time. Assessed and moderated coursework for over 50 students. Presented formative and summative feedback.
* <b>Introduction to Programming II</b>: Coordinated focused laboratory sessions. Marked and moderated coursework. Assigned formative and summative feedback. Facilitated group work.
* <b>First Year Group Project</b>: Advised student projects. Provided formative feedback. Facilitated group work.
* **Software Engineer (Intern)**. HeX Productions, Nottingham.
<br>
May 2019 - September 2019.
* Impact: Established a successful new technology event in Nottingham. Created a new accessible website for the event. Modified and repaired several existing public sector websites for conformity with WCAG 2.1 AA.
* Created an accessible e-Learning website and presented to over 30 developers and industry experts talks on accessible web development.
* Skills: Developed websites using technologies such as WordPress, Terminalfour, PHP, JavaScript, and CSS. Worked as a full-stack developer with accessibility in mind, developing a practical understanding of WCAG, presenting, public speaking and organising skills.
* **IT Support Tutor**. The University of Nottingham.
<br>
August 2018 - December 2018.
* Impact: Advised over 100 students with registration difficulties, timetable problems and student ID concerns.
* Skills: Advanced independent and collaborative working skills. Problem solving. Interpersonal skills.
* **Research Software Developer (Intern)**. The University of Twente (UT).
<br>
June 2018 - August 2018.
* Impact: Developed prototype social robots at the UT DesignLab to assist children with ASD. Robots would vary facial movements to mimic human faces.
* Skills: Demonstrated ability to think critically and to communicate effectively with a diverse group of people. Displayed ability to create rapid, effective, low and high fidelity prototypes.
***
Awards
======
* <a href="/publication/2022-4-27">**Best Communication Paper**</a>. <a href="https://www.w4a.info/2022/" target="_blank">The 19th International Web for All Conference</a>. April 2022.
* **EPSRC Doctoral Studentship** (Engineering and Physical Sciences Research Council). January 2020 - Present.
***
Professional Certificates
======
* **Designing Effective Science Communication**. University of Colorado Boulder. Coursera. June 2022.
* **Accessible Documents: Word, PowerPoint, Acrobat**. WebAIM. August 2019.
***
Professional Activities and Reviewing
======
* **Publicity Chair**. <a href="https://www.w4a.info/2022/" target="_blank">The 19th International Web for All Conference</a>.
<br>
September 2021 - April 2022.
* Refined mailing lists, disseminated calls for papers and ran conference social media channels. Compiled and uploaded video recordings of the conference proceedings. Liaised with authors, organising and steering committee.
* **Journal Peer Reviews**:
* ACM Transactions on the Web (TWEB): 2 Reviews in 2022.
* ACM Transactions on Accessible Computing (TACCESS): 2021.
***
Volunteering
======
* **Mentor**. ClickSilver (Capital One).
<br>
August 2018 - December 2018.
* Mentored an older person with memory difficulties in the use of computers. The older person struggled to transition short-term memories into longer-term memories. Developed unique ways of teaching relatively complex ideas to someone with memory problems. | 49.247312 | 266 | 0.75131 | eng_Latn | 0.916416 |
da2a7e6e187ac92bfd45d622e6581d0bffe59da6 | 48 | md | Markdown | category/Linux-kernel.md | YWHyuk/YWHyuk.github.io | 2d9cd2c499f60567c0534e4f1f7b3483211fe73b | [
"MIT"
] | null | null | null | category/Linux-kernel.md | YWHyuk/YWHyuk.github.io | 2d9cd2c499f60567c0534e4f1f7b3483211fe73b | [
"MIT"
] | null | null | null | category/Linux-kernel.md | YWHyuk/YWHyuk.github.io | 2d9cd2c499f60567c0534e4f1f7b3483211fe73b | [
"MIT"
] | null | null | null | ---
layout: category
title: Linux-kernel
---
| 6 | 19 | 0.625 | eng_Latn | 0.653723 |
da2ae366cb61f52f2e1cf7929ba5194bde0731f6 | 105 | md | Markdown | README.md | tqbdev/CNM-TH2015-CK | 47bc99a850ecf1a4608ece53a9479a5abc4f7f26 | [
"Apache-2.0"
] | null | null | null | README.md | tqbdev/CNM-TH2015-CK | 47bc99a850ecf1a4608ece53a9479a5abc4f7f26 | [
"Apache-2.0"
] | null | null | null | README.md | tqbdev/CNM-TH2015-CK | 47bc99a850ecf1a4608ece53a9479a5abc4f7f26 | [
"Apache-2.0"
] | 1 | 2019-10-07T00:42:47.000Z | 2019-10-07T00:42:47.000Z | # CNM-TH2015-CK
Đồ án cuối kì CNM-TH2015. Ứng dụng ngân hàng, chuyển tiền, tích hợp một phần blockchain.
| 35 | 88 | 0.752381 | vie_Latn | 1.000007 |
da2b4b29703f3f9dee64f65ea7939e793b952cc1 | 2,965 | md | Markdown | _posts/2016-11-04-logging-out-and-then-logging-in-throws-403-error-with-csrf-protector.md | mebjas/mebjas.github.io | 94eac9216e56cf1c148f980df8cd40b21e017192 | [
"Apache-2.0"
] | 10 | 2020-02-08T04:00:01.000Z | 2022-02-24T06:31:08.000Z | _posts/2016-11-04-logging-out-and-then-logging-in-throws-403-error-with-csrf-protector.md | mebjas/mebjas.github.io | 94eac9216e56cf1c148f980df8cd40b21e017192 | [
"Apache-2.0"
] | 20 | 2020-04-01T07:51:22.000Z | 2022-03-29T06:55:16.000Z | _posts/2016-11-04-logging-out-and-then-logging-in-throws-403-error-with-csrf-protector.md | mebjas/mebjas.github.io | 94eac9216e56cf1c148f980df8cd40b21e017192 | [
"Apache-2.0"
] | 6 | 2020-08-15T16:52:24.000Z | 2021-12-29T13:06:41.000Z | ---
layout: post
title: Logging out and then logging in throws 403 error with CSRF Protector PHP – fix / workaround
categories: [csrf, javascript, open-source, owasp, security, web-security, php]
description: "Recently an interesting bug came up in CSRF Protector PHP. If you log out of your website and then try to login again there only, CSRF Protector throws 403 – forbidden response. So this comes by design because first thing that you do in your logout script is, initiate CSRF Protector > let it do it’s stuff and then destroy session to logout the user. Now this screws everything because CSRFP is dependent on tokens it store in session variables. So next time you try to login again which is a POST request, it’s unable to validate the incoming token and throws 403 or whatever is the failedValdiationResponse in your config."
post-no: 3
toc: false
---
{:width="750px"}
## Problem
Recently an interesting bug came up in CSRF Protector PHP. Read the entire [issue thread on Github](https://github.com/mebjas/CSRF-Protector-PHP/issues/51).
> If you log out of your website and then try to login again there only, CSRF Protector throws 403 – forbidden response.
So this comes by design, because first thing that you do in your logout script is, initiate CSRF Protector and then let it do it’s stuff and then destroy session to logout the user. Now this screws everything because CSRFP is dependent on tokens it store in session variables. So next time you try to login again which is a POST request, it’s unable to validate the incoming token and throws 403 or whatever is the `failedValdiationResponse` in your config.
## Solution
Now since, this is a very particular issue I don’t feel library should handle it implicitly. What needs to be done is:
1. Do not destroy entire session in logout – just unset the key you deal with for authentication. Now this might need you to change the design, which is far difficult than not using CSRFP itself. So use this if your user authentication is dependent on particular keys in session vars.
2. Now this would look like a university student workaround. Store existing tokens in temp variable, destroy session then start session again and restore `$temp` back to session.
```php
if (isset($_SESSION["CSRFP_TOKEN"])) $temp = $_SESSION["CSRFP_TOKEN"];
// Now CSRF_TOKEN key need to be pulled from config.
@session_unset();
@session_destroy();
@session_start();
if (isset($temp)) $_SESSION["CSRFP_TOKEN"] = $temp;
```
3. **Best way to deal with this would be**: if you are using `session_destroy()` to logout, call `csrfp::csrfProtector::init();`. This would ensure user logs out and init() would create a new session with new tokens in it.
Hope these workaround works for you. Else feel free to reach out to me by posting another issue on [mebjas/CSRF-Protector-PHP](https://github.com/mebjas/CSRF-Protector-PHP) | 89.848485 | 640 | 0.760202 | eng_Latn | 0.995379 |
da2bc13dc49fb918821b87fb2012a52a3e00854a | 1,085 | md | Markdown | README.md | simra/CartridgeOCR | 445eb51b93c9297dedee076be8b197e6d7697b5d | [
"MIT"
] | null | null | null | README.md | simra/CartridgeOCR | 445eb51b93c9297dedee076be8b197e6d7697b5d | [
"MIT"
] | null | null | null | README.md | simra/CartridgeOCR | 445eb51b93c9297dedee076be8b197e6d7697b5d | [
"MIT"
] | null | null | null | # CartridgeOCR
Final deck: https://1drv.ms/p/s!Aq_TlvfieKvqu8t5DYBMbiD91PxE6Q?e=STaglB
## Roadmap
Some areas to explore:
- CNN training from a few examples. We have several options for fine-tuning: yolo, aml resnets, torchvision
- given an extraction, unroll it to optimize OCR.
- other enhancements to improve OCR
- labeling and storage workspaces
- mobile app workflow
## Dev environment
https://pytorch.org/tutorials/intermediate/torchvision_tutorial.html
- conda install cython
- conda install jupyter
- pip install opencv-python
- pip install git+https://github.com/gautamchitnis/cocoapi.git@cocodataset-master#subdirectory=PythonAPI
- pip install pillow
- conda install matplotlib
- pip install azureml-sdk
- pip install azure.cli.core
- pip install azureml-contrib-dataset
# Explore yolo
- wget https://pjreddie.com/media/files/yolov3.weights
# Explore torchvision
- potentially important: https://github.com/pytorch/vision/issues/2720
Torchvision todo:
- move to a GPU box.
- double check batch size, epoch size
- visualize outputs
- understand evaluation outputs.
| 26.463415 | 108 | 0.787097 | eng_Latn | 0.608072 |
da2c14e7648a17809bf26027b0a02b4920af61af | 45 | md | Markdown | PHASE_1/TestScripts/api/readme.md | vicinx3/disease-outbreak | 035e78875c374e2cdbd4720a4f2ed1370f63a88c | [
"MIT"
] | null | null | null | PHASE_1/TestScripts/api/readme.md | vicinx3/disease-outbreak | 035e78875c374e2cdbd4720a4f2ed1370f63a88c | [
"MIT"
] | null | null | null | PHASE_1/TestScripts/api/readme.md | vicinx3/disease-outbreak | 035e78875c374e2cdbd4720a4f2ed1370f63a88c | [
"MIT"
] | null | null | null | # Description
Test files for API endpoints.
| 11.25 | 29 | 0.777778 | eng_Latn | 0.454735 |
da2cd29266f3eb152d4b96dee0cb35c32265bada | 31 | md | Markdown | docs/course/rbac/4-ui-navigation/demo/9__ui/src/layout/index/__index.component.scss.md | sunlu/ng-nest | bc97c0f47b44ed17ed5669b4e65c3fc1a1d25b3d | [
"MIT"
] | 149 | 2019-12-15T13:33:13.000Z | 2022-03-28T13:18:28.000Z | docs/course/rbac/4-ui-navigation/demo/9__ui/src/layout/index/__index.component.scss.md | sunlu/ng-nest | bc97c0f47b44ed17ed5669b4e65c3fc1a1d25b3d | [
"MIT"
] | 180 | 2020-10-18T08:57:39.000Z | 2022-03-30T10:45:16.000Z | docs/course/rbac/4-ui-navigation/demo/9__ui/src/layout/index/__index.component.scss.md | sunlu/ng-nest | bc97c0f47b44ed17ed5669b4e65c3fc1a1d25b3d | [
"MIT"
] | 18 | 2020-03-13T02:40:30.000Z | 2021-10-21T11:22:27.000Z | ---
primary: '1,10,48-106'
---
| 7.75 | 22 | 0.483871 | eng_Latn | 0.18614 |
da2f6d46cb1ab3178fbd7611b9d72534a1e386c5 | 1,526 | md | Markdown | memdocs/intune/user-help/install-office-windows.md | Mdlglobal-atlassian-net/memdocs.pt-pt | 5ee72251572da436f8c758346a3505f167a85c9a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | memdocs/intune/user-help/install-office-windows.md | Mdlglobal-atlassian-net/memdocs.pt-pt | 5ee72251572da436f8c758346a3505f167a85c9a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | memdocs/intune/user-help/install-office-windows.md | Mdlglobal-atlassian-net/memdocs.pt-pt | 5ee72251572da436f8c758346a3505f167a85c9a | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-28T15:46:01.000Z | 2020-05-28T15:46:01.000Z | ---
title: Instalar o Office 365 no seu dispositivo Windows 10 | Microsoft Docs
description: ''
keywords: ''
author: lenewsad
ms.author: lanewsad
manager: dougeby
ms.date: 02/21/2018
ms.topic: end-user-help
ms.prod: ''
ms.service: microsoft-intune
ms.subservice: end-user
ms.technology: ''
ms.assetid: 42e26c51-5373-4c2e-9321-34d85560f3d1
searchScope:
- User help
ROBOTS: ''
ms.reviewer: aiwang
ms.suite: ems
ms.custom: intune-enduser
ms.collection: ''
ms.openlocfilehash: a49d5de1ce0dd91ff9dacb07d50970bc9ef2e90d
ms.sourcegitcommit: a77ba49424803fddcaf23326f1befbc004e48ac9
ms.translationtype: MT
ms.contentlocale: pt-PT
ms.lasthandoff: 05/27/2020
ms.locfileid: "83882158"
---
# <a name="installing-office-365-on-your-windows-10-device"></a>Instalar o Office 365 no seu dispositivo Windows 10
Existem várias formas de instalar o conjunto de aplicações do Office no seu dispositivo Windows 10. Consoante o tamanho da sua empresa, poderão estar disponíveis para transferência múltiplas versões do Office no Portal da Empresa.

Quando a sua empresa tornar o Office disponível, instale apenas uma versão do Office. Se tentar instalar ambas as versões, removerá aquela que instalou primeiro.
Ainda precisa de ajuda? Contacte o suporte da empresa. Para encontrar as informações de contacto dele, verifique o [site do Portal da Empresa](https://go.microsoft.com/fwlink/?linkid=2010980).
| 40.157895 | 230 | 0.794889 | por_Latn | 0.957274 |
da2fb159ea246a8a823608ed6818e09d8c3bdf43 | 19,017 | md | Markdown | articles/storage/storage-client-side-encryption.md | jaime-espinosa/azure-content | 7233c847da36c71e9b2fcf37126d17dba99dc886 | [
"CC-BY-3.0"
] | null | null | null | articles/storage/storage-client-side-encryption.md | jaime-espinosa/azure-content | 7233c847da36c71e9b2fcf37126d17dba99dc886 | [
"CC-BY-3.0"
] | null | null | null | articles/storage/storage-client-side-encryption.md | jaime-espinosa/azure-content | 7233c847da36c71e9b2fcf37126d17dba99dc886 | [
"CC-BY-3.0"
] | 1 | 2019-09-10T17:51:26.000Z | 2019-09-10T17:51:26.000Z | <properties
pageTitle="Get Started with Client-Side Encryption for Microsoft Azure Storage (Preview) | Microsoft Azure"
description="The Azure Storage Client Library for .NET preview offers support for client-side encryption and integration with Azure Key Vault. Client-side encryption offers maximum security for your Azure Storage applications, as your access keys are never available to the service. Client-side encryption is available for blobs, queues, and tables."
services="storage"
documentationCenter=".net"
authors="tamram"
manager="carolz"
editor=""/>
<tags
ms.service="storage"
ms.workload="storage"
ms.tgt_pltfrm="na"
ms.devlang="na"
ms.topic="article"
ms.date="06/18/2015"
ms.author="tamram"/>
# Get Started with Client-Side Encryption for Microsoft Azure Storage (Preview)
## Overview
Welcome to the [preview of the new Azure Storage Client Library for .NET](https://www.nuget.org/packages/WindowsAzure.Storage/4.4.1-preview). This preview library contains new functionality to help developers encrypt data inside client applications before uploading to Azure Storage, and to decrypt data while downloading. The preview library also supports integration with Azure [Key Vault](http://azure.microsoft.com/services/key-vault/) for storage account key management.
## Encryption and decrpyption via the envelope technique
The processes of encryption and decryption follow the envelope technique.
### Encryption via the envelope technique
Encryption via the envelope technique works in the following way:
1. The Azure storage client library generates a content encryption key (CEK), which is a one-time-use symmetric key.
2. User data is encrypted using this CEK.
3. The CEK is then wrapped (encrypted) using the key encryption key (KEK). The KEK is identified by a key identifier and can be an asymmetric key pair or a symmetric key and can be managed locally or stored in Azure Key Vaults.
The storage client library itself never has access to KEK. The library invokes the key wrapping algorithm that is provided by Key Vault. Users can choose to use custom providers for key wrapping/unwrapping if desired.
4. The encrypted data is then uploaded to the Azure Storage service. The wrapped key along with some additional encryption metadata is either stored as metadata (on a blob) or interpolated with the encrypted data (queue messages and table entities).
### Decryption via the envelope technique
Decryption via the envelope technique works in the following way:
1. The client library assumes that the user is managing the key encryption key (KEK) either locally or in Azure Key Vaults. The user does not need to know the specific key that was used for encryption. Instead, a key resolver which resolves different key identifiers to keys can be set up and used.
2. The client library downloads the encrypted data along with any encryption material that is stored on the service.
3. The wrapped content encryption key (CEK) is then unwrapped (decrypted) using the key encryption key (KEK). Here again, the client library does not have access to KEK. It simply invokes the custom or Key Vault provider’s unwrapping algorithm.
4. The content encryption key (CEK) is then used to decrypt the encrypted user data.
## Encryption Mechanism
The storage client library uses [AES](http://en.wikipedia.org/wiki/Advanced_Encryption_Standard) in order to encrypt user data. Specifically, [Cipher Block Chaining (CBC)](http://en.wikipedia.org/wiki/Block_cipher_mode_of_operation#Cipher-block_chaining_.28CBC.29) mode with AES. Each service works somewhat differently, so we will discuss each of them here.
### Blobs
In the preview version, the client library supports encryption of whole blobs only. Specifically, encryption is supported when users use **UploadFrom*** methods or **BlobWriteStream**. For downloads, both full and range downloads are supported.
During encryption, the client library will generate a random Initialization Vector (IV) of 16 bytes, together with a random content encryption key (CEK) of 32 bytes, and perform envelope encryption of the blob data using this information. The wrapped CEK and some additional encryption metadata are then stored as blob metadata along with the encrypted blob on the service.
> [AZURE.WARNING] If you are editing or uploading your own metadata for the blob, you need to ensure that this metadata is preserved. If you upload new metadata without this metadata, the wrapped CEK, IV and other metadata will be lost and the blob content will never be retrievable again.
Downloading an encrypted blob involves retrieving the content of the entire blob using the **DownloadTo***/**BlobReadStream** convenience methods. The wrapped CEK is unwrapped and used together with the IV (stored as blob metadata in this case) to return the decrypted data to the users.
Downloading an arbitrary range (**DownloadRange*** methods) in the encrypted blob involves adjusting the range provided by users in order to get a small amount of additional data that can be used to successfully decrypt the requested range.
All blob types (block blobs and page blobs) can be encrypted/decrypted using this scheme.
### Queues
Since queue messages can be of any format, the client library defines a custom format that includes the Initialization Vector (IV) and the encrypted content encryption key (CEK) in the message text.
During encryption, the client library generates a random IV of 16 bytes along with a random CEK of 32 bytes and performs envelope encryption of the queue message text using this information. The wrapped CEK and some additional encryption metadata are then added to the encrypted queue message. This modified message (shown below) is stored on the service.
<MessageText>{"EncryptedMessageContents":"6kOu8Rq1C3+M1QO4alKLmWthWXSmHV3mEfxBAgP9QGTU++MKn2uPq3t2UjF1DO6w","EncryptionData":{…}}</MessageText>
During decryption, the wrapped key is extracted from the queue message and unwrapped. The IV is also extracted from the queue message and used along with the unwrapped key to decrypt the queue message data. Note that the encryption metadata is small (under 500 bytes), so while it does count toward the 64KB limit for a queue message, the impact should be manageable.
### Tables
In the preview version, the client library supports encryption of entity properties for insert and replace operations.
>[AZURE.NOTE] Merge is not currently supported. Since a subset of properties may have been encrypted previously using a different key, simply merging the new properties and updating the metadata will result in data loss. Merging either requires making extra service calls to read the pre-existing entity from the service, or using a new key per property, both of which are not suitable for performance reasons.
Table data encryption works as follows:
1. Users specify the properties that should be encrypted.
2. The client library generates a random Initialization Vector (IV) of 16 bytes along with a random content encryption key (CEK) of 32 bytes for every entity, and perform envelope encryption on the individual properties that should be encrypted by deriving a new IV per property.
3. The wrapped CEK and some additional encryption metadata are then stored as two additional reserved properties. The first reserved property (_ClientEncryptionMetadata1) is a string property that holds the information about IV, version, and wrapped key. The other reserved property (_ClientEncryptionMetadata2) is a binary property that holds the information about the properties that are encrypted.
4. Due to these additional reserved properties required for encryption, users may now have only 250 custom properties instead of 252. The total size of the entity must be less than 1MB.
Note that only string properties can be encrypted. If other types of properties are to be encrypted, they must be converted to strings.
For tables, in addition to the encryption policy, users must specify the properties that should be encrypted. This can be done by either specifying an [EncryptProperty] attribute (for POCO entities that derive from TableEntity) or an encryption resolver in request options. An encryption resolver is a delegate that takes a partition key, row key, and property name and returns a Boolean that indicates whether that property should be encrypted. During encryption, the client library will use this information to decide whether a property should be encrypted while writing to the wire. The delegate also provides for the possibility of logic around how properties are encrypted. (For example, if X, then encrypt property A; otherwise encrypt properties A and B.) Note that it is not necessary to provide this information while reading or querying entities.
### Batch Operations
In batch operations, the same KEK will be used across all the rows in that batch operation because the client library only allows one options object (and hence one policy/KEK) per batch operation. However, the client library will internally generate a new random IV and random CEK per row in the batch. Users can also choose to encrypt different properties for every operation in the batch by defining this behavior in the encryption resolver.
### Queries
To perform query operations, you must specify a key resolver that is able to resolve all the keys in the result set. If an entity contained in the query result cannot be resolved to a provider, the client library will throw an error. For any query that performs server side projections, the client library will add the special encryption metadata properties (_ClientEncryptionMetadata1 and _ClientEncryptionMetadata2) by default to the selected columns.
## Azure Key Vault
Azure Key Vault (Preview) helps safeguard cryptographic keys and secrets used by cloud applications and services. By using Azure Key Vault, users can encrypt keys and secrets (such as authentication keys, storage account keys, data encryption keys, .PFX files, and passwords) by using keys that are protected by hardware security modules (HSMs). For more information, see [What is Azure Key Vault?](../articles/key-vault-whatis.md).
The storage client library uses the Key Vault core library in order to provide a common framework across Azure for managing keys. Users also get the additional benefit of using the Key Vault extensions library. The extensions library provides useful functionality around simple and seamless Symmetric/RSA local and cloud key providers as well as with aggregation and caching.
### Interface and dependencies
There are three Key Vault packages:
- Microsoft.Azure.KeyVault.Core contains the IKey and IKeyResolver. It is a small package with no dependencies. The storage client librarys for .NET and Windows Phone define it as a dependency.
- Microsoft.Azure.KeyVault contains the Key Vault REST client.
- Microsoft.Azure.KeyVault.Extensions contains extension code that includes implementations of cryptographic algorithms and an RSAKey and a SymmetricKey. It depends on the Core and KeyVault namespaces and provides functionality to define an aggregate resolver (when users want to use multiple key providers) and a caching key resolver. Although the storage client library does not directly depend on this package, if users wish to use Azure Key Vault to store their keys or to use the Key Vault extensions to consume the local and cloud cryptographic providers, they will need this package.
Key Vault is designed for high-value master keys, and throttling limits per Key Vault are designed with this in mind. When performing client-side encryption with Key Vault, the preferred model is to use symmetric master keys stored as secrets in Key Vault and cached locally. Users must do the following:
1. Create a secret offline and upload it to Key Vault.
2. Use the secret's base identifier as a parameter to resolve the current version of the secret for encryption and cache this information locally. Use CachingKeyResolver for caching; users are not expected to implement their own caching logic.
3. Use the caching resolver as an input while creating the encryption policy.
More information regarding Key Vault usage can be found in the [encryption code samples](https://github.com/Azure/azure-storage-net/tree/preview/Samples/GettingStarted/EncryptionSamples).
### Best practices
Encryption support is available only in the storage client libraries for .NET and Windows Phone. Windows Runtime does not currently support encryption. Additionally, Key Vault extensions are not currently supported for Windows Phone. If you want to use storage client encryption on phone, you will need to implement your own key providers. Also, due to a limitation in the Windows Phone .NET platform, page blob encryption is currently not supported on Windows Phone.
>[AZURE.IMPORTANT] Be aware of these important points when using the preview library:
>
>- Do not use the preview library for production data. In the future, changes to the library will affect the schemas used. Decryption of data that has been encrypted with the preview library is not guaranteed in future versions.
>- When reading from or writing to an encrypted blob, use full blob upload commands and range/full blob download commands. Avoid writing to an encrypted blob using protocol operations such as Put Block, Put Block List, Write Pages, or Clear Pages; otherwise you may corrupt the encrypted blob and make it unreadable.
>- For tables, a similar constraint exists. Be careful to not update encrypted properties without updating the encryption metadata.
>- If you set metadata on the encrypted blob, you may overwrite the encryption-related metadata required for decryption, since seting metadata is not additive. This is also true for snapshots; avoid specifying metadata while creating a snapshot of an encrypted blob.
## Client API / Interface
While creating an EncryptionPolicy object, users can provide only a Key (implementing IKey), only a resolver (implementing IKeyResolver), or both. IKey is the basic key type that is identified using a key identifier and that provides the logic for wrapping/unwrapping. IKeyResolver is used to resolve a key during the decryption process. It defines a ResolveKey method that returns an IKey given a key identifier. This provides users the ability to choose between multiple keys that are managed in multiple locations.
- For encryption, the key is used always and the absence of a key will result in an error.
- For decryption:
- The key resolver is invoked if specified to get the key. If the resolver is specified but does not have a mapping for the key identifier, an error is thrown.
- If resolver is not specified but a key is specified, the key identifier on the key is against what is stored for the service.
The [encryption samples](https://github.com/Azure/azure-storage-net/tree/preview/Samples/GettingStarted/EncryptionSamples) demonstrate a more detailed end-to-end scenario for blobs, queues and tables, along with Key Vault integration.
### Blobs
Users will create a **BlobEncryptionPolicy** object and set it in the request options (per API or at a client level by using **DefaultRequestOptions**). Everything else will be handled by the client library internally.
// Create the IKey used for encryption.
RsaKey key = new RsaKey("private:key1" /* key identifier */);
// Create the encryption policy to be used for upload and download.
BlobEncryptionPolicy policy = new BlobEncryptionPolicy(key, null);
// Set the encryption policy on the request options.
BlobRequestOptions options = new BlobRequestOptions() { EncryptionPolicy = policy };
// Upload the encrypted contents to the blob.
blob.UploadFromStream(stream, size, null, options, null);
// Download and decrypt the encrypted contents from the blob.
MemoryStream outputStream = new MemoryStream();
blob.DownloadToStream(outputStream, null, options, null);
### Queues
Users will create a **QueueEncryptionPolicy** object and set it in the request options (per API or at a client level by using **DefaultRequestOptions**). Everything else will be handled by the client library internally.
// Create the IKey used for encryption.
RsaKey key = new RsaKey("private:key1" /* key identifier */);
// Create the encryption policy to be used for upload and download.
QueueEncryptionPolicy policy = new QueueEncryptionPolicy(key, null);
// Add message
QueueRequestOptions options = new QueueRequestOptions() { EncryptionPolicy = policy };
queue.AddMessage(message, null, null, options, null);
// Retrieve message
CloudQueueMessage retrMessage = queue.GetMessage(null, options, null);
### Tables
In addition to creating an encryption policy and setting it on request options, users will have to specify an **EncryptionResolver** in **TableRequestOptions** or set attributes on the entity.
#### Using the resolver
// Create the IKey used for encryption.
RsaKey key = new RsaKey("private:key1" /* key identifier */);
// Create the encryption policy to be used for upload and download.
TableEncryptionPolicy policy = new TableEncryptionPolicy(key, null);
TableRequestOptions options = new TableRequestOptions()
{
EncryptionResolver = (pk, rk, propName) =>
{
if (propName == "foo")
{
return true;
}
return false;
},
EncryptionPolicy = policy
};
// Insert Entity
currentTable.Execute(TableOperation.Insert(ent), options, null);
// Retrieve Entity
// No need to specify an encryption resolver for retrieve
TableRequestOptions retrieveOptions = new TableRequestOptions()
{
EncryptionPolicy = policy
};
TableOperation operation = TableOperation.Retrieve(ent.PartitionKey, ent.RowKey);
TableResult result = currentTable.Execute(operation, retrieveOptions, null);
#### Using attributes
As mentioned above, if the entity implements TableEntity, then the properties can be decorated with the [EncryptProperty] attribute instead of specifying the EncryptionResolver.
[EncryptProperty]
public string EncryptedProperty1 { get; set; }
## Next steps
[Client-Side Encryption for Microsoft Azure Storage – Preview](http://blogs.msdn.com/b/windowsazurestorage/archive/2015/04/28/client-side-encryption-for-microsoft-azure-storage-preview.aspx)
Download the [Azure Storage Client Library for .NET NuGet package](http://www.nuget.org/packages/WindowsAzure.Storage/4.4.0-preview)
Download the [Azure Storage Client Library for .NET Source Code](https://github.com/Azure/azure-storage-net/tree/preview) from GitHub
Download the Azure Key Vault NuGet [Core](http://www.nuget.org/packages/Microsoft.Azure.KeyVault.Core/), [Client](http://www.nuget.org/packages/Microsoft.Azure.KeyVault/), and [Extensions](http://www.nuget.org/packages/Microsoft.Azure.KeyVault.Extensions/) packages
Visit the [Azure Key Vault Documentation](../articles/key-vault-whatis.md)
| 79.2375 | 856 | 0.791976 | eng_Latn | 0.995851 |
da2fb6963b77d38fd7b98855825a6c1b952afcf7 | 250 | md | Markdown | _purposes/_drafts/motorcycle-theft-protection.md | chrisbo246/pickyvagabond | 0b643e961d79bbc5f8e092ad9a3887a0ab5c481e | [
"MIT"
] | null | null | null | _purposes/_drafts/motorcycle-theft-protection.md | chrisbo246/pickyvagabond | 0b643e961d79bbc5f8e092ad9a3887a0ab5c481e | [
"MIT"
] | null | null | null | _purposes/_drafts/motorcycle-theft-protection.md | chrisbo246/pickyvagabond | 0b643e961d79bbc5f8e092ad9a3887a0ab5c481e | [
"MIT"
] | null | null | null | ---
title: "Motorcycle theft protection"
description: "Lock your bike to a fixed point and deter thieves."
image: "https://images-na.ssl-images-amazon.com/images/I/510dMUrD9hL.jpg"
purposes: [motorcycling, theft-protection]
qualities: []
wiki: ~
---
| 27.777778 | 73 | 0.744 | eng_Latn | 0.772154 |
da2fcab435299971e7af69ff3ce0d326b8e100e4 | 1,606 | md | Markdown | README.md | wh1ter0se0/termtest | be2a4b9a38df6565ba9275abd7fc580111dca7d7 | [
"MIT"
] | null | null | null | README.md | wh1ter0se0/termtest | be2a4b9a38df6565ba9275abd7fc580111dca7d7 | [
"MIT"
] | null | null | null | README.md | wh1ter0se0/termtest | be2a4b9a38df6565ba9275abd7fc580111dca7d7 | [
"MIT"
] | null | null | null | # r6sRandomizer
A [Rainbow Six Siege](https://www.ubisoft.com/en-us/game/rainbow-six/siege) operator randomizer.
**Demo**

---
## Table of Contents (Optional)
> If your `README` has a lot of info, section headers might be nice.
- [Installation](#installation)
- [Team](#team)
- [License](#license)
---
## Installation
### Clone
- Clone this repo to your local machine using `https://github.com/JMax45/r6sRandomizer`
### Setup
> move to the directory with the program
```shell
$ cd r6sRandomizer/build-r6sRandomizer-Desktop-Debug/
```
> run the program
```shell
$ chmod +x r6sRandomizer
$ ./r6sRandomizer
```
---
## Team
| <a href="https://www.jz-software.com" target="_blank">**JMax**</a> | <a href="https://www.jz-software.com" target="_blank">**Zeverotti**</a> |
| :---: |:---:|
| [](https://www.jz-software.com) | [](https://www.jz-software.com) |
| <a href="https://github.com/JMax45" target="_blank">`github.com/JMax45`</a> | <a href="https://github.com/Zeverotti" target="_blank">`github.com/Zeverotti`</a> |
---
## License
[](http://badges.mit-license.org)
- **[MIT license](http://opensource.org/licenses/mit-license.php)**
- Copyright 2020 © <a href="https://www.jz-software.com" target="_blank">JZ-Software</a>.
| 26.327869 | 314 | 0.693026 | yue_Hant | 0.32521 |
da30203a2593c9b9ef932a63e9dfdda4d2709795 | 18,427 | md | Markdown | README.md | JULIELab/trec-pm | 1993f3e0a373ea169129c4b3a11c956d9f502f4d | [
"MIT"
] | 8 | 2019-04-25T08:21:06.000Z | 2021-01-15T18:31:00.000Z | README.md | JULIELab/trec-pm | 1993f3e0a373ea169129c4b3a11c956d9f502f4d | [
"MIT"
] | 87 | 2019-06-12T14:32:06.000Z | 2020-04-30T16:57:48.000Z | README.md | JULIELab/trec-pm | 1993f3e0a373ea169129c4b3a11c956d9f502f4d | [
"MIT"
] | 4 | 2019-06-19T13:37:21.000Z | 2020-02-16T03:39:20.000Z | # TREC-PM (Precision Medicine)
A repository containing support code and resources initially developed at the [Institute for Medical Informatics, Statistics and Documentation at the Medical University of Graz (Austria)](https://www.medunigraz.at/imi/en/) for participation at the [2017 TREC Precision Medicine Track](http://trec-cds.appspot.com/2017.html). For further information on this track and the final results please check the official [TREC-PM 2017 overview paper](https://trec.nist.gov/pubs/trec26/papers/Overview-PM.pdf). Team name: **imi_mug**
It was then further improved for participation at the [2018 TREC Precision Medicine Track](http://trec-cds.appspot.com/2018.html). Improvements include: support for subtemplates and the possibility to use disjunctive queries (_dis\_max_) allowing e.g. synonyms and hypernyms to have different weights. Team name: **hpi-dhc**.
A systematic study of query expansion and boosting mechanisms (including optimal `b` and `k1` BM25 hyperparameters) is available on the branch [sigir20](https://github.com/JULIELab/trec-pm/tree/sigir20).
## Citing
If you use `imi_mug`'s original data or code in your work, please cite their [TREC 2017 proceedings paper](https://trec.nist.gov/pubs/trec26/papers/imi_mug-PM.pdf):
*Pablo López-García, Michel Oleynik, Zdenko Kasáč and Stefan Schulz. TREC 2017 Precision Medicine - Medical University of Graz. Text REtrieval Conference, Gaithersburg, MD. 2017. Available at https://trec.nist.gov/pubs/trec26/papers/imi_mug-PM.pdf.*
If you use any of the improvements mentioned above, please also cite our [TREC 2018 proceedings paper](https://trec.nist.gov/pubs/trec27/papers/hpi-dhc-PM.pdf):
*Michel Oleynik, Erik Faessler, Ariane Morassi Sasso, et. al. HPI-DHC at TREC 2018 Precision Medicine Track. Text REtrieval Conference, Gaithersburg, MD. 2018. Available at https://trec.nist.gov/pubs/trec27/papers/hpi-dhc-PM.pdf.*
Lastly, if you use the optimal hyperparemeters mentioned above, please cite our [SIGIR 2020 paper](https://doi.org/10.1145/3397271.3401048).
*Erik Faessler, Michel Oleynik, and Udo Hahn. 2020. What Makes a Top-Performing Precision Medicine Search Engine? Tracing Main System Features in a Systematic Way. _In Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR ’20), July 25–30, 2020, Virtual Event, China._ ACM, New York, NY, USA, 10 pages. https://doi.org/10.1145/3397271.3401048*
## Elastic Search JSON Templating Engine
For easier query formulation, this project contains a custom engine to fill JSON templates with contents from the search
topics. The idea is to fix the query structure (e.g. `{"query":{"match":{"title":"${disease}"}}}`) and to dynamically
add the value of a specific topic (the TREC query) at the specified position. In the previous example, `${disease
}` will access the field `disease`) of the provided topic.
There are currently two templating engines contained in this project. A legacy one and a newer that should replace
the legacy approach in the future.
The classes realizing the legacy approach are
* `at.medunigraz.imi.bst.retrieval.MapQueryDecorator`
* `at.medunigraz.imi.bst.retrieval.TemplateQueryDecorator`
* `at.medunigraz.imi.bst.retrieval.SubTemplateQueryDecorator`
The new approach is encoded in
* `at.medunigraz.imi.bst.retrieval.JsonMapQueryDecorator`
* `at.medunigraz.imi.bst.retrieval.JsonTemplateQueryDecorator`
These "decorators" are applied to a given topic and a file containing a template. They will then replace the
_template expressions_ with the referenced values from the topic. As a template expression we denote a special
syntax that should set apart the fixed JSON contents from a topic value injection directive.
The syntax of the template expressions for the newer approach are explained in the following. The legacy approach
will not be documented here. In case it is needed we refer to the code (tests and existing experimental code) to
demonstrate its usage.
We distinguish between value injecting template expressions and template injecting expressions. The first kind
is the one that refers to actual topic values to be inserted into the template. The latter kind refers to expressions
that load a sub template, resolve its template expressions (which is a recursive action) and than replace the
original template expression with the expression-resolved subtemplate.
Note that all template expressions discussed here are case-insensitive with respect to the expression keywords, e.g.
modifiers. The field names and template paths must be written with correct case since they directly refer to Java
object fields and file paths.
### Value Injecting Template Expressions
Template expressions of this type access values in the topic for actual insertion into the template that contains the
expression. All expressions are JSON strings or part of a JSON string. The quotes surrounding the template will be
removed if necessary, i.e. for non-string values of the referenced topic field. If one quote is missing, it will be
assumed that the expression part of longer string and the quoting will be left untouched.
The following template expressions are offered for value injection:
| expression | description |
|:--------------|:-------------:|
| `"${topicField}" ` | Inserts the value of the topic object field `topicField`. If that value is an array or a collection, a modifier is required, see below. |
| `"${topicField[i]}" ` | Requires that `topicField` is a collection or an array and `i >= 0` is an explicitly given constant (e.g. `3`). Inserts the `ith` value of `topicField`. It is not a requirement that the underlying data structures offers random access. If not, the data structure will be iterated to reach the `ith` value. |
| `"${topicField[]}" ` | Requires that the template containing this expression was referenced by an iterative template-injecting expression in another template. Requires that `topicField` is a collection or an array. Inserts the `ith` value of `topicField` where `i` is an index that is dynamically passed by the iterative parent expression. |
| `"${topicField[][j][]}" ` | Requires that the template containing this expression is at the end of a two-level iterative template-injecting expression chain from two other templates (this, this template would be the third). Requires that `topicField` is a collection or an array and `j >= 0` is an explicitly given constant (e.g. `3`). Inserts the value of `topicField` at position `[i][j][k]` where `i` and `k` are indices dynamically passed by the iterative parent expressions. |
| `"${$ELEMENT}" ` | Requires that the template containing this expression was referenced by an iterative template-injecting expression in another template. Inserts the value referenced in the current iteration of the direct parent expression. Cannot have implicit index specifications (`[]`) as this is the current iteration element itself. Can, however, have explicit index specifications (e.g. `[2]`) if the value is a collection or array. |
In addition to those expressions there exists a set of template modifiers. Those modifiers are only valid within
value-injecting template expressions. They influence the exact way a referenced topic value is rendered into the
JSON template. The modifiers are directly prepended to the name of the topic field or the `$ELEMENT` reference. The
following table gives an overview over the existing modifiers.
| modified expression | description |
|:-------------:|:-------------:|
| `"${CONCAT topicField}" ` | If the `topicField` value is a collection or array, its contents will be concatenated into a single whitespace-delimited string. This also works with multi-dimensional data structures. |
| `"${JSONARRAY topicField}" ` | The value if `topicField` will be rendered as a (possibly nested) JSON array, including the brackets. This works also with multi-dimensional data structures. |
| `"${FLAT JSONARRAY topicField}" ` | The value if `topicField` will be flattened into one single array and rendered as a JSON array, including the brackets. This works also with multi-dimensional data structures. |
| `"${QUOTE topicField}"` | Rarely needed. Forces the injected value to be surrounded by quotes. Should be handled automatically. |
| `"${NOQUOTE topicField}"` | Rarely needed. Prohibits the injected value to be surrounded by quotes. Should be handled automatically. |
### Template Injecting Template Expressions
The expressions discussed here have in common that they reference the name of subtemplate. A subtemplate is insofar
different from a "normal" template that is resides in a specific resources folder which is configuration in the
configuration.
The possible options are
| expression | description |
|:--------------|:-------------:|
| `"${INSERT templatePath.json}"` | Inserts the given subtemplate after injecting topic values, if any are referenced in the subtemplate. |
| `["${FOR INDEX IN topicField REPEAT templatePath.json}"]` | Requires that `topicField` is a collection or array. For each value in `topicField`, the subtemplate at `templatePath.json` will be subject to template expression resolution with respect to the index and value of the current iteration. The value of the current iteration can be accessed in the subtemplate via `topicField[]` or `$ELEMENT`. For nested applications of this expression, the subtemplate can specify multiple indices, e.g `topicField[][]`. |
| `"[${FOR INDEX IN topicField[] REPEAT templatePath.json}]"` | Requires that `topicField` is at least two-dimensional. Recursive FOR INDEX IN application. |
## Other resources
### 2017
* [imi_mug TREC 2017 presentation slides](https://github.com/bst-mug/trec2017/blob/master/docs/presentation.pdf)
* [imi_mug TREC 2017 Poster](https://github.com/bst-mug/trec2017/blob/master/docs/poster.pdf)
* [TREC 2017 proceedings](https://trec.nist.gov/pubs/trec26/trec2017.html).
### 2018
* [hpi_dhc TREC 2018 presentation slides](https://github.com/hpi-dhc/trec-pm/blob/master/docs/2018/presentation.pdf)
* [hpi_dhc TREC 2018 Poster](https://github.com/hpi-dhc/trec-pm/blob/master/docs/2018/poster.pdf)
* [hpi_dhc TREC 2018 Data Artifacts](https://figshare.com/projects/TREC_PM_2018_Data_hpi-dhc_/56882)
* [TREC 2018 proceedings](https://trec.nist.gov/pubs/trec27/trec2018.html).
## Code Dependencies
- JDK 11+ (won't compile with JDK8)
- maven
- make (for `trec_eval` tool)
- gcc (for `trec_eval` tool)
- perl (for `sample_eval` tool)
- Elasticsearch 5.4.0+
- python3 (to parse UMLS, get fasttext embeddings)
## How to Create the Resources for the Experiments
### UMLS
You require the `MRCONSO.RRF` which can be obtained from the official UMLS downloads.
Then, adapt the paths in the `scripts/createUmlsTermSynsets.py` script to read from your `MRCONSO.RRF` file and
create the `resources/umlsSynsets.txt` file. Framework classes making use of the UMLS synsets will expect
the file at this location.
- Download https://download.nlm.nih.gov/umls/kss/2019AA/umls-2019AA-mrconso.zip
- `unzip umls-2019AA-mrconso.zip`
- `python3 scripts/createUmlsTermSynsets.py MRCONSO.RRF ENG > resources/umlsSynsets.txt`
- `wc -c umlsSynsets.txt` = 338449057
- `gzip resources/umlsSynsets.txt`
### FastText Embeddings for LtR
`FastText` embeddings are used to create document embeddings for LtR features. Note that their performance impact seemed to be minor in experiments on the TREC-PM 17/18 data and probably can be left out without great performance penalties. However, this can't be said for sure before evaluation on the 2019 gold standard.
The emebeddings can be recreated by:
1. Run the BANNER gene tagger from [jcore-projects](https://github.com/JULIELab/jcore-projects/tree/master/jcore-jnet-ae-biomedical-english), version>=2.4 on the Medline/PubMed 2019 baseline.
2. Extract the document text from those document with at least one tagged gene in them. This should be around 8 million documents. The text is the title plus abstract text (e.g. by using the [JCoRe PubMed reader](https://github.com/JULIELab/jcore-projects/tree/master/jcore-pubmed-reader) and the [JCoRe To TXT consumer](https://github.com/JULIELab/jcore-base/tree/master/jcore-txt-consumer) in the `DOCUMENT` mode). No postprocessing (which should be done for better models but hasn't been done on the used embeddings).
3. Create `FastText` word embeddings with a dimension of 300. We used the `.bin` output for LtR features.
## Some Examples on How to Run Experiments
```
# All executions should be run where the pom file is, usually the root of the project
# How to run the pubmed experimenter
# Necessary to define the year and type of gold-standard (for evaluation)
mvn clean install
mvn exec:java -Dexec.mainClass="at.medunigraz.imi.bst.trec.LiteratureArticlesExperimenter"
# How to run the clinical trials experimenter
# Necessary to define the year and type of gold-standard (for evaluation)
mvn clean install
mvn exec:java -Dexec.mainClass="at.medunigraz.imi.bst.trec.ClinicalTrialsExperimenter"
# How to run the KeywordExperimenter
# Necessary to define the year and type of gold-standard (for evaluation)
# For positive booster, in the keyword template leave boost = 1
# For negative booster, in the keyword template leave boost = -1
# Also, in the KeywordExperimenter the keywordsSource needs to be specified
mvn clean install
mvn exec:java -Dexec.mainClass="at.medunigraz.imi.bst.trec.KeywordExperimenter" > out.txt &
cat out.txt | grep -e "\(^[0-9\.]*\)\(\;.*\)\(with.*\)\(\\[.*\\]\)\(.*\)" | sed -r "s/"\(^[0-9\.]*\)\(\;.*\)\(with.*\)\(\\[.*\\]\)\(.*\)"/\1 \2 \4/" > results.txt
```
# How to Create the Document Database and the ElasticSearch Index
The databases can be re-created using the the components in the `uima` subdirectory.
All UIMA pipelines have been created and run by the [JCoRe Pipeline Components](https://github.com/JULIELab/jcore-pipeline-modules) in version `0.4.0`. Note that all pipelines require their libraries in the `lib/` directory which does not exist at first. It is automatically created and populated by opening the pipeline with the `JCoRe Pipeline Builder CLI` under the above link. Opening the pipeline should be enough. If this das not create and populate the `lib/` directory, try opening and saving the pipeline.
1. Install `ElasticSearch 5.4` and `Postgres >= 9.6`. Used for the experiments was `Postgres 9.6.13`.
2. Change into the `uima` directory on the command line and execute `./gradlew install-uima-components`. this must successfully run through in order to complete the following steps. Note that Gradle is only used for scripting, the projects are all build with Maven. Thus, check the Maven output for success or failure messages. Gradle may report success despite Maven failing.
3. Run the `pm-to-xmi-db-pipeline` and the `ct-to-xmi-db-pipeline` with the `JCoRE Pipeline Runner`. Before you actually run those, check the `pipelinerunner.xml` configuration files in both projects for the number threads being used. Adapt them to the capabilities of your system, if necessary.
4. Configure the `preprocessing` and `preprocessing_ct` with the `JCoRe Pipeline Builder` to active nearly all (explained in a second) components. Some are deactivated in this release. Note that there are some components specific to `BANNER` gene tagging and `FLAIR` gene tagging. Use the `BANNER` components, Flair hasn't been used in our submitted runs. You might also leave the `LingScope` and `MutationFinder` components off because those haven't been used either. Configure the `uima/costosys.xml` file in all pipelines to point to your Postgres database. Run the components. They will write the annotation data into the Postgres database. We used multiple machines for this, employing the SLURM scheduler (not required). All in all we had 96 CPU cores available. Processing time was in the hours, much less than a day for PubMed. The processing will accordingly take longer or shorter depending on the resources at your disposal.
5. Configure the `pubmed-indexer` and `ct-indexer` projects to work with your ElasticSearch index using the `JCoRe Pipeline Builder`. Execute `mvn package` in both pipeline directories to build the indexing code, which is packaged as a `jar` and automatically put into the `lib` directory of the pipelines. Run the components.
If all steps have been performed successfully, the indices should now be present in your ElasticSearch instance. To run the experiments, also configure the `<repository root>/config/costosys.xml` file to point to your database. Then run the `at.medunigraz.imi.bst.trec.LiteratureArticlesExperimenter´ and `at.medunigraz.imi.bst.trec.ClinicalTrialsExperimenter` classes.
# Important Java System Properties in this Framework
There are few settings that are configured via Java System properties. Such settings do not count as
regular configuration settings but change basic behaviour of the system, often used for tests.
* `at.medunigraz.imi.bst.retrieval.subtemplates.folder` - sets the folder where the subtemplates are expected (default: `/subtemplates/`)
* `de.julielab.java.utilities.cache.enabled` - if set to `false`, the caching library is deactivated. The caching code is still there but the `CacheAccess` objects always return `null` when retrieving cached objects.
## Q&A
**Q: Do I really need to store all the documents into the database? Wouldn't it be quicker just to index everything directly from the source data?**
*A: Directly indexing from the source data is very well possible by combining the respective parts of the three steps (reading, preprocessing, indexing). Note however, that the LtR feature generation makes use of the document stored in the database. Thus, LtR wouldn't work this way.*
[](https://www.codacy.com/app/michelole/trec-pm)
[](https://travis-ci.com/JULIELab/trec-pm)
[](https://coveralls.io/github/michelole/trec-pm?branch=master)
[](https://opensource.org/licenses/MIT)
| 89.019324 | 936 | 0.770337 | eng_Latn | 0.986403 |
da3045157d89b3c477508973fe82962d08029a0e | 1,139 | md | Markdown | docs/extensibility/debugger/reference/idebugalias-dispose.md | drigovz/visualstudio-docs.pt-br | 7a1b53ff3dd5c3151e9c8b855599edf499df9d95 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/extensibility/debugger/reference/idebugalias-dispose.md | drigovz/visualstudio-docs.pt-br | 7a1b53ff3dd5c3151e9c8b855599edf499df9d95 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/extensibility/debugger/reference/idebugalias-dispose.md | drigovz/visualstudio-docs.pt-br | 7a1b53ff3dd5c3151e9c8b855599edf499df9d95 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: IDebugAlias::D ispose | Microsoft Docs
ms.date: 11/04/2016
ms.topic: reference
f1_keywords:
- IDebugAlias::Dispose
helpviewer_keywords:
- IDebugAlias::Dispose method
ms.assetid: e84909a4-d378-4f48-bf25-2c014c77c8e3
author: acangialosi
ms.author: anthc
manager: jmartens
ms.workload:
- vssdk
dev_langs:
- CPP
- CSharp
ms.openlocfilehash: 033d74be9548b6bdaaccfe567e99c1d94453bca5
ms.sourcegitcommit: ae6d47b09a439cd0e13180f5e89510e3e347fd47
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 02/08/2021
ms.locfileid: "99944791"
---
# <a name="idebugaliasdispose"></a>IDebugAlias::Dispose
Marca este alias para remoção.
## <a name="syntax"></a>Sintaxe
```cpp
HRESULT Dispose();
```
```csharp
int Dispose();
```
## <a name="parameters"></a>Parâmetros
Nenhum.
## <a name="return-value"></a>Valor retornado
Se for bem-sucedido, retornará S_OK; caso contrário, retorna um código de erro.
## <a name="remarks"></a>Comentários
Depois que esse método for chamado, o alias não estará mais disponível.
## <a name="see-also"></a>Consulte também
- [IDebugAlias](../../../extensibility/debugger/reference/idebugalias.md)
| 23.244898 | 80 | 0.752414 | por_Latn | 0.302698 |
da308331e49e22b3fdb4c26e4841fffcf95c761b | 14 | md | Markdown | README.md | Nunnery/ng-statuspage | dc44515226f70aa012e6149c3b28a5f68e18d999 | [
"MIT"
] | null | null | null | README.md | Nunnery/ng-statuspage | dc44515226f70aa012e6149c3b28a5f68e18d999 | [
"MIT"
] | null | null | null | README.md | Nunnery/ng-statuspage | dc44515226f70aa012e6149c3b28a5f68e18d999 | [
"MIT"
] | null | null | null | # StatusPage
| 4.666667 | 12 | 0.714286 | kor_Hang | 0.356157 |
da3105f7c947a354a4b7b7a059707b2b4098bb3b | 396 | md | Markdown | docs/interfaces/graph.md | wwmoraes/dot | 63d9779aea38cb7ae5c2ae779e855e09fe1ad582 | [
"MIT"
] | 4 | 2021-01-05T08:40:18.000Z | 2022-01-23T07:50:23.000Z | docs/interfaces/graph.md | wwmoraes/dot | 63d9779aea38cb7ae5c2ae779e855e09fe1ad582 | [
"MIT"
] | 3 | 2021-02-08T06:24:21.000Z | 2022-02-28T15:02:51.000Z | docs/interfaces/graph.md | wwmoraes/dot | 63d9779aea38cb7ae5c2ae779e855e09fe1ad582 | [
"MIT"
] | null | null | null | ---
title: Graph
description: implemented by dot-compatible graph values
---
# `Graph`
Implemented by dot-compatible graph values, with support for attributes.
Extends attributes' [Identity](../attributes/interfaces/Identity.md),
[Styleable](../attributes/interfaces/Styleable.md) and
[Serializable](../attributes/interfaces/Serializable.md) interfaces.
## source
```go
--8<-- "Graph.go"
```
| 22 | 72 | 0.742424 | eng_Latn | 0.632496 |
da3707d5bf55558fc956678d7595d0c098aa91db | 1,182 | md | Markdown | _posts/2016-04-16-norfolk-osm.md | jgravois/jonahadkins.github.io | 69754138f77357af56d2e121fd9f94793709600a | [
"MIT"
] | null | null | null | _posts/2016-04-16-norfolk-osm.md | jgravois/jonahadkins.github.io | 69754138f77357af56d2e121fd9f94793709600a | [
"MIT"
] | null | null | null | _posts/2016-04-16-norfolk-osm.md | jgravois/jonahadkins.github.io | 69754138f77357af56d2e121fd9f94793709600a | [
"MIT"
] | null | null | null | ---
layout: post
title: "City of Norfolk OSM Imports"
date: 2016-04-16
categories: [osm]
---
City of Norfolk, Virginia GIS department has provided building footprints and address points for importing into OpenStreetMap via their [Open Data Portal](http://data.orf.opendata.arcgis.com/). The data is provided as is, with compatible license.

### Post-Import Notes
80,045 Total Buildings
* 53,140 Buildings (!= shed/garage) with Addresses - 66% :|
Imports will be done through the dedicated user account [jonahadkins_norfolk_import](https://www.openstreetmap.org/user/jonahadkins_norfolk_imports) and will be done through OSM Import Guidelines - [See Wiki](https://wiki.openstreetmap.org/wiki/City_of_Norfolk)

Imported building swill have the following `building=` tags populated:
| Building | Count |
| ------------- | ------------- |
| commercial | 4,121 |
| garage | 13,249 |
| hospital | 10 |
| residential | 55,160 |
| school | 51 |
| yes | 7,454 |
---
| 34.764706 | 263 | 0.71066 | eng_Latn | 0.794035 |
da38497d3a2252db52381082a72b66150c973379 | 4,879 | md | Markdown | docs/analysis-services/multidimensional-models-adomd-net-client/retrieving-metadata-working-with-adomd-net-object-model.md | IrvinDominin/sql-docs.it-it | 4b82830a24c29e5486f950728a69ddb46cb4c874 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/analysis-services/multidimensional-models-adomd-net-client/retrieving-metadata-working-with-adomd-net-object-model.md | IrvinDominin/sql-docs.it-it | 4b82830a24c29e5486f950728a69ddb46cb4c874 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/analysis-services/multidimensional-models-adomd-net-client/retrieving-metadata-working-with-adomd-net-object-model.md | IrvinDominin/sql-docs.it-it | 4b82830a24c29e5486f950728a69ddb46cb4c874 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Utilizzo con il modello a oggetti ADOMD.NET | Documenti Microsoft
ms.date: 05/02/2018
ms.prod: sql
ms.technology: analysis-services
ms.custom: adomd
ms.topic: conceptual
ms.author: owend
ms.reviewer: owend
author: minewiskan
manager: kfile
ms.openlocfilehash: 29e59b5811f6c13c7aa222c7d6eba31f9bcd706e
ms.sourcegitcommit: c12a7416d1996a3bcce3ebf4a3c9abe61b02fb9e
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 05/10/2018
ms.locfileid: "34020128"
---
# <a name="retrieving-metadata---working-with-adomdnet-object-model"></a>Il recupero dei metadati - utilizzo di modello a oggetti ADOMD.NET
In ADOMD.NET è disponibile un modello a oggetti per la visualizzazione dei cubi e degli oggetti subordinati contenuti in un'origine dati analitica. Tramite il modello a oggetti tuttavia non è possibile utilizzare tutti i metadati per un'origine dati analitici specifica, ma è possibile accedere solo alle informazioni più utili da visualizzare in un'applicazione client in modo da consentire all'utente di creare comandi in modo interattivo. A causa della complessità ridotta dei metadati da presentare, il modello a oggetti ADOMD.NET risulta più facile da utilizzare.
Nel modello a oggetti ADOMD.NET l'oggetto <xref:Microsoft.AnalysisServices.AdomdClient.AdomdConnection> consente di accedere alle informazioni sui cubi OLAP (Online Analytical Processing), sui modelli di data mining definiti in un'origine dati analitici e sugli oggetti correlati, ad esempio dimensioni, set denominati e algoritmi di data mining.
## <a name="retrieving-olap-metadata"></a>Recupero di metadati OLAP
Ogni oggetto <xref:Microsoft.AnalysisServices.AdomdClient.AdomdConnection> dispone di una raccolta di oggetti <xref:Microsoft.AnalysisServices.AdomdClient.CubeDef> che rappresentano i cubi disponibili per l'utente o per l'applicazione. L'oggetto <xref:Microsoft.AnalysisServices.AdomdClient.CubeDef> espone informazioni sul cubo e sui diversi oggetti correlati al cubo, ad esempio dimensioni, indicatori di prestazioni chiave, misure, set denominati e così via.
Se possibile, è necessario utilizzare l'oggetto <xref:Microsoft.AnalysisServices.AdomdClient.CubeDef> per rappresentare metadati nelle applicazioni client progettate per supportare più server OLAP o per visualizzare e accedere a metadati generali.
> [!NOTE]
> Per metadati specifici del provider o per visualizzare e accedere a metadati dettagliati, utilizzare set di righe dello schema per il recupero dei metadati stessi. Per altre informazioni, vedere [Working with Schema Rowsets in ADOMD.NET](../../analysis-services/multidimensional-models-adomd-net-client/retrieving-metadata-working-with-schema-rowsets.md).
Nell'esempio seguente viene utilizzato l'oggetto <xref:Microsoft.AnalysisServices.AdomdClient.CubeDef> per recuperare i cubi visibili e le relative dimensioni dal server locale:
[!code-cs[Adomd.NetClient#RetrieveCubesAndDimensions](../../analysis-services/multidimensional-models-adomd-net-client/codesnippet/csharp/retrieving-metadata-work_1_1.cs)]
## <a name="retrieving-data-mining-metadata"></a>Recupero di metadati di data mining
Ogni oggetto <xref:Microsoft.AnalysisServices.AdomdClient.AdomdConnection> dispone di diversi raccolte che forniscono informazioni sulle funzionalità di data mining dell'origine dati:
- <xref:Microsoft.AnalysisServices.AdomdClient.MiningModelCollection> che contiene un elenco di ogni modello di data mining nell'origine dati.
- <xref:Microsoft.AnalysisServices.AdomdClient.MiningServiceCollection> che fornisce informazioni sugli algoritmi di data mining disponibili.
- <xref:Microsoft.AnalysisServices.AdomdClient.MiningStructureCollection> che espone informazioni sulle strutture di data mining nel server.
Per stabilire le modalità di esecuzione di query su un modello di data mining nel server, eseguire un'iterazione nella raccolta <xref:Microsoft.AnalysisServices.AdomdServer.MiningModel.Columns%2A>. Ogni oggetto <xref:Microsoft.AnalysisServices.AdomdClient.MiningModelColumn> espone le caratteristiche seguenti:
- Indicazione dell'oggetto come colonna di input o meno (<xref:Microsoft.AnalysisServices.AdomdClient.MiningModelColumn.IsInput%2A>).
- Indicazione dell'oggetto come colonna di stima o meno (<xref:Microsoft.AnalysisServices.AdomdClient.MiningModelColumn.IsPredictable%2A>).
- Valori associati a una colonna discreta (<xref:Microsoft.AnalysisServices.AdomdClient.MiningModelColumn.Values%2A>).
- Tipo di dati nella colonna (<xref:Microsoft.AnalysisServices.AdomdClient.MiningModelColumn.Type%2A>).
## <a name="see-also"></a>Vedere anche
[Recupero di metadati da un'origine dati analitici](../../analysis-services/multidimensional-models-adomd-net-client/retrieving-metadata-from-an-analytical-data-source.md)
| 82.694915 | 572 | 0.805493 | ita_Latn | 0.952041 |
da3898a1df35ba9182d9e7c1e85a44a32fe31bdb | 3,048 | md | Markdown | README.md | ManOAction/SpotifyCustomCrossfade | fd2dae42a99960b18660dae80af930ce2ca9aa8f | [
"MIT"
] | null | null | null | README.md | ManOAction/SpotifyCustomCrossfade | fd2dae42a99960b18660dae80af930ce2ca9aa8f | [
"MIT"
] | null | null | null | README.md | ManOAction/SpotifyCustomCrossfade | fd2dae42a99960b18660dae80af930ce2ca9aa8f | [
"MIT"
] | null | null | null | # Crossfit Crossfader
Crossfit Crossfader is a web app that allows you to build playlists that jump in and out of songs at just the right time.
It's a learning project for me. I wanted to learn about building apps with Flask, Bootstrap, OAuth 2.0, and Elastic Beanstalk.
At the time of this typing, it needs a lot of cleaning up, but I'm planning on getting it to just good enough and then pull requests are welcome.
## Installation
This is a python project so you'll need that, and the package manager [pip](https://pip.pypa.io/en/stable/) to install Crossfit Crossfader.
In your project directory, start up a virtual environment with:
(Windows)
```bash
python -m venv venv
venv\Scripts\activate
```
(Linux)
```bash
python3 -m venv venv
source venv/bin/activate
```
The install dependencies with:
```bash
pip install -r requirements.txt
```
## Usage
Start the server using this:
```bash
python webapp.py
```
## Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Please make sure to update tests as appropriate.
## License
[MIT License ](https://choosealicense.com/licenses/mit/)
A short and simple permissive license with conditions only requiring preservation of copyright and license notices. Licensed works, modifications, and larger works may be distributed under different terms and without source code.
# Roadmap and Goals
Multiple Playlists
Threaded or some version that doesn't use sleep()
Bring Back CRUD to store user playlists (maybe loaded by csv)
## Done Recently
2/16 - Security update and session tokens
2/16 - Unnecessary bootstrap facelift
2/15 - Clean away CRUD portions (different branch?)
2/15 - Host on Elastic Beanstalk or other PaaS rather than Lightsail.
## Diary and Misc Notes
2/20/21 - Finally got threadded playing of playlists up and going. Quick bootstrap skin. Time to start working on multiple playlists and playlist uploading.
We're using this for variable control in the href of the templates
```html
<a href="{{ '/view_assessment_result/%s'%a.id|urlencode }}">{{ a.id }}</a>
```
2/15/21 - Using this as well https://medium.com/technest/build-a-crud-app-with-flask-bootstrap-heroku-60dfa3a788e8
This is helping how bootstrap works: https://getbootstrap.com/docs/3.4/css/
2/14/21 - We've got basic authorizaiton and the spotify API working from their excellent documentation. Time to start on the Flask App. We're revisiting this guy https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world to get us off the ground.
## Migrating Works Like This
(Source Machine) pip freeze > requirements.txt
Copy requirements.txt to Target Machine
(Target Machine Venv) pip install -r requirements.txt
## Vestigal Notes about CRUD Stuff
This was great, but just not really related to what we're doing. Come back when you're wanting to learn more about SQL Alchemy, SQLLite, and the like.
https://medium.com/technest/build-a-crud-app-with-flask-bootstrap-heroku-60dfa3a788e8
| 32.425532 | 269 | 0.764108 | eng_Latn | 0.985667 |
da38cfc5e95b9b59c45d68580cf782b7768cfda3 | 689 | md | Markdown | content/posts/2010/03/08/Digging-discoverable-OpenID-providers/index.md | funkfinger/blog | 60543ceff99a975597ba89e465cb9b968d7b8411 | [
"Unlicense"
] | 2 | 2020-03-04T10:42:12.000Z | 2020-03-04T11:01:17.000Z | content/posts/2010/03/08/Digging-discoverable-OpenID-providers/index.md | funkfinger/blog | 60543ceff99a975597ba89e465cb9b968d7b8411 | [
"Unlicense"
] | null | null | null | content/posts/2010/03/08/Digging-discoverable-OpenID-providers/index.md | funkfinger/blog | 60543ceff99a975597ba89e465cb9b968d7b8411 | [
"Unlicense"
] | null | null | null | ---
layout: post
title: Digging discoverable OpenID providers
date: '2010-03-09'
tags:
- code
- code
- openid
- url
obsolete: true
---
Here's a list of big-time OpenID providers that are discoverable. This means you don't need an exact URL, which is cool...
<ul>
<li><b>Google: </b>https://www.google.com/accounts/o8/id</li>
<li><b>Yahoo: </b>http://yahoo.com</li>
<li><b>MySpace: </b>http://myspace.com</li>
<li><b>myOpenID: </b>hhttp://myopenid.com</li>
</ul>
Additionally, if you have Google Apps, you can craft an OpenID URL like this: `https://www.google.com/accounts/o8/site-xrds?hd=example.com` --- Hmm, but it doesn't seem to work... will update if I figure it out.
| 29.956522 | 211 | 0.680697 | eng_Latn | 0.713293 |
da3952c49faf6b3c59143c3f0d63fd60d689f5c4 | 1,151 | md | Markdown | oj-leetcode-algorithm/src/main/java/com/nxy006/project/algorithm/leetcode/p0086/partition_list/README.md | nxy006/java-algorithm | 9759078d672f303ae6ac78f275d63872981ee609 | [
"MIT"
] | null | null | null | oj-leetcode-algorithm/src/main/java/com/nxy006/project/algorithm/leetcode/p0086/partition_list/README.md | nxy006/java-algorithm | 9759078d672f303ae6ac78f275d63872981ee609 | [
"MIT"
] | null | null | null | oj-leetcode-algorithm/src/main/java/com/nxy006/project/algorithm/leetcode/p0086/partition_list/README.md | nxy006/java-algorithm | 9759078d672f303ae6ac78f275d63872981ee609 | [
"MIT"
] | null | null | null | # 86. Partition List `Medium`
- **Origin Link**: [https://leetcode.com/problems/partition-list/](https://leetcode.com/problems/partition-list/)
- **Tag**: `linked-list`, `two-pointers`
## Description
Given the `head` of a linked list and a value `x`, partition it such that all nodes **less than** `x` come before nodes **greater than or equal** to `x`.
You should **preserve** the original relative order of the nodes in each of the two partitions.
## Example
**Example 1:**

```
Input: head = [1,4,3,2,5,2], x = 3
Output: [1,2,2,4,3,5]
```
**Example 2:**
```
Input: head = [2,1], x = 2
Output: [1,2]
```
## Constraints
- The number of nodes in the list is in the range `[0, 200]`.
- `-100 <= Node.val <= 100`
- `-200 <= x <= 200`
## Solution Template
```java
/**
* Definition for singly-linked list.
* public class ListNode {
* int val;
* ListNode next;
* ListNode() {}
* ListNode(int val) { this.val = val; }
* ListNode(int val, ListNode next) { this.val = val; this.next = next; }
* }
*/
class Solution {
public ListNode partition(ListNode head, int x) {
}
}
```
| 19.508475 | 153 | 0.60556 | eng_Latn | 0.854799 |
da3985a98cabc5cb72c07ebd23538cc5590379a3 | 153 | md | Markdown | vendor/github.com/go-swagger/scan-repo-boundary/README.md | chuckha/go-swagger | 4a8b4199d6dadb26a2dddff9073b1d60f5c78392 | [
"Apache-2.0"
] | 1 | 2019-07-06T12:11:44.000Z | 2019-07-06T12:11:44.000Z | vendor/github.com/go-swagger/scan-repo-boundary/README.md | chuckha/go-swagger | 4a8b4199d6dadb26a2dddff9073b1d60f5c78392 | [
"Apache-2.0"
] | 1 | 2018-02-03T22:40:51.000Z | 2018-02-07T20:13:54.000Z | vendor/github.com/go-swagger/scan-repo-boundary/README.md | chuckha/go-swagger | 4a8b4199d6dadb26a2dddff9073b1d60f5c78392 | [
"Apache-2.0"
] | 3 | 2016-06-06T15:27:31.000Z | 2021-04-10T21:21:37.000Z | # TestRepo
This is a repo that is used in the tests of the go-swagger project.
It's is only here to test finding files across repository boundaries.
| 25.5 | 69 | 0.764706 | eng_Latn | 0.999991 |
da39e65ee3e087ad76ab3e23fe11fb9d81a2ce6c | 261 | md | Markdown | README.md | rtoenniges/node-red-contrib-hx711 | 89755bfeacd0ba86b14ae2baad35e2f7a549a144 | [
"MIT"
] | 1 | 2021-02-05T04:21:05.000Z | 2021-02-05T04:21:05.000Z | README.md | rtoenniges/node-red-contrib-hx711 | 89755bfeacd0ba86b14ae2baad35e2f7a549a144 | [
"MIT"
] | 2 | 2020-10-12T07:10:43.000Z | 2022-03-26T21:15:01.000Z | README.md | rtoenniges/node-red-contrib-hx711 | 89755bfeacd0ba86b14ae2baad35e2f7a549a144 | [
"MIT"
] | 1 | 2021-02-11T07:35:26.000Z | 2021-02-11T07:35:26.000Z | # node-red-contrib-hx711
A simple <a href="http://nodered.org" target="_new">Node-RED</a> node to get the weight value from a hx711 attached to the Raspberry Pis GPIOs.<br>
## Dependencies
* HX711 ([@shroudedcode/hx711](https://github.com/shroudedcode/hx711))
| 43.5 | 147 | 0.739464 | eng_Latn | 0.649721 |
da3a9af92e6eeeee25af24e6d6e884d128af310d | 159 | md | Markdown | README.md | Azahet/SpeedTest-Desktop | 367498fb6ebbc81c4ad54e0cc83c26a755f1a59e | [
"MIT"
] | 2 | 2018-03-14T00:32:08.000Z | 2019-07-12T12:48:44.000Z | README.md | Azahet/SpeedTest-Desktop | 367498fb6ebbc81c4ad54e0cc83c26a755f1a59e | [
"MIT"
] | null | null | null | README.md | Azahet/SpeedTest-Desktop | 367498fb6ebbc81c4ad54e0cc83c26a755f1a59e | [
"MIT"
] | null | null | null |
<p align="center">
<img src="https://i.imgur.com/Vus9N02.png" />
</p>
------------

| 13.25 | 67 | 0.572327 | yue_Hant | 0.651408 |
da3af16a19bb71213160927c31f2e7ba039c536f | 43 | md | Markdown | README.md | Jerry-Shen0527/ToyRenderer | 84771afb8f5edb7055760b46e1dbf1cd9323071b | [
"MIT"
] | null | null | null | README.md | Jerry-Shen0527/ToyRenderer | 84771afb8f5edb7055760b46e1dbf1cd9323071b | [
"MIT"
] | null | null | null | README.md | Jerry-Shen0527/ToyRenderer | 84771afb8f5edb7055760b46e1dbf1cd9323071b | [
"MIT"
] | null | null | null | # Toy Render
Rendering with simple openGL.
| 14.333333 | 29 | 0.790698 | eng_Latn | 0.99299 |
da3b6e12a57010f52bcbcb4dac16ed2c3f327e92 | 747 | md | Markdown | submodules/stringer/README.md | Prepodavan/gen | b5dfc93a4b1224ed7307b157f27c258330a34f4c | [
"MIT"
] | null | null | null | submodules/stringer/README.md | Prepodavan/gen | b5dfc93a4b1224ed7307b157f27c258330a34f4c | [
"MIT"
] | null | null | null | submodules/stringer/README.md | Prepodavan/gen | b5dfc93a4b1224ed7307b157f27c258330a34f4c | [
"MIT"
] | null | null | null | stringer
========
This is a typewriter package for use with [gen](https://github.com/clipperhouse/gen), a tool for type-driven code generation. It is a fork of Rob Pike’s [tool](https://godoc.org/golang.org/x/tools/cmd/stringer) of the same name, which generates readable strings for consts.
It is one of gen’s built-in typewriters.
To use it:
```
go get -u github.com/clipperhouse/gen
```
Then, mark up a type in your package, for example:
```
// +gen stringer
type Pill int
const (
Placebo Pill = iota
Aspirin
Ibuprofen
Paracetamol
Acetaminophen = Paracetamol
)
```
...and run `gen` on your package. You should see a new file named `mytype_stringer.go`. See the [gen docs](https://clipperhouse.github.io/gen/) for more information.
| 24.9 | 273 | 0.72423 | eng_Latn | 0.960999 |
da3b98013e933f58e2f3bcb68a2486e2d7a7238e | 8,188 | md | Markdown | src/course-materials/frontend-fundamentals/week-4/day-3/lecture-materials/intro-to-ajax-and-javascript-promises.md | laurenperez/SEIR-Flex-2.22.22-Sasquatch | e75650b219ec2bfe4d95e45b2d868975ea8f8ddb | [
"MIT"
] | null | null | null | src/course-materials/frontend-fundamentals/week-4/day-3/lecture-materials/intro-to-ajax-and-javascript-promises.md | laurenperez/SEIR-Flex-2.22.22-Sasquatch | e75650b219ec2bfe4d95e45b2d868975ea8f8ddb | [
"MIT"
] | null | null | null | src/course-materials/frontend-fundamentals/week-4/day-3/lecture-materials/intro-to-ajax-and-javascript-promises.md | laurenperez/SEIR-Flex-2.22.22-Sasquatch | e75650b219ec2bfe4d95e45b2d868975ea8f8ddb | [
"MIT"
] | null | null | null | ---
track: "Frontend Fundamentals"
title: "Intro to AJAX and JavaScript Promises with jQuery"
week: 4
day: 3
type: "lecture"
---
# Intro to AJAX and JavaScript Promises with jQuery
<br>
<br>
<br>
<br>
## Lesson Objectives
1. Explain AJAX
2. Explain promises
3. Populate the DOM with AJAX data
4. Make dynamic AJAX requests
<br>
<br>
<br>
## Explain AJAX
- AJAX Stands for Asynchronous JavaScript And XML
- It's just a way for your page to get data from external sources
**According to MDN:**
> Asynchronous JavaScript + XML, while not a technology in itself, is a term coined in 2005 by Jesse James Garrett, that describes a "new" approach to using a number of existing technologies together, including HTML or XHTML, Cascading Style Sheets, JavaScript, The Document Object Model, XML, XSLT, and most importantly the XMLHttpRequest object.
> When these technologies are combined in the Ajax model, web applications are able to make quick, incremental updates to the user interface without reloading the entire browser page. This makes the application faster and more responsive to user actions.
> Although X in Ajax stands for XML, JSON is used more than XML nowadays because of its many advantages such as being lighter and a part of JavaScript. Both JSON and XML are used for packaging information in Ajax model.
<br>
<br>
<br>
### Lesson Setup
- Create a folder called `intro-to-ajax-practice`
- Inside of `intro-to-ajax-practice` create the following folder/file structure:
```shell
intro-to-ajax-practice/
index.html
script.js
```
- You can add this HTML to your `.html` file:
```html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta http-equiv="X-UA-Compatible" content="ie=edge" />
<script src="https://code.jquery.com/jquery-3.3.1.min.js"></script>
<script defer src="./script.js"></script>
<title>Intro to AJAX</title>
</head>
<body></body>
</html>
```
We'll have our page get data from the external site [https://www.omdbapi.com/](https://www.omdbapi.com/)
- From the documentation, we can see that `https://www.omdbapi.com/?apikey=53aa2cd6&t=Frozen` will get data about the movie Frozen
- The `apikey` parameter is necessary for this external source so that can track and possibly limit access to specific people
- In order to use this particular API in our projects, we'll need to [request an API key](https://www.omdbapi.com/apikey.aspx)
Let's use JavaScript to get data for our page:
```javascript
const promise = $.ajax({
url: "https://www.omdbapi.com/?apikey=53aa2cd6&t=Frozen",
})
promise.then(
(data) => {
console.log(data)
},
(error) => {
console.log("bad request: ", error)
}
)
```
## Explain promises
`$.ajax` returns a "promise" object, which we'll save to the variable `promise`.
Think of this as an object that holds information about the AJAX request "event".
All "promise" objects have a `.then()` method. This method takes two parameters.
1. The `success` callback
2. The `error` callback
These callbacks behave just like callbacks to DOM events.
Remember: a callback is a function that get's passed to another function, as an argument, to be called at a later time, when something happens.
In this case, when the AJAX request succeeds or fails.
We can also rewrite the previous code into one expression:
```javascript
$.ajax({
url: "https://www.omdbapi.com/?apikey=53aa2cd6&t=Frozen",
}).then(
(data) => {
console.log(data)
},
(error) => {
console.log("bad request", error)
}
)
```
<br>
<br>
<br>
## Populate the DOM with AJAX data
Now that we have successfully made an AJAX request, let's use the response from OMDB to populate the DOM.
Let's add the below `html` to our practice project.
```html
<h1>Movie Info</h1>
<main>
<h3>Title</h3>
<p id="title"></p>
<p>Year</p>
<p id="year"></p>
<p>Rating</p>
<p id="rated"></p>
</main>
```
Now let's use the data to populate the DOM:
- First we'll select/cache the DOM elements we'll need to work with.
- Once the data comes back from our AJAX request, we can set the content of our DOM elements with it.
```javascript
const $title = $('#title');
const $year = $('#year');
const $rated = $('#rated');
$.ajax({
url:'https://www.omdbapi.com/?apikey=53aa2cd6&t=Frozen'
}).then(
(data) => {
$title.text(data.Title);
$year.text(data.Year);
$rated .text(data.Rated);
},
(error) => {
console.log('bad request: ', error);
});
})
```
<br>
<br>
<br>
## Make dynamic AJAX requests
Currently, we're getting data for Frozen every time the page loads.
Let's let the user choose the movie:
We'll use the below `html` to begin adding this functionality. Go ahead and place this form below the closing `<main>` tag
```html
<!-- existing code above -->
...
</main >
<form>
<input type="text" placeholder="Movie Title"/>
<input type="submit" value="Get Movie Info" />
</form>
```
First, let's set up a state variable to store our movie data.
Then, we'll set up an event listener for a 'submit' events from our form.
For best practices, we'll move the AJAX request to it's own function called `handleGetData`, this function will get called when the form is submitted thus fetching our data and assigning it to our `movieData` state variable.
Also, notice how we're having to call `preventDefault()` on the `event` object, this is how we can prevent the default browser behavior for form submissions: A full page refresh.
In this case, refreshing/reloading the page defeats the purpose of AJAX, so we'll "turn off" the default behavior.
Next, we'll create a seperate function called `render` to take care of transfering the data from our state variable to the DOM.
To summarize, `handleGetData` will just handle requesting the data and assigning it to "state". It will then call `render`, which will transfer that state to the DOM.
By the way, using specialized functions like `handleGetData` and `render` are a great practice to seperate concerns and keep our code organized.
```javascript
let movieData
$("form").on("submit", handleGetData)
function handleGetData(event) {
event.preventDefault()
// calling preventDefault() on a 'submit' event will prevent a page refresh
$.ajax({
url: "https://www.omdbapi.com/?apikey=53aa2cd6&t=Frozen",
}).then(
(data) => {
movieData = data
render()
},
(error) => {
console.log("bad request", error)
}
)
}
function render() {
$title.text(movieData.Title)
$year.text(movieData.Year)
$rated.text(movieData.Rated)
}
```
Lastly, let's use the input that user types to modify the AJAX request:
- Let's create another state variable called `userInput`.
- Next, we'll select/cache a reference to the input element from the DOM.
- Whenever `handleGetData` gets called, we want to assign the value from our input element to our state variable and use that value to modify the AJAX request.
- Very much like our `apikey`, `userInput` becomes what is known as a query parameter in our `URL`.
```javascript
let movieData, userInput
const $title = $("#title")
const $year = $("#year")
const $rated = $("#rated")
const $input = $('input[type="text"]')
$("form").on("submit", handleGetData)
function handleGetData(event) {
event.preventDefault()
// calling preventDefault() on a 'submit' event will prevent a page refresh
userInput = $input.val()
// getting the user input
$.ajax({
url: "https://www.omdbapi.com/?apikey=53aa2cd6&t=" + userInput,
}).then(
(data) => {
movieData = data
render()
},
(error) => {
console.log("bad request", error)
}
)
}
function render() {
$title.text(movieData.Title)
$year.text(movieData.Year)
$rated.text(movieData.Rated)
}
```
<br>
<br>
<br>
## Review Questions
**❓ In your own words describe a JavaScript Promise**
**❓ What is AJAX?**
**❓ What jQuery method do we use to make AJAX requests**
<br>
<br>
<br>
## Resources
- [`$.ajax` jQuery Documentation](https://api.jquery.com/jQuery.ajax/)
- [`AJAX` MDN Documentation](https://developer.mozilla.org/en-US/docs/Web/Guide/AJAX)
| 26.75817 | 347 | 0.698339 | eng_Latn | 0.956117 |
da3c651e751a7e5e328b4c5568c2749500f49fa4 | 4,608 | md | Markdown | Instructions/Walkthroughs/07-Implement Azure Functions.md | ianychoi/AZ-900T0x-MicrosoftAzureFundamentals | a7924d8f15d9844c2dabe594274e7c2e381c461d | [
"MIT"
] | 1 | 2020-04-10T06:39:00.000Z | 2020-04-10T06:39:00.000Z | Instructions/Walkthroughs/07-Implement Azure Functions.md | ianychoi/AZ-900T0x-MicrosoftAzureFundamentals | a7924d8f15d9844c2dabe594274e7c2e381c461d | [
"MIT"
] | null | null | null | Instructions/Walkthroughs/07-Implement Azure Functions.md | ianychoi/AZ-900T0x-MicrosoftAzureFundamentals | a7924d8f15d9844c2dabe594274e7c2e381c461d | [
"MIT"
] | 15 | 2019-12-15T02:50:56.000Z | 2019-12-19T01:19:04.000Z | ---
wts:
title: '07 - Implement Azure Functions'
module: 'Module 02 - Core Azure Services'
---
# 07 - Implement Azure Functions
In this walkthrough, we will create a Function App to display a Hello message when there is an HTTP request.
Estimated time: 30 minutes
# Task 1: Create a Function app
In this task, we will create a Function app.
1. Sign in to the [Azure portal](https://portal.azure.com).
2. Search for **Function App** and then click **+Add**.
3. Fill in the Azure Function App settings fields. If not specified, take the default.
| Settings | Value |
| -- | --|
| App name | **function-xxx** (this must be unique) |
| Subscription | **Choose your subscription** |
| Resource group | **myRGFunction** (create new) |
| OS | **Windows** |
| Hosting plan | **Consumption plan** |
| Location | **East US** |
| | |
4. Select the **Create** button to begin provisioning and deploying your new Azure Function App.
5. Wait for the Notiication that the resource has been created.
6. **Refresh** the Function App page and verify your new resource is *running*.

# Task 2: Create a HTTP triggered function and test
In this task, we will use the Webhook + API function to display a message when there is an HTTP request.
1. Select your new Function app.
2. Select the "**+**" button next to **Functions**, and select **In-portal**. Notice your other choices for developing in Visual Studio and VS Code. Click **Continue**.

3. Choose **WebHook + API**, and then select **Create**. This will run a function whenever the app receives an HTTP request. There are a large number of other templates to choose from.

4. Notice the code is designed to run an HTTP request and log information. Also, notice the function returns a Hello message with a name.

5. Select **Get function URL** from the top of function editor.
6. Set the **Key** drop-down value to **default (Function key)**. Then, select **Copy** to copy the function URL.

7. Paste the copied function URL into your web browser's address bar. When the page is requested the function will run. Notice the message that the function needs a name.

8. Append **&name=yourname** to the end of the URL.
**Note**: Here, `<yourname>` refers to your given first name. The final URL will be something like, `https://azfuncxxx.azurewebsites.net/api/HttpTrigger1?code=X9xx9999xXXXXX9x9xxxXX==&name=cindy`

9. When your function runs, trace information is written to log files in Azure. To view the logs in Azure portal, return to the function editor, and select the **Logs** button.

Congratulations! You have created a Function App to display a Hello message when there is an HTTP request.
**Note**: To avoid additional costs, you can remove this resource group. Search for resource groups, click your resource group, and then click **Delete resource group**. Verify the name of the resource group and then click **Delete**. Monitor the **Notifications** to see how the delete is proceeding. | 57.6 | 343 | 0.736545 | eng_Latn | 0.991984 |
da3d3f6195acb4bdf73fe6ac8cb9f35b7dbceb13 | 1,111 | md | Markdown | README.md | ryanhs/simple-chess-ai | aeaa4ddc3c5361be20517d81e6ed9e56c329b172 | [
"MIT"
] | 1 | 2017-01-12T15:01:59.000Z | 2017-01-12T15:01:59.000Z | README.md | ryanhs/simple-chess-ai | aeaa4ddc3c5361be20517d81e6ed9e56c329b172 | [
"MIT"
] | null | null | null | README.md | ryanhs/simple-chess-ai | aeaa4ddc3c5361be20517d81e6ed9e56c329b172 | [
"MIT"
] | null | null | null | # Simple Chess AI
### this project only for fun!!
this project show how simple chess AI can be implemented in PHP,
yeah it doesn't created to be strong AI ofcourse, due to poor performance of PHP compared to C, when this project created.
#### specification
hmm maybe not much, but i do writing this in PHP 5.6
### Algorithm
- using simple alpha-beta pruning [Wikipedia - Alpha–beta pruning](https://en.wikipedia.org/wiki/Alpha%E2%80%93beta_pruning)
### Evaluation
- using simplified [ChessProgramming - Simplified evaluation function](https://chessprogramming.wikispaces.com/Simplified+evaluation+function)
## UCI
this project use UCI format, so you can do a bit using UCI command
## Example to run
run this by ``php simple-chess-ai/uci.php``
after the program run, you can type following command:
``ucinewgame``
``isready``
``position fen $fen`` exp: ``position fen rnbqkbnr/pppppppp/8/8/8/8/PPPPPPPP/RNBQKBNR w KQkq - 0 1``
``go depth $depth`` exp: ``go depth 2``
## Example output
```text
SimpleChessAI in PHP by Ryan H. Silalahi
uciok
readyok
bestmove e2e4
```
### License
MIT
| 28.487179 | 142 | 0.729073 | eng_Latn | 0.961456 |
da3dc98875bb59450c427d5c17e99676a8af462c | 8,184 | md | Markdown | variant_analysis/README.md | jstimes/GeneAnalysis | 2d212cf439792c3b395a95deafd2f986a72bf060 | [
"Apache-2.0"
] | null | null | null | variant_analysis/README.md | jstimes/GeneAnalysis | 2d212cf439792c3b395a95deafd2f986a72bf060 | [
"Apache-2.0"
] | null | null | null | variant_analysis/README.md | jstimes/GeneAnalysis | 2d212cf439792c3b395a95deafd2f986a72bf060 | [
"Apache-2.0"
] | null | null | null | ### Variant analysis
#### Overview
This is a pipeline that generates variant data and synthesizes some figures and
reports based on that data. It uses a genome annotation file and whole genome VCF file as input, and then, given a gene of interest, computes details about variants associated with that gene. This software utilizes the BEDTools software (Quinlan, 2010) and some open source python packages listed below.
#### Datasets
**Human genome assembly**: GRCh37
**Genome annotation**: From gencode, release 19 ([http://ftp.ebi.ac.uk/pub/databases/gencode/Gencode_human/release_19/](http://ftp.ebi.ac.uk/pub/databases/gencode/Gencode_human/release_19/))
**Genome variants**: 1000 genomes phase 1 analysis ([https://ftp-trace.ncbi.nih.gov/1000genomes/ftp/phase1/analysis_results/integrated_call_sets/](https://ftp-trace.ncbi.nih.gov/1000genomes/ftp/phase1/analysis_results/integrated_call_sets/))
**External databases**: dbsnp and dbvar via eutilities ([https://www.ncbi.nlm.nih.gov/snp/docs/eutils_help/](https://www.ncbi.nlm.nih.gov/snp/docs/eutils_help/)).
#### Pipeline Steps
The file `run.sh` runs this analysis pipeline. Data inputs and outputs are expected to be loaded from / written to the ‘data’ folder. The input flags are as follows:
* -a: The genome **a**nnotation file to use.
* -c: The **c**hromosome number of interest
* -g: The **g**ene of interest
* -v: The variants VCF file to use.
<gene> and <chr> below refer to the values used for the -g and -c flags respectively.
The pipeline follows these steps:
1. Cleans pre-existing output from the ‘data’ folder.
2. Extracts the genome annotations just for the chromosome of interest.
1. Written to _chr<chr>_annotation.gff3_
3. Extracts the genome annotations just for the gene of interest, and also generates bed files representing these annotations; one for the whole gene and one just for exons.
2. Gene annotations written to _<gene>.gff3_
3. Gene bed file written to _<gene>.bed_
4. Exons bed file written to _<gene>_exons.bed_
4. Extracts the variants only on the chromosome of interest.
5. Filtered VCF file written _<chr>_variants.vcf_
6. Modified VCF file with the INFO column containing the dbSnp ref written to _chr<chr>_variants_adjusted.vcf_
7. Variants on chromosome bed file written to _chr<chr>_variants.bed_
5. Computes gene and variant intersections.
8. Whole gene and variant intersections written to _<gene>_variants.bed_
9. Exon-only variant intersections written to _<gene>_exons_variants.bed_
6. Cleans the intersection data, joining it with dbSnp annotations where possible; generates plots and a text report based on this data.
10. Whole-gene variant dataset written to _<gene>_variants.csv_
11. Allele frequency distribution plot written to _<gene>_variants_af_distribution.png_
12. Variant type distribution plot written to _<gene>_variants_variant_type_distribution.png_
13. A text report summarizing some findings is written to _<gene>_variants_report.txt_
14. Exon-only equivalents are written to files with the prefix _<gene>_exons__
7. Generates another set of data based solely on dbsnp & dbvar data (i.e. not just variants defined in the VCF file that overlap with the gene’s annotation; looks up all pathogenic dbsnp and dbvar entries associated with the gene of interest).
15. Pathogenic dbsnp variant data written to _<gene>_pathogenic_snps.csv_
16. Pathogenic dbvar variant data written to _<gene>_pathogenic_svs.csv_
#### Pipeline Outputs
**<gene>_variants.csv & <gene>_exons_variants.csv _columns:**
* _chr_: the chromosome of interest
* _start_: start position of this variant in the chromosome, in reference to the human genome assembly defined above.
* _stop_: stop position of this variant.
* _allele frequency_: The reported allele frequency in the VCF file (AF=).
* _variant type_: The variant type as reported in the VCF file.
* _dbsnp id_: ID of the snp in dbSnp. This can be pasted into the dbSnp search bar for finding more information about the variant.
* _dbsnp_variant_type_: The variant type as reported by dbSnp. This is slightly more specific than the VCF variant type (e.g. this reports ‘del’ instead of ‘indel’).
* _significances_: This column includes any pathogenic metadata about the variant from dbSnp; empty if no metadata about this present in dbSnp. Values include:
* 'pathogenic-likely-pathogenic', 'drug-response', 'likely-benign', 'likely-pathogenic', 'pathogenic', 'risk-factor', 'not-provided', 'uncertain-significance', 'benign-likely-benign', 'benign', 'conflicting-interpretations-of-pathogenicity'
* _origins_: dbSnp reported origin of variant. Values include:
* 'unknown', 'maternal', 'somatic', 'germline'
* _diseases_: dbSnp reported diseases associated with this variant.
The columns of **<gene>_pathogenic_snps.csv _**are a subset of those in **<gene>\_variants.csv.**
The columns of **_<gene>_pathogenic_svs.csv_** are similar to those of **_<gene>_pathogenic_snps.csv_**, except _dbsnp_id _is replaced with _dbvar_id_ and _variant_type_ refers to the dbVar reported variant type.
#### Usage Instructions
_Prerequisite: you must be using a UNIX environment to run this analysis._
1. Create a new directory and download the source code.
2. Ensure python3 is installed and install bedtools
1. Python3 is likely already installed, but see these docs for any help:
1. [https://docs.python.org/3/using/unix.html](https://docs.python.org/3/using/unix.html)
2. For bedtools, you can either download and build the source as described below, or directly download the bedtools binary here:
2. [https://github.com/arq5x/bedtools2/releases/download/v2.30.0/bedtools.static.binary](https://github.com/arq5x/bedtools2/releases/download/v2.30.0/bedtools.static.binary)
3. Build from source:
```
wget -cd https://github.com/arq5x/bedtools2/archive/master.zip
unzip master.zip
cd bedtools2-master/
sudo yum -y install gcc-c++
sudo yum -y install zlib-devel
make
```
3. Install python3 package dependencies:
3. If pip is not already installed, run:
4. sudo apt install python3-pip
4. Run `pip install x` for each of these packages:
5. pandas
6. requests
7. biopython
8. numpy
9. matplotlib
4. Change into the `variant_analysis` directory.
5. Download input data files to the ‘data’ directory:
5. _gencode.v19.annotation.gff3_
10. Download and uncompress this file (or a different version) from here:
11. [http://ftp.ebi.ac.uk/pub/databases/gencode/Gencode_human/release_19/](http://ftp.ebi.ac.uk/pub/databases/gencode/Gencode_human/release_19/)
6. _ALL.wgs.integrated_phase1_v3.20101123.snps_indels_sv.sites.vcf_
12. Download and uncompress this file (or a different version) from here:
13. [https://ftp-trace.ncbi.nih.gov/1000genomes/ftp/phase1/analysis_results/integrated_call_sets/](https://ftp-trace.ncbi.nih.gov/1000genomes/ftp/phase1/analysis_results/integrated_call_sets/)
6. Mark the bash script as executable:
```
chmod +x run.sh
```
7. Before you can run the script, you’ll need to know your gene of interest and the chromosome number it is on. You can type in your gene name at UCSC Genome Browser to find its chromosome:
[http://genome.ucsc.edu/cgi-bin/hgGateway](http://genome.ucsc.edu/cgi-bin/hgGateway)
8. If everything has been installed and the input data is present, you can run the pipeline using the following command:
```
./run.sh -a data/gencode.v19.annotation.gff3 -g <gene> -c <chromosome_number> -v data/ALL.wgs.integrated_phase1_v3.20101123.snps_indels_sv.sites.vcf
```
Replacing <gene> and <chromosome_number> with your own choices, and
remembering to update the data file paths if you downloaded different data to begin with.
9. The script will output “Finished” when it completes; some steps such as fetching results from dbsnp/dbvar can take some time to complete.
| 50.832298 | 303 | 0.747312 | eng_Latn | 0.955105 |
da3e2a6a44c2a225c2cb7555574ef25c532434d2 | 18 | md | Markdown | README.md | qikunlv/eicu--date | f0bb5b3c665921959ec609d4fc6430b03c45c1dd | [
"MIT"
] | null | null | null | README.md | qikunlv/eicu--date | f0bb5b3c665921959ec609d4fc6430b03c45c1dd | [
"MIT"
] | null | null | null | README.md | qikunlv/eicu--date | f0bb5b3c665921959ec609d4fc6430b03c45c1dd | [
"MIT"
] | null | null | null | # eicu--date
good
| 6 | 12 | 0.666667 | eng_Latn | 0.780609 |
da3ecd7e692ec13158429736996699c37b611cf2 | 12,623 | md | Markdown | lib/tcllib/embedded/md/tcllib/files/modules/ftpd/ftpd.md | marsman57/TclIdentityServerREST | e4bbb856cf9481f662fcaa8370500d03908e5a50 | [
"MIT"
] | 86 | 2015-01-29T03:48:33.000Z | 2022-03-10T16:55:04.000Z | lib/tcllib/embedded/md/tcllib/files/modules/ftpd/ftpd.md | marsman57/TclIdentityServerREST | e4bbb856cf9481f662fcaa8370500d03908e5a50 | [
"MIT"
] | 7 | 2015-06-02T08:29:21.000Z | 2019-06-13T05:48:17.000Z | lib/tcllib/embedded/md/tcllib/files/modules/ftpd/ftpd.md | marsman57/TclIdentityServerREST | e4bbb856cf9481f662fcaa8370500d03908e5a50 | [
"MIT"
] | 53 | 2015-02-13T01:31:07.000Z | 2021-10-20T11:54:48.000Z |
[//000000001]: # (ftpd \- Tcl FTP Server Package)
[//000000002]: # (Generated from file 'ftpd\.man' by tcllib/doctools with format 'markdown')
[//000000003]: # (ftpd\(n\) 1\.3 tcllib "Tcl FTP Server Package")
<hr> [ <a href="../../../../toc.md">Main Table Of Contents</a> | <a
href="../../../toc.md">Table Of Contents</a> | <a
href="../../../../index.md">Keyword Index</a> | <a
href="../../../../toc0.md">Categories</a> | <a
href="../../../../toc1.md">Modules</a> | <a
href="../../../../toc2.md">Applications</a> ] <hr>
# NAME
ftpd \- Tcl FTP server implementation
# <a name='toc'></a>Table Of Contents
- [Table Of Contents](#toc)
- [Synopsis](#synopsis)
- [Description](#section1)
- [COMMANDS](#section2)
- [CALLBACKS](#section3)
- [VARIABLES](#section4)
- [Bugs, Ideas, Feedback](#section5)
- [Keywords](#keywords)
- [Category](#category)
# <a name='synopsis'></a>SYNOPSIS
package require Tcl 8\.3
package require ftpd ?1\.3?
[__::ftpd::server__ ?*myaddr*?](#1)
[__::ftpd::config__ ?*option value*? ?*option value \.\.\.*?](#2)
[*fsCmd* __append__ *path*](#3)
[*fsCmd* __delete__ *path* *channel*](#4)
[*fsCmd* __dlist__ *path* *style* *channel*](#5)
[*fsCmd* __exists__ *path*](#6)
[*fsCmd* __mkdir__ *path* *channel*](#7)
[*fsCmd* __mtime__ *path* *channel*](#8)
[*fsCmd* __permissions__ *path*](#9)
[*fsCmd* __rename__ *path* *newpath* *channel*](#10)
[*fsCmd* __retr__ *path*](#11)
[*fsCmd* __rmdir__ *path* *channel*](#12)
[*fsCmd* __size__ *path* *channel*](#13)
[*fsCmd* __store__ *path*](#14)
# <a name='description'></a>DESCRIPTION
The __ftpd__ package provides a simple Tcl\-only server library for the FTP
protocol as specified in RFC 959
\([http://www\.rfc\-editor\.org/rfc/rfc959\.txt](http://www\.rfc\-editor\.org/rfc/rfc959\.txt)\)\.
It works by listening on the standard FTP socket\. Most server errors are
returned as error messages with the appropriate code attached to them\. Since the
server code for the ftp daemon is executed in the event loop, it is possible
that a __bgerror__ will be thrown on the server if there are problems with
the code in the module\.
# <a name='section2'></a>COMMANDS
- <a name='1'></a>__::ftpd::server__ ?*myaddr*?
Open a listening socket to listen to and accept ftp connections\. myaddr is
an optional argument\. *myaddr* is the domain\-style name or numerical IP
address of the client\-side network interface to use for the connection\.
- <a name='2'></a>__::ftpd::config__ ?*option value*? ?*option value \.\.\.*?
The value is always the name of the command to call as the callback\. The
option specifies which callback should be configured\. See section
[CALLBACKS](#section3) for descriptions of the arguments and return
values for each of the callbacks\.
* \-authIpCmd *proc*
Callback to authenticate new connections based on the ip\-address of the
peer\.
* \-authUsrCmd *proc*
Callback to authenticate new connections based on the user logging in
\(and the users password\)\.
* \-authFileCmd *proc*
Callback to accept or deny a users access to read and write to a
specific path or file\.
* \-logCmd *proc*
Callback for log information generated by the FTP engine\.
* \-fsCmd *proc*
Callback to connect the engine to the filesystem it operates on\.
* \-closeCmd *proc*
Callback to be called when a connection is closed\. This allows the
embedding application to perform its own cleanup operations\.
* \-xferDoneCmd *proc*
Callback for transfer completion notification\. In other words, it is
called whenever a transfer of data to or from the client has completed\.
# <a name='section3'></a>CALLBACKS
- __authIpCmd__ callback
The authIpCmd receives the ip\-address of the peer attempting to connect to
the ftp server as its argument\. It returns a 1 to allow users from the
specified IP to attempt to login and a 0 to reject the login attempt from
the specified IP\.
- __authUsrCmd__ callback
The authUsrCmd receives the username and password as its two arguments\. It
returns a 1 to accept the attempted login to the ftpd and a 0 to reject the
attempted login\.
- __authFileCmd__ callback
The authFileCmd receives the user \(that is currently logged in\), the path or
filename that is about to be read or written, and __read__ or
__write__ as its three arguments\. It returns a 1 to allow the path or
filename to be read or written, and a 0 to reject the attempted read or
write with a permissions error code\.
- __logCmd__ callback
The logCmd receives a severity and a message as its two arguments\. The
severities used within the ftpd package are __note__, __debug__, and
__error__\. The logCmd doesn't return anything\.
- __fsCmd__ callback
The fsCmd receives a subcommand, a filename or path, and optional additional
arguments \(depending on the subcommand\)\.
The subcommands supported by the fsCmd are:
* <a name='3'></a>*fsCmd* __append__ *path*
The append subcommand receives the filename to append to as its
argument\. It returns a writable tcl channel as its return value\.
* <a name='4'></a>*fsCmd* __delete__ *path* *channel*
The delete subcommand receives the filename to delete, and a channel to
write to as its two arguments\. The file specified is deleted and the
appropriate ftp message is written to the channel that is passed as the
second argument\. The delete subcommand returns nothing\.
* <a name='5'></a>*fsCmd* __dlist__ *path* *style* *channel*
The dlist subcommand receives the path that it should list the files
that are in, the style in which the files should be listed which is
either __nlst__ or __list__, and a channel to write to as its
three arguments\. The files in the specified path are printed to the
specified channel one per line\. If the style is __nlst__ only the
name of the file is printed to the channel\. If the style is __list__
then the file permissions, number of links to the file, the name of the
user that owns the file, the name of the group that owns the file, the
size \(in bytes\) of the file, the modify time of the file, and the
filename are printed out to the channel in a formatted space separated
format\. The __dlist__ subcommand returns nothing\.
* <a name='6'></a>*fsCmd* __exists__ *path*
The exists subcommand receives the name of a file to check the existence
of as its only argument\. The exists subcommand returns a 1 if the path
specified exists and the path is not a directory\.
* <a name='7'></a>*fsCmd* __mkdir__ *path* *channel*
The mkdir subcommand receives the path of a directory to create and a
channel to write to as its two arguments\. The mkdir subcommand creates
the specified directory if necessary and possible\. The mkdir subcommand
then prints the appropriate success or failure message to the channel\.
The mkdir subcommand returns nothing\.
* <a name='8'></a>*fsCmd* __mtime__ *path* *channel*
The mtime subcommand receives the path of a file to check the modify
time on and a channel as its two arguments\. If the file exists the mtime
is printed to the channel in the proper FTP format, otherwise an
appropriate error message and code are printed to the channel\. The mtime
subcommand returns nothing\.
* <a name='9'></a>*fsCmd* __permissions__ *path*
The permissions subcommand receives the path of a file to retrieve the
permissions of\. The permissions subcommand returns the octal file
permissions of the specified file\. The file is expected to exist\.
* <a name='10'></a>*fsCmd* __rename__ *path* *newpath* *channel*
The rename subcommand receives the path of the current file, the new
file path, and a channel to write to as its three arguments\. The rename
subcommand renames the current file to the new file path if the path to
the new file exists, and then prints out the appropriate message to the
channel\. If the new file path doesn't exist the appropriate error
message is printed to the channel\. The rename subcommand returns
nothing\.
* <a name='11'></a>*fsCmd* __retr__ *path*
The retr subcommand receives the path of a file to read as its only
argument\. The retr subcommand returns a readable channel that the
specified file can be read from\.
* <a name='12'></a>*fsCmd* __rmdir__ *path* *channel*
The rmdir subcommand receives the path of a directory to remove and a
channel to write to as its two arguments\. The rmdir subcommand removes
the specified directory \(if possible\) and prints the appropriate message
to the channel \(which may be an error if the specified directory does
not exist or is not empty\)\. The rmdir subcommand returns nothing\.
* <a name='13'></a>*fsCmd* __size__ *path* *channel*
The size subcommand receives the path of a file to get the size \(in
bytes\) of and a channel to write to as its two arguments\. The size
subcommand prints the appropriate code and the size of the file if the
specified path is a file, otherwise an appropriate error code and
message are printed to the channel\. The size subcommand returns nothing\.
* <a name='14'></a>*fsCmd* __store__ *path*
The store subcommand receives the path of a file to write as its only
argument\. The store subcommand returns a writable channel\.
- __closeCmd__
The __closeCmd__ receives no arguments when it is invoked, and any
return value it may generate is discarded\.
- __xferDoneCmd__ sock sock2 file bytes filename err
The __xferDoneCmd__ receives six arguments when invoked\. These are, in
this order, the channel handle of the control socket for the connection, the
channel handle of the data socket used for the transfer \(already closed\),
the handle of the channel containing the transfered file, the number of
bytes transfered, the path of the file which was transfered, and a \(possibly
empty\) error message\. Any return value it may generate is discarded\.
# <a name='section4'></a>VARIABLES
- __::ftpd::cwd__
The current working directory for a session when someone first connects to
the FTPD or when the __REIN__ ftp command is received\.
- __::ftpd::contact__
The e\-mail address of the person that is the contact for the ftp server\.
This address is printed out as part of the response to the __FTP HELP__
command\.
- __::ftpd::port__
The port that the ftp server should listen on\. If port is specified as zero,
the operating system will allocate an unused port for use as a server
socket; afterwards, the variable will contain the port number that was
allocated\.
- __::ftpd::welcome__
The message that is printed out when the user first connects to the ftp
server\.
- __::ftpd::CurrentSocket__
Accessible to all callbacks and all filesystem commands \(which are a special
form of callback\) and contains the handle of the socket channel which was
active when the callback was invoked\.
# <a name='section5'></a>Bugs, Ideas, Feedback
This document, and the package it describes, will undoubtedly contain bugs and
other problems\. Please report such in the category *ftpd* of the [Tcllib
Trackers](http://core\.tcl\.tk/tcllib/reportlist)\. Please also report any ideas
for enhancements you may have for either package and/or documentation\.
When proposing code changes, please provide *unified diffs*, i\.e the output of
__diff \-u__\.
Note further that *attachments* are strongly preferred over inlined patches\.
Attachments can be made by going to the __Edit__ form of the ticket
immediately after its creation, and then using the left\-most button in the
secondary navigation bar\.
# <a name='keywords'></a>KEYWORDS
[ftp](\.\./\.\./\.\./\.\./index\.md\#ftp), [ftpd](\.\./\.\./\.\./\.\./index\.md\#ftpd),
[ftpserver](\.\./\.\./\.\./\.\./index\.md\#ftpserver), [rfc
959](\.\./\.\./\.\./\.\./index\.md\#rfc\_959),
[services](\.\./\.\./\.\./\.\./index\.md\#services)
# <a name='category'></a>CATEGORY
Networking
| 40.329073 | 98 | 0.684861 | eng_Latn | 0.997719 |
da3f2822037eb4d512124a9055abd789a84d383d | 60 | md | Markdown | README.md | brianaske/brianaske.github.io | 795bdd10fa15060c17582e1c44cea1a65afeef92 | [
"MIT"
] | null | null | null | README.md | brianaske/brianaske.github.io | 795bdd10fa15060c17582e1c44cea1a65afeef92 | [
"MIT"
] | null | null | null | README.md | brianaske/brianaske.github.io | 795bdd10fa15060c17582e1c44cea1a65afeef92 | [
"MIT"
] | null | null | null | brianaske.github.io
模板出處 https://colorlib.com/wp/templates/
| 20 | 39 | 0.8 | kor_Hang | 0.226776 |
da408ed4bd5110d7a9b5f1f961683f9e60a4b759 | 6,308 | md | Markdown | README.md | tejazz/patang | f708e7febfafbadb08daf5af0bcb8130078a4bc5 | [
"MIT"
] | 8 | 2020-12-26T14:11:48.000Z | 2021-09-15T07:16:25.000Z | README.md | tejazz/patang | f708e7febfafbadb08daf5af0bcb8130078a4bc5 | [
"MIT"
] | null | null | null | README.md | tejazz/patang | f708e7febfafbadb08daf5af0bcb8130078a4bc5 | [
"MIT"
] | null | null | null | ## Patang
Easy plug-in module for extracting product details for e-commerce websites like Flipkart and Amazon in an easy-to-read JSON format. It extends two implementations - with __Puppeteer__ and __DOM String__.
### Who Should Use This?
This module would come in handy if one is building for some kind of a __scraping operation__ and needs to evaluate product details. It takes care of the logic for extracting product details.
### Supported Platforms
- [Flipkart](https://www.flipkart.com)
- [Amazon India](https://amazon.in)
### Setup
Install the package in your Node.js application
```
npm i --save patang
```
You can go ahead and require it wherever you need to use it.
``` javascript
// sample file: ./index.js
const patang = require('patang');
```
### Usages - Examples
There are two ways to utilize the library
#### __With DOM String__
The `evaluateProductDetails` function of the `pageEvaluator` module expects the `HTML DOM string` and one of the `supported platform name` as parameter.
```javascript
const axios = require('axios');
const { domEvaluator } = require('patang');
// Flipkart
let exampleFlpktUrl = 'https://www.flipkart.com/boat-stone-grenade-5-w-portable-bluetooth-speaker/p/itm0f38c2f530da5?pid=ACCFDBFR9ZCZTDGJ&lid=LSTACCFDBFR9ZCZTDGJUKDA7E&marketplace=FLIPKART&srno=b_1_1&otracker=hp_omu_Top%2BOffers_5_3.dealCard.OMU_MG06BUMHI8DW_3&otracker1=hp_omu_PINNED_neo%2Fmerchandising_Top%2BOffers_NA_dealCard_cc_5_NA_view-all_3&fm=neo%2Fmerchandising&iid=78c6c4aa-ba59-437b-b973-e57a583ee1c7.ACCFDBFR9ZCZTDGJ.SEARCH&ppt=browse&ppn=browse&ssid=ca4ygt9n0g0000001608474631861';
axios.get(exampleFlpktUrl)
.then((res) => {
console.log('Flipkart Product Details')
// printing the extracted details as an example
// use the return value as you see apt for your use-case
console.log(domEvaluator.evaluateProductDetails(res.data, 'flipkart'));
});
```
#### __With Puppeteer__
The `evaluateProductDetails` function of the `domEvaluator` module expects the `platform name` and the `puppeteer page object` to be made available.
``` javascript
const puppeteer = require('puppeteer');
const { pageEvaluator } = require('patang');
let exampleFlpktUrl = 'https://www.flipkart.com/boat-stone-grenade-5-w-portable-bluetooth-speaker/p/itm0f38c2f530da5?pid=ACCFDBFR9ZCZTDGJ&lid=LSTACCFDBFR9ZCZTDGJUKDA7E&marketplace=FLIPKART&srno=b_1_1&otracker=hp_omu_Top%2BOffers_5_3.dealCard.OMU_MG06BUMHI8DW_3&otracker1=hp_omu_PINNED_neo%2Fmerchandising_Top%2BOffers_NA_dealCard_cc_5_NA_view-all_3&fm=neo%2Fmerchandising&iid=78c6c4aa-ba59-437b-b973-e57a583ee1c7.ACCFDBFR9ZCZTDGJ.SEARCH&ppt=browse&ppn=browse&ssid=ca4ygt9n0g0000001608474631861';
let exampleAmznUrl = 'https://www.amazon.in/Honor-HONOR-Band-5/dp/B07Z26SS9G/?_encoding=UTF8&pd_rd_w=E2RZU&pf_rd_p=e60c70f0-0541-4ba5-b6fc-ada95198a5fe&pf_rd_r=FVVSZP80NMR76D87FG70&pd_rd_r=c469c8b3-2ea8-4cb2-b9e3-8ca2cd005fe4&pd_rd_wg=Xwn0L&ref_=pd_gw_crs_zg_bs_1984443031';
async function extractDetailsFromPage() {
const browser = await puppeteer.launch({
headless: true, defaultViewport: null, args: [
'--no-sandbox',
'--disable-setuid-sandbox',
'--no-zygote',
'--no-default-browser-check',
'--bwsi',
'--disable-dev-shm-usage',
'--disable-infobars',
'--hide-scrollbars',
],
});
const page = await browser.newPage();
await page.goto(exampleFlpktUrl, { waitUntil: 'networkidle0' });
console.log('Flipkart Product Details');
console.log(await pageEvaluator.evaluateProductDetails(page, 'flipkart'));
await page.goto(exampleAmznUrl, { waitUntil: 'networkidle0' });
console.log('Amazon Product Details');
console.log(await pageEvaluator.evaluateProductDetails(page, 'amazon'));
await page.close();
await browser.close();
return 'Completed!';
}
// invoking the function
extractDetailsFromPage()
.then(res => console.log(res));
```
### API
##### `domEvaluator`: Function
`evaluateProductDetails(dom: string, platform: string)`: Function
- `platform`: Values: `'flipkart' | 'amazon'`
- `dom`: Values: `'<html>....</html>'`
- Return Value: `Product Details Object`
e.g:
``` javascript
{
title: 'boAt Stone Grenade 5 W Portable Bluetooth Speaker<!-- --> (Charcoal Black, Mono Channel)',
description: '<p>Listen to music in stellar quality with this boAt speaker. With a multitude of features, such as 7-hours of playback time, water-resistant, and easy access control, this speaker ensures a fulfilling aural experience.<br></p>',
price: '₹1,499',
productImage: 'http://rukmini1.flixcart.com/image/128/128/k0vbgy80pkrrdj/speaker/mobile-tablet-speaker/4/n/n/boat-stone-grenade-original-imafg96ffpnpgdv4.jpeg?q=70'
}
```
##### `pageEvaluator`: Function
`evaluateProductDetails(page: object, platform: string)`: Function
- `platform`: Values: `'flipkart' | 'amazon'`
- `page`: Values: `[Puppeteer Page Object](https://pptr.dev/#?product=Puppeteer&version=v5.4.1&show=api-class-page)`
- Return Value: `Product Details Object`
e.g:
``` javascript
{
title: 'boAt Stone Grenade 5 W Portable Bluetooth Speaker<!-- --> (Charcoal Black, Mono Channel)',
description: '<p>Listen to music in stellar quality with this boAt speaker. With a multitude of features, such as 7-hours of playback time, water-resistant, and easy access control, this speaker ensures a fulfilling aural experience.<br></p>',
price: '₹1,499',
productImage: 'http://rukmini1.flixcart.com/image/128/128/k0vbgy80pkrrdj/speaker/mobile-tablet-speaker/4/n/n/boat-stone-grenade-original-imafg96ffpnpgdv4.jpeg?q=70'
}
```
##### `platformIdentifiers`: Object
- Returns the attributes object for a particular platform
- Example:
``` javascript
const { platformIdentifiers } = require('patang');
console.log(platformIdentifiers.Flipkart);
// Output
// {
// title: ['.B_NuCI'],
// description: ['div._1mXcCf'],
// price: ['._30jeq3._16Jk6d'],
// productImage: ['._396cs4._2amPTt._3qGmMb._3exPp9'],
// }
```
### License
[MIT](https://github.com/tejazz/patang/blob/main/LICENSE)
### Contributions - Guidelines
The project is open to one and all for contributions. Simply fork the project, make your changes and raise a PR. | 44.737589 | 495 | 0.732879 | eng_Latn | 0.44808 |
da40d49154d9c49e673125ddd5520964a69cce4b | 917 | md | Markdown | _posts/tumblr/2011-07-19-ice-bar-200365.md | bravelocation/somethingneweveryday | 3a21b0d292babafa8493061a1091012db0efa0ef | [
"MIT"
] | null | null | null | _posts/tumblr/2011-07-19-ice-bar-200365.md | bravelocation/somethingneweveryday | 3a21b0d292babafa8493061a1091012db0efa0ef | [
"MIT"
] | null | null | null | _posts/tumblr/2011-07-19-ice-bar-200365.md | bravelocation/somethingneweveryday | 3a21b0d292babafa8493061a1091012db0efa0ef | [
"MIT"
] | null | null | null | ---
layout: post
title: Ice Bar (200/365)
date: '2011-07-19T08:15:00+01:00'
categories:
- drinks
- interesting
- sweden
- travel
tumblr_url: http://www.somethingnew365.com/post/44061600146/ice-bar-200365
---
Another great day in Stockholm but the absolute highlight was going to the Ice Bar in the hotel.
It’s made by the same people who make the famous Ice Hotel and supposedly there are only 6 bars like this in the world. To state the bleeding obvious it was bloody freezing in there. The glasses were made out of ice, and beforehand we thought it was a bit mean only letting you stay an hour but after 40 minutes and two stiff drinks we’d had enough - and we’d stayed in the longest!
Hilary’s always wanted to go to stay in the Ice Hotel, but I’ve never been keen, and after this - although it was brilliant fun - I’m definitely not going!

| 45.85 | 382 | 0.768811 | eng_Latn | 0.999282 |
da4110794ca776e593bf19a4d41aa2ea6eb1c4fb | 3,026 | md | Markdown | _posts/2022-2-26-pytorch-notes-pytorch-tensors.md | HiRISEAI/hiriseai.github.io | 43c96a66b2c8a14f72a57963c73a9b240fb2c4bc | [
"MIT"
] | null | null | null | _posts/2022-2-26-pytorch-notes-pytorch-tensors.md | HiRISEAI/hiriseai.github.io | 43c96a66b2c8a14f72a57963c73a9b240fb2c4bc | [
"MIT"
] | null | null | null | _posts/2022-2-26-pytorch-notes-pytorch-tensors.md | HiRISEAI/hiriseai.github.io | 43c96a66b2c8a14f72a57963c73a9b240fb2c4bc | [
"MIT"
] | null | null | null | ---
layout: post
title: "PyTorch Notes: PyTorch Tensors"
date: 2022-2-26
image: assets/images/ESP_011785_1875_4_1_notes.pytorch.png
tags: [ AI, MinLab, notes.ML ]
---
**PyTorch has various components**.
- `torch` has functionalities similar to NumPy with GPU support.
- `torch.autograd` provides classes, methods, and functions for implementing automatic differentiation of arbitrary scalar valued functions.
- `nn` is a neural network library in PyTorch.
- `optim` provides optimization algorithms that are used for the minimization and maximization of functions.
- `utils` has utility functions to load data; it also has other functions...
**Creating a tensor with all elements initialized with the same value**
- `torch.zeros( )`: returns a tensor filled with the scalar value 0, with the shape defined by the variable argument size.
- `torch.ones( )`: returns a tensor filled with the scalar value 1, with the shape defined by the variable argument size.
- `torch.eye( )`: reuturns a 2-D tensor with ones on the diagonal and zeros elsewhere.
- `torch.empty( )`: returns a tensor filled with uninitialized data. The shape of the tensor is defined by the variable argument size.
- `torch.full( )`: creates a tensor size filled with `fill_value`. the tensor's dtype is inferred from `fill_value`.
**Tensor syntax**
- `torch.numel( )`: returns the total number of elements in the input tensor.
- `torch.rand( )`: returns a tensor filled with random numbers from a uniform distribution on the interval[0,1).
- `torch.randn( )`: returns a tensor filled with random numbers from a normal distribution with mean 0 and variance 1 (also called the standard normal distribution). The shape of the tensor is defined by the variable argument `size`.
- `torch.argmin( )`: returns the indices of the minimum value(s) of hte flattened tensor or along a dimension. If there are multiple minimal values then the indices of the first minimal value are returned.
- `torch.argmax( )`: returns the indices of the maximum value of all elements in the `input` tensor. If there are multiple maximal values then the indices of the first maximal value are returned.
- `torch.cat(tensors, dim=0)`: concatenates the given sequence of `seq` tensors in the given dimension. All tensors must either have the same shape (except in the concatenating dimension) or be empty. The inverse operation is `torch.split( )`.
The scripts can be seen here:
- [https://github.com/rocks2021/Colabs-for-HiRISEAI/blob/main/pt1_pytorch_tensors.ipynb](https://github.com/rocks2021/Colabs-for-HiRISEAI/blob/main/pt1_pytorch_tensors.ipynb){:target="_blank"}
- [https://github.com/rocks2021/Colabs-for-HiRISEAI/blob/main/pt2_pytorch_tensors.ipynb](https://github.com/rocks2021/Colabs-for-HiRISEAI/blob/main/pt2_pytorch_tensors.ipynb){:target="_blank"}
- [https://github.com/rocks2021/Colabs-for-HiRISEAI/blob/main/pt3_pytorch_tensors.ipynb](https://github.com/rocks2021/Colabs-for-HiRISEAI/blob/main/pt3_pytorch_tensors.ipynb){:target="_blank"}
| 79.631579 | 243 | 0.76768 | eng_Latn | 0.983783 |
da4286a4dfa8f479cf13e415ee7c386b99542dda | 280 | md | Markdown | books/how-to-read-ruby-reference/changelog.md | JunichiIto/jnchito-zenn-docs | 89bde3aaef49b06dfc823f83a46b33ce897c8f43 | [
"MIT"
] | 1 | 2021-09-03T01:33:27.000Z | 2021-09-03T01:33:27.000Z | books/how-to-read-ruby-reference/changelog.md | JunichiIto/jnchito-zenn-docs | 89bde3aaef49b06dfc823f83a46b33ce897c8f43 | [
"MIT"
] | null | null | null | books/how-to-read-ruby-reference/changelog.md | JunichiIto/jnchito-zenn-docs | 89bde3aaef49b06dfc823f83a46b33ce897c8f43 | [
"MIT"
] | null | null | null | ---
title: "更新履歴と謝辞"
---
## 更新履歴
以下は軽微な誤字脱字の修正を除いた、本書の更新履歴です。
### 2022-01-10
- 「ユースケースその1 > URLとバージョンの関係を知る」の説明を一部更新。
### 2021-09-03
- 1stリリース。
## 謝辞
表紙画像は[株式会社フィヨルド](https://fjord.jp/)のデザイナー、町田哲平さん([@machida](https://twitter.com/machida))に作成してもらいました。町田さん、どうもありがとうございました!
| 14 | 121 | 0.696429 | yue_Hant | 0.293174 |
da43c9fa1bd5edb316917fad4d376226c2c5487e | 893 | md | Markdown | aws/lambda-install-npm-package.md | FSou1/til | c33f4199614e10c71862a3eb19901609064eed7c | [
"MIT"
] | 2 | 2021-05-04T02:00:17.000Z | 2022-02-02T03:00:32.000Z | aws/lambda-install-npm-package.md | FSou1/til | 0e14455b2668dd11368e6307fd04df81542273fa | [
"MIT"
] | null | null | null | aws/lambda-install-npm-package.md | FSou1/til | 0e14455b2668dd11368e6307fd04df81542273fa | [
"MIT"
] | null | null | null | # How to use npm packages in AWS Lambda
You cannot load NPM modules without uploading a `.zip` file.
In order to upload a `.zip` file, you need:
* Put a lamda function file in a separate directory
* Install necessary npm package(s)
* Make sure the function works locally with `node lambdaFunc.js`
* Go to the directory and compress the contents ([How to create a Zip archive with CLI](/7z/create-zip-with-cli.md))
* Upload the function package:
```
aws lambda update-function-code --function-name poc --region us-east-2 --zip-file fileb://C:/Projects/git/SeasonedDeveloper/path-to-function/lambdaFunc.zip
```
You mayu need to:
* specify a region with the `--region` parameter
* configure AWS CLI with `aws configure`
* create a user with AWS Access Key & Secret
References:
* https://stackoverflow.com/a/34439125/2524304
* https://docs.aws.amazon.com/lambda/latest/dg/nodejs-package.html | 40.590909 | 155 | 0.75252 | eng_Latn | 0.813905 |
da44f06fc2cf1760773598b109349125ee56efda | 4,117 | md | Markdown | README.md | xdissent/node-sdl | 830ab42cbd9d39c70989a86182cb60351d7ef804 | [
"MIT"
] | 90 | 2015-01-12T20:40:49.000Z | 2022-02-12T02:00:42.000Z | README.md | deadbeef84/node-sdl | 830ab42cbd9d39c70989a86182cb60351d7ef804 | [
"MIT"
] | 7 | 2015-11-29T03:17:17.000Z | 2021-03-01T19:14:32.000Z | README.md | deadbeef84/node-sdl | 830ab42cbd9d39c70989a86182cb60351d7ef804 | [
"MIT"
] | 19 | 2015-03-17T13:30:49.000Z | 2021-09-06T05:22:23.000Z | # node-sdl ( Simple DirectMedia Layer bindings for node.js )
## 0. Installation
Currently, installation is finicky and largely depends upon my specific system for now. A future release is planned to make this process better. If you want to get it working, you need to have the dylibs mentioned in bindings.gyp under /usr/local/lib. (this includes the main SDL2 dylib, SDL2_ttf, and SDl2_image)
If you have those libraries, and clone node-sdl, you can build it with
<pre> node-gyp configure build </pre>
## 1. Usage
As a general rule, these bindings adhere to the following conventions.
* SDL structs are wrapped at the base level of the bindings. If you want a Window, it will be under sdl.Window.
* Structs that you would normally pass to various functions are instead wrapped as objects with prototype functions, and are created with the new keyword. As an example, instead of doing something such as <pre>sdl.GetWindowWidth(window)</pre>, you would instead do <pre>var window = new sdl.Window(...); window.getWidth()</pre>.
* Constants, enums, etc. are split up into various namespaces below the base namespace, based on where the underscores are in their name. This roughly translates <pre>SDL_WINDOWPOS_CENTERED</pre> into <pre>sdl.WINDOWPOS.CENTERED</pre>.
* Extensions to the base SDL API are under their own namespace. While you would find <pre>sdl.Window</pre> or <pre>sdl.Renderer</pre>, anything under the SDL_ttf library would be under <pre>sdl.TTF</pre> If you want to make a font, you would use <pre>new sdl.TTF.Font</pre>.
## 2. Specific usage
### 2.1 Initialization and Shutdown
Explaining the basics behind finding pieces of the SDL API is all well and good, but examples are still the best. So here is a quick and easy way to create a new window using node-sdl.
<pre>sdl.init(sdl.INIT.EVERYTHING); // Initialize all SDL subsystems.
var window = new sdl.Window("Test Window", sdl.WINDOWPOS.CENTERED, sdl.WINDOWPOS.CENTERED, 640, 480);
setTimeout(function() { sdl.quit(); }, 2000);</pre>
It's that easy. Though one thing to be aware of: in that example I declared a window in the global space. Because that is the only place it is referenced, once the script finishes (meaning after the call to setTimeout) there will be no more references to it. That means the window will get garbage collected sometime in the future. And because the bindings handle destroying SDL objects when the wrapping object gets destructed, that means the window will disappear, seemingly randomly. Make sure you keep a reference to all objects you want to persist somewhere, or you might find your window disappearing without warning.
### 2.2 Events
Currently, events are wrapped as a pure Javascript object. So trying to access properties of the event that don't exist for that event will give back undefined. You can determine the exact type of event by checking <pre>event.type</pre> just like in SDL. Same goes for all the other properties. If an event would have a key member, that will be in the Javascript object, etc.
## 3. License
Copyright (c) 2014 Tim "creationix" Caswell
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
| 72.22807 | 623 | 0.780423 | eng_Latn | 0.993381 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.