hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3e64a005b4af3a410713a92e147ecbcba78b03fa | 3,011 | md | Markdown | README.md | jasudev/Scroller | a825313309c2fc6964b50edc83867370ad08bb47 | [
"MIT"
] | 18 | 2022-02-02T15:45:26.000Z | 2022-03-28T22:55:05.000Z | README.md | jasudev/Scroller | a825313309c2fc6964b50edc83867370ad08bb47 | [
"MIT"
] | null | null | null | README.md | jasudev/Scroller | a825313309c2fc6964b50edc83867370ad08bb47 | [
"MIT"
] | null | null | null | # **Scroller for SwiftUI**
You can animate in individual views based on scroll position. Developed with SwiftUI. This library supports iOS/macOS.
[](https://developer.apple.com/macOS)
[](https://developer.apple.com/iOS)
[](https://developer.apple.com/macOS)
[](https://www.instagram.com/dev.fabula)
[](https://developer.apple.com/documentation/swift_packages/package/)
[](https://opensource.org/licenses/MIT)
## Screenshot
|Example|Vertical|Horizontal|
|:---:|:---:|:---:|
|<img src="Markdown/Scroller.gif">|<img src="Markdown/ScrollerVertical.gif">|<img src="Markdown/ScrollerHorizontal.gif">|
## Example
[https://fabulaapp.page.link/222](https://fabulaapp.page.link/222)
[https://fabulaapp.page.link/223](https://fabulaapp.page.link/223)
## Usages
1. Scroller
```swift
Scroller(.vertical, value: $valueV) {
ForEach(0...5, id: \.self) { index in
GeometryReader { proxy in
ScrollerVContent(value: proxy.scrollerValue(.vertical))
}
}
} lastContent: {
Rectangle()
.fill(Color.blue)
.overlay(Text("LastView"))
.foregroundColor(Color.white)
}
```
2. Each view only needs to conform to the ScrollerContent protocol.
```swift
struct ScrollerVContent: ScrollerContent {
/// Bind each view's scroll-relative value. It is a value between 0 and 1.
var value: CGFloat = 0
var body: some View {
GeometryReader { proxy in
ScrollerInfoView(axes: .vertical, value: value, proxy: proxy)
.offset(y: proxy.size.height * value)
.padding(10)
Rectangle().fill(Color.blue)
.frame(width: proxy.size.width * value, height: 5)
.offset(y: proxy.size.height * value)
}
.background(Color.orange.opacity(1.0 - value))
}
}
```
## Swift Package Manager
The Swift Package Manager is a tool for automating the distribution of Swift code and is integrated into the swift compiler. Once you have your Swift package set up, adding Scroller as a dependency is as easy as adding it to the dependencies value of your Package.swift.
```swift
dependencies: [
.package(url: "https://github.com/jasudev/Scroller.git", .branch("main"))
]
```
## Contact
instagram : [@dev.fabula](https://www.instagram.com/dev.fabula)
email : [[email protected]](mailto:[email protected])
## License
Scroller is available under the MIT license. See the [LICENSE](LICENSE) file for more info.
| 42.408451 | 270 | 0.655928 | eng_Latn | 0.385298 |
3e651e66497a97651ee0aca38fa4372c677f6e6f | 1,178 | md | Markdown | licenses.md | Bhaskers-Blu-Org2/DUCK | 63a15a11c7576ab6261cb98685976b5e5d2c152f | [
"Apache-2.0"
] | 3 | 2018-04-09T04:06:11.000Z | 2018-07-30T23:08:41.000Z | licenses.md | Bhaskers-Blu-Org2/DUCK | 63a15a11c7576ab6261cb98685976b5e5d2c152f | [
"Apache-2.0"
] | 8 | 2017-07-18T10:45:38.000Z | 2018-10-02T22:18:22.000Z | licenses.md | LaudateCorpus1/DUCK | 63a15a11c7576ab6261cb98685976b5e5d2c152f | [
"Apache-2.0"
] | 7 | 2019-11-03T16:56:21.000Z | 2021-11-10T10:20:33.000Z | # Third Party Licenses
This document tracks third-party software distributed by this project:
- AngularJS [https://angularjs.org/]
- Zurb Foundation [https://github.com/zurb/foundation-sites/blob/develop/LICENSE]
- Font Awesome [https://fortawesome.github.io/Font-Awesome/license/]
- JSHashtable fork [https://github.com/enriquecastl/jshashtable]
- JSHashSet fork [https://github.com/enriquecastl/jshashset]
- ngSortable [https://github.com/a5hik/ng-sortable]
- Fuse.js [http://kiro.me/projects/fuse.html]
- Mass AutoComplete [http://hakib.github.io/MassAutocomplete/]
- Elastic Input [http://jacek-pulit.github.io/angular-elastic-input/]
- Angular Translate [https://github.com/angular-translate/angular-translate/blob/master/LICENSE]
- Angular File Saver [https://github.com/alferov/angular-file-saver]
- CouchDB [https://www.apache.org/licenses/LICENSE-2.0]
- Carneades [https://github.com/carneades/carneades-4/blob/master/LICENSE]
- Go programming Language [https://golang.org/LICENSE]
- Go Yaml parser [https://github.com/go-yaml/yaml/blob/v2/LICENSE] and [https://github.com/go-yaml/yaml/blob/v2/LICENSE.libyaml]
- SWI Prolog [http://www.swi-prolog.org/license.html]
| 51.217391 | 128 | 0.767402 | yue_Hant | 0.342179 |
3e65ad66636f3b40bc39f38648c3067b6e0e5ef8 | 35 | md | Markdown | README.md | Chamarez/ionic-docker-microservices-spring | 7802b58150f892be50c01a5f5d76d58163deb8cf | [
"MIT"
] | null | null | null | README.md | Chamarez/ionic-docker-microservices-spring | 7802b58150f892be50c01a5f5d76d58163deb8cf | [
"MIT"
] | null | null | null | README.md | Chamarez/ionic-docker-microservices-spring | 7802b58150f892be50c01a5f5d76d58163deb8cf | [
"MIT"
] | null | null | null | # ionic-docker-microservices-spring | 35 | 35 | 0.857143 | eng_Latn | 0.345207 |
3e66d0ae8d8ecd7330d078f2637b008e94b79bbf | 28 | md | Markdown | README.md | brandeis-llc/semviz | 97735e88d3808cad35ca07430fced16152461a83 | [
"MIT"
] | null | null | null | README.md | brandeis-llc/semviz | 97735e88d3808cad35ca07430fced16152461a83 | [
"MIT"
] | 6 | 2020-04-21T02:41:07.000Z | 2020-04-22T14:40:07.000Z | README.md | brandeis-llc/semviz | 97735e88d3808cad35ca07430fced16152461a83 | [
"MIT"
] | 1 | 2020-04-20T14:17:06.000Z | 2020-04-20T14:17:06.000Z | # SemViz
semviz.org source
| 7 | 17 | 0.75 | tur_Latn | 0.278753 |
3e66eec7cb3eb3fc44e8a911496ea5cab7dcbc46 | 6,548 | md | Markdown | docs/learning-by-doing/week04-docker-monitoring/02-report-en.md | cuongtransc/autoscaling | bb1a6abfedb16c53f52dae4f24b4e2141a4b5e0a | [
"Apache-2.0"
] | 1 | 2020-05-24T20:11:06.000Z | 2020-05-24T20:11:06.000Z | docs/learning-by-doing/week04-docker-monitoring/02-report-en.md | cuongtransc/autoscaling | bb1a6abfedb16c53f52dae4f24b4e2141a4b5e0a | [
"Apache-2.0"
] | 2 | 2017-06-18T02:33:51.000Z | 2017-06-18T02:33:58.000Z | docs/learning-by-doing/week04-docker-monitoring/02-report-en.md | cuongtransc/autoscaling | bb1a6abfedb16c53f52dae4f24b4e2141a4b5e0a | [
"Apache-2.0"
] | 4 | 2018-01-26T07:53:35.000Z | 2020-04-14T05:38:05.000Z | # Báo cáo tuần 4
#### Auto - scaling team
### A. Overview
**1. cAdvisor**
cAdvisor (Container Advisor) provides container users an understanding of the resource usage and performance characteristics of their running containers. It is a running daemon that collects, aggregates, processes, and exports information about running containers.
Specifically, for each container it keeps resource isolation parameters, historical resource usage, histograms of complete historical resource usage and network statistics. This data is exported by container and machine-wide.
- Performance Metrics:
+ CPU: total usage, usage per core, usage breakdown (Hz)
+ Memory: total usage(Byte)
+ Network: Throughput-Tx bytes,Rx bytes (Bytes per second), Errors(Errors per second) - Tx bytes, Rx bytes
+ Filesystem (Storage): total usage (Byte)
- Frequence of data collection:
+ Real-time collector (per second)
- Size of data per one docker container measuring:
>Size of a sample data unit *monitoring time by time unit * collected frequence of data
+ Structure of a sample data unit:
Time|Sequence_number|fs_limit|Machine|memory_usage|container_name|cpu_cumulative_usage|memory_working_set|rx_bytes|tx_errors|tx_bytes|fs_device|rx_errors|fs_usage
--|--|--|--|--|--|--|--|--|--|--|--|--|--|
**2. InfluxDB**
InfluxDB is a time series, metrics, and analytics database. cAdvisor only displays realtime information and doesn't store the metrics. We need to store the monitoring information which cAdvisor provides in order to display a time range other than realtime.
Feature:
- Time-Centric Functions
- Events
- Powerful Query Language
- Scalable Metrics
- Native HTTP API
- Built-in Explorer
Database structure:
+ By time series
+ Metrics data and events data oriented
+ A sample time series record:
Time|Sequence_number|field 1|field 2|field 3|....
---|---|---|---|---|---
+ Store billions of data points.
Aggregate record:
+ Merge multiple series together
+ Group by time range
+ Graph visualized
+ Powerful aggregate function: sum, mean, max, count, median...
+ SQL-like query language
```sh
Exam:
select count(type) from events group by time(10m), type
into events.count_per_type.10m
```
Client Libraries
+ Supporting to interact with InfluxDB throughout HTTP protocol (read, write,insert ...)
+ Support many language: javaScript, Ruby, Rails, Python, PHP, Perl, .NET...
Get more [here](https://influxdb.com/)
**3. Grafana:**
Grafana is a leading open source application for visualizing large-scale measurement data. The Grafana Dashboard allows us to pull all the pieces together visually. This powerful Dashboard allows us to run queries against the InfluxDB and chart them accordingly in a very nice layout.
Features:
+ graph

+ singlestat

+ annotation

Data aggregate
+ Interacting with InfluxDB
+ Query template/ editor for InfluxDB
HTTP API
+ The Grafana backend exposes an HTTP API, the same API is used by the frontend to do everything from saving dashboards, creating users and updating data sources.
Get more [here](http://docs.grafana.org/)
### B. Installation of Docker Monitoring.
####1. Install the InfluxDb
- command:
```
$ docker run -d -p 8083:8083 -p 8086:8086 --expose 8090 --expose 8099 -e PRE_CREATE_DB=cadvisor --name influxsrv tutum/influxdb:0.8.8
```
+ `-p 8083:8083` : user interface, log in with username-admin, pass-admin
+ `-p 8086:8086` : interaction with orther application
+ `PRE_CREATE_DB=cadvisor` : create database have name `cadvisor`
+ `--name influxsrv` : container have name `influxsrv`, use to cAdvisor link it.
####2. Install the cAdvisor container and link it to the InfluxDB container.
- command:
```
docker run \
--volume=/:/rootfs:ro \
--volume=/var/run:/var/run:rw \
--volume=/sys:/sys:ro \
--volume=/var/lib/docker/:/var/lib/docker:ro \
--publish=8080:8080 \
--link=influxsrv:influxsrv \
--detach=true \
--name=cadvisor \
google/cadvisor:0.14.0 \
-storage_driver=influxdb \
-storage_driver_db=cadvisor \
-storage_driver_host=influxsrv:8086
```
+ `--publish=8080:8080` : user interface
+ `--link=influxsrv:influxsrv`: link to container influxsrv
+ `-storage_driver=influxdb`: set the storage driver as InfluxDB
+ Specify what InfluxDB instance to push data to:
* `-storage_driver_host=influxsrv:8086 `: The *ip:port* of the database. Default is 'localhost:8086'
* `-storage_driver_db=cadvisor `: database name. Uses db 'cadvisor' by default
- After install successfully access url `http://localhost:8080` You should now see the cAdvisor gathering statistics on your Docker host and containers
####3. Install the Grafana Dashboard and link it to the InfluxDB container:
- command:
```
docker run -d -p 3000:3000 \
-e HTTP_USER=admin \
-e HTTP_PASS=admin \
-e INFLUXDB_HOST=localhost \
-e INFLUXDB_PORT=8086 \
-e INFLUXDB_NAME=cadvisor \
-e INFLUXDB_USER=root \
-e INFLUXDB_PASS=root \
--link=influxsrv:influxsrv \
grafana/grafana:2.0.2
```
- After install successfully access url `http://localhost:3000`, config to link it to the InfluxDb:
+ Login: Username – admin, password – admin
+ Click on the Grafana icon in the upper left hand corner of the GUI. Click on: Data Sources → Add New and fill information follow image:

- Config to monitoring statistics:
+ Click Dashboard → Home Menu → New → Add Panel → Graph

+ Click notitle → edit, after write our query for our graph at metric tab.

+ Example several graph:

#### References
- https://www.brianchristner.io/how-to-setup-docker-monitoring/
- You can use json file to create dashboards: https://github.com/vegasbrianc/docker-monitoring
| 39.926829 | 284 | 0.752902 | eng_Latn | 0.73312 |
3e67cd11c5ccc05296162285f11fb4117cdda223 | 366 | md | Markdown | _posts/2018-07-01-8.md | nickie10106/new-simply-nickie | c127b6bd8fca641462a0858fdd6c8d511ec3be50 | [
"Apache-2.0"
] | null | null | null | _posts/2018-07-01-8.md | nickie10106/new-simply-nickie | c127b6bd8fca641462a0858fdd6c8d511ec3be50 | [
"Apache-2.0"
] | null | null | null | _posts/2018-07-01-8.md | nickie10106/new-simply-nickie | c127b6bd8fca641462a0858fdd6c8d511ec3be50 | [
"Apache-2.0"
] | null | null | null | ---
title: "Cherry On the Cake"
subtitle: Electronic Direct Message
layout: default
modal-id: 8
date: 2014-07-15
img: portfolio/8.jpg
thumbnail: thumbnail/8.jpg
alt: Electronic Direct Message
project-date: 2018
client: None
category: Design
description: Clean and innovative design to send out to customers on MailChimp, GetResponse or any email marketing app.
---
| 22.875 | 119 | 0.781421 | eng_Latn | 0.863125 |
3e67e5dc7976d08e6ee235ac24b961acce4be431 | 2,907 | md | Markdown | content/post/Day_36.md | lewisc2303/LewisBlog | a9fc8b4522ffcdabce5d513a97b3e3f978be86b2 | [
"MIT"
] | null | null | null | content/post/Day_36.md | lewisc2303/LewisBlog | a9fc8b4522ffcdabce5d513a97b3e3f978be86b2 | [
"MIT"
] | null | null | null | content/post/Day_36.md | lewisc2303/LewisBlog | a9fc8b4522ffcdabce5d513a97b3e3f978be86b2 | [
"MIT"
] | null | null | null | ---
title: "Day 36"
date: 2018-10-08
tags: []
draft: False
---
# (More) Types
### Polymorphism
Degrees of polymorphism are:
- Parametric Polymorphism - full polymorphism of any type: `a -> b -> a`
- Constrained Polymorphism - polymorphism constrained to a typeclass: `(Num a, Ord a) => a`
- Concrete type - not polymorphic: `[Char] -> Char`
It is beneficial to keep a function as polymorphic as possible, as it saves you from creating different functions for different types.
But as soon as you apply a function belonging to a typeclass you constrain the argument to the typeclass making it a constrained polymorphic function. This is often necessary as certain functions are only applicable to a certain typeclass and then even more functions are applicable to specific data types; so the more specified you make the type, the greater access to functions you have.
### Typeclass-constrained type variables
Certain functions are constrained to certain typeclasses, sub classes and types; analogous to a tree of compatibility. Functions from the stem can be applied to the branches but not the other way round.
e.g. Take the function:
```
divideLength = 6 / length [1, 2, 3]
```
This does not compile because `/` is of type `fractional` and is an infixr operator and so applies to the right but `length` is of type `Integer` which is not compatible. You would either have to use functions of the same type...
**Or** you can convert `length` to the parent typeclass `Num` by using the `fromIntegral`. And so when the `/` operator is applied to `length` it can constrain the `Num` typeclass to `fractional`.
```
Prelude> divideLength = 6 / fromIntegral ( length [1, 2, 3] )
Prelude> divideLength
2.0
```
Note: answer is in decimal form as it is of `fractional`
### Type signatures for HOFs
Type signatures are a useful tool to read a function and determines its purpose, it specifies the application of the arguments regarding their respective types.
I found type signature fairly straight forward until I got to the following exercise:
Write the undefined function which will allow this type signature to compile?
```
munge :: (x -> y) -> (y -> (w, z)) -> x -> w
```
At first I got too bogged down with the amount of arrows and letters that were there. it seemed a bit mad.
Then I tried to dissect it logically from the type **signature** (key word).
The output type needs to be of type `w`. So working back from `w`, I need the first element of the tuple `(w, z)` which I can use the function `fst` (a parametric polymorphic function), but that functions require the input `y` and the first argument does just that `x->y` and then `x` is needed to input into the function to get `w`! Therefore if you nest these you get:
```
munge :: (x -> y) -> (y -> (w, z)) -> x -> w -- How to approach this?
munge xy ywz x = fst (ywz (xy (x) ) )
```
Perhaps a bad explanation, but it's hard to explain.
| 46.887097 | 389 | 0.733402 | eng_Latn | 0.999571 |
3e681a3ca0db1770393ad1dcf91a387035ce6f4c | 148 | md | Markdown | pra/TrabalhoFinal/README.md | LucasFreitass/udesc | c45bd8474d2e6ac1507b4e9acc8488fe853decee | [
"Apache-2.0"
] | null | null | null | pra/TrabalhoFinal/README.md | LucasFreitass/udesc | c45bd8474d2e6ac1507b4e9acc8488fe853decee | [
"Apache-2.0"
] | null | null | null | pra/TrabalhoFinal/README.md | LucasFreitass/udesc | c45bd8474d2e6ac1507b4e9acc8488fe853decee | [
"Apache-2.0"
] | null | null | null | ## Trabalho final
Objetivo do trabalho era realizar uma comparação entre melhores e piores casos de uma sequencia de execução de uma Árvore AVL.
| 37 | 127 | 0.797297 | por_Latn | 1.00001 |
3e69cb94e1945cb6884c815dbb1a506e62ea7dde | 1,088 | md | Markdown | _posts/2018-05-23-python-module.md | miaculpa/miaculpa.github.io | 4013ddd0f134c80a7fca542a73b9cc2b30d93d7a | [
"MIT"
] | null | null | null | _posts/2018-05-23-python-module.md | miaculpa/miaculpa.github.io | 4013ddd0f134c80a7fca542a73b9cc2b30d93d7a | [
"MIT"
] | null | null | null | _posts/2018-05-23-python-module.md | miaculpa/miaculpa.github.io | 4013ddd0f134c80a7fca542a73b9cc2b30d93d7a | [
"MIT"
] | null | null | null | ---
layout: post
title: "[Python 04모듈과 패키지] - 모듈과 패키지"
subtitle: "모듈과 패키지"
categories: language
tags: python
comments: true
---
## 모듈
>함수나 변수 또는 클래스들을 모아 놓은 파일이다.
### 1. 모듈 불러오기 (import)
```
import 모듈이름
```
모듈의 이름이 전역 네임스페이스에 등록되어 모듈명.함수로 사용가능하다.
```
from 모듈이름 import 함수이름
```
모듈명을 생략하고 모듈 내부의 함수를 쓸 수 있다.
```
from 모듈이름 import*
```
\* 문자는 모든것이라는 뜻이다. 모듈명의 모든 함수를 불러서 사용하겠다는 뜻
```
from 모듈이름 import 함수이름 as 별칭
import 모듈이름 as 별칭
```
같은 모듈명이 존재하거나 혼동 될 수 있을 경우, 뒤에 as를 붙여 사용할 모듈명을 변경할 수 있다.
### 2. if \_\_name\_\_ == \__main\_\_
python <파일명>으로 실행한 경우에만 동작할 부분은 if문으로 감싸준다.
\_\_name\_\_ : 모듈의 전역변수
\_\_main\_\_ : 파이썬 인터프리터가 실행한 모듈의 경우, \_\_main\_\_ 이라는 이름을 가진다
---
## 패키지
>모듈들을 모아 둔 특별한 폴더를 뜻한다.
패키지로 사용할 폴더에 \_\_init\_\_.py파일을 넣어주면 해당 폴더는 패키지로 취급된다.
```
├── functions
│ ├── __init__.py
│ ├── game.py
│ └── shop.py
└── lol.py
```
\*, \_\_all\_\_
패키지에 포함된 하위 패키지 및 모듈을 불러올 때, \*을 사용하면 해당 모듈의 모든 식별자들을 불러온다.
이 때, 각 모듈에서 자신이 import될 때 불러와질 목록을 지정하고자 한다면 \_\_all\_\_ 을 정의하면 된다.
패키지 자체를 import시에 자동으로 가져오고 싶은 목록이 있다면, 패키지의 \_\_init\_\_.py파일에 해당 항목을 import해주면 된다.
| 14.702703 | 83 | 0.629596 | kor_Hang | 1.00001 |
3e6a0f926655551537dd990821e08ca251b97e01 | 1,632 | md | Markdown | docs/tables/aws_glacier_vault.md | blinkops/blink-aws-query | 21b0d5eee6b0b5554c040867b9e25681e7ba6559 | [
"Apache-2.0"
] | 61 | 2021-01-21T19:06:48.000Z | 2022-03-28T20:09:46.000Z | docs/tables/aws_glacier_vault.md | blinkops/blink-aws-query | 21b0d5eee6b0b5554c040867b9e25681e7ba6559 | [
"Apache-2.0"
] | 592 | 2021-01-23T05:27:12.000Z | 2022-03-31T14:16:19.000Z | docs/tables/aws_glacier_vault.md | blinkops/blink-aws-query | 21b0d5eee6b0b5554c040867b9e25681e7ba6559 | [
"Apache-2.0"
] | 30 | 2021-01-21T18:43:25.000Z | 2022-03-12T15:14:05.000Z | # Table: aws_glacier_vault
A vault is a way to group archives together in Amazon S3 Glacier.
## Examples
### Basic info
```sql
select
vault_name,
creation_date,
last_inventory_date,
number_of_archives,
size_in_bytes
from
aws_glacier_vault;
```
### List vaults that grant full access to the resource
```sql
select
title,
p as principal,
a as action,
s ->> 'Effect' as effect,
s -> 'Condition' as conditions
from
aws_glacier_vault,
jsonb_array_elements(policy_std -> 'Statement') as s,
jsonb_array_elements_text(s -> 'Principal' -> 'AWS') as p,
jsonb_array_elements_text(s -> 'Action') as a
where
s ->> 'Effect' = 'Allow'
and a in ('*', 'glacier:*');
```
### List vaults that grant anonymous access to the resource
```sql
select
title,
p as principal,
a as action,
s ->> 'Effect' as effect,
s -> 'Condition' as conditions
from
aws_glacier_vault,
jsonb_array_elements(policy_std -> 'Statement') as s,
jsonb_array_elements_text(s -> 'Principal' -> 'AWS') as p,
jsonb_array_elements_text(s -> 'Action') as a
where
p = '*'
and s ->> 'Effect' = 'Allow';
```
### Get the archival age in days before deletion for each vault
```sql
select
title,
a as action,
s ->> 'Effect' as effect,
s -> 'Condition' -> 'NumericLessThan' ->> 'glacier:archiveageindays' as archive_age_in_days
from
aws_glacier_vault,
jsonb_array_elements(vault_lock_policy_std -> 'Statement') as s,
jsonb_array_elements_text(s -> 'Action') as a;
```
### List vaults without owner tag key
```sql
select
vault_name,
tags
from
aws_glacier_vault
where
not tags :: JSONB ? 'owner';
```
| 18.758621 | 93 | 0.683211 | eng_Latn | 0.815457 |
3e6a627f68ec1dc372bde9fd10fd0bb71a7aa71e | 14,362 | md | Markdown | _posts/2020-03-15-software-design-principles-lesson-3.md | matiasbeltramone/matiasbeltramone.github.io | a3576c6002cb7bffb728cc1b8d454c9b6cbcb89d | [
"MIT"
] | null | null | null | _posts/2020-03-15-software-design-principles-lesson-3.md | matiasbeltramone/matiasbeltramone.github.io | a3576c6002cb7bffb728cc1b8d454c9b6cbcb89d | [
"MIT"
] | null | null | null | _posts/2020-03-15-software-design-principles-lesson-3.md | matiasbeltramone/matiasbeltramone.github.io | a3576c6002cb7bffb728cc1b8d454c9b6cbcb89d | [
"MIT"
] | null | null | null | ---
layout: post
title: "POO Lección 8: Principios de Diseño de Software II"
tags: [POO, OOP, Programación Orientada a Objetos, Object Oriented Programming, Software Design Principles]
---
¿Qué es el buen diseño de software? ¿Cómo podemos medirlo? ¿Qué prácticas necesitaras seguir para llegar? ¿Como podes hacer tu arquitectura flexible, estable y fácil de entender?
Estas son las grandes preguntas; pero, afortunadamente, las respuestas son diferentes dependiendo del tipo de aplicación que querramos construir. Sin embargo, hay muchos principios universales del diseño de software que podrán ayudarte a contestar estas preguntas para tu propio proyecto. La mayoría de los patrones de diseño que podamos encontrar son basados en estos principios.
### Principio 1: :package: Encapsulación
<hr>
Identificar los aspectos de tu aplicación que varían y separarlos de los que permanecen iguales.
<hr>
**El principal objetivo de este principio es minimizar el efecto causado por cambios.**
Imaginemos que nuestro programa es un barco, y los cambios son minas que permanecen abajo del agua. Si golpeamos una, el barco se hunde.
Conociendo esto, podemos dividir el casco del barco en compartimientos independientes que puedan ser sellados de forma segura con el objetivo de limitar el daño causado a un compartimiento individual.
Si golpearamos ahora una mina, es probable que el barco **como un todo** siga a flote gracias a este mecanismo, que hace que el daño se quede en una cosa parte del barco.
De la misma manera, en nuestro caso podemos **aislar las partes** de un **programa** que varía en **modulos independientes**, protegiendo el resto del código de efectos adversos que pueda ocasionar ese módulo en cuestión. Como resultado, gastas menos tiempo para que el programa vuelva a funcionar en caso de que falle por un módulo en particular, implementando y probando los cambios de manera más aislada, sin temer que se rompa otra parte. Cuanto menos tiempo pase haciendo cambios, o probando que un cambio en una feature no afecte al resto del programa, más tiempo tendrá para implementar funciones nuevas.
#### Encapsulación: :scroll: A nivel de metodos.
Digamos que estamos haciendo un sitio de e-commerce. En algún lugar de nuestro código, hay un método `getOrderTotal` que calcula el precio total de la orden, incluídos los impuestos, recargos, envíos, etc.
Nosotros podemos anticipar que el **código de los impuestos** relacionados **puede cambiar en el futuro**.
La cantidad de impuestos dependen del país, estado, o quizás de la ciudad donde el cliente reside, y la formula actual podría cambiar con el paso del tiempo por nuevas leyes o regulaciones, algo muy común en Argentina por ejemplo. Como resultado, vamos a necesitar cambiar el metodo `getOrderTotal` bastante seguido. Pero el nombre del método sugiere que no se preocupa acerca del `como` el recargo es calculado.
```
method getOrderTotal() is
total = 0
foreach item in this.lineItems
total += item.price * item.quantity
if (this.country == "US")
total += total * 0.07 // US VAT
else if (this.country == "EU"):
total += total * 0.20 // European VAT
return total
```
ANTES: El cálculo de los impuestos son calculados junto al resto del código del método.
Podemos extraer la lógica para calcular los impuestos en un método por separado, escondiendolo del método original.
```
method getOrderTotal() is
total = 0
foreach item in order.lineItems
total += item.price * item.quantity
total += total * getTaxRate(this.country)
return total
method getTaxRate(country) is
if (country == "US")
return 0.07 // US VAT
else if (country == "EU")
return 0.20 // European VAT
else
return 0
```
DESPUÉS: Podes obtener el calculo de impuesto solamente llamando al método designado para calcularlos.
Los cambios relacionados con los impuestos quedan aislados dentro de un solo método.
Además, si la lógica del cálculo de impuestos se vuelve demasiado complicada, ahora es más fácil moverlo a una clase separada.
#### Encapsulación: :file_folder: A nivel de clases.
Con el tiempo, podemos agregar más y más responsabilidades a un método que solía hacer algo simple. Estos comportamientos adicionales a menudo vienen con sus propios campos auxiliares y métodos que eventualmente difuminan la responsabilidad principal de la clase que los contiene. Extraer todo a una nueva clase podría hacer las cosas mucho más claras y simples.
<p align="center">
<img width="30%" src="https://user-images.githubusercontent.com/22304957/72668237-8346f580-3a03-11ea-8264-d063fd9f3621.png"/>
Antes: Calculabamos los impuestos en la clase `Order`
</p>
Lo que podemos hacer ahora es que los objetos de la clase `Order` deleguen todos los calculos relacionados a impuestos a un objeto especial para realizar esa acción en particular `TaxCalculator`.
<p align="center">
<img width="50%" src="https://user-images.githubusercontent.com/22304957/72668236-8346f580-3a03-11ea-9fa0-2bc178bee4f7.png"/>
</p>
```
method getOrderTotal() is
total = 0
foreach item in this.lineItems
subtotal = item.price * item.quantity
total += subtotal * taxCalculator.getTaxRate(country, state, item.product)
return total
```
DESPUÉS: El calculo de impuestos esta oculto para la clase `Order` mediante el colaborador `TaxCalculator`
#### Principio 2: :clipboard: Programar una Interfaz no una Implementación. (Un contrato)
<hr>
Programar una interfaz, no una implementación. Depender de abstracciones, no de clases concretas.
<hr>
Podríamos decir que el diseño es suficientemente flexible si puedes extender facilmente sin romper el código existente. Vamos a asegurarnos que esto es correcto mirando otro ejemplo con gatos.
Un `Cat` que puede comer cualquier tipo de comida es más flexible que uno que solo puede comer pescado supongamos. Puedes alimientar al primer gato con pestaco porque esta en el marco de "cualquier comida"; sin embargo, también puedes extender el menú del gato con cualquier comida, por ejemplo: balanceado, en cambio el segundo gato que solo puede comer pescado no puede extender su menú.
Cuando quieres hacer que dos clases colaboren, solemos empezar haciendo una dependiente de la otra. Diablos! A menudo empiezo haciendo esto yo mismo. Sin embargo, hay otra forma más flexible para configurar una colaboración entre objetos:
1. Determinar exactamente que es lo que un objeto necesita del otro: ¿Qué métodos ejecutara?
2. Describa estos métodos en una nueva **interfaz**.
3. Hacer que la clase que es una dependencia, implementar esta interfaz recien creada para que cumpla con el contrato correctamente.
4. Ahora crear la segunda clase que depende de esta interfaz, y hacer que tome como colaborador esa interfaz. Puedes hacer que funcione también relacionando los objetos originales, pero la conexión mediante `interfaces` es mucho más flexible.
<p align="center">
<img src="https://user-images.githubusercontent.com/22304957/72668510-7081f000-3a06-11ea-99e7-bf7b676e1672.png"/>
</p>
Si observamos el **antes y después** de extraer la **interfaz**. El codigo en la **derecha** es **más flexible** que el codigo de la izquierda, pero también es verdad que es **algo más complicado de entender o realizar**.
Después de hacer este cambio, probablemente no sientas de inmediato ningún beneficio de realizar esto. Por el contrario, el código se convirtió en algo más complicado que antes de realizar y seguir el flujo que antes. Sin embargo, si sientes que esto podría ser un buen punto para agregar funcionalidad extra o para que otra gente use tu código y quiera extender la funcionalidad de una manera ms sencilla, sigamos este camino mediante `interfaces`.
#### Ejemplo
Miremos otro ejemplo que ilustre que los objetos a través de **interfaces** podrían resultar más beneficiosos que depender de **clases concretas**. Imaginemos que estamos creando un simulador de una compañia de desarrollo de software. Tenemos diferentes clases que representan varios tipos de empleados.
<p align="center">
<img src="https://user-images.githubusercontent.com/22304957/72668673-184bed80-3a08-11ea-9a99-e40c5b0e1bb7.png"/>
Todas las clases estan altamente acopladas entre si.
</p>
En un principio, la clase `Company` esta altamente acoplada a las clases concretas de empleados. Sin embargo, apesar de la diferencia entre las implementaciones, podemos generalizar los métodos de trabajo relacionados y extraerlos en una interfaz común para todas las clases de empleados.
Después de hacer esto, nosotros podemos aplicar `polimorfismo` dentro de la clase `Company`, tratando varios objetos empleados a través de la interfaz `Employee`.
<p align="center">
<img src="https://user-images.githubusercontent.com/22304957/72668751-c22b7a00-3a08-11ea-8f4f-914a2fc95aed.png"/>
El polimorfismo nos ayuda a simplificar el codigo, pero el resto de la clase `Company` sigue dependiendo de las clases `Employee` concretas.
</p>
La clase `Company` sigue acoplada a las clases `Employee`. Esto es malo porque si introducimos nuevos tipos de compañias que funcionan con otro tipos de empleados, nosotros necesitaremos sobreescribir la mayoría de la clase `Company` en lugar de reutiizar el código.
Para resolver este problema, podemos declarar el método para obtener los empleados como `abstracto`. Cada clase concreta `Company` implementara este método diferentemente, creando solamente los empleados que necesita.
<p align="center">
<img src="https://user-images.githubusercontent.com/22304957/72668827-880ea800-3a09-11ea-8405-8559b066fadc.png"/>
El método primario de la clase `Company` es independiente de la clase concreta empleado. Los objetos `Employee` son creados en las subclases concretas de `Company`.
</p>
Después de este cambio, la clase `Company` se hace independiente de las diferentes clases de `Employee`. Ahora puedes extender esta clase e introducir nuevo tipos de compañías y empleados mientras seguimos reutilizando la porción base de la clase `Company`. Entendiendo la clase `Company` no rompemos cualquier código existente que ya se basa en ella.
Por cierto, acabas de ver la aplicación de un patrón de diseño en acción! Ese fue un ejemplo del patrón `Factory Method`.
No se preocupe: seguramente lo discutiremos más adelante en detalle, en alguna serie de patrones que generemos.
#### Principio 3: ✍️ Composición sobre Herencia.
La herencia es probablemente el camino más obvio y fácil de reutilización de código entre clases. Supongamos que tenes dos clases con el mismo código, procedes de la siguiente manera creas una clase común base para estas dos clases y moves el codigo similar allí, luego extendes de esa clase base y listo. Facilisimo!
Desafortunadamente, la herencia viene con advertencias que a menudo se hacen evidentes solo después de que su programa ya tiene toneladas de clases y cambiar algo se vuelve bastante difícil. Aquí hay una lista de esos problemas:
- Una subclase no puede reducir la interfaz de la superclase. Tienen que implementar todos los métodos abstractos de la clase padre aún si no lo tienes que utilizar.
- Cuando sobreescribimos métodos necesitamos estar seguros que el nuevo comportamiento es compatible con el de la clase base. Esto es importante porque los objetos de la subclase podrían ser pasados a un código que espera el objeto de la superclase y no queremos que el código se rompa.
- La herencia rompe la encapsulación de la superclase porque los detalles internos de la clase padre están disponibles para la subclase. Puede haber una situación opuesta en la que un programador hace que una superclase conozca algunos detalles de las subclases en virtud de facilitar aún más la extensión, en ambos casos no es un camino correcto.
- Las subclases estan altamente acopladas a las superclases. Cualquier cambio en una superclase podrá romper la funcionalidad de las subclases.
- Intentar reutilizar el código a través de la herencia puede conducir a crear jerarquías de herencia paralelas.
La herencia usualmente toma lugar en una sola dimensión.
Pero cada vez que hay dos o más dimensiones, tienes que crear muchas combinaciones de clases, hinchando la jerarquía de clases a un tamaño ridículo.
**Hay una alternativa a la herencia llamada composición.**
Mientras que la herencia representa la relación "es un" entre clases (un automóvil es un transporte) y siempre lo será, la composición representa el "tiene una" relación (un automóvil tiene un motor).
Debo mencionar que este principio también se aplica a los agregados, una variante de composición más relajada donde un objeto puede tener una referencia a la otra pero no gestiona su ciclo (lifecycle).
Aquí hay un ejemplo: un automóvil tiene un conductor, pero él o ella puede usar otro automóvil o simplemente caminar sin utilizar un automóvil.
#### Ejemplo
Imaginemos que necesitamos crear una app de catálogo para una empresa manufacturera de autos. La compañía hace autos y camiones; pueden ser electricos o a gas; todos los modelos tienen control manual o automatico.
<p align="center">
<img src="https://user-images.githubusercontent.com/22304957/72669268-5a782d80-3a0e-11ea-96f8-dbac3364c855.png"/>
HERENCIA: Extendiendo una clase en muchas dimensiones podría derivar en una combinatoria explosión de subclases.
</p>
Como podemos ver, cada parámetro adicional resulta en multiplicar el número de subclases. Hay mucho código duplicado entre subclases porque una subclase no puede extender de dos clases al mismo tiempo, al menos no en la mayoría de los lenguajes actuales.
Podemos resolver este problema con composición. En lugar de un objeto `Auto o Car` implementando las funciones por si mismo, podemos delegarlo en otros objetos.
El beneficio agregado es que puedes reemplazar el comportamiento en tiempo de ejecución. Por ejemplo. Puedes reemplazar el objeto `Engine` linkeado al objeto `Car` solamente asignando un `Engine` diferente al objeto `Car`.
<p align="center">
<img src="https://user-images.githubusercontent.com/22304957/72669269-5ba95a80-3a0e-11ea-98ef-87a06ce0cffc.png"/>
COMPOSICIÓN: Diferentes "dimensiones" de funcionalidad extraidas a sus propias jerarquías de clases.
</p>
Esta estructura de clases se asemeja al patron `Strategy`, donde utilizamos diferentes implementaciones para llegar al mismo resultado.
| 65.880734 | 611 | 0.78833 | spa_Latn | 0.996955 |
3e6afa08b9fa164f6ee1f00117827561f0c1579e | 2,042 | md | Markdown | CS320/CS320Assignment4-master/README.md | ko-takahashi/college | c333f1e1767f1206687f5e9b0fb3f0145b2d5d6a | [
"MIT"
] | null | null | null | CS320/CS320Assignment4-master/README.md | ko-takahashi/college | c333f1e1767f1206687f5e9b0fb3f0145b2d5d6a | [
"MIT"
] | null | null | null | CS320/CS320Assignment4-master/README.md | ko-takahashi/college | c333f1e1767f1206687f5e9b0fb3f0145b2d5d6a | [
"MIT"
] | null | null | null | Bryan Phan
[email protected]
This project was to help familiarize us with Python programming and virtual machine
The first part of the program (prog4_1.py) was written in Python and implemented the Tokenize(str) and Parse(tokens) function. The tokenize function takes in an input string and tokenizes it according to the same rules in assignment#2. We do not have to store the tokenized string, but return true. For the parse function it takes a list of tokens as input that have previously been parsed by the Tokenize Function. If there are any mistakes a raise ValueError will occur.
The second part of the program (prog4_2.py) was written in Python and implemented a StackMachine.class. This class had both the Execute(tokens) public function and the currentLine property(initial value zero). The Execute(tokens) function should accept a list of tokens that has been previously Tokenized and Parsed correctly. The function will then perform the operation defined in the list of tokens as specified by the operation. These operations include push #, pop, add, sub, mul, div, mod, skip, save #, get #. Whenever the Execute(tokens) function finishes executing the operation specified by the tokens, the property currentLine is incremented by 1. If, at any time, the program attempts to pop a value that doesn't exist or get a value that has not previously been saved, it raises an IndexError with the message "Invalid Memory Access".
The final part of the program (prog4_3.py) was written in Python and implemented a driver program. It imports both prog4_1.py and prog4_2.py. The first command line argument is read as a file, which the program then proceeds to tokenize and parse all of the lines of the file. After that the lines are stored in an indexable structure so that we can randomly get the tokens for any line by line number. It then instantiates a StackMachine class and execute each operation one line at a time. When the StackMachine stopsrunning naturally, then program will print "Program terminated correctly."
| 185.636364 | 848 | 0.801175 | eng_Latn | 0.999837 |
3e6b18e37d55e3d76452f0928c7c1072a785d229 | 1,261 | md | Markdown | docs/preprocessor/rename-search-namespace.md | Mdlglobal-atlassian-net/cpp-docs.it-it | c8edd4e9238d24b047d2b59a86e2a540f371bd93 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/preprocessor/rename-search-namespace.md | Mdlglobal-atlassian-net/cpp-docs.it-it | c8edd4e9238d24b047d2b59a86e2a540f371bd93 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/preprocessor/rename-search-namespace.md | Mdlglobal-atlassian-net/cpp-docs.it-it | c8edd4e9238d24b047d2b59a86e2a540f371bd93 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-28T15:54:57.000Z | 2020-05-28T15:54:57.000Z | ---
title: attributo di importazione rename_search_namespace
ms.date: 08/29/2019
f1_keywords:
- rename_search_namespace
helpviewer_keywords:
- rename_search_namespace attribute
ms.assetid: 47c9d7fd-59dc-4c62-87a1-9011a0040167
ms.openlocfilehash: 42c6edb6aa34b441db8041dd2974728c138b2c82
ms.sourcegitcommit: 6e1c1822e7bcf3d2ef23eb8fac6465f88743facf
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 09/03/2019
ms.locfileid: "70216630"
---
# <a name="rename_search_namespace-import-attribute"></a>attributo di importazione rename_search_namespace
**C++Specifico**
Ha la stessa funzionalità dell'attributo [rename_namespace](../preprocessor/rename-namespace.md) , ma viene usata nelle librerie dei tipi in cui si usa `#import` la direttiva insieme all'attributo [auto_search](../preprocessor/auto-search.md) .
## <a name="syntax"></a>Sintassi
> **#import** *libreria di tipi* **rename_search_namespace (** "*newname*" **)**
### <a name="parameters"></a>Parametri
*NewName*\
Nuovo nome dello spazio dei nomi.
## <a name="remarks"></a>Note
**Specifico C++ finale**
## <a name="see-also"></a>Vedere anche
[attributi di #import](../preprocessor/hash-import-attributes-cpp.md)\
[#import (direttiva)](../preprocessor/hash-import-directive-cpp.md)
| 32.333333 | 244 | 0.766852 | ita_Latn | 0.522509 |
3e6bf86f692f891f92b3f4fefdd5a27a3a21535f | 69 | md | Markdown | articles/app-service-logic/app-service-logic-schema-2015-08-01.md | zhenjiao-ms/azure-docs | c0a229227c1651301b5cd978c3d248c2e22fbb66 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-03-22T15:03:27.000Z | 2022-03-22T15:03:27.000Z | articles/app-service-logic/app-service-logic-schema-2015-08-01.md | zhenjiao-ms/azure-docs | c0a229227c1651301b5cd978c3d248c2e22fbb66 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/app-service-logic/app-service-logic-schema-2015-08-01.md | zhenjiao-ms/azure-docs | c0a229227c1651301b5cd978c3d248c2e22fbb66 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2017-02-18T05:45:54.000Z | 2019-12-21T21:23:13.000Z | ---
redirect_url: /azure/logic-apps/logic-apps-schema-2015-08-01
---
| 17.25 | 60 | 0.710145 | swe_Latn | 0.131623 |
3e6c462169d2918f7afff2cca0589cea597026ff | 2,443 | md | Markdown | treebanks/de_pud/de_pud-feat-VerbForm.md | mjabrams/docs | eef96df1ce8f6752e9f80660c8255482b2a07c45 | [
"Apache-2.0"
] | null | null | null | treebanks/de_pud/de_pud-feat-VerbForm.md | mjabrams/docs | eef96df1ce8f6752e9f80660c8255482b2a07c45 | [
"Apache-2.0"
] | null | null | null | treebanks/de_pud/de_pud-feat-VerbForm.md | mjabrams/docs | eef96df1ce8f6752e9f80660c8255482b2a07c45 | [
"Apache-2.0"
] | null | null | null | ---
layout: base
title: 'Statistics of VerbForm in UD_German-PUD'
udver: '2'
---
## Treebank Statistics: UD_German-PUD: Features: `VerbForm`
This feature is universal.
It occurs with 2 different values: `Inf`, `Part`.
62 tokens (0%) have a non-empty value of `VerbForm`.
59 types (1%) occur at least once with a non-empty value of `VerbForm`.
1 lemmas (17%) occur at least once with a non-empty value of `VerbForm`.
The feature is used with 2 part-of-speech tags: <tt><a href="de_pud-pos-VERB.html">VERB</a></tt> (58; 0% instances), <tt><a href="de_pud-pos-AUX.html">AUX</a></tt> (4; 0% instances).
### `VERB`
58 <tt><a href="de_pud-pos-VERB.html">VERB</a></tt> tokens (3% of all `VERB` tokens) have a non-empty value of `VerbForm`.
The most frequent other feature values with which `VERB` and `VerbForm` co-occurred: <tt><a href="de_pud-feat-Mood.html">Mood</a></tt><tt>=EMPTY</tt> (58; 100%), <tt><a href="de_pud-feat-Number.html">Number</a></tt><tt>=EMPTY</tt> (58; 100%), <tt><a href="de_pud-feat-Person.html">Person</a></tt><tt>=EMPTY</tt> (58; 100%), <tt><a href="de_pud-feat-Tense.html">Tense</a></tt><tt>=EMPTY</tt> (39; 67%).
`VERB` tokens may have the following values of `VerbForm`:
* `Inf` (40; 69% of non-empty `VerbForm`): <em>abzuschreiben, aufrechtzuerhalten, begrenzen, behandeln, bekommen, beschäftigen, besuchen, bewegen, durchzuführen, einzunehmen</em>
* `Part` (18; 31% of non-empty `VerbForm`): <em>abgesehen, basierend, gefolgt, gesagt, Breaking, Geformt, abgehärtet, angeklagt, begründet, geeignet</em>
* `EMPTY` (1855): <em>sagte, ist, an, hat, haben, gibt, sagt, war, auf, begann</em>
### `AUX`
4 <tt><a href="de_pud-pos-AUX.html">AUX</a></tt> tokens (0% of all `AUX` tokens) have a non-empty value of `VerbForm`.
The most frequent other feature values with which `AUX` and `VerbForm` co-occurred: <tt><a href="de_pud-feat-Mood.html">Mood</a></tt><tt>=EMPTY</tt> (4; 100%), <tt><a href="de_pud-feat-Number.html">Number</a></tt><tt>=EMPTY</tt> (4; 100%), <tt><a href="de_pud-feat-Person.html">Person</a></tt><tt>=EMPTY</tt> (4; 100%), <tt><a href="de_pud-feat-Tense.html">Tense</a></tt><tt>=Past</tt> (3; 75%).
`AUX` tokens may have the following values of `VerbForm`:
* `Inf` (1; 25% of non-empty `VerbForm`): <em>aufzutreten</em>
* `Part` (3; 75% of non-empty `VerbForm`): <em>bekannt, genannt, geschrieben</em>
* `EMPTY` (946): <em>ist, wurde, war, werden, wird, wurden, sind, hatte, waren, hat</em>
| 59.585366 | 401 | 0.671715 | yue_Hant | 0.297752 |
3e6c82b7e9541f5d42d725bcfc765041f463eabb | 2,666 | md | Markdown | _posts/popculture/2018-05-17-popculture-oceans8.md | psmak3/psmak3.github.io | 17f0c3ef414e840d1008ee073056dd8ecda30a5c | [
"MIT"
] | 1 | 2020-08-06T22:31:34.000Z | 2020-08-06T22:31:34.000Z | _posts/popculture/2018-05-17-popculture-oceans8.md | psmak3/psmak3.github.io | 17f0c3ef414e840d1008ee073056dd8ecda30a5c | [
"MIT"
] | null | null | null | _posts/popculture/2018-05-17-popculture-oceans8.md | psmak3/psmak3.github.io | 17f0c3ef414e840d1008ee073056dd8ecda30a5c | [
"MIT"
] | 1 | 2016-05-28T01:59:29.000Z | 2016-05-28T01:59:29.000Z | ---
layout: post
title: "Lady Parts"
excerpt: "Pop Culture"
categories: popculture
comments: false
share: true
---

Over the past few years, the social left have been demanding more good parts for women in Hollywood. From the Bechdel test to just a bunch of SJW's making a lot of noise, there is a feeling from some that there needs to be not only more females in movies, but parts that do not restrict them to damsels in distress or background eye candy.
Sure, I'll go along with this. I'm an open-minded man of the 21st century, I'm willing to agree with this idea. Also my mom reads this, so I really have no choice but to say I agree.
So has been the recent result of this push? Well, we have a female Luke Skywalker, a remake of the Ghostbusters with women, and not this; an Ocean's 11 re-remake but with 8 women.
Is this what people had in mind for more women in movies? Just to rehash dusty scripts of days past and cast it anew with females? Seems very lame.
In a business that prides itself on progressiveness and creativity, there seems to be a lot of those things missing. I mean c'mon now....Ocean's 8? That is so lazy and inane that one cannot defend this idea at all.
And it is not that there is not reasonable talent in the cast. Kristin Wiig (UA grad by the way) is very good, Sandra Bullock has moments, and Cate Blanchett is maybe the best female actress going today. So why is the outlet for this female talent engulfed in a 'sloppy seconds' (or thirds in this case) script?
Now we can blame Hollywood for their inane choices for remakes, but we also should look at the women that are cooperating in this. There have been good and very funny movies as of late with primary female casts: Spy, The Heat, and Bridesmaids (which was sort of like a remake of Hangover, but I digress). There are also very strong female parts in many good big films; Cate Blanchett in Thor 3, Gal Gadot in Wonder Woman, and of course the creme de la creme of strong female parts, Charlize Theron as Furiousa in Mad Max. These women in this new bullpoo Ocean's 8 should not agree to do that tripe.
It is not like there is any proof that female's cannot carry a movie or are weak on screen. Quite the opposite. So why mire this talent in some rehashed garbage. And why does the talent let this happen?
Anyway, I guarantee I will not see this movie. I loved the Clooney Ocean's 11, it was hip, fresh, ultra-stylized, and just fun. Then the franchise got stale. I bet my mortgage that by simply infusing the cast with women will not freshen this all up.
| 62 | 599 | 0.768192 | eng_Latn | 0.999781 |
3e6c96dfd46b935dc0e932f41c83ae410c024813 | 10,209 | md | Markdown | _posts/2017-06-07-Du-fsharp-a-newcraft-2017.md | evilz/evilz.github.io | 2c2acae18422d2b91507b2e9dd84f26574c05964 | [
"MIT"
] | null | null | null | _posts/2017-06-07-Du-fsharp-a-newcraft-2017.md | evilz/evilz.github.io | 2c2acae18422d2b91507b2e9dd84f26574c05964 | [
"MIT"
] | null | null | null | _posts/2017-06-07-Du-fsharp-a-newcraft-2017.md | evilz/evilz.github.io | 2c2acae18422d2b91507b2e9dd84f26574c05964 | [
"MIT"
] | null | null | null | ---
layout: post
title: Du Fsharp à NewCrafts 2017
date: 2017-06-07
author: evilz
comments: true
tags: [dotnet, Informatique]
image: https://farm5.staticflickr.com/4599/38677096174_8c651790b7_z.jpg
---
J'ai eu la chance d'assister aux deux jours de la [conférence NewCrafts de cette année](http://ncrafts.io).
Sur les deux jours, nous avons pu assister à de nombreuses présentations très intéressantes,
mais je vais ici m'intéresser aux sessions abordant du Fsharp et auxquelles j'ai pu participer.
## Scott Wlaschin - Thirteen ways of looking at a Turtle
**@ScottWlaschin**

Tout le monde se pose la même question en lisant le titre de la session : "Une tortue ? What ...?"
En fait Scott nous propose en utilisant l'image de cette tortue une API très simpliste permettant de tracer des traits.
La "tortue" peut se déplacer en avant, tourner sur un angle donné et finalement dessiner ou non le trait pendant son déplacement.
Cette API de quatre méthodes peut sembler trop simple pour être vraiment intéressante, mais nous allons voir sa réalisation en utilisant 13 approches différentes.
Cela va du modèle orienté objet classique, en passant par de l'actor model puis de l'event sourcing.
Je vous recommande fortement de jeter un coup d'œil, car même si le code est présenté en F#, cela reste applicable dans d'autres plateformes/langages.
L'ensemble du contenu est en ligne sur [son fameux site](https://fsharpforfunandprofit.com/turtle/)
## Michel Grootjans et Thomas Coopman : Playing with projections
**@michelgrootjans @tcoopman**
J’ai participé à un atelier, où nous avons joué avec des projections.
Lorsque l’on fait de l’event sourcing, il va souvent falloir relire et rejouer les événements d’une certaine façon, que ce soit pour retourner dans un état particulier ou pour faire des analyses, il y a souvent besoin de faire des transformations sur des paquets d'événements.
L’objectif de cet atelier était donc de travailler sur ces transformations qui sont appelées des Projections.
Le domaine métier est relativement simple, mais possède déjà de nombreux types d'événements.

J’avoue ne pas avoir fait l’exercice directement en F# mais en C# en utilisant pas mal de linq. Donc facilement portable en F#.
D’ailleurs l’exercice est disponible dans de nombreux langages.
Je vous invite à regarder [ici](https://github.com/michelgrootjans/playing_with_projections)
---
## Evelina Gabasova - The F#orce Awakens
**@evelgab**
<amp-youtube data-videoid="R10rPhpLvr8" layout="responsive" width="480" height="270"></amp-youtube>
Après une introduction digne de la saga Star Wars, Evelina nous a présenté comment elle a réalisé une analyse autour des personnages en utilisant les scripts des sept films.
1) La première étape consistait à réussir à parser les différents scripts des films pour en extraire les noms des personnages. Elle a alors utilisé les scènes pour en déduire les personnages connectés.
Cela ne s'est pas fait sans mal puisque chaque script possède un format différent et que certains personnages n'ont même pas de dialogue.
Pour cette première étape ce sont des regex et des active patterns qui ont été utilisés.
```fsharp
// Active pattern to parse the contents of the script
let (|SceneTitle|Name|Word|) (text:string) =
let scenePattern = "[ 0-9]*(INT.|EXT.)[ A-Z0-9]"
let namePattern = "^[/A-Z0-9]+[-]*[/A-Z0-9 ]*[-]*[/A-Z0-9 ]+$"
if Regex.Match(text, scenePattern).Success then
SceneTitle text
elif Regex.Match(text, namePattern).Success then
Name text
else Word
```
2) Deuxième étape : Analyser les données.
Evelina nous a montré ses différents [Azure notebook](https://notebooks.azure.com/) d'analyse.
Ces notebooks, fortement inspirés de ceux disponibles depuis longtemps en Python, permettent de mélanger texte et code exécutable.
Il est même possible d’utiliser la lib Fsharp.Data pour générer des graphiques plus ou moins complexes.

3) Troisième étape : Utiliser plusieurs apis publiques fournissant des informations supplémentaires sur Star Wars.
Il s'agit principalement de l'api [Swapi](http://swapi.co/API)
Elle fournit des détails sur de nombreux éléments de la saga et possède même les clients pour plusieurs langages.
C'est Evelina qui a créé le client F#.
Pour cela elle a utilisé une fonctionnalité très puissante de F# : les Type Provider.
Une simple référence vers le package JsonProvider a créé deux constantes contenant une json d'exemple : une avec les champs minimum et l'autre complète. Le provider se charge de créer dynamiquement un modèle.
Elle a complété sa démo en croisant des données récupérées d'IMDB en utilisant la même technique, mais un provider différent : 'HtmlProvider' qui est lui capable de récupérer un tableau de données pour notre exemple.

Pour tous les détails, rendez-vous sur [son blog](http://evelinag.com/blog/2016/01-25-social-network-force-awakens/)
Que faut-il retenir de tout ça ?
Plusieurs points sont intéressants à garder en tête :
- F# permet de très vite et avec peu de code prototyper l'analyse de données.
les Type Provider et les notebooks sont aussi d'une grande aide.
- Commencer par analyser un ensemble de données connu avant de se lancer sur des analyses à grande échelle.
Bien que le domaine lié à Star Wars puisse paraître geek, c'est surtout un domaine connu par beaucoup de monde, et cela permet de s'apercevoir rapidement des résultats incohérents.
---
## Gien Verschatse - Playing nice together: how to use F# in a brownfield project
**@selketjah**

J'attendais cette session avec impatience. Gien, développeuse Belge pour des jeux de casino en ligne, nous a fait un retour d'expérience sur son intégration de développement F# au sein d'un existant en C#.
Elle a donc passé en revue les différentes fonctionnalités qui pouvaient ou non être utilisées depuis un programme C# voire VB.net.
J’ai refait moi-même quelques tests suite à cette présentation et [mon code est disponible ici](https://github.com/evilz/Fsharp-from-csharp)
Ce que je retiens surtout de ce retour d'expérience, c’est le courage de Gien à s'être lancée dans l’aventure F#, dans le sens où il est difficile (pour ne pas dire impossible) de trouver des missions en F#. C’est donc à nous d’introduire au fur et à mesure cette techno si cela nous semble adéquat.
---
## Krzysztof Cieslak - Hacking F# in JS ecosystem
**@k_cieslak**

Dans cette session, Krzysztof nous présente le dernier Framework à la mode dans la communauté FSharp, `Fable`.
Fable est un compilateur F# vers JavaScript qui s'appuie sur Babel.
Fable va prendre votre code F#, en extraire un AST et le fournir à Babel qui fait le reste du travail pour obtenir un code Js pour Navigateur ou NodeJS.
On peut alors se poser la question : Mais pourquoi ?
La réponse simple serait “parce qu’on peut !” ou encore “pourquoi pas !”
Mais il y a de vrais avantages à utiliser F# comme langage principal :
- Une approche fonctionnelle
- Tout est fortement typé : Record type et Union Type
- Toutes les fonctionnalités avancées : pattern matching, Computation Expression
- Réutilisation de code côté serveur
- Pragmatique
- Communauté
Il nous a ensuite présenté l'outil à travers plusieurs démos dans le navigateur avec du JS très simple, mais aussi une démo react.
Il faut aussi savoir que Fable permet la génération de sourcemap ; il est donc possible de déboguer le code F# depuis la fenêtre de DevTool de Chrome !
La dernière version de Fable est utilisable directement depuis la cli dotnet et permet donc d'initialiser rapidement un projet ou de démarrer un serveur :
dotnet fable start
[Plus d’infos sur ce billet du blog](http://fable.io/blog/Introducing-1-0-beta.html)
De plus si comme moi vous vous demandez comment jouer avec du code d’une librairie externe et comment récupérer tous les types un peu comme le fait Typescript, les devs de la communauté ont pensé à tout et ont créé cet outil : [ts2fable](https://www.npmjs.com/package/ts2fable), qui permet de convertir un type definition de Typescript en Fsharp.
Vous trouverez plus d'informations et de démos sur [le site officiel](http://fable.io/)
---
## Mathias Brandewinder - Serverless F# with Azure Functions: fsibot goes nano-services
**@brandewinder**

Le Professeur Tournesol nous a très rapidement montré comment mettre en place grâce aux Azure Functions un système Server Less (ou presque) en F#.
Ce que j'apprécie principalement sur le système des Azure Functions c’est l’interface en ligne simple et agréable à utiliser. Il est vraiment facile de créer une nouvelle fonction et de définir le trigger qui va la déclencher.
Il y a déjà pas mal de triggers disponibles et nous avons pu voir l’utilisation d'un trigger basé sur une horloge et d'un autre sur des messages queue.
<p> </p>
Le FsiBot de Mathias utilise ces deux triggers. L’application regarde à intervalles réguliers les tweets d’un channel comportant du F#, compile et s'exécute.

[Le code est disponible ici](https://github.com/mathias-brandewinder/fsibot-serverless)
[Et un petit billet explicatif ici](http://brandewinder.com/2017/04/01/azure-function-app-diagram/)
---
## Et pour conclure
Deux journées finalement assez riches pour la communauté F#. Et même si les 45min de présentation ne permettent pas de rentrer dans le détail, cela a l'avantage de présenter la technologie et les possibilités du langage aux développeurs.
Je vous invite donc à regarder de près ce langage .net au travers des différents liens fournis ainsi que sur http://fsharp.org et [awesome-fsharp](https://github.com/fsprojects/awesome-fsharp). | 54.015873 | 346 | 0.775198 | fra_Latn | 0.985374 |
3e6d0c1ecd30d68960e5fec415ffd4902dc23ca1 | 487 | md | Markdown | README.md | AstroWa3l/Smart-Contracts | 22dc4865fa1d622adf6b6ec1ca6bfe191bbcc110 | [
"Apache-2.0"
] | null | null | null | README.md | AstroWa3l/Smart-Contracts | 22dc4865fa1d622adf6b6ec1ca6bfe191bbcc110 | [
"Apache-2.0"
] | null | null | null | README.md | AstroWa3l/Smart-Contracts | 22dc4865fa1d622adf6b6ec1ca6bfe191bbcc110 | [
"Apache-2.0"
] | null | null | null | # Smart-Contracts
##### I will be uploading various smart contracts to be used in future DApp projects..
##### In order to run the various haskell smart contracts you must go to IOHK's Marlowe playground here: https://alpha.marlowe.iohkdev.io/#/ Once you have arrived at the main page choose "New Project", then choose the "Marlowe" editor, then you can clear the contents in the Marlowe editor and paste in the smart contracts here and start the simulation to see if it all worked :)
| 81.166667 | 379 | 0.76386 | eng_Latn | 0.998393 |
3e6d63c2678c80008b83130ad1a160bf0738b658 | 674 | md | Markdown | README.md | officialraze/webproject_new | 617976a32e7d69d42e1b956525b44d636b8b88bf | [
"PHP-3.01",
"PHP-3.0"
] | null | null | null | README.md | officialraze/webproject_new | 617976a32e7d69d42e1b956525b44d636b8b88bf | [
"PHP-3.01",
"PHP-3.0"
] | 1 | 2019-02-11T15:41:15.000Z | 2019-02-11T15:41:15.000Z | README.md | officialraze/webproject_new | 617976a32e7d69d42e1b956525b44d636b8b88bf | [
"PHP-3.01",
"PHP-3.0"
] | null | null | null | # Webprojekt
Herzlich Willkommen bei meinem Webprojekt, das im Jahre 2018 begonnen wurde und im Verlaufe des Informatik-Unterrichts erweitert wurde.
## Was kann man auf der Webseite alles machen?
Man kann verschiedene Künstler sehen und diese haben alle eine Detail-Seite, auf der man einen Slider mit verschiedenen Bildern des Künstlers sehen kann. Ebenfalls kann man seine nächsten Tour-Daten betrachten.
Mit Hilfe des Band-Logins können die einzelnen Künstler jeweils ihre Beschreibung und Tour-Daten anpassen. Dort werden ebenfalls verschiedene Daten des Künstlers angezeigt.
Das Webprojekt wurde mit Hilfe von HTML, CSS, PHP und MySQL erstellt.
© by Melvin Lauber
| 51.846154 | 210 | 0.818991 | deu_Latn | 0.999877 |
3e6e0a735b0b41818f6d15446b3092775b1b7029 | 805 | md | Markdown | .github/PULL_REQUEST_TEMPLATE.md | nybles/beta-website | ac65a4e7cf91970771b4ed9da7b94153357aa4b9 | [
"MIT"
] | 2 | 2019-05-04T19:44:18.000Z | 2019-07-11T16:31:26.000Z | .github/PULL_REQUEST_TEMPLATE.md | nybles/beta-website | ac65a4e7cf91970771b4ed9da7b94153357aa4b9 | [
"MIT"
] | 8 | 2017-03-08T19:08:25.000Z | 2017-03-12T21:29:51.000Z | .github/PULL_REQUEST_TEMPLATE.md | nybles/beta-website | ac65a4e7cf91970771b4ed9da7b94153357aa4b9 | [
"MIT"
] | 6 | 2017-03-11T12:50:30.000Z | 2017-03-12T16:28:47.000Z | Thanks for making this contribution. :blush:
People like you maintain the spirit of community work. :clap:
However to make sure that things don't break, please make following checks.
* Read the [contribution guidelines](https://github.com/nybles/nybles.github.io/blob/master/CONTRIBUTING.md).
* Any UI changes should follow Material design principles. [reference](http://fezvrasta.github.io/bootstrap-material-design/)
* Make sure the site is working fine after your commits. [Running Jekyll locally](https://jekyllrb.com/docs/quickstart/)
**Remove above lines, after reading them**
Link of your fork to view changes: *Enter github.io link here*
Any [issue](https://github.com/nybles/nybles.github.io/issues) your want to refer to?
*# issue number *
Briefly describe what changes you made:
*
*
*
| 38.333333 | 125 | 0.76646 | eng_Latn | 0.962749 |
3e6ed3564323fec6c49bc7b11da0087d55476cf5 | 33 | md | Markdown | README.md | khos2ow/cluster-api-provider-cloudstack | 7cf1d89f63f9371de1e73835706e0f9667ac4607 | [
"Apache-2.0"
] | null | null | null | README.md | khos2ow/cluster-api-provider-cloudstack | 7cf1d89f63f9371de1e73835706e0f9667ac4607 | [
"Apache-2.0"
] | null | null | null | README.md | khos2ow/cluster-api-provider-cloudstack | 7cf1d89f63f9371de1e73835706e0f9667ac4607 | [
"Apache-2.0"
] | null | null | null | # cluster-api-provider-cloudstack | 33 | 33 | 0.848485 | nld_Latn | 0.303017 |
3e6f74c79b6467bc94525c93157268be99e4abe7 | 601 | md | Markdown | docs/README.md | AndonMitev/EWallet | 898cde38933d6f134734528b3e594eedf5fa50f3 | [
"Apache-2.0"
] | 322 | 2018-02-28T07:38:44.000Z | 2020-05-27T23:09:55.000Z | docs/README.md | AndonMitev/EWallet | 898cde38933d6f134734528b3e594eedf5fa50f3 | [
"Apache-2.0"
] | 643 | 2018-02-28T12:05:20.000Z | 2020-05-22T08:34:38.000Z | docs/README.md | AndonMitev/EWallet | 898cde38933d6f134734528b3e594eedf5fa50f3 | [
"Apache-2.0"
] | 63 | 2018-02-28T10:57:06.000Z | 2020-05-27T23:10:38.000Z | # Documentation
Below are the main areas of the documentation. Navigate through the links below to see more details.
- **[Design](design/):** Design and philosophy behind the eWallet Server.
- **[Guides](guides/):** Using the eWallet Server to its full potential.
- **[Setup](setup/):** Details on the different eWallet Server setup approaches.
- **[Running the tests](tests/):** Running the tests bundled with the eWallet Server.
- **[Demo](demo.md):** Sample setup that demonstrates how the eWallet Server can be used.
- **[FAQ](faq.md):** Top frequently asked questions about the eWallet Server.
| 54.636364 | 100 | 0.733777 | eng_Latn | 0.984863 |
3e6fdf377bf19bfbba845a2b451a57db7639535a | 205 | md | Markdown | development/coder/http/index.md | biangbiang/something | b98e1c4b53a7b2c2260f40307596ef590a4f5776 | [
"MIT"
] | 1 | 2017-08-23T19:48:02.000Z | 2017-08-23T19:48:02.000Z | development/coder/http/index.md | biangbiang/something | b98e1c4b53a7b2c2260f40307596ef590a4f5776 | [
"MIT"
] | null | null | null | development/coder/http/index.md | biangbiang/something | b98e1c4b53a7b2c2260f40307596ef590a4f5776 | [
"MIT"
] | 3 | 2017-02-17T06:41:34.000Z | 2021-08-28T03:00:42.000Z | http协议相关
============
### [讨论“get”和“post”安全性](discussion-of-get-and-post-security)
---
### [四种常见的 POST 提交数据方式](four-post-data-way)
---
### [X-Frame-Options 响应头](x-frame-options-response-headers)
---
| 13.666667 | 60 | 0.609756 | yue_Hant | 0.3403 |
3e702a1519da4dbe9b27a7c4adaeddb8d13aade7 | 388 | md | Markdown | docs/Boundary.md | Syncsort/PreciselyAPIsSDK-Android | 752d575414ca06f04c1499c86a5cc569c5043437 | [
"Apache-2.0"
] | null | null | null | docs/Boundary.md | Syncsort/PreciselyAPIsSDK-Android | 752d575414ca06f04c1499c86a5cc569c5043437 | [
"Apache-2.0"
] | null | null | null | docs/Boundary.md | Syncsort/PreciselyAPIsSDK-Android | 752d575414ca06f04c1499c86a5cc569c5043437 | [
"Apache-2.0"
] | null | null | null |
# Boundary
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**boundaryId** | **String** | | [optional]
**boundaryType** | **String** | | [optional]
**boundaryRef** | **String** | | [optional]
**geometry** | [**DemographicsGeometry**](DemographicsGeometry.md) | | [optional]
**url** | **String** | | [optional]
| 21.555556 | 83 | 0.494845 | kor_Hang | 0.266832 |
3e704d73a7eab1dfe984570079441c7f6d8fc047 | 712 | md | Markdown | README.md | assimoes/angular-directives | bd8e78416abaa02c39170848a7419c3e7184ed4f | [
"MIT"
] | null | null | null | README.md | assimoes/angular-directives | bd8e78416abaa02c39170848a7419c3e7184ed4f | [
"MIT"
] | null | null | null | README.md | assimoes/angular-directives | bd8e78416abaa02c39170848a7419c3e7184ed4f | [
"MIT"
] | null | null | null | # angular-directives
## Example
Add the `asv-directives` module as a dependency to your main module definition
`index.html`
```
(function () {
'use strict';
var app = angular.module('app', ['asv-directives']);
})();
```
Add a reference to the `asv-directives.js` script file below your angular app js file reference.
```
<html>
<head>
...
<script src="path/to/yourAngularApp.js" type="text/javascript"></script>
<script src="path/to/asv-directives.js" type="text/javascript"></script>
...
</head>
<body>...</body>
</html>
```
Use it.
```
<file-uploader accepts="*.*" post-url="/controller/UploadAction" file-icon="file.ico" col-size="12" max-size="10000000"></file-uploader>
```
| 18.736842 | 136 | 0.650281 | eng_Latn | 0.381756 |
3e71270bc73945897aa2b2cd4caa8f86aed40f08 | 2,288 | md | Markdown | _posts/2015-12-31-latex.md | fuzzbin/devnull | 58a7e04f265caf425d6fc6af7d1e863bb7beb55a | [
"MIT"
] | null | null | null | _posts/2015-12-31-latex.md | fuzzbin/devnull | 58a7e04f265caf425d6fc6af7d1e863bb7beb55a | [
"MIT"
] | null | null | null | _posts/2015-12-31-latex.md | fuzzbin/devnull | 58a7e04f265caf425d6fc6af7d1e863bb7beb55a | [
"MIT"
] | null | null | null | ---
layout: post
title: LaTeX
date: 2015-12-31 00:00:00
author: Tom Jarle
---
[\\(LaTeX\\)](https://www.latex-project.org/) er et system for å typesette dokumenter. Resultatet holder høy kvalitet, og er meget godt egnet for produksjon av tekniske og vitenskaplige dokumenter. Å lage \\(LaTeX\\)-dokumenter gjøres på en litt annen måte enn i en vanlig tekstbehandler. Skriveprosessen minner mer om programmering, og \\(LaTeX\\)-koden må faktisk kompileres før dokumentet kan vises. Dette høres kanskje tungvint ut, og det var det før. Heldigvis så har det kommet flere gode verktøy de siste årene som gjør at man kan lage flotte \\(LaTeX\\)-dokumenter direkte i nettleseren uten å måtte installere annen programvare. Et slikt verktøy er [overleaf](https://www.overleaf.com).
Hvorfor bruke \\(LaTeX\\)? Først og fremst fordi typesettingen er fantastisk fin. Moderne tekstbehandlere tilbyr riktignok formelverktøy som gir veldig bra resultat, men \\(LaTeX\\) er enda ett hakk bedre. En annen grunn for min egen del er at det faktisk er en veldig effektiv måte å skrive på. Fokuset er på innhold, så tar \\(LaTeX\\) seg av formattering. Det er behagelig. I tillegg er det faktisk ganske gøy!
Der man i et tekstdokument kanskje ville skrevet pytagoras læresetning som a^2 + b^2 = c^2, eller plassert uttrykket inn med et formelverktøy kan man veldig enkelt få dette til å bli \\(a^2+b^2=c^2\\) ved å bruke \\(LaTeX\\). Man kan faktisk legge inn større uttrykk som \\(x_{1,2}=\frac{-b\pm\sqrt{b^2-4ac}}{2a}\\) direkte i teksten uten å ødelegge formateringen på dokumentet.
Uttrykket under er skrevet med en litt annen kode, som gjør at det kommer på en egen linje.
$$x_{1,2}=\frac{-b\pm\sqrt{b^2-4ac}}{2a}$$
Dette kan være nyttig hvis man skal vise utledninger som går over flere linjer.
En annen fin funksjon med [overleaf](https://www.overleaf.com) er støtten for samskriving. Man kan enkelt dele, og samskrive på dokumenter. I tillegg ligger det mange maler tilgjengelig, slik at man enklere kan komme i gang. Selv om mange forbinder \\(LaTeX\\) med vitenskapelige dokumenter, støtter verktøyet er lang rekke andre funksjoner. Et eksempel på dette er [lilypond](http://www.lilypond.org/text-input.html), dette er et system for å skrive noter til musikk med \\(LaTeX\\)-rammeverket. | 120.421053 | 695 | 0.76049 | nob_Latn | 0.996591 |
3e7142e8abf0dfe9404136f23bbc7c352bbd6f8b | 2,109 | md | Markdown | content/posts/2020-02-13--RSA-Algorithm-With-Example.md | GeminiWind/personal-blog | f58e1686edadf0a53995829ed178775ad547b475 | [
"MIT"
] | null | null | null | content/posts/2020-02-13--RSA-Algorithm-With-Example.md | GeminiWind/personal-blog | f58e1686edadf0a53995829ed178775ad547b475 | [
"MIT"
] | 11 | 2021-03-01T20:47:18.000Z | 2022-02-26T17:39:54.000Z | content/posts/2020-02-13--RSA-Algorithm-With-Example.md | GeminiWind/personal-blog | f58e1686edadf0a53995829ed178775ad547b475 | [
"MIT"
] | null | null | null | ---
title: "Encryption: RSA Algorithm with example"
date: "2020-02-13T22:40:32.169Z"
template: "post"
draft: false
slug: "/explain-rsa-algorithm-with-example"
category: "AWS"
tags:
- "RSA"
- "Encryption"
description: "Explain RSA Algorithm in the shortest and easiest way along with example"
---
In the university, I bet that all of us has studied how RSA works. However, in that time, we don't truly understand the important of cryptography as well as RSA algorithm. Therefore, in this article, we 'll look back to RSA. What is RSA ? How it work ? Where is it apply ?
## Brief introduction
- RSA (Rivest–Shamir–Adleman) is one of the first public-key cryptosystems and is widely used for secure data transmission.
- The acronym RSA is made of the initial letters of the surnames of Ron Rivest, Adi Shamir, and Leonard Adleman, who first publicly described the algorithm in 1977.
- RSA is one of the cipher suites used in Transport Layer Security, which is used by HTTPS.
## Operation
1. Choose two different large random prime numbers *__p__* and *__q__*
2. Calculate *__n = p * q__*
- *__n__* is the modulus for the public key and the private keys
3. Calculate the totient: *__Φ(n)=(p-1)(q-1)__*
4. Choose an integer *__e__* such that 1 < e < Φ(n) and e is co-prime to n
- e is released as the public key exponent
5. Compute d as the following formular: *__d = e^(-1) mod Φ(n)__*
- d is kept as the private key exponent
## Encrypting message
Alice gives her public key (n & e) to Bob and keeps her private key secret. Bob wants to send message M to Alice.
The cipher text c will be computed as the following way
_**c = m^e mod n**_
## Decrypting message
Alice can recover m from c by using her private key d in the following procedure:
_**m = c^d mod n**_
## Example
1. Choose random large prime number p = 61 and q = 53
2. n = q * q = 61 * 53 = 3233
3. Φ(n) = (p-1)(q-1) = 60*52 = 3120
4. Find e: e> 1 and e is co-prime to 3120 => e = 17
5. Find d = 2753
- Encryption: (m = 88): c= m^17 mod 3233 = 88 ^ 17 mod 3233 = 1525
- Decryption: m = 1525^2753 mod 3233 = 88
| 35.745763 | 272 | 0.702703 | eng_Latn | 0.996744 |
3e71746ee1a772cfac04ac7913dd5d41116e91b6 | 26 | md | Markdown | README.md | aramisic/html-css-js | 697f19f56fc7a1f814be6984ef4ceca808bc2e42 | [
"MIT"
] | null | null | null | README.md | aramisic/html-css-js | 697f19f56fc7a1f814be6984ef4ceca808bc2e42 | [
"MIT"
] | null | null | null | README.md | aramisic/html-css-js | 697f19f56fc7a1f814be6984ef4ceca808bc2e42 | [
"MIT"
] | null | null | null | # html-css-js
Single Page
| 8.666667 | 13 | 0.730769 | kor_Hang | 0.937387 |
3e71ad6de98bad7f2046e7d9d9c87ad7cb7204f3 | 1,136 | md | Markdown | README.md | Ohems/watodo | d6e173468b33c3576696dfdd1416ea7d1c99e02a | [
"MIT"
] | null | null | null | README.md | Ohems/watodo | d6e173468b33c3576696dfdd1416ea7d1c99e02a | [
"MIT"
] | null | null | null | README.md | Ohems/watodo | d6e173468b33c3576696dfdd1416ea7d1c99e02a | [
"MIT"
] | null | null | null | #  [WIP]
Community event and responsibility management tool. Work in progress and currently unusable.
## Getting Started
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.
## Contributing
Please read [CONTRIBUTING.md](CONTRIBUTING.md) for details on our code of conduct, and the process for submitting pull requests to us.
**NOTE:** Project is currently in its earliest stages and actively developed all around without clear organization. Contributing at this stage isn't recommended.
## Versioning
We use [SemVer](http://semver.org/) for versioning. For the versions available, see the [tags on this repository](https://github.com/Ohems/mlc/releases).
## Authors
* [Ohems](https://github.com/Ohems) - *Initial work*
See also the list of [contributors](https://github.com/Ohems/mlc/graphs/contributors) who participated in this project.
## License
This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details
| 40.571429 | 200 | 0.771127 | eng_Latn | 0.994882 |
3e71f423447f75b10ec016a20f98fae2751ff9e8 | 4,957 | md | Markdown | src/pages/pages/2015-05-01---about/index.md | BKJang/Gatsby-Sample | 2c7b5948318dc5914d43511003291bd65d0ce2fd | [
"MIT"
] | 1 | 2019-05-22T14:43:21.000Z | 2019-05-22T14:43:21.000Z | src/pages/pages/2015-05-01---about/index.md | BKJang/Gatsby-Sample | 2c7b5948318dc5914d43511003291bd65d0ce2fd | [
"MIT"
] | null | null | null | src/pages/pages/2015-05-01---about/index.md | BKJang/Gatsby-Sample | 2c7b5948318dc5914d43511003291bd65d0ce2fd | [
"MIT"
] | null | null | null | ---
title: "About me"
layout: page
path: "/about"
---
### 장봉기 (BongKi Jang)
| | |
| :--------: | ---------------------------- |
| **phone** | 010-4502-7756 |
| **E-mail** | [email protected] |
| **Academic** | 경기대학교 경영정보학과 졸업(2012.03 ~ 2018.02) |
| | |
---
### Brief Introduction
새로운 기술을 배우는데 흥미를 가지고 있으며, 최근 수행한 업무에서도 처음 접한 기술들을 소화하며 성공적으로 서비스를 launching 하였습니다.<br/><br/>
최근, JavaScript, React와 같은 프론트엔드 기술과 Java의 효율적인 연동에 대해 관심을 가지고 있으며, 이외에 프로젝트의 분석 설계에도 관심을 갖고 있습니다.<br/><br/>
<div align="center">
`#JavaScript` `#React` `#Java` `#Analytical Design`
</div>
---
### Skills
> HTML/CSS/JavaScript(ES6+), React, Java.
---
### Work Experiences
#### Futuresoft Co.
| | |
| --------------: | ------------------------------------------------------------------------------------------------------- |
| **period** | 17.08 ~ Current |
| **position** | 웹 개발자 |
| **description** | 현재, Free-Wifi Service 개발에 참여중에 있으며, IT자산관리시스템, 사내업무관리시스템, 싱가폴 암호화폐거래소에 참여했었습니다.
| **projects** | ITAM(IT Asset Management System), ETMS(Enterprise Management System), Cryptocurrency Exchange Center, Free-Wifi Service |
| | |
---
<br/>
<br/>
### Project Experience
#### Free-Wifi Servcie
- JavaScript, jQuery, BootStrap3
| | |
| --------------: | ---------------------------------------------------- |
| **period** | 18.07 ~ Current |
| **description** | 사용자에게 무료 Wifi를 제공하는 대신, 광고를 제공하고 고객의 정보를 수집해 마케팅 정보로 활용하기 위한 서비스입니다. |
| **implementation** | 사용자의 정보를 가져오는 방식 중 하나로 소셜로그인을 도입했었는데 기존 소셜 로그인에서 제공하는 팝업 방식의 로그인이 CNA 브라우저에서는 동작하지 않았습니다. <br/> 이를 해결하기 위해 팝업 창으로 로그인 화면을 구현하는 방식이 아닌 해당 화면으로 redirect 시키는 방식으로 구현했었습니다.<br/> 현재는 수집한 데이터를 통계처리하고 Visualization 하는 방식에 대한 스터디를 진행하고 있습니다. |
| | |
<br/>
#### Cryptocurrency Exchange Center(KnockCoin)
- Site Link: (https://knockcoin.io/excenter/exchange)
- React, JavaScript, Redux
| | |
| --------------: | ---------------------------------------------------- |
| **period** | 18.03 ~ 18.07 |
| **description** | 비트코인, 이더리움과 같은 암호 화폐를 거래하고 입, 출금 서비스를 제공하는 암호화폐 거래소입니다. |
| **implementation** | 인증 절차가 타 사이트에 비해 복잡하다보니, 이용자도 쉽게 이용할 수 있도록 해야했습니다.<br/> 그래서 Captcha나 OTP 등의 인증 방식을 구현하는데 있어 로그인 및 인증 과정에서 최대한 직관적인 flow를 제공하는데 중점을 두고 고민했었습니다. <br/> 진행 초반에는 애플리케이션의 상태 관리를 하는데 있어 큰 어려움이 없었지만, 애플리케이션이 복잡도를 더해가면서부터 상태 관리에 대한 어려움이 있었고 그 과정에서 Redux를 사용할 필요성을 느꼈고 적용과 동시에 이러한 문제를 해결할 수 있었습니다.<br/> |
| | |
<br/>
<br/>
#### ETMS(Enterprise Management System)
- JavaScript, jQuery, AxisJ
| | |
| --------------: | ---------------------------------------------------- |
| **period** | 17.08 ~ 17.11 |
| **description** | 기존에 수기로 진행되던 지출결의, 휴가신청 등 사내업무의 Workflow를 자동화한 시스템입니다. |
| **implementation** | 입사 후, 처음 진행한 프로젝트로서 사내의 Workflow를 이해하고 이를 자동화하는 것에 중점을 두고 진행했던 프로젝트였습니다.<br/> WorkFlow의 시각화를 하는 과정에서 Camunda WorkFlow Engine를 사용하려 했었지만, 사내 WorkFlow의 복잡도가 Camunda Engine을 사용하기에 낮다고 판단하였고, AxisJ라이브러리를 이용하여 Flow의 시각화를 최대한 간소화하여 표현했었습니다. |
| | |
<br/>
<br/>
<br/>
#### ITAM(IT Asset Management System)
- JavaScript, jQuery, AxisJ
| | |
| --------------: | ---------------------------------------------------- |
| **period** | 17.11 ~ 18.02 |
| **description** | 기존 사내 Solution으로 보유하고 있던 시스템으로서, 기업의 IT자산을 직관적으로 파악하고, IT 자산 관리에 이용되는 비용을 정산하고 그에 따른 통계를 제공해주는 시스템입니다. |
| **implementation** | 기존에 있던 UI/UX를 renewal하고 Spring Security를 통해 권한관리를 좀 더 세부적으로 적용하는 프로젝트였습니다. <br/> Grid드나 Tree 같은 공통화된 컴포넌트를 수정할 일이 많았고, 그 과정에서 Tree의 Drag&Drop 등 UI/UX상 직관적으로 사용자가 접할 수 있는 환경이 필요했고, zTree와 AxisJ라는 jQuery라이브러리를 이용해서 처리했었습니다. |
| | |
<div align="center">
_감사합니다._
</div>
| 43.104348 | 369 | 0.408311 | kor_Hang | 1.000001 |
3e73a4cf57819cb6d1ac9ef249d1e6dfe8313920 | 4,006 | md | Markdown | docs/elements/zendesk/events.md | davewilks/2018devportal | 806bc7410ab484f604863c1032206ad9c8d6c2ea | [
"MIT",
"Unlicense"
] | 6 | 2016-06-30T13:37:39.000Z | 2020-03-26T22:53:33.000Z | docs/elements/zendesk/events.md | davewilks/2018devportal | 806bc7410ab484f604863c1032206ad9c8d6c2ea | [
"MIT",
"Unlicense"
] | 849 | 2016-04-25T22:39:24.000Z | 2019-09-28T15:37:53.000Z | docs/elements/zendesk/events.md | davewilks/2018devportal | 806bc7410ab484f604863c1032206ad9c8d6c2ea | [
"MIT",
"Unlicense"
] | 4 | 2017-02-03T20:21:45.000Z | 2019-03-22T22:42:58.000Z | ---
heading: Zendesk
seo: Events | Zendesk | Cloud Elements API Docs
title: Events
description: Enable Zendesk events for your application.
layout: sidebarelementdoc
breadcrumbs: /docs/elements.html
elementId: 41
elementKey: fake
parent: Back to Element Guides
order: 25
---
# Events
{% include polling_and_webhooks_defined.md %}
{% include callout.html content="<strong>On this page</strong></br><a href=#polling>Polling</a></br><a href=#webhooks>Webhooks</a>" type="info" %}
## Polling
In order to enable polling, add these extra configurations to your instance JSON:
```JSON
"event.notification.enabled": "true",
"event.notification.callback.url": "<INSERT_YOUR_APPS_CALLBACK_URL>",
"event.poller.configuration": "<SEE_BELOW>"
```
instance JSON with polling events enabled:
```json
{
"element": {
"key": "zendesk"
},
"providerData": {
"code": "Code on Return the URL"
},
"configuration": {
"oauth.api.key": "<INSERT_ZENDESK_UNIQUE_IDENTIFIER>",
"oauth.api.secret": "<INSERT_ZENDESK_CLIENT_SECRET>",
"oauth.callback.url": "https://www.my_cool_app.com",
"zendesk.subdomain": "<INSERT_ZENDESK_SUB_DOMAIN>",
"event.notification.enabled": "true",
"event.notification.callback.url": "<INSERT_YOUR_APPS_CALLBACK_URL>",
"event.poller.configuration": {
"users": {
"url": "/hubs/helpdesk/users",
"idField": "id",
"pageSize": 100,
"datesConfiguration": {
"updatedDateField": "updated_at",
"updatedDateFormat": "yyyy-MM-dd'T'HH:mm:ss'Z'",
"createdDateField": "created_at",
"createdDateFormat": "yyyy-MM-dd'T'HH:mm:ss'Z'"
}
}
}
},
"tags": [
"<INSERT_TAGS>"
],
"name": "<INSERT_INSTANCE_NAME>"
}
```
### Webhooks
After you create an instance with webhooks enabled, your app will receive event notifications from Zendesk based on our default settings
Customization is an option based on your specific needs. See customization instructions below.
#### Webhook JSON
```json
{
"element": {
"key": "zendesk"
},
"providerData": {
"code": "Code on Return the URL"
},
"configuration": {
"oauth.api.key": "<INSERT_ZENDESK_UNIQUE_IDENTIFIER>",
"oauth.api.secret": "<INSERT_ZENDESK_CLIENT_SECRET>",
"oauth.callback.url": "https://www.my_cool_app.com",
"zendesk.subdomain": "<INSERT_ZENDESK_SUB_DOMAIN>",
"event.notification.enabled": "true",
"event.notification.callback.url": "<INSERT_YOUR_APPS_CALLBACK_URL>"
},
"tags": [
"<INSERT_TAGS>"
],
"name": "<INSERT_INSTANCE_NAME>"
}
```
NOTE: To begin all changes to tickets, your app will be notified.
You have the option to limit that scope according to your needs.
1. Login to your Zendesk account and click “Settings”

2. Scroll and find “Triggers” and click to select

3. Find the Cloud Elements Trigger and click “edit”

4. NOTE: the following steps are OPTIONAL. Can Change the name of the Trigger
5. Can Change the Conditions of the Trigger

IMPORTANT: Please do not remove the target field, events will not function if removed.
Events rely on the target remaining the same and Message field conforming to a JSON friendly format.

Feel free to add any of the Zendesk placeholders in your Message body, just remember to keep it JSON friendly.
* Click on the View available placeholders
* Add “Placeholders” to “Message” – Remember to keep in JSON friendly format.

| 31.054264 | 146 | 0.707189 | yue_Hant | 0.540815 |
3e74497c8fe57340ab7b20a466050ea040216fa7 | 233 | md | Markdown | README.md | skotep/webdev-ng2 | d563a91c4ca0f239c75ce705698b7ad5f15a2c7c | [
"MIT"
] | null | null | null | README.md | skotep/webdev-ng2 | d563a91c4ca0f239c75ce705698b7ad5f15a2c7c | [
"MIT"
] | null | null | null | README.md | skotep/webdev-ng2 | d563a91c4ca0f239c75ce705698b7ad5f15a2c7c | [
"MIT"
] | null | null | null | # webdev-ng2
Example site build with Angular2 [https://ricebook-ng2.surge.sh/](https://ricebook-ng2.surge.sh/)
for Rice University COMP 431/531 Webdev class [https://www.clear.rice.edu/comp431](https://www.clear.rice.edu/comp431)
| 33.285714 | 118 | 0.751073 | yue_Hant | 0.606057 |
3e7657b3f6660131c886e7568278f37289bd7bd1 | 485 | md | Markdown | site/pods/tutorials/content/21owtxh3/content.md | jasonprogrammer/gerbil | 0824aa21683f751cec1639cd402a777f888986b3 | [
"MIT"
] | 25 | 2020-05-26T00:47:06.000Z | 2022-02-15T12:30:05.000Z | site/pods/tutorials/content/21owtxh3/content.md | jasonprogrammer/gerbil | 0824aa21683f751cec1639cd402a777f888986b3 | [
"MIT"
] | 3 | 2020-06-24T17:50:07.000Z | 2022-02-12T23:40:22.000Z | site/pods/tutorials/content/21owtxh3/content.md | jasonprogrammer/gerbil | 0824aa21683f751cec1639cd402a777f888986b3 | [
"MIT"
] | 2 | 2020-07-16T01:02:23.000Z | 2020-07-16T15:21:25.000Z | # Topic links
During a `gerbil build`, an HTML file is created that contains a list of links
for each pod (topic) you have created.
An example of this page is [here](https://github.com/jasonprogrammer/gerbil/blob/master/site/pods/index.html).
In the [home page content template](https://github.com/jasonprogrammer/gerbil/blob/master/web/templates/home-content.html),
you can insert the following mustache tag:
```xml
{{{topics}}}
```
This will insert the links into the template.
| 30.3125 | 123 | 0.758763 | eng_Latn | 0.969037 |
3e7673b3b9b7861828180d2b519fcecdd0967c75 | 59 | md | Markdown | README.md | DerekStrickland/consul-compose | bee5285a2bdcc4c7dc2bf13f880c5a27316521e6 | [
"MIT"
] | 1 | 2021-05-13T07:35:23.000Z | 2021-05-13T07:35:23.000Z | README.md | DerekStrickland/consul-compose | bee5285a2bdcc4c7dc2bf13f880c5a27316521e6 | [
"MIT"
] | null | null | null | README.md | DerekStrickland/consul-compose | bee5285a2bdcc4c7dc2bf13f880c5a27316521e6 | [
"MIT"
] | 1 | 2021-05-13T07:35:27.000Z | 2021-05-13T07:35:27.000Z | # consul-compose
Docker Compose file for a demo consul app
| 19.666667 | 41 | 0.79661 | ita_Latn | 0.508541 |
3e76900eaaf56062062864da8f090799255a2df8 | 4,378 | md | Markdown | DATA/daily-TypeScript/TypeScript-2021-11-05.md | LuckRain7/z-action-github-daily-trending | 88d12e1921b8d022d06021cd3d906af775f029c8 | [
"MIT"
] | 4 | 2021-02-18T00:25:52.000Z | 2022-01-05T02:24:22.000Z | DATA/daily-TypeScript/TypeScript-2021-11-05.md | LuckRain7/z-action-github-daily-trending | 88d12e1921b8d022d06021cd3d906af775f029c8 | [
"MIT"
] | null | null | null | DATA/daily-TypeScript/TypeScript-2021-11-05.md | LuckRain7/z-action-github-daily-trending | 88d12e1921b8d022d06021cd3d906af775f029c8 | [
"MIT"
] | 1 | 2021-06-30T07:23:38.000Z | 2021-06-30T07:23:38.000Z | # [GitHub] TypeScript 日趋势榜项目(2021-11-05)
## 1. microsoft/TypeScript
项目地址:[https://github.com/microsoft/TypeScript](https://github.com/microsoft/TypeScript)
stars:undefined | forks:undefined | undefined stars today
## 2. whyour/qinglong
项目地址:[https://github.com/whyour/qinglong](https://github.com/whyour/qinglong)
stars:undefined | forks:undefined | undefined stars today
## 3. n8n-io/n8n
项目地址:[https://github.com/n8n-io/n8n](https://github.com/n8n-io/n8n)
stars:undefined | forks:undefined | undefined stars today
## 4. openkraken/kraken
项目地址:[https://github.com/openkraken/kraken](https://github.com/openkraken/kraken)
stars:undefined | forks:undefined | undefined stars today
## 5. remix-run/react-router
项目地址:[https://github.com/remix-run/react-router](https://github.com/remix-run/react-router)
stars:undefined | forks:undefined | undefined stars today
## 6. nuxt/framework
项目地址:[https://github.com/nuxt/framework](https://github.com/nuxt/framework)
stars:undefined | forks:undefined | undefined stars today
## 7. home-assistant/frontend
项目地址:[https://github.com/home-assistant/frontend](https://github.com/home-assistant/frontend)
stars:undefined | forks:undefined | undefined stars today
## 8. streamich/react-use
项目地址:[https://github.com/streamich/react-use](https://github.com/streamich/react-use)
stars:undefined | forks:undefined | undefined stars today
## 9. cyrildiagne/ar-cutpaste
项目地址:[https://github.com/cyrildiagne/ar-cutpaste](https://github.com/cyrildiagne/ar-cutpaste)
stars:undefined | forks:undefined | undefined stars today
## 10. hashicorp/terraform-cdk
项目地址:[https://github.com/hashicorp/terraform-cdk](https://github.com/hashicorp/terraform-cdk)
stars:undefined | forks:undefined | undefined stars today
## 11. angular/angular-cli
项目地址:[https://github.com/angular/angular-cli](https://github.com/angular/angular-cli)
stars:undefined | forks:undefined | undefined stars today
## 12. graphql/graphql-js
项目地址:[https://github.com/graphql/graphql-js](https://github.com/graphql/graphql-js)
stars:undefined | forks:undefined | undefined stars today
## 13. facebook/docusaurus
项目地址:[https://github.com/facebook/docusaurus](https://github.com/facebook/docusaurus)
stars:undefined | forks:undefined | undefined stars today
## 14. Bismuth-Forge/bismuth
项目地址:[https://github.com/Bismuth-Forge/bismuth](https://github.com/Bismuth-Forge/bismuth)
stars:undefined | forks:undefined | undefined stars today
## 15. vuetifyjs/vuetify
项目地址:[https://github.com/vuetifyjs/vuetify](https://github.com/vuetifyjs/vuetify)
stars:undefined | forks:undefined | undefined stars today
## 16. antvis/X6
项目地址:[https://github.com/antvis/X6](https://github.com/antvis/X6)
stars:undefined | forks:undefined | undefined stars today
## 17. metaplex-foundation/metaplex
项目地址:[https://github.com/metaplex-foundation/metaplex](https://github.com/metaplex-foundation/metaplex)
stars:undefined | forks:undefined | undefined stars today
## 18. angular/angular
项目地址:[https://github.com/angular/angular](https://github.com/angular/angular)
stars:undefined | forks:undefined | undefined stars today
## 19. antfu/vitesse
项目地址:[https://github.com/antfu/vitesse](https://github.com/antfu/vitesse)
stars:undefined | forks:undefined | undefined stars today
## 20. graphql/graphiql
项目地址:[https://github.com/graphql/graphiql](https://github.com/graphql/graphiql)
stars:undefined | forks:undefined | undefined stars today
## 21. solana-labs/token-list
项目地址:[https://github.com/solana-labs/token-list](https://github.com/solana-labs/token-list)
stars:undefined | forks:undefined | undefined stars today
## 22. raycast/extensions
项目地址:[https://github.com/raycast/extensions](https://github.com/raycast/extensions)
stars:undefined | forks:undefined | undefined stars today
## 23. TuSimple/naive-ui
项目地址:[https://github.com/TuSimple/naive-ui](https://github.com/TuSimple/naive-ui)
stars:undefined | forks:undefined | undefined stars today
## 24. umijs/umi
项目地址:[https://github.com/umijs/umi](https://github.com/umijs/umi)
stars:undefined | forks:undefined | undefined stars today
## 25. facebook/jest
项目地址:[https://github.com/facebook/jest](https://github.com/facebook/jest)
stars:undefined | forks:undefined | undefined stars today
| 21.566502 | 103 | 0.74212 | yue_Hant | 0.373418 |
3e76bb7b4aaee8e22c15e827f1a37e196c8ba428 | 12,700 | markdown | Markdown | _posts/2021-09-30-Reversing a dotNET Dropper.markdown | seckid/Blog | f31f5fcb47b7b633504f79d2b2d77fe11b40705e | [
"MIT"
] | null | null | null | _posts/2021-09-30-Reversing a dotNET Dropper.markdown | seckid/Blog | f31f5fcb47b7b633504f79d2b2d77fe11b40705e | [
"MIT"
] | null | null | null | _posts/2021-09-30-Reversing a dotNET Dropper.markdown | seckid/Blog | f31f5fcb47b7b633504f79d2b2d77fe11b40705e | [
"MIT"
] | null | null | null | ---
layout: post
title: "Reversing .NET sample from Malshare With Anti Reversing and debugging capability - PART 2"
date: 2021-09-30 07:34:23 -0500
categories: Reversing PE32 Downloader NotNet
---
Reversing the sample downloaded by the .NET downloader

<!--more-->
## Introduction
This is the second part of this investigation following the reversing of a .NET downloader malware sample obtained from Malshare. A link to the initial analysis can be found here: [Part1 of the Saga](https://blog.securitykid.co.uk/Reversing-a-dotNET-Downloader/).
During that investigation we was able to extract and observe an executable downloaded from the Discord cdn network after execution of the first sample. This post is the analysis of the second executable. Unusually this executable has the same name as the original "pctool.exe". Therefore I predict I may see similarities with the sample. Let's get to it!
## Disclaimer
As always …. Just adding here to make sure that i cover myself really.
If you do attempt to conduct any of the analysis detailed in the post above, you are doing so on your own risk. I am very new in this area of security and so should not be considered as an 'Expert'. Please take all necessary precautions whilst analysing malware.
Also, The opinions in this blog are mine alone and not of my employer.
## TLDR
what I was able to find out about this sample from the analysis I have been able to achieve thus far:
- .NET malware
- Dropper that drops and executes 9 files in a temporary location
- Uses basic obfuscation techniques to prevent it being identified
- Uses the anti-analysis techniques identified in previous version to prevent analysis which allows us to confirm that the samples are closely related
- Likely deployed using a builder as some of the functionality is not used inside the sample
## Sample
The sample I am reviewing in this post was a result of reviewing the downloader obtained was through the initial analysis detailed in the blog post found [here](https://blog.securitykid.co.uk/Reversing-a-dotNET-Downloader/)
| Identifier | Value |
| ----------- | ----------- |
| MD5 |1E5DB48934EF0508B896A5E06F36A655 |
| SHA1 |C1EC9AB65A2D7AAA9FFDF952292BEEDC39A06AE5 |
| SHA256 |389745CB2190986EAA84B6B7410FF6341A6AAB127B0763C294EA84E13C2D8E1A|
| File Size | 4280320 (bytes) |
| CPU| 32-bit |
## Analysis
The following section contains analysis conducted using various techniques to identify the malware and understand its objectives:
### Analysis of Binary
Straight the way after opening the sample in PE Studio I noted that the sample is again a dotnet compiled executable reliant on the library "mscoree.dll". However the size of this sample was much larger than the previous binary so initial assumptions suggest that this file is not a light downloader but potentially a dropper - This however, is an early prediction.
Reviewing the stings section of PE Studio provided some interesting details. In total, there were 73,289 string values of which the largest values was a string 31,026 bytes in size. This was significantly more than the previous sample. On review most of these strings were obfuscated which supports my theory that this sample is a dropper. Below is a screenshot of some of the largest strings observed in this sample:

I was able to validate my initial assumption was right once I opened the sample in dnSpy. The initial section of the sample looks very similar to the previous sample reviewed in the previous post. Below is a screenshot of the program stub which if put side by side looks almost identical to the previous binary:

However, upon review it seems like the anti-debugging techniques should work in this sample whereby they didn’t in the previous. This is because the BOOLEAN values under the program section i.e "Program.antiVM" have been set to a value whereas initially they were left null. Therefore I will need to implement my patches I documented previously in this sample to overcome the anti-debugging techniques. This is where I am happy I put the time into understanding them.

To understand the anti-debugging techniques please review my previous post which goes into detail about how I overcome the techniques.
**Observation 1**
My first observation in comparing the differences between the two samples directed me to the new functions and variables that are under the program section of the new sample. The two other sections ("Anti", and "Runner") are very similar in comparison. Below I have highlighted the sections that appear different:

I started off by reviewing the variables that are new in this instance. The following provided some further insight:
| Variable| Description|
|----------|------------|
|public static string encryptType = "XORIAIZCNIWw"; | This string appears to be an encryption key. Based on the initial three characters it might well be an XOR decryption key. However unconfirmed as of yet |
|File Names List | This list of strings includes: ("Chrome7", "PublicDwlBrowser1100", "setup", "udptest", "sfx_123_206", "oliver2109-c","setup_2", "liy-game", "jhuuee") . This list is interesting as there are 9 names. There are also 9 "Path.GetTempPath()" strings under "fileDropPaths" variable. Therefore I assume that there are 9 files that are dropped during the execution of this malware. Furthermore there are 9 "fileRunTypes" which appear to be a method of applying persistance to the 9 files |
| lastly there is a byte variable named "cvers" containing a base 64 string | Once decoded this displays a registry path used for persistence "SOFTWARE\Microsoft\Windows\CurrentVersion\Run" |
The last string is the first observation of a previous finding that now is obfuscated using encoding. Previously this was plain test in the sample. Now this is a base64 encoded string.

**Observation 2**
Once I had overcome and patched the known Anti-analysis techniques I reviewed the program section of the binary to understand the functions I was currently unaware of. The first "for" loop is where I started.
The for loop is built upon the 9 file name strings we observed earlier. For each of these name the following variables are set to:
- The filename eg "Chrome7"
- The file type i.e "exe"
- The file run type i.e "Run Always"
- The file drop path i.e "Path.GetTempPath()" which equates to the current users temporary folder i.e "c:\Users\User_name\Appdata\Local\Temp\"
The last part of the for loop is a call to a function called "GetResource" and passing the filename variable. Presumably this is the process of extracting the files from the dropper:

The function GetResource takes the string value of the file name and runs the following:
```ResourceManager resourceManager = new ResourceManager("yxkbwrlbvrp", Assembly.GetExecutingAssembly());
return (byte[])resourceManager.GetObject(file);```
According to MSDN the string "yxkbwrlbvrp" is the "ResourceManager.BaseName" property. This represents the root name of the resource files that the Resource Manager Searches for resources. Using a tool called ILSpy, we can see these resources:

I attempted to extract each of these resources to review and confirm if they are encoded. The extraction of these resources subsequently confirmed their encoding as can be observed in "Chrome7":

Confirming my theory that these files are encrypted I observed a decryption routine after the extraction of these files. Below is a screenshot of this function:

My initial review of this section has intrigued me. I have detailed several points here:
1. There's an initial if statement that checks the compressed flag which in this example is set to "false" which suggests that the routine will skip this. However analysis of this function confirms that the function would (if set) use the DeflateStream class to decompress the data. I will need to dynamically analyse this component to confirm if this is the case, as I am unaware if even if the BOOLEAN is set to "false" it might look to if its not NULL.
2. Secondly There's an if / else if statement that checks for two different encryption strings either "AWkCZdaodw" or "XORIAIZCNIWw" which we know from the analysis of strings that in our instance the value is "XORIAIZCNIWw". This is used to set a value in the "EncryptOrDecryptXOR" function to "pobpusrcbpx". The second value used is the stream of data noted from the files. This is the decoding function of the dropped file/files.
To test this I removed the execution of these files from the program and saved the patched version to execute.
**Note**
If you don’t save the patched version in dnspy you will run the previously compiled binary (the malicious one) and run the risk of exposure. You must ensure that you are isolated from the internet and running in a virtualised isolated environment before doing this!

I checked the "C:\Users\John\AppData\Local\Temp" directory for the files and was pleasantly surprised :) 9 executable files were created:

I Checked for persistence which was not set for any of these executables. Therefore, a BOOLEAN value of "false" prevents this function from running. Which therefore suggests that the files do not run through the decompress routine and so therefore the only obfuscation is this XOR function. To test this I wrote a simple PowerShell script to decode the function:
```
$key = 'pobpusrcbpx'
$keyBytes = [System.Text.Encoding]::Unicode.GetBytes($key)
$len_keyBytes = $keyBytes.Count
$file = [system.IO.File]::ReadAllBytes("E:\02-MALWARE\pctool\pctool\Chrome7")
$filelen = $file.Count
Write-Output "--------------------"
$XOR_array = New-Object Byte[] $filelen
for($i=0; $i -lt $filelen ; $i++)
{
$XOR_array[$i] = $keyBytes[$i] -bxor $file[$i]
}
$entirefile = $XOR_array -join ' '
Write-Output $entirefile
```
My script proved that this is the only encoding used as shown with the screenshot below. The HXD window contains the file Chrome7 dropped through the execution of the patched malware whereas the PowerShell command was ran against the extracted blog from resources export form "Chrome7" which was then converted to Hexadecimal using CyberChef:

## Conclusion
So we were able to successfully confirm that this binary was indeed a dropper used to drop and execute 9 files. This time the malware author used some basic level of obfuscation (XOR) to encode / decode the binaries which reside inside the dropper. This was one of two different decoding procedures that are bult into this sample so presumably these executables are derived from a builder where the Threat Actor can specify what encoding to choose. This however is not clarified
At the time of writing it is unknown the purpose of these executables. Perhaps a PART3 for this saga!
Really enjoyed reviewing this sample. Hope you enjoyed reading through my notes :)
### Indicators of Compromise
Below is a list of indicators obtained from the sample:
| Indicator | Description|
|----------|------------|
|Chrome7.exe|File Dropped|
|jhuuee.exe|File Dropped|
|liy-game.exe|File Dropped|
|oliver2109-c.exe|File Dropped|
|PublicDwlBrowser1100.exe|File Dropped|
|setup.exe|File Dropped|
|setup_2.exe|File Dropped|
|sfx_123_206.exe|File Dropped|
|udptest.exe|File Dropped|
|C:\Users\John\AppData\Local\Temp| Directory of drop where John is the user who executed (my names not john ;))|
|yxkbwrlbvrp|Resource name where Binaries are encoded and stored|
|U09GVFdBUkVcTWljcm9zb<br>2Z0XFdpbmRvd3NcQ3VycmVudFZ<br>lcnNpb25cUnVu|Base64 encoded registry path|
|XORIAIZCNIWw| Encryption string used to determine the encryption process to execute|
|pobpusrcbpx|XOR key to executed to decode the binaries|
|AWkCZdaodw| Encryption string used to determine the encryption process to execute - this was not used in this sample|
|pctool.exe|Original File Name|
|1E5DB48934EF0508B896A5E06F36A655 |MD5 |
|C1EC9AB65A2D7AAA9FFDF952292BEEDC39A06AE5 |SHA1 |
|SHA256 |389745CB2190986EAA84B6B7410FF6341A6AAB127B0763C294EA84E13C2D8E1A|SHA256 |
|4280320 (bytes) |File Size |
|32-bit |CPU|
| 63.819095 | 505 | 0.76748 | eng_Latn | 0.998497 |
3e7713f9012b5ed0ad7acdaff33b6dcdb4dbe013 | 2,695 | md | Markdown | docs/NO.2_designPattern/structuralPatterns/NO.6-facade-pattern.md | ChenMJ068/JavaNotes | fa54a98169f295915888b736060f221edce35fcb | [
"Apache-2.0"
] | 1 | 2020-04-07T09:59:35.000Z | 2020-04-07T09:59:35.000Z | docs/NO.2_designPattern/structuralPatterns/NO.6-facade-pattern.md | Chenide/JavaNotes | 995e8dd30e83a94c73a4aeba7dfc41fe506cf1ae | [
"Apache-2.0"
] | 1 | 2022-03-31T21:05:05.000Z | 2022-03-31T21:05:05.000Z | docs/NO.2_designPattern/structuralPatterns/NO.6-facade-pattern.md | ChenMJ068/JavaNotes | fa54a98169f295915888b736060f221edce35fcb | [
"Apache-2.0"
] | null | null | null | ## 外观模式
外观模式(Facade Pattern)隐藏系统的复杂性,并向客户端提供了一个客户端可以访问系统的接口。这种类型的设计模式属于结构型模式,它
向现有的系统添加一个接口,来隐藏系统的复杂性。
这种模式涉及到一个单一的类,该类提供了客户端请求的简化方法和对现有系统类方法的委托调用。
#### 外观(Facade)模式是“迪米特法则”的典型应用,它有以下主要优点:
- 降低了子系统与客户端之间的耦合度,使得子系统的变化不会影响调用它的客户类。
- 对客户屏蔽了子系统组件,减少了客户处理的对象数目,并使得子系统使用起来更加容易。
- 降低了大型软件系统中的编译依赖性,简化了系统在不同平台之间的移植过程,因为编译一个子系统不会影响其他的子系统,也不会影响外观对象。
#### 外观(Facade)模式的主要缺点如下:
- 不能很好地限制客户使用子系统类。
- 增加新的子系统可能需要修改外观类或客户端的源代码,违背了“开闭原则”。
#### 使用场景:
- 为复杂的模块或子系统提供外界访问的模块。
- 子系统相对独立。
- 预防低水平人员带来的风险。
#### 注意事项:
在层次化结构中,可以使用外观模式定义系统中每一层的入口。
### 代码实现
假设去医院看病,挂号,看医生,检查,取药都需要自己去,如果对环境不熟悉,非常浪费时间,如果有个人带着你还不用排队就能节省很多时间了。

- 创建接口医院
```java
public interface Hospital {
void proxy();
}
```
- 创建接口的实现类,挂号,看医生,取药
```java
public class Registration implements Hospital {
@Override
public void proxy() {
System.out.println("先挂号");
}
}
```
```java
public class SeeDoctor implements Hospital {
@Override
public void proxy() {
System.out.println("检查病因");
}
}
```
```java
public class Pharmacy implements Hospital {
@Override
public void proxy() {
System.out.println("取药");
}
}
```
- 创建一个代理,可以带领你做很多事
```java
public class Hospitalproxy {
private Registration registration;
private SeeDoctor seeDoctor;
private Pharmacy pharmacy;
public Hospitalproxy() {
this.registration = new Registration();
this.seeDoctor = new SeeDoctor();
this.pharmacy = new Pharmacy();
}
public void setRegistration(){
registration.proxy();
}
public void setSeeDoctor(){
seeDoctor.proxy();
}
public void setPharmacy(){
pharmacy.proxy();
}
}
```
- 创建一个病人
```java
public class Patient {
public static void main(String[] args) {
HospitalProxy proxy = new HospitalProxy();
proxy.setRegistration();
proxy.setSeeDoctor();
proxy.setPharmacy();
}
}
```
- 执行结果
```
先挂号
检查病因
取药
```
### 总结
- 在外观模式中,外部与一个子系统的通信必须通过一个统一的外观对象进行,为子系统中的一组接口提供一个一致的界面,外观模式定义了一
个高层接口,这个接口使得这一子系统更加容易使用。外观模式又称为门面模式,它是一种对象结构型模式。
- 外观模式包含两个角色:外观角色是在客户端直接调用的角色,在外观角色中可以知道相关的(一个或者多个)子系统的功能和责任,它将所有
从客户端发来的请求委派到相应的子系统去,传递给相应的子系统对象处理;在软件系统中可以同时有一个或者多个子系统角色,每一个子系统
可以不是一个单独的类,而是一个类的集合,它实现子系统的功能。
- 外观模式要求一个子系统的外部与其内部的通信通过一个统一的外观对象进行,外观类将客户端与子系统的内部复杂性分隔开,使得客户端只
需要与外观对象打交道,而不需要与子系统内部的很多对象打交道。
- 外观模式主要优点在于对客户屏蔽子系统组件,减少了客户处理的对象数目并使得子系统使用起来更加容易,它实现了子系统与客户之间的松
耦合关系,并降低了大型软件系统中的编译依赖性,简化了系统在不同平台之间的移植过程;其缺点在于不能很好地限制客户使用子系统类,而
且在不引入抽象外观类的情况下,增加新的子系统可能需要修改外观类或客户端的源代码,违背了“开闭原则”。
- 外观模式适用情况包括:要为一个复杂子系统提供一个简单接口;客户程序与多个子系统之间存在很大的依赖性;在层次化结构中,需要定义
系统中每一层的入口,使得层与层之间不直接产生联系。
源码地址:[设计模式源码](https://github.com/Chenide/JavaNotes) | 22.838983 | 70 | 0.730241 | yue_Hant | 0.253121 |
3e771ad0826f5e4d5229f6889c14ce527b4fdda4 | 1,412 | md | Markdown | 2020/07/23/2020-07-23 10:30.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/07/23/2020-07-23 10:30.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/07/23/2020-07-23 10:30.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年07月23日10时数据
Status: 200
1.杭州女子失踪案后续
微博热度:13752152
2.林有有舔许幻山冰淇淋
微博热度:2461058
3.2020高考成绩开始放榜
微博热度:2352061
4.梁正贤是海王吗
微博热度:1425561
5.吴镇宇念王一博粉丝祝福语
微博热度:1356695
6.夏天想白有多难
微博热度:1261373
7.高考查分
微博热度:1166050
8.中国驻美使馆收到炸弹和死亡威胁
微博热度:1058248
9.河南新增1例无症状感染者
微博热度:882795
10.美听证会通过TikTok禁令
微博热度:881976
11.罗晋否认出轨
微博热度:878649
12.思文被house淘汰
微博热度:873875
13.火箭少女护送Yamy
微博热度:872651
14.NBA
微博热度:869063
15.中方敦促美方立即撤销有关错误决定
微博热度:864986
16.美国24小时新增新冠8.3万人
微博热度:861917
17.杭州女子老公
微博热度:859802
18.杭州女子 奇门遁甲
微博热度:857133
19.顾佳穿搭
微博热度:853967
20.湖北分数线
微博热度:846959
21.毛晓彤谈三十而立
微博热度:843411
22.广州急寻剧毒蘑菇遗失物
微博热度:745341
23.BLACKPINK新单预告
微博热度:658198
24.警方成立杭州女子失踪专案组
微博热度:557158
25.江西分数线
微博热度:537321
26.蚂蚁集团员工持股约40%
微博热度:474910
27.三十而已
微博热度:409776
28.高考成绩公布前的半小时
微博热度:373897
29.深圳地铁5号线
微博热度:369268
30.服装行业全年蒸发4000亿
微博热度:321396
31.浙江丽水首次拍到四不像
微博热度:271978
32.清唱也能这么好听
微博热度:259881
33.梁爽不原谅大宝
微博热度:255541
34.西藏那曲6.6级地震
微博热度:242002
35.老人给公交司机扇风被回赠折扇
微博热度:241377
36.白宫一食堂工作人员新冠检测阳性
微博热度:237176
37.得知高考成绩时的你
微博热度:224458
38.万达女子311名密接无一感染
微博热度:217556
39.南京楼市新政
微博热度:214369
40.大连新增1例系某海产品加工企业员工
微博热度:209810
41.南京购房追溯两年内离婚记录
微博热度:205487
42.世界最大农作物太极图景观
微博热度:200104
43.闪耀暖暖
微博热度:198605
44.科学家发现操纵优化细胞衰老途径
微博热度:192883
45.安徽分数线
微博热度:188588
46.杭州公安
微博热度:180916
47.李健工作室辟谣
微博热度:170673
48.张起灵救吴邪
微博热度:168163
49.谢娜微信登不上去
微博热度:158270
50.医疗机构严禁利用名称误导患者
微博热度:157769
| 6.921569 | 20 | 0.781161 | yue_Hant | 0.333813 |
3e7753df7cbe7c50e33decf5ff86658d49aba94b | 2,901 | md | Markdown | data/issues/ZF-9909.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 40 | 2016-06-23T17:52:49.000Z | 2021-03-27T20:02:40.000Z | data/issues/ZF-9909.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 80 | 2016-06-24T13:39:11.000Z | 2019-08-08T06:37:19.000Z | data/issues/ZF-9909.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 52 | 2016-06-24T22:21:49.000Z | 2022-02-24T18:14:03.000Z | ---
layout: issue
title: "Statement containing both parentheses and AS clause"
id: ZF-9909
---
ZF-9909: Statement containing both parentheses and AS clause
------------------------------------------------------------
Issue Type: Bug Created: 2010-05-28T12:33:57.000+0000 Last Updated: 2012-11-20T21:37:40.000+0000 Status: Open Fix version(s):
Reporter: Shahriyar Imanov (shehriyari) Assignee: None Tags: - Zend\_Db\_Select
Related issues:
Attachments:
### Description
Example SELECT statement is like this:
SELECT CAST(column\_name AS UNSIGNED) FROM table1
Code I used was:
1. $select = $db->select()
2. ->from(array('table1'),
3. array('CAST(column\_name AS UNSIGNED)',
4. );
which did not work - error returned, was: Could not find "table1.CAST(column\_name" field. As we know, if there is a parentheses inside the clause, it is considered to be a Zend\_Db\_Expr. But on the other hand, the existence of "AS" clause messes things up, making it think we have a field-name-alias here, which we don't. Its just CAST function's syntax. Of course I overcame the problem by explicitly telling it we are having Zend\_Db\_Expr with this code which worked:
1. $select = $db->select()
2. ->from(array('table1'),
3. array('column\_name' => new Zend\_Db\_Expr('CAST(column\_name AS UNSIGNED)'))
4. );
Question for developers of ZF is: Shouldn't you guys not touch any clauses, including AS clause, inside the statement which contains parentheses - the same way you do with Zend\_Db\_Expr objects? Statements with parentheses are considered to be Zend\_Db\_Expr objects, but at the same time they are not...
Shehi
### Comments
Posted by Artem Stepin (nemesis2010) on 2010-05-28T13:15:27.000+0000
If you add 'as column\_name' in the column definition, it should work also without Zend\_Db\_Expr
<pre class="highlight">
$select = $db->select()->from(
array('table1'),
array('CAST(column_name AS UNSIGNED) as column_name')
);
Posted by Shahriyar Imanov (shehriyari) on 2010-05-28T16:43:05.000+0000
Dorogoy Artyom, then why on Earth would I use Zend\_Db at all?! Didn't you know that AS column\_name does not work in most RDBMS's, including MSSQL? The main reason for me in using Zend\_Db is because it abstracts Db management via Factory pattern, it keeps the code clean and it makes sure devs never write SQL/query, instead they model/design it.
Anyway, I believe your solution is very wrong and misleading when it comes to Zend Db.
Posted by Artem Stepin (nemesis2010) on 2010-05-29T01:53:07.000+0000
Ok, I didn't know that. I only tried to use different Zend\_Db\_Adapter's and those given me a similar output.
Posted by Shahriyar Imanov (shehriyari) on 2010-05-29T12:59:53.000+0000
Oh thats ok brother, sorry I bashed on you like this :) Guess I was tired of long day's work... Thanks for your feedback nevertheless!
| 34.129412 | 472 | 0.720786 | eng_Latn | 0.990915 |
3e77a197a5995e9e1003c007faf56b8e22a95558 | 651 | md | Markdown | _posts/2015-11-21-Little-People-Style-80335.md | eudanceyou/eudanceyou.github.io | 9d81bccab5dd52c95c99495c5800da809ea32ed7 | [
"MIT"
] | null | null | null | _posts/2015-11-21-Little-People-Style-80335.md | eudanceyou/eudanceyou.github.io | 9d81bccab5dd52c95c99495c5800da809ea32ed7 | [
"MIT"
] | null | null | null | _posts/2015-11-21-Little-People-Style-80335.md | eudanceyou/eudanceyou.github.io | 9d81bccab5dd52c95c99495c5800da809ea32ed7 | [
"MIT"
] | null | null | null | ---
layout: post
date: '2015-11-21'
title: "Little People Style 80335"
category: Little People
tags: [Little People]
---
### Little People Style 80335
Just **$189.99**
###
<table><tr><td>BRANDS</td><td>Little People</td></tr></table>
<a href="https://www.antebrands.com/en/little-people/62783-little-people-style-80335.html"><img src="//static.msromantic.com/145552/little-people-style-80335.jpg" alt="Little People Style 80335" style="width:100%;" /></a>
<!-- break -->
Buy it: [https://www.antebrands.com/en/little-people/62783-little-people-style-80335.html](https://www.antebrands.com/en/little-people/62783-little-people-style-80335.html)
| 40.6875 | 221 | 0.71275 | yue_Hant | 0.603039 |
3e783f838f72ed4a6ddcbc6f9b7a46dcd3ccf79b | 79 | md | Markdown | _pages/disclaimer.md | pejuang-onlien/blog | 8cfb00e357ba3be0785a4c5ba66010bbab315f7e | [
"MIT"
] | null | null | null | _pages/disclaimer.md | pejuang-onlien/blog | 8cfb00e357ba3be0785a4c5ba66010bbab315f7e | [
"MIT"
] | null | null | null | _pages/disclaimer.md | pejuang-onlien/blog | 8cfb00e357ba3be0785a4c5ba66010bbab315f7e | [
"MIT"
] | null | null | null | ---
layout: page
title: Disclaimer
permalink: /disclaimer
comments: true
---
| 8.777778 | 22 | 0.708861 | eng_Latn | 0.670679 |
3e7892f5a3dc22a83b99977b3ab1c38bf3186503 | 59 | md | Markdown | publisher-consumer/README.md | frauca/queues | 9c930c9aa03769f884e3aa03b31aabf4140a8cc0 | [
"Apache-2.0"
] | null | null | null | publisher-consumer/README.md | frauca/queues | 9c930c9aa03769f884e3aa03b31aabf4140a8cc0 | [
"Apache-2.0"
] | null | null | null | publisher-consumer/README.md | frauca/queues | 9c930c9aa03769f884e3aa03b31aabf4140a8cc0 | [
"Apache-2.0"
] | null | null | null | Simple project with one producer and one consumer listening | 59 | 59 | 0.864407 | eng_Latn | 0.999905 |
3e7a1764c8f81bf003f183e2caff5eb0d021894c | 13,770 | md | Markdown | articles/active-directory/active-directory-saas-namely-tutorial.md | OpenLocalizationTestOrg/azure-docs-pr15_de-AT | ca82887d8067662697adba993b87860bdbefea29 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | 1 | 2020-11-29T22:55:06.000Z | 2020-11-29T22:55:06.000Z | articles/active-directory/active-directory-saas-namely-tutorial.md | Allyn69/azure-docs-pr15_de-CH | 211ef2a7547f43e3b90b3c4e2cb49e88d7fe139f | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/active-directory-saas-namely-tutorial.md | Allyn69/azure-docs-pr15_de-CH | 211ef2a7547f43e3b90b3c4e2cb49e88d7fe139f | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | 2 | 2019-07-03T20:05:49.000Z | 2020-11-29T22:55:15.000Z | <properties
pageTitle="Lernprogramm: Azure Active Directory Integration: | Microsoft Azure"
description="So konfigurieren Sie einmaliges Anmelden zwischen Azure Active Directory und zwar."
services="active-directory"
documentationCenter=""
authors="jeevansd"
manager="prasannas"
editor=""/>
<tags
ms.service="active-directory"
ms.workload="identity"
ms.tgt_pltfrm="na"
ms.devlang="na"
ms.topic="article"
ms.date="10/20/2016"
ms.author="jeedes"/>
# <a name="tutorial-azure-active-directory-integration-with-namely"></a>Lernprogramm: Azure Active Directory Integration:
Das Ziel dieses Lernprogramms ist nämlich Integration in Azure Active Directory (Azure AD) zeigen.
Integrieren nämlich in Azure AD bietet folgende Vorteile:
- Sie können steuern, in Azure AD, die zwar Zugriff hat
- Können die Benutzer automatisch angemeldet an: Abrufen (einmaliges Anmelden) mit ihren Azure AD-Konten
- Sie können Ihre Konten zentral - klassischen Azure-Portal verwalten
Wenn Sie weitere Informationen zur Integration von SaaS Anwendung in Azure AD wissen möchten, finden Sie unter [Zugriff und single Sign-on Azure Active Directory](active-directory-appssoaccess-whatis.md).
## <a name="prerequisites"></a>Erforderliche Komponenten
Konfigurieren von Azure AD Integration: Sie benötigen Folgendes:
- Ein Azure AD-Abonnement
- Nämlich einmalige Anmeldung aktivierte Abonnements
> [AZURE.NOTE] Um Testschritte in diesem Lernprogramm empfehlen nicht wir einer.
Um die Schritte in diesem Lernprogramm zu testen, sollten Sie diese Empfehlung befolgen:
- Verwenden Sie Ihre produktionsumgebung nur dies erforderlich ist.
- Wenn Sie eine Testversion Azure AD-Umgebung haben, können Sie einem einmonatigen Testversion [hier](https://azure.microsoft.com/pricing/free-trial/)abrufen.
## <a name="scenario-description"></a>Beschreibung des Szenarios
Das Ziel dieses Lernprogramms ist Azure AD einmaliges Anmelden in einer Umgebung testen können.
In diesem Lernprogramm beschriebenen Szenario besteht aus zwei wesentlichen Bausteine:
1. Nämlich hinzufügen aus der Galerie
2. Konfigurieren und Testen von Azure AD einmaliges Anmelden
## <a name="adding-namely-from-the-gallery"></a>Nämlich hinzufügen aus der Galerie
Konfigurieren die Integration: Azure AD müssen Sie die Liste der verwalteten SaaS-apps nämlich aus der Galerie hinzufügen.
**Um nämlich aus der Galerie hinzuzufügen, führen Sie die folgenden Schritte:**
1. Klicken Sie im **klassischen Azure-Portal**im linken Navigationsbereich auf **Active Directory**.
![Active Directory][1]
2. Wählen Sie aus der Liste **Verzeichnis** das Verzeichnis für das Verzeichnisintegration soll.
3. Ansicht Applications in der Verzeichnisansicht öffnen, klicken Sie im oberen Menü auf **Applications** .
![Applikationen][2]
4. Klicken Sie am unteren Rand der Seite **Hinzufügen** .
![Applikationen][3]
5. Klicken Sie im Dialogfeld **Was möchten Sie tun** auf **eine Anwendung aus dem Katalog hinzufügen**.
![Applikationen][4]
6. Geben Sie im Suchfeld **:**.

7. Im Ergebnisbereich **nämlich**wählen Sie aus und dann auf **vollständig** die Anwendung hinzufügen.

## <a name="configuring-and-testing-azure-ad-single-sign-on"></a>Konfigurieren und Testen von Azure AD einmaliges Anmelden
Dieser Abschnitt soll wie Sie konfigurieren und Testen der Azure AD einmaliges Anmelden mit nämlich basierend auf einen Testbenutzer "Britta Simon" bezeichnet.
Azure AD muss für einmaliges Anmelden funktioniert wissen, was der Benutzer Gegenstück in zu einem Benutzer in Azure AD. In anderen Worten muss eine Verknüpfung Beziehung zwischen Azure AD-Benutzer und die zugehörigen Benutzer in nämlich hergestellt werden.
Diese Beziehung wird eingerichtet, indem den Wert des **Benutzernamens** in Azure AD als **Benutzername** in nämlich.
Konfigurieren und Testen von Azure AD müssen einmaliges Anmelden, Sie den folgenden Bausteinen durchführen:
1. **[Konfigurieren von Azure AD Single Sign-On](#configuring-azure-ad-single-single-sign-on)** - der Benutzer dieses Feature verwenden können.
2. **[Benutzer erstellen eine Azure Anzeige testen](#creating-an-azure-ad-test-user)** : Azure AD einmaliges Anmelden mit Britta Simon testen.
4. **[Erstellen einer nämlich testen Benutzer](#creating-a-namely-test-user)** - eine Entsprechung von Britta Simon in nämlich, verknüpft ist der Azure AD Darstellung Ihres.
5. **[Zuweisen von Azure AD Benutzer testen](#assigning-the-azure-ad-test-user)** : Britta Simon Azure AD einmaliges Anmelden verwenden aktivieren.
5. **[Testen von Single Sign-On](#testing-single-sign-on)** - überprüfen, ob die Konfiguration funktioniert.
### <a name="configuring-azure-ad-single-sign-on"></a>Konfiguration von Azure AD einmaliges Anmelden
Ziel dieses Abschnitts ist Azure AD einmaliges Anmelden im klassischen Azure-Portal aktivieren und einmaliges Anmelden in Ihrem nämlich Anwendung konfigurieren.
**Konfigurieren von Azure AD folgendermaßen einmaliges Anmelden mit::**
1. Klicken Sie im klassischen Azure-Portal, auf die **nämlich** Application Integration auf **Konfigurieren einmaliges Anmelden** **Konfigurieren Sie einmaliges Anmelden** Dialogfeld öffnen.
![Einmaliges Anmelden konfigurieren][6]
2. Auf der Seite **Wie möchten Sie Benutzer nämlich anmelden** **Azure AD einmaliges Anmelden**wählen Sie und klicken Sie auf **Weiter**.

3. Auf der **App-Einstellungen konfigurieren** , gehen Sie folgendermaßen vor:.

ein. Geben Sie im Textfeld **Anmelde-URL** den URL zur Ihre Benutzer Ihre Anwendung zwar anmelden (z.B.: *https://fabrikam.Namely.com/*).
b. Klicken Sie auf **Weiter**.
4. Auf der Seite **Konfigurieren Sie einmaliges Anmelden an:** die folgenden Schritte:

ein. Klicken Sie auf **Zertifikat herunterladen**und speichern Sie die Datei auf Ihrem Computer.
b. Klicken Sie auf **Weiter**.
1. In einem anderen Browserfenster, melden Sie sich auf der Website als Administrator nämlich Unternehmens.
1. Klicken Sie in der oberen Symbolleiste auf **Unternehmen**.

1. Klicken Sie auf **die Registerkarte** .

1. Klicken Sie auf **SAML**.

1. Auf der Einstellungsseite **SAML** Schritte:

ein. Klicken Sie auf **SAML aktivieren**.
b. Kopieren Sie im klassischen Azure-Portal, auf der Seite **Konfigurieren Sie einmaliges Anmelden am nämlich** den **Einzelnen Sign-On Service URL** -Wert, und fügen Sie ihn in das Textfeld **Identität Anbieter DDO Url** .
c. Öffnen Sie heruntergeladene Zertifikat im Editor, kopieren Sie den Inhalt und fügen Sie ihn in das Textfeld **Identitätszertifikat Anbieter** .
d. Klicken Sie auf **Speichern**.
6. Wählen Sie im klassischen Azure-Portal die Konfiguration für einzelne Zeichen Bestätigung und klicken Sie dann auf **Weiter**.
![Azure AD einmaliges Anmelden][10]
7. Klicken Sie auf der Seite **Bestätigung für einzelne Zeichen** auf **abgeschlossen**.
![Azure AD einmaliges Anmelden][11]
### <a name="creating-an-azure-ad-test-user"></a>Erstellen einen Testbenutzer Azure AD
Dieser Abschnitt soll im klassischen Azure-Portal namens Britta Simon Testbenutzer erstellen.
![Azure AD-Benutzer erstellen][20]
**Um einen Testbenutzer in Azure AD zu erstellen, führen Sie die folgenden Schritte:**
1. Klicken Sie im **klassischen Azure-Portal**im linken Navigationsbereich auf **Active Directory**.

2. Wählen Sie aus der Liste **Verzeichnis** das Verzeichnis für das Verzeichnisintegration soll.
3. Klicken Sie zum Anzeigen der Benutzerliste im oberen Menü auf **Benutzer**.

4. Klicken Sie das Dialogfeld **Benutzer hinzufügen** der Symbolleiste an der Unterseite, **Benutzer hinzufügen**.

5. Auf der Seite **Erzählen zu diesem Benutzer** die folgenden Schritte:

ein. Wählen Sie als Typ der Benutzer neuen Benutzer in der Organisation.
b. Geben Sie im **Textfeld**von Benutzernamen **BrittaSimon**.
c. Klicken Sie auf **Weiter**.
6. Führen Sie auf das **Benutzerprofil** die folgenden Schritte aus:

ein. Geben Sie im Textfeld **Vorname** **Britta**.
b. Im Feld **Nachname** Typ **Simon**.
c. Geben Sie im Textfeld **Anzeigename** **Britta Simon**.
d. Wählen Sie in der Liste **Rolle** **Benutzer**.
e. Klicken Sie auf **Weiter**.
7. Klicken Sie auf der Seite **erhalten temporäres Kennwort** **Erstellen**.

8. Führen Sie die folgenden Schritte auf der Seite **Passwort zu erhalten** :

ein. Notieren Sie den Wert für das **Neue Kennwort**.
b. Klicken Sie auf **abgeschlossen**.
### <a name="creating-a-namely-test-user"></a>Erstellt ein Benutzer nämlich testen
Dieser Abschnitt soll Benutzer Britta Simon in nämlich erstellen.
**Gehen Sie zum Erstellen eines Benutzers in nämlich Britta Simon aufgerufen:**
1. Anmeldung Ihr nämlich Unternehmensstandort als Administrator.
1. Klicken Sie in der oberen Symbolleiste auf **Personen**.

1. Klicken Sie auf die Registerkarte **Verzeichnis** .

1. Klicken Sie auf **neue Person hinzufügen**.
1. Gehen Sie im Dialogfeld **Neue Person hinzufügen** :
ein. Geben Sie im Textfeld **Vorname** **Britta**.
b. Geben Sie im Textfeld **Nachname** **Simon**.
c. Geben Sie im Textfeld **E-Mail** Adresse Brittas klassischen Azure-Portal.
d. Klicken Sie auf **Speichern**.
### <a name="assigning-the-azure-ad-test-user"></a>Zuweisen von Azure AD-Testbenutzer
Dieser Abschnitt soll aktivieren Britta Simon Azure einmaliges Anmelden verwenden, nämlich den Zugang zu gewähren.
![Benutzer zuweisen][200]
**Zuweisen von Britta Simon, die folgenden Schritte:**
1. Der Azure-Verwaltungsportal Ansicht Applications in der Verzeichnisansicht öffnen, klicken Sie auf **Programme** im oberen Menü.
![Benutzer zuweisen][201]
2. In der Anwendungsliste auswählen **:**

1. Klicken Sie im Menü oben auf **Benutzer**.
![Benutzer zuweisen][203]
1. Wählen Sie in der Liste Benutzer **Britta Simon**.
2. Klicken Sie auf unten auf **zuweisen**.
![Benutzer zuweisen][205]
### <a name="testing-single-sign-on"></a>Testen von Single Sign-On
Dieser Abschnitt soll Azure AD einzelne Anmeldung Überprüfen der Konfiguration mithilfe der.
Beim Klicken auf die nämlich in der Kachel, Sie sollten erhalten automatisch angemeldet an Ihre nämlich Anwendung.
## <a name="additional-resources"></a>Zusätzliche Ressourcen
* [Liste der Lernprogramme zum SaaS-Apps in Azure Active Directory integrieren](active-directory-saas-tutorial-list.md)
* [Was ist Zugriff und single Sign-on Azure Active Directory?](active-directory-appssoaccess-whatis.md)
<!--Image references-->
[1]: ./media/active-directory-saas-namely-tutorial/tutorial_general_01.png
[2]: ./media/active-directory-saas-namely-tutorial/tutorial_general_02.png
[3]: ./media/active-directory-saas-namely-tutorial/tutorial_general_03.png
[4]: ./media/active-directory-saas-namely-tutorial/tutorial_general_04.png
[6]: ./media/active-directory-saas-namely-tutorial/tutorial_general_05.png
[10]: ./media/active-directory-saas-namely-tutorial/tutorial_general_06.png
[11]: ./media/active-directory-saas-namely-tutorial/tutorial_general_07.png
[20]: ./media/active-directory-saas-namely-tutorial/tutorial_general_100.png
[200]: ./media/active-directory-saas-namely-tutorial/tutorial_general_200.png
[201]: ./media/active-directory-saas-namely-tutorial/tutorial_general_201.png
[203]: ./media/active-directory-saas-namely-tutorial/tutorial_general_203.png
[204]: ./media/active-directory-saas-namely-tutorial/tutorial_general_204.png
[205]: ./media/active-directory-saas-namely-tutorial/tutorial_general_205.png
| 40.982143 | 257 | 0.764198 | deu_Latn | 0.973598 |
3e7a6c3aad7770b9675bd015de5b3a52a1f892f4 | 2,969 | md | Markdown | AlchemyInsights/troubleshoot-audio-issues-in-windows-10.md | isabella232/OfficeDocs-AlchemyInsights-pr.id-ID | a378cb115ca9ee2ef20ad097a08471f925d505a9 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2020-05-19T19:06:50.000Z | 2021-03-06T00:35:09.000Z | AlchemyInsights/troubleshoot-audio-issues-in-windows-10.md | MicrosoftDocs/OfficeDocs-AlchemyInsights-pr.id-ID | 95d1cef182a766160dba451d9c5027f04a6dbe06 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2020-06-02T23:33:41.000Z | 2022-02-09T07:00:30.000Z | AlchemyInsights/troubleshoot-audio-issues-in-windows-10.md | isabella232/OfficeDocs-AlchemyInsights-pr.id-ID | a378cb115ca9ee2ef20ad097a08471f925d505a9 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2020-06-02T23:33:21.000Z | 2021-10-09T10:42:11.000Z | ---
title: Memecahkan masalah audio di Windows 10
ms.author: pebaum
author: pebaum
manager: scotv
ms.audience: Admin
ms.topic: article
ms.service: o365-administration
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.collection: Adm_O365
ms.custom:
- "3476"
- "9001463"
ms.openlocfilehash: 81a7f77bd6565c52ec9d752934a872e59cc11e89b90a646d17c3549d72e8a69f
ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175
ms.translationtype: MT
ms.contentlocale: id-ID
ms.lasthandoff: 08/05/2021
ms.locfileid: "54039429"
---
# <a name="troubleshooting-audio-issues-in-windows-10"></a>Memecahkan masalah audio di Windows 10
**Menjalankan pemecah masalah audio**
1. Membuka pengaturan [Pemecahan masalah](ms-settings:troubleshoot).
2. Pilih **Memutar Audio** Jalankan pemecah > **masalah**.
**Mengatur perangkat default**
Jika menyambungkan ke perangkat audio menggunakan USB atau HDMI, Anda mungkin perlu mengatur perangkat tersebut sebagai default:
1. Buka **Mulai** > **Suara**, lalu **pilih Suara** atau Ubah **suara** sistem dari daftar hasil.
2. Di tab **Pemutaran,** pilih perangkat, pilih **Atur Default**, lalu pilih **OK**.
**Memeriksa kabel, volume, speaker, dan headphone**
1. Periksa koneksi speaker dan headphone untuk kabel yang melonggar, dan pastikan kabel tersebut tersambung ke colokan yang benar.
2. Periksa tingkat daya dan volume anda dan cobalah mengubah semua kontrol volume naik.
3. Beberapa speaker dan aplikasi memiliki kontrol volumenya sendiri; Anda mungkin harus memeriksa semuanya untuk memastikan nilai tersebut berada di tingkat yang tepat.
4. Coba sambungkan menggunakan port USB lain.
**Catatan**: Ingatlah bahwa speaker Anda mungkin tidak berfungsi ketika headphone dicolokkan.
**Periksa Manajer Perangkat**
Untuk memastikan driver telah diperbarui:
1. Pilih **Mulai**, **ketik Manajer Perangkat**, lalu pilih Manajer **Perangkat** dari daftar hasil.
2. Di **bawah Suara, video, dan pengontrol permainan**, pilih kartu suara Anda, buka, pilih tab **Driver,** lalu pilih **Perbarui Driver**.
**Catatan**: Windows menemukan driver baru, cari driver di situs web produsen perangkat dan ikuti instruksi mereka.
**Instal ulang driver**
Jika tidak dapat memperbarui melalui Manajer Perangkat atau menemukan driver baru di situs web produsen, cobalah langkah-langkah berikut:
1. Di Manajer Perangkat, klik kanan (atau tekan dan tahan) driver audio, lalu pilih Hapus **instalan**. Hidupkan ulang perangkat dan Windows Anda mencoba menginstal ulang driver.
2. Jika menginstal ulang driver tidak berhasil, coba gunakan driver audio umum yang dilengkapi dengan Windows. Di Manajer Perangkat, klik kanan (atau tekan dan tahan) driver audio Anda > Perbarui perangkat lunak **driver** Telusuri komputer saya untuk perangkat lunak driver Biarkan saya memilih dari daftar driver perangkat di komputer saya , pilih Perangkat Audio Definisi Tinggi , pilih Berikutnya , dan ikuti instruksi untuk > > menginstalnya.
| 44.313433 | 454 | 0.785113 | ind_Latn | 0.960696 |
3e7b3748931acc3a95fb9d32210b5f5dd5980590 | 984 | md | Markdown | docs/visual-basic/misc/division-by-zero-error.md | TomekLesniak/docs.pl-pl | 3373130e51ecb862641a40c5c38ef91af847fe04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/misc/division-by-zero-error.md | TomekLesniak/docs.pl-pl | 3373130e51ecb862641a40c5c38ef91af847fe04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/misc/division-by-zero-error.md | TomekLesniak/docs.pl-pl | 3373130e51ecb862641a40c5c38ef91af847fe04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Dzielenie przez zero (błąd Visual Basic)
ms.date: 07/20/2015
f1_keywords:
- vbrID11
ms.assetid: 7dc22e29-8baa-4d82-a1a6-2de64ba9b25d
ms.openlocfilehash: 73dacb232b9749e36de0cc76e37eb60334c29e74
ms.sourcegitcommit: bf5c5850654187705bc94cc40ebfb62fe346ab02
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 09/23/2020
ms.locfileid: "91084401"
---
# <a name="division-by-zero-visual-basic-error"></a>Dzielenie przez zero (błąd Visual Basic)
Wartość wyrażenia użytego jako dzielnik wynosi zero.
## <a name="to-correct-this-error"></a>Aby poprawić ten błąd
1. Sprawdź pisownię zmiennych w wyrażeniu. Błędnie wpisana zmienna może niejawnie utworzyć zmienną numeryczną, która została zainicjowana do zera.
2. Sprawdź poprzednie operacje na zmiennej w wyrażeniu, szczególnie te przekazane do procedury jako argumenty z innych procedur.
## <a name="see-also"></a>Zobacz także
- [Typy błędów](../programming-guide/language-features/error-types.md)
| 36.444444 | 148 | 0.781504 | pol_Latn | 0.993258 |
3e7b69ec055282437ab8ab7045a5bd662663ed29 | 1,488 | md | Markdown | _posts/articles/2013-12-18-dinoii-plug-and-play.md | jaswalsaurabh/souliss.github.io | b029df39200ccebf00d94705e0925bc62b088062 | [
"MIT"
] | 6 | 2015-09-22T19:24:20.000Z | 2020-01-25T06:51:38.000Z | _posts/articles/2013-12-18-dinoii-plug-and-play.md | jaswalsaurabh/souliss.github.io | b029df39200ccebf00d94705e0925bc62b088062 | [
"MIT"
] | 17 | 2015-07-11T17:32:38.000Z | 2019-10-30T00:04:01.000Z | _posts/articles/2013-12-18-dinoii-plug-and-play.md | jaswalsaurabh/souliss.github.io | b029df39200ccebf00d94705e0925bc62b088062 | [
"MIT"
] | 30 | 2015-03-16T18:32:22.000Z | 2020-10-02T01:54:14.000Z | ---
layout: article
title: "Lets start, DINo v2 full plug&play"
categories: articles
author: plinio_seniore
excerpt: "First attempts for auto-configuration nodes."
tags: [arduino, kmp electronics, rs485, ethernet]
modified: 2013-12-18
image:
feature: 2013-12/DinoII-InputsB_EN.jpg
teaser: 2013-12/DinoII-InputsB_EN-teaser.jpg
ads: false
redirect_from: "2013/12/lets-start-dino-v2-full-plug.html"
---
After some time spent on tuning Souliss drivers for Wiznet W5200 Ethernet controller equipped on KMP Electronics DINo v2, we got a full support of new features included in new Alpha 5 with focus on plug&play.
Diving into the examples prepared for DINo you will found no longer need to set Quick Configuration paramters and IP configuration, that's the new inSketch mode included in the last Souliss release. Just load the example on your DINo and connect the boards to the network, they will join and be discovered by your Android smartphone automatically, doesn't matter your IP configuration.
Isn't magic, just using broadcast the boards will discover them-self and auto assign an address using a MAC-RAW (no IP) data exchange, then only the main (Gateway) board that has to be connected to Android get the IP configuration from the app itself. These features are of course available on all Souliss nodes regardless the communication media that is used.
All is included in new release A5.0.3 available in the [download](https://github.com/souliss/souliss/wiki/Downloads) area. | 67.636364 | 385 | 0.795027 | eng_Latn | 0.995462 |
3e7cd1cff9cffab0d2185998f1aae38ae3023ae8 | 498 | md | Markdown | README.md | Sergey-Titkov/Integration-Tests-with-Maven3-Failsafe-plugin-Undestandig | a9d9d400233c81485e4bcd3a592aa267b19947bc | [
"MIT"
] | null | null | null | README.md | Sergey-Titkov/Integration-Tests-with-Maven3-Failsafe-plugin-Undestandig | a9d9d400233c81485e4bcd3a592aa267b19947bc | [
"MIT"
] | null | null | null | README.md | Sergey-Titkov/Integration-Tests-with-Maven3-Failsafe-plugin-Undestandig | a9d9d400233c81485e4bcd3a592aa267b19947bc | [
"MIT"
] | null | null | null | # Integration-Tests-with-Maven3-Failsafe-plugin-Undestandig
Разобрался как писать интеграционные тесты с использованием maven и Failsafe плугина.
Интеграционные тесты запускаются:
```sh
mvn clean verify
```
Но лучше вынести их в отдельный профиль.
Была идея показывать результаты выполнения интеграционных тестов в SonarQube, но он не может их показыть.
SonarQube может показывать только покрытие для интеграционных тестов.
Можно только "смешать" результаты интеграционных и модульных тестов.
| 31.125 | 105 | 0.827309 | rus_Cyrl | 0.954779 |
3e7d937a7584f723e993ecd75e00162e632fcae3 | 330 | md | Markdown | posts/intellij/README.md | rafasantos/rafael-santos-blog | ad40f2ff52fafa1c9e980a0df7a3583b4b8f9d60 | [
"MIT"
] | null | null | null | posts/intellij/README.md | rafasantos/rafael-santos-blog | ad40f2ff52fafa1c9e980a0df7a3583b4b8f9d60 | [
"MIT"
] | null | null | null | posts/intellij/README.md | rafasantos/rafael-santos-blog | ad40f2ff52fafa1c9e980a0df7a3583b4b8f9d60 | [
"MIT"
] | null | null | null | Intellij Mix
============
Recursivelly remove Intellij's projects and configuration files.
```bash
find . -name '.idea' -exec rm -rf {} \;
find . -name '*.iml' -exec rm -rf {} \;
```
MacOS, remove Intellij's cache and local settings.
```bash
rm -rf ~/Library/Application\ Support/JetBrains
rm -rf ~/Library/Caches/JetBrains
```
| 22 | 64 | 0.663636 | eng_Latn | 0.350809 |
3e7e0c4fbe1d1bc6e5c562e9b6ed5db610cba507 | 338 | md | Markdown | README.md | jaehyungpark/YHack-2016 | dd818c53fc021ff70fa20d957442516c26d2e791 | [
"MIT"
] | null | null | null | README.md | jaehyungpark/YHack-2016 | dd818c53fc021ff70fa20d957442516c26d2e791 | [
"MIT"
] | null | null | null | README.md | jaehyungpark/YHack-2016 | dd818c53fc021ff70fa20d957442516c26d2e791 | [
"MIT"
] | null | null | null | # YHack-2016
YHack at Yale University.
Project name: Meteor
Created a distributive computing simulator web application using REST API in PHP, Javascript, HTML, and CSS.
The website is hosted by 1 & 1's cloud service and accessible via 74.208.83.96.
(We bought an URL via namecheap but had issues setting up the server's IP addr to URL).
| 42.25 | 108 | 0.778107 | eng_Latn | 0.978384 |
3e7e84bdc41e35f5c0916d61f718afc8ab97d087 | 218 | md | Markdown | README.md | Gibstick/pin-archive-3 | 457738891792daba23e2d0d177095b0ea435a52d | [
"MIT"
] | 2 | 2021-12-31T05:23:01.000Z | 2022-01-22T14:26:37.000Z | README.md | Gibstick/pin-archive-3 | 457738891792daba23e2d0d177095b0ea435a52d | [
"MIT"
] | null | null | null | README.md | Gibstick/pin-archive-3 | 457738891792daba23e2d0d177095b0ea435a52d | [
"MIT"
] | null | null | null | # pin-archive-3
TODO
# Permissions and Scopes
Scopes are:
- bot
- application.commands
Permissions are `9280`:
- Read/View Channels
- Manage Messages
- Add Reactions
The bot requires the message content intent.
| 11.473684 | 44 | 0.747706 | eng_Latn | 0.889498 |
3e7f6eaea387e64514a03c9d35196e7029b9c947 | 8,654 | md | Markdown | key-scenarios.md | Malvoz/MapML-Proposal | 98dd1b9e7eb1622630c30ea604b6be5e8cc696c1 | [
"W3C-20150513"
] | 10 | 2019-12-23T00:36:30.000Z | 2021-12-01T06:18:13.000Z | key-scenarios.md | Malvoz/MapML-Proposal | 98dd1b9e7eb1622630c30ea604b6be5e8cc696c1 | [
"W3C-20150513"
] | 38 | 2020-02-12T22:08:14.000Z | 2021-12-31T12:30:53.000Z | key-scenarios.md | Malvoz/MapML-Proposal | 98dd1b9e7eb1622630c30ea604b6be5e8cc696c1 | [
"W3C-20150513"
] | 3 | 2020-06-15T22:18:33.000Z | 2020-11-03T12:02:49.000Z | <h1 id="key-scenarios">Key scenarios</h1>
Key scenarios of the [MapML Proposal](README.md).
This is a list of scenarios which we hope MapML will solve.
<h2>Contents</h2>
- [Tiled Coordinate Reference Systems](key-scenarios.md#tiled-coordinate-reference-systems)
- [Linking](key-scenarios.md#linking)
- [Including map layers in HTML by reference](#including-map-layers-in-html-by-reference)
- [Links from HTML to MapML](#links-from-html-to-mapml)
- [Links for layer alternate styles](#links-for-layer-alternate-styles)
- [Links for alternate layer coordinate systems (projections)](#links-for-alternate-layer-coordinate-systems-projections)
- [Links between map services](#links-between-map-services)
- [Links between locations](#links-between-locations)
<h2 id="tiled-coordinate-reference-systems">Tiled Coordinate Reference Systems</h2>
Perhaps the most important characteristic of Web maps that is essential for standardization, is the definition of the [coordinate reference systems](https://en.wikipedia.org/wiki/Spatial_reference_system) and the scale of the display.

<h2 id="linking">Linking</h2>
<h3 id="including-map-layers-in-html-by-reference">Including map layers in HTML by reference</h3>
[Cartography](https://en.wikipedia.org/wiki/Cartography) is a challenging discipline, no matter the media in which it is performed. The Web medium demands an entirely new kind of cartography, one which makes maps potentially more valuable than ever, by rendering them dynamically pannable and scalable, among other characteristics. The new cartographers have a large and powerful set of server technologies, commonly known as [Geographic Information Systems](https://en.wikipedia.org/wiki/Geographic_information_system). Coupled with open data, and public server APIs, Web cartographers are able to easily publish their products.
<h3 id="links-from-html-to-mapml">Links from HTML to MapML</h3>
It’s one thing to have an element like `<layer>` to provide the client logic for a MapML document embedded in an HTML document, but what should happen if a simple `<a href="">` pointed to a MapML document? We’ve said that a MapML document should be parseable with the HTML parser, so what should the default behaviour of activating such link be? Probably, it should be similar to that behaviour which happens when you create a link to a media type that is natively supported by HTML, such as image/png or video/mp4: the browser synthesizes a simple HTML document with a client element such as `<img src="">` or `<video><source src=""></video>`, i.e. it should synthesize an HTML document with a `<map><layer src=""></layer></map>` element with default or generated metadata and parameters.
<h3 id="links-for-layer-alternate-styles">Links for layer alternate styles</h3>
```html
<mapml lang="en">
<head>
<meta charset="utf-8">
<title>States</title>
<link href="./?style=pophatch" rel="style" title="pophatch">
<link href="./?style=polygon" rel="style" title="polygon">
<link href="./?style=" rel="self style" title="population">
</head>
…
</mapml>
```
Results in a rendered map widget UI with links to mutually exclusive options:

Selecting the ‘pophatch’ link responds with a different MapML document:
```html
<mapml lang="en">
<head>
<meta charset="utf-8">
<title>States</title>
<link href="./?style=pophatch" rel="self style" title="pophatch">
<link href="./?style=polygon" rel="style" title="polygon">
<link href="./?style=" rel="style" title="population">
</head>
…
</mapml>
```
which replaces that layer’s browsing context with a different representation of the layer:

In the above code example, a MapML document provides links to alternate ‘styles’ of itself. The current map document is additionally and importantly tagged with the `self` link relation. This link-based facility is very powerful, because what constitutes a ‘style’ for the current map document is in the eye of the document author, and is not limited to a representation with an alternate stylesheet applied to the same data. What is linked to as a ‘style’ may be a radically different rendering of ‘this layer’, even different data, for example a satellite view vs. a topographic/street map rendering (which are obviously not simply different stylesheet renderings of the same data, as in the above example, but are in fact different data for the same extent).
<h3 id="links-for-alternate-layer-coordinate-systems-projections">Links for alternate layer coordinate systems (projections)</h3>
Geographic Information Systems allow us to provide the same data in a variety of coordinate systems. When the same layer data is available in different MapML coordinate systems, service providers can make it easy to select the right URL for a layer by encoding links to the current data represented in alternate coordinate systems, in the document metadata (in the head). The user agent can, based on the required projection designated by the `<map projection="">` attribute, automatically select (negotiate) from among the advertised links, as shown below. The coordinate system selection/negotiation process is transparent and requires no input from the user; if the URL used by the author in the `<layer src="">` value leads to a MapML document with a coordinate system that does not match its parent `<map projection="">` value, the user agent will select the correct advertised alternate to replace it.
```html
<mapml lang="en">
<head>
<meta charset="utf-8">
<title>States</title>
<meta name="projection" content="OSMTILE">
<link href="./CBMTILE" rel="alternate" projection="CBMTILE">
<link href="./APSTILE" rel="alternate" projection="APSTILE">
<link href="./WGS84" rel="alternate" projection="WGS84">
</head>
...
</mapml>
```
`alternate` link relations are used to provide alternate representations by format and other criteria. In this case, the `projection` attribute is used by the UA to distinguish the links within the link group identified by the `alternate` link relation.
<h3 id="links-between-map-services">Links between map services</h3>
This feature could enable federation of map services in a way similar to the ‘federation’ of the HTML Web today, in that in today’s Web, authors decide to link to others’ Web sites and so the federation of sites constituting the Web is enabled. The MapML-enabled Web should follow a similar path, in that not every Web map needs or can contain all the map information in the world. So, links should be used to allow map authors to link to others maps in a similar way. I believe this blends with or is the same as the [links between locations](#links-between-locations) use case, below.
In the `<feature>` markup [shown in the High-Level API explainer](high-level-api.md#bookmark2), a standard hyperlink is shown, wrapped around the outer polygon coordinates of a feature’s geometry:
```html
<polygon>
<a href="https://example.org/canada/mapml">
<coordinates class="f72 outer">59.5303058 74.7915903 59.7266984 74.7479857 … 76.1838491 <span class="noline">67.5 76.1894504 67.5 76.7412725 67.5 77.0079535</span> 67.2115912 76.9522523 … 74.727385 59.5303058 74.7915903</coordinates>
</a>
</polygon>
```
Links like this could have a default visual and accessible cue to indicate link-ness, like its text-wrapping `<a href>` counterpart, that would signal what the user should expect to happen if activated with a gesture. Useful actions could include: replacing the entire (root) browsing context with another map or HTML document, or loading the link target into the map layer browsing context from which it was activated.
Like `<span>` elements, `<a>` elements could appear within the `<coordinates>` element, allowing links from geometry parts.
<h3 id="links-between-locations">Links between locations</h3>
Links between locations could be marked up similarly to [links between services](#links-between-map-services), possibly with the addition of attributes indicating the location and zoom of the link destination. The current location of a map could be modified by activating links from one location to another. There might need to be a different visual / accessible cue for such links. By default, the map might animate in a “fly to” manner in response to link activation.
| 75.252174 | 909 | 0.761613 | eng_Latn | 0.993475 |
3e82a44bc818e884d99ec9b0e4ea200d297043bc | 25 | md | Markdown | README.md | oTkPoBeHuE/jsgame | c40a0a9d6e7278a96b13f77880bc8c742a520af3 | [
"MIT"
] | null | null | null | README.md | oTkPoBeHuE/jsgame | c40a0a9d6e7278a96b13f77880bc8c742a520af3 | [
"MIT"
] | null | null | null | README.md | oTkPoBeHuE/jsgame | c40a0a9d6e7278a96b13f77880bc8c742a520af3 | [
"MIT"
] | null | null | null | # jsgame
Javascript game
| 8.333333 | 15 | 0.8 | eng_Latn | 0.945835 |
3e84a55c9bdd59fed65bd27725e2e425d5d46e50 | 807 | md | Markdown | neuralstyle/README.md | awadalaa/DataSciencePractice | 604c4838fd51d59032978621fe7e1973320fc60b | [
"MIT"
] | null | null | null | neuralstyle/README.md | awadalaa/DataSciencePractice | 604c4838fd51d59032978621fe7e1973320fc60b | [
"MIT"
] | null | null | null | neuralstyle/README.md | awadalaa/DataSciencePractice | 604c4838fd51d59032978621fe7e1973320fc60b | [
"MIT"
] | null | null | null | # neural-style
Neural style in tensorflow! This is a tensorflow implementation based on the ideas from this [paper][paper]. Using a neural net we can differentiate style from content and combine two images.
## Examples

These were the input images used :


[Rain Princess - Leonid Afremov][rain]
## Prerequisite
Get Pre-trained VGG network:
```wget http://www.vlfeat.org/matconvnet/models/beta16/imagenet-vgg-verydeep-19.mat```
## Reference
[A Neural Algorithm of Artistic Style (Leon A. Gatys, et al.)][paper]
[paper]: http://arxiv.org/abs/1508.06576
[rain]: https://afremov.com/RAIN-PRINCESS-Palette-knife-Oil-Painting-on-Canvas-by-Leonid-Afremov-Size-30-x30.html | 28.821429 | 191 | 0.755886 | eng_Latn | 0.404783 |
3e84d50cbccde59db1dbd6d3c297565e60e9c2d5 | 36,975 | md | Markdown | Documentation/Building Stroika.md | SophistSolutions/Stroika | f4e5d84767903a054fba0a6b9c7c4bd1aaefd105 | [
"MIT"
] | 28 | 2015-09-22T21:43:32.000Z | 2022-02-28T01:35:01.000Z | Documentation/Building Stroika.md | SophistSolutions/Stroika | f4e5d84767903a054fba0a6b9c7c4bd1aaefd105 | [
"MIT"
] | 98 | 2015-01-22T03:21:27.000Z | 2022-03-02T01:47:00.000Z | Documentation/Building Stroika.md | SophistSolutions/Stroika | f4e5d84767903a054fba0a6b9c7c4bd1aaefd105 | [
"MIT"
] | 4 | 2019-02-21T16:45:25.000Z | 2022-02-18T13:40:04.000Z | # Building Stroika
---
## Common
Stroika is a C++ class library. The only fully supported build environment for Stroika is GNU Make. Once you have that setup, you can build through your favorite IDE.
This build process is cross-platform. It supports cross-compiling, and builds on visual studio.net (windows), and Linux.
---
## Quick Start
```bash
wget https://github.com/SophistSolutions/Stroika/archive/v2.1-Release.tar.gz
tar xf v2.1-Release.tar.gz && mv Stroika-2.1-Release Stroika-Dev
```
or
```bash
git clone https://github.com/SophistSolutions/Stroika.git Stroika-Dev
```
followed by:
```bash
make --directory Stroika-Dev all run-tests
```
If you have a relatively standard POSIX like c++ build environement, you maybe done at this point. If you got errors, or want to know more, read on.
### Build with Docker
If you are missing any components and just want a quick environment to test that has all the right build components installed, you can use the pre-built docker containers:
UNIX:
```bash
docker run -it sophistsolutionsinc/stroika-buildvm-ubuntu2004-small
cat Getting-Started-With-Stroika.md
```
Windows:
```bash
docker run -it sophistsolutionsinc/stroika-buildvm-windows-cygwin-vs2k19
cat Getting-Started-With-Stroika.md
```
### **_Note_**
It takes a while to build all of Stroika (10-20 minutes per configuration), so adding -j10 (or so) helps a lot.
---
## More Details on Getting Started
### Install required tools
This mostly conists of GNU make, and perl (see details below depending on your platform). After that, the Stroika build process will prompt you for anything else you need.
#### Required for ALL platforms
- c++ compiler supporting C++17 or later
- make (gnu make)
- patch
- perl
- pkg-config
- realpath
- sed
- tar
- tr
- wget
- 7za (if building with LZMA SDK – the default)
- unzip
#### For MacOS
- XCode 12 or later
- install from appstore,
- Then from command line
- xcode-select –install
- Homebrew can be helpful (but use whatever package mgr you wish)
- ruby -e "\$(curl -fsSL [https://raw.githubusercontent.com/Homebrew/install/master/install](https://raw.githubusercontent.com/Homebrew/install/master/install))"
- to install apps with brew, use "brew install APPNAME"
- brew install gnu-sed
- brew install p7zip (if building lzma)
#### For Windows
- Visual Studio.net 2017 (or later)
- Currently tested with Visual Studio.net 2017 and Visual Studio.net 2019 (see release notes for details)
- Cygwin
Including
- dos2unix
- unix2dos
#### For UNIX
- Compiler
- gcc 8 or later OR
- Stroika v2.1 is currently tested with gcc8, gcc9, and gcc10
- llvm (clang++) 6 or later
- Stroika v2.1 is currently tested with clang6, clang7, clang8, clang9, clang10
- automake (if building curl)
- libtool (gnu version) – (if building curl)
### Things to try (explore)
- `make help`
Not needed, but gives some idea of make options.
- `make check-prerequisite-tools`
Not needed, but tells you if you are missing anything critical.
- `make default-configurations`
Not needed, but it's a springboard for setting up the configuration you want.
Review ConfigurationFiles/Debug.xml or any of the other default configuration files
---
## The **configure** script
Like many non-trivial C/C++ libraries, you run configure to establish some build parameters, before invoking make. But unlike most such configuration systems, Stroika creates 'named' configurations, and facilitates building multiple such named configurations at once.
Each configuration is stored in in a file named \${CONFIGNAME}.xml in the top-level `ConfigurationFiles/` directory.
- Examples of generating configurations
- `./configure Debug-x86 --config-tag Windows --config-tag x86 --arch x86 --apply-default-debug-flags`
- `./configure Debug --config-tag Unix --apply-default-debug-flags`
- `./configure clang++-6-release-libstdc++ --config-tag Unix --compiler-driver clang++-6.0 --apply-default-release-flags --stdlib libstdc++ --trace2file enable`
- `CXX=clang++ ./configure Debug-clang --config-tag Unix --apply-default-debug-flags`
- `./configure g++-valgrind-debug-SSLPurify --config-tag Unix --config-tag valgrind -valgrind enable --openssl use --openssl-extraargs purify --apply-default-debug-flags --sanitize none;`
### Configuration File Format
Simple XML format (@todo provide XSD).
The command-line used to generate the configuration is the first element of the XML file, so its easy to regenerate the configuration with whatever slight variation you wish.
#### Example Configuration file
```xml
<!--This file autogenerated by the configure command: see Configure-Command-Line, modify it, and re-run-->
<Configuration>
<Configure-Command-Line>configure Debug-x86_64 --config-tag Windows --config-tag x86_64 --build-by-default Cygwin --arch x86_64 --apply-default-debug-flags</Configure-Command-Line>
<ProjectPlatformSubdir>VisualStudio.Net-2019</ProjectPlatformSubdir>
<BUILD_TOOLS_ROOT>C:/Program Files (x86)/Microsoft Visual Studio/2019/Community</BUILD_TOOLS_ROOT>
<TARGET_PLATFORMS>Windows</TARGET_PLATFORMS>
<ARCH>x86_64</ARCH>
<ConfigTags>Windows x86_64</ConfigTags>
<BuildByDefault>Cygwin</BuildByDefault>
<AS>C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/bin/HostX64/x64/ml64.exe</AS>
<CC>C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/bin/HostX64/x64/cl.exe</CC>
<CXX>C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/bin/HostX64/x64/cl.exe</CXX>
<MIDL>C:/Program Files (x86)/Windows Kits/10/bin/10.0.19041.0/x64/midl.exe</MIDL>
<RC>C:/Program Files (x86)/Windows Kits/10/bin/10.0.19041.0/x64/rc.exe</RC>
<ExtraMakeDefines>
</ExtraMakeDefines>
<PkgConfigLinkLineAppendages>
</PkgConfigLinkLineAppendages>
<ENABLE_ASSERTIONS>1</ENABLE_ASSERTIONS>
<qFeatureFlag_ActivePerl>use</qFeatureFlag_ActivePerl>
<qFeatureFlag_boost>use</qFeatureFlag_boost>
<qFeatureFlag_LibCurl>no</qFeatureFlag_LibCurl>
<qFeatureFlag_OpenSSL>use</qFeatureFlag_OpenSSL>
<qFeatureFlag_WinHTTP>use-system</qFeatureFlag_WinHTTP>
<qFeatureFlag_ATLMFC>use-system</qFeatureFlag_ATLMFC>
<qFeatureFlag_Xerces>use</qFeatureFlag_Xerces>
<qFeatureFlag_ZLib>use</qFeatureFlag_ZLib>
<qFeatureFlag_sqlite>use</qFeatureFlag_sqlite>
<qFeatureFlag_LZMA>use</qFeatureFlag_LZMA>
<qFeatureFlag_WIX>use</qFeatureFlag_WIX>
<CPPFLAGS>/I"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/ATLMFC/include" /I"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/include" /I"C:/Program Files (x86)/Windows Kits/NETFXSDK/4.8/include/um" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/ucrt" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/shared" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/um" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/winrt" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/cppwinrt" /I"C:/Sandbox/Stroika/DevRoot/Builds/Debug-x86_64/ThirdPartyComponents/include/" /I"C:/Sandbox/Stroika/DevRoot/Library/Sources/" /I"C:/Sandbox/Stroika/DevRoot/IntermediateFiles/Debug-x86_64/" /D_UNICODE /DUNICODE /D_WINDOWS /D_DEBUG /DqDebug=1 /DqHasFeature_LibCurl=0 /DqHasFeature_OpenSSL=1 /DqHasFeature_WinHTTP=1 /DqHasFeature_ATLMFC=1 /DqHasFeature_Xerces=1 /DqHasFeature_ZLib=1 /DqHasFeature_sqlite=1 /DqHasFeature_LZMA=1 /DqHasFeature_boost=1 /DqTraceToFile=1 /DqDefaultTracingOn=1</CPPFLAGS>
<CFLAGS>/I"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/ATLMFC/include" /I"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/include" /I"C:/Program Files (x86)/Windows Kits/NETFXSDK/4.8/include/um" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/ucrt" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/shared" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/um" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/winrt" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/cppwinrt" /I"C:/Sandbox/Stroika/DevRoot/Builds/Debug-x86_64/ThirdPartyComponents/include/" /I"C:/Sandbox/Stroika/DevRoot/Library/Sources/" /I"C:/Sandbox/Stroika/DevRoot/IntermediateFiles/Debug-x86_64/" /EHsc /nologo /GR /Gd /W4 /Zc:inline /FC /bigobj /RTCsu /GS /Oy- /Od /MTd /Z7 /D_UNICODE /DUNICODE /D_WINDOWS /D_DEBUG /DqDebug=1 /DqHasFeature_LibCurl=0 /DqHasFeature_OpenSSL=1 /DqHasFeature_WinHTTP=1 /DqHasFeature_ATLMFC=1 /DqHasFeature_Xerces=1 /DqHasFeature_ZLib=1 /DqHasFeature_sqlite=1 /DqHasFeature_LZMA=1 /DqHasFeature_boost=1 /DqTraceToFile=1 /DqDefaultTracingOn=1 -fsanitize=address</CFLAGS>
<CXXFLAGS>/std:c++latest /I"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/ATLMFC/include" /I"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/include" /I"C:/Program Files (x86)/Windows Kits/NETFXSDK/4.8/include/um" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/ucrt" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/shared" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/um" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/winrt" /I"C:/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/cppwinrt" /I"C:/Sandbox/Stroika/DevRoot/Builds/Debug-x86_64/ThirdPartyComponents/include/" /I"C:/Sandbox/Stroika/DevRoot/Library/Sources/" /I"C:/Sandbox/Stroika/DevRoot/IntermediateFiles/Debug-x86_64/" /EHsc /nologo /GR /Gd /W4 /Zc:inline /FC /bigobj /RTCsu /GS /Oy- /Od /MTd /Z7 /D_UNICODE /DUNICODE /D_WINDOWS /D_DEBUG /DqDebug=1 /DqHasFeature_LibCurl=0 /DqHasFeature_OpenSSL=1 /DqHasFeature_WinHTTP=1 /DqHasFeature_ATLMFC=1 /DqHasFeature_Xerces=1 /DqHasFeature_ZLib=1 /DqHasFeature_sqlite=1 /DqHasFeature_LZMA=1 /DqHasFeature_boost=1 /DqTraceToFile=1 /DqDefaultTracingOn=1 -fsanitize=address</CXXFLAGS>
<INCLUDES_PATH>/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/ATLMFC/include:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/include:/cygdrive/c/Program Files (x86)/Windows Kits/NETFXSDK/4.8/include/um:/cygdrive/c/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/ucrt:/cygdrive/c/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/shared:/cygdrive/c/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/um:/cygdrive/c/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/winrt:/cygdrive/c/Program Files (x86)/Windows Kits/10/include/10.0.19041.0/cppwinrt:/cygdrive/c/Sandbox/Stroika/DevRoot/Builds/Debug-x86_64/ThirdPartyComponents/include/:/cygdrive/c/Sandbox/Stroika/DevRoot/Library/Sources/:/cygdrive/c/Sandbox/Stroika/DevRoot/IntermediateFiles/Debug-x86_64/</INCLUDES_PATH>
<CrossCompiling>false</CrossCompiling>
<IncludeDebugSymbolsInLibraries>1</IncludeDebugSymbolsInLibraries>
<IncludeDebugSymbolsInExecutables>1</IncludeDebugSymbolsInExecutables>
<TOOLS_PATH_ADDITIONS>/cygdrive/c/Program Files (x86)/HTML Help Workshop:/cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v10.0A/bin/NETFX 4.8 Tools/x64/:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/Ninja:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/FSharp/Tools:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/TeamFoundation/Team Explorer:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/TestWindow:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/Extensions/Microsoft/IntelliCode/CLI:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/VC/Linux/bin/ConnectionManagerExe:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/VC/VCPackages:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/Tools/:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/Tools/devinit:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/MSBuild/Current/Bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/MSBuild/Current/bin/Roslyn:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Team Tools/Performance Tools:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/Team Tools/Performance Tools/x64:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/bin/HostX64/x64:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/Shared/Common/VSPerfCollectionTools/vs2019/:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/Shared/Common/VSPerfCollectionTools/vs2019/x64:/cygdrive/c/Program Files (x86)/Windows Kits/10/bin/10.0.19041.0/x64:/cygdrive/c/Program Files (x86)/Windows Kits/10/bin/x64:/cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319</TOOLS_PATH_ADDITIONS>
<LIBS_PATH>/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/ATLMFC/lib/x64:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/lib/x64:/cygdrive/c/Program Files (x86)/Windows Kits/NETFXSDK/4.8/lib/um/x64:/cygdrive/c/Program Files (x86)/Windows Kits/10/lib/10.0.19041.0/ucrt/x64:/cygdrive/c/Program Files (x86)/Windows Kits/10/lib/10.0.19041.0/um/x64:/cygdrive/c/Sandbox/Stroika/DevRoot/Builds/Debug-x86_64/ThirdPartyComponents/lib/</LIBS_PATH>
<LIB_DEPENDENCIES>xerces-c_3D.lib lzma.lib sqlite.lib zlib.lib urlmon.lib rpcrt4.lib kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib odbccp32.lib </LIB_DEPENDENCIES>
<EXTRA_PREFIX_LINKER_ARGS> /nologo /MACHINE:x64 /DEBUG</EXTRA_PREFIX_LINKER_ARGS>
<EXTRA_SUFFIX_LINKER_ARGS></EXTRA_SUFFIX_LINKER_ARGS>
<AR></AR>
<LIBTOOL>C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/bin/HostX64/x64/lib.exe</LIBTOOL>
<RANLIB></RANLIB>
<Linker>C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30037/bin/HostX64/x64/link.exe</Linker>
<STRIP></STRIP>
<RUN_PREFIX></RUN_PREFIX>
</Configuration>
```
### Configuration Basic Concepts
- Each configuration has a name (e.g. Debug, Release-clang-7, Debug-raspberry-pi, Release-Centos-6, etc)
- Each configuration can have multiple 'tags' - like Unix, Windows, x86, arm, etc - which can be used to build sets of configurations
- Each configuration defines CFLAGS and CXXFLAGS and (explain others) variables used in a regular makefile. You define (on the command line) variables (like 'assertions') which are used to generate to generate a bunch of other variables which appear in configuration files.
### Sample default build rules
Just to provide context for the defined configuration variables; see the samples Makefiles for how to construct your own makefiles
```make
$(ObjDir)%.o : %.cpp
@#n.b. no CPPFLAGS here - 'configure --append-CPPFLAGS' adds directly to CFLAGS and CXXFLAGS
@$(CXX) $(CXXFLAGS) -c $< -o $@
$(TARGETEXE): $(Objs) $(StroikaLibs)
@$(Linker) $(StroikaLinkerPrefixArgs) -o $(TARGETEXE) $(Objs) $(StroikaLinkerSuffixArgs)
```
### Configure Command-line reference
```text
Usage:
configure CONFIGURATION-NAME [OPTIONS]* where options can be:
--arch {ARCH} /* for high level architecture - first section of gcc -machine - e.g. x86, x86_64, arm - usually auto-detected */
[--config-tag {TAG-NAME}]* /* Add TAG-NAME to the list of tags associated with this configuration (for now limit one). Maybe repeated */
--build-by-default never|always|{uname} /* Mark configuration to be built by default always (DEFAULT), never, or only if host uname/uname -o equals ARGUMENT */
--platform {PLATFORM} /* Specifies the ProjectPlatformSubdir (Unix, VisualStudio.Net-2017, VisualStudio.Net-2019) - usually auto-detected */
--target-platforms {TARGET_PLATFORMS} /* Specifies the target-platforms- system compiling Stroika for - (set of ENUM where ENUM=(Windows, POSIX, Linux, MacOS)) - usually auto-detected */
--build-tools-root {BUILD_TOOLS_ROOT} /* Specifies the BUILD_TOOLS_ROOT - initially just for visual studio - usually auto-detected */
--assertions { enable|disable|default } /* Enables/disable assertion feature (setting qDebug) */
--block-allocation { enable|disable|default } /* Enables/disable block-allocation (a feature that improves performance, but messes up valgrind) */
--valgrind { enable|disable|default } /* Enables/disable valgrind-specific runtime code (so far only needed for clean helgrind use) */
--GLIBCXX_DEBUG { enable|disable|default } /* Enables/Disables GLIBCXX_DEBUG (G++-specific) */
--MACOSX_DEPLOYMENT_TARGET { #|default|none } /* set MACOSX_DEPLOYMENT_TARGET (mac os min version) (macosx target specific) */
--cppstd-version {FLAG} /* Sets can be c++17, or c++2a */
--stdlib {LIB} /* libc++ (clang lib), libstdc++ (gcc and often clang) */
--ActivePerl {use|no} /* Enables/disables use of ActivePerl (Windows Only) - JUST USED TO BUILD OPENSSL for Windows*/
--LibCurl {build-only|use|use-system|no} /* Enables/disables use of LibCurl for this configuration [default TBD]*/
--boost {build-only|use|use-system|no} /* Enables/disables use of boost for this configuration [default use] */
--OpenSSL {build-only|use|use-system|no} /* Enables/disables use of OpenSSL for this configuration [default use] */
--OpenSSL-ExtraArgs { purify? } /* Optionally configure extra OpenSSL features (see Stroika/OpenSSL makefile) */
--WinHTTP {use-system|no} /* Enables/disables use of WinHTTP for this configuration [default use-system on windows, and no otherwise] */
--ATLMFC {use-system|no} /* Enables/disables use of ATLMFC for this configuration [default use-system on windows, and no otherwise] */
--Xerces {build-only|use|use-system|no} /* Enables/disables use of Xerces for this configuration [default use] */
--sqlite {build-only|use|use-system|no} /* Enables/disables use of sqlite for this configuration [default use] */
--ZLib {build-only|use|use-system|no} /* Enables/disables use of ZLib for this configuration [default use] */
--WIX {use|use-system|no} /* Enables/disables use of WIX (Windows Only) - to build windows installers*/
--lzma {build-only|use|use-system|no} /* Enables/disables use of LZMA SDK for this configuration [default use] */
--no-third-party-components /* equivilent to --ActivePerl no --LibCurl no --boost no --OpenSSL no --WinHTTP no --ATLMFC no --Xerces no --sqlite no --ZLib no --WIX no --lzma no */
--trace2file { enable|disable|default } /* Enables/disable trace2file feature */
--static-link-gccruntime { enable|disable } /* Enables/disable gcc runtime static link (only applies if gcc family compiler) */
--make-define {ARG} /* Define makefile define for the given configuration: text of arg appears as line in Configuration.mk */
--compiler-driver {ARG} /* default is gcc */
--ar {ARG} /* default is undefined, but if compiler-driver is gcc or g++, this is gcc-ar */
--as {ARG} /* default is 'as' on unix, and retrieved from visual studio on visual studio */
--ranlib {ARG} /* default is undefined, but if compiler-driver is gcc or g++, this is gcc-ranlib */
--strip {ARG} /* sets program to do stripping; default is undefined, but for POSIX, defaults to strip */
--append-CFLAGS {ARG} /* Appends ARG to CFLAGS */
--remove-CFLAGS {ARG} /* Remove ARG from CFLAGS (including default added args; processed after all adds applied) */
--replace-all-CFLAGS {ARG} /* OVERRIDES DEFAULTS- and sets CFLAGS to just these values */
--append-CPPFLAGS {ARG} /* Appends ARG to CPPFLAGS; alias for append-CFLAGS AND append-CXXFLAGS, elg --append-CPPFLAGS -DA=B */
--remove-CPPFLAGS {ARG} /* Remove ARG from CPPFLAGS (including default added args; processed after all adds applied) */
--replace-all-CPPFLAGS {ARG} /* OVERRIDES DEFAULTS- and sets CPPFLAGS to just these values */
--append-CXXFLAGS {ARG} /* Appends ARG to CXXFLAGS */
--remove-CXXFLAGS {ARG} /* Remove ARG from CXXFLAGS (including default added args; processed after all adds applied) */
--replace-all-CXXFLAGS {ARG} /* OVERRIDES DEFAULTS- and sets CXXFLAGS to just these values */
--SharedSymbolVisibility {ARG} /* alias for append-CFLAGS AND append-CXXFLAGS with -fvisibility=XXX (defaults to hidden on gcc/clang/unix compilers) */
--append-extra-prefix-linker-args {ARG} /* Appends ARG to 'extra prefix linker args */
--append-extra-suffix-linker-args {ARG} /* Appends ARG to 'extra suffix linker args */
--append-extra-compiler-and-linker-args {ARG} /* Appends ARG to 'extra compiler' and 'extra linker' args */
--includes-path {ARG} /* Sets INCLUDES_PATH variable (: separated, since unix standard and allows spaces) */
--append-includes-path {ARG} /* Appends ARG to 'INCLUDES_PATH */
--libs-path {ARG} /* Sets LIBS_PATH variable (':' separated, since unix standard and allows spaces) */
--append-libs-path {ARG} /* Appends ARG to 'LIBS_PATH */
--lib-dependencies {ARG} /* Sets LIB_DEPENDENCIES variable (space separated) */
--append-lib-dependencies {ARG} /* Appends ARG to 'LIB_DEPENDENCIES */
--run-prefix {ARG} /* Sets variable RUN_PREFIX with stuff injected before run for built executables,
such as LD_PRELOAD=/usr/lib/arm-linux-gnueabihf/libasan.so.3 */
--append-run-prefix {ARG} /* Appends ARG to 'extra linker */
--pg {ARG} /* Turn on -pg option (profile for UNIX/gcc platform) on linker/compiler */
--lto { enable|disable } /* Turn on link time code gen on linker/compiler (for now only gcc/unix stack) */
--cross-compiling {true|false} /* Defaults generally to false, but set explicitly to control if certain tests will be run */
--apply-default-debug-flags /* */
--apply-default-release-flags /* */
--only-if-has-compiler /* Only generate this configuration if the compiler appears to exist (test run)*/
--debug-symbols {true|false} /* --debug-symbols-lib AND --debug-symbols-exe at the same time */
--debug-symbols-lib {true|false} /* defaults to true, but can be disabled if makes compile/link/etc too big/slow */
--debug-symbols-exe {true|false} /* defaults to true, but can be disabled if makes compile/link/etc too big/slow */
--malloc-guard {true|false} /* defaults to false (for now experimental and only works with GCC) */
--runtime-stack-check {true|false} /* gcc -fstack-protector-all */
--sanitize {none|thread|address|undefined|leak} /* if arg none, reset to none, else adds arg to sanitized feature (gcc/clang only) -
any arg you can pass to -fsanitize=XXXX */
/* see https://gcc.gnu.org/onlinedocs/gcc-6.1.0/gcc.pdf (search -fsanitize=; eg. --sanitize address,undefined */
--no-sanitize {thread|vptr|etc...} /* any from --sanitize or all */
Configure's behavior is also influenced by the following environment variables:
CC, CXX, PLATFORM, TARGET_PLATFORMS, ARCH, AS, AR, RANLIB, STRIP; these just simpulate adding the obvoius associated argument to configure
EXTRA_CONFIGURE_ARGS= a space separated list of arguments added to the beginning of the configure command
```
### Amending a configuration
Configuration files should **not** be edited by hand. Instead, the current command line is the first element of the configuration file: take that as a starting point and ammend it as needed, and re-run.
### Environment variables that affect generation of configuration
- CC, CXX, PLATFORM, ARCH, AS, AR, RANLIB, STRIP
The reason this is so important, is that it allows an external build system like bitbake, or node-gyp, etc to define parameters for a build, and easily generate appropriate configurations.
### Printing configure variables from a script
#### Examples
- `./ScriptsLib/GetConfigurationParameter Debug-U-32 CFLAGS`
/Od /I"C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.16.27023\ATLMFC\include" /I"C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.16.27023\include" /I"C:\Program Files (x86)\Windows Kits\NETFXSDK\4.6.1\include\um" /I"C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt" /I"C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\shared" /I"C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um" /I"C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\winrt" /I"C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\cppwinrt" /I"C:\Sandbox\Stroika\DevRoot\Builds\Debug-U-32\ThirdPartyComponents\include\\" /I"C:\Sandbox\Stroika\DevRoot\Library\Sources\\" /I"C:\Sandbox\Stroika\DevRoot\IntermediateFiles\Debug-U-32\\"
- `./ScriptsLib/GetConfigurationParameter Debug Linker`
g++
- `./ScriptsLib/GetConfigurationParameter Release CXXFLAGS`
-flto --std=c++17 -O3 -I/mnt/c/Sandbox/Stroika/DevRoot/Builds/Release/ThirdPartyComponents/include/ -I/mnt/c/Sandbox/Stroika/DevRoot/Library/Sources/ -I/mnt/c/Sandbox/Stroika/DevRoot/IntermediateFiles/Release/ -Wall -Wno-switch -Wno-sign-compare -Wno-unused-variable -Wno-unused-value -Wno-strict-aliasing -Wno-comment -Wno-unused-function -Wno-unused-but-set-variable -Wno-unused-local-typedefs -g
---
## Static Linking vs. Dynamic Libraries
Stroika itself is only provided as a static library. This is because static libraries are much simpler, better for optimizing, and more flexible about configuraiton of the particular build options desired.
I cannot rule out ever providing a dynamic link option/feature, but I see exceedingly little point to it.
Of course, you can still use dynamic (shared library) linking however you wish for any components you build (like you can build shared libraries with Stroika), and obviously for most operatiing systems, the libraries your apps must like to are shared libraries.
But this bias towards static linking is reflected in all the defaults and samples provided with Stroika.
---
## Third Party Components
Stroika builds on, and neatly integrates functionality from several third-party compoenents. These components are automatically downloaded and built and integrated into Stroika (depending on configuration) or Stroika can be configured to use the system installed version of these components.
- ActivePerl (special case, just a hack to be able to build openssl)
- boost
- curl
- openssl
- lzma SDK
- sqlite
- WIX
- xerces
- zlib
---
## Build Results
Intermediate files (objs etc) go into
- **IntermediateFiles/{CONFIGURATION-NAME}**.
Final build products (libraries and executables) go into
- **Builds/{CONFIGURATION-NAME}**.
---
## Build Process
On any platform, building Stroika, and all is demo applications and regression tests is as simple as cd'ing to the top-level directory, and typing make
### Special Targets
- `make`
Make with no arguments runs 'make help'
- `make help`
Prints the names and details of the special targets
- `make all`
Builds the stroika library, tests, demos, etc.
- `make libraries`
Builds just the Stroika libraries
- `make samples`
Builds the Stroika sample applications
- `make run-tests`
- `make CONFIGURATION=zyx run-tests REMOTE='lewis@raspberrypi'`
- `make CONFIGURATION=abc run-tests VALGRIND=memcheck`
Builds Stroika, and all the regression tests, and runs the regression tests. If REMOTE= is specified, the code is copied to the target machine with ssh, and the test run there (helpful for when cross-compiling). VALGRIND= is used to run memcheck, leakcheck, or helgrind on the given configuraiton.
- make `project-files`
Builds project files which can be used for things like visual studio (not needed)
- `make check-prerequisite-tools`
Checks if the tools needed to build Stroika are installed and in your path. This is done automatically, and generally not needed explicitly.
- `make apply-configurations`
To generate all the directories and files dependent on the defined configurations. Note – this is generally not necessary, and called automatically.
### CONFIGURATION arguments to make
All the make targets (e.g. all, libraries etc) take an OPTIONAL parameter CONFIGURATION. If specified, only that configuration is built. If omitted (or empty) – ALL configurations are built.
### TAGS arguments to make
This allows for building (or clobbering or whatever) a related family of configurations. For example
- `make TAGS=Windows`
runs all the windows builds.
- `make TAGS=Unix` all
runs all the UNIX compiles
- `make TAGS="Windows x86" all`
This builds all the configurations with BOTH the Windows and x86 tag (so all the 32-bit x86 builds)
- `make TAGS="Unix arm" list-configurations`
This doesn't build anything, but just lists all the matching configurations.
- `make list-configurations`
- `make list-configuration-tags`
- `make list-configurations TAGS="Windows"`
- `make list-configurations TAGS="Windows x86_64"`
### Other Make variables that can be helpful
- ECHO_BUILD_LINES=1 makes the output of make noisier, so you can run particular builds by hand, or debug the make process.
- MAKE_INDENT_LEVEL=N, is useful when calling the stroika make process as part of another larger make process, to get its output indended.
### Cross-Compiling
#### Building for Raspberry Pi
To cross-compile for Raspberry pi,
- install some suitable cross compiler (in this example arm-linux-gnueabihf-g++-9)
On unubtu, sudo apt-get install g++-9-arm-linux-gnueabihf
- configure raspberrypi-gcc-9 --apply-default-debug-flags --trace2file enable --compiler-driver arm-linux-gnueabihf-g++-9 --cross-compiling true
Set cross-compiling true so that internal tests aren't run using the arm built executables.
Set –apply-default-release-flags instead of 'debug' for a smaller faster executable.
--trace2file disable to disable tracefile utility, and enabled writes a debug log to /tmp.
- make CONFIGURATION=raspberrypi-gcc-9 all
This builds the samples, libraries etc to Builds/raspberrypi-gcc-9)
- make run-tests REMOTE=pi@myRasberryPi
This uses ssh to run the tests remotely on the argument machine (I setup a hostname in /etc/hosts for the mapping).
Using SSH, it's also helpful to setup ssh keys to avoid re-entering passwords
[http://sshkeychain.sourceforge.net/mirrors/SSH-with-Keys-HOWTO/SSH-with-Keys-HOWTO-4.html](http://sshkeychain.sourceforge.net/mirrors/SSH-with-Keys-HOWTO/SSH-with-Keys-HOWTO-4.html)
---
## Integration with IDEs
### Using Visual Studio.net
Visual Studio.net project and solution files are available for the Stroika demos, top-level project files, and regression tests. Once you have built your configuration files (see above), you can use the project files to build, test, extend and develop Stroika.
### Using Visual Studio Code
Visual Studio Code works well with Stroika. Just open the workspace file Workspaces/VSCode/Stroika.code-workspace.
The workspsace contains pre-built 'tasks' to build Stroika (run makefiles).
### Using QtCreator (on unix)
Run Library/Projects/QtCreator/CreateQtCreatorSymbolicLinks.sh to create project files at the top level of your Stroika directory. Then you can open that .creator file in qtCreator, and build and debug Stroika-based applications.
---
## But Why? Build / Configuration Design
### Alternatives
We seriously considered a number of build systems, including cmake, ant, perl scripts, qmake, etc. They all had substantial weaknesses, with ant possibly being the best alternative, except that it's heavily java oriented. Maybe for c++ cmake would have been the best?
But just plain GNU make – appears to be a nearly universally available alternative, very standard and simple, so that's what we're doing.
But - even with just plain make, you need some sort of configure script to establish what compiler options will be defined. Again, lots of different alternatives here, but in the end I decided to just build custom scripts which build a very simple XML configuration declaration, and which drives the make process by #included 'config' makefile.
---
## Common Errors
- Tar failure
Errors about invalid parameters, and/or bad blocks can usually be fixed by installing a copy of gnu tar. We've tested 1.27.
- cp: illegal option –
Install a copy of GNU cp
- Cannot find 'blah' in Cygwin
If you are trying to install required components in Cygwin, and cannot find them in the Cygwin setup GUI, try:
cygcheck -p dos2unix
- On raspberry pi
> /tmp/Test43: /lib/arm-linux-gnueabihf/libc.so.6: version `GLIBC_2.28' not found (required by /tmp/Test43)
- fix by
- sudo vi /etc/apt/sources.list"
- temporarily add
```bash
#tmphack to load GLIBC_2 2.28
deb http://raspbian.raspberrypi.org/raspbian/ buster main contrib non-free rpi
```
- `sudo apt-get update`
- `apt-cache policy libc6`
- `sudo apt-get install libc6=2.28-5+rpi1`
- undo edit to /etc/apt/sources.list (or comment out addition)
- sudo apt-get update
NOTE - if you area dealing later versions some variation on this will likely work. Select the next debian release past your current one for the /etc/apt/sources.list addition and use apt-cache policy to find an available version.
- On MacOS
- Brew issues
```sh
ld: library not found for -lidn2
```
- brew install libidn2
- and maybe also add to .zshenv
```sh
export CPATH=/opt/homebrew/include
export LIBRARY_PATH=/opt/homebrew/lib
```
- Under Docker
```text
==7192==LeakSanitizer has encountered a fatal error.
==7192==HINT: For debugging, try setting environment variable LSAN_OPTIONS=verbosity=1:log_threads=1
==7192==HINT: LeakSanitizer does not work under ptrace (strace, gdb, etc)
```
OR
```text
warning: Error disabling address space randomization: Operation not permitted
warning: Could not trace the inferior process.
warning: ptrace: Operation not permitted
```
run the docker container (docker run or docker exec line) - with:
`--security-opt seccomp=unconfined`
- See also
[../ScriptsLib/RunInDockerEnvironment](../ScriptsLib/RunInDockerEnvironment)
for more hints on developing flags with docker containers.
| 62.989779 | 2,305 | 0.711535 | eng_Latn | 0.652344 |
3e84fcfc042a93aef59b0131d7c0395da0b525e5 | 548 | md | Markdown | .github/ISSUE_TEMPLATE/feature_request.md | LaurenceGA/go-crev | 03d6e4a7dd7d0d9e9c8d89ba5bfbc3900683512c | [
"MIT"
] | 4 | 2020-03-12T10:47:57.000Z | 2021-07-31T05:40:25.000Z | .github/ISSUE_TEMPLATE/feature_request.md | LaurenceGA/go-crev | 03d6e4a7dd7d0d9e9c8d89ba5bfbc3900683512c | [
"MIT"
] | 36 | 2020-03-12T10:50:24.000Z | 2022-03-18T09:01:02.000Z | .github/ISSUE_TEMPLATE/feature_request.md | LaurenceGA/go-crev | 03d6e4a7dd7d0d9e9c8d89ba5bfbc3900683512c | [
"MIT"
] | null | null | null | ---
name: Feature request
about: Suggest a new feature this project should have
title: ''
labels: feature
assignees: ''
---
#### Problem to solve
_A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]_
#### Solution
_A clear and concise description of what you want to happen._
#### Alternative solutions considered
_A clear and concise description of any alternative solutions or features you've considered._
#### Additional context
_Add any other context or screenshots about the feature request here._ | 27.4 | 94 | 0.760949 | eng_Latn | 0.998775 |
3e863033c190d126012cfb1134bfbb5a7ed2f99f | 5,400 | md | Markdown | articles/defender-for-iot/how-to-view-information-per-zone.md | flexray/azure-docs.pl-pl | bfb8e5d5776d43b4623ce1c01dc44c8efc769c78 | [
"CC-BY-4.0",
"MIT"
] | 12 | 2017-08-28T07:45:55.000Z | 2022-03-07T21:35:48.000Z | articles/defender-for-iot/how-to-view-information-per-zone.md | flexray/azure-docs.pl-pl | bfb8e5d5776d43b4623ce1c01dc44c8efc769c78 | [
"CC-BY-4.0",
"MIT"
] | 441 | 2017-11-08T13:15:56.000Z | 2021-06-02T10:39:53.000Z | articles/defender-for-iot/how-to-view-information-per-zone.md | flexray/azure-docs.pl-pl | bfb8e5d5776d43b4623ce1c01dc44c8efc769c78 | [
"CC-BY-4.0",
"MIT"
] | 27 | 2017-11-13T13:38:31.000Z | 2022-02-17T11:57:33.000Z | ---
title: Informacje o urządzeniach w określonych strefach
description: Korzystanie z lokalnej konsoli zarządzania w celu uzyskania kompleksowych informacji o widoku dla określonej strefy
author: shhazam-ms
manager: rkarlin
ms.author: shhazam
ms.date: 03/21/2021
ms.topic: how-to
ms.service: azure
ms.openlocfilehash: 685d7b1df4389c356ee7b531d179919025d298d0
ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 03/30/2021
ms.locfileid: "104776338"
---
# <a name="view-information-per-zone"></a>Wyświetl informacje na strefę
## <a name="view-a-device-map-for-a-zone"></a>Wyświetlanie mapy urządzeń dla strefy
Wyświetl mapę urządzenia dla wybranej strefy na czujniku. Ten widok przedstawia wszystkie elementy sieci powiązane z wybraną strefą, w tym czujniki, podłączone do nich urządzenia i inne informacje.
:::image type="content" source="media/how-to-work-with-asset-inventory-information/zone-map-screenshot.png" alt-text="Zrzut ekranu przedstawiający mapę strefy.":::
- W oknie **Zarządzanie lokacją** wybierz pozycję **Wyświetl mapę strefy** na pasku zawierającym nazwę strefy.
:::image type="content" source="media/how-to-work-with-asset-inventory-information/default-region-to-default-business-unit-v2.png" alt-text="Domyślny region do domyślnej jednostki biznesowej.":::
Zostanie wyświetlone okno **Mapowanie urządzeń** . Następujące narzędzia są dostępne do wyświetlania informacji dotyczących urządzeń i urządzeń z mapy. Aby uzyskać szczegółowe informacje o każdej z tych funkcji, zobacz *Podręcznik użytkownika platformy Defender for IoT*.
- **Widoki powiększenia mapy**: widok uproszczony, widok połączeń i widok szczegółowy. Widok wyświetlonej mapy różni się w zależności od poziomu powiększenia mapy. Można przełączać się między widokami mapy przez dostosowanie poziomów powiększenia.
:::image type="icon" source="media/how-to-work-with-asset-inventory-information/zoom-icon.png" border="false":::
- **Narzędzia do wyszukiwania map i układu**: narzędzia używane do wyświetlania różnych segmentów sieci, urządzeń, grup urządzeń lub warstw.
:::image type="content" source="media/how-to-work-with-asset-inventory-information/search-and-layout-tools.png" alt-text="Zrzut ekranu przedstawiający widok narzędzia do wyszukiwania i układu.":::
- **Etykiety i wskaźniki na urządzeniach:** Na przykład liczba urządzeń zgrupowanych w podsieci w sieci IT. W tym przykładzie jest to 8.
:::image type="content" source="media/how-to-work-with-asset-inventory-information/labels-and-indicators.png" alt-text="Zrzut ekranu etykiet i wskaźników.":::
- **Wyświetl właściwości urządzenia**: na przykład czujnik monitorujący urządzenie i podstawowe właściwości urządzeń. Kliknij prawym przyciskiem myszy urządzenie, aby wyświetlić jego właściwości.
:::image type="content" source="media/how-to-work-with-asset-inventory-information/asset-properties-v2.png" alt-text="Zrzut ekranu przedstawiający widok właściwości urządzenia.":::
- **Alert skojarzony z urządzeniem:** Kliknij prawym przyciskiem myszy urządzenie, aby wyświetlić powiązane alerty.
:::image type="content" source="media/how-to-work-with-asset-inventory-information/show-alerts.png" alt-text="Zrzut ekranu przedstawiający Widok wyświetlanie alertów.":::
## <a name="view-alerts-associated-with-a-zone"></a>Wyświetlanie alertów skojarzonych ze strefą
Aby wyświetlić alerty skojarzone z określoną strefą:
- Wybierz ikonę alertu w oknie **strefy** .
:::image type="content" source="media/how-to-work-with-asset-inventory-information/business-unit-view-v2.png" alt-text="Widok domyślnej jednostki biznesowej z przykładami.":::
Aby uzyskać więcej informacji, zobacz temat [Omówienie: Praca z alertami](how-to-work-with-alerts-on-premises-management-console.md).
### <a name="view-the-device-inventory-of-a-zone"></a>Wyświetlanie spisu urządzeń dla strefy
Aby wyświetlić spis urządzeń skojarzony z określoną strefą:
- W oknie **strefa** wybierz pozycję **Wyświetl spis urządzeń** .
:::image type="content" source="media/how-to-work-with-asset-inventory-information/default-business-unit.png" alt-text="Zostanie wyświetlony ekran spisu urządzeń.":::
Aby uzyskać więcej informacji, zobacz temat [badanie wszystkich wykrywania czujników przedsiębiorstwa w spisie urządzeń](how-to-investigate-all-enterprise-sensor-detections-in-a-device-inventory.md).
## <a name="view-additional-zone-information"></a>Wyświetl dodatkowe informacje o strefie
Dostępne są następujące dodatkowe informacje o strefie:
- **Szczegóły strefy**: Wyświetl liczbę urządzeń, alertów i czujników skojarzonych ze strefą.
- **Szczegóły czujnika**: Wyświetl nazwę, adres IP i wersję każdego czujnika przypisanego do strefy.
- **Stan łączności**: jeśli czujnik jest odłączony, Połącz się z czujnika. Zobacz [czujniki połączeń z lokalną konsolą zarządzania](how-to-activate-and-set-up-your-on-premises-management-console.md#connect-sensors-to-the-on-premises-management-console).
- **Postęp aktualizacji**: jeśli czujnik podłączony jest uaktualniany, zostaną wyświetlone Stany uaktualnienia. Podczas uaktualniania lokalna Konsola zarządzania nie odbiera informacji o urządzeniu z czujnika.
## <a name="next-steps"></a>Następne kroki
[Uzyskiwanie wglądu w globalne, regionalne i lokalne zagrożenia](how-to-gain-insight-into-global-regional-and-local-threats.md)
| 61.363636 | 271 | 0.793889 | pol_Latn | 0.999742 |
3e86a8160223ecab7481d23dbc22ebd93ee55c7c | 967 | md | Markdown | README.md | rupesh1310/ML-kits | 657ef68553bf333213d4916683052dcb482cae40 | [
"MIT"
] | null | null | null | README.md | rupesh1310/ML-kits | 657ef68553bf333213d4916683052dcb482cae40 | [
"MIT"
] | 8 | 2020-04-23T02:37:23.000Z | 2022-02-12T05:58:20.000Z | README.md | rupesh1310/Machine_Learning_kits | 657ef68553bf333213d4916683052dcb482cae40 | [
"MIT"
] | 1 | 2020-04-07T06:46:44.000Z | 2020-04-07T06:46:44.000Z | <h1 align="center">
<br>
<img src="https://library.kissclipart.com/20180925/yoe/kissclipart-javascript-clipart-javascript-library-jquery-0fb817622aa27de4.png" alt="Machine Learning With JavaScript" width="200"></a>
<br>
Machine Learning With JavaScript
<br>
</h1>
<h4 align="center">A repository to keep track of all my progess <a href="#" target="_blank">Machine Learning With JavaScript</a>.</h4>
<p align="center">
<a href="#how-to-use">How To Use</a> •
<a href="#build-with">Build With</a> •
<a href="#installation">Installation</a> •
<a href="#future-updates">Future Updates</a> •
</p>
<h1 align="center"> ️💚️ Contributors 💚 </h1>
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
<!-- prettier-ignore -->
[<img src="https://avatars1.githubusercontent.com/u/30566706?s=460&u=fa66403c14af5eafd23a330aee2b3864ed35c9c9&v=4" width="100px;"/><br /><sub><b>A.RUPESH</b></sub>](https://github.com/rupesh1310)<br />
| 35.814815 | 201 | 0.688728 | yue_Hant | 0.181074 |
3e86e40e2f7c776ac6e7e2973ad377815eea9ece | 618 | md | Markdown | domain/README.md | motown-io/motown | 783ccda7c28b273a529ddd47defe8673b1ea365b | [
"Apache-2.0"
] | 31 | 2015-01-09T14:25:16.000Z | 2020-08-19T03:21:10.000Z | domain/README.md | quann169/MotownBlueCurrent | d33e1c9bb6a5e5e0ad7711a6f47075477893aa6a | [
"Apache-2.0"
] | 32 | 2015-02-05T13:01:43.000Z | 2020-05-17T15:32:18.000Z | domain/README.md | quann169/MotownBlueCurrent | d33e1c9bb6a5e5e0ad7711a6f47075477893aa6a | [
"Apache-2.0"
] | 16 | 2015-01-06T14:23:41.000Z | 2020-09-11T18:33:12.000Z | # Wiki
Detailed information about the add-on can be found in the [wiki](https://github.com/motown-io/motown/wiki/Motown-Core).
The information below is just to provide a quick overview of the modules.
# Command authorization
Authorization on command class level. Can check if a user identity has access to a command class for a certain charging station.
# Command handling
The domain logic (command-handling) around the defined aggregates.
# Core API
Contains the basic blocks that are to be used in other modules.
# Utils
Classes that add functionality to Axon, but are not part of the Axon framework (yet). | 29.428571 | 128 | 0.779935 | eng_Latn | 0.997027 |
3e88b0211d5034885ddbc6cbb617c9360b7a038c | 5,216 | md | Markdown | content/10.optimization.md | slochower/nonequilibrium-barrier | d905240bcf8ffbda6967ea555b64f9d4ba595e63 | [
"CC-BY-4.0",
"CC0-1.0"
] | null | null | null | content/10.optimization.md | slochower/nonequilibrium-barrier | d905240bcf8ffbda6967ea555b64f9d4ba595e63 | [
"CC-BY-4.0",
"CC0-1.0"
] | null | null | null | content/10.optimization.md | slochower/nonequilibrium-barrier | d905240bcf8ffbda6967ea555b64f9d4ba595e63 | [
"CC-BY-4.0",
"CC0-1.0"
] | null | null | null | ## Optimization of the potential energy surfaces
It would be nice to be able to design -- or suggest how to design -- a molecular motor for specific properties: speed, force, torque, gearing, ability to work against a load, resistance to being forced backwards, or something else.
To that end, we set out to explore the relationship between the shape of the potential energy surfaces and these properties.
### Optimization of a single surface for maximal flux
{#fig:bound-presmooth width=10cm}
{#fig:bound width=10cm}
To start, let's begin with a fixed bound energy surface created by smoothing a sawtooth with seven spline points (Figure @fig:bound).
I couldn't find a way to spline across the periodic boundary, so the curve looks a little wonkier than expected.
My first attempt was to use a downhill simplex method ([Nelder-Mead](https://en.wikipedia.org/wiki/Nelder%E2%80%93Mead_method)) to optimize the apo surface for maximal flux.
The algorithm begins with an initial guess of a flat apo surface.
The results are not completely deterministic, even with with setting `np.random.seed(42)`, and I don't understand that.
After 100 loops of the same optimization, using the same initial conditions and random seed, the number of function evaluations in each optimization bounces between around 1400 and around 500!
The non-repeatability of the optimization is repeatable itself, however!
Upon running a further 100 loops on a different day, I observed the same bouncing between 1400 and 500 iterations.
I could look into this further, but I haven't.
It may have to do with the interpolation.
{#fig:nelder-mead width=10cm}
{#fig:nelder-mead-2 width=10cm}
{#fig:optimized width=10cm}
After 1403 iterations, Nelder-Mead optimization results in the surface shown in Figure @fig:optimized.
Each blue line is an iteration of the optimization.
Lighter colors correspond to earlier iterations.
The final surface is darker because many lines are overlaid.
I have not implemented bounds on the optimization because Nelder-Mead does not allow bounds, as far as I know.
I'll return to the idea of using bounds a little later.
The result of this optimization is that the flux starts near $0$ (although not exactly at zero, curiously) and drops to $-0.050 \,\text{cycle s}^{-1}$ quickly and stays there.
{#fig:optimized-flux width=10cm}
COBYLA and Powell's method result in better optimization than simplex downhill.
After 269 iterations, COBYLA approaches flux of more than $-250 \,\text{cycle s}^{-1}$, with the flux beginning to decrease after just a few iterations.
The iterations of COBYLA look like they are refining a single landscape, instead of jumping around.
Powell's method does well, too, although it occasionally jumps back to near zero flux after finding a high value.
Taken together, the fixed bound surface and the COBYLA-optimized apo surface are shown below.
{#fig:cobyla-optimized-surfaces width=10cm}
{#fig:cobyla-optimized-flux width=10cm}
Curiously, the flux is mostly zero across the
### Optimization of both surfaces for maximal flux
COBYLA in particular handles the bounds and produces highly optimized surfaces after just a few iterations.
## Optimization of a surface for maximum force
| 93.142857 | 607 | 0.80694 | eng_Latn | 0.987472 |
3e897ec941c5e7b3faf3d378dbfde518ac8a2644 | 442 | md | Markdown | README.md | restnfeel/restnfeel-ui | bba088d52ba0e892dd2d0f49b0329d2c59a71ca7 | [
"MIT"
] | null | null | null | README.md | restnfeel/restnfeel-ui | bba088d52ba0e892dd2d0f49b0329d2c59a71ca7 | [
"MIT"
] | null | null | null | README.md | restnfeel/restnfeel-ui | bba088d52ba0e892dd2d0f49b0329d2c59a71ca7 | [
"MIT"
] | null | null | null | ## restnfeel-ui
Google Material Ui 를 기반으로 제작된 UI 입니다.
## Feature
```
UI 자체 제작을 위해 오픈소스중 MIT 라이센스만 이용합니다.
Material-ui / react-multi-select / Airbnb react-dates 를 대상으로 진행됩니다.
개인적으로 운영하는 홈페이지와, 추후 지속적으로 UI 작업을
위해 재사용등에 대한 요구사항 해결을 위해
자체 제작된 컴포넌트를 사용하고자 시작하였습니다.
https://material-ui.com/
https://github.com/Khan/react-multi-select
```
## Contributor
```
Seo Jeong Hwan <[email protected]>
```
## License
```
MIT
```
| 14.258065 | 69 | 0.665158 | kor_Hang | 1.000004 |
3e89fd45edbea416df87f152eeb6a040a563a183 | 1,371 | md | Markdown | README.md | MarcinKonowalczyk/ceas-processing | 00ae897c1ab39876cedbb88deb114450e9e53729 | [
"MIT"
] | null | null | null | README.md | MarcinKonowalczyk/ceas-processing | 00ae897c1ab39876cedbb88deb114450e9e53729 | [
"MIT"
] | null | null | null | README.md | MarcinKonowalczyk/ceas-processing | 00ae897c1ab39876cedbb88deb114450e9e53729 | [
"MIT"
] | null | null | null | # ceas-processing
Simple demonstration of concepts of Cavity Enhanced Spectroscopy. Written in Processing 3.
## Installation
- Download [Processing](https://processing.org)
- Run `.ceas/ceas.pde`
You can also compile the whole sketch into an executable with Processing: `File -> Export Application...`
## Controls
- __[esape]__ - quit
- __[backspace]__ - reset (stop all recording and despawn everything)
- __[spacebar]__ - spawn single burst of photons
- __[up/down]__ - increase/decrease spawn rate
- __[left/right]__ - increase/decrease speed of light
- __[T]__ - toggle continuos spawn
- __[S]__ - spawn a source of photons
- __[L]__ - show item labels
- __[R]__ - toggle record form the detectors
## Analysis
Analysis scripts written in Matlab. Data recorded from the detectors in the demonstration.
CRDS measures the lifetime of light inside of the cavity:
<center><img src="./analysis/figures/crds.png" width="50%"></center>
CEAS measures the changes in the steady-state of light in the cavity:
<center><img src="./analysis/figures/ceas.png" width="50%"></center>
## ToDo's
- More general collision detection?
- Curved mirrors
- Show the requirement for the cavity stability criterion
- Evanescent wave cavities (This would require coding in a separate behavior for the photons when they are in the evanescent wave, and will probably be non-trivial)
| 33.439024 | 164 | 0.754923 | eng_Latn | 0.954595 |
3e8be4a336f73826124756c6d8ef83eb6706fbde | 850 | md | Markdown | content/dogs/_index.md | agladstein/starter-academic | 97a363ee0dd377c8265c50b89e1ec743332846cd | [
"MIT"
] | null | null | null | content/dogs/_index.md | agladstein/starter-academic | 97a363ee0dd377c8265c50b89e1ec743332846cd | [
"MIT"
] | null | null | null | content/dogs/_index.md | agladstein/starter-academic | 97a363ee0dd377c8265c50b89e1ec743332846cd | [
"MIT"
] | 2 | 2020-09-15T17:12:50.000Z | 2020-09-17T15:50:25.000Z | ---
title: Meet Naomi and Teo
# View.
# 1 = List
# 2 = Compact
# 3 = Card
view: 2
# Optional header image (relative to `static/media/` folder).
header:
caption: ""
image: ""
---
I adopted Naomi as 5 month old puppy in 2010 from the Washington Animal Rescue League, in Washington DC.
Naomi is a perfect Rottweiler mix, who enjoys laying in the sun and pleasing her humans.
I adopted Teo as a 3 month old puppy in 2014 from the Pima County Animal Shelter, in Tucson, AZ.
Teo is an unknown mix (Rhodesian Ridgeback?), who enjoys barking at squirrels and counter-surfing.
<table><tr>
<td> <img src="pics/naomi1.jpg" alt="Drawing" style="width: 530px;"/> </td>
<td> <img src="pics/teo4.jpg" alt="Drawing" style="width: 400px;"/> </td>
</tr></table>
See their [Instagram](https://www.instagram.com/teomi2020/) account for more pictures!
| 30.357143 | 105 | 0.7 | eng_Latn | 0.917864 |
3e8db17516ead831ed02579086df0b7966be05b4 | 802 | md | Markdown | content/post/2019-11-08-160206-74896502.md | tcgriffith/owaraisite | 31254bf7f3fbeece1bd68698f9d020069cc74fb1 | [
"MIT"
] | 5 | 2019-02-28T04:52:09.000Z | 2020-05-25T11:34:13.000Z | content/post/2019-11-08-160206-74896502.md | tcgriffith/owaraisite | 31254bf7f3fbeece1bd68698f9d020069cc74fb1 | [
"MIT"
] | 26 | 2017-10-24T16:10:44.000Z | 2019-10-14T04:30:35.000Z | content/post/2019-11-08-160206-74896502.md | tcgriffith/owaraisite | 31254bf7f3fbeece1bd68698f9d020069cc74fb1 | [
"MIT"
] | 8 | 2017-10-02T06:51:19.000Z | 2022-03-07T21:12:38.000Z | ---
title: 20160206 我们kamaitachi ★和牛
author:
- 鎌鼬字幕
- Notttti
zmz: 鎌鼬字幕
publishdate: '2016-02-06'
bangumi: 其他
date: '2019-11-08'
slug: 2019-11-08-160206-74896502
description: 其他•160206
weight: 8892.0
bangumis:
- 其他
tags:
- 综艺
- 和牛
- かまいたち
- 镰鼬
categories:
- 鎌鼬字幕
- Notttti
brief: "「俺達かまいたち」20160206 ★ゲスト: 和牛 ・天下第一讨论会「和恋人去高级餐厅约会,结账时却发现没带钱」 cut"
---

# 简介
「俺達かまいたち」20160206 ★ゲスト: 和牛
・天下第一讨论会「和恋人去高级餐厅约会,结账时却发现没带钱」 cut
[去B站观看](https://www.bilibili.com/video/av74896502/)
<div class ="resp-container"><iframe class="testiframe" src="//player.bilibili.com/player.html?aid=74896502"", scrolling="no", allowfullscreen="true" > </iframe></div>
| 24.30303 | 168 | 0.720698 | yue_Hant | 0.222208 |
3e8e0115971dcec930abc3686d402da5ff014701 | 6,490 | md | Markdown | README.md | mykebrowne/data-512-a3 | bf5588df52af0f51c9ca9a330d60cef3b35ec3ac | [
"MIT"
] | null | null | null | README.md | mykebrowne/data-512-a3 | bf5588df52af0f51c9ca9a330d60cef3b35ec3ac | [
"MIT"
] | null | null | null | README.md | mykebrowne/data-512-a3 | bf5588df52af0f51c9ca9a330d60cef3b35ec3ac | [
"MIT"
] | null | null | null | # data-512-a3 <br>
# Final project plan <br>
## To what extent is the opioid crisis driven by over-prescription? <br>
### Background <br>
If you Google ‘opioid crisis’, you are likely to find many articles that highlight the increase in deaths associated with opioid usage in the US over the past two decades. According to the __[CDC](https://doi.org/10.15585/mmwr.mm655051e1)__, drug overdose deaths nearly tripled during the period 1999 to 2014 which some __[commentators](https://doi.org/10.15585/mmwr.mm6626a4)__ have ascribed to the concurrent increase in opioid prescription rate, which quadrupled from 1999 to 2010. __[One explanation](https://doi.org/10.1002/pds.1452)__ provided is that overprescription of opioid pain relievers is a determinant of fatal overdose and abuse. In particular, individuals who are prescribed opioids for periods of over 90 days are unlikely to discontinue using and may switch to obtaining them illicitly when they can no longer obtain them through __[legitimate means](https://doi.org/10.15585/mmwr.mm6626a4)__. These illicit sources are more likely to include heroin and fentanyl which have been implicated in the significant __[increase in opioid death rates](https://doi.org/10.15585/mmwr.mm655051e1)__ during 2014-2015. <br>
The response to this crisis has varied. Some states have enacted legislation designed to reduce opioid prescribing. For example, Florida prohibited dispensing by prescribers in 2011 and subsequently experienced a __[52% decline](https://www.ncbi.nlm.nih.gov/pubmed/24990489)__ in oxycodone overdose death rate. The __[CDC](https://doi.org/10.15585/mmwr.mm6626a4)__ advises local jurisdictions to target “high-prescribing areas for interventions such as...virtual physical therapy sessions with pain coping skills training...for chronic pain”. These approaches have, however, been __[criticized](https://www.theguardian.com/commentisfree/2017/nov/07/truth-us-opioid-crisis-too-easy-blame-doctors-not-prescriptions)__ on the grounds that they penalize doctors for prescribing opioids to people who need them and for ignoring the wider socio-economic factors that contribute to drug addiction. <br>
This aim of this research is to further this debate by revisiting data on opioid prescribing and opioid overdose deaths and overlaying economic data to explore whether the relationship between opioid prescribing and opioid overdose is as straightforward as some of the Google headlines would suggest. <br>
### Data sources <br>
| Concept | Operational definition | Source |
| --- | ---| --- |
| Opioid prescription rate (at a year and state level) during 2006 to 2016 | Total number of retail opioid prescriptions dispensed in a given year and state per resident population | https://www.cdc.gov/drugoverdose/maps/rxstate2006.html <br><br> https://www.cdc.gov/drugoverdose/maps/rxstate2007.html |
| Opioid overdose death rate (at a year and state level) | Total number of deaths due to:<br> ICD-10 underlying cause-of-death codes: X40-X44 (unintentional), X60-X64 (suicide), X85 (homicide), Y10-Y14 (undetermined intent). <br> <br> And <br><br> ICD-10 multiple-cause-of-death codes: T40.0 (Opium), T40.1 (Heroin), T40.2 (Other opioids), T40.3 (Methadone), T40.4 (Other synthetic narcotics), T40.6 (Other and unspecified narcotics).<br><br> In a given year and state per resident population. | Available at: http://wonder.cdc.gov. |
| Poverty rate (at a year and state level) during 2006 to 2016 | Population below poverty level in a given year and state per resident population | American Community Survey. Poverty Status In The Past 12 Months. 2005 to 2016. Available at: <br> <br> https://factfinder.census.gov/faces/nav/jsf/pages/searchresults.xhtml?refresh=t
### Questions to be investigated
#### 1. What is the relationship between opioid prescription rate and opioid overdose rate? <br>
If overprescription of opioids is a determinant of fatal overdose (as has been suggested by existing research), one would expect to see this reflected in the strength of the relationship between the opioid prescription rate and opioid overdose rate, both cross-sectionally (in other words, at a single point in time across geographies) and longitudinally (over time across a single geography).
#### 2. How does the relationship between opioid prescription rate and opioid overdose rate change given knowledge of the poverty rate?
If poverty somehow effects the relationship between opioid prescription and overdose rate, this would be manifested in a number of ways. First the partial correlation between prescription and overdose rate given the poverty rate would be different to the overall correlation between the prescription and overdose rate. Equivalently, when overdose rate is regressed against prescription rate and poverty rate together, one would expect to see either a main effect of poverty rate or an interaction between prescription rate and poverty rate.
### Analytical approach
#### 1. Create flat file unique at year, state level <br>
| Column name | Type | Range |
| --- | --- | --- |
| Year | Categorical | {2006...2016} |
| State | Categorical | {AL...WY} |
| Prescription rate per population | Ratio | {0, 1} |
| Opioid overdose death rate per population | Ratio | {0, 1} |
| Poverty rate per population | Ratio | {0, 1} |
| Population | Integer |{0, ] |
#### 2. Visualize relationship between opioid prescription rates and overdose rates through scatterplots for: <br>
- All states, all years
- All states, split by year
- Each state - all years (for selected states)
#### 3. Calculate correlation coefficients <br>
Between: prescription and overdose rate; partial correlation between prescription and overdose rate, controlling for poverty rate. If the correlation between prescription and overdose rate is less strong when controlling for poverty rate, this would indicate that poverty has a role to play in influencing overdose rate.
#### 4. Perform Poisson regression with offset <br>
Model the relationship between overdose rate, prescription rate and poverty rate. More formally: <br> <br>
log(E[# overdoses]) = log(population) + β0 + (β1 * prescription rate) + (β2* poverty rate) + ( β3* prescription rate : poverty rate) <br> <br>
If either the main effect of poverty, β2, or the interaction effect, β3, are significant this would also suggest that poverty influences overdose rate (independently) of prescription rate. <br>
| 120.185185 | 1,137 | 0.77057 | eng_Latn | 0.993772 |
3e8e2cf0c5d2c784652fa2b76ca270709a229261 | 1,303 | md | Markdown | README.md | monohonplaya/S-FORCE | da5aae9d4e000d78afff02db407b320f58b1baa3 | [
"MIT"
] | null | null | null | README.md | monohonplaya/S-FORCE | da5aae9d4e000d78afff02db407b320f58b1baa3 | [
"MIT"
] | null | null | null | README.md | monohonplaya/S-FORCE | da5aae9d4e000d78afff02db407b320f58b1baa3 | [
"MIT"
] | null | null | null | # S-FORCE GAME
I will probably never finish this so I'll leave this here with some notes.
- To build, download Godot Mono 3.2.3 and import the .csproj file or scan the directory, and run. It may break with 3.2.4 and almost definitely will with 4.0 when those are out. It'd be nice to fix for 4.0 since it should improve graphics a lot.
- The bearies all have pick up and throw animations (you can make a drop anim by reversing the pick up animation) and I was originally planning to have stuff you could pick up and place/throw for puzzles/combat and that should be relatively easy to implement. The sitwiggle animation also ended up unused in gameplay.
- If you try to view Spikedog's animation player or make any changes there, the mesh will get messed up, but just go back to the mesh (spikedog > Armature004 > Skeleton > spikedog) and reset the rotation and scale to fix it.
- The 3D models are available in the glb directory as glb files. I could only get them to work by importing from within blender, not by just opening the file with it.
- The really bad save file format is because I worked on the project in 3.2.2 and I couldn't get Newsoft.Json to work in it, but it is supposedly fixed in 3.2.3.
- The code is generally messy idk glhf
# Game Download
https://monohonplaya.itch.io/s-force
| 93.071429 | 317 | 0.77053 | eng_Latn | 0.999731 |
3e8f6c89060bb9a9f9d249349a518334c0591525 | 508 | md | Markdown | sdk/python/docs/HelmTemplateResponse.md | alexeykaplin/karbon-platform-services | cf51423347f68b0821e3bf00bafac184d9e0fc8c | [
"MIT"
] | 11 | 2020-09-15T07:52:57.000Z | 2021-06-29T15:27:22.000Z | sdk/python/docs/HelmTemplateResponse.md | alexeykaplin/karbon-platform-services | cf51423347f68b0821e3bf00bafac184d9e0fc8c | [
"MIT"
] | 7 | 2020-11-13T19:04:26.000Z | 2022-03-10T11:16:24.000Z | sdk/python/docs/HelmTemplateResponse.md | alexeykaplin/karbon-platform-services | cf51423347f68b0821e3bf00bafac184d9e0fc8c | [
"MIT"
] | 23 | 2018-11-25T14:44:46.000Z | 2020-08-17T11:41:29.000Z | # HelmTemplateResponse
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**app_manifest** | **str** | |
**crds** | **str** | contains helm Custom Resource Definitions string | [optional]
**metadata** | **str** | |
**values** | **str** | contains values.yaml string | [optional]
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
| 39.076923 | 161 | 0.576772 | yue_Hant | 0.241341 |
3e8ff0c717045a17792f8977a1058cff9fc17a5c | 894 | md | Markdown | vendor/github.com/m3db/stackadler32/Readme.md | xroa/nightingale | 66c93f472af82f1eab6a175327ecb1ab2989d499 | [
"Apache-2.0"
] | 44 | 2021-06-06T08:42:51.000Z | 2022-01-05T06:10:02.000Z | vendor/github.com/m3db/stackadler32/Readme.md | xroa/nightingale | 66c93f472af82f1eab6a175327ecb1ab2989d499 | [
"Apache-2.0"
] | 1 | 2022-03-23T11:08:03.000Z | 2022-03-23T11:08:03.000Z | vendor/github.com/m3db/stackadler32/Readme.md | xroa/nightingale | 66c93f472af82f1eab6a175327ecb1ab2989d499 | [
"Apache-2.0"
] | 1 | 2020-09-01T10:12:34.000Z | 2020-09-01T10:12:34.000Z | # stackadler32
## Note: This is a fork of [github.com/sent-hil/adler32](http://github.com/sent-hil/adler32) that provides digests that are allocated on the stack and can be incrementally written to. This is useful for places where you perform concurrent checksumming and there's no good place to cache a digest without needing to acquire it expensively (under lock, etc).
Port of adler32 checksum function as described here: https://www.ietf.org/rfc/rfc1950.txt to Go.
## Example:
```go
adler32.Checksum([]byte("Hello World"))
```
## Tests
```bash
$ go test
PASS
ok github.com/sent-hil/adler32 2.429s
$ go test -bench=.
# This library is slightly faster than the one in standard library.
$ go test -bench=.
BenchmarkThis-4 10000 230169 ns/op
BenchmarkStdLib-4 10000 190834 ns/op
PASS
ok github.com/sent-hil/adler32 6.554s
```
| 31.928571 | 356 | 0.701342 | eng_Latn | 0.970653 |
3e9103237f2ae802b8eadca736929d04c6e16198 | 1,616 | md | Markdown | oneday_oneleetcode/leetcode/202010/20201018.md | wenjuGao/oneday_oneleetcode | 79c60a1cbc630f1a179d4f120ed42e0be3cd32f0 | [
"Unlicense"
] | 1 | 2020-07-06T13:56:30.000Z | 2020-07-06T13:56:30.000Z | oneday_oneleetcode/leetcode/202010/20201018.md | wenjuGao/oneday_oneleetcode | 79c60a1cbc630f1a179d4f120ed42e0be3cd32f0 | [
"Unlicense"
] | 2 | 2020-11-12T10:03:33.000Z | 2020-11-12T10:03:34.000Z | oneday_oneleetcode/leetcode/202010/20201018.md | wenjuGao/oneday_oneleetcode | 79c60a1cbc630f1a179d4f120ed42e0be3cd32f0 | [
"Unlicense"
] | null | null | null | ---
title: 删除链表的倒数第N个节点
tags:
- 删除链表的倒数第N个节点
sidebar: auto
---
### 删除链表的倒数第 N 个节点
::: tip 难度
中等
:::

## [题目:](https://leetcode-cn.com/problems/remove-nth-node-from-end-of-list/)
给定一个链表,删除链表的倒数第 n 个节点,并且返回链表的头结点。
**示例:**
```
给定一个链表: 1->2->3->4->5, 和 n = 2.
当删除了倒数第二个节点后,链表变为 1->2->3->5.
```
**说明:**
给定的 n 保证是有效的。
## 抛砖引玉
**思路:**
计算链表总长度,倒数第 N 个节点就是第 len-n 个节点
在链表头部增加一个哑节点(dummy node),来方便处理删除节点 1 的情况
```
0->1->2->3->4->5
```

```javascript
/**
* Definition for singly-linked list.
* function ListNode(val, next) {
* this.val = (val===undefined ? 0 : val)
* this.next = (next===undefined ? null : next)
* }
*/
/**
* @param {ListNode} head
* @param {number} n
* @return {ListNode}
*/
var removeNthFromEnd = function(head, n) {
let _result = new ListNode(0, head),
node = _result,
len = 0,
index = 0
while (head) {
len++
head = head.next
}
// 更新node指针到len-n位置
for (let i = 1; i < len - n + 1; i++) {
node = node.next
}
// 跳过le-n位置的节点之间连接其next上的节点完成删除
node.next = node.next ? node.next.next : null
return _result.next
}
```
### 一次遍历
声明两个指针中间查 n 步,那么前面的指针到链表尾部时,后面的指针就刚好在倒数第 n 的位置上
```javascript
var removeNthFromEnd = function(head, n) {
let _result = new ListNode(0, head),
start = _result,
end = head,
index = 0
while (index < n) {
end = end.next
index++
}
while (end !== null) {
end = end.next
start = start.next
}
start.next = start.next.next
return _result.next
}
```
| 15.843137 | 76 | 0.602723 | yue_Hant | 0.366439 |
3e9180f279d6a005d103858434c19dd65c322db4 | 1,905 | md | Markdown | _posts/2021-04-26-17825.md | Seongil-Shin/Seongil-Shin.github.io | 505ad61cefa9767d10c92862c8cf5cd91f6f2912 | [
"MIT"
] | null | null | null | _posts/2021-04-26-17825.md | Seongil-Shin/Seongil-Shin.github.io | 505ad61cefa9767d10c92862c8cf5cd91f6f2912 | [
"MIT"
] | null | null | null | _posts/2021-04-26-17825.md | Seongil-Shin/Seongil-Shin.github.io | 505ad61cefa9767d10c92862c8cf5cd91f6f2912 | [
"MIT"
] | null | null | null | ---
title: 17825 Samsung sw test
author: 신성일
date: 2021-04-26 18:38:44 +0900
categories: [알고리즘, beackjoon]
tags: [알고리즘, algorithm]
---
# 17825 Samsung sw test
## 코드
```cpp
#include <iostream>
#include <vector>
using namespace std;
typedef struct Map {
int score;
int next;
int blueNext;
} Map;
int dice[10];
Map map[33] = { {0, 1, -1},{2, 2, -1}, {4, 3, -1}, {6,4 , -1}, {8, 5, -1},
{10, 6, 21}, {12, 7, -1}, {14, 8, -1}, {16, 9, -1}, {18, 10, -1},
{20, 11, 24 }, { 22, 12, -1 }, { 24, 13, -1 }, { 26, 14, -1 }, { 28, 15, -1 },
{30, 16, 26}, {32, 17, -1}, {34, 18, -1}, {36, 19, -1}, {38, 20, -1},
{40, 32, -1}, {13, 22, -1}, {16, 23, -1}, {19, 29, -1}, {22, 25, -1},
{24, 29, -1}, {28, 27, -1}, {27, 28, -1}, {26, 29, -1}, {25, 30, -1},
{30, 31, -1}, {35, 20, -1}, {0, -1, -1} };
int getScore(int piece[4], int score, int deep);
int main() {
for (int i = 0; i < 10; i++)
cin >> dice[i];
int temp[4] = { 0,0,0,0 };
cout << getScore(temp, 0, 0);
}
int getScore(int piece[4], int score, int deep) {
if (deep == 10)
return score;
int maxScore = 0;
for (int i = 0; i < 4; i++) {
int current = piece[i];
if (current >= 32)
continue;
int next = map[current].blueNext != -1 ? map[current].blueNext : map[current].next;
for (int j = 0; j < dice[deep]; j++) {
current = next;
if (current == 32)
break;
next = map[current].next;
}
int temp[4];
for (int j = 0; j < 4; j++) {
if (i != j)
temp[j] = piece[j];
else
temp[j] = current;
}
if (current == 32) {
int a = getScore(temp, score, deep + 1);
maxScore = a > maxScore ? a : maxScore;
continue;
}
bool check = false;
for (int j = 0; j < 4; j++) {
if (piece[j] == current)
check = true;
}
if (!check) {
int a = getScore(temp, score + map[current].score, deep + 1);
maxScore = a > maxScore ? a : maxScore;
}
}
return maxScore;
}
```
| 21.647727 | 85 | 0.503412 | krc_Cyrl | 0.235842 |
3e91be7d16136c53f5f018974e63e43679f77970 | 4,230 | md | Markdown | doc/pluggable_types.md | kianmeng/epgsql | 154c3dcb05ff96da80bf5301aa3aa7b5ae41d1f6 | [
"BSD-3-Clause"
] | 336 | 2015-01-01T17:07:24.000Z | 2022-02-25T21:28:20.000Z | doc/pluggable_types.md | kianmeng/epgsql | 154c3dcb05ff96da80bf5301aa3aa7b5ae41d1f6 | [
"BSD-3-Clause"
] | 215 | 2015-01-21T13:42:42.000Z | 2022-03-30T10:22:28.000Z | doc/pluggable_types.md | kianmeng/epgsql | 154c3dcb05ff96da80bf5301aa3aa7b5ae41d1f6 | [
"BSD-3-Clause"
] | 156 | 2015-01-21T22:47:34.000Z | 2022-01-13T11:35:09.000Z | # Pluggable types
It's possible to make a custom datatype encoder/decoder as well as to change encoding/decoding
of existing supported datatype.
You can't have specific decoding rules for specific column or for specific query. Codec update
affects any occurence of this datatype for this connection.
## Possible usecases
* Decode JSON inside epgsql
* Change datetime representation
* Add support for standard datatype that isn't supported by epgsql yet
* Add support for contrib datatypes
* Add codecs for your own custom datatypes (eg
[implemented on C level](https://www.postgresql.org/docs/current/static/xtypes.html) or
created by [CREATE TYPE](https://www.postgresql.org/docs/current/static/sql-createtype.html))
## This can be done by following steps
### Implement epgsql_codec behaviour callback module
See [epgsql_codec](src/epgsql_codec.erl)
This module should have following functions exported:
```erlang
init(any(), epgsql_sock:pg_sock()) -> codec_state().
```
Will be called only once on connection setup or when `update_type_cache/2` is called.
Should initialize codec's internal state (if needed). This state will be passed as 1st
argument to other callbacks later.
```erlang
names() -> [epgsql:type_name()].
```
Will be called immediately after init. It should return list of postgresql type names
this codec is able to handle. Names should be the same as in column `typname` of `pg_type`
table.
```erlang
encode(Data :: any(), epgsql:type_name(), codec_state()) -> iodata().
```
Will be called when parameter of matching type is passed to `equery` or `bind` etc.
2nd argument is the name of matching type (useful when `names/0` returns more than one name).
It should convert data to iolist / binary in a postgresql binary format representation.
Postgresql binary format usualy not documented, so you most likely end up checking postgresql
[source code](https://github.com/postgres/postgres/tree/master/src/backend/utils/adt).
*TIP*: it's usualy more efficient to encode data as iolist, because in that case it will be
written directly to socket without any extra copying. So, you don't need to call
`iolist_to_binary/1` on your data before returning it from this function.
```erlang
decode(Data :: binary(), epgsql:type_name(), codec_state()) -> any()
```
If `equery` or `execute` returns a dataset that has columns of matching type, this function
will be called for each "cell" of this type. It should parse postgresql binary format and
return appropriate erlang representation.
```erlang
decode_text(Data :: binary(), epgsql:type_name(), codec_state()) -> any().
```
Optional callback. Will be called (if defined) in the same situation as `decode/3`, but for
`squery` command results and data will be in postgresql text, not binary representation.
By default epgsql will just return it as is.
It would be nice to also define and export `in_data()`, `out_data()` and `data()` typespecs.
Example: if your codec's `names/0` returns `[my_type, my_other_type]` and following command was
executed:
```erlang
epgsql:equery(C, "SELECT $1::my_type, $1::my_type", [my_value])
```
Then `encode(my_value, my_type, codec_state())` will be called (only once). And, since we are doing select
of a 2 values of type `my_type`, callback `decode(binary(), my_type, codec_state())` will be
called 2 times.
### Load this codec into epgsql
It can be done by calling `epgsql:update_type_cache(Connection, [{CallbackModuleName, InitOptions}])` or
by providing `{codecs, [{CallbackModuleName, InitOptions}]}` connect option.
You may define new datatypes as well as redefine already supported ones.
## Tips
* When you just want to slightly change default decoding/encoding, it may be easier to emulate
inheritance by calling default codec's functions and then modifying what they return
* Again, try to return iolists from `encode/3` when ever possible
* You may pass options to `init/2`. It's the 2nd element of the tuple `{ModName, Opts}`.
* You may use some context information from connection (it's internal record
passed as 2nd argument to `init/2`). See [epgsql_sock.erl](src/epgsql_sock.erl) for API functions.
* Note that any error in callback functions will cause crash of epgsql connection process!
| 44.526316 | 106 | 0.763357 | eng_Latn | 0.989316 |
3e92325f07d6a09f3ba9c5e6d74ca5bf985bc7e0 | 1,764 | md | Markdown | _docs/backup.md | jackalyst/tipbot | 809bef0d0d2b785da9f21c7f8212f25a403b9d55 | [
"MIT"
] | 2 | 2021-05-17T21:56:46.000Z | 2021-11-17T11:22:52.000Z | _docs/backup.md | jackalyst/tipbot | 809bef0d0d2b785da9f21c7f8212f25a403b9d55 | [
"MIT"
] | 10 | 2021-02-28T06:38:24.000Z | 2021-12-23T03:15:41.000Z | _docs/backup.md | jackalyst/tipbot | 809bef0d0d2b785da9f21c7f8212f25a403b9d55 | [
"MIT"
] | 6 | 2021-02-24T05:33:25.000Z | 2021-12-22T21:31:14.000Z | # Backup and Recovery
> Backup all of the user information and wallet files needed for rebuilding the bot upon failure.
Script used to backup the bot data for later restoration.
## Backup System
from `crontab` run the `/_scripts/backup/backup.sh` script to copy the files over and encrypt them using the password in the configuration.
The encrypted files are sent to multiple off-server locations for safe keeping. If failure happens, one of these backups are restored on a newly build server to house the tipbot.
### Setup
Edit the scripts `backup.js` and `backup.sh` located in the `_scripts/backup/` directory. Both of these files have configuration settings that need to be modified, as well as the typical `_config/config.json` file under the "backup" settings.
Ensure the directory that you are backing up to exists and set crontab to execute the scripts at some time daily.
`0 01 * * * /home/ubuntu/qrl-tipbot/_scripts/backup/backup.sh`
Executing the script will create both an un-encrypted backup tar and an encrypted one. Transfer the encrypted file off-site to backup.`TipBot_Backup.tar.gz.enc`
Send at minimum, daily, weekly, and monthly files out.
## Recovery Procedure
Decrypt the files using `openssl` and the password you provided in the config file back on the tipbot.
```bash
# decrypt with password in file
openssl enc -pbkdf2 -d -base64 -out hey_TipBot_Backup.tar.gz -in TipBot_Backup.tar.gz.enc -pass file:$HOME/qrl-tips/_scripts/backup/qrl-tipbotBackup/secret_pass.txt
# or with password passed through stdin
echo -n "password_here" | openssl enc -pbkdf2 -d -base64 -out hey1_TipBot_Backup.tar.gz -in TipBot_Backup.tar.gz.enc -pass stdin
```
Will decrypt the tar file, allowing you to un-tar and reinstate the tipbot.
| 42 | 242 | 0.774943 | eng_Latn | 0.992617 |
3e935e8c3e2849e909f95fb0fb604ddb3b13dd9a | 56 | md | Markdown | README.md | 6304673J8/2D_Pong | 6d9c0ccb6569d321cf99dd6ebb8e76d0adb50960 | [
"MIT"
] | null | null | null | README.md | 6304673J8/2D_Pong | 6d9c0ccb6569d321cf99dd6ebb8e76d0adb50960 | [
"MIT"
] | null | null | null | README.md | 6304673J8/2D_Pong | 6d9c0ccb6569d321cf99dd6ebb8e76d0adb50960 | [
"MIT"
] | null | null | null | # 2D_Pong
Learning Unity by Developing Custom Pong Game
| 18.666667 | 45 | 0.821429 | eng_Latn | 0.820423 |
3e94e3eafa2b2f3c1f8416a1498048a0a58a1de1 | 16 | md | Markdown | docs/src/pages/customizations/theming.md | froyog/react-rainbow | fb59f5f242bb665ee701e984721a340a813bc307 | [
"MIT"
] | 2 | 2019-03-12T06:54:43.000Z | 2019-04-15T14:03:39.000Z | docs/src/pages/customizations/theming.md | froyog/react-rainbow | fb59f5f242bb665ee701e984721a340a813bc307 | [
"MIT"
] | null | null | null | docs/src/pages/customizations/theming.md | froyog/react-rainbow | fb59f5f242bb665ee701e984721a340a813bc307 | [
"MIT"
] | null | null | null | # Theming
文档编辑中 | 5.333333 | 9 | 0.75 | eng_Latn | 0.784436 |
3e95a4ae5a4c5efc99049857da8565480fa0197f | 1,281 | md | Markdown | docs/framework/wcf/diagnostics/wmi/serviceappdomain.md | douglasbreda/docs.pt-br | f92e63014d8313d5e283db2e213380375cea9a77 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wcf/diagnostics/wmi/serviceappdomain.md | douglasbreda/docs.pt-br | f92e63014d8313d5e283db2e213380375cea9a77 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wcf/diagnostics/wmi/serviceappdomain.md | douglasbreda/docs.pt-br | f92e63014d8313d5e283db2e213380375cea9a77 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: ServiceAppDomain
ms.date: 03/30/2017
ms.assetid: f28e5186-a66d-46c1-abe9-b50e07f8cb4f
ms.openlocfilehash: aef0a0da9d107b92d9d3017968d5554205a6fd71
ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 05/04/2018
ms.locfileid: "33485662"
---
# <a name="serviceappdomain"></a>ServiceAppDomain
Um serviço é mapeado para um domínio de aplicativo.
## <a name="syntax"></a>Sintaxe
```
class ServiceAppDomain
{
Service ref;
AppDomainInfo ref;
};
```
## <a name="methods"></a>Métodos
A classe ServiceAppDomain não define nenhum método.
## <a name="properties"></a>Propriedades
A classe ServiceAppDomain tem as seguintes propriedades:
### <a name="ref"></a>ref
Tipo de dados: serviço
Qualificadores: chave
Tipo de acesso: somente leitura
O serviço desse domínio de aplicativo.
### <a name="ref"></a>ref
Tipo de dados: AppDomainInfo
Qualificadores: chave
Tipo de acesso: somente leitura
Contém propriedades do domínio do aplicativo.
## <a name="requirements"></a>Requisitos
|MOF|Declarado em Servicemodel.mof.|
|---------|-----------------------------------|
|Namespace|Definido em root\ServiceModel|
| 24.634615 | 61 | 0.682279 | por_Latn | 0.838717 |
3e95ba0c7b2849f6eab2de7a967ccd843f40aee6 | 27 | md | Markdown | src/Material-Design-Lite-Widgets.package/MDLDialogWidget.class/README.md | jecisc/MaterialDesignLite | 6a5f7f6aaaf5465b300de12e60f223372664a4b0 | [
"MIT"
] | null | null | null | src/Material-Design-Lite-Widgets.package/MDLDialogWidget.class/README.md | jecisc/MaterialDesignLite | 6a5f7f6aaaf5465b300de12e60f223372664a4b0 | [
"MIT"
] | 1 | 2018-06-05T18:09:06.000Z | 2018-06-05T18:21:29.000Z | src/Material-Design-Lite-Widgets.package/MDLDialogWidget.class/README.md | jecisc/MaterialDesignLite | 6a5f7f6aaaf5465b300de12e60f223372664a4b0 | [
"MIT"
] | null | null | null | I represent a dialog widget | 27 | 27 | 0.851852 | eng_Latn | 0.986751 |
3e95ee26b00e92142046d10b9400ecbd78dd6f46 | 527 | md | Markdown | src/components/DatePicker/demo/disabledrange.md | aliqin/atui | 8060e06ce53fb5807ae553aa5ad5ba34433a5902 | [
"MIT"
] | 744 | 2016-11-25T08:53:20.000Z | 2022-03-18T08:13:46.000Z | src/components/DatePicker/demo/disabledrange.md | aliqin/atui | 8060e06ce53fb5807ae553aa5ad5ba34433a5902 | [
"MIT"
] | 46 | 2016-12-09T10:27:34.000Z | 2021-03-25T02:45:34.000Z | src/components/DatePicker/demo/disabledrange.md | aliqin/atui | 8060e06ce53fb5807ae553aa5ad5ba34433a5902 | [
"MIT"
] | 89 | 2016-12-30T08:30:03.000Z | 2020-09-16T07:32:14.000Z | ---
order: 3
title:
zh-CN: 禁用部分日期的时间范围选择器
en-US: Type
---
## zh-CN
`RangePicker`可以通过设置`disabledStart`,`disabledEnd` 来禁用部分日期
## en-US
````jsx
<range-picker @change="change" format="yyyyMMdd" :disabled-end="(date) => { return date.getMonth() < 3 }" :disabled-start="(date) => { return date.getMonth() >= 2 }"></range-picker>
````
````vue-script
new Vue({
el: 'body',
components: {
rangePicker: atui.DatePicker.RangePicker
},
methods:{
change(start,end) {
console.log(start,end)
}
}
})
````
| 16.46875 | 181 | 0.607211 | yue_Hant | 0.49714 |
3e9743ec5daa29b81fe837affded3081a9d03f48 | 12,332 | md | Markdown | _posts/papers/2019-03-21-Fast_accurate_faced_detection.md | seongkyun/seongkyun_old1.github.io | bc6b08aa4dde119581d9509d1e6d9b1206a0d97e | [
"MIT"
] | 4 | 2019-12-20T08:18:07.000Z | 2020-01-22T10:36:55.000Z | _posts/papers/2019-03-21-Fast_accurate_faced_detection.md | seongkyun/seongkyun_old1.github.io | bc6b08aa4dde119581d9509d1e6d9b1206a0d97e | [
"MIT"
] | null | null | null | _posts/papers/2019-03-21-Fast_accurate_faced_detection.md | seongkyun/seongkyun_old1.github.io | bc6b08aa4dde119581d9509d1e6d9b1206a0d97e | [
"MIT"
] | 6 | 2019-10-07T11:54:06.000Z | 2020-09-04T01:10:40.000Z | ---
layout: post
title: A Fast and Accurate System for Face Detection,Identification, and Verification (Face detection에 대해서만)
category: papers
tags: [Deep learning, Face detection]
comments: true
---
# A Fast and Accurate System for Face Detection,Identification, and Verification (Face detection에 대해서만)
Original paper: https://arxiv.org/pdf/1809.07586.pdf
Authors: Rajeev Ranjan, Ankan Bansal, Jingxiao Zheng, Hongyu Xu, Joshua Gleason, Boyu Lu, Anirudh Nanduri, Jun-Cheng Chen, Carlos D. Castillo, Rama Chellappa
## Abstract
- 많은 데이터셋과 컴퓨터의 연산능력 증가로 인해 CNN을 이용한 object detection, recognition benchmark의 성능이 향상됨. 이로 인해 딥러닝을 이용한 face detection의 성능이 많이 향상 됨. CNN을 이용해 얼굴을 찾을 수 있으며 landmark의 위치를 정의하고 pose나 얼굴 인식을 할 수 있다.
- 본 논문에서는 몇몇의 benchmark 데이터셋을 이용하여 SOTA face detection의 성능을 검증하고 몇몇 face identification에 대한 detail에 대해 다룸.
- 또한 새로운 face detector인 Deep Pyramid Single Shot Face Detector(DPSSD)를 제안한다. 이는 빠르고 face detection시 다양한 scale 변환에 유용하다. 논문에서 automatic face recognition에 관련된 다양한 모듈의 design detail을 설명하며, 이는 face detection, landmark localization and alignment, 그리고 face identification/verification을 포함한다.
- 논문에서는 제안하는 face detector를 이용하여 각종 데이터셋들에 대한 evaluation 결과를 제공하며, IARPA Janus Benchmarks A, B, C(IJB-A, B, C)와 Janus Challenge Set 5(CS5)에 대한 실험 결과를 제공한다.
## 1. Introduction
- Facial analytics에 대한 많은 연구가 진행되어 있으며 이러한 연구는 law enforcement, active authentication on device, face biometrics for payment, 자율 주행차 등에 응용되어지고 있다. 또한 각종 dataset들의 등장으로 DCNN의 활용 가능성과 성능이 높아졌다.
- 본 논문에선 새로은 face detector를 제안하고, 이는 훨씬 빠르고 다양한 scale의 얼굴에 대해 탐지 결과가 좋다. 또한 현존하는 DCNN 기반 자동 face detection pipeline을 적용하여 SOTA 기술에 대해 좋은 결과를 보인다.
## 2. A brief survey of existing literature
- 본 챕터에서는 간단하게 현존하는 다른 방법을 사용하는 face identification/verification pipeline 모듈들에 대한 overview를 제시한다. 우선 최근의 face detection method에 대한 것을 다룬다. 다음으로 두 번째 모듈인 facial keypoint detection에 대해 고려한다. 마지막으로 feature learning에 관한 최근의 여러 연구에 대해 논하고 face verification과 identification에 대한 SOTA 연구들을 요약한다.
### 2.1 Face Detection
- Face detection은 face recognition/verfication pipeline의 첫 번째 step이다. Face detection 알고리즘은 주어진 입력 영상에 대해 모든 얼굴들의 위치를 출력하며, 보통 bouning box 형태로 출력한다. Face detector는 pose, illumination, view-point, expression, scale, skin-color, some occlusions, diguises, make-up 등에 대한 변화요인으로부터 강건해야한다. 대부분의 최근 DCNN-based face detector들은 일반적인 object detection의 접근 방법으로부터 영감 받아졌다(inspired by). CNN detector 들은 두 개의 sub-category로 나뉘어지는데, 하나는 Region-base와 다른 하나는 Sliding window-based 방식이다.
- __Region-based__ 접근방식은 우선 object-proposal들을 생성하며 CNN classifier를 이용해 각각의 proposal을 분류하여 그것이 얼굴인지 아닌지를 판별한다. 첫 번째 step은 보통 off-the-shelf(기성품인) proposal generator인 Selective search[26] 을 사용한다. 최근의 다른 HyperFace[10] detector나 All-in-One Face[18]같은 경우 이러한 방법을 사용한다. Generic method를 이용하여 object proposal을 생성하는 방법 말고, Faster R-CNN[5]는 Region Proposal Network(RPN)을 사용한다. Jiang and Learned-Miller는 Faster-RCNN 네트워크를 face detect에 사용하였다[27]. 유사하게 [28]에서는 Faster-RCNN framework를 사용하여 mulit-task face detector를 제안하였다. Chen et al.[29]에서는 multi-task RPN을 face detection과 facial keypoint localization을 위해 학습 시켰다. 이로인해 낭비되어지는(너무 많이 제안되는) face proposal을 줄일 수 있었으며 그 결과 face proposal의 quality가 좋아졌다. Single Stage Headless face detector[7] 또한 RPN 기반이다.
- __Sliding window-based__ 접근방식은 주어진 scale에서의 feature map에서의 모든 location에서 얼굴을 탐지하여 출력한다. 이러한 detection 방식은 face detection score와 bounding box의 형태로 구성되어 있다. 이러한 접근방식은 proposal generation step으로 분리되어 있지 않으므로(한번에 detection) region-based 방식(2-step)에 비해 더 빠르게 동작한다(one-step). [9]나 [30]같은 경우 multiple scale에서 image pytamid를 생성하여 multi-scale detection을 수행한다. 유사하게 [31]에선 multiple resolution을 위한 cascade architecture을 사용한다. Single Shot Detector(SSD)[6] 또한 multi-scale sliding-window 기반의 object detector다. 하지만 multi-scale procession을 위한 object pyramid를 사용하는 대신에, SSD는 deep CNN들의 계층적 특성을 이용한다(utilizes the hierarchal nature of deep CNNs). ScaleFace[32]나 S3FD[33]과 같은 방법 또한 face detection을 위한 유사한 방법을 사용한다.
- 더해서 detection 알고리즘을 개선하기 위해, large annotated datset을 사용 가능하게 됨으로써 face detection의 성능이 급진적으로 좋아지고 있다. FDDB[34]는 2,845개의 이미지를 갖고 있으며 전체 5,171개의 얼굴들을 포함한다. 유사한 scale인 MALF[35] 데이터셋은 5,250개의 이미지로 구성되었으며 11,931개의 얼굴을 갖고 있다. 더 큰 데이터셋은 WIDER Face[22]다. WIDER Face는 32,000개가 넘는 이미지를 갖고 있으며 expression, scale, pose, illuminsation 등에 대한 다양한 변화를 갖는 이미지를 갖고있다. 대부분의 SOTA face detector들은 WIDER Face 데이터셋을 이용하여 학습되어졌다. 이 데이터셋은 작은(tiny) 크기의 얼굴들을 많이 갖고있다. 위에서 논해진 몇몇의 face detector들은 아직도 이미지에서 작은 얼굴들을 잘 찾기 위해 노력하고 있다.(작은 얼굴들에 대한 탐지 결과가 좋지 못함). [36]은 이러한 작은 얼굴들의 탐지가 왜 중요한지에 대하여 보여준다.
- [37]에서는 2014년 이전에 개발된 수많은 face detection 방법들에 대한 survey의 연장선을 보여준다(extensive survey). [12]에서는 video에서 face recognition을 위한 face associtaion의 중요성을 논한다. Association은 다른 비디오 프레임에서의 다른 얼굴들에대한 관련성을 찾는 과정이다.
### 2.2 Facial Keypoints Detection and Head Orientation
- Facial keypoints는 corners of eyes나 nose tip, ear lobes, mouth corner 등을 포함한다. 이러한 것들은 face identification/verification에서 중요한 face alignment를 위해 필요하다[15]. Head pose는 또다른 관심있는 중요한 정보 중 하나다. Keypoint localization 방법들에 대한 포괄적인 조사는 [38]과 [39]에서 찾을 수 있다.
### 2.3 Face Identification and verification
### 2.4 Multi-task Learning for Facial Analysis
## 3. A state-of-the-art face verification and recognition pipeline
<center>
<figure>
<img src="/assets/post_img/papers/2019-03-21-Fast_accurate_faced_detection/fig2.jpg" alt="views">
<figcaption>Figure 2. 논문에서 제안하는 face recognition pipeline. 논문에서 제안하는 DPSSD face detector(section 3.1)를 이용하여 얼굴을 탐지한다. 이러한 detection들은 All-in-One Face network(section 3.2)를 통과하며 각각 얼굴에 대한 facial keypoint들을 만들어낸다. 이러한 정보들은 canonical view를 만들기 위해 얼굴 정렬(face align)에서 사용되어진다. 이러한 aligned face들은 논문에서 제안하는 face representation network(section 3.3)을 통과하고 cosine similarity를 이용하여 두 얼굴의 유사성을 얻는다.</figcaption>
</figure>
</center>
- 이번 섹션에선, 18개월 이내에 저자들에 의해 만들어진 face identifiaction과 verification을 위한 SOTA pipeline에 대해 논한다. Pipeline에 대한 overview는 figure 2에서 보여진다. 우선 논문에서 제안하는 DPSSD face detection을 subsection 3.1에서 설명한다. 다음으로 간단하게 MTL 접근방법을 사용하는 논문의 face alignment 방법을 요약한다. 마지막으로 identity feature의 extracting을 위한 논문의 접근 방법을 설명하고, 그것을 이용하여 face identification과 verification을 진행한다.
### 3.1 Deep Pytamid Single Shot Face Detector
- 논문에선 DCNN-based face detector를 제안하며 이를 Deep Pyramid Single Shot Face detector(DPSSD)라고 하며, 다양한 scale에 대해서 빠르게 face detection이 가능하다. 특히 작은 얼굴을 잘 찾는다. Face detection이 일반적인 object detection의 한 분야이므로, 다양한 연구자들이 이미 나와있는 object detector들을 fine-tunning하여 face detection을 위해 사용한다[27]. 하지만 효율적인 face detector를 디자인 하기 위해 face detection과 object detection의 방식에 차이를 두는것은 중요하다. 우선, 얼굴은 일반적인 객체에 비해 영상에서 더 작은 scale/size를 갖고 있다. 전형적으로 object detector들은 이러한 face detection에서 필요되어지는 low resolution을 탐지하도록 설계되어있지 않다. 다음으로, 전형적인 객체에 비해 얼굴은 다양한 aspect ratio가 필요하지 않다. 얼굴은 비율이 보통 정해져있으므로 일반적인 object detector들이 다양한 aspect ration를 다루는 작업이 필요하지 않다. 따라서 이러한 점을 토대로 face detector를 설계하였다.
## 4. Experimental results
- 이번 섹션에서는 4개의 데이터셋에 대한 논문에서 제안하는 face detection 결과를 보여준다. 또한 face identification과 verification에 대한 4 개의 challenging evaluation dataset인 IJB-A, B, C,와 CS5에 대한 실험 결과를 보여준다. 논문에선 논문에서 제안하는 system이 SOTA의 결과나 혹은 그와 근접한 결과를 대부분의 protocol에 대해 보여준다. 아래의 섹션에서는 평가 데이터셋과 protocol에 대해 설명하며, 논문에서 제안하는 시스템과의 실험결과의 차이를 보여준다.
### 4.1 Face detection
- 논문에선 WIDER Face, UFDD, FDDB, Pascal Face라는 4개의 face detection dataset에 대해 제안하는 DPSSD face detector의 성능을 평가했다. 논문에서 제안하는 방식은 Pascal Faces dataset에 대해 SOTA의 결과를 보였으며, WIDER Faces, UFDD, FDDB에 대해 경쟁력 있는 결과를 달성했다.
#### 4.1.1 WIDER Face Dataset Results
- 데이터셋은 393,703개의 face annotation을 가진 32,203개의 이미지를 포함하고 있으며, 이중 40%는 학습에 사용되고 10%는 validation에, 나머지 50%는 test에 사용된다. 데이터셋에는 occlusions, poses, event categories 및 face bounding box를 포함하는 많은 annotation들이 들어있다. 각 face들은 scale, pose, occlusion에 대한 다양한 변화를 갖고있다. 또한 얼굴의 크기가 4픽셀까지 작은 경우도 있기 때문에 이 데이터셋을 이용한 face detection은 매우 어려운 일이다. 논문에선 Face detection 학습을 위해 traning set을 사용하고, validation set을 이용하여 성능을 평가한다. Figure 7에서는 제안하는 DPSSD와 SOTA Face detector들의 성능을 비교한다.
<center>
<figure>
<img src="/assets/post_img/papers/2019-03-21-Fast_accurate_faced_detection/fig7.jpg" alt="views">
<figcaption>Figure 7. WIDER Face validation dataset에 대한 성능 평가로, 각각 Easy, Medium, Hard faces로 나뉜다. 모델명 우측의 숫자는 mean average precision(mAP)이다.</figcaption>
</figure>
</center>
- 논문에서는 DPSSD와 S3FD[33], SSH[7], HR[36], CMS-RCNN[96], ScaleFace[32], Multitask Cascade[82], LDCF+[97], Faceness[98], Multiscale Cascade[22], Two-stage CNN[22], ACF[99],에 대한 성능을 비교한다. 논문에선 S3FD, SSH, HR과 같은 SOTA 방식과 비교해 DPSSD가 경쟁적인 성능을 달성한 것을 볼 수 있다. DPSSD는 easy, medium 셋에서 각각 0.925, 0.908의 mAP를 달성했다. Hard셋에서의 경우, mAP는 0.857로 가장 좋은 성능을 나타내는 S3FD(0.859) 방식과 아주 근접한 성능을 보였다.
- 또한 SSD[100]를 fine tunning하여 학습한 baseline face detector와 논문에서 제시하는 방법을 비교했다. 이는 Hard set에서 DPSSD가 SSD보다 44%가량 향상된 mAP를 보였다. 이 결과는 고정된 aspect ration를 가진 anchor pyramid를 재 설계하고, upsampling layer를 추가하여 얼굴 감지 성능을 향상시키는데 많은 도움이 되게 하였다.
#### 4.1.2 UFDD Dataset Results
<center>
<figure>
<img src="/assets/post_img/papers/2019-03-21-Fast_accurate_faced_detection/fig8.jpg" alt="views">
<figcaption>Figure 8. UFDD 데이터셋에 대한 performane evaluation 결과. 각 방법 우측의 숫자는 mAP를 의미한다.</figcaption>
</figure>
</center>
- UFDD는 기존 데이터 셋에 존재하지 않는 몇 가지 현실적인 문제를 포함하는 최근의 face detection dataset이다. UFDD는 날씨에 의한 흐려짐(비, 눈, 안개), 움직임에 의한 번짐, 초점 흐려짐과 같은 요소들이 포함된 이미지 셋이다. 추가적으로 UFDD는 동물 얼굴이나 사람 얼굴이 아닌것들과같이 이 데이터셋을 매우 까다롭게 만드는 distractor들을 포함하고 있다. 데이터셋은 10,897개의 face annotation을 가진 총 6,425개의 이미지를 포함하고 있다. 논문에서 제안한 방식과 S3FD[33], SSH[7], HR[36], Faster-RCNN[27]을 비교했다. WIDER Face 데이터셋과 유사하게 0.706의 mAP로 경쟁력 있는 겨로가를 달성했다. UFDD 데이터셋에 대해서 논문에서 제안하는 알고리즘이 fine-tunning 되지 않았단 것을 고려해야 한다.
#### 4.1.3 FDDB Dataset Results
<center>
<figure>
<img src="/assets/post_img/papers/2019-03-21-Fast_accurate_faced_detection/fig9.jpg" alt="views">
<figcaption>Figure 9. FDDB 데이터셋에 대한 performance evalutation 결과. 각 방법 우측의 숫자는 mAP를 의미한다.</figcaption>
</figure>
</center>
- FDDB datset은 제한되지 않은(unconstrained) face detection을 위한 데이터셋이다. FDDB는 야후 웹사이트 뉴스기사에서 가져온 2,845개의 이미지를 포함한 총 5,171개의 얼굴로 구성되어 있다. 영상들은 manually하게 ground truth를 생성하기 위해 localized 되어있다. 데이터 셋에는 2 개의 evalutation protocol이 있으며, 이는 discrete와 continuous로 나뉜다. 이는 각각 근본적으로 detecion과 ground truth 사이에 coarse match와 precise match에 대응된다. 논문에서 제안하는 모델의 성능을 평가하기위해 논문에서 제안하는 방법을 Figure 9에서 보여지는 Receiver Opoerating Characteristic(ROC) curves를 사용하는 discrete protocol에 근거하여 평가했다.
- 논문에서는 S3FD[33], HR[36], Faster-RCNN[27], All-In-One Face[8], LDCF+[97], DP2MFD[9], Faceness[98], HyperFace[10], Head-hunter[101], DPM[101], Joint Cascade[79]와 같은 다른 얼굴 검출 방법과의 성능을 비교했다. 그림에서 볼 수 있듯, 논문의 방법은 최신 방식은 S3FD나 HR과 비교해서 경쟁력 있는 성능을 보였고, 0.969의 mAP를 달성했다. FDDB 데이터셋을 위한 특별한 fine-tunning이나 bounding box regressiond을 사용하지 않았다는 점을 고려해야 한다.
#### 4.1.4 Pascal Faces Dataset Results
<center>
<figure>
<img src="/assets/post_img/papers/2019-03-21-Fast_accurate_faced_detection/fig9.jpg" alt="views">
<figcaption>Figure 9. FDDB 데이터셋에 대한 performance evalutation 결과. 각 방법 우측의 숫자는 mAP를 의미한다.</figcaption>
</figure>
</center>
- PASCAL faces dataset은 PASCAL VOC 데이터셋의 subset인 person layout dataset에서 수집되었다. 데이터셋에는 851개의 이미지에서 1,335개의 얼굴을 포함하고 있으며, 각각 모양(appearance)과 pose가 크게 다르다. Fig.10에서 이 데이터 셋에 대한 서로 다른 face detection 방식과의 성능 비교를 확인 할 수 있다. Figure 10에서, 논문에서 제안하는 DPSSD 방식이 96.11%의 mAP로 가장 좋은 결과를 보이는 것을 알 수 있다. Table 5에서는 다양한 데이터셋과 verification과 identification의 evaluation task에 대한 결과를 볼 수 있다.
#### 4.2 ~ 4.5 생략(Face identification/verification에 대한 )
## 5. Conclusion
- 논문에서는 현재의 CNN을 이용하는 face recognition system을 이용한 모델들의 overview를 제공했다. 논문에서는 face recognition pipeline에 대해 논했으며 SOTA 기술들이었다. 또한 논문에서는 feature representation을 위한 두 네트워크의 앙상블(ensemble)을 사용하는 face recognition system을 제안하고 이에 대한 detail들에 대해 설명했다. 논문에서 제안하는 모델에서 pipeline에서의 Face detection과 keypoint localization은 모두 CNN을 이용하여 한번에 이루어졌다. 논문에서는 시스템을 위한 training과 dataset에 대한 detail에 대해 논했으며 이게 어떻게 face recognition과 연관되어 잇는지 논했다. 논문에서 IJB-A, B, C와 CS5의 4개 challenging dataset에 대한 제안하는 시스템의 실험결과를 제시했다. 그리고 앙상블 based 시스템이 SOTA 결과에 근접하게 나왔다.
- 하지만 풀어야 할 몇몇 issue들이 존재한다. DCNN 기반의 face recognition system에 대한 이론적인 이해를 위한 연구가 필요하다. 주어지는 다양한 loss function들은 이러한 network의 학습을 위해 사용되어지며, 모든 loss function을 같은 맥락으로 통합하는 framework를 개발해야 한다. Domain adaptation과 dataset bias또한 현재의 face recognition system의 issue다. 이러한 시스템은 보통 dataset을 이용하여 학습되어지며 유사한 test set에 대해 잘 동작한다. 하지만 하나의 도메인에 대해서 학습 되어진 네트워크는 다른 도메인에서 잘 동작하지 않는다. 논문에서는 다른 서로 다른 dataset들을 조합하여 학습시켰다. 이로인해 학습되어진 모델이 더 강건(robust)해 졌다. CNN의 학습에는 현재 몇시간에서 몇일이 걸린다. 따라서 더 빠른 학습을 위한 효율적인 구조(architecture)나 CNN의 implementation이 필요하다.
| 108.175439 | 735 | 0.777084 | kor_Hang | 1.00001 |
3e9875e9aecffdccba0a674a4050518fcde1b051 | 4,802 | md | Markdown | README.md | shift-left-test/meta-shift | effce9bea894f990703cc047157e3f30d53d9365 | [
"MIT"
] | 2 | 2022-01-19T02:39:43.000Z | 2022-02-07T01:58:17.000Z | README.md | shift-left-test/meta-shift | effce9bea894f990703cc047157e3f30d53d9365 | [
"MIT"
] | null | null | null | README.md | shift-left-test/meta-shift | effce9bea894f990703cc047157e3f30d53d9365 | [
"MIT"
] | null | null | null | # meta-shift
> Shift-left testing for the Yocto project
## About
[Shift-left testing](https://en.wikipedia.org/wiki/Shift-left_testing) is an approach to software testing in which testing is performed earlier and often in the software development lifecycle.
The benefits of the shift-left testing approach are:
* Improved software quality since defects are detected in earlier stages
* Cost effective as early detected defects are cheaper to fix
* Increased efficiency in the software development process
* Reduced time to market since the QA stage does not take much time
The **meta-shift** layer is a set of recipes and classes for the Bitbake build system, which allows developers to test or examine their software modules in the host build environment. By enabling meta-shift, developers are able to easily use various tasks for their recipes via the bitbake command, such as:
* Lines of code
* Cache hit ratio
* Static analysis
* Comments
* Cyclomatic complexity
* Duplication
* Unit testing
* Code coverage
* Mutation testing
### Features
The main purpose of meta-shift is to provide the shift-left testing tools to the Yocto build environment satisfying the following needs.
* Easy to configure
* Host-based testing
* Supports major build systems (cmake, qmake, autotools)
* Supports SDK
* Supports various Yocto releases and BSPs
* Jenkins integration
## Quick start
Please visit a [build-sample](https://github.com/shift-left-test/build-sample) repository to find more information about how to configure and use features provided by the meta-shift layer.
## Usage
This is a meta layer for the Yocto project. Please find more information about the [meta layer](https://docs.yoctoproject.org/overview-manual/concepts.html#layers) if you are not familiar with.
### Dependencies
**Mandatory layers**
* meta-oe (meta-openembedded)
* meta-pyton (meta-openembedded)
**Optional layers**
* meta-qt5: To support QT5 based recipes
* meta-qt6: To support QT6 based recipes
* meta-clang: To use clang-tidy and the mutation testing
### Installation
Clone this repository and add the layer to your *bblayer.conf*
$ git clone -b dunfell https://github.com/shift-left-test/meta-shift.git
$ source oe-build-init-env
$ bitbake-layers add-layer ../meta-shift
### Supported tasks
List of tasks via the bitbake command
* do_checkcache
* do_checkcacheall
* do_checkcode
* do_checkcodeall
* do_checkrecipe
* do_checkrecipeall
* do_checktest
* do_checktestall
* do_coverage
* do_coverageall
* do_report
* do_reportall
* do_test
* do_testall
### Supported bitbake tools
List of bitbake tool commands
* devtool cache
* devtool show
* bitbake-layers inspect
* bitbake-layers status
* bitbake-layers test-layers
* bitbake-layers test-recipes
* recipetool check
* recipetool inspect
### Configuration
These options can be used by adding to *conf/local.conf*.
* **SHIFT_CHECKCODE_EXCLUDES**: Paths to exclude from the static analysis
* **SHIFT_CHECKCODE_TOOLS**: Indicates which static analysis tools to use (cppcheck, cpplint, and clang-tidy)
* **SHIFT_CHECKRECIPE_SUPPRESS_RULES**: Exclude rules from bitbake script analysis (A list of rules can be found at https://github.com/priv-kweihmann/oelint-adv)
* **SHIFT_CHECKTEST_EXCLUDES**: Excludes paths from mutation testing
* **SHIFT_CHECKTEST_EXTENSIONS**: Extensions of source files to be mutated
* **SHIFT_CHECKTEST_GENERATOR**: Set the mutation generator (random, uniform, or weighted)
* **SHIFT_CHECKTEST_LIMIT**: Set the maximum limit of mutants
* **SHIFT_CHECKTEST_SCOPE**: Indicate which source code to mutate (all or commit)
* **SHIFT_CHECKTEST_SEED**: Random seed for the mutation generator
* **SHIFT_CHECKTEST_VERBOSE**: Silence the test ouput while running the `do_checktest` task
* **SHIFT_COVERAGE_EXCLUDES**: Exclude paths from code coverage analysis
* **SHIFT_REPORT_DIR**: A path to store report files
* **SHIFT_TEST_SHUFFLE**: Randomize the order of tests
* **SHIFT_TEST_SUPPRESS_FAILURES**: Do not return non-zero exit code when tests fail
### Jenkins integration
It is recommended to set up [The meta-shift plugin for Jenkins](https://github.com/shift-left-test/meta-shift-plugin) for your Jenkins instance.
## Development
To prepare the meta-shift development environment via Docker:
$ git clone https://github.com/shift-left-test/dockerfiles.git
$ cd dockerfiles
$ docker build -f yocto-dev/18.04/Dockerfile -t yocto-dev-18.04 .
$ docker run --rm -it yocto-dev-18.04
To run all tests:
$ pytest
## Contributing
This project is open to any patches. The patches can be submitted as Github pull request in https://github.com/shift-left-test/meta-shift or to the project mailing list.
## License
This project source code is available under MIT license. See [LICENSE](LICENSE).
| 32.445946 | 307 | 0.771554 | eng_Latn | 0.965818 |
3e98f90ab77c7e6c3589509565134f2f310fc6d2 | 1,103 | md | Markdown | node-bridge-arduino-serial/readme.md | diogobernardino/protopie-connect-bridge-apps | 528923e343785819f1692324f6e4efab3a30ed9d | [
"Apache-2.0"
] | 10 | 2020-09-08T05:17:43.000Z | 2022-03-21T08:41:57.000Z | node-bridge-arduino-serial/readme.md | ProtoPie/protopie-connect-bridge-apps | f5332a4180da5598a7efe7ff228a25ecf968361b | [
"Apache-2.0"
] | 4 | 2020-09-24T23:22:04.000Z | 2021-08-17T06:55:17.000Z | node-bridge-arduino-serial/readme.md | ProtoPie/protopie-connect-bridge-apps | f5332a4180da5598a7efe7ff228a25ecf968361b | [
"Apache-2.0"
] | 7 | 2021-03-29T13:22:31.000Z | 2022-03-16T14:13:14.000Z | # arduino-serial-node-bridge
> Bridgeapp for Arduino and desktop with USB serial. This example shows how to exchange messages between Arduino and desktop through USB serial. It's a kind of echo example. ProtoPie example sends 'ROTATE' message to node bridge app via Protopie connect, the node bridge app will deliver the message to Arduino. After receiving the message on Arduino, the Arduino echo the message. Finally, you can see that the object on ProtoPie is rotated
# Arduino
## Setup
You need to set the baudate to 9600 and change the name of the serial port in index.js. You can find out the name on Arduino IDE
```js
const PORT_NAME = '/dev/cu.SLAB_USBtoUART';
```
## Send message from serial to ProtoPie Connect
```c
if (Serial.available() > 0) {
// read the message
message = Serial.readString();
// echo what you got
Serial.println(message);
Serial.flush();
}
```
## Send message from ProtoPie Connect to serial
```js
// You've got a message from ProtoPie connect
socket.on('ppMessage', message => {
// Write a message to Arduino
port.write(message.messageId);
});
``` | 31.514286 | 441 | 0.736174 | eng_Latn | 0.979133 |
3e992e0921c1505ea1195e0b52e46e5a04a8803a | 2,756 | md | Markdown | README.md | MatrixOrbital/Eve-BT81x-Flash | c5d66dab2c6da20fb08bf20ed3bcaaac40387029 | [
"MIT"
] | null | null | null | README.md | MatrixOrbital/Eve-BT81x-Flash | c5d66dab2c6da20fb08bf20ed3bcaaac40387029 | [
"MIT"
] | null | null | null | README.md | MatrixOrbital/Eve-BT81x-Flash | c5d66dab2c6da20fb08bf20ed3bcaaac40387029 | [
"MIT"
] | null | null | null | This code contains functions to perform the following operations:
- Read a file off SD card and write it to Eve connected flash.
- Parse Eve flash and store file offsets - flash addresses - of files.
- An example of displaying bitmaps (ASTC format) directly out of flash.
To be used in conjuction with Eve Asset Builder software from Bridgetek.
Eve Asset Builder will take any number of files and pack them into a single file. This file will also contain
the "blob" file provided by Bridgetek which allows the Eve BT81x to use fast mode (QSPI) in it's interactions
with the onboard flash chip.
In order for this code to function, the location of the file list and offsets (output.map) must be known.
In order to place this offset table at a known address (offset 4096), the following procedure is provided:
Eve Asset Builder provides no method of ordering files within the "blob" so some dance must be performed.
1) select your converted media files and run "Generate Flash".
2) rename output.map as aaoutput.map
3) Add the same files to "Generate Flash" as well as aaoutput.map and generate flash again.
-- This includes the old map file at the first file location in flash - offset 4096
-- The included map file does not include itself and so all offsets are wrong.
4) Delete aaoutput.map and rename output.map to aaoutput.map
5) Include the same files again including aaoutput.map and generate flash a third time.
-- Now, the file aaoutput.map will be found at 4096 and that file now includes itself with correct offsets.
- Designed for Matrix Orbital EVE2 SPI TFT Displays incorporating BT81x chips and QSPI flash
https://www.matrixorbital.com/ftdi-eve/eve-bt815
- This code makes use of the Matrix Orbital EVE2 Library found here:
https://github.com/MatrixOrbital/EVE2-Library
- While a copy of the library files (Eve2_81x.c and Eve2_81x.h) is included here, you may look for updated
files if you wish.
- Matrix Orbital EVE2 SPI TFT display information can be found at: https://www.matrixorbital.com/ftdi-eve
- An Arduino shield with a connector for Matrix Orbital EVE2 displays is used to interface the Arduino to Eve.
This shield includes:
- 20 contact FFC connector for Matrix Orbital EVE2 displays
- 3 pushbuttons for application control without requiring a touchscreen (useful for initial calibration)
- Audio amplifier and speaker for audio feedback
- SD card holder
- Additionally, the shield board is automatically level shifted for 5V Arduino and works with 3.3V Parallax Propeller ASC+
https://www.matrixorbital.com/accessories/interface-module/eve2-shield
Support Forums
http://www.lcdforums.com/forums/viewforum.php?f=45
| 53 | 126 | 0.760522 | eng_Latn | 0.997244 |
3e99312ee7ff933c5ae524710738713f2ea358f0 | 1,165 | md | Markdown | docs/Storage.md | TheRakeshPurohit/cube-composer | a891ffe5de79b072819da04718820d0452b9a201 | [
"MIT"
] | 1,993 | 2015-03-09T20:51:45.000Z | 2022-03-25T08:16:19.000Z | docs/Storage.md | TheRakeshPurohit/cube-composer | a891ffe5de79b072819da04718820d0452b9a201 | [
"MIT"
] | 47 | 2015-03-28T13:22:13.000Z | 2022-03-07T15:33:00.000Z | docs/Storage.md | TheRakeshPurohit/cube-composer | a891ffe5de79b072819da04718820d0452b9a201 | [
"MIT"
] | 107 | 2015-06-02T10:52:29.000Z | 2022-01-31T11:20:09.000Z | ## Module Storage
#### `STORAGE`
``` purescript
data STORAGE :: Effect
```
#### `SaveableGameState`
``` purescript
type SaveableGameState = { currentLevel :: LevelId, levelState :: StrMap (Array TransformerId) }
```
#### `toSaveable`
``` purescript
toSaveable :: GameState -> SaveableGameState
```
#### `fromSaveable`
``` purescript
fromSaveable :: SaveableGameState -> GameState
```
#### `unsafeLoadGameState`
``` purescript
unsafeLoadGameState :: forall a eff. (a -> Maybe a) -> (Maybe a) -> Eff (storage :: STORAGE | eff) (Maybe SaveableGameState)
```
Retrieve the current game state from local storage (FFI, needs 'Just' and 'Nothing' as input)
#### `loadGameState`
``` purescript
loadGameState :: forall eff. Eff (storage :: STORAGE | eff) (Maybe GameState)
```
Retrieve game state from local storage
#### `unsafeSaveGameState`
``` purescript
unsafeSaveGameState :: forall eff. SaveableGameState -> Eff (storage :: STORAGE | eff) Unit
```
Store a game state in local storage (unsafe)
#### `saveGameState`
``` purescript
saveGameState :: forall eff. GameState -> Eff (storage :: STORAGE | eff) Unit
```
Store a game state in local storage
| 19.416667 | 124 | 0.687554 | kor_Hang | 0.521821 |
3e99bfa465004c613a425712d12c9b136d31d45a | 725 | md | Markdown | infrastructure/aws/scripts/README.md | zzwzzw435/webapptest | a212d08debb5ab4098344a3c5b18f6c6c76de307 | [
"Apache-2.0"
] | null | null | null | infrastructure/aws/scripts/README.md | zzwzzw435/webapptest | a212d08debb5ab4098344a3c5b18f6c6c76de307 | [
"Apache-2.0"
] | null | null | null | infrastructure/aws/scripts/README.md | zzwzzw435/webapptest | a212d08debb5ab4098344a3c5b18f6c6c76de307 | [
"Apache-2.0"
] | null | null | null | # CSYE6225-fall2019
Programming Infrastructure Using AWS cli commandline interface
pre-requirement:
1. Build credentials and config files under ~/.aws director for aws cli configuration
required input variables:
* profile name
* zone
* VPC CIDR (or use default when nothing input)
* subnets CIDR (or use default when nothing input)
Steps:
1. Run csye6225-aws-networking-setup.sh for creating networking resources Using AWS cli commandline interface. Follow the console output instruction to provide proper run-time variables.
2. Run csye6225-aws-networking-teardown.sh for deleting networking resources Using AWS cli commandline interface. Follow the console output instruction to provide proper run-time variables | 42.647059 | 189 | 0.811034 | eng_Latn | 0.955008 |
3e9bc57f2023e942e10d19ca10275376c4018680 | 3,226 | md | Markdown | README.md | michalgolek/Lightwave-Plugin-ImageFilter_RenderTag | f288592f01ad281c09ad3d997ae3439e56be12f8 | [
"MIT"
] | null | null | null | README.md | michalgolek/Lightwave-Plugin-ImageFilter_RenderTag | f288592f01ad281c09ad3d997ae3439e56be12f8 | [
"MIT"
] | null | null | null | README.md | michalgolek/Lightwave-Plugin-ImageFilter_RenderTag | f288592f01ad281c09ad3d997ae3439e56be12f8 | [
"MIT"
] | null | null | null | # MiG_RenderTag
> Newtek Lightwave plugin for layout module
Image filter plugin for layout, printing on the final renderer info about the rendering time, author and resolution of the output image and any data in the form of text from the user on the final renderer.

## Installation
Windows:
>1. Copy the plugin with the **MiG_RenderTag.p** extension to the Lightwave plugins folder.
>2. Run the Layout module and add the plugin by clicking the **Add plugin** button in the **Utilities** tab, selecting the file "MiG_RenderTag.p"
## Usage example
>1. Go to the **Effects-> Processing** tab and add an **Image Filter** named **MiG_RenderTag**.
>2. Enter the filter settings and then complete the content of the component located above the Preview \ Load \ Save buttons by adding any text, eg "Sample Render Tag".
>3. Close the dialog box by clicking the **Continue** button
>4. Render a single frame by pressing **F9**.
>5. Done!
In the rendered image you will see a black rectangle at the bottom with a white inscription "Sample Render Tag".
## Development setup
Describe how to install Lightwave 3D SDK and prepare the environment for building a plugin.
Description for the Visual Studio development environment:
### Build server.lib for Lightwave SDK:
>1. Download the official SDK for Ligthwave 3D from the manufacturer's website:
https://www.lightwave3d.com/lightwave_sdk/
>2. Unzip the SDK package to the location where the plugin project is located in the **"sdk\lwsdk"** folder
>3. Open the solution file in visual studio located in the main lwsdk folder
>4. Switch to the active project called **source**
>5. Select the target platform (for Lightwave plugins in 32 bit version use win32 and for Lightwave in 64 bit version use win64) and configuration type (Release or Debug)
>6. Access the project settings in the **Librarian** tab
>7. Set *Output File* to **$(OutDir)\server.lib**
>8. Build a project.
>9. Done!
### Build MiG_RenderTag plugin:
>If the process of building server.lib was successful then follow these steps:
>
>1. After creating a copy of the repository, also download the simpleini submodule
>2. Go to the project folder and open the Visual Studio walkthrough called MIG_Plugin
>3. Select the target platform (for Lightwave plugins in 32 bit version use win32 and for Lightwave in 64 bit version use win64) and configuration type (Release or Debug)
>4. Optionally, if you want to debug the plugin while the layout module is running, then in the project settings go to the **Debugging** tab and set as **Command** the full path to the exek files of the layout module.<br>
> In addition, set the **Command Arguments** parameter if you want to run the layout module with your own parameters
>5. Build a project.
>6. Done!
## Release History
* 0.6.1
* Initial Commit
## Meta
Distributed under the MIT license. See ``LICENSE`` for more information.
## Contributing
1. Fork it (<https://github.com/michalgolek/Lightwave-Plugin-ImageFilter_RenderTag/fork>)
2. Create your feature branch (`git checkout -b feature/fooBar`)
3. Commit your changes (`git commit -am 'Add some fooBar'`)
4. Push to the branch (`git push origin feature/fooBar`)
5. Create a new Pull Request | 46.085714 | 221 | 0.758834 | eng_Latn | 0.988959 |
3e9beb20f7f12b09fc06841f0690f2535a303e17 | 461 | md | Markdown | README.md | bradyj04/VSCode-SnakeGame | dcd9e4e9e7912b088d81afe53896e215b1d99358 | [
"MITNFA"
] | null | null | null | README.md | bradyj04/VSCode-SnakeGame | dcd9e4e9e7912b088d81afe53896e215b1d99358 | [
"MITNFA"
] | null | null | null | README.md | bradyj04/VSCode-SnakeGame | dcd9e4e9e7912b088d81afe53896e215b1d99358 | [
"MITNFA"
] | null | null | null | ## Synopsis
This is a html5/canvas implementation of the game Snake. Enjoy.
## Motivation
I`ve always loved the Snake games in Nokia mobile phones. This is a replica attempt.
## Installation
Just download the .html file and run it with your favorite browser!
## Contributors
If you want, you can help me make a REAL replica of the Nokia Snake game (with all the cozy graphics, and stuff). Just let me know your`e interested in doing it.
## License
MIT | 24.263158 | 162 | 0.752711 | eng_Latn | 0.992057 |
3e9d97c1d29e08df481df02e3cd312bdcf9df1cd | 2,551 | md | Markdown | README.md | schemedoc/manpages | 81bcbe975a6fdfd1c3c0f2f109cbe47f985f968c | [
"MIT"
] | 8 | 2020-05-19T23:36:56.000Z | 2022-02-01T16:09:30.000Z | README.md | schemedoc/scheme-manpages | e4d8e389312a865e350ef88f3e9d69be290705c7 | [
"MIT"
] | 9 | 2020-08-13T06:35:36.000Z | 2022-02-01T22:09:00.000Z | README.md | schemedoc/scheme-manpages | e4d8e389312a865e350ef88f3e9d69be290705c7 | [
"MIT"
] | 2 | 2020-10-12T21:48:27.000Z | 2022-02-01T14:56:22.000Z | # Scheme manpages
To write portable Scheme code it is necessary to know the language
itself and the variations that exist in its implementations. The
manpage format makes it possible to include much more detail than is
usual for the Scheme reports.
This project aims to be a collection of manpages for the programming
language Scheme. The goal is to include all of R6RS and R7RS.
## Status
Just started (2020-04-18).
## Scope
All of R6RS and R7RS-small, but it can be extended to R7RS-large. The
documents listed under `STANDARDS` can include SRFIs, but features
that exist only as SRFIs are not in scope.
## How to use
The manpages should be packaged for distribution at some point, but
for now it is enough to clone the repository and set the `MANPATH`
environment variable. It could be something like this shell snippet,
if you're in the cloned repository:
```sh
export MANPATH=$PWD:$(manpath -g)
```
Then you should be able to use `man car`, etc. It is also possible to
write `man man3/car.3scheme`.
## How to contribute
Take a manpage from `templates/` and fill it in! Remove any sections
that you feel are not needed. Do as many as you like and submit a PR
through GitHub. You can open an issue if you want to show others that
you're working on something.
Please use [errata-corrected versions of
R6RS](https://weinholt.se/scheme/r6rs/) and R7RS as your references.
Not everything has a template yet and some do not have accurate
synopses. Another way to help is to develop tools that work with the
documents, e.g. checking their structure, searching for missed pages,
etc.
Sometimes it's appropriate to group together several procedures in the
same page. See `man3/cdr.3scheme` for an example of how to
link to another page.
## A short defense of the manpage language
The manpages are written directly in roff format. The history of roff
goes back to the early 1960s and today most systems use GNU groff as
their roff typesettter.
The manpages use the *man* macro package. An alternative would have
been *mdoc*, which is more modern, but it is much more geared towards
documenting C functions. The *tbl* and *eqn* pre-processors can also
be used if you need to create a table or an equation.
GNU groff can output text for the terminal, HTML, PDF and some other
obscure formats. Pandoc can convert the manpages to many other formats.
Using another format as the definitive one would easily introduce
translation errors. The roff format is also easy to parse later if we
want to do some automatic maintenance. Keep it simple!
| 35.929577 | 71 | 0.778518 | eng_Latn | 0.999854 |
3ea01dbeac30165d3e4d8b31a6432a1bd8861e9d | 1,476 | md | Markdown | demo/weather_station/proximus/http/README.md | YangSkyL/lora_weather_station | d1fc9e63bab7b4336770157dbdf1b32cca819645 | [
"Apache-2.0"
] | 1 | 2019-06-26T08:18:40.000Z | 2019-06-26T08:18:40.000Z | demo/weather_station/proximus/http/README.md | YangSkyL/lora_weather_station | d1fc9e63bab7b4336770157dbdf1b32cca819645 | [
"Apache-2.0"
] | null | null | null | demo/weather_station/proximus/http/README.md | YangSkyL/lora_weather_station | d1fc9e63bab7b4336770157dbdf1b32cca819645 | [
"Apache-2.0"
] | 1 | 2019-10-05T11:21:30.000Z | 2019-10-05T11:21:30.000Z | # Configure Proximus cloudplatform
### Create CloudChannel
In orde to receive the uplink message and send the downlink message through Proximus, you need to ceate CloudChannel on [Proximus Enco cloudplatform](https://www.enco.io/).
1. Go to [EnCo DevPortal](http://devs.enco.io/dashboard/) and log in with your Enco account.
2. Navigate to CloudChannels API. You should get a overview of your CloudChannels API's.
3. Click on the button  to create a new CloudChannel. You should see something like below:

4. Here you can define where the data come from and where the data should go. For this demo we use LoRa as the input and HTTP as the output.
5. After drag & drop required components to input and output, you need to configure LoRa inbound configuration, CloudChannel definition and HTTP outbound configuration.
6. Configure the input:
* Click on 
* Select your device.
7. Configure CloudChannel definition:
* Click on Edit definition
* Give CloudChannel a relevant name
* Select TemperatureSensor
* Click Ok
8. Configure the output:
* Click on 
* Give a relevant Name
* Endpoint: http://\<hostname\>:8287/lora/api/proximus/uplink
* HTTP method: POST
* Click on Ok
9. Click Save Changes.
10. Redo above steps to create CloudChannels for PressureSensor, BatteryLevelSensor and HumiditySensor. | 49.2 | 172 | 0.77168 | eng_Latn | 0.869654 |
3ea18861c4cb1c3ed666bb9b43df1e8ecdee00a3 | 3,581 | md | Markdown | _posts/2020-11-01-becoming-a-leader-2.md.md | zainafzal88/blog | 11863448c26bde6bd72d180e6ab61ededf15763f | [
"MIT"
] | null | null | null | _posts/2020-11-01-becoming-a-leader-2.md.md | zainafzal88/blog | 11863448c26bde6bd72d180e6ab61ededf15763f | [
"MIT"
] | 4 | 2020-09-16T11:48:29.000Z | 2020-09-29T21:26:07.000Z | _posts/2020-11-01-becoming-a-leader-2.md.md | zainafzal88/blog | 11863448c26bde6bd72d180e6ab61ededf15763f | [
"MIT"
] | null | null | null | ---
layout: post
title: Becoming a Project Lead - Part 2
description: My 5-week transition from a Developer to Project Lead
date: 2020-11-01
---
This is the continuation from my last post in which I express my transition from a Developer to Project Lead. If you would like to read the first part, you can do so by clicking [here](https://blogs.roarcoder.dev/posts/becoming-a-leader/index.html).
I am still using the same tools (Jira and Google Calendar) for project management and organising meeting. However, I am also sing additional tools now which are:
### Zoom
Using this for online meetings.
While it's a very good tool, I have observed a drawback of it (not complaining) which is, if you have a recurring meeting, you can't use the same meeting link without the partcipants having to input the password. However, if you send a new link to the partcipants, they can use it without the password.
Having said that, I am sure this functionality has been implemented for a reason related to security which I totally understand.
### Slack
We have created a separate channel for this. However, there is a group I have also created. So, whenever any member of them team has to communicate any specifics about the project, this is done in this group. Me and the team uses this because we like to keep the techincal specifics within us.
## Feeling
Approximately 5 weeks in the transition, I am still feeling uncomfortable because I need to live upto the expectations of my seniors and deliver. It's also amazing how I am fully engaged that the time passes like a concord. Monday to Friday goes within a blink of an eye.
## Experience
**Vocabulory** - I am using my words really careful as I have come to learn that life and death is at the tip of your tongue. A person can shape him self and his mind just by the kind of words he uses.
**Listen More** - I have also learnt that listening more is the key to success as you only say what you know when talking. However, you get new knowledge when listening to other people.
**Lead throughout SDLC** - Taking the project from idea to actual delivery with the team, step by step is being a learning curve. I have learnt that, it's not just knowing how to do different things and assign tasks but making sure my team understands what is required, expected and the deliverables at the end. Especially when different members have different levels of expertise, I need to make sure I am utilizing each one to their best abilities for different kind of tasks and all tasks are being completed in the highest possible quality through Quality Assurance.
**Teaching** - As all of my team have varying levels of knowledge, there is a need to teach them what they don't know. I am developing the skill of teaching by explaining them what to do and how(if they don't know). My existing knowledge and technial skills are being polished and if I don't know something, I go research, learn and then teach them.
**Time Management** - Efficient management of time is crucial to leading a project I have learnt. Therefore, my mindset has changed from being "busy" to being "productive" as the time is passing rapidly.
**Focus Solely** - What I mean by this is it's important to focus on one thing at a time. As I am not a person who can multitask, this skill has been gifted to me by the Almighty.
## Conclusion
Overall, I am loving the transition as I love to being comfortable with the uncomfortable. This is it for now. Stay tuned for more.
Would love to see what your experience and feelings were when you were in the transition. | 89.525 | 570 | 0.777716 | eng_Latn | 0.999972 |
3ea1b1fbdc5a367b01a6cbd1bf8906e64ac75b19 | 3,240 | md | Markdown | _posts/2019-1-27-Vue-Js2.md | wisdompark/wisdompark.github.io | 84820a2820eb975ddf6eae9ec55ff9b70885316e | [
"MIT"
] | null | null | null | _posts/2019-1-27-Vue-Js2.md | wisdompark/wisdompark.github.io | 84820a2820eb975ddf6eae9ec55ff9b70885316e | [
"MIT"
] | null | null | null | _posts/2019-1-27-Vue-Js2.md | wisdompark/wisdompark.github.io | 84820a2820eb975ddf6eae9ec55ff9b70885316e | [
"MIT"
] | null | null | null | ---
layout: post
title: 화면을 개발하기 위한 필수 단위
---
## ◾뷰 인스턴스
뷰 인스턴스는 뷰로 화면을 개발하기 위해 필수적으로 생성해야 하는 기본 단위이다.
### ***뷰 인스턴스의 정의와 속성***
#### #생성자
객체를 새로 생성 할 때 자주 사용하는 옵션과 기능들을 미리 특정 객체에 저장해 놓고 새로 객체를 생성할 때 기존에 포함된 기능과 더불어 기존 기능을 쉽게 확장하여 사용하는 기법
```java
new Vue(){ //Vue 인스턴스 생성
el : '#app', //el 속성
data :{ //data 속성
message : 'Hello Vue.js!'
}
}
```
new Vue() 인스턴스를 생성할 때 Vue를 생성자라고 한다. Vue 생성자는 뷰 라이브러리를 로딩하면 접근할 수 있으며, 생성자를 사용하면 뷰로 개발할 때 필요한 기능들을 생성자에 미리 정의해 놓고 사용자가 그 기능을 재정의하여 편리하게 사용 가능하다.
### ***뷰 인스턴스의 옵션과 속성***
인스턴스 생성을 재 정의할 data, el, template, method, created 등의 속성을 의미한다.
1. template : 화면에 표시할 HTML, CSS 등의 마크업 요소를 정의
2. methods : 화면 로직 제어와 관계 된 마우스 이벤트 클릭 처리와 같은 메서드를 정의하는 속성
3. created : 뷰 인스턴스가 생성되자 마자 실행할 로직을 정의할 수 있는 속성
### ***뷰 인스턴스의 유효범위***
1. 뷰 라이브러리 파일 로딩
2. 인스턴스 객체 생성(옵션 속성 포함)
3. 특정 화면에 인스턴스 요소를 붙임
4. 인스턴스 내용이 화면 요소로 변환
5. 변환된 화면 요소를 사용자가 최종 확인
### ***뷰 인스턴스의 라이프사이클***
1. beforeCreate : 인스턴스가 생성되고 나서 가장 처음으로 실행되는 라이프 사이클 단계
2. created : data 속성과 methods 속성에 접근 할 수 있는 단계이자 컴포넌트가 생성되고 나서 실행되는 단계
3. mounted : el 속성에서 지정한 화면 요소에 인스턴스가 부착되고 나서 호출되는 단계
4. beforeUpdate : 관찰하고 있는 데이터가 변경되면 가상 돔으로 화면을 다시 그리기 전에 호출되는 단계
5. updated : 데이터가 변경되고 나서 가상돔으로 화면을 그리고 나면 실행되는 단계
6. beforeDestroy : 뷰 인스턴스가 파괴되기 직전에 호출되는 단계

## ◾뷰 컴포넌트
화면을 구성할 수 있는 특정 영역을 의미한다.
### ***컴포넌트 등록하기***
1. 전역 컴포넌트
```java
Vue.component('컴포넌트 이름', {
//컴포넌트 내용
});
```
2. 지역 컴포넌트
```java
new Vue({
components: {
'컴포넌트 이름' : 컴포넌트 내용
}
});
```
### *지역 컴포넌트와 전역 컴포넌트의 차이*
인스턴스의 유효범위에 있다.
1. 전역 컴포넌트는 인스턴스를 새로 생성할 때마다 인스턴스에 components 속성을 등록할 필요 없다.
2. 지역 컴포넌트는 새 인스턴스를 생성할 때마다 인스턴스에 components 속성을 등록해야 한다.
## ◾뷰 컴포넌트 통신
각 컴포넌트의 유효범위가 독립적이기 때문에 다른 컴포넌트의 값을 직접 참조하지 못한다.
개발자 개개인의 스타일대로 구성되지 않고 애플리케이션이 모두 동일한 데이터 흐름을 가지므로 협업하기에 좋다.
### *상하위 컴포넌트 관계*
1. 뷰는 상위에서 하위로만 데이터를 전달해야 하는 기본적인 통신 규칙을 따른다.
2. 상위에서 하위로 props라는 특별한 속성을 전달
3. 하위에서 상위로는 이벤트만 전달
### *상위에서 하위 컴포넌트로 데이터 전달하기*
props는 상위에서 하위 컴포넌트로 데이터를 전달할 때 사용하는 속성이다.
1. props 속성을 사용하려면 하위 컴포넌트의 속성에 정의
```java
Vue.component(
'child-component',{
props : ['props 속성 이름']
}
);
```
2. 상위 컴포넌트의 HTML 코드에 등록 된 child-component 컴포넌트 태그에 v-bind 속성 추가
```java
<child-component v-bind:props 속성 이름="상위 컴포넌트의 data 속성"></child-component>
```
### *하위에서 상위 컴포넌트로 데이터 전달하기*
이벤트를 발생시켜(event emit) 상위 컴포넌트에 신호를 보낸다.
1. 이벤트 발생
```java
this.$emit('이벤트명');
```
2. 이벤트 수신
```java
<child-component v-on:이벤트명="상위 컴포넌트의 메서드명"></child-component>
```
### *같은 레벨의 컴포넌트 간 통신*
상위-하위 관계를 유지하고 있지 않아도 컴포넌트 간 데이터를 주고받을 수 있는 방법은 이벤트 버스(Event Bus)이다.
#### 이벤트 버스 형식
```java
//이벤트 버스를 위한 추가 인스턴스 1개 생성
var eventBus = new Vue();
//이벤트를 보내는 컴포넌트
methods : {
메서드명 : function(){
eventBus.$emit('이벤트명', 데이터);
}
}
//이벤트를 받는 컴포넌트
methods : {
created : function(){
eventBus.$on('이벤트명', function(데이터){
...
});
}
}
```
보내는 컴포넌트에서는 .$emit()을, 받는 컴포넌트에서는 .$on()을 구현한다.
### *이벤트 버스의 단점*
props 속성을 이용하지 않고도 원하는 컴포넌트 간 직접적 데이터를 전달할 수 있어 편리하지만 컴포넌트가 많아지면 어디서 어디로 보냈는지 관리가 되지 않는 문제 발생한다.
## ◾참고링크
1. [뷰 인스턴스 라이프사이클 다이어그램 발췌](https://wikidocs.net/17701)
## ◾메모장
> 다음챕터 [상용 웹 앱을 개발하기 위한 필수 기술들](https://wisdompark.github.io/Vue-Js3/)
| 22.978723 | 144 | 0.629938 | kor_Hang | 1.00001 |
3ea1f43cb8f2be58f7648f8c3d847fb691b584be | 111 | md | Markdown | README.md | eugenio-cunha/stack | 13ba95dec8e73387129f3dffcd780f68b307354f | [
"MIT"
] | 1 | 2020-06-03T20:02:02.000Z | 2020-06-03T20:02:02.000Z | README.md | eugenio-cunha/stack | 13ba95dec8e73387129f3dffcd780f68b307354f | [
"MIT"
] | null | null | null | README.md | eugenio-cunha/stack | 13ba95dec8e73387129f3dffcd780f68b307354f | [
"MIT"
] | null | null | null | # stack
Lista de app:
* [Gotify](https://gotify.net/)
- um servidor simples para enviar e receber mensagens
| 18.5 | 55 | 0.711712 | por_Latn | 0.987021 |
3ea236b236c1f28aa9822f0aae670c45c19028c3 | 4,035 | md | Markdown | api/adr/20210210-choose-validation-libs.md | reach-arpit-garg/heinz-95729-project | 1e0f48307f53904aaf3b8e8b4d2216110232c4b1 | [
"MIT"
] | 2 | 2017-12-04T21:00:28.000Z | 2018-07-23T12:49:54.000Z | api/adr/20210210-choose-validation-libs.md | reach-arpit-garg/heinz-95729-project | 1e0f48307f53904aaf3b8e8b4d2216110232c4b1 | [
"MIT"
] | 25 | 2016-11-22T18:16:01.000Z | 2021-12-13T17:42:56.000Z | api/adr/20210210-choose-validation-libs.md | reach-arpit-garg/heinz-95729-project | 1e0f48307f53904aaf3b8e8b4d2216110232c4b1 | [
"MIT"
] | 19 | 2016-11-22T14:06:51.000Z | 2021-12-11T05:16:25.000Z | # Choose Validation Libraries
## Status
accepted
## Context
When constructing models, we need to validate the inputs to ensure quality, as well as produce error messages other developers, or users can understand, so they can fix the data they are presenting to us. Validation can also help us mitigate adversarial techniques, such as property pollution.
In addition to validation, immutability can help developers avoid unintended bugs that can be caused when multiple functions act upon a reference/pointer.
Many validation libraries exist, as well as formats for defining schemas. Not being satisfied with any validation libraries in the node community in 2014, I wrote [@polyn/blueprint](https://github.com/losandes/polyn-blueprint#polynblueprint).
JavaScript supports immutability to a degree, however true immutability requires careful, and verbose use of `Object.freeze`. Of the libraries I've reviewed that help mitigate this complexity, none did so without introducing significant complexities themselves. I built [@polyn/immutable](https://github.com/losandes/polyn-immutable#polynimmutable) to both solve the problem of immutability in JavaScript, _and_ further reduce the amount of code necessary to validate an object, at the same time.
I've continued to maintain these libraries over the years. I rewrote them as new libraries to use at Slack in 2019. They now support TypeScript. Blueprint schema's are designed to look like TypeScript where possible to reduce the cognitive overhead when being used in concert with TypeScript.
@polyn/blueprint & immutable supports JSON Schema, which can be a better option when we intend to export our schema's to 3rd party developers for their own testing, however JSON Schema is much more complex and limited when compared to native blueprint schemas.
In addition to the TypeScript style type definitions, blueprint supports functional validation. This allows it to go beyond type validation, and into comprehensive, and proprietary validation very easily. It supports object oriented type definitions, regular expressions, and can be used for ETL without defining multiple models (there is no requirement that the input data match the output schema, and blueprint has tools to help you map from one to the other).
When developers consume the output of blueprint or immutable, instead of the original input, the libraries help mitigate a common adversarial technique: property pollution.
An example blueprint, using immutable:
```JavaScript
const { optional, range, registerBlueprint } = require('@polyn/blueprint')
const { immutable } = require('@polyn/immutable')
const { randomBytes } = require('crypto')
registerBlueprint('Movie', {
title: 'string',
director: 'string',
year: range({ gte: 1895, lte: new Date().getFullYear() })
})
const User = immutable('User', {
id: optional('string').withDefault(`U${randomBytes(4).toString('hex')}`),
name: 'string',
age: range({ gt: 0, lte: 130 }),
role: /^admin|user$/,
favoriteMovies: 'Movie[]?'
})
```
Blueprint also exports a module called, `is`, which can be used as part of an assertion strategy (i.e. `is.number(42)`, and `is.not.number('fourty-two')`)
I evaluated Walmart's [Joi](https://github.com/hapijs/joi) library while writing this ADR. I think it solves a subset of the problems that blueprint solves, and none of the problems immutable solves. Joi is more verbose to use, and to read, unless we use JSON Schema. The breaking changes introduced to Joi year after year indicates that it's less stable than blueprint and immutable.
## Decision
Use [@polyn/blueprint](https://github.com/losandes/polyn-blueprint#polynblueprint), and [@polyn/immutable](https://github.com/losandes/polyn-immutable#polynimmutable) for validation and immutability.
## Consequences
Using blueprint's proprietary schema's mean that we tightly couple our models to these libraries. We can use JSON Schema instead, to avoid that coupling, however, I did not identify evidence that supports that as the better option.
| 69.568966 | 496 | 0.784634 | eng_Latn | 0.993681 |
3ea3587715594adce4fb02aecf719f7081e52013 | 7,019 | md | Markdown | White_Paper/02_Edge_Computing_in_the_Context_of_Open_Manufacturing/04_3_Operational_View.md | OpenManufacturingPlatform/iotcon-connectivity-handbook | 749cf96b800a218f29f8acfcd54309c4d8c29bf9 | [
"MIT"
] | 11 | 2021-04-03T11:32:58.000Z | 2021-11-11T01:20:32.000Z | White_Paper/02_Edge_Computing_in_the_Context_of_Open_Manufacturing/04_3_Operational_View.md | OpenManufacturingPlatform/iot_connectivity_public | 749cf96b800a218f29f8acfcd54309c4d8c29bf9 | [
"MIT"
] | 2 | 2021-02-02T09:35:23.000Z | 2021-02-24T15:55:02.000Z | White_Paper/02_Edge_Computing_in_the_Context_of_Open_Manufacturing/04_3_Operational_View.md | OpenManufacturingPlatform/iotcon-connectivity-handbook | 749cf96b800a218f29f8acfcd54309c4d8c29bf9 | [
"MIT"
] | 5 | 2021-03-12T14:23:47.000Z | 2021-10-31T20:33:15.000Z | [< Chapter 4.2: Application View](04_2_Application_View.md)
### 4.3 Operational View
Since applications in the manufacturing sector have special requirements in terms of stability and automation, an integrated operational approach for the edge nodes is essential. This chapter focuses on the functions that are important to manage the edge node through its lifecycle.
As described in chapter 4.1, it is possible to host the edge node either in the plant data center or on a computing unit on the shop floor next to the production asset. From an operational point of view, both options have their own challenges.
<img width="854" alt="image" src="https://user-images.githubusercontent.com/3258579/124182717-18a93e80-da6c-11eb-9fd0-691b935a2106.png">
An edge node also goes through the characteristic device lifecycle phases described in [OMP IOTCON 2020]<sup>1</sup>. Each lifecycle phase has specific requirements for successful execution. These requirements are generally met with additional cloud services. The predominant supporting cloud services are the Edge Node Management & Onboarding Service, the Monitoring Service, the Security Service, and the Edge Node Twin.
**Management & Onboarding Service**: Generally, this cloud service has the task of fetching the current or setting the desired state of connected edge nodes.
In the _provision phase_, the service onboards new edge nodes. The first step is to create digital identities. The digital identity can originate from existing asset management systems. The service can also save and manage target configurations of edge nodes.
The Management & Onboarding Service provides an endpoint where edge nodes initiate a connection. For this, a basic setup of the edge node must be performed. Example actions are installing the operating system, provisioning the agents, and providing security credentials.
After the initial connection is complete, the edge node offers basic self-describing characteristics (state). A few examples are the firmware version and node settings. The node state is compared with the target configuration from the service. If deviations occur, a state update is sent down to the edge node. State updates can be security or policy updates, changed configurations, and application versions (containers). In our reference use case, a configuration would be the thresholds to identify anomalous energy consumption patterns.
State updates resulting in edge node downtime that interfere with production are not possible at any time. Therefore, careful consideration should be given to scheduling the updates at an appropriate time.
Customized parameters are required to adapt the installed application to the specific use case. These parameters are pushed to the edge node and applied to the application in the _configuration phase_. An example can be a configuration file that is loaded onto the storage (e.g., hard drive) and fetched by the containers on startup. This is the same when updates occur in the _operations phase_ (see Edge Node Twin).
The final phase is _retirement_. On _physical_ edge nodes, a retirement is initiated by a hardware failure or an upgrade cycle. The hardware is exchanged, and the existing identity with its state is transferred to the new device. To reduce downtime in manufacturing scenarios, this relocation must be done by the service as seamlessly (i.e., automatically) as possible. On virtual edge nodes, a failure results in starting a new edge node and in the transferal of the digital identity.
If the edge node can be fully retired, the digital identity is removed from the edge node management service to prevent further access, both in physical or virtual edge node cases.
The **Cloud Monitoring Service** collects all log and metric information from edge nodes. Information can originate from the host system as well as from the running container applications. Therefore, the edge runtime and the applications must support this mechanism.
Via the service, it is possible to create alerting mechanisms (e.g., on allocated memory) which are applied as stream analysis on the incoming data. In the _provisioning phase_, the edge node establishes a connection to the monitoring service. Afterwards, alerts, as well as logs and metrics, can be used throughout the following _configuration_ and _operations phase_. They are used to determine the system’s health and perform incident traceability.
To obtain meaningful results, the log messages must have a defined format. Common components are message origin, severity, content, UTC timestamp, and correlation id to ensure better traceability.
The **Security Service** is responsible for managing the secure entities of the edge node and its applications. In the _provisioning phase_, the edge node connects to the cloud service the first time, and they exchange their trust entities. In a scenario of large-scale installation of physical edge nodes, a default certificate could be provided, which is valid for the first connection. After the connection to the service, it gets exchanged by the security service. For this, the service needs access to the relevant certificate authorities (CAs).
A second task is policy enforcement. This is done by a comparison of a target state with the current device state. Examples are updating the host system and the application of security rules, like disabling ports.
In the _configuration phase_, the security service supports the edge application configuration by handling security-related tasks, e.g., installing certificates to connect to third-party services. Also, edge applications can manage their secrets in a secure manner through this service. In the _operations phase_, continuous monitoring of vulnerability databases is performed. Any findings can result in potential updates of the target states. In the _retirement phase_, the revocation of the security entities of an edge node takes place.
Another relevant service is the **Edge Node Twin**. Its job is to synchronize application parameters between the cloud service and the edge nodes. Therefore, its main use is in the _operations phase_. Possible configuration parameters are stored in an “edge node twin” representation in the service. Changes made on the twin’s parameters are synchronized via the Device Management Service. A possible example is the adoption of a threshold value inside an application.
The parameter is generally reflected as hierarchically organized key-value-pairs. As a prerequisite, the edge container runtime has to be able to receive these parameters and provide them to the container applications. The container applications themselves need to be able to interpret them and change dynamically based on the given values from the twin service.
In our case, the process specialist analyzes the visualized data in the cloud dashboard and then sets new values in the Edge Node Twin to reduce the energy consumption.
[Chapter 5: Outlook >](05_Outlook.md) | 152.586957 | 551 | 0.810087 | eng_Latn | 0.99926 |
3ea3d0b6ffefbbfc7afb043ca30295680432d3fd | 1,023 | markdown | Markdown | _posts/2020-01-06-new_year_refocused.markdown | JoinJanay-JS/JoinJanay-JS.github.io | bac7ddd27c9a3f3599be6b653c789ee9e06266a3 | [
"MIT"
] | null | null | null | _posts/2020-01-06-new_year_refocused.markdown | JoinJanay-JS/JoinJanay-JS.github.io | bac7ddd27c9a3f3599be6b653c789ee9e06266a3 | [
"MIT"
] | null | null | null | _posts/2020-01-06-new_year_refocused.markdown | JoinJanay-JS/JoinJanay-JS.github.io | bac7ddd27c9a3f3599be6b653c789ee9e06266a3 | [
"MIT"
] | null | null | null | ---
layout: post
title: "New Year... ReFocused"
date: 2020-01-06 21:19:29 +0000
permalink: new_year_refocused
---
Happy New Year!! These last few weeks have sparked an energy inside of me that has me wanting to complete and finsh things i've started. I'm in a space of cleaning house. I'm SUPER SUPER excited about my Sinatra project and how it's coming along. I am so excited about this project, that I feel that this can grow into a bigger.
I've already started the framework for my project. My plan is to get about 2 hours of coding each night between work and bed. I am super excited (and nervous) about the technical interview portion of the project. I want to be sure i'm better than before. I want to feel more secure in the questions I am answering. I felt before, I was a bit shaky, but I feel this time around I can be more prepared. I've also started to look at some of the requirement for jobs in the software development world and trying to gauge the technical debt that I would need.
| 85.25 | 560 | 0.753666 | eng_Latn | 0.999833 |
3ea5c184a42c7f131587e51d0749c299ffd46eaf | 3,495 | md | Markdown | istio-release/proxytproxy/README.md | xiaoluhong/gcr.io_mirror | 1a07cbc1055ebfb0e0461535d240d834514dec25 | [
"MIT"
] | null | null | null | istio-release/proxytproxy/README.md | xiaoluhong/gcr.io_mirror | 1a07cbc1055ebfb0e0461535d240d834514dec25 | [
"MIT"
] | null | null | null | istio-release/proxytproxy/README.md | xiaoluhong/gcr.io_mirror | 1a07cbc1055ebfb0e0461535d240d834514dec25 | [
"MIT"
] | null | null | null |
[gcr.io/istio-release/proxytproxy](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
-----
[gcr.io/istio-release/proxytproxy:1.1.0.snapshot.1](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20180922-09-15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20180923-09-15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20180924-09-15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20180925-09-15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20180926-09-15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20180927-09-15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20180928-05-43](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20180928-09-15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20181001-03-48](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20181001-09-15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20181002-09-15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20181002-09-25](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:master-20181003-09-15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test14](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test15](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test22](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test30](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test33](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test34](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test35](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test36](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test38](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test40](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test44](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test45](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
[gcr.io/istio-release/proxytproxy:rkpagadala-test-latest-daily](https://hub.docker.com/r/anjia0532/istio-release.proxytproxy/tags/)
| 40.172414 | 131 | 0.789413 | yue_Hant | 0.344167 |
3ea6fec3813401ac55538b51dc4797c01a4752a8 | 4,524 | md | Markdown | content/lua/mt/_index.en.md | xackery/eqquestapi | e3bb4d58651c7c2bb1ced94deb59115946eed3c5 | [
"MIT"
] | null | null | null | content/lua/mt/_index.en.md | xackery/eqquestapi | e3bb4d58651c7c2bb1ced94deb59115946eed3c5 | [
"MIT"
] | 1 | 2020-09-08T17:21:08.000Z | 2020-09-08T17:21:08.000Z | content/lua/mt/_index.en.md | xackery/eqquestapi | e3bb4d58651c7c2bb1ced94deb59115946eed3c5 | [
"MIT"
] | 1 | 2020-08-29T00:49:26.000Z | 2020-08-29T00:49:26.000Z | ---
title: MT Enum
menuTitle: MT Enum
searchDescription: Lua MT Enum
searchTitle: Lua MT Enum
weight: 25
---
## MessageType Enum
- [NPCQuestSay](npcquestsay) -- {{% lua_type_number %}}
- [Say](say) -- {{% lua_type_number %}}
- [Tell](tell) -- {{% lua_type_number %}}
- [Group](group) -- {{% lua_type_number %}}
- [Guild](guild) -- {{% lua_type_number %}}
- [OOC](ooc) -- {{% lua_type_number %}}
- [Auction](auction) -- {{% lua_type_number %}}
- [Shout](shout) -- {{% lua_type_number %}}
- [Emote](emote) -- {{% lua_type_number %}}
- [Spells](spells) -- {{% lua_type_number %}}
- [YouHitOther](youhitother) -- {{% lua_type_number %}}
- [OtherHitsYou](otherhitsyou) -- {{% lua_type_number %}}
- [YouMissOther](youmissother) -- {{% lua_type_number %}}
- [OtherMissesYou](othermissesyou) -- {{% lua_type_number %}}
- [Broadcasts](broadcasts) -- {{% lua_type_number %}}
- [Skills](skills) -- {{% lua_type_number %}}
- [Disciplines](disciplines) -- {{% lua_type_number %}}
- [Unused1](unused1) -- {{% lua_type_number %}}
- [DefaultText](defaulttext) -- {{% lua_type_number %}}
- [Unused2](unused2) -- {{% lua_type_number %}}
- [MerchantOffer](merchantoffer) -- {{% lua_type_number %}}
- [MerchantBuySell](merchantbuysell) -- {{% lua_type_number %}}
- [YourDeath](yourdeath) -- {{% lua_type_number %}}
- [OtherDeath](otherdeath) -- {{% lua_type_number %}}
- [OtherHits](otherhits) -- {{% lua_type_number %}}
- [OtherMisses](othermisses) -- {{% lua_type_number %}}
- [Who](who) -- {{% lua_type_number %}}
- [YellForHelp](yellforhelp) -- {{% lua_type_number %}}
- [NonMelee](nonmelee) -- {{% lua_type_number %}}
- [WornOff](wornoff) -- {{% lua_type_number %}}
- [MoneySplit](moneysplit) -- {{% lua_type_number %}}
- [LootMessages](lootmessages) -- {{% lua_type_number %}}
- [DiceRoll](diceroll) -- {{% lua_type_number %}}
- [OtherSpells](otherspells) -- {{% lua_type_number %}}
- [SpellFailure](spellfailure) -- {{% lua_type_number %}}
- [Chat](chat) -- {{% lua_type_number %}}
- [Channel1](channel1) -- {{% lua_type_number %}}
- [Channel2](channel2) -- {{% lua_type_number %}}
- [Channel3](channel3) -- {{% lua_type_number %}}
- [Channel4](channel4) -- {{% lua_type_number %}}
- [Channel5](channel5) -- {{% lua_type_number %}}
- [Channel6](channel6) -- {{% lua_type_number %}}
- [Channel7](channel7) -- {{% lua_type_number %}}
- [Channel8](channel8) -- {{% lua_type_number %}}
- [Channel9](channel9) -- {{% lua_type_number %}}
- [Channel10](channel10) -- {{% lua_type_number %}}
- [CritMelee](critmelee) -- {{% lua_type_number %}}
- [SpellCrits](spellcrits) -- {{% lua_type_number %}}
- [TooFarAway](toofaraway) -- {{% lua_type_number %}}
- [NPCRampage](npcrampage) -- {{% lua_type_number %}}
- [NPCFlurry](npcflurry) -- {{% lua_type_number %}}
- [NPCEnrage](npcenrage) -- {{% lua_type_number %}}
- [SayEcho](sayecho) -- {{% lua_type_number %}}
- [TellEcho](tellecho) -- {{% lua_type_number %}}
- [GroupEcho](groupecho) -- {{% lua_type_number %}}
- [GuildEcho](guildecho) -- {{% lua_type_number %}}
- [OOCEcho](oocecho) -- {{% lua_type_number %}}
- [AuctionEcho](auctionecho) -- {{% lua_type_number %}}
- [ShoutECho](shoutecho) -- {{% lua_type_number %}}
- [EmoteEcho](emoteecho) -- {{% lua_type_number %}}
- [Chat1Echo](chat1echo) -- {{% lua_type_number %}}
- [Chat2Echo](chat2echo) -- {{% lua_type_number %}}
- [Chat3Echo](chat3echo) -- {{% lua_type_number %}}
- [Chat4Echo](chat4echo) -- {{% lua_type_number %}}
- [Chat5Echo](chat5echo) -- {{% lua_type_number %}}
- [Chat6Echo](chat6echo) -- {{% lua_type_number %}}
- [Chat7Echo](chat7echo) -- {{% lua_type_number %}}
- [Chat8Echo](chat8echo) -- {{% lua_type_number %}}
- [Chat9Echo](chat9echo) -- {{% lua_type_number %}}
- [Chat10Echo](chat10echo) -- {{% lua_type_number %}}
- [DoTDamage](dotdamage) -- {{% lua_type_number %}}
- [ItemLink](itemlink) -- {{% lua_type_number %}}
- [RaidSay](raidsay) -- {{% lua_type_number %}}
- [MyPet](mypet) -- {{% lua_type_number %}}
- [DS](ds) -- {{% lua_type_number %}}
- [Leadership](leadership) -- {{% lua_type_number %}}
- [PetFlurry](petflurry) -- {{% lua_type_number %}}
- [PetCrit](petcrit) -- {{% lua_type_number %}}
- [FocusEffect](focuseffect) -- {{% lua_type_number %}}
- [Experience](experience) -- {{% lua_type_number %}}
- [System](system) -- {{% lua_type_number %}}
- [PetSpell](petspell) -- {{% lua_type_number %}}
- [PetResponse](petresponse) -- {{% lua_type_number %}}
- [ItemSpeech](itemspeech) -- {{% lua_type_number %}}
- [StrikeThrough](strikethrough) -- {{% lua_type_number %}}
- [Stun](stun) -- {{% lua_type_number %}} | 48.12766 | 63 | 0.631521 | eng_Latn | 0.220362 |
3ea7ac2e8bd42a12a04e1b5a72d7ab28f8c2cdb0 | 1,879 | md | Markdown | README.md | Serheo/humanizer | 0d8abddd58654695f36830b5a93eca99fa650d85 | [
"MIT"
] | 1 | 2020-08-05T04:13:45.000Z | 2020-08-05T04:13:45.000Z | README.md | Serheo/humanizer | 0d8abddd58654695f36830b5a93eca99fa650d85 | [
"MIT"
] | null | null | null | README.md | Serheo/humanizer | 0d8abddd58654695f36830b5a93eca99fa650d85 | [
"MIT"
] | null | null | null | # Humanizer
Humanizer is a very simple CAPTCHA method. It has a localized YAML file with questions and answers which is used to validate that the user is an actual human. Any model that includes ActiveModel::Validations should work. Our aim is to be database and mapper agnostic, so if it doesn't work for you, open an issue. Humanizer only works with Rails 3.
## Installation
1. `gem install humanizer`
2. `rails g humanizer`
## Advanced Installation
* Install all locales: `rails g humanizer --all-locales`
* Show available locales: `rails g humanizer --show-locales`
* Install selected locales: `rails g humanizer en fi de`
## Usage
1. In your model, include Humanizer and add the #require_human_on method, example:
class User < ActiveRecord::Base
include Humanizer
require_human_on :create
end
2. Ask the question in the form, example:
<%= f.label :humanizer_answer, @model.humanizer_question %>
<%= f.text_field :humanizer_answer %>
<%= f.hidden_field :humanizer_question_id %>
## Configuration
Default translations can be found in config/locales/
You might want to add/change question and answer pairs. This can be easily done by adding/modifying entries in locales file.
## Live sites
* [ArcticStartup.com](http://arcticstartup.com/) - signup form
## License
Humanizer is licensed under the MIT License, for more details see the LICENSE file.
## Question/Answer Translations
* English, Finnish and Portuguese translations by [Kisko Labs](http://kiskolabs.com/)
* German by [Sven Schwyn](http://github.com/svoop)
* Dutch by [Joren De Groof](http://github.com/joren)
* Brazilian Portuguese by [Britto](http://github.com/britto)
* Russian by [Shark](http://github.com/Serheo)
## Contributors
* [Florian Bertholin](http://github.com/Arkan)
* [seogrady](http://github.com/seogrady)
| 32.964912 | 348 | 0.727515 | eng_Latn | 0.973119 |
3ea881950fb295e838ea7990f71dd16bed0628f3 | 6,502 | md | Markdown | _posts/2021-06-06-command.md | eremeykin/eremeykin.github.io | 89c6361b81fadafb28fba647cfd40695608aad3c | [
"MIT"
] | null | null | null | _posts/2021-06-06-command.md | eremeykin/eremeykin.github.io | 89c6361b81fadafb28fba647cfd40695608aad3c | [
"MIT"
] | null | null | null | _posts/2021-06-06-command.md | eremeykin/eremeykin.github.io | 89c6361b81fadafb28fba647cfd40695608aad3c | [
"MIT"
] | null | null | null | ---
title: Command
tags:
- 'тип: поведенческий'
- 'уровень: объект'
- GoF
- interface
- action
- callback
- 'function object'
show_excerpt: true
---
Инкапсулирует действие в объекте, таким образом позволяет сохранять это действие,
передавать его как параметр, хранить историю выполнения или отменять.
<!--more-->
<style>
.wrap {
padding-bottom: 25px;
}
</style>
## Назначение
Инкапсулирует действие в объекте, таким образом позволяет сохранять это действие,
передавать его как параметр, хранить историю выполнения или отменять. Допустим,
в приложении есть слой визуализации и слой бизнес-логики. Слой визуализации
отправляет некоторые запросы к слою бизнес-логики. Шаблон Command предлагает
выполнять такие запросы не напрямую, а через инкапсулюрующие запрос команды.
Команда включает в себя все детали запроса: на каком объекте он вызывается, какой
метод вызывается, какие аргументы передаются. Например: на объекте "лампа в гостиной"
вызвать метод "включить свет" с аргументами "теплый", "50%".
<p align="center">
<img src="/assets/images/command/command-illustration.png" width="50%" />
</p>
## Описание
Действие инкапсулируется в некоторый объект -- команду, который содержит в себе весь
контекст, всю необходимую информацию, требуемую для выполнения этого действия.
Используя сохраненный контекст можно так же реализовать отмену выполнения действия,
применив обратное преобразование (правда, иногда такое преобразование не существует,
или требует слишком много ресурсов). Все команды имеют общий интерфейс, это значит,
что их можно хранить в одном контейнере и обрабатывать универсальным способом. Через
интерфейс команды её можно выполнить, при этом в аргументы не потребуется передавать
дополнительной информации, всё необходимое уже инкапсулировано в самой команде, а
это значит, что команду можно выполнить из любого места.
## Реализация
<p align="center">
<img src="/assets/images/command/command-class-diagram.png"/>
</p>
<div class="grid grid--px-0">
<div class="cell cell--lg-3 cell--3"><b>Client</b></div>
<div class="cell cell--auto">Client</div>
<div class="cell cell--lg-12 wrap">Клиентский код, создает и конфигурирует команду, устанавливая в неё объект на котором вызывается команда, её параметры и
через тип команды -- вызываемый метод объекта</div>
<div class="cell cell--lg-3 cell--3"><b>Invoker</b></div>
<div class="cell cell--auto">ProgrammableRemoteControl</div>
<div class="cell cell--lg-12 wrap">Класс, хранящий и вызывающий команду. Получает команду извне через DI. </div>
<div class="cell cell--lg-3 cell--3"><b>Command</b></div>
<div class="cell cell--auto">Command</div>
<div class="cell cell--lg-12 wrap">Интерфейс команды, как правило не имеет никаких аргументов, так как весь необходимый контекст уже захвачен внутри состояния команды при её инициализации</div>
<div class="cell cell--lg-3 cell--3"><b>ConcreteCommand</b></div>
<div class="cell cell--auto">TurnLightOnCommand</div>
<div class="cell cell--lg-12 wrap">Конкретная реализация команды, выполняющей какое либо полезное действие на целевом объекте</div>
<div class="cell cell--lg-3 cell--3"><b>Receiver</b></div>
<div class="cell cell--auto">Light</div>
<div class="cell cell--lg-12 wrap">Целевой объект бизнес логики, получатель вызова через команду.</div>
</div>
**Особенность**
У команды обычно нет возвращаемого типа, она модифицирует состояние приложения.
## Примеры
Есть программируемый пульт дистанционного управления, на каждую кнопку которого
можно назначить некоторое действие. При создании объекта пульта управления в него
надо заинжектить команды, которые нажатие кнопок будет вызывать. Допустим, с
помощью первой кнопки конкретного пульта мы хотим включать конкретную лампу зелёным
цветом с яркостью 60%. Тогда под это действие создается объект команды с выбранными
значениями параметров и привязанный к данной лампе. Теперь нажатие на кнопку будет
вызывать установленную в пульте управления команду.
<p align="center">
<img src="/assets/images/command/command-class-diagram-example.png"/>
</p>
## Варианты
* Можно реализовать макрокоманду, которая состоит из последовательности обычных команд
и выполняет их при вызове.
* Интерфейс команды может включать или не включать метод для отмены команды.
## Чем отличается
**[Adapter](/2021/01/24/adapter.html)** имеет схожую структуру и диаграмму классов
с Command. Можно сказать, что команда это в некотором роде адаптер вызываемого объекта
к интерфейсу Command. Получается, что Command -- очень частный случай Adapter.
Но шаблон Adapter применяется при необходимости, по ситуации к тому интерфейсу,
к какому нужно, а под Command создается новый интерфейс. Adapter может адаптировать
интерфейс с несколькими методами, обусловленными бизнес-логикой, Command реализует
чисто командные методы.
**[Strategy](/2021/09/28/strategy.html)** определяет как делать действие, способ или алгоритм его выполнения.
А Command определяет какое именно действие нужно сделать. Command часто не содержит
логику самого действия, а просто инкапсулирует параметры и вызов на конкретном
объекте некоторого метода. Strategy часто определяет именно содержательную часть
логики, по которой выполняется действие.
Strategy - Quicksort / Mergesort - как
Command - Open / Close - что
Strategy обычно принимает параметры в методе, а команда инкапсулирует их в себе.
Команда может выполняться отложено, Strategy в момент вызова. Strategy обычно
несколько объектов (т.к. это штука без состояния), а объектов команды -- множество,
команда имеет состояние и завязана на конкретный объект, у которого она будет дергать метод.
## Ссылки
[https://java-design-patterns.com/patterns/command/](https://java-design-patterns.com/patterns/command/)
[https://github.com/iluwatar/java-design-patterns/tree/master/command](https://github.com/iluwatar/java-design-patterns/tree/master/command)
[https://refactoring.guru/design-patterns/command](https://refactoring.guru/design-patterns/command)
[Command Pattern – Design Patterns (ep 7)](https://www.youtube.com/watch?v=9qA5kw8dcSU)
[StackOverflow: Difference Between Command and Adapter Patterns](https://stackoverflow.com/questions/28392556/difference-between-command-and-adapter-patterns)
[StackOverflow: Difference between Strategy pattern and Command pattern](https://stackoverflow.com/questions/4834979/difference-between-strategy-pattern-and-command-pattern)
[Baeldung - The Command Pattern in Java](https://www.baeldung.com/java-command-pattern)
| 43.346667 | 195 | 0.783605 | rus_Cyrl | 0.961755 |
3ea8bc9631f49ca99fd92004e3471d02dc1f3f13 | 962 | md | Markdown | README copy.md | kwroble/unifi-aws | 361c867c6ed2acbc06bb5b6decd2faff65ff39df | [
"BSD-2-Clause"
] | null | null | null | README copy.md | kwroble/unifi-aws | 361c867c6ed2acbc06bb5b6decd2faff65ff39df | [
"BSD-2-Clause"
] | null | null | null | README copy.md | kwroble/unifi-aws | 361c867c6ed2acbc06bb5b6decd2faff65ff39df | [
"BSD-2-Clause"
] | null | null | null | Commands used:
1. Use Application Default Credentials (ADC):
`gcloud auth application-default login`
1. Create a new GCP Project:
`gcloud projects create <PROJECT_ID>`
1. List all GCP Projects:
`gcloud projects list`
1. Set active Project:
`gcloud config set project <PROJECT_ID>`
1. Use Service Account Credentials:
```
gcloud iam service-accounts create prod-svc
gcloud projects add-iam-policy-binding unifi-controller-kyle --member="serviceAccount:[email protected]" --role="roles/owner"
gcloud iam service-accounts keys create prod-svc-creds.json --iam-account=prod-svc@unifi-controller-kyle.iam.gserviceaccount.com
```
1. Set GCP Credentials:
`set GOOGLE_APPLICATION_CREDENTIALS=C:\Users\Kyle\AppData\Roaming\gcloud\application_default_credentials.json`
1. Set ssh username (Optional):
`set TF_VAR_username=kyle`
1. Run init
`terraform init`
1. Run Validate
`terraform validate`
1. Run Apply
`terraform apply`
| 26.722222 | 162 | 0.787942 | kor_Hang | 0.362884 |
3eaa4233f88e41b1b048aa8aa8d57ad26e86e752 | 285 | md | Markdown | linux/beaglebone/general_information.md | robotlabor-education/robotwiki | 86ed3c8bf439a4a646ac34b2022939916ba5b198 | [
"BSD-3-Clause"
] | null | null | null | linux/beaglebone/general_information.md | robotlabor-education/robotwiki | 86ed3c8bf439a4a646ac34b2022939916ba5b198 | [
"BSD-3-Clause"
] | null | null | null | linux/beaglebone/general_information.md | robotlabor-education/robotwiki | 86ed3c8bf439a4a646ac34b2022939916ba5b198 | [
"BSD-3-Clause"
] | 1 | 2021-10-17T16:04:41.000Z | 2021-10-17T16:04:41.000Z | # General information for BeagleBone devleopment
## Bundle list
- Most recent official releases: https://beagleboard.org/latest-images
- A list of special bundles are available here: https://elinux.org/Beagleboard:BeagleBoneBlack_Debian#Debian_Stretch_LXQt_Snapshot_for_BeagleBoard-xM | 57 | 149 | 0.838596 | yue_Hant | 0.527994 |
3eaaaaa0ef8bed25e4ba3c2b90dce8220ffbc3b9 | 5,923 | md | Markdown | posts/Arith_on_Trees/README.md | CousinoMath/blog | 84d782b0435e02fb4f7404b8095ae226b9dac71e | [
"MIT"
] | 1 | 2019-06-06T03:42:45.000Z | 2019-06-06T03:42:45.000Z | posts/Arith_on_Trees/README.md | CousinoMath/blog | 84d782b0435e02fb4f7404b8095ae226b9dac71e | [
"MIT"
] | null | null | null | posts/Arith_on_Trees/README.md | CousinoMath/blog | 84d782b0435e02fb4f7404b8095ae226b9dac71e | [
"MIT"
] | null | null | null | # Arithmetic on Trees
## Alternate Notations for Arithmetic
There are other ways to write down math. The style which we are used to is known
as [infix notation][1]. Two better known alternatives are
[Reverse Polish notation or postfix][3], and [Polish notation or prefix][2].
The main difference is that infix notation places operators, like +, −,
×, ÷, are placed between the arguments, and parentheses are used
for grouping. Postfix notation
(<acronym title="Reverse Polish Notation">RPN</acronym>) puts the operators
after the arguments, and prefix
(<acronym title="Polish Notation">PN</acronym>) puts the operators before the
arguments. One interesting difference is that these other two notations
don’t require the use of any parentheses. Below is a table of a few
examples written in all three notations.
| Infix | Postfix | Prefix |
| --- | --- | --- |
| `1 + 1` | `1 1 +` | `+ 1 1` |
| `1 + 2 × 3` | `1 2 3 × +` | `+ × 2 3 1` |
| `(1 + 2) × 3` | `1 2 + 3 ×` | `× + 1 2 3` |
| `√(3^2 + 4^2)` | `3 2 ^ 4 2 ^ + √` | `√ + ^ 3 2 ^ 4 2` |
<!--
<table>
<thead>
<tr>
<th>Infix</th>
<th>Postfix</th>
<th>Prefix</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>1 + 1</code></td>
<td><code>1 1 +</code></td>
<td><code>+ 1 1</code></td>
</tr>
<tr>
<td><code>1 + 2 × 3</code></td>
<td><code>1 2 3 × +</code></td>
<td><code>+ × 2 3 1</code></td>
</tr>
<tr>
<td><code>(1 + 2) × 3</code></td>
<td><code>1 2 + 3 ×</code></td>
<td><code>× 3 + 1 2</code></td>
</tr>
<tr>
<td><code>√(3^2 + 4^2)</code></td>
<td><code>3 2 ^ 4 2 ^ + √</code></td>
<td><code>√ + ^ 3 2 ^ 4 2</code></td>
</tr>
</tbody>
</table>
-->
There are programming advantages of using these alternate notations,
most deal with entering in expressions and parsing expressions.
Many <acronym title="Hewlett Packard">HP</acronym> calculators use
<acronym title="Reverse Polish Notation">RPN</acronym> input which
allows them to forgo buttons for `(`, `)`, and `=`. The end of an
postfix expression is signaled by a entering an operation. This can be
seen in the postfix examples above in that each ends with an operation.
Parsing postfix and prefix expressions can be accomplished with simpler
programs than those required to parse infix expressions.
With the supremacy of <acronym title="Texas Instraments">TI</acronym>
calculators, which employ infix expression input, it is difficult to
test any benefit from students knowing these alternate notations due to
the cost of teaching either prefix or postfix. And there are benefits to
students who can think in these alternate notations. One is a better
understanding of order of operations. Neither prefix nor postfix
notation require an ordering on operation evaluation. This is because
the ordering is already encoded in the notation. A student who can
consistently convert expressions from a textbook or assignment into one
of these alternate notations will be able to correctly determine the
order of evaluation in the exercise. Second, often the last operation
performed in the evaluation is important to identify. In exercises like
expanding logarithmic expressions, computing derivatives, or even
solving equations, the last operation performed needs to be identified
to determine which rule should be the first to employ. With postfix, the
last operation is the last symbol in the expression, and likewise with
prefix, it is the first symbol. Third, distinguishing between horizontal
and vertical transformations requires an ability akin to identiyfing
which operations happen before the function is applied and which happen
after. All these benefits amount deeper understanding of order of
operations for not just numeric expression, but algebraic as well. For
those that would like a new phrase, perhaps we should call this deeper
concept the “order of evaluation”.
## Tree Notation for Arithmetic
I think that there might be a notation which is easier to read and
understand which provides all the benefits of prefix or postfix. This
notation may be correctly referred to as the abstract syntax tree of the
arithmetic expression. Learning this tree notation makes prefix and
postfix easier to understand, and vesa-versa. Below is an example of
using the tree notation to represent `1+1`.
Next is an example of the computations to show that
`3 / 4 − 2 / 3 = 1 / 12`.
Another learning example is the quadratic formula.
And the following are the steps in the computation of the quadratic
formula for the quadratic `\(x^{2} − 4 x + 3\)`.
One may notice the use of boxes for operators like − and ÷
versus the use of ovals for + and ×. This is due to the fact that
- − and ÷ are noncommutative binary operators, whereas + and
× are commutative. So for − and ÷, it must be made
explicit which is the first argument and which is the second. But for
- + and ×, it does not matter which argument is first and which is
second. Also, note in the quadratic formula, the part of the tree
corresponding to `\(4 a c\)` is just an oval for × with three
leaves coming off it, one leave for each 4, `\(a\)`, and `\(c\)`. This
might seem odd because × is not defined to take three arguments
but only two. The associative property is what allows us to multiply
together any number of arguments.
Already, we have used the properties of associativity and commutativity
to explain the features of this tree notation. The convenience of these
two properties in our notation push us to use + instead of − and
× instead of ÷.
[1]: https://en.wikipedia.org/wiki/Infix_notation "Infix notation"
[2]: https://en.wikipedia.org/wiki/Polish_notation "Polish notation"
[3]: https://en.wikipedia.org/wiki/Reverse_Polish_notation "Reverse Polish notation"
[4]: https://en.wikipedia.org/wiki/Lisp_%28programming_language%29#Syntax_and_semantics "LISP"
[5]: https://en.wikipedia.org/wiki/Smalltalk#Expressions "Smalltalk"
[6]: https://en.wikipedia.org/wiki/APL_%28programming_language%29 "APL"
| 43.551471 | 94 | 0.733243 | eng_Latn | 0.998998 |
3eaac810a76492b0747370f954527499a6e108b7 | 13,390 | md | Markdown | _posts/2021-11-10 [rascunho]Web Services INTRO under construction.md | andrer54/andrer54.github.io | 589cfac90a649aa47fc8e6aa2975a8ff1b0be7dd | [
"MIT"
] | null | null | null | _posts/2021-11-10 [rascunho]Web Services INTRO under construction.md | andrer54/andrer54.github.io | 589cfac90a649aa47fc8e6aa2975a8ff1b0be7dd | [
"MIT"
] | 5 | 2021-09-06T16:14:22.000Z | 2021-09-06T22:10:55.000Z | _posts/2021-11-10 [rascunho]Web Services INTRO under construction.md | andrer54/github-pages-with-jekyll | 589cfac90a649aa47fc8e6aa2975a8ff1b0be7dd | [
"MIT"
] | null | null | null | # WebServices
## REST e SOAP
duas diferentes abordagens que permitem a transmissão de dados em Web Services.
### Alguns conceitos relacionados a Web Services:
* SOA (Arquitetura Orientada a Serviços),
* SOAP (Protocolo Simples de Acesso a Objetos)
* REST (Transferência Representacional de Estado),
(utilizando a linguagem de programação Java.)
---------------------
# SOA #, estamos falando de um padrão de ARQUITETURA de software,
baseado nos princípios da computação distribuída,
onde as funcionalidades existentes no software
devem estar disponíveis no formato de serviços.
-----
#### BREVE OBSERVAÇÃO SOBRE COMPUPTAÇÃO DISTRIBUIDA
a comunicação entre cliente e servidor era restrita a uma rede interna,
ficando o servidor responsável por efetuar todo o processamento.
Posteriormente, através de middlewares como CORBA, SCOM, RMI,
esse processamento passou a ser feito entre vários servidores,
tais middlewares responsáveis por prover a comunicação nos sistemas distribuídos.
-------
RECENTEMENTE...
### As aplicações cliente x servidor migraram para a Internet,
dando origem então aos Web Services,
que surgiram como uma extensão dos conceitos de chamada remota de métodos,
Então, podemos dizer que:
### Web Services
são aplicações distribuídas que se comunicam por meio de mensagens.
Em outras palavras,
um Web Service é uma interface que descreve uma coleção de operações acessíveis pela rede através de mensagens.
Nesse sentido, temos transações e regras de negócio de uma aplicação expostas através de protocolos acessíveis e
compreensíveis por outras aplicações.
#### podendo essas ser escritas em qualquer linguagem de programação, além de residirem
em qualquer sistema operacional.
-------------------
# EM CONSTRUCAO
--para não ficar muito longo esse post, eu vou fazer a parte 2 começando por
"OS WEB SERVICES E EUS ARQUITETURA"
OS WEB SERVICES E SUA ARQUITETURA
A arquitetura dos Web Services é baseada em três componentes:
Provedor de serviços;
Consumidor de serviços;
Registro dos serviços.
Vamos conhecer um pouco mais sobre cada um desses componentes.
* PROVEDOR DE SERVIÇOS
O primeiro componente, o provedor de serviços, é responsável pela criação e descrição do Web Service – em um formato padrão, compreensível para quem precise utilizar o serviço ‒ assim como pela sua disponibilização a fim de que possa ser utilizado.
* CONSUMIDOR DE SERVIÇOS
Esse componente, ou papel, é representado por quem utiliza um Web Service disponibilizado em um provedor de serviços.
*REGISTRO DOS SERVIÇOS
Trata-se de um repositório a partir do qual o provedor de serviços pode disponibilizar seus Web Services e no qual o consumidor de serviços pode utilizá-los. Em termos técnicos, o registro dos serviços contém informações, como os detalhes da empresa, os serviços por ela oferecidos e a descrição técnica dos Web Services.
-------
OUTROS ELEMENTOS DA ARQUITETURA DOS WEB SERVICES
Conforme pode ser visto na Figura 1, além dos elementos já apresentados, há ainda outros que compõem a arquitetura dos Web Services, como a WSDL, o SOAP, assim como a XML e a UDDI. A seguir, conheceremos um pouco mais sobre as tecnologias WSDL e UDDI. Já o SOAP será visto mais adiante, com o REST, em tópicos específicos.
*WSDL
A WSDL (Web Services Description Language) é uma linguagem baseada em XML, cuja função é descrever, de forma automatizada, os serviços do Web Service através de um documento acessível aos clientes que desejam fazer uso do Web Service. A WSDL é responsável por fornecer as informações necessárias para utilização de um Web Service, como as operações disponíveis e suas assinaturas.
*UDDI
A UDDI (Universal Description, Discovery and Integration) é responsável por prover um mecanismo para a descoberta e publicação de Web Services. Nesse sentido, a UDDI contém informações categorizadas sobre as funcionalidades e serviços disponíveis no Web Service, permitindo, ainda, a associação de informações técnicas, normalmente definidas com o uso da WSDL, a esses serviços.
--------------
* SOAP E REST
Conforme mencionado anteriormente, inicialmente, no contexto da computação distribuída, eram utilizadas tecnologias como RMI, DCOM e CORBA para a integração de aplicações. Nesse cenário, tais tecnologias obtiveram sucesso quando aplicadas em ambientes de rede locais e homogêneos. Posteriormente, já no ambiente heterogêneo da Internet, outras soluções foram aplicadas através da construção de aplicações web escritas em linguagens como ASP, PHP e Java (JSP). Tais aplicações, em termos de integração com outras aplicações, faziam uso de XML.
Embora a XML seja um formato de transmissão de dados padronizado, faltava padronização por parte das empresas em termos de desenvolvimento, utilização de protocolos, transações, segurança etc. Frente a isso, o W3C desenvolveu um padrão cujo principal objetivo era prover a interoperabilidade entre aplicações. Tal padrão recebeu o nome de “Padrões WS-*” e é constituído por especificações para criação de Web Services baseados no protocolo SOAP.
O REST, diferentemente do SOAP, não é uma especificação e nem foi criado pelo W3C. Em linhas gerais, trata-se de uma forma alternativa, uma arquitetura web, para o consumo de Web Services e que se baseia na utilização de recursos oferecidos pelo HTTP.
Veremos mais sobre SOAP e REST a seguir.
------------------
* SOAP
O SOAP é um protocolo, baseado em definições XML, utilizado para a troca de informações/comunicação em ambiente distribuído. Tal protocolo encapsula as chamadas e os retornos a métodos Web Services, trabalhando, principalmente, sobre o protocolo HTTP. Com o uso de SOAP, é possível invocar aplicações remotas utilizando RPC ou troca de mensagens, sendo indiferente o sistema operacional, a plataforma ou a linguagem de programação das aplicações envolvidas.
** Comunicação em SOAP
Web Services que fazem uso do protocolo SOAP podem utilizar dois modelos distintos de comunicação:
> RPC
Nesse modelo, é possível modelar chamadas de métodos com parâmetros, assim como receber valores de retorno. Com ele, o corpo (body) da mensagem SOAP contém o nome do método a ser executado e os parâmetros. Já a mensagem de resposta contém um valor de retorno ou de falha.
>Document
Nesse modelo, o body contém um fragmento XML que é enviado ao serviço requisitado, no lugar do conjunto de valores e parâmetros presente no RPC.
* Formato de mensagem
Uma mensagem SOAP é composta por três elementos:
>ENVELOPE
Elemento principal (raiz do documento) do XML, responsável por definir o conteúdo da mensagem. É um elemento obrigatório.
>HEADER
Mecanismo genérico que torna possível a adição de características, de informações adicionais, à mensagem. É um elemento opcional, mas que, quando utilizado, deve ser o primeiro elemento do Envelope.
>BODY
Corpo da mensagem. Contém a informação a ser transportada. Assim como o Envelope, é um elemento obrigatório.
<SOAP-ENV:envelope>
<SOAP-ENV:header>
</SOAP-ENV:header>
<SOAP-ENV:body>
<SOAP-ENV:fault>
</SOAP-ENV:fault>
</SOAP-ENV:body>
</SOAP-ENV:envelope>
>Exemplo de requisição e resposta utilizando SOAP
Para melhor compreensão, veremos a seguir um exemplo prático de requisição e resposta de Web Service utilizando o protocolo SOAP. Nesse exemplo, será invocado o método “GetModulosTema”. Tal método recebe como parâmetro o nome do tema, representado pela variável “TemaNome”. Como resposta, são retornados os nomes dos módulos relacionados ao tema informado. O XML contendo o envelope da requisição pode ser visto no código abaixo:
<?xml version="1.0"?>
<soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope">
<soap:Header>
</soap:Header>
<soap:Body>
<GetModulosTema>
<TemaNome>Webservices<TemaNome>
</GetModulosTema>
</soap:Body>
</soap:Envelope>
A seguir, é demonstrado o XML do envelope contendo a resposta do método invocado.
<?xml version="1.0"?>
<soap:Envelope
xmlns:soap="http://www.w3.org/2003/05/soap-envelope/"
soap:encodingStyle="http://www.w3.org/2003/05/soap-encoding">
<soap:Body>
<GetModulosTemaResponse>
<Modulos>
<Modulo>
<Nome>SOAP e REST</Nome>
</Modulo>
<Modulo>
<Nome>Utilização de SOAP XML em JAVA</Nome>
</Modulo>
<Modulo>
<Nome>Utilização de REST JSON em JAVA</Nome>
</Modulo>
</Modulos>
</GetModulosTemaResponse>
</soap:Body>
</soap:Envelope>
REST
O REST foi proposto por Roy Fielding, um dos criadores do protocolo HTTP, em 2000, com a premissa de utilizar os recursos oferecidos pelo HTTP. Trata-se de um modelo mais simples que o SOAP, além de não ser um protocolo, mas, sim, uma arquitetura web, composta pelos seguintes elementos:
Cliente (do Web Service);
Provedor (do Web Service);
Protocolo HTTP.
Considerando os elementos acima, o consumo de um Web Service que faz uso de REST tem seu ciclo de vida iniciado com o cliente enviando uma solicitação a um determinado provedor. Tal provedor, após processar a requisição, responde ao cliente. Além disso, o HTTP é o protocolo que define o formato das mensagens enviadas e recebidas, além de também ser responsável pelo transporte dessas mensagens.
Estrutura dos recursos REST
Na arquitetura REST, os serviços ou recursos disponíveis correspondem a uma URI (Uniform Resource Identifier) específica e que também é única. Se considerarmos o exemplo visto no protocolo SOAP, podemos dizer que “GetModulosTema” é um método pertencente a um recurso – que vamos chamar de “Tema”. Logo, a URI para consumo desse serviço seria:
http://www.dominio.com.br/tema/GetModulosTema/{nome-do-tema}
Considerando então que “Tema” é o nome do recurso, podemos imaginar outros métodos disponíveis no mesmo. Poderíamos ter, por exemplo, um método para listar todos os temas; um método para inserir um novo tema; etc. Cada um desses serviços teria uma URI própria:
Listagem de todos os temas
http://www.dominio.com.br/tema
Inserção de tema
http://www.dominio.com.br/tema/CreateTema/{nome-do-tema}
Uma vez que os Web Services REST são baseados no protocolo HTTP, a estrutura dos recursos REST, como visto acima, provém justamente dos métodos e códigos de retorno HTTP. Isto, em termos práticos, significa dizer que devemos usar os diferentes métodos HTTP de acordo com as operações para manipulação de dados dos recursos que desejamos fazer.
Por exemplo: Para recuperar dados, como no caso onde queremos listar todos os temas existentes, devemos usar o método HTTP GET.
Abaixo, na Tabela 1, estão listados os métodos HTTP e suas funções em relação à arquitetura REST:
Método HTTP Descrição / Para que é usado
GET Usado na recuperação ou listagem de recursos
POST Usado na inclusão de um recurso
PUT Usado na edição de um recurso
DELETE Usado na exclusão de um recurso
Como já mencionado, REST utiliza os recursos do protocolo HTTP. Logo, em relação às respostas dos serviços, temos disponíveis os códigos de retorno HTTP.
Por exemplo: Para verificarmos se um recurso foi atualizado com sucesso, devemos verificar se o código HTTP é igual a 200. Caso algum erro tenha ocorrido, teremos então o código 400 ou 404.
Exemplo de requisição e resposta utilizando REST
O consumo de um recurso REST é feito através de uma URI. Logo, poderíamos acessar tal recurso até mesmo através de um navegador web, sendo a forma mais usual, quando falamos de integração entre aplicações, implementarmos um cliente, através de uma linguagem de programação, que acesse o recurso em questão, enviando parâmetros, quando necessário, e tratando o retorno do mesmo. Nesse contexto, vamos usar o mesmo exemplo visto em SOAP e recuperar a listagem de módulos disponíveis para um determinado tema.
Para consumir o serviço, vamos utilizar a seguinte URI:
http://www.dominio.com.br/tema/GetModulosTema/Webservices
Um possível retorno para essa requisição é visto a seguir:
{
"Modulos": [{
"Nome": "SOAP e REST"
}, {
"Nome": "Utilização de SOAP XML em JAVA"
}, {
"Nome": "Utilização de REST JSON em JAVA"
}
]
}
Importante
Como vimos, as informações retornadas pelo Web Service consumido estão no formato JSON. Embora não seja o único tipo de dado disponível para o transporte de informações em REST – ou melhor, em mensagens HTTP, ele é o mais utilizado nessa arquitetura.
Como curiosidade, em requisições REST podemos usar qualquer um dos tipos de conteúdo (Content-Type) abaixo:
Application/xml
Application/json
Text/plain
Text/xml
Text/html
para não ficar muito extenso, vou fazer o modulo 2 falando de webservivers, arquiteturas, sopa e rest.
------------
REFERÊNCIAS
JACOBSON, D.; BRAIL, G.; WOODS, D. APIs: A Strategy Guide. California: O’Reilly, 2012.
KALIN, M. Java Web Services: Up and Running. 2nd. ed. California: O’Reilly, 2013.
PRESSMAN, R.; MAXIM, B. Engenharia de Software: Uma abordagem profissional. Porto Alegre: McGraw-Hill – Artmed, 2016.
W3C. W3C Working Group Note 11. In: W3C ‒ Web Services Architecture. Publicado em meio eletrônico em: fev. 2014.
| 49.227941 | 542 | 0.756385 | por_Latn | 0.999933 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.