hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c00ec213a3138c563e3b85d52c36243e4b8dabb2 | 8,060 | md | Markdown | _posts/SIST/2018-11-07-oracle-정리-04.md | younggeun0/younggeun0.github.io | 29926870832211f3e489e4c608349650a2e5482a | [
"MIT"
] | null | null | null | _posts/SIST/2018-11-07-oracle-정리-04.md | younggeun0/younggeun0.github.io | 29926870832211f3e489e4c608349650a2e5482a | [
"MIT"
] | null | null | null | _posts/SIST/2018-11-07-oracle-정리-04.md | younggeun0/younggeun0.github.io | 29926870832211f3e489e4c608349650a2e5482a | [
"MIT"
] | 1 | 2022-03-10T00:16:36.000Z | 2022-03-10T00:16:36.000Z | ---
layout: post
title: 오라클 DBMS 정리 04
tags: [오라클DBMS]
excerpt: "ORACLE DBMS 정리 - FUNCTIONS"
date: 2018-11-07
feature: https://github.com/younggeun0/younggeun0.github.io/blob/master/_posts/img/oracle/oracleImageFeature.jpg?raw=true
comments: true
---
## 오라클 DBMS 정리 04 - FUNCTIONS
---
### 함수
* 오라클은 함수들 연습용으로 가상테이블 **DUAL**을 제공함
- **pseudo table**
- 입력된 데이터로 컬럼을 생성하여 조회하는 일을함
- 사용자가 DUAL 테이블을 생성하면 가상 테이블을 사용할 수 없음
+ DROP TABLE DUAL; 로 동명 테이블 제거하면 다시 바로 사용 가능
---
### 문자열 함수
~~~sql
-- 문자열의 길이를 숫자로 반환해주는 함수 LENGTH(문자열)
LENGTH('ABCD')
-- 4
-- 영어 문자열을 대문자로 바꾸는 함수 UPPER(문자열)
UPPER('AbcD')
-- ABCD
-- 영어 문자열을 소문자로 바꾸는 함수 LOWER(문자열)
LOWER('AbcD')
-- abcd
-- 영어 문자열의 첫 글자만 대문자로 바꾸고 나머지는 소문자로 변환하는 함수 INITCAP(문자열)
INITCAP('name')
-- Name
-- 문자열 중간에 띄어쓰기가 존재하면 띄어쓰기 다음 나오는 문자의 첫 글짜는 대문자로 바꿈
INITCAP('abcd ef ghi')
-- Abcd Ef Ghi
-- 찾는 특정 문자열의 인덱스를 반환해주는 함수 INSTR(문자열, 찾을문자열)
-- 오라클은 인덱스가 1번부터 시작
INSTR('AbcDef', 'D')
-- 4
-- 찾는 문자열이 없으면 0이 반환됨(JAVA같은 언어에서 찾는 데이터가 없을 땐 -1이 반환)
-- 문자열 자르기 SUBSTR(문자열, 시작인덱스, 자를글자수)
-- 부모 문자열(Superstring)에서 자식 문자열(Substring)을 잘라내는 것
SUBSTR('ABCDEF', 2, 3)
-- BCD
-- 자를글자수를 입력 안하면 시작인덱스부터 끝까지 자름
SUBSTR('ABCDEF', 3)
-- CDEF
-- 문자열 공백 제거 세가지(앞(좌), 뒤(우), 양뒤 공백 제거)
-- 사이 공백은 REPLACE함수를 통해 없앨 수 있음!
-- 앞뒤 공백을 제거하는 TRIM(문자열)
TRIM(' ABCDE ')
-- ABCDE
-- 앞 공백을 제거하는 LTRIM(문자열)
LTRIM(' ABCDE ')
-- ABCDE
-- 뒤 공백을 제거하는 RTRIM(문자열)
RTRIM(' ABCDE ')
-- ABCDE
-- 문자열 결합해주는 함수 CONCAT(문자열, 붙일문자열)
-- 붙임연산자(||)와 동일한 기능
-- 많이 결합할 경우 복잡해지므로 ||를 더 많이 쓴다
CONCAT('ABC', 'DEF')
-- ABCDEF
-- 문자열 채우기는 왼쪽 채우기 LPAD와 오른쪽 채우기 RPAD가 있음
-- 이 함수에서 한글은 2byte로 계산함
-- LPAD(문자열, 총자릿수, 채울문자열), RPAD(문자열, 총자릿수, 채울문자열)
LPAD('ABCDE', 10, '#')
-- #####ABCDE
RPAD('ABCDE', 10, '$')
-- ABCDE$$$$$
-- 규격화된 문자열을 만들 때 자주 사용
~~~
---
### 수학 함수
~~~sql
-- sin, cos, tan 구해주는 함수 SIN(값) COS(값) TAN(값), 잘 안쓰임
-- 절대값을 반환하는 함수 ABS(값)
ABS(-5)
-- 5
-- 반올림을 해주는 함수 ROUND(값, 반올림할자리수)
-- '.'을 기준으로 ROUND(555.555) 소수부는 +, 정수부는 -
-- 소수부는 반올림했을 때 볼 자리수
ROUND(5.555, 1)
-- 5.6
ROUND(5.555, 2)
-- 5.56
-- 정수부는 그 자리에서 반올림
ROUND(555, -1)
-- 560
-- 0이 아니면 무조건 올리는 함수 CEIL(값)
CEIL(10.1)
-- 11
-- 내림 함수 FLOOR(값)
FLOOR(10.8)
-- 10
-- 지정한 위치의 수를 자르는 절삭 함수 TRUNC(값, 절삭할자리수)
---소수부는 반올림했을 때 볼 자리수를 뜻하고, 정수부는 반올림할 그 자리를 뜻함
-- 소수부는 그 다음자리를 자름
-- 정수부는 그 자리를 자름
TRUNC(555.555, 2)
-- 555.55
-- 원단위 절삭
TRUNC(555.555, -1)
-- 550
-- 소수 이하 절삭
TRUNC(555.555, 0)
-- 555
~~~

---
### NULL 변환 함수
- NULL은 값이 존재하지 않는 상태(0이 아님)
- **NULL은 레코드를 추가할 때(INSERT) 해당 컬럼을 명시하지 않으면 삽입됨**
+ CHAR/VARCHAR2 : 컬럼을 명시하지 않거나 값이 ''인 경우
+ NUMBER/DATE : 컬럼을 명시하지 않은 경우
- **NVL(값, NULL이었을 때 보여줄 값)**
+ 컬럼 값이 NULL일때 보여줄 값으로 컬럼 데이터형과 동일한 타입값을 넣어야 함
~~~sql
NVL(age, 0)
-- 잘못된 데이터형을 삽입한 예
NVL(age, '없음')
-- 오라클은 자동형변환을 해주기 때문에 문자열이 숫자면 괜찮다
NVL(age, '0')
~~~
---
### 변환함수
- **TO_CHAR - 문자가 아닌 값(NUMBER, DATE)들을 문자로 바꿔줌**
- **숫자 변환 pattern**
+ 지정한 자리에 ,나 .을 출력
+ 0 - 데이터가 없으면 0을 채워서 출력
+ 9 - 데이터가 있는 것 까지만 출력
- **날짜 변환 pattern**
+ 날짜를 원하는 형식으로 변환해서 보겠다는 의미
+ yyyy/ yy - 년 / mm/mon/month - 월 / dd - 일
+ am - 오전, 오후 / hh - 12시간 / hh24 - 24시간
+ mi - 분
+ ss - 초
+ day - 요일(ex) 목요일 / dy - 요일(ex) 목
+ q - 분기(1~4)
+ 주의! 패턴이 길면 에러가 발생!
* 에러 발생 시 나눠서 붙임 연산자로 연결
- TO_DATE - 날짜가 아닌 값들을 날짜로 바꿔줌
+ 날짜가 아닌 문자열을 날짜로 변환
~~~sql
-- TO_CHAR(숫자, pattern)
TO_CHAR(2018, '0,000,000')
-- 0,002,018
TO_CHAR(2018, '9,999,999')
-- 2,018
SELECT TO_CHAR(2018.1025, '999999.999')
FROM DUAL;
-- 2018.103
-- 사원테이블에서 사원번호, 사원명, 입사일, 연봉을 조회
-- 단, 연봉은 데이터가 있는 것까지만 ,를 넣어 출력
SELECT empno, ename, hiredate, TO_CHAR(sal, '999,999') sal
FROM emp;
-- 에러!, TO_CHAR의 반환값은 문자열! 문자열은 사칙연산을 할 수 없음
SELECT TO_CHAR(sal, '9,999')+100
FROM emp;
-- 이거는 가능
-- SELECT TO_CHAR(sal+100, '9,999')
-- TO_CHAR(날짜, pattern)
TO_CHAR(SYSDATE, 'y')
-- 8
TO_CHAR(SYSDATE, 'yy')
-- 18
TO_CHAR(SYSDATE, 'yyyy')
-- 2018
TO_CHAR(SYSDATE, 'yyyyy')
-- 20188
-- 필요한 형태로 패턴을 만들어 사용한다
-- SYSDATE로 입력한게 아니면 시간정보는 들어가지 않는다
SELECT TO_CHAR(SYSDATE, 'yyyy-mm-dd am hh(hh24):mi:ss day dy q')
FROM DUAL;
SELECT SYSDATE
FROM DUAL;
-- 툴마다 SYSDATE형태가 다르게 나옴, 같게 만들고 싶으면 TO_CHAR를 사용한다
-- perttern이 특수문자가 아닌 문자열 사용할때에는 "로 묶는다
SELECT TO_CHAR(SYSDATE, 'yyyy"년" mm"월" dd"일"')
FROM DUAL;
-- perttern을 너무 길게 사용하면 Error 발생!
SELECT TO_CHAR(SYSDATE, 'yyyy " 년 " mm " 월 " dd " 일 " hh24 " 시 " mi " 분 " ss " 초 "')
FROM DUAL;
-- 이럴경우 잘라서 ||나 CONCAT으로 붙이는 방법을 사용
SELECT TO_CHAR(SYSDATE, 'yyyy " 년 " mm " 월 " dd " 일 "')
|| TO_CHAR(' hh24 " 시 " mi " 분 " ss " 초 "')
FROM DUAL;
~~~
~~~sql
TO_DATE(문자열, pattern)
-- pattern은 TO_CHAR에서 쓰던 pattern과 동일
-- 단, 문자열이 날짜형식의 문자열일 때만 변환됨
TO_DATE('2018-10-25','yyyy-mm-dd')
-- 2018-10-25
-- 문자형 데이터가 날짜형 데이터로 변환됨
-- 현재 날짜가 아닌 날짜를 추가할 땐, 날짜 형식의 문자열을 추가하면 됨
INSERT INTO class4_info(num, name, input_date)
VALUES(8, '양세찬', '2018-10-21');
-- SYSDATE로 데이터를 넣지 않았기 때문에 시간정보는 없다
INSERT INTO class4_info(num, name, input_date)
VALUES(9, '양세형', TO_DATE('2018-10-22', 'yyyy-mm-dd'));
-- 굳이 TO_DATE를 안써도 년월일 구별해서 잘 들어간다
-- TO_CHAR는 날짜나 숫자를 입력받아야함.(함수의 값(인자)은 데이터형을 구분한다.)
-- 아래 '2018-10-25'는 문자열, 문자열로 함수의 인자로 들어갔을 경우 에러 발생
SELECT TO_CHAR('2018-10-25', 'mm')
FROM DUAL;
-- 함수의 매개변수로 이용할 땐 TO_DATE가 필요!
SELECT TO_CHAR(TO_DATE('2018-10-25', 'yyyy-mm-dd'), 'mm')
FROM DUAL;
~~~
---
### 조건(비교) 함수
* DECODE
- PL/SQL에서는 사용할 수 없음
- 비교값, 출력값이 쌍을 이룸, 여러개 들어갈 수 있음
- **컬럼명이 비교값과 같으면 쌍을 이루는 출력값 반환**
+ **비교값과 일치하는 값이 없으면 맨 마지막 값 반환**
- 비교하여 실행될 코드가 짧거나 간단한 경우에 사용!
~~~sql
DECODE(컬럼명, 비교값, 출력값, 비교값, 출력값, ..., 비교값이없을때출력할값)
-- deptno가 10, 20, 30이 있을 때
-- deptno가 10이면 SI로, 20이면 SM으로 30이면 QA로 그 외는 Solution을 반환하는 DECODE 함수
DECODE(deptno, 10, 'SI',
20, 'SM',
30, 'QA', 'Solution')
-- 사원테이블에서 사원번호, 사원명, 연봉, 부서명을 조회
-- 단, 부서명은 아래의 부서번호에 해당하는 부서명으로 출력
-- 10-개발부, 20-유지보수부, 30-품질보증부, 그 외는 탁구부로 출력
SELECT empno, ename, sal,
DECODE(deptno, 10, '개발부',
20, '유지보수부', 30, '품질보증부',
'탁구부') dname
FROM emp;
~~~
* CASE
- PL/SQL에서 사용 가능
- SELECT 조회 컬럼에 사용
- 컬럼의 값을 비교하여 코드를 수행할 때 사용
- **비교하여 실행될 코드가 길거나 복잡한 경우 사용**
~~~sql
SELECT CASE 컬럼명
WHEN 비교값 THEN 실행코드
WHEN 비교값 THEN 실행코드
WHEN ...
ELSE 비교값이없을때실행코드
END alias
-- DECODE랑 다르게 콤마 안씀! 조심할 것
-- DECODE 예제와 같은 문제를 CASE로 조회
SELECT empno, ename, sal, deptno,
CASE deptno WHEN 10 THEN '개발부'
WHEN 20 THEN '유지보수부'
WHEN 30 THEN '품질보증부'
ELSE '탁구부'
END dname
FROM emp;
~~~
---
### 집계 함수
* 컬럼을 묶어서 보여줄 때 사용하는 함수
- 한 행이 조회됨
* **여러행이 조회되는 컬럼과 같이 사용되면 에러 발생**
* **WHERE절에서는 집계함수를 사용할 수 없다**
* 집계함수는 GROUP BY와 같이 사용하면 그룹별 집계를 조회한다
* GROUP BY와 함께 사용하면 그룹별 집계 가능
- COUNT(), SUM(), AVG(), MAX(), MIN() 등 존재
~~~sql
SELECT COUNT(empno), ename
FROM emp;
-- Error!, COUNT는 한 행으로 조회되고 ename은 14개가 조회됨
-- COUNT(empno) 컬럼값으로 뭘 넣어야 할 지 몰라서 에러발생
-- COUNT는 레코드의 수를 세는 함수
-- NULL인 레코드는 COUNT에 포함되지 않는다!
COUNT(컬럼명)
-- NULL이 아닌 모든 레코드의 수를 셀 때
COUNT(*)
-- 컬럼 값의 합을 구하는 함수 SUM
SUM(컬럼명)
-- 컬럼 값의 평균을 구하는 함수 AVG
AVG(컬럼명)
-- 컬럼 값 중 가장 큰 수를 찾는 반환하는 MAX
MAX(컬럼명)
-- 컬럼 값 중 가장 작은 수를 찾는 반환하는 MIN
MIN(컬럼명)
-- 최고연봉액, 최저연봉액, 최고연봉액과 최저연봉액의 차이
SELECT MAX(sal), MIN(sal), MAX(sal)-MIN(sal)
FROM emp;
-- **WHERE절에서는 집계함수를 사용할 수 없다
-- 서브쿼리를 사용해서 처리해야 함
-- 사원테이블에서 평균연봉보다 많이 받는 수령하는 사원의
-- 사원번호, 사원명, 연봉, 입사일을 조회
SELECT empno, ename, sal, hiredate
FROM emp
WHERE sal > AVG(sal); -- 에러!
-- 집계함수는 GROUP BY와 같이 사용하면 그룹별 집계를 조회한다
-- 부서번호, 부서별인원수, 부서별 연봉의 합, 부서별 연봉평균, 부서별 최고연봉액
SELECT deptno, COUNT(empno), SUM(sal), TRUNC(AVG(sal),2), MAX(sal), MIN(sal)
FROM emp
GROUP BY deptno;
~~~
---
### 날짜 함수
* **ADD_MONTHS**
- 날짜에 월 더하기
* **MONTHS_BETWEEN**
- 날짜간 개월수 차이
~~~sql
ADD_MONTHS(날짜, 더할개월수)
MONTHS_BETWEEN(큰날짜, 작은날짜)
~~~
~~~sql
-- 날짜에 +를 사용하면 일을 더함
SELECT SYSDATE+5
FROM DUAL;
-- 현재로부터 5개월 후
SELECT ADD_MONTHS(SYSDATE,5)
FROM DUAL;
-- 현재(2018-10-25)부터 2019-05-25까지의 개월수 차이를 구한다
SELECT MONTHS_BETWEEN('2019-05-25', SYSDATE)
FROM DUAL;
~~~
---
[숙제풀이](https://github.com/younggeun0/SSangYoung/blob/master/dev/query/date181025/homework.sql) | 19.563107 | 121 | 0.61464 | kor_Hang | 1.000009 |
c0123f07af3a9bd68b268eb79469579272055111 | 1,993 | md | Markdown | README.md | theiskaa/highlightable | 2a5a89e3e844fa9124e1c75c4526b0d647176991 | [
"MIT"
] | 7 | 2021-08-17T19:19:25.000Z | 2022-03-26T20:31:55.000Z | README.md | theiskaa/highlightable | 2a5a89e3e844fa9124e1c75c4526b0d647176991 | [
"MIT"
] | 6 | 2021-10-18T05:42:08.000Z | 2022-03-21T22:30:05.000Z | README.md | theiskaa/highlightable | 2a5a89e3e844fa9124e1c75c4526b0d647176991 | [
"MIT"
] | null | null | null | <p align="center">
<br>
<img width="500" src="https://user-images.githubusercontent.com/59066341/129483451-4196e1bb-f094-4b3c-aefc-41d77aff8117.png" alt="Package Logo">
<br>
<br>
<a href="https://pub.dev/packages/field_suggestion">
<img src="https://img.shields.io/badge/Special%20Made%20for-FieldSuggestion-blue" alt="License: MIT"/>
</a>
<a href="https://github.com/theiskaa/highlightable-text/blob/main/LICENSE">
<img src="https://img.shields.io/badge/License-MIT-red.svg" alt="License: MIT"/>
</a>
<a href="https://github.com/theiskaa/highlightable-text/blob/main/CONTRIBUTING.md">
<img src="https://img.shields.io/badge/Contributions-Welcome-brightgreen" alt="CONTRIBUTING"/>
</a>
</p>
# Overview & Usage
First, `actualText` property and `highlightableWord` property are required.
You can customize `actualText` by providing `defaultStyle`. Also you can customize highlighted text style by `highlightStyle` property.
`highlightableWord` poperty could be string array or just string with spaces.
You can enable word detection to focus on concrete matcher word. See "Custom usage" part for example.
### Very basic usage
```dart
HighlightText(
'Hello World',
highlightableWord: 'hello',
),
```
<img width="200" alt="s1" src="https://user-images.githubusercontent.com/59066341/129080679-bfb97d11-93c5-4258-b271-0e0918e3bc22.png">
### Custom usage
```dart
HighlightText(
"Hello, Flutter!",
highlightable: "Flu, He",
caseSensitive: true, // Turn on case-sensitive.
detectWords: true, // Turn on only full word hightlighting.
defaultStyle: TextStyle(
fontSize: 25,
color: Colors.black,
fontWeight: FontWeight.bold,
),
highlightStyle: TextStyle(
fontSize: 25,
letterSpacing: 2.5,
color: Colors.white,
backgroundColor: Colors.blue,
fontWeight: FontWeight.bold,
),
),
```
<img width="220" alt="stwo" src="https://user-images.githubusercontent.com/59066341/129483513-c379f0d6-d5ba-43e1-a2d7-0722aeb5dafa.png">
| 32.672131 | 145 | 0.724034 | eng_Latn | 0.289276 |
c0134cd244b03bfcadf875f179dce9c9e5742c58 | 10,455 | md | Markdown | _posts/2019-12-24-hottest-programming-languages-to-learn-in-2020.md | vamsikollipara/vamsikollipara.github.io | 64e65a20762820ae10a192137b2565e15345bc18 | [
"MIT"
] | null | null | null | _posts/2019-12-24-hottest-programming-languages-to-learn-in-2020.md | vamsikollipara/vamsikollipara.github.io | 64e65a20762820ae10a192137b2565e15345bc18 | [
"MIT"
] | null | null | null | _posts/2019-12-24-hottest-programming-languages-to-learn-in-2020.md | vamsikollipara/vamsikollipara.github.io | 64e65a20762820ae10a192137b2565e15345bc18 | [
"MIT"
] | null | null | null | ---
layout: post
title: "Hottest Programming Languages in 2020"
author: vk
categories: [Programming, Learning]
tags: [Java, Javascript, Python, Rust, Kotlin, Go, Dart, Typescript]
image: assets/images/post1/pro-lang.png
featured: true
---
The begin of 21st century saw a rise in demand of skills in programming. Computer sciences/ engineering was the most preferred subject of graduation among students. But this time around by the end of 2nd decade software engineers are not the only one who own this skill set. Every graduate irrespective of their background is expected to have knowledge in at least one of the most used programming language by the companies recruiting them. The competitive world has made it a point that schools have started curriculum with new subjects such as C and python.
Now there is always a fresh set of people who irrespective of their backgrounds are coming forward to learn programming languages. Choosing the right one for their job and career can be daunting with around 700 programming languages to choose from (https://en.wikipedia.org/wiki/List_of_programming_languages).
Just to make it easier we have come up with a comprehensive list of the most in demand programming languages in 2020. Our analysis is considering data from leading open source platforms such as GitHub and Stack Overflow.
## Front Runners
Let us have a look at the top programming languages based on their user base.
Yes, JavaScript is continuing its supremacy as top programming language 2019. But 2020 will see a new champion. You guessed it right that will be python. In terms of Community question asked per month Python took over Java in 2019 and it is after JavaScript for Numero Uno spot. Growing popularity in NodeJS also not helping JavaScript either because typescript is taking its share in UI development.

#### Python
Python is an interpreted, high-level, general-purpose programming language. It created by Guido van Rossum and first released in 1991, Python's design philosophy emphasizes code readability with its notable use of significant whitespace.
since it is a general-purpose programming language, you can use the it for developing both desktop and web applications. Also, you can use Python for developing complex scientific and numeric applications. It is designed with features to facilitate data analysis and visualization.
Latest boom in Data science clearly reflects in the growth in popularity of Python programming language as it is the most preferred language for Data Analytics, Machine Learning, Data Visualization and Deep Learning. No Wonder Pandas is the top used tags among the python packages in stack overflow followed by Django, Matplotlib another package for data science and Flask.
Python is used at major tech companies for some of their major products. Some of the notable names include
+ Google
+ Microsoft
+ Amazon
+ Netflix
+ Facebook
2020 is expected to see some major growth in the fields of Data science. If you are planning to be ready for a tasty challenge in 2020 train yourself in python to solve many daily life problems.
#### JavaScript
JavaScript, The programming language of Internet.
JavaScript, often abbreviated as JS, is a high-level, just-in-time compiled, object-oriented programming language
Alongside HTML and CSS, JavaScript is one of the core technologies of the World Wide Web. JavaScript enables interactive web pages and is an essential part of web applications. Most websites use it, and major web browsers have a dedicated JavaScript engine to execute it.
JavaScript is developed by Brendon Eich at Netscape. Later it became a standard in web development. Now it is maintained by ECMAScript an opensource community driven organization.
Birth of Node.js in 2009 age JavaScript a big boom in server-side development.
Although there are similarities between JavaScript and Java, including language name, syntax, and respective standard libraries, the two languages are distinct and differ greatly in design.
It is no exaggeration that JavaScript is still the widely used programming language based on its sheer user base. There are over 1.6 billion web sites in the world, and JavaScript is used on 95% of them, that is a staggering 1.52 billion web sites with JavaScript. JavaScript in its various forms sees its usage in Web Development, Server-Side Scripting, IoT etc.
2020 will not be any different for JavaScript but the increase in typescript (a superscript of JavaScript) will see a little decline its usage in web development. If you are thinking to venture yourself into web development, JavaScript is a must.
#### Java
Java is the Big Brother of programming languages when it comes to it popularity and wide userbase. Since its inception in 1995 (24 years old, hence the Big Brother) it has been one of the most sought-after skill set in software industry. Java is one of the widely taught programming languages in the grad schools, that makes it quite popular in dev circles right from the young age.
Java is a general-purpose programming language that is class-based, object-oriented, and designed to have as few implementation dependencies as possible.
Java still dominates various lists like TIOBE, Octaverse etc in popular programming languages. Java is not going to go anywhere. Given its wide range of usage in applications it worth learning (in case you haven’t).
John Cook has an interesting post about his predictions for programming languages. As per that java is going to outlive many new coming languages like Go, C#, python. Given it is already 24 years old is quite a long run.
## People Champion
These programming languages are garnering a good response from the programming community. Adding these skillsets in a developer’s arsenal will be handy in 2020.
#### RUST
Rust has been the "most loved programming language" in the Stack Overflow Developer Survey every year since 2016. When 83.5% of the world’s biggest developer forum like it, there must be something good going on in here.
Rust is a programming language developed by Mozilla. Rust is a multi-paradigm system programming language focused on safety, especially safe concurrency. Rust is syntactically like C++ but is designed to provide better memory safety while maintaining high performance.

*“Rust is one of the fastest growing languages on GitHub” – Octaverse report.*
#### Kotlin
Behold!! the JAVA alternative is here. It is said to conquer major share in mobile application development space.
Kotlin is a cross-platform, statically typed, general-purpose programming language with type inference. Kotlin is designed to interoperate fully with Java, and the JVM version of its standard library depends on the Java Class Library, but type inference allows its syntax to be more concise. Kotlin mainly targets the JVM, but also compiles to JavaScript or native code (via LLVM). Language development costs are borne by JetBrains, while the Kotlin Foundation protects the Kotlin trademark.
From Android studio v3.0 Kotlin is included in as an option to select along with standard java compiler. The Android Kotlin compiler lets the user choose between targeting Java 6 or Java 8 compatible bytecode.
Its ability to create both android (Kotlin/JVM) and iOS (Kotlin/Native) applications makes it a hot skillset to master in 2020.

#### Dart
The fastest growing programming language in GitHub (It is observed to grow by 500+% in YoY usage). Flutter is the one of GitHub’s most popular open source projects and Dart is the language used to write Flutter apps.

Dart is a cross functional programming language originally build for Flutter by Google. It can be used to build mobile, desktop, backend and web applications. Are you are doing cross platform apps in 2020? Dart is worth a shot.
#### Typescript
Typescript is an object oriented strictly typed superset on JavaScript from Microsoft. Yes, it transpiles into JavaScript. That is the reason many may oppose it to be a programming language but, Typescript is seeing a steady growth its userbase thanks to modern web UI frameworks like Angular.
You can agree or deny but the fact is Typescript is eating JavaScript market. UI development in 2020 can see more such contribution to Typescript.


Growth in TypeScript and drop in JavaScript is not a coincidence as both operate more in UI development space.
#### Go
Go is not only loved by people but also one of the most wanted programming languages in 2019 (Stack overflow Developer survey 2019). It is mostly used for building largescale and complex software.
Go, also known as Golang is a statically typed, compiled programming language designed at Google by Robert Griesemer, Rob Pike, and Ken Thompson. Go is syntactically like C that boasts the user friendliness of dynamically typed, interpreted languages like Python, but with memory safety, garbage collection, structural typing, and CSP-style concurrency.
There are two major implementations:
+ Google's self-hosting compiler toolchain targeting multiple operating systems, mobile devices, and WebAssembly.
+ gccgo, a GCC frontend.
A third party transpiler, GopherJS, compiles Go to JavaScript for front-end web development.
#### References:
Stack overflow
+ *[`Stack Overflow Survey`](https://insights.stackoverflow.com/survey/2019#most-loved-dreaded-and-wanted)*
+ *[`Stack overflow Trends`](https://insights.stackoverflow.com/trends?tags=go%2Ckotlin%2Ctypescript%2Crust%2Cdart)*
GitHub
+ *[`GitHut 2.0`](https://madnight.github.io/githut/#/pull_requests/2019/3)*
+ *[`The State of the Octoverse`](https://octoverse.github.com/)*
Others
+ *[`Kotlin Lang`](https://play.kotlinlang.org/hands-on/Targeting%20iOS%20and%20Android%20with%20Kotlin%20Multiplatform/01_Introduction)*
+ *[`John cook about programming languages life expectancy`](https://www.johndcook.com/blog/2017/08/19/programming-language-life-expectancy/)* | 81.046512 | 559 | 0.798757 | eng_Latn | 0.998698 |
c0143c74fbc090010988eb8b1f1c5c57b2e67395 | 5,045 | md | Markdown | _posts/2019-10-10-designing-in-the-open.md | scotentSD/scotentSD.github.io | 92f3a11f4667e94241e7fbfec785c8789d9b1a78 | [
"MIT"
] | null | null | null | _posts/2019-10-10-designing-in-the-open.md | scotentSD/scotentSD.github.io | 92f3a11f4667e94241e7fbfec785c8789d9b1a78 | [
"MIT"
] | null | null | null | _posts/2019-10-10-designing-in-the-open.md | scotentSD/scotentSD.github.io | 92f3a11f4667e94241e7fbfec785c8789d9b1a78 | [
"MIT"
] | 2 | 2019-12-10T09:44:48.000Z | 2020-03-27T16:18:59.000Z | ---
layout: post
title: Designing in the open
author: David
---
The [Scottish Enterprise service design team](https://scotentsd.github.io/) is committed to designing in the open
<!--more-->
## What does design in the open mean?

According to Brad Frost, [designing in the open](https://bradfrost.com/blog/post/designing-in-the-open/) means
>sharing your work and/or process publicly as you undertake a design project
It's not about making your code open source (although it could include that). It's about:
- Sharing bits of things as you build them
- Sharing versions of things as they eveolve and change
- Sharing techniques and resources
- Sharing our experiences and stories
For us, designing in the open means sharing all the documents and artefacts and prototypes an anything else we generate publicly by default.
Unless there is a good reason not to - for example, if it contains personal or potentially sensitive information - we make it as widely available as we can as soon as possible. Often, that means our work is available while we are still working on it.
## Why design in the open?
Well, we're a public body, so why shouldn't the ouputs of our work be available to the people who fund it?
But the process brings many benefits:
- Faster (potentially instantaneous) feedback
- Feedback from a wider group
- Generating buy-in from the community and stakeholders
- Demonstrating our commitment (and work in progress)
- Communicating all day, every day
- Reduce the cost of communication within the team, even if some work remotely
But mostly, it's just about being open, honest, and transparent. And we reduce our communication overhead because we don't need to tell people about what we've done if it is right in front of them any time they want it.
## Some history
Towards te end of 2008, both @numbat70 (Martin Kerr) and @david-obrien (writing) worked for Scottish Enterprise's web team.
We were asked by our director (this was like our boss's boss's boss's boss) if we could completely redesign, rewrite and rebuild scottish-enterprise.com by the 1st of April.
So, no pressure then. 3 months to completely tear down and rebuild a 5,000 page website.
But we did it. We had no time to contract anyone else to do it, so we did it ourselves.
We did a lot of things we would call agile now, though neither one of us was particularly aware of the term in 2008.
We put together a small, multi-disciplinary team. Mostly content writers; I did front end design and some development. @numbat70 did most of the back end dev.
We used [Gerry McGovern](https://twitter.com/gerrymcgovern)'s [top tasks](http://www.customercarewords.com/services/customer-top-task-identification/) methodology to identify user needs.
Most importantly, we all worked in the same small room. We talked, every day, about our work. We helped each other out. We didn't wait for anyone else to catch up. We wrote and coded and designed and built together, but independently.
No dependencies.
In the end, we got there. We redesigned, rewrote, rebuilt and shipped on 1 April 2009. We got a 10,000 page site down to <500 pages by foucusing on user needs.
The results? Ambiguous. Analytics data was still mostly based on server logs in those days. Google Analytics was still a twinkle in my old mate Sergey's eye.
Stakeholders hated it. And everyone complained about "their" content going missing (even though "their" content had not, so far as we could tell, been read by anyone in over 6 months.)
### Then on Friday night we went to the Arches
That's just for SEO.
In truth, we started to blog at http://digital.scotentblog.co.uk/. It's a WordPress blog, dead simple, far away from our internal processes and controls. It just made things easier.
Important people (cf. boss's boss's boss's boss) actually read it. Which, er, surprised us.
Astonished is probably a better word.
Which is all good, but wasn't making a difference.
### Then we moved to the junkyard

AKA our old Paisley office. A beautiful early 20th century building with lovely art deco tiling in the wally close and the shonkiest interior ever.
We had the first floor. Half of it was off limits because, well, there were holes and stuff. And if you opened a window it might fall out onto the street and decapitate a passing buddy.
But it was great. Apart from the trains.
We were properly agile.
We had small teams and no communications overhead.
We had proper, actual, for real kanban boards that we actually stood in front of and talked about our work. Every day.
There was loads of wall-space, and plenty of whiteboards
It felt easy, and it felt natural. It felt good.
We plastered that place with post-its and ideas. Some of them came to fruition, most didn't. Some of those that came to fruition worked, most didn't.
We tried stuff, we discarded the stuff that wasn't working, we improved the stuff that was.
And we got results.
| 47.149533 | 250 | 0.772052 | eng_Latn | 0.999652 |
c014594a7b8937db0094abf4c55baccfe86e4d2a | 1,917 | md | Markdown | CSP_Solver-master/README.md | magdaleenaaaa/TP-Sudoku | 55b2be19ab894c55346bd152ac902418e6af5b2a | [
"MIT"
] | 1 | 2021-01-04T19:07:11.000Z | 2021-01-04T19:07:11.000Z | CSP_Solver-master/README.md | magdaleenaaaa/TP-Sudoku | 55b2be19ab894c55346bd152ac902418e6af5b2a | [
"MIT"
] | null | null | null | CSP_Solver-master/README.md | magdaleenaaaa/TP-Sudoku | 55b2be19ab894c55346bd152ac902418e6af5b2a | [
"MIT"
] | 16 | 2020-01-14T14:09:42.000Z | 2020-02-04T11:57:26.000Z | CSP Solver
Bonjour à tous,
Pour exécuter le programme, vous devez avoir installé python 3.4 ou version ultérieure, les fichiers search.py, original.py et utils.py ont été extraits du code original d'AIMA (https://github.com/aimacode/aima-python) par Peter Norvig comme solveur de base pour csp général.
- original.py, utils.py et search.py ne sont pas nécessaires pour exécuter main.py ou test.py, ceux-ci ont été ajoutés juste pour montrer ses performances sur docs / original_results.txt
- La classe sudokuCSP crée les objets nécessaires pour le solveur csp général comme voisin, variables, domaine pour un jeu sudoku comme problème de satisfaction de contrainte
- Le fichier csp.py contient du code de original.py (fichier csp.py du code AIMA) légèrement modifié, il n'utilise pas de fonctions rares d'autres fichiers (comme ultis.py), les méthodes qui ont été modifiées sont étiquetées avec le commentaire "@modifié"
- Le fichier csp.py contient des algorithmes pour résoudre csp, comme le retour en arrière avec les 3 méthodes différentes pour l'inférence, mrv et quelques autres méthodes.
- La classe dans gui.py contient du code pour créer une interface utilisateur graphique pour le jeu, permettant à l'utilisateur de choisir un niveau (il y a 6 tableaux différents) et une méthode infernet pour le retour en arrière
- Le fichier test.py contient du code pour exécuter des tests pour chaque algorithme d'inférence sur une carte sudoku (il est possible de choisir quelle carte)
- Le fichier Main.py contient une méthode principale pour exécuter le programme gui
Deux fichiers ont été ajoutés montrant la différence entre le fichier csp.py d'origine du code AIMA et celui-ci modifié. (vérifiez modified_results.txt et original_results.txt)
N'hésitez pas à lire les documentations pour plus d'informations ou à en envoyer un message à Danush, Axel, Célia ou moi-même Laura.
Bonne lecture à tous
| 71 | 275 | 0.798122 | fra_Latn | 0.995407 |
c01470dbe1687dfa0737107d79afb5eda64b07a7 | 813 | md | Markdown | docs/api/@remirror/core-extensions/core-extensions.historyextensionoptions.getdispatch.md | jankeromnes/remirror | 95306cee4c76ee9fd7271a0ab6069f0a0a6803d9 | [
"MIT"
] | 1 | 2021-05-22T06:22:01.000Z | 2021-05-22T06:22:01.000Z | docs/api/@remirror/core-extensions/core-extensions.historyextensionoptions.getdispatch.md | jankeromnes/remirror | 95306cee4c76ee9fd7271a0ab6069f0a0a6803d9 | [
"MIT"
] | null | null | null | docs/api/@remirror/core-extensions/core-extensions.historyextensionoptions.getdispatch.md | jankeromnes/remirror | 95306cee4c76ee9fd7271a0ab6069f0a0a6803d9 | [
"MIT"
] | null | null | null | <!-- Do not edit this file. It is automatically generated by API Documenter. -->
[Home](./index.md) > [@remirror/core-extensions](./core-extensions.md) > [HistoryExtensionOptions](./core-extensions.historyextensionoptions.md) > [getDispatch](./core-extensions.historyextensionoptions.getdispatch.md)
## HistoryExtensionOptions.getDispatch() method
Provide a custom dispatch getter function for embedded editors
<b>Signature:</b>
```typescript
getDispatch?(): DispatchFunction;
```
<b>Returns:</b>
`DispatchFunction`
## Remarks
This is only needed when the extension is part of a child editor, e.g. `ImageCaptionEditor`<!-- -->. By passing in the `getDispatch` method history actions can be dispatched into the parent editor allowing them to propagate into the child editor.
| 36.954545 | 247 | 0.740467 | eng_Latn | 0.868851 |
c0155bd6702341b6d48e1c3190f85d42faa3276e | 352 | md | Markdown | vendor/tw-city-selector-master/docs/zipcode.md | mypetertw/ianphp-framework | c4265b3aeb68021b2816f5ab08eac3ec73d70bc9 | [
"MIT"
] | null | null | null | vendor/tw-city-selector-master/docs/zipcode.md | mypetertw/ianphp-framework | c4265b3aeb68021b2816f5ab08eac3ec73d70bc9 | [
"MIT"
] | null | null | null | vendor/tw-city-selector-master/docs/zipcode.md | mypetertw/ianphp-framework | c4265b3aeb68021b2816f5ab08eac3ec73d70bc9 | [
"MIT"
] | null | null | null | # 郵遞區號表
<dl id="zipcode-list" class="zipcode-list">
<template v-for="(county, i) in counties">
<dt>{{ county }}</dt>
<dd>
<span v-for="(district, n) in districts[i][0]">
{{ district }}
{{ districts[i][1][n] }}
</span>
</dd>
</template>
</dl>
<script>
new Vue({
el: '#zipcode-list',
data: window.data
});
</script>
| 16.761905 | 48 | 0.534091 | eng_Latn | 0.362141 |
c015b4bae1552dbcad08d8d73b16177a58f46eac | 919 | md | Markdown | README.md | MHHukiewitz/bonfida-api-wrapper-python3 | c4db0c5ea1e294e0bb43bafa63c96c4d474b168a | [
"MIT"
] | 5 | 2021-03-08T07:21:02.000Z | 2021-12-12T01:09:39.000Z | README.md | MHHukiewitz/bonfida-api-wrapper-python3 | c4db0c5ea1e294e0bb43bafa63c96c4d474b168a | [
"MIT"
] | 2 | 2021-04-22T12:55:39.000Z | 2021-04-30T09:56:49.000Z | README.md | MHHukiewitz/bonfida-api-wrapper-python3 | c4db0c5ea1e294e0bb43bafa63c96c4d474b168a | [
"MIT"
] | 2 | 2021-04-27T15:39:17.000Z | 2021-04-30T22:24:08.000Z | # Bonfida-Trader
## Warning
This is an UNOFFICIAL wrapper for Bonfida [HTTP API](https://docs.bonfida.com/) written in Python 3.7
The library can be used to fetch market data or create third-party clients
USE THIS WRAPPER AT YOUR OWN RISK, I WILL NOT CORRESPOND TO ANY LOSES
## Features
- Implementation of all [public](#) endpoints
## Donate
If useful, buy me a coffee?
- ETH: 0xA9D89A5CAf6480496ACC8F4096fE254F24329ef0 (brendanc.eth)
- SOL: AQBqATwRqbU8odBL3RCFovzLbHR13MuoF2v53QpmjEV3
## Installation
$ git clone https://github.com/LeeChunHao2000/bonfida-api-wrapper-python3
- This wrapper requires [requests](https://github.com/psf/requests)
## Quickstart
This is an introduction on how to get started with Bonfida client. First, make sure the Bonfida library is installed.
And then:
from Bonfida.client import Client
client = Client('', '')
### Version Logs
#### 2021-03-07
- Birth! | 23.564103 | 117 | 0.746464 | eng_Latn | 0.719986 |
c0160f8a3359376269038840d89fb9961a5cb747 | 887 | md | Markdown | docs/csharp/misc/cs1057.md | AlejandraHM/docs.es-es | 5f5b056e12f9a0bcccbbbef5e183657d898b9324 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/csharp/misc/cs1057.md | AlejandraHM/docs.es-es | 5f5b056e12f9a0bcccbbbef5e183657d898b9324 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/csharp/misc/cs1057.md | AlejandraHM/docs.es-es | 5f5b056e12f9a0bcccbbbef5e183657d898b9324 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Error del compilador CS1057
ms.date: 07/20/2015
f1_keywords:
- CS1057
helpviewer_keywords:
- CS1057
ms.assetid: 6f247cfd-6d26-43b8-98d9-0a6d7c115cad
ms.openlocfilehash: 3617d83b81894476dc7635b962d8c70462dc573f
ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 05/04/2018
ms.locfileid: "33304014"
---
# <a name="compiler-error-cs1057"></a>Error del compilador CS1057
'member': las clases estáticas no pueden contener miembros protegidos
Este error se genera al declarar a un miembro protegido dentro de una clase estática.
## <a name="example"></a>Ejemplo
El ejemplo siguiente genera el error CS1057:
```csharp
// CS1057.cs
using System;
static class Class1
{
protected static int x; // CS1057
public static void Main()
{
}
}
```
| 24.638889 | 88 | 0.720406 | spa_Latn | 0.663011 |
c0164d7a3154a66bd9113c36a453e5a6875fc171 | 38 | md | Markdown | README.md | Jonle/jonleServer | 12cf3a9ccffde8a29eb1ca3e9d8c038d165b08f9 | [
"Unlicense"
] | null | null | null | README.md | Jonle/jonleServer | 12cf3a9ccffde8a29eb1ca3e9d8c038d165b08f9 | [
"Unlicense"
] | null | null | null | README.md | Jonle/jonleServer | 12cf3a9ccffde8a29eb1ca3e9d8c038d165b08f9 | [
"Unlicense"
] | null | null | null | # jonleServer
my bolg sever - Node.js
| 12.666667 | 23 | 0.736842 | tur_Latn | 0.249483 |
c0167e849b8a8d78d7887e19b03e7795f1c064c2 | 30 | md | Markdown | 0x00-python-hello_world/README.md | Trice254/alx-higher_level_programming | b49b7adaf2c3faa290b3652ad703914f8013c67c | [
"MIT"
] | null | null | null | 0x00-python-hello_world/README.md | Trice254/alx-higher_level_programming | b49b7adaf2c3faa290b3652ad703914f8013c67c | [
"MIT"
] | null | null | null | 0x00-python-hello_world/README.md | Trice254/alx-higher_level_programming | b49b7adaf2c3faa290b3652ad703914f8013c67c | [
"MIT"
] | null | null | null | ## 0x00. Python - Hello, World | 30 | 30 | 0.666667 | eng_Latn | 0.364362 |
c017b8f84fc21a0c86343031fa47df06ec7e06eb | 2,425 | md | Markdown | examples/cool_app/README.md | romychs/gotk3 | 116e09e25f5e2b197e29e6ebbb06d889101bd2a7 | [
"0BSD"
] | 32 | 2018-03-14T15:08:34.000Z | 2020-12-22T19:25:18.000Z | examples/cool_app/README.md | romychs/gotk3 | 116e09e25f5e2b197e29e6ebbb06d889101bd2a7 | [
"0BSD"
] | 1 | 2018-03-25T16:25:15.000Z | 2018-04-04T13:05:19.000Z | examples/cool_app/README.md | romychs/gotk3 | 116e09e25f5e2b197e29e6ebbb06d889101bd2a7 | [
"0BSD"
] | 4 | 2018-03-25T16:21:13.000Z | 2021-03-28T08:27:18.000Z | Feature-rich GTK+3 app demo written in go
=========================================
Aim of this application is to interconnect GLIB/GTK+ components and widgets together
with use of helpful code patterns, meet the real needs of developers.
It's obligatory to have GTK+ 3.12 and high installed, otherwise app will not compile
(GtkPopover, GtkHeaderBar require more recent GTK+3 installation).
Features of this demonstration:
1) All actions in application implemented via GLIB's GAction component working as entry point
for any activity in application. Advanced use of GAction utilize
"[action with states and parameters](https://developer.gnome.org/GAction/)"
with seamless integration of such actions to UI menus.
2) New widgets such as GtkHeaderBar, GtkPopover and others are used.
3) Good example of preference dialog with save/restore functionality.
4) Helpful code pattern are present: dialogs and message boxes demo
working with save/restore settings (via GSettings),
fullscreen mode on/off, actions with states or stateless and others.
Screenshots
-----------
Main form with about dialog:

Main form and modern popover menu:

Main form and classic main menu:

One of few dialogs demo:

Preference dialog with save/restore settings functionality:

Installation
------------
Almost no action needed, the main requirement is to have the GOTK3 library preliminary installed.
Still, to make a "preference dialog" function properly scripts `gs_install_schema.sh`/`gs_uninstall_schema.sh`
must be used, to install/compile [GLIB setting's schema](https://developer.gnome.org/GSettings/).
To run application, type in console `go run ./cool_app.go`.
Additional recommendations
-------------------------
- Use GNOME application `gtk3-icon-browser` to find themed icons available at your linux desktop installation.
Contact
-------
Please use [Github issue tracker](https://github.com/d2r2/gotk3/issues)
for filing bugs or feature requests.
| 41.810345 | 110 | 0.769072 | eng_Latn | 0.935475 |
c01a3973e369661d4a9d81458996edf296076fdd | 298 | md | Markdown | 2017/todo_2017.md | wiix/ccjava-overview | 42e525317417416f54801e585fdf1f6c42209fa2 | [
"Apache-2.0"
] | null | null | null | 2017/todo_2017.md | wiix/ccjava-overview | 42e525317417416f54801e585fdf1f6c42209fa2 | [
"Apache-2.0"
] | null | null | null | 2017/todo_2017.md | wiix/ccjava-overview | 42e525317417416f54801e585fdf1f6c42209fa2 | [
"Apache-2.0"
] | null | null | null | # 2017年期望完成的目标
### Todos
- 掌握基于证书的用户管理、及安全传输。
- 基于javafx(或wpf)的多线程执行器:GUI程序;实时显示执行动态;可管理线程。
- 基于“用户组”概念的可扩展权限框架:注解、可处理数据权限、权限传播管理。
- 掌握oauth2.0,并有一个实际应用。
- 使用netty实现一个有中心多节点网络服务,并实现多人分组聊天示例应用。
- 掌握一种机器学习API。
- 掌握一种人脸检测API。
- 完成一个工单系统。
- 所有的前端使用restful api + bootstrap + ngularjs[/vuejs]
| 21.285714 | 52 | 0.738255 | yue_Hant | 0.577437 |
c01aa049091f2f72cf280c694e8bfe22680e4f7e | 95 | md | Markdown | general-todos.md | hboutemy/incubator-wayang | a91bd3615432cd8f3bf44bd2f4b99b5d8df3cdba | [
"Apache-2.0"
] | null | null | null | general-todos.md | hboutemy/incubator-wayang | a91bd3615432cd8f3bf44bd2f4b99b5d8df3cdba | [
"Apache-2.0"
] | null | null | null | general-todos.md | hboutemy/incubator-wayang | a91bd3615432cd8f3bf44bd2f4b99b5d8df3cdba | [
"Apache-2.0"
] | null | null | null | # Here are all the To-Do's that are general
## TODO: Explain the structure of the scala folder | 31.666667 | 50 | 0.747368 | eng_Latn | 0.999537 |
c01b08c939c5612ddda0bbad260e42c33092dd25 | 22,090 | md | Markdown | config.md | jmhbnz/humacs | 063873150d1977203a34d9ca4a743040e02c1c75 | [
"BSD-2-Clause"
] | 10 | 2020-07-29T20:30:16.000Z | 2022-03-08T00:43:47.000Z | config.md | jmhbnz/humacs | 063873150d1977203a34d9ca4a743040e02c1c75 | [
"BSD-2-Clause"
] | 39 | 2020-07-29T01:35:46.000Z | 2021-08-29T23:59:13.000Z | config.md | jmhbnz/humacs | 063873150d1977203a34d9ca4a743040e02c1c75 | [
"BSD-2-Clause"
] | 7 | 2020-07-29T01:34:15.000Z | 2022-02-22T13:40:32.000Z | - [Humac Human Deets](#org19ea609)
- [Ergonomics](#orgfdecd2e)
- [Better Local Leaders](#org6ec54d3)
- [Use mouse scroll](#org6337bb1)
- [lispy vim](#org73d8410)
- [Consistency](#orgcbe28e3)
- [consistent paths](#org00d844c)
- [Appearance](#org7f80cdc)
- [Fonts](#org1556a9a)
- [Theme](#org018a0eb)
- [Indent](#orga40c35d)
- [LSP Behaviour](#orge6cf9b5)
- [Languages](#org1fc6b65)
- [Web](#org06cc1eb)
- [Go](#orgdd10467)
- [Vue](#org864ca0c)
- [Org](#org0586042)
- [Show properties when cycling through subtrees](#org8c25a2b)
- [ASCII colours on shell results](#orgb9be192)
- [Literate!](#org1d5d058)
- [SQL](#orgb686a95)
- [Go](#org09c422e)
- [Pairing](#org4bfcb1e)
- [Exporting](#orgd168ca6)
- [Sane Org Defaults](#org4911799)
- [Support Big Query](#org08572fb)
- [Snippets](#org07738c1)
- [org-mode](#org96ae34f)
- [Blog Property](#orgece52b0)
- [Dashboard](#org76a3931)
- [Banners](#org6ea8080)
- [user configs](#org15bf9e3)
- [init.el](#org501945e)
- [Patch for when using emacs 28+](#org9dd9a75)
- [Doom! block](#org73b79ac)
- [packages.el](#org6a2581d)
- [ii-packages](#org13860bd)
- [upstream](#orga8e224e)
[](spacemacs-config/banners/img/kubemacs.png)
<a id="org19ea609"></a>
# Humac Human Deets
On a sharing.io cluster, we should have these two env vars set…so we can personalize to the person who started the instance. Otherwise, they’re just a friend.
```elisp
(setq user-full-name (if (getenv "GIT_AUTHOR_NAME")
(getenv "GIT_AUTHOR_NAME")
"ii friend")
user-mail-address (if (getenv "GIT_COMMIT_EMAIL")
(getenv "GIT_COMMIT_EMAIL")
"ii*ii.ii"))
```
<a id="orgfdecd2e"></a>
# Ergonomics
<a id="org6ec54d3"></a>
## Better Local Leaders
I got used to using comma as the localleader key, from spacemacs, so i keep it.
```elisp
(setq doom-localleader-key ",")
```
<a id="org6337bb1"></a>
## Use mouse scroll
```elisp
(defun scroll-up-5-lines ()
"Scroll up 5 lines"
(interactive)
(scroll-up 5))
(defun scroll-down-5-lines ()
"Scroll down 5 lines"
(interactive)
(scroll-down 5))
(global-set-key (kbd "<mouse-4>") 'scroll-down-5-lines)
(global-set-key (kbd "<mouse-5>") 'scroll-up-5-lines)
```
<a id="org73d8410"></a>
## lispy vim
This sets up keybindings for manipuulating parenthesis with slurp and barf when in normal or visual mode.
```elisp
(map!
:map smartparens-mode-map
:nv ">" #'sp-forward-slurp-sexp
:nv "<" #'sp-forward-barf-sexp
:nv "}" #'sp-backward-barf-sexp
:nv "{" #'sp-backward-slurp-sexp)
```
<a id="orgcbe28e3"></a>
# Consistency
<a id="org00d844c"></a>
## consistent paths
If you are using a mac, you can have problem with running source blocks or some language support as the shell PATH isn’t found in emacs. [exec-path-from-shell](https://github.com/purcell/exec-path-from-shell) is a solution for this.
```elisp
(when (memq window-system '(mac ns x)) (exec-path-from-shell-initialize))
```
<a id="org7f80cdc"></a>
# Appearance
<a id="org1556a9a"></a>
## Fonts
;; Doom exposes five (optional) variables for controlling fonts in Doom. Here ;; are the three important ones:
```elisp
;(setq doom-font (font-spec :family "Source Code Pro" :size 10)
; ;; )(font-spec :family "Source Code Pro" :size 8 :weight 'semi-light)
; doom-serif-font (font-spec :family "Source Code Pro" :size 10)
; doom-variable-pitch-font (font-spec :family "Source Code Pro" :size 10)
; doom-unicode-font (font-spec :family "Input Mono Narrow" :size 12)
; doom-big-font (font-spec :family "Source Code Pro" :size 10))
```
<a id="org018a0eb"></a>
## Theme
```elisp
(setq doom-theme 'doom-gruvbox)
```
<a id="orga40c35d"></a>
## Indent
```elisp
(setq standard-indent 2)
```
<a id="orge6cf9b5"></a>
## LSP Behaviour
This brings over the lsp behaviour of spacemacs, so working with code feels consistent across emacs..
```elisp
(use-package! lsp-ui
:config
(setq lsp-navigation 'both)
(setq lsp-ui-doc-enable t)
(setq lsp-ui-doc-position 'top)
(setq lsp-ui-doc-alignment 'frame)
(setq lsp-ui-doc-use-childframe t)
(setq lsp-ui-doc-use-webkit t)
(setq lsp-ui-doc-delay 0.2)
(setq lsp-ui-doc-include-signature nil)
(setq lsp-ui-sideline-show-symbol t)
(setq lsp-ui-remap-xref-keybindings t)
(setq lsp-ui-sideline-enable t)
(setq lsp-prefer-flymake nil)
(setq lsp-print-io t))
```
<a id="org1fc6b65"></a>
# Languages
<a id="org06cc1eb"></a>
## Web
auto-closing tags works different if you are in a terminal or gui. We want consistent behaviour when editing any sort of web doc. I also like it to create a closing tag when i’ve starteed my opening tag, which is auto-close-style 2
```elisp
(setq web-mode-enable-auto-closing t)
(setq-hook! web-mode web-mode-auto-close-style 2)
```
<a id="orgdd10467"></a>
## Go
Go is enabled, with LSP support in our [init.el](init.el). To get it working properly, though, you want to ensure you have all the go dependencies installed on your computer and your GOPATH set. It’s recommended you read the doom docs on golang, following all links to ensure your dependencies are up to date. [Go Docs](file:///Users/hh/humacs/doom-emacs/modules/lang/go/README.md)
I’ve had inconsistencies with having the GOPATH set on humacs boxes, so if we are in a humacs pod, explicitly set the GOPATH
```elisp
(when (and (getenv "HUMACS_PROFILE") (not (getenv "GOPATH")))
(setenv "GOPATH" (concat (getenv "HOME") "/go")))
```
<a id="org864ca0c"></a>
## Vue
Tried out vue-mode, but it was causing more problems than benefits and doesn’t seem to do much beyond what web-mode plus vue-lsp support would do. So, following [Gene Hack’s Blog Post](https://genehack.blog/2020/08/web-mode-eglot-vetur-vuejs-=-happy/), we’ll create our own mode, that just inherits all of web-mode and adds lsp. This requires for [vls](https://npmjs.com/vls) to be installed.
```elisp
(define-derived-mode ii-vue-mode web-mode "iiVue"
"A major mode derived from web-mode, for editing .vue files with LSP support.")
(add-to-list 'auto-mode-alist '("\\.vue\\'" . ii-vue-mode))
(add-hook 'ii-vue-mode-hook #'lsp!)
```
<a id="org0586042"></a>
# Org
Various settings specific to org-mode to satisfy our preferences
<a id="org8c25a2b"></a>
## Show properties when cycling through subtrees
This is an adjustment to the default hook, which hides drawers by default
```elisp
(setq org-cycle-hook
' (org-cycle-hide-archived-subtrees
org-cycle-show-empty-lines
org-optimize-window-after-visibility-change))
```
<a id="orgb9be192"></a>
## ASCII colours on shell results
```elisp
(defun ek/babel-ansi ()
(when-let ((beg (org-babel-where-is-src-block-result nil nil)))
(save-excursion
(goto-char beg)
(when (looking-at org-babel-result-regexp)
(let ((end (org-babel-result-end))
(ansi-color-context-region nil))
(ansi-color-apply-on-region beg end))))))
(add-hook 'org-babel-after-execute-hook 'ek/babel-ansi)
```
<a id="org1d5d058"></a>
# Literate!
<a id="orgb686a95"></a>
## SQL
```elisp
(setq org-babel-default-header-args:sql-mode
'((:results . "replace code")
(:product . "postgres")
(:wrap . "SRC example")))
```
<a id="org09c422e"></a>
## Go
```elisp
(setq org-babel-default-header-args:go
'((:results . "replace code")
(:wrap . "SRC example")))
```
<a id="org4bfcb1e"></a>
## Pairing
```elisp
(use-package! graphviz-dot-mode)
(use-package! sql)
(use-package! ii-utils)
(use-package! ii-pair)
(after! ii-pair
(osc52-set-cut-function)
)
;;(use-package! iterm)
;;(use-package! ob-tmate)
```
<a id="orgd168ca6"></a>
## Exporting
```elisp
(require 'ox-gfm)
```
<a id="org4911799"></a>
## Sane Org Defaults
In addition to the org defaults, we wanna make sure our exports include results, but that we dont’ try to run all our tamte commands again.
```elisp
(setq org-babel-default-header-args
'((:session . "none")
(:results . "replace code")
(:comments . "org")
(:exports . "both")
(:eval . "never-export")
(:tangle . "no")))
(setq org-babel-default-header-args:shell
'((:results . "output code verbatim replace")
(:wrap . "example")))
```
<a id="org08572fb"></a>
## Support Big Query
```elisp
(defun ii-sql-comint-bq (product options &optional buf-name)
"Create a bq shell in a comint buffer."
;; We may have 'options' like database later
;; but for the most part, ensure bq command works externally first
(sql-comint product options buf-name)
)
(defun ii-sql-bq (&optional buffer)
"Run bq by Google as an inferior process."
(interactive "P")
(sql-product-interactive 'bq buffer)
)
(after! sql
(sql-add-product 'bq "Google Big Query"
:free-software nil
;; :font-lock 'bqm-font-lock-keywords ; possibly later?
;; :syntax-alist 'bqm-mode-syntax-table ; invalid
:prompt-regexp "^[[:alnum:]-]+> "
;; I don't think we have a continuation prompt
;; but org-babel-execute:sql-mode requires it
;; otherwise re-search-forward errors on nil
;; when it requires a string
:prompt-cont-regexp "3a83b8c2z93c89889a4c98r2z34"
;; :prompt-length 9 ; can't precalculate this
:sqli-program "bq"
:sqli-login nil ; probably just need to preauth
:sqli-options '("shell" "--quiet" "--format" "pretty")
:sqli-comint-func 'ii-sql-comint-bq
)
)
```
<a id="org07738c1"></a>
# Snippets
These are helpful text expanders made with yasnippet
<a id="org96ae34f"></a>
## org-mode
<a id="orgece52b0"></a>
### Blog Property
Creates a property drawer with all the necessary info for our blog.
```snippet
# -*- snippet -*-
# name: blog
# key: <blog
# --
** ${1:Enter Title}
:PROPERTIES:
:EXPORT_FILE_NAME: ${1:$(downcase(replace-regexp-in-string " " "-" yas-text))}
:EXPORT_DATE: `(format-time-string "%Y-%m-%d")`
:EXPORT_HUGO_MENU: :menu "main"
:EXPORT_HUGO_CUSTOM_FRONT_MATTER: :summary "${2:No Summary Provided}"
:END:
${3:"Enter Tags"$(unless yas-modified-p (progn (counsel-org-tag)(kill-whole-line)))}
```
<a id="org76a3931"></a>
# Dashboard
<a id="org6ea8080"></a>
## Banners
```elisp
(setq
;; user-banners-dir
;; doom-dashboard-banner-file "img/kubemacs.png"
doom-dashboard-banner-dir (concat humacs-spacemacs-directory (convert-standard-filename "/banners/"))
doom-dashboard-banner-file "img/kubemacs.png"
fancy-splash-image (concat doom-dashboard-banner-dir doom-dashboard-banner-file)
)
```
<a id="org15bf9e3"></a>
# user configs
Place your user config in
```elisp
(defun pair-or-user-name ()
"Getenv SHARINGIO_PAIR_NAME if exists, else USER"
(if (getenv "SHARINGIO_PAIR_USER")
(getenv "SHARINGIO_PAIR_USER")
(getenv "USER")))
(setq humacs-doom-user-config (expand-file-name (concat humacs-directory "doom-config/users/" (pair-or-user-name) ".org")))
(if (file-exists-p humacs-doom-user-config)
(progn
(org-babel-load-file humacs-doom-user-config)
)
)
;; once all personal vars are set, reload the theme
(doom/reload-theme)
;; for some reason this isn't loading
;; and doesn't exist an config.org time
;; (doom-dashboard/open) ;; our default screen
```
<a id="org501945e"></a>
# init.el
:header-args:emacs-lisp+ :tangle init.el :header-args:elisp+ :results silent :tangle init.el
<a id="org9dd9a75"></a>
## Patch for when using emacs 28+
```elisp
;; patch to [email protected]
;; https://www.reddit.com/r/emacs/comments/kqd9wi/changes_in_emacshead2828050_break_many_packages/
(defmacro define-obsolete-function-alias ( obsolete-name current-name
&optional when docstring)
"Set OBSOLETE-NAME's function definition to CURRENT-NAME and mark it obsolete.
\(define-obsolete-function-alias \\='old-fun \\='new-fun \"22.1\" \"old-fun's doc.\")
is equivalent to the following two lines of code:
\(defalias \\='old-fun \\='new-fun \"old-fun's doc.\")
\(make-obsolete \\='old-fun \\='new-fun \"22.1\")
WHEN should be a string indicating when the function was first
made obsolete, for example a date or a release number.
See the docstrings of `defalias' and `make-obsolete' for more details."
(declare (doc-string 4)
(advertised-calling-convention
;; New code should always provide the `when' argument
(obsolete-name current-name when &optional docstring) "23.1"))
`(progn
(defalias ,obsolete-name ,current-name ,docstring)
(make-obsolete ,obsolete-name ,current-name ,when)))
```
<a id="org73b79ac"></a>
## Doom! block
```elisp
(doom! :input
;;chinese
;;japanese
:os
(tty +osc)
:completion
company ; the ultimate code completion backend
helm ; the *other* search engine for love and life
;;ido ; the other *other* search engine...
;;ivy ; a search engine for love and life
:ui
deft ; notational velocity for Emacs
doom ; what makes DOOM look the way it does
doom-dashboard ; a nifty splash screen for Emacs
doom-quit ; DOOM quit-message prompts when you quit Emacs
; fill-column ; a `fill-column' indicator
hl-todo ; highlight TODO/FIXME/NOTE/DEPRECATED/HACK/REVIEW
;;hydra
;;indent-guides ; highlighted indent columns
;minimap ; show a map of the code on the side
modeline ; snazzy, Atom-inspired modeline, plus API
;;nav-flash ; blink cursor line after big motions
;;neotree ; a project drawer, like NERDTree for vim
ophints ; highlight the region an operation acts on
(popup +defaults) ; tame sudden yet inevitable temporary windows
;; pretty-code ; ligatures or substitute text with pretty symbols
;;tabs ; a tab bar for Emacs
treemacs ; a project drawer, like neotree but cooler
unicode ; extended unicode support for various languages
window-select ; visually switch windows
vc-gutter ; vcs diff in the fringe
vi-tilde-fringe ; fringe tildes to mark beyond EOB
workspaces ; tab emulation, persistence & separate workspaces
zen ; distraction-free coding or writing
:editor
(evil +everywhere) ; come to the dark side, we have cookies
file-templates ; auto-snippets for empty files
fold ; (nigh) universal code folding
(format +onsave) ; automated prettiness
;;god ; run Emacs commands without modifier keys
;;lispy ; vim for lisp, for people who don't like vim
multiple-cursors ; editing in many places at once
;;objed ; text object editing for the innocent
;;parinfer ; turn lisp into python, sort of
;;rotate-text ; cycle region at point between text candidates
snippets ; my elves. They type so I don't have to
word-wrap ; soft wrapping with language-aware indent
:emacs
dired ; making dired pretty [functional]
electric ; smarter, keyword-based electric-indent
ibuffer ; interactive buffer management
(undo +tree) ; persistent, smarter undo for your inevitable mistakes
vc ; version-control and Emacs, sitting in a tree
:term
eshell ; the elisp shell that works everywhere
;;shell ; simple shell REPL for Emacs
;;term ; basic terminal emulator for Emacs
;vterm ; the best terminal emulation in Emacs
:checkers
syntax ; tasing you for every semicolon you forget
;;spell ; tasing you for misspelling mispelling
;;grammar ; tasing grammar mistake every you make
:tools
;;ansible
debugger ; FIXME stepping through code, to help you add bugs
direnv
docker
editorconfig ; let someone else argue about tabs vs spaces
ein ; tame Jupyter notebooks with emacs
(eval +overlay) ; run code, run (also, repls)
;;gist ; interacting with github gists
lookup ; navigate your code and its documentation
(lsp +peek)
macos ; MacOS-specific commands
magit ; a git porcelain for Emacs
make ; run make tasks from Emacs
pass ; password manager for nerds
;; pdf ; pdf enhancements
;;prodigy ; FIXME managing external services & code builders
rgb ; creating color strings
;;taskrunner ; taskrunner for all your projects
terraform ; infrastructure as code
tmux ; an API for interacting with tmux
;;upload ; map local to remote projects via ssh/ftp
:lang
;;agda ; types of types of types of types...
;;cc ; C/C++/Obj-C madness
clojure ; java with a lisp
;;common-lisp ; if you've seen one lisp, you've seen them all
;;coq ; proofs-as-programs
;;crystal ; ruby at the speed of c
;;csharp ; unity, .NET, and mono shenanigans
;;data ; config/data formats
;;(dart +flutter) ; paint ui and not much else
;;elixir ; erlang done right
;;elm ; care for a cup of TEA?
emacs-lisp ; drown in parentheses
;;erlang ; an elegant language for a more civilized age
;;ess ; emacs speaks statistics
;;faust ; dsp, but you get to keep your soul
;;fsharp ; ML stands for Microsoft's Language
;;fstar ; (dependent) types and (monadic) effects and Z3
;;gdscript ; the language you waited for
(go +lsp) ; the hipster dialect
;;(haskell +dante) ; a language that's lazier than I am
;;hy ; readability of scheme w/ speed of python
;;idris ;
json ; At least it ain't XML
;;(java +meghanada) ; the poster child for carpal tunnel syndrome
javascript ; all(hope(abandon(ye(who(enter(here))))))
;;julia ; a better, faster MATLAB
;;kotlin ; a better, slicker Java(Script)
latex ; writing papers in Emacs has never been so fun
;;lean
;;factor
;;ledger ; an accounting system in Emacs
lua ; one-based indices? one-based indices
markdown ; writing docs for people to ignore
;;nim ; python + lisp at the speed of c
;;nix ; I hereby declare "nix geht mehr!"
;;ocaml ; an objective camel
(org +present +pomodoro +pandoc +hugo); ; organize your plain life in plain text
;;php ; perl's insecure younger brother
;;plantuml ; diagrams for confusing people more
;;purescript ; javascript, but functional
python ; beautiful is better than ugly
;;qt ; the 'cutest' gui framework ever
racket ; a DSL for DSLs
;;raku ; the artist formerly known as perl6
;;rest ; Emacs as a REST client
;;rst ; ReST in peace
(ruby +rails) ; 1.step {|i| p "Ruby is #{i.even? ? 'love' : 'life'}"}
;;rust ; Fe2O3.unwrap().unwrap().unwrap().unwrap()
;;scala ; java, but good
;;scheme ; a fully conniving family of lisps
sh ; she sells {ba,z,fi}sh shells on the C xor
;;sml
;;solidity ; do you need a blockchain? No.
;;swift ; who asked for emoji variables?
;;terra ; Earth and Moon in alignment for performance.
web ; the tubes
yaml ; JSON, but readable
:email
;;(mu4e +gmail)
;;notmuch
;;(wanderlust +gmail)
:app
calendar
irc ; how neckbeards socialize
(rss +org) ; emacs as an RSS reader
;;twitter ; twitter client https://twitter.com/vnought
:config
;; literate ; don't use literate when manually tangling
(default +bindings +smartparens))
```
<a id="org6a2581d"></a>
# packages.el
:header-args:emacs-lisp+ :tangle packages.el :header-args:elisp+ :results silent :tangle packages.el
<a id="org13860bd"></a>
## ii-packages
```elisp
(package! ii-utils :recipe
(:host github
:branch "master"
:repo "ii/ii-utils"
:files ("*.el")))
(package! ii-pair :recipe
(:host github
:branch "main"
:repo "humacs/ii-pair"
:files ("*.el")))
```
<a id="orga8e224e"></a>
## upstream
```elisp
(package! sql)
(package! ob-sql-mode)
(package! ob-tmux)
(package! ox-gfm) ; org dispatch github flavoured markdown
(package! kubernetes)
(package! kubernetes-evil)
(package! exec-path-from-shell)
(package! tomatinho)
(package! graphviz-dot-mode)
(package! feature-mode)
(package! almost-mono-themes)
(package! graphviz-dot-mode)
```
| 32.485294 | 410 | 0.606428 | eng_Latn | 0.786726 |
c01b4c23398e0f7280f9c3826c408ba2b073b5b4 | 1,985 | md | Markdown | readme.md | XenotriX1337/TRPL-Interpreter | 976962090b2c2d0ade9d02567b83bd643e4ab9d2 | [
"MIT"
] | null | null | null | readme.md | XenotriX1337/TRPL-Interpreter | 976962090b2c2d0ade9d02567b83bd643e4ab9d2 | [
"MIT"
] | null | null | null | readme.md | XenotriX1337/TRPL-Interpreter | 976962090b2c2d0ade9d02567b83bd643e4ab9d2 | [
"MIT"
] | null | null | null | # The Reactive Programming Language
> This is the interpreter for TRPL
> The specification for the language itself can be found [here](https://github.com/XenotriX1337/TRPL/blob/draft/specification.md)
The Reactive Programming Language or TRPL is a language developed as part of a school project.
What makes this language different from others, is that when assigning a term to a variable, the term is not evaluated before being assigned but instead the variable stores the term itself.
So, for example if you have this code:
``` js
var x = 5
var y = x + 2
x = 10
print y
```
most languages would output `7` whereas in TRPL this produces the output `12`.
Now, you might be wondering if this can be useful or not and to be honest I have no clue.
So, you shouldn't think of TRPL as the main programming language for your next project but as an experiment to try to think differently.
## Getting Started
To use the interpreter, you will need to build it yourself.
### Linux
1. Clone this repository
2. Install **GNU/Bison** and **flex**
3. Create a `build` directory
4. Inside your `build` directory run `cmake ..`
5. Once the cmake configuartion is done, compile with `make`
### Windows
I have not yet tried to compile this on windows but since there is no platform specific code, the cmake file should work.
> If you are a windows user and built this project, I would appreciat it a lot if you could help me writing this guide.
## Contributing
All contributions and suggestions are welcome.
For suggestions and bugs, please file an issue.
For direct contributions, please file a pull request.
> If you'd like to contribute to the language itself, I invite you to do so on the [Specification](https://github.com/xenotrix1337/TRPL) repository.
## Author
Tibo Clausen
[email protected]
## License
This project is licensed under the MIT License. For more information see the [LICENSE](https://github.com/XenotriX1337/TRPL-Interpreter/blob/master/LICENSE) file.
| 32.016129 | 189 | 0.763224 | eng_Latn | 0.999081 |
c01b76fd151916ccb150abe31e7c26ad6f9e606f | 15,668 | md | Markdown | Teaching - 9ho - Daedam9 - Full Article/Daedam9 Translated by J_Lee_77777.md | TranslatingTathagata/Translations | 52dadbc55b9f8bf42a70110a1812944c0a062f14 | [
"MIT-advertising"
] | 2 | 2018-11-20T09:15:24.000Z | 2019-09-26T10:35:02.000Z | Teaching - 9ho - Daedam9 - Full Article/Daedam9 Translated by J_Lee_77777.md | TranslatingTathagata/Translations | 52dadbc55b9f8bf42a70110a1812944c0a062f14 | [
"MIT-advertising"
] | 3 | 2019-03-05T03:38:51.000Z | 2019-03-29T04:20:06.000Z | Teaching - 9ho - Daedam9 - Full Article/Daedam9 Translated by J_Lee_77777.md | TranslatingTathagata/Translations | 52dadbc55b9f8bf42a70110a1812944c0a062f14 | [
"MIT-advertising"
] | 1 | 2019-03-05T03:34:52.000Z | 2019-03-05T03:34:52.000Z | Sam Han Lee, Chairman, Natural Sciences Studies
VS
Ki Whan Kim, Moon Woo Lee, Hwoi Sim Jeong
Recently, both in the country and overseas, there have been noticeable emergence of the application of Chi *Tai-Chi* Meditation, in many popular training methods used to cleanse the mind, address social issues, and bring about social discourse?
Today, the word vitality is used quite often within the human society, and to achieve vitality, we see many people practicing Tai-Chi or meditation.
Chi is not visible to human eyes, and as a result, there are many questions surrounding it as it is difficult to understand even if one is to learn about it from a well versed person.
Despite that, many individuals do not realize how risky their actions maybe when following others’ advices without verifying its truths.
First, let’s find out what Chi refers to.
Chi refers to energy and heightened feeling of ambience without seeing. Chi can be further divided into ‘vitality’ and ‘lethargy’ as two forms of heightened feeling of ambience or energy.
Vitality is the live energy, the energy that keeps oneself while lethargy defines the pure energy, the energy that does not keep oneself.
Additionally, vitality can be further categorized into two forms. The first form refers to the vitality that is contained in the plants and animals foods, which are absorbed by various physical organs as they are eaten and digested.
This type of vitality transforms into energy that is necessary for the body when one moves from one location to another as needed, and such ingested vitality becomes the source of the energy in performing life activities for the person.
As such, there do not appear any notable side-effects when the energy contained in some substance with a form is ingested by a new subject and converted into a source of energy for that subject, through the absorption of the original energy by other substances of the new subject.
However, the other kind of vitality, which is accepted through many training methods and also the subject of the question, does not refer to the energy that is absorbed by physical organs, rather they refer to the energy that travel throughout the body and become the source of diseases and other issues.
The vitality that is ingested by eating objects, such as grains, plants, Ascidians, Fish, Meat, gets immediately absorbed as its energy enters the body.
However, if people take in the vitality that are in the air, rather than through ingestion of objects, then the energy that enters your body will cause many diseases and problems.
The reason is that such vitality is a live energy, which keeps itself. Upon entering your body, instead of getting absorbed, this energy travels throughout the body, preventing normal activities by causing breakages, and at times, it overtakes sense of awareness by attaching itself to the conscious system.
Therefore, by applying many of the trending methods, such as xx Tai-Chi, xx Meditation, Brain Respiration, xx Moral Technique, xx Self Cultivation in order to achieve vitality, not only does it lead one to danger, but can also lead to grave results by killing off one’s spirit as consciousness maybe lost.
Some people claim that they were healed of diseases after absorbing Chi. But this is a very dangerous thing. Although on rare occasions, it is possible to heal diseases that may not be possible to heal by using modern medicine, if someone feels comfort by absorbing Chi while fighting illness, the reality is that he is feeling the comfort because he has lost his awareness, which is the case of making his spirit even miserable.
Lethargy refers to the pure energy that does not hold oneself.
Lethargy does not contain much nutrients or energy. When we drink water, the energy contained in water enters our body. This is lethargy.
As this type of Chi is very small, it does not provide much help during the activities of people or cells. That’s how small the energy is.
Although both vitality and lethargy are in the air, majority of energy that people absorb during normal breathing is lethargy. This lethargy never damages people’s health. No matter how much is absorbed, lethargy does not cause harm.
However, the vitality that is in the air holds oneself, it resists from easily entering into human body. At the same time, if people absorb vitality without fearing it, great harm will come to the human world.
Through media, you are most likely familiar with the issue that is associated with a form of Tai-Chi, Falun Gong. The practitioners are said to be inhaling the vitality in the air as they perform exercise like poses.
People do not usually perform health exercises when recommended. Also, people do not easily follow instructions if they were asked to unite while doing health exercises.
Furthermore, when asked to propagate and provide health exercises available because they are good things, people do not do so. Finally, no one will protest against the government should the government prevent people from performing health exercises. No one will consider health exercises as a subject of protest.
However, Falun Gong, a form of Tai-Chi, exercised its influence and pressed the Chinese government to legalize it. As a result, the Chinese government had undertaken an investigation into Falun Gong and ordered the protesting organization to dismantle.
Why is such event occurring? Like Korea, great many changes are taking place within the Chinese society. What are being referred to as these societal changes?
It refers to this vitality exercise or the act of absorbing spirits into people’s bodies.
It is easy to see that there are exercises that connect spirits and people everywhere in the world. If spirit gets to control people, anything, including mob action can be made possible. Many things that are beyond our common sense can occur.
There are countless organizations that are active, fronting Chi in our society as well.
Recently, they said that one can improve academic performances through brain respiration training. This is hard to understand through common sense.
A certain student, who was in the middle of his class, has improved his academic performance to the top level of his class, vying for being the 1st or 2nd in the class, after starting brain respiration training. Very dangerously, those people, who lack the preconceived notion of reality, can easily be drawn into such claims.
How did that student come to excel in academics? When people run high temperatures, the brain activities become hindered. In such conditions, even little things cause one to become overly sensitive and deteriorate the ability to memorize.
In other words, this is a phenomenon that occurs as blood does not easily flow to the brain due to some disabilities in the blood vessels, either inside the chest or head.
This phenomenon occurs in the back of the head: The rear brain weakens and causes anemia in the brain, thus inducing a rise in the temperature in the area. This in turn causes brain activity to get impeded, leading to deterioration in the ability to accept reality, hence resulting in a drop in the academic performance.
In such situation, by resolving the problems that impedes the student’s heart activity or improve blood flow to the head by removing problems that cause poor blood circulation, the student will recover normal brain activities. From that point, depending on the student’s efforts, his grades will eventually improve.
However, it is rare for a child, who suffers from short attention span, to improve his focus and suddenly raise academic performance in a short time. If the source of the problem or the process is not understood, such occurrence may be described as a miracle. However, when the source of the problem or the process is learned, there is, in fact, no miracle.
If something unusual occurs without any cause or process, we must investigate the circumstances. If such circumstances are not clearly investigated, it will be very difficult for people to understand the things of this world.
If a student experiences sudden changes in his academic performance, after practicing brain respiration for a few months, then his sense of awareness is being controlled by the energy that had entered him from the air.
There are many examples through which these occurrences can be explained.
In Tibet, there is a group of people, called Rimpoche.
The word, Rimpoche, refers to reincarnation of a dead person. A certain Rimpoche is able to speak before reaching the age of two, claims to be someone in the past and remember and recognize certain items or speeches that were used in the past life. Such occurrences are not normal and out of the ordinary.
Other people require learning a language for a long time after birth, and are conditioned to remember things only after by seeing them. Then how was it possible for this child to speak and recognize things right after birth?
That is because a spirit of a dead person has entered the child and mistakes himself as being reincarnated, borrowing the child’s body, or pretends to be reincarnated, speaking and acting, using the child’s physical abilities.
As a simple example, a smart person with a college degree has died, but his/her spirit was unable to enter the samsaric cycle, it enters another person’s body after wondering about for a while, controlling the person’s consciousness. The person, whose consciousness is being controlled, then is subconsciously able to know the answers to test problems.
A person, having another being in his body through spiritual contact, would know what kind of problems will appear on tests as the content will be reflected in his consciousness, even prior to receiving the test paper.
This is not a case of performing well through hard work. This is not a case of seeing by a living consciousness. Rather it is a case of seeing through a spirit of a dead person.
If such occurrences take place after brain respiration exercises then this is an indication of another living energy, entering oneself. By having a clear understanding of such process, one can see that performing brain respiration exercises will make oneself very unfortunate.
However, parents still insist on sending kids, who perform poor in school, to brain respiration exercises. In fact, it is perfectly OK for kids to do poorly at school.
Depending on the basis of the person, there are many who does well in life even their performances at school is poor. The test scores do not determine one’s future.
It can be possible to improve a child’s performance at school through brain respiration exercises, but one must be cautious that by doing this, it can be possible to ruin the child’s future by having the child lose himself and become a strange person.
As such, it is very dangerous for people to accept the vitality that is in the air.
As they claim, if vitality is good and necessary for people, and that it is possible to intake nutrients that are in food by inhaling the air, we do not need to eat.
After eating food, its waste exits our body in the form of urine and excretion while the vitality that is in the food gets absorbed and used. But if we can choose and pick what vitality to inhale from the air then there really is no need for us to be eating any food.
Scientifically speaking, is it not that even air has mass? Then one’s weight should increase upon stepping on to a scale, just as it would after eating food, so would after inhaling a lot of vitality that is in the air.
We must remember that it is dangerous to believe something as reality without any real proof.
How should one understand the identity of Chi? The answer lies in seeing the reality.
All energy is in objects and moving life forms. The world follows the law of the jungle. The strong preys on the weak. Even a holy man must eat grain or plants in order to live.
However, since the plants are life forms as well, everyone eats a life form to live. It is because vitality exists in life forms. It is perfectly safe for people to eat the vitalities that are in such life forms, but to intake the vitality in the air is very dangerous.
Eating well and sleeping well is a good thing. However, not eating well and not sleeping well indicates a problem.
Everyone must guard against taking in the vitality that’s in the air, but in general, the vitality does not easily come into contact with those that stay busy. Also, those who are hard working do not have interest in Tai-Chi or some strange activities.
However, people who have a lot of free time, are unable to focus and have built up stress seek out of the ordinary ways to resolve their problems. Hence, by having interest in strange things, leans to doing strange things and eventually, strange things materialize to himself.
The vitality in the air sets up traps for those people that come in contact with it and either enters the body of a trapped person, or contacts those who lack caution, or have weak sense of awareness.
If something like this happens to someone, he may do certain unwanted things without hesitation. That occurs as he would have lost his awareness. These are very unfortunate occurrences for the individual.
The mass media is doing very irresponsible things in our society.
Without any knowledge of Tai-Chi or brain respiration exercises, the media invites those who practice them, giving them platforms, to drum up people’s interests.
This is nothing more than the mass media companies taking the lead in providing the sources that muddy and destroy the society itself.
A guest, who calls himself some enlightened person on TV, demonstrates Tai-Chi and claims that if one has reached a high consciousness level, one can easily levitate in the air.
Whatever is the high consciousness level and high level skill? What is the relationship between one’s good life and levitating in the air in strange clothes?
If it is true just because they speak as if they are great and truly excellent then why is it that only recently are they surfacing in the dark society of our country, even though there lived countless of many people throughout the thousands, tens and thousands of years.
This occurrence can be simply explained as an event in the end of the world.
For some unexplained reason, the mass media in our society is leading the introduction and pushing of uninvestigated things.
Especially the TV programs often show them to stimulate people’s interests by fanning the falsehood that these things are in fact proven truths and helping people to do those things. Such happenings will bring about great misfortunes to our society in the future.
It is possible to draw a conclusion by observing those occurrences that the reality is that unproven things become the subject of people’s interest in dark, politically confused society.
One must continuously be reminded that one can be greatly harmed by simply believing what others say in today’s world.
We must constantly put our guards up to such things. If not, one must always think that harm may come in one’s way when least expected.
By ignoring this aphorism and accepts unknown vitality, one will lose one’s own life forever and will become the source of harms that are brought to many other people.
Everyone must observe and understand those things that exist in the world. If anyone pays attention to this conversation, verify, and learn, one will bless oneself through life of health, good life and the fruits of good life.
| 489.625 | 7,757 | 0.796209 | eng_Latn | 0.999984 |
c01b784f0d17fb8b87cb702bacb0b513d6605de9 | 267 | md | Markdown | README.md | OptimusKe/LoveLiveSimulator | af53c76588a1b91b61af6bde75f5215029f1c769 | [
"MIT"
] | 10 | 2015-01-11T14:12:16.000Z | 2019-03-22T12:11:54.000Z | README.md | OptimusKe/LoveLiveSimulator | af53c76588a1b91b61af6bde75f5215029f1c769 | [
"MIT"
] | null | null | null | README.md | OptimusKe/LoveLiveSimulator | af53c76588a1b91b61af6bde75f5215029f1c769 | [
"MIT"
] | null | null | null | LoveLiveSimulator
======
Simulate Love Live Mobile Game (http://lovelive.bushimo.jp/)

Third Party Libraries
======
- SDWebImage
- google-plus-ios-sdk
Author
======
OptimusKe ([email protected])
| 15.705882 | 64 | 0.715356 | kor_Hang | 0.201841 |
c0200a22b0178dce538c2f64a3deb43f1645fa20 | 3,054 | md | Markdown | CHANGELOG.md | AnthonyNahas/ngx-mailto | e6759a9596fd2ded44e538a683e0dabb080ae049 | [
"MIT"
] | 10 | 2020-11-30T07:36:17.000Z | 2022-03-02T19:39:26.000Z | CHANGELOG.md | AnthonyNahas/ngx-mailto | e6759a9596fd2ded44e538a683e0dabb080ae049 | [
"MIT"
] | 2 | 2020-11-27T08:40:19.000Z | 2020-12-02T08:48:55.000Z | CHANGELOG.md | AnthonyNahas/ngx-mailto | e6759a9596fd2ded44e538a683e0dabb080ae049 | [
"MIT"
] | null | null | null | ## [1.0.6](https://github.com/anthonynahas/ngx-mailto/compare/1.0.5...1.0.6) (2021-06-15)
### Bug Fixes
* **lib:** upgraded angular to v12 ([61d416f](https://github.com/anthonynahas/ngx-mailto/commit/61d416faf498378d00953ec0bb1736232980401f))
## [1.0.5](https://github.com/anthonynahas/ngx-mailto/compare/1.0.4...1.0.5) (2021-04-20)
### Bug Fixes
* **lib:** updated the dependencies ([8fa9f89](https://github.com/anthonynahas/ngx-mailto/commit/8fa9f896bd68469c63335f60b01398990faec5e7))
## [1.0.4](https://github.com/anthonynahas/ngx-mailto/compare/1.0.3...1.0.4) (2021-04-15)
### Bug Fixes
* **project:** updated the dependencies ([c4094b4](https://github.com/anthonynahas/ngx-mailto/commit/c4094b43ba948b5e1175f131aa6c76de16b9c614))
## [1.0.3](https://github.com/anthonynahas/ngx-mailto/compare/1.0.2...1.0.3) (2021-01-18)
## [1.0.2](https://github.com/anthonynahas/ngx-mailto/compare/1.0.1...1.0.2) (2021-01-18)
## [1.0.1](https://github.com/anthonynahas/ngx-mailto/compare/1.0.0...1.0.1) (2020-11-30)
# 1.0.0 (2020-11-26)
### Bug Fixes
* **demo:** improved the demo app ([43cfa4a](https://github.com/anthonynahas/ngx-mailto/commit/43cfa4af29da9b24c6789d18cfc072c3ce56610a))
* **lib:** added angular cli ghpages package for deploy ([1aee14d](https://github.com/anthonynahas/ngx-mailto/commit/1aee14d601771fd5b5cf63c5c5be568a58d3df4b))
* **lib:** added release-it package ([7179938](https://github.com/anthonynahas/ngx-mailto/commit/7179938ced32265115cbbf98a383f35571c6be2d))
* **lib:** added support of schematics ([f3b32f5](https://github.com/anthonynahas/ngx-mailto/commit/f3b32f523f8d3afbd1a4f3fd6514a9feda040f5b))
* **lib:** improved the compose function ([5f826d7](https://github.com/anthonynahas/ngx-mailto/commit/5f826d7949f68b2f8625a35ab52f99c72b1524b2))
* **lib:** improved the ngx-mailto.service.ts ([e692d91](https://github.com/anthonynahas/ngx-mailto/commit/e692d912d6d194591c10affd72258b44371a9b13))
* **lib:** refactor code ([51053a6](https://github.com/anthonynahas/ngx-mailto/commit/51053a6ffaec4d5ff2c964c9ff9e58c352dda468))
* **project:** minor ([e63a8de](https://github.com/anthonynahas/ngx-mailto/commit/e63a8dea20d5bdab802a48ed21c9a93fe307830e))
* **project:** minor ([4ee79b5](https://github.com/anthonynahas/ngx-mailto/commit/4ee79b51d57df71abc9678fa3197947d34052fad))
* **project:** minor refactoring ([171158f](https://github.com/anthonynahas/ngx-mailto/commit/171158f94ccee42653137fcdc0d8f636c0de324e))
### Features
* **demo:** added angular material ([ea0e634](https://github.com/anthonynahas/ngx-mailto/commit/ea0e634273b453d0f05a90ff27d08db0e3940bbb))
* **demo:** added ngx-markdown package ([8b8f132](https://github.com/anthonynahas/ngx-mailto/commit/8b8f132acddb5a50bc55762b4963743f0689918a))
* **project:** added angular universal support ([644588a](https://github.com/anthonynahas/ngx-mailto/commit/644588a623396bfbe2be900d7cc21f3a638e27d9))
* **project:** generated the `ngx-mailto` library ([ceb3308](https://github.com/anthonynahas/ngx-mailto/commit/ceb3308dffb1cc63fd7d4301e2c2670132b55b6e))
| 58.730769 | 159 | 0.764898 | yue_Hant | 0.334935 |
c0224a535b4ce3077b6a14631825d4dd58773159 | 95 | md | Markdown | README.md | stretchmaniac/pixel-physics | df51717eef9a2dda0af9df462aa8774d675b2398 | [
"MIT"
] | 1 | 2020-04-02T07:14:11.000Z | 2020-04-02T07:14:11.000Z | README.md | stretchmaniac/pixel-physics | df51717eef9a2dda0af9df462aa8774d675b2398 | [
"MIT"
] | null | null | null | README.md | stretchmaniac/pixel-physics | df51717eef9a2dda0af9df462aa8774d675b2398 | [
"MIT"
] | null | null | null | # pixel-physics
A sandbox for creating discrete physical systems, such as cellular automata.
| 31.666667 | 77 | 0.8 | eng_Latn | 0.963688 |
c02392609f6a53c80c2975d1e7e118a38c385b09 | 218 | md | Markdown | _watches/M20190927_025107_TLP_4.md | Meteoros-Floripa/meteoros.floripa.br | 7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad | [
"MIT"
] | 5 | 2020-05-19T17:04:49.000Z | 2021-03-30T03:09:14.000Z | _watches/M20190927_025107_TLP_4.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | null | null | null | _watches/M20190927_025107_TLP_4.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | 2 | 2020-05-19T17:06:27.000Z | 2020-09-04T00:00:43.000Z | ---
layout: watch
title: TLP4 - 27/09/2019 - M20190927_025107_TLP_4T.jpg
date: 2019-09-27 02:51:07
permalink: /2019/09/27/watch/M20190927_025107_TLP_4
capture: TLP4/2019/201909/20190926/M20190927_025107_TLP_4T.jpg
---
| 27.25 | 62 | 0.784404 | yue_Hant | 0.063673 |
c0239b54ba22e5445e59f5ee4becfa4af1f48c19 | 4,234 | md | Markdown | README.md | hmcts/ccd-case-disposer | 6f8aa633de476182c7c39cb2174b4da2c542b737 | [
"MIT"
] | 1 | 2021-09-23T10:04:02.000Z | 2021-09-23T10:04:02.000Z | README.md | hmcts/ccd-case-disposer | 6f8aa633de476182c7c39cb2174b4da2c542b737 | [
"MIT"
] | 3 | 2022-01-08T23:38:41.000Z | 2022-02-21T17:17:56.000Z | README.md | hmcts/ccd-case-disposer | 6f8aa633de476182c7c39cb2174b4da2c542b737 | [
"MIT"
] | 1 | 2022-03-03T13:52:08.000Z | 2022-03-03T13:52:08.000Z | # ccd-case-disposer
[](https://travis-ci.org/hmcts/ccd-case-disposer)
## Purpose
This micro-service disposes off case records after a certain period of inactivity after a significant business event
## Getting Started
### Prerequisites
- [JDK 11](https://java.com)
### Building the application
The project uses [Gradle](https://gradle.org) as a build tool. It already contains
`./gradlew` wrapper script, so there's no need to install gradle.
To build the project execute the following command:
```bash
./gradlew build
```
### Running the application
Create the image of the application by executing the following command:
```bash
./gradlew assemble
```
Create docker image:
```bash
docker-compose build
```
Run the distribution (created in `build/install/ccd-case-disposer` directory)
by executing the following command:
```bash
docker-compose up
```
This will start the API container and immediately attempt to delete qualifying case records from the data sources it's configured to connected to.
### Alternative script to run application
To skip all the setting up and building, just execute the following command:
```bash
./bin/run-in-docker.sh
```
For more information:
```bash
./bin/run-in-docker.sh -h
```
Script includes bare minimum environment variables necessary to start api instance. Whenever any variable is changed or any other script regarding docker image/container build, the suggested way to ensure all is cleaned up properly is by this command:
```bash
docker-compose rm
```
It clears stopped containers correctly. Might consider removing clutter of images too, especially the ones fiddled with:
```bash
docker images
docker image rm <image-id>
```
There is no need to remove postgres and java or similar core images.
## Developing
### Unit tests
To run all unit tests execute the following command:
```bash
./gradlew test
```
### Integration tests
To run all integration tests execute the following command:
```bash
./gradlew integration
```
### Functional tests
The functional tests require Elasticsearch, which is not enable by default on the local `ccd-docker` setup, thus it should be enabled along with logstash with this command:
```bash
./ccd enable elasticsearch logstash
```
The next step is to get both `ccd-definition-store-api` and `ccd-data-store-api` to use Elasticsearch and this is done by exporting the following environment variables:
```bash
export ES_ENABLED_DOCKER=true
export ELASTIC_SEARCH_ENABLED=$ES_ENABLED_DOCKER
export ELASTIC_SEARCH_FTA_ENABLED=$ES_ENABLED_DOCKER
```
Indices of the relevant case types are expected to be present in the Elasticsearch instance for the tests to work.
The easiest way to get the indices created is to run the `ccd-data-store-api` functional tests at least once prior to running these functional tests.
This is especially useful when testing locally.
When the above steps are completed, run the functional tests using the following command:
```bash
./gradlew functional
```
> Note: These are the tests run against an environment.
> Please see [ccd-docker/README.md](./ccd-docker/README.md) for local environment testing.
>
> If you would like to test against AAT dependencies then run `docker-compose up`.
> Also set the required environment variables that can be found by reviewing the contents of this project's
> [Jenkinsfile_CNP](./Jenkinsfile_CNP) script (particularly the `secrets` mappings, and the variables set by
> the `setBeftaEnvVariables` routine).
>
### Code quality checks
We use [checkstyle](http://checkstyle.sourceforge.net/) and [PMD](https://pmd.github.io/).
To run all checks execute the following command:
```bash
./gradlew clean checkstyleMain checkstyleTest checkstyleIntegrationTest pmdMain pmdTest pmdIntegrationTest
```
To run all checks alongside the unit tests execute the following command:
```bash
./gradlew checks
```
or to run all checks, all tests and generate a code coverage report execute the following command:
```bash
./gradlew check integration functional jacocoTestReport
```
## License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details
| 28.608108 | 251 | 0.770194 | eng_Latn | 0.994154 |
c023b5085f873f100fc80dbd20d4c6855c9637b6 | 806 | md | Markdown | mnemonic-examples/README.md | yzz127/mnemonic | d31eb241a94559a01bb6c68150309b4959d538ff | [
"Apache-2.0"
] | 61 | 2017-12-06T17:02:13.000Z | 2022-03-08T01:10:55.000Z | mnemonic-examples/README.md | yzz127/mnemonic | d31eb241a94559a01bb6c68150309b4959d538ff | [
"Apache-2.0"
] | 48 | 2017-12-18T17:36:44.000Z | 2022-03-24T06:15:08.000Z | mnemonic-examples/README.md | yzz127/mnemonic | d31eb241a94559a01bb6c68150309b4959d538ff | [
"Apache-2.0"
] | 29 | 2017-12-17T04:02:39.000Z | 2021-11-21T21:06:45.000Z | # Mnemonic Examples
The examples demonstrate how to use the [Mnemonic](http://mnemonic.apache.org/).
To run the default example [Main](src/main/java/org/apache/mnemonic/examples/Main.java):
```bash
$ # requires 'vmem' memory service to run, please refer to the code of test cases for details.
$ mvn exec:exec -Pexample -pl mnemonic-examples
```
To run a specific example by providing the example name under the [examples](src/main/java/org/apache/mnemonic/examples):
```bash
$ mvn exec:exec -Pexample -pl mnemonic-examples -Dexample.name=<the example name> [-Dexample.args="<arguments separated by space>"]
```
For how to run the examples in the [Docker](https://www.docker.com), please refer to the [Docker usage](http://mnemonic.apache.org/docs/docker.html) at the documentation of Mnemonic.
| 42.421053 | 182 | 0.746898 | eng_Latn | 0.974321 |
c024152e2b3f0a200ec4c82a743a338d023c86b1 | 4,104 | md | Markdown | README.md | Joy-mark-AG/react-hook-cart | 212788dad37adf6ca8f38c7df4e7809e68927a01 | [
"MIT"
] | 2 | 2021-01-18T00:28:22.000Z | 2021-01-20T01:12:52.000Z | README.md | Joy-mark-AG/react-hook-cart | 212788dad37adf6ca8f38c7df4e7809e68927a01 | [
"MIT"
] | 1 | 2021-01-17T21:49:12.000Z | 2021-01-17T21:49:12.000Z | README.md | Joy-mark-AG/react-hook-cart | 212788dad37adf6ca8f38c7df4e7809e68927a01 | [
"MIT"
] | 2 | 2021-01-17T21:06:14.000Z | 2021-05-18T18:07:19.000Z | # react-hook-cart
<div align="center">
[](https://www.typescriptlang.org/)
[](https://bundlephobia.com/[email protected])
</div>
🛒 This is a typescript, hook using shopping cart lib, persistent by default, that I'm hopeful will help a few people out.
<h2>📦 Installation</h2>
$ npm install react-hook-cart
<h2>📖 Example</h2>
Check out the <a href="https://codesandbox.io/s/react-hook-cart-example-gnxl1">Demo</a>.
<h2>🕹 API</h2>
#### 🔗 `CartProvider`
This is a Provider Component to wrapper around your entire app(or a section of it) in order to create context for the cart.
- `storage` can take other methods to store cart, default uses localStorage.
```tsx
import { CartProvider } from "react-hook-cart";
<CartProvider>
<App />
</CartProvider>
```
#### 🔗 `useCart()`
Function to expose cart functionality
```tsx
import { useCart } from "react-hook-cart";
const { items, isEmpty, totalCost, addItem, removeItem, clearCart } = useCart();
```
#### 🔗 `items`
`items` in an `Item` array
```tsx
import { useCart } from "react-hook-cart";
const { items } = useCart();
const ShowCart = () => {
return (
<div>
<ul>
{items.map((item) => (
<li>{item.id}</li>
))}
</ul>
</div>
);
};
```
#### 🔗 `addItem(Item, quantity)`
Adds the item to the items array
- `Item` is an object `{id: string, price: number}`, it can have additional properties `{id: string, price: number, name:"example"}`
- `quantity` is a number, but optional. Default value is 1
```tsx
const { addItem } = useCart();
return (
<button onClick={()=>addItem({id: "Br73s", price: 5}, 2)}>Add 2 bread for 5 USD each</button>
);
```
#### 🔗 `removeItem(id)`
Removes all of the items with that id.
- `id` is a string
```tsx
const { removeItem } = useCart();
return (
<button onClick={()=>removeItem("Br73s")}>Remove items</button>
);
```
#### 🔗 `updateItem(id, updates)`
`updateItem` updates the item with the updates object.
- `id` is a string
- `updates` is an object
```tsx
const { updateItem } = useCart();
return (
<button onClick={()=>updateItem("Br73s", { size: "Large" })}>Make it a large bread!</button>
);
```
#### 🔗 `updateItemQuantity(id, quantity)`
`updateItemQuantity` changes the quantity of an item to the exact quantity given.
- `id` is a string
- `quantity` is a number
```tsx
const { updateItemQuantity } = useCart();
return (
<button onClick={()=>updateItemQuantity("Br73s", 5)}>Set item amount to 5</button>
);
```
#### 🔗 `clearCart()`
`clearCart()` empties the cart, and resets the state.
```tsx
const { clearCart } = useCart();
return (
<button onClick={()=>clearCart()}>Empty the cart!</button>
);
```
#### 🔗 `isEmpty`
A quick and easy way to check if the cart is empty.
- `isEmpty` is a boolean.
```tsx
const { isEmpty } = useCart();
return (
<p>The cart is {isEmpty ? "empty" : "not empty"}</p>
);
```
#### 🔗 `getItem(id)`
Get item with that id.
- `id` is a string
```tsx
const { getItem } = useCart();
const item = getItem("Br73s")}>
```
#### 🔗 `inCart(id)`
Quickly check if an item is in the cart.
- `id` is a string
- returns a boolean
```tsx
const { inCart } = useCart();
const itemWasInCart = inCart("Br73s")}>
```
#### 🔗 `totalItems`
The total amount of items in the cart.
- `totalItems` is a number
```tsx
const { totalItems } = useCart();
return (
<p>There are {totalItems} in the cart</p>
);
```
#### 🔗 `totalUniqueItems`
The total amount of unique items in the cart.
- `totalUniqueItems` is a number
```tsx
const { totalUniqueItems } = useCart();
return (
<p>There are {totalUniqueItems} in the cart</p>
);
```
#### 🔗 `totalCost`
The total cost of all the items in the cart.
- `totalCost` is a number
```tsx
const { totalCost } = useCart();
return (
<p>The total cost of the cart is: {totalCost}</p>
);
```
| 17.389831 | 145 | 0.625487 | eng_Latn | 0.79637 |
c024d6398a73f628c66bd5a1a8fcce35d30c7d5c | 2,543 | md | Markdown | docs/framework/unmanaged-api/debugging/icordebugprocess6-decodeevent-method.md | TomekLesniak/docs.pl-pl | 3373130e51ecb862641a40c5c38ef91af847fe04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugprocess6-decodeevent-method.md | TomekLesniak/docs.pl-pl | 3373130e51ecb862641a40c5c38ef91af847fe04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugprocess6-decodeevent-method.md | TomekLesniak/docs.pl-pl | 3373130e51ecb862641a40c5c38ef91af847fe04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Metoda ICorDebugProcess6::DecodeEvent
ms.date: 03/30/2017
ms.assetid: 1453bc0c-6e0d-4d5a-b176-22607f8a3e6c
ms.openlocfilehash: ed75b3c5657fed805f187285a576b81598be331c
ms.sourcegitcommit: d8020797a6657d0fbbdff362b80300815f682f94
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 11/24/2020
ms.locfileid: "95690281"
---
# <a name="icordebugprocess6decodeevent-method"></a>Metoda ICorDebugProcess6::DecodeEvent
Dekoduje zarządzane zdarzenia debugowania, które zostały hermetyzowane w ładunku specjalnie spreparowanych zdarzeń debugowania wyjątku natywnego.
## <a name="syntax"></a>Składnia
```cpp
HRESULT DecodeEvent(
[in, length_is(countBytes), size_is(countBytes)] const BYTE pRecord[],
[in] DWORD countBytes,
[in] CorDebugRecordFormat format,
[in] DWORD dwFlags,
[in] DWORD dwThreadId,
[out] ICorDebugDebugEvent **ppEvent
);
```
## <a name="parameters"></a>Parametry
`pRecord`
podczas Wskaźnik do tablicy bajtów z natywnego zdarzenia debugowania wyjątku, który zawiera informacje o zarządzanym zdarzeniu debugowania.
`countBytes`
podczas Liczba elementów w `pRecord` tablicy bajtów.
`format`
podczas Element członkowski wyliczenia [CorDebugRecordFormat](cordebugrecordformat-enumeration.md) , który określa format niezarządzanego zdarzenia debugowania.
`dwFlags`
podczas Pole bitowe, które zależy od architektury docelowej i określające dodatkowe informacje o zdarzeniu debugowania. W przypadku systemów Windows może być elementem członkowskim wyliczenia [CorDebugDecodeEventFlagsWindows](cordebugdecodeeventflagswindows-enumeration.md) .
`dwThreadId`
podczas Identyfikator systemu operacyjnego wątku, w którym został zgłoszony wyjątek.
`ppEvent`
określoną Wskaźnik do adresu obiektu [ICorDebugDebugEvent](icordebugdebugevent-interface.md) , który reprezentuje zdekodowane zdarzenie debugowania zarządzane.
## <a name="remarks"></a>Uwagi
> [!NOTE]
> Ta metoda jest dostępna tylko z .NET Native.
## <a name="requirements"></a>Wymagania
**Platformy:** Zobacz [wymagania systemowe](../../get-started/system-requirements.md).
**Nagłówek:** CorDebug. idl, CorDebug. h
**Biblioteka:** CorGuids. lib
**.NET Framework wersje:**[!INCLUDE[net_46_native](../../../../includes/net-46-native-md.md)]
## <a name="see-also"></a>Zobacz także
- [Interfejs ICorDebugProcess6](icordebugprocess6-interface.md)
- [Debugowanie — Interfejsy](debugging-interfaces.md)
| 37.397059 | 278 | 0.745576 | pol_Latn | 0.992308 |
c0255e97a97361c70c4e127740a4323157ffc1a9 | 11,945 | md | Markdown | node_modules/ng.seed/readme.md | MarieSumalee/chec-estore | f150cf80199332e076b49ddfdc306276aafeab56 | [
"BSD-3-Clause"
] | null | null | null | node_modules/ng.seed/readme.md | MarieSumalee/chec-estore | f150cf80199332e076b49ddfdc306276aafeab56 | [
"BSD-3-Clause"
] | null | null | null | node_modules/ng.seed/readme.md | MarieSumalee/chec-estore | f150cf80199332e076b49ddfdc306276aafeab56 | [
"BSD-3-Clause"
] | null | null | null | # ng.seed: create a modular [ng app](https://github.com/ng-/ng)
ng.seed aims to make it dead simple to create a modular application using the [ng framework](https://github.com/ng-/ng)
#####ng is right for you if:
- You want to use npm as a package manager to create, organize, & share ng modules
- You want access to an ecosystem of npm packages (anything with an “ng.” prefix)
- You want a modular application with a matching directory structure
- You want to have simple environmental settings & run on all your server's cores
## getting started
1. *At the command prompt, install [node](http://nodejs.org/api/) if you haven't yet. Install [manually](https://gist.github.com/isaacs/579814) or using:*
. <(curl https://raw.github.com/ng-/ng.seed/master/node.sh)
This will install and use the latest stable version of node. Specify a version with `nave use <version>`
2. *Goto the directory where you want your application, and install ng.seed*
npm install ng.seed #if git is not installed
npm install ng-/ng.seed #if git is installed
3. *Upon successful installation, you will be prompted to name your application*
What would you like to name your application? myProject
Answering `myProject` would create the following
- `myProject/package.json` this is your config file
- `myProject/node_modules/ng.seed` this loads your modules
- `myProject/node_modules/ng.seed/node_modules/ng` the ng framework
4. *Start your application*
node myProject <environment>
`<environment>` is available inside your application as `process.env.NODE_ENV` and `process.argv[3]`
In the browser, `http://localhost:1080` should now display **"Welcome to ng.seed!"**
5. *Continue application in background as a daemon*
ctrl-c
Input and output will now be redirected from the terminal to the log files specified in `package.json`
6. *Quit or restart your application*
node myProject stop|restart
7. *Install modules as dependencies*
npm install myProject --save <dependency>
All ng.seed dependencies should begin with an `ng.` prefix
8. *Build your application using this guide & then share it with others!*
npm publish myProject
## how it works
ng.seed uses your directory structure to organize and load your ng application. The easiest way to learn ng.seed is to explore one of the existing projects built with it listed at the end of this readme.
Starting in `myProject`, ng.seed recursively searches for folders named `node_modules`. Each of these folders becomes an ng module. If a module has a `node_modules` folder, then those dependencies are loaded and their names are automatically entered into the parent module's "requires" array.
Each module can contain any number of files and folders, however, folders named after an ng service (e.g., `animate`, `config`, `constant`, `controller`, `directive`, `factory`, `filter`, `provider`, `run`, `service`, `value`, `stack`, `parse`) will have their files registered into your ng application as that type.
ng.seed will use the filename without the extension as the name it registers with ng. For example, `myProject/node_modules/module1/factory/example.js` would register myProject as `ng.module('myProject', ['module1'])`, module1 as `ng.module('module1', [])`, and - finally - example.js as `ng.module('module1').factory('example', <module.exports>)`.
## define services
You define services much like you would in node.js or angular. To elaborate on the example above, `myProject/factory/example.js` could look like this
```javascript
module.exports = function(dependency1, dependency2)
{
// I am initialization code that runs only once
// the first time that this factory is injected
return {
// I am the object that is available to service that injected me
}
}
```
As with ng, the default is identical client & server behavior. To exert more fine-grained control over
```javascript
exports.client = function(dependency1, dependency2)
{
return //injecting "example" on client is completely different than on the server
}
exports.server = function(dependency3, dependency4)
{
return //injecting "example" on server is completely different than on the client
}
```
## dependencies
ng.seed has three type of dependencies:
1. angular modules - precompiled modules such as ngRoute, ngCookies, uiBootstrap
2. ng.seed modules - npm packages starting with `ng.` that will compile into angular modules
3. npm packages - regular npm packages that are `require()` in ng.seed modules
(1) To load regular angular modules set them in the `requires` property in your `package.json`. Although angular's core module `ng` is the only external module required - to get you started quickly - ng.seed loads both the bleeding-edge of `ng` from `http://code.angularjs.org/snapshot/angular.js` and `ngRoute` from `http://code.angularjs.org/snapshot/angular-route.js`. See the **config** section of this readme for an example of how to edit your `package.json` to stay on a particular version. Add or remove ngRoute, ngAnimate, ngCookies, ngSanitize, ngTouch and any other *precompiled* angular modules (i.e., not ng) - such as angular-ui's bootstrap or angular's firebase bindings - to this option.
(2) Use ng.seed modules by simply adding them to your application's `node_modules` folder and - if you plan to publish your application - add them as a dependencies to your `package.json`. Easily do both at the same time with the command `npm install ng.thirdParty --save <dependency>`. All ng.seed modules **must** have a name starting with `ng.` or the module will not load. ng.seed enforces this restriction in order to differentiate ng.seed modules from other npm dependencies. If you end up publishing your module, this `ng.` namespace will also help your module be discovered by other developers browsing npm's registry or github.
(3) Any npm package that does not start with `ng.` will not be loaded by ng.seed. This allows ng.seed modules to `require()` any package within the npm registry as a dependency. To ensure compatibility, regular npm packages will always be run on the server rather than the client.
## ng global
Just like `angular` is to angular, `ng` is ng.seed's one & only global variable. `ng` has exactly the same [api as angular](http://docs.angularjs.org/api/ng/function) with your favorite helper methods such as `ng.toJson`, `ng.fromJson`, `ng.isDefined`, etc. In addition, ng.seed has one extra property: `ng.config`. To learn more about `ng.config`, please see the **config** section of this readme.
## config
All config options are contained in your project's `package.json`, which is modified based on the `<environment>` set when ng.seed is loaded. The config options are available globally as `ng.config`.
If your `package.json` property does not have a property named `<environment>` then the whole property is loaded because ng.seed assumes that the option is constant accross all environments. If `<environment>` property does exist then that property is used for the option.
If the `<environment>` property exists and its value is another property, then ng.seed assumes you are referencing that property and loads that one as the option. For example:
Environment based properties are not set, so all environments are identical
```javascript
//Given the following package.json
{
"option1": {
"iam":"happy"
"ur": "sad"
}
}
//node myProject local
ng.config =
{
option1:
{
iam:"happy"
ur: "sad"
}
}
//node myProject live
ng.config =
{
option1:
{
iam:"happy"
ur: "sad"
}
}
//node myProject test
ng.config =
{
option1:
{
iam:"happy"
ur: "sad"
}
}
```
Some environmental config options are set
```javascript
//Given the following package.json
{
"option2": {
"local":"happy"
"live": "sad"
}
}
//node myProject local
ng.config =
{
option2: "happy"
}
//node myProject live
ng.config =
{
option2: "sad"
}
//node myProject test
ng.config =
{
//this option is most likely an error as ng.seed does not know
//which value to load when given the environment "test"
option2: {
"local":"happy"
"live": "sad"
}
}
```
All environmental configs are set (one with a reference)
```javascript
//Given the following package.json
{
"option3": {
"local":"happy"
"test":"live"
"live": "sad"
}
}
//node myProject local
ng.config =
{
option3:"happy"
}
//node myProject live
ng.config =
{
option3:"sad"
}
//node myProject test
ng.config =
{
//this one references the live environment
option3:"sad",
}
```
For another example, let's see how to use google's cdn for testing & production. In `package.json` change:
#### from
```javascript
"requires": {
"ng":"http://code.angularjs.org/snapshot/angular.js",
"ngRoute":"http://code.angularjs.org/snapshot/angular-route.js"
},
```
#### to
```javascript
"requires": {
"local": {
"ng": "../ng.cdn/1.2.6.js",
"ngRoute": "../ng.cdn/1.2.0-route.js"
},
"test": {
"ng": "//ajax.googleapis.com/ajax/libs/angularjs/1.2.12/angular.min.js",
"ngRoute": "//ajax.googleapis.com/ajax/libs/angularjs/1.2.12/angular-route.min.js"
},
"live":"test"
},
```
#### also the same
```javascript
"requires": {
"local": {
"ng": "../ng.cdn/1.2.6.js",
"ngRoute": "../ng.cdn/1.2.0-route.js"
},
"test": "live"
"live": {
"ng": "//ajax.googleapis.com/ajax/libs/angularjs/1.2.12/angular.min.js",
"ngRoute": "//ajax.googleapis.com/ajax/libs/angularjs/1.2.12/angular-route.min.js"
}
},
```
Other common configuration options include changing the path and/or prefix of your log files, or changing your application's default protocol/port from `http port 1080`.
## run as root
Don’t run node/ng.seed as root because of it could open potential security vulnerabilities. When not root, the only thing you won’t be able to do is listen on ports less than 1024. Instead, listen on a port > 1024 (e.g., the default is `1080` for `http`) and use ip-table to forward ports 80 & 443 to the ports on which your server is actually listening.
##views
In order to work, views require ngRoute to be loaded in `package.json`. ng.seed provide a shortcut to defining routes by parsing the filenames in the `view` folder. By placing an `.html` file in the `view` folder, ng.seed will know to add that file to $routeProvider as a template. The route given for that template will be the view's filename - with `$` replaced with `:` since `:` character is not allowed to be used in the filename of many Operating Systems. For example, `myProject/view` may contain a file named `i/am/$a/$route?/that/will/$be*/registered.html` which ng.seed will add as
```javascript
ng.module('myProject').config(function($routeProvider)
{
$routeProvider.when('i/am/:a/:route?/that/will/:be*/registered', {template:<html>})
})
```
Since view's have no way of registering a controller with $routeProvider directly, you will need to specify the controller within the view using angular's ngController directive.
## changelog
#### 0.0.0-rc2
- Environment based config added
- Require "ng." prefix for ng.seed modules
- Port api enables passing options to createServer
- Added automatic daemon functionality
- Added log file config to package.json
- Refactored code into separate files
#### 0.0.0-rc1
- Initial commit
## todos
- Feel free to email [email protected] with suggestions!
- Duplication detection before overwrite?
- Use fs.watch to do automatic reloads
- Use the cluster module to run multi-thread (w/o redundant stdin/stdout?)
## related projects
- [ng](https://github.com/ng-/ng): angular reimagined
- [ng.data](https://github.com/ng-/ng.data): simple getter/setter for data persistence
- ng.cql: realtime cassandra database syncing
- [ng.auth](https://github.com/ng-/ng.auth): example authentication using ng interceptors
- [ng.crud](https://github.com/ng-/ng.crud): example demonstrating a simple crud application using ng.seed
- [ng.style](https://github.com/ng-/ng.style): beautiful html using twitter bootstrap
| 39.816667 | 702 | 0.732859 | eng_Latn | 0.990719 |
c02614738a4fb85703f96dc7cabe9074e45d1d57 | 88 | md | Markdown | README.md | LeandroHuff/fetching | 1d7e641fb88da64e331d253087b1d8f5bb0dbec4 | [
"Unlicense"
] | null | null | null | README.md | LeandroHuff/fetching | 1d7e641fb88da64e331d253087b1d8f5bb0dbec4 | [
"Unlicense"
] | null | null | null | README.md | LeandroHuff/fetching | 1d7e641fb88da64e331d253087b1d8f5bb0dbec4 | [
"Unlicense"
] | null | null | null | # fetching
A simple python program to fetch a list of URL from their respective server.
| 29.333333 | 76 | 0.795455 | eng_Latn | 0.999449 |
c0277c4889a5602b4c15812e8cc2e16765d02da7 | 267 | md | Markdown | docs/contact.md | pefi-78/frag-deinen-provider.de | 47a40a87dff6e1ae5446aa3e9f4acb0832dac6e7 | [
"Apache-2.0"
] | null | null | null | docs/contact.md | pefi-78/frag-deinen-provider.de | 47a40a87dff6e1ae5446aa3e9f4acb0832dac6e7 | [
"Apache-2.0"
] | null | null | null | docs/contact.md | pefi-78/frag-deinen-provider.de | 47a40a87dff6e1ae5446aa3e9f4acb0832dac6e7 | [
"Apache-2.0"
] | null | null | null | ---
layout: page
title: Kontakt
permalink: /contact/
---
# Wie könnt ihr uns erreichen
<img align="center" src="../images/logo/logo_color_96.png">
Ihr könnt uns am besten über Discord erreichen.
## Discord [https://discord.gg/BsdSpW3](https://discord.gg/BsdSpW3)
| 20.538462 | 67 | 0.719101 | deu_Latn | 0.711665 |
c0279831f7639a6e5d175bb495244ad1c87623c4 | 975 | md | Markdown | README.md | mfeineis/elm-design-insights | a1ea588b635c99a9a22db9504b11952c33d641eb | [
"BSD-3-Clause"
] | null | null | null | README.md | mfeineis/elm-design-insights | a1ea588b635c99a9a22db9504b11952c33d641eb | [
"BSD-3-Clause"
] | 2 | 2021-05-06T22:32:35.000Z | 2021-08-31T18:08:35.000Z | README.md | mfeineis/elm-design-insights | a1ea588b635c99a9a22db9504b11952c33d641eb | [
"BSD-3-Clause"
] | null | null | null | Before you continue
===================
This is a small side-project that was born out of the interest in
the design process of the Elm language. It is not an officially
published project and doing so would very likely be against Elm's
creator Evan Czaplicki's wishes - see
[this post on reddit](https://www.reddit.com/r/elm/comments/71bp2o/comment/dn9rygg).
So please be kind and refrain from putting this tool into
circulation!
* * *
elm-design-insights
===================
This page shows all commits to Elm core package repositories
that may contain interesting thoughts and decisions regarding the
design process of the Elm language. Note that this is not a live
list but is sporadically generated and served statically.
Building locally
----------------
* `git clone https://github.com/mfeineis/elm-design-insights`
* `cd elm-design-insights`
* `npm run setup`
* visit [http://127.0.0.1:8081]()
Development is done with `elm-reactor`, just use `npm run dev`.
| 30.46875 | 84 | 0.728205 | eng_Latn | 0.998806 |
c0293cb5c1e53d102a1741c6b4c2009590f3202e | 9,998 | md | Markdown | docs/2014/analysis-services/trace-events/discover-events-data-columns.md | sql-aus-hh/sql-docs.de-de | edfac31211cedb5d13440802f131a1e48934748a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/analysis-services/trace-events/discover-events-data-columns.md | sql-aus-hh/sql-docs.de-de | edfac31211cedb5d13440802f131a1e48934748a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/analysis-services/trace-events/discover-events-data-columns.md | sql-aus-hh/sql-docs.de-de | edfac31211cedb5d13440802f131a1e48934748a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Datenspalten für Ermittlungsereignisse | Microsoft-Dokumentation
ms.custom: ''
ms.date: 06/13/2017
ms.prod: sql-server-2014
ms.reviewer: ''
ms.technology:
- analysis-services
ms.topic: conceptual
helpviewer_keywords:
- Discover Events event category
ms.assetid: 10ec598e-5b51-4767-b4f7-42e261d96a40
author: minewiskan
ms.author: owend
manager: craigg
ms.openlocfilehash: 72e462ead8fc003102bbd5ec1e48ef29904d0fb9
ms.sourcegitcommit: 3da2edf82763852cff6772a1a282ace3034b4936
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 10/02/2018
ms.locfileid: "48106240"
---
# <a name="discover-events-data-columns"></a>Datenspalten für Ermittlungsereignisse
Die Kategorie "Ermittlungsereignisse" besitzt die folgenden Ereignisklassen:
- Discover Begin-Klasse
- Discover End-Klasse
In den folgenden Tabellen sind die Datenspalten für jede dieser Ereignisklassen aufgeführt.
## <a name="discover-begin-classdata-columns"></a>Discover Begin-Klasse – Datenspalten
|||||
|-|-|-|-|
|**Spaltenname**|**Spalten-ID**|**Spaltentyp**|**Spaltenbeschreibung**|
|EventClass|0|1|Die Ereignisklasse dient zur Kategorisierung von Ereignissen.|
|EventSubclass|1|1|Die Ereignisunterklasse enthält zusätzliche Informationen zu jeder Ereignisklasse. Im folgenden finden Sie gültige Wert: name/Paare:<br /><br /> 0: **DBSCHEMA_CATALOGS**<br />1: **DBSCHEMA_TABLES**<br />2: **DBSCHEMA_COLUMNS**<br />3: **DBSCHEMA_PROVIDER_TYPES**<br />4: **MDSCHEMA_CUBES**<br />5: **MDSCHEMA_DIMENSIONS**<br />6: **MDSCHEMA_HIERARCHIES**<br />7: **MDSCHEMA_LEVELS**<br />8: **MDSCHEMA_MEASURES**<br />9: **MDSCHEMA_PROPERTIES**<br />10: **MDSCHEMA_MEMBERS**<br />11: **MDSCHEMA_FUNCTIONS**<br />12: **MDSCHEMA_ACTIONS**<br />13: **MDSCHEMA_SETS**<br />14: **DISCOVER_INSTANCES**<br />15: **MDSCHEMA_KPIS**<br />16: **MDSCHEMA_MEASUREGROUPS**<br />17: **MDSCHEMA_COMMANDS**<br />18: **DMSCHEMA_MINING_SERVICES**<br />19: **DMSCHEMA_MINING_SERVICE_PARAMETERS**<br />20: **DMSCHEMA_MINING_FUNCTIONS**<br />21: **DMSCHEMA_MINING_MODEL_CONTENT**<br />22: **DMSCHEMA_MINING_MODEL_XML**<br />23: **DMSCHEMA_MINING_MODELS**<br />24: **DMSCHEMA_MINING_COLUMNS**<br />25: **DISCOVER_DATASOURCES**<br />26: **DISCOVER_PROPERTIES**<br />27: **DISCOVER_SCHEMA_ROWSETS**<br />28: **DISCOVER_ENUMERATORS**<br />29: **DISCOVER_KEYWORDS**<br />30: **DISCOVER_LITERALS**<br />31: **DISCOVER_XML_METADATA**<br />32: **DISCOVER_TRACES**<br />33: **DISCOVER_TRACE_DEFINITION_PROVIDERINFO**<br />34: **DISCOVER_TRACE_COLUMNS**<br />35: **DISCOVER_TRACE_EVENT_CATEGORIES**<br />36: **DMSCHEMA_MINING_STRUCTURES**<br />37: **DMSCHEMA_MINING_STRUCTURE_COLUMNS**<br />38: **DISCOVER_MASTER_KEY**<br />39: **MDSCHEMA_INPUT_DATASOURCES**<br />40: **DISCOVER_LOCATIONS**<br />41: **DISCOVER_PARTITION_DIMENSION_STAT**<br />42: **DISCOVER_PARTITION_STAT**<br />43: **DISCOVER_DIMENSION_STAT**<br />44: **MDSCHEMA_MEASUREGROUP_DIMENSIONS**<br />45: **DISCOVER_XEVENT_TRACE_DEFINITION**|
|CurrentTime|2|5|Der Zeitpunkt, zu dem das Ereignis begonnen hat (falls verfügbar). Für das Filtern lauten die erwarteten Formate "JJJJ-MM-TT" und "JJJJ-MM-TT HH:MM:SS".|
|StartTime|3|5|Der Zeitpunkt, zu dem das Ereignis begonnen hat (falls verfügbar). Für das Filtern lauten die erwarteten Formate "JJJJ-MM-TT" und "JJJJ-MM-TT HH:MM:SS".|
|ConnectionID|25|1|Enthält die mit dem Ermittlungsereignis verbundene eindeutige Verbindungs-ID.|
|DatabaseName|28|8|Name der Datenbank, in der die Anweisung des Benutzers ausgeführt wird.|
|NTUserName|32|8|Enthält den mit dem Ermittlungsereignis verbundenen Windows-Benutzernamen. Der Benutzername liegt im kanonischen Format vor. Beispiel: engineering.microsoft.com/software/user.|
|NTDomainName|33|8|Enthält die Windows-Domäne, die dem Ermittlungsereignis zugeordnet ist.|
|ClientProcessID|36|1|Enthält die Prozess-ID der Clientanwendung.|
|ApplicationName|37|8|Der Name der Clientanwendung, die die Verbindung mit dem Server hergestellt hat. Diese Spalte wird mit den Werten aufgefüllt, die von der Anwendung übergeben werden, und nicht mit dem angezeigten Namen des Programms.|
|SessionID|39|8|Enthält die mit dem Ermittlungsereignis verbundene Sitzungs-ID.|
|NTCanonicalUserName|40|8|Enthält den mit dem Ermittlungsereignis verbundenen Windows-Benutzernamen. Der Benutzername liegt im kanonischen Format vor. Beispiel: engineering.microsoft.com/software/user.|
|SPID|41|1|Enthält die SPID (Server Process ID), die die mit dem Ermittlungsereignis verbundene Benutzersitzung eindeutig kennzeichnet. Die SPID entspricht direkt dem von XMLA verwendeten Sitzungs-GUID.|
|TextData|42|9|Enthält die dem Ereignis zugeordneten Textdaten.|
|ServerName|43|8|Enthält den Namen der Instanz von [!INCLUDE[msCoName](../../includes/msconame-md.md)] [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] [!INCLUDE[ssASnoversion](../../includes/ssasnoversion-md.md)] , in der das Ermittlungsereignis aufgetreten ist.|
|RequestProperties|45|9|Enthält die Eigenschaften der mit dem Ermittlungsereignis verbundenen XMLA-Anforderung (XML for Analysis).|
## <a name="discover-end-classdata-columns"></a>Discover End-Klasse – Datenspalten
|||||
|-|-|-|-|
|**Spaltenname**|**Spalten-ID**|**Spaltentyp**|**Spaltenbeschreibung**|
|EventClass|0|1|Enthält die Ereignisklasse; wird verwendet, um Ereignisse zu kategorisieren.|
|EventSubclass|1|1|Die Ereignisunterklasse enthält zusätzliche Informationen zu jeder Ereignisklasse. Im folgenden finden Sie gültige Wert: name/Paare:<br /><br /> 0: **DBSCHEMA_CATALOGS**<br />1: **DBSCHEMA_TABLES**<br />2: **DBSCHEMA_COLUMNS**<br />3: **DBSCHEMA_PROVIDER_TYPES**<br />4: **MDSCHEMA_CUBES**<br />5: **MDSCHEMA_DIMENSIONS**<br />6: **MDSCHEMA_HIERARCHIES**<br />7: **MDSCHEMA_LEVELS**<br />8: **MDSCHEMA_MEASURES**<br />9: **MDSCHEMA_PROPERTIES**<br />10: **MDSCHEMA_MEMBERS**<br />11: **MDSCHEMA_FUNCTIONS**<br />12: **MDSCHEMA_ACTIONS**<br />13: **MDSCHEMA_SETS**<br />14: **DISCOVER_INSTANCES**<br />15: **MDSCHEMA_KPIS**<br />16: **MDSCHEMA_MEASUREGROUPS**<br />17: **MDSCHEMA_COMMANDS**<br />18: **DMSCHEMA_MINING_SERVICES**<br />19: **DMSCHEMA_MINING_SERVICE_PARAMETERS**<br />20: **DMSCHEMA_MINING_FUNCTIONS**<br />21: **DMSCHEMA_MINING_MODEL_CONTENT**<br />22: **DMSCHEMA_MINING_MODEL_XML**<br />23: **DMSCHEMA_MINING_MODELS**<br />24: **DMSCHEMA_MINING_COLUMNS**<br />25: **DISCOVER_DATASOURCES**<br />26: **DISCOVER_PROPERTIES**<br />27: **DISCOVER_SCHEMA_ROWSETS**<br />28: **DISCOVER_ENUMERATORS**<br />29: **DISCOVER_KEYWORDS**<br />30: **DISCOVER_LITERALS**<br />31: **DISCOVER_XML_METADATA**<br />32: **DISCOVER_TRACES**<br />33: **DISCOVER_TRACE_DEFINITION_PROVIDERINFO**<br />34: **DISCOVER_TRACE_COLUMNS**<br />35: **DISCOVER_TRACE_EVENT_CATEGORIES**<br />36: **DMSCHEMA_MINING_STRUCTURES**<br />37: **DMSCHEMA_MINING_STRUCTURE_COLUMNS**<br />38: **DISCOVER_MASTER_KEY**<br />39: **MDSCHEMA_INPUT_DATASOURCES**<br />40: **DISCOVER_LOCATIONS**<br />41: **DISCOVER_PARTITION_DIMENSION_STAT**<br />42: **DISCOVER_PARTITION_STAT**<br />43: **DISCOVER_DIMENSION_STAT**<br />44: **MDSCHEMA_MEASUREGROUP_DIMENSIONS**<br />45: **DISCOVER_XEVENT_TRACE_DEFINITION**|
|CurrentTime|2|5|Enthält die aktuelle Zeit des Ermittlungsereignisses (wenn verfügbar). Für das Filtern lauten die erwarteten Formate "JJJJ-MM-TT" und "JJJJ-MM-TT HH:MM:SS".|
|StartTime|3|5|Enthält die Zeit (falls verfügbar), zu der das Ermittlungsendereignis begonnen hat. Für das Filtern lauten die erwarteten Formate "JJJJ-MM-TT" und "JJJJ-MM-TT HH:MM:SS".|
|EndTime|4|5|Enthält die Uhrzeit, zu der das Ereignis beendet wurde. Diese Spalte wird für Startereignisklassen (z. B. SQL:BatchStarting oder SP:Starting) nicht aufgefüllt. Für das Filtern lauten die erwarteten Formate "JJJJ-MM-TT" und "JJJJ-MM-TT HH:MM:SS".|
|Duration|5|2|Enthält die ungefähre Zeit (in Millisekunden), die für das Ermittlungsereignis benötigt wurde.|
|CPUTime|6|2|Enthält die CPU-Zeit (in Millisekunden), die vom Ereignis verwendet wurde.|
|Schweregrad|22|1|Enthält den Schweregrad einer Ausnahme.|
|Success|23|1|Enthält den Erfolg oder Fehler des Ermittlungsereignisses. Die Werte sind:<br /><br /> 0 = Fehler<br /><br /> 1 = Erfolg|
|Fehler|24|1|Enthält die Fehlernummer aller mit dem Ermittlungsereignis verbundenen Fehler.|
|ConnectionID|25|1|Enthält die mit dem Ermittlungsereignis verbundene eindeutige Verbindungs-ID.|
|DatabaseName|28|8|Enthält den Namen der Datenbank, in der das Ermittlungsereignis aufgetreten ist.|
|NTUserName|32|8|Enthält den Windows-Benutzernamen, der dem Objektberechtigungsereignis zugeordnet ist.|
|NTDomainName|33|8|Enthält das mit dem Ermittlungsereignis verbundene Windows-Domänenkonto.|
|ClientProcessID|36|1|Enthält die Client-Prozess-ID der Anwendung, von der das Ereignis initiiert wurde.|
|ApplicationName|37|8|Enthält den Namen der Clientanwendung, die die Verbindung mit dem Server hergestellt hat. Diese Spalte wird mit den Werten aufgefüllt, die von der Anwendung übergeben werden, und nicht mit dem angezeigten Namen des Programms.|
|SessionID|39|8|Enthält die mit dem Ermittlungsereignis verbundene Sitzungs-ID.|
|NTCanonicalUserName|40|8|Enthält den Windows-Benutzernamen, der dem Objektberechtigungsereignis zugeordnet ist. Der Benutzername liegt im kanonischen Format vor. Beispiel: engineering.microsoft.com/software/user.|
|SPID|41|1|Enthält die SPID (Server Process ID), die die mit dem Ermittlungsendeereignis verbundene Benutzersitzung eindeutig kennzeichnet. Die SPID entspricht direkt dem von XMLA verwendeten Sitzungs-GUID.|
|TextData|42|9|Enthält die dem Ereignis zugeordneten Textdaten.|
|ServerName|43|8|Enthält den Namen der Instanz von [!INCLUDE[ssASnoversion](../../includes/ssasnoversion-md.md)] , in der das Ermittlungsereignis aufgetreten ist.|
|RequestProperties|45|9|Enthält die Eigenschaften in der XMLA-Anforderung.|
## <a name="see-also"></a>Siehe auch
[Ermittlungsereignisse – Ereigniskategorie](discover-events-event-category.md)
| 116.255814 | 1,793 | 0.763153 | deu_Latn | 0.804668 |
c029b23c310349b4b9efef311dca843b80e77ca6 | 1,681 | md | Markdown | docs/templates/report_switch_cabling.md | Cray-HPE/canu | 3a92ce1e9b63f35aa30b9135afaa734e61909407 | [
"MIT"
] | 6 | 2021-09-16T22:02:48.000Z | 2022-02-04T18:08:57.000Z | docs/templates/report_switch_cabling.md | Cray-HPE/canu | 3a92ce1e9b63f35aa30b9135afaa734e61909407 | [
"MIT"
] | 57 | 2021-09-17T17:15:59.000Z | 2022-03-31T20:56:21.000Z | docs/templates/report_switch_cabling.md | Cray-HPE/canu | 3a92ce1e9b63f35aa30b9135afaa734e61909407 | [
"MIT"
] | 4 | 2022-01-06T17:09:02.000Z | 2022-02-04T18:09:33.000Z | # Report Switch Cabling
```{eval-rst}
.. click:: canu.report.switch.cabling.cabling:cabling
:prog: canu report switch cabling
```
## Example
To check the cabling of a single switch run: `canu report switch cabling --ip 192.168.1.1 --username USERNAME --password PASSWORD`
```bash
$ canu report switch cabling --ip 192.168.1.1 --username USERNAME --password PASSWORD
Switch: test-switch-spine01 (192.168.1.1)
Aruba 8325
------------------------------------------------------------------------------------------------------------------------------------------
PORT NEIGHBOR NEIGHBOR PORT PORT DESCRIPTION DESCRIPTION
------------------------------------------------------------------------------------------------------------------------------------------
1/1/1 ==> 00:00:00:00:00:01 No LLDP data, check ARP vlan info. 192.168.1.20:vlan1, 192.168.2.12:vlan2
1/1/3 ==> ncn-test2 00:00:00:00:00:02 mgmt0 Linux ncn-test2
1/1/5 ==> ncn-test3 00:00:00:00:00:03 mgmt0 Linux ncn-test3
1/1/7 ==> 00:00:00:00:00:04 No LLDP data, check ARP vlan info. 192.168.1.10:vlan1, 192.168.2.9:vlan2
1/1/51 ==> test-spine02 1/1/51 Aruba JL635A GL.10.06.0010
1/1/52 ==> test-spine02 1/1/52 Aruba JL635A GL.10.06.0010
```

---
<a href="/readme.md">Back To Readme</a><br>
| 50.939394 | 138 | 0.430101 | yue_Hant | 0.330447 |
c02b04455195517aaf7c9057b687dc7067a70a62 | 180 | md | Markdown | README.md | frousterz/js-bootcamp | 603ff495e0b69c4a950f516493fc19e48f8735ad | [
"MIT"
] | null | null | null | README.md | frousterz/js-bootcamp | 603ff495e0b69c4a950f516493fc19e48f8735ad | [
"MIT"
] | null | null | null | README.md | frousterz/js-bootcamp | 603ff495e0b69c4a950f516493fc19e48f8735ad | [
"MIT"
] | null | null | null | # js-bootcamp
JavaScript Bootcamp Tutorial
### Details
* This repository is used to store all files related to a JavaScript Udemy Course.
### Version
* v1.0.0
### License
* MIT
| 15 | 82 | 0.722222 | eng_Latn | 0.968945 |
c02d04587189acf60691345e035f764cd15038d8 | 148 | md | Markdown | README.md | Harukiblue/DisasterSimulator | bf27294e4bc4e701979cef3ff52e4f2696d423d1 | [
"MIT"
] | null | null | null | README.md | Harukiblue/DisasterSimulator | bf27294e4bc4e701979cef3ff52e4f2696d423d1 | [
"MIT"
] | null | null | null | README.md | Harukiblue/DisasterSimulator | bf27294e4bc4e701979cef3ff52e4f2696d423d1 | [
"MIT"
] | null | null | null | # DisasterSimulator
A simulator for Disasters\
The simulator runs various disaster scenarios every year and calculates the amount of damage caused.
| 37 | 100 | 0.837838 | eng_Latn | 0.997367 |
c02ddcdef1e35e931f1a9058bd9bc2b58d09a5c9 | 725 | md | Markdown | docs/sources/source-s3.md | davepgreene/propsd | 64bfc819f3b3bd91d0b32874e33dfdb206043434 | [
"MIT"
] | 21 | 2016-04-29T23:01:02.000Z | 2022-02-16T07:15:01.000Z | docs/sources/source-s3.md | davepgreene/propsd | 64bfc819f3b3bd91d0b32874e33dfdb206043434 | [
"MIT"
] | 272 | 2016-01-11T16:41:01.000Z | 2021-09-01T02:31:43.000Z | docs/sources/source-s3.md | davepgreene/propsd | 64bfc819f3b3bd91d0b32874e33dfdb206043434 | [
"MIT"
] | 11 | 2016-07-14T13:23:23.000Z | 2022-02-16T07:15:01.000Z | # Propsd S3 Plugin
## Purpose
The S3 Plugin manages retrieving data from S3. On a set interval the plugin will send a request using the aws-sdk to a config object in S3, determine if the object has been updated more recently than the previously retrieved object, and, if so, will retrieve the new object, parse its `Body`, and emit an event for the storage layer to consume.
## High level composition
The watcher component will be split into the following modules:
* `S3`
* `S3.Agent`
* `S3.Store`
* `S3.Parser`
## Classes
* [Sources.S3](s3/class-source-s3.md)
* [Sources.S3.Agent](s3/class-source-s3-agent.md)
* [Sources.S3.Store](s3/class-source-s3-store.md)
* [Sources.S3.Parser](s3/class-source-s3-parser.md) | 38.157895 | 344 | 0.735172 | eng_Latn | 0.874194 |
c02e344ef07a48633b7e25476eafc861cf523d8a | 95 | md | Markdown | Packs/Ransomware/ReleaseNotes/1_0_4.md | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 799 | 2016-08-02T06:43:14.000Z | 2022-03-31T11:10:11.000Z | Packs/Ransomware/ReleaseNotes/1_0_4.md | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 9,317 | 2016-08-07T19:00:51.000Z | 2022-03-31T21:56:04.000Z | Packs/Ransomware/ReleaseNotes/1_0_4.md | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 1,297 | 2016-08-04T13:59:00.000Z | 2022-03-31T23:43:06.000Z |
#### Layouts Containers
##### Post Intrusion Ransomware
- Fixed Ransomware pack dependencies.
| 19 | 37 | 0.747368 | eng_Latn | 0.455949 |
c02facb65b36025a89ca5ced9574d5aa5eb69d55 | 113 | md | Markdown | _includes/about/zh.md | HooRang/hoorang.github.io | 2d40c25702e04d1225e43382083135c836953c27 | [
"Apache-2.0"
] | null | null | null | _includes/about/zh.md | HooRang/hoorang.github.io | 2d40c25702e04d1225e43382083135c836953c27 | [
"Apache-2.0"
] | null | null | null | _includes/about/zh.md | HooRang/hoorang.github.io | 2d40c25702e04d1225e43382083135c836953c27 | [
"Apache-2.0"
] | null | null | null | > 人生就是一场旅行,不在乎目的地,
> 在乎的是沿途的风景以及看风景的心情。
Hi, 我是何良,IT男一名。写过web前端,做过服务器,会点android。人比较懒,博客作为个人解决问题的知识库,勤快的时候就写写。
| 18.833333 | 68 | 0.787611 | zho_Hans | 0.838015 |
c030d576f0fe01f0846af3bf24e5b9a11e9b6c9f | 891 | md | Markdown | content/media/weixin/wx_肖磊看市/重要投资课程推送.md | leeleilei/52etf.net | 934d558bd4181a1e0b39a7a10e0d7076c53995fb | [
"CC-BY-4.0"
] | 10 | 2020-06-01T16:08:00.000Z | 2021-10-31T15:17:26.000Z | content/media/weixin/wx_肖磊看市/重要投资课程推送.md | leeleilei/52etf.net | 934d558bd4181a1e0b39a7a10e0d7076c53995fb | [
"CC-BY-4.0"
] | null | null | null | content/media/weixin/wx_肖磊看市/重要投资课程推送.md | leeleilei/52etf.net | 934d558bd4181a1e0b39a7a10e0d7076c53995fb | [
"CC-BY-4.0"
] | null | null | null |
---
title: 重要投资课程推送-肖磊看市
date: 2019-05-08
tags: ["肖磊看市", ]
display: false
---
##
重要投资课程推送
kanshi1314
在投资市场,获取有用的信息就像大海捞针,我每天用18个小时思考,用五分钟告诉你;这里是肖磊看市,市场纵然复杂多变,但并非不可预期。
这两天关于国际局势的分析,基本都发不出去。
<img class="rich_pages" data-copyright="0" data-ratio="0.42582197273456296" data-s="300,640" src="https://mmbiz.qpic.cn/mmbiz_png/rIYcHn0KrPTIS8y22zliaqwicticQ2AWXkXa4J8cU9J4l4zVibNnSZWJBrB8uGY2jGeicNib1YLpca8RPXXtwbSdNDGA/640?wx_fmt=png" data-type="png" data-w="1247" style="">
连续两次发送失败,如上图所示。
**中美贸易争端对整个投资市场的影响,我已经做了2小时直播,进行了全面的解读,目前购买收听只需要299,明天将恢复原价至899。可以购买后再慢慢听,永久有效。会员免费。**
<img class="rich_pages" data-ratio="1.7786666666666666" data-s="300,640" src="https://mmbiz.qpic.cn/mmbiz_jpg/rIYcHn0KrPTIS8y22zliaqwicticQ2AWXkXfeMItDNEPibmyzohicrYrYYQsx6RgZOreFnX579icibpr1tXrPSCtlibvXw/640?wx_fmt=jpeg" data-type="jpeg" data-w="750"/>
还可以点击左下角 “阅读原文” 加入
| 17.134615 | 278 | 0.774411 | yue_Hant | 0.216064 |
c031252b1c11f3dcb23eb3b6462e46621b35a8d7 | 5,394 | md | Markdown | examples/userprovided-mtls/README.md | osodevops/confluent-kubernetes-playground | 145270c41c42fd78d9d163810876ad495051b671 | [
"Apache-2.0"
] | 8 | 2021-06-15T07:58:28.000Z | 2022-03-30T00:00:11.000Z | examples/userprovided-mtls/README.md | osodevops/confluent-kubernetes-playground | 145270c41c42fd78d9d163810876ad495051b671 | [
"Apache-2.0"
] | 1 | 2021-06-10T16:58:11.000Z | 2021-06-10T16:58:11.000Z | examples/userprovided-mtls/README.md | osodevops/confluent-kubernetes-playground | 145270c41c42fd78d9d163810876ad495051b671 | [
"Apache-2.0"
] | 3 | 2021-10-30T18:43:06.000Z | 2022-03-07T17:58:23.000Z | # User Provider mTLS
In this scenario example, you'll deploy the Confluent platform each with its own certificate to validate the architecture and deployment. The certificates that are generated in this example use the `sandbox` namespace. **NOTE** You will need to change this for your environment which is why the generate_certificates.sh script is used.
1. Create one server certificate per Confluent component service. You'll use the same certificate authority for all. Update `zookeeper-server-domain.json` and `kafka-server-domain.json` with your namespace and generate certificates for each component.
```shell
cd examples/userprovided-mtls
./generate_certificates.sh
```
2. Deploy the CRDS using the standard way:
```shell
kubectl apply -k ../../kustomize/crds
```
3. Deploy the mTLS example which use Kustomize to pull in the base and example overlays using the following
```shell
kubectl apply -k .
```
4. Validate zookeeper is working using:
```shell
kubectl logs -f -n sandbox zookeeper-0
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:java.library.path=/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:java.io.tmpdir=/tmp
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:java.compiler=<NA>
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:os.name=Linux
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:os.arch=amd64
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:os.version=5.10.47-linuxkit
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:user.name=?
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:user.home=?
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:user.dir=/opt
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:os.memory.free=336MB
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:os.memory.max=4096MB
[INFO] 2021-08-17 14:40:54,836 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer logEnv - Server environment:os.memory.total=357MB
[INFO] 2021-08-17 14:40:54,838 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer setMinSessionTimeout - minSessionTimeout set to 6000
[INFO] 2021-08-17 14:40:54,838 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer setMaxSessionTimeout - maxSessionTimeout set to 60000
[INFO] 2021-08-17 14:40:54,839 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.ZooKeeperServer <init> - Created server with tickTime 3000 minSessionTimeout 6000 maxSessionTimeout 60000 datadir /mnt/data/txnlog/version-2 snapdir /mnt/data/data/version-2
[INFO] 2021-08-17 14:40:54,839 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.quorum.Learner followLeader - FOLLOWING - LEADER ELECTION TOOK - 13 MS
[WARN] 2021-08-17 14:40:54,841 [QuorumPeer[myid=0](plain=0.0.0.0:2181)(secure=0.0.0.0:2182)] org.apache.zookeeper.server.quorum.Learner connectToLeader - Unexpected exception, tries=0, remaining init limit=30000, connecting to zookeeper-1.zookeeper.sandbox.svc.cluster.local/172.17.0.6:2888
[INFO] 2021-08-17 14:49:42,057 [nioEventLoopGroup-7-1] org.apache.zookeeper.server.auth.X509AuthenticationProvider handleAuthentication - Authenticated Id 'CN=kafka,L=Earth,ST=Pangea,C=Universe' for Scheme 'x509'
```
5. Validate Kafka is working using:
```shell
kubectl logs -f -n sandbox kafka-0
[INFO] 2021-08-17 14:49:00,492 [LicenseBackgroundFetcher RUNNING] org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 6.1.2-ce
[INFO] 2021-08-17 14:49:00,493 [LicenseBackgroundFetcher RUNNING] org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId: 4c988093cc81349d
[INFO] 2021-08-17 14:49:00,493 [LicenseBackgroundFetcher RUNNING] org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs: 1629211740492
[INFO] 2021-08-17 14:49:00,493 [kafka-producer-network-thread | confluent-metrics-reporter] org.apache.kafka.clients.Metadata update - [Producer clientId=confluent-metrics-reporter] Cluster ID: xBPcfVfKSrCS15AmzC6BUQ
``` | 98.072727 | 335 | 0.770857 | eng_Latn | 0.263896 |
c032c23de2039d1b9d246345f7ce73700ac96085 | 6,479 | md | Markdown | articles/virtual-machines/extensions/agent-dependency-linux.md | changeworld/azure-docs.sv-se | 6234acf8ae0166219b27a9daa33f6f62a2ee45ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/extensions/agent-dependency-linux.md | changeworld/azure-docs.sv-se | 6234acf8ae0166219b27a9daa33f6f62a2ee45ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/extensions/agent-dependency-linux.md | changeworld/azure-docs.sv-se | 6234acf8ae0166219b27a9daa33f6f62a2ee45ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Azure Monitor Dependency virtual machine-tillägg för Linux
description: Distribuera Azure Monitor Dependency Agent på Linux virtuell dator med hjälp av en virtuell dator tillägg.
services: virtual-machines-linux
documentationcenter: ''
author: mgoedtel
manager: carmonm
editor: ''
tags: azure-resource-manager
ms.assetid: ''
ms.service: virtual-machines-linux
ms.topic: article
ms.tgt_pltfrm: vm-linux
ms.workload: infrastructure-services
ms.date: 03/29/2019
ms.author: magoedte
ms.openlocfilehash: 82f9c5a67cb056752cf8310be3b7c9f0bd2501e9
ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 03/28/2020
ms.locfileid: "79254045"
---
# <a name="azure-monitor-dependency-virtual-machine-extension-for-linux"></a>Azure Monitor Dependency virtual machine-tillägg för Linux
Funktionen Azure Monitor för virtuella datorer kart får sina data från Microsoft Dependency Agent. Virtual Machine-tillägget för azure VM Dependency-agent för Linux publiceras och stöds av Microsoft. Tillägget installerar beroendeagenten på virtuella Azure-datorer. Det här dokumentet beskriver de plattformar, konfigurationer och distributionsalternativ som stöds för azure VM Dependency agent virtual machine-tillägget för Linux.
## <a name="prerequisites"></a>Krav
### <a name="operating-system"></a>Operativsystem
Azure VM Dependency agent-tillägget för Linux kan köras mot de operativsystem som stöds i avsnittet [Operativsystem som stöds](../../azure-monitor/insights/vminsights-enable-overview.md#supported-operating-systems) i distributionsartikeln för Azure Monitor för virtuella datorer.
## <a name="extension-schema"></a>Tilläggsschema
Följande JSON visar schemat för Azure VM Dependency agent-tillägget på en Azure Linux VM.
```json
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"vmName": {
"type": "string",
"metadata": {
"description": "The name of existing Linux Azure VM."
}
}
},
"variables": {
"vmExtensionsApiVersion": "2017-03-30"
},
"resources": [
{
"type": "Microsoft.Compute/virtualMachines/extensions",
"name": "[concat(parameters('vmName'),'/DAExtension')]",
"apiVersion": "[variables('vmExtensionsApiVersion')]",
"location": "[resourceGroup().location]",
"dependsOn": [
],
"properties": {
"publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
"type": "DependencyAgentLinux",
"typeHandlerVersion": "9.5",
"autoUpgradeMinorVersion": true
}
}
],
"outputs": {
}
}
```
### <a name="property-values"></a>Egenskapsvärden
| Namn | Värde/exempel |
| ---- | ---- |
| apiVersion | 2015-01-01 |
| utgivare | Microsoft.Azure.Monitoring.DependencyAgent |
| typ | BeroendeAgentLinux |
| typHandlerVersion | 9.5 |
## <a name="template-deployment"></a>Malldistribution
Du kan distribuera Azure VM-tillägg med Azure Resource Manager-mallar. Du kan använda JSON-schemat som beskrivs i föregående avsnitt i en Azure Resource Manager-mall för att köra azure VM Dependency agent-tillägget under en Azure Resource Manager-malldistribution.
JSON för en virtuell dator förlängning kan kapslas inuti den virtuella datorn resurs. Du kan också placera den på rot- eller toppnivå i en Resource Manager JSON-mall. Placeringen av JSON påverkar värdet för resursnamnet och resurstypen. Mer information finns i [Ange namn och typ för underordnade resurser](../../azure-resource-manager/templates/child-resource-name-type.md).
I följande exempel förutsätts att tillägget beroendeagent är kapslat i resursen för den virtuella datorn. När du kapslar tilläggsresursen placeras `"resources": []` JSON i objektet för den virtuella datorn.
```json
{
"type": "extensions",
"name": "DAExtension",
"apiVersion": "[variables('apiVersion')]",
"location": "[resourceGroup().location]",
"dependsOn": [
"[concat('Microsoft.Compute/virtualMachines/', variables('vmName'))]"
],
"properties": {
"publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
"type": "DependencyAgentLinux",
"typeHandlerVersion": "9.5",
"autoUpgradeMinorVersion": true
}
}
```
När du placerar tillägget JSON i roten av mallen innehåller resursnamnet en referens till den överordnade virtuella datorn. Typen återspeglar den kapslade konfigurationen.
```json
{
"type": "Microsoft.Compute/virtualMachines/extensions",
"name": "<parentVmResource>/DAExtension",
"apiVersion": "[variables('apiVersion')]",
"location": "[resourceGroup().location]",
"dependsOn": [
"[concat('Microsoft.Compute/virtualMachines/', variables('vmName'))]"
],
"properties": {
"publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
"type": "DependencyAgentLinux",
"typeHandlerVersion": "9.5",
"autoUpgradeMinorVersion": true
}
}
```
## <a name="azure-cli-deployment"></a>Azure CLI-distribution
Du kan använda Azure CLI för att distribuera VM-tillägget för beroendeagent till en befintlig virtuell dator.
```azurecli
az vm extension set \
--resource-group myResourceGroup \
--vm-name myVM \
--name DependencyAgentLinux \
--publisher Microsoft.Azure.Monitoring.DependencyAgent \
--version 9.5
```
## <a name="troubleshoot-and-support"></a>Felsöka och support
### <a name="troubleshoot"></a>Felsöka
Data om tillståndet för tilläggsdistributioner kan hämtas från Azure-portalen och med hjälp av Azure CLI. Om du vill se distributionstillståndet för tillägg för en viss virtuell dator kör du följande kommando med hjälp av Azure CLI:
```azurecli
az vm extension list --resource-group myResourceGroup --vm-name myVM -o table
```
Utdata för tilläggskörning loggas till följande fil:
```
/opt/microsoft/dependcency-agent/log/install.log
```
### <a name="support"></a>Support
Om du behöver mer hjälp när som helst i den här artikeln kontaktar du Azure-experterna på [MSDN Azure- och Stack Overflow-forumen](https://azure.microsoft.com/support/forums/). Du kan också arkivera en Azure-supportincident. Gå till [Azure-supportwebbplatsen](https://azure.microsoft.com/support/options/) och välj **Hämta support**. Information om hur du använder Azure Support finns i [vanliga frågor och svar om Microsoft Azure-support](https://azure.microsoft.com/support/faq/).
| 39.748466 | 482 | 0.729897 | swe_Latn | 0.861085 |
c0332c635486ff54e37d50ffc6207b76cefe95e6 | 2,844 | md | Markdown | contents/chapter02/_posts/21-02-08-02_01_02_Affine_set.md | simonseo/convex-optimization | 9b16dda99c092601d08b6baef2d1f1ba7df3786a | [
"MIT"
] | 67 | 2021-01-09T19:01:02.000Z | 2022-03-18T09:54:52.000Z | contents/chapter02/_posts/21-02-08-02_01_02_Affine_set.md | simonseo/convex-optimization | 9b16dda99c092601d08b6baef2d1f1ba7df3786a | [
"MIT"
] | 136 | 2021-01-28T14:06:06.000Z | 2022-02-19T14:31:33.000Z | contents/chapter02/_posts/21-02-08-02_01_02_Affine_set.md | simonseo/convex-optimization | 9b16dda99c092601d08b6baef2d1f1ba7df3786a | [
"MIT"
] | 16 | 2021-05-08T06:44:17.000Z | 2022-03-21T07:08:52.000Z | ---
layout: post
title: 02-01-02 Affine set
chapter: "02"
order: 3
owner: "Wontak Ryu"
---
Affine set은 점(point), 직선(line), 평면(plane), 초평면(hyperplane)과 같이 선형적 특성이 있으면서 경계가 없는 집합을 말한다. 어떤 집합이 affine set이라고 말할 수 있으려면 집합에 속한 임의의 두 점으로 직선을 만들어서 그 직선이 집합에 포함되는지를 보면 된다. 이쯤에서 다들 느끼겠지만 직선이 포함된다는 의미는 경계가 없다는 의미이므로 어떤 공간이 경계가 있다면 affine set이 될 수 없다는 것을 직관적으로 알 수 있을 것이다. 수학적으로 이 내용을 정의해보자.
## Affine set
집합 $$C \subseteq R^n$$에 속한 두 점 $$x_1$$, $$x_2 \in C$$을 지나는 직선을 만들었을 때 이 직선이 $$C$$에 포함되면 이 집합을 **affine set**이라고 한다.
>$$\theta x_1 + (1-\theta)x_2 \in C$$ with $$\theta \in R$$
이 식을 다르게 해석해 보면 set $$C$$에 속한 두 점을 linear combination하되 계수의 합을 1로 제한했다고 볼 수도 있다. (이 식에서 계수인 $$\theta$$와 $$(1-\theta)$$의 합은 1이다. ) 그리고, 그 결과가 $$C$$에 다시 포함되면 affine set이다.
## Affine combination
여러 점들을 linear combination할 때 계수의 합을 1로 제한하게 되면 이를 **affine combination**이라고 한다.
>$$\theta_1 x_1 + \theta_2 x_2 + \cdots + \theta_k x_k \in C$$ with $$\theta_1 + \theta_2 + ... + \theta_k = 1$$
이제 affine set의 정의를 affine combination 개념을 이용해서 일반화해 볼 수 있다. 즉, 어떤 집합에 속하는 점들을 affine combination했을 때 그 결과가 다시 그 집합에 속하면 그 집합은 affine set이라고 말할 수 있다.
반대로 affine set에 속하는 점들을 affine combination하면 항상 set에 속하게 된다.
## Affine hull
$$C \subseteq \mathbb{R}^n$$에 포함된 점들의 모든 affine combination들의 집합을 $$C$$의 affine hull이라고 하며 **aff** $$C$$로 표기한다. Affine hull **aff** $$C$$은 항상 affine set이며, 집합 $$C$$를 포함하는 가장 작은 affine set이다.
> $$ \mathbb{aff} C = \{ \theta_1 x_1 + \dotsi + \theta_k x_k \phantom{1} \mid \phantom{1} x_1, \dotso, x_k \in C, \theta_1 + \dotsi + \theta_k = 1 \}$$
## Affine set과 subspace의 관계
Affine set $$C$$가 있을 때 $$x_0 \in C$$라면 set $$V = C - x_0$$는 subspace이다.
($$V$$가 subspace라는 증명은 아래에 있다.)
>$$V = C - x_0 = \{ x - x_0 \phantom{1} \mid \phantom{1} x \in C \}$$
따라서, **"Affine set $$C$$은 linear subspace $$V$$를 $$x_0$$만큼 translation한 것이다"** 라고 할 수 있으며, $$x_0$$는 집합 $$C$$에서 임의로 선택할 수 있다. 그리고, $$C$$의 차원은 $$V$$의 차원과 같다. ($$C, V \subseteq \mathbb{R}^n$$)
>$$C = V + x_0 = \{ v + x_0 \phantom{1} \mid\phantom{1} v \in V \}$$
#### [참고] $$V$$가 subspace임을 증명
$$V$$는 subspace라는 것을 증명하려면 sum과 scalar multiplication에 닫혀있다는 것을 보이면 된다.
먼저 $$v_1, v_2 \in V$$이고 $$\alpha, \beta \in R$$라고 하자. 만일 $$\alpha v_1 + \beta v_2 + x_0$$가 $$C$$에 속한다는 것을 확인한다면, $$V = C - x_0$$에 따라 $$\alpha v_1 + \beta v_2 \in V$$가 되므로 $$V$$가 sum과 scalar multiplication에 닫혀있다는 것을 알 수 있다.
$$v_1 + x_0 \in C$$이고 $$v_2 + x_0 \in C$$이므로 $$\alpha v_1 + \beta v_2 + x_0$$는 다음과 같이 전개될 수 있다.
이때, 전개 결과에서 계수들의 합이 $$\alpha + \beta + (1 - \alpha - \beta) = 1$$이므로, $$C$$에 속하는 세 점의 affine combination 형태임을 알 수 있다. 따라서, 전개 결과는 집합 $$C$$에 속하게 된다.
>$$\alpha v_1 + \beta v_2 + x_0 = \alpha (v_1 + x_0) + \beta (v_2 + x_0) + (1 - \alpha - \beta) x_0 \in C$$
정리하면 $$\alpha v_1 + \beta v_2 + x_0 \in C$$이기 때문에 $$\alpha v_1 + \beta v_2 \in V$$가 되어서 $$V$$는 sum과 scalar multiplication에 닫혀있는 subspace임을 알 수 있다.
| 49.894737 | 289 | 0.620253 | kor_Hang | 1.00001 |
c03360f01453267f9952b5f26babe6812c077814 | 711 | md | Markdown | _posts/Writing/2020년/2020-11-07-하나의 챕터를 마치고.md | guswns1659/guswns1659.github.io | c78f6b911b7da899788f38a929e166a060d795c0 | [
"MIT"
] | 1 | 2020-09-26T11:15:45.000Z | 2020-09-26T11:15:45.000Z | _posts/Writing/2020년/2020-11-07-하나의 챕터를 마치고.md | guswns1659/guswns1659.github.io | c78f6b911b7da899788f38a929e166a060d795c0 | [
"MIT"
] | 2 | 2020-05-27T00:31:14.000Z | 2021-05-20T15:29:50.000Z | _posts/Writing/2020년/2020-11-07-하나의 챕터를 마치고.md | guswns1659/guswns1659.github.io | c78f6b911b7da899788f38a929e166a060d795c0 | [
"MIT"
] | 1 | 2022-03-10T00:24:34.000Z | 2022-03-10T00:24:34.000Z | ---
title: "하나의 챕터를 마치고"
header:
overlay_image: /assets/writing.jpg
overlay_filter: 0.2
caption: "Photo credit: [**Unsplash**](https://unsplash.com)"
categories:
- Writing
---
글은 나를 표현하는 행위입니다.
컴퓨터 공학와 전혀 상관없는 전공을 졸업했다. 그리고 개발자로 커리어를 시작하기 위해 준비했던 시기가 막 끝났다. 이제는 개발자로서 성장하는 이야기가 담길 다른 챕터가 시작된다.
합격 소식을 주변 지인들에게 전했다. 다들 축하해주며 출근은 언제냐 물었다. 바로 다음주 월요일이라 전하니 대부분 왜이리 빨리 가냐며 아쉬워했다. 나도 더 쉬고 싶기도 했지만 합격에 큰 의미부여를 하고 싶지 않았다. 새로운 시작이라는 마음으로 다시 꾸준히 걸어가고 싶었다.
5일 정도 쉬면서 개발자를 준비하기 위해 달려온 시기를 정리했다. 그리고 하나의 책을 읽었다. "기술적인 언어를 새로운 인간적인 언어로 컴파일하여 개발자의 생활에 인간미와 온기를 불어 넣으려고 했다"던 책이다. 여러가지 생각을 던져주는 좋은 책이다.

| 35.55 | 152 | 0.724332 | kor_Hang | 1.00001 |
c033ec921f59ae28db407fdd317f9c78c882f79c | 1,495 | md | Markdown | vurdere-pull-request.md | navikt/pam-policy | 34c665f5535a44b15734a161fd4823d5f35483cd | [
"MIT"
] | 1 | 2019-08-08T05:58:51.000Z | 2019-08-08T05:58:51.000Z | vurdere-pull-request.md | navikt/pam-policy | 34c665f5535a44b15734a161fd4823d5f35483cd | [
"MIT"
] | 1 | 2018-09-19T12:34:22.000Z | 2019-05-28T15:16:43.000Z | vurdere-pull-request.md | navikt/pam-policy | 34c665f5535a44b15734a161fd4823d5f35483cd | [
"MIT"
] | 1 | 2020-09-15T13:40:31.000Z | 2020-09-15T13:40:31.000Z | # Vurdere pull request
1. Les gjennom akseptansekriteriene og se om de er oppfylt i koden
2. Sjekk kodekvalitet, lesbarhet og forståelse, og at ting forstås funksjonelt
3. Sjekk koden mot relevante kriterier under (og alt annet du kan komme på)
4. Kommenter mangler og ting som er bra
5. Godkjenn hvis alt ser bra ut
6. **Uansett:** Si fra på slack (i #team-aasmund-pr / #team-tuan-github) at pull requesten har kommentarer eller er godkjent
## Endepunkter
Sjekk at
- Endringer i APIer er bakoverkompatible
- Endepunkter er tilstrekkelig beskyttet
- Endepunkter ikke eksponerer for mye/sensitiv data til brukere som ikke skal ha tilgang til det
## Database
- Ved opprettelse av tabeller eller nye kolonner, se etter potensielle behov for indekser
- Se etter manglende parametrisering av SQL. (SQL-injection)
- Sjekk at evt. endringer i migreringsscript er bakoverkompatible
- Sjekk at migreringen enten skjer fullstendig eller ikke i det hele tatt. Pass på at migreringen er wrappet inn i en transaction. Delvis migreringer er vanskelig å komme seg ut av.
## Universell utforming
- Se etter semantisk korrekt bruk av h1, h2, h3? Korrekt rekkefølge h1,h2,h3 i stedet for h6,h7,h8.
- Alle input, links, images og andre synlige elementer bør ha alt tekst.
- Input felter må ha placeholder og label.
- Alle elementer nås med tastatur. F.eks.: lenker, knapper, radioknapper, checkboxer, nedtrekksmenyer og funksjoner kan styres ved tastatur. At man kan navigere seg vekk/videre vha tastatur.
| 48.225806 | 190 | 0.785953 | nob_Latn | 0.989005 |
c034c15a1678e97694b47e67c0afd2e4b390d8a5 | 424 | md | Markdown | README.md | finngaida/HackerNews | d3b22527d344f50ca2adb6dee83247e1b8b6fe72 | [
"MIT"
] | null | null | null | README.md | finngaida/HackerNews | d3b22527d344f50ca2adb6dee83247e1b8b6fe72 | [
"MIT"
] | null | null | null | README.md | finngaida/HackerNews | d3b22527d344f50ca2adb6dee83247e1b8b6fe72 | [
"MIT"
] | null | null | null | # HackerNews
---------
Obviously there are several Hacker News iOS Apps on the AppStore, but none of them suits my needs perfectly, so I decided to start my own project.
## Features
- List current top links in a table
- Load preview images for every entry lazily
- Peek into articles and add to Reading List from there
- Awesome reload control
## ToDos
- Implement caching
- Have the reload actually work
- Improve design
| 28.266667 | 146 | 0.757075 | eng_Latn | 0.997991 |
c035077745bbb70b224ab59e36c492fed7e59a89 | 28 | md | Markdown | README.md | mkosir/react-native-parallax-tilt | 3c08bb9cbbf564006a1efecfc3c8917b054fb57b | [
"MIT"
] | null | null | null | README.md | mkosir/react-native-parallax-tilt | 3c08bb9cbbf564006a1efecfc3c8917b054fb57b | [
"MIT"
] | null | null | null | README.md | mkosir/react-native-parallax-tilt | 3c08bb9cbbf564006a1efecfc3c8917b054fb57b | [
"MIT"
] | null | null | null | # react-native-parallax-tilt | 28 | 28 | 0.821429 | eng_Latn | 0.756359 |
c03540aa147fd51b07f82c2973ce69692eef71e2 | 521 | md | Markdown | README.md | hello-slide/token-manager | 9bcb6e637fbee6827f48ec040f13df378c268d1a | [
"MIT"
] | null | null | null | README.md | hello-slide/token-manager | 9bcb6e637fbee6827f48ec040f13df378c268d1a | [
"MIT"
] | null | null | null | README.md | hello-slide/token-manager | 9bcb6e637fbee6827f48ec040f13df378c268d1a | [
"MIT"
] | null | null | null | # Token Manager
[](https://github.com/hello-slide/token-manager/actions/workflows/docker-publish.yml)
[](https://github.com/hello-slide/token-manager/actions/workflows/go.yml)
## Overview
- PASETOを使用した復号可能なトークンを作成します。
```env
PUBLIC_KEY="*******"
```
## Test
```bash
go test ./test
```
## LICENSE
[MIT](./LICENSE)
| 22.652174 | 199 | 0.725528 | yue_Hant | 0.214209 |
c03631e6a112aae0516bd78b33c506551e221aa2 | 14,371 | md | Markdown | _posts/yummyKit/2019-01-07-6day.md | yummyHit/yummyHit.github.io | 54e5003459ee83ec4d7f1b74f72710d29f0486d1 | [
"MIT"
] | null | null | null | _posts/yummyKit/2019-01-07-6day.md | yummyHit/yummyHit.github.io | 54e5003459ee83ec4d7f1b74f72710d29f0486d1 | [
"MIT"
] | null | null | null | _posts/yummyKit/2019-01-07-6day.md | yummyHit/yummyHit.github.io | 54e5003459ee83ec4d7f1b74f72710d29f0486d1 | [
"MIT"
] | null | null | null | ---
title: "[yummyKit] 6 day"
category: "tooldev"
header:
teaser: /assets/images/yummyKit/6day/03.gif
last_modified_at: 2019-01-07T06:04:00+09:00
permalink: /tooldev/2019-01-07-06day
---
<p style="TEXT-ALIGN: center;"><strong><span style="COLOR: #ff0000">※주의사항※<br />연구 목적으로 작성된 것이며, 허가 받지 않은 공간에서는 테스트를 절대 금지합니다. 악의적인 목적으로 이용할 시 발생할 수 있는 법적 책임은 자신한테 있습니다. 이는 해당 글을 열람할 때 동의하였다는 것을 의미합니다.</span></strong></p>
<br />
<p style="TEXT-ALIGN: center;">흐어 ㅜㅜ 이 글은 분명 3일 전에 쓰고있던건뎀 갑자기 IE 브라우저가 다운되면서 저장도 안된 상태로 날아가부렀서욤 ㅜㅜㅜ 그래서 다시씀니다!! 후욱,, 후욱,,</p>
<p style="TEXT-ALIGN: center;">요즘 너무 나태해지다 보니 정신이 몽롱해지면서 머리가 몽총해진다는 것을 깨달았어욧!! 세상에나 마상에나 코딩이 안되는거 있죵 ㅜㅜ 그래서! 신년 목표 및 계획을 핑계삼아 오늘부터 열씸히 하려함니다!! 석사 가즈아~</p>
<br />
<p style="TEXT-ALIGN: center;"><img width="585" height="437" src="/assets/images/yummyKit/6day/01.png" /></p>
<br />
<p style="TEXT-ALIGN: center;">냠냠!! 역시 *nix 짱짱맨이져!! 꺄륵~ (모든 윈도우 유저에게 죄송함니다.. 흑 제가 윈도우를 좋아하진 않아서욥.. 호고곡!)</p>
<p style="TEXT-ALIGN: center;">참 요즘 깃헙에 재밌는거 올리고있어욤!! 씨언어로 만드는 객체지향 클래스!! 물론 객체지향이라함은 꽃과 같은 기능이 6개나 존재해서(encapsulation, inheritance, overloading, overriding, virtual/interface/abstract, access modifier) 전부 구현은 어렵지만욤!! 클래스 기능을 최대한 구현해보고, C1X(C11 이라고도 함미당)부터 생긴 _Generic 키워드를 통해 오버로딩도 구현할 슈 있어욤!! 이렇게 언어 만들어 가즈아~!</p>
<br />
## Live Packet Capture
<p style="TEXT-ALIGN: center;">자! 오늘 하려고 했던 거슨 바로바로 live packet capture 예욤!! 지난 포스팅에서는 pcap_open_offline() 함수를 통해서 패킷이 캡쳐된 파일을 분석하는 것이었다면, 이번엔 실제로 흘러가는 패킷을 분석하는 것이죵!!</p>
<p style="TEXT-ALIGN: center;">호에에 이것을 구현하믄 이제 sniffer 는 금방 구현하지 않을까욥!? 네 않을 수도 있숨미다.. 왜냐하믄 arp 의 개념을 딱! 머릿속에 박은 채로 victim 과 router 사이에 잘 위치해야하거든욤!! 우선 소스코드를 보여드리기 전에, 수행할 결과화면을 보여드릴게욤!!</p>
<br />
### sample image
<p style="TEXT-ALIGN: center;"><img width="779" height="2420" src="/assets/images/yummyKit/6day/02.png" /></p>
<br />
<p style="TEXT-ALIGN: center;">먼저 제 NIC 는 저기 보이는 NPF_{F044528C_..._558A} 하나만 있다고 해욤! 히히 가상머신이라 딱히 vmnet 이 없어서 1개밖에 안잡혀욧!</p>
<p style="TEXT-ALIGN: center;">그리고 이 윈도우의 IP는 172.20.10.5 번이었구욤, 라우터의 IP는 172.20.10.1 이에욤! 172.20.10.3 번이 간혹 보이는뎅, 얘는 제 가상머신 중 하나인 우분투 리눅스의 IP구욤!</p>
<p style="TEXT-ALIGN: center;">Source/Destination MAC Address를 보시면 딱 2가지만 번갈아가면서 나와욤! 하나는 윈도우의 MAC, 다른 하나는 라우터의 MAC 주소람니다!! 호엥 왜 외부 패킷으로 나가는데 라우터까지만 MAC 주소가 나오냐구욤!?</p>
<br />
### network flow
<p style="TEXT-ALIGN: center;"><strong><em><span style="color: #B827EE">인터넷 통신을 보았을 때, Host PC --> Router --> Router --> ... --> Data Center --> ... --> DNS Server </span><span style="color: #B827EE">--> .. --> Router --> Web Server 와 같은 순서로 패킷이 흐르는데, Host PC 는 7계층에서 시작하여서 Router 가 존재하는 3계층(뭐 Switch 의 경우 MultiLayer 까지 존재해서 Layer 2 ~ 7 까지 다양하지만, 라우터의 경우 L4 까지 존재하는 것으로 알고있어용!!)을 통해 물리계층을 지나 다시 Web Server 인 7계층까지 도달할것이에욤!!</span></em></strong></p>
<p style="TEXT-ALIGN: center;"><strong><em><span style="color: #B827EE">그럼 우리의 Host PC 는 MAC Address 를 담아서 맞물려있는 라우터 즉, 현재 연결되어 있는 네트워크의 라우터에게 패킷을 전송, 이 라우터는 라우팅 테이블을 통해 다음 라우터로 지나가므로 MAC Address 가 필요하지 않아서 헤더를 떼어내 버려요!! 그렇기 때문에 Host PC 의 MAC Address 와 라우터의 MAC Address 만 나오게 되는 것이죵!!</span></em></strong></p>
<br />
<p style="TEXT-ALIGN: center;"><strong><span style="COLOR: #ff0000">※ 참고로 위 작업은 개인 휴대폰의 핫스팟을 이용하였으며, 타인에게 피해가 가지 않도록 실습하였음을 말씀드립니다. 다시 한 번 말씀드리지만 모의실습을 진행할 시 자신만의 실습 환경에서만 진행하시길 부탁드리며, 그렇지 않을 시 모든 법적 책임은 자신에게 있다는 것을 명심하길 바랍니다.</span></strong></p>
<br />
<p style="TEXT-ALIGN: center;">해당 패킷 흐름은 Host 인 윈도우 10에서 N사(뇌이뻐!)의 포털사이트에 접속했을 때 잡힌 패킷이에욤! 나중에 패킷들 분석해보시면 중간중간 SSL 적용 안되어있던것도..읍읍</p>
<p style="TEXT-ALIGN: center;"><span style="color: #B827EE"><strong><em>위 네이버로 통하는 IP는 cmd 창이나 terminal 에서 "nslookup www.naver.com" 에서도 볼 수 있는 IP + "traceroute(윈도우의 경우 tracert) www.naver.com" 으로 확인할 수 있으니 "엇!! 이거 해킹아냡!! 막 네이버 아이피 알고 그러면 잡혀가는거 아니얍!!" 하실 필요는 없어용!! 각 포털 사이트 및 중견급 이상 기업의 사이트들은 웹서버에 접근하지 못하도록 중간에 패킷에 정보를 주지 않고 흘리거나, access denied 시켜버려욤!</em></strong></span></p>
<br />
<p style="TEXT-ALIGN: center;"><span style="color: #B827EE"><strong><em>아닛 어떻게 access denied 를 시킬 수 있는거짓!? 리눅스에서의 traceroute 명령은 초기에 UDP 프로토콜을 통해 next hop 으로 넘어가도록 되어있어욤. 그러다가 ICMP 프로토콜의 TTL 값이 끝(총 라우터 64번 거친 경우; 리눅스는 default TTL maximum 64, 윈도우는 default TTL maximum 128)나거나, time to live exceeded 되거나 등등의 오류/성공 코드를 통해 결과를 받아욤!</em></strong></span></p>
<p style="TEXT-ALIGN: center;"><span style="color: #B827EE"><strong><em>윈도우에서의 tracert 명령은 처음부터 ICMP 프로토콜을 이용해서 결과를 받구욤! 둘이 약간 다르죵!? ㅎㅎ</em></strong></span></p>
<br />
<p style="TEXT-ALIGN: center;"><span style="color: #B827EE"><strong><em>위 ICMP 프로토콜은 패킷 전송이 원활히 흘러가는가 알아보기 위한 ping 명령을 위할 때가 아니면 사용할 일이 적기에, 보통 막아버려욤! 특히 윈도우는 처음부터 막혀있어서 필요시 wf.msc 와 같은 방화벽 정책에서 열어주어야 함니다!</em></strong></span></p>
<br />
<p style="TEXT-ALIGN: center;">오잉 위 패킷캡쳐 구현을 말씀드리다가 막 네트워크까지 주저리주저리했네욤 :) 히히</p>
<p style="TEXT-ALIGN: center;">이제 소스코드를 보여드릴 때가 된 것 같군뇸!! 아마 이번 포스팅에서는 main 함수까진 못하구 사용되는 라이브러리 및 타 함수만 분석해드리고 빠잇! 할 것 같아욤!! 오늘도 아리따운 마크다운을 이용해서 가져와보기 전에! 다 가져가야만 속이 후련한 해바라기 식당의 아들을 만나보고 갈까욥!?</p>
<br />
<p style="TEXT-ALIGN: center;"><img width="600" height="354" src="/assets/images/yummyKit/6day/03.gif" /></p>
<br />
<p style="TEXT-ALIGN: center;"><strike>속이 후련했... 냐!!!! 내가 더 슬프게 해주꿰 ㅜㅜ 병진이형! 형은 나가이써 돼지고기 싫으면!!</strike></p>
<p style="TEXT-ALIGN: center;">정말정말 명작인 영화이자 연기 짱짱의 김래원쨩..</p>
<p style="TEXT-ALIGN: center;">그럼 이제 소스코드를 보러 갑씨다! 뾰로롱</p>
<br />
### source code
```cpp
#pragma comment(lib, "ws2_32.lib")
/*
This code is made by yummyHit using Microsoft Visual C++ 2010 Express.
You must have winpcap and add to library, include directory at your VC/bin directory.
+ Additional, install libnet and include it.
+ Additional, include <libnet-macros.h> in libnet-headers.h file.
Must be add linker: ws2_32.lib; wpcap.lib; Packet.lib
*/
#define HAVE_REMOTE
#include <stdio.h>
#include <stdlib.h>
#include <WinSock2.h>
#include <pcap.h>
#include <libnet.h>
#define PROMISCUOUS 1
#define NONPROMISCUOUS 0
void mac_print(int init, struct libnet_ethernet_hdr *eth) {
int i = 0;
if(init == 0) {
while(i < ETHER_ADDR_LEN) {
printf("%02x:", eth->ether_shost[i]); // print Source Mac Address
if((i+1) == (ETHER_ADDR_LEN-1)) // If next i value is last index, print last address value and quit loop.
printf("%02x", eth->ether_shost[++i]);
i++;
}
printf("\n");
}
else {
i = 0;
while(i < ETHER_ADDR_LEN) {
printf("%02x:", eth->ether_dhost[i]); // print Destination Mac Address
if((i+1) == (ETHER_ADDR_LEN-1)) // If next i value is last index, print last address value and quit loop.
printf("%02x", eth->ether_dhost[++i]);
i++;
}
printf("\n");
}
}
```
<br />
<p style="TEXT-ALIGN: center;">지난 포스팅과 include 하는 것은 거의 똑같져!? 다른거라면 libnet 라이브러리!</p>
### libnet library
<p style="TEXT-ALIGN: center;">지난 코드에선 ethernet frame, arp/ip packet, tcp/udp segment 구조체를 직접 만들어서 패킷 데이터를 할당시켰잖아욤!? 이젠 그럴필요 없슴니다! 묻지도 따지지도 말고 libnet 라이브러리만 추가하시면 되어욤!! 오픈소스로 있으니 직접 사이트에서 다운로드 받으셔두 되구욤!!</span></p>
<p style="TEXT-ALIGN: center;"><strong><em>(libnet web site: </em></strong><a href="http://packetfactory.openwall.net/projects/libnet"><strong><em>http://packetfactory.openwall.net/projects/libnet</em></strong></a><strong><em>)</em></strong></p>
<p style="TEXT-ALIGN: center;">혹은 리눅스라면 역시 패키지 다운로드를 통하면 됨니다!!</p>
<p style="TEXT-ALIGN: center;"><strong><em>데비안의 경우 libnet1-* / 레드햇의 경우 libnet* 을 통해서 설치하시면 되어욤!</em></strong></p>
<br />
<p style="TEXT-ALIGN: center;"><strong>#include <libnet.h></strong></p>
<p style="TEXT-ALIGN: center;">여기서 libnet-headers.h 가 필요한 것인데, 말 그대로 패킷 전송에 필요한 헤더를 모아둔 곳이에욤! 해당 파일에서 필요한 구조체는</p>
<br />
```cpp
/*
* Ethernet II header
* Static header size: 14 bytes
*/
struct libnet_ethernet_hdr
{
uint8_t ether_dhost[ETHER_ADDR_LEN];/* destination ethernet address */
uint8_t ether_shost[ETHER_ADDR_LEN];/* source ethernet address */
uint16_t ether_type; /* protocol */
};
/*
* ARP header
* Address Resolution Protocol
* Base header size: 8 bytes
*/
struct libnet_arp_hdr
{
uint16_t ar_hrd; /* format of hardware address */
#define ARPHRD_NETROM 0 /* from KA9Q: NET/ROM pseudo */
#define ARPHRD_ETHER 1 /* Ethernet 10Mbps */
#define ARPHRD_EETHER 2 /* Experimental Ethernet */
#define ARPHRD_AX25 3 /* AX.25 Level 2 */
#define ARPHRD_PRONET 4 /* PROnet token ring */
#define ARPHRD_CHAOS 5 /* Chaosnet */
#define ARPHRD_IEEE802 6 /* IEEE 802.2 Ethernet/TR/TB */
#define ARPHRD_ARCNET 7 /* ARCnet */
#define ARPHRD_APPLETLK 8 /* APPLEtalk */
#define ARPHRD_LANSTAR 9 /* Lanstar */
#define ARPHRD_DLCI 15 /* Frame Relay DLCI */
#define ARPHRD_ATM 19 /* ATM */
#define ARPHRD_METRICOM 23 /* Metricom STRIP (new IANA id) */
#define ARPHRD_IPSEC 31 /* IPsec tunnel */
uint16_t ar_pro; /* format of protocol address */
uint8_t ar_hln; /* length of hardware address */
uint8_t ar_pln; /* length of protocol addres */
uint16_t ar_op; /* operation type */
#define ARPOP_REQUEST 1 /* req to resolve address */
#define ARPOP_REPLY 2 /* resp to previous request */
#define ARPOP_REVREQUEST 3 /* req protocol address given hardware */
#define ARPOP_REVREPLY 4 /* resp giving protocol address */
#define ARPOP_INVREQUEST 8 /* req to identify peer */
#define ARPOP_INVREPLY 9 /* resp identifying peer */
u_char ar_sha[6]; // Sender hardware address
u_char ar_spa[4]; // Sender IP address
u_char ar_dha[6]; // Target hardware address
u_char ar_dpa[4]; // Target IP address
};
/*
* IPv4 header
* Internet Protocol, version 4
* Static header size: 20 bytes
*/
struct libnet_ipv4_hdr
{
#if (LIBNET_LIL_ENDIAN)
uint8_t ip_hl:4, /* header length */
ip_v:4; /* version */
#endif
#if (LIBNET_BIG_ENDIAN)
uint8_t ip_v:4, /* version */
ip_hl:4; /* header length */
#endif
uint8_t ip_tos; /* type of service */
#ifndef IPTOS_LOWDELAY
#define IPTOS_LOWDELAY 0x10
#endif
#ifndef IPTOS_THROUGHPUT
#define IPTOS_THROUGHPUT 0x08
#endif
#ifndef IPTOS_RELIABILITY
#define IPTOS_RELIABILITY 0x04
#endif
#ifndef IPTOS_LOWCOST
#define IPTOS_LOWCOST 0x02
#endif
uint16_t ip_len; /* total length */
uint16_t ip_id; /* identification */
uint16_t ip_off;
#ifndef IP_RF
#define IP_RF 0x8000 /* reserved fragment flag */
#endif
#ifndef IP_DF
#define IP_DF 0x4000 /* dont fragment flag */
#endif
#ifndef IP_MF
#define IP_MF 0x2000 /* more fragments flag */
#endif
#ifndef IP_OFFMASK
#define IP_OFFMASK 0x1fff /* mask for fragmenting bits */
#endif
uint8_t ip_ttl; /* time to live */
uint8_t ip_p; /* protocol */
uint16_t ip_sum; /* checksum */
struct in_addr ip_src, ip_dst; /* source and dest address */
};
/*
* TCP header
* Transmission Control Protocol
* Static header size: 20 bytes
*/
struct libnet_tcp_hdr
{
uint16_t th_sport; /* source port */
uint16_t th_dport; /* destination port */
uint32_t th_seq; /* sequence number */
uint32_t th_ack; /* acknowledgement number */
#if (LIBNET_LIL_ENDIAN)
uint8_t th_x2:4, /* (unused) */
th_off:4; /* data offset */
#endif
#if (LIBNET_BIG_ENDIAN)
uint8_t th_off:4, /* data offset */
th_x2:4; /* (unused) */
#endif
uint8_t th_flags; /* control flags */
#ifndef TH_FIN
#define TH_FIN 0x01 /* finished send data */
#endif
#ifndef TH_SYN
#define TH_SYN 0x02 /* synchronize sequence numbers */
#endif
#ifndef TH_RST
#define TH_RST 0x04 /* reset the connection */
#endif
#ifndef TH_PUSH
#define TH_PUSH 0x08 /* push data to the app layer */
#endif
#ifndef TH_ACK
#define TH_ACK 0x10 /* acknowledge */
#endif
#ifndef TH_URG
#define TH_URG 0x20 /* urgent! */
#endif
#ifndef TH_ECE
#define TH_ECE 0x40
#endif
#ifndef TH_CWR
#define TH_CWR 0x80
#endif
uint16_t th_win; /* window */
uint16_t th_sum; /* checksum */
uint16_t th_urp; /* urgent pointer */
};
/*
* UDP header
* User Data Protocol
* Static header size: 8 bytes
*/
struct libnet_udp_hdr
{
uint16_t uh_sport; /* source port */
uint16_t uh_dport; /* destination port */
uint16_t uh_ulen; /* length */
uint16_t uh_sum; /* checksum */
};
```
<br />
<br />
<p style="TEXT-ALIGN: center;">짜란! 요것들 임니다! 우리가 yummyKit 과 같은 프로그램을 만들기 위해 계속 사용할 Ethernet ! ARP ! IPv4 ! TCP ! UDP ! 대표적인 프로토콜들이죵!! 이 외에도 총 68개의 구조체가 있으니 68개의 프로토콜에 대한 헤더가 구조체로 만들어져 있는 파일임니다!!</p>
<p style="TEXT-ALIGN: center;">아아 역시 사람은 지식이 많아야함미다 ㅜㅜ 무지하니 이런것들이 있는 줄도 모르고 한땀한땀 만들면 물논 실력은 좋아질 수 있으나.. 시간을 생각했을 때 이런 정도는 시간허비가 커진다고 느껴져용 ㅜㅜ 라이브러리를 알았으면 기냥 패키지 뙇!! 다운로드 받아서 바로 include 뙇!!</p>
<br />
<p style="TEXT-ALIGN: center;">헤더 외에도 우리에게 필요했던 ETHER_ADDR_LEN, ARPOP_REQUEST, ARPOP_REPLY 와 같은 전처리가 되어있으니 이것도 사용하면 될 것 같네욤!! 코드 가독성아 좋아져라 얍!</p>
<p style="TEXT-ALIGN: center;">라이브러리만 설명하는데 포스팅 스크롤이 압박되어버렸네욥... 빠르게 다음 함수를 설명드리고 마칠게용!! 제성함미다... (((꾸벅</p>
<br />
```cpp
void mac_print(int init, struct libnet_ethernet_hdr *eth);
```
<br />
<p style="TEXT-ALIGN: center;">마크다운을 사용하시면 예쁘게 나옴니다 얏호!! (+____+) 그치만,, 이렇게라도 하지 않으면 회원쨩,, 개발 재미없다고 떠날것 같은걸!</p>
<br />
<p style="TEXT-ALIGN: center;"><strong><em><span style="color: #B827EE">온전히 함수명 그대~로 MAC Address 출력을 위함이에욤!! MAC Address 의 경우 unsigned char 형식으로 1바이트씩 6개가 이루어져 있어서 %02x 또는 %02X 로 출력해주어야 하거든요!! 또한 예쁘게 보기 위하여 MAC Address를 표기하는 AA:BB:CC:DD:EE:FF 방식을 위해 마지막 바이트에선 뒤에 ":" 세미콜론을 붙이지 않고 출력하는 것!!!</span></em></strong></p>
<p style="TEXT-ALIGN: center;"><span style="color: #B827EE"><strong><em>그리고 위 함수를 처음 들어갔을 때 init 이라는 정수형 변수를 통해 if 조건 분기를 타게되는데, 이것은 Source MAC Address 인지 Destination MAC Address 인지 구별을 위함이에욤!! libnet_ethernet_hdr 구조체를 이용했기 때문에 출발지와 목적지의 변수가 달라서 따로 써야하거든뇸!!</em></strong></span></p>
<br />
<p style="TEXT-ALIGN: center;">오늘 왠지 스크롤 압박이 되었네욥.. 히히 다음번엔 온전히 main 함수를 분석하여 실제로 첫 번째 사진과 같이 live packet capture 프로그램을 만들어보아욤!! 오늘은 월요일 아침 6시이니 월요일 좋다는 짤과 함께 다음 포스팅에서 뵈어욥 뿅!!</p>
<br />
<p style="TEXT-ALIGN: center;"><img width="330" height="330" src="/assets/images/yummyKit/6day/04.jpeg" /></p>
<br />
---
Check out the [yummyhit's website][yummy-kr] for more info on who am i. If you have questions, you can ask them on E-mail.
[yummy-kr]: http://yummyhit.kr
| 48.550676 | 475 | 0.658479 | kor_Hang | 0.999228 |
c036957c74c788fe947884872771a82f5475322e | 152 | md | Markdown | README.md | songmingpeng/deep-learning | e23c6770ccaddeeedd4c537762597b6eae8ba134 | [
"MIT"
] | 3 | 2018-10-28T02:40:01.000Z | 2018-10-28T02:58:55.000Z | README.md | tyewu/deep-learning | 3dc4636627dc77aba1aba6347304d3be65b2718c | [
"MIT"
] | null | null | null | README.md | tyewu/deep-learning | 3dc4636627dc77aba1aba6347304d3be65b2718c | [
"MIT"
] | 1 | 2018-10-26T03:31:28.000Z | 2018-10-26T03:31:28.000Z | # deep-learning
Me and FT Wu decided to make a good achievement about deep learning NLP and we believe we can do it because we trust each other, do it!
| 50.666667 | 135 | 0.776316 | eng_Latn | 0.999749 |
c037d623f785c957fad2721a5c8a090d29cf69a0 | 309 | md | Markdown | README.md | davidson-santos/PI_ant | 0b3f4d23cf08ce5c3b6195fccb57dc77163d5afa | [
"MIT"
] | null | null | null | README.md | davidson-santos/PI_ant | 0b3f4d23cf08ce5c3b6195fccb57dc77163d5afa | [
"MIT"
] | null | null | null | README.md | davidson-santos/PI_ant | 0b3f4d23cf08ce5c3b6195fccb57dc77163d5afa | [
"MIT"
] | null | null | null | # PI_ant
Tradutor de expressões do Português para o Inglês. Projeto feito para a disciplina de POO do curso de informática do IFBA, campus Eunápolis.
# Grupo:
CAIO ALMEIDA RODRIGUES, DAVIDSON SANTOS DE OLIVEIRA,
GABRIELA CARRERA GOMES, MARIA EDUARDA SANTOS OLIVEIRA e
WANDERSON GUILHERME COSTA GONÇALVES.
| 38.625 | 141 | 0.805825 | yue_Hant | 0.556946 |
c0380338c9f896b4186b808229044273b8de1cda | 4,120 | md | Markdown | README.md | isabella232/esp-insights | 3d8b5d44dc6a94b9c35faf7a8e75d06a059c2364 | [
"Apache-2.0"
] | null | null | null | README.md | isabella232/esp-insights | 3d8b5d44dc6a94b9c35faf7a8e75d06a059c2364 | [
"Apache-2.0"
] | 1 | 2022-03-21T14:20:16.000Z | 2022-03-21T14:20:16.000Z | README.md | isabella232/esp-insights | 3d8b5d44dc6a94b9c35faf7a8e75d06a059c2364 | [
"Apache-2.0"
] | null | null | null | # ESP Insights (Beta)
ESP Insights is a remote diagnostics solution that allows users to remotely monitor the health of ESP devices in the field.
## Introduction
Developers normally prefer debugging issues by physically probing them using gdb or observing the logs. This surely helps debug issues, but there are often cases wherein issues are seen only in specific environments under specific conditions. Even things like casings and placement of the product can affect the behaviour. A few examples are
- Wi-Fi disconnections for a smart switch concealed in a wall.
- Smart speakers crashing during some specific usage pattern.
- Appliance frequently rebooting due to power supply issues.
Having remote diagnostics facility helps in identifying such issues faster. ESP Insights includes a firmware agent (the Insights agent) that captures some of the vital pieces of diagnostics information from the device during runtime and uploads them to the ESP Insights cloud. The cloud then processes this data for easy visualisation. Developers can log in to a web-based dashboard to look at the health and issues reported by their devices in the field. A sample screen is shown here.

Currently, developers can monitor the following information from the web-based dashboard:
- Error logs: Anything that is logged on console with calls with ESP_LOGE by any component in the firmware
- Warning logs: Anything that is logged on console with calls with ESP_LOGW by any component in the firmware
- Custom Events: Application specific custom events that the firmware wishes to track via calls to ESP_DIAG_EVENT
- Reset reason: The reason why the device was reset (power on, watchdog, brownout, etc.)
- Coredump summary: In case of a crash, the register contents as well as the stack backtrace of the offending thread (wherever possible)
- Metrics: Time-varying data like the free heap size, the Wi-Fi signal strength that is plotted over a period of time
- Variables: Variable values like the IP Address or state variables that report their current value
- Group analytics: Insights into how group of devices are performing
All of this information should help the developer understand better how their device is performing in the field.
You can find more details on [Insights Features](FEATURES.md) page.
> ESP Insights currently works with the ESP Insights cloud and ESP RainMaker cloud. Support for other cloud services will be available in a subsequent release.
## Getting Started
Following code should get you started, and your application can start reporting ESP Insights data to the Insights cloud.
### Enabling Insights with HTTPS
For Insights agent HTTPS is configure as the default transport.
```c
#include <esp_insights.h>
#define ESP_INSIGHTS_AUTH_KEY "<Paste-Auth-Key-Here>"
{
esp_insights_config_t config = {
.log_type = ESP_DIAG_LOG_TYPE_ERROR,
.auth_key = ESP_INSIGHTS_AUTH_KEY,
};
esp_insights_init(&config);
/* Rest of the application initialization */
}
```
As you may have noticed, all you will need is the unique ESP_INSIGHTS_AUTH_KEY to be embedded in your firmware.
Here is how you can obtain the ESP Insights Auth Key:
* Sign up or Sign in on [ESP Insights Dashboard](https://dashboard.insights.espressif.com/)
* Visit [Manage Auth Keys](https://dashboard.insights.espressif.com/home/manage-auth-keys) and generate an Auth Key
* Copy the Auth Key to your firmware
### Enabling Insights with MQTT
Configure the default insights transport to MQTT (Component config → ESP Insights → Insights default transport → MQTT).
Alternately you can add `CONFIG_ESP_INSIGHTS_TRANSPORT_MQTT=y` to `sdkconfig.defaults`.
```c
#include <esp_insights.h>
{
esp_insights_config_t config = {
.log_type = ESP_DIAG_LOG_TYPE_ERROR,
};
esp_insights_init(&config);
/* Rest of the application initialization */
}
```
You will require the MQTT certs which you can obtain by performing [Claiming](examples/minimal_diagnostics#esp-insights-over-mqtt).
For more details please head over to [examples](examples).
| 49.638554 | 486 | 0.78568 | eng_Latn | 0.995278 |
c038c18358b6cf206b6c89165b941336e5b7156b | 248 | md | Markdown | After-Five/whiskey-n-soda.md | Greh/Bachelor-Chow | 769862e9898df0a7d9e151cf6355307069970f0d | [
"MIT"
] | 3 | 2015-03-25T01:18:35.000Z | 2015-08-09T15:07:39.000Z | After-Five/whiskey-n-soda.md | Greh/Bachelor-Chow | 769862e9898df0a7d9e151cf6355307069970f0d | [
"MIT"
] | 1 | 2018-07-01T21:47:01.000Z | 2018-08-08T22:58:01.000Z | After-Five/whiskey-n-soda.md | Greh/Bachelor-Chow | 769862e9898df0a7d9e151cf6355307069970f0d | [
"MIT"
] | null | null | null | Whiskey and Soda
==============
Ingredients
------------
* whiskey
* dash of lemon juice
* club soda
* extra flavoring
Extra flavoring options
------
* dash of orange blossom water
* blood orange ginger bitters (no lemon juice)
* chuncho bitters
| 15.5 | 46 | 0.665323 | eng_Latn | 0.821361 |
c03b3a78fe02f4102a1d584a50cd3a8fd5591f98 | 1,586 | md | Markdown | README.md | mirarus/bmvc-core | 0ce7a155d34229facedcaac8dd703c95d2af92a6 | [
"MIT"
] | null | null | null | README.md | mirarus/bmvc-core | 0ce7a155d34229facedcaac8dd703c95d2af92a6 | [
"MIT"
] | null | null | null | README.md | mirarus/bmvc-core | 0ce7a155d34229facedcaac8dd703c95d2af92a6 | [
"MIT"
] | 2 | 2021-07-28T09:18:05.000Z | 2021-08-03T20:54:59.000Z | # Bmvc-core
Mirarus BMVC Core (Basic MVC Core)
[](https://packagist.org/packages/mirarus/bmvc-core)
[](https://packagist.org/packages/mirarus/bmvc-core)
[](https://packagist.org/packages/mirarus/bmvc-core)
[](https://packagist.org/packages/mirarus/bmvc-core)
[](https://github.com/mirarus/bmvc-core/actions/workflows/php.yml)
Libraries: [BMVC Libs](https://github.com/mirarus/bmvc-libs)
## Installation
Install using composer:
```bash
composer require mirarus/bmvc-core
```
## Example
Install using composer:
```bash
<?php
require_once __DIR__ . '/vendor/autoload.php';
use BMVC\Core\{App, Route, Controller};
use BMVC\Libs\{MError, Benchmark};
class Main
{
function index() {
echo "[Main::index]";
}
}
Route::any('/', function () {
Controller::call('main@index');
MError::color("info")::print("Benchmark", "Memory Usage: " . Benchmark::memory());
});
App::Run([
'init' => [
//BMVC\Core\Model::class
]
]);
?>
```
## License
Licensed under the MIT license, see [LICENSE](LICENSE)
| 28.321429 | 193 | 0.722573 | kor_Hang | 0.186338 |
c03b71b491f7c7713547f460b719c4a6f3ee8b58 | 5,351 | md | Markdown | _posts/GritFX_weekly/2021-12-26-GritFX-VOL14.md | LLH07/llh07.github.io | 001532b97a57b4578b5833b67c293751d4eff034 | [
"MIT"
] | null | null | null | _posts/GritFX_weekly/2021-12-26-GritFX-VOL14.md | LLH07/llh07.github.io | 001532b97a57b4578b5833b67c293751d4eff034 | [
"MIT"
] | null | null | null | _posts/GritFX_weekly/2021-12-26-GritFX-VOL14.md | LLH07/llh07.github.io | 001532b97a57b4578b5833b67c293751d4eff034 | [
"MIT"
] | null | null | null | ---
layout: "post"
title: GritFX Vol. 14 (升降息與量化寬鬆、ON-RRP)
date: 2021-12-26 23:00:00 +0800
categories: [Macroeconomics, FX]
tags: [GritFX]
author:
name: Lung Hung Lin
link: https://www.linkedin.com/in/lung-hung-blair-lin-645a85194/
---
2021W51: 2021.12.20-26
- [升降息與量化寬鬆的差別](#升降息與量化寬鬆的差別)
- [附錄: 準備金市場](#附錄-準備金市場)
- [附錄: 為何 QE =\= 印錢?](#附錄-為何-qe--印錢)
- [近期 ON-RRP 增幅大,顯示資金過剩](#近期-on-rrp-增幅大顯示資金過剩)
- [周記: 禮拜一去三井住友面試](#周記-禮拜一去三井住友面試)
本周的 Grit Forex 想要統整我在網路上無意間看到的一本外匯書: John Jagerson 與 Wade Hansen 著的【外匯交易: 從入們到精通】的筆記。會想讀這本書是因為我習慣在看書前大概看一下作者的學經歷、擅長領域等,而作者之一的 Jagerson 曾任 Thinkorswim 的副總裁,我一開始把它跟 Thinkmarkets (我目前在用的外匯經紀商)搞混,因此可以說是誤打誤撞打開這本書 XD。以下幾點是我在書中學習到的新知以及相關延伸話題。
## 升降息與量化寬鬆的差別
中央銀行主要靠傳統的升息或降息或金融海嘯後新發明的量化寬鬆來干預貨幣供給量,那麼這兩者的差別是甚麼呢?
以降息為例,央行跟該國商業銀行簽定 **附買回協議 (Repurchase Agreement, Repo, RP, 正回購)**。_央行跟商業銀行暫時買進債券,並釋放貨幣到市場中,等到一段時間後 (通常時間極短,例如一天 (overnight),商業銀行必須跟央行買回債券,並支付貨幣給央行,而商業銀行支付的價格 (以買回債券)大於當初央行買進債券的價格。_
同樣的邏輯,**附賣回協議 (Reverse Repurchase Agreement, Reverse Repo, 逆回購)** 就是指 _商業銀行暫時跟央行買入債券,並付央行一筆錢,等到一段時間後,央行必須向商業銀行買回債券,並支付高於當初賣出的錢。_
注意到在整個過程中,央行並沒有真的買入或賣出金融資產 (債券等),實際上這只是一種借貸的過程。以降息來說,央行跟商業銀行買入債券,並付給銀行錢,意思就是**央行是債權人,商業銀行是債務人,債券是債務人的擔保品,確保商業銀行未來會還錢給債權人央行。央行藉由調整債務人應該還給債權人 (央行本人)的利率來操縱準備金市場的利率。**
至於量化寬鬆是央行真的在市場中購入債券 (收回債券),壓低市場利率。具體作法是**央行在準備金市場 (The Market for Bank Reserves)中右移供給,使準備金市場均衡價格 (每個國家名稱不同,如美國的均衡價格稱為 Fed Funds Rate、台灣稱為 Discount Rate)下跌,Fed Funds Rate 下跌後,銀行間借款的利率變低,借貸變得更容易,進而使活絡企業、民間經濟。**
### 附錄: 準備金市場
前文提到準備金市場這個重要概念,這裡就詳細介紹吧! 特別感謝政大金融系張興華老師的【貨幣理論與政策】一課,對央行有深入的講解,也讓我能夠產出下面的介紹,推推政大的好老師!
首先要知道的是準備金市場普通人是不能參與的,根據維基百科,必須要是存款機構 (depository institutions)或儲蓄互助社 (Credit Unions,如台灣的信用合作社,詳見 [此篇文章](https://www.learnenglishwithwill.com/credit-union-meaning-fuction-intro-chinese-translation/))才能參與準備金市場。
而要成為市場,就必須要有供給、需求、價格,及數量。**在準備金市場中,數量就是「準備金數量」、價格就是銀行間拆款利率。**如下圖所示,_需求線為負斜率_,因為當價格上漲 (拆款利率上升),代表 A 銀行借錢給 B 銀行可以獲得較高的利息,因此 A 銀行會想借出手中的準備金,導致手中的準備金數量下降 (所以需求線為銀行手中的超額準備金數量需求,Demand for excess reserves)。
然而,_供給線是垂直線_,因為市場中能有多少準備金流動,是由央行 (藉由 100% 控制貨幣基數)決定的,因此數量不隨價格而改變。
因此,最終均衡價格即為銀行間拆款利率 (在美國稱為 Federal Funds Rate)。

準備金市場 (資料來源: 政大金融系張興華老師上課投影片)
### 附錄: 為何 QE =\= 印錢?
這裡主要節錄 [Money and Macro 的這部影片](https://www.youtube.com/watch?v=ZbqtpKk6iC8),主要在探討為何 QE (大規模資產購買,Large-Scale Asset Purchases) 不等於印錢,以及**為何 QE 理論上可以達成在降低利率的同時不造成通膨加劇。**
```原因一 QE 是在準備金市場中執行```
QE 的目標資產是政府公債與企業債等,其作法是在準備金市場中買入國債,並釋出「資金」,但這裡的資金指的是【準備金】,也就是剛才上文提到的只有特定銀行機構才能參與的準備金市場的橫軸部分。在**央行釋出準備金後,垂直線的供給右移,使均衡價格 (拆款利率)下跌,代表銀行借款的利率降低,就有可能會給予民間消費者、企業 (會花真正的錢消費、投資,並造成通膨的人) 較低的利率,進而促進經濟,達到寬鬆市場的目標。**
因此,銀行是否真的願意用低利率幫助消費者是不確定的,央行只不過是用間接的方式 (讓銀行間借款的利率降低)鼓勵銀行這麼做。這也就說明,為何 QE 不會造成過大的通膨。
```原因二 在降低拆款利率的同時,債券利率下跌```
第二個 QE 不會造成通膨的原因是: 拆款利率下跌的同時,央行要也把債券買回來 (這樣資產負債表才能平衡,見下圖)。

量化寬鬆如何記錄在央行資產負債表
這代表 __央行必須真的到債券的二級市場買入債券,這會造成債券市價上漲,殖利率下跌。__
這也是為何有些人批評 QE 只會造成金融市場 (i.e., 非實體經濟) 欣欣向榮,但對實體經濟沒有幫助。
但事實上這樣的評論是不正確的,因為**殖利率的下跌會讓債券 issuer (也就是向別人借款的政府或企業)能夠少付利息,政府或企業便可以繼續積極籌錢購買商品或投資。**
總結來說,_QE 印的「錢」是準備金_,而銀行準備金增加給銀行誘因降低貸給消費者與企業的利率,但銀行是否真的會這麼做由他們決定。同時,QE 在釋放「錢」的同時,把流通在外的債券收回,導致債券價格上漲,殖利率下跌,但不會造成實體經濟的通貨膨脹。
## 近期 ON-RRP 增幅大,顯示資金過剩
上文說到很多理論面的知識,那麼實際情況如何? 下圖是今年二月以來聯準會 ON-RRP 的情況,可以看到從四月開始資金大幅流回聯準會,目前 ON-RRP 總量已達 1.5 兆美元。

資金大量流入ON-RRP(資料來源: [NY FED](https://www.newyorkfed.org/markets/data-hub))
這顯示美國的銀行手中握有太多準備金,不知道怎麼運用,只好將其存在 FED 的帳戶裡,甚至,歐洲交易員也選擇將資金放在 FED,賺取 ON-RRP 的利息 (見 [此篇報導](https://tw.stock.yahoo.com/news/%E5%9C%8B%E9%9A%9B%E9%87%91%E8%9E%8D-%E6%AD%90%E5%85%83%E5%8D%80%E7%8F%BE%E9%87%91%E9%81%8E%E5%89%A9-%E7%BE%8E%E5%85%83%E8%B5%B0%E5%BC%B7%E8%87%B3%E5%B0%91%E5%88%B0%E5%B9%B4%E5%BA%95-084308830.html))。
如果資金大量流入美元,會為美元指數帶來支撐,預期短期美元走多。
同時,過多資金流入 ON-RRP 代表在準備金市場上對準備金的需求減少 (因為手中有太多了,怎麼還會想要更多呢?),而造成供過於求的情況,因此價格 (拆款利率)下跌,目前聯邦資金利率為 0.08%,已相當接近 ON-RRP 的利率 0.05% 了。

資金流入 ON-RRP 可能導致 FED 提高 ON-RRP 利率
## 周記: 禮拜一去三井住友面試
有認真看周報的朋友應該知道,上禮拜我沒有發 Grit Forex…。因為上週/本周真的太多事情了,先是六日回高雄投公投,接著禮拜一去三井住友面試,禮拜五又有投資與資產組合的面試,真的太忙了,因此本周的 Grit Forex 加碼,一次補足兩周內容,字字充滿 hard core,我自己看應該也會想睡覺 XD
總之,本周很累很充實,雖然投資與資產組合報告做得很不好 (遇到雷組員真的氣死),但也很高興有機會 [上台用英語報告](https://www.youtube.com/watch?v=kZV8znkFHaI),雖然仍有許多不足以及要加強的地方,但我會繼續加油的。
然後,最高興的還是通過三井住友的面試,即將在企業金融部工讀!!! 期許自已努力學習,建立個人品牌與形象!!! 未來也會持續更新實習心得!!!

我們自己開發的策略與其他投資理論在 2022 年 DAX 指數市場的表現比較 ([更多資料](https://github.com/LLH07/InvestnPortfolio)) | 60.806818 | 319 | 0.812372 | yue_Hant | 0.899139 |
c03c9f3e32161e2d3e3d5443eb6e5d8c7cb82f43 | 33,889 | md | Markdown | articles/active-directory/fundamentals/scenario-azure-first-sap-identity-integration.md | changeworld/azure-docs.de-de | 26492264ace1ad4cfdf80e5234dfed9a106e8012 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-03-12T23:37:21.000Z | 2021-03-12T23:37:21.000Z | articles/active-directory/fundamentals/scenario-azure-first-sap-identity-integration.md | changeworld/azure-docs.de-de | 26492264ace1ad4cfdf80e5234dfed9a106e8012 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/fundamentals/scenario-azure-first-sap-identity-integration.md | changeworld/azure-docs.de-de | 26492264ace1ad4cfdf80e5234dfed9a106e8012 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Szenario: Verwenden von Azure Active Directory zum Schützen des Zugriffs auf SAP-Plattformen und -Anwendungen'
description: Leitfaden für Architekten und IT-Administratoren zum Sichern des Zugriffs auf SAP-Plattformen und -Anwendungen
services: active-directory
author: xstof
manager: alberts
ms.service: active-directory
ms.workload: identity
ms.subservice: fundamentals
ms.topic: conceptual
ms.date: 08/26/2021
ms.author: christoc
ms.reviewer: ''
ms.custom: ''
ms.collection: ''
ms.openlocfilehash: 2c987a951cbaf3795757ab0b57e8dcea0eb47aee
ms.sourcegitcommit: 61f87d27e05547f3c22044c6aa42be8f23673256
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 11/09/2021
ms.locfileid: "132063962"
---
# <a name="scenario---using-azure-active-directory-to-secure-access-to-sap-platforms-and-applications"></a>Szenario: Verwenden von Azure Active Directory zum Schützen des Zugriffs auf SAP-Plattformen und -Anwendungen
Dieses Dokument enthält Ratschläge zum technischen Entwurf und zur Konfiguration von SAP-Plattformen und -Anwendungen, wenn Azure Active Directory als primärer Authentifizierungsdienst für Benutzer verwendet wird.
## <a name="terminology-used-in-this-guide"></a>Terminologie in dieser Anleitung
| Abkürzung | BESCHREIBUNG |
| --------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| [BTP](https://www.sap.com/products/business-technology-platform.html) | SAP Business Technology Platform. Das gesamte Technologieangebot von SAP. Die meisten der hier beschriebenen SAP-Technologien sind Bestandteil von BTP. Die Produkte, die formell als SAP Cloud Platform bezeichnet werden, sind Bestandteil von SAP BTP. |
| [IAS](https://help.sap.com/viewer/6d6d63354d1242d185ab4830fc04feb1/Cloud/en-US) | SAP Cloud Identity Services – Identity Authentication Service. Der mehrinstanzenfähige cloudbasierte Identitätsanbieterdienst, der von SAP bereitgestellt wird. IAS unterstützt Benutzer bei der Authentifizierung bei ihren eigenen SAP-Dienstinstanzen. |
| [IDS](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/d6a8db70bdde459f92f2837349f95090.html) | SAP ID Service. Eine Instanz von IAS, die von SAP für die Authentifizierung von Kunden und Partnern bei von SAP bereitgestellten PaaS- und SaaS-Diensten verwendet wird. |
| [IPS](https://help.sap.com/viewer/f48e822d6d484fa5ade7dda78b64d9f5/Cloud/en-US/2d2685d469a54a56b886105a06ccdae6.html) | SAP Cloud Identity Services – Identity Provisioning Service. IPS unterstützt die Synchronisierung von Identitäten zwischen verschiedenen Speichern/Zielsystemen. |
| [XSUAA](https://blogs.sap.com/2019/01/07/uaa-xsuaa-platform-uaa-cfuaa-what-is-it-all-about/) | Extended Services for Cloud Foundry User Account and Authentication. XSUAA ist ein mehrinstanzenfähiger OAuth-Autorisierungsserver innerhalb von SAP BTP. |
| [CF](https://www.cloudfoundry.org/) | Cloud Foundry. Cloud Foundry ist die Umgebung, in der SAP sein Multi-Cloud-Angebot für BTP (AWS, Azure, GCP, Alibaba) erstellt hat. |
| [Fiori](https://www.sap.com/products/fiori/develop.html) | Die webbasierte Benutzeroberfläche von SAP (im Gegensatz zur desktopbasierten Benutzeroberfläche). |
## <a name="overview"></a>Überblick
Es gibt viele Dienste und Komponenten im SAP- und Microsoft-Technologiestapel, die in Szenarien für die Authentifizierung und Autorisierung von Benutzern eine Rolle spielen. Die wichtigsten Dienste sind im folgenden Diagramm aufgeführt.

Da es viele Variationen möglicher Szenarien gibt, die konfiguriert werden müssen, konzentrieren wir uns auf ein Szenario, in dem die Azure AD-Identität bevorzugt verwendet wird. Wir gehen von folgenden Annahmen aus:
- Sie möchten alle Ihre Identitäten zentral und nur über Azure AD steuern.
- Sie möchten den Wartungsaufwand so weit wie möglich reduzieren sowie Authentifizierung und App-Zugriff Microsoft- und SAP-übergreifend automatisieren.
- Die allgemeine Anleitung für Azure AD mit IAS gilt für Apps, die in BTP- und SAP SaaS-Apps bereitgestellt werden, die in IAS konfiguriert wurden. Darüber hinaus werden ggf. spezielle Empfehlungen für BTP (z. B. die Verwendung von Rollenzuordnungen über Azure AD Gruppen) und SAP SaaS-Apps (z. B. die Verwendung des Identitätsbereitstellungsdiensts für die rollenbasierte Autorisierung) bereitgestellt.
- Außerdem wird davon ausgegangen, dass Benutzer bereits in Azure AD und für alle SAP-Systeme bereitgestellt wurden, die voraussetzen, dass Benutzer bereitgestellt werden, damit sie funktionieren. Dies gilt unabhängig davon, wie die Bereitstellung durchgeführt wurde: manuell, von einer lokalen Active Directory-Instanz über Azure AD Connect oder über Personalverwaltungssysteme wie SAP SuccessFactors. In diesem Dokument wird SuccessFactors daher wie jede andere Anwendung angesehen, bei der sich (vorhandene) Benutzer anmelden. Die eigentliche Bereitstellung von Benutzern aus SuccessFactors in Azure AD wird nicht behandelt.
Basierend auf diesen Annahmen konzentrieren wir uns hauptsächlich auf die Produkte und Dienste, die im folgenden Diagramm dargestellt sind. Dabei handelt es sich um die verschiedenen Komponenten, die für die Authentifizierung und Autorisierung in einer cloudbasierten Umgebung besonders relevant sind.

## <a name="recommendations"></a>Empfehlungen
### <a name="summary"></a>Zusammenfassung
- [1. Verwendung der Verbundauthentifizierung in SAP Business Technology Platform- und SAP SaaS-Anwendungen über SAP Identity Authentication Service](#1---use-federated-authentication-in-sap-business-technology-platform-and-sap-saas-applications-through-sap-identity-authentication-service)
- [2. Verwendung von Azure AD für die Authentifizierung und von IAS/BTP für die Autorisierung](#2---use-azure-ad-for-authentication-and-iasbtp-for-authorization)
- [3. Verwendung von Azure AD-Gruppen für die Autorisierung über Rollensammlungen in IAS/BTP](#3---use-azure-ad-groups-for-authorization-through-role-collections-in-iasbtp)
- [4. Verwendung lediglich eines einzelnen BTP-Unterkontos für Anwendungen mit ähnlichen Identitätsanforderungen](#4---use-a-single-btp-subaccount-only-for-applications-that-have-similar-identity-requirements)
- [5. Verwendung des IAS-Mandanten der Produktionsumgebung für die Authentifizierung und Autorisierung aller Endbenutzer](#5---use-the-production-ias-tenant-for-all-end-user-authentication-and-authorization)
- [6. Definition eines Rolloverprozesses für SAML-Signaturzertifikate](#6---define-a-process-for-rollover-of-saml-signing-certificates)
### <a name="1---use-federated-authentication-in-sap-business-technology-platform-and-sap-saas-applications-through-sap-identity-authentication-service"></a>1\. Verwendung der Verbundauthentifizierung in SAP Business Technology Platform- und SAP SaaS-Anwendungen über SAP Identity Authentication Service
#### <a name="context"></a>Kontext
Ihre Anwendungen in BTP können Identitätsanbieter über [Vertrauensstellungskonfigurationen](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/cb1bc8f1bd5c482e891063960d7acd78.html) verwenden, um Benutzer über das SAML 2.0-Protokoll zwischen BTP/XSUAA und dem Identitätsanbieter zu authentifizieren. Beachten Sie, dass nur SAML 2.0 unterstützt wird, obwohl zwischen der Anwendung selbst und BTP/XSUAA das OpenID Connect-Protokoll verwendet wird (in diesem Kontext nicht relevant).
In BTP können Sie eine Vertrauensstellungskonfiguration für SAP ID Service (Standardeinstellung) einrichten. Wenn als autoritatives Benutzerverzeichnis jedoch Azure AD verwendet wird, können Sie einen **Verbund** einrichten, damit sich Benutzer mit ihren vorhandenen Azure AD-Konten anmelden können.
Zusätzlich zum Verbund können Sie optional auch die **Benutzerbereitstellung** so einrichten, dass Azure AD Benutzer vorab in BTP bereitgestellt werden. Es gibt dafür jedoch keine native Unterstützung (nur für Azure AD -> SAP Identity Authentication Service). Eine integrierte Lösung mit nativer Unterstützung wäre der BTP Identity Provisioning Service. Die Vorabbereitstellung von Benutzerkonten kann für Autorisierungszwecke nützlich sein (z. B. zum Hinzufügen von Benutzern zu Rollen). Abhängig von den jeweiligen Anforderungen können Sie dies jedoch auch mit Azure AD-Gruppen erreichen (siehe unten), sodass Sie u. U. überhaupt keine Benutzerbereitstellung benötigen.
Beim Einrichten der Verbundbeziehung gibt es mehrere Optionen:
- Sie können einen Verbund mit Azure AD direkt über BTP/XSUAA erstellen.
- Sie können einen Verbund mit IAS erstellen, der wiederum als Verbund mit Azure AD als Unternehmensidentitätsanbieter eingerichtet ist (auch als „SAML-Proxying“ bezeichnet).
Für SAP SaaS-Anwendungen wird IAS bereitgestellt und vorkonfiguriert, um das Onboarding von Endbenutzern zu vereinfachen. (Beispiele hierfür sind SuccessFactors, Marketing Cloud, Cloud4Customer, Sales Cloud und andere.) Dieses Szenario ist weniger komplex, da IAS direkt mit der Ziel-App verbunden und nicht über einen Proxy mit XSUAA verbunden ist. In jedem Fall gelten für diese Einrichtung die gleichen Regeln wie für Azure AD mit IAS im Allgemeinen.
#### <a name="what-are-we-recommending"></a>Unsere Empfehlung
Wenn Sie Azure AD als autoritatives Benutzerverzeichnis verwenden, wird empfohlen, in BTP eine Vertrauensstellungskonfiguration zu IAS einzurichten. IAS wiederum wird als Verbund mit Azure AD als Unternehmensidentitätsanbieter eingerichtet.

Für die Vertrauensstellungskonfiguration in BTP wird empfohlen, die Erstellung von Schattenbenutzern bei der Anmeldung zu aktivieren. Auf diese Weise erhalten Benutzer, die noch nicht in BTP erstellt wurden, automatisch ein Konto, wenn sie sich zum ersten Mal über IAS/Azure AD anmelden. Wenn diese Einstellung deaktiviert wird, dürfen sich nur vorab bereitgestellte Benutzer anmelden.
#### <a name="why-this-recommendation"></a>Gründe für diese Empfehlung
Bei Verwendung eines Verbunds können Sie die Vertrauensstellungskonfiguration auf der Ebene von BTP-Unterkonten definieren. In diesem Fall müssen Sie die Konfiguration für jedes andere verwendete Unterkonto wiederholen. Durch die Verwendung von IAS als zwischengelagerte Vertrauensstellungskonfiguration profitieren Sie von der zentralisierten unterkontoübergreifenden Konfiguration. Außerdem können Sie IAS-Features wie [risikobasierte Authentifizierung](https://help.sap.com/viewer/6d6d63354d1242d185ab4830fc04feb1/Cloud/en-US/bc52fbf3d59447bbb6aa22f80d8b6056.html) und zentrale [Anreicherung von Assertionsattributen](https://help.sap.com/viewer/6d6d63354d1242d185ab4830fc04feb1/Cloud/en-US/7124201682434efb946e1046fde06afe.html) verwenden. Um die Benutzerfreundlichkeit zu gewährleisten, sollten diese erweiterten Sicherheitsfeatures nur an einer einzigen Stelle erzwungen werden. Dies kann entweder IAS sein oder bei Beibehaltung von Azure AD als einzelner autoritativer Benutzerspeicher (wie in diesem Whitepaper vorausgesetzt) zentral über die [bedingte Zugriffsverwaltung](../conditional-access/overview.md) von Azure AD bewerkstelligt werden.
Beachten Sie, dass bei Verwendung von IAS jedes Unterkonto als „Anwendung“ gilt, obwohl im betreffenden Unterkonto eine oder mehrere Anwendungen bereitgestellt werden können. In IAS kann jede derartige Anwendung für den Verbund mit demselben Unternehmensidentitätsanbieter eingerichtet werden (in diesem Fall Azure AD).
#### <a name="summary-of-implementation"></a>Zusammenfassung der Implementierung
In Azure AD:
- [Konfigurieren Sie Azure AD die nahtlose einmalige Anmeldung](../hybrid/how-to-connect-sso.md) (nahtlose SSO), bei der Benutzer automatisch auf ihren Unternehmensgeräten angemeldet werden, die mit dem Unternehmensnetzwerk verbunden sind. Wenn diese Funktion aktiviert ist, müssen Benutzer zur Anmeldung bei Azure AD nicht ihr Kennwort und in der Regel nicht einmal ihren Benutzernamen eingeben.
In Azure AD und IAS:
- Befolgen Sie die entsprechende Dokumentation, um Azure AD im Verbundmodus (Proxy) mit IAS zu verbinden [(SAP-Dokumentation](https://developers.sap.com/tutorials/cp-ias-azure-ad.html), [Microsoft-Dokumentation](../saas-apps/sap-hana-cloud-platform-identity-authentication-tutorial.md)). Achten Sie auf die Einstellung `NameID` in Ihrer SSO-Konfiguration in Azure AD, da UPNs nicht unbedingt E-Mail-Adressen sind.
- Konfigurieren Sie die gebündelte Anwendung für die Verwendung von Azure AD. Rufen Sie dazu die Seite [Bedingte Authentifizierung](https://help.sap.com/viewer/6d6d63354d1242d185ab4830fc04feb1/Cloud/en-US/0143dce88a604533ab5ab17e639fec09.html) auf, und legen Sie den Standardauthentifizierungsidentitätsanbieter auf den Unternehmensidentitätsanbieter fest, der Ihr Azure AD-Verzeichnis darstellt.
In BTP:
- Richten Sie eine Vertrauensstellungskonfiguration zu IAS ein [(SAP-Dokumentation)](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/7c6aa87459764b179aeccadccd4f91f3.html#loio7c6aa87459764b179aeccadccd4f91f3), und vergewissern Sie sich, dass „[Available for User Logon](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/LATEST/en-US/affb201b1a36497996c2144c28683aed.html)“ (Für Benutzeranmeldung verfügbar) und „Create Shadow Users During Logon“ (Schattenbenutzer bei der Anmeldung erstellen) aktiviert sind.
- Deaktivieren Sie ggf. „Available for User Logon“ in der Standardkonfiguration für die SAP ID Service-Vertrauensstellung, sodass sich Benutzer immer über Azure AD authentifizieren und kein Bildschirm angezeigt wird, in dem sie den Identitätsanbieter auswählen können.
### <a name="2---use-azure-ad-for-authentication-and-iasbtp-for-authorization"></a>2\. Verwendung von Azure AD für die Authentifizierung und von IAS/BTP für die Autorisierung
#### <a name="context"></a>Kontext
Wenn BTP und IAS für die **Benutzerauthentifizierung** über einen Verbund mit Azure AD konfiguriert wurden, gibt es mehrere Konfigurationsoptionen für die **Autorisierung**:
- In Azure AD können Sie der Unternehmensanwendung, die Ihre SAP IAS-Instanz in Azure AD darstellt, Azure AD-Benutzer und -Gruppen zuweisen.
- In IAS können Sie die risikobasierte Authentifizierung verwenden, um Anmeldungen zuzulassen oder zu sperren und so den Zugriff auf die Anwendung in BTP zu verhindern.
- In BTP können Sie über Rollensammlungen definieren, welche Benutzer und Gruppen auf die Anwendung zugreifen und bestimmte Rollen erhalten können.
#### <a name="what-are-we-recommending"></a>Unsere Empfehlung
Es wird empfohlen, die Autorisierung nicht direkt in Azure AD selbst durchzuführen und in Azure AD „[Benutzerzuweisung erforderlich](../manage-apps/assign-user-or-group-access-portal.md)“ für die Unternehmensanwendung explizit zu deaktivieren. Beachten Sie, dass diese Einstellung für SAML-Anwendungen standardmäßig aktiviert ist, sodass Sie diese Einstellung explizit deaktivieren müssen.
#### <a name="why-this-recommendation"></a>Gründe für diese Empfehlung
Bei Erstellung des Anwendungsverbunds über IAS wird der Benutzer aus der Sicht von Azure AD im Rahmen des Anmeldeflows eigentlich bei IAS authentifiziert. Dies bedeutet, dass Azure AD keine Informationen darüber hat, bei welcher BTP-Anwendung sich der Benutzer letztendlich anmelden möchte. Dies bedeutet auch, dass die Autorisierung in Azure AD nur für eine sehr grobe Autorisierung verwendet werden kann, z. B. dass sich der Benutzer bei einer *beliebigen* Anwendung in BTP oder bei *keiner* anmelden kann. Dies betont auch die SAP-Strategie, Apps und Authentifizierungsmechanismen auf der Ebene von BTP-Unterkonten zu isolieren.
Obwohl dies ein berechtigter Grund für die Verwendung von „Benutzerzuweisung erforderlich“ sein kann, bedeutet dies, dass es jetzt möglicherweise zwei verschiedene Stellen gibt, an denen Autorisierungsinformationen verwaltet werden müssen: sowohl in Azure AD für die Unternehmensanwendung (wo sie für *alle* BTP-Anwendungen gelten) als auch in jedem BTP-Unterkonto. Dies kann zu Verwirrung und Fehlkonfigurationen führen, wenn Autorisierungseinstellungen an einer Stelle, aber nicht an der anderen Stelle aktualisiert werden. Wenn beispielsweise ein Benutzer in BTP zugelassen, aber in Azure AD nicht der Anwendung zugewiesen wurde, schlägt die Authentifizierung fehl.
#### <a name="summary-of-implementation"></a>Zusammenfassung der Implementierung
Deaktivieren Sie für die Azure AD-Unternehmensanwendung, die die Verbundbeziehung mit IAS darstellt, „[Benutzerzuweisung erforderlich](../manage-apps/assign-user-or-group-access-portal.md)“. Dies bedeutet auch, dass Sie die Zuweisung von Benutzern problemlos überspringen können ([siehe Beschreibung in der Microsoft-Dokumentation](../saas-apps/sap-hana-cloud-platform-identity-authentication-tutorial.md#assign-the-azure-ad-test-user)).
### <a name="3---use-azure-ad-groups-for-authorization-through-role-collections-in-iasbtp"></a>3\. Verwendung von Azure AD-Gruppen für die Autorisierung über Rollensammlungen in IAS/BTP
#### <a name="context"></a>Kontext
Wenn Sie die Autorisierung für Ihre BTP-Anwendungen konfigurieren möchten, gibt es mehrere Optionen:
- Sie können eine differenzierte Zugriffssteuerung in der Anwendung selbst abhängig vom jeweils angemeldeten Benutzer konfigurieren.
- Sie können den Zugriff über Rollen und Rollensammlungen in BTP basierend auf Benutzerzuweisungen oder Gruppenzuweisungen angeben.
Die endgültige Implementierung kann eine Kombination beider Strategien verwenden. Für die Zuweisung über Rollensammlungen kann dies benutzerspezifisch erfolgen, oder es können Gruppen des konfigurierten Identitätsanbieters verwendet werden.
#### <a name="what-are-we-recommending"></a>Unsere Empfehlung
Wenn Sie Azure AD als autoritative Quelle für eine differenzierte Autorisierung verwenden möchten, empfiehlt es sich, Azure AD-Gruppen zu verwenden und diese Rollensammlungen in BTP zuzuweisen. Um Benutzer Zugriff auf bestimmte Anwendungen zu gewähren, müssen die Benutzer lediglich zu den relevanten Azure AD-Gruppen hinzugefügt werden, ohne dass eine weitere Konfiguration in IAS/BTP erforderlich ist.
Bei dieser Konfiguration wird empfohlen, die Gruppen-ID (Objekt-ID) der Azure AD-Gruppe und nicht den Anzeigenamen („sAMAccountName“) als eindeutigen Bezeichner der Gruppe zu verwenden. Dies bedeutet, dass Sie die Gruppen-ID als „Gruppen“-Assertion im SAML-Token verwenden müssen, das von Azure AD ausgestellt wird. Darüber hinaus wird die Gruppen-ID für die Zuweisung zur Rollensammlung in BTP verwendet.

#### <a name="why-this-recommendation"></a>Gründe für diese Empfehlung
Wenn Sie *Benutzer* direkt Rollensammlungen in BTP zuweisen, werden Autorisierungsentscheidungen nicht in Azure AD zentralisiert. Dies bedeutet auch, dass der Benutzer bereits in IAS vorhanden sein muss, damit er einer Rollensammlung in BTP zugewiesen werden kann. Da wir einen Verbund anstelle der Benutzerbereitstellung empfehlen, bedeutet dies, dass das Schattenkonto des Benutzers zum gewünschten Zeitpunkt der Benutzerzuweisung u. U. noch nicht in IAS vorhanden ist. Wenn Sie Azure AD-Gruppen verwenden und diese Gruppen Rollensammlungen zuweisen, entfallen diese Probleme.
Das Zuweisen von Gruppen zu Rollensammlungen scheint der vorherigen Empfehlung zu widersprechen, Azure AD nicht für die *Autorisierung* zu verwenden. Aber sogar in diesem Fall fällt die Autorisierungsentscheidung weiterhin in BTP. Die Entscheidung basiert jetzt allerdings nur auf der Gruppenmitgliedschaft, die in Azure AD verwaltet wird.
Es wird empfohlen, die Gruppen-ID der Azure AD-Gruppe und nicht den Namen der Gruppe zu verwenden, da die Gruppen-ID global eindeutig und unveränderlich ist und später nie für eine andere Gruppe wiederverwendet werden kann. Demgegenüber kann die Verwendung des Gruppennamens zu Problemen führen, wenn der Name geändert wird. Außerdem besteht ein Sicherheitsrisiko darin, dass eine Gruppe gelöscht und eine andere Gruppe mit dem gleichen Namen erstellt wird, die aber Benutzer enthält, die keinen Zugriff auf die Anwendung haben sollten.
#### <a name="summary-of-implementation"></a>Zusammenfassung der Implementierung
In Azure AD:
- Erstellen Sie Gruppen, zu denen die Benutzer hinzugefügt werden können, die Zugriff auf Anwendungen in BTP benötigen. (Erstellen Sie z. B. eine Azure AD-Gruppe für jede Rollensammlung in BTP.)
- Konfigurieren Sie für die Azure AD-Unternehmensanwendung, die die Verbundbeziehung mit IAS darstellt, die SAML-Benutzerattribute und -Ansprüche, um [einen Gruppenanspruch für Sicherheitsgruppen hinzuzufügen](../hybrid/how-to-connect-fed-group-claims.md#add-group-claims-to-tokens-for-saml-applications-using-sso-configuration):
- Legen Sie das Quellattribut auf „Gruppen-ID“ und den Namen auf `Groups` fest (exakt wie hier mit dem Großbuchstaben „G“ geschrieben).
- Außerdem wird dringend empfohlen, die in den Ansprüchen zurückgegebenen Gruppen auf die Gruppen zu beschränken, die explizit zugewiesen wurden, um die Anspruchsnutzdaten klein zu halten und zu vermeiden, dass Azure AD die Anzahl von Gruppenansprüchen in SAML-Assertionen auf 150 beschränkt:
- Wählen Sie unter „Welche dem Benutzer zugeordneten Gruppen sollen im Anspruch zurückgegeben werden?“ die Option „Der Anwendung zugewiesene Gruppen“ aus. Weisen Sie sie dann die Gruppen, die Sie als Ansprüche einbeziehen möchten, der Unternehmensanwendung zu, indem Sie im Abschnitt „Benutzer und Gruppen“ die Option „Benutzer/Gruppe hinzufügen“ auswählen.

In IAS:
- Achten Sie in der Konfiguration des Unternehmensidentitätsanbieters unter den Optionen für den Identitätsverbund darauf, dass „[Use Identity Authentication user store](https://help.sap.com/viewer/6d6d63354d1242d185ab4830fc04feb1/LATEST/en-US/c029bbbaefbf4350af15115396ba14e2.html)“ (Identity Authentication-Benutzerspeicher verwenden) deaktiviert ist. Andernfalls gehen die Gruppeninformationen von Azure AD im SAML-Token für BTP verloren, sodass die Autorisierung fehlschlägt.
In BTP:
- Führen Sie für die Rollensammlungen, die von den Anwendungen im betreffenden Unterkonto verwendet werden, eine [Zuordnung der Rollensammlungen zu Benutzergruppen](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/51acfc82c0c54db59de0a528f343902c.html) durch, indem Sie eine Konfiguration für den IAS-Identitätsanbieter hinzufügen und den Namen auf die Gruppen-ID (Objekt-ID) der Azure AD-Gruppe festlegen.
### <a name="4---use-a-single-btp-subaccount-only-for-applications-that-have-similar-identity-requirements"></a>4\. Verwendung lediglich eines einzelnen BTP-Unterkontos für Anwendungen mit ähnlichen Identitätsanforderungen
#### <a name="context"></a>Kontext
In BTP kann jedes Unterkonto mehrere Anwendungen enthalten. Aus Sicht von IAS handelt es sich bei einer „gebündelten Anwendung“ jedoch um ein vollständiges BTP-Unterkonto, und nicht um die darin differenzierten Anwendungen. Dies bedeutet, dass alle Vertrauenseinstellungen, Authentifizierung und Zugriffskonfiguration sowie Branding- und Layoutoptionen in IAS für alle Anwendungen innerhalb dieses Unterkontos gelten. Ebenso gelten alle Vertrauensstellungskonfigurationen und Rollensammlungen in BTP auch für alle Anwendungen im betreffenden Unterkonto.
#### <a name="what-are-we-recommending"></a>Unsere Empfehlung
Es wird empfohlen, mehrere Anwendungen nur dann in einem einzelnen BTP-Unterkonto zu gruppieren, wenn sie ähnliche Anforderungen auf Identitätsebene haben (Benutzer, Gruppen, Identitätsanbieter, Rollen, Vertrauensstellungskonfiguration, Branding usw.).
#### <a name="why-this-recommendation"></a>Gründe für diese Empfehlung
Wenn Sie mehrere Anwendungen mit sehr unterschiedlichen Identitätsanforderungen in einem einzelnen Unterkonto in BTP gruppieren, kann eine Konfiguration entstehen, die unsicher ist oder leichter falsch konfiguriert werden kann. Wenn beispielsweise für eine freigegebene Ressource, z. B. einen Identitätsanbieter, für eine einzelne Anwendung in BTP eine Konfigurationsänderung vorgenommen wird, wirkt sich dies auf alle Anwendungen aus, die auf dieser freigegebenen Ressource basieren.
#### <a name="summary-of-implementation"></a>Zusammenfassung der Implementierung
Überlegen Sie sorgfältig, wie Sie mehrere Anwendungen über Unterkonten in BTP gruppieren möchten. Weitere Informationen finden Sie in der [Dokumentation zum SAP-Kontenmodell](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/8ed4a705efa0431b910056c0acdbf377.html).
### <a name="5---use-the-production-ias-tenant-for-all-end-user-authentication-and-authorization"></a>5\. Verwendung des IAS-Mandanten der Produktionsumgebung für die Authentifizierung und Autorisierung aller Endbenutzer
#### <a name="context"></a>Kontext
Wenn Sie IAS verwenden, verfügen Sie in der Regel über einen Mandanten für die Produktionsumgebung und einen Mandanten für die Entwicklungs-/Testumgebung. Für die verschiedenen Unterkonten oder Anwendungen in BTP können Sie auswählen, welcher Identitätsanbieter (IAS-Mandant) verwendet werden soll.
#### <a name="what-are-we-recommending"></a>Unsere Empfehlung
Es wird empfohlen, für alle Interaktionen mit Endbenutzern immer den IAS-Mandanten für die Produktionsumgebung zu verwenden, auch im Kontext einer Entwicklungs-/Testversion oder -umgebung der *Anwendung*, bei der sich die Endbenutzer anmelden müssen.
Es wird empfohlen, andere IAS-Mandanten nur zum Testen der identitätsbezogenen Konfiguration zu verwenden, die isoliert vom Produktionsmandanten erfolgen muss.
#### <a name="why-this-recommendation"></a>Gründe für diese Empfehlung
Da IAS die zentrale Komponente ist, die für den Verbund mit Azure AD eingerichtet wurde, gibt es nur eine einzige Stelle, an der die Verbund- und Identitätskonfiguration eingerichtet und verwaltet werden muss. Eine Duplizierung in andere IAS-Mandanten kann zu Fehlkonfigurationen oder Inkonsistenzen zwischen Umgebungen führen, wenn es um den Endbenutzerzugriff geht.
### <a name="6---define-a-process-for-rollover-of-saml-signing-certificates"></a>6\. Definition eines Rolloverprozesses für SAML-Signaturzertifikate
#### <a name="context"></a>Kontext
Beim Konfigurieren des Verbunds zwischen Azure AD und IAS sowie zwischen IAS und BTP werden SAML-Metadaten ausgetauscht, die X.509-Zertifikate enthalten. Diese Zertifikate werden für die Verschlüsselung und kryptografische Signaturen der SAML-Token verwendet, die zwischen beiden Parteien gesendet werden. Die Zertifikate verfügen über Ablaufdaten und müssen regelmäßig aktualisiert werden (auch in Notfallsituationen, wenn z. B. ein Zertifikat kompromittiert wurde).
Beachten Sie, dass die Standardgültigkeitsdauer des anfänglichen Azure AD-Zertifikats, das zum Signieren von SAML-Assertionen verwendet wird, 3 Jahre beträgt. (Beachten Sie weiterhin, dass das Zertifikat im Gegensatz zu OpenID Connect- und OAuth 2.0-Token, die von einem globalen Zertifikat in Azure AD signiert werden, speziell für die Unternehmensanwendung gilt.) Sie können [ein neues Zertifikat mit einem anderen Ablaufdatum generieren](../manage-apps/manage-certificates-for-federated-single-sign-on.md#customize-the-expiration-date-for-your-federation-certificate-and-roll-it-over-to-a-new-certificate) oder ein eigenes Zertifikat erstellen und importieren.
Wenn Zertifikate ablaufen, können sie nicht mehr verwendet werden, sodass neue Zertifikate konfiguriert werden müssen. Daher muss ein Prozess eingerichtet werden, um die Zertifikatkonfiguration in der vertrauenden Seite (die die Signaturen überprüfen muss) mit den tatsächlichen Zertifikaten, die zum Signieren der SAML-Token verwendet werden, auf dem aktuellen Stand zu halten.
In einigen Fällen kann die vertrauende Seite dies automatisch erledigen, indem die Konfiguration über einen Metadatenendpunkt bereitgestellt wird, der die aktuellen Metadateninformationen dynamisch zurückgibt. Dafür wird in der Regel eine öffentlich zugängliche URL verwendet, von der die vertrauende Seite die Metadaten in regelmäßigen Abständen abrufen und ihren internen Konfigurationsspeicher aktualisieren kann.
Allerdings lässt IAS nur die Einrichtung von Unternehmensidentitätsanbietern durch einen Import der XML-Metadatendatei zu. Die Bereitstellung eines Metadatenendpunkts für den dynamischen Abruf der Azure AD-Metadaten (z. B. `https://login.microsoftonline.com/my-azuread-tenant/federationmetadata/2007-06/federationmetadata.xml?appid=my-app-id`) wird nicht unterstützt. Ebenso lässt BTP keine Einrichtung einer neuen Vertrauensstellungskonfiguration über den IAS-Metadatenendpunkt (z. B. `https://my-ias-tenant.accounts.ondemand.com/saml2/metadata`) zu. Es ist ebenfalls ein einmaliger Upload einer XML-Metadatendatei erforderlich.
#### <a name="what-are-we-recommending"></a>Unsere Empfehlung
Stellen Sie beim Einrichten des Identitätsverbunds zwischen zwei Systemen (z. B. Azure AD und IAS sowie IAS und BTP) sicher, dass Sie das Ablaufdatum der verwendeten Zertifikate erfassen. Stellen Sie sicher, dass diese Zertifikate rechtzeitig ersetzt werden können und dass ein dokumentierter Prozess zum Aktualisieren der neuen Metadaten in allen vertrauenden Seiten vorhanden ist, die von diesen Zertifikaten abhängig sind.
Wie bereits erwähnt, wird die Einrichtung einer Vertrauensstellungskonfiguration in BTP für IAS empfohlen, die wiederum für den Verbund mit Azure AD als Unternehmensidentitätsanbieter eingerichtet ist. In diesem Fall sind die folgenden Zertifikate (die für die SAML-Signatur und Verschlüsselung verwendet werden) wichtig:
- Das Unterkontozertifikat in BTP: Wenn sich dieses Zertifikat ändert, muss die SAML 2.0-Konfiguration der Anwendung in IAS aktualisiert werden.
- Das Mandantenzertifikat in IAS: Wenn sich dieses Zertifikat ändert, müssen sowohl die SAML 2.0-Konfiguration der Unternehmensanwendung in Azure AD als auch die Vertrauensstellungskonfiguration in BTP aktualisiert werden.
- Das Unternehmensanwendungszertifikat in Azure AD: Wenn sich dieses Zertifikat ändert, muss die SAML 2.0-Konfiguration des Unternehmensidentitätsanbieters in IAS aktualisiert werden.

SAP stellt [hier](https://blogs.sap.com/2017/12/06/sap-cloud-platform-integration-automated-notification-of-keystore-entries-reaching-expiry/) und [hier](https://blogs.sap.com/2019/03/01/sap-cloud-platform-integration-automated-notification-for-client-certificates-reaching-expiry/) Beispielimplementierungen für Clientzertifikatbenachrichtigungen mit SAP Cloud Platform Integration bereit. Diese können mit Azure Integration Services oder PowerAutomate angepasst werden. Sie müssen jedoch für die Verwendung von Serverzertifikaten angepasst werden. Dafür ist eine benutzerdefinierte Implementierung erforderlich.
#### <a name="why-this-recommendation"></a>Gründe für diese Empfehlung
Wenn die Zertifikate ablaufen dürfen oder rechtzeitig ersetzt werden, aber die von den Zertifikaten abhängigen vertrauenden Seiten nicht mit den neuen Zertifikatinformationen aktualisiert werden, können sich Benutzer nicht mehr über den Verbund bei Anwendungen anmelden. Dies kann zu erheblichen Ausfallzeiten für alle Benutzer führen, während der Dienst durch Neukonfiguration der Metadaten wiederhergestellt wird.
#### <a name="summary-of-implementation"></a>Zusammenfassung der Implementierung
[Fügen Sie eine E-Mail-Benachrichtigungsadresse für den Zertifikatablauf](../manage-apps/manage-certificates-for-federated-single-sign-on.md#add-email-notification-addresses-for-certificate-expiration) in Azure AD hinzu, und legen Sie sie auf ein Gruppenpostfach fest, damit die Benachrichtigung nicht an eine einzelne Person gesendet wird (die beim Ablauf des Zertifikats möglicherweise sogar kein Konto mehr hat). Standardmäßig erhält nur der Benutzer, der die Unternehmensanwendung erstellt hat, eine Benachrichtigung.
Erwägen Sie eine Automatisierung, um den gesamten Rolloverprozess für Zertifikat auszuführen. Sie können z. B. regelmäßig überprüfen, ob Zertifikate ablaufen, und diese ersetzen, während alle vertrauenden Seiten mit den neuen Metadaten aktualisiert werden.
| 132.378906 | 1,151 | 0.792794 | deu_Latn | 0.996402 |
c03e1c5ff7e5645a4599b9b8c531733015f6bbdd | 1,536 | md | Markdown | README.md | preservica/Universal-Access-CSS | 1e20e68d3e0845edcd16710b18109c09c7c5004c | [
"MIT"
] | null | null | null | README.md | preservica/Universal-Access-CSS | 1e20e68d3e0845edcd16710b18109c09c7c5004c | [
"MIT"
] | null | null | null | README.md | preservica/Universal-Access-CSS | 1e20e68d3e0845edcd16710b18109c09c7c5004c | [
"MIT"
] | 1 | 2019-08-30T20:18:09.000Z | 2019-08-30T20:18:09.000Z | # universal-access-css
Examples of Cascading Style Sheets (.css) files that can be used to change the look and feel of the Preservica Universal Access WordPress site.
## Getting started
The Base Preservica theme folder contains the 4 main css files that control the look and feel of the Preservica default theme.
* style.css is the main stylesheet that is loaded by WordPress
* base.css
* layout.css
* skeleton.css
These css files can be used to understand how the theme is controlled by the stylesheets and to help understand which element needs to be changed to effect a specific change in the theme.
## Contributing
Please read [CONTRIBUTING.md](CONTRIBUTING.md) for details on our code of conduct, and the process for submitting pull requests to us.
css files contributed shoud contain comments explaining what aspect of the WordPress site are changed by deploying the specific css file
## Deployment
Simply download the css file and upload it on your Preservica Universal Access site via the WordPress Dashboard (Appearance > Customize > CSS > Select file).
Once you have uploaded the css file, you will be able to preview the effect of the new stylesheet before publishing it.
## License
This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details
## Support
Support for this project is provided by its community of contributors. Preservica does not support any of the code included in this project. Users choose to use any code included in this project at their own risk.
| 45.176471 | 213 | 0.791016 | eng_Latn | 0.999422 |
c03ee18ccaf68553ff12cf70fbc4aa7c20f31b13 | 1,407 | md | Markdown | glosalist/601_raw.md | fiasinstitute/glosa | 541c7b892226d21043f06f86322f7ec52ee294d1 | [
"MIT"
] | 3 | 2020-10-27T22:49:36.000Z | 2022-02-20T17:15:55.000Z | glosalist/601_raw.md | fiasinstitute/glosa | 541c7b892226d21043f06f86322f7ec52ee294d1 | [
"MIT"
] | null | null | null | glosalist/601_raw.md | fiasinstitute/glosa | 541c7b892226d21043f06f86322f7ec52ee294d1 | [
"MIT"
] | null | null | null | ---
authorName: [email protected]
canDelete: false
contentTrasformed: false
from: sydpidd@...
headers.inReplyToHeader: .nan
headers.messageIdInHeader: PDFhNC40MTc3MWYyZC4zMDgxMWMwNEBhb2wuY29tPg==
headers.referencesHeader: .nan
layout: email
msgId: 601
msgSnippet: dear marcel i am looking at esperanto at the moment and, as several contributers
to glosalist would like to use it, i think it could useful to glosa if
nextInTime: 602
nextInTopic: 0
numMessagesInTopic: 1
postDate: '1129300484'
prevInTime: 600
prevInTopic: 0
profile: sydpidd1926
replyTo: LIST
senderId: 1egLhSKJrz5ny4csrYCiqC8CUyOu0T9a2UZD-LjSkzBiFd2QhcoKdcMxpZe2aK06ICLbNOkv
spamInfo.isSpam: false
spamInfo.reason: '12'
systemMessage: false
title: 'Re: [glosalist] esperanto'
topicId: 601
userId: 137587403
---
dear marcel
i am looking at esperanto at the moment and, as several contributers to
glosalist would like to use it, i think it could useful to glosa if glosalist asked
for contributions in glosa and about glosa in english as before but in
addition about g in easp
so far i much prefer g even though i do a fair amount of grumbling. however,
i do not see any rivallry or competition between the two languages g seems to
be parallel with western europe and esp with eastern.
if the two languages are good ials, it should be easy to use both
comparisons to follow
syd
[Non-text portions of this message have been removed]
| 30.586957 | 92 | 0.803127 | eng_Latn | 0.979637 |
c03eed2fcd50a282021a4eec7ae980cfde171b5a | 361 | md | Markdown | pages.pt_BR/common/fast.md | derNiklaas/tldr | 36648e10311955754f99c02805b3f66d5000a5e5 | [
"CC-BY-4.0"
] | 38,585 | 2015-01-03T03:23:18.000Z | 2022-03-31T17:46:27.000Z | pages.pt_BR/common/fast.md | derNiklaas/tldr | 36648e10311955754f99c02805b3f66d5000a5e5 | [
"CC-BY-4.0"
] | 5,517 | 2015-01-05T08:36:28.000Z | 2022-03-31T19:03:46.000Z | pages.pt_BR/common/fast.md | derNiklaas/tldr | 36648e10311955754f99c02805b3f66d5000a5e5 | [
"CC-BY-4.0"
] | 4,407 | 2015-01-12T15:29:39.000Z | 2022-03-31T08:37:33.000Z | # fast
> Teste sua velocidade de download e upload utilizando fast.com.
> Mais informações: <https://github.com/sindresorhus/fast-cli>.
- Mede a velocidade de download atual:
`fast`
- Mede a velocidade de upload atual além da velocidade de download:
`fast --upload`
- Exibe os resultados em uma única linha para reduzir espaçamento:
`fast --single-line`
| 21.235294 | 67 | 0.747922 | por_Latn | 0.999668 |
c03f5a4c21d3a35a3e875258c78b5727bbe2714c | 46 | md | Markdown | README.md | PB-Tech/Koa2Template | 44b19c852f7a27e1f797cf76e257a501cdb23e3b | [
"MIT"
] | null | null | null | README.md | PB-Tech/Koa2Template | 44b19c852f7a27e1f797cf76e257a501cdb23e3b | [
"MIT"
] | null | null | null | README.md | PB-Tech/Koa2Template | 44b19c852f7a27e1f797cf76e257a501cdb23e3b | [
"MIT"
] | null | null | null | # Koa2Template
This is a Koa2 develop template | 23 | 31 | 0.826087 | eng_Latn | 0.998942 |
c03feb10689536e1b3c921d2bef8306082787812 | 640 | md | Markdown | README.md | deematariq/boxbox | 51a79223ba039a023741160c1c4f9ff9cfe3b515 | [
"MIT"
] | 31 | 2015-01-24T20:40:24.000Z | 2020-10-30T01:33:27.000Z | README.md | deematariq/boxbox | 51a79223ba039a023741160c1c4f9ff9cfe3b515 | [
"MIT"
] | 6 | 2015-01-03T17:28:20.000Z | 2017-03-30T17:23:19.000Z | README.md | deematariq/boxbox | 51a79223ba039a023741160c1c4f9ff9cfe3b515 | [
"MIT"
] | 17 | 2015-01-03T16:41:45.000Z | 2021-12-06T21:01:43.000Z | # boxbox
A framework that makes it easier to use the Box2d / Box2dweb physics engine in JavaScript.
## Learn about boxbox
http://incompl.github.com/boxbox
## box2dweb files are from
http://code.google.com/p/box2dweb/
## Demos, experiments, projects, etc.
* [Don't look at me](http://dontlookatme.maryrosecook.com/)
* [Platformer demo](http://incompl.github.io/boxbox/boxbox/demos/platformer/demo.html)
* [Chain demo](http://incompl.github.io/boxbox/boxbox/demos/chain/chain.html)
* [Box Fall](http://bama.ua.edu/~ardixon1/MAIN/Code/block_fall/play.html)
* Add your project or demo here!
## Created at Bocoup
http://bocoup.com
| 26.666667 | 90 | 0.734375 | eng_Latn | 0.353854 |
c040443951414c02fb8e4eb7d981a2c01d7c376c | 1,778 | md | Markdown | README.md | jpmetcalf/IdentityManager2 | 725556d99687626c3161ba6df2a6fa77be3b25d0 | [
"Apache-2.0"
] | 184 | 2018-03-29T00:18:41.000Z | 2022-02-22T07:25:46.000Z | README.md | ChaosEngine/IdentityManager2 | 5dd7ad9828f999efa6fcd3e2d660d831e34350de | [
"Apache-2.0"
] | 20 | 2018-04-05T12:49:21.000Z | 2021-05-07T07:37:09.000Z | README.md | ChaosEngine/IdentityManager2 | 5dd7ad9828f999efa6fcd3e2d660d831e34350de | [
"Apache-2.0"
] | 35 | 2018-04-23T04:17:24.000Z | 2022-02-20T05:35:22.000Z | # IdentityManager2
[](https://www.nuget.org/packages/IdentityManager2) [](https://gitter.im/IdentityManager/IdentityManager?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
IdentityManager2 is a tool for developers and/or administrators to manage the identity information for users of their applications in ASP.NET Core. This includes creating users, editing user information (passwords, email, claims, etc.) and deleting users. It provides a modern replacement for the ASP.NET WebSite Administration tool that used to be built into Visual Studio.
In theory, IdentityManager2 can work with any user store, it just requires an implementation of `IIdentityManagerService`. For example ASP.NET Core Identity usage, check out [IdentityManager2.AspNetIdentity](https://github.com/IdentityManager/IdentityManager2.AspNetIdentity).
IdentityManager2 is a development tool and is not designed to be used in production. For production identity management see [AdminUI](https://www.identityserver.com/products).
## Articles
- [Getting Started with IdentityManager2](https://www.scottbrady91.com/ASPNET-Identity/Getting-Started-with-IdentityManager2)
- [IdentityManager2 2020 Update](https://www.scottbrady91.com/ASPNET-Identity/IdentityManager2-2020-Update)
## Contributing
Currently, IdentityManager2 is a port of the original IdentityManager dev tool. If you're interested in helping to update the codebase, then check out the [issue tracker](https://github.com/IdentityManager/IdentityManager2/issues?q=label%3A%22help+wanted%22+is%3Aissue+is%3Aopen).
Developed and maintained by [Rock Solid Knowledge](https://www.identityserver.com).
| 84.666667 | 374 | 0.813273 | eng_Latn | 0.544921 |
c0441d6fd0432fdd12b71e7200745f3b3d409321 | 357 | md | Markdown | node_modules/decentraland-ecs/docs/decentraland-ecs.path2.length.md | nearnshaw/ECS-fish-in-circles | b37fd3feb9ee006be3645e7062a3ed46ca26f698 | [
"Apache-2.0"
] | null | null | null | node_modules/decentraland-ecs/docs/decentraland-ecs.path2.length.md | nearnshaw/ECS-fish-in-circles | b37fd3feb9ee006be3645e7062a3ed46ca26f698 | [
"Apache-2.0"
] | null | null | null | node_modules/decentraland-ecs/docs/decentraland-ecs.path2.length.md | nearnshaw/ECS-fish-in-circles | b37fd3feb9ee006be3645e7062a3ed46ca26f698 | [
"Apache-2.0"
] | null | null | null | [Home](./index) > [decentraland-ecs](./decentraland-ecs.md) > [Path2](./decentraland-ecs.path2.md) > [length](./decentraland-ecs.path2.length.md)
# Path2.length method
Gets the sum of the distance between each sequential point in the path
**Signature:**
```javascript
length(): number;
```
**Returns:** `number`
the Path2 total length (float).
| 25.5 | 154 | 0.703081 | eng_Latn | 0.760523 |
c04514cacc0f40334ec8d4c8c7f15224862ce99d | 744 | md | Markdown | README.md | colinsteffen/ba-xamarin-android | 90e57cb9e758a12056c09ca72f1916963a78ca5b | [
"Apache-2.0"
] | null | null | null | README.md | colinsteffen/ba-xamarin-android | 90e57cb9e758a12056c09ca72f1916963a78ca5b | [
"Apache-2.0"
] | null | null | null | README.md | colinsteffen/ba-xamarin-android | 90e57cb9e758a12056c09ca72f1916963a78ca5b | [
"Apache-2.0"
] | null | null | null | # ba-xamarin-android
Projekte für die Bachelorarbeit mit Thema App-Entwicklung (Xamarin vs. native Android-Entwicklung)
Die Projekte sind in verschiedene Unterprojekte geordnet, die einen Vergleich zwischen nativer Android-Entwicklung, der Entwicklung mit Xamarin.Android und Xamarin.Forms bezüglich dem Umgang mit Datenbanken, Json Serialization und Deserialization und Rest-Kommunikation darstellen.
Die native Android Projekte sind zu Öffnen mit dem Entwicklungstool Android Studio. Die Xamarin-Projekte sind zu Öffnen mit dem Entwicklungstool Visual Studio. Sollte es beim Deployment der Xamarin-Projekte Probleme geben, müssen evtl. in der Datei Properties in dem untergeordneten Android Projekten die Android-Options umgestellt werden.
| 106.285714 | 339 | 0.84543 | deu_Latn | 0.994643 |
bdddc969b421b701f872f50c664ead52432830d5 | 241 | md | Markdown | Classes/10. Ranger.md | 10leej/tordath | 2ade1b19077776f83fed6ef23bde92528e8735b3 | [
"CC-BY-4.0"
] | 1 | 2020-09-06T20:16:32.000Z | 2020-09-06T20:16:32.000Z | Classes/10. Ranger.md | 10leej/tordath | 2ade1b19077776f83fed6ef23bde92528e8735b3 | [
"CC-BY-4.0"
] | 1 | 2020-10-08T03:53:52.000Z | 2020-10-25T01:02:39.000Z | Classes/10. Ranger.md | 10leej/tordath | 2ade1b19077776f83fed6ef23bde92528e8735b3 | [
"CC-BY-4.0"
] | 1 | 2020-10-24T23:43:17.000Z | 2020-10-24T23:43:17.000Z | # Ranger
Rangers are often found exploring the various lands hunting game, or just serving on expeditions. Often they've adapted their own forms of magics through self learning to adapt to their surrounds, or even the prey they seek to hunt. | 120.5 | 232 | 0.809129 | eng_Latn | 0.999989 |
bddea061156a49b17a2663becb0f54958702fbbc | 75 | md | Markdown | README.md | joao472762/pomodoro | 837ce4b946b3004bf1ca0237d954da5cc68f546a | [
"MIT"
] | null | null | null | README.md | joao472762/pomodoro | 837ce4b946b3004bf1ca0237d954da5cc68f546a | [
"MIT"
] | null | null | null | README.md | joao472762/pomodoro | 837ce4b946b3004bf1ca0237d954da5cc68f546a | [
"MIT"
] | null | null | null | # pomodoro
pomodoro é uma técinica de estudos para otimizar o seu serviço
| 25 | 63 | 0.8 | por_Latn | 0.999994 |
bddf9dc54fa11f198f2af207e56dbd1e999b2e0b | 3,715 | md | Markdown | _posts/2016-11-06-biology-jargons.md | jwang123/jwang123.github.io | 0a920fc186ecf8a20de5885b0b29d6b895629618 | [
"MIT"
] | null | null | null | _posts/2016-11-06-biology-jargons.md | jwang123/jwang123.github.io | 0a920fc186ecf8a20de5885b0b29d6b895629618 | [
"MIT"
] | null | null | null | _posts/2016-11-06-biology-jargons.md | jwang123/jwang123.github.io | 0a920fc186ecf8a20de5885b0b29d6b895629618 | [
"MIT"
] | null | null | null | ---
layout: post
title: Biology jargons
---
#Bio-Jargon
After reading my fair share of biology papers for the past three years, I've come to realize that, like most fields, biology is full of jargons. However, unlike quantitive fields such as math and physics, biology papers are usually written in a more accessible manner. Equipped with a thorough understanding of some common jargons, I believe most undergraduate science student can fully comprehend biological research papers. So after three years, I've decided to compile my own list and explanation so other students are not so easily dissuaded from pursuing a thorough understanding of any paper they encounter.
The list will be divided into multiple categories based on my own opinion of where they are most relevant. (Note: this is a work in progress, all explanations are subject to change)
### Genetics
- **GAL4**: a tool used to study gene expression
- a transcriptional activator from budding yeast
- part of the popular GAL4/UAS binary expression system: one transgenic construct drives the expression of the GAL4 and another construct contains its binding site positioned upstream of a responder gene
- depending on the **driver** (native promoter driving GAL4), the responder gene can be made to express in specific cells
- **GAL4 driver lines**: these are genetic varieties of a model organism where GAL4 is only expressed in some subset of the animal's tissues.
- some lines are highly specific, perhaps only in a few cells
- the presence of GAL4 is assumed to have little to no effect since most cells do not have UAS regions
- **reporter lines**: second part of the system, they are strains with special UAS region next to a desired gene. Since the gene is only expressed in cells that express GAL4, they are *reporting* which cell express GAL4
- **GAL80**: a GAL4 inhibitor that can be used to create GAL4 expression in cells that are in line A but not line B (by having GAL80 expressed in line B)
- **Pan-neuronal**: across most or all types of neurons
- **$\Delta F/F$**:
- $\Delta F$ indicates the difference between initial fluorescence intensity at the resting state and after stimulation.
- $\Delta F/F$ compares the change of the intensity to the original intensity before stimulation.
- refer to this page for the calculation: http://www.nature.com/nprot/journal/v6/n1/box/nprot.2010.169_BX1.html
- **Biocytin-filling**: to determine the shape of a neuron
- **fictive motor pattern**: neuronal activity in the absence of muscle feedback
- **Pizoelectric actuator**: related to the piezoelectric effect, this youtube video was helpful to me in visualizing its mechanism: https://www.youtube.com/watch?v=fHp95e-CwWQ
- **ChR2 Chrimson**: red-shifted channelrhodpsins, these yellow/red light-sensitive channelrhodopsins allow controlling two populations of neurons independently with light pulses of different colors. ChR2 are light-gated ion channels, they allow cations to enter the cell upon light illumination (most natural ChR are non-cation specific)
- **GRASP**: a method for mapping synapses using two fragments of a GFP protein. GFP is attached to CD4 ligand, and if two neurons are close enough for synapse to form, GFP would be reconstituted and can be viewed under a fluorescent microscopew to
What does the letters and numbers of a transgenic strain mean?
- GMRgal4>UAS-XYZ
- *GMR* represents the promoter sequence driving the expression of GAL4. This corresponds where the protein will be expressed in a multi-cellular organism
- *gal4>UAS* represents the expression system being used
- *XYZ* is the protein of interest that it ultimately being expressed
| 95.25641 | 613 | 0.775774 | eng_Latn | 0.999474 |
bddfff7ad8b8fa7a436ddc793a89d68cd7b96bc7 | 85 | md | Markdown | notes/09-our-query-in-react.md | Matthew-Nelson/pro-gatsby-2 | 8ef2c6f69284bca958a8ca58d252da99843b744e | [
"MIT"
] | null | null | null | notes/09-our-query-in-react.md | Matthew-Nelson/pro-gatsby-2 | 8ef2c6f69284bca958a8ca58d252da99843b744e | [
"MIT"
] | null | null | null | notes/09-our-query-in-react.md | Matthew-Nelson/pro-gatsby-2 | 8ef2c6f69284bca958a8ca58d252da99843b744e | [
"MIT"
] | null | null | null | We are now going to learn how to actually write our queries within a react component
| 42.5 | 84 | 0.811765 | eng_Latn | 1.000009 |
bde01f0b6e3d6a818927a9932c7cadcf06301e10 | 107 | md | Markdown | README.md | javeeddanyal/COVID-19_Forecasting | 82bd92540584b58e6fea102f0b6a890ce5142c27 | [
"MIT"
] | null | null | null | README.md | javeeddanyal/COVID-19_Forecasting | 82bd92540584b58e6fea102f0b6a890ce5142c27 | [
"MIT"
] | null | null | null | README.md | javeeddanyal/COVID-19_Forecasting | 82bd92540584b58e6fea102f0b6a890ce5142c27 | [
"MIT"
] | null | null | null | # COVID-19_Forecasting
Novel coronavirus (nCoV) Prediction of Worldwide cases, Deaths, and recovered cases
| 35.666667 | 83 | 0.82243 | eng_Latn | 0.968613 |
bde1692a55b74fdf296ad4799684933bb307a9c2 | 277 | md | Markdown | Privacy notice.md | Aaron-Junker/aaron-junker.github.io | 7e2715d82e066281ab54e241be1e83b009d19d03 | [
"MIT"
] | 1 | 2021-07-04T11:40:05.000Z | 2021-07-04T11:40:05.000Z | Privacy notice.md | Aaron-Junker/aaron-junker.github.io | 7e2715d82e066281ab54e241be1e83b009d19d03 | [
"MIT"
] | null | null | null | Privacy notice.md | Aaron-Junker/aaron-junker.github.io | 7e2715d82e066281ab54e241be1e83b009d19d03 | [
"MIT"
] | null | null | null | ---
layout: page
title: Privacy notice
permalink: /privacy/
---
I don't collect any data from you.
Some pages are using Amazon affilate. For the Amazon Privacy statement click on this link: [Link](https://www.amazon.de/gp/help/customer/display.html?nodeId=GX7NJQ4ZB8MHFRNJ)
| 27.7 | 174 | 0.761733 | eng_Latn | 0.754512 |
bde1ccce668e21aee092d3f8caae9ceecd87bbd2 | 9,648 | md | Markdown | _pages/advisory-committee-reports/passenger-vessels/pvaac-report-ch08.md | bruce-usab/usab-uswds | f1c4b7b36d1be2eeba3a94b89a8bbd3ccaf2980a | [
"CC0-1.0"
] | null | null | null | _pages/advisory-committee-reports/passenger-vessels/pvaac-report-ch08.md | bruce-usab/usab-uswds | f1c4b7b36d1be2eeba3a94b89a8bbd3ccaf2980a | [
"CC0-1.0"
] | 3 | 2020-06-12T20:54:59.000Z | 2020-07-13T19:24:52.000Z | _pages/advisory-committee-reports/passenger-vessels/pvaac-report-ch08.md | bruce-usab/usab-uswds | f1c4b7b36d1be2eeba3a94b89a8bbd3ccaf2980a | [
"CC0-1.0"
] | 2 | 2020-06-25T02:52:11.000Z | 2020-07-13T19:11:43.000Z | ---
title: Chapter 8 Vehicle Parking
layout: report
order-number: 8
collection-folder: pvaac
collection-title: Passenger Vessels Access Advisory Committee
---
Note: This chapter only applies to passenger vessels subject to subchapters K or H, except where sections are referenced by chapter 12 which addresses subchapters C and T vessels.
Comment: The committee discussed the parking process that occurs on a vehicle ferry. Parking on a typical ferry is a highly controlled activity versus the undirected parking which normally occurs in a city lot. Unlike landside parking lots, vehicles typically queue up in preparation to enter the ferry at a set time, with space generally provided on a first-come-first-serve basis. Vehicles entering the ferry are often directed by the crew to a particular lane and are required to fill the lane starting at the front of the ferry. Depending on the number of vehicles loaded and each vehicle's weight, vehicles may also be directed to certain areas (particularly heavy trucks) to reduce their weight impact on the trim and stability of the ferry. When ferry demand is high, crew members ensure the spacing between vehicles is at a minimum thereby maximizing the carrying capacity of the ferry. Because of these factors, individual parking spaces are not designated on a ferry as in a city lot. Although lane markings are generally provided to assist drivers and crew members in aligning the vehicles in rows, parking lanes are seldom further demarcated into individual parking boxes because vehicle lengths vary and unused space is not acceptable during times of high demand.
Comment: The committee agreed that providing accessible parking on a ferry required an effective parking management plan plus design and construction requirements. The design and construction requirements must ensure the vehicle deck has the space available so that accessible parking can be provided when the need arises. The management plan must ensure that the need is identified before the loading process begins and that vehicles are arranged on the deck so that access aisles and accessible routes are provided. The committee noted that the access aisles must adjoin an accessible route which connects to all other accessible elements and spaces on the ferry. This allows individuals with disabilities to depart their vehicles and travel to any part of the ferry that is required to be accessible, just as other passengers are permitted to walk to other parts of the ferry.
* * * * *
### SCOPING
#### 208 Parking Spaces on Passenger Vessels Which Carry Vehicles
**208.1 General.** Where public parking is provided on passenger vessels, accessible parking spaces shall be provided in accordance with 208.
> **EXCEPTION: **Where a passenger vessel does not have public toilet facilities and passengers are not required to leave their vehicles, this section does not apply.
Comment: The committee recognized that in small ferries, with short trip duration, the provision of accessible parking spaces might be difficult or impossible. The committee noted that on some small ferries, toilet facilities are not provided and passengers normally remain in their vehicles. For these reasons and because these small ferries usually have short crossings, the committee added an exception.
**208.2 Number Required.** Accessible parking spaces shall be provided in accordance with Table 208.2 and shall comply with 502.
**Table 208.2 Accessible Parking Spaces**
| Total Parking Capacity Provided For the Public on the Passenger Vessel | Minimum Required Number of Accessible Parking Spaces |
| 1 to 25 | 1 |
| 26 to 50 | 2 |
| 51 to 75 | 3 |
| 76 to 100 | 4 |
| 101 to 150 | 5 |
| 151 to 200 | 6 |
| 201 to 300 | 7 |
| 301 to 400 | 8 |
| 401 to 500 | 9 |
| 501 to 1000 | 2 percent of total |
| 1001 and over | 20, plus 1 for each 100 over 1000 |
> **208.2.1** Not Used.
Comment: Not on ferries.
> **208.2.2 Van Parking Spaces. **For every eight or fraction of eight accessible parking spaces required by 208.2, at least one shall be a van parking space complying with 502.
**208.3** Not Used.
Comment: As crew members direct drivers to the positions in which they will park, identification signs are not needed and may cause confusion. In trips where no passengers need accessible parking, deck space that is used as access aisles and associated accessible routes may be occupied by vehicles. In such cases, identification signs and markings typically found in a landside parking lot could cause confusion to drivers who believe they are illegally parking in areas "reserved" for persons with disabilities. Vessel operators shall arrange vehicles requiring accessible loading areas and related accessible paths of travel to other accessible areas of the vessel. The arrangement of the vehicles is to result in the required loading areas and aisle as required in the fixed parking requirements of this chapter for cars, vans and buses.
Comment: It was noted that on ferries that load from both ends, the orientation of the vehicle must be accounted for in configuring the adjacent door swing, lift operating area and maneuvering clearances on the deck within the accessible loading area.
**208.4 Location. **Accessible parking spaces shall be located on the shortest accessible route to an accessible elevator, or if no elevator is provided, to accessible public areas on the same deck.
> **EXCEPTION: **All van parking spaces shall be permitted to be grouped on one level of a parking structure.
Comment: Bus accommodation: For ferry vessels that carry over-the-road, transit and paratransit buses, if passengers are permitted to exit and enter the vehicle while onboard the vessel, an accessible path shall be provided from the vehicle to the other accessible facilities aboard the vessel.
**209** Not Used.
Comment: Passenger loading zones are not provided on passenger vessels. They may be on the pier that serves a vessel but not on the vessel itself, therefore, section 209 has been marked as "not used."
* * * * *
### TECHNICAL
#### 502 Parking Spaces
**502.1 General. **Bus, car and van parking spaces required to be accessible shall comply with 502.
**502.2 Accessible Parking Spaces. **Vehicle parking spaces shall be 156 inches (3965 mm) wide minimum and shall be 240 inches (6100 mm) in length minimum. Van parking spaces shall be 192 inches (4880 mm) wide minimum and shall be 240 inches (6100 mm) in length minimum. Over-the-road buses, transit and paratransit bus parking spaces shall provide an adjacent accessible loading area that is 96 inches (2440 mm) wide minimum and shall be 72 inches (1829 mm) long minimum (such as in the Access Board's vehicle guidelines).
Comment: Unlike land based parking lots, an accessible parking space on a ferry will be a rectangle which contains both the space for a vehicle and the space for an accessible aisle. Marking is principally provided to remind crew members where to position vehicles that need accessible parking. The 156 inch width of the vehicle parking space equates to 96 inches for the vehicle and 60 inches for an access aisle which is consistent with the car parking space dimensions in a land based parking lot. Likewise, the 192 inch width of the van parking space provides 96 inches for the vehicle and 96 inches for an access aisle. It is also non-directional, allowing for vehicles to drive on from opposite ends of the ferry.
**502.3 **Not Used.
Comment: As the accessible parking space contains both the space for a vehicle and the space for the access aisle, section 502.3 is marked as "not used", and section 502.3.1 through 502.3.3 have been deleted.
**502.4 Deck Surfaces.** Accessible parking spaces shall have surface slopes not steeper than 1:48 in all directions and shall comply with 302. Changes in level are not permitted.
> **EXCEPTION: **This section shall not apply to vehicle tie-downs which are flush with the deck.
Comment: Compliance to section 302 was added because the original reference to 302 was located in section 502.3, which now is marked as not used.
Comment: Some ferries are designed so that vehicles can be secured to the deck. Often the securement device is connected to the deck at a point which is somewhat recessed. As the typical tie-down spot would not comply with section 302 and would also constitute a change in level on the surface, an exception was added to allow tie-downs in accessible parking spaces which are flush with the deck.
**502.5 Vertical Clearance. **Van parking spaces and a vehicular route to van parking spaces, shall provide a vertical clearance of 98 inches (2490 mm) minimum. Transit and paratransit bus parking spaces, and a vehicular route to such parking spaces, shall provide a vertical clearance of 114 inches (2895 mm) minimum. Over-the-road bus parking spaces, and a vehicular route to such parking spaces, shall provide a vertical clearance in accordance with other national standards.
Comment: It was noted that, for drive-through ferries, the vertical clearance needs to be maintained for one lane width along the entire path of travel of the vehicle, during loading and unloading.
Comment: See 208.3.
**502.6 Marking.** The surface of the accessible parking spaces shall be marked to distinguish them from parking areas which do not contain accessible parking spaces.
Comment: As this marking requirement is provide primarily to assist crew members, the area within the parking space containing the access aisle is not required to be marked so as to distinguish it from the area where the vehicle will be located.
**503** Not Used.
Comment: See 209. | 94.588235 | 1,276 | 0.793636 | eng_Latn | 0.999684 |
bde24451f50e4b4a70c2d09700c1abd3e56e85f6 | 280 | md | Markdown | spreadsheets/Read Me.md | geoffreynyaga/ostrich-project | 157cd7a3c3d9014e31ef21ca21de43f04d039997 | [
"MIT"
] | 15 | 2017-11-08T10:03:26.000Z | 2021-12-21T07:02:44.000Z | spreadsheets/Read Me.md | geoffreynyaga/ostrich-project | 157cd7a3c3d9014e31ef21ca21de43f04d039997 | [
"MIT"
] | 9 | 2020-01-17T15:09:22.000Z | 2022-03-25T19:02:05.000Z | spreadsheets/Read Me.md | geoffreynyaga/ostrich-project | 157cd7a3c3d9014e31ef21ca21de43f04d039997 | [
"MIT"
] | null | null | null | ## NOTE:
1.all these are OFFICE EXCEL 2013 WORKBOOKS
2. ALL the excel workbooks presented here are linked together so it is advisable to
FIRST OPEN the initial sizing workbook and click yes when prompted to update links,
then you can proceed to open the workbook of your choice. | 46.666667 | 83 | 0.796429 | eng_Latn | 0.998797 |
bde26b1b4766d464ff9cb977641e40f9742446df | 4,593 | md | Markdown | docs/loops.md | hash-org/lang-arxiv | 28c05e6ea75493e34dba5d64e5a58fa22ec56aea | [
"MIT"
] | null | null | null | docs/loops.md | hash-org/lang-arxiv | 28c05e6ea75493e34dba5d64e5a58fa22ec56aea | [
"MIT"
] | null | null | null | docs/loops.md | hash-org/lang-arxiv | 28c05e6ea75493e34dba5d64e5a58fa22ec56aea | [
"MIT"
] | null | null | null | # Hash language loop constructs
Hash contains 3 distinct loop control constructs: `for`, `while` and `loop`. Each construct has
a distinct usage case, but they can often be used interchangebly without hastle and are merely
a style choice.
<br />
# General
Each construct supports the basic `break` and `continue` loop control flow statements. These statements
have the same properties as in many other languages like C, Rust, Python etc.
`break` - Using this control flow statements immediatelly terminates the loop and continues
to any statement after the loop (if any).
`continue` - Using this control flow statement will immediatelly skip the current iteration
of the loop body and move on to the next iteration (if any). Obviously, if no iterations
remain, `continue` behaves just like `break`.
<br />
# For construct
## Basics
For loops are special loop control statements that are designed to be used
with iterators.
For loops can be defined as:
```rust
for i in range(1, 10) { // range is a built in iterator
print(i);
}
```
Iterating over lists is also quite simple using the `iter` function to
convert the list into an iterator:
```rust
let nums: [u32] = [1,2,3,4,5,6,7,8,9,10];
// infix functional notation
for num in nums.iter() {
print(num);
}
// using the postfix functional notation
for num in iter(nums) {
print(nums);
}
```
The general syntax for a `for` loop is
```
"for" destructor_expression "in" iterator_expr "{" body_expression "}"
```
## iterators
Iterators ship with the standard library, but you can define your own iterators via the Hash generic typing system.
An iterator `I` of `T` it means to have an implementation `next<I, T>` in scope the current scope.
So, for the example above, the `range` function is essentially a `RangeIterator` of the `u8`, `u16`, `u32`, `...` types.
More details about generics are [here](https://github.com/feds01/hash/wiki/Types-&-Structs#generics).
<br />
# While construct
While loops are identical to 'while' constructs in other languages such as Java, C, JavaScript, etc.
The loop will check a given conditional expression ( must evaluate to a `bool`), and if it evaluates
to `true`, the loop body is executed, oterhwise the interpreter moves on. The loop body can also
use loop control flow statements like `break` or `continue` to prematurely stop looping for a
given condition.
While loops can be defined as:
```rust
let c: u32 = 0;
while c < 10 {
print(i);
}
```
The general syntax for a `while` loop is
```
"while" expression "{" body_expression "}"
```
**Note**: In `Hash`, you cannot write `do-while` loops, but if u want to write a loop that behaves
like a `do-while` statement, here is a good example using `loop`:
```rust
loop {
// do something here, to enable a condition check,
// and then use if statement, or match case to check
// if you need to break out of the loop.
if !condition {break}
}
```
<br />
# Loop consturct
The loop consturct is the simplest of the three. The basic syntax for a loop is as follows:
```rust
let c: u64 = 1;
loop {
print("I looped " + c + " times!");
c += 1;
}
```
You can also use conditional statements within the loop body (which is equivalent to a function body) like so:
```rust
let c: u64 = 1;
loop {
if c == 10 { break }
print("I looped " + c + " times!");
c += 1;
} // this will loop 10 times, and print all 10 times
```
```rust
let c: u64 = 1;
loop {
c += 1;
if c % 2 != 0 { continue };
print("I loop and I print when I get a " + c);
} // this will loop 10 times, and print only when c is even
```
## Miscellaneous
As mentioned at the start of the introduction, the `loop` control flow keyword
is the most universal control flow since to you can use `loop` to represent
both the `for` and `while` loops.
For example, the `for` loop can be expressed using `loop` as:
```rust
loop {
match next(i) {
Some(x) => body(x);
None => break;
}
}
```
And the `while` loop can be written using the `loop` directive
like so:
```rust
let c = 0;
loop {
let x = match x { // where 'x' is the condition for the while loop
true => {
if c < 5 {c+=1; true} else {false}
}
false => break;
}
}
// same as...
let c = 0;
while c < 5 {
c+=1;
}
```
Similarly, the `loop` keyword is equivalent of someone writing a `while` loop that has
a conditional expression that always evaluate to `true`; like so,
```rust
while true {
// do something
}
// is the same as...
loop {
// do something
}
```
| 21.767773 | 120 | 0.673416 | eng_Latn | 0.998533 |
bde2ff0d968d73706e08b557d4ffcb3c189be626 | 1,701 | md | Markdown | doc/bench_register_list.md | vibhatha/sciml-bench | 4c299462cb0017138cb77d5f9116862cfd1f653d | [
"MIT"
] | 12 | 2021-05-10T17:10:56.000Z | 2022-02-03T16:52:32.000Z | doc/bench_register_list.md | vibhatha/sciml-bench | 4c299462cb0017138cb77d5f9116862cfd1f653d | [
"MIT"
] | null | null | null | doc/bench_register_list.md | vibhatha/sciml-bench | 4c299462cb0017138cb77d5f9116862cfd1f653d | [
"MIT"
] | 6 | 2021-05-05T07:28:50.000Z | 2022-03-05T06:44:18.000Z | | **Benchmark Name** | Dataset Ready? | **Baseline Ready?** | **Integrated** | **Origin** |
| --- | --- | --- | --- | --- |
| cloud\_slstr | ✓ | ✓ | ✓ | SciML |
| em\_denoise | ✓ | ✓ | ✓ | SciML |
| dms\_x | ✓ | ✓ | ✓ | SciML |
| ediff/stemdl | ✓ | ✓ | ✓ | ORNL |
| optics\_damage | ✓ | ✓ | ✓ | SciML |
| cryo\_denoise | ✓ | ✓ | ✓ | SciML |
| cryo\_class | ✓ | ✗ | ✗ | SciML |
| cryo\_pick | ✓ | ✗ | ✗ | SciML |
| photoz | ✓ | ✓ | ✗ | SciML |
| psd\_class | ✓ | ✓ | ✗ | SciML |
| saxs\_shapes | ✓ | ✓ | ✗ | SciML |
| saxs\_esti | ✓ | ✓ | ✗ | SciML |
| tritium\_breed | ✓ | ✓ | ✗ | CCFE |
| vae\_stars | ✗ | ✗ | ✗ | ESA |
| uno-multitask | ✓ | ✓ | ✗ | ANL |
| lottery\_hypo | ✓ | ✗ | ✗ | SciML |
| tevolop | ✓ | ✓ | ✗ | Uni. of Virginia |
| quake\_noise | ✗ | ✗ | ✗ | Oxford |
| modis\_fmf | ✓ | ✓ | ✗ | Turing |
| trop\_cyclone | ✗ | ✗ | ✗ | Turing |
| spin\_hamiltonian | ✓ | ✗ | ✗ | SCD/DL |
| jarvis\_ff | ✓ | ✗ | ✗ | NIST |
| jarvis\_dft | ✓ | ✗ | ✗ | NIST |
| jarvis\_ffields | ✓ | ✓ | ✗ | NIST |
| sans\_models | ✓ | ✓ | ✗ | NIST |
| pah\_geom | ✓ | ✗ | ✗ | NIST |
| nmr2d\_spectra | ✓ | ✗ | ✗ | NIST |
| cccdb\_quantym | ✓ | ✗ | ✗ | NIST |
| nist\_absorb | ✓ | ✗ | ✗ | NIST |
| nist\_saturate | ✓ | ✗ | ✗ | NIST |
| vbno\_diffract | ✗ | ✗ | ✗ | NIST |
| tide\_predict | ✗ | ✗ | ✗ | IBM |
| mixed-precision-gio | ✗ | ✗ | ✗ | SCD/DL |
| halo-mass | ✓ | ✗ | ✗ | UCL/STFC |
| mlflimp-seg | ✓ | ✗ | ✗ | SciML |
| giab | ✓ | ✗ | ✗ | NIST |
| ligo\_waves | ✓ | ✓ | ✗ | Uni. of Bristol |
| storm\_align | ✓ | ✗ | ✗ | SciML |
| aber\_fun | ✓ | ✗ | ✗ | RFI |
| beam\_par\_opti | ✓ | ✗ | ✗ | EPAC |
| latent\_factor | ✓ | ✓ | ✗ | SciML |
| clean\_air | ✓ | ✗ | ✗ | Turing |
| stm\_nlp | ✓ | ✗ | ✗ | CEDA | | 37.8 | 91 | 0.423868 | yue_Hant | 0.500284 |
bde314186517861331a0398579020eaf2f1ac74d | 1,255 | md | Markdown | README.md | wbl/NISTP | 921f3a702dac73e423dd6a3705d38f5eed863f17 | [
"Apache-2.0",
"0BSD"
] | 4 | 2016-03-24T19:55:03.000Z | 2019-11-27T08:10:44.000Z | README.md | wbl/NISTP | 921f3a702dac73e423dd6a3705d38f5eed863f17 | [
"Apache-2.0",
"0BSD"
] | null | null | null | README.md | wbl/NISTP | 921f3a702dac73e423dd6a3705d38f5eed863f17 | [
"Apache-2.0",
"0BSD"
] | null | null | null | NISTP
=====
NISTP is an implementation of elliptic curve cryptography designed to be easy
to use. Similar to NaCl, with which I hope to integrate it sometime, it
combines authentication and encryption into a single function, making
cryptography easy to use. NISTP puts together P256, AES256GCM into the
crypto_box_p256aes256gcm function, and includes ECDSA support.
Building requires only a C compiler. To build ./configure.sh && cd bin
&& make. Libtomcrypt is required to build tests. PARI is required to
interpret some tests, and there is an awk script for summarizing
tests. There is currently no attempt to discover things. There are timing
leaks from the AES implementation.
Using NISTP
===
To learn how to use NISTP take a look at the examples in ./tests. The
API follows the principles of the NaCl API at http://nacl.cr.yp.to:
functions should combine multiple cryptographic operations and make all
decisions for the user, as well as require no initialization or cleanup.
Zeroization is not done, but all data is on the stack. The primatives used
in NISTP are secure and mostly standard, aside from the Schnorr signature which
is here used an equivelent form. NISTP is fast, but cannot compete with
Curve25519. It is much simpler then OpenSSL. | 46.481481 | 79 | 0.796813 | eng_Latn | 0.99885 |
bde34141cd0d6d41c2c247d07627f476cec655ac | 123 | md | Markdown | README.md | GeoRouv/UTF16-to-UTF8 | 78fc6b03957ee7c702c777c420c49537f73fdcee | [
"MIT"
] | 1 | 2021-05-27T16:42:15.000Z | 2021-05-27T16:42:15.000Z | README.md | GeoRouv/UTF16-to-UTF8 | 78fc6b03957ee7c702c777c420c49537f73fdcee | [
"MIT"
] | null | null | null | README.md | GeoRouv/UTF16-to-UTF8 | 78fc6b03957ee7c702c777c420c49537f73fdcee | [
"MIT"
] | null | null | null | # UTF16-to-UTF8
A simple converter
## Compile
$ gcc -o utf16to8 utf16to8.c
## Run
$ ./utf16to8 < SAMPLEutf16.txt
| 13.666667 | 34 | 0.650407 | kor_Hang | 0.205489 |
bde735905a57b8298b802e1a29e61bb3505467e6 | 56 | md | Markdown | src/deepproblog/examples/neurogta/data/label_data_schets_experiment/readme.md | vossenwout/gtadeepproblog | 65509b740518af422b96e84ef10716e0ac246e75 | [
"Apache-2.0"
] | null | null | null | src/deepproblog/examples/neurogta/data/label_data_schets_experiment/readme.md | vossenwout/gtadeepproblog | 65509b740518af422b96e84ef10716e0ac246e75 | [
"Apache-2.0"
] | null | null | null | src/deepproblog/examples/neurogta/data/label_data_schets_experiment/readme.md | vossenwout/gtadeepproblog | 65509b740518af422b96e84ef10716e0ac246e75 | [
"Apache-2.0"
] | null | null | null | hier komen de correspenderende keyinputs voor elke image | 56 | 56 | 0.875 | nld_Latn | 0.996142 |
bde8763d537c59200940952255e6c324c14609e2 | 611 | md | Markdown | README.md | NSWSESMembers/beaconnotify | d73acaee08a3d762005c46fcd65a8aee120f8bdb | [
"MIT"
] | null | null | null | README.md | NSWSESMembers/beaconnotify | d73acaee08a3d762005c46fcd65a8aee120f8bdb | [
"MIT"
] | 1 | 2015-03-25T23:56:39.000Z | 2015-03-26T00:28:47.000Z | README.md | sdunster/beaconnotify | d73acaee08a3d762005c46fcd65a8aee120f8bdb | [
"MIT"
] | null | null | null | # beaconnotify
Send messages to contact groups using beacon
## Installation
```bash
git clone https://github.com/sdunster/beaconnotify.git
cd beaconnotify && npm install -g
```
## Usage
Look up a contact group's ID using the Contact Group Register page.
Remove the `--sandbox` flag to send messages through production beacon
(instead of trainbeacon).
```bash
export BEACON_USERNAME='user'
export BEACON_PASSWORD='pass'
beaconnotify --sandbox contact_id 'my message'
```
or
```bash
BEACON_USERNAME='user' BEACON_PASSWORD='pass' beaconnotify --sandbox contact_id 'my message'
```
## License
[MIT](LICENSE)
| 20.366667 | 92 | 0.757774 | eng_Latn | 0.43256 |
bde98be3e2df126f09feddcf4df07a59c14d0615 | 1,037 | md | Markdown | README.md | martianpanda/android_device_samsung_j7elte | b5e2401cf81643e7d1a63415f24cec01fa092864 | [
"Apache-2.0"
] | null | null | null | README.md | martianpanda/android_device_samsung_j7elte | b5e2401cf81643e7d1a63415f24cec01fa092864 | [
"Apache-2.0"
] | null | null | null | README.md | martianpanda/android_device_samsung_j7elte | b5e2401cf81643e7d1a63415f24cec01fa092864 | [
"Apache-2.0"
] | null | null | null | Device configuration for the Samsung Galaxy J7 Exynos
Copyright (C) 2016 Apavayan Sinha <[email protected]>
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
------------------------------------------------------------------
Device Tree For Samsung Galaxy J7 LTE
=====================================
Basic | Spec Sheet
-------:|:-------------------------
CPU | Qcta 1.5 GHz Cortex-A53
CHIPSET | Samsung Exynos 7580
GPU | Mali-T720MP2
Memory | 1.5GB RAM
Android | 5.1.1
Storage | 16 GB
MicroSD | Up to 128GB
Battery | 3000 mAh
Display | 5.5"
Front Camera | 5 MP, LED flash
Rear Camera | 13 MP, 4128 x 3096 pixels, autofocus, LED flash

This branch is for building CyanogenMod 13.0 Firmware.
Model Supported : SM-J700F SM-J700M SM-J700H
| 28.027027 | 104 | 0.63163 | eng_Latn | 0.428183 |
bde9a5805622bf5aac1836b6fd05fb231f751560 | 1,344 | md | Markdown | README.md | sercanersoy/database-management-system | 7c0e7887aac73184531c506fd01291dee4f80bc9 | [
"MIT"
] | null | null | null | README.md | sercanersoy/database-management-system | 7c0e7887aac73184531c506fd01291dee4f80bc9 | [
"MIT"
] | null | null | null | README.md | sercanersoy/database-management-system | 7c0e7887aac73184531c506fd01291dee4f80bc9 | [
"MIT"
] | null | null | null | # Simple Database Management System
## Description
This project aims to create a simple database management system which will be able to execute basic DDL and DML operations.
## How to Run
In order to build the project, `cmake` must be installed. After installing cmake, run the following in the project's root directory to build the project:
```
cmake . && make
```
Execute the following command to use the program interactively:
```
./database_management_system
```
Also you can give the query commands to the program in an input file and print the outputs to an output file as follows:
```
./database_management_system input.txt output.txt
```
## How to Use
Available commands are listed below:
### DDL (Data Definition Language) Operations
Create Type:
```
create type <type-name> <number-of-fields> <field1-name> <field2-name> ...
```
Delete Type:
```
delete type <type-name>
```
List All Types:
```
list type
```
### DML (Data Manipulation Language) Operations
Create Record:
```
create record <type-name> <field1-value> <field2-value> ...
```
Delete Record:
```
delete record <type-name> <primary-key>
```
Update Record:
```
update record <type-name> <primary-key> <field2-value> <field3-value> ...
```
Search Record:
```
search record <type-name> <primary-key>
```
List All Records:
```
list record <type-name>
```
| 18.162162 | 153 | 0.71131 | eng_Latn | 0.961012 |
bded1e3d817503599e535762e7297c7a6c1a19b7 | 78 | md | Markdown | src/Controllers/README.md | cierrateam/billing | ffd158e414569b6b623e58b97a15388886424dd6 | [
"MIT"
] | null | null | null | src/Controllers/README.md | cierrateam/billing | ffd158e414569b6b623e58b97a15388886424dd6 | [
"MIT"
] | 1 | 2020-01-24T18:07:49.000Z | 2020-01-24T18:07:49.000Z | src/Controllers/README.md | cierrateam/billing | ffd158e414569b6b623e58b97a15388886424dd6 | [
"MIT"
] | null | null | null | # Billing Controllers
All the billing controllers are located in this folder
| 19.5 | 54 | 0.820513 | eng_Latn | 0.999924 |
bdeddb93605e72c42a72156b44b4eaf275800c65 | 1,752 | md | Markdown | Bab4-CFunction/3-Scope.md | michaelrk02/TeachingAssistant-KP2021 | b59f6ba502ec642a3d607e58a114a26deb7d4aff | [
"MIT"
] | 3 | 2021-08-30T15:00:24.000Z | 2021-11-14T00:48:13.000Z | Bab4-CFunction/3-Scope.md | ivandraaa/TeachingAssistant-KP2021 | 351dd2d3a6f28d0e4fc7f2f870ace7b48d645e74 | [
"MIT"
] | 8 | 2021-08-31T02:20:02.000Z | 2021-11-17T05:06:18.000Z | Bab4-CFunction/3-Scope.md | ivandraaa/TeachingAssistant-KP2021 | 351dd2d3a6f28d0e4fc7f2f870ace7b48d645e74 | [
"MIT"
] | 8 | 2021-08-29T14:05:48.000Z | 2021-12-16T16:48:46.000Z | [<< Materi Sebelumnya (Function Scope) <<](2-FungsiLibraryC.md)
# 4.3 - Aturan Scope
Apa itu scope? **Scope** (atau ruang lingkup) menandakan keterlihatan (visibility) dari suatu **variabel** supaya dapat diakses oleh kode yang membutuhkan
Perhatikan contoh berikut:
```c
if (pilihan == 2) {
int kode; /* variabel `kode` */
printf("Masukkan kode: ");
scanf("%d", &kode);
/* `kode` dapat digunakan di sini */
printf("Kode anda: %d\n", kode);
if (kode == 111) {
/* `kode` juga masih dapat digunakan di sini, begitu juga seterusnya */
printf("Kode anda lagi: %d\n", kode);
}
}
/* tetapi `kode` tidak dapat digunakan di sini, karena perbedaan "scope" */
printf("Kode yang anda masukkan: %d\n", kode /* akan error */);
```
Secara sederhana, semua variabel yang didefinisikan di dalam suatu **block** (`{ ... }`) hanya dapat digunakan dalam ruang lingkup block itu sendiri dan dalamnya. Apabila kita mencoba untuk mengakses variabel yang bersangkutan di luar block tempat didefinisikannya variabel tersebut, maka akan memproduksi error.
## Global Scope
Kita dapat mendefinisikan suatu variabel **di luar** sebuah fungsi dan dengan demikian, variabel tersebut memiliki jenis scope yaitu **global scope**, perhatikan contoh berikut:
```c
#include <stdio.h>
/* variabel `kode` pada global scope */
int kode;
void tampilkan_kode() {
/* menggunakan variabel `kode` pada global scope */
printf("Kode anda adalah: %d\n", kode);
}
int main() {
printf("Masukkan kode: ");
scanf("%d", &kode); /* menampung ke variabel `kode` pada global scope */
tampilkan_kode();
return 0;
}
/*
Output:
Masukkan kode: 555
Kode anda adalah: 555
*/
```
[Materi Berikutnya (Rekursi) >>](4-Rekursi.md)
| 28.258065 | 312 | 0.679795 | ind_Latn | 0.97126 |
bdee6decc858300396942e829bccbf357de49e5e | 1,094 | md | Markdown | _posts/2004-06-06-perpetually_expanding_horizons_sure.md | cwinters/cwinters.github.io | a1c68381f4d1c82cadca08aa784f53fe0b3fb3f2 | [
"MIT"
] | null | null | null | _posts/2004-06-06-perpetually_expanding_horizons_sure.md | cwinters/cwinters.github.io | a1c68381f4d1c82cadca08aa784f53fe0b3fb3f2 | [
"MIT"
] | 6 | 2020-02-24T22:24:42.000Z | 2022-02-26T01:45:04.000Z | _posts/2004-06-06-perpetually_expanding_horizons_sure.md | cwinters/cwinters.github.io | a1c68381f4d1c82cadca08aa784f53fe0b3fb3f2 | [
"MIT"
] | null | null | null | ---
tags: politics
layout: post
title: "'Perpetually expanding horizons?' Sure..."
---
<a href="http://www.washingtonpost.com/wp-dyn/articles/A19077-2004Jun5.html">An Optimist's Legacy</a> - I haven't written much about politics partly because I've been swamped with work. But also because there's just too much to write/think about. It's exhausting.
<p>I wasn't going to say anything about Reagan but then I saw this column by George Will describing his legacy:</p>
<blockquote>The feeling of foreboding -- the sense of shrunken possibilities -- that afflicted Americans 20 years ago has been banished by a new birth of the American belief in perpetually expanding horizons.</blockquote>
<p>Spoken like someone who doesn't know a soul who's living from paycheck to paycheck, or worried to the bone that they'll get hurt or sick while they're skating along without health insurance for "just a few months". That's not to say that horizons aren't expanding -- the horizons of the corporations (and the men who run them) sucking the life out of humanity are growing day by day.</p>
| 60.777778 | 390 | 0.764168 | eng_Latn | 0.999315 |
bdee73794893e8bea3c3e4f3b770e62dee12457a | 1,526 | md | Markdown | README.md | wisehackermonkey/dirdate | 25d20b7378f17540726636f7d0ae5752a4cfb5de | [
"MIT"
] | null | null | null | README.md | wisehackermonkey/dirdate | 25d20b7378f17540726636f7d0ae5752a4cfb5de | [
"MIT"
] | null | null | null | README.md | wisehackermonkey/dirdate | 25d20b7378f17540726636f7d0ae5752a4cfb5de | [
"MIT"
] | null | null | null | # Dirdate
Dirdate is a simiple commandline tool for creating folders with the format YYYYMMDD_XXXXXX
where
- Y = year
- M = month
- D = day
- X = project name
## Installation
install dirdate by:
> gem install dirdate
And then execute:
> dirdate PROJECT_NAME
OR
> dirdate
## Usage
There are two modes to this simple commandline tool.
1) with arguments
```
dirdate <name of folder>
```

[](https://asciinema.org/a/P0zv17KwTTv8Ur9PgMZqZUnB2)
2) without arguments
[](https://asciinema.org/a/XQvobG48vPNZgqN04nzoVGceK)
## Contributing
#### This project will most likely be not maintained.
Bug reports and pull requests are welcome on GitHub at https://github.com/wisehackermonkey/dirdate.
## License
The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
## Development
TODO Remove detalut text
After checking out the repo, run `bin/setup` to install dependencies. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
To install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and tags, and push the `.gem` file to [rubygems.org](https://rubygems.org).
| 31.142857 | 324 | 0.752294 | eng_Latn | 0.940392 |
bdeee9011c0605f971c691af8a5cca18bc24c64f | 2,718 | md | Markdown | docs/relational-databases/replication/monitor/view-information-and-perform-tasks-for-a-publisher-replication-monitor.md | lxyhcx/sql-docs.zh-cn | e63de561000b0b4bebff037bfe96170d6b61c908 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/relational-databases/replication/monitor/view-information-and-perform-tasks-for-a-publisher-replication-monitor.md | lxyhcx/sql-docs.zh-cn | e63de561000b0b4bebff037bfe96170d6b61c908 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/relational-databases/replication/monitor/view-information-and-perform-tasks-for-a-publisher-replication-monitor.md | lxyhcx/sql-docs.zh-cn | e63de561000b0b4bebff037bfe96170d6b61c908 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 查看发布服务器的信息和执行其任务(复制监视器)| Microsoft Docs
ms.custom: ''
ms.date: 03/14/2017
ms.prod: sql
ms.prod_service: database-engine
ms.component: replication
ms.reviewer: ''
ms.suite: sql
ms.technology:
- replication
ms.tgt_pltfrm: ''
ms.topic: conceptual
helpviewer_keywords:
- Publishers [SQL Server replication], Replication Monitor tasks
- viewing Publisher information
- Publishers [SQL Server replication], viewing information
ms.assetid: 1e777e95-377a-4de3-b965-867464aadaaf
caps.latest.revision: 37
author: MashaMSFT
ms.author: mathoma
manager: craigg
ms.openlocfilehash: 57f47623fa4d75588eb5f882fa96da97a755131e
ms.sourcegitcommit: 1740f3090b168c0e809611a7aa6fd514075616bf
ms.translationtype: HT
ms.contentlocale: zh-CN
ms.lasthandoff: 05/03/2018
---
# <a name="view-information-and-perform-tasks-for-a-publisher-replication-monitor"></a>查看发布服务器的信息和执行其任务(复制监视器)
[!INCLUDE[appliesto-ss-xxxx-xxxx-xxx-md](../../../includes/appliesto-ss-xxxx-xxxx-xxx-md.md)]
复制监视器提供了下列选项卡,以显示有关选定发布服务器的信息:
- **发布**
此选项卡显示选定发布服务器上所有发布的相关信息。
- **订阅监视列表**
此选项卡用于显示所选发布服务器上所有可用发布的订阅的信息:有错误、出现警告或性能最差。 对于运行 [!INCLUDE[msCoName](../../../includes/msconame-md.md)] [!INCLUDE[ssVersion2005](../../../includes/ssversion2005-md.md)]以前版本的分发服务器,不显示此选项卡。
- **“代理”** 选项卡
此选项卡显示所有类型的复制所使用的代理和作业的详细信息。 使用该选项卡,还可以启动和停止每个代理和作业。
若要查看每个选项卡上各个选项的详细信息,请在右窗格中单击该选项卡,再单击菜单栏上的 **“帮助”** 。 有关启动复制监视器的信息,请参阅[启动复制监视器](../../../relational-databases/replication/monitor/start-the-replication-monitor.md)。
### <a name="to-view-information-and-perform-tasks-for-a-publisher"></a>查看发布服务器的信息和执行其任务
1. 在左窗格中,展开发布服务器组,然后单击一个发布服务器。
2. 若要查看所有发布的信息,请单击 **“发布”** 选项卡。
3. 若要查看有关订阅的信息,请单击 **“订阅监视列表”** 选项卡。还可以从此访问更详细的信息以及执行任务:
- 若要查看与订阅相关联的代理的详细信息,请右键单击该订阅,再单击 **“查看详细信息”**。
- 若要查看订阅的属性,请右键单击该订阅,然后单击 **“属性”**。
- 若要同步推送订阅,请右键单击该订阅,然后单击 **“开始同步”**。
- 若要重新初始化订阅,请右键单击该订阅,然后单击 **“重新初始化订阅”**。
4. 若要查看有关代理的信息,请单击 **“代理”** 选项卡。您还可以通过此选项卡访问更详细的信息并执行任务:
- 若要查看有关代理的详细信息(如信息性消息以及任何错误消息),请右键单击代理,然后单击 **“查看详细信息”**。
- 若要查看有关运行代理的作业的详细信息(如计划、作业步骤详细信息等等),请右键单击代理,然后单击 **“属性”**。
- 若要管理代理的配置文件,请右键单击代理,然后单击 **“代理配置文件”**。 有关详细信息,请参阅[处理复制代理配置文件](../../../relational-databases/replication/agents/work-with-replication-agent-profiles.md)。
- 若要启动未运行的代理,请右键单击代理,然后单击 **“启动代理”**。
- 若要停止运行中的代理,请右键单击代理,然后单击 **“停止代理”**。
## <a name="see-also"></a>另请参阅
[查看和修改分发服务器和发布服务器属性](../../../relational-databases/replication/view-and-modify-distributor-and-publisher-properties.md)
[监视复制](../../../relational-databases/replication/monitor/monitoring-replication-overview.md)
| 33.975 | 194 | 0.698308 | yue_Hant | 0.278378 |
bdef675103c460edb5725957dfac20f187ad2272 | 937 | md | Markdown | README.md | dandelano/Android-Final-Project | f7a70e3351dc6079f9aa7a631d395f2f4eebcc98 | [
"MIT"
] | null | null | null | README.md | dandelano/Android-Final-Project | f7a70e3351dc6079f9aa7a631d395f2f4eebcc98 | [
"MIT"
] | null | null | null | README.md | dandelano/Android-Final-Project | f7a70e3351dc6079f9aa7a631d395f2f4eebcc98 | [
"MIT"
] | null | null | null | # Android-Final-Project
This is my final project for my Android programming class. It is an app that allows you to
store the description of movies on your tablet. You can add new movies to your database by searching by the title of the movie and saving the result to the local storage for later viewing. The search function uses themoviedb.org api to retrieve descriptions, cover images, and home page urls if available.
### Version
1.0.0
# License
MIT
This project was intended as a single use app for education purposes only. Each project below
has its own license which can be located at the links below. This app is provided 'as is'.
### This project uses the following projects:
Thank you to the following projects for the contributions they made in the creation of this project.
Tmdb api
https://www.themoviedb.org/
themoviedbapi
https://github.com/holgerbrandl/themoviedbapi
Picasso
http://square.github.io/picasso/ | 32.310345 | 305 | 0.784418 | eng_Latn | 0.999164 |
bdefe8cee35efa41ec0e7e16bf79a42de0850414 | 604 | md | Markdown | _posts/JavaScript/2020-07-22-024-typeof-instanceof.md | europani/http-europani.github.io- | 123736445d3dbab963cf7dac08d6e987a0260c42 | [
"MIT"
] | null | null | null | _posts/JavaScript/2020-07-22-024-typeof-instanceof.md | europani/http-europani.github.io- | 123736445d3dbab963cf7dac08d6e987a0260c42 | [
"MIT"
] | null | null | null | _posts/JavaScript/2020-07-22-024-typeof-instanceof.md | europani/http-europani.github.io- | 123736445d3dbab963cf7dac08d6e987a0260c42 | [
"MIT"
] | null | null | null | ---
layout: post
title: typeof, instanceof 연산자
categories: Javascript
tags: [Javascript]
---
● typeof 객체 : 객체의 타입을 출력
```javascript
console.log(typeof 'string'); // string
console.log(typeof 1); // number
console.log(typeof []); // object
console.log(typeof {}); // object
console.log(typeof null); // object
console.log(typeof function() { }); // function
```
● instanceof 객체 : 객체의 인스턴스타입을 출력
```javascript
let Person = function(){
this.name = "Chris";
};
let inst = new Person();
inst instanceof Person; // true
inst instanceof Object; // true
typeof inst; // object
``` | 20.133333 | 48 | 0.647351 | eng_Latn | 0.398131 |
bdf151ed1fbc892d77c022cf3f9650e895dee8bc | 26 | md | Markdown | README.md | bigBug2333/7-2nodejs- | 4378e485a453cf1f1b6223f3aeef65d742121b54 | [
"MIT"
] | null | null | null | README.md | bigBug2333/7-2nodejs- | 4378e485a453cf1f1b6223f3aeef65d742121b54 | [
"MIT"
] | null | null | null | README.md | bigBug2333/7-2nodejs- | 4378e485a453cf1f1b6223f3aeef65d742121b54 | [
"MIT"
] | null | null | null | # 7-2nodejs-
7-2nodejs第一天
| 8.666667 | 12 | 0.730769 | zho_Hans | 0.457797 |
bdf1563432075986039da9d984929bd1e48077f1 | 69 | md | Markdown | README.md | mrhx/composer-solid-example | c85bda49da296ca2d2637e02606d156d6eb9e18e | [
"MIT"
] | null | null | null | README.md | mrhx/composer-solid-example | c85bda49da296ca2d2637e02606d156d6eb9e18e | [
"MIT"
] | null | null | null | README.md | mrhx/composer-solid-example | c85bda49da296ca2d2637e02606d156d6eb9e18e | [
"MIT"
] | null | null | null | # composer-solid-example
Example of composer-based SOLID application
| 23 | 43 | 0.84058 | eng_Latn | 0.99341 |
bdf2680b6dbd6cefefb339328beb83257557b8c8 | 32 | md | Markdown | README.md | MarcMeinhardt/2DTerrainGeneration | 396e035ee13f96da09eeb299cde6884ea3faabcd | [
"MIT"
] | 1 | 2022-03-12T22:28:02.000Z | 2022-03-12T22:28:02.000Z | README.md | MarcMeinhardt/2DTerrainGeneration | 396e035ee13f96da09eeb299cde6884ea3faabcd | [
"MIT"
] | null | null | null | README.md | MarcMeinhardt/2DTerrainGeneration | 396e035ee13f96da09eeb299cde6884ea3faabcd | [
"MIT"
] | null | null | null | # Generative Art
Generative Art
| 10.666667 | 16 | 0.8125 | deu_Latn | 0.672274 |
bdf2ddf4cb8aabcbfbbd3db3685b099b6470d2a7 | 143 | md | Markdown | content/posts/post123.md | SebDanielsson/tina-cloud-starter | 7af2d402eb0882788eb52e88ec47bcc4d237fcb2 | [
"Apache-2.0"
] | null | null | null | content/posts/post123.md | SebDanielsson/tina-cloud-starter | 7af2d402eb0882788eb52e88ec47bcc4d237fcb2 | [
"Apache-2.0"
] | null | null | null | content/posts/post123.md | SebDanielsson/tina-cloud-starter | 7af2d402eb0882788eb52e88ec47bcc4d237fcb2 | [
"Apache-2.0"
] | null | null | null | ---
title: Post123
author: content/authors/pedro.md
excerpt: HEJ TESTAR TINA CMS
---
HEJ TESTAR TINA CMSHEJ TESTAR TINA CMSHEJ TESTAR TINA CMS
| 20.428571 | 57 | 0.769231 | yue_Hant | 0.980648 |
bdf2fb256430177ab2e7c6a80dde1ab2bb54d986 | 1,019 | md | Markdown | en/common/kahlan.md | reinhart1010/nix | a1803c718ead3b79854b65396c8967bd5ec32874 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | en/common/kahlan.md | reinhart1010/nix | a1803c718ead3b79854b65396c8967bd5ec32874 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | en/common/kahlan.md | reinhart1010/nix | a1803c718ead3b79854b65396c8967bd5ec32874 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
layout: page
title: common/kahlan (English)
description: "A unit and Behaviour Driven Development test framework for PHP."
content_hash: bcba5553b0d8ba41ebff85702deb3764d4988e9e
---
# kahlan
A unit and Behaviour Driven Development test framework for PHP.
More information: <https://kahlan.github.io>.
- Run all specifications in the "spec" directory:
`kahlan`
- Run specifications using a specific configuration file:
`kahlan --config=`<span class="tldr-var badge badge-pill bg-dark-lm bg-white-dm text-white-lm text-dark-dm font-weight-bold">path/to/configuration_file</span>
- Run specifications and output using a reporter:
`kahlan --reporter=`<span class="tldr-var badge badge-pill bg-dark-lm bg-white-dm text-white-lm text-dark-dm font-weight-bold">dot|bar|json|tap|verbose</span>
- Run specifications with code coverage (detail can be between 0 and 4):
`kahlan --coverage=`<span class="tldr-var badge badge-pill bg-dark-lm bg-white-dm text-white-lm text-dark-dm font-weight-bold">detail_level</span>
| 37.740741 | 158 | 0.770363 | eng_Latn | 0.532025 |
bdf306017f4d169fb7697a9eb648840fe0fffd43 | 30 | md | Markdown | README.md | RahmanM/Sample | 2460775543b61a8e7e368ffd884ce25edbaa8d72 | [
"Apache-2.0"
] | null | null | null | README.md | RahmanM/Sample | 2460775543b61a8e7e368ffd884ce25edbaa8d72 | [
"Apache-2.0"
] | null | null | null | README.md | RahmanM/Sample | 2460775543b61a8e7e368ffd884ce25edbaa8d72 | [
"Apache-2.0"
] | null | null | null | # Sample
Sample test only repo | 15 | 21 | 0.8 | eng_Latn | 0.99974 |
bdf3378435fd3edafb65bf4ba22fca3b20ed1037 | 14,784 | md | Markdown | docs/《现代JavaScript》教程/40.错误处理/01.错误处理,try-catch.md | caidix/vuepress-interview-github | c2dcc2a730b255f8795a95325801cf83d96516d7 | [
"MIT"
] | null | null | null | docs/《现代JavaScript》教程/40.错误处理/01.错误处理,try-catch.md | caidix/vuepress-interview-github | c2dcc2a730b255f8795a95325801cf83d96516d7 | [
"MIT"
] | 5 | 2021-01-02T09:29:34.000Z | 2021-01-02T09:29:37.000Z | docs/《现代JavaScript》教程/40.错误处理/01.错误处理,try-catch.md | caidix/vuepress-interview-github | c2dcc2a730b255f8795a95325801cf83d96516d7 | [
"MIT"
] | null | null | null | ---
title: 错误处理,"try..catch"
date: 2020-12-13 14:02:41
permalink: /pages/245490/
categories:
- 《现代JavaScript》教程
- 错误处理
tags:
- 现代JavaScript
author:
name: CD_wOw
link: https://github.com/caidix
---
不管你多么精通编程,有时我们的脚本总还是会出现错误。可能是因为我们的编写出错,或是与预期不同的用户输入,或是错误的服务端响应以及其他数千种原因。
通常,如果发生错误,脚本就会“死亡”(立即停止),并在控制台将错误打印出来。
但是有一种语法结构 `try..catch`,它使我们可以“捕获(`catch`)”错误,因此脚本可以执行更合理的操作,而不是死掉。
## `“try…catch”` 语法
`try..catch` 结构由两部分组成:`try` 和 `catch`:
```js
try {
// 代码...
} catch (err) {
// 错误捕获
}
```
它按照以下步骤执行:
1. 首先,执行 `try {...}` 中的代码。
2. 如果这里没有错误,则忽略 `catch(err)`:执行到 `try` 的末尾并跳过 `catch` 继续执行。
3. 如果这里出现错误,则 `try` 执行停止,控制流转向 `catch(err)` 的开头。变量 `err`(我们可以使用任何名称)将包含一个 `error` 对象,该对象包含了所发生事件的详细信息。

- 所以,`try {…}` 块内的错误不会杀死脚本 — 我们有机会在 `catch` 中处理它。
让我们来看一些例子。
- 没有 error 的例子:显示 `alert` `(1)` 和 `(2)`:
```javascript
try {
alert('Start of try runs'); // (1) <--
// ...这里没有 error
alert('End of try runs'); // (2) <--
} catch(err) {
alert('Catch is ignored, because there are no errors'); // (3)
}
```
- 包含 error 的例子:显示 `(1)` 和 `(3)` 行的 `alert` 中的内容:
```javascript
try {
alert('Start of try runs'); // (1) <--
lalala; // Error,变量未定义!
alert('End of try (never reached)'); // (2)
} catch(err) {
alert(`Error has occurred!`); // (3) <--
}
```
> **`try..catch` 仅对运行时的 error 有效**
>
> 要使得 `try..catch` 能工作,代码必须是可执行的。换句话说,它必须是有效的 `JavaScript` 代码。
>
> 如果代码包含语法错误,那么 `try..catch` 将无法正常工作,例如含有不匹配的花括号:
>
> ```javascript
> try {
> {{{{{{{{{{{{
> } catch(e) {
> alert("The engine can't understand this code, it's invalid");
> }
> ```
>
> `JavaScript` 引擎首先会读取代码,然后运行它。在读取阶段发生的错误被称为“解析时间(`parse-time`)”错误,并且无法恢复(从该代码内部)。这是因为引擎无法理解该代码。
>
> 所以,`try..catch` 只能处理有效代码中出现的错误。这类错误被称为“运行时的错误(`runtime errors`)”,有时被称为“异常(`exceptions`)”。
> **`try..catch` 同步工作**
>
> 如果在“计划的(`scheduled`)”代码中发生异常,例如在 `setTimeout` 中,则 `try..catch` 不会捕获到异常:
>
> ```javascript
> try {
> setTimeout(function() {
> noSuchVariable; // 脚本将在这里停止运行
> }, 1000);
> } catch (e) {
> alert( "won't work" );
> }
> ```
>
> 因为 `try..catch` 包裹了计划要执行的函数,该函数本身要稍后才执行,这时引擎已经离开了 `try..catch` 结构。
>
> 为了捕获到计划的(`scheduled`)函数中的异常,那么 `try..catch` 必须在这个函数内:
>
> ```javascript
> setTimeout(function() {
> try {
> noSuchVariable; // try..catch 处理 error 了!
> } catch {
> alert( "error is caught here!" );
> }
> }, 1000);
> ```
## `Error` 对象
发生错误时,`JavaScript` 生成一个包含有关其详细信息的对象。然后将该对象作为参数传递给 `catch`:
```javascript
try {
// ...
} catch(err) { // <-- “error 对象”,也可以用其他参数名代替 err
// ...
}
```
对于所有内建的 `error,error` 对象具有两个主要属性:
- `name`
Error 名称。例如,对于一个未定义的变量,名称是 `"ReferenceError"`。
- `message`
关于 error 的详细文字描述。
还有其他非标准的属性在大多数环境中可用。其中被最广泛使用和支持的是:
- `stack`
当前的调用栈:用于调试目的的一个字符串,其中包含有关导致 `error` 的嵌套调用序列的信息。
例如:
```javascript
try {
lalala; // error, variable is not defined!
} catch(err) {
alert(err.name); // ReferenceError
alert(err.message); // lalala is not defined
alert(err.stack); // ReferenceError: lalala is not defined at (...call stack)
// 也可以将一个 error 作为整体显示出来as a whole
// Error 信息被转换为像 "name: message" 这样的字符串
alert(err); // ReferenceError: lalala is not defined
}
```
### 可选的 “`catch`” 绑定
> **A recent addition**
>
> `This is a recent addition to the language. Old browsers may need polyfills.`
>
> **最近添加的**
>
> 这是最近添加的语言,旧的浏览器可能需要填充
如果我们不需要 `error` 的详细信息,`catch` 也可以忽略它:
```javascript
try {
// ...
} catch { // <-- 没有 (err)
// ...
}
```
## 使用 “`try…catch`”
让我们一起探究一下真实场景中 `try..catch` 的用例。
正如我们所知道的,`JavaScript` 支持 [JSON.parse(str)](https://developer.mozilla.org/zh/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse) 方法来解析 `JSON` 编码的值。
通常,它被用来解析从网络,从服务器或是从其他来源接收到的数据。
我们收到数据后,然后像下面这样调用 `JSON.parse`:
```javascript
let json = '{"name":"John", "age": 30}'; // 来自服务器的数据
let user = JSON.parse(json); // 将文本表示转换成 JS 对象
// 现在 user 是一个解析自 json 字符串的有自己属性的对象
alert( user.name ); // John
alert( user.age ); // 30
```
你可以在 [JSON 方法,toJSON](https://zh.javascript.info/json) 一章中找到更多关于 JSON 的详细内容。
**如果 `json` 格式错误,`JSON.parse` 就会生成一个 error,因此脚本就会“死亡”。**
我们对此满意吗?当然不!
如果这样做,当拿到的数据出了问题,那么访问者永远都不会知道原因(除非他们打开开发者控制台)。代码执行失败却没有提示信息,这真的是很糟糕的用户体验。
让我们用 `try..catch` 来处理这个 `error`:
```javascript
let json = "{ bad json }";
try {
let user = JSON.parse(json); // <-- 当出现一个 error 时...
alert( user.name ); // 不工作
} catch (e) {
// ...执行会跳转到这里并继续执行
alert( "Our apologies, the data has errors, we'll try to request it one more time." );
alert( e.name );
alert( e.message );
}
```
在这儿,我们将 `catch` 块仅仅用于显示信息,但是我们可以做更多的事儿:发送一个新的网络请求,向访问者建议一个替代方案,将有关错误的信息发送给记录日志的设备,……。所有这些都比代码“死掉”好得多。
## 抛出我们自定义的 `error`
如果这个 `json` 在语法上是正确的,但是没有所必须的 `name` 属性该怎么办?
像这样:
```javascript
let json = '{ "age": 30 }'; // 不完整的数据
try {
let user = JSON.parse(json); // <-- 没有 error
alert( user.name ); // 没有 name!
} catch (e) {
alert( "doesn't execute" );
}
```
这里 `JSON.parse` 正常执行,但是缺少 `name` 属性对我们来说确实是个 `error`。
为了统一进行 `error` 处理,我们将使用 `throw` 操作符。
### “`Throw`” 操作符
`throw` 操作符会生成一个 error 对象。
语法如下:
```javascript
throw <error object>
```
技术上讲,我们可以将任何东西用作 error 对象。甚至可以是一个原始类型数据,例如数字或字符串,但最好使用对象,最好使用具有 `name` 和 `message` 属性的对象(某种程度上保持与内建 error 的兼容性)。
`JavaScript` 中有很多内建的标准 error 的构造器:`Error`,`SyntaxError`,`ReferenceError`,`TypeError` 等。我们也可以使用它们来创建 `error` 对象。
它们的语法是:
```javascript
let error = new Error(message);
// 或
let error = new SyntaxError(message);
let error = new ReferenceError(message);
// ...
```
对于内建的 `error`(不是对于其他任何对象,仅仅是对于 `error`),`name` 属性刚好就是构造器的名字。`message` 则来自于参数(`argument`)。
例如:
```javascript
let error = new Error("Things happen o_O");
alert(error.name); // Error
alert(error.message); // Things happen o_O
```
让我们来看看 `JSON.parse` 会生成什么样的 error:
```javascript
try {
JSON.parse("{ bad json o_O }");
} catch(e) {
alert(e.name); // SyntaxError
alert(e.message); // Unexpected token b in JSON at position 2
}
```
正如我们所看到的, 那是一个 `SyntaxError`。
在我们的示例中,缺少 `name` 属性就是一个 `error`,因为用户必须有一个 `name`。
所以,让我们抛出这个 `error`。
```javascript
let json = '{ "age": 30 }'; // 不完整的数据
try {
let user = JSON.parse(json); // <-- 没有 error
if (!user.name) {
throw new SyntaxError("Incomplete data: no name"); // (*)
}
alert( user.name );
} catch(e) {
alert( "JSON Error: " + e.message ); // JSON Error: Incomplete data: no name
}
```
在 `(*)` 标记的这一行,`throw` 操作符生成了包含着我们所给定的 `message` 的 `SyntaxError`,与 `JavaScript` 自己生成的方式相同。`try` 的执行立即停止,控制流转向 `catch` 块。
现在,`catch` 成为了所有 `error` 处理的唯一场所:对于 `JSON.parse` 和其他情况都适用。
## 再次抛出`(Rethrowing)`
在上面的例子中,我们使用 `try..catch` 来处理不正确的数据。但是在 `try {...}` 块中是否可能发生 **另一个预料之外的 error**?例如编程错误(未定义变量)或其他错误,而不仅仅是这种“不正确的数据”。
例如:
```javascript
let json = '{ "age": 30 }'; // 不完整的数据
try {
user = JSON.parse(json); // <-- 忘记在 user 前放置 "let"
// ...
} catch(err) {
alert("JSON Error: " + err); // JSON Error: ReferenceError: user is not defined
// (实际上并没有 JSON Error)
}
```
当然,一切皆有可能!程序员也会犯错。即使是被数百万人使用了几十年的开源项目中 — 也可能突然被发现了一个漏洞,并导致可怕的黑客入侵。
在我们的例子中,`try..catch` 旨在捕获“数据不正确”的 `error`。但是实际上,`catch` 会捕获到 **所有** 来自于 `try` 的 `error`。在这儿,它捕获到了一个预料之外的 `error`,但是仍然抛出的是同样的 `"JSON Error"` 信息。这是不正确的,并且也会使代码变得更难以调试。
为了避免此类问题,我们可以采用“重新抛出”技术。规则很简单:
**`catch` 应该只处理它知道的 `error`,并“抛出”所有其他 `error`。**
“再次抛出(`rethrowing`)”技术可以被更详细地解释为:
1. `Catch` 捕获所有 `error`。
2. 在 `catch(err) {...}` 块中,我们对 `error` 对象 `err` 进行分析。
3. 如果我们不知道如何处理它,那我们就 `throw err`。
通常,我们可以使用 `instanceof` 操作符判断错误类型:
```javascript
try {
user = { /*...*/ };
} catch(err) {
if (err instanceof ReferenceError) {
alert('ReferenceError'); // 访问一个未定义(undefined)的变量产生了 "ReferenceError"
}
}
```
我们还可以从 `err.name` 属性中获取错误的类名。所有原生的错误都有这个属性。另一种方式是读取 `err.constructor.name`。
在下面的代码中,我们使用“再次抛出”,以达到在 `catch` 中只处理 `SyntaxError` 的目的:
```javascript
let json = '{ "age": 30 }'; // 不完整的数据
try {
let user = JSON.parse(json);
if (!user.name) {
throw new SyntaxError("Incomplete data: no name");
}
blabla(); // 预料之外的 error
alert( user.name );
} catch(e) {
if (e instanceof SyntaxError) {
alert( "JSON Error: " + e.message );
} else {
throw e; // 再次抛出 (*)
}
}
```
如果 `(*)` 标记的这行 `catch` 块中的 `error` 从 `try..catch` 中“掉了出来”,那么它也可以被外部的 `try..catch` 结构(如果存在)捕获到,如果外部不存在这种结构,那么脚本就会被杀死。
所以,`catch` 块实际上只处理它知道该如何处理的 `error`,并“跳过”所有其他的 `error`。
下面这个示例演示了这种类型的 `error` 是如何被另外一级 `try..catch` 捕获的:
```javascript
function readData() {
let json = '{ "age": 30 }';
try {
// ...
blabla(); // error!
} catch (e) {
// ...
if (!(e instanceof SyntaxError)) {
throw e; // 再次抛出(不知道如何处理它)
}
}
}
try {
readData();
} catch (e) {
alert( "External catch got: " + e ); // 捕获了它!
}
```
上面这个例子中的 `readData` 只知道如何处理 `SyntaxError`,而外部的 `try..catch` 知道如何处理任意的 `error`。
## `try…catch…finally`
等一下,以上并不是所有内容。
`try..catch` 结构可能还有一个代码子句(`clause`):`finally`。
如果它存在,它在所有情况下都会被执行:
- `try` 之后,如果没有 `error`,
- `catch` 之后,如果没有 `error`。
该扩展语法如下所示:
```javascript
try {
... 尝试执行的代码 ...
} catch(e) {
... 处理 error ...
} finally {
... 总是会执行的代码 ...
}
```
试试运行这段代码:
```javascript
try {
alert( 'try' );
if (confirm('Make an error?')) BAD_CODE();
} catch (e) {
alert( 'catch' );
} finally {
alert( 'finally' );
}
```
这段代码有两种执行方式:
1. 如果你对于 “`Make an error`?” 的回答是 “Yes”,那么执行 `try -> catch -> finally`。
2. 如果你的回答是 “`No`”,那么执行 `try -> finally`。
`finally` 子句(`clause`)通常用在:当我们开始做某事的时候,希望无论出现什么情况都要完成完成某个任务。
例如,我们想要测量一个斐波那契数字函数 `fib(n)` 执行所需要花费的时间。通常,我们可以在运行它之前开始测量,并在运行完成时结束测量。但是,如果在该函数调用期间出现 `error` 该怎么办?特别是,下面这段 `fib(n)` 的实现代码在遇到负数或非整数数字时会返回一个 `error`。
无论如何,`finally` 子句都是一个结束测量的好地方。
在这儿,`finally` 能够保证在两种情况下都能正确地测量时间 — 成功执行 `fib` 以及 `fib` 中出现 error 时:
```javascript
let num = +prompt("Enter a positive integer number?", 35)
let diff, result;
function fib(n) {
if (n < 0 || Math.trunc(n) != n) {
throw new Error("Must not be negative, and also an integer.");
}
return n <= 1 ? n : fib(n - 1) + fib(n - 2);
}
let start = Date.now();
try {
result = fib(num);
} catch (e) {
result = 0;
} finally {
diff = Date.now() - start;
}
alert(result || "error occurred");
alert( `execution took ${diff}ms` );
```
你可以通过运行上面这段代码并在 `prompt` 弹窗中输入 `35` 来进行检查 — 代码运行正常,先执行 `try` 然后是 `finally`。如果你输入的是 `-1` — 将立即出现 `error`,执行将只花费 `0ms`。以上两种情况下的时间测量都正确地完成了。
换句话说,函数 `fib` 以 `return` 还是 `throw` 完成都无关紧要。在这两种情况下都会执行 `finally` 子句。
> **变量和 `try..catch..finally` 中的局部变量**
>
> 请注意,上面代码中的 `result` 和 `diff` 变量都是在 `try..catch` **之前** 声明的。
>
> 否则,如果我们使用 `let` 在 `try` 块中声明变量,那么该变量将只在 `try` 块中可见。
> **`finally` 和 `return`**
>
> `finally` 子句适用于 `try..catch` 的 **任何** 出口。这包括显式的 `return`。
>
> 在下面这个例子中,在 `try` 中有一个 `return`。在这种情况下,`finally` 会在控制转向外部代码前被执行。
>
> ```javascript
> function func() {
>
> try {
> return 1;
>
> } catch (e) {
> /* ... */
> } finally {
> alert( 'finally' );
> }
> }
>
> alert( func() ); // 先执行 finally 中的 alert,然后执行这个 alert
> ```
> **`try..finally`**
>
> 没有 `catch` 子句的 `try..finally` 结构也很有用。当我们不想在这儿处理 `error`(让它们 `fall through`),但是需要确保我们启动的处理需要被完成。
>
> ```javascript
> function func() {
> // 开始执行需要被完成的操作(比如测量)
> try {
> // ...
> } finally {
> // 完成前面我们需要完成的那件事儿,即使 try 中的执行失败了
> }
> }
> ```
>
> 上面的代码中,由于没有 `catch`,所以 `try` 中的 `error` 总是会使代码执行跳转至函数 `func()` 外。但是,在跳出之前需要执行 `finally` 中的代码。
## 全局 `catch`
> **环境特定**
>
> 这个部分的内容并不是 `JavaScript` 核心的一部分。
设想一下,在 `try..catch` 结构外有一个致命的 `error`,然后脚本死亡了。这个 `error` 就像编程错误或其他可怕的事儿那样。
有什么办法可以用来应对这种情况吗?我们可能想要记录这个 `error`,并向用户显示某些内容(通常用户看不到错误信息)等。
规范中没有相关内容,但是代码的执行环境一般会提供这种机制,因为它确实很有用。例如,`Node.JS` 有 [`process.on("uncaughtException")`](https://nodejs.org/api/process.html#process_event_uncaughtexception)。在浏览器中,我们可以将将一个函数赋值给特殊的 [window.onerror](https://developer.mozilla.org/zh/docs/Web/API/GlobalEventHandlers/onerror) 属性,该函数将在发生未捕获的 error 时执行。
语法如下:
```javascript
window.onerror = function(message, url, line, col, error) {
// ...
};
```
- `message`
Error 信息。
- `url`
发生 `error` 的脚本的 `URL`。
- `line`,`col`
发生 `error` 处的代码的行号和列号。
- `error`
`Error` 对象。
例如:
```markup
<script>
window.onerror = function(message, url, line, col, error) {
alert(`${message}\n At ${line}:${col} of ${url}`);
};
function readData() {
badFunc(); // 啊,出问题了!
}
readData();
</script>
```
全局错误处理程序 `window.onerror` 的作用通常不是恢复脚本的执行 — 如果发生编程错误,那这几乎是不可能的,它的作用是将错误信息发送给开发者。
也有针对这种情况提供错误日志的 `Web` 服务,例如 [https://errorception.com](https://errorception.com/) 或 [http://www.muscula.com](http://www.muscula.com/)。
它们会像这样运行:
1. 我们注册该服务,并拿到一段 `JS` 代码(或脚本的 `URL`),然后插入到页面中。
2. 该 `JS` 脚本设置了自定义的 `window.onerror` 函数。
3. 当发生 `error` 时,它会发送一个此 `error `相关的网络请求到服务提供方。
4. 我们可以登录到服务方的 `Web` 界面来查看这些 `error`。
## 总结
`try..catch` 结构允许我们处理执行过程中出现的 `error`。从字面上看,它允许“尝试”运行代码并“捕获”其中可能发生的错误。
语法如下:
```javascript
try {
// 执行此处代码
} catch(err) {
// 如果发生错误,跳转至此处
// err 是一个 error 对象
} finally {
// 无论怎样都会在 try/catch 之后执行
}
```
这儿可能会没有 `catch` 部分或者没有 `finally`,所以 `try..catch` 或 `try..finally` 都是可用的。
Error 对象包含下列属性:
- `message` — 人类可读的 `error` 信息。
- `name` — 具有 `error` 名称的字符串(`Error` 构造器的名称)。
- `stack`(没有标准,但得到了很好的支持)— `Error` 发生时的调用栈。
如果我们不需要 `error` 对象,我们可以通过使用 `catch {` 而不是 `catch(err) {` 来省略它。
我们也可以使用 `throw` 操作符来生成自定义的 `error`。从技术上讲,`throw` 的参数可以是任何东西,但通常是继承自内建的 `Error` 类的 `error` 对象。下一章我们会详细介绍扩展 `error`。
再次抛出(`rethrowing`)是一种错误处理的重要模式:`catch` 块通常期望并知道如何处理特定的 `error` 类型,因此它应该再次抛出它不知道的 `error`。
即使我们没有 `try..catch`,大多数执行环境也允许我们设置“全局”错误处理程序来捕获“掉出(`fall out`)”的 `error`。在浏览器中,就是 `window.onerror`。
## 实例
### 使用 `finally `还是直接放在代码后面?
比较下面两个代码片段。
1. 第一个代码片段,使用 `finally` 在 `try..catch` 之后执行代码:
```javascript
try {
work work
} catch (e) {
handle errors
} finally {
cleanup the working space
}
```
2. 第二个代码片段,将清空工作空间的代码放在了 `try..catch` 之后:
```javascript
try {
work work
} catch (e) {
handle errors
}
cleanup the working space
```
我们肯定需要在工作后进行清理,无论工作过程中是否有 `error` 都不影响。
在这儿使用 `finally` 更有优势,还是说两个代码片段效果一样?如果在这儿有这样的优势,如果需要,请举例说明。
#### 解决方案
> 当我们看函数中的代码时,差异就变得很明显了。
>
> 如果在这儿有“跳出” `try..catch` 的行为,那么这两种方式的表现就不同了。
>
> 例如,当 `try..catch` 中有 `return` 时。`finally` 子句会在 `try..catch` 的 **任意** 出口处起作用,即使是通过 `return` 语句退出的也是如此:在 `try..catch` 刚刚执行完成后,但在调用代码获得控制权之前。
>
> ```javascript
> function f() {
> try {
> alert('start');
> return "result";
> } catch (e) {
> /// ...
> } finally {
> alert('cleanup!');
> }
> }
>
> f(); // cleanup!
> ```
>
> ……或者当有 `throw` 时,如下所示:
>
> ```javascript
> function f() {
> try {
> alert('start');
> throw new Error("an error");
> } catch (e) {
> // ...
> if("can't handle the error") {
> throw e;
> }
>
> } finally {
> alert('cleanup!')
> }
> }
>
> f(); // cleanup!
> ```
>
> 正是这里的 `finally` 保证了 `cleanup`。如果我们只是将代码放在函数 `f` 的末尾,则在这些情况下它不会运行。 | 19.897712 | 298 | 0.627706 | yue_Hant | 0.739003 |
bdf34dcc16a38ca3d5a0ee18394c595894f18e45 | 4,369 | md | Markdown | README.md | scottrfrancis/dash-home | 98e9167748a708fa49952c648815da02e665a09b | [
"MIT"
] | null | null | null | README.md | scottrfrancis/dash-home | 98e9167748a708fa49952c648815da02e665a09b | [
"MIT"
] | null | null | null | README.md | scottrfrancis/dash-home | 98e9167748a708fa49952c648815da02e665a09b | [
"MIT"
] | null | null | null | # Home Dashboard
A simple dashboard to show the status of various systems around the house and to facilitate easy control of some of the more complex ones.
First project is to provide visibility into the Pool equipment. I have a Pentair IntelliTouch controller with wireless remote. Among other issues with that system is the complexity to determine if you turned off the heater or not. Requires about 20 different button clicks and navigation of a clunky menu system. This project seeks to make that information visible.
A second effort is to provide visibility and ad hoc control of the BHyve irrigation controller, which is also used in my house to control the pool filler. Most of the time a regular schedule is sufficient, but trigging a 5 or 10 minute fill on an ad hoc basis would be nice.
### Amplify Framework
This project was built using the [Amplify Framework for React](https://docs.amplify.aws/start/q/integration/react) and will interface, initially, with an AWS IoT Device Shadow for data.
*TODO*
* Push to CloudFront (for https hosting) and register DNS
* Refactor the Dashboard component into service and view components
* Implement desired state functionality -- right now this is read-only.
* Abstract the direct connect to the Shadow with AppSync and a control API Gateway
* Interface with the BHyve
* Create some 'macro' buttons for activities like draining the pool (start the pump, wait, open the drain valve, wait, close, wait, stop)

## Design Notes
This is a basic React app that uses react-bootstrap for layout and a couple other components. Most of the work is done in one big component, `Dashboard`. This needs to be refactored.
The tricky part here is the magic to get the IoT Data. To supply that data, I previously created an [IoT Thing](https://github.com/scottrfrancis/Pentair-Thing) to read the Pentair Protocol and publish to a device Shadow.
To make things simple (quick and dirty), I'm connecting directly to the device shadow. This has the advantage of getting updates quickly and an easy interface to request changes to turn things on and off.
However, it won't readily extend to other control surfaces, like Alexa, where we will need an API.
The Amplify Framework uses [AWS Cognito](https://aws.amazon.com/cognito/) for user management, and I took the easy route of just wrapping my main component with the Amplify HOC. Easy and secure.
### Connecting IoT
Beyond bringing up the React app, and assuming you have a device shadow to read from, there are some steps needed to enable the two to talk.
Having created a Cognito user pool as part of the Amplify setup, you need to allow authenticated users to attach to IoT data. This is done by
1. Finding the role used by the authenticated cognito users in the IAM Console.
2. Create a new policy (attachIoTPolicy) and attach to the role using
```
{ "Version": "2012-10-17", "Statement": [
{ "Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "iot:AttachPolicy",
"Resource": "*" }
] }
```
3. Additionally attach the standard policy `AWSIoTDataAccess` to the cognito role.
The code here relies on having some configuration in the file `aws-iot.js`, which has this structure:
```
const awsiot = {
aws_pubsub_region: "<your region>",
aws_iot_endpoint: "<address from query below>",
policy_name: "<policy name created below>"
}
export default awsiot
```
There are some setup steps to complete that file:
1. Install the [AWS CLI tool](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-linux.html) and `jq` if you don't have them.
2. Get the `aws_iot_endpoint` with the command
```
aws iot describe-endpoint --endpoint-type iot:data-ats | jq '.endpointAddress'
```
3. In the AWS IoT Console, create a new policy (dashboard-policy) with this statement and copy the name (dashboard-policy) to the `policy_name` property.
```
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "iot:*", "Resource": "*" } ] }
```
4. From the `aws-exports.js` file that Amplify creates, get the value for the `aws_cognito_identity_pool_id` property and attach the newly created policy to the identity pool with a command like this
```
aws iot attach-policy --policy-name dashboard-policy --target <aws_identity_pool_id from above>
```
| 59.849315 | 365 | 0.760357 | eng_Latn | 0.997967 |
bdf3f6cd33d3de5bef3ea1f83890a034f395937b | 9,630 | md | Markdown | articles/azure-functions/functions-bindings-event-grid-output.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/azure-functions/functions-bindings-event-grid-output.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/azure-functions/functions-bindings-event-grid-output.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Azure Event Grid powiązanie danych wyjściowych dla Azure Functions
description: Dowiedz się, jak wysłać Zdarzenie Event Grid w Azure Functions.
author: craigshoemaker
ms.topic: reference
ms.date: 02/14/2020
ms.author: cshoe
ms.custom: fasttrack-edit
ms.openlocfilehash: e7a2611312ffc33703dd5cc9d0a2d7142ddb0532
ms.sourcegitcommit: f97f086936f2c53f439e12ccace066fca53e8dc3
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 02/15/2020
ms.locfileid: "77368950"
---
# <a name="azure-event-grid-output-binding-for-azure-functions"></a>Azure Event Grid powiązanie danych wyjściowych dla Azure Functions
Użyj powiązania danych wyjściowych Event Grid do zapisywania zdarzeń w temacie niestandardowym. Musisz mieć prawidłowy [klucz dostępu dla tematu niestandardowego](../event-grid/security-authentication.md#custom-topic-publishing).
Aby uzyskać informacje na temat konfiguracji i szczegółów konfiguracji, zobacz [Omówienie](./functions-bindings-event-grid.md).
> [!NOTE]
> Powiązanie danych wyjściowych Event Grid nie obsługuje sygnatur dostępu współdzielonego (tokeny SAS). Musisz użyć klucza dostępu tematu.
> [!IMPORTANT]
> Powiązanie danych wyjściowych Event Grid jest dostępne tylko dla funkcji 2. x i wyższych.
## <a name="example"></a>Przykład
# <a name="ctabcsharp"></a>[C#](#tab/csharp)
W poniższym przykładzie pokazano [ C# funkcję](functions-dotnet-class-library.md) , która zapisuje komunikat do Event Grid niestandardowego tematu przy użyciu metody zwracanej wartości jako dane wyjściowe:
```csharp
[FunctionName("EventGridOutput")]
[return: EventGrid(TopicEndpointUri = "MyEventGridTopicUriSetting", TopicKeySetting = "MyEventGridTopicKeySetting")]
public static EventGridEvent Run([TimerTrigger("0 */5 * * * *")] TimerInfo myTimer, ILogger log)
{
return new EventGridEvent("message-id", "subject-name", "event-data", "event-type", DateTime.UtcNow, "1.0");
}
```
W poniższym przykładzie pokazano, jak za pomocą interfejsu `IAsyncCollector` wysyłać partie komunikatów.
```csharp
[FunctionName("EventGridAsyncOutput")]
public static async Task Run(
[TimerTrigger("0 */5 * * * *")] TimerInfo myTimer,
[EventGrid(TopicEndpointUri = "MyEventGridTopicUriSetting", TopicKeySetting = "MyEventGridTopicKeySetting")]IAsyncCollector<EventGridEvent> outputEvents,
ILogger log)
{
for (var i = 0; i < 3; i++)
{
var myEvent = new EventGridEvent("message-id-" + i, "subject-name", "event-data", "event-type", DateTime.UtcNow, "1.0");
await outputEvents.AddAsync(myEvent);
}
}
```
# <a name="c-scripttabcsharp-script"></a>[C#Napisy](#tab/csharp-script)
Poniższy przykład przedstawia dane wyjściowe powiązania Event Grid w pliku *Function. JSON* .
```json
{
"type": "eventGrid",
"name": "outputEvent",
"topicEndpointUri": "MyEventGridTopicUriSetting",
"topicKeySetting": "MyEventGridTopicKeySetting",
"direction": "out"
}
```
Kod C# skryptu, który tworzy jedno zdarzenie:
```cs
#r "Microsoft.Azure.EventGrid"
using System;
using Microsoft.Azure.EventGrid.Models;
using Microsoft.Extensions.Logging;
public static void Run(TimerInfo myTimer, out EventGridEvent outputEvent, ILogger log)
{
outputEvent = new EventGridEvent("message-id", "subject-name", "event-data", "event-type", DateTime.UtcNow, "1.0");
}
```
Kod C# skryptu, który tworzy wiele zdarzeń:
```cs
#r "Microsoft.Azure.EventGrid"
using System;
using Microsoft.Azure.EventGrid.Models;
using Microsoft.Extensions.Logging;
public static void Run(TimerInfo myTimer, ICollector<EventGridEvent> outputEvent, ILogger log)
{
outputEvent.Add(new EventGridEvent("message-id-1", "subject-name", "event-data", "event-type", DateTime.UtcNow, "1.0"));
outputEvent.Add(new EventGridEvent("message-id-2", "subject-name", "event-data", "event-type", DateTime.UtcNow, "1.0"));
}
```
# <a name="javascripttabjavascript"></a>[JavaScript](#tab/javascript)
Poniższy przykład przedstawia dane wyjściowe powiązania Event Grid w pliku *Function. JSON* .
```json
{
"type": "eventGrid",
"name": "outputEvent",
"topicEndpointUri": "MyEventGridTopicUriSetting",
"topicKeySetting": "MyEventGridTopicKeySetting",
"direction": "out"
}
```
Oto kod JavaScript, który tworzy pojedyncze zdarzenie:
```javascript
module.exports = async function (context, myTimer) {
var timeStamp = new Date().toISOString();
context.bindings.outputEvent = {
id: 'message-id',
subject: 'subject-name',
dataVersion: '1.0',
eventType: 'event-type',
data: "event-data",
eventTime: timeStamp
};
context.done();
};
```
Oto kod JavaScript, który tworzy wiele zdarzeń:
```javascript
module.exports = function(context) {
var timeStamp = new Date().toISOString();
context.bindings.outputEvent = [];
context.bindings.outputEvent.push({
id: 'message-id-1',
subject: 'subject-name',
dataVersion: '1.0',
eventType: 'event-type',
data: "event-data",
eventTime: timeStamp
});
context.bindings.outputEvent.push({
id: 'message-id-2',
subject: 'subject-name',
dataVersion: '1.0',
eventType: 'event-type',
data: "event-data",
eventTime: timeStamp
});
context.done();
};
```
# <a name="pythontabpython"></a>[Python](#tab/python)
Powiązanie danych wyjściowych Event Grid nie jest dostępne dla języka Python.
# <a name="javatabjava"></a>[Java](#tab/java)
Powiązanie danych wyjściowych Event Grid nie jest dostępne dla języka Java.
---
## <a name="attributes-and-annotations"></a>Atrybuty i adnotacje
# <a name="ctabcsharp"></a>[C#](#tab/csharp)
W przypadku [ C# bibliotek klas](functions-dotnet-class-library.md)Użyj atrybutu [EventGridAttribute](https://github.com/Azure/azure-functions-eventgrid-extension/blob/dev/src/EventGridExtension/OutputBinding/EventGridAttribute.cs) .
Konstruktor atrybutu przyjmuje nazwę ustawienia aplikacji, która zawiera nazwę tematu niestandardowego i nazwę ustawienia aplikacji, która zawiera klucz tematu. Aby uzyskać więcej informacji na temat tych ustawień, zobacz [Output-Configuration](#configuration). Oto przykład `EventGrid` atrybutu:
```csharp
[FunctionName("EventGridOutput")]
[return: EventGrid(TopicEndpointUri = "MyEventGridTopicUriSetting", TopicKeySetting = "MyEventGridTopicKeySetting")]
public static string Run([TimerTrigger("0 */5 * * * *")] TimerInfo myTimer, ILogger log)
{
...
}
```
Aby zapoznać się z pełnym przykładem, zobacz [przykład](#example).
# <a name="c-scripttabcsharp-script"></a>[C#Napisy](#tab/csharp-script)
Atrybuty nie są obsługiwane przez C# skrypt.
# <a name="javascripttabjavascript"></a>[JavaScript](#tab/javascript)
Atrybuty nie są obsługiwane przez język JavaScript.
# <a name="pythontabpython"></a>[Python](#tab/python)
Powiązanie danych wyjściowych Event Grid nie jest dostępne dla języka Python.
# <a name="javatabjava"></a>[Java](#tab/java)
Powiązanie danych wyjściowych Event Grid nie jest dostępne dla języka Java.
---
## <a name="configuration"></a>Konfiguracja
W poniższej tabeli objaśniono właściwości konfiguracji powiązań ustawiane w pliku *Function. JSON* i `EventGrid` atrybutu.
|Właściwość Function.JSON | Właściwość atrybutu |Opis|
|---------|---------|----------------------|
|**type** | Nie dotyczy | Musi być ustawiona na wartość "eventGrid". |
|**direction** | Nie dotyczy | Musi być równa "out". Ten parametr jest ustawiany automatycznie podczas tworzenia powiązania w Azure Portal. |
|**Nazwij** | Nie dotyczy | Nazwa zmiennej używana w kodzie funkcji, która reprezentuje zdarzenie. |
|**topicEndpointUri** |**TopicEndpointUri** | Nazwa ustawienia aplikacji, która zawiera identyfikator URI tematu niestandardowego, na przykład `MyTopicEndpointUri`. |
|**topicKeySetting** |**TopicKeySetting** | Nazwa ustawienia aplikacji, która zawiera klucz dostępu dla tematu niestandardowego. |
[!INCLUDE [app settings to local.settings.json](../../includes/functions-app-settings-local.md)]
> [!IMPORTANT]
> Upewnij się, że wartość właściwości konfiguracja `TopicEndpointUri` jest ustawiona na nazwę ustawienia aplikacji, która zawiera identyfikator URI tematu niestandardowego. Nie określaj identyfikatora URI tematu niestandardowego bezpośrednio w tej właściwości.
## <a name="usage"></a>Sposób użycia
# <a name="ctabcsharp"></a>[C#](#tab/csharp)
Wysyłaj komunikaty przy użyciu parametru metody, takiego jak `out EventGridEvent paramName`. Aby pisać wiele komunikatów, można użyć `ICollector<EventGridEvent>` lub `IAsyncCollector<EventGridEvent>` zamiast `out EventGridEvent`.
# <a name="c-scripttabcsharp-script"></a>[C#Napisy](#tab/csharp-script)
Wysyłaj komunikaty przy użyciu parametru metody, takiego jak `out EventGridEvent paramName`. W C# skrypcie `paramName` jest wartością określoną we właściwości `name` *funkcji Function. JSON*. Aby pisać wiele komunikatów, można użyć `ICollector<EventGridEvent>` lub `IAsyncCollector<EventGridEvent>` zamiast `out EventGridEvent`.
# <a name="javascripttabjavascript"></a>[JavaScript](#tab/javascript)
Dostęp do zdarzenia wyjściowego przy użyciu `context.bindings.<name>`, gdzie `<name>` jest wartością określoną we właściwości `name` *funkcji Function. JSON*.
# <a name="pythontabpython"></a>[Python](#tab/python)
Powiązanie danych wyjściowych Event Grid nie jest dostępne dla języka Python.
# <a name="javatabjava"></a>[Java](#tab/java)
Powiązanie danych wyjściowych Event Grid nie jest dostępne dla języka Java.
---
## <a name="next-steps"></a>Następne kroki
* [Wysyłanie zdarzenia Event Grid](./functions-bindings-event-grid-trigger.md)
| 37.913386 | 328 | 0.736449 | pol_Latn | 0.979804 |
bdf457c15a4bfad86863d4641be8f4c6e5d6dcb4 | 864 | md | Markdown | docs/ide/reference/q-devenv-exe.md | hericlesme/visualstudio-docs.pt-br | 086d2f88af868af84582bc7f1d50ffc5ea14b11f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ide/reference/q-devenv-exe.md | hericlesme/visualstudio-docs.pt-br | 086d2f88af868af84582bc7f1d50ffc5ea14b11f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ide/reference/q-devenv-exe.md | hericlesme/visualstudio-docs.pt-br | 086d2f88af868af84582bc7f1d50ffc5ea14b11f | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-07-26T14:58:39.000Z | 2021-07-26T14:58:39.000Z | ---
title: -? (devenv.exe)
ms.date: 11/04/2016
ms.prod: visual-studio-dev15
ms.technology: vs-ide-general
ms.topic: reference
helpviewer_keywords:
- /? Devenv switch
ms.assetid: fd8fd6b2-1304-4d06-8118-6629666801fb
author: gewarren
ms.author: gewarren
manager: douge
ms.workload:
- multiple
ms.openlocfilehash: e33726d600cd4988fa3ad4a07aea9df10db70ea7
ms.sourcegitcommit: e13e61ddea6032a8282abe16131d9e136a927984
ms.translationtype: HT
ms.contentlocale: pt-BR
ms.lasthandoff: 04/26/2018
ms.locfileid: "31942572"
---
# <a name="-devenvexe"></a>/? (devenv.exe)
Exibe uma caixa de mensagem listando todas as opções `devenv`, com uma breve descrição de cada uma.
## <a name="syntax"></a>Sintaxe
```
devenv /?
```
## <a name="see-also"></a>Consulte também
- [Opções de linha de comando devenv](../../ide/reference/devenv-command-line-switches.md) | 26.181818 | 99 | 0.737269 | por_Latn | 0.4858 |
bdf49b4ce7a886de2766ea6d37c5678691b9879f | 1,453 | md | Markdown | docs/csharp/language-reference/operators/less-than-operator.md | lucieva/docs.cs-cz | a688d6511d24a48fe53a201e160e9581f2effbf4 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2018-12-19T17:04:23.000Z | 2018-12-19T17:04:23.000Z | docs/csharp/language-reference/operators/less-than-operator.md | lucieva/docs.cs-cz | a688d6511d24a48fe53a201e160e9581f2effbf4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/csharp/language-reference/operators/less-than-operator.md | lucieva/docs.cs-cz | a688d6511d24a48fe53a201e160e9581f2effbf4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: '< – Operátor (referenční dokumentace jazyka C#)'
ms.date: 07/20/2015
f1_keywords:
- <_CSharpKeyword
helpviewer_keywords:
- less than operator (<) [C#]
- < operator [C#]
ms.assetid: 38cb91e6-79a6-48ec-9c1e-7b71fd8d2b41
ms.openlocfilehash: 382110985eaffd7ca4cf014d7991fc5ee87dc031
ms.sourcegitcommit: 2eceb05f1a5bb261291a1f6a91c5153727ac1c19
ms.translationtype: MT
ms.contentlocale: cs-CZ
ms.lasthandoff: 09/04/2018
ms.locfileid: "43530305"
---
# <a name="lt-operator-c-reference"></a>< – Operátor (referenční dokumentace jazyka C#)
Všechny číselné a výčet typů definovat "menší než" relační operátor (`<`), která vrací `true` Pokud je první operand je menší než druhé, `false` jinak.
## <a name="remarks"></a>Poznámky
Lze přetěžovat uživatelsky definované typy `<` – operátor (viz [operátor](../../../csharp/language-reference/keywords/operator.md)). Pokud `<` je přetížena, [ > ](../../../csharp/language-reference/operators/greater-than-operator.md) musí také být přetíženy.
## <a name="example"></a>Příklad
[!code-csharp[csRefOperators#24](../../../csharp/language-reference/operators/codesnippet/CSharp/less-than-operator_1.cs)]
## <a name="see-also"></a>Viz také
- [Referenční dokumentace jazyka C#](../../../csharp/language-reference/index.md)
- [Průvodce programováním v jazyce C#](../../../csharp/programming-guide/index.md)
- [Operátory jazyka C#](../../../csharp/language-reference/operators/index.md)
| 46.870968 | 259 | 0.724019 | ces_Latn | 0.949797 |
bdf540dd5578c213bfab676b2ec1b2e67c95bf0c | 2,399 | md | Markdown | Lesson-11-Loops/README.md | hollowbit/learn_programming | 1e0539f1ea8d6d1098a9f1131decf9091ebfdbbb | [
"Apache-2.0"
] | null | null | null | Lesson-11-Loops/README.md | hollowbit/learn_programming | 1e0539f1ea8d6d1098a9f1131decf9091ebfdbbb | [
"Apache-2.0"
] | null | null | null | Lesson-11-Loops/README.md | hollowbit/learn_programming | 1e0539f1ea8d6d1098a9f1131decf9091ebfdbbb | [
"Apache-2.0"
] | null | null | null | [Previous: Lists/Arrays](../Lesson-10-Lists-Arrays/README.md)
# Loops
If you want to rerun certain code multiple times, or for each item in a list, you can use a loop.
```javascript
enemies.forEach(function(enemy) {
enemy.x += Math.random() * 4 - 2
enemy.y += Math.random() * 4 - 2
})
```
If you want to spawn 5 enemies:
```javascript
enemiesSpawned = 0
while(enemiesSpawned < 5) {
enemies.push({ x: 10 * enemiesSpawned, y: 3 * enemiesSpawned})
enemiesSpawned++
}
```
## Using Loops and Lists in Algorithms
Lists and loops make it possible to to write algorithms that take any amount of data and produce a result. For example, our average age algorithm could only calculate the average age using a few people. Adding more people to it would require code changes. If we made a function to get the average age that took in a list of ages to calculate with, our algorithm would be able to get an average age whether we have 5 people or 1000 people. Here is what it could look like:
```javascript
function calculateAverageAge(ages) {
total = 0
ages.forEach(function(age) {
total += age
})
return total / ages.length // ages.length is how many items are in the list
}
// Let's use our algorithm
calculateAverageAge([5,98,24]) // 42.33
calculateAverageAge([43,645,5,24,65,35,8,12,22,56,23,54,6,2,39,84,93,15,18,53,74,73,84,65,23,64,46,66,32,98,28,39,24,46,76,43,31,81]) // 60.39
```
## Lists in Our Game
*Try adding some AI to the enemies. You can access them with the `enemies` list:* https://jsfiddle.net/khmt0oc1/1/
<details>
<summary>Example Answer</summary>
```javascript
enemies.forEach(function(enemy) {
movementX = 0
movementY = 0
if (enemy.x < x) {
movementX = 2
}
if (enemy.x > x) {
movementX = -2
}
if (enemy.y < y) {
movementY = 2
}
if (enemy.y > y) {
movementY = -2
}
enemy.x += movementX
enemy.y += movementY
})
```
</details>
# Take some time to make this game interesting! Try programming any features you like
Here is what my game ended up looking like: https://jsfiddle.net/khmt0oc1/3/
> NOTE: The game is already programmed to do a GAME OVER screen when your health is < 1.
| 29.9875 | 471 | 0.627345 | eng_Latn | 0.983915 |
bdf5cf3917d26b9a060211c5c0a970fee836d9fc | 1,200 | md | Markdown | README.md | SkyzoxRobin/middleman-nft | 0d3ab9e1a6e578117e6bca39b30f608294da2afa | [
"MIT"
] | null | null | null | README.md | SkyzoxRobin/middleman-nft | 0d3ab9e1a6e578117e6bca39b30f608294da2afa | [
"MIT"
] | null | null | null | README.md | SkyzoxRobin/middleman-nft | 0d3ab9e1a6e578117e6bca39b30f608294da2afa | [
"MIT"
] | null | null | null | # middleman-nft
Building first secure p2p NFT exchange with near zero fees
## Motivations of this project
I have recently seen numerous scams in the NFT space with people trying to bid with MEX for "bluechip" projects or acting as a middleman and rekting newcomers that just wanted to flip their NFTs. We are also seeing the emergence of several cashgrab projects on the Elrond Network with insane royalties (mainly 10%).
Those basics assumptions result in a community that is struggling to make benefits because of enormous royalties from projects and scammers that are seeking for newcomers to rekt.
That's why I am aiming at delivering a secure smart contract based p2p service in order to make trades between two accounts secure, easy and without enormous fees (only 2%). By doing so, I offer a secure way to trade NFT for EGLD between two accounts and I allow huge deals to happen without being charged for insane fees.
In the same time, I am sharing the code as long as I see near nobody with open source projects on Elrond. I will make sure to be 100% transparent in my future projects in order to help new devs contribute to the network.
Don't hesitate to reach out on twitter @yum0ee
| 75 | 322 | 0.7975 | eng_Latn | 0.999792 |
bdf64035b1dda4ef2b9a0932e1baf19410055717 | 396 | md | Markdown | _posts/2017-04-23-browser-animation-in-2017-by-birtles.md | jser/realtime.jser.info | 1c4e18b7ae7775838604ae7b7c666f1b28fb71d4 | [
"MIT"
] | 5 | 2016-01-25T08:51:46.000Z | 2022-02-16T05:51:08.000Z | _posts/2017-04-23-browser-animation-in-2017-by-birtles.md | jser/realtime.jser.info | 1c4e18b7ae7775838604ae7b7c666f1b28fb71d4 | [
"MIT"
] | 3 | 2015-08-22T08:39:36.000Z | 2021-07-25T15:24:10.000Z | _posts/2017-04-23-browser-animation-in-2017-by-birtles.md | jser/realtime.jser.info | 1c4e18b7ae7775838604ae7b7c666f1b28fb71d4 | [
"MIT"
] | 2 | 2016-01-18T03:56:54.000Z | 2021-07-25T14:27:30.000Z | ---
title: Browser animation in 2017 by birtles
author: azu
layout: post
itemUrl: 'http://slides.com/birtles/browser-animation-2017'
editJSONPath: 'https://github.com/jser/jser.info/edit/gh-pages/data/2017/04/index.json'
date: '2017-04-23T13:30:07Z'
tags:
- JavaScript
- CSS
- animation
- slide
---
CSS Animations/Web Animationsについてのスライド。
Web Animations APIやフレームアニメーションについて、またその実装状況にについて
| 24.75 | 87 | 0.760101 | yue_Hant | 0.281633 |
bdf731ed786b7e9d1168608aee53d83a2843f273 | 315 | md | Markdown | README.md | tritonmedia/sync | 5665f6d71d457874e714180422152b07b30cc082 | [
"Apache-2.0"
] | null | null | null | README.md | tritonmedia/sync | 5665f6d71d457874e714180422152b07b30cc082 | [
"Apache-2.0"
] | 4 | 2020-02-16T05:52:41.000Z | 2021-04-26T13:06:17.000Z | README.md | tritonmedia/sync | 5665f6d71d457874e714180422152b07b30cc082 | [
"Apache-2.0"
] | null | null | null | # sync
Sync ensures that a directory directly matches a remote s3 bucket on a local machine.
## Usage
Create a `config.yaml` in the same directory as a binary from [Github Releases](https://github.com/tritonmedia/sync/releases), use `config.example.yaml` for an example.
Then run `sync`
## License
Apache-2.0 | 24.230769 | 168 | 0.749206 | eng_Latn | 0.984459 |
bdf92f5b70196e5142ce8d05025dffa2ab2f2f38 | 84 | md | Markdown | README.md | fabiotindin/template-node-ts-mongodb | 7699446bf312c6da339b4a80ca0ede798f1ddd55 | [
"MIT"
] | 1 | 2021-12-16T23:21:29.000Z | 2021-12-16T23:21:29.000Z | README.md | BrunoMonte/ApiAnotacoes_Mongodb | ea06fdaa9f5facbaf8e24cb203043d88e8f5095f | [
"MIT"
] | null | null | null | README.md | BrunoMonte/ApiAnotacoes_Mongodb | ea06fdaa9f5facbaf8e24cb203043d88e8f5095f | [
"MIT"
] | null | null | null | # template-node-ts-mongodb
Template usando Nodejs + express + typescript + mongodb
| 21 | 55 | 0.77381 | eng_Latn | 0.190444 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.