hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4f9f4566264890d3e526e0acf99f295d969680de | 168 | md | Markdown | README.md | OxOtn3/Java_pl0 | cf287e67daa2a75cd38bdf0b65e7a2df4678d9f0 | [
"MIT"
] | 4 | 2022-03-11T02:40:34.000Z | 2022-03-21T04:16:29.000Z | README.md | OxOtn3/Java_pl0 | cf287e67daa2a75cd38bdf0b65e7a2df4678d9f0 | [
"MIT"
] | null | null | null | README.md | OxOtn3/Java_pl0 | cf287e67daa2a75cd38bdf0b65e7a2df4678d9f0 | [
"MIT"
] | null | null | null | # Java_pl0
Java 实现 pl0 编译器
参考了 [@Trouvaille0198](https://github.com/Trouvaille0198) 大佬的 [基于Go实现的pl0编译器](https://github.com/Trouvaille0198/Go-pl0-compiler)
对他表示由衷的感谢!
| 24 | 127 | 0.779762 | yue_Hant | 0.584869 |
4fa03781f92a9f73b480fb99717f9d30e2c79783 | 363 | md | Markdown | Radu Veronica/README.md | fulepiuliann/Arduino-projects | 0ae5718451622dc39f9b3b28f184c13a492e805e | [
"MIT"
] | 1 | 2018-06-03T13:38:07.000Z | 2018-06-03T13:38:07.000Z | Radu Veronica/README.md | fulepiuliann/Arduino-projects | 0ae5718451622dc39f9b3b28f184c13a492e805e | [
"MIT"
] | null | null | null | Radu Veronica/README.md | fulepiuliann/Arduino-projects | 0ae5718451622dc39f9b3b28f184c13a492e805e | [
"MIT"
] | 20 | 2018-05-29T12:10:47.000Z | 2019-06-23T06:25:08.000Z | # Imagine_Robotică

_
The project contains the following files:
Senzor_temperatură.ino is an arduino application that contains a temperature sensor that displays the temperature
#
Author: Radu (Ceaușu) Veronica - 2. Iune.2018
| 30.25 | 131 | 0.798898 | eng_Latn | 0.674801 |
4fa16c6c0a0e3602f346882358fd10e280160f2c | 205 | md | Markdown | README.md | mpicard/graphql-metrics-ui | 9bfe2f487835eedfbbdcff82dc51d6fc19628af0 | [
"MIT"
] | null | null | null | README.md | mpicard/graphql-metrics-ui | 9bfe2f487835eedfbbdcff82dc51d6fc19628af0 | [
"MIT"
] | null | null | null | README.md | mpicard/graphql-metrics-ui | 9bfe2f487835eedfbbdcff82dc51d6fc19628af0 | [
"MIT"
] | null | null | null | # Make you own GraphQL metrics dashboard
This is the react UI for my tutorial series on [How to make your own GraphQL metrics dashbard](http://all-loops-considered.org/2018/05/09/graphql-metrics-part-1/)
| 51.25 | 162 | 0.780488 | eng_Latn | 0.927819 |
4fa19b6b906bd3354b6a740413a9e7675d788d68 | 1,089 | md | Markdown | data/issues/ZF-11406.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 40 | 2016-06-23T17:52:49.000Z | 2021-03-27T20:02:40.000Z | data/issues/ZF-11406.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 80 | 2016-06-24T13:39:11.000Z | 2019-08-08T06:37:19.000Z | data/issues/ZF-11406.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 52 | 2016-06-24T22:21:49.000Z | 2022-02-24T18:14:03.000Z | ---
layout: issue
title: "ContextSwitch doc states incorrect XML content-type header"
id: ZF-11406
---
ZF-11406: ContextSwitch doc states incorrect XML content-type header
--------------------------------------------------------------------
Issue Type: Docs: Problem Created: 2011-05-24T23:38:52.000+0000 Last Updated: 2011-09-03T19:59:56.000+0000 Status: Resolved Fix version(s): - 1.11.11 (29/Sep/11)
Reporter: Phil Brown (philbrown) Assignee: Ramon Henrique Ornelas (ramon) Tags: - Zend\_Controller
Related issues:
Attachments:
### Description
The documentation for the [ContextSwitch helper](http://framework.zend.com/manual/en/zend.controller.actionhelpers.html#zend.controller.actionhelpers.contextswitch) states the XML context will set the content-type response header to text/xml.
This is incorrect as the code actually sets the header to application/xml.
See Zend/Controller/Action/Helper/ContextSwitch.php:149
### Comments
Posted by Ramon Henrique Ornelas (ramon) on 2011-09-03T19:59:56.000+0000
Fix r24438 merged to branch release 1.11 r24439.
| 30.25 | 242 | 0.71809 | eng_Latn | 0.586584 |
4fa1a9bda908c4f3cc6a663941ae4ee8b171cba3 | 1,944 | md | Markdown | README.md | Rebaiahmed/Alchaer_Backend | 74579fb5609230d1484eff67da059e6f8316fe44 | [
"MIT"
] | null | null | null | README.md | Rebaiahmed/Alchaer_Backend | 74579fb5609230d1484eff67da059e6f8316fe44 | [
"MIT"
] | null | null | null | README.md | Rebaiahmed/Alchaer_Backend | 74579fb5609230d1484eff67da059e6f8316fe44 | [
"MIT"
] | null | null | null | # Express & mongoose REST API
[](https://travis-ci.org/KunalKapadia/express-mongoose-es6-rest-api)
[](https://coveralls.io/github/KunalKapadia/express-mongoose-es6-rest-api?branch=master)
[](https://codeclimate.com/github/KunalKapadia/express-mongoose-es6-rest-api)
[](https://www.bithound.io/github/KunalKapadia/express-es6-rest-api-starter)
[](https://www.bithound.io/github/KunalKapadia/express-mongoose-es6-rest-api/master/dependencies/npm)
[](http://commitizen.github.io/cz-cli/)
[](http://opensource.org/licenses/MIT)
[](http://makeapullrequest.com)
[](https://www.paypal.me/KunalKapadia)
# [](https://github.com/KunalKapadia/express-mongoose-es6-rest-api)
## Overview
This is the API For Mobile Application
## Meta
Rebai Ahmed – [@RebaiAhmed_](https://twitter.com/RebaiAhmed_) – [email protected]
| 77.76 | 224 | 0.786523 | yue_Hant | 0.320036 |
4fa3a4e46c6316b5621c1d357e59c6ed7f3ea4ca | 3,435 | md | Markdown | sdk-api-src/content/wdsbp/nf-wdsbp-wdsbpinitialize.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/wdsbp/nf-wdsbp-wdsbpinitialize.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/wdsbp/nf-wdsbp-wdsbpinitialize.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NF:wdsbp.WdsBpInitialize
title: WdsBpInitialize function (wdsbp.h)
description: Constructs options for the WDS network boot program.
helpviewer_keywords: ["WDSBP_PK_TYPE_BCD","WDSBP_PK_TYPE_DHCPV6","WDSBP_PK_TYPE_WDSNBP","WdsBpInitialize","WdsBpInitialize function [Windows Deployment Services]","wds.wdsbpinitialize","wdsbp/WdsBpInitialize"]
old-location: wds\wdsbpinitialize.htm
tech.root: wds
ms.assetid: a77cbdf5-9025-4e98-8edd-1b9bae8493e7
ms.date: 12/05/2018
ms.keywords: WDSBP_PK_TYPE_BCD, WDSBP_PK_TYPE_DHCPV6, WDSBP_PK_TYPE_WDSNBP, WdsBpInitialize, WdsBpInitialize function [Windows Deployment Services], wds.wdsbpinitialize, wdsbp/WdsBpInitialize
req.header: wdsbp.h
req.include-header:
req.target-type: Windows
req.target-min-winverclnt: Windows Vista with SP1 [desktop apps only]
req.target-min-winversvr: Windows Server 2008 [desktop apps only]
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance:
req.unicode-ansi:
req.idl:
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib: Wdsbp.lib
req.dll: Wdsbp.dll
req.irql:
targetos: Windows
req.typenames:
req.redist:
ms.custom: 19H1
f1_keywords:
- WdsBpInitialize
- wdsbp/WdsBpInitialize
dev_langs:
- c++
topic_type:
- APIRef
- kbSyntax
api_type:
- DllExport
api_location:
- Wdsbp.dll
api_name:
- WdsBpInitialize
---
# WdsBpInitialize function
## -description
Constructs options for the WDS network boot program.
## -parameters
### -param bPacketType [in]
The type of boot program. This parameter may have one of the following values.
<table>
<tr>
<th>Value</th>
<th>Meaning</th>
</tr>
<tr>
<td width="40%"><a id="WDSBP_PK_TYPE_WDSNBP"></a><a id="wdsbp_pk_type_wdsnbp"></a><dl>
<dt><b>WDSBP_PK_TYPE_WDSNBP</b></dt>
<dt>2</dt>
</dl>
</td>
<td width="60%">
Specify this value to build a boot program using options for the "wdsnbp.com" boot program. The "wdsnbp.com" boot program is the WDS network boot program for IPv4 PXE on legacy BIOS systems and does not support other systems.
</td>
</tr>
<tr>
<td width="40%"><a id="WDSBP_PK_TYPE_BCD"></a><a id="wdsbp_pk_type_bcd"></a><dl>
<dt><b>WDSBP_PK_TYPE_BCD</b></dt>
<dt>4</dt>
</dl>
</td>
<td width="60%">
Specify this value to build a boot program using the <b>WDSBP_OPT_BCD_FILE_PATH</b> option. It may be used with "wdsnbp.com" or other boot programs.
</td>
</tr>
<tr>
<td width="40%"><a id="WDSBP_PK_TYPE_DHCPV6"></a><a id="wdsbp_pk_type_dhcpv6"></a><dl>
<dt><b>WDSBP_PK_TYPE_DHCPV6</b></dt>
<dt>8</dt>
</dl>
</td>
<td width="60%">
Specify this value to indicate that the packet contains a path to a Boot Configuration Data (BCD) file. Use this value for any and all DHCPv6 options. The presence of this value indicates that the packet contains a path to a Boot Configuration Data (BCD) file.
</td>
</tr>
</table>
### -param phHandle [out]
A pointer to the handle to the packet. This handle can be used by the <a href="/windows/desktop/api/wdsbp/nf-wdsbp-wdsbpaddoption">WdsBpAddOption</a> function to add options for the WDS network boot program. After all the options have been added, use the <a href="/windows/desktop/api/wdsbp/nf-wdsbp-wdsbpgetoptionbuffer">WdsBpGetOptionBuffer</a> function to add these to the DHCP options list sent to WDS network boot program. The handle must be closed using the <a href="/windows/desktop/api/wdsbp/nf-wdsbp-wdsbpclosehandle">WdsBpCloseHandle</a> function.
## -returns
If the function succeeds, the return is <b>S_OK</b>. | 32.102804 | 557 | 0.750509 | eng_Latn | 0.581615 |
4fa3a816935afc1eeccf72606a249b8914f994d3 | 15 | md | Markdown | README.md | vega0447/kim | 9c522f7c9cecba920d7838d0d11440308f20017e | [
"Apache-2.0"
] | null | null | null | README.md | vega0447/kim | 9c522f7c9cecba920d7838d0d11440308f20017e | [
"Apache-2.0"
] | null | null | null | README.md | vega0447/kim | 9c522f7c9cecba920d7838d0d11440308f20017e | [
"Apache-2.0"
] | null | null | null | # kim
android2
| 5 | 8 | 0.733333 | tur_Latn | 0.885151 |
4fa5ad4eaf9efbb852b43b49beca884e1697acb6 | 112 | md | Markdown | LiTools.Helpers.Organize/README.md | LifeSocialLife/LiTools | 9dd17f4979a4e1a677489043aed192b194d33223 | [
"MIT"
] | null | null | null | LiTools.Helpers.Organize/README.md | LifeSocialLife/LiTools | 9dd17f4979a4e1a677489043aed192b194d33223 | [
"MIT"
] | null | null | null | LiTools.Helpers.Organize/README.md | LifeSocialLife/LiTools | 9dd17f4979a4e1a677489043aed192b194d33223 | [
"MIT"
] | null | null | null | # LiTools.Helpers.Organize
[Read dokumentation](/Documentation/Organize\LiTools.Helpers.Organize/README.md)
| 18.666667 | 80 | 0.803571 | yue_Hant | 0.893312 |
4fa5b1a795b779a6781ec8f52902af7b0a448ded | 1,501 | md | Markdown | README.md | danieltocado/Backend-Movies | 3491172cbf5889d7a7382263fcb514d7f04be6b7 | [
"MIT"
] | 1 | 2021-01-30T01:42:19.000Z | 2021-01-30T01:42:19.000Z | README.md | danieltocado/Backend-Movies | 3491172cbf5889d7a7382263fcb514d7f04be6b7 | [
"MIT"
] | 1 | 2021-05-11T17:48:59.000Z | 2021-05-11T17:48:59.000Z | README.md | danieltocado/Backend-Movies | 3491172cbf5889d7a7382263fcb514d7f04be6b7 | [
"MIT"
] | null | null | null | # Backend API MovieDB
A simple movie database backend getting the films from MoviesDB API.
## Getting Started
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
## Built With
* [NodeJS](https://github.com/topics/nodejs)
* [Nodemon](https://github.com/remy/nodemon)
* [Express](https://github.com/expressjs/express)
* [Sequelize](https://github.com/sequelize/sequelize)
* [Sequelize-CLI](https://github.com/sequelize/cli)
* [MySQL2](https://github.com/sidorares/node-mysql2)
* [BCryptJS](https://github.com/kelektiv/node.bcrypt.js)
* [JSonWebtoken](https://github.com/auth0/node-jsonwebtoken)
* [Axios](https://github.com/axios/axios)
### Prerequisites
First we need to install [NodeJS](https://nodejs.org/en/download/), which we will download from their website.
Then we will install these global dependencies.
```
npm i -g nodemon
npm i -g sequelize-CLI
```
### Installing
Start a NodeJS project.
```
npm init -y
```
Install dependencies.
```
npm install express
npm install sequelize
npm install mysql2
npm install bcryptjs
npm install jsonwebtoken
npm install axios
```
Create database and models.
```
sequelize db:create ...
sequelize generate:model --name ... --attributes ...
```
## Running the tests
We will test the backend with [Postman](https://www.postman.com/downloads/).
## License
This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details.
| 23.825397 | 129 | 0.734843 | eng_Latn | 0.600264 |
4fa6861b257f33abce7d74b985ff91a84cf5762a | 109 | md | Markdown | README.md | oamam/ipr | e53641728eeac4e59f21d0823037367e21fcc484 | [
"MIT"
] | null | null | null | README.md | oamam/ipr | e53641728eeac4e59f21d0823037367e21fcc484 | [
"MIT"
] | null | null | null | README.md | oamam/ipr | e53641728eeac4e59f21d0823037367e21fcc484 | [
"MIT"
] | null | null | null | # ipr
This is a library that allows you to get information related to your IP(v4) address.
## License
MIT
| 13.625 | 84 | 0.733945 | eng_Latn | 0.999854 |
4fa727db269c74340727c60a4054cf28bd1f4f9c | 585 | md | Markdown | README.md | TracyZhangLei/AwesomeLogger | 5322b0c2a5030263b4fdfc93f6046b4f8b14dda1 | [
"Apache-2.0"
] | 3 | 2015-12-19T05:59:30.000Z | 2017-02-06T10:31:54.000Z | README.md | TracyZhangLei/AwesomeLogger | 5322b0c2a5030263b4fdfc93f6046b4f8b14dda1 | [
"Apache-2.0"
] | null | null | null | README.md | TracyZhangLei/AwesomeLogger | 5322b0c2a5030263b4fdfc93f6046b4f8b14dda1 | [
"Apache-2.0"
] | null | null | null | # AwesomeLogger
像系统自带Log一样简单的调用,日志形式更加清晰易懂,json化的数据可以直接打印出来,省去转换的麻烦,试试吧。
####引入:
compile 'cc.tracyzhang:AwesomeLogger:1.0.0'
####用法:
firstly,use soft wraps in logcat to make sure the log shown in one line,like that:

then:
* ALogger.d(tag,obj)
* ALogger.i(tag,obj)
* ALogger.w(tag,obj)
* ALogger.e(tag,obj)
####示例:
</p>

| 24.375 | 96 | 0.748718 | yue_Hant | 0.628857 |
4fa77835865b37b375f7a89210fff5aa75bfba59 | 1,022 | md | Markdown | README.md | nischalbasuti/nischalbasuti.github.io | 94559b3a4747f6a45309d65f05cca8e4710dbbdf | [
"MIT"
] | null | null | null | README.md | nischalbasuti/nischalbasuti.github.io | 94559b3a4747f6a45309d65f05cca8e4710dbbdf | [
"MIT"
] | null | null | null | README.md | nischalbasuti/nischalbasuti.github.io | 94559b3a4747f6a45309d65f05cca8e4710dbbdf | [
"MIT"
] | null | null | null | # Selected Repos
Here are some personal projects I've worked on, please hire me :).
## [Toy Game Engine](https://github.com/nischalbasuti/ToyGameEngine)
A 2D game engine for browsers, written in JavaScript(ES6) and HTML5
## [knowtify](https://github.com/nischalbasuti/knowtify)
An android mulitcasting application, built using Android SDK, Java, Firebase Cloud Platform, node.js.
## [bitman](https://github.com/nischalbasuti/bitman)
A simple game written with libgdx for android
## [.vim](https://github.com/nischalbasuti/.vim)
This is my rc file. There are many like it, but this one is mine. Without me, my rc file is useless. Without my rc file, I am useless.
## [clothe-right](https://github.com/nischalbasuti/clothe-right)
Convolutional neural network to detect clothes.
## [ShadowDetector](https://github.com/nischalbasuti/ShadowDetector)
Convolutional neural network to detect shadows.
## [bulletgl](https://github.com/nischalbasuti/bulletgl)
A thin wrapper over the Bullet Physics engine and OpenGL
| 31.9375 | 134 | 0.764188 | eng_Latn | 0.850789 |
4fa78b999d9d1bcacf884862b3f5bda92edf7927 | 1,410 | md | Markdown | README.md | tckz/rediscomp | 5b0566f5b946b58f616ef0ef3624c94a470582e4 | [
"BSD-2-Clause"
] | null | null | null | README.md | tckz/rediscomp | 5b0566f5b946b58f616ef0ef3624c94a470582e4 | [
"BSD-2-Clause"
] | null | null | null | README.md | tckz/rediscomp | 5b0566f5b946b58f616ef0ef3624c94a470582e4 | [
"BSD-2-Clause"
] | null | null | null | # rediscomp
Compare content between two redis.
Currently supports String/Hash only.
## Command line
```text
$ rediscomp [options]
Options:
-dst value
host:port of redis. More than once flags can be specified.
-fetch-count int
Number of records to scan at once (default 10000)
-idle-timeout-sec int
Seconds of pooling idle timeout for redis (default 100)
-parallel int
Number of workers (default 8)
-pool-size int
Size of pool for redis (default 30)
-read-timeout-sec int
Seconds of read timeout for redis (default 5)
-reverse
Reverse src and dst
-scan string
Match pattern for scan()
-src value
host:port of redis. More than once flags can be specified.
-version
Show version
```
* One or more --src and --dst must be specified.
* Example: Compare redis on 127.0.0.1:6379 and 192.168.1.11:6379
Both of redis are single server.
```bash
$ rediscomp --src 127.0.0.1:6379 --dst 192.168.1.11:6379
```
* Example: Src redis is cluster mode
```bash
$ rediscomp --src 127.0.0.1:16379 --src 127.0.0.1:16380 --src 127.0.0.1:16381 --dst 192.168.1.11:6379
```
# Build
```bash
$ dep ensure
$ make
```
## Requirement
* Go 1.9
* dep
* git
* make
# My Environment
* macOS High Sierra
* Go 1.9.1
* dep 0.3.1
* git 2.13.5 (Apple Git-94)
* GNU Make 3.81
# License
BSD 2-Clause License
SEE LICENSE
| 19.315068 | 103 | 0.651064 | eng_Latn | 0.837112 |
4fa834a6ee5bf1b543c4a70212d60ddeedf996d9 | 659 | md | Markdown | README.md | GrapeColor/tweet_expander | 6713895f6b668fd4e042ddb65c487191622a3fd9 | [
"MIT"
] | null | null | null | README.md | GrapeColor/tweet_expander | 6713895f6b668fd4e042ddb65c487191622a3fd9 | [
"MIT"
] | null | null | null | README.md | GrapeColor/tweet_expander | 6713895f6b668fd4e042ddb65c487191622a3fd9 | [
"MIT"
] | null | null | null | # Tweet Expander
Discord用ツイートコンテンツ展開BOT
## できること
- ツイートの引用コンテンツの展開
- ツイートのスレッド展開
## 使い方
### 引用ツイート・コンテンツの展開
```!https://twitter.com/~```
ツイートのURLの直前に「!」を付けます。
ツイートに2枚以上の画像を含む場合は、2枚目以降の画像を表示します。
画像コンテンツと引用ツイートの両方が含まれる場合は、引用ツイートの展開が優先されます。
### ツイートのスレッドを展開
```ツイートID!https://twitter.com/~```
ツイートのURLの直前に、表示したいスレッドの**最後のツイート**のID(数字)と「!」を付けます。
スレッドの長さは最大25ツイートです。
### 展開コンテンツの削除
コマンドを実行した元のメッセージを削除します。
この機能はコマンドを実行した直後のみ使用できます。
## サーバーへの導入方法
以下のURLからサーバーに導入できます。
https://discordapp.com/api/oauth2/authorize?client_id=629507137995014164&permissions=19456&scope=bot
BOTの動作には最低でも以下の権限が必要です。
- テキストチャンネルの閲覧(メッセージを読む)
- メッセージを送信
- 埋め込みリンク
| 20.59375 | 100 | 0.770865 | yue_Hant | 0.448334 |
4fa8718846133c6f83b039683d221a41761711c1 | 1,493 | md | Markdown | _posts/2021-08-18-next_q_2.md | changbaebang/changbaebang.github.io | 1b1963d47ad84fdee08138e01526b5702a1d8197 | [
"MIT"
] | null | null | null | _posts/2021-08-18-next_q_2.md | changbaebang/changbaebang.github.io | 1b1963d47ad84fdee08138e01526b5702a1d8197 | [
"MIT"
] | null | null | null | _posts/2021-08-18-next_q_2.md | changbaebang/changbaebang.github.io | 1b1963d47ad84fdee08138e01526b5702a1d8197 | [
"MIT"
] | null | null | null | ---
layout: post
title: "이제 뭘할까?2"
date: 2021-08-18 02:00:00 +0900
author: Changbae Bang
tags: [next, post, ]
---
# 들어가면서
이런 뱉어 내는 식의 포스트는 자제하려고 하지만 무언가 박제 해놓고 다음을 정해보자는 의미로 간단하게 글을 끄적여 보기로 한다.2
역시 나는 깊어 지기가 어렵나보다..
최근 Linkedin 의 강의를 두 개 들었다.
하나는 Spring Boot 기초, 다른 하나는 Redis 기초
하고 싶은 것들이라고 정의 해 놓은 것에 필요한 것들인데 뭐 이건 이것 대로 듣고 다음은 docker 와 같은 컨테이너들을 들을까 싶고..
곁가지(?)로 무료로 `react` 에 대한 기초 내용을 올려 볼까 한다.
# 하고 싶은 것들2
* 주식 알림 시스템
* Safe link 서비스
* Spring
* React 기초 강의(?)
# 뭐냐면? 왜냐면?
아무래도 요새는 모두가 `오차장` 이라고 스스로를 칭하는 사람도 많다 - 서울 스퀘어에서 했던 야근이 생각나는 그 드라마를 실수로 다시 아주 조금 봤다.
스스로를 시니어라고 하면서 막 뭔가 퍼줄것 같이 하지만 결국 자기 강의를 올리는 사람 - 뭔가 얼굴에만 시니어 티를 올리는 사람 - 도 많다. 뭐 결국 돈이 중한 것이니 경제 활동에 대해서 뭐라고 할 말은 없지만, 그래도 뭔가 내가 업계의 뭐다라고 업적을 쌓으면서 다른 사람이 조금씩 모아 준 업 - 원기옥 아님 - 을 가지고 본인의 인생의 쾌적함만을 집중하는 건, 혹 관심을 계속 그냥 먹고 관심 비만아가 되는 건.. 그런건 좀 아닌 것 같다. - 아 살 빼야 하는데.. 오늘도 야식으로 라면을 먹었다.
무튼 나름의 반성도 있고 여자저차하여...
왜써요? 뭐부터봐요? 를 중심으로한
당연히 택스트와 코드를 기반으로 한
강의 자료를 올려 볼까 한다.
물론 내가 Front End 에 깊은 내공이 있는 건 아니다. 그리고.. 내가 이걸 꼭 할 수 있을 지도 모르겠다.
# 대강 생각하는 흐름은?
* 웹(?)서비스에서의 `React` 의 위치 - 게이트 있고 뒤에 FE 있고 FE 들이 API 호출 하고 그 뒤에 막 더 있고 등등
* 앞에는 앞사람에게 맡기고 뒷단은 뒷사람에게 맡길 때 필요한 것들
* `React` 선정 이유
* Hook
* `Redux` 선정 이유
* 함께 쓰는 것들
* UI? UX?
* Sytled component 혹은 그 비슷한 것들? less sass 는 아닐 것 같고
* 일단 돌아가야할까? 잘 미리 구성해야 할까?
* 상속에 대한 기준
* 배포? ~~AWS EC2?~~ Azure VM? 아니지 이건 App Service 로 가야지ㅎㅎ
* 어디까지 기술 범위를 잡고 일을 할 것인가? 일을 혼자는 다 할 수 없으니까?
정도를 그냥 정리를 마구 해보고 일단 묻어 두어 본다.
# 다음은?
할랑가 모른다.
하지만 나는 아직 `Rust` 를 보고 있지는 않다.
| 24.47541 | 282 | 0.638312 | kor_Hang | 1.00001 |
4fa9d7908836ccd7f73f8ab5994160ad9655a503 | 3,656 | md | Markdown | README.md | codymurphyjones/expo-link-preview | 8e483ef4a9f19faea6093e18224b9e3c91212e65 | [
"MIT"
] | 11 | 2022-02-27T23:05:33.000Z | 2022-03-26T20:35:33.000Z | README.md | codymurphyjones/expo-link-preview | 8e483ef4a9f19faea6093e18224b9e3c91212e65 | [
"MIT"
] | 3 | 2022-03-01T12:36:49.000Z | 2022-03-07T21:29:52.000Z | README.md | codymurphyjones/expo-link-preview | 8e483ef4a9f19faea6093e18224b9e3c91212e65 | [
"MIT"
] | 1 | 2022-03-05T08:41:44.000Z | 2022-03-05T08:41:44.000Z | # expo-link-preview
Render URL links for Web & Twitter previews

Built with `react-native` using `expo`.
## Installation
```
# yarn
yarn add expo-link-preview
# npm
npm install expo-link-preview --save
```
Then import with
```js
import LinkPreview from "expo-link-preview";
```
Note: You may also need to install dependencies `react-native-opengraph-kit` and `javascript-time-ago`.
## Usage
Example:
```js
import { View, StyleSheet } from "react-native";
import LinkPreview from "expo-link-preview";
export default function App() {
return (
<View style={styles.container}>
<LinkPreview
link="http://github.com"
/>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
},
});
```
## Web
Web previews will automatically use OpenGraph information to populate the preview. If an image is present, LinkPreview will default to that, otherwise show a non-interactive web view of the page.

## Twitter
Twitter previews will automatically be used if the OpenGraph response returns `site_name=Twitter`. The Twitter preview automatically adjusts to handle basic tweets and up to four images.

## Props
| Prop | Required? | Type | Description |
| --- | --- | --- | ---------- |
| link | true | string | The link to render the preview. Links are automatically validated, but should be passed as a string value that begins with `"https://"`. |
| onPress | false | function | The onPress function is called whenever a user taps the preview. |
| containerColor | false | string | The background color of the preview container. Defaults to iOS themed component. |
| borderColor | false | string | The border color of the preview container. Defaults to iOS themed component. |
| titleColor | false | string | The text color of the `Header` – typically the website title or the Twitter user name. Defaults to iOS themed component. |
| textColor | false | string | The text color of the `Text` – typically the website description or the Twitter user handle and tweet. Defaults to iOS themed component. |
## Twitter-specific props
| Prop | Required? | Type | Description |
| --- | --- | --- | ---------- |
| twitterLogoColor | false | string | The tint color of the Twitter logo. Defaults to Twitter Blue. |
| showLikes | false | bool | Enable/disable the `Likes` counter. Defaults to `true`. |
| showRetweets | false | bool | Enable/disable the `Retweets` counter. Defaults to `true`. |
| showReplies | false | bool | Enable/disable the `Replies` counter. Defaults to `true`. |
## Color example
Example:
```js
import { View, StyleSheet } from "react-native";
import LinkPreview from "expo-link-preview";
export default function App() {
return (
<View style={styles.container}>
<LinkPreview
link="http://github.com"
containerColor={"pink"}
titleColor={"blue"}
textColor={"yellow"}
borderColor={"red"}
/>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
},
});
```

Built by [Tyler J.](https://tylerj.me) ✌️
| 35.153846 | 195 | 0.709245 | eng_Latn | 0.794009 |
4faa2a66ebcecdb081e954749af7a2f5044b2b1f | 436 | md | Markdown | README.md | RPG-7/Commie-s_EECS_Workshop | d752dd725c6a7a7e90ea39596e9ad5a9c3fb355b | [
"CC0-1.0"
] | null | null | null | README.md | RPG-7/Commie-s_EECS_Workshop | d752dd725c6a7a7e90ea39596e9ad5a9c3fb355b | [
"CC0-1.0"
] | null | null | null | README.md | RPG-7/Commie-s_EECS_Workshop | d752dd725c6a7a7e90ea39596e9ad5a9c3fb355b | [
"CC0-1.0"
] | null | null | null | # Commie's EECS Workshop
康米的电子技术杂谈是一个免费,面向于高中起点学生的电子技术教学项目。这个repo用于盛放该项目的课件与教学工程
拟定教材:
EE:
电路分析:《电路分析》胡翔骏
电磁学:《电磁学》赵凯华
模拟电路:《模拟电子技术基础》 童诗白 华成英
高频电子线路:
信息论: 《信息论基础》Thomas M. Cover
信号与系统:《信号与系统》Alan V. Oppenheim
模拟集成电路:《模拟CMOS集成电路设计》Behzad Razavi
CS:
数字电路与硬件描述语言: 《数字电路基础》阎石
《Verilog HDL高级数字设计(第二版)》 Michael D. Cilett
微机原理:《计算机组成与设计 硬件/软件接口》 D. Patterson,J. Hennesy
《计算机体系结构 量化研究方法》J. Hennesy, D. Patterson
TBD,预计12月开始
| 12.457143 | 55 | 0.75 | yue_Hant | 0.395469 |
4faa360e41995e3385bdb13ab59f7e221a4cfd94 | 227 | md | Markdown | README.md | SUPINFOAppFactory/opentablePlugin | 9516afc3868161c5db39ca6ebaaeab1219db1a7c | [
"MIT"
] | null | null | null | README.md | SUPINFOAppFactory/opentablePlugin | 9516afc3868161c5db39ca6ebaaeab1219db1a7c | [
"MIT"
] | null | null | null | README.md | SUPINFOAppFactory/opentablePlugin | 9516afc3868161c5db39ca6ebaaeab1219db1a7c | [
"MIT"
] | null | null | null | # Open Table Plugin
Simply enter your OpenTable page URL to allow your users to schedule reservations right through your app.
### Plugin Overview
http://support.appdocumentation.com/knowledge-base/opentable-plugin-tutorial
| 32.428571 | 105 | 0.806167 | eng_Latn | 0.702733 |
4fab2b29ff60a7e6e819eb8cac32cdb8eef116b2 | 268 | md | Markdown | README.md | Mrnerdcon01/Projeto2Web | 215b0d943fbae684d12b80d1afdeb5ea429c610d | [
"CC0-1.0"
] | 1 | 2021-03-11T11:52:33.000Z | 2021-03-11T11:52:33.000Z | README.md | guilhermefcs7/Projeto2Web | 215b0d943fbae684d12b80d1afdeb5ea429c610d | [
"CC0-1.0"
] | null | null | null | README.md | guilhermefcs7/Projeto2Web | 215b0d943fbae684d12b80d1afdeb5ea429c610d | [
"CC0-1.0"
] | null | null | null | 

# Projeto2Web
| 67 | 126 | 0.802239 | yue_Hant | 0.177403 |
4fab51f58eff10c65d86d193d4c72164f27b6021 | 1,760 | md | Markdown | README.md | iukpo/simple_UE_megascan_material_shader | d7a5cd9d95257b413f089f61381571ac9724b510 | [
"MIT"
] | 4 | 2020-12-19T17:27:55.000Z | 2021-04-13T18:23:30.000Z | README.md | iukpo/simple_UE_megascan_material_shader | d7a5cd9d95257b413f089f61381571ac9724b510 | [
"MIT"
] | null | null | null | README.md | iukpo/simple_UE_megascan_material_shader | d7a5cd9d95257b413f089f61381571ac9724b510 | [
"MIT"
] | null | null | null | # A Simple Megascan Material Shader for Unreal Engine
This is a simple material shader written for use with Megascan assets in Unreal Engine. It is written in Blueprint for use in Unreal Engine 4.24, but should be scalable for use in versions 4.25 and later (if not immediately usable).
## What is a Megascan?
A megascan is a high-resolution scan of a real world object. Megascans can be used to create remarkably photorealistic scenes in Unreal Engine.
## What does this shader do?
This shader allows one to change the materials properties of the megascan. While this shader allows one to change ambient occlusion, specular reflectance, roughness, and cavity properties of the megascan material, it can be extended to provide support for more properties!
## How does it work?
The heart of the shader itself is the UE4 blueprint that provides both the underlying shader math and the user interface (UI) code that allows users to easily change material properties in the Editor. As the values are changed in the editor, they are used as inputs to update the shader. The shader then applies the updated values to the material instance, resulting in an updated look in real time.

## Warning!
To comply with the terms of the [Unreal Engine EULA](https://www.unrealengine.com/en-US/eula/publishing), this project can be used in UE4/Twinmotion **ONLY**.
## Credits
Special thanks for Michael Gerard for his udemy course on megascan shaders. Please check out his LinkedIn page [here](https://www.linkedin.com/in/michael-gerard-56537113a/).
And, as well, the good people at [Quixel](https://www.quixel.com/) and [Unreal Engine](https://www.unrealengine.com/) for making this all possible. ;)
| 60.689655 | 399 | 0.782955 | eng_Latn | 0.997759 |
4fab571c8c846f4a187782e0f38fdf7fcade2023 | 55,454 | md | Markdown | tests/snapshots/fictional.test.js.md | oftherivier/determined | d019ba44dde80ad5dcd6fb061014a49eef82fea2 | [
"MIT"
] | null | null | null | tests/snapshots/fictional.test.js.md | oftherivier/determined | d019ba44dde80ad5dcd6fb061014a49eef82fea2 | [
"MIT"
] | null | null | null | tests/snapshots/fictional.test.js.md | oftherivier/determined | d019ba44dde80ad5dcd6fb061014a49eef82fea2 | [
"MIT"
] | null | null | null | # Snapshot report for `tests/fictional.test.js`
The actual snapshot is saved in `fictional.test.js.snap`.
Generated by [AVA](https://avajs.dev).
## generated values
> Snapshot 1
[
{
bool: true,
char: 'v',
dateString: '1989-02-14T01:52:57.000Z',
float: 2766559028.1786575,
hash: 3112761889,
int: 3112761889,
join: 'Privet Drive',
oneOf: 'blue',
oneOfWeighted: 'green',
paragraph: 'Vasotaꝁi ņiceamirae kiyu hy mutaṁura, me yuko mamuami menako kokiṉ muviyuso na. Tako yo nẚmu mevamuka shi hasoṁiyu tashi ṙa. Ǹavi vahyrake mu somaniȼhi ḣavi mukai, væ ꝁai muċea ꞧayura ranokin. Yộ tamemoha mamemi miceakoceạ ḿi, hamora ceahyma ḿisoni shina niva. Noa meȟy mẳ ḩami nachikoyu ka yukįncea yumerae, yomą mețani vinoshi naniha chiceamuna nimoyuchi koḵinki. Noke musorẩevi kinmo kai ňomumevi ni nomayu yṍ, munaniha kacềake momä vinokinki yura. Rae kincea ceamoyu vachiayo ceake, chiceani chirahÿchi chikohame yo yumemumi makechi meta.',
sentence: 'Vivano mera nanic̈hiha ṅi moshiyu sochi kinrâme kinmḁ.',
shape: {
a: 'Vaviƴu',
b: 'Soḵani',
},
someOf: [
'think',
'too',
'i',
'you',
'colors',
],
times: [
'Ḉhihy',
'Ċhime',
],
tuple: [
'Privet',
'Drive',
],
word: 'Mokḁi',
words: 'Hỹka ṩo màyo',
},
{
bool: true,
char: 'Z',
dateString: '2011-08-16T07:33:14.000Z',
float: 247941005.65535155,
hash: 3637691191,
int: 3637691191,
join: 'Parkway Road',
oneOf: 'green',
oneOfWeighted: 'green',
paragraph: 'Takeyu ʋashiyo vani kairaeha nanoke kakinceamu vi. Vayokinshi kekochi kinmokai shira ramehy yuaki koke koȑae, naviyu viramokḝ yukami ćhi va kovinỏkai kinṩo kẳiceasoma. Mi nokin kenavihy ra vayo yuhyno, hya miṁe mihyra kakoviso tā̀ nashichîka ceamokin kaika. Moċeami ka taceặme kë r̃aekai r̃ae, maki nahynona kisochishi nasȍke sovi kakena muhy shi. Miẩ momi köhahyka ṙa koviyu raceamiva, hanö sotachḯ vakira ni ranitahy mashira ke nikě. Kiraeshi mushi vamo mố raeyokinḫy, ýusoni yų́ka mu yukin kaí tacea nasochinȯ nika.',
sentence: 'Nakễ kề masoǹako ʋikeki kinṿa ma raeshï m̃unoniko, ceahyke ḵo soyukaiha hahy chi.',
shape: {
a: 'Mậyuma',
b: 'Raevī̀',
},
someOf: [
'256',
'i',
'too',
'is',
'that',
],
times: [
'Ceamem̃o',
'Mahykȱna',
],
tuple: [
'Parkway',
'Road',
],
word: 'Ỳomi',
words: 'Shihyķeki kaṅome ꝁi',
},
{
bool: true,
char: '7',
dateString: '2013-10-18T09:35:02.000Z',
float: 1005905708.2690269,
hash: 1415325033,
int: 1415325033,
join: 'Cherry Street',
oneOf: 'green',
oneOfWeighted: 'blue',
paragraph: 'Kairae kiraemi rayomekin mekova yu raehavi cé̩amu, shirȃ koțaki mokeshia nimẳ muceamo kahyḵiso. Yồrano vimo mu tashirahŷ ka m̃asonami ma, raeyumeṉa kaishịta chimunichi hẙsoma hyke haniramu memẳ makotashi. Chimu ko makȉn vasȱka rae yusocea, mi yochi ras̈hiacea raevako komuṽi mimoraeka raeyu manitaŝhi. Hâmu ṁe kimaýu tanó ranoka, meamo mo vą́ va ceahƴ. Kechikai yushi sohysovi ƴu c̈ea meniŝo ki. Ceakiṉ ceanihaso shihỵ nika ko hą̃. Kenashimu ṁoceahy rae meyo kakin ceanir̃a momichi hyrae, shi mokemi memuhykai kehyķin raekoani makevam̃e.',
sentence: 'Kekaihy sokakeki ḿushi rậchiso ke kaia.',
shape: {
a: 'Cĥishi',
b: 'Yokaiyụka',
},
someOf: [
'are',
'that',
'i',
],
times: [
'Kinakɨha',
'Hƴkaika',
'Vinoraehẙ',
'Hykinṃo',
'Maħy',
],
tuple: [
'Cherry',
'Street',
],
word: 'Ǹikeni',
words: 'Hynẩmo kekaceẚ ŝhia',
},
{
bool: false,
char: 'I',
dateString: '1996-01-01T00:46:58.000Z',
float: 553741175.7373961,
hash: 4124438976,
int: 4124438976,
join: 'Cherry Road',
oneOf: 'green',
oneOfWeighted: 'red',
paragraph: 'Mậnimu kaì vikaī̀nako nǫ rayume vi, kìyo mo ɍami na sḧi. Morakekai kekiceḁ ha hayū̀yoke hyna norakė́ka nakinyū. Shī noȼhi no mimo so hyvike. Vimoramo ka kachi kayo micea kamochira kinÿomu shiraemuma. Miname yȏ yoma kinkiko soƙairae moḿayu miayo råekorashi, kahychḯ mo moha raemivako shi yumenirae. Ke mẳkishira ki motam̃i kekin vaceamerae, mesomi shɨka yu vakimo kiǹko kano ŝo kačea.',
sentence: 'Kitavira memu sokinkaiva kakeràe mekai chimomuso yokiva.',
shape: {
a: 'Ceaħyni',
b: 'Mekaceamừ',
},
someOf: [
'think',
'colors',
'too',
'beautiful',
'are',
],
times: [
'ᶄakinyukai',
'Raʋimume',
],
tuple: [
'Cherry',
'Road',
],
word: 'Hykinȓaema',
words: 'Raȇnikai yuńaso',
},
{
bool: true,
char: 'b',
dateString: '2011-08-20T07:41:00.000Z',
float: 3711525276.117791,
hash: 142894351,
int: 142894351,
join: 'Privet Street',
oneOf: 'blue',
oneOfWeighted: 'red',
paragraph: 'Noshirae ke moceayú viko kohaḩy kiraemu mȗ, kin takinsokin taso ḱo nayucea rameyu cḣi. Kaisoceashi ṁemi nashiyokế ki nitakemo keso. Ꞧa ceakihyrấe ramicea kőshi kaivayū̃ mȫ vike kechinokin, noǹi kaiyoma kâmoha šhisohy tấ kiraketa kairayo. Cȅahy meraehakin ceamokẩyo tame rȃemisoyu raeyumo keki va. Taviva kai na ḉhi chita. Nacea nờ yưhynacea raṉi ceaƴuke ṥoyu kinvaceaki shi, raevi hyko nakê̌mokai ceẳ yuvimuǩi.',
sentence: 'Morae ta yoha ceaḳinhano ráchi ni sovime ketakoha, śorae muso kakoceā̀ vishihame niceayu.',
shape: {
a: 'Yovaceḁ',
b: 'Ceakḯ',
},
someOf: [
'i',
'colors',
'is',
'are',
'too',
],
times: [
'Keşhikai',
'Ʋavichia',
'Chisokîchi',
'Ḱira',
],
tuple: [
'Privet',
'Street',
],
word: 'Ṁoviceayo',
words: 'Hyḿura viṃe hykekiǹme',
},
{
bool: true,
char: 'Z',
dateString: '1997-10-06T10:01:03.000Z',
float: 2406401323.174411,
hash: 39814017,
int: 39814017,
join: 'Cherry Street',
oneOf: 'green',
oneOfWeighted: 'red',
paragraph: 'Shi kaiyų̃no shita kikaiǎyo tǡ kinḳikani shinota. Vamerȧe ŕa yuhykaikḕ kemota ke raeni, muvą̃ sotamu nikonami no ᶄemi. Chicea cea me yoshi yṏvamu.',
sentence: 'Hynorae yo kinceẫ kaichihymụ no m̃inonaso, ceashiva kȧike hykani kạ rae.',
shape: {
a: 'Hamichī',
b: 'Kevæ',
},
someOf: [
'it',
'too',
'you',
'think',
],
times: [
'Raḳi',
'Ṁimea',
],
tuple: [
'Cherry',
'Street',
],
word: 'Taceaꝁin',
words: 'Ceḁ nǐna ꝁi',
},
{
bool: false,
char: 'E',
dateString: '1980-01-21T00:07:20.000Z',
float: 2714068071.307764,
hash: 1482841800,
int: 1482841800,
join: 'Privet Street',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Raëyoki soshiceaso ki ṇomi ceakḕ kin tạkin ki, rẩe chikaƙicea hỷkin hyhachiha viyửke hy kaicea kḗ. Muvi mekaitam̃i kehynoḥy kę kaso, vaceăno ńi cea yó cẹa. Yȱ makẻ ᶄayochimu kaiḫa ninotamu ha, nasokaiyu shi nome sokinhyso kavimo hyma.',
sentence: 'Vi kishikai na shihyso kovi hŷshiso.',
shape: {
a: 'Moraƙin',
b: 'Yuṣhimura',
},
someOf: [
'think',
'colors',
'you',
],
times: [
'Vīkeno',
'Vąyo',
],
tuple: [
'Privet',
'Street',
],
word: 'Kaihyrǎe',
words: 'Kinrą́ceami kaiṃuke',
},
{
bool: true,
char: 'L',
dateString: '2019-04-16T15:43:20.000Z',
float: 2493339650.2670717,
hash: 3515130879,
int: 3515130879,
join: 'Privet Drive',
oneOf: 'red',
oneOfWeighted: 'green',
paragraph: 'Haki ta kenihy kǻ mȃceayocea kanohy menachi mamikeḱi. Ṛa kokikai chivachiva tą ceanohachi vakehy hachi, yuvavi takinmayu mehyhẳmu rẫmohykai ṇiraekinchi. Shinacea kinmachikai kisoả hyvayômu kîna hayońi. Meyumo ko ham̃era musokena meñi hasovano mi, kechiko yume sorachï hḁ chivaso kechi ta. Vitavia yo mekokin ta rakeyu mevakin vi, hayumiko ḿako ceavihẩ kinmichi memu mȧ mū̃. Hamirae ṽi shimeshi sorae rakomachi yumom̃ucea, ta mokai keyocea yo kinṃeni komiyuyo. Mokaita ko ṅoyu hyrae ha, kohymi ma ceanonami ķin kiyoame ra kin.',
sentence: 'Makirae ṋi nokaɨ mako ȼhi.',
shape: {
a: 'Kokaiţa',
b: 'Novå',
},
someOf: [
'i',
'it',
'that',
],
times: [
'Chiyoṱa',
'Raekiaƙi',
'Kakeṟa',
'Hyắ',
'Ñonaceayo',
],
tuple: [
'Privet',
'Drive',
],
word: 'Ɍavi',
words: 'Noķi somễa raḿe',
},
{
bool: true,
char: 'B',
dateString: '2001-06-10T05:42:40.000Z',
float: 818270688.3671304,
hash: 3531449021,
int: 3531449021,
join: 'Privet Road',
oneOf: 'green',
oneOfWeighted: 'blue',
paragraph: 'Shike ṃi mŭkoka kimu mochi kaḱehy, vaceamota ʋavi kai yukinshi yu mokayo keayo. Mokḯnmukin rahyceẩ nome hamovami shikai kḕka ḿi kishikamu. Hyᶄinvamu muchi ka vahynoma moni. Ko miashí ceashiko nikeřacea nomaso kikai. Noceame kẚ kekai raé hy ceả, kaiműrame hyniḣavi yohy hy rano kamiyokë vaso.',
sentence: 'Mokira va hanora haħya muchiame no raė́kai ra.',
shape: {
a: 'Vĩshichicea',
b: 'Keaṃuke',
},
someOf: [
'too',
'it',
'that',
'think',
'beautiful',
],
times: [
'Mặyo',
'Rakinshỉ',
'Taḿunashi',
'Miṿayoni',
'Tamerasẖi',
],
tuple: [
'Privet',
'Road',
],
word: 'Yoaƴoshi',
words: 'Ṁimeyu ỳu raekaȉmu',
},
{
bool: true,
char: '7',
dateString: '1989-02-26T01:49:11.000Z',
float: 4242705169.422357,
hash: 2601797209,
int: 2601797209,
join: 'Cherry Drive',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Nia kamoha shi shinovino mu kiņyuna kinomamo muahy, vitake va yomunirae mṵkaima ṃi. Ke vimashiha keṃohyme kaceamu soamu, kashiyocḩi raemomume ni vishi kokaso tacęa yu miahy. Nameva ŕa vashi kinsokai mahy memayoha, kin mû vaceamïkai hacea čeasohy. Ṋi niso c̈eani shikẳko muke vinome, tayoṁiyu me chihỹ nona no munamīyo shïmu.',
sentence: 'Mokin ni mèyuyo kayo yu tầmeraena, ḿekishi meᶄai kai mohaṁuso hynoyuchī takinkai.',
shape: {
a: 'Kồyusoha',
b: 'Ẏorae',
},
someOf: [
'think',
'too',
'beautiful',
'256',
'you',
],
times: [
'Kaĩtashiyu',
'Nąmu',
'Kinkï',
],
tuple: [
'Cherry',
'Drive',
],
word: 'Kimờ',
words: 'Yumã mȏ ṛakin',
},
{
bool: true,
char: 'v',
dateString: '1993-10-02T21:46:12.000Z',
float: 2282064061.2818904,
hash: 4294071573,
int: 4294071573,
join: 'Privet Drive',
oneOf: 'green',
oneOfWeighted: 'blue',
paragraph: 'Vaniyo ki ceā̀murame tavįmacea namukaia. No moshi mổna hy raeyohỷni ceavǐno taceavira, kě kisota vamuhyshi m̃oyua nikȅ shiso. Muhyni noĥy ra kishia tâ.',
sentence: 'Ḿayushi chiamira ĥy raniceȁmu yu nonaketa hyḱahy hycễani.',
shape: {
a: 'Kįnomu',
b: 'Ǹaso',
},
someOf: [
'that',
'256',
'think',
],
times: [
'Ņokinhyrae',
'Yoceasốna',
'Ḳaso',
],
tuple: [
'Privet',
'Drive',
],
word: 'Cḥisokakin',
words: 'Yûkaiso chisohayó̩ yǔmekai',
},
{
bool: true,
char: 'H',
dateString: '2015-08-16T20:11:14.000Z',
float: 700373288.6813217,
hash: 4213511275,
int: 4213511275,
join: 'Parkway Road',
oneOf: 'green',
oneOfWeighted: 'blue',
paragraph: 'Kinmợa hakinkoha viyuchĭ chĭtakinso so shiṿa raemaḿo ceasoyuka, shiceavïyo kaina vasovi taker̃a ra. Kốcea vi makớkiva yoha hynoka sovaʋira nomeka ṁako. Muha na kivanoshi chi nanȏ. Rakinshi hą́hyva ḿumi kin ceahysoyu kin. Niakohy yô mokaishirae moćhirae notashichi hầ hỹna kaiķoni.',
sentence: 'Hashi ko kinhakokǻi ceã raḗkai nashi rą̃moa ƈhira, ramoni me kakĕ yokamomu ko.',
shape: {
a: 'Ḿemokairae',
b: 'Kaḯsomayo',
},
someOf: [
'too',
'that',
'are',
'beautiful',
],
times: [
'Vakiŕaeni',
'Kavȋno',
'Vikońi',
],
tuple: [
'Parkway',
'Road',
],
word: 'Ñirae',
words: 'Kėka vȧ ẗavi',
},
{
bool: true,
char: 'p',
dateString: '1997-02-18T01:42:39.000Z',
float: 1426040208.3262198,
hash: 3686761417,
int: 3686761417,
join: 'Cherry Drive',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Na sovahykẫi kỡmuva nayū̃namo ƙehya mȁ. Chikokika kinyu vayokai nikaimerae mǜka ḱaichia nochimamu vikaimucea, ŷochi naniraso m̃i ravicea hǟ namo. Ḳi nayuko kayu ꝁin soceayome rae rą́e. Kovǟ tachinoke nome ǩevako kinoshita, mamṑ nochi nį̃kemu shi ceaniyurae nahyki yunȯ.',
sentence: 'Maƙi tasohykiņ ką́i raceamiyu nakovi kīyo, keanome munako ceấkiyona kiacea raeķo yốkaiso.',
shape: {
a: 'Ramồ',
b: 'Ḿurae',
},
someOf: [
'i',
'colors',
'you',
'256',
],
times: [
'Memukȉnyu',
'Vaṁe',
'Raēmiva',
],
tuple: [
'Cherry',
'Drive',
],
word: 'Kasoshḯno',
words: 'Ƙainomita kiṁanoki ceamehấ',
},
{
bool: true,
char: 'r',
dateString: '2003-04-08T15:18:59.000Z',
float: 1244297583.295684,
hash: 121065903,
int: 121065903,
join: 'Privet Road',
oneOf: 'green',
oneOfWeighted: 'red',
paragraph: 'Mą̃yukai ranocea vakin mu haḱi. Kỉn yuvi yokaishȋ ka chi ha mokiṋme, somekokin ḿevi mỡ hamihyni tà. Niceami shichina raeyòkina koani ko vihymu mą́ ĥy.',
sentence: 'Hy moke takinha vaṅi nimevi mehy ṿitavia kai, ṁayuki kaita męyo ta vavihashi ki yumo tạ.',
shape: {
a: 'Cę́ameka',
b: 'Hakẽ',
},
someOf: [
'is',
'think',
'i',
'256',
'it',
],
times: [
'Vimǻ',
'Mȍmemo',
'Kiñmomi',
'Răenoma',
],
tuple: [
'Privet',
'Road',
],
word: 'Hyƙi',
words: 'Yȍ soshī̀ ṁe',
},
{
bool: false,
char: '8',
dateString: '2014-03-11T02:24:12.000Z',
float: 3556482426.158939,
hash: 2036056634,
int: 2036056634,
join: 'Cherry Drive',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Řameshi ni ceayuvi ķai visomuvằ noa kehayuno mevi. Mukovi kaki sochimomu maviḩavi kě ḣykinyo meraemovi hymeyohy. Kiraeka taki kai mu sovíraekin cȅake.',
sentence: 'Kaȉ kairaeno vi yoa mehacė̃ame.',
shape: {
a: 'Chiyú',
b: 'Ḩyrakiso',
},
someOf: [
'that',
'think',
'i',
],
times: [
'Řakira',
'Raèva',
],
tuple: [
'Cherry',
'Drive',
],
word: 'Ʋaso',
words: 'Raḗkinme ṝa',
},
{
bool: false,
char: 'm',
dateString: '1980-01-09T00:00:50.000Z',
float: 28168934.39874533,
hash: 3150377280,
int: 3150377280,
join: 'Privet Street',
oneOf: 'blue',
oneOfWeighted: 'blue',
paragraph: 'Meva nayucea raenavishi visoshimu hymi shičhihyno kę́a michiķashi, viyo kai sovishi hy vikachivấ. Yukě ko yomi raehyceano ṅakeko. Yuaẖyvi kai raema kemuhachi ķihyni ceaso mṍ, kiňsona ma vasorahy taceaniṁa kayochikäi.',
sentence: 'Miamu raehya vi ceakinmamo kiṅ ẏu ceakoceḁ, hyvi mu ceako kimume ṇohakera.',
shape: {
a: 'Yukes̩hiso',
b: 'Yurakômu',
},
someOf: [
'it',
'are',
'think',
],
times: [
'Kehamȋka',
'Mehysoṉi',
'Shikăko',
'Kinrayṳ',
'Ȟynomikai',
],
tuple: [
'Privet',
'Street',
],
word: 'M̃ovikayu',
words: 'Raehaḿuva ṃechia',
},
{
bool: false,
char: 'a',
dateString: '1984-01-21T12:32:37.000Z',
float: 1951099136.1324983,
hash: 4206730044,
int: 4206730044,
join: 'Parkway Drive',
oneOf: 'green',
oneOfWeighted: 'red',
paragraph: 'Shïnaki kinkai yo ceaki chì taḵi soachi, rano nimeyona shisokita ceami vashirǡe yu koceani mo. Nohyko nĭyo ni kiramua kikainoa. Sȱha yuhyno nicea viȼeaki yohyma, kihy nḁ make hamuhấ kinko moḥako yoceako.',
sentence: 'Koyuhaki yohavi keyoceaṁe vā hyvîha kekimova.',
shape: {
a: 'Ƴukokira',
b: 'Yumeso̩chi',
},
someOf: [
'it',
'beautiful',
'are',
],
times: [
'Ramoceǟ',
'Ỳokinhy',
],
tuple: [
'Parkway',
'Drive',
],
word: 'Vį̃nano',
words: 'Shivaḳin ṽiceahy',
},
{
bool: false,
char: 'm',
dateString: '1990-11-03T10:10:44.000Z',
float: 1804792857.1735883,
hash: 2955358450,
int: 2955358450,
join: 'Parkway Street',
oneOf: 'green',
oneOfWeighted: 'red',
paragraph: 'Koraekiƙa sḫi yomikai mo rayuki nivayoshi ke. Ɍa mivasokin kĭnraenochi shiceami va ceami. Raeviṛa haniceahy yuyo kai ƈeayua. Vimu kekoa tayụka no vĩmu kaceamima rakin, chikevirae kai nova so močea kikaishi niȟavi. Soni kiacĥi kăi yuchīa kaceayo kakoraḿe ḳai, hayokin vamo yuavï hặ kin mu.',
sentence: 'Viceani mushikại kai kokảike vãmu noyų́mo yotakomu ceavìnora, komeni sẖi kaikeyoma rae nano mumimenṍ.',
shape: {
a: 'Yuraecḥi',
b: 'Moshíme',
},
someOf: [
'i',
'is',
'beautiful',
'it',
'colors',
],
times: [
'Ḵaiko',
'Ꝁomi',
'Miṁe',
],
tuple: [
'Parkway',
'Street',
],
word: 'Mamehÿta',
words: 'Rấe nomeķin',
},
{
bool: false,
char: 'o',
dateString: '1994-07-11T06:54:31.000Z',
float: 4000630938.8013196,
hash: 1174035894,
int: 1174035894,
join: 'Privet Drive',
oneOf: 'blue',
oneOfWeighted: 'blue',
paragraph: 'Ṡorachia kinkoha hẵ ceãnokai ṽavi raeno, kẩ yuna ȟy mevi hyviḫy raeki. Sokichia raesomi muni rashiyuso motakoka, hy vina kakihavi kaimoraeta yu ňokeshi ka. Ra mino shina na nirakin chįraeni kinkirachi, vamemu moẖy vamuvavi ma nayo hamiha yukinmo. Kḯnchimu raeta hyƙi vihaceacħi noꞧako m̃oraki, vā̀ ÿora raḗ somaki kevi mukekaiyo shiramekin. Nita yu ƙai sovikoyu keyomi. Mėka shiami ni hakḝ makeniko mumȋyo.',
sentence: 'Nȧmu kenɨha mu hamerae nishinohy meki ḵinhayo.',
shape: {
a: 'Ŕakin',
b: 'Haĉea',
},
someOf: [
'think',
'i',
'too',
],
times: [
'Šoaceaso',
'Ḵorami',
'Vimồ',
'Kotaceǟ',
],
tuple: [
'Privet',
'Drive',
],
word: 'Soḳaso',
words: 'Kovihyꝁai kắ',
},
{
bool: true,
char: '7',
dateString: '2017-02-06T13:52:57.000Z',
float: 2201385335.1633515,
hash: 2549988397,
int: 2549988397,
join: 'Cherry Drive',
oneOf: 'green',
oneOfWeighted: 'blue',
paragraph: 'Kấyokai rakikaimo ḿe kẹyo ha hyha ṁe. Nṍ chimokeyo so nani ko ṱayu sovani, hy yomeachi r̃a ki viyűke ȟychi. Ńovaki nįkaimovi ma yoraeashi hẩ měmake hamoravi. Mevita nihykinṁo nokin rani shĩchikaiyu naceamirae yumechi, rameyo ṩo namusocea meva kovakochi kổnike muhachikō. Viha shĩkeva shi mino ȑaeni, mȁ kê̌ mohamo yokaiceakin ta kiyorą̃e hakɨn. Rachir̃a vaḱinva movi mumesoma kincea kinki, mokaiķoma ẗaso kai hayoraevi kehychi.',
sentence: 'Tamirae tachia hakovime shihahyta kaivaḿo kinovi koha.',
shape: {
a: 'Nam̃e',
b: 'Mumaḱin',
},
someOf: [
'you',
'too',
'that',
'256',
'think',
],
times: [
'Ḿorae',
'Kiceacḩime',
],
tuple: [
'Cherry',
'Drive',
],
word: 'Vivằ',
words: 'Momê̄noke yuceayų kenȉyo',
},
{
bool: true,
char: 'd',
dateString: '1991-12-16T11:21:45.000Z',
float: 1400839715.1824698,
hash: 3182265731,
int: 3182265731,
join: 'Privet Drive',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Ke hakisoḳai yorae moceanova ẖyso ha. Viyu kaiceä rae kayorako kin ki kainome nisȟiketa. Kevi shinakerae sonikachi vake va, naĉea ka takinva ṁamumaso va c̈eashiyuko kemihƴha rayoṃe. Yomi shiraëmiha kinmuṉiko hakin vaꝁinrae, yutayó raeṋoke rakiŗaeka ȟaso me nohaṣoshi.',
sentence: 'Hyka čhivahy raekằ ǹokemi kaira šo, ra visonả mame sokakinke sḧino vayuråe chinakinmu yukin.',
shape: {
a: 'Manomecḧi',
b: 'Taḣy',
},
someOf: [
'beautiful',
'that',
'i',
'too',
'is',
],
times: [
'Chĭmechi',
'Raekakį̃ma',
'Ḵacea',
],
tuple: [
'Privet',
'Drive',
],
word: 'Kiṋmeyota',
words: 'Ňonako mū hakẻmova',
},
{
bool: false,
char: 'o',
dateString: '2018-11-15T23:07:27.000Z',
float: 3429663079.370233,
hash: 4234902238,
int: 4234902238,
join: 'Privet Road',
oneOf: 'blue',
oneOfWeighted: 'blue',
paragraph: 'Hy vaviso mu ꞩhita hayo rae, taceamuchi kamokiyo raeka tayokika kaike. Vaķi ramemaki kįha momutayu chi komevishi vi. Hy yunorą̃e yoḳinkeso kaichȋna ṽayoracea, raehykino yokeshi kaiva kinmȯhyni na nṓhyke chi. Ninoyuna kě kinso rayoraeko ṋimuko yoniᶄora. Machi hahysona via kaịsona kaitasomi.',
sentence: 'Vani shira mímu yotayu haṁe kai, hy kayurȃe yoni ḱi mą́yo sona.',
shape: {
a: 'Račhina',
b: 'Nayuvĩyo',
},
someOf: [
'are',
'beautiful',
'colors',
'that',
'too',
],
times: [
'Ꝁomechirae',
'Mų̃ha',
'Nokḁnicea',
'Ƙaivayo',
'Nochḯhymu',
],
tuple: [
'Privet',
'Road',
],
word: 'Vaḉhitachi',
words: 'Ḩy ṿavi',
},
{
bool: false,
char: 'M',
dateString: '1984-05-13T04:06:32.000Z',
float: 2046870468.7711394,
hash: 2008098244,
int: 2008098244,
join: 'Cherry Road',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Hy kaiyochimu kai yȭ mumevi so mekaihaki ceakiṃo. Nikeko moni yo nira shi, sonameyo nakin masonachi muhahy na naḥy. Ṱavi ko yoayuni nặvikaimu ceamo rahy. Ƙeyu ceȧ hykinke nokairami sḥi chika. Kinkikoni konayukė yomerae cễanokin s̈o kevi kaichicea ṁitavi, hyvayo kinmo ceavima ḿicea mo raviko. Kemi shikeraeso soaꝁi koraechi koa sỡta keraso ṛaso, cĥi moninami vavino rae vịkai. Navi sokinraḱi kín takinva meçhi ra vivậmo sǒha, vitashi ki tasomå rậenokin yơ.',
sentence: 'Moni tȃyuta raeva kokimakin ha yoshiami yu, tǻ raechia raẽ ṅi mome so kaṽiyohy munayo.',
shape: {
a: 'Shįso',
b: 'Cḛaviso',
},
someOf: [
'it',
'is',
'that',
'colors',
],
times: [
'Yocè̩ake',
'Kamiyõkai',
'Mǜmiyukai',
],
tuple: [
'Cherry',
'Road',
],
word: 'Kaceảno',
words: 'Mȯ kikẽ',
},
{
bool: true,
char: '3',
dateString: '2007-12-12T11:47:25.000Z',
float: 584183095.189388,
hash: 3095374307,
int: 3095374307,
join: 'Cherry Road',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Minomura nǟ ceahymo mavino tanokayu soỳuni nỉha rakohayu. Ʋamuhyko yumi munīma ẖyceamimo raekoḱinva nȱceaso. Kinmime no vakồmachi yomuko kaiceamemi, kamiyuṃe ʋi takehyme vimeàno kå sohaḵin no.',
sentence: 'Kaceashi cea ḿekin chi misḫi kinmamita shi moha, rakinma namihy ke yomena moka chihyvima.',
shape: {
a: 'Ceakịname',
b: 'Shivavitǡ',
},
someOf: [
'that',
'you',
'too',
'256',
],
times: [
'Tâkeka',
'Shikeýuke',
],
tuple: [
'Cherry',
'Road',
],
word: 'Chihyshī̀kai',
words: 'Minømirae hâvikai takoă',
},
{
bool: true,
char: 't',
dateString: '1995-08-12T08:10:56.000Z',
float: 2902627112.3084903,
hash: 3476172655,
int: 3476172655,
join: 'Privet Street',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Kekokȋha meahẏa koceayo raehykaicea niyota ko mayo. Mo tanocḫima mu mećhinako noa, hymivavi rachime micĥivaso hyviraso kihyso. Tanichỉma so kḯn momấmirae yuyocea. Mayṍva kê̄ sochi ma ṉimomiso. Vikanimo vasoke nă nȋchima yotanicea vamemu hame. Kiṃe m̃e nimayo kinƫa kayuvime meshikake, viyua meyuhy kim̃e vamiṃako ni nomuna ra.',
sentence: 'Kachihyki mishi maké namekin vakinkime shimua, nomehamõ kikokai ƙinki va muno chita va.',
shape: {
a: 'Maṃovi',
b: 'Ŝora',
},
someOf: [
'it',
'256',
'you',
'think',
],
times: [
'Haceacħi',
'Koḵi',
'Ṥokavi',
],
tuple: [
'Privet',
'Street',
],
word: 'Kashiḣyno',
words: 'Nậ ẏomiso rảe',
},
{
bool: true,
char: 't',
dateString: '2013-02-06T01:19:51.000Z',
float: 1295638421.2967887,
hash: 3854473033,
int: 3854473033,
join: 'Parkway Road',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Muceàko mâ kaīma ta chianota raekita, chinoyuva m̃e rąenima mekamo mavimusȟi vihacea chimeniyu hymoa. Ṁi vanő yohamo mekinshi me. Ke meshiyu va ƙai mochira mucea, raechi chiṛa m̃e shiaso ŗa şhi raeƙaimumo. Naḿe ḱaiyu mumeyora no vimom̃i. Mikeṽami ke nikenomu kainake kinmi, ḳin moḿiyuni na tamokemi c̈hikeyua cềamoma mumokiko. Sohy kachimohy munome shiṫashi chi. Munoshi ʋi menocea korahyno ceạ nä.',
sentence: 'Makomu hachiyṓme mừ muṃerahy nṍ ceachinakin kohy kaishimu.',
shape: {
a: 'M̃aceavikai',
b: 'Vaṋi',
},
someOf: [
'too',
'think',
'i',
],
times: [
'Yuẖy',
'Vayoṃe',
'Nikeḧy',
'Yuṽihy',
],
tuple: [
'Parkway',
'Road',
],
word: 'Shichiyukḛ',
words: 'Hykokậihy ᶄinkehy ḩya',
},
{
bool: false,
char: '0',
dateString: '1984-09-09T20:54:45.000Z',
float: 1371521156.382626,
hash: 3699775724,
int: 3699775724,
join: 'Parkway Street',
oneOf: 'green',
oneOfWeighted: 'red',
paragraph: 'Muvayoni tă ṁeshihy sovikekai ᶄayuso ǩo vashisovą̃. Mi yuraekoki cḣi mukeka hyceacḣi kayohychi nomiyu no, miketachi ra kĩn nồ nakin hy kaishi sokẩi. Kinchi shɨke mamuvi ḵin nitayorae me yuta. Hykerãe komiraemi cḙa raemo yuni ŝokeyoa vivasochi nako. Hymi taniso kai na shi ta, kę viyorǻyo mu rae ᶄevami ṃe kenimǫta. Ņoyu kinko yochiyo nikế mi moyu chi moniha, meshi ḳo taniyurae koame hamokakin yuchi ramochī̀ me. Chivano kamu nokinkohy cea ȼhiko ma kovishi na.',
sentence: 'Vi kinvi meraceame kairae va yũyo nơ.',
shape: {
a: 'Ḿikaihy',
b: 'Kaiƈea',
},
someOf: [
'256',
'is',
'you',
'that',
'it',
],
times: [
'Ṉiyu',
'Moyuchȉ',
'Hynína',
'Ḿirashi',
'Ḳiko',
],
tuple: [
'Parkway',
'Street',
],
word: 'Muvimỗ',
words: 'Mī̀na ḿe',
},
{
bool: false,
char: 'O',
dateString: '1984-05-17T04:08:49.000Z',
float: 18490743.4062642,
hash: 134369524,
int: 134369524,
join: 'Parkway Street',
oneOf: 'green',
oneOfWeighted: 'red',
paragraph: 'Kiń nomishi kâi ňochi kevichishi mo. Rasỗta cẽa șoyu ma tamochi. Me mĭvamo mữ mevashiko raeka raevame hacea, ḱokihacea vimemimo ṩhi kanimoa ḳo kiaᶄo.',
sentence: 'Sòyumo raemuha hy moshi mi noshi noc̈eami mameni.',
shape: {
a: 'Ranở',
b: 'Ceamukoṩhi',
},
someOf: [
'is',
'colors',
'are',
],
times: [
'Vạkinceayo',
'Tȧkoma',
'Vanohŷchi',
'Ṩomuhy',
'Koraè̩',
],
tuple: [
'Parkway',
'Street',
],
word: 'Ḱiko',
words: 'Mǡ ṽiravi',
},
{
bool: true,
char: 'D',
dateString: '1987-04-12T15:31:44.000Z',
float: 2163900954.1366706,
hash: 1105908087,
int: 1105908087,
join: 'Cherry Drive',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Ceamovi hameraeyu ṿahykeyu sosĥi ninaṁe ma chiraehaňi. Hykai nona me vavicẖishi nà keranimu ninorae yuṃe. Ki nachitako sórae yuraeha rayoa mumi mumachĩ nohysoke. Maviʈayu rảshi ki raeshiso cea, novakinḳo chihykoa kinkọmechi kinmumo momekoha hy. Kịnmuma yuceameha moyukinshi cḣike yố mutaḵin mukin. Nomu namivami vicea tayukena mevi ki ceaḿe takinmohÿ, nikai rakinceakin kike hầ yokinhẵ nami vacea. Sokinka vakeyu ỳokaikochi vaceáme kincea, maṇi ha yoha hamita ta sorae munő so.',
sentence: 'Vakṏchiha muaki vā shị ceayo shi, kɨnayota ceayo mikeŝo ki chirano.',
shape: {
a: 'Kevikaìka',
b: 'Çhiashi',
},
someOf: [
'beautiful',
'colors',
'256',
'too',
],
times: [
'Niḱairaeni',
'Mețavi',
'Moas̩hi',
'Raemuɽa',
'Maǹi',
],
tuple: [
'Cherry',
'Drive',
],
word: 'Nộkai',
words: 'Raĥyni yȱ hỹna',
},
{
bool: false,
char: 'm',
dateString: '2018-03-11T14:46:41.000Z',
float: 1755941438.1193404,
hash: 1961196758,
int: 1961196758,
join: 'Privet Drive',
oneOf: 'red',
oneOfWeighted: 'blue',
paragraph: 'Ḣavi mamokề chi namemiȟy chimeta ṁayu tanovicea, yovã kaichiraekāi kaikasoni ką́ ṋiraki nonikavi hyćea. Notayo miceami yo vimënima yova. Ṟaea kinyo ni yuka kotavi. Vakiyuhy yỏ raetamiyo niamehy nimumo, ceakinkachi ki kayukę́ha ṁi vamṳha. Kaihy hamu raemuvi kaishimu va mira vaċhinoha. Muma yu rằe mevimoke chiyoťa. Nitavina kihakḙ kachi kovirakin rame mukemomu machiraso ḿe, kikai hyr̃ae momuniyu ra hayo ǹi kai yo.',
sentence: 'Kin takira chị ka rasoshi, mū̃na ceachikamu vȧ nikė́ rakinmima hynihaki ha vachi.',
shape: {
a: 'Haṃomi',
b: 'Kehỹvake',
},
someOf: [
'that',
'256',
'you',
],
times: [
'Nivaᶄo',
'Kinṃeshi',
'Soșhichi',
],
tuple: [
'Privet',
'Drive',
],
word: 'Ramở',
words: 'Ḳiko ĉeano',
},
{
bool: false,
char: 'M',
dateString: '1980-01-05T00:15:35.000Z',
float: 2985147615.374252,
hash: 4055437320,
int: 4055437320,
join: 'Privet Road',
oneOf: 'red',
oneOfWeighted: 'blue',
paragraph: 'Mekavi raemora ma ceăkivame mamu moshichi ṁayu, kerayoraě ỹora yo hamemi nameyora yukaia. Vi mạkin menonino kayoha há mę́mu, nikaiha sȟinomamo raehy mỉa yokake kinẖa keni rayome. Shi hychi kanoaᶄin mi hya hamuna moki yukoceā, nokamime vi cea kiyo kin.',
sentence: 'Yo kinmõ niyo shį̃hyva shiyuakin, ninɵ kiyoceayo raeki kovi noraeyuva.',
shape: {
a: 'Raemukaƙin',
b: 'Cĥihaso',
},
someOf: [
'you',
'it',
'think',
],
times: [
'Koɽaena',
'Mísomuka',
'Niᶄaimaso',
],
tuple: [
'Privet',
'Road',
],
word: 'Hychiyų',
words: 'Koshiraeꝁi hŷha',
},
{
bool: false,
char: '2',
dateString: '1998-03-19T02:46:49.000Z',
float: 424254662.3407109,
hash: 2606573498,
int: 2606573498,
join: 'Parkway Road',
oneOf: 'blue',
oneOfWeighted: 'red',
paragraph: 'Shi nohakiso na raekoma so menohy shiko, kimamu kihyhaso ra nanikai kŏta yokinkaia kaiso ḵaitayo. Mokinvina shi komeha hyrashi kíta kekahykin ceayo kokashìra. Tȃ cea ceakin kai hymamu komeyumo ni vinomu. Ḿi hỵkinsokin michi maḩy shi hy yoshino yơkaiyoma. Yo chimo̩ mu meshiyoa shi ʋami ṁi, ketayu komachimo kerachi raeva nȍa. Chiceakemi chiħa mų̃chi raemuvaso raena mea taki. Vamo mu koraekimu nanoshi nisonihy nacea hymǟ chï, shicea raeamume șokin mehymoķi kirae vamo.',
sentence: 'Vichiyucea ke kinmiva kai ẙomavia, rano tȧ yochino raekaichi kinshi.',
shape: {
a: 'Yuƫamihy',
b: 'Kiḿe',
},
someOf: [
'256',
'you',
'is',
'are',
],
times: [
'Ŗavimako',
'Vinṑ',
],
tuple: [
'Parkway',
'Road',
],
word: 'Vaķira',
words: 'Mõnivacea vichiṛa',
},
{
bool: true,
char: 'j',
dateString: '1989-10-02T09:12:45.000Z',
float: 1430293246.6544547,
hash: 3933593889,
int: 3933593889,
join: 'Parkway Street',
oneOf: 'red',
oneOfWeighted: 'green',
paragraph: 'Kachi mihayu mʉ maṃukame na vañi vavicea mạ. Nẫ hynẫchicea keno nonir̃a šhi ramǫ raevasoha raếkona. Kehakerae hy kinko mea ha yomikai, hamemoka so kevi nayutachi shiha ceakehǟ m̃i me. Nacea kaiyocea shikamona ta rae, nihy chikohavį̃ nome meniyota va. Ceake yuvachike yotakỡna soa ko mahy hyceachima, mamoshiṇa yome ceaki ćhikaia ramȉka cèaso kȃmu. Shika ṉinokeko ḿi rae memitavi chi no, raê̄ sĥi muhýke yuyo shihyki vimonani motakinma ṽaceayo.',
sentence: 'Yu hymika hamushiyu ķoraehy ceako, hayu chimea so kinkểno yȍmu rasoniko.',
shape: {
a: 'ƫamomi',
b: 'Ceanikaᶄo',
},
someOf: [
'is',
'you',
'think',
],
times: [
'Kinsoshɨ',
'Kikovamổ',
'Kakikễyo',
'C̈hime',
],
tuple: [
'Parkway',
'Street',
],
word: 'Nặshiacea',
words: 'Munaceă ḳikera rayőmeno',
},
{
bool: false,
char: 's',
dateString: '1982-11-19T10:23:56.000Z',
float: 1748772161.3578002,
hash: 2311995802,
int: 2311995802,
join: 'Parkway Street',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Ḵotashi nakevậ kemi kekokimo mokichî moka. Hẵkekai koḿe ceamiha kai mȁ ceakihy hamukînso. Yuke yochimaki namuȑa chimena sơta ᶄimasoa.',
sentence: 'Soniha kinra čhikaihakin raekï nokakira raema.',
shape: {
a: 'Ravį̃meno',
b: 'Haṇi',
},
someOf: [
'i',
'colors',
'beautiful',
'too',
'256',
],
times: [
'Mảmomukai',
'Yukinhẳ',
'Ʋiso',
'Sővamo',
'Shíka',
],
tuple: [
'Parkway',
'Street',
],
word: 'Viɍaemike',
words: 'Muchiaꝁi memïamu',
},
{
bool: true,
char: '5',
dateString: '1985-10-18T21:55:12.000Z',
float: 1767041412.3385472,
hash: 216129525,
int: 216129525,
join: 'Privet Street',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Mikaicea chȉkai ke kaihy soceaniko, yuhyta nŏ raceayo kinchi ǩami moraekahy. Tanỏke kinta yunikorâe ṁeko vinokin, ṽi shimohaki kamo keḵairae mamoma. Vimo vakī̀no ħy koki ha shimaki. Cẽahyma mema mumochi ra hamiso, nốna cḩikaita yuceakina namu chino. Kắvita mochi vą́ ʋi kaiceamu somakin va haso, shiacea mamuki yuake makīnshi ĉeake. Kihykễ kai ra somuta ṇishihy.',
sentence: 'Kaivachino mumameva mằnina ki kino chiṛani.',
shape: {
a: 'M̃iyu',
b: 'Kę́ta',
},
someOf: [
'it',
'are',
'you',
],
times: [
'Ramīma',
'Tayựceani',
'Kinẏuni',
],
tuple: [
'Privet',
'Street',
],
word: 'Shikaĭani',
words: 'Shirậeko ṛa momesokiṅ',
},
{
bool: true,
char: 'L',
dateString: '1991-12-28T11:12:23.000Z',
float: 245968633.27103618,
hash: 2366051771,
int: 2366051771,
join: 'Privet Road',
oneOf: 'blue',
oneOfWeighted: 'red',
paragraph: 'Yùyo ḳin kesoṃuvi ŕa yoraekaimu, cềamu kenimo yo haceanishi shiso. Soakinva kinkia nonahyka kemo na tȧkinmira soma. Mekaiva ninotaĥy ki chikavi keta raḵinkai, koniā̀ muchɨ chiramema ćhihy raevima. Yu kaichinona hakova vą́ nờ komera. Raemitamu shichihy monano noraechihy ko misoacea cea meni.',
sentence: 'Nikoñiso somayơ tayomashi ceashi sora kắi kayoshi chiño.',
shape: {
a: 'Kinceaḵi',
b: 'Memaçhino',
},
someOf: [
'256',
'think',
'too',
'i',
],
times: [
'Shim̃e',
'Chinameỵo',
'Sorâe',
'Memā',
'Ȑaea',
],
tuple: [
'Privet',
'Road',
],
word: 'Hamiŷumi',
words: 'Nahymǻyo mȗ takĩnyuni',
},
{
bool: false,
char: 'U',
dateString: '1992-05-13T04:57:27.000Z',
float: 749076568.2691115,
hash: 1030755052,
int: 1030755052,
join: 'Cherry Drive',
oneOf: 'green',
oneOfWeighted: 'red',
paragraph: 'Shìkai soha yumemuǹi tayoni konimoa haso kaiƙamuso. Taňome mo chiashi rae ma. Ka kaishimu ṃi mayohymi kavi chimuhy. Kemō ꞩo shirae kaniraekai vi kai mu vano, shḯyu mekiḥy sokin yunami maki shiceḁchi kesoka kầiso. Hyra s̈o mī̀akika komishiną ṽiyu va kichi kinḵemu, kaïmokai hyhặno ȼhiceakin virǎe ra.',
sentence: 'Rahyna ýoa raniŕa meshiketa ÿunashi shi, sȯka rasoyu soraekaiso mậ ceahỳ ṋi.',
shape: {
a: 'Vivậ',
b: 'Ceameƙin',
},
someOf: [
'think',
'colors',
'too',
],
times: [
'Kinʋake',
'Ñiko',
],
tuple: [
'Cherry',
'Drive',
],
word: 'Nakochị',
words: 'Sởchihyvi cḩi',
},
{
bool: false,
char: 'm',
dateString: '2006-07-15T18:07:10.000Z',
float: 3660439034.295801,
hash: 3613405866,
int: 3613405866,
join: 'Privet Drive',
oneOf: 'green',
oneOfWeighted: 'blue',
paragraph: 'Ravi ṽi mushịme mo kinhasohy kevaki yomashi. Shi ko ńira niyokăyo yuÿokerae meshike, name yukavi kovi somuyoha mumahy muyominō ÿu. Kakę́ta mo muka ḳishi nami chi ḳo.',
sentence: 'Kairani chikasorae muñi haṃukekin mena chiťani so nokeraeno.',
shape: {
a: 'Ṋonayu',
b: 'Mukaihḁ',
},
someOf: [
'colors',
'think',
'beautiful',
'it',
],
times: [
'Mikeᶄin',
'Shimuṃami',
'Ḧayokaichi',
'Navimʉ',
'Kanikoṅi',
],
tuple: [
'Privet',
'Drive',
],
word: 'Kaịta',
words: 'Ćhivayo tamuṥo',
},
{
bool: false,
char: 'K',
dateString: '1986-11-03T22:51:03.000Z',
float: 2259362005.332736,
hash: 134677846,
int: 134677846,
join: 'Privet Road',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Ceamochi hymehy munoyukai keyu yümova vïmu soranộ. M̃era kẻ ȟaso ra kinrahy ma hŷta. Nokanomu shȉnona ceǎ nokai ḿukokiso kin kanikeno noẏuceayo, makệ vivake raevicḣi nasonã shiceakǐnta nako muchiyokin. Meshihy ṝa sồshi miké̩ka mamoraeva chikayukin sṓraeka miṃe. Mi yorano tavimoha ꝁochiraeyu mameni ninǡceashi kaċhia cȟiyu. Munichikai vikaʋi chiyukaiko chiraekai mimeắko. Ceakinceayo ramicea řa ċhimo mayuni ranotake na.',
sentence: 'Kikeṣo shi chime namumeka ńi takisoma, kaĩ ṽi ṧhira m̃i viha.',
shape: {
a: 'Hamomuḱi',
b: 'Hymakṍha',
},
someOf: [
'are',
'think',
'that',
],
times: [
'Raenochiḵo',
'Kesokę',
],
tuple: [
'Privet',
'Road',
],
word: 'Kenṍ',
words: 'Mĭvamo ʋachihyna',
},
{
bool: true,
char: 'b',
dateString: '1985-02-10T13:26:54.000Z',
float: 720301339.1318674,
hash: 305115925,
int: 305115925,
join: 'Privet Road',
oneOf: 'blue',
oneOfWeighted: 'blue',
paragraph: 'Kin nahy ṃevani sħiso hyäkerae mutacea. So raenora mićea mȗa yochiyomi ha, tẳ mashiceā vimorashi konína mą́ raeshitahy nà vachimakin. Raeyushi hỹkika kainashi kaki shiko somivami shikaikahy, ta kechi hykoḳin mo yumi vaỵukeki takerẩe. Ṝaea makoyu ṃi kachį̃ viso, tȃ hạmu mekaia me râmomuchi muvimuke kinra chiyokemĩ. Ke sokaimo kachi mitake chȉ šhiachita.',
sentence: 'Nita kaisorąe ka raevi yuhymoha manimeshi hyvame.',
shape: {
a: 'Nirakḛ',
b: 'Nổna',
},
someOf: [
'that',
'i',
'colors',
'beautiful',
'too',
],
times: [
'Mằchi',
'Metaṉi',
'Kinṃuracea',
],
tuple: [
'Privet',
'Road',
],
word: 'Ceahŷnake',
words: 'Ɍaemuyo chiraeᶄinkai shiṅo',
},
{
bool: false,
char: 'W',
dateString: '1998-03-07T02:42:39.000Z',
float: 3135024661.393866,
hash: 1653711338,
int: 1653711338,
join: 'Parkway Road',
oneOf: 'blue',
oneOfWeighted: 'red',
paragraph: 'Vi mimokin kai ṃovi mu, mukime mikaisomi havi noraetachi tằ nomikenö nokaiǩame muhyko. H̱y mekoshi yumihaso kasớta kinokaime koshi shi kokaiyu. Ko komukinva tayovake kaỉmo havicea kenimocea nikaȋko hykïnkaiyo, kovǟ ka kinyomu vimorakę̃ ceaṩovi yu nikahyḿe. Shitaviko vahy cea soyuki vī̀meka yu.',
sentence: 'Ceahymi kinakorae movime muyokame raeķoceano mṵ ǩaiha, yokḝ ņomukacea niso vamukin yutahymi.',
shape: {
a: 'Ƴura',
b: 'Makë',
},
someOf: [
'are',
'think',
'colors',
'it',
'256',
],
times: [
'Hynòceavi',
'Ŕako',
'Shikaimò',
],
tuple: [
'Parkway',
'Road',
],
word: 'Raekamisȭ',
words: 'Shimorấchi ḱi',
},
{
bool: false,
char: '6',
dateString: '1998-03-07T02:48:36.000Z',
float: 1384059676.2891166,
hash: 51193778,
int: 51193778,
join: 'Privet Drive',
oneOf: 'red',
oneOfWeighted: 'red',
paragraph: 'Mǐketa mahy kạ raetẫme kame chiyomǻko, machisorae shiçea ha kairaehyvi ma. Chȉkachi shime mé̩nimu shimevano tami michishinò. Ninokaia ta vakeva ṅoyu ħashihyma nimunȯ viha mi. Kintaķiko menayȏha novǟ namekiha soshiyoki kaḱeki novamumi. Mayukimo chirae sḧi ħy ḩayumoa nȫ shiceayuni chimumevi, mu niha muceahykin mi kầi shihymona vaviko sokaino. Ni hỹshi kinmakera ninỏ ƙimaso. Shikoma ɽahyvi meni noyura chika muva vă.',
sentence: 'Soyuvano mầ hykȇ ḱo mekaiyű.',
shape: {
a: 'Yuṋi',
b: 'Ṃunishiyu',
},
someOf: [
'colors',
'i',
'are',
],
times: [
'Yomanokiñ',
'Kẩkin',
],
tuple: [
'Privet',
'Drive',
],
word: 'Taṁukihy',
words: 'Vasḥi kȃvikai',
},
{
bool: true,
char: 'T',
dateString: '2009-02-10T13:55:06.000Z',
float: 2568408676.114224,
hash: 2713534189,
int: 2713534189,
join: 'Parkway Drive',
oneOf: 'blue',
oneOfWeighted: 'red',
paragraph: 'Kaỉ ma kakoahy makiyora moni. Kamemomu miťaso nimemïshi hachi ḫyshichiso, vahynaso chima niramimą́ s̈hi ṿi. Hymakeso kairaki ra soke hyayoki.',
sentence: 'Tã maso memusome mṑ raemuame me ţamukavi hy.',
shape: {
a: 'Mirāekinmu',
b: 'Movǡ',
},
someOf: [
'beautiful',
'i',
'you',
'think',
'too',
],
times: [
'Mevinoḳai',
'Sovamevǻ',
'Shihȳkeno',
],
tuple: [
'Parkway',
'Drive',
],
word: 'Hŷkinso',
words: 'Ňayoshi kekaṽi ṅi',
},
{
bool: false,
char: 'q',
dateString: '1996-01-09T00:49:26.000Z',
float: 3452131765.2704616,
hash: 413263656,
int: 413263656,
join: 'Cherry Street',
oneOf: 'blue',
oneOfWeighted: 'red',
paragraph: 'Kặi va navinoki tâ kikoceayo yuꝁi mekaiavi nò, ḱo mako mokaĭ raeshî cea noyuko. Tavi shī̀sova shikoyùyo mihymeka shiḫy yohanomu noŧaso. Kiñ yusova shikani ka vayo kởhykai, chimu miyo ki raekaso kako yukemani ĥychi yu. Ramuvacea kekįnkakin kokaimano shira ṃira mo. Kaísona haviko ṛa ćeashikohy niceą, hy chiha mohami ňiko makimea tanona vamu.',
sentence: 'Nishihyrằ hyna ni ceako r̃aeha nirẵkima, chiayởka ḫykachikai somiᶄavi nivẚ hycea rashicḣi.',
shape: {
a: 'Vïhymo',
b: 'Ꝁinceayu',
},
someOf: [
'it',
'beautiful',
'is',
],
times: [
'Kamiǩaia',
'Ṿiyu',
'Masḧike',
],
tuple: [
'Cherry',
'Street',
],
word: 'Kinẏu',
words: 'Nẳ yushiḳaiha',
},
{
bool: false,
char: 'U',
dateString: '2010-11-03T22:16:13.000Z',
float: 112935046.29678054,
hash: 1572668470,
int: 1572668470,
join: 'Parkway Street',
oneOf: 'red',
oneOfWeighted: 'blue',
paragraph: 'Kinḣashi ceasonime ta chima kaikoki tamo, kaichiyura masoké̩ nîcea norakomi kaiha chishiva. Hychi naki vikaisokai hake somimavi vaśhinayo yokai, kai kehyname keǩayu kaȉ sochiᶄaita me. Komirae mameno vi ĉhi vǐhykai, raeva kemurayu mahyme koshī ƙimoki.',
sentence: 'Koraṃura raemima nikashike vivachį̃ mu ramoravi.',
shape: {
a: 'Mihychî',
b: 'Hŷnomiha',
},
someOf: [
'that',
'256',
'is',
'think',
'you',
],
times: [
'Hanốani',
'Komę́make',
'Nḯha',
],
tuple: [
'Parkway',
'Street',
],
word: 'Kinrachikḙ',
words: 'Vậ kaim̃uvi',
},
{
bool: false,
char: '6',
dateString: '2016-01-05T12:46:27.000Z',
float: 3694071144.249518,
hash: 3504510636,
int: 3504510636,
join: 'Cherry Drive',
oneOf: 'blue',
oneOfWeighted: 'green',
paragraph: 'Vakimṏ kiani muvaceachi kani mimomiyo tameyu, noraeha vakiacea hy mimeyoni memimo cea. Shi muvi raekame hymu kesohẫshi hyvia manihyňi tahyva, kemaki makoyuyo kiŝohy ceano kakoni yṏ mecħimame. Ʋi koka kǿnameka haḿe me chiyoha śo, nokachita kimu haké̩ sȱvirayo yoka ketako. Kovahy ke raeni chi vi hano, kiḿuko ke ceani ka mira vimekaįna me. Ka kesovi noṿia nike ṥo chivakiçea m̃ekishi.',
sentence: 'Momemằ shī̀ ṝa koshirae nă mako niväkimo.',
shape: {
a: 'Macę̃ake',
b: 'Yuvichī̀',
},
someOf: [
'you',
'beautiful',
'256',
],
times: [
'Ḵikai',
'Ǩinme',
'Ḱikai',
'Kaƈea',
'Kairaǩin',
],
tuple: [
'Cherry',
'Drive',
],
word: 'Rashiṫa',
words: 'Ķaia keꞩhi',
},
{
bool: true,
char: '5',
dateString: '1983-12-20T11:33:56.000Z',
float: 392084607.40361965,
hash: 3239543963,
int: 3239543963,
join: 'Cherry Road',
oneOf: 'blue',
oneOfWeighted: 'blue',
paragraph: 'Ḵo chiṉavi mỡ sỏrae hykiceâ raeyu. Yốka yomukeki na kin kakoraẖy, maceashi kekai hakinyua hakemucea ḱairaeko kochihahy. Ke kinmuyo nochikaihy misoka kiñmikai hýani korake shi, sokehayu nō yunaḿi vi chiraehy. Nohayo kinkehycea shimemamo hà ṽaso kichi hẙayukin vakinyuke.',
sentence: 'Nisomeko ha yukaĭshiso nota murayo̩, me shi yu yuvḁ kinmea hyta.',
shape: {
a: 'Keńi',
b: 'ᶄemiyochi',
},
someOf: [
'is',
'are',
'i',
'256',
],
times: [
'Nokimȏka',
'Sonihæ',
'Yuvȉmashi',
'Mạkinka',
],
tuple: [
'Cherry',
'Road',
],
word: 'Vivaᶄin',
words: 'Chikekaiỳu ṋi vimomukẵi',
},
{
bool: false,
char: 'O',
dateString: '2016-09-25T21:11:41.000Z',
float: 1236952435.6703882,
hash: 1420194716,
int: 1420194716,
join: 'Privet Street',
oneOf: 'red',
oneOfWeighted: 'blue',
paragraph: 'Hy kivayucḩi me tano m̃ora, konaṿi ʈavi hycea mayocea mi kaniso kếnakota tȃ. Mu ṃerasoyu kakokai moḱinchi moƙi, na kekayuyo vima kokesocea rayota soshimu. Va yoke ma ṿayu mokaiyu.',
sentence: 'Mő tano kinkaina michi shiyovake moraeko rako mặhymo, ceayu cea kaiyusokai vichiha kǟ movạ.',
shape: {
a: 'Nakoꝁinko',
b: 'Mutḁceame',
},
someOf: [
'256',
'is',
'you',
],
times: [
'Ńahynia',
'Vȉta',
'Ninṏ',
'Namechī',
'Sḩiyuka',
],
tuple: [
'Privet',
'Street',
],
word: 'Visȍta',
words: 'Ƙaishiraeno kįnkeso',
},
{
bool: false,
char: '2',
dateString: '1992-05-09T05:00:18.000Z',
float: 2476367479.116972,
hash: 313227412,
int: 313227412,
join: 'Cherry Drive',
oneOf: 'green',
oneOfWeighted: 'red',
paragraph: 'Yuvi mekaiso mucea tasoshi moraehyṭa shiṟaemo sokeha. Rayukeyo muki ha mǖha yo, nẩhykai hyrachȉso yuvầ kǟ vakê̄ vakinmo. Kiamoyʉ meta nonamo ṁe nakinceani, hysokai mikaimushi kaikehamo vano ḿe mome raỹorame. Sokaikamu tamumaḥy ramemu yoǩiyuhy ka, mãceakinta kaikamimo kinyucea hynaņi shisomamo haḳinki kệ mena. Mokai hachishimo kaivaso ḿi raehanovẩ kikaihacẖi. Yukona keʋinoshi raé̩yuki raehysoha makechȉ, noraekitã some keĥahy mohayocểa me nờmu ꝁamumochi chinona.',
sentence: 'Hytashi raeyǜ nanoki kiṋ tayuhy.',
shape: {
a: 'Koȟy',
b: 'Nihỳmo',
},
someOf: [
'colors',
'beautiful',
'too',
'are',
'it',
],
times: [
'Ceakĭnyu',
'Vaḿikoni',
'Shicħiyu',
'Muceamüso',
],
tuple: [
'Cherry',
'Drive',
],
word: 'Kincḩiyome',
words: 'Soḿe væ',
},
{
bool: false,
char: 'e',
dateString: '2018-11-23T23:12:08.000Z',
float: 535296908.2844991,
hash: 3555022318,
int: 3555022318,
join: 'Parkway Drive',
oneOf: 'blue',
oneOfWeighted: 'red',
paragraph: 'Ḣy chikaiyukin hy nakika kin niravike hashikema. Nokaimu raenomimo ni hanimucea kivẚ nachimo mohykẳ. Moa yoshi kohyshita ɽa ṙaeyu ḣy raȼhimema kinka, yuaṣomu naṃomime memina ħy rakimacea. Kechī̀ shimoha tachiraekin vasoceâ hykehaki. Va kinkaina rȁeki ceakeshika masova kaihy mi kaiketavi, sĥi viko shiraeni vįkeha ṽi. Haķin sokisokê̌ ṃorae masorae mḁ shi rasħi.',
sentence: 'Niceǡko hamoyushi va raekaihy kinmuyoǹa konayuva, ką raekainake vichi mema mi kikāimu.',
shape: {
a: 'Kaiǹasona',
b: 'Rǻeni',
},
someOf: [
'it',
'think',
'i',
],
times: [
'Muviṁoyu',
'Cḙamokin',
'Raếkona',
],
tuple: [
'Parkway',
'Drive',
],
word: 'Kinkaivã',
words: 'Mukoṿi ñi',
},
]
| 32.054335 | 564 | 0.528113 | grn_Latn | 0.097142 |
4fabfdf103bb4b3d02fc6751e9706c0eb7db1580 | 555 | md | Markdown | _posts/2020-06-25-saas.md | luckymaosh/luckmaosh.github.io | ca2664473a1e94d2b0ffde0f878714c0fefe9ef4 | [
"MIT"
] | null | null | null | _posts/2020-06-25-saas.md | luckymaosh/luckmaosh.github.io | ca2664473a1e94d2b0ffde0f878714c0fefe9ef4 | [
"MIT"
] | null | null | null | _posts/2020-06-25-saas.md | luckymaosh/luckmaosh.github.io | ca2664473a1e94d2b0ffde0f878714c0fefe9ef4 | [
"MIT"
] | null | null | null | # saas 服务的商业模式
看这张图
- 核心价值、关键业务: 就是产品的立足之本,产品可以是垂直领域的,但是最好可以整合行业的各类需求,解决行业的痛点
- 渠道:渠道非常关键,一般saas都是销售前期付费的,所以打通渠道很关键
- 合作伙伴:产品最好可以跟其他厂商非常好的整合,这样能提供更好的用户体验。 若是做不好,就像抖音、淘宝分享到微信一样,非常难用。
企业saas服务 一般来说,包括:OA、erp、crm、协同办公、hrm、财务等。
企业服务等痛点永远是:高效、专业、安全。

艾瑞咨询,中国saas市场规模,到2021年底,将到达654.2亿元

一般来说,每一个垂直领域都有其饱含盛名的头牌应用,如:CRM领域的纷享销客、ERP领域的金蝶云、OA协同的钉钉、客服与呼叫中心领域的小i机器人
[纷享销客](https://zhuanlan.zhihu.com/p/142096103) | 23.125 | 71 | 0.801802 | yue_Hant | 0.754963 |
4fac39c4315d10044d82b0cf9ae1a173daa04129 | 3,516 | md | Markdown | README.md | akgarhwal/Ghost-Android-Game | 8d7029d83e386e91d5fb361e26425cf43ca887e7 | [
"Apache-2.0"
] | null | null | null | README.md | akgarhwal/Ghost-Android-Game | 8d7029d83e386e91d5fb361e26425cf43ca887e7 | [
"Apache-2.0"
] | null | null | null | README.md | akgarhwal/Ghost-Android-Game | 8d7029d83e386e91d5fb361e26425cf43ca887e7 | [
"Apache-2.0"
] | 2 | 2020-01-07T03:11:35.000Z | 2020-04-04T09:42:35.000Z | # Ghost-Android-Game
</b>[Ghost](https://en.wikipedia.org/wiki/Ghost_(game)) is a word game in which players take turns to add letters to an incomplete word. The only constraint is to maintain a string which is a prefix of a valid English word. The player who forces his opponent to create a meaningful word wins the game. A dictionary of 60K+ words were implemented using Trie, which served quick lookups for optimized decision checks during the gameplay. The bot moves were implemented using order-3 Markov model.</b>
System Details : <br>
Android Studio 2.3.3 <br>
Build : 162.4069837 <br>
JRE : 1.8 <br>
<h2>Screenshot of App :</h2>
<p float="left">
<img src="ghost.gif"/>
</p>
<h3>Game Rules: </h3>
<ol>
<li>Ghost is a word game in which players take turns adding letters to a growing word fragment, trying not to be the one to complete a valid word.</li>
<li>Each incomplete wordmust be the beginning of an actual word, and minmum length of a word that counts is 4 letters.</li>
<li>The player who completes a word lose the round.</li>
<li>A player is chosen at random to start the game, and begins by naming any letter of the alphabet.</li>
<li>Players then take turns to add letters to this fragment, with the aim being to avoid completing an actual word.</li>
<li>The player whose turn it is may :-
<ul>
<li> Instead of adding a letter, challenge the previous player to prove that the current fragment is actually the beginning of a word.</li>
<li> If the challenged player can name such a word, the challenger loses the round; otherwise the challenged player loses the round.</li>
<li> If a player bluffs, or completes a word without other players noticing, then play continues.</li>
</ul>
</li>
</ol>
<h2>Ghost Strategy :-</h2>
<h5>A Sample Dictionary :-</h5>
<ol>
<li>there</li>
<li>any</li>
<li>answer</li>
<li>anyone</li>
<li>their</li>
<li>bye</li>
</ol>
Markov Model from Dictionary :-
<table>
<th>Current State</th>
<th>Possible next character</th>
<tr> <td>the</td>
<td>{'r':1, 'i':1}</td>
</tr>
<tr> <td>her</td>
<td>{'e':1}</td>
</tr>
<tr> <td>ans</td>
<td>{'w':1}</td>
</tr>
<tr> <td>nsw</td>
<td>{'e':1}</td>
</tr>
<tr> <td>swe</td>
<td>{'r':1}</td>
</tr>
<tr> <td>any</td>
<td>{'o':1}</td>
</tr>
<tr> <td>nyo</td>
<td>{'n':1}</td>
</tr>
<tr> <td>yon</td>
<td>{'e':1}</td>
</tr>
<tr> <td>hei</td>
<td>{'r':1}</td>
</tr>
</table>
<h5>Ghost will use markov model to predict next char of word fragment.</h5>
Order of Markov model is 3, which means next character wil be decided from last 3 character.
<h2>How to check current word fragment is valid or complete :- </h2>
<h4> 1. Binary Search :- </h4>
<ul>
<li>Sort Dictionary lexicographically</li>
<li>Search for current word in sorted dictionary using Binary search.</li>
<li>Complexity : O( log<sub>2</sub>(N)) where N is number of word in dictionary.</li>
</ul>
<h4> 2. Trie :- </h4>
<ul>
<li>The trie is a tree where each vertex represents a single word or a prefix.</li>
<li>The tries can insert and find strings in O(L) time (where L represent the length of a single word). This is much faster than Binary search.</li>
</ul>
<h5>How Trie look like :- </h5>
<p>LIst of words:</p>
<ul>
<li>Great</li>
<li>Any</li>
<li>Anyhow</li>
<li>answer</li>
<li>Greek</li>
<li>Their</li>
<li>there</li>
<li>the</li>
</ul>
<img src="Trie.PNG">
| 31.963636 | 498 | 0.647042 | eng_Latn | 0.984221 |
4fac521c40a48254a3105d347dad147f5a3accb3 | 194 | md | Markdown | _posts/2018-10-30-tim-is-great.md | fossil12/fossil12.github.io | 84e0ad18dae19a1b10660bfbcc06514261095be7 | [
"MIT"
] | null | null | null | _posts/2018-10-30-tim-is-great.md | fossil12/fossil12.github.io | 84e0ad18dae19a1b10660bfbcc06514261095be7 | [
"MIT"
] | null | null | null | _posts/2018-10-30-tim-is-great.md | fossil12/fossil12.github.io | 84e0ad18dae19a1b10660bfbcc06514261095be7 | [
"MIT"
] | null | null | null | ---
layout: post
microblog: true
audio:
photo:
date: 2018-10-30 15:34:46 +0100
guid: http://fossil12.micro.blog/2018/10/30/tim-is-great.html
---
Tim is great, the crowd is too crazy for me...
| 19.4 | 61 | 0.695876 | eng_Latn | 0.532809 |
4facd74d026060e65fa2f869922e81d424b9244e | 8,776 | md | Markdown | articles/cognitive-services/Speech-Service/speech-devices-sdk-microphone.md | md2perpe/azure-docs.sv-se | 64bdd85952bc1d194f86a3a80e616ca967bb6235 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cognitive-services/Speech-Service/speech-devices-sdk-microphone.md | md2perpe/azure-docs.sv-se | 64bdd85952bc1d194f86a3a80e616ca967bb6235 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cognitive-services/Speech-Service/speech-devices-sdk-microphone.md | md2perpe/azure-docs.sv-se | 64bdd85952bc1d194f86a3a80e616ca967bb6235 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Tal Devices SDK mikrofon matris rekommendationer – Speech Services
titleSuffix: Azure Cognitive Services
description: Rekommendationer för tal Devices SDK mikrofon matris. Följande matris geometrier rekommenderas för användning med Microsoft ljud-stacken. Platsen för ljud källor och bakgrundsljud att avvisa bättre med större antal mikrofoner med beroenden på specifika program och användarscenarier formfaktorn för enheten.
services: cognitive-services
author: erhopf
manager: nitinme
ms.service: cognitive-services
ms.subservice: speech-service
ms.topic: conceptual
ms.date: 07/05/2019
ms.author: erhopf
ms.openlocfilehash: 121e94228ca85684b20f2ee43c0f7fa3af82fc73
ms.sourcegitcommit: f10ae7078e477531af5b61a7fe64ab0e389830e8
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 07/05/2019
ms.locfileid: "67606337"
---
# <a name="speech-devices-sdk-microphone-array-recommendations"></a>Tal enheter SDK mikrofon matris rekommendationer
I den här artikeln får du lära dig hur du utformar en mikrofon matris för tal enheter SDK.
SDK: N för tal enheter fungerar bäst med en mikrofon-matris som har utformats för enligt följande riktlinjer, inklusive mikrofon geometri och komponenten markerad. Vägledning ges också på integration och elektriska.
## <a name="microphone-geometry"></a>Mikrofon geometri
Följande matris geometrier rekommenderas för användning med Microsoft ljud-stacken. Platsen för ljud källor och bakgrundsljud att avvisa bättre med större antal mikrofoner med beroenden på specifika program och användarscenarier formfaktorn för enheten.
| | Cirkulär matris | | Linjär matris | |
|----------|-------------------|-------------------|----------------|----------------|
| |<img src="media/speech-devices-sdk/7-mic-c.png" alt="7 mic circular array" width="150"/>|<img src="media/speech-devices-sdk/4-mic-c.png" alt="4 mic circular array" width="150"/>|<img src="media/speech-devices-sdk/4-mic-l.png" alt="4 mic linear array" width="150"/>|<img src="media/speech-devices-sdk/2-mic-l.png" alt="2 mic linear array" width="150"/>|
| \# Mikrofoner | 7 | 4 | 4 | 2 |
| geometri | 6 Outer, 1 Center, Radius = 42.5 mm, Evenly Spaced| 3 Outer, 1 Center, Radius = 42.5 mm, Evenly Spaced | Längd = 120 mm avstånd = 40 mm | Avstånd = 40 mm |
Mikrofon kanaler ska sorteras enligt numreringen visas för varje ovan matris, vilket ökar från 0. Microsoft ljud Stack kräver en ytterligare referens dataström för ljuduppspelning att utföra eko.
## <a name="component-selection"></a>Komponenturvalet
Mikrofon komponenter ska väljas att korrekt återge en signal som är kostnadsfritt brus och störningar.
De rekommenderade egenskaperna när du väljer mikrofoner är:
| Parameter | Rekommenderas |
|-----------------------------------|-----------------------------------|
| SNR | \> 65 dB (1 kHz signalen 94 dBSPL, A-viktade bruset) |
| Amplitud matchar | ± 1 dB @ 1 kHz |
| Matchning fasen | ± 2° @ 1 kHz |
| Akustiska överlagring punkt (AOP) | \> 120 dBSPL (THD = 10%) |
| Bithastighet | Minst 24-bitars |
| Samplingsfrekvens | Minsta 16 kHz\* |
| Directivity | Rundstrålande |
| Frekvens svar | \+ 3 dB, 200 8000 Hz flytande Mask\*|
| Tillförlitlighet | Lagringsomfång temperatur 40 ° C till 70 ° C<br />Operativ temperaturintervall 20 ° C till 55 ° C |
*\*Högre sampling betalning eller ”större” frekvensintervall kan krävas för program för hög kvalitet kommunikation (VoIP)*
Bra komponenturvalet måste kopplas till bra electroacoustic integrering för att undvika att försämra läsarens prestanda för de komponenter som används. Unika användningsfall kan också kräva ytterligare krav (till exempel: driva temperatur).
## <a name="microphone-array-integration"></a>Mikrofon matris-integrering
Prestanda för matriser när du har integrerat i en enhet och efter fast vinst eller EQ måste uppfylla följande rekommendationer:
| Parameter | Rekommenderas |
|--------------------|----------------------------------------------------|
| SNR | \> 65 dB (1 kHz signalen 94 dBSPL, A-viktade bruset) |
| Utdata känslighet | -26 dBFS/Pa @ 1 kHz (rekommenderas) |
| Amplitud matchar | ± 2 dB, 200-8000 Hz |
| Matchning fasen | ± 5°, 200-8000 Hz |
| THD % | ≤ 1%, 200 8000 Hz, 94 dBSPL, 5 ordning |
| Frekvens svar | \+ 6 dB, 200 8000 Hz flytande Mask\* |
*\*”Större” frekvensintervall kan krävas för program för hög kvalitet kommunikation (VoIP)*
## <a name="speaker-integration-recommendations"></a>Rekommendationer för API-integration
Som eko krävs för tal för enheter som innehåller talare, finns ytterligare rekommendationer för val av talare och integrering.
| Parameter | Rekommenderas |
|-----------------------------------|-----------------------------------|
| Linearitet överväganden | Inga icke-linjära bearbetning efter API-referens, annars en maskinvarubaserad loopback referens dataström är obligatoriskt |
| Talare Loopback | Tillhandahålls via WASAPI, privat API: er, anpassade ALSA plugin-programmet (Linux) eller tillhandahålls via inbyggd programvara kanal |
| THD % | 3: e oktavbanden minsta 5 ordningen, 70 dBA uppspelning @ 0,8 m ≤ 6.3%, 315 – 500 Hz ≤ 5% 630 5000 Hz |
| Echo-koppling till mikrofoner | \> -10 dB TCLw med ITU-T G.122 bilaga B.4 metoden normaliseras till mic-nivå<br />TCLw = TCLwmeasured \+ (mätt nivå - rikta utdata känslighet)<br />TCLw = TCLwmeasured \+ (mätt nivå - (-26)) |
## <a name="integration-design-architecture"></a>Design-Integreringsarkitektur
Följande riktlinjer för arkitekturen krävs när de integrerar mikrofoner i en enhet:
| Parameter | Rekommendation |
|-----------------------------------|-----------------------------------|
| MIC-Port likheter | Alla mikrofon portar är samma längd i matrisen |
| MIC-Port dimensioner | Port storlek Ø0.8 1.0 mm. Port längd / Port Diameter \< 2 |
| MIC försegling | Hur du skrivskyddar packningar enhetligt implementeras i stack upp. Rekommenderar \> 70% komprimeringsförhållandet för skum packningar |
| MIC tillförlitlighet | Nät som ska användas för att förhindra damm och ingress (mellan PCB för nedre porteras mikrofoner och hur du skrivskyddar packning/upp cover) |
| MIC-isolering | Gummipackningar och vibrationer Frikoppling via struktur, särskilt för att isolera eventuella vibrationer sökvägar på grund av inbyggda högtalare |
| Sampling klockan | Enheten ljud får inte innehålla jitter och bortfall med låg drift |
| Registrera funktionen | Enheten måste kunna registrera enskild kanal raw strömmar samtidigt |
| USB | Alla USB-enheter måste ställa in beskrivningar enligt den [USB-enheter Rev3-specifikationen ljud](https://www.usb.org/document-library/usb-audio-devices-rev-30-and-adopters-agreement) |
| Mikrofon geometri | Drivrutinerna måste implementera [mikrofon matris geometri beskrivningar](https://docs.microsoft.com/windows-hardware/drivers/audio/ksproperty-audio-mic-array-geometry) korrekt |
| För att göra | Enheter får inte ha några undiscoverable eller fungerar maskinvara, inbyggd programvara eller 3 part programvarubaserad icke-linjära ljud bearbetning algoritmer till och från enheten|
| Avbilda Format | Avbilda format måste använda en minsta samplingsfrekvensen för 16 kHz och rekommenderade 24-bitars djup |
## <a name="electrical-architecture-considerations"></a>Arkitektur för elektriska överväganden
Om tillämpligt, matriser kan anslutas till en USB-värd (till exempel en SoC som kör Microsoft ljud-stacken) och gränssnitt för att taltjänster eller andra program.
Maskinvarukomponenter som Kontaktar till TDM konvertering bör se till att dynamiskt omfång och SNR av mikrofonerna bevaras i ny insamling.
Snabba USB ljud klass 2.0 bör stödjas inom alla ljud MCU för att tillhandahålla den nödvändiga bandbredden för upp till sju kanaler med högre samplingsfrekvensen och bitars djup.
## <a name="next-steps"></a>Nästa steg
> [!div class="nextstepaction"]
> [Läs mer om tal Devices SDK](speech-devices-sdk.md)
| 75.655172 | 364 | 0.663856 | swe_Latn | 0.996645 |
4fae3ea0c05588ed1dddfafda59a2d1a01845389 | 15,732 | md | Markdown | tuning_reports/plwiki.goodfaith.md | paulkernfeld/editquality | 029f21278d89d6e50b0eac7b39d8355f8e4686f4 | [
"MIT"
] | 18 | 2015-09-13T10:47:31.000Z | 2018-08-20T15:00:35.000Z | tuning_reports/plwiki.goodfaith.md | paulkernfeld/editquality | 029f21278d89d6e50b0eac7b39d8355f8e4686f4 | [
"MIT"
] | 98 | 2015-12-13T12:18:24.000Z | 2018-08-07T21:10:46.000Z | tuning_reports/plwiki.goodfaith.md | paulkernfeld/editquality | 029f21278d89d6e50b0eac7b39d8355f8e4686f4 | [
"MIT"
] | 17 | 2015-09-29T20:52:12.000Z | 2018-08-20T11:33:30.000Z | # Model tuning report
- Revscoring version: 2.0.5
- Features: editquality.feature_lists.plwiki.goodfaith
- Date: 2017-09-12T21:12:09.902886
- Observations: 15013
- Labels: [true, false]
- Statistic: roc_auc.labels.true (maximize)
- Folds: 5
# Top scoring configurations
| model | roc_auc.labels.true | params |
|:-----------------------|----------------------:|:-------------------------------------------------------------------------------|
| RandomForestClassifier | 0.9747 | n_estimators=320, criterion="entropy", min_samples_leaf=3, max_features="log2" |
| RandomForestClassifier | 0.9746 | n_estimators=320, criterion="entropy", min_samples_leaf=1, max_features="log2" |
| RandomForestClassifier | 0.9742 | n_estimators=320, criterion="gini", min_samples_leaf=1, max_features="log2" |
| RandomForestClassifier | 0.9732 | n_estimators=160, criterion="entropy", min_samples_leaf=5, max_features="log2" |
| RandomForestClassifier | 0.9724 | n_estimators=320, criterion="entropy", min_samples_leaf=5, max_features="log2" |
| RandomForestClassifier | 0.9719 | n_estimators=160, criterion="gini", min_samples_leaf=3, max_features="log2" |
| RandomForestClassifier | 0.9713 | n_estimators=160, criterion="entropy", min_samples_leaf=7, max_features="log2" |
| GradientBoosting | 0.9711 | n_estimators=700, max_features="log2", learning_rate=0.01, max_depth=5 |
| RandomForestClassifier | 0.9709 | n_estimators=80, criterion="gini", min_samples_leaf=5, max_features="log2" |
| RandomForestClassifier | 0.9708 | n_estimators=160, criterion="entropy", min_samples_leaf=3, max_features="log2" |
# Models
## BernoulliNB
| roc_auc.labels.true | params |
|----------------------:|:---------|
| 0.8171 | |
## GaussianNB
| roc_auc.labels.true | params |
||
## RandomForestClassifier
| roc_auc.labels.true | params |
|----------------------:|:--------------------------------------------------------------------------------|
| 0.9747 | n_estimators=320, criterion="entropy", min_samples_leaf=3, max_features="log2" |
| 0.9746 | n_estimators=320, criterion="entropy", min_samples_leaf=1, max_features="log2" |
| 0.9742 | n_estimators=320, criterion="gini", min_samples_leaf=1, max_features="log2" |
| 0.9732 | n_estimators=160, criterion="entropy", min_samples_leaf=5, max_features="log2" |
| 0.9724 | n_estimators=320, criterion="entropy", min_samples_leaf=5, max_features="log2" |
| 0.9719 | n_estimators=160, criterion="gini", min_samples_leaf=3, max_features="log2" |
| 0.9713 | n_estimators=160, criterion="entropy", min_samples_leaf=7, max_features="log2" |
| 0.9709 | n_estimators=80, criterion="gini", min_samples_leaf=5, max_features="log2" |
| 0.9708 | n_estimators=160, criterion="entropy", min_samples_leaf=3, max_features="log2" |
| 0.9706 | n_estimators=320, criterion="gini", min_samples_leaf=5, max_features="log2" |
| 0.9703 | n_estimators=320, criterion="gini", min_samples_leaf=7, max_features="log2" |
| 0.9699 | n_estimators=80, criterion="entropy", min_samples_leaf=7, max_features="log2" |
| 0.9696 | n_estimators=160, criterion="gini", min_samples_leaf=7, max_features="log2" |
| 0.9689 | n_estimators=320, criterion="gini", min_samples_leaf=13, max_features="log2" |
| 0.9687 | n_estimators=320, criterion="entropy", min_samples_leaf=7, max_features="log2" |
| 0.9682 | n_estimators=160, criterion="entropy", min_samples_leaf=13, max_features="log2" |
| 0.9679 | n_estimators=160, criterion="gini", min_samples_leaf=13, max_features="log2" |
| 0.9677 | n_estimators=320, criterion="gini", min_samples_leaf=3, max_features="log2" |
| 0.9677 | n_estimators=320, criterion="entropy", min_samples_leaf=13, max_features="log2" |
| 0.9677 | n_estimators=80, criterion="entropy", min_samples_leaf=3, max_features="log2" |
| 0.9673 | n_estimators=160, criterion="gini", min_samples_leaf=1, max_features="log2" |
| 0.9671 | n_estimators=160, criterion="entropy", min_samples_leaf=1, max_features="log2" |
| 0.9662 | n_estimators=160, criterion="gini", min_samples_leaf=5, max_features="log2" |
| 0.9657 | n_estimators=80, criterion="entropy", min_samples_leaf=5, max_features="log2" |
| 0.9656 | n_estimators=80, criterion="gini", min_samples_leaf=7, max_features="log2" |
| 0.9655 | n_estimators=40, criterion="entropy", min_samples_leaf=5, max_features="log2" |
| 0.9651 | n_estimators=40, criterion="entropy", min_samples_leaf=13, max_features="log2" |
| 0.9649 | n_estimators=80, criterion="gini", min_samples_leaf=1, max_features="log2" |
| 0.9649 | n_estimators=40, criterion="gini", min_samples_leaf=13, max_features="log2" |
| 0.9642 | n_estimators=20, criterion="entropy", min_samples_leaf=7, max_features="log2" |
| 0.964 | n_estimators=80, criterion="gini", min_samples_leaf=3, max_features="log2" |
| 0.9636 | n_estimators=40, criterion="entropy", min_samples_leaf=3, max_features="log2" |
| 0.9634 | n_estimators=80, criterion="gini", min_samples_leaf=13, max_features="log2" |
| 0.9632 | n_estimators=20, criterion="entropy", min_samples_leaf=3, max_features="log2" |
| 0.9624 | n_estimators=40, criterion="gini", min_samples_leaf=3, max_features="log2" |
| 0.9623 | n_estimators=40, criterion="gini", min_samples_leaf=5, max_features="log2" |
| 0.9621 | n_estimators=80, criterion="entropy", min_samples_leaf=13, max_features="log2" |
| 0.9619 | n_estimators=20, criterion="gini", min_samples_leaf=13, max_features="log2" |
| 0.9618 | n_estimators=40, criterion="entropy", min_samples_leaf=7, max_features="log2" |
| 0.9612 | n_estimators=40, criterion="gini", min_samples_leaf=7, max_features="log2" |
| 0.9607 | n_estimators=20, criterion="entropy", min_samples_leaf=13, max_features="log2" |
| 0.9599 | n_estimators=20, criterion="entropy", min_samples_leaf=5, max_features="log2" |
| 0.9582 | n_estimators=20, criterion="gini", min_samples_leaf=5, max_features="log2" |
| 0.9565 | n_estimators=80, criterion="entropy", min_samples_leaf=1, max_features="log2" |
| 0.953 | n_estimators=20, criterion="gini", min_samples_leaf=7, max_features="log2" |
| 0.9529 | n_estimators=10, criterion="gini", min_samples_leaf=3, max_features="log2" |
| 0.952 | n_estimators=40, criterion="entropy", min_samples_leaf=1, max_features="log2" |
| 0.9506 | n_estimators=10, criterion="entropy", min_samples_leaf=13, max_features="log2" |
| 0.9504 | n_estimators=20, criterion="gini", min_samples_leaf=3, max_features="log2" |
| 0.9483 | n_estimators=10, criterion="gini", min_samples_leaf=13, max_features="log2" |
| 0.947 | n_estimators=10, criterion="entropy", min_samples_leaf=7, max_features="log2" |
| 0.9431 | n_estimators=10, criterion="gini", min_samples_leaf=5, max_features="log2" |
| 0.9424 | n_estimators=10, criterion="gini", min_samples_leaf=7, max_features="log2" |
| 0.9398 | n_estimators=40, criterion="gini", min_samples_leaf=1, max_features="log2" |
| 0.9357 | n_estimators=20, criterion="gini", min_samples_leaf=1, max_features="log2" |
| 0.9307 | n_estimators=10, criterion="entropy", min_samples_leaf=5, max_features="log2" |
| 0.9284 | n_estimators=20, criterion="entropy", min_samples_leaf=1, max_features="log2" |
| 0.9259 | n_estimators=10, criterion="entropy", min_samples_leaf=3, max_features="log2" |
| 0.9005 | n_estimators=10, criterion="gini", min_samples_leaf=1, max_features="log2" |
| 0.8948 | n_estimators=10, criterion="entropy", min_samples_leaf=1, max_features="log2" |
## LogisticRegression
| roc_auc.labels.true | params |
|----------------------:|:--------------------|
| 0.8826 | C=1, penalty="l1" |
| 0.8806 | C=10, penalty="l1" |
| 0.8662 | C=0.1, penalty="l1" |
| 0.6673 | C=10, penalty="l2" |
| 0.6467 | C=0.1, penalty="l2" |
| 0.6333 | C=1, penalty="l2" |
## GradientBoosting
| roc_auc.labels.true | params |
|----------------------:|:-----------------------------------------------------------------------|
| 0.9711 | n_estimators=700, max_features="log2", learning_rate=0.01, max_depth=5 |
| 0.9703 | n_estimators=500, max_features="log2", learning_rate=0.01, max_depth=5 |
| 0.9689 | n_estimators=100, max_features="log2", learning_rate=0.1, max_depth=5 |
| 0.9683 | n_estimators=500, max_features="log2", learning_rate=0.01, max_depth=7 |
| 0.9682 | n_estimators=700, max_features="log2", learning_rate=0.01, max_depth=7 |
| 0.9671 | n_estimators=700, max_features="log2", learning_rate=0.1, max_depth=7 |
| 0.9668 | n_estimators=300, max_features="log2", learning_rate=0.01, max_depth=7 |
| 0.9667 | n_estimators=100, max_features="log2", learning_rate=0.1, max_depth=3 |
| 0.9665 | n_estimators=300, max_features="log2", learning_rate=0.1, max_depth=3 |
| 0.9661 | n_estimators=300, max_features="log2", learning_rate=0.01, max_depth=5 |
| 0.9657 | n_estimators=100, max_features="log2", learning_rate=0.1, max_depth=7 |
| 0.9653 | n_estimators=700, max_features="log2", learning_rate=0.01, max_depth=3 |
| 0.9649 | n_estimators=100, max_features="log2", learning_rate=0.01, max_depth=7 |
| 0.9632 | n_estimators=500, max_features="log2", learning_rate=0.1, max_depth=3 |
| 0.9632 | n_estimators=300, max_features="log2", learning_rate=0.1, max_depth=7 |
| 0.9631 | n_estimators=500, max_features="log2", learning_rate=0.1, max_depth=5 |
| 0.963 | n_estimators=500, max_features="log2", learning_rate=0.1, max_depth=7 |
| 0.9618 | n_estimators=300, max_features="log2", learning_rate=0.1, max_depth=5 |
| 0.96 | n_estimators=700, max_features="log2", learning_rate=0.1, max_depth=3 |
| 0.9597 | n_estimators=500, max_features="log2", learning_rate=0.01, max_depth=3 |
| 0.9596 | n_estimators=700, max_features="log2", learning_rate=0.5, max_depth=5 |
| 0.9589 | n_estimators=100, max_features="log2", learning_rate=0.01, max_depth=5 |
| 0.9585 | n_estimators=700, max_features="log2", learning_rate=0.1, max_depth=1 |
| 0.9577 | n_estimators=100, max_features="log2", learning_rate=0.5, max_depth=5 |
| 0.9568 | n_estimators=700, max_features="log2", learning_rate=0.1, max_depth=5 |
| 0.9566 | n_estimators=500, max_features="log2", learning_rate=0.1, max_depth=1 |
| 0.9562 | n_estimators=100, max_features="log2", learning_rate=0.5, max_depth=1 |
| 0.955 | n_estimators=500, max_features="log2", learning_rate=0.5, max_depth=1 |
| 0.9549 | n_estimators=300, max_features="log2", learning_rate=0.01, max_depth=3 |
| 0.9547 | n_estimators=300, max_features="log2", learning_rate=0.5, max_depth=1 |
| 0.9543 | n_estimators=500, max_features="log2", learning_rate=0.5, max_depth=5 |
| 0.9534 | n_estimators=500, max_features="log2", learning_rate=0.5, max_depth=7 |
| 0.9529 | n_estimators=300, max_features="log2", learning_rate=0.5, max_depth=7 |
| 0.9523 | n_estimators=700, max_features="log2", learning_rate=0.5, max_depth=7 |
| 0.9518 | n_estimators=700, max_features="log2", learning_rate=0.5, max_depth=1 |
| 0.9492 | n_estimators=300, max_features="log2", learning_rate=0.1, max_depth=1 |
| 0.9487 | n_estimators=300, max_features="log2", learning_rate=0.5, max_depth=5 |
| 0.9466 | n_estimators=500, max_features="log2", learning_rate=0.5, max_depth=3 |
| 0.9461 | n_estimators=100, max_features="log2", learning_rate=1, max_depth=1 |
| 0.9449 | n_estimators=100, max_features="log2", learning_rate=0.5, max_depth=3 |
| 0.9437 | n_estimators=100, max_features="log2", learning_rate=0.5, max_depth=7 |
| 0.9416 | n_estimators=100, max_features="log2", learning_rate=0.01, max_depth=3 |
| 0.9414 | n_estimators=700, max_features="log2", learning_rate=0.5, max_depth=3 |
| 0.9336 | n_estimators=300, max_features="log2", learning_rate=0.5, max_depth=3 |
| 0.9282 | n_estimators=700, max_features="log2", learning_rate=1, max_depth=3 |
| 0.9264 | n_estimators=100, max_features="log2", learning_rate=0.1, max_depth=1 |
| 0.9255 | n_estimators=700, max_features="log2", learning_rate=1, max_depth=1 |
| 0.925 | n_estimators=300, max_features="log2", learning_rate=1, max_depth=1 |
| 0.9211 | n_estimators=500, max_features="log2", learning_rate=1, max_depth=1 |
| 0.9196 | n_estimators=700, max_features="log2", learning_rate=0.01, max_depth=1 |
| 0.9137 | n_estimators=500, max_features="log2", learning_rate=0.01, max_depth=1 |
| 0.9133 | n_estimators=100, max_features="log2", learning_rate=1, max_depth=3 |
| 0.9071 | n_estimators=300, max_features="log2", learning_rate=0.01, max_depth=1 |
| 0.9067 | n_estimators=300, max_features="log2", learning_rate=1, max_depth=5 |
| 0.9058 | n_estimators=500, max_features="log2", learning_rate=1, max_depth=3 |
| 0.9041 | n_estimators=500, max_features="log2", learning_rate=1, max_depth=5 |
| 0.8931 | n_estimators=100, max_features="log2", learning_rate=1, max_depth=7 |
| 0.881 | n_estimators=100, max_features="log2", learning_rate=0.01, max_depth=1 |
| 0.8604 | n_estimators=300, max_features="log2", learning_rate=1, max_depth=3 |
| 0.8575 | n_estimators=100, max_features="log2", learning_rate=1, max_depth=5 |
| 0.8348 | n_estimators=700, max_features="log2", learning_rate=1, max_depth=7 |
| 0.8248 | n_estimators=300, max_features="log2", learning_rate=1, max_depth=7 |
| 0.7487 | n_estimators=700, max_features="log2", learning_rate=1, max_depth=5 |
| 0.7202 | n_estimators=500, max_features="log2", learning_rate=1, max_depth=7 |
| 89.386364 | 131 | 0.593631 | eng_Latn | 0.085021 |
4fb0bd3a8ba8d060ea7819db9efd47cb35438982 | 3,700 | md | Markdown | README.md | abinaya0101/VCPlayBot | e6b5e1b9204529bff3319cabadbbd01b61df917f | [
"Apache-2.0"
] | 1 | 2021-08-18T03:05:11.000Z | 2021-08-18T03:05:11.000Z | README.md | abinaya0101/VCPlayBot | e6b5e1b9204529bff3319cabadbbd01b61df917f | [
"Apache-2.0"
] | null | null | null | README.md | abinaya0101/VCPlayBot | e6b5e1b9204529bff3319cabadbbd01b61df917f | [
"Apache-2.0"
] | null | null | null | # How To Host
The easiest way to deploy this Bot
<p align="center"><a href="https://heroku.com/deploy?template=https://github.com/abinaya0101/VCPlayBot"> <img src="https://img.shields.io/badge/Deploy%20To%20Heroku-red?style=for-the-badge&logo=heroku" width="220" height="38.45"/></a></p>
- Support Channel :- [Awesome Bot](http://t.me/LaylaList)
- Support Group :- [Awesome Support](http://t.me/AwesomeSupport)
```
Please fork this repository don't import code
Made with Python3
(C) @QueenArzoo
```
### Mandatory Vars.
- Some Of The Mandatory Vars Are :-
- `API_ID` : Give API_ID of your Alternate Telegram Account. also get from here [@APIInfoBot](https://t.me/APIinfoBot)
- `API_HASH` : Give API_HASH of your Alternate Telegram Account. also get from here [@APIInfoBot](https://t.me/APIinfoBot)
- `STRING_NAME` : Make a string session from [here](https://replit.com/@QueenArzoo/VCPlayBot)
- `BOT_TOKEN` : Make a Bot from [@Botfather](https://t.me/botfather) and fill it's bot token.
- `SUDO_USERS` : Fill Userid of yhe users whom you want to be able to control the bot. You can add multiple id by giving a space in b/w each id.
Get STRING_NAME from here: [](https://replit.com/@QueenArzoo/VCPlayBot)
## Commands 🛠
- `/play <song name>` - play song you requested
- `/dplay <song name>` - play song you requested via deezer
- `/splay <song name>` - play song you requested via jio saavn
- `/playlist` - Show now playing list
- `/current` - Show now playing
- `/song <song name>` - download songs you want quickly
- `/search <query>` - search videos on youtube with details
- `/deezer <song name>` - download songs you want quickly via deezer
- `/saavn <song name>` - download songs you want quickly via saavn
- `/video <song name>` - download videos you want quickly
#### Admins only.
- `/player` - open music player settings panel
- `/pause` - pause song play
- `/resume` - resume song play
- `/skip` - play next song
- `/end` - stop music play
- `/userbotjoin` - invite assistant to your chat
- `/userbotleave` - remove assistant from your chat
- `/admincache` - Refresh admin list
### Commands for Channel Music Play 🛠
For linked group admins only:
- `/cplay <song name>` - play song you requested
- `/cplay <reply to link>` - play replied youtube link
- `/cplay <reply to audio>` - play replied file
- `/cdplay <song name>` - play song you requested via deezer
- `/csplay <song name>` - play song you requested via jio saavn
- `/cplaylist` - Show now playing list
- `/cccurrent` - Show now playing
- `/cplayer` - open music player settings panel
- `/cpause` - pause song play
- `/cresume` - resume song play
- `/cskip` - play next song
- `/cend` - stop music play
- `/userbotjoinchannel` - invite assistant to your chat
* channel is also can be used instead of c
If you donlt like to play in linked channel:
1. Get your channel ID.
2. Rename your group to: Channel Music: your_channel_id
3. Add @VCPlayBot as Channel admin with full perms
4. add helper to channel
5. Simply send commands in your group.
### Commands for Sudo Users ⚔️
- `/userbotleaveall` - remove assistant from all chats
- `/gcast <reply to message>` - globally brodcast replied message to all chats
- `/pmpermit [on/off]` - enable/disable pmpermit message
#### Pmpermit
- `.a` - approove someone to pm you
- `.da` - disapproove someone to pm you
+ Sudo Users can execute any command in any groups
#### Special Credits
- [Rojserbest](http://github.com/rojserbes): Callsmusic Developer
- [Awesome Bot](http://t.me/LaylaList) Channel bot list
- [Dev](http://t.me/HEROGAMERS1) Hero owner of this bot
| 38.541667 | 238 | 0.708378 | eng_Latn | 0.960189 |
4fb0cc7065a4af57aaba9a44ac286ab22d91da38 | 873 | md | Markdown | README.md | jlongenecker/OnTheMap | 89de5b05415b1090406b1ea73671116d001a662e | [
"MIT"
] | null | null | null | README.md | jlongenecker/OnTheMap | 89de5b05415b1090406b1ea73671116d001a662e | [
"MIT"
] | null | null | null | README.md | jlongenecker/OnTheMap | 89de5b05415b1090406b1ea73671116d001a662e | [
"MIT"
] | null | null | null | # OnTheMap
On the Map is a project built to satisfy Udacity's iOS Nanodegree Lesson 3 [project](https://www.udacity.com/course/ios-networking-with-swift--ud421).
The app allows a user to download other user locations, and upload a location of their choice
from a Parse Server run by Udacity. The location information is supplied to and from the server in a JSON format with the coordinates supplied by Apple's MapKit.
The app is written in Swift and utilizes Grand Central Dispatch, JSON, NSURLSession as well as UIKit.
##Installation
Clone Git Repository and then open with Xcode. The repository requries Xcode 7.2 or later.
## Collaboration
Since this project meets Udacity's objectives for lesson 3, this project is complete. Additional work can continue to be contributed
and the repository will be monitored.
##License
This project uses the MIT License.
| 41.571429 | 162 | 0.789233 | eng_Latn | 0.997797 |
4fb282ccb4bfb9ed4eee25bfd20744eddd7df56e | 5,191 | md | Markdown | content/ja/docs/GO-GlobalAdminConsole/ObtainingaTrustedServerCertificate.md | snemesk/ggdocs | db0fc2d553e7df734831a6a815b3592138d65219 | [
"Apache-2.0"
] | null | null | null | content/ja/docs/GO-GlobalAdminConsole/ObtainingaTrustedServerCertificate.md | snemesk/ggdocs | db0fc2d553e7df734831a6a815b3592138d65219 | [
"Apache-2.0"
] | null | null | null | content/ja/docs/GO-GlobalAdminConsole/ObtainingaTrustedServerCertificate.md | snemesk/ggdocs | db0fc2d553e7df734831a6a815b3592138d65219 | [
"Apache-2.0"
] | null | null | null | ---
title: "信頼されたサーバ証明書の取得"
linkTitle: ""
weight: 07
type: "docs"
---
強力な暗号化証明書ウィザードを使用する代わりに、お客様は他の認証局のサーバ証明書を使用することができます。クライアントのオペレーティングシステムによって信頼されているCAからサーバ証明書を取得する方法については、以下の情報と選択したCAのマニュアルを参照してください。CAは、証明書署名要求(CSR)を必要とします。
### CSRを生成する方法
1. [http://www.openssl.org/related/binaries.html](http://www.openssl.org/related/binaries.html)から **OpenSSL** をダウンロードします。
2. GO-Globalホストに **OpenSSL**をインストールします。
3. [スタート | ファイル名を指定して実行 ]の順にクリックします。
4. **cmd** と入力し、**Enter** キーを押します。
5. 以下のコマンドを入力して、サーバの秘密鍵を生成します。OPENSSL_DIRとは、OpenSSLがインストールされているディレクトリへのパスです(たとえば、`C:\openSSL`)。
```
[OPENSSL_DIR]\bin\openssl genrsa –out server.key 2048
```
6. 次のコマンドを入力します。
```
[OPENSSL_DIR]\bin\openssl req –sha256 –new –key server.key –out server.csr
```
このコマンドを実行すると、以下のように、証明書に含める属性の入力を求められます。
```
Country Name:JP
State:都道府県名
Locality:市区町村名
Organization:会社名 Organizational
Unit:部署名
Common Name:サーバ名
E-mail Address:電子メールアドレス
```
ワイルドカードSSL証明書を使用しない場合Common Nameは、GO-Globalホストのホスト名(ホストに接続するときに指定する名前)と一致している必要があります。異なる名前を使用すると、接続時にクライアントから警告が発せられます。上のコマンドの出力は、 **server.csr**という名前のファイルに保存されます。このファイルをCAに送信します。GO-Globalでは、OpenSSLツールキットに基づいてSSLが実装されているため、使用するツールは、Apache mod_sslパッケージなどのOpenSSLベースの製品で使用されているツールと同じです。mod_sslパッケージを使用してサーバの証明書を取得する方法については、CA の指示に従ってください。
認証局(CA)から署名された証明書ファイルが届いた時、それを **server.crt** というファイル名で保存してください。このファイルと上記ステップ5で生成されたファイル **server.key** をGO-Globalホスト上のシステムアカウントおよびAdministratorグループのアカウントでのみアクセス可能なフォルダへコピーしてください。この時、通常のユーザアカウントでアクセス出来ないことを必ず確認してください。最後に、Admin Consoleで署名された証明書ファイルを選択してください。
### サーバ証明書の選択方法
1. Admin Consoleで、[Tools | Host Options]の順にクリックします。
2. **Security**タブをクリックします。
3. **Transport**リストの中から **SSL**を選択します。
4. **[SSL Certificate]** ボックスに、サーバの証明書(server.crtなど)ファイルへのパスを入力または参照します。
5. **OK**をクリックします。
GO-Globalは、証明書とその鍵がPEM形式であることを要求しています。第三者のCAに証明書を要求する場合、GraphOnはPEM形式で証明書を要求することをお勧めします。これが不可能で、証明書をDER形式でしか配信できない場合は、次のコマンドを使用してPEM形式に変換できます。
```
openssl x509 -inform der -in MYCERT>cer -out MYCERT.pem
```
MYCERT.pemファイルは、GO-Globalで使用するためにMYCERT.crtに名前を変更できます。
# GO-Globalでの中間SSL証明書の使用
GO-Globalで中間SSL証明書を使用するときは、既存の証明書と中間証明書を連結する必要があります。次の例では、Go Daddy中間証明書を使用しています。
1. GO-Globalホストで使用されている.crtファイルと.keyファイルを取り出します。
2. Go Daddy中間証明書(GODaddyCA.crtなど)をダウンロードしてください。これは元の証明書の購入に付属しているはずですが、次の[Go Daddyサイト](https://certs.godaddy.com/Repository.go)にもあります。
3. 次のように.crtと中間の.crtファイルを連結します:test_server.crt+GODaddyCA.crt server.crt
4. 新しく作成したserver.crtファイルと一致するように、手順1のキーファイルの名前をserver.keyに変更します。
5. これら2つのファイルをGO-Globalホストにコピーします(例:`C:\data`)。
6. Admin Consoleを起動します。[Tool | Host Options ]の順にクリックします。 **[Security]** タブをクリックします。
7. 高度な暗号化ライセンスをお持ちの場合は、トランスポートをSSLに変更し、暗号化レベルを256ビットAESに上げます。そうでない場合は、56ビットのままにしてください。
8. `C:\data`にあるSSL証明書server.crtを参照して、 **[OK]** をクリックします。.crtファイルと.keyファイルに同じプレフィックスが付いている場合は、この時点でエラーメッセージは表示されません
9. テスト目的で **接続が安全な場合にユーザに通知する** を有効にします。
10. **OK** をクリックします。
11. 別のシステムからGO-Globalセッションを開始してください。
{{% alert title="参照" color="info" %}}
GO-Globalのモバイルアプリで中間のSSL証明書を使用する場合は、ルート証明書が必要です。
{{% /alert %}}
# iOSおよびAndroidクライアントでの中間SSL証明書の使用
iOSまたはAndroid上のGO-Global AppControllerがサーバ証明書を信頼するためには、中間証明書とすべてのルート証明書を含むSSL証明書チェーン全体を信頼できる必要があります。
1. GO-Globalホストで使用されているサーバ証明書の証明書チェーン内の中間証明書およびルート証明書の証明書ファイルを取得します。
2. すべての証明書ファイルを次のように 1 つのファイルに連結します。test_server.crt+intermediate.crt+root.crt server.crt.
{{% alert title="参照" color="info" %}}
0個以上の中間ファイルと1個以上のルートファイルがあります。.crtファイルが自己署名されている場合は、.crtファイルの名前をserver.crtに変更する必要があります。
{{% /alert %}}
3. 新しく作成した **server.crt** ファイルと一致するように、手順1のキーファイルの名前を **server.key** に変更します。
4. これら2つのファイルをGO-Globalホストにコピーします(`C:\Data`)。
5. Admin Consoleを起動します。[Tool | Host Options]の順にクリックします。**[Security]** タブをクリックします。
6. 高度な暗号化ライセンスをお持ちの場合は、トランスポートを **SSL**に変更し、暗号化レベルを256ビットAESに上げます。そうでない場合は、56ビットのままにしてください。
7. `C:\data`にあるSSL証明書 **server.crt** を参照して、 **[OK]** をクリックします。.crtファイルと.keyファイルに同じプレフィックスが付いている場合は、この時点でエラーメッセージは表示されません。
8. テスト目的で **Notify users when connections are secure** を有効にします。
9. **OK**をクリックします。
10. iOSまたはAndroidデバイスからGO-Globalセッションを開始します。
# SSLの問題を解決する
SSLを使用するようにRelay Load Balancerを設定している場合、Dependent HostsのRelay Load Balancer欄に入力した名前は、Relay Load Balancerの証明書のCommon Nameと一致している必要があります。(Relay Load BalancerのRelay Load Balancerフィールドに入力した名前は、Relay Load Balancerの証明書のCommon Nameと一致する必要はありません) リレーロードバランサーと依存ホストが正しく設定されていることを確認するには、以下の手順に従います。
1. 管理者コンソールを実行し、ツリービューのRelay Load Balancerの下に従属ホストが表示されていることを確認します。表示されない場合は、従属ホストがRelay Load Balancerに接続されていません。
2. 2. ツリービューで従属ホストがリレーロードバランサーの下に表示されない場合は、従属ホストのアプリケーショ ンパブリッシングサービスのログを確認します。証明書が無効であるというメッセージがあれば、SSLの設定に問題があります。
3. 3. ディペンデントホストの Application Publishing Service ログに SSL 証明書のエラーメッセージがある場合は、ディペンデントホストの GO-Global の GO-G\Programs ディレクトリを参照し、AppController.exe をダブルクリックして GO-Global クライアントを実行します。
4. 接続ダイアログにリレーロードバランサーのアドレスを入力します。従属ホスト上のホストオプションダイアログのリレーロードバランサーフィールドに指定されているアドレスを正確に入力します。
5. 5. [Connect]をクリックします。
SSL警告メッセージが表示された場合、従属ホストはRelay Load Balancerに接続できません。SSL警告メッセージに記載されている問題を解決してください。そうすれば、従属ホストはRelay Load Balancerに接続できるはずです。
SSL警告ダイアログが表示されず、別のエラーメッセージが表示された場合(例:利用可能なホストがありません)は、SSL設定に問題がないことを示しています。このテストでは、クライアントがRelay Load Balancerへの接続を開く能力に関係のないエラーメッセージは無視しても構いません。
| 45.938053 | 337 | 0.814294 | jpn_Jpan | 0.458181 |
4fb3c3ea404128dde77a22b893a00b7cb406aa61 | 420 | md | Markdown | devcon-2017-cdct/README.md | jonasholtkamp/talks | 13172e1a5ea3dba508474496be0cb16b47adf51b | [
"Apache-2.0"
] | null | null | null | devcon-2017-cdct/README.md | jonasholtkamp/talks | 13172e1a5ea3dba508474496be0cb16b47adf51b | [
"Apache-2.0"
] | null | null | null | devcon-2017-cdct/README.md | jonasholtkamp/talks | 13172e1a5ea3dba508474496be0cb16b47adf51b | [
"Apache-2.0"
] | null | null | null | # Testing the consumer
1. C: `mvn test` → failure
1. P: `mvn install -DskipTests` → success, stubs installed
1. C: `mvn test` → success
# Testing the provider
1. P: `mvn test` → success
# Altering the results...
1. Change contract
1. P: `mvn test` → failure
1. P: `mvn install -DskipTests` → success, stubs installed
1. C: `mvn test` → failure | 32.307692 | 58 | 0.559524 | eng_Latn | 0.764631 |
4fb5e99771c5c3b763f7a3d86a47e043461eceec | 279 | md | Markdown | README.md | jakemager/Ultra-Music-Festival-Lineup-Picker-2019 | 0dee2082b0078939f46e54e09ba96d392cbe60d0 | [
"MIT"
] | null | null | null | README.md | jakemager/Ultra-Music-Festival-Lineup-Picker-2019 | 0dee2082b0078939f46e54e09ba96d392cbe60d0 | [
"MIT"
] | null | null | null | README.md | jakemager/Ultra-Music-Festival-Lineup-Picker-2019 | 0dee2082b0078939f46e54e09ba96d392cbe60d0 | [
"MIT"
] | null | null | null | # Ultra Music Festival Lineup Picker 2019
Easily pick the lineups you want to attend for Ultra Music Festival 2019
http://jakemager.com/umfLineupPicker/
### Contributing
Feel free to make a pull request. The algrothim is not good at all.
#### Running
`npm i`
`npm start`
| 16.411765 | 72 | 0.74552 | eng_Latn | 0.948228 |
4fb6073f41f0acc2aacbda080d68ee591c3b24d9 | 196 | md | Markdown | README.md | redislabs-training/ru301 | 960ede206bff48294aed894dfa7ce85edfc473b1 | [
"MIT"
] | null | null | null | README.md | redislabs-training/ru301 | 960ede206bff48294aed894dfa7ce85edfc473b1 | [
"MIT"
] | null | null | null | README.md | redislabs-training/ru301 | 960ede206bff48294aed894dfa7ce85edfc473b1 | [
"MIT"
] | null | null | null | # RU301 - Running Redis at Scale
This repository contains files used in the exercises for the Redis University [RU301 Running Redis at Scale](https://university.redis.com/courses/ru301/) course.
| 49 | 161 | 0.790816 | eng_Latn | 0.952192 |
4fb6855f773b5a3afd263f6b1d05eb3895d2b2a3 | 284 | md | Markdown | docs/zh/go/README.md | liduapong/code-docs | 3fbb330532dc7f477f09d63b86cf3bff72467c55 | [
"MIT"
] | 2 | 2019-08-26T11:07:17.000Z | 2019-08-27T17:44:47.000Z | docs/zh/go/README.md | liduapong/code-docs | 3fbb330532dc7f477f09d63b86cf3bff72467c55 | [
"MIT"
] | 9 | 2021-03-01T21:02:54.000Z | 2022-02-26T23:50:16.000Z | docs/zh/go/README.md | liduapong/code-docs | 3fbb330532dc7f477f09d63b86cf3bff72467c55 | [
"MIT"
] | 3 | 2019-08-26T08:09:00.000Z | 2020-03-12T03:17:53.000Z | # 介绍
Go是一种开源编程语言,可以轻松构建简单,可靠,高效的软件。
优点是打包之后的可执行文件无需配置环境即可运行。丢到服务器就可以运行。
- 官方网站 [https://golang.org](https://golang.org)
- 官方Tour [https://tour.golang.org/list](https://tour.golang.org/list)
- Awesome Go [https://github.com/avelino/awesome-go](https://github.com/avelino/awesome-go) | 31.555556 | 91 | 0.746479 | yue_Hant | 0.555907 |
4fb702f209f751cd75dcdcef026aaeb8cdf870c2 | 83 | md | Markdown | README.md | gahughes/esap_demo | f9e6b0a8966abf5194c06ebdd65e50d3131a3cf0 | [
"BSD-3-Clause"
] | null | null | null | README.md | gahughes/esap_demo | f9e6b0a8966abf5194c06ebdd65e50d3131a3cf0 | [
"BSD-3-Clause"
] | null | null | null | README.md | gahughes/esap_demo | f9e6b0a8966abf5194c06ebdd65e50d3131a3cf0 | [
"BSD-3-Clause"
] | null | null | null | # esap_demo
Test for ESAP gammapy
Based on https://github.com/stvoutsin/esap_demo
| 16.6 | 47 | 0.795181 | kor_Hang | 0.707999 |
4fb70c03e233b0ef24efe73c1452167493d8bf53 | 6,899 | md | Markdown | src/concepts/glossary.md | cogment/cogment-doc | 232853a4b041ceb8f0ca48d3e3a7f7ed0497e417 | [
"Apache-2.0"
] | 6 | 2021-05-31T13:09:35.000Z | 2021-11-16T20:45:00.000Z | src/concepts/glossary.md | cogment/cogment-doc | 232853a4b041ceb8f0ca48d3e3a7f7ed0497e417 | [
"Apache-2.0"
] | null | null | null | src/concepts/glossary.md | cogment/cogment-doc | 232853a4b041ceb8f0ca48d3e3a7f7ed0497e417 | [
"Apache-2.0"
] | null | null | null | # Glossary
Terminology is a very important part of understanding new concepts and learning how to use new technology. The words we use throughout our documentation may cause problems if one is not familiar with how we use those words; this is a glossary of terms for newcomers and seasoned developers alike. Some familiar terms may have additional caveats specifically added to their definition in the context of the Cogment Framework (generally for clarity).
[A](#a)—B—C—D—[E](#e)—[F](#f)—G—[H](#h)—I—J—K—L—[M](#m)—N—[O](#o)—[P](#p)—Q—[R](#8)—S—[T](#t)—[U](#u)—V—W—X—Y
## A
### Actor
An actor is somebody or something who/which interacts with the environment by executing certain actions, taking observations in, and receiving rewards (positive or negative) for this interaction. An Actor can be an [Agent](#agent) (of any level of complexity and any type of flexibility, from bots to ML agents), or a human user.
### Actor Class
Each [Actor](#actor) always belongs to a single Actor Class. An Actor Class is primarily defined by its associated [Action Space](#action-space), as a property of an [environment](#environment). For example, pilot and passenger could be two different Actor Classes.
### Action
1. An Action is an interaction an [Actor](#actor) performs on the environment. Actions are picked from the [Action Space](#action-space),
2. A single element of an Action Space.
### Agent
We usually call agent any non-human [Actor](#actor). Agents can use any sort of decision-making underlying system, able to learn or not.
## E
### Environment
1. The environment is the set of rules defining how a trial evolves over time for any given use case. For example, to train a pilot [agent](#agent), a flight simulation would be the environment. [Actors](#actor) can interact with the environment itself, or with each other through the environment, within the boundaries of the environment ruleset (i.e. how an environment can change, from environmental rulesets or the actions of Actors in the environment).
2. A stateful instance of an environment.
### Environment State
An environment state is the specific set of conditions in which the environment is at a specific time (for example, when it is first instantiated). These conditions can be observable or not, and our [Framework](#framework) does not concern itself with the ones that are not.
## F
### Framework
These two elements combined are what we call the framework:
- [Orchestrator](#orchestrator)
- SDKs
### Frontend
The interface, usually an app, that humans use to interact with the rest of the system; the software that turns humans into [Actors](#actor).
## H
### Human - Artificial Intelligence Interaction Loop Training
We call Human - AI interaction loop training the fundamental paradigm our [Framework](#framework) was build for: a continuous loop between humans and agents where they learn from each other. It’s a way to train [agents](#agent) in an environment where direct human interactions, whether between humans, between humans and the environment, or between humans and agents, provide live data to the agents (first part of the loop), _**as well as**_ a way for agents to interact with humans, either directly or through the environment (second part of the loop).
## M
### Message
Messages can be sent from any actor or the environment to any [actor](#actor) or the [environment](#environment). The message can be any protobuf class. This creates channels between any set of [actors](#actor) and the [environment](#environment). These channels can be used for applications where communication between actors and the environment need to be outside of the standard observation and action spaces.
### Model
A model is a representation, usually a mathematical one in our context, of a concept, structure, system, or an aspect of the real world. It is usually a simplified and abstracted representation.
## O
### Observation
An observation is the subset of the [environment state](#environment-state) that an [Actor](#actor) based its choice of [Action](#action) (or lack thereof) on.
### Observation delta
An observation delta is the difference between two [observations](#observation). Usually, we encode deltas from the past to the future.
### Observation transition
An observation transition is an observation delta between two consecutive [observations](#observation).
### Observation space
An Observation space is the set of all possible [observations](#observation) an [Actor](#actor) can make of an [environment](#environment).
### Orchestrator
The Orchestrator is the central piece of our [framework](#framework); it’s an executable that handles several things:
- It circulates data flows between [Actors](#actor) and [Environments](#environment).
- It dumps datasets in the chosen storage location.
- It compresses & encrypts data.
- It collates various [reward](#reward) sources (usually [environment](#environment) or [actors](#actor)) into a single reward for an Actor.
- It instantiates the [trials](#trial).
## P
### Plugin/extension
A plugin or extension adds functionality to our core [framework](#framework).
We provide plugins that handle special features such as Deployment, Dataset storage destinations, Analytics, that one may or may not choose to use alongside the core framework, depending on their specific needs.
### Protocol Buffer
A binary data format for serialized communication, `.proto` files are used to specify the available data structures. You can learn more at <https://developers.google.com/protocol-buffers/>{target=\_blank}.
## R
### Reward
1. A sent reward is a measure of an [Actor’s](#actor) performance within the environment at a given [tick](#tick). The reward can be sent by the environment, and/or a different Actor. They are sent to the [Orchestrator](#orchestrator), which collates them before they are received by the target actor.
2. A received reward is a single measure of an [Actor’s](#actor) performance. It is produced when at least one reward is sent to the actor at a given [tick](#tick).
### Reward function
A reward function describes how an [agent](#agent) "ought" to behave; what behaviours lead to [Rewards](#reward). Note that in our case, Reward functions can be used to reward any [Actor](#actor), regardless of it being human or not.
### Reinforcement Learning (RL)
RL is a specific method to train [agents](#agent), using [reward functions](#reward-function).
## T
### Tick
A tick is a discrete timestep between two [states of the environment](#environment-state). In our [Framework](#framework), ticks within a trial are numbered.
### Trial
A trial is a single run of a [use case](#use-case), with a beginning and end, populated with a single instance of the use case’s [environment](#environment) and its [actors](#actor).
## U
### Use case
The problem one wants to solve.
| 50.727941 | 555 | 0.756921 | eng_Latn | 0.99921 |
4fb8c3dcde6ff4d1f2e223ac101496e9b46e7ab6 | 2,475 | md | Markdown | windows-driver-docs-pr/image/postscanjobendstateevent-postscanjobendstate-jobcompletedstatereasons.md | AmadeusW/windows-driver-docs | 6d272f80814969bbb5ec836cbbebdf5cae52ee35 | [
"CC-BY-4.0",
"MIT"
] | 4 | 2017-05-30T18:13:16.000Z | 2021-09-26T19:45:08.000Z | windows-driver-docs-pr/image/postscanjobendstateevent-postscanjobendstate-jobcompletedstatereasons.md | AmadeusW/windows-driver-docs | 6d272f80814969bbb5ec836cbbebdf5cae52ee35 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows-driver-docs-pr/image/postscanjobendstateevent-postscanjobendstate-jobcompletedstatereasons.md | AmadeusW/windows-driver-docs | 6d272f80814969bbb5ec836cbbebdf5cae52ee35 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-09-26T19:45:46.000Z | 2021-09-26T19:45:46.000Z | ---
title: PostScanJobEndStateEvent.PostScanJobEndState.JobCompletedStateReasons
description: PostScanJobEndStateEvent.PostScanJobEndState.JobCompletedStateReasons
ms.assetid: d294c763-a3ef-4e24-b2c5-6066936324d0
keywords: ["PostScanJobEndStateEvent.PostScanJobEndState.JobCompletedStateReasons"]
ms.author: windowsdriverdev
ms.date: 11/28/2017
ms.topic: article
ms.prod: windows-hardware
ms.technology: windows-devices
---
# PostScanJobEndStateEvent.PostScanJobEndState.JobCompletedStateReasons
The **JobCompletedStateReasons** element contains a collection of one or more **JobStateReason** elements that indicate the reasons that a post-scan job completed.
When a post-scan job completes, the DSM scan server sends the DSM device a **PostScanJobEndStateEvent** message. This message contains a **JobCompletedStateReasons** element that gives one or more reasons that the job completed.
The **JobCompletedStateReasons** element and **JobStateReasons** element are defined similarly and have the same sub-elements. Whereas the **JobCompletedStateReasons** element contains status information about a post-scan job that has completed, the **JobStateReasons** element contains status information about a job that might not yet have completed.
The **JobCompletedStateReasons** element supports the following sub-element:
[PostScanJobEndStateEvent.PostScanJobEndState.JobCompletedStateReasons.JobStateReason](postscanjobendstateevent-postscanjobendstate-jobcompletedstatereasons-.md)
[Send comments about this topic to Microsoft](mailto:[email protected]?subject=Documentation%20feedback%20%5Bimage\image%5D:%20PostScanJobEndStateEvent.PostScanJobEndState.JobCompletedStateReasons%20%20RELEASE:%20%2811/8/2017%29&body=%0A%0APRIVACY%20STATEMENT%0A%0AWe%20use%20your%20feedback%20to%20improve%20the%20documentation.%20We%20don't%20use%20your%20email%20address%20for%20any%20other%20purpose,%20and%20we'll%20remove%20your%20email%20address%20from%20our%20system%20after%20the%20issue%20that%20you're%20reporting%20is%20fixed.%20While%20we're%20working%20to%20fix%20this%20issue,%20we%20might%20send%20you%20an%20email%20message%20to%20ask%20for%20more%20info.%20Later,%20we%20might%20also%20send%20you%20an%20email%20message%20to%20let%20you%20know%20that%20we've%20addressed%20your%20feedback.%0A%0AFor%20more%20info%20about%20Microsoft's%20privacy%20policy,%20see%20http://privacy.microsoft.com/default.aspx. "Send comments about this topic to Microsoft")
| 70.714286 | 975 | 0.839192 | eng_Latn | 0.429496 |
4fb95c3049679578345bc1125aa57757d92c38f5 | 4,255 | md | Markdown | _posts/2019-08-28-Download-canon-business-solutions-service-number.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2019-08-28-Download-canon-business-solutions-service-number.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2019-08-28-Download-canon-business-solutions-service-number.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Canon business solutions service number book
upon it without being afraid of falling through. Warily, that's canon business solutions service number point, we never panic, [Footnote 18: The birch which grows here is the sweet-scented birch Sterm shrugged, and was very pleasant killing it afterwards by a knife-stab behind the shoulder. And they'll probably tell you Maureen is a peach, meditation, they argue that some human lives from each other, and quicker wits, but they wouldn't know Consider a human egg cell, i. " His attention bounces from one to the other as he answers the question twice, his dry manufactured eyes. "Now we only have a third to go, brush-cut hair. As for the woman, she acknowledged that she would have preferred a account it is incumbent on canon business solutions service number to begin by giving a narrative of the bouncing off him, yea, it just makes you stupid, and he stood back admiring the complexity of the life in such a barren enough to the door to be awakened at once by the girl's knock, but all our planning will have to be geared to that fiction, seeing only the track before them in the dim silvery glow of extraordinary rarity. --_Das neu-entdeckte crack the glass, and the lava was rising, or the Corporal Swyleys who stayed out of it and weren't interested as long as they were left alone. since his days in a catboat on Havnor Bay. " She almost left. It's a vicious circle- you'd have to get in there to turn the Army around, little more than a murmur. hundreds of skuas which I have seen, yes?" possibilities, and four The placenta and the amniotic fluid weigh three pounds. He felt oppressed, and it may on her sixteenth birthday. I have to cheek it out. tell you Maureen is a peach, one-half the natural size, which seemed to do the trick, and as two cops strangely like ham sizzling in a skillet. But lately--" Geertz, stomp and stomp. The dresser had been searched, releasing clouds of sparks like fireflies and great black moths of paper ash, shared history. Enlad of the Kings, what device avails the hand of death to stay, i. the canon business solutions service number evening. years. He wasn't going to what-if himself into a panic. By CAPTAIN ALBERT H. Well, so haply I may get of thee profit that shall do away from me my errors and guide me into the way of righteousness, Hey, displaced predators prowling the canon business solutions service number mist. There was no need to speak any name. " b. He found as a consequence that he saw eye-to-eye with every lobbyist up to a point, and peace return, things worth knowing, he naturally turned that sweater had been and how little it had in common with the fingers of a woman. The playful Presence able to identify "Eenie" for them. "Is that what you think?" according to the _Tromsoe Stiftstidende_. hunkers in front of the mutt, either, having no connection with the Nights, he was able to locate the 28th August sighted the westernmost islands, canon business solutions service number without considerable chewed. It was the first real money he had had in his pocket for years: ten ivory counters carved with the Otter of Shelieth on one side and the Rune of Peace canon business solutions service number the other in honour of King Lebannen. " So Adi went forth and admitted Jerir, teeth without tabby, ungrateful for the respite and much the worse for beer, and with a whack knocked the heap into a blaze. Mirrors shattered: a must be a charming and civilized approach that would be proper, so that they walled the world; whilst canon business solutions service number rest of the kings tarried behind, the courtyard of the fountain, and that canon business solutions service number the last word he spoke to Ivory. proposal that it should be taken over by Captain Wiggins, ii, the as holy water to a demon! "No sane person ought to have confidence in a guy whose business "Now the King is in my body, and maybe find a little happiness "Ms. 374 telling them what to do?""Why should they?" The four Kargad islands are mostly arid in climate but fertile when watered and cultivated! " your part does not mean that schedules are slipping down there. "It's a deal. | 472.777778 | 4,142 | 0.787074 | eng_Latn | 0.999964 |
4fba5281ab62605394c33e5658998394a9b759f7 | 739 | md | Markdown | README_Swagger接口列表.md | zxfjd3g/gshop-admin_permission | 82a759bae0269195a54321a10277410c0676282b | [
"MIT"
] | 1 | 2021-01-05T07:53:27.000Z | 2021-01-05T07:53:27.000Z | README_Swagger接口列表.md | zxfjd3g/gshop-admin_permission | 82a759bae0269195a54321a10277410c0676282b | [
"MIT"
] | null | null | null | README_Swagger接口列表.md | zxfjd3g/gshop-admin_permission | 82a759bae0269195a54321a10277410c0676282b | [
"MIT"
] | 1 | 2021-05-26T11:13:46.000Z | 2021-05-26T11:13:46.000Z | # Swagger接口列表
## 接口文档
商品、分类、品牌:http://39.99.186.36:8216/swagger-ui.html
网站首页Banner列表: http://39.99.186.36:8210/swagger-ui.html
活动与优惠券:http://39.99.186.36:8200/swagger-ui.html
购物车:http://39.99.186.36:8201/swagger-ui.html
CMS:http://39.99.186.36:8210/swagger-ui.html
评论:http://39.99.186.36:8209/swagger-ui.html
sku详情:http://39.99.186.36:8202/swagger-ui.html
搜索:http://39.99.186.36:8203/swagger-ui.html
订单:http://39.99.186.36:8204/swagger-ui.html
支付:http://39.99.186.36:8205/swagger-ui.html
用户:http://39.99.186.36:8208/swagger-ui.html
权限: http://39.99.186.36:8170/swagger-ui.html
## 线上项目
后台管理: http://47.93.148.192:7070/
前台PC: http://gmall.atguigu.cn/
库存地址:http://39.99.186.36:9000 | 38.894737 | 58 | 0.664411 | yue_Hant | 0.933355 |
4fbaba29aaa76ccee6667fa4ef80f20203bc3b4f | 2,880 | md | Markdown | content/publication/TDP_fragments/index.md | ConnorScott-Ox/Professional-Profile | 4d542bf996feaa4658f7004d50592a1d1d3213b6 | [
"MIT"
] | null | null | null | content/publication/TDP_fragments/index.md | ConnorScott-Ox/Professional-Profile | 4d542bf996feaa4658f7004d50592a1d1d3213b6 | [
"MIT"
] | null | null | null | content/publication/TDP_fragments/index.md | ConnorScott-Ox/Professional-Profile | 4d542bf996feaa4658f7004d50592a1d1d3213b6 | [
"MIT"
] | null | null | null | ---
abstract: The pathological hallmark of amyotrophic lateral sclerosis (ALS) is the presence of cytoplasmic inclusions, containing C‐terminal fragments of the protein TDP‐43. Here, we tested the hypothesis that highly sensitive mass spectrometry with parallel reaction monitoring (MS‐PRM) can generate a high‐resolution map of pathological TDP‐43 peptide ratios to form the basis for quantitation of abnormal C‐terminal TDP‐43 fragment enrichment. Human cortex and spinal cord, microscopically staged for the presence of p‐TDP‐43, p‐tau, alpha‐synuclein, and beta‐amyloid pathology, were biochemically fractionated and analyzed by immunoblot and MS for the detection of full‐length and truncated (disease‐specific) TDP‐43 peptides. This informed the synthesis of heavy isotope‐labeled peptides for absolute quantification of TDP‐43 by MS‐PRM across 16 ALS, 8 Parkinson’s, 8 Alzheimer’s disease, and 8 aged control cases. We confirmed by immunoblot the previously described enrichment of pathological C‐terminal fragments in ALS‐TDP urea fractions. Subsequent MS analysis resolved specific TDP‐43 N‐ and C‐terminal peptides, including a novel N‐terminal truncation site‐specific peptide. Absolute quantification of peptides by MS‐PRM showed an increased C:N‐terminal TDP‐43 peptide ratio in ALS‐TDP brain compared to normal and disease controls. A C:N‐terminal ratio >1.5 discriminated ALS from controls with a sensitivity of 100% (CI 79.6–100) and specificity of 100% (CI 68–100), and from Parkinson’s and Alzheimer’s disease with a sensitivity of 93% (CI 70–100) and specificity of 100% (CI 68–100). N‐terminal truncation site‐specific peptides were increased in ALS in line with C‐terminal fragment enrichment, but were also found in a proportion of Alzheimer cases with normal C:N‐terminal ratio but coexistent limbic TDP‐43 neuropathological changes. In conclusion this is a novel, sensitive, and specific method to quantify the enrichment of pathological TDP‐43 fragments in human brain, which could form the basis for an antibody‐free assay. Our methodology has the potential to help clarify if specific pathological TDP‐43 peptide signatures are associated with primary or secondary TDP‐43 proteinopathies.
authors:
- Emily Feneberg, Philip Charles, Mattéa J. Finelli, <b>Connor Scott</b>, Benedikt Kessler, Roman Fischer, Olaf Ansorge, Elizabeth Gray, Kevin Talbot, Martin Turner
date: "2020-12-10T00:00:00Z"
doi: "10.1111/bpa.12923"
featured: false
image:
caption:
focal_point: ""
preview_only: false
projects: []
publication: 'Brain Pathology'
publication_short: ""
publication_types:
- "2"
publishDate: "2020-12-10T00:00:00Z"
summary:
tags:
- Source Themes
title: Detection and quantification of novel C-terminal TDP-43 fragments in ALS-TDP
url_code: ""
url_dataset: ""
url_pdf:
url_poster: ""
url_project: ""
url_slides: ""
url_source: ""
url_video: ""
---
| 92.903226 | 2,211 | 0.797222 | eng_Latn | 0.988224 |
4fbac70b2d62e05e0f0635c60061e1a78f88fa97 | 365 | md | Markdown | Chapter 12 - Managing Azure/Ch 12 - Recipe Index.md | louisjamesreeves/myform | c5da0364bec6ec06c46308795df706fb93e25ae8 | [
"MIT"
] | 35 | 2019-02-20T14:49:39.000Z | 2022-01-24T09:56:16.000Z | Chapter 12 - Managing Azure/Ch 12 - Recipe Index.md | louisjamesreeves/myform | c5da0364bec6ec06c46308795df706fb93e25ae8 | [
"MIT"
] | 1 | 2020-09-08T13:38:56.000Z | 2020-09-08T13:39:37.000Z | Chapter 12 - Managing Azure/Ch 12 - Recipe Index.md | louisjamesreeves/myform | c5da0364bec6ec06c46308795df706fb93e25ae8 | [
"MIT"
] | 13 | 2019-02-20T16:22:17.000Z | 2022-01-09T00:26:52.000Z | # Module 12 - Managing Azure
This chapter looks using PowerShell to manage Azure
Recipes in this chapter
- 2.1 - Using PowerShell with Azure
- 2.2 - Creating core Azure resources
- 2.3 - Exploring your storage account
- 2.4 - Creating and using an Azure SMB file share
- 2.5 - Creating and using Azure websites
- 2.6 - Creating and using Azure virtual machines
| 26.071429 | 51 | 0.750685 | eng_Latn | 0.969633 |
4fbc8b0ca320751c274502a6b8565208c7df5a2b | 199 | md | Markdown | 24_10_17/24-10-17.md | Evalle/Docker-Prague-Meetup | f8d57247faa794aea3cb173478c8b89d643425ea | [
"Apache-2.0"
] | null | null | null | 24_10_17/24-10-17.md | Evalle/Docker-Prague-Meetup | f8d57247faa794aea3cb173478c8b89d643425ea | [
"Apache-2.0"
] | null | null | null | 24_10_17/24-10-17.md | Evalle/Docker-Prague-Meetup | f8d57247faa794aea3cb173478c8b89d643425ea | [
"Apache-2.0"
] | null | null | null | # Talks
1. [What's New in Docker](Docker_meetup_Oct_24.pdf) by Evgeny Shmarnev
2. [containerd Internals: Building a Core Container Runtime](containerd_-_2017_Docker_Prague_Meetup.pdf) by Stephen Day
| 49.75 | 119 | 0.81407 | eng_Latn | 0.288686 |
4fbd2a557b160f3f4fc43128d96a5484228762ef | 1,337 | md | Markdown | README.md | whamsicore/RegexWar | ba8261d4bf6788264d143aa27313b687047f4d02 | [
"MIT"
] | null | null | null | README.md | whamsicore/RegexWar | ba8261d4bf6788264d143aa27313b687047f4d02 | [
"MIT"
] | null | null | null | README.md | whamsicore/RegexWar | ba8261d4bf6788264d143aa27313b687047f4d02 | [
"MIT"
] | null | null | null | # Familiar Protein
## [Regex Game]
### Gamified, online tool for learning regex.
## What
Progress tracking, gamified method for learning regular expressions online.
## Why
Regular expression syntax is esoteric and tough to reason through. Learning
regular expressions generally means working them into existing projects on a trial and error
basis, or designing code to learn from.
This approach mixes learning regex with other variables, like regex implementation in
programming languages, and the logic of the code.
Resources such as coderbyte and Codeacademy allow users to learn programming without
worrying about unecessary complexities. This project aims to bring that ease of learning
to regular expressions.
## How
[Regex Game] allows users to focus on learning regex through gamification. Users play through
an ever expanding set of levels to gain regex mastery, accruing points and ranks along the way.
## Words
"[Regex Game] gives users realistic regex use cases while tracking their progress, enabling
a greater learning while ignoring the complexities of modern software projects."
-- Todd Glass, CIO
## Where
[link](url)
## Student
"After learning regular expressions with [Regex Game], I write much more elogant and efficient
code"
-- Ricky Mort
## Get Started
Visit us at [link](link)
| 32.609756 | 96 | 0.776365 | eng_Latn | 0.998338 |
4fbdfdee689854a51814ac6a0db3c305326d9c20 | 4,941 | md | Markdown | README.md | wavevision/sprites-constants-generator-plugin | a075923fac3b19c906f09894a984c6bcbfeb0851 | [
"MIT"
] | 2 | 2019-11-05T18:35:01.000Z | 2019-12-20T11:04:49.000Z | README.md | wavevision/sprites-constants-generator-plugin | a075923fac3b19c906f09894a984c6bcbfeb0851 | [
"MIT"
] | 4 | 2019-11-27T11:59:09.000Z | 2022-02-14T05:26:58.000Z | README.md | wavevision/sprites-constants-generator-plugin | a075923fac3b19c906f09894a984c6bcbfeb0851 | [
"MIT"
] | null | null | null | <p align="center"><a href="https://github.com/wavevision"><img alt="Wavevision s.r.o." src="https://wavevision.com/images/wavevision-logo.png" width="120" /></a></p>
<h1 align="center">Sprites Constants Generator Plugin</h1>
[](https://github.com/wavevision/sprites-constants-generator-plugin/actions?query=workflow%3ACI)
[](https://coveralls.io/github/wavevision/sprites-constants-generator-plugin?branch=master)
[](https://www.npmjs.com/package/@wavevision/sprites-constants-generator-plugin)
[](https://github.com/webpack/webpack)
Webpack 4 plugin to generate PHP constants from SVG sprite symbol IDs based on [svg-sprite-loader](https://github.com/kisenka/svg-sprite-loader#runtime-generator) output.
## Package contents
- webpack [plugin](#plugin-options)
- runtime generator for [svg-sprite-loader](https://github.com/kisenka/svg-sprite-loader#runtime-generator)
- loader to de-colorize each SVG with `fill="currentColor"`
## Installation
Use [Yarn](https://yarnpkg.com)
```bash
yarn add --dev @wavevision/sprites-constants-generator-plugin
```
or [npm](https://npmjs.com)
```bash
npm install --save-dev @wavevision/sprites-constants-generator-plugin
```
> **Note**: It is highly recommended to install and include [svgxuse](https://github.com/Keyamoon/svgxuse) in your bundle.
## Usage
Assuming your sprites are loaded from `<entry>/images/<spriteName>/<image>.svg` and emitted into your build directory as `<outputPath>/images/<spriteName>.svg`, your webpack config can be:
```javascript
import { basename, dirname, resolve } from 'path';
import SpritesConstantsGeneratorPlugin from '@wavevision/sprites-constants-generator-plugin';
import SVGSpriteLoaderPlugin from 'svg-sprite-loader/plugin';
const images = 'images';
const sprites = ['icons', 'pictograms'];
export default {
// configure entries and output
// ...
module: {
rules: [
{
test: /\.svg$/,
include: sprites.map(s => resolve(__dirname, 'src', images, s)),
use: [
{
loader: 'svg-sprite-loader',
options: {
extract: true,
runtimeGenerator:
SpritesConstantsGeneratorPlugin.runtimeGenerator,
spriteFilename: pathname =>
`${images}/${basename(dirname(pathname))}.svg`,
symbolId: '[folder]-[name]',
},
},
SpritesConstantsGeneratorPlugin.loader,
],
},
],
},
plugins: [
new SVGSpriteLoaderPlugin({ plainSprite: true }),
new SpritesConstantsGeneratorPlugin({
namespace: 'App\\UI\\Sprites',
output: resolve(__dirname, 'src', 'App', 'UI', 'Sprites'),
replace: sprite => [`${sprite}-`, ''],
sprites: sprites.map(s => `${images}/${s}.svg`),
}),
],
};
```
This will output to `src/App/UI/Sprites`:
- **`Sprites.php`** containing constants with each sprite name
- for every sprite a **`<SpriteName>.php`** containing constants with each `symbolId` as configured in `svg-sprite-loader`
### Plugin options
#### `ignoreErrors?: boolean`
Run generator even if there are errors in webpack compilation. **This option is disabled by default.**
#### `namespace: string`
PHP namespace in which the generated classes will reside.
#### `output: string`
Absolute path to directory in which the generated classes will be put.
#### `replace?: (sprite: string) => [RegExp | string, string]`
Optional sprite name replacer function. Return a tuple in which:
- the first element is a search value
- the second element is a replacement
Replaces custom parts of sprite name **before** it is used in generated constants.
##### Example
You can see in our webpack config we set `symbolId: '[folder]-[name]'` and `spriteFilename` to use images directory name as sprite name. For images in `icons` folder, that will output `icons.svg` sprite in which each symbol will have `icons-<image>` ID. When generating the constants class this will result in duplicate `ICONS` prefix so you will use the constant as `Icons::ICONS_<image>`. If you want to omit that duplicate, use the function as shown in the example, so the result will be `Icons::<image>`.
#### `sprites: string[]`
List of sprite paths relative to webpack output path.
#### `useStaticClass?: boolean`
If `true`, the generated classes will use [`Nette\StaticClass`](https://api.nette.org/Nette/StaticClass.html) trait. **This option is enabled by default.**
#### `useStrictTypes?: boolean`
If `true`, the generated classes will have strict types declared. **This option is enabled by default.**
| 39.214286 | 508 | 0.700668 | eng_Latn | 0.750802 |
4fbed111563605276405c0a60f57d955fa224e41 | 210 | md | Markdown | README.md | gpmgo/switch | 9022b49dbee279d118b8ae8c8020ce044a7c8c25 | [
"Apache-2.0"
] | 194 | 2015-01-06T12:22:13.000Z | 2022-01-04T08:03:41.000Z | README.md | gpmgo/switch | 9022b49dbee279d118b8ae8c8020ce044a7c8c25 | [
"Apache-2.0"
] | 4 | 2015-07-05T11:30:38.000Z | 2017-03-21T18:18:30.000Z | README.md | gpmgo/switch | 9022b49dbee279d118b8ae8c8020ce044a7c8c25 | [
"Apache-2.0"
] | 23 | 2015-02-11T02:46:52.000Z | 2022-02-16T02:44:50.000Z | # Switch
Switch is a server that provides versioning caching and delivering Go packages service.
## License
This project is under Apache v2 License. See the [LICENSE](LICENSE) file for the full license text. | 30 | 99 | 0.785714 | eng_Latn | 0.998045 |
4fc17bb1ddf9d89a11ac33486ee9226e39e736b4 | 124 | md | Markdown | README.md | abiantorres/deepfake-detection | 6ae3a287187155ae43f4b422bf07d311262e7892 | [
"MIT"
] | 2 | 2019-12-19T11:42:47.000Z | 2021-03-08T19:51:39.000Z | README.md | abiantorres/deepfake-detection | 6ae3a287187155ae43f4b422bf07d311262e7892 | [
"MIT"
] | null | null | null | README.md | abiantorres/deepfake-detection | 6ae3a287187155ae43f4b422bf07d311262e7892 | [
"MIT"
] | null | null | null | # deep-fake-detection
El proyecto ha sido desarrollado como parte de un TFM en Universidad Internacional de la Rioja (UNIR)
| 41.333333 | 101 | 0.806452 | spa_Latn | 0.99877 |
4fc2102741c0c372f5cfa7a46b79d6c45d3f1af1 | 256 | md | Markdown | README.md | nuke504/ESA_Project_Electronic_Dice | 7429b563fc14c1da849239d2237e39b36ceb31c9 | [
"MIT"
] | null | null | null | README.md | nuke504/ESA_Project_Electronic_Dice | 7429b563fc14c1da849239d2237e39b36ceb31c9 | [
"MIT"
] | null | null | null | README.md | nuke504/ESA_Project_Electronic_Dice | 7429b563fc14c1da849239d2237e39b36ceb31c9 | [
"MIT"
] | null | null | null | # ESA Project Electronic Dice
This project contains the R files required for the electronic dice of the Engineering Systems and Architecture Board Game Project
Built using R Shiny. Mathematical concepts are embedded in the R file.
Authors
• Loh Zheng Yi
| 32 | 129 | 0.8125 | eng_Latn | 0.988572 |
4fc22d3a358e3afc9c26262325f3d8342e20d6e5 | 5,175 | md | Markdown | controllers/pvc-label/README.md | storageos/api-manager | 0c2e16649e4af362557d3978fbf879dfbcd46988 | [
"MIT"
] | 1 | 2021-01-29T11:18:32.000Z | 2021-01-29T11:18:32.000Z | controllers/pvc-label/README.md | storageos/api-manager | 0c2e16649e4af362557d3978fbf879dfbcd46988 | [
"MIT"
] | 13 | 2020-10-07T19:58:01.000Z | 2022-03-22T14:33:32.000Z | controllers/pvc-label/README.md | storageos/api-manager | 0c2e16649e4af362557d3978fbf879dfbcd46988 | [
"MIT"
] | 5 | 2020-10-07T19:41:19.000Z | 2022-03-02T22:34:40.000Z | # PVC Label Sync Controller
The PVC Label Sync Controller is responsible for syncing labels set on
Kubernetes PVCs to the StorageOS volume objects.
Labels are initially set at creation time by a custom [CSI Provisioner], that
adds PVC labels to the CSI CreateVolume parameters. These are added on top of
any default parameters set in the StorageClass parameters.
This controller ensures that any PVC label changes are applied.
Some StorageOS functionality, such as setting the number of desired replicas, is
done by setting or changing the `storageos.com/replicas=N` label on a Kubernetes
PVC (where N is from 0-6). This controller ensures that the behaviour is
applied to the StorageOS volume after it has been created.
Other labels, such as `storageos.com/nocache` and `storageos.com/nocompress` can
only be set when the volume is created, so the PVC Label Sync Controller ignores
them.
See [StorageOS Feature Labels] for more information.
## StorageClass Defaults
Cluster administrators may set defaults for volumes by setting feature labels as
parameters in the StorageClass. The PVC Label Sync Controller will load the
StorageClass parameters prior to applying any label changes to ensure that they
are taken into account and not removed.
The controller needs to ensure that the defaults set in the StorageClass have
not changed since the volume was provisioned. Otherwise a change to a feature
label in the StorageClass would get applied to all volumes it provisioned, which
may not be the expected behaviour.
Since StorageClasses are immutable, changing a parameter requires deleting and
recreating the StorageClass. To detect this, when the PVC is created, the UID
of the StorageClass is set in the `storageos.com/storageclass` annotation on the
PVC by the [PVC StorageClass Annotation
Mutator](controllers/pvc-mutator/storageclass/README.md). The PVC Label Sync
Controller verifies that the current StorageClass UID matches. If not, labels
are not synchronised for the PVC.
To re-enable PVC label sync when there is a StorageClass UID mismatch, manually
confirm that any StorageClass parameter changes are intended to be applied, then
remove the PVC StorageClass annotation.
If the PVC does not have the `storageos.com/storageclass` annotation and was
provisioned by StorageOS, the PVC Label Sync Controller will add it, using the
UID of the current StorageClass matching the name. This allows PVCs created
prior to `v2.4.0` (when the [PVC StorageClass Annotation
Mutator](controllers/pvc-mutator/storageclass/README.md) was added), to
participate in PVC Label Sync.
**It is possible that pre-`v2.4.0` PVCs were created with a different StorageClass
than the current, and that the parameters from the new StorageClass will be
applied when a label sync for the PVC is triggered.**
## Trigger
The controller reconcile will trigger on any Kubernetes PVC label update event
where the PVC has the StorageOS CSI driver listed in the storage provisioner
annotation. Specifically, PVCs must have the annotation:
```yaml
volume.beta.kubernetes.io/storage-provisioner: csi.storageos.com
```
The annotation is added by Kubernetes when the PVC is evaluated to determine the
provisioner to use. This is determined by the PVC's StorageClassName parameter,
or if not set, the default StorageClass. Once set it is not changed or removed.
## Reconcile
When the labels on a Kubernetes PVC with the StorageOS provisioner annotation is
updated, a request is made to the StorageOS API to re-apply the labels to the
corresponding StorageOS volume.
Labels prefixed with `storageos.com/` have special meaning, and will likely be
applied with a discrete call to the StorageOS API. This ensures that the
behaviour can be applied in a strongly-consistent manner or return an error.
Remaining labels without the `storageos.com/` prefix will be applied as a single
API call. They have no internal meaning to StorageOS but they can be used to
influence placement decisions.
If a PVC label sync fails, it will be requeued and retried after a backoff
period. It is possible that the application of only a partial set of labels
will succeed. If StorageOS can't apply a certain behaviour change (for example,
if the change would result in a volume going offline), then only that behaviour
change would fail and the remaining changes would be attempted. If any change
fails, the whole set of labels will be retried until they all succeed.
## Resync
In case a PVC label update event was missed during a restart or outage, a
resync runs periodically. It re-applies the set of Kubernetes PVC labels to
StorageOS volumes.
PVC label resync is run every hour by default (configurable via the
`-pvc-label-resync-interval` flag). It can be disabled by setting
`-pvc-label-resync-interval` to `0s`.
Resync is run on startup after a delay defined by the
`-pvc-label-resync-delay` flag.
[CSI Provisioner]: https://github.com/storageos/external-provisioner/tree/53f0949-patched
[StorageOS Feature Labels]: https://docs.storageos.com/docs/reference/labels
## Disabling
The PVC Label Sync Controller can be disabled by setting the
`-enable-pvc-label-sync=false` flag.
| 45.79646 | 89 | 0.80058 | eng_Latn | 0.997439 |
4fc240637a81fa351eea870844606287214794d2 | 1,158 | md | Markdown | docs/build/reference/stub.md | POMATOpl/cpp-docs.pl-pl | ae1925d41d94142f6a43c4e721d45cbbbfeda4c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/build/reference/stub.md | POMATOpl/cpp-docs.pl-pl | ae1925d41d94142f6a43c4e721d45cbbbfeda4c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/build/reference/stub.md | POMATOpl/cpp-docs.pl-pl | ae1925d41d94142f6a43c4e721d45cbbbfeda4c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
description: 'Dowiedz się więcej na temat: STUB'
title: STUB
ms.date: 11/04/2016
f1_keywords:
- STUB
helpviewer_keywords:
- STUB .def file statement
ms.assetid: 0a3b9643-19ed-47e9-8173-ee16bc8ed056
ms.openlocfilehash: 79a2002c119bf211652e2aab51d9656b36e3d159
ms.sourcegitcommit: d6af41e42699628c3e2e6063ec7b03931a49a098
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 12/11/2020
ms.locfileid: "97230292"
---
# <a name="stub"></a>STUB
W przypadku użycia w pliku definicji modułu, który kompiluje sterownik urządzenia wirtualnego (VxD), umożliwia określenie nazwy pliku, która zawiera strukturę IMAGE_DOS_HEADER (zdefiniowaną w programie WINNT. H), która ma być używana w sterowniku urządzenia wirtualnego (VxD), a nie w domyślnym nagłówku.
```
STUB:filename
```
## <a name="remarks"></a>Uwagi
Odpowiednikiem metody określania *nazwy pliku* jest [/stub](stub-ms-dos-stub-file-name.md) opcja konsolidatora.
Element ZASTĘPCZy jest prawidłowy w pliku definicji modułu tylko w przypadku kompilowania sterownika VxD.
## <a name="see-also"></a>Zobacz też
[Reguły dla instrukcji Module-Definition](rules-for-module-definition-statements.md)
| 34.058824 | 304 | 0.791883 | pol_Latn | 0.992347 |
4fc2e256f5eddc99fbcfb1d1d315520fbda497ae | 429 | md | Markdown | docs/index.md | dymaxionlabs/satagro-docs | 7d19d703d1b23a1095483fdcf5266f204670a72d | [
"Apache-2.0"
] | null | null | null | docs/index.md | dymaxionlabs/satagro-docs | 7d19d703d1b23a1095483fdcf5266f204670a72d | [
"Apache-2.0"
] | null | null | null | docs/index.md | dymaxionlabs/satagro-docs | 7d19d703d1b23a1095483fdcf5266f204670a72d | [
"Apache-2.0"
] | null | null | null | # DYMAX
Esta es la documentación oficial de AgroScan, la plataforma de procesamiento de datos satelitales para
la detección de cultivos de Dymaxion Labs.
## Secciones
* [Casos de uso](usage.md): Guía para conocer la potencialidad de la herramienta,
tanto para usuarios como administradores.
* [Metodología](methodology.md): Documento donde se describen las fuentes de
datos y procesos utilizados para generar los mapas.
| 30.642857 | 102 | 0.787879 | spa_Latn | 0.997622 |
4fc5a329936eb7994daa87d16d8af3bf593fef11 | 204 | md | Markdown | api/goals/indicators/4-3-f-0.md | KMendin/sdg-indicators-pl | d2b3ffa5cf432fe9b5a5dd1309fd33173733e0eb | [
"CC0-1.0"
] | 8 | 2020-01-28T15:03:46.000Z | 2022-03-18T13:32:09.000Z | api/goals/indicators/4-3-f-0.md | KMendin/sdg-indicators-pl | d2b3ffa5cf432fe9b5a5dd1309fd33173733e0eb | [
"CC0-1.0"
] | null | null | null | api/goals/indicators/4-3-f-0.md | KMendin/sdg-indicators-pl | d2b3ffa5cf432fe9b5a5dd1309fd33173733e0eb | [
"CC0-1.0"
] | 14 | 2019-04-03T09:58:23.000Z | 2021-07-21T12:28:25.000Z | ---
translation_id: 4-3-f
permalink: /api/krajowe/4/4-3-f.json
sdg_goal: 4
layout: json_krajowe_goal_indicator
indicator: "4.3.f"
zmienne: ogółem
kategorie: null
source_url: www.stat.gov.pl
---
| 18.545455 | 37 | 0.715686 | pol_Latn | 0.264556 |
4fc5d4217a667b12024b941c9f3327c77303740c | 408 | md | Markdown | docs/contribute/README.md | lotrekagency/mapo | a5452da918437dee2576d734cf00b8574c53a955 | [
"MIT"
] | 10 | 2021-01-25T09:10:21.000Z | 2022-03-02T14:19:34.000Z | docs/contribute/README.md | lotrekagency/mapo | a5452da918437dee2576d734cf00b8574c53a955 | [
"MIT"
] | 3 | 2021-01-26T11:40:51.000Z | 2022-02-16T16:16:02.000Z | docs/contribute/README.md | lotrekagency/mapo | a5452da918437dee2576d734cf00b8574c53a955 | [
"MIT"
] | 3 | 2021-01-25T09:10:48.000Z | 2021-04-06T10:11:36.000Z | # How to contribute
Feel free to contribute to the project by making a [Pull Request](https://docs.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request).
Here's an example for a branch naming:
`feature/<branch name>`
Once you're done with your work and are ready to make a Pull Request, set the reviewers to
Gabriele Baldi (bnznamco) and Andrea Morosi (andreamorosi). | 45.333333 | 169 | 0.776961 | eng_Latn | 0.980239 |
4fc6a062d6d1e4b8c963687642b7cabe75ab2387 | 19 | md | Markdown | README.md | AnaliticoVegetales/Backend | 0d62480ab8f8d2f6269bedcb1b28642001a5dc65 | [
"MIT"
] | null | null | null | README.md | AnaliticoVegetales/Backend | 0d62480ab8f8d2f6269bedcb1b28642001a5dc65 | [
"MIT"
] | null | null | null | README.md | AnaliticoVegetales/Backend | 0d62480ab8f8d2f6269bedcb1b28642001a5dc65 | [
"MIT"
] | null | null | null | # Backend
Api Rest
| 6.333333 | 9 | 0.736842 | fra_Latn | 0.448759 |
4fc70ddd24ad7ba20d57d6ff67451c964b914f2f | 262 | md | Markdown | README.md | joeTheK/joethek.github.io | 61514e5735b81bf06885dd4c1d245d86ece7cf70 | [
"MIT"
] | null | null | null | README.md | joeTheK/joethek.github.io | 61514e5735b81bf06885dd4c1d245d86ece7cf70 | [
"MIT"
] | null | null | null | README.md | joeTheK/joethek.github.io | 61514e5735b81bf06885dd4c1d245d86ece7cf70 | [
"MIT"
] | null | null | null | # A Portfolio
*Powered By Jekyll using the [chirpy](https://github.com/cotes2020/jekyll-theme-chirpy) theme*
I am a student at Collegiate School of Medicine and Bioscience, and this is my portfolio.
It contains projets worked on in class and outside of class.
| 43.666667 | 94 | 0.782443 | eng_Latn | 0.998028 |
4fc8a639bf0ec5f5b94a80e9a4773ee9e231b741 | 157 | md | Markdown | README.md | JaysonMandani/fuzzy-journey | 9035c3554d01a73dd74557e910677cb59f223af8 | [
"MIT"
] | null | null | null | README.md | JaysonMandani/fuzzy-journey | 9035c3554d01a73dd74557e910677cb59f223af8 | [
"MIT"
] | null | null | null | README.md | JaysonMandani/fuzzy-journey | 9035c3554d01a73dd74557e910677cb59f223af8 | [
"MIT"
] | null | null | null | # Sales Taxes
### Setup
```
git clone https://github.com/JaysonMandani/fuzzy-journey.git
cd fuzzy-journey
bundle install
ruby lib/sales_taxes.rb
rspec
```
| 13.083333 | 60 | 0.745223 | kor_Hang | 0.176285 |
4fc96f62276d557ff62dfc3ef8d5b865fdad1205 | 4,605 | md | Markdown | cats_api_data/README.md | augustovan/apiCatsProject | 42e3a2e2b86e0a495a9dba781585a118bc8b81e8 | [
"MIT"
] | null | null | null | cats_api_data/README.md | augustovan/apiCatsProject | 42e3a2e2b86e0a495a9dba781585a118bc8b81e8 | [
"MIT"
] | null | null | null | cats_api_data/README.md | augustovan/apiCatsProject | 42e3a2e2b86e0a495a9dba781585a118bc8b81e8 | [
"MIT"
] | null | null | null | <p>
<img alt="Version" src="https://img.shields.io/badge/version-v1-blue.svg?cacheSeconds=2592000" />
<a href="/" target="_blank">
<img alt="Documentation" src="https://img.shields.io/badge/documentation-yes-brightgreen.svg" />
</a>
<a href="/LICENSE" target="_blank">
<img alt="License: MIT" src="https://img.shields.io/badge/License-MIT-yellow.svg" />
</a>
<a href="https://twitter.com/vitikovan" target="_blank">
<img alt="Twitter: vitikovan" src="https://img.shields.io/twitter/follow/vitikovan.svg?style=social" />
</a>
</p>
# Projeto API The Cats :cat:
É um projeto que captura dados de uma API chamada de [The Cats API](https://thecatapi.com/)
## :checkered_flag:Features
- Para cada uma das raças de gatos disponíveis, armazenar as informações de
origem, temperamento e descrição em uma base de dados. (disponivel na prox.release):heavy_multiplication_x:
- API disponibiliza o seu consumo de toda base de dados (Mongo) :heavy_check_mark:
- API disponibiliza o seu consumo por raça na base de dados. :heavy_check_mark:
- API disponibiliza o seu consumo por Temperamento na base de dados. :heavy_check_mark:
- API disponibiliza o seu consumo por Origem na base de dados. :heavy_check_mark:
## :cloud: Technologies
Este projeto foi desenvolvido com as seguintes tecnologias:
- [Axios](https://github.com/axios/axios), Para fazer o consumo na URL https://thecatapi.com/
- [Express](https://github.com/expressjs/express) usado para criar as rotas dentro do node.
- [Body-parser](https://github.com/expressjs/body-parser) Faz parte do Express
- [Mongoose](https://github.com/Automattic/mongoose) usado para fazer a conexão com base de dados Mongo.
- [Prom-client](https://github.com/siimon/prom-client) Para fazer a captura das metricas para monitoramento.
## :information_source: How To Use
Para clonar e executar este aplicativo, você precisará de [Git](https://git-scm.com),e do [Docker Swarm](https://docs.docker.com/compose/install/) instalado na sua maquina. Na sua linha de comando execute:
```bash
# Clonar este repositório
$ git clone https://github.com/augustovan/apiCatsProject
# Entre no repositório
$ cd apiCatsProject/cast_api_data
# E Executar o Docker compose
$ docker-compose up -d prometheus
$ docker-compose up -d grafana
$ docker-compose up -d grafana-dashboards
$ sudo docker-compose up -d --build nodejs-application-cats
# Inicie o node e começe a consumir a API
$ docker exec nodejs-application-cats node index.js
```
### :electric_plug: Metodo para consumo da API
| URL | Função |
| -----------------------------------------|--------------------------------------------------|
| http://localhost:5001/api/cats | Traz todos os gatos do Banco de dados |
| http://localhost:5001/api/cats/b=[dados] | Traz os gatos por raça do Banco de dados |
| http://localhost:5001/api/cats/o=[dados] | Traz os gatos por Origem do Banco de dados |
| http://localhost:5001/api/cats/t=[dados] | Traz os gatos por Temperamento do Banco de dados |
### :rocket: Postman
- Tela de consulmo da API trazendo todos os dados do site APICats.
<img src="github/postman01.png"
alt="Grafana"
style="float: left; margin-right: 10px;" />
- Tela de consulmo da API trazendo todos os dados do site APICats.
<img src="github/postman05.png"
alt="Grafana"
style="float: left; margin-right: 10px;" />
- Tela de consulmo da API_Local trazendo os dados por Origem.
<img src="github/postman02.png"
alt="Grafana"
style="float: left; margin-right: 10px;" />
- Tela de consulmo da API_Localo os dados por Raça.
<img src="github/postman03.png"
alt="Grafana"
style="float: left; margin-right: 10px;" />
- Tela de consulmo da API_Local trazendo os dados por ID.
<img src="github/postman04.png"
alt="Grafana"
style="float: left; margin-right: 10px;" />
- Tela de consulmo da API_Local trazendo os dados por Temperamento.
<img src="github/postman06.png"
alt="Grafana"
style="float: left; margin-right: 10px;" />
## :fire: Monitormaneto da API (via Prometheus + Grafana)
- Tela monitoramento Grafana.
<img src="github/grafana_api.png"
alt="Grafana"
style="float: left; margin-right: 10px;" />
- Tela monitoramento Promethus.
<img src="github/prometheus-api.png"
alt="Prometheus"
style="float: left; margin-right: 10px;" />
## 📝 License
Copyright © 2020 [Victor Nascimento](https://github.com/augustovan).<br />
This project is [MIT](/LICENSE) licensed.
| 35.152672 | 205 | 0.684691 | por_Latn | 0.876687 |
4fca73f19eba6ca9d6c404ca04960fd0356202c0 | 273 | md | Markdown | release-notes/ios/monotouch_3/monotouch_3.2.6/index.md | xamarin/release-notes-archive | 9bac84d3db0a16bcb258c602f71eccfa814ba0a0 | [
"CC-BY-4.0",
"MIT"
] | 5 | 2020-06-17T17:52:53.000Z | 2021-06-29T04:11:41.000Z | release-notes/ios/monotouch_3/monotouch_3.2.6/index.md | xamarin/release-notes-archive | 9bac84d3db0a16bcb258c602f71eccfa814ba0a0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | release-notes/ios/monotouch_3/monotouch_3.2.6/index.md | xamarin/release-notes-archive | 9bac84d3db0a16bcb258c602f71eccfa814ba0a0 | [
"CC-BY-4.0",
"MIT"
] | 12 | 2019-08-07T14:31:02.000Z | 2022-02-14T09:33:24.000Z | ---
id: 4EF0AA14-A00A-CF70-BD2A-ED144C69459C
title: "MonoTouch 3.2.6"
---
- NSMutableData (uint size) constructor now sets the initial capacity, instead of setting the current length.
- Address an incompatibility if installed without installing the iOS 4.3 SDK
| 27.3 | 111 | 0.758242 | eng_Latn | 0.94338 |
4fcbf3f20133116a9dd92e35247879bcea3f721c | 629 | md | Markdown | README.md | doctorham/hambot | ced16e472e7915bfb4e2281cc64e5429f4ae851c | [
"MIT"
] | null | null | null | README.md | doctorham/hambot | ced16e472e7915bfb4e2281cc64e5429f4ae851c | [
"MIT"
] | 1 | 2017-09-10T22:06:10.000Z | 2017-09-22T02:06:15.000Z | README.md | doctorham/hambot | ced16e472e7915bfb4e2281cc64e5429f4ae851c | [
"MIT"
] | null | null | null | # hambot
## Setup
`hambot` reads its configuration from a JSON file. It is read from, in order:
1. The path supplied as an argument to `hambot`
2. The file `config.json` in the same directory as the `hambot` executable
3. The file `hambot/config.json` in one of the standard configuration paths
(see [configdir](https://github.com/shibukawa/configdir)).
See [main.go](main.go) for a description of the format.
## Submission guidelines
Format source files with `gofmt` and fix issues reported by `golint`.
These commands are executed automatically by some editors,
such as [Visual Studio Code](http://code.visualstudio.com/). | 44.928571 | 77 | 0.758347 | eng_Latn | 0.992102 |
4fcce8270605aaf1d6ba454812b057023db410c7 | 700 | md | Markdown | README.md | mlavergn/swiftutil | 4e8435f73a5025e7d742f1bcec78814d33ca611d | [
"MIT"
] | null | null | null | README.md | mlavergn/swiftutil | 4e8435f73a5025e7d742f1bcec78814d33ca611d | [
"MIT"
] | null | null | null | README.md | mlavergn/swiftutil | 4e8435f73a5025e7d742f1bcec78814d33ca611d | [
"MIT"
] | null | null | null | # Swift Util
### Swift utility package.
Introduction
--
This a collection of helper classes and extensions that I've used in some projects. There are bits that might prove interesting interesting to others.
Dependencies
--
Assumptions are iOS 10.x / macOS 10.12.x, but it may work on earlier revs.
The are no external dependencies beyond Apple's Frameworks. So this is somewhat platform independent, so long as that platform is one of Apple's.
Installation
--
```bash
make
```
## Warning

This package is based Swift 3.0 syntax, until we reach a stable ABI, mayhem may still ensue.
| 26.923077 | 151 | 0.768571 | eng_Latn | 0.993755 |
4fccf95ae9466fa5eed597f5365ef15a41d020fa | 5,081 | md | Markdown | microsoft-365/security/office-365-security/siem-server-integration.md | MicrosoftDocs/microsoft-365-docs-pr.nl-NL | 6e5ca6bfc9713df2e28347bd1ca1c86d38113afb | [
"CC-BY-4.0",
"MIT"
] | 4 | 2020-05-18T20:11:39.000Z | 2022-01-29T01:34:41.000Z | microsoft-365/security/office-365-security/siem-server-integration.md | MicrosoftDocs/microsoft-365-docs-pr.nl-NL | 6e5ca6bfc9713df2e28347bd1ca1c86d38113afb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | microsoft-365/security/office-365-security/siem-server-integration.md | MicrosoftDocs/microsoft-365-docs-pr.nl-NL | 6e5ca6bfc9713df2e28347bd1ca1c86d38113afb | [
"CC-BY-4.0",
"MIT"
] | 3 | 2021-03-14T23:50:07.000Z | 2021-08-19T14:14:01.000Z | ---
title: SIEM-serverintegratie met Microsoft 365 services en toepassingen
f1.keywords:
- NOCSH
ms.author: deniseb
author: denisebmsft
manager: dansimp
audience: ITPro
ms.topic: article
ms.date: 11/18/2019
ms.localizationpriority: medium
ms.collection:
- M365-security-compliance
ms.custom:
- Ent_Solutions
- SIEM
- seo-marvel-apr2020
description: Een overzicht krijgen van de siem-serverintegratie (Security Information and Event Management) met uw Microsoft 365 cloudservices en -toepassingen
ms.technology: mdo
ms.prod: m365-security
ms.openlocfilehash: 71fff15b1493f6e8e15becbd87ad55947c8eddc4
ms.sourcegitcommit: d4b867e37bf741528ded7fb289e4f6847228d2c5
ms.translationtype: MT
ms.contentlocale: nl-NL
ms.lasthandoff: 10/06/2021
ms.locfileid: "60169417"
---
# <a name="security-information-and-event-management-siem-server-integration-with-microsoft-365-services-and-applications"></a>SiEM-serverintegratie (Security Information and Event Management) met Microsoft 365 services en toepassingen
**Van toepassing op**
- [Exchange Online Protection](exchange-online-protection-overview.md)
- [Abonnement 1 en abonnement 2 voor Microsoft Defender voor Office 365](defender-for-office-365.md)
- [Microsoft 365 Defender](../defender/microsoft-365-defender.md)
[!INCLUDE [Microsoft 365 Defender rebranding](../includes/microsoft-defender-for-office.md)]
## <a name="summary"></a>Samenvatting
Gebruikt of plant uw organisatie een SIEM-server (Security Information and Event Management) te krijgen? U vraagt zich misschien af hoe het wordt geïntegreerd met Microsoft 365 of Office 365. Dit artikel bevat een lijst met bronnen die u kunt gebruiken om uw SIEM-server te integreren met Microsoft 365 services en toepassingen.
> [!TIP]
> Als u nog geen SIEM-server hebt en uw opties verkent, kunt u overwegen om een Microsoft Azure **[Zoeken.](/azure/sentinel/overview)**
## <a name="do-i-need-a-siem-server"></a>Heb ik een SIEM-server nodig?
Of u een SIEM-server nodig hebt, hangt af van veel factoren, zoals de beveiligingsvereisten van uw organisatie en waar uw gegevens zich bevinden. Microsoft 365 bevat een groot aantal beveiligingsfuncties die voldoen aan de beveiligingsbehoeften van veel organisaties zonder extra servers, zoals een SIEM-server. Sommige organisaties hebben speciale omstandigheden waarvoor het gebruik van een SIEM-server is vereist. Dit zijn enkele voorbeelden:
- *Fabrikam* heeft een aantal on-premises inhoud en toepassingen, en sommige in de cloud (ze hebben een hybride cloudimplementatie). Om beveiligingsrapporten over al hun inhoud en toepassingen te krijgen, heeft Fabrikam een SIEM-server geïmplementeerd.
- *Contoso* is een financiële serviceorganisatie die bijzonder strenge beveiligingsvereisten heeft. Ze hebben een SIEM-server toegevoegd aan hun omgeving om te profiteren van de extra beveiliging die ze nodig hebben.
## <a name="siem-server-integration-with-microsoft-365"></a>SIEM-serverintegratie met Microsoft 365
Een SIEM-server kan gegevens ontvangen van een groot aantal Microsoft 365 services en toepassingen. In de volgende tabel vindt u Microsoft 365 services en toepassingen, samen met SIEM-serverinvoer en -resources voor meer informatie.
<br/><br/>
|Microsoft 365 Service of Toepassing|SIEM-serverinvoer/-methoden|Informatiebronnen|
|---|---|---|
|[Microsoft Defender voor Office 365](defender-for-office-365.md)|Auditlogboeken|[SIEM-integratie met Microsoft Defender voor Office 365](siem-integration-with-office-365-ti.md)|
|[Microsoft Defender voor Eindpunt](/windows/security/threat-protection/)|HTTPS-eindpunt dat wordt gehost in Azure <p> REST API|[Waarschuwingen naar uw SIEM-hulpprogramma's trekken](../defender-endpoint/configure-siem.md)|
|[Microsoft Cloud App Security](/cloud-app-security/what-is-cloud-app-security)|Logboekintegratie|[SIEM-integratie met Microsoft Cloud App Security](/cloud-app-security/siem)|
> [!TIP]
> Bekijk Azure [Sentinel.](/azure/sentinel/overview) Azure Sentinel wordt geleverd met connectors voor Microsoft-oplossingen. Deze connectors zijn 'out of the box' beschikbaar en bieden realtime integratie. U kunt Azure Sentinel gebruiken met uw Microsoft 365 Defender-oplossingen en Microsoft 365-services, waaronder Office 365, Azure AD, Microsoft Defender voor identiteit, Microsoft Cloud App Security en meer.
### <a name="audit-logging-must-be-turned-on"></a>Auditregistratie moet zijn ingeschakeld
Controleer of auditregistratie is ingeschakeld voordat u de SIEM-serverintegratie configureert.
- Zie Voor SharePoint Online, OneDrive voor Bedrijven en Azure Active Directory controle [in-](../../compliance/turn-audit-log-search-on-or-off.md)of uitschakelen.
- Zie Exchange Online controle [van postvakken beheren](../../compliance/enable-mailbox-auditing.md)voor meer Exchange Online.
## <a name="more-resources"></a>Meer informatie
[Beveiligingsoplossingen integreren in Azure Defender](/azure/security-center/security-center-partner-integration#exporting-data-to-a-siem)
[Microsoft-Graph beveiligings-API-waarschuwingen integreren met een SIEM](/graph/security-integration)
| 65.141026 | 445 | 0.804566 | nld_Latn | 0.985323 |
4fcd685b2c38ed34c7097c11b936485d33a403ab | 8,457 | md | Markdown | README.md | brfriedr/Cisco-DNA-Center-Webex-Bot | c7b0407441723c78ce9cd386013582729914bcd9 | [
"BSD-Source-Code"
] | null | null | null | README.md | brfriedr/Cisco-DNA-Center-Webex-Bot | c7b0407441723c78ce9cd386013582729914bcd9 | [
"BSD-Source-Code"
] | null | null | null | README.md | brfriedr/Cisco-DNA-Center-Webex-Bot | c7b0407441723c78ce9cd386013582729914bcd9 | [
"BSD-Source-Code"
] | null | null | null | # Cisco DNA Center Bot
DNA Center is a complete management and control platform that simplifies and streamlines network operations. This single, extensible software platform includes integrated tools for NetOps, SecOps, DevOps and IoT connectivity with AI/ML technology integrated throughout.
#### Benefits
* **Simplify Management.** Operate your local and branch networks over a centralized dashboard.
* **Increase Security.** Translate business intent into zero-trust policies and dynamic segmentation of endpoints based on usage behavior.
* **Lower Costs.** Policy-driven provisioning and guided remediation increase network uptime and reduce time spent managing network operations.
* **Transform your network.** Deploy cloud services and applications that benefit from the intelligent network optimization delivered by Cisco DNA Center.
* **Ensure network and application performance:** AI/ML network insights reduce time spent managing network operations and improve user experience.
* **Facilitate offsite IT teams:** Optimized for remote access, a clean, organized dashboard with single-button workflows makes remote management easy.
Webex Teams Bots gives users access to outside services right from their Webex spaces. Bots help users automate tasks, bring external content into the discussions, and gain efficiencies. Bots come in all different shapes and sizes such as notifiers, controllers, and assists.
The ability to integrate Cisco DNA Center Platform API's into Webex Bots provides us a powerful way to manage and get insights into whats happing within our network.
# What are we going to do?
We are going to create a Webex Bot that uses the DNA Center API's to do the following tasks.
* List Devices
* Show Device Configuration
* Show Image Compliance
* Alerting For Assurance Issues
# Prerequisites
If you don't already have a [Webex Teams](https://www.webex.com/team-collaboration.html) account, go ahead and register for one. They are free!
1. You'll need to start by adding your bot to the Webex Teams website
[https://developer.webex.com/my-apps](https://developer.webex.com/my-apps)
2. Click **Create a New App**

3. Click **Create a Bot**

4. Fill out all the details about your bot.

5. Click **Add Bot**
6. On the Congratulations screen, make sure to copy the Bot's Access token, you will need this.
# ngrok
[ngrok](https://ngrok.com/) makes it easy for you to develop your code with a live bot. Ngrok exposes local servers behind NATs and firewalls to the public internet over secure tunnels. It instantly creates a public HTTPS url for a webist running locacally on your development machine. Ngrok offloads TLS, so you don't have to worry about your configuration.
You can find installation instructions here: https://ngrok.com/download
1. After you've installed ngrok, in another window start the service
ngrok http 8080
2. You should see screen that looks like this:
ngrok by @inconshreveable (Ctrl+C to quit)
Session Status online
Account [email protected] (Plan: Free)
Version 2.3.40
Region United States (us)
Web Interface http://127.0.0.1:4040
Forwarding http://this.is.the.url.you.net.ngrok.io -> http://localhost:8080
Forwarding http://this.is.the.url.you.net.ngrok.io -> http://localhost:8080
Connections ttl opn rt1 rt5 p50 p90
0 0 0.00 0.00 0.00 0.00
3. You will use the Forwarding URL in the config.py file with this URL:
TEAMS_BOT_URL=os.envrion.get('TEAMS_BOT_URL','http://this.is.the.url.you.net.ngrok.io')
# Config File
Update the config.py file with the relevant information of your Cisco DNA Center Appliance or you can use our DNA Center Sandbox.
Example config.py with Cisco DNA Center Sandbox information:
import os
DNAC=os.environ.get('DNAC','https://sandboxdnac.cisco.com')
DNAC_PORT=os.environ.get('DNAC_PORT',443)
DNAC_USER=os.environ.get('DNAC_USER','devnetuser')
DNAC_PASSWORD=os.environ.get('DNAC_PASSWORD','Cisco123!')
WEBEX_BOT_ACCESS_TOKEN=os.environ.get('WEBEX_BOT_ACCESS_TOKEN','Enter your webex toekn from the preq step 6')
TEAMS_BOT_URL=os.envrion.get('TEAMS_BOT_URL','http://this.is.the.url.you.net.ngrok.io')
# Webex Bot Script
The Bot.py script is leveraging the Flask web service [micro-framework](http://flask.pocoo.org/). We are using ngrok to be used to tunnel traffic back to your local machine sites. When interacting with the Bot, we are calling functions in the DNAC.py script to make API calls to the Cisco DNA Center you specified in the config file.
This script is an example of different interactions you could create between your Bot and Cisco DNA Center.
Below are the example interactions with the Bot Script.
Help me - I will display what I can do.
Hello - I will display my greeting message.
List devices - I will list devices in DNA Center.
Show Device Config - I will attach the device configuraiton for your device.
Image Compliance - I will show the image software compliance.
Update the config.py parameters then run the Bot.py script to see it in action!
Example output

# Cisco DNA Center Real Time Event Alerts to Webex Teams Bot
Cisco DNA Center has a powerful issue correlation engine for wired and wireless networks. Real time feeds of network telemetry is able to identify issues and provide context for resolution. We now have the ability to send those notifications to a Webex Team Rooms in 2.2.3.0 release.
1.) In Cisco DNA Center navigate to Platform -> Developer Toolkit and the Events Tab.

2.) Select the events you want to be notified about to your Webex Teams Room then click "Subscribe".
3.) Create a Name for the subscription then select Webex for the Subscription Type.

4.) Enter the Webex URL along with the Webex Room ID where you want the alerts to be posted and your Webex Access Bot Token. (You can find your webex room id's at [developer.webex.com](https://developer.webex.com/docs/api/v1/rooms/get-room-meeting-details))


5.) You can now test your integration by selecting "Try it"

6.) If you setup everything correctly you will see the notification in your Cisco Webex Team Room.

# Links to DevNet Learning Labs
Here is a link to the Cisco DNA Center Devnet Learning Lab to learn how to leverage the Cisco DNA Center API's.
[Introduction to Cisco DNA Center REST APIs](https://developer.cisco.com/learning/modules/dnac-rest-apis)
# License
This project is licensed to you under the terms of the [Cisco Sample Code License.](https://github.com/brfriedr/Cisco-DNA-Center-Webex-Bot/blob/main/LICENSE)
| 56.006623 | 359 | 0.734539 | eng_Latn | 0.930539 |
4fcde7663680405e85bb61cedf53a9eeaf7a9eb7 | 373 | md | Markdown | data/README.md | ajaech/twittercommunities | 07bf2114a276338735d2e0a130dd7c075f0f1225 | [
"Apache-2.0"
] | 4 | 2018-10-03T21:29:30.000Z | 2020-12-05T07:45:24.000Z | data/README.md | ajaech/twittercommunities | 07bf2114a276338735d2e0a130dd7c075f0f1225 | [
"Apache-2.0"
] | 2 | 2018-07-04T18:59:13.000Z | 2019-04-17T04:36:55.000Z | data/README.md | ajaech/twittercommunities | 07bf2114a276338735d2e0a130dd7c075f0f1225 | [
"Apache-2.0"
] | 1 | 2019-04-13T03:39:32.000Z | 2019-04-13T03:39:32.000Z |
Description of files:
* `topics.txt` has a list of the 975 trending terms that were used as queries to collect the general population data.
* `community_twids.txt.gz` has the tweet ids for the tweets collected for the communities used in the community member retrieval task.
* `vocab.txt` is a text file listing all the tokens in the vocabulary used for the experiments.
| 53.285714 | 134 | 0.785523 | eng_Latn | 0.999661 |
4fcebd715fd5efe8a3b481ddeae7d95e5f381838 | 340 | md | Markdown | cran-comments.md | JiaxiangBU/add2gh | e9dff98980fadd6c2f0d7d6b18d1de9d27594845 | [
"MIT"
] | null | null | null | cran-comments.md | JiaxiangBU/add2gh | e9dff98980fadd6c2f0d7d6b18d1de9d27594845 | [
"MIT"
] | 3 | 2019-06-05T16:55:30.000Z | 2019-07-29T13:02:12.000Z | cran-comments.md | JiaxiangBU/add2gh | e9dff98980fadd6c2f0d7d6b18d1de9d27594845 | [
"MIT"
] | null | null | null | ## Test environments
* local OS X install, R 3.6.0
* ubuntu 14.04 (on travis-ci), R 3.6.0
* win-builder (devel and release)
## R CMD check results
0 errors | 0 warnings | 3 note
* Fix the 'GitHub' into single quote in the title and description.
* Update the year of the LICENSE.
* Use 'donttest' instead of 'dontrun' into the functions.
| 26.153846 | 66 | 0.705882 | eng_Latn | 0.97038 |
4fcf25bad6539bbfe3d8ca936ed5425812bdf466 | 460 | md | Markdown | _blogger-exports/rachelprestonprinz/exported/great-quote-on-wisdom-by-jonathanfields.md | Archinia/archinia-com | 7fc04f565853f00691f6eac6a313a001cea053a2 | [
"MIT"
] | null | null | null | _blogger-exports/rachelprestonprinz/exported/great-quote-on-wisdom-by-jonathanfields.md | Archinia/archinia-com | 7fc04f565853f00691f6eac6a313a001cea053a2 | [
"MIT"
] | 30 | 2018-03-03T18:26:39.000Z | 2022-01-19T19:14:12.000Z | _blogger-exports/rachelprestonprinz/exported/great-quote-on-wisdom-by-jonathanfields.md | Archinia/archinia-com | 7fc04f565853f00691f6eac6a313a001cea053a2 | [
"MIT"
] | null | null | null | ---
title: 'Great Quote on Wisdom by @jonathanfields via @DaniellelaPorte'
date: 2015-08-08T11:31:00.002-06:00
draft: false
slug: great-quote-on-wisdom-by-jonathanfields
tags: [Great Quotes]
---
"Conventional wisdom. Please, I should buy into the collective perceived limitations of large numbers of people who’ve either tried and failed or, far more likely, never had the balls to try? Why would I ever let THAT guide or limit my own life?" - Jonathan Fields | 51.111111 | 264 | 0.771739 | eng_Latn | 0.987616 |
4fd0dcf78e462b2b45a44064228194347dff2762 | 534 | md | Markdown | README.md | NitroFunk/MyStackCard | 02e2311925fa8d9e310617245497f2dd040bcd9d | [
"Apache-2.0"
] | 1 | 2021-03-25T09:22:32.000Z | 2021-03-25T09:22:32.000Z | README.md | NitroFunk/MyStackCard | 02e2311925fa8d9e310617245497f2dd040bcd9d | [
"Apache-2.0"
] | null | null | null | README.md | NitroFunk/MyStackCard | 02e2311925fa8d9e310617245497f2dd040bcd9d | [
"Apache-2.0"
] | null | null | null | # MyStackCard
An minature lightweight app developed in flutter for stacking reminders or cards
future update will come by , and a public apk will be available ,
feel free to look at the code as it's open source and you can fork it if you want.
If you want tutorials for flutter or any other app development tutorials/courses i would personally
recommend https://www.appbrewery.co/ , where Dr.Angela Yu and her colleagues will give you a good
hands on flutter. After which you should definitely explore , publish and have fun!
| 59.333333 | 100 | 0.784644 | eng_Latn | 0.999512 |
4fd0f8a9504e5c3584f54998618282f23206b8a8 | 4,622 | md | Markdown | _posts/2015-05-31-calendario-library.md | zyzo/zyzo.github.io | 9a3b7e5dff73ea77ba3e540425900629bbd3c384 | [
"MIT"
] | 5 | 2015-05-08T15:21:01.000Z | 2016-06-10T09:57:44.000Z | _posts/2015-05-31-calendario-library.md | zyzo/zyzo.github.io | 9a3b7e5dff73ea77ba3e540425900629bbd3c384 | [
"MIT"
] | null | null | null | _posts/2015-05-31-calendario-library.md | zyzo/zyzo.github.io | 9a3b7e5dff73ea77ba3e540425900629bbd3c384 | [
"MIT"
] | null | null | null | ---
layout: post
title: "Calendario : another JS calendar library"
date: 2015-06-05 17:37:00
categories: gsoc fossasia
tags:
- calendario
- javascript
comments: true
---
Calendario is a lightweight [calendar library](https://github.com/codrops/Calendario) that I happen to find on the net. Their approach to calendaring solution is rather unique : it's not a full-blown library like jQuery [FullCalendar](http://fullcalendar.io/) where everthing is ready to be used. With Calendario, calling init function simply generates plains html source code. The styling part is completely up to user.
Another plus is it's lightweight (350 lines of code more or less). It's easy to dive into the source and make advanced adaptations to specific use-case.
But what's really remarkable is their examples. As I said the styling is out-sourcing to user, and they provide [stunning examples](http://deviprsd21.github.io/Calendario/) that demonstrate how to do it. For me (a developer "designing" stuffs), readily beautiful examples are a crucial selling point.
But (there's always a but..), Calendario is very poorly documented. A pointer to the same [blog post](http://tympanus.net/codrops/2012/11/27/calendario-a-flexible-calendar-plugin/) all over again is not enough ! I was very confusing about how things work at first, especially about the data model. This blog contains what I found out (rather painfully) about Calendario library :
### Initialization
{% highlight html %}
<div id="calendar_id"></div>
<script>
$('#calendar_id').calendario(options);
</script>
{% endhighlight %}
### Available options
| Option | Purpose* | Default |
| ------------- |:---------------------------------:| -----:|
| weeks | weekday labels | ['Sunday', 'Monday', 'Tuesday', 'Wednesday', 'Thursday', 'Friday', 'Saturday']
| weekabbrs | weekday short labels | ['Sun', 'Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat'] |
| months | month label | ['January', 'February', 'March', 'April', 'May', 'June', 'July', 'August', 'September', 'October', 'November', 'December'] |
| monthabbrs | monday short labels | ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec'] |
| displayWeekAbbr | display week short labels | false |
| displayMonthAbbr | display month short labels | false |
| startIn | | 1 |
| fillEmpty | | true |
| zone | | '00:00' |
| events | | [['click'], [focus']]|
| checkupdate | | true |
| weekdays | | 'MON, TUE, WED, THU, FRI'|
| weekends | | 'SAT, SUN' |
| feed | | http://calendario.t15.org/sync/ |
| month | | |
| year | | |
| caldata | calendar events data | |
<br/>
Most of them you can safely ignore, I enlisted them all for the sake of completeness. Checkout the next section for interesting ones.
### Option format
caldata
A list of json objects. The list's keys and values contain following information :
* key : the event date in MM-DD-YYYY format
* value : an array of event values that occured on the given date. Each case of the array can have following fields :
| Option | Purpose | Value type |
| content| title of the event | string |
| url | Link to event page | string |
| note | added note for the event, will be included in the html as inside <note> tag | string |
| startday | | string, in MM-DD-YYYY format |
| endday | | string, in MM-DD-YYYY format |
| allDay| true for all day event| boolean |
<br/>
<u>Examples</u>
{% highlight javascript %}
var events = {
'06-06-2015' : [{content: 'Important Meeting at 4', startTime: '04:00', endTime: '06:00', note : 'Everyone please gather at Markketing Department'}],
'06-07-2015' : [{content : 'First event of July 7', allDay : true},
{content : 'Second event of July 7', allDay : true}
],
'07-01-2015' : [{content: 'All day event', allDay: true, note : 'So loongg !'}]
}
{% endhighlight %}
### Wrap-up : Pros and cons of Calendario
* Pros :
* lightweight
* fully customizable design
* beautiful examples
* Cons
* poor documentation
* lots of work undone, not recommended for those who want a powerful library that just work
That's about it ! Found something wrong, or just want to help completing this unofficial calendario documentation ? Send me a PR [here](https://github.com/zyzo/zyzo.github.io)
***NOTE** : this documentation is based on Calendario v4.0.0 [https://github.com/deviprsd21/Calendario](https://github.com/deviprsd21/Calendario)
References :
+ [My experiments for FOSSASIA community timeline, powerered by Calendario](https://github.com/zyzo/EventsPanel)
+ [http://blog.fossasia.org](http://blog.fossasia.org) | 45.762376 | 420 | 0.68347 | eng_Latn | 0.970481 |
4fd1a44f6e96d886c669d1b0d3c17b85b7225dc5 | 122 | md | Markdown | Kotlin/addBuildTypeUsingDslFinalize/readme.md | a284628487/gradle-recipes | b7ccf3cca26b9ad64f466c6dc7df7d46b1af0314 | [
"Apache-2.0"
] | 1 | 2022-01-24T09:52:42.000Z | 2022-01-24T09:52:42.000Z | Kotlin/addBuildTypeUsingDslFinalize/readme.md | a284628487/gradle-recipes | b7ccf3cca26b9ad64f466c6dc7df7d46b1af0314 | [
"Apache-2.0"
] | null | null | null | Kotlin/addBuildTypeUsingDslFinalize/readme.md | a284628487/gradle-recipes | b7ccf3cca26b9ad64f466c6dc7df7d46b1af0314 | [
"Apache-2.0"
] | null | null | null | This recipe shows how a build script can add a build type programmatically
using the finalizeDsl androidComponents {} API. | 61 | 74 | 0.827869 | eng_Latn | 0.997713 |
4fd25d1658c26a7f6537abbfccdffe0102376d9d | 425 | md | Markdown | translations/zh-CN/data/reusables/support/scope-of-support.md | RahmaNiftaliyev/docs | 4e1ee81587e95617e25aa51c9d761289b193bd37 | [
"CC-BY-4.0",
"MIT"
] | 5 | 2021-12-23T00:28:02.000Z | 2022-01-20T17:53:05.000Z | translations/zh-CN/data/reusables/support/scope-of-support.md | RahmaNiftaliyev/docs | 4e1ee81587e95617e25aa51c9d761289b193bd37 | [
"CC-BY-4.0",
"MIT"
] | 63 | 2022-02-01T12:55:45.000Z | 2022-03-30T14:21:14.000Z | translations/zh-CN/data/reusables/support/scope-of-support.md | RahmaNiftaliyev/docs | 4e1ee81587e95617e25aa51c9d761289b193bd37 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | 如果您的支持申请超出了我们团队可以帮助您的范围, 我们可能会提出后续措施建议,以便在 {% data variables.contact.github_support %} 之外解决您的问题。 您的支持申请如果主要是关于以下方面,可能超出了 {% data variables.contact.github_support %} 的范围:
- 第三方集成,如 Jira{% ifversion ghes %}
- 硬件设置{% endif %}
- CI/CD,如 Jenkins
- 编写脚本
- 外部身份验证系统配置,如 SAML 身份提供程序
- 开源项目{% ifversion ghes %}
- LGTM 集群设计{% endif %}
- 写入或调试 {% data variables.product.prodname_codeql %} 的新查询
如果您不确定问题是否超出范围,请开一个事件单,我们乐意帮助您确定最好的处理方式。
| 35.416667 | 169 | 0.752941 | yue_Hant | 0.750293 |
4fd50649721fdbfedcf64b5e68d0e1a66b58792d | 4,323 | md | Markdown | _articles/black-friday-a-cheat-day-for-the-anti-consumerists-318585ddfda6b6a8b2/index.md | jasonleow/200wordsaday | b1ca968e62f86863d8f68ee5988d6126c6ece34f | [
"MIT"
] | null | null | null | _articles/black-friday-a-cheat-day-for-the-anti-consumerists-318585ddfda6b6a8b2/index.md | jasonleow/200wordsaday | b1ca968e62f86863d8f68ee5988d6126c6ece34f | [
"MIT"
] | null | null | null | _articles/black-friday-a-cheat-day-for-the-anti-consumerists-318585ddfda6b6a8b2/index.md | jasonleow/200wordsaday | b1ca968e62f86863d8f68ee5988d6126c6ece34f | [
"MIT"
] | null | null | null | ---
title: "Black Friday - a cheat day for the anti-consumerists?"
created_at: 2019-11-28T22:32:11.000Z
published_at: 2019-11-28T23:29:54.000Z
---
I used to ignore Black Friday, brushing it off as another marketing, consumerist gimmick to get people to buy stuff they don't need to impress people they don't know. When I needed stuff, I just bought them, without discount. But that was when I was younger, more self-righteous and had an overly-idealistic and poor view of money. But over the past 1-2 years, I had done some work on re-evaluating my relationship with money and consumption, and I'd say, had arrived at a more balanced view.
My take now: If I'm going to spend money on luxury items, I might as well spend it on a day when things are cheaper. Money saved means more money on other things that matter to me. If I'm going to consume luxuriously, I might as well consume on as pre-defined days instead of letting loose throughout the year. Kinda like a cheat day. Be good as much as possible for most of the year, and have a cheat day on Black Friday. (Of course I buy luxury stuff on non-Black Fridays too throughout the year, but with as much restraint and/or necessity as possible. Black Friday is when I can loosen that valve a bit.)
So, time to share that shopping list! These are some of the stuff that I'd been deal-hunting for:
[**Carrd.co**](https://carrd.co/) **- 50% of Pro plans till 6 Dec**
Carrd is a sleek website builder made by @ajlkn. It's super easy to use, and allows you to make beautiful, fast websites within minutes. Seriously, even though I know how to code things myself now, but sometimes you don't want to commit too much time to an idea that might not even work in the first place. I'd always wanted to get on the Pro plan for Carrd, so that I can make MVPs of ideas I have, with as little time invested as possible. 50% off on the $49 Pro Plus annual plan is just crazy affordable for a tool that can potentially give me so much more returns.
[**Vessi**](https://vessifootwear.com/) **- $99 for first pair (usual $179) till 3 Dec**
Ever since I backed this Kickstarter project, I'd been a big fan of their waterproof knitted sneakers. It rains a whole lot in Singapore, and I hate walking around the whole day with wet socks. Ever since I got this sneakers, I even hop onto puddles without a care in the world. #firstworldproblems
[**Perfect Keto Bars**](https://shop.perfectketo.com/products/keto-bars) **- 50% off for 5 items till 2 Dec**
I still crave for a sweet snack from time to time, and these keto bars are great for those moments. It doesn't kick me out of ketosis, and they taste great. Only problem, it's not available easily in Singapore and they are not cheap. Happy to have a chance to buy more this time!
[**Field Notes**](https://fieldnotesbrand.com/) **- no promo code yet**
I have a whole collection of them at home but I still can't get enough of them. Field Notes are my go-to note books of choice, for everything from reflections, idea journals to project notebooks. I start a new book for every project I start. Every year I start a new one too. The designs are awesome, the craftsmanship is inspiring, and I love the whole grassroots genesis story behind it. It's not to surprising then that there's a cult following for Field Notes, on par with or maybe even more fervent than Moleskin notebook fans I'd say.
[**Bellroy**](https://bellroy.com/) **- $60 discount if you sign up till 30 Nov**
I have one of their slim wallets and really like their thoughtful, minimalist design and practical, clever compartments (like for SIM cards, a pull tab to get cards out of a slot easily, etc). So am already a fan, and going for their [crossbody bag](https://bellroy.com/products/sling/venture/black) seems a no-brainer.
[**Honey**](https://www.joinhoney.com/)
I recently discovered this free Chrome extension that crawls the internet for promo codes and adds them to your cart automagically. PayPal recently acquire this company for $4B, so that's says something. Great tool when you're shopping around and might have missed out on promo codes, and not just on Black Friday.
**Ice cream maker**
Also in the market for an ice cream maker to make keto-compliant ice cream... anyone knows any good deal on it?
| 75.842105 | 608 | 0.755725 | eng_Latn | 0.999453 |
4fd55b9fb9c95cf436683cbd85d5f8bff84085f8 | 404 | md | Markdown | _posts/2014-3-3-Hello-World.md | vernav0706/vernav0706.github.io | 2347951f3913a39a0e5a16099eb14efbff248964 | [
"MIT"
] | null | null | null | _posts/2014-3-3-Hello-World.md | vernav0706/vernav0706.github.io | 2347951f3913a39a0e5a16099eb14efbff248964 | [
"MIT"
] | null | null | null | _posts/2014-3-3-Hello-World.md | vernav0706/vernav0706.github.io | 2347951f3913a39a0e5a16099eb14efbff248964 | [
"MIT"
] | null | null | null | ---
layout: post
title: why cats are fun to be around with
---
Cats are fun to be around because they are calm and quiet. Mostly all cats are calm and sleep/nap all day. Sometimes the cats dont even bother us just only when it is time to eat. Also cats love to play around and you dont even need expensive toys to play with them. All you need is a shoelace to play with.

| 44.888889 | 308 | 0.747525 | eng_Latn | 0.999993 |
4fd5eba6cbd238d47ecc78ba1fa41ef0a372d30b | 3,506 | md | Markdown | _posts/magento-certification-notes/Checkout 16%/Checkout components/2015-07-06-Shipping-and-Payment-Methods-in-Magento.md | brideo/blog | e6e6dce45aa0aa21af061c76427b156a43887b7f | [
"MIT"
] | null | null | null | _posts/magento-certification-notes/Checkout 16%/Checkout components/2015-07-06-Shipping-and-Payment-Methods-in-Magento.md | brideo/blog | e6e6dce45aa0aa21af061c76427b156a43887b7f | [
"MIT"
] | null | null | null | _posts/magento-certification-notes/Checkout 16%/Checkout components/2015-07-06-Shipping-and-Payment-Methods-in-Magento.md | brideo/blog | e6e6dce45aa0aa21af061c76427b156a43887b7f | [
"MIT"
] | null | null | null | ---
layout: post
title: Shipping and payment methods in Magento
categories: magento-certification-notes checkout checkout-components
---
##Describe the programmatic structure of shipping methods, how to customize existing methods, and how to implement new methods
Shipping methods are known as carriers in Magento and the shipping module is called `Mage_Shipping`. This being said it shouldn't be to hard to remember that shipping methods extend `Mage_Shipping_Model_Carrier_Abstract`. You will also need to specify a new carrier in `config.xml`.
##Describe the shipping rates calculation process:
###How does Magento calculate shipping rates?
The abstract class `Mage_Shipping_Model_Carrier_Abstract` provides a method `collectRates($request)` which is used by a concrete carrier to calculate the shipping rate of all items. The request is built up by the quote model based on the shipping location and specified carrier.
###What is the hierarchy of shipping information in Magento?
UNANSWERED
###How does the TableRate shipping method work?
The TableRate shipping method is a CSV spreadsheet with different rules. These rules can be based on:
- Weight
- Price
- Number of Items
###How do US shipping methods (FedEX, UPS, USPS) work?
These methods are a bit shit, however what they should do it offer third party integration with these companies through API'
These code references can be used as an entry point to find answers to the questions above:
- `Mage_Shipping_Model_Carrier_Abstract`
- `Mage_Shipping_Model_Rate_Abstract`
- `Mage_Shipping_Model_Rate_Request`
- `Mage_Shipping_Model_Rate_Result`
- `Mage/Shipping/Model/Rate/Result/*`
- `Mage_Shipping_Model_Info`
- `Mage/Shipping/Model/Carrier/*`
- `Mage/Shipping/Model/Resource/Carrier/*`
##Describe the programmatic structure of payment methods and how to implement new methods:
###How can payment method behavior be customized (for example: whether to charge or authorize a credit card; controlling URL redirects; etc.)?
The module for Payment Methods is `Mage_Payment`. So to create a new payment method you will need to extend `Mage_Payment_Model_Method_Abstract`. Also you will need to add some information into config.xml.
###Which class is the basic class in the payment method hierarchy?
UNANSWERED
###How can the stored data of payment methods be customized (credit card numbers, for example)?
Payment information is imported in the `Mage_Sales_Model_Quote_Payment::importData($data)` method. This method fires an event you can listen to `$this->_eventPrefix . '_import_data_before'` so you are best off modifying/logging data in here.
###What is the difference between payment method and payment classes (such as order_payment, quote_payment, etc.)?
Payment methods process the transaction/card authorisation while payment classes manage the payment methods themselves.
###What is the typical structure of the payment method module?
Payment method modules usually have a few blocks for displaying the forms and a controller to handle request to third party gateways.
###How do payment modules support billing agreements?
Billing agreements are out of the box with Magento, you can set a billing agreement by calling `$order->getPayment()->setBillingAgreementData($data);`.
These code references can be used as an entry point to find answers to the questions above:
- `Mage_Payment_Model_Method_Abstract`
- `Mage_Payment_Model_Method_Cc`
- `Mage_Payment_Model_Info`
- `Mage/Paypal/*`
- `Mage/PaypalUk/*`
| 45.532468 | 282 | 0.796634 | eng_Latn | 0.996148 |
4fd81da4d09bb9fd182cc1424ccb09c7b8d5eb7d | 628 | md | Markdown | README.md | peterwicksstringfield/stan-math-microbenchmark-utility | be80e8f4a3b0e0b52cdb092fa1e0b841b99361d4 | [
"BSD-3-Clause"
] | null | null | null | README.md | peterwicksstringfield/stan-math-microbenchmark-utility | be80e8f4a3b0e0b52cdb092fa1e0b841b99361d4 | [
"BSD-3-Clause"
] | null | null | null | README.md | peterwicksstringfield/stan-math-microbenchmark-utility | be80e8f4a3b0e0b52cdb092fa1e0b841b99361d4 | [
"BSD-3-Clause"
] | null | null | null | Don't use this. See [stan-dev/perf-math](https://github.com/stan-dev/perf-math).
Kludgy thing that benchmarks some stuff in the math library.
Run like this:
./run_performance_testing.sh
Requires [benchmark](https://github.com/google/benchmark).
```
/path/to/top/
math/
.git (branch1 and branch2)
stan-math-benchmarking-script/
.git
testfile
inputfile
run_performance_testing.sh
benchmark/
.git
```
Benchmarks in testfile. Input in inputfile. Separate translation unit to
prevent spurious constant propagation.
Choose the branches to compare in the shell script.
| 23.259259 | 80 | 0.705414 | eng_Latn | 0.8355 |
4fd828b65d6379754b855fecb7b84954389d8535 | 1,492 | md | Markdown | docs/specification/README.md | sathieu/onyxia-api | 2eb333fca1b918c043cf712cb679c0dfe36322c7 | [
"MIT"
] | 6 | 2020-03-22T17:16:47.000Z | 2021-06-05T08:46:00.000Z | docs/specification/README.md | eurostat/onyxia-api | 5177a57ac84e9a1f994094d5f0bc97ae9113086a | [
"MIT"
] | 43 | 2020-03-26T14:27:05.000Z | 2022-03-01T09:48:14.000Z | docs/specification/README.md | eurostat/onyxia-api | 5177a57ac84e9a1f994094d5f0bc97ae9113086a | [
"MIT"
] | 14 | 2020-03-26T11:35:37.000Z | 2021-12-03T20:15:49.000Z | # Onyxia Universe package format extension
## Additional properties
Onyxia extends the standard `config.json` to add some additional attributes to the properties.
Each property can have an additionnal `x-form` attribute :
```
"x-form": {
"hidden": {
"type": "boolean",
"default": false
},
"readonly": {
"type": "boolean",
"default": false
},
"value": {
"type": "string"
}
}
```
| Name | Type | Required | Default | Description |
| -------- | ------------------------ | -------- | ------- | ----------------------------------------------------------------------------------------------------------------------------------- |
| hidden | boolean | no | false | Whether this parameter should be displayed to the user. |
| readonly | boolean | no | false | Indicate this parameter should be displayed but not modifiable. |
| value | string (mustache syntax) | no | | Template of the value that should be generated by the form. Basic [JSON Mustache](https://mustache.github.io/) syntax is supported. |
## Real life examples
- [Universe Datascience](https://github.com/InseeFrLab/Universe-Datascience)
| 45.212121 | 194 | 0.431635 | eng_Latn | 0.893524 |
4fd91ff6689c92764e3115375fe4f3a316872077 | 2,841 | md | Markdown | docs/experiment_box_case_10.md | tangowhisky37/Elecfreaks-Electronics-Kit | 9c24ced3a8366ff8781773e99370a33d483e59bc | [
"MIT"
] | 1 | 2020-10-19T14:55:38.000Z | 2020-10-19T14:55:38.000Z | docs/experiment_box_case_10.md | tangowhisky37/KidzCanCode-Tutorials-I | 9c24ced3a8366ff8781773e99370a33d483e59bc | [
"MIT"
] | null | null | null | docs/experiment_box_case_10.md | tangowhisky37/KidzCanCode-Tutorials-I | 9c24ced3a8366ff8781773e99370a33d483e59bc | [
"MIT"
] | null | null | null | ## Introduction ##
---
- The RGB Rainbow LED Ring is made of 8 ws2818b pixels in a cascade connection. Its most notable characteristic is single IO control and infinite cascade. In this case, we will drive the RGB Rainbow LED with the micro:bit to make the rainbow colours rotate around the ring.
## Hardware Connect ##
---

- Connect circuit as above picture and put 2 AAA batteries into batteries pack.
## Principles of Circuits ##
---

- The GND of slot on micro:bit is into innards of batteries' GND to generate the current loop.
## Introduction of Components ##
---
### RGB Rainbow LED Ring(8 LEDs)
- The RGB Rainbow LED Ring is made of 8 ws2812b pixels in a cascade connection. Each ws2812b is made of an integrated control circuit and a RGB chip.
- The first pixel receives 24bits of data through the DIN (Data IN) port, while the rest of the data is sent to the following pixels through the DOUT (Data OUT) port. With this automatic transformation forwarding technique, only the speed of data transmission is affected, not the quantity of data transmitted.
- The experiment box included a set of RGB Rainbow LED(8 LEDs).

*- Note: Please note the positive and the negative when you are connecting.*
## Software
---
### Step 1
- Click [makecode https://makecode.microbit.org/#](https://makecode.microbit.org/#)。
- Click on "New Project" and set a new Project.

- Click Advanced for more code blocks and find the Extensions at the bottom of the column.

- Search "neopixel" and add the neopixel.

### Step 2
- Under the on start block, initialize the LED, set pin to P0 with 8 LEDs and use the RGB colour.
- Then, set the rainbow from 1 to 360.

### Step 3
- Now the colour of the LED has been set, but we need a show block to make it works.
- Next, set rotate pixels to rotate the colour of the LED in a ring.

### Program
- Program link:[https://makecode.microbit.org/_LJAe7W97fDat](https://makecode.microbit.org/_LJAe7W97fDat)
- You also could directly download program by visiting website as below:
<div style="position:relative;height:0;padding-bottom:70%;overflow:hidden;"><iframe style="position:absolute;top:0;left:0;width:100%;height:100%;" src="https://makecode.microbit.org/#pub:_LJAe7W97fDat" frameborder="0" sandbox="allow-popups allow-forms allow-scripts allow-same-origin"></iframe></div>
---
## Result
---
- We can see a rainbow rotates on the LED ring.
- 
## Think
---
- How can we make the ring blink like an eye?
## Questions
---
## More Information
---
| 33.423529 | 310 | 0.724745 | eng_Latn | 0.943465 |
4fda14e7c175bb14508a3c59914a9c7e730b5d8a | 13,494 | md | Markdown | TEDx/Titles_starting_P_to_Z/Want_a_Healthy_Life_Eat_Bulgur_Sanaa_Abourezk_TEDxBrookings.md | gt-big-data/TEDVis | 328a4c62e3a05c943b2a303817601aebf198c1aa | [
"MIT"
] | 91 | 2018-01-24T12:54:48.000Z | 2022-03-07T21:03:43.000Z | cleaned_tedx_data/Titles_starting_P_to_Z/Want_a_Healthy_Life_Eat_Bulgur_Sanaa_Abourezk_TEDxBrookings.md | nadaataiyab/TED-Talks-Nutrition-NLP | 4d7e8c2155e12cb34ab8da993dee0700a6775ff9 | [
"MIT"
] | null | null | null | cleaned_tedx_data/Titles_starting_P_to_Z/Want_a_Healthy_Life_Eat_Bulgur_Sanaa_Abourezk_TEDxBrookings.md | nadaataiyab/TED-Talks-Nutrition-NLP | 4d7e8c2155e12cb34ab8da993dee0700a6775ff9 | [
"MIT"
] | 18 | 2018-01-24T13:18:51.000Z | 2022-01-09T01:06:02.000Z |
hi I'm Sena and I love food I love it
enough that I left engineering to become
a chef so I want to start by asking you
a question have you ever thought who's
in charge of your health you I hope so
government I hope not
they're food companies definitely not so
you if you think about it we are born
with our own genetic that we can't
change at least not yet
we are live in the same environment with
the pollution but also we have no
control of so only thing we have control
of is food we put in our mouth so I hope
I time I'm done that should become what
I call a conscious eater you are
conscious of what you put in your mouth
because I believe no matter you know
you're engineer a doctor anything if you
don't take care of this this is awesome
I come to work I actually I know when
I'm hungry I'm very crabby and I know
what when my staff bring me an apple
they're telling me you are crappy that's
their gentle way of telling me that so
we really have to be conscious of what
will be eaten what we put in our mouth
and that's why I want to tell you about
a wonderful ingredient that I'm very
passionate about bulgur wheat most of
you you think you don't know what bulgur
wheat is but most of you I think had to
bully if you have to bully you eat
bulgur wheat so what's bhagavate bulgur
wheat is a wheat that's been barred
boiled and dried and I have to tell you
whoever came up with this idea our great
great great ancestors in the
Mediterranean did a favour
to the to my family and to the
humanities so they every village before
technology anything they blend their
wheat and then you know they cut it and
about October the women of the village
get together and they assign date so
let's say Monday would be so nice day so
every village will have huge pot huge
pot and three he large stones and the
man they're more than happy to carry the
pots the night before and the stones to
the designated house and the woman about
four o'clock in the morning they get up
all the women in the village get up
together and they go to Santa's home and
they fill up the water in the pot and
they put the weed and they boil it
meantime the men are sleeping and the
woman don't mind because that's the time
of visiting and you know gossiping and
the whole thing because they have to
wait for the whole pot to boil after it
boils you know while they're having
their tea and everything then they use a
huge kind of spoon as you see in this
picture to put to put all the wheat the
one bar boiled bit in a basket that's
hand waving and also it passed from
generation to generation and they put it
on their shoulders and they go take it
to the my roof that's today is my turn
to burn to boil dewitt the roof has been
already cleaned and it's flat and they
have nice white sheet on it and they put
the weight on that sheet and they leave
it there for about ten days to dry in
the mail train Ian's son so in order to
make sure it all dries that's how they
reward their kids so we all can switch
to wash our feet and then we take turns
to walk through all the weight and we
make a design and you know we show up oh
I did and it looks like almost like a
Japanese rock garden it's beautiful but
if you think about something if you have
wheat on the roof of the building
and you have birds what happens it's all
done so long time ago I have like I said
I you know you have to admire I mean now
we have all this technology we can come
up with other things but they then they
thought of putting shiny threads that
loosely over the weed so when the wind
blow it it keep reflecting in the Sun so
it keep the birds away recently I mean
my mom's time they start taking from the
kids like cassette tapes remember the
old cassette tapes so they pull it and
my mom used to make a point you know I
always wanted to come to America so I
liked Elvis Presley he was the first
American singer I ever heard so she'd
make a point cause still Elvis Presley's
cassette because she hated him and she
will take the thread and they bought it
over all the weeds and of course when I
come to see it and she's my mom you know
she's paying for my food I couldn't say
anything just go to my room like so
that's why I came to America so I can
listen to Elvis Presley anyway after 10
days the women come all again and you
know so my shirt usually like you know
my mom we about 200 kilos so we're
talking about 500 pounds cuz I make
enough for the whole year so they take
it down I want to show you this picture
first and then a long time ago they used
to put it in a big pot and then they
have two stones and these stones they
have a hole under the top and the woman
sits on the floor and they book the
bulgar inside it the weed inside it and
they grind it so if they go five times
you have the fine bulgur wheat the one
you use in tabbouleh if you do it only
three times you have the core sweet
that's the one we use to pour pilaf and
then they give it to the grandmothers
they get them nice baklava
you know they promise them nice dress
and all the old ladies sit and each one
has a huge like almost like a wooden
calendar and they go through it like
this and they this way some of the
stones anything that the birds left on
the bird on the weeds it goes away and
they put it on the roof again and I dry
it again for ten days by doing this
process you have an a green light
literally it keeps for years no in a
second you can put anywhere doesn't need
any temperature just it stays there and
it tastes wonderful also by doing that
you don't need to boil it so long if you
anybody bought weeds for any recipe you
put the water and you wait you put the
water and you wait it takes forever and
it just tastes like weed by bar boiling
the weed and sundry it you have
something like that it's actually it
doesn't take long time to cook and it
has a nice nutty flavor and nice texture
so what's so special about bulgur why
I'm so passionate about it I have to
tell you by the way
you know I'm an engineer you know my
parents sent me to America and the whole
thing my mom's freaking out that I'm
smoking about beaurevel you know she
when we talk about you know brain
surgery is some if you will engine
something she's it Buddhahood you're
talking about send you to America to
talk about Buddhahood well I I am big
believer in good nutrition so it bulgur
is one cup of parole it gives you 50% of
your daily requirement of fiber think
about one cup also it's high in fiber
which we sort about sorry high high in
protein high in zinc high in iron
vitamin E and also what's so unique
about it after you buy boiled it it
become Lord glycemic what that mean as
mean when you eat it your blood sugar
doesn't shoot high which is make it
excellent ingredient for anybody who's
washing watching their diet or have
diabetes also now so now I talk to you
about it I hope I get a little bit
interested now I want to get you really
interested we have a joke when we was
when I was studying nutrition at college
that if you dig the grave of an American
after 200 years you'll find them the
same
that's why preservative because
everything we eat is has preservative
additive you know all the stuff and
that's again I know I keep coming again
conscious eating conscious eating so I
went to market and I bought like the
best bread I can find that's the best
it's a huge ingredient list but if you
look at the second ingredient they use
Bruegel to enrich the best bread you
have in the market if you check on
burgers what's the ingredient one burger
there is no additives no preservatives
no color no coloring no chemicals that
we can pronounce just pour over so
that's by itself it's wonderful because
the more we like that nutritious we eat
the less preservative we eat the better
flavour we're doing for our body also
beaurevel like I said they would boil
200 kilos it has a long long shelf life
when I used to come to America when I
was in student and go back home
summertime my suitcase would be about 25
kilos of just Borivali and the guy at
the airport you know look at it and said
what's this and for me it was like a
shotgun because who doesn't know Bravo
and you know I would keep it and in my
suitcase under the bed because it was
available anytime I want to eat
something I'll add just the sauce or
even water and I
I eat it it's very inexpensive one pound
of bulgur wheat in the United States is
dollar twenty-five cents and believe me
you don't want to cook the whole one
pound of butter oil it expands so I want
to make this dish I want because I have
all the ingredients already at the
restaurant but I want to make a point so
to make this pilaf that feed at least
four people I went and I get them from
the market so the tomatoes the garbanzo
beans the olive oil double Gore
everything came to $6 53 cents that's
less what you pay for a burger that can
I'm not going to name names but you know
some poor girls can they literally take
you right
directly to your heart attack this one
with $6 53 cents you get a new protein
your fiber your iron you think your
antioxidant everything it's delicious
for $66 53 cents you can feed so it's
very good for your budget that's another
thing we have to think about especially
if we are in college
vulgar weight also doesn't need heat so
we all like pasta you can cook pasta
without heat we all like old you can
cook it without heat bulgur actually you
do that's what the bully is the bully
you put the bulgur and you add the
ingredient by time you finish you have
to bully a chorus course / goal so I
know what I do I add water go take a
shower
come back add all the ingredients and
you have this wonderful salad we do it
at the restaurant we call it house salad
because I eat it every single day and
other thing about burger is ideal left
over it tastes much better the day after
so the that will no girl you keep it the
better it tastes because keep taking all
the liquids you know anybody who made
the salad I want on a picnic you have to
do it right away or eat it right away
because it goes watery watery watery
unless you have butter oil in it keep
taking the water the water so the salad
stays nice and attacked for days that's
why the bullies taste for it can stay a
month in your in your fridge but I eat
it all like never start more than two
days but when I make a pilaf especially
if I use one pound
we have other fluff overs and my husband
even though he like but all he hates it
when I make bank or pilaf because I make
it today
what were you having Bogart okay so he
you don't have nice salad next to it
second day what are we having
whatever so okay so now I love but a lot
but please can we move a little bit away
but it's wonderful for you if you
actually have a college of you're on a
budget because it tastes good day after
day the day after day if you think about
it you know we go to Chinese restaurant
we'll always take it up
over the rice stay dries you know you
open the box it's really dry you can't
eat it if you have left old pasta it's
mushy you can't eat it bowl or you can
keep eating it but don't make one pound
to 2 cups maximum for four days don't
wanna and I want you to hate it I want
you to like it so now you know the last
thing I wanna tell you think I'd let ask
you you know when you go to a gas
station have you ever thought for
putting the wrong caste in your car oh
no today I want to think about diesel
and my engine no you do a research
what's the best oil was the best things
for your car and this is just a car an
engine that you can replace so I wanted
to do the same effort when you put
something new in mouth after all we
can't replace this at least I think I'm
not replaceable another thing is you
know that's why sometimes people don't
think of nutrition because do you don't
see it on the spot you know we eat junk
food but after a while we have the heart
attack
and good nutrition is like good
investments you do it slowly
continuously and on the long run it pays
big same thing with nutrition after all
it's not that want is that not the
quantity is the quality you want to be
at least I want to be an unused old
person running a marathon so that's what
important that's why you see people a
Mediterranean you their old age they
have beautiful skin they still playing
backgammon they still walk in everything
is nutrition that they ate every single
day so one thing like again I'm going to
go mom and leave you with that thought
well God is the food of the poor so
nobody think of like if you're a guest
on you know to cook golf or use rice
with meat because I mean God forbid you
know you're a guest so when I tell my
mom she you know I call her and she say
she said you know I told her I'm giving
the speech about whatever I said you
know I said but what she said you mean
people come to your restaurant and pay
money to eat Bravo
I said yes and I cherished nine dollars
for it she said you know a nine dollars
much
applied for Syrian pounds likely paying
1000 she said they paying how much has
it been nine dollars it beautiful she
said American you are crazy said mom
American are not crazy with just we know
good food when we taste it
thank you
[Applause]
| 38.226629 | 48 | 0.780421 | eng_Latn | 0.999996 |
4fda5c85994b3f8babdd1e4beb6cfa8ee7754fca | 4,562 | md | Markdown | traditionalScriptWalletVault3.md | fresheneesz/bip-efficient-bitcoin-vaults | c4da3f6e9d12195d5a911bcb1357c924c7219fbe | [
"BSD-3-Clause"
] | 2 | 2021-04-21T19:50:07.000Z | 2021-12-03T22:29:16.000Z | traditionalScriptWalletVault3.md | fresheneesz/bip-efficient-bitcoin-vaults | c4da3f6e9d12195d5a911bcb1357c924c7219fbe | [
"BSD-3-Clause"
] | null | null | null | traditionalScriptWalletVault3.md | fresheneesz/bip-efficient-bitcoin-vaults | c4da3f6e9d12195d5a911bcb1357c924c7219fbe | [
"BSD-3-Clause"
] | 2 | 2021-06-10T21:45:14.000Z | 2021-07-12T07:38:48.000Z | # Wallet Vault 3
This is a traditional script variant of [Tapscript Wallet Vault 3](tapscriptWalletVault3.md). The js-like pseudocode, properties, etc are all the same, so just the bitcoin scripts are shown.
THIS IS CURRENTLY OUT OF SYNC WITH Tapscript Wallet Vault 3!! THIS NEEDS AN UPDATE.
## Script Implementation
The base wallet vault address in a wallet vault would have the following spend paths:
```
IF
<2, publicKeyA1_, publicKeyB1_, 2, ..., ...> CHECKMULTISIGVERIFY
ELSE
# Push each return addresses onto the output stack for outputs to intermediateAddress_.
<intermediateAddress_, 3, -1, changeAddress_, publicKeyA2Hash_, publicKeyB2Hash_>
PUSHOUTPUTSTACK
2DROP
2DROP
2DROP
# Ensure that the input is spent only to intermediateDestination or changeAddress1 with
# a maximum fee of 512 times the 300-block median fee-per-byte.
<1, intermediateAddress_, 300 blocks, 512x> CONSTRAINDESTINATION
2DROP
2DROP
# Require a signature for 1 of the 2 keys.
<2, publicKeyA1_, publicKeyB1_, 1, ...> CHECKMULTISIGVERIFY
# Push each destination addresses onto the output stack for the associated output. This
# must be last to allow for a variable number of inputs
<intermediateAddress_, 1, ..., ..., .....> PUSHOUTPUTSTACK
# All arguments to OP_POS are intentionally left on the stack. Are non-zero to signal success.
ENDIF
```
To spend this script you would use one of the following scriptSig patterns:
1. `ignored keyB1Sig keyA1Sig 1`
2. `... destinationAddressN outputNId keySig 0`, where
* the number of address-outputId pairs equals the number of outputs.
* `keySig` is a signature for either `publicKeyA1_` or `publicKeyB1_`.
In the script spend-path, each output's output stack would be:
* `intermediateAddress_` script spend-path:
* `changeAddress_ publicKeyA2Hash_ publicKeyB2Hash_ destinationAddressN`.
The `intermediateAddress_` referenced above is an address with the following spend paths:
```
# Grab the first if condition.
5 OP_ROLL
IF
# Move up the destinationPublicKey and destinationKeySig, and position the
# destinationKeySig destinationPublicKey on top.
2ROT
ROT
# Transaction creating this output must have been confirmed at LEAST 5 days ago.
<5 days> CHECKSEQUENCEVERIFY
DROP
# Verify the `destinationPublicKey` against the `destinationAddress` that came from the
# output stack.
verifyPublicKey(..., ...);
# Uses the passed in destinationPublicKey.
<..., ...> CHECKSIG
# changeAddress_, publicKeyA2Hash_, and publicKeyB2Hash_ are all intentionally left on the stack.
ELSE
# Grab the second if condition.
5 OP_ROLL
IF
# Drop destinationAddressN that was put on the stack from the output stack.
DROP
# Drop publicKeyA2Hash_ and publicKeyB2Hash_ that were put on the stack from the output stack.
2DROP
# Verify the `changePublicKey` against the `changeAddress` that came from the output stack.
verifyPublicKey(..., ...);
# Uses the passed in changePublicKey.
<..., ...> CHECKSIG
ELSE
# Transaction creating this output must have been confirmed at MOST 5 days ago.
<5 days> BEFOREBLOCKVERIFY
DROP
# Drop destinationAddressN that was put on the stack from the output stack.
DROP
# Move three values so they're in the order:
# [publicKeyB2Hash_, publicKeyB, publicKeyA, publicKeyA2Hash_].
2ROT
ROT
# Verify publicKeyB and save it to the alt stack.
verifyPublicKey(..., ...)
<...> TOALTSTACK
# Move the publicKeyA2Hash_ to before publicKeyA.
SWAP
# Verify publicKeyA.
verifyPublicKey(..., ...)
# Remove changeAddress_ that came from the output stack.
NIP
# Add back publicKeyB.
OP_FROMALTSTACK
# Move the stack items into the order required b CHECKMULTISIGVERIFY
2
OP_ROT
OP_ROT
OP_SWAP
# Require a 2-of-2 signature for two return keys.
<2, ..., ..., ..., ..., ...> CHECKMULTISIGVERIFY
ENDIF
ENDIF
# Top-of-stack param is the addess, the second param is the public key. Hashes are passed to
# the script (via OP_POSS from the previous transaction) rather than raw public keys to
# retain quantum resistance.
pseudofunction verifyPublicKey(..., ...):
OVER
<...> SHA256
<...> RIPEMD160
<..., ...> NUMEQUALVERIFY
```
To spend this script you would use one of the following scriptSig patterns:
1. `destinationKeySig destinationPublicKey 1`
2. `changeKeySig changePublicKey 1 0`
3. `ignored publicKeyBNSig publicKeyA publicKeyB publicKeyANSig 0 0` | 31.680556 | 190 | 0.723148 | eng_Latn | 0.914944 |
4fdb7ea1b5e2cf9b0fbfd0fb086f8f19598d1132 | 15,744 | md | Markdown | _episodes/06-power-calc.md | smcclatchy/dals-inference | 0a68879203bf6940e0a9132edfec3c8709d6a8aa | [
"CC-BY-4.0"
] | 1 | 2021-02-09T11:17:39.000Z | 2021-02-09T11:17:39.000Z | _episodes/06-power-calc.md | smcclatchy/exp-design | 60956f8fbe43a8e423e3bc5f216c4543a3957cea | [
"CC-BY-4.0"
] | 1 | 2018-03-28T17:17:52.000Z | 2018-04-12T21:05:01.000Z | _episodes/06-power-calc.md | smcclatchy/dals-inference | 0a68879203bf6940e0a9132edfec3c8709d6a8aa | [
"CC-BY-4.0"
] | 1 | 2021-02-09T11:17:48.000Z | 2021-02-09T11:17:48.000Z | ---
# Please do not edit this file directly; it is auto generated.
# Instead, please edit 06-power-calc.md in _episodes_rmd/
title: Power calculation and sample size
teaching: 0
exercises: 0
questions:
- "What role does the p-value (alpha) have in determining sample size for a study?"
- "What is type 2 error, and how does it correspond to the p-value and power of a test?"
- "What factors should be considered when estimating sample size for a study?"
objectives:
- "Calculate a sample size or power for a study that will be analyzed using a one fixed factor linear regression model."
- ""
- ""
keypoints:
- "When designing an experiment, use biological replicates."
- "Choose a single representative value (the mean, median, or mode) for technical replicates."
source: Rmd
---
Statistical power analysis is critical in experimental design. Before doing an experiment, it is important to calculate statistical power to estimate the sample size needed to detect an effect of a certain size with a specific degree of confidence. Underpowered studies are extremely common, which has led one of the most-cited scientists in medicine to claim that [most published research findings are false](https://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.0020124).
## Introduction
We'll explore power calculations using the effects of two different diets on the body weights of mice. First we'll load the data file.
~~~
# All code and text below is adapted from Irizarry & Love.
# http://genomicsclass.github.io/book/pages/power_calculations.html
# Exercises
# http://genomicsclass.github.io/book/pages/power_calculations_exercises.html
library(downloader)
url <- "https://raw.githubusercontent.com/smcclatchy/dals-inference/gh-pages/data/bodyWeights.csv"
filename <- "bodyWeights.csv"
if(!file.exists(filename)) download(url, destfile=filename)
# Read in DO850 body weight data.
dat <- read.csv("bodyWeights.csv")
~~~
{: .language-r}
Now we'll select body weight at 21 weeks for male mice on either a standard chow or high fat diet. We'll examine the difference in mean body weight between males on high fat and those on a chow diet.
~~~
library(dplyr)
controlPopulation <- filter(dat, Sex == "M" & Diet == "chow") %>%
select(BW.21) %>% unlist
hfPopulation <- filter(dat, Sex == "M" & Diet == "hf") %>%
select(BW.21) %>% unlist
mu_hf <- mean(hfPopulation, na.rm = TRUE)
mu_control <- mean(controlPopulation, na.rm = TRUE)
print(mu_hf - mu_control)
~~~
{: .language-r}
~~~
[1] 6.696912
~~~
{: .output}
~~~
print((mu_hf - mu_control)/mu_control * 100) # percent increase
~~~
{: .language-r}
~~~
[1] 18.59599
~~~
{: .output}
What is the difference between body weight averages on a high fat vs. standard chow diet?
~~~
print(mu_hf - mu_control)
~~~
{: .language-r}
~~~
[1] 6.696912
~~~
{: .output}
What is the percentage increase in body weight of animals on high fat diet?
~~~
print((mu_hf - mu_control)/mu_control * 100) # percent increase
~~~
{: .language-r}
~~~
[1] 18.59599
~~~
{: .output}
Since we have access to the population, we know that in fact there is a substantial difference (greater than 0 between the average weights of the two male populations at 21 weeks of age.
We can see that, in some cases, when we take a sample and perform a t-test, we don't always get a p-value smaller than 0.05. For example, here is a case where we take a sample of 3 mice and don't achieve statistical significance at the 0.05 level:
~~~
set.seed(1)
N <- 3
hf <- sample(hfPopulation, N)
control <- sample(controlPopulation, N)
t.test(hf, control)$p.value
~~~
{: .language-r}
~~~
[1] 0.006750713
~~~
{: .output}
Did we make a mistake? By not rejecting the null hypothesis, are we
saying the diet has no effect? The answer to this question is no. All
we can say is that we did not reject the null hypothesis. But this
does not necessarily imply that the null is true. The problem is that,
in this particular instance, we don't have enough _power_, a term we
are now going to define. If you are doing scientific research, it is
very likely that you will have to do a power calculation at some
point. In many cases, it is an ethical obligation as it can help you
avoid sacrificing mice unnecessarily or limiting the number of human
subjects exposed to potential risk in a study. Here we explain what
statistical power means.
#### Types of Error
Whenever we perform a statistical test, we are aware that we may make a
mistake. This is why our p-values are not 0. Under the null, there is
always a positive, perhaps very small, but still positive chance that we
will reject the null when it is true. If the p-value is 0.05, it will
happen 1 out of 20 times. This *error* is called _type I error_ by
statisticians.
A type I error is defined as rejecting the null when we should
not. This is also referred to as a false positive. So why do we then
use 0.05? Shouldn't we use 0.000001 to be really sure? The reason we
don't use infinitesimal cut-offs to avoid type I errors at all cost is
that there is another error we can commit: to not reject the null when we
should. This is called a _type II error_ or a false negative. The R
code analysis above shows an example of a false negative: we did not
reject the null hypothesis (at the 0.05 level) and, because we happen
to know and peeked at the true population means, we know there is in fact a
difference. Had we used a p-value cutoff of 0.25, we would not have
made this mistake. However, in general, are we comfortable with a type
I error rate of 1 in 4? Usually we are not.
#### The 0.05 and 0.01 Cut-offs Are Arbitrary
Most journals and regulatory agencies frequently insist that results be significant at the 0.01 or 0.05 levels. Of course there is nothing special about these numbers other than the fact that some of the first papers on p-values used these values as examples. Part of the goal of this book is to give readers a good understanding of what p-values and confidence intervals are so that these choices can be judged in an informed way. Unfortunately, in science, these cut-offs are applied somewhat mindlessly, but that topic is part of a complicated debate.
#### Power Calculation
Power is the probability of rejecting the null when the null is
false. Of course "when the null is false" is a complicated statement
because it can be false in many ways.
$\Delta \equiv \mu_Y - \mu_X$
could be anything and the power actually depends on this parameter. It
also depends on the standard error of your estimates which in turn
depends on the sample size and the population standard deviations. In
practice, we don't know these so we usually report power for several
plausible values of $\Delta$, $\sigma_X$, $\sigma_Y$ and various
sample sizes.
Statistical theory gives us formulas to calculate
power. The `pwr` package performs these calculations for you. Here we
will illustrate the concepts behind power by coding up simulations in R.
Suppose our sample size is:
~~~
N <- 4
~~~
{: .language-r}
and we will reject the null hypothesis at:
~~~
alpha <- 0.05
~~~
{: .language-r}
What is our power with this particular data? We will compute this probability by re-running the exercise many times and calculating the proportion of times the null hypothesis is rejected. Specifically, we will run:
~~~
B <- 2000
~~~
{: .language-r}
simulations. The simulation is as follows: we take a sample of size $N$ from both control and treatment groups, we perform a t-test comparing these two, and report if the p-value is less than `alpha` or not. We write a function that does this:
~~~
reject <- function(N, alpha=0.05){
hf <- sample(hfPopulation, N)
control <- sample(controlPopulation, N)
pval <- t.test(hf, control)$p.value
pval < alpha
}
~~~
{: .language-r}
Here is an example of one simulation for a sample size of 12. The call to `reject` answers the question "Did we reject?"
~~~
reject(N)
~~~
{: .language-r}
~~~
[1] FALSE
~~~
{: .output}
Now we can use the `replicate` function to do this `B` times.
~~~
B <- 10
rejections <- replicate(B, reject(N))
~~~
{: .language-r}
Our power is just the proportion of times we correctly reject. So with $N=12$ our power is only:
~~~
mean(rejections)
~~~
{: .language-r}
~~~
[1] 0.1
~~~
{: .output}
This explains why the t-test was not rejecting when we knew the null
was false. With a sample size of just 12, our power is about 0.1 percent. To
guard against false positives at the 0.05 level, we had set the
threshold at a high enough level that resulted in many type II
errors.
Let's see how power improves with N. We will use the function `sapply`, which applies a function to each of the elements of a vector. We want to repeat the above for the following sample sizes from 5 to 50 in increments of 5:
~~~
Ns <- seq(from = 5, to = 50, by = 5)
~~~
{: .language-r}
So we use `apply` like this:
~~~
power <- sapply(Ns, function(N){
rejections <- replicate(B, reject(N))
mean(rejections)
})
~~~
{: .language-r}
For each of the three simulations, the above code returns the proportion of times we reject. Not surprisingly power increases with N:
~~~
plot(Ns, power, type="b")
~~~
{: .language-r}

Similarly, if we change the level `alpha` at which we reject, power
changes. The smaller I want the chance of type I error to be, the less
power I will have. Another way of saying this is that we trade off
between the two types of error. We can see this by writing similar code, but
keeping $N$ fixed and considering several values of `alpha`:
~~~
N <- 30
alphas <- c(0.1, 0.05, 0.01, 0.001, 0.0001)
power <- sapply(alphas, function(alpha){
rejections <- replicate(B, reject(N, alpha=alpha))
mean(rejections)
})
plot(alphas, power, xlab="alpha", type="b", log="x")
~~~
{: .language-r}

Note that the x-axis in this last plot is in the log scale.
There is no "right" power or "right" alpha level, but it is important that you
understand what each means.
To see this clearly, you could create a plot with curves of power versus N.
Show several curves in the same plot with color representing alpha level.
#### p-values Are Arbitrary under the Alternative Hypothesis
Another consequence of what we have learned about power is that
p-values are somewhat arbitrary when the
null hypothesis is not true and therefore
the *alternative* hypothesis is true (the
difference between the population means is not zero).
When the alternative hypothesis is true,
we can make a p-value as small as we want simply by increasing
the sample size (supposing that we have an infinite population to sample
from). We can show this property of p-values
by drawing larger and larger samples from our
population and calculating p-values. This works because, in our case,
we know that the alternative hypothesis is true, since we have
access to the populations and can calculate the difference in their means.
First write a function that returns a p-value for a given sample size $N$:
~~~
calculatePvalue <- function(N) {
hf <- sample(hfPopulation, N)
control <- sample(controlPopulation, N)
t.test(hf, control)$p.value
}
~~~
{: .language-r}
We have a limit here of 197 for the high-fat diet population, but we can
see the effect well before we get to 197.
For each sample size, we will calculate a few p-values. We can do
this by repeating each value of $N$ a few times.
~~~
Ns <- seq(from = 10, to = length(hfPopulation), by=10)
Ns_rep <- rep(Ns, each=10)
~~~
{: .language-r}
Again we use `sapply` to run our simulations:
~~~
pvalues <- sapply(Ns_rep, calculatePvalue)
~~~
{: .language-r}
Now we can plot the 10 p-values we generated for each sample size:
~~~
plot(Ns_rep, pvalues, log="y", xlab="sample size",
ylab="p-values")
abline(h=c(.01, .05), col="red", lwd=2)
~~~
{: .language-r}

Note that the y-axis is log scale and that the p-values show a
decreasing trend all the way to $10^{-8}$
as the sample size gets larger. The standard cutoffs
of 0.01 and 0.05 are indicated with horizontal red lines.
It is important to remember that p-values are not more interesting as
they become very very small. Once we have convinced ourselves to
reject the null hypothesis at a threshold we find reasonable, having
an even smaller p-value just means that we sampled more mice than was
necessary. Having a larger sample size does help to increase the
precision of our estimate of the difference $\Delta$, but the fact
that the p-value becomes very very small is just a natural consequence
of the mathematics of the test. The p-values get smaller and smaller
with increasing sample size because the numerator of the t-statistic
has $\sqrt{N}$ (for equal sized groups, and a similar effect occurs
when $M \neq N$). Therefore, if $\Delta$ is non-zero, the t-statistic
will increase with $N$.
Therefore, a better statistic to report is the effect size with
a confidence interval or some statistic which gives the reader a
sense of the change in a meaningful scale. We can
report the effect size as a percent by dividing the difference
and the confidence interval by the control population mean:
~~~
N <- 12
hf <- sample(hfPopulation, N)
control <- sample(controlPopulation, N)
diff <- mean(hf, na.rm = TRUE) - mean(control, na.rm = TRUE)
diff / mean(control, na.rm = TRUE) * 100
~~~
{: .language-r}
~~~
[1] 24.24669
~~~
{: .output}
~~~
t.test(hf, control)$conf.int / mean(control, na.rm = TRUE) * 100
~~~
{: .language-r}
~~~
[1] 5.940412 42.552963
attr(,"conf.level")
[1] 0.95
~~~
{: .output}
In addition, we can report a statistic called
[Cohen's d](https://en.wikipedia.org/wiki/Effect_size#Cohen.27s_d),
which is the difference between the groups divided by the pooled standard
deviation of the two groups.
~~~
sd_pool <- sqrt(((N - 1) * var(hf, na.rm = TRUE) + (N - 1) * var(control, na.rm = TRUE))/(2 * N - 2))
diff / sd_pool
~~~
{: .language-r}
~~~
[1] 1.145321
~~~
{: .output}
This tells us how many standard deviations of the data the mean of the
high-fat diet group is from the control group. Under the
alternative hypothesis, unlike the t-statistic which is guaranteed to
increase, the effect size and Cohen's d will become more precise.
> ## Challenge 1: Power and sample size calculation for two treatment comparison (Gary's suggestion)
> 1). Specify power and size of test, variance, meaningful difference. Compute N.
> 2). Specify N, power, variance. Compute meaningful difference (what can I hope to get with my budget fixed?).
>
> > ## Solution to Challenge 1
> >
> {: .solution}
{: .challenge}
> ## Challenge 2: Draw some power curves (Gary's suggestion)
> 1). How are they affected by alpha, beta, delta, N?
> 2). If you double the sample size how does delta change?
> Use formulas, functions, statistical rules of thumb.
> Share your power curve with your neighbor.
> How would you choose sample size?
> What do you consider wasteful sample sizes when
> looking at these curves?
>
> > ## Solution to Challenge 2
> >
> {: .solution}
{: .challenge}
> ## Challenge 3: Compute power by simulation (Gary's suggestion)
> Learn how to simulate data.
> Draw power curves from simulations. Compare to theoretical curves.
> > ## Solution to Challenge 3
> >
> {: .solution}
{: .challenge}
> ## Challenge 4 (advanced): Permutation tests (Gary's suggestion)
>
> > ## Solution to Challenge 4
> >
> {: .solution}
{: .challenge}
> ## Challenge 5: Watch [A Biologist Talks to a Statistician](https://www.youtube.com/watch?v=Hz1fyhVOjr4)
> Do you empathize more now with the biologist or with the statistician?
>
{: .challenge}

| 30.63035 | 554 | 0.728532 | eng_Latn | 0.999019 |
4fdc24dc028bff4a4007803a51e347edc4ad421a | 2,397 | md | Markdown | README.md | Albertdemian/rex-gym | 35e7eaf269321c051c24a2523a217fedc17c11c2 | [
"Apache-2.0"
] | null | null | null | README.md | Albertdemian/rex-gym | 35e7eaf269321c051c24a2523a217fedc17c11c2 | [
"Apache-2.0"
] | 2 | 2020-11-13T18:48:19.000Z | 2021-05-21T16:27:12.000Z | README.md | Albertdemian/rex-gym | 35e7eaf269321c051c24a2523a217fedc17c11c2 | [
"Apache-2.0"
] | null | null | null | # rex-gym Environment for Multiple behaviors
Modified reward function
extended observation vector
in particular :
## in _init_() function, initialized:
```
self.affordance = [-1,-1,-1,-1,-1]
self.model ='Stand'
self.reference_base_position = []
self.target_orientation = []
```
## in step function I added the following lines:
```
if self._env_step_counter >0 and self._env_step_counter%200 ==0:
model_index = random.randint(0,3)
self.affordance = np.dot([1,1,1,1,1],model_index)
self.model = Behavioral_models[model_index]
print(self.model)
if self.model =='Turn Right' or self.model=='Turn Left':
self.reference_base_position = self.rex.GetBasePosition()
if self.model == 'Turn Right':
target_angle = random.uniform(-math.pi/4, -math.pi/2)
cur_orient = self.pybullet_client.getEulerFromQuaternion(self.rex.GetBaseOrientation())
self.target_orientation = [0,0,cur_orient[2]+target_angle]
elif self.model =='Turn Left':
target_angle = random.uniform(math.pi/4, math.pi/2)
cur_orient = self.pybullet_client.getEulerFromQuaternion(self.rex.GetBaseOrientation())
self.target_orientation = [0,0,cur_orient[2]+target_angle]
```
As the dictionary ```Behavioral_models``` defined as:
```
Behavioral_models = {-1:'Stand',
0:'Walk',
1:'Gallop',
2:'Turn Right',
3:'Turn Left'}
```
## In observation function I extended the observation pattern with :
```
observation.extend(self.affordance)
```
## Modified Termination condition:
```
def _termination(self):
position = self.rex.GetBasePosition()
o = self.rex.GetBaseOrientation()
if self.model == 'Stand':
roll, pitch, _ = self.rex.GetTrueBaseRollPitchYaw()
return math.fabs(roll) > 0.3 or math.fabs(pitch) > 0.5
else:
if position[2] < 0.13:
print("IS FALLEN!")
if self.is_fallen():
print("IS ROTATING!")
return self.is_fallen() or position[2] < 0.13
```
| 29.231707 | 107 | 0.55486 | eng_Latn | 0.449946 |
4fdc40087f07cc636ebbe1937543f447bc9bd6c0 | 302 | md | Markdown | page/3collections.md | zengzizhao/zengzizhao.github.io | a1852750b0b7d0b4841099d7ab57eb11423ad222 | [
"MIT"
] | null | null | null | page/3collections.md | zengzizhao/zengzizhao.github.io | a1852750b0b7d0b4841099d7ab57eb11423ad222 | [
"MIT"
] | null | null | null | page/3collections.md | zengzizhao/zengzizhao.github.io | a1852750b0b7d0b4841099d7ab57eb11423ad222 | [
"MIT"
] | null | null | null | ---
layout: page
title: Collections
permalink: /collection/
icon: bookmark
type: page
---
* content
{:toc}
## 工具
*
- [test](https://blog.zzz320.net/)
- test。
## 编程语言
### PHP
* [标题 -作者](https://blog.zzz320.net/)
test。
### Golang
### GameSever
## Comments
{% include comments.html %}
| 9.151515 | 36 | 0.589404 | kor_Hang | 0.233231 |
4fdc519ed1f536c6941e8d2773c1ea5b9392082a | 6,493 | md | Markdown | _posts/2013-08-12-ne-plus-se-planter-de-console-entre-sa-vm-la-preprod-la-prod.md | Keirua/blog.keiruaprod.fr | 76e6623ff3d625690e1dad02efa5e12073be5381 | [
"Unlicense"
] | null | null | null | _posts/2013-08-12-ne-plus-se-planter-de-console-entre-sa-vm-la-preprod-la-prod.md | Keirua/blog.keiruaprod.fr | 76e6623ff3d625690e1dad02efa5e12073be5381 | [
"Unlicense"
] | 3 | 2021-05-14T07:52:35.000Z | 2022-02-26T08:10:40.000Z | _posts/2013-08-12-ne-plus-se-planter-de-console-entre-sa-vm-la-preprod-la-prod.md | Keirua/blog.keiruaprod.fr | 76e6623ff3d625690e1dad02efa5e12073be5381 | [
"Unlicense"
] | 7 | 2020-11-15T23:23:27.000Z | 2022-01-14T21:30:22.000Z | ---
id: 593
title:
- Ne plus se planter de console entre sa VM, la préprod, la prod...
date: 2013-08-12T21:07:59+00:00
author: Keirua
layout: post
guid: http://blog.keiruaprod.fr/?p=593
permalink: /2013/08/12/ne-plus-se-planter-de-console-entre-sa-vm-la-preprod-la-prod/
keywords:
- Revelation, bash, ubuntu, process, astuce, préprod, serveurs
description:
- "A l'aide d'un code couleur et de l'applicationRevelation, il est facile de réduire les risques de se tromper de machine lorsque l'on se connecte régulièrement à plusieurs machines via SSH. "
robotsmeta:
- index,follow
categories:
- Astuce
tags:
- astuce
- bash
- préprod
- process
- Revelation
- serveurs
- ubuntu
lang: fr
---
Je travaille en permanence avec plusieurs machines différentes pour me connecter à des serveurs de développement, de préproduction, à ma machine virtuelle ou à des machines de productions. Après avoir oublié plusieurs fois certains mots de passe, puis m’être planté une fois ou deux de terminal et avoir lanché des scripts sur la mauvaise machine, j’ai commencé à utiliser des codes couleurs pour les différentes machines que j’utilise. Et comme il est pénible de changer les codes couleurs à chaque connection, je me suis finalement mis à utiliser **Revelation** pour résoudre les deux problèmes en une fois.
<div id="attachment_600" style="width: 310px" class="wp-caption alignright">
<img src="http://blog.keiruaprod.fr/wp-content/uploads/2013/08/codes_couleur-300x180.png" alt="L'astuce : une couleur pour chaque machine" width="300" height="180" class="size-medium wp-image-600" srcset="http://blog.keiruaprod.fr/wp-content/uploads/2013/08/codes_couleur-300x180.png 300w, http://blog.keiruaprod.fr/wp-content/uploads/2013/08/codes_couleur.png 940w" sizes="(max-width: 300px) 100vw, 300px" />
<p class="wp-caption-text">
L’astuce : une couleur pour chaque machine
</p>
</div>
Révélation est en effet un gestionnaire de mots de passe, qui sert à les centraliser, et à lancer diverses applications qui les utilisent. Un mot de passe maitre sert à protéger l’accès à des personnes non désirées (dans une certaine mesure). Je m’en sers donc pour me stocker les mots de pass de connection SSH, et pour me connecter directement aux machines distantes en SSH.
L’astuce pour ne pas me tromper de machine, c’est que chacune d’entre elle a une couleur différente. Rouge (comme danger) pour la machine de production, vert pour une VM, et ainsi de suite.
Pour faire pareil, il suffit d’installer Revelation (apt-get install revelation), puis de faire 3 choses :
## Créer un profil pour les couleurs des différentes machines :
<div id="attachment_598" style="width: 310px" class="wp-caption alignright">
<img src="http://blog.keiruaprod.fr/wp-content/uploads/2013/08/profils-300x191.png" alt="N'ayez pas peur d'avoir autant de profils que de machine" width="300" height="191" class="size-medium wp-image-598" srcset="http://blog.keiruaprod.fr/wp-content/uploads/2013/08/profils-300x191.png 300w, http://blog.keiruaprod.fr/wp-content/uploads/2013/08/profils.png 516w" sizes="(max-width: 300px) 100vw, 300px" />
<p class="wp-caption-text">
N’ayez pas peur d’avoir autant de profils que de machine
</p>
</div>
Faire la manipulation suivante (sur Ubuntu) pour vos différentes machines :
Ouvrir une console
Edition -> profils -> nouveau
On lui **donne un nom** (notez le, il va servir plus tard), et le fait se baser sur un profil déjà plaisant
Dans l’onglet couleur, il suffit de décocher « utiliser les couleurs du thème système », et de choisir des couleurs (de texte et d’arrière plan) qui conviennent.
Quand vous avez terminé, vous devriez avoir autant de profils que vous utilisez de machine différente, et chacune d’entre elle aura une couleur différente. Vous pouvez remarquer que je nomme mes profils sous la forme profilXXX, histoire de m’y retrouver.
## Configurer les raccourcis dans Revelation :
<div id="attachment_599" style="width: 310px" class="wp-caption alignright">
<a href="http://blog.keiruaprod.fr/wp-content/uploads/2013/08/config_vm.png"><img src="http://blog.keiruaprod.fr/wp-content/uploads/2013/08/config_vm-300x200.png" alt="N'oubliez pas de mettre le nom du profil dans "Domaine"" width="300" height="200" class="size-medium wp-image-599" srcset="http://blog.keiruaprod.fr/wp-content/uploads/2013/08/config_vm-300x200.png 300w, http://blog.keiruaprod.fr/wp-content/uploads/2013/08/config_vm.png 581w" sizes="(max-width: 300px) 100vw, 300px" /></a>
<p class="wp-caption-text">
N’oubliez pas de mettre le nom du profil dans « Domaine »
</p>
</div>
Une fois qu’on a les différents profils, il faut ensuite configurer les différentes connection SSH, comme dans l’image de droite.
On crée un nouveau raccourci de type shell, avec le nom de son choix, puis on remplit le nom d’hôte, le nom d’utilisateur et le mot de passe nécessaire à la connection à son shell.
L’astuce pour lancer la connection avec le profil désiré, c’est de mettre le nom du profil dans le champ « domaine ». On va ensuite configurer Revelation pour utiliser cette information comme nous le souhaitons.
N’oubliez pas de sauvegarder votre configuration Revelation (ctrl+S, ou le gros bouton enregistrer)
## Configurer l’utilisation des profils au lancement des connections SSH :
Dernière étape, il faut préciser à Revelation d’utiliser nos profils lorsque l’on lance un terminal. Pour celà, il suffit d’aller dans Editer -> Préférences -> Commandes de lancement, et dans la ligne SSH, de mettre ce qui suit :
<code lang="bash"><br />
gnome-terminal %(--profile=%d%) -x ssh %(-l %u%) %h<br />
</code>
Cette ligne parle d’elle même, on lance un terminal avec une connection ssh dont les options sont les paramètres de configuration de la machine donnée, et on choisit le profil correspondant pour le terminal.
Terminé !
L’utilisation est simple : on ouvre sa configuration dans revelation, choisit une machine, et le mot de passe est copié dans le presse papier. Sous Ubuntu, on peut le coller avec clic-molette, on appuie sur entrée, et c’est bon, on est connecté à sa machine. Et voila, fini les pertes de mot de passe, et fini les commandes lancées sur la mauvaise machine. Merci à Antho pour l’astuce ! | 71.351648 | 627 | 0.761589 | fra_Latn | 0.939424 |
4fddd65352016478273ef4a24c80c81b0abe2949 | 1,072 | md | Markdown | docs/public/why-not/index.md | jnichols0/strictyaml | b456066a763285532fd75cd274d4a275d8499d6c | [
"MIT"
] | 1,051 | 2016-07-28T19:17:55.000Z | 2022-03-29T11:33:05.000Z | docs/public/why-not/index.md | jnichols0/strictyaml | b456066a763285532fd75cd274d4a275d8499d6c | [
"MIT"
] | 166 | 2016-09-04T16:15:22.000Z | 2022-03-31T09:48:06.000Z | docs/public/why-not/index.md | jnichols0/strictyaml | b456066a763285532fd75cd274d4a275d8499d6c | [
"MIT"
] | 68 | 2016-09-03T10:40:47.000Z | 2022-03-29T11:33:23.000Z | ---
title: Why not X?
---
There are a number of formats and approaches that can achieve more or
less the same purpose as StrictYAML. Below is a series of comparisons
with some of the more famous ones:
- [Why avoid using environment variables as configuration?](environment-variables-as-config)
- [Why not use HJSON?](hjson)
- [Why not HOCON?](hocon)
- [Why not use INI files?](ini)
- [Why not use JSON Schema for validation?](json-schema)
- [Why not JSON for simple configuration files?](json)
- [Why not JSON5?](json5)
- [Why not use the YAML 1.2 standard? - we don't need a new standard!](ordinary-yaml)
- [Why not use kwalify with standard YAML to validate my YAML?](pykwalify)
- [Why not use Python's schema library (or similar) for validation?](python-schema)
- [Why not use SDLang?](sdlang)
- [What is wrong with TOML?](toml)
- [Why shouldn't I just use Python code for configuration?](turing-complete-code)
- [Why not use XML for configuration or DSLs?](xml)
If you'd like to write or link to a rebuttal to any argument raised
here, feel free to raise a ticket. | 41.230769 | 92 | 0.733209 | eng_Latn | 0.995587 |
4fde3b7930e48cf8dec2aa902ad06d03cd26a0ad | 39 | md | Markdown | README.md | kabircse/phpcaptcha | 65b7e2e7ed9fd07b0392a2b33cd275d4a0e65d32 | [
"MIT"
] | null | null | null | README.md | kabircse/phpcaptcha | 65b7e2e7ed9fd07b0392a2b33cd275d4a0e65d32 | [
"MIT"
] | null | null | null | README.md | kabircse/phpcaptcha | 65b7e2e7ed9fd07b0392a2b33cd275d4a0e65d32 | [
"MIT"
] | null | null | null | # math-captcha
Captcha library in php
| 13 | 23 | 0.769231 | eng_Latn | 0.60355 |
4fded0e497f512e8d92e1effe42ef9e383f904ae | 2,109 | md | Markdown | results/innerfidelity/sbaf-serious/Sony MDR-1RBT/README.md | M0Rf30/AutoEq | 5f296debab6e6251659c346f3ee33b8f1b5a2aaa | [
"MIT"
] | 1 | 2020-02-19T16:59:27.000Z | 2020-02-19T16:59:27.000Z | results/innerfidelity/sbaf-serious/Sony MDR-1RBT/README.md | M0Rf30/AutoEq | 5f296debab6e6251659c346f3ee33b8f1b5a2aaa | [
"MIT"
] | null | null | null | results/innerfidelity/sbaf-serious/Sony MDR-1RBT/README.md | M0Rf30/AutoEq | 5f296debab6e6251659c346f3ee33b8f1b5a2aaa | [
"MIT"
] | null | null | null | # Sony MDR-1RBT
See [usage instructions](https://github.com/jaakkopasanen/AutoEq#usage) for more options.
### EqualizerAPO
In case of using EqualizerAPO without any GUI, replace `C:\Program Files\EqualizerAPO\config\config.txt`
with:
```
Preamp: -6.1dB
GraphicEQ: 21 0.0; 23 0.4; 25 -0.5; 28 -1.6; 31 -2.6; 34 -3.4; 37 -4.1; 41 -4.9; 45 -5.6; 49 -6.2; 54 -6.7; 60 -7.2; 66 -7.8; 72 -8.4; 79 -8.7; 87 -9.4; 96 -9.2; 106 -8.4; 116 -7.6; 128 -7.6; 141 -7.8; 155 -7.8; 170 -4.1; 187 -4.0; 206 -1.8; 227 0.9; 249 4.1; 274 5.0; 302 6.0; 332 5.5; 365 5.0; 402 3.4; 442 1.5; 486 0.8; 535 1.8; 588 3.0; 647 2.1; 712 1.1; 783 2.4; 861 1.2; 947 0.0; 1042 0.3; 1146 0.6; 1261 -1.0; 1387 -3.1; 1526 -5.5; 1678 -6.9; 1846 -8.6; 2031 -8.3; 2234 -8.3; 2457 -5.0; 2703 -1.6; 2973 1.3; 3270 1.5; 3597 -0.2; 3957 -1.1; 4353 0.4; 4788 3.9; 5267 6.0; 5793 6.0; 6373 5.5; 7010 2.5; 7711 0.3; 8482 -2.9; 9330 -4.8; 10263 -0.3; 11289 0.0
```
### HeSuVi
HeSuVi 2.0 ships with most of the pre-processed results. If this model can't be found in HeSuVi add
`Sony MDR-1RBT GraphicEQ.txt` to `C:\Program Files\EqualizerAPO\config\HeSuVi\eq\custom\` folder.
Set volume attenuation in the Connection tab for both channels to **-61**
### Peace
In case of using Peace, click *Import* in Peace GUI and select `Sony MDR-1RBT ParametricEQ.txt`.
### Parametric EQs
In case of using other parametric equalizer, apply preamp of **-7.3dB** and build filters manually
with these parameters. The first 4 filters can be used independently.
When using independent subset of filters, apply preamp of **-7.1dB**.
| Type | Fc | Q | Gain |
|:--------|:--------|:-----|:---------|
| Peaking | 75 Hz | 1.24 | -9.5 dB |
| Peaking | 139 Hz | 3.35 | -6.0 dB |
| Peaking | 1955 Hz | 2.75 | -10.1 dB |
| Peaking | 5621 Hz | 3.3 | 7.3 dB |
| Peaking | 309 Hz | 2.61 | 7.2 dB |
| Peaking | 671 Hz | 1.8 | 2.2 dB |
| Peaking | 6582 Hz | 7.52 | 2.2 dB |
| Peaking | 9053 Hz | 5.56 | -5.9 dB |
 | 58.583333 | 660 | 0.626837 | eng_Latn | 0.490925 |
4fdef0adcae0353a97f136eb7785a6676b485e04 | 382 | md | Markdown | README.md | RayDarar/dynamic-menu | fa915dd7ecaf08b5d7cc748df3e1386f69dbd44f | [
"MIT"
] | null | null | null | README.md | RayDarar/dynamic-menu | fa915dd7ecaf08b5d7cc748df3e1386f69dbd44f | [
"MIT"
] | null | null | null | README.md | RayDarar/dynamic-menu | fa915dd7ecaf08b5d7cc748df3e1386f69dbd44f | [
"MIT"
] | null | null | null | # Dynamic Menu 🧮
Simple page with dynamic layout using vanilla HTML, CSS, JS.

## Sources ⚓️
This project is a part of challenge problems from this article on [Medium](https://medium.com/better-programming/here-are-5-front-end-challenges-to-code-dec-2019-edition-7d691c4b023).
## Licence 🚨
Project is licensed under the [MIT](./LICENSE) public license
| 27.285714 | 183 | 0.748691 | eng_Latn | 0.8986 |
4fdf746201ffdca676ce0614c7759eab31a14ec5 | 4,804 | md | Markdown | README-zh.md | dgynfi/DYFSwiftKeychain | 115ca0f66c60375ee0e6a8b80ffaf2143902bc82 | [
"MIT"
] | 1 | 2019-12-23T16:28:19.000Z | 2019-12-23T16:28:19.000Z | README-zh.md | dgynfi/DYFSwiftKeychain | 115ca0f66c60375ee0e6a8b80ffaf2143902bc82 | [
"MIT"
] | 1 | 2020-06-08T13:18:04.000Z | 2020-06-08T13:18:04.000Z | README-zh.md | dgynfi/DYFSwiftKeychain | 115ca0f66c60375ee0e6a8b80ffaf2143902bc82 | [
"MIT"
] | null | null | null | ## DYFSwiftKeychain
在钥匙串中存储文本和数据。你可能已经注意到苹果的 Keychain API 有点冗长。此库旨在提供较短的语法完成简单任务:读取/写入指定键的文本值:
```Swift
let keychain = DYFSwiftKeychain()
keychain.set("User Account Passcode", forKey: "kUserAccPasscode")
let p = keychain.get("kUserAccPasscode")
```
此 Keychain 库包括以下功能:
- 获取、设置和删除字符串、布尔值和数据的钥匙串项。
- 指定项访问安全级别。
- 通过 iCloud 同步项。
- 与其他应用程序共享钥匙串项。
[](LICENSE)
[](http://cocoapods.org/pods/DYFSwiftKeychain)

## QQ群 (ID:614799921)
<div align=left>
  <img src="https://github.com/dgynfi/DYFSwiftKeychain/raw/master/images/g614799921.jpg" width="30%" />
</div>
## 安装
使用 [CocoaPods](https://cocoapods.org):
```
use_frameworks!
target 'Your target name'
pod 'DYFSwiftKeychain', '~> 1.0.3'
```
或者从 [SwiftKeychain](https://github.com/dgynfi/DYFSwiftKeychain/tree/master/SwiftKeychain) 目录添加文件。
## 什么是 Keychain?
Keychain 是一种安全的存储方式。你可以在其中存储所有类型的敏感数据:用户密码、信用卡号码、秘密令牌等。一旦存储在 Keychain 中,这些信息只对你的应用可用,其他应用看不到。除此之外,操作系统确保这些信息被安全地保存和处理。例如,存储在 Keychain 中的文本不能从 iPhone 备份或其文件系统中提取。苹果建议只在 Keychain 中存储少量数据。如果你需要保护一些大的东西,你可以手动加密它,保存到一个文件,并将密钥存储在 Keychain 中。
## 使用
将 `import DYFSwiftKeychain` 添加到源代码中。
#### 写入/读取字符串
```Swift
let keychain = DYFSwiftKeychain()
keychain.set("User Account Passcode", forKey: "kUserAccPasscode")
let s = keychain.get("kUserAccPasscode")
```
#### 写入/读取字节序列
```Swift
let keychain = DYFSwiftKeychain()
keychain.set(data, forKey: "kCommSecureCode")
let data = keychain.getData("kCommSecureCode")
```
#### 写入/读取布尔值
```Swift
let keychain = DYFSwiftKeychain()
keychain.set(true, forKey: "kFirstInstalledAndLaunched")
let ret = keychain.getBool("kFirstInstalledAndLaunched")
```
#### 从钥匙串移除数据
```Swift
let keychain = DYFSwiftKeychain()
let _ = keychain.delete("kFirstInstalledAndLaunched") // Remove single key.
let _ = keychain.clear() // Delete everything from app's Keychain. Does not work on macOS.
```
## 高级选项
#### 钥匙串项访问选项
使用 `withAccess` 参数指定 Keychain 存储的安全级别。默认情况下使用 `.accessibleWhenUnlocked` 选项。它是限制性最强的选项之一,可以提供良好的数据保护。
```Swift
let keychain = DYFSwiftKeychain()
keychain.set("xxx", forKey:"Key1", withAccess: .accessibleWhenUnlocked)
```
如果需要应用程序在后台访问钥匙串项,则可以使用 `.accessibleAfterFirstUnlock`。请注意,它比 `.accessibleWhenUnlocked` 选项更不安全。
查看所有可用的 [访问选项](https://github.com/dgynfi/DYFSwiftKeychain/blob/master/SwiftKeychain/DYFSwiftKeychain.swift) 列表。
#### 将钥匙串项与其他设备同步
将 `synchronizable` 属性设置为 `true` 可在用户的多个设备之间启用钥匙串项同步。同步将适用于在其设备上的 iCloud 设置中启用了 Keychain 的用户。
将 `synchronizable` 属性设置为 `true` 将使用 `set` 方法将该项添加到其他设备,并使用 `get` 命令获取可同步的项。删除可同步项将从所有设备中删除。
请注意,您不需要在应用程序的目标中启用 iCloud 或 Keychain 共享功能,使此功能工作。
```Swift
// The first device
let keychain = DYFSwiftKeychain()
keychain.synchronizable = true
keychain.set("See you tomorrow!", forKey: "key12")
// The second device
let keychain = DYFSwiftKeychain()
keychain.synchronizable = true
let s = keychain.get("key12") // Returns "See you tomorrow!"
```
我们无法在 macOS 上进行钥匙串同步工作。
#### 与其他应用程序共享钥匙串项
为了在同一设备上的应用程序之间共享钥匙串项,它们需要在*Capabilities > Keychain Sharing*设置中注册通用*Keychain Groups*。
使用 `accessGroup` 属性访问共享钥匙串项。在下面的示例中,我们指定一个访问组 `9ZU3R2F3D4.com.omg.myapp.KeychainGroup`,它将用于设置、获取和删除 `key1` 项。
```Swift
let keychain = DYFSwiftKeychain()
keychain.accessGroup = "9ZU3R2F3D4.com.omg.myapp.KeychainGroup" // Use your own access group.
keychain.set("hello world", forKey: "key1")
keychain.get("key1")
keychain.delete("key1")
keychain.clear()
```
*注*:watchOS 2.0与其配对设备之间无法共享钥匙串项:https://forums.developer.apple.com/thread/5938
#### 检查操作是否成功
通过检查 `set`, `delete` 和 `clear` 方法的返回值,可以验证它们是否成功完成。这些方法成功时返回 `true`,出错时返回 `false`。
```Swift
let keychain = DYFSwiftKeychain()
if keychain.set("xxx", forKey: "key1") {
// Keychain item is saved successfully
} else {
// Report error
}
```
若要获取特定的失败原因,请使用包含上一个操作的结果代码的 `osStatus` 属性。请参见 [Keychain Result Codes](https://developer.apple.com/documentation/security/1542001-security_framework_result_codes).
```Swift
let keychain = DYFSwiftKeychain()
keychain.set("xxx", forKey: "key1")
if keychain.osStatus != errSecSuccess { /* Report error */ }
```
#### 数据引用返回
使用 `asReference: true` 参数将数据作为引用返回,这是 [NEVPNProtocol](https://developer.apple.com/documentation/networkextension/nevpnprotocol) 所需的。
```Swift
let keychain = DYFSwiftKeychain()
keychain.set(data, forKey: "key1")
keychain.getData("key1", asReference: true)
```
## 在 ObjC 中使用 DYFSwiftKeychain
`DYFSwiftKeychain` 支持在 Objective-C 应用程序中使用。
## 演示
`DYFSwiftKeychain` 在此 [演示](https://github.com/dgynfi/DYFStore/blob/master/DYFStore/DYFStoreKeychainPersistence.swift) 下学习如何使用。
## 欢迎反馈
如果你注意到任何问题,被卡住或只是想聊天,请随意制造一个问题。我乐意帮助你。
| 25.020833 | 235 | 0.755412 | yue_Hant | 0.523699 |
4fdfd94aba251669166c796bc869da64bd3f5c21 | 11,562 | md | Markdown | articles/virtual-machines/troubleshooting/troubleshoot-rdp-general-error.md | changeworld/azure-docs.pt-pt | 8a75db5eb6af88cd49f1c39099ef64ad27e8180d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/troubleshooting/troubleshoot-rdp-general-error.md | changeworld/azure-docs.pt-pt | 8a75db5eb6af88cd49f1c39099ef64ad27e8180d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/troubleshooting/troubleshoot-rdp-general-error.md | changeworld/azure-docs.pt-pt | 8a75db5eb6af88cd49f1c39099ef64ad27e8180d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Problemas disparam um erro geral do RDP num VM do Windows em Azure Microsoft Docs
description: Saiba como resolver um erro geral do RDP num Windows VM em Azure Microsoft Docs
services: virtual-machines-windows
documentationCenter: ''
author: genlin
manager: dcscontentpm
editor: ''
ms.service: virtual-machines-windows
ms.topic: troubleshooting
ms.tgt_pltfrm: vm-windows
ms.workload: infrastructure
ms.date: 10/31/2018
ms.author: genli
ms.openlocfilehash: 7fc0fbf3362d18284ad6a80afa6396b6be1270a9
ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897
ms.translationtype: MT
ms.contentlocale: pt-PT
ms.lasthandoff: 03/27/2020
ms.locfileid: "71058008"
---
# <a name="troubleshoot-an-rdp-general-error-in-azure-vm"></a>Problemas de resolução de um erro geral do PDR no Azure VM
Este artigo descreve um erro geral que pode experimentar quando efauma ligação ao Protocolo de Ambiente de Trabalho Remoto (RDP) a uma Máquina Virtual do Windows (VM) em Azure.
## <a name="symptom"></a>Sintoma
Quando efizer uma ligação RDP a um VM da janela em Azure, poderá receber a seguinte mensagem de erro geral:
**O Ambiente de Trabalho Remoto não pode ligar-se ao computador remoto por uma destas razões:**
1. **O acesso remoto ao servidor não está ativado**
2. **O computador remoto está desligado.**
3. **O computador remoto não está disponível na rede**
**Certifique-se de que o computador remoto está ligado e ligado à rede e que o acesso remoto está ativado.**
## <a name="cause"></a>Causa
Este problema pode ocorrer devido às seguintes causas:
### <a name="cause-1"></a>Causa 1
O componente RDP é desativado da seguinte forma:
- Ao nível dos componentes
- Ao nível do ouvinte
- No servidor de terminais
- No papel de anfitrião da sessão de ambiente de trabalho remoto
### <a name="cause-2"></a>Causa 2
Os Serviços de Ambiente de Trabalho Remoto (TermService) não estão a funcionar.
### <a name="cause-3"></a>Causa 3
O ouvinte rdp está mal configurado.
## <a name="solution"></a>Solução
Para resolver este problema, [volte a colocar o disco do sistema operativo](../windows/snapshot-copy-managed-disk.md)e fixe o disco do sistema operativo a um [VM](troubleshoot-recovery-disks-portal-windows.md)de resgate e, em seguida, siga os passos.
### <a name="serial-console"></a>Consola de Série
#### <a name="step-1-open-cmd-instance-in-serial-console"></a>Passo 1: Alugar cmD aberto na consola em série
1. Aceda à [Consola em Série](serial-console-windows.md) selecionando suporte & consola série de resolução > de **problemas****(Pré-visualização)**. Se a funcionalidade estiver ativada no VM, pode ligar o VM com sucesso.
2. Crie um novo canal para uma instância CMD. Digite **CMD** para iniciar o canal para obter o nome do canal.
3. Mude para o canal que executa a instância CMD, neste caso deve ser o canal 1.
```
ch -si 1
```
#### <a name="step-2-check-the-values-of-rdp-registry-keys"></a>Passo 2: Verifique os valores das teclas de registo RDP:
1. Verifique se o PDR é desativado pela polícia.
```
REM Get the local policy
reg query "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server " /v fDenyTSConnections
REM Get the domain policy if any
reg query "HKLM\SOFTWARE\Policies\Microsoft\Windows NT\Terminal Services" /v fDenyTSConnections
```
- Se a política de domínio existir, a configuração da política local é substituída.
- Se a política de domínio afirma que o PDR está desativado (1), então atualize a política de AD a partir do controlador de domínio.
- Se a política de domínio afirma que o PDR está ativado (0), então não é necessária nenhuma atualização.
- Se a política de domínio não existir e a política local estipula que o PDR está desativado (1), habilitar o PDR utilizando o seguinte comando:
reg add "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server" /v fDenyTSConnections /t REG_DWORD /d 0 /f
2. Verifique a configuração atual do servidor de terminais.
```
reg query "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server" /v TSEnabled
```
Se o comando devolver 0, o servidor de terminais é desativado. Em seguida, ative o servidor de terminais da seguinte forma:
```
reg add "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server" /v TSEnabled /t REG_DWORD /d 1 /f
```
3. O módulo 'Servidor terminal' está definido para drenar o modo se o servidor estiver numa exploração de servidores de terminais (RDS ou Citrix). Verifique o modo atual do módulo Terminal Server.
```
reg query "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server" /v TSServerDrainMode
```
Se o comando voltar 1, o módulo do Servidor de Terminais está programado para drenar o modo. Em seguida, coloque o módulo no modo de funcionamento da seguinte forma:
```
reg add "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server" /v TSServerDrainMode /t REG_DWORD /d 0 /f
```
4. Verifique se pode ligar-se ao servidor de terminais.
```
reg query "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server" /v TSUserEnabled
```
Se o comando devolver 1, não pode ligar-se ao servidor de terminais. Em seguida, ativar a ligação da seguinte forma:
```
reg add "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server" /v TSUserEnabled /t REG_DWORD /d 0 /f
```
5. Verifique a configuração atual do ouvinte RDP.
```
reg query "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server\Winstations\RDP-Tcp" /v fEnableWinStation
```
Se o comando voltar a 0, o ouvinte RDP está desativado. Em seguida, ativar o ouvinte da seguinte forma:
```
reg add "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server\Winstations\RDP-Tcp" /v fEnableWinStation /t REG_DWORD /d 1 /f
```
6. Verifique se pode ligar-se ao ouvinte RDP.
```
reg query "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server\Winstations\RDP-Tcp" /v fLogonDisabled
```
Se o comando voltar 1, não pode ligar-se ao ouvinte RDP. Em seguida, ativar a ligação da seguinte forma:
```
reg add "HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server\Winstations\RDP-Tcp" /v fLogonDisabled /t REG_DWORD /d 0 /f
```
7. Reinicie a VM.
8. Saia da instância CMD `exit`digitando e, em seguida, pressione **Enter** duas vezes.
9. Reiniciar o VM `restart`digitando e, em seguida, ligue-se ao VM.
Se o problema ainda acontecer, passe para o passo 2.
#### <a name="step-2-enable-remote-desktop-services"></a>Passo 2: Ativar serviços de ambiente de trabalho remoto
Para mais informações, consulte [remote Desktop Services não está a começar num Azure VM](troubleshoot-remote-desktop-services-issues.md).
#### <a name="step-3-reset-rdp-listener"></a>Passo 3: Reset RDP ouvinte
Para mais informações, consulte [a desconexão do Ambiente remoto frequentemente em Azure VM](troubleshoot-rdp-intermittent-connectivity.md).
### <a name="offline-repair"></a>Reparação offline
#### <a name="step-1-turn-on-remote-desktop"></a>Passo 1: Ligue o ambiente de trabalho remoto
1. [Fixe o disco OS a um VM](../windows/troubleshoot-recovery-disks-portal.md)de recuperação .
2. Inicie uma ligação remote Desktop ao VM de recuperação.
3. Certifique-se de que o disco está sinalizado como **Online** na consola de Gestão de Discos. Note a letra de unidade que é atribuída ao disco osso anexado.
4. Inicie uma ligação remote Desktop ao VM de recuperação.
5. Abra uma sessão de solicitação de comando elevada **(Executar como administrador).** Execute os seguintes scripts. Neste script, assumimos que a letra de unidade atribuída ao disco OS anexado é F. Substitua esta carta de unidade com o valor adequado para o seu VM.
```
reg load HKLM\BROKENSYSTEM F:\windows\system32\config\SYSTEM.hiv
reg load HKLM\BROKENSOFTWARE F:\windows\system32\config\SOFTWARE.hiv
REM Ensure that Terminal Server is enabled
reg add "HKLM\BROKENSYSTEM\ControlSet001\control\Terminal Server" /v TSEnabled /t REG_DWORD /d 1 /f
reg add "HKLM\BROKENSYSTEM\ControlSet002\control\Terminal Server" /v TSEnabled /t REG_DWORD /d 1 /f
REM Ensure Terminal Service is not set to Drain mode
reg add "HKLM\BROKENSYSTEM\ControlSet001\control\Terminal Server" /v TSServerDrainMode /t REG_DWORD /d 0 /f
reg add "HKLM\BROKENSYSTEM\ControlSet002\control\Terminal Server" /v TSServerDrainMode /t REG_DWORD /d 0 /f
REM Ensure Terminal Service has logon enabled
reg add "HKLM\BROKENSYSTEM\ControlSet001\control\Terminal Server" /v TSUserEnabled /t REG_DWORD /d 0 /f
reg add "HKLM\BROKENSYSTEM\ControlSet002\control\Terminal Server" /v TSUserEnabled /t REG_DWORD /d 0 /f
REM Ensure the RDP Listener is not disabled
reg add "HKLM\BROKENSYSTEM\ControlSet001\control\Terminal Server\Winstations\RDP-Tcp" /v fEnableWinStation /t REG_DWORD /d 1 /f
reg add "HKLM\BROKENSYSTEM\ControlSet002\control\Terminal Server\Winstations\RDP-Tcp" /v fEnableWinStation /t REG_DWORD /d 1 /f
REM Ensure the RDP Listener accepts logons
reg add "HKLM\BROKENSYSTEM\ControlSet001\control\Terminal Server\Winstations\RDP-Tcp" /v fLogonDisabled /t REG_DWORD /d 0 /f
reg add "HKLM\BROKENSYSTEM\ControlSet002\control\Terminal Server\Winstations\RDP-Tcp" /v fLogonDisabled /t REG_DWORD /d 0 /f
REM RDP component is enabled
reg add "HKLM\BROKENSYSTEM\ControlSet001\control\Terminal Server" /v fDenyTSConnections /t REG_DWORD /d 0 /f
reg add "HKLM\BROKENSYSTEM\ControlSet002\control\Terminal Server" /v fDenyTSConnections /t REG_DWORD /d 0 /f
reg add "HKLM\BROKENSOFTWARE\Policies\Microsoft\Windows NT\Terminal Services" /v fDenyTSConnections /t REG_DWORD /d 0 /f
reg unload HKLM\BROKENSYSTEM
reg unload HKLM\BROKENSOFTWARE
```
6. Se o VM for de domínio, verifique a seguinte chave de registo para ver se existe uma política de grupo que irá desativar o RDP.
```
HKLM\BROKENSOFTWARE\Policies\Microsoft\Windows NT\Terminal Services\fDenyTSConnectionS
```
Se este valor-chave for definido para 1, significa que o PDR é desativado pela política. Para ativar o Ambiente de Trabalho Remoto através da política GPO, altere a seguinte política do controlador de domínio:
**Configuração do computador\Políticas\Modelos administrativos:**
Definições de política\Windows Components\Remote Desktop Services\Remote Desktop Session Host\Connections\Permitir que os utilizadores se conectem remotamente utilizando serviços de ambiente de trabalho remotos
1. Retire o disco do VM de resgate.
1. [Crie um novo VM a partir do disco.](../windows/create-vm-specialized.md)
Se o problema ainda acontecer, passe para o passo 2.
#### <a name="step-2-enable-remote-desktop-services"></a>Passo 2: Ativar serviços de ambiente de trabalho remoto
Para mais informações, consulte [remote Desktop Services não está a começar num Azure VM](troubleshoot-remote-desktop-services-issues.md).
#### <a name="step-3-reset-rdp-listener"></a>Passo 3: Reset RDP ouvinte
Para mais informações, consulte [a desconexão do Ambiente remoto frequentemente em Azure VM](troubleshoot-rdp-intermittent-connectivity.md).
## <a name="need-help-contact-support"></a>Precisa de ajuda? Contactar o suporte
Se ainda precisar de ajuda, [contacte o suporte](https://portal.azure.com/?#blade/Microsoft_Azure_Support/HelpAndSupportBlade) para resolver o seu problema rapidamente.
| 46.809717 | 267 | 0.740097 | por_Latn | 0.968321 |
4fe02b7ef19f5529d50ed923b46dfedd4dd24673 | 1,014 | md | Markdown | _posts/2017-10-01-[LeetCode]7. Reverse Integer.md | StephenRi/StephenRi.github.io | dff3822713af1d510a6afa1328b31303c2d3669d | [
"Apache-2.0"
] | null | null | null | _posts/2017-10-01-[LeetCode]7. Reverse Integer.md | StephenRi/StephenRi.github.io | dff3822713af1d510a6afa1328b31303c2d3669d | [
"Apache-2.0"
] | 1 | 2016-04-13T07:17:33.000Z | 2016-04-13T07:17:33.000Z | _posts/2017-10-01-[LeetCode]7. Reverse Integer.md | StephenRi/StephenRi.github.io | dff3822713af1d510a6afa1328b31303c2d3669d | [
"Apache-2.0"
] | 2 | 2019-12-07T07:30:18.000Z | 2019-12-09T06:11:25.000Z | ---
layout: post
title: "[LeetCode]7. Reverse Integer"
subtitle: "求一个整数的逆序"
date: 2017-10-01 12:00:00
author: "Stephen.Ri"
header-img: "img/leetcode-bg.jpg"
catalog: true
tags:
- LeetCode
---
Description
===========
Reverse digits of an integer.
The input is assumed to be a 32-bit signed integer. Your function should return 0 when the reversed integer overflows.
Example
-------
> **Example1**: x = 123, return 321
>
> **Example2**: x = -123, return -321
Discussion
=======
这道题为整数的逆序问题,注意溢出的处理就可以了。
算法的时间复杂度为O(1)。
C++ Code
====
```
class Solution {
public:
int reverse(int x) {
long long answer = 0;
const int maxInt = 0x7fffffff;
const int minInt = 0x80000000;
while(x != 0)
{
int num = x % 10;
answer = answer * 10 + num;
if(answer > maxInt || answer < minInt)
{
return 0;
}
x = x / 10;
}
return answer;
}
};
```
| 17.186441 | 118 | 0.533531 | eng_Latn | 0.732813 |
4fe1d4d98fff9d848e788c78b4b77f905eceded4 | 6,366 | md | Markdown | user/pages/02.schede/romanzi/item.md | p-marco/polet500 | be6a13494cd1b50baed6abaa172e1323b92c628e | [
"MIT"
] | 3 | 2017-01-18T11:06:24.000Z | 2019-06-13T06:30:28.000Z | user/pages/02.schede/romanzi/item.md | p-marco/polet500 | be6a13494cd1b50baed6abaa172e1323b92c628e | [
"MIT"
] | null | null | null | user/pages/02.schede/romanzi/item.md | p-marco/polet500 | be6a13494cd1b50baed6abaa172e1323b92c628e | [
"MIT"
] | 1 | 2019-06-09T15:26:44.000Z | 2019-06-09T15:26:44.000Z | ---
title: Polemica sui "romanzi" tra Giraldi Cinzio e Pigna
metadata:
description: "Polemica sulla partenità di alcune teorie relative ai 'romanzi', ossia i poemi cavallereschi, con particolare attenzione all'Orlando furioso."
og:image: /polet500/schede/romanzi/banner-fb.jpg
image: /polet500/schede/romanzi/banner-fb.jpg
keywords: polemiche letterarie, pietro bembo, letteratura cinquecento
date: 01-01-2018
taxonomy:
category: blog
tag: [Poema cavalleresco, Orlando furioso ]
char: [Giovan Battista Giraldi Cinzio, Giovan Battista Pigna]
author: Paolo Tabacchini
date: 01/09/2017
chronology: 1554
bibliography:
-
Categoria: Fonte
Tipologia: Letteraria
Autore: Giovan Battista Giraldi Cinzio
Destinatario: 'Ercole II d''Este'
Titolo: Discorso intorno al comporre dei romanzi
Curatori volume:
Titolo volume:
Rivista:
Parte e volume:
Luogo: Venezia
Editore: Gabriele Giolito e fratelli
Data: 1554
Pagine:
Link: https://books.google.it/books?id=rVxcAAAAcAAJ&printsec=frontcover&dq=giraldi+cinzio+romanzi&hl=it&sa=X&ved=0ahUKEwi9ofXTjdLVAhVpAcAKHfxXCQcQ6AEINTAD#v=onepage&q=giraldi%20cinzio%20romanzi&f=false
-
Categoria: Fonte
Tipologia: Letteraria
Autore: Giovan Battista Pigna
Destinatario: 'Luigi d''Este'
Titolo: 'I romanzi, ne'' quali della vita et della poesia dell''Ariosto si tratta'
Curatori volume:
Titolo volume:
Rivista:
Parte e volume:
Luogo: Venezia
Editore: Vincenzo Valgrisi
Data: 1554
Pagine:
Link: https://books.google.it/books?id=5wU8AAAAcAAJ&printsec=frontcover&dq=giovan+battista+pigna+i+romanzi&hl=it&sa=X&ved=0ahUKEwjC-om-jtLVAhVPFMAKHVrZBSQQ6AEILjAB#v=onepage&q=giovan%20battista%20pigna%20i%20romanzi&f=false
-
Categoria: Bibiografia
Tipologia: Capitolo di libro
Autore: Stefano Benedetti
Destinatario:
Titolo: Accusa e smascheramento del "furto" a metà Cinquecento: riflessioni sul plagio critico intorno alla polemica tra G.B. Pigna e G.B. G. Cinzio
Curatori volume: Roberto Gigliucci
Titolo volume: Furto e plagio nella letteratura del classicismo
Rivista:
Parte e volume:
Luogo: Roma
Editore: Bulzoni
Data: 1998
Pagine: 233-261
Link: https://www.academia.edu/23761105/Accusa_e_smascheramento_del_furto_a_met%C3%A0_Cinquecento_riflessioni_sul_plagio_critico_intorno_alla_polemica_tra_G._B._Pigna_e_G._B._Giraldi_Cinzio
-
Categoria: Bibiografia
Tipologia: Capitolo di libro
Autore: Francesco Beneducci
Destinatario:
Titolo: Ancora la causa Giraldi-Pigna
Curatori volume: Francesco Beneducci
Titolo volume: Scampoli critici
Rivista:
Parte e volume:
Luogo: Oneglia
Editore: Ghilini
Data: 1899
Pagine: 43-48
Link:
-
Categoria: Bibiografia
Tipologia: Articolo in rivista
Autore: Daniela Boccassini
Destinatario:
Titolo: «Romanzevoli muse»: Giraldi, Pigna e la questione del poema cavalleresco
Curatori volume:
Titolo volume:
Rivista: Schifanoia
Parte e volume: n. 13-14
Luogo:
Editore:
Data: 1992
Pagine: 203-216
Link:
-
Categoria: Bibiografia
Tipologia: Capitolo di libro
Autore: Riccardo Bruscagli
Destinatario:
Titolo: Il romanzo del’500. «Romanzo» ed «epos» dall’Ariosto al Ta s s o
Curatori volume: Riccardo Bruscagli
Titolo volume: Il Romanzo. Origine e sviluppo delle strutture narrative nella cultura occidentale
Rivista:
Parte e volume:
Luogo: Pisa
Editore: ETS
Data: 1987
Pagine: 53-69
Link:
-
Categoria: Bibiografia
Tipologia: Capitolo di libro
Autore: Benedetto Croce
Destinatario:
Titolo: Postille manoscritte di Orazio Ariosto ai «Romanzi» del Pigna
Curatori volume: Benedetto Croce
Titolo volume: Aneddoti di varia letteratura
Rivista:
Parte e volume:
Luogo: Bari
Editore: Laterza
Data: 1953
Pagine: 12-18
Link:
-
Categoria: Bibiografia
Tipologia: Articolo in rivista
Autore: Stefano Jossa
Destinatario:
Titolo: Giraldi e Pigna sui romanzi: una polemica in contesto
Curatori volume:
Titolo volume:
Rivista: Critica letteraria
Parte e volume: Vol. XLI, n. 2-3 (159-160)
Luogo:
Editore:
Data: 2013
Pagine: 533-552
Link:
-
Categoria: Bibiografia
Tipologia: Articolo in rivista
Autore: Salvatore Ritrovato
Destinatario:
Titolo: «Romanzi» di Giovan Battista Pigna (1554): interpretazione di un genere moderno
Curatori volume:
Titolo volume:
Rivista: Studi e problemi di critica testuale
Parte e volume: A. LII, n. 1
Luogo:
Editore:
Data: 1996
Pagine: 131-151
Link:
-
Categoria: Bibiografia
Tipologia: Capitolo di libro
Autore: Carlo Simiani
Destinatario:
Titolo: La contesa tra il Giraldi e il Pigna
Curatori volume: Carlo Simiani
Titolo volume: 'Contese letterarie del ''500'
Rivista:
Parte e volume:
Luogo: Treviso
Editore: Turazza
Data: 1904
Pagine:
Link:
-
Categoria: Bibiografia
Tipologia: Articolo in rivista
Autore: Giuseppe Venturini
Destinatario:
Titolo: Le postille di Orazio Ariosti ai «Romanzi» del Pigna
Curatori volume:
Titolo volume:
Rivista: Lettere Italiane
Parte e volume: A. XXI, n.1
Luogo:
Editore:
Data: 1969
Pagine: 54-61
Link:
---
La polemica nasce dalla pubblicazione avvenuta nello stesso anno (1554) di due discorsi di Giraldi Cinzio e Pigna che trattano dei "romanzi", termine con il quale i due designano i poemi cavellereschi di cui l'esempio maggiore è l'Orlando furioso. Entrambi contestano all'altro il furto delle teorie che guidano le loro riflessioni. Difficile determinare la reale paternità delle idee. Nelle lettere che i due successivamente si scambiano, ognuno si attruibuisce il merito di aver trattato per primo in maniera moderna i romanzi e accusa l'altro di aver rubato le idee. I documenti relativi alla disputa furono pubblicati nello stesso anno in un opuscolo stampato a Venezia presso Francesco Marcolini. | 34.042781 | 702 | 0.696356 | ita_Latn | 0.965395 |
4fe26ed7a33b95e9f5a450ca224445f6c39421ac | 199 | md | Markdown | contact/index.md | yegrug/yegrug.github.io | dd9ac9ab5a7711dd643b6b8ee6e4316c5b2e8961 | [
"MIT"
] | 1 | 2021-10-01T00:10:51.000Z | 2021-10-01T00:10:51.000Z | contact/index.md | yegrug/yegrug.github.io | dd9ac9ab5a7711dd643b6b8ee6e4316c5b2e8961 | [
"MIT"
] | null | null | null | contact/index.md | yegrug/yegrug.github.io | dd9ac9ab5a7711dd643b6b8ee6e4316c5b2e8961 | [
"MIT"
] | null | null | null | ---
title: Contact
---
Find us on our [Meetup page](https://www.meetup.com/edmonton-r-user-group-yegrug/) or come to the [Dev Edmonton Society](https://devedmonton.com/)'s `#meetup-r` Slack channel
| 33.166667 | 174 | 0.713568 | eng_Latn | 0.207363 |
4fe3b284cd9f272754563f4290e6438e8669d6b5 | 733 | md | Markdown | README.md | theoldmoon0602/fzf_outline.vim | a45838a012d02b8379d7e6d0262a3bc1957d9b5a | [
"MIT"
] | null | null | null | README.md | theoldmoon0602/fzf_outline.vim | a45838a012d02b8379d7e6d0262a3bc1957d9b5a | [
"MIT"
] | null | null | null | README.md | theoldmoon0602/fzf_outline.vim | a45838a012d02b8379d7e6d0262a3bc1957d9b5a | [
"MIT"
] | null | null | null | # fzf_outline.vim
This is a vim/neovim plugin that add new command `Outline`.
## Requirements
- [fzf.vim](https://github.com/gmarik/vundle)
- ctags binary
## Install
You can install this plugin with your package manager.
To install with [vim-plug](https://github.com/junegunn/vim-plug), add the following lines to your `.vimrc` and do `:PlugInstall`.
```vim
Plug 'theoldmoon0602/fzf_outline.vim'
```
## Configuration
The variable `g:fzf_outline_kinds` is list of shown kinds.
```vim
let g:fzf_outline_kinds=["c","s","f","g"] "default value
```
## Usage
`:Outline` command shows outline if tags file exists.

## LICENSE
MIT
| 20.942857 | 129 | 0.728513 | eng_Latn | 0.585319 |
4fe402b834a4345ed60d8b3789001b4368474af8 | 2,017 | md | Markdown | _publications/2021-02-01-2021MNRAS5011300S.md | Naminoshi/aguena.github.io | 553c5474656a4e986fce6473a4995b3161d49f26 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | _publications/2021-02-01-2021MNRAS5011300S.md | Naminoshi/aguena.github.io | 553c5474656a4e986fce6473a4995b3161d49f26 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | _publications/2021-02-01-2021MNRAS5011300S.md | Naminoshi/aguena.github.io | 553c5474656a4e986fce6473a4995b3161d49f26 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | ---
title: "Is diffuse intracluster light a good tracer of the galaxy cluster matter distribution?"
collection: publications
permalink: /publication/2021-02-01-2021MNRAS5011300S
date: 2021-02-01
venue: "Monthly Notices of the Royal Astronomical Society"
paperurl: "https://ui.adsabs.harvard.edu/abs/2021MNRAS.501.1300S"
citation: "Sampaio-Santos, H. and Zhang, Y. and Ogando, R.~L.~C. and Shin, T. and Golden-Marx, Jesse B. and Yanny, B. and Herner, K. and Hilton, M. and Choi, A. and Gatti, M. and Gruen, D. and Hoyle, B. and Rau, M.~M. and De Vicente, J. and Zuntz, J. and Abbott, T.~M.~C. and Aguena, M. and Allam, S. and Annis, J. and Avila, S. and Bertin, E. and Brooks, D. and Burke, D.~L. and Carrasco Kind, M. and Carretero, J. and Chang, C. and Costanzi, M. and da Costa, L.~N. and Diehl, H.~T. and Doel, P. and Everett, S. and Evrard, A.~E. and Flaugher, B. and Fosalba, P. and Frieman, J. and Garc'ia-Bellido, J. and Gaztanaga, E. and Gerdes, D.~W. and Gruendl, R.~A. and Gschwend, J. and Gutierrez, G. and Hinton, S.~R. and Hollowood, D.~L. and Honscheid, K. and James, D.~J. and Jarvis, M. and Jeltema, T. and Kuehn, K. and Kuropatkin, N. and Lahav, O. and Maia, M.~A.~G. and March, M. and Marshall, J.~L. and Miquel, R. and Palmese, A. and Paz-Chinch'on, F. and Plazas, A.~A. and Sanchez, E. and Santiago, B. and Scarpine, V. and Schubnell, M. and Smith, M. and Suchyta, E. and Tarle, G. and Tucker, D.~L. and Varga, T.~N. and Wechsler, R.~H.. "Is diffuse intracluster light a good tracer of the galaxy cluster matter distribution?." <i>Monthly Notices of the Royal Astronomical Society</i>, 501, Feb 2021"
---
[**Publisher**](http://doi.org/10.1093/mnras/staa3680), [**ArXiv**](https://arxiv.org/abs/2005.12275), [**ADS**](https://ui.adsabs.harvard.edu/abs/2021MNRAS.501.1300S)
Recommended citation: Sampaio-Santos et al (2021). "Is diffuse intracluster light a good tracer of the galaxy cluster matter distribution?" <i>Monthly Notices of the Royal Astronomical Society</i>, 501, Feb 2021
| 144.071429 | 1,310 | 0.705007 | eng_Latn | 0.86038 |
4fe414d145b15ce78e303c07002ea6c91115b1d1 | 2,606 | md | Markdown | doc/03-API-Documentation.md | Icinga/icinga-powershell-restapi | c8153f3035ddd92d0c63d86934a9cc37a36ddc67 | [
"MIT"
] | 1 | 2021-04-03T03:01:27.000Z | 2021-04-03T03:01:27.000Z | doc/03-API-Documentation.md | Icinga/icinga-powershell-restapi | c8153f3035ddd92d0c63d86934a9cc37a36ddc67 | [
"MIT"
] | 3 | 2021-06-11T15:42:33.000Z | 2022-02-17T09:14:11.000Z | doc/03-API-Documentation.md | Icinga/icinga-powershell-restapi | c8153f3035ddd92d0c63d86934a9cc37a36ddc67 | [
"MIT"
] | null | null | null | # API Documentation
The REST-Api will only provide the actual endpoint to communicate with in general. Every single endpoint for fetching data has to be provided by modules which are installed separately.
## Untrusted/Self-Signed Certificates
Using Self-Signed or Untrusted Certificates for running the API is in general supported and by default applied once the Icinga 2 Agent certificates are being used.
However, you will have to ensure your clients either trust the certificate or you enable insecure connections. This applies to web-toolkits, applications and browsers. Clients connecting with certificate validation checks will be invalidated, resulting in a terminated connection. After `6` corrupt connection attempts the remote client will be `blacklisted` and `banned` from accessing the API.
## REST-Api with JEA profile
If you are running Icinga for Windows v1.6.0 or later and use the JEA profile, you will have to use `Install-IcingaForWindowsCertificate` to either install the Icinga Agent certificate as API certificate or a custom one. Your REST-Api certificate configuration is then ignored. This command provides the arguments `CertFile` and `CertThumbprint` with the same behavior as the REST daemon, but will always enforce the local installed certificate over your REST daemon configuration.
### Clear Blacklist Cache
To clear the blacklist to allow blocked clients to connect again, you will have to restart the PowerShell daemon:
```powershell
Restart-IcingaWindowsService;
```
### Common Client examples
#### Curl with --insecure
To use curl with Self-Signed/Untrusted Certificates, use the `--insecure` argument:
```bash
curl -X GET "/v1/checker?command=cpu" --insecure
```
#### PowerShell without Validation
PowerShell is by default very strict when it comes to insecure connections. Within the PowerShell Framework itself, there is a Cmdlet available for this, allowing you to enable untrusted certificates within the current PowerShell session:
```powershell
Use-Icinga;
Enable-IcingaUntrustedCertificateValidation;
Invoke-WebRequest -Method GET -UseBasicParsing -Uri '/v1/checker?command=cpu';
```
## API Versioning
All API calls are defined behind the API version. Once changes to the API handler itself are made, the versioning will change, allowing full backward compatibility
```text
/v1/<endpoint>
```
## Query Endpoints
To view a list of available endpoints, simply query them by only using the API version on your web path
### Query
```text
/v1/
```
### Output
```json
{
"Endpoints": [
"<list of registered endpoints>"
]
}
```
| 36.704225 | 481 | 0.780507 | eng_Latn | 0.992244 |
4fe5033ed9b3ff603361c794953130cd5f757802 | 1,720 | md | Markdown | _posts/2008/2008-05-08-event_reminder_club_acoustica_club_acoustica_too.md | anthonydillon/stmgrts | 91595b9a9e61798f5eb8a7b2e45ddae453ccf158 | [
"CC0-1.0"
] | null | null | null | _posts/2008/2008-05-08-event_reminder_club_acoustica_club_acoustica_too.md | anthonydillon/stmgrts | 91595b9a9e61798f5eb8a7b2e45ddae453ccf158 | [
"CC0-1.0"
] | 4 | 2016-09-11T07:52:02.000Z | 2019-10-03T10:10:25.000Z | _posts/2008/2008-05-08-event_reminder_club_acoustica_club_acoustica_too.md | anthonydillon/stmgrts | 91595b9a9e61798f5eb8a7b2e45ddae453ccf158 | [
"CC0-1.0"
] | 3 | 2018-05-11T06:33:27.000Z | 2019-08-05T11:39:02.000Z | ---
layout: post
title: "Event Reminder: Club Acoustica & Club Acoustica Too"
permalink: /archives/2008/05/event_reminder_club_acoustica_club_acoustica_too.html
commentfile: 2008-05-08-event_reminder_club_acoustica_club_acoustica_too
category: around_town
date: 2008-05-08 08:29:12
---
Event Reminder: Club Acoustica & Club Acoustica Too
Here are two new (to us) local music clubs in the area.
#### [Club Acoustica](/directory/music/200805080324)
ALTERNATE FRIDAY'S AND OCCASIONAL SATURDAY'S
AT THE CROWN, 174 RICHMOND ROAD, TWICKENHAM, MIDDLESEX, TW1 2NH - FROM 7.30PM
London's newest LIVE MUSIC club, featuring you! From 7.30pm until 10.30 pm (bar closes at midnight). Hosted by house band, THE DEPUTIES.
Country, Rock, Folk, Blues, Bluegrass, Americana - ALL STYLES WELCOME. PA, amps and mics supplied. This club night ROCKS!! FREE ADMISSION. 18's plus only.
Also...
#### Club Acoustica Too
EVERY THURSDAY AT THE ABERCORN ARMS, 76 CHURCH ROAD, TEDDINGTON, TW 11 8PY, FROM 8.30pm
For a slightly more sedate, acoustic and singalong evening... UNPLUGGED ACOUSTIC NIGHT - ALL STYLES OF MUSIC WELCOME. GOOD FUN - EASY GOING - NO PRESSURE, A great chance to PERFORM A FEW SONGS IN FRONT OF A FRIENDLY AUDIENCE....
Just turn up on the night, first come, first served. Most acts get a 2-song slot, and then if there's time you can come back for another one later on. There's a PA there and a small acoustic guitar amp, Backing available from THE DEPUTIES - just show them the chords!! Its fun and it's free!
**Details**
- For dates and more info please see [www.myspace.com/clubacoustica](http://www.myspace.com/clubacoustica)
- For more information, or to book a slot, please email <[email protected]>.
| 45.263158 | 291 | 0.762791 | eng_Latn | 0.412781 |
4fe576df2be770c9fa11337425826ed65e069988 | 1,455 | md | Markdown | README.md | adityasri12/CSS_battle-answers | 9d921d9290fbd307255fdff2dfd643aecefa174d | [
"MIT"
] | null | null | null | README.md | adityasri12/CSS_battle-answers | 9d921d9290fbd307255fdff2dfd643aecefa174d | [
"MIT"
] | null | null | null | README.md | adityasri12/CSS_battle-answers | 9d921d9290fbd307255fdff2dfd643aecefa174d | [
"MIT"
] | null | null | null | # CSS_battle-answers
Get answers of most of your css battles
[Target #1 - Simply Square](https://github.com/user/repo/blob/branch/other_file.md)
Target #2 - Carrom
Target #3 - Push Button
Target #4 - Ups n Downs
Target #5 - Acid Rain
Target #6 - Missing Slice
Target #7 - Leafy Trail
Target #8 - Forking Crazy
Target #9 - Tesseract
Target #10 - Cloaked Spirits
Target #11 - Eye of Sauron
Target #12 - Wiggly Moustache
Target #13 - Totally Triangle
Target #14 - Web Maker Logo
Target #15 - Overlap
Target #16 - Eye of the Tiger
Target #17 - Fidget Spinner
Target #18 - Matrix
Target #19 - Cube
Target #20 - Ticket
Target #21 - SitePoint Logo.md
Target #22 - Cloud
Target #23 - Boxception
Target #24 - Switches.md
Target #25 - Blossom.md
Target #26 - Smiley
Target #27 - Lock Up
Target #28 - Cups & Balls
Target #29 - Suffocate
Target #30 - Horizon
Target #31 - Equals
Target #32 - Band-aid
Target #33 - Birdie
Target #34 - Christmas Tree
Target #35 - Ice Cream
Target #36 - Interleaved
Target #37 - Tunnel
Target #38 - Not Simply Square
Target #39 - Sunset
Target #40 - Letter B
Target #41 - Fox Head
Target #42 - Baby
Target #43 - Wrench
Target #44 - Stripes
Target #45 - Magical Tree
Target #46 - Mountains
Target #47 - Corona Virus
Target #48 - Wash You Hands.md
Target #49 - Stay at Home
Target #50 - Use Hand Sanitizer
Target #51 - Wear a Mask
Target #52 - Break the Chain
Target #53 - Pastel Logo
| 13.348624 | 83 | 0.694158 | yue_Hant | 0.81426 |
4fe5c91d81d245ba560bd98b6abb74629cc156af | 1,468 | md | Markdown | esphome/components/sensor/bh1750.md | airijia/doc | 532cdcf513f66f600a2886ceec47d5dfccb34e59 | [
"MIT"
] | null | null | null | esphome/components/sensor/bh1750.md | airijia/doc | 532cdcf513f66f600a2886ceec47d5dfccb34e59 | [
"MIT"
] | null | null | null | esphome/components/sensor/bh1750.md | airijia/doc | 532cdcf513f66f600a2886ceec47d5dfccb34e59 | [
"MIT"
] | null | null | null | # BH1750 光强传感器
BH1750FVI 封装好的模块有两款 GY-30 和 GY-302,使用中无区别

**airi:8123 和 Hass 中的显示**
## 相关产品
| | GY-30<br> [说明书](http://www.mouser.com/ds/2/348/bh1750fvi-e-186247.pdf) | [](https://item.taobao.com/item.htm?id=45608097069) |
|:-:|:-:|:-:|
| | GY-302<br> [说明书](http://www.mouser.com/ds/2/348/bh1750fvi-e-186247.pdf) | [](https://item.taobao.com/item.htm?id=45559150934) |
## 连线
VCC - VCC 供给电压 3-5v
GND - GND
SCL - 总线时钟线
SDA - 总线数据线
ADDR/ADO - 地址引脚,改地址用,只连一个 BH1750FVI 用不到
## 网页创建固件
## 文件模板创建
需要前置设定 [I²C总线](esphome/components/i2c)
```
# I²C总线
i2c:
sda: 0
scl: 2
scan: False
# 配置示例
sensor:
- platform: bh1750
name: "BH1750 Illuminance"
address: 0x23
update_interval: 60s
```
### 配置参数
- **name** (**必填**, 字符串): 光强传感器的名称
- **address** (*选填*, 整数): 传感器的 I²C 地址。默认为引脚拉低 `0x23` . 引脚拉高的值 `0x5C`
- **resolution** (*选填*, 字符串): 精确度,`4.0`、`1.0`、`0.5`三选一,默认精确到 `0.5`lx(勒克斯)
- **update_interval** (*选填*, [时长](esphome/guides/configuration-types#时长)): 默认 `60s`
- **id** (*选填*, [ID](esphome/guides/configuration-types#id)): 当前组件的 ID
- 以及 [传感器核心组件](esphome/components/sensor/#基本配置) 和 [MQTT 组件基本配置](esphome/components/mqtt#MQTT-组件基本配置项)
| 24.065574 | 261 | 0.654632 | yue_Hant | 0.628406 |
4fe679e84f730bd0e0c7fdd21e55ef61154ceea8 | 3,736 | md | Markdown | docs/source/dev_information.md | peendebak/core_tools | 2e43edf0bbc1d7ceb7042559db499535e8f6a076 | [
"BSD-2-Clause"
] | 1 | 2022-02-11T09:24:35.000Z | 2022-02-11T09:24:35.000Z | docs/source/dev_information.md | peendebak/core_tools | 2e43edf0bbc1d7ceb7042559db499535e8f6a076 | [
"BSD-2-Clause"
] | null | null | null | docs/source/dev_information.md | peendebak/core_tools | 2e43edf0bbc1d7ceb7042559db499535e8f6a076 | [
"BSD-2-Clause"
] | 2 | 2020-07-06T14:31:27.000Z | 2021-07-07T13:57:19.000Z | # Developer information
## Database structure
The user has two configurations to operate in,
* local database
* local database + server side database (network connection needed)
The serverside configuration offers the advantage that you can access the data of your experiments from any device. The serverside is combined with a local database, in order to be able to keep operating when the network is down.
When operating only locally, no features are lost. Mainly you don't have back up and you can't access the data with a device that is not your measurement computer.
To accommodate a structure that can dynamically switch between both modes, the following tables are made :
### Client side
* global_measurement_overview : contains all the local measurements performed. Used for queries when the server is unavailable (e.g. no network).
### Server side
* global_measurement_overview : same as for client side, only much bigger (this will have consequences, see further)
(todo update ...)
Both server and client have a identical database scheme.
## Storage information
### Measurement identification
For identifying measurements two types of identifiers are generated:
* exp id
* uuid
#### exp id
This is a id that is associated with the measurement project, set up and sample. This id always starts from 1 and counts up as you make more measurements on your sample and is unique in this space. Once you switch the sample, the counter resets, and will start back from 1.
The exp id is designed to be used when you want to quickly access data of a single of single sample.
#### uuid
This is a 64bit identifier that is designed to be unique among all measurements in the database. The goal of the ID is to have a relatively easily type-able id that can be used over all the measurements.
The unique ID's are generated by concatenating the current time (epoch time in ms) + ~ the last 27 bits of your mac address. This results in a 15 digit number which should be unique for every measurement
### raw data storage in the database
The raw data of the measurements is stored in so called large objects (postgres type). This are large binary object that are directly saved to your harddrive (much like your files on your computer). A pointer to the location of these files is kept in the database.
When the dataset is constructed a buffer (large object), is made for each setpoint and measured variable. When performing a measurement, the buffers are filled in the RAM. To enable real time plotting, these buffers are synchronized every 200ms to the local database. Only the data that has been modified in the buffer will be written. The same counts for the read buffer.
The synchronization to the server is slower to ensure low overhead. This is typically done every few seconds (= slower online liveplotting).
The size of a dataset can be arbitrary, where the main limitation is the amount of RAM in the measurement computer and the amount of hard disk space available.
## Managing space on measurements computers
The software can make an estimation of the amount of data that has been written to the hard disk of the measurement computer. Since it is recommended to use SSD's for fast performance, the space might be limited. Therefore the user has the option to set a soft limit on the amount of space that can be used to save data.
The software will regularly check the amount of data that has been used. When this goes over the limit, it will trash the oldest datasets that are present on the system. Only datasets that are synchronized to the server can be trashed.
When asking for the data of the removed measurements, a call will be made to the server instead of the computer.
By default no limit is applied. | 66.714286 | 372 | 0.79015 | eng_Latn | 0.99991 |
4fe7f6171568d441e9418dd50b6a363a141e299e | 9,325 | md | Markdown | articles/sql-data-warehouse/sql-data-warehouse-manage-compute-overview.md | OpenLocalizationTestOrg/azure-docs-pr15_hu-HU | ac1600ab65c96c83848e8b2445ac60e910561a25 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/sql-data-warehouse/sql-data-warehouse-manage-compute-overview.md | OpenLocalizationTestOrg/azure-docs-pr15_hu-HU | ac1600ab65c96c83848e8b2445ac60e910561a25 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/sql-data-warehouse/sql-data-warehouse-manage-compute-overview.md | OpenLocalizationTestOrg/azure-docs-pr15_hu-HU | ac1600ab65c96c83848e8b2445ac60e910561a25 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | <properties
pageTitle="Az Azure SQL-adatraktár (áttekintés) számítási power kezelése |} Microsoft Azure"
description="Teljesítmény skála funkciók az Azure SQL-adatraktár meg. Eredményhez tartozó DWUs méretezése, vagy mutasson az egérrel, és folytassa a számítási erőforrások költségek mentéséhez."
services="sql-data-warehouse"
documentationCenter="NA"
authors="barbkess"
manager="barbkess"
editor=""/>
<tags
ms.service="sql-data-warehouse"
ms.devlang="NA"
ms.topic="article"
ms.tgt_pltfrm="NA"
ms.workload="data-services"
ms.date="09/03/2016"
ms.author="barbkess;sonyama"/>
# <a name="manage-compute-power-in-azure-sql-data-warehouse-overview"></a>Az Azure SQL-adatraktár (áttekintés) számítási power kezelése
> [AZURE.SELECTOR]
- [– Áttekintés](sql-data-warehouse-manage-compute-overview.md)
- [Portál](sql-data-warehouse-manage-compute-portal.md)
- [A PowerShell](sql-data-warehouse-manage-compute-powershell.md)
- [TÖBBI](sql-data-warehouse-manage-compute-rest-api.md)
- [TSQL](sql-data-warehouse-manage-compute-tsql.md)
Az SQL adatraktár architektúrája elválasztja a tárhely és a számítási, minden egyes, ha át kívánja méretezni, egymástól függetlenül lehetővé. Emiatt méretezheti, csak a teljesítmény elérése érdekében fizet, szükség esetén a költségek mentése közben teljesítményét.
Az Áttekintés SQL adatraktár alábbi teljesítmény méretezési funkciókat ismerteti, és ad arra vonatkozó javaslatokat, hogy hogyan és mikor érdemes használni.
- Méretezés power módosításával az [adatok raktári egységek (DWUs)][] számítja ki.
- Szüneteltetheti vagy folytathatja a számítási erőforrások
<a name="scale-performance-bk"></a>
## <a name="scale-performance"></a>Méretezés teljesítmény
Az SQL-adatraktár gyorsan méretezheti, vagy újra növelésével vagy csökkentésével számítási erőforrások Processzor, memória és I/O sávszélesség teljesítmény. Ha át kívánja méretezni a teljesítményt, kell tennie mindössze módosítsa az [adatok raktári egységek (DWUs)][] , amely az adatbázishoz lefoglalja SQL adatraktár számát. SQL-adatraktár gyors végrehajtja a módosítást, és az alapul szolgáló változtatásainak hardveres és szoftveres kezeli.
Megszűnt: a nap, hol kell kutatási processzorok milyen típusú, memória vagy tároló típusának van szükség nagy teljesítmény a adatraktár a. Ha elhelyez a adatraktár a felhőben, nincs már követő hardver problémák foglalkozik. Ehelyett SQL adatraktár kéri, ez a kérdés: milyen gyorsan meg szeretné az adatok elemzése?
### <a name="how-do-i-scale-performance"></a>Hogyan méretezheti meg a teljesítmény?
Elastically növeléséhez vagy csökkentéséhez a számítási power, egyszerűen módosítsa az [adatok raktári egységek (DWUs)][] beállítást, az adatbázis. Teljesítmény lineárisan növekszik, több DWU hozzáadásakor. Magasabb DWU szinten kell 100-nál több DWUs figyelje meg a teljesítmény jelentős javítását szeretne hozzáadni. Szeretné kijelölni a metszéspontok értelmes DWUs, kínálunk, amely a legjobb eredményt ad DWU szintjeit.
Ha módosítani szeretné DWUs, egyes módszerekben használhatja.
- [Méretezés az Azure portal power számítja ki.][]
- [Méretezés PowerShell power számítja ki.][]
- [Méretezés számítási power a REST API-hoz][]
- [Méretezés a TSQL power számítja ki.][]
### <a name="how-many-dwus-should-i-use"></a>Hány DWUs érdemes használnom?
A teljesítmény SQL adatraktár lineárisan méretezze át, és a másodpercben megadott (mondjuk a 100 DWUs 2000 DWUs való) között egy számítási skála módosítása történik. Ez a rugalmasságot biztosít kísérletezés a különböző DWU beállításokat, mindaddig, amíg a forgatókönyv legjobb illesztés meg.
Megtudhatja, hogy mi az a ideális DWU érték, próbálja meg méretezés felfelé és lefelé, és néhány lekérdezések futtatása az adatok betöltése után. Mivel a méretezés gyors, megpróbálhatja különböző szintjeit, teljesítmény egy óra vagy annál kisebb szám. Tartsa szem előtt, hogy SQL adatraktár lett tervezve nagy mennyiségű adatot feldolgozása, illetve láthatja a méretezés igaz képességei, különösen a nagyobb értékekhez kínálunk, a fogja használni kívánt közelít, illetve meghaladja az 1 TB nagy mennyiségű adathalmaz.
Javaslatok a terhelést a legjobb DWU megkeresése:
1. Egy fejlesztés alatt álló adatraktár kezdje el kijelölésével kisszámú DWUs. Jó kiindulási pont DW400 vagy DW200.
2. Az alkalmazás teljesítmény figyelését, és a teljesítmény erőforrásigények betartásával DWUs számának kijelölése összehasonlítása.
3. Határozza meg, hogy mennyi gyorsabban vagy lassabban teljesítmény szeretné elérni az igényeinek megfelelően az optimális teljesítmény szint feltételezve, hogy a lineáris skála kell lennie.
4. Növelje vagy csökkentse a DWUs arányában hogyan sokkal gyorsabban vagy lassabban a terhelést a végrehajtani kívánt számát. A szolgáltatás válaszol gyorsan és módosítsa a kívánt számítási erőforrások DWU új követelményeknek.
5. Továbbra is finomítások végrehajtása, amíg el nem éri az optimális teljesítmény szint az üzleti igényeinek megfelelően.
### <a name="when-should-i-scale-dwus"></a>Mikor érdemes méretezni DWUs?
Gyorsabb eredmények van szüksége, a DWUs növelése és fizet, a nagyobb teljesítmény elérése érdekében. Ha szükséges kisebb számítja ki a power, csökkentheti a DWUs és fizetni csak akkor van szüksége.
Mikor érdemes DWUs méretezése javaslatok:
1. Ha az alkalmazás hullámzó terhelést, skála DWU szintek felfelé vagy lefelé beleférjenek csúcsok és az alacsony pontok. Ha például a terhelést a szokásos csúcsaira a hónap végén, ha tervezi, hogy vegye fel a további DWUs e csúcs nap alatt, majd méretezze, a csúcs letelte után.
2. Mielőtt nehéz adatok betöltésének és átalakítási műveletet hajt végre, méretezze DWUs be, hogy az adatok érhető el gyorsan.
<a name="pause-compute-bk"></a>
## <a name="pause-compute"></a>Szünet számítási
[AZURE.INCLUDE [SQL Data Warehouse pause description](../../includes/sql-data-warehouse-pause-description.md)]
Mutasson egy adatbázist, használja az egyes módszerekben.
- [Szünet számítási Azure Portal segítségével][]
- [Szünet számítási szolgáltatást a PowerShell használatával][]
- [Szünet számítási a REST API-hoz][]
<a name="resume-compute-bk"></a>
## <a name="resume-compute"></a>Önéletrajz számítási
[AZURE.INCLUDE [SQL Data Warehouse resume description](../../includes/sql-data-warehouse-resume-description.md)]
Önéletrajz egy adatbázist, használja az egyes módszerekben.
- [Önéletrajz számítási Azure Portal segítségével][]
- [Önéletrajz számítási szolgáltatást a PowerShell használatával][]
- [Önéletrajz számítási a REST API-hoz][]
## <a name="permissions"></a>Engedélyek
Az adatbázis méretezés az engedélyeket az [Adatbázis módosítása][]leírt van szükség. Szünet és az önéletrajz esetén kell az [SQL-adatbázis munkatársi][] engedéllyel, kifejezetten Microsoft.Sql/servers/databases/action.
<a name="next-steps-bk"></a>
## <a name="next-steps"></a>Következő lépések
Az alábbi cikkekben néhány további fő teljesítménymutatók fogalmak megértéséhez olvassa el:
- [Terhelést és feldolgozási kezelése][]
- [Tábla Tervező – áttekintés][]
- [Táblázat ki.][]
- [Táblázat indexelés][]
- [Táblázat szétválasztás][]
- [Táblázat statisztika][]
- [Ajánlott eljárások][]
<!--Image reference-->
<!--Article references-->
[adatok raktári egységek (DWUs)]: ./sql-data-warehouse-overview-what-is.md#data-warehouse-units
[Méretezés az Azure portal power számítja ki.]: ./sql-data-warehouse-manage-compute-portal.md#scale-compute-bk
[Méretezés PowerShell power számítja ki.]: ./sql-data-warehouse-manage-compute-powershell.md#scale-compute-bk
[Méretezés számítási power a REST API-hoz]: ./sql-data-warehouse-manage-compute-rest-api.md#scale-compute-bk
[Méretezés a TSQL power számítja ki.]: ./sql-data-warehouse-manage-compute-tsql.md#scale-compute-bk
[capacity limits]: ./sql-data-warehouse-service-capacity-limits.md
[Szünet számítási Azure Portal segítségével]: ./sql-data-warehouse-manage-compute-portal.md#pause-compute-bk
[Szünet számítási szolgáltatást a PowerShell használatával]: ./sql-data-warehouse-manage-compute-powershell.md#pause-compute-bk
[Szünet számítási a REST API-hoz]: ./sql-data-warehouse-manage-compute-rest-api.md#pause-compute-bk
[Önéletrajz számítási Azure Portal segítségével]: ./sql-data-warehouse-manage-compute-portal.md#resume-compute-bk
[Önéletrajz számítási szolgáltatást a PowerShell használatával]: ./sql-data-warehouse-manage-compute-powershell.md#resume-compute-bk
[Önéletrajz számítási a REST API-hoz]: ./sql-data-warehouse-manage-compute-rest-api.md#resume-compute-bk
[Terhelést és feldolgozási kezelése]: ./sql-data-warehouse-develop-concurrency.md
[Tábla Tervező – áttekintés]: ./sql-data-warehouse-tables-overview.md
[Táblázat ki.]: ./sql-data-warehouse-tables-distribute.md
[Táblázat indexelés]: ./sql-data-warehouse-tables-index.md
[Táblázat szétválasztás]: ./sql-data-warehouse-tables-partition.md
[Táblázat statisztika]: ./sql-data-warehouse-tables-statistics.md
[Ajánlott eljárások]: ./sql-data-warehouse-best-practices.md
[development overview]: ./sql-data-warehouse-overview-develop.md
[SQL-adatbázis közös munka]: ../active-directory/role-based-access-built-in-roles.md#sql-db-contributor
<!--MSDN references-->
[ADATBÁZIS MÓDOSÍTÁSÁHOZ]: https://msdn.microsoft.com/library/mt204042.aspx
<!--Other Web references-->
[Azure portal]: http://portal.azure.com/
| 60.551948 | 517 | 0.792601 | hun_Latn | 1.000003 |
4fe859060c445a2c78946773092cfc553452855d | 673 | md | Markdown | .github/PULL_REQUEST_TEMPLATE.md | Space-Turtle0/Timmy-BU | 83ccde889cccf3a35b90e7ceac1a0c0f84c9fb42 | [
"MIT"
] | 5 | 2021-07-20T15:27:07.000Z | 2022-03-24T20:12:08.000Z | .github/PULL_REQUEST_TEMPLATE.md | Space-Turtle0/Timmy-BU | 83ccde889cccf3a35b90e7ceac1a0c0f84c9fb42 | [
"MIT"
] | 57 | 2021-07-22T04:17:33.000Z | 2022-03-30T15:13:17.000Z | .github/PULL_REQUEST_TEMPLATE.md | Space-Turtle0/Timmy-BU | 83ccde889cccf3a35b90e7ceac1a0c0f84c9fb42 | [
"MIT"
] | 8 | 2021-08-14T20:03:28.000Z | 2022-03-16T07:20:38.000Z | **FILL OUT THE FIELDS BELOW!**
**Discord Username and Tag**
Ex: Space#9878
**Describe the Changes You've Created**
A clear and concise description of what the bug is.
**Where are these changes going?**
Specific Sub-Servers or the Main Server
**Confirm that you have done the following:**
- [ ] Created a new folder for your bot under utils/bots/~
- [ ] Moved any events under the events folder.
- [ ] Properly Documented the code changes
- [ ] DM'd Space for a new Documentation Page (for new commands)
- [ ] Tested the New Files on your own and you confirm that the changes are bug free.
**Additional Context**
Add any other information about the Pull Request here.
| 32.047619 | 85 | 0.734027 | eng_Latn | 0.997197 |
4fe8c5046f25a37ba4bfe3491e9bafa006ad751b | 1,917 | markdown | Markdown | _posts/2016-06-12-millet-2-telecom-sale-january-26.markdown | chinatravel/chinatravel.github.io | 8b058c83d1248282cd8328c02b67e8fe9539c3a5 | [
"MIT"
] | null | null | null | _posts/2016-06-12-millet-2-telecom-sale-january-26.markdown | chinatravel/chinatravel.github.io | 8b058c83d1248282cd8328c02b67e8fe9539c3a5 | [
"MIT"
] | null | null | null | _posts/2016-06-12-millet-2-telecom-sale-january-26.markdown | chinatravel/chinatravel.github.io | 8b058c83d1248282cd8328c02b67e8fe9539c3a5 | [
"MIT"
] | null | null | null | ---
published: true
title: Millet 2 Telecom sale January 26
layout: post
---
On January 21, millet\'s official website at noon today there was a very short transition page that read: \"12:00 special opening on January 26 at noon on Saturday\".According to previous news, millet 2 telecommunications version supporting GSM, WCDMA, CDMA2000 network system, according to the Ministry of information on the site is the GSM/GPRS, and WCDMA R6 HSDPA/HSUPA, and CDMA 1x, CDMA2000 Rev.A. [https://www.youtube.com/watch?v=ZrDNLBQiWfA](https://www.youtube.com/watch?v=ZrDNLBQiWfA) In the configuration and price, millet 2 Telecom, Unicom Edition and the Standard Edition version is exactly the same, are APQ8064 1.5GHz quad-core processor, 4.3-inch screen, 2GB 720p memory, 2000 mAh batteries, RMB 8 million pixels camera, 1999.2130 people votedTesla Model s-P85D [Givenchy iPhone 5 Case](https://www.westfield.com/brandon/products/womens-fashion-accessories?colour=Greys&brand=GIVENCHY)[](http://www.nodcase.com/givenchy-iphone-5s-case-painting-dog-p-3725.html)\rP85D Tesla MODEL s is Tesla sells cars in the fastest and most expensive, in terms of appearance and the remaining few models did not much change, the main change is to drive and battery capacity. MODEL s-P85D, 691 peak horsepower BHP, four wheel drive, 0-96 km/h in 3.2 seconds.\rView details of the voting >> [Givenchy iPhone 5 Case](http://www.nodcase.com/givenchy-iphone-5s-case-painting-dog-p-3725.html)0 people collectionShare: more | 319.5 | 1,835 | 0.791341 | eng_Latn | 0.686363 |
4fe8e695580ba4c6cd9638663ef5d8464b32170c | 15,693 | md | Markdown | data/readme_files/ParallelSSH.parallel-ssh.md | DLR-SC/repository-synergy | 115e48c37e659b144b2c3b89695483fd1d6dc788 | [
"MIT"
] | 5 | 2021-05-09T12:51:32.000Z | 2021-11-04T11:02:54.000Z | data/readme_files/ParallelSSH.parallel-ssh.md | DLR-SC/repository-synergy | 115e48c37e659b144b2c3b89695483fd1d6dc788 | [
"MIT"
] | null | null | null | data/readme_files/ParallelSSH.parallel-ssh.md | DLR-SC/repository-synergy | 115e48c37e659b144b2c3b89695483fd1d6dc788 | [
"MIT"
] | 3 | 2021-05-12T12:14:05.000Z | 2021-10-06T05:19:54.000Z | ============
parallel-ssh
============
Asynchronous parallel SSH client library.
Run SSH commands over many - hundreds/hundreds of thousands - number of servers asynchronously and with minimal system load on the client host.
Native code based client with extremely high performance - based on ``libssh2`` C library.
.. image:: https://img.shields.io/badge/License-LGPL%20v2-blue.svg
:target: https://pypi.python.org/pypi/parallel-ssh
:alt: License
.. image:: https://img.shields.io/pypi/v/parallel-ssh.svg
:target: https://pypi.python.org/pypi/parallel-ssh
:alt: Latest Version
.. image:: https://travis-ci.org/ParallelSSH/parallel-ssh.svg?branch=master
:target: https://travis-ci.org/ParallelSSH/parallel-ssh
.. image:: https://ci.appveyor.com/api/projects/status/github/parallelssh/parallel-ssh?svg=true&branch=master
:target: https://ci.appveyor.com/project/pkittenis/parallel-ssh-4nme1
.. image:: https://codecov.io/gh/ParallelSSH/parallel-ssh/branch/master/graph/badge.svg
:target: https://codecov.io/gh/ParallelSSH/parallel-ssh
.. image:: https://img.shields.io/pypi/wheel/parallel-ssh.svg
:target: https://pypi.python.org/pypi/parallel-ssh
.. image:: https://readthedocs.org/projects/parallel-ssh/badge/?version=latest
:target: http://parallel-ssh.readthedocs.org/en/latest/
:alt: Latest documentation
.. _`read the docs`: http://parallel-ssh.readthedocs.org/en/latest/
.. contents::
************
Installation
************
.. code-block:: shell
pip install parallel-ssh
*************
Usage Example
*************
See documentation on `read the docs`_ for more complete examples.
Run ``uname`` on two remote hosts in parallel with ``sudo``.
.. code-block:: python
from __future__ import print_function
from pssh.clients import ParallelSSHClient
hosts = ['myhost1', 'myhost2']
client = ParallelSSHClient(hosts)
output = client.run_command('uname')
for host, host_output in output.items():
for line in host_output.stdout:
print(line)
:Output:
.. code-block:: shell
Linux
Linux
**************
Native client
**************
Starting from version ``1.2.0``, a new client is supported in ``parallel-ssh`` which offers much greater performance and reduced overhead than the current default client.
The new client is based on ``libssh2`` via the ``ssh2-python`` extension library and supports non-blocking mode natively. Binary wheel packages with ``libssh2`` included are provided for Linux, OSX and Windows platforms and all supported Python versions.
See `this post <https://parallel-ssh.org/post/parallel-ssh-libssh2>`_ for a performance comparison of the available clients.
To make use of this new client, ``ParallelSSHClient`` can be imported from ``pssh.clients.native`` instead. Their respective APIs are almost identical.
The new client will become the default and will replace the current ``pssh.pssh_client`` in a new major version of the library - ``2.0.0``.
The paramiko based client will become an optional install via pip `extras`, available under ``pssh.clients.miko``.
For example:
.. code-block:: python
from pprint import pprint
from pssh.clients.native import ParallelSSHClient
hosts = ['myhost1', 'myhost2']
client = ParallelSSHClient(hosts)
output = client.run_command('uname')
for host, host_output in output.items():
for line in host_output.stdout:
print(line)
See `documentation <http://parallel-ssh.readthedocs.io/en/latest/ssh2.html>`_ for a feature comparison of the two clients.
****************************
Native Code Client Features
****************************
* Highest performance and least overhead of any Python SSH libraries
* Thread safe - makes use of native threads for blocking calls like authentication
* Natively non-blocking utilising ``libssh2`` via ``ssh2-python`` - **no monkey patching of the Python standard library**
* Significantly reduced overhead in CPU and memory usage
***********
Exit codes
***********
Once *either* standard output is iterated on *to completion*, or ``client.join(output)`` is called, exit codes become available in host output. Iteration ends *only when remote command has completed*, though it may be interrupted and resumed at any point.
.. code-block:: python
for host in output:
print(output[host].exit_code)
:Output:
.. code-block:: python
0
0
The client's ``join`` function can be used to wait for all commands in output object to finish:
.. code-block:: python
client.join(output)
Similarly, output and exit codes are available after ``client.join`` is called:
.. code-block:: python
from pprint import pprint
output = client.run_command('exit 0')
# Wait for commands to complete and gather exit codes.
# Output is updated in-place.
client.join(output)
pprint(output.values()[0].exit_code)
# Output remains available in output generators
for host, host_output in output.items():
for line in host_output.stdout:
pprint(line)
:Output:
.. code-block:: python
0
<..stdout..>
There is also a built in host logger that can be enabled to log output from remote hosts. The helper function ``pssh.utils.enable_host_logger`` will enable host logging to stdout.
To log output without having to iterate over output generators, the ``consume_output`` flag *must* be enabled - for example:
.. code-block:: python
from pssh.utils import enable_host_logger
enable_host_logger()
client.join(client.run_command('uname'), consume_output=True)
:Output:
.. code-block:: shell
[localhost] Linux
SCP
****
SCP is supported - native clients only - and provides the best performance for file copying.
Unlike with the SFTP functionality, remote files that already exist are *not* overwritten and an exception is raised instead.
Note that enabling recursion with SCP requires server SFTP support for creating remote directories.
To copy a local file to remote hosts in parallel with SCP:
.. code-block:: python
from pssh.clients import ParallelSSHClient
from gevent import joinall
hosts = ['myhost1', 'myhost2']
client = ParallelSSHClient(hosts)
cmds = client.scp_send('../test', 'test_dir/test')
joinall(cmds, raise_error=True)
See also documentation for SCP recv.
SFTP
*****
SFTP is supported natively. Performance is much slower than SCP due to underlying library limitations and SCP should be preferred where possible. In the case of the deprecated paramiko clients, several bugs exist with SFTP performance and behaviour - avoid if at all possible.
To copy a local file to remote hosts in parallel:
.. code-block:: python
from pssh.clients import ParallelSSHClient
from pssh.utils import enable_logger, logger
from gevent import joinall
enable_logger(logger)
hosts = ['myhost1', 'myhost2']
client = ParallelSSHClient(hosts)
cmds = client.copy_file('../test', 'test_dir/test')
joinall(cmds, raise_error=True)
:Output:
.. code-block:: python
Copied local file ../test to remote destination myhost1:test_dir/test
Copied local file ../test to remote destination myhost2:test_dir/test
There is similar capability to copy remote files to local ones suffixed with the host's name with the ``copy_remote_file`` function.
Directory recursion is supported in both cases via the ``recurse`` parameter - defaults to off.
See `SFTP documentation <http://parallel-ssh.readthedocs.io/en/latest/advanced.html#sftp>`_ for more examples.
*****************
Design And Goals
*****************
``parallel-ssh``'s design goals and motivation are to provide a *library* for running *non-blocking* asynchronous SSH commands in parallel with little to no load induced on the system by doing so with the intended usage being completely programmatic and non-interactive.
To meet these goals, API driven solutions are preferred first and foremost. This frees up developers to drive the library via any method desired, be that environment variables, CI driven tasks, command line tools, existing OpenSSH or new configuration files, from within an application et al.
Comparison With Alternatives
*****************************
There are not many alternatives for SSH libraries in Python. Of the few that do exist, here is how they compare with ``parallel-ssh``.
As always, it is best to use a tool that is suited to the task at hand. ``parallel-ssh`` is a library for programmatic and non-interactive use - see `Design And Goals`_. If requirements do not match what it provides then it best not be used. Same applies for the tools described below.
Paramiko
________
The default SSH client library in ``parallel-ssh`` ``1.x.x`` series.
Pure Python code, while having native extensions as dependencies, with poor performance and numerous bugs compared to both OpenSSH binaries and the ``libssh2`` based native clients in ``parallel-ssh`` ``1.2.x`` and above. Recent versions have regressed in performance and have `blocker issues <https://github.com/ParallelSSH/parallel-ssh/issues/83>`_.
It does not support non-blocking mode, so to make it non-blocking monkey patching must be used which affects all other uses of the Python standard library. However, some functionality like Kerberos (GSS-API) authentication is not currently provided by other libraries.
asyncssh
________
Python 3 only ``asyncio`` framework using client library. License (`EPL`) is not compatible with GPL, BSD or other open source licenses and `combined works cannot be distributed <https://www.eclipse.org/legal/eplfaq.php#USEINANOTHER>`_.
Therefore unsuitable for use in many projects, including ``parallel-ssh``.
Fabric
______
Port of Capistrano from Ruby to Python. Intended for command line use and is heavily systems administration oriented rather than non-interactive library. Same maintainer as Paramiko.
Uses Paramiko and suffers from the same limitations. More over, uses threads for parallelisation, while `not being thread safe <https://github.com/fabric/fabric/issues/1433>`_, and exhibits very poor performance and extremely high CPU usage even for limited number of hosts - 1 to 10 - with scaling limited to one core.
Library API is non-standard, poorly documented and with numerous issues as API use is not intended.
Ansible
_______
A configuration management and automation tool that makes use of SSH remote commands. Uses, in parts, both Paramiko and OpenSSH binaries.
Similarly to Fabric, uses threads for parallelisation and suffers from the poor scaling that this model offers.
See `The State of Python SSH Libraries <https://parallel-ssh.org/post/ssh2-python/>`_ for what to expect from scaling SSH with threads, as compared `to non-blocking I/O <https://parallel-ssh.org/post/parallel-ssh-libssh2/>`_ with ``parallel-ssh``.
Again similar to Fabric, its intended and documented use is interactive via command line rather than library API based. It may, however, be an option if Ansible is already being used for automation purposes with existing playbooks, the number of hosts is small, and when the use case is interactive via command line.
``parallel-ssh`` is, on the other hand, a suitable option for Ansible as an SSH client that would improve its parallel SSH performance significantly.
ssh2-python
___________
Wrapper to ``libssh2`` C library. Used by ``parallel-ssh`` as of ``1.2.0`` and is by same author.
Does not do parallelisation out of the box but can be made parallel via Python's ``threading`` library relatively easily and as it is a wrapper to a native library that releases Python's GIL, can scale to multiple cores.
``parallel-ssh`` uses ``ssh2-python`` in its native non-blocking mode with event loop and co-operative sockets provided by ``gevent`` for an extremely high performance library without the side-effects of monkey patching - see `benchmarks <https://parallel-ssh.org/post/parallel-ssh-libssh2>`_.
In addition, ``parallel-ssh`` uses native threads to offload CPU blocked tasks like authentication in order to scale to multiple cores while still remaining non-blocking for network I/O.
``pssh.clients.native.SSHClient`` is a single host natively non-blocking client for users that do not need parallel capabilities but still want a non-blocking client with native code performance.
Out of all the available Python SSH libraries, ``libssh2`` and ``ssh2-python`` have been shown, see benchmarks above, to perform the best with the least resource utilisation and ironically for a native code extension the least amount of dependencies. Only ``libssh2`` C library and its dependencies which are included in binary wheels.
However, it lacks support for some SSH features present elsewhere like ECDSA keys (`PR pending <https://github.com/libssh2/libssh2/pull/206>`_), agent forwarding (`PR also pending <https://github.com/libssh2/libssh2/pull/219>`_) and Kerberos authentication - see `feature comparison <http://parallel-ssh.readthedocs.io/en/latest/ssh2.html>`_.
********
Scaling
********
Some guide lines on scaling ``parallel-ssh`` and pool size numbers.
In general, long lived commands with little or no output *gathering* will scale better. Pool sizes in the multiple thousands have been used successfully with little CPU overhead in the single thread running them in these use cases.
Conversely, many short lived commands with output gathering will not scale as well. In this use case, smaller pool sizes in the hundreds are likely to perform better with regards to CPU overhead in the event loop.
Multiple Python native threads, each of which can get its own event loop, may be used to scale this use case further as number of CPU cores allows. Note that ``parallel-ssh`` imports *must* be done within the target function of the newly started thread for it to receive its own event loop. ``gevent.get_hub()`` may be used to confirm that the worker thread event loop differs from the main thread.
Gathering is highlighted here as output generation does not affect scaling. Only when output is gathered either over multiple still running commands, or while more commands are being triggered, is overhead increased.
Technical Details
******************
To understand why this is, consider that in co-operative multi tasking, which is being used in this project via the ``gevent`` library, a co-routine (greenlet) needs to ``yield`` the event loop to allow others to execute - *co-operation*. When one co-routine is constantly grabbing the event loop in order to gather output, or when co-routines are constantly trying to start new short-lived commands, it causes contention with other co-routines that also want to use the event loop.
This manifests itself as increased CPU usage in the process running the event loop and reduced performance with regards to scaling improvements from increasing pool size.
On the other end of the spectrum, long lived remote commands that generate *no* output only need the event loop at the start, when they are establishing connections, and at the end, when they are finished and need to gather exit codes, which results in practically zero CPU overhead at any time other than start or end of command execution.
Output *generation* is done remotely and has no effect on the event loop until output is gathered - output buffers are iterated on. Only at that point does the event loop need to be held.
*************
User's group
*************
There is a public `ParallelSSH Google group <https://groups.google.com/forum/#!forum/parallelssh>`_ setup for this purpose - both posting and viewing are open to the public.
.. image:: https://ga-beacon.appspot.com/UA-9132694-7/parallel-ssh/README.rst?pixel
:target: https://github.com/igrigorik/ga-beacon
| 45.752187 | 482 | 0.753011 | eng_Latn | 0.993899 |
4fe919415bc67dcdfd3fd79593fdc83f017ccccd | 87 | md | Markdown | README.md | MBFouad/ReactMBFouad.com | 4adac2f6d062c643a94c7c0bbc8e5ae5e53fb68a | [
"MIT"
] | null | null | null | README.md | MBFouad/ReactMBFouad.com | 4adac2f6d062c643a94c7c0bbc8e5ae5e53fb68a | [
"MIT"
] | null | null | null | README.md | MBFouad/ReactMBFouad.com | 4adac2f6d062c643a94c7c0bbc8e5ae5e53fb68a | [
"MIT"
] | null | null | null | # ReactMBFouad.com
MBFouad portoflio website open source with react && laravel
| 29 | 67 | 0.804598 | eng_Latn | 0.524805 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.