hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
24e3c3b012a30e47ed35a871afa4aa46201ea463 | 152 | md | Markdown | readme.md | Krohx/blog-src | 8cf746fca47971d8b2a4cd3a1acf375b5e1439f1 | [
"MIT"
] | null | null | null | readme.md | Krohx/blog-src | 8cf746fca47971d8b2a4cd3a1acf375b5e1439f1 | [
"MIT"
] | null | null | null | readme.md | Krohx/blog-src | 8cf746fca47971d8b2a4cd3a1acf375b5e1439f1 | [
"MIT"
] | null | null | null | #####Pelican source files for setting up and generating static content for out Github-hosted static blog:
[Blog | Krohx](http://krohx.github.io/blog)
| 30.4 | 105 | 0.75 | eng_Latn | 0.910696 |
24e4758ed8912ea4fb77213a29456ccf038a80f7 | 121 | md | Markdown | project.md | jimmywu1385/jimmywu.github.io | e28b629edeb9bdcad04913a161e2d8319879f20e | [
"CC-BY-3.0"
] | null | null | null | project.md | jimmywu1385/jimmywu.github.io | e28b629edeb9bdcad04913a161e2d8319879f20e | [
"CC-BY-3.0"
] | null | null | null | project.md | jimmywu1385/jimmywu.github.io | e28b629edeb9bdcad04913a161e2d8319879f20e | [
"CC-BY-3.0"
] | null | null | null | ---
layout: landing
title: Project
description: some cool stuff I made
image: assets/images/pic11.jpg
nav-menu: true
---
| 15.125 | 35 | 0.743802 | eng_Latn | 0.884953 |
24e47b0eddc0259c70a16f487484d3d009fc06bf | 29 | md | Markdown | README.md | billhsu/aria | e1d859e18174d4de7d1885c83ce97fe74b763182 | [
"Apache-2.0"
] | null | null | null | README.md | billhsu/aria | e1d859e18174d4de7d1885c83ce97fe74b763182 | [
"Apache-2.0"
] | null | null | null | README.md | billhsu/aria | e1d859e18174d4de7d1885c83ce97fe74b763182 | [
"Apache-2.0"
] | null | null | null | # aria
A smarter smartwatch.
| 9.666667 | 21 | 0.758621 | eng_Latn | 0.597867 |
24e58e0a4fe7de8d1cfcd5fb1b4ccc79aca9e94d | 1,023 | md | Markdown | README.md | xanium4332/libfreenect-sys | 21bdc41fe09b20ec0388ad76b1dca30e434ef6ad | [
"MIT"
] | null | null | null | README.md | xanium4332/libfreenect-sys | 21bdc41fe09b20ec0388ad76b1dca30e434ef6ad | [
"MIT"
] | null | null | null | README.md | xanium4332/libfreenect-sys | 21bdc41fe09b20ec0388ad76b1dca30e434ef6ad | [
"MIT"
] | null | null | null | # libfreenect Rust Bindings [in development]
The `libfreenect-sys` crate provides declarations and linkage for the
`libfreenect` C library. Following the `*-sys` package conventions, the
`libfreenect-sys` crate does not define higher-level abstractions over the
native `libfreenect` library functions.
This crate currently exposes an interface compatible with libfreenect-0.5.
However, it is currently missing declarations for audio- and registration-based
functions.
## Dependencies
The system-wide `libfreenect` library is detected using `pkg-config`. If the
library is present and version >= 0.5.2, this crate will merely link to the
system library.
If no system library is found, this crate will compile and statically link
against the in-tree included `libfreenect`. In this case a dependency on the
[`libusb-sys`](https://crates.io/crates/libusb-sys) crate is required in order
to provide a working `libusb` implementation.
## License
Copyright © 2015 James Buckley
Distributed under the [MIT License](LICENSE).
| 40.92 | 79 | 0.792766 | eng_Latn | 0.997201 |
24e69c3e4d35a7bd50748a08673d2b02449f6633 | 8,382 | md | Markdown | README.md | cmcmicrosystems/RISCV | cc789c46450c55e002bb6c4a45ec8ec46784eac2 | [
"Apache-2.0"
] | 6 | 2019-10-25T15:27:58.000Z | 2021-07-08T06:34:28.000Z | README.md | cmcmicrosystems/RISCV | cc789c46450c55e002bb6c4a45ec8ec46784eac2 | [
"Apache-2.0"
] | 1 | 2022-02-11T03:03:55.000Z | 2022-02-11T03:03:55.000Z | README.md | cmcmicrosystems/RISCV | cc789c46450c55e002bb6c4a45ec8ec46784eac2 | [
"Apache-2.0"
] | 3 | 2020-05-30T13:09:37.000Z | 2022-02-24T16:51:53.000Z | # RISCV-Pulpissimo-FPGA-Implementation-for-ZCU102
*Note:*
- *CMC clients may submit their questions through CMC's online support form to get timely response.*
- *Further development work (such as new applications or ips) relavent to RISC-V on zcu102 from the community is welcome to release through CMC's github.You may contribute back to this github repository by submitting a pull request.*
- *CMC's clients are welcome to release their open source projects through CMC's github if they are willing share it with the community. Contact CMC for details.*
##
This README provides instructions on how to setup the environment to compile the pulpissimo fpga platform and configure it on the Xilinx ZCU102 evaluation board. Pulpissimo is a 32-bit RI5CY (a RISC-V compatible core ) single-core System-on-a-Chip developed by the PULP team (Parallel processing Ultra-low Power platform).
For more information about Pulpissimo, please visit [Pulpissimo on github](https://github.com/pulp-platform/pulpissimo).
For more information about PULP, please visit [PULP web site](https://pulp-platform.org/).
CMC ported the pulpissimo platform to the Xilinx ZCU102 evaluation board based on the FPGA implementations for other FPGA boards provided by the PULP team. And this development has been merged into the master branch of the pulpissimo github repository(https://github.com/pulp-platform/pulpissimo). Therefore, users can obtain the pulpissimo-zcu102 from either the Pulpissimo github repository or from this repository. The "hello world" application used for testing the pulpissimo-zcu102 is only available from this repository.
## System Requirements
**Host OS:** the Pulpissimo platform and related tools are tested on Ubuntu 16.04 by CMC. information on other supported Linux distribution can be found here: https://github.com/pulp-platform/pulp-builder/blob/master/README.md
**JTAG programming cable:** it is required to download and debug an application on Pulpissimo platform. Digilent JTAG HS2 programming cable is used in this case
**Software tools and environment:**
- RISC-V toolchain: cross-compiling tools for creating applications. more information can be found here: https://github.com/pulp-platform/pulp-riscv-gnu-toolchain
- Pulp SDK: runtime environment for creating application running on Pulpissimo platform. More information can be found here:https://github.com/pulp-platform/pulp-sdk
- Pulpissimo platform package: the tools and source files required to compile the Pulpissimo platform. More information can be found here: https://github.com/pulp-platform/pulpissimo
- Minicom: a terminal used for communication between the host and the Pulpissimo platform.
Please note: most of the instructions provided below are duplicated from the Pulp-platform GitHub pages. This README brings the instructions that are provided in different Pulp-platform GitHub pages into one place for a quick and convenient getting started with Pulpissimo for ZCU102. You are encouraged to visit Pulp-platform GitHub for more details.
## Download CMC Pulpissimo FPGA Implementation for ZCU102
```
$git clone https://github.com/cmcmicrosystems/pulpissimo-zcu102.git
```
## Installing Linux dependency
Please install the following dependency on your host Ubuntu Linux:
```
$sudo apt install git python3-pip python-pip gawk texinfo libgmp-dev libmpfr-dev libmpc-dev swig3.0 libjpeg-dev lsb-core doxygen python-sphinx sox graphicsmagick-libmagick-dev-compat libsdl2-dev libswitch-perl libftdi1-dev cmake scons libsndfile1-dev
$sudo pip3 install artifactory twisted prettytable sqlalchemy pyelftools openpyxl xlsxwriter pyyaml numpy configparser pyvcd
$sudo pip2 install configparser
```
Please note: the default gcc version should be 5. Other version might make the build failed.
## Installing RISCV toolchain
This section provides instructions on installing the RISC-V cross-compiling tools.
**Download the sources**
```
$git clone –recursive https://github.com/pulp-platform/pulp-riscv-gnu-toolchain
```
**Install prerequisites**
```
$sudo apt-get install autoconf automake autotools-dev curl libmpc-dev libmpfr-dev libgmp-dev gawk build-essential bison flex texinfo gperf libtool patchutils bc zlib1g-dev
```
**Choose an installation path**
You may choose any accessible directory to host the RISC-V toolchain . For example, /opt/riscv is picked to install the toolchain in this example. Add “/opt/riscv/bin” to your PATH environment variable:
```
$export PATH=”/opt/riscv/bin:$PATH”
```
**Build and Install Newlib cross-compiler for Pulp**
```
$cd pulp-riscv-gnu-toolchain $ ./configure –prefix=/opt/riscv –with-arch=rv32imc –with-cmodel=medlow –enable-multilib
$make $./configure –prefix=/opt/riscv
$make
```
You should now have riscv-gcc tools installed under /opt/riscv.
## Installing pulp platform and Build Pulp-SDK
**Download pulpissimo platform**
```
$cd ~
$git clone https://github.com/pulp-platform/pulpissimo.git
```
**Build pulp-sdk**
```
$cd pulpissimo
$git clone https://github.com/pulp-platform/pulp-sdk.git
$cd pulp-sdk
$export PULP_RISCV_GCC_TOOLCHAIN=”/opt/riscv/bin”
```
Copy the zcu102.sh to ~/pulpissimo/pulp-sdk/configs/fpgas/puplissimo/ as following:
```
$cp ~/pulpissimo-zcu102/zcu102.sh ~/pulpissimo/pulp-sdk/configs/fpgas/pulpissimo/
```
**Select target and platform, and Build SDK**
```
$source configs/pulpissimo.sh
$source configs/fpgas/pulpissimo/zcu102.sh
$make all
```
## ZCU102 FPGA Implementation
**Copy relevant folder and files**
```
$cp ~/pulpissimo-zcu102/Makefile ~/pulpissimo/fpga # this step will overwrite the exiting Makefile in the fpga folder
```
In order to generate the PULPissimo bitstream for a supported target FPGA board, first generate the necessary synthesis include scripts by starting the update-ips script in the pulpissimo root directory:
```
$cd ~/pulpissimo
$./update-ips
```
**Build bitstream for zcu102**
```
$cd fpga
$make zcu102
```
You should find the pulpissimo-zcu102.bit generated under the current directory.
## Program ZCU102 board
**Step one:** Connect the ZCU102 evaluation board to your host machine with a Micro-USB cable from the J2 connector (USB JTAG) on the ZCU102 board to a USB port on your host machine.
**Step two:** Set the board boot mode to JTAG boot (all four DIP switch of the switch SW6 set to on position) More details on how to setup the zcu102 board are provided in the ZCU102 Evaluation Board User Guise.
**Step three:** Program the ZCU102 with Hardware Manager. Invoke vivado Open Hardware Manager Open Target Program device Detailed instructions on how to use hardware manager are provided in Vivado Design Suite User Guide – programming and Debugging
## Build Hello World Application
Copy the “hello” application example to pulp-sdk.
```
$cp -r ~/pulpissimo-zcu102/hello ~/pulpissimo/pulp-sdk
```
In the pulp-sdk directory, issue the following commands:
```
$source configs/pulpissimo.sh
$source configs/fpgas/pulpissimo/zcu102.sh
$make env
$source sourceme.sh
$cd hello
$make clean all
```
You should see the binary is generated under ./build/pulpissimo/test/test.
## Run Hello Application on ZCU102 FPGA
**Step one:** Connect the UART of the ZCU102 (J83) to a USB port on your host.
**Step two:** Connect Digilent JTAG-HS2 adapter from (J55) of the ZCU102 to a USB port on your host.
Please note the HS2 connector should be connected to the J55 pins with odd numbers (the top row of the J55).
**Step three:** Program the board with the pulpissimo-zcu102.bit following the instructions provided in the previous section
**Step four:** Open three terminals on your host.
In terminal 1, issue the following commands:
```
$ minicom -s
```
Set serial port to /dev/ttyUSB0 with baud rate of 115200.
Please note you might need to configure the serial port to a different device (for example /dev/ttyUSB1) depending on which USB port on the host side is connected to the UART of the board.
In terminal 2, issue the following commands:
```
$cd ~/pulpisimo/pulp-sdk
$./pkg/openocd/1.0/bin/openocd -f ~/pulpissimo/fpga/pulpissimo-zcu102/openocd-zcu102-digilent-jtag-hs2.cfg
```
In terminal 3, issue the following commands:
```
$riscv32-unknown-elf-gdb ~/pulpissimo/pulp-sdk/hello/build/pulpissimo/test/test
```
In gdb, run:
```
(gdb)target remote localhost:3333
(gdb)load
(gdb)b main
(gdb)list
(gdb)continue
```
You should see “hello world!” in terminal 1 at this point.
| 50.8 | 527 | 0.78084 | eng_Latn | 0.973735 |
24e6c981bda98b90ea9f0d67f0cc4d0162cee42f | 590 | md | Markdown | README.md | Enliven-se/enliven-bootswatch | 53c1a9be6accc3f00d94df905b23ed304ab1062d | [
"MIT"
] | null | null | null | README.md | Enliven-se/enliven-bootswatch | 53c1a9be6accc3f00d94df905b23ed304ab1062d | [
"MIT"
] | 1 | 2016-09-16T12:58:00.000Z | 2016-09-16T12:58:00.000Z | README.md | Enliven-se/enliven-bootswatch | 53c1a9be6accc3f00d94df905b23ed304ab1062d | [
"MIT"
] | null | null | null | [Bootstrap](http://getbootstrap.com/) [bootswatch](http://bootswatch.com/) for [ENLIVEN](http://www.enliven.co)
Packaged for use with Bower, Node, and Meteor.
Usage:
* Set up: `yarn setup`
-- (or `npm run setup`)
* Watch & learn: `gulp serve`
* Build: `gulp build`
* Deploy to Github Pages: `gulp deploy`
PLEASE NOTE:
Although this package is required by the `enliven-frontend` project, the presence of a `node_modules` folder inside of this one will cause the build of `enliven-frontend` to fail. So please remove the local `node_modules` folder before building the master project.
| 36.875 | 264 | 0.735593 | eng_Latn | 0.933656 |
24e8517c356128640206fa9fdf89d1d85a942490 | 141 | md | Markdown | README.md | RajanPatel97/QuoteGenerator | bebba14eafa21190c38dbdcd34c40e60a402b2c5 | [
"MIT"
] | 2 | 2020-07-25T15:19:36.000Z | 2020-07-25T16:29:15.000Z | README.md | RajanPatel97/Wise-Words-Quote-Generator | bebba14eafa21190c38dbdcd34c40e60a402b2c5 | [
"MIT"
] | null | null | null | README.md | RajanPatel97/Wise-Words-Quote-Generator | bebba14eafa21190c38dbdcd34c40e60a402b2c5 | [
"MIT"
] | null | null | null | # Wise Words Quote Generator
### https://rajanpatel97.github.io/Wise-Words-Quote-Generator/
Part of the '30 Days of JavaScript' Challenge.
| 23.5 | 62 | 0.758865 | kor_Hang | 0.234621 |
24e940111e42232e37add05b463e8074062ab24e | 34 | md | Markdown | build/page_footer.md | michatran4/md-blogger | e81c86b8b850e8366fe75ca48789227b7997b9e3 | [
"Unlicense"
] | 13 | 2021-12-20T21:12:31.000Z | 2022-02-19T05:42:14.000Z | build/page_footer.md | michatran4/md-blogger | e81c86b8b850e8366fe75ca48789227b7997b9e3 | [
"Unlicense"
] | null | null | null | build/page_footer.md | michatran4/md-blogger | e81c86b8b850e8366fe75ca48789227b7997b9e3 | [
"Unlicense"
] | null | null | null | ---
[homepage](../index.html)
| 8.5 | 26 | 0.5 | zul_Latn | 0.188602 |
24ea80f6cc872f82a07fc5aa62fec6152020d035 | 178 | md | Markdown | changelog/fix_a_false_positive_for_style_regexp_literal.md | caalberts/rubocop | 981a8a22ef2c02b5fb813b162509a4e0b82fcac7 | [
"MIT"
] | 2 | 2021-08-12T15:50:26.000Z | 2021-11-09T10:50:55.000Z | changelog/fix_a_false_positive_for_style_regexp_literal.md | caalberts/rubocop | 981a8a22ef2c02b5fb813b162509a4e0b82fcac7 | [
"MIT"
] | 1 | 2021-06-29T06:47:04.000Z | 2021-06-29T06:47:04.000Z | changelog/fix_a_false_positive_for_style_regexp_literal.md | caalberts/rubocop | 981a8a22ef2c02b5fb813b162509a4e0b82fcac7 | [
"MIT"
] | 1 | 2021-06-29T06:42:55.000Z | 2021-06-29T06:42:55.000Z | * [#9880](https://github.com/rubocop/rubocop/pull/9880): Fix a false positive for `Style/RegexpLiteral` when using a regexp starts with a blank as a method argument. ([@koic][])
| 89 | 177 | 0.735955 | eng_Latn | 0.742653 |
24eaf1b4ad6a767519fe8150f96dd3b91545f29e | 407 | md | Markdown | RecyclerViewItemsLeft/README.md | InsanusMokrassar/AndroidUtils | 6da62f1690b6e29de1e92d18020ea587fff1ea2b | [
"Apache-2.0"
] | null | null | null | RecyclerViewItemsLeft/README.md | InsanusMokrassar/AndroidUtils | 6da62f1690b6e29de1e92d18020ea587fff1ea2b | [
"Apache-2.0"
] | 1 | 2018-02-09T11:42:34.000Z | 2018-02-09T11:42:34.000Z | RecyclerViewItemsLeft/README.md | InsanusMokrassar/AndroidUtils | 6da62f1690b6e29de1e92d18020ea587fff1ea2b | [
"Apache-2.0"
] | null | null | null | # RecyclerViewItemsLeft
This module was created as part of AndroidUtils support library which contain only one class to work with notifying about left items before the end of list in `RecyclerView`.
Just call `RecyclerView#subscribeItemsLeft` with `(callback: (Int) -> Unit, leftItems: Int)` or `(callback: (Int) -> Unit, filter: (Int) -> Boolean)` for filter left itens and be notified when you need it.
| 67.833333 | 205 | 0.761671 | eng_Latn | 0.998896 |
24ec27daff41914cfcf4f7d564baf3646d3ce178 | 2,482 | md | Markdown | technologies/app/orientdb-3.1.7/README.md | cleyrop/technologies | dea1d94288648f1fc7f76c09bb43b0dd70fd8e75 | [
"Apache-2.0"
] | 5 | 2020-04-03T13:23:16.000Z | 2022-03-21T10:03:13.000Z | technologies/app/orientdb-3.1.7/README.md | cleyrop/technologies | dea1d94288648f1fc7f76c09bb43b0dd70fd8e75 | [
"Apache-2.0"
] | 173 | 2020-01-21T10:09:47.000Z | 2022-03-30T08:11:35.000Z | technologies/app/orientdb-3.1.7/README.md | cleyrop/technologies | dea1d94288648f1fc7f76c09bb43b0dd70fd8e75 | [
"Apache-2.0"
] | 12 | 2020-05-14T13:38:05.000Z | 2022-01-19T09:37:48.000Z | # OrientDB - customized by Saagie
This Docker image is available on [Saagie's DockerHub](https://hub.docker.com/r/saagie/orientdb) is based on the official [openjdk:8-jre-slim](https://hub.docker.com/_/openjdk) image.
It is specially designed to run on Saagie's V2 platform.
If you need persistence, the volume should at least have 256Mb of space.
## Build the image
### Using gradle build
This gradle build is based on [Saagie's technology plugin](https://github.com/saagie/technologies-plugin).
To build the project, go to the root of this project.
Then run:
```
./gradlew :orientdb-3.1.7:buildImage
```
If you want to test the image, you can run:
```
./gradlew :orientdb-3.1.7:testImage
```
### Using docker commands
First go to context/version sub-directory:
```
cd orientdb-3.1.7
```
Then run the following command:
```
docker build -t saagie/orientdb:3.1.7 .
```
## Run OrientDB container
### On Saagie's Platform
This container is designed to run on Saagie's platform.
The official documentation is available here: [Saagie's official documentation](https://docs.saagie.io/product/latest/sdk/index.html).
### On premise / your local server
It is possible (mainly for development and testing) to run this image outside of a Saagie platform.
Please note that Saagie cannot provide any support for images launched outside of its platform.
Run:
```
docker run -it --rm --name orientdb -p 19480:9480 -p 19424:9424 -e ORIENTDB_WEB_PATH=/http -e ORIENTDB_BINARY_PATH=/binary -e ORIENTDB_ROOT_PASSWORD=yourPassword saagie/orientdb:3.1.7-1.77.0_apporientdb
```
- Port `9480` should be mapped to the one you will be using on host side (here `19480`) for web access.
- Port `9424` should be mapped to the one you will be using on host side (here `19424`) for binary access.
- `ORIENTDB_WEB_PATH` variable is **mandatory** and defines a specific path for web access, it can be /. It's used to customize the path to the application when behind a reverse proxy.
- `ORIENTDB_BINARY_PATH` variable is **mandatory** and defines a specific path bor binary access, it can be /. It's used to customize the path to the application when behind a reverse proxy.
- `ORIENTDB_ROOT_PASSWORD` variable is also **mandatory** and should be set to whatever you'll be using as a password to access OrientDB, here `yourPassword`
Databases are under /orientdb/databases
Configuration is under /orientdb/config
Logs are under /orientdb/log
Configure persistence according to your needs. | 35.971014 | 202 | 0.757051 | eng_Latn | 0.970404 |
24ec33b037ace5e5571687b5b3d5426272cb130f | 6,706 | md | Markdown | _posts/2018-07-11-npm-vue-component.md | frankxjkuang/frankxjkuang.github.io | e9ffb6b34aec2ea5e5236d0ab9ba2902245fb7b1 | [
"MIT"
] | 3 | 2017-10-23T01:37:25.000Z | 2017-12-04T10:24:02.000Z | _posts/2018-07-11-npm-vue-component.md | frankxjkuang/frankxjkuang.github.io | e9ffb6b34aec2ea5e5236d0ab9ba2902245fb7b1 | [
"MIT"
] | null | null | null | _posts/2018-07-11-npm-vue-component.md | frankxjkuang/frankxjkuang.github.io | e9ffb6b34aec2ea5e5236d0ab9ba2902245fb7b1 | [
"MIT"
] | 1 | 2019-03-17T06:14:50.000Z | 2019-03-17T06:14:50.000Z | ---
layout: post
title: 封装 Vue 组件,并使用 npm 发布
date: 2018-07-11 11:45:30 +0800
categories: Vue
tag: note
---
* content
{:toc}
[源码地址](https://github.com/frankxjkuang/custom-ui),如果对你有帮助的话希望不要吝啬你的 Star
本文主要记录一下如何基于 `Vue` 开发组件,并在 [npm](https://www.npmjs.com/) 上发布。废话不多说,进入正题
# Vue 开发插件
开发之前先看看官网的 [开发规范](https://cn.vuejs.org/v2/guide/plugins.html#%E5%BC%80%E5%8F%91%E6%8F%92%E4%BB%B6)
我们开发的之后期望的结果是支持 import、require 或者直接使用 script 标签的形式引入,就像这样:
```js
// 这里注意一下包的名字前缀是 custom ,组件的名字前缀是 moor
// 这是因为那个名字发布包的时候被占用了(我做实验的时候叫 moor-ui)现在改成了custom-ui,但是组件的前缀懒得改
import CustomUI from 'custom-ui';
// 或者 const CustomUI = require('custom-ui');
// 或者 <script src="..."></script>
Vue.use(CustomUI);
```
# 构建一个 Vue 项目
开发组件我们使用 `webpack-simple` :
```bash
vue init webpack-simple <project-name>
```
> **PS:** 这里我选择了 use sass 因为,之后开发组件会用到
开发组件的文件结构如下,参考了一下 [element](https://github.com/elemefe) 不过我们这个是简易版,仅供分享和自己使用
```bash
.
├── src/ // 源码目录
│ ├── packages/ // 组件目录
│ │ ├── switch/ // 组件(以switch为例)
│ │ ├── moor-switch.vue // 组件代码
│ │ ├── index.js // 挂载插件
│ ├── App.vue // 页面入口
│ ├── main.js // 程序入口
│ ├── index.js // (所有)插件入口
├── index.html // 入口html文件
.
```
好了,到这里准备工作做好了,我们可以开始开发组件了,接着上面的例子,下面开始开发一个 `switch` 组件。
# 开发单个组件
先看一下目标效果:

开始开发:在 packages 文件夹下新建一个 switch 文件夹用来存放 switch 组件的源码,继续在 switch 文件夹中新建 moor-switch.vue 和 index.js 文件
## moor-switch.vue
这个文件是组件源码,我这里就不放源码了,这里就说一下我个人认为最重要的点吧,这也是封装表单类组件最为重要的点:
自定义组件绑定 v-model,[官网地址](https://cn.vuejs.org/v2/guide/components-custom-events.html#%E8%87%AA%E5%AE%9A%E4%B9%89%E7%BB%84%E4%BB%B6%E7%9A%84-v-model)
使用:
```html
<!-- 使用父组件的值绑定 -->
<!-- isSwitch = false -->
<moor-switch
v-model="isSwitch">开关:
</moor-switch>
<!-- 子组件必须要有 input 来处理对应的值 -->
<!-- 其中最重要的就是需要 :value="value" 用来绑定值 -->
<!-- @change="$emit('input', $event.target.value)" 事件触发改变值 -->
<input
type="checkbox"
@change="$emit('input', $event.target.value)"
:true-value="activeValue"
:false-value="inactiveValue"
:disabled="disabled"
:value="value">
<!-- 当然还需要使用 props 来接受这个 value -->
<script>
// ... 此处省略代码
props: {
value: {
type: [Boolean, String, Number],
default: false
}
}
// ... 此处省略代码
</script>
```
## index.js
这个文件没什么好说的就是将该组件作为 Vue 插件,代码就三行这里就放在这吧:
```js
// MoorSwitch 是对应组件的名字,要记得在 moor-switch.vue 文件中还是 name 属性哦
import MoorSwitch from './moor-switch';
MoorSwitch.install = Vue => Vue.component(MoorSwitch.name, MoorSwitch);
export default MoorSwitch;
```
好了基本完成了,但是为了将所有的组件集中起来比如我还有 `select`、 `input`、 `button` 等组件,那么我想要统一将他们放在一个文件这中便于管理
所以在 App.vue 同级目录我新建了一个 index.js 文件,内容也没啥好说的看看就懂了:
```js
import HelloWorld from './packages/hello-world/index.js';
import MoorSwitch from './packages/switch/index.js';
// ...如果还有的话继续添加
const components = [
HelloWorld,
MoorSwitch
// ...如果还有的话继续添加
]
const install = function(Vue, opts = {}) {
components.map(component => {
Vue.component(component.name, component);
})
}
/* 支持使用标签的方式引入 */
if (typeof window !== 'undefined' && window.Vue) {
install(window.Vue);
}
export default {
install,
HelloWorld,
MoorSwitch
// ...如果还有的话继续添加
}
```
本地运行通过 `<script/>` 标签的方式使用,修改 `index.html` 文件:
```html
<!-- 省略部分代码 -->
<div id="app">
<moor-hello-world :color="color" :msg="msg"></moor-hello-world>
<moor-switch
v-model="lightSwitch">开关:</moor-switch>
</div>
<script src="./node_modules/vue/dist/vue.js"></script>
<script src="/dist/custom-ui.js"></script>
<script>
new Vue({
el: '#app',
data() {
return {
color: 'red',
msg: 'hello world!',
lightSwitch: false
}
}
})
</script>
```
然后运行 `npm run dev` 你就可以看到效果了:

好了到这里我们的组件就开发完成了;下面开始说怎么打包发布到 npm 上
# 发布到 npm
打包之前,首先我们需要改一下 `webpack.config.js` 这个文件;
```js
// ... 此处省略代码
// 执行环境
const NODE_ENV = process.env.NODE_ENV
module.exports = {
// 根据不同的执行环境配置不同的入口
entry: NODE_ENV == 'development' ? './src/main.js' : './src/index.js',
output: {
// 修改打包出口,在最外级目录打包出一个 index.js 文件,我们 import 默认会指向这个文件
path: path.resolve(__dirname, './dist'),
publicPath: '/dist/',
filename: 'custom-ui.js',
library: 'custom-ui', // 指定的就是你使用require时的模块名
libraryTarget: 'umd', // libraryTarget会生成不同umd的代码,可以只是commonjs标准的,也可以是指amd标准的,也可以只是通过script标签引入的
umdNamedDefine: true // 会对 UMD 的构建过程中的 AMD 模块进行命名。否则就使用匿名的 define
},
// ... 此处省略代码
}
```
修改 `package.json` 文件:
```js
// 发布开源因此需要将这个字段改为 false
"private": false,
// 这个指 import custom-ui 的时候它会去检索的路径
"main": "dist/custom-ui.js",
```
发布命令其实就是两句话
```js
// 这里需要你有一个 npm 的账号,文章开头有官网链接
npm login // 登陆
npm publish // 发布
```
完成之后我们就可以在项目中安装使用了
```js
npm install custom-ui -S
```
在 `main.js` 中引入插件
```js
import CustomUI from 'custom-ui'
Vue.use(CustomUI)
```
在组件中使用:
```html
<!-- 直接使用脚手架的HelloWorld组件 -->
<!-- 此处有省略代码,看对地方加入代码哦 -->
<div class="moor-item">
<label>Input: </label>
<moor-input
v-model="input1"
placeholder="请输入信息">
</moor-input>
<moor-input
v-model="input2"
placeholder="请输入信息">
</moor-input>
<moor-input
placeholder="输入框禁用"
:disabled="inputDisabled">
</moor-input>
</div>
<div class="moor-item">
<label>Switch: </label>
<moor-switch
v-model="lightSwitch">开关(开):</moor-switch>
<moor-switch
v-model="switchLight">开关(关):</moor-switch>
</div>
<script>
export default {
name: 'HelloWorld',
data () {
return {
// HelloWorld
msg: 'Welcome to moor UI!',
color: 'red',
// input
input1: '',
input2: '这是默认值',
inputDisabled: true,
// switch
lightSwitch: false,
switchLight: true
}
},
watch: {
lightSwitch: newValue => console.log('开关:', newValue),
}
}
</script>
<style scoped>
.moor-select, .moor-btn, .moor-switch, .moor-input {
margin: 10px 6px;
}
.moor-item {
display: flex;
align-items: center;
}
.moor-item label {
width: 100px;
display: inline-block;
}
</style>
```
预览效果如下:

**PS:** 修改 .gitignore 去掉忽略dist,因为我们打包的文件也需要提交;每次上到npm上需要更改版本号,package.json 里的 version 字段
写的比较简单,主要还是提供思路。用习惯了开源的组件自己总得了解一下嘛,有的时候在开发的过程中我们找不到合适的开源组件就需要自己开发了,这个时候我们把自己写的一些精致的小插件开源出来挺好的...
最后希望你给个 Star [源码地址](https://github.com/frankxjkuang/custom-ui)
哦,对了README,不想写了...哈哈 | 21.221519 | 148 | 0.64912 | yue_Hant | 0.53157 |
24ed2d8fee12030de4a8ab14a887415688c53722 | 581 | md | Markdown | VBA/Office-F1-Landing/class-doesn-t-support-automation-error-430office-shared-vblr6-chm1011327.md | oloier/VBA-content | 6b3cb5769808b7e18e3aff55a26363ebe78e4578 | [
"CC-BY-4.0",
"MIT"
] | 584 | 2015-09-01T10:09:09.000Z | 2022-03-30T15:47:20.000Z | VBA/Office-F1-Landing/class-doesn-t-support-automation-error-430office-shared-vblr6-chm1011327.md | oloier/VBA-content | 6b3cb5769808b7e18e3aff55a26363ebe78e4578 | [
"CC-BY-4.0",
"MIT"
] | 585 | 2015-08-28T20:20:03.000Z | 2018-08-31T03:09:51.000Z | VBA/Office-F1-Landing/class-doesn-t-support-automation-error-430office-shared-vblr6-chm1011327.md | oloier/VBA-content | 6b3cb5769808b7e18e3aff55a26363ebe78e4578 | [
"CC-BY-4.0",
"MIT"
] | 590 | 2015-09-01T10:09:09.000Z | 2021-09-27T08:02:27.000Z | ---
title: Class doesn't support Automation (Error 430), Office Shared [vblr6.chm1011327]
keywords: vblr6.chm1011327
f1_keywords:
- vblr6.chm1011327
ms.prod: office
ms.assetid: fe44953d-b276-4b6c-8d58-02bfaa9f92b3
ms.date: 06/08/2017
---
# Class doesn't support Automation (Error 430), Office Shared [vblr6.chm1011327]
Hi there! You have landed on one of our F1 Help redirector pages. Please select the topic you were looking for below.
[Class doesn't support Automation (Error 430)](http://msdn.microsoft.com/library/f3d5d8a8-4d53-f8bc-b5dc-62f0820fe8fc%28Office.15%29.aspx)
| 32.277778 | 138 | 0.776248 | eng_Latn | 0.567271 |
24ed373bd380da1cc0a3543f3abe26f91bd26185 | 1,867 | md | Markdown | TypeScript.md | mindbox-moscow/style-guides | 1cda704249640ab41b381bca0f8a0e53f145e975 | [
"MIT"
] | null | null | null | TypeScript.md | mindbox-moscow/style-guides | 1cda704249640ab41b381bca0f8a0e53f145e975 | [
"MIT"
] | null | null | null | TypeScript.md | mindbox-moscow/style-guides | 1cda704249640ab41b381bca0f8a0e53f145e975 | [
"MIT"
] | null | null | null | # Стандарты кодирования — TypeScript
### Форматирование
Для отступов используем табуляцию.
Для фигурных скобок испольузем K&R стиль:
```typescript
interface Foo {
bar: string;
}
```
##### Длина строки
- Максимальная допустимая длина строки — 130 символов
- Символ табуляция считается за 4 символа
- Пробельные символы в конце строки учитываются
### Именование
- Имя файла соответствует имени класса. В случае с React - имени компонента.
- Компоненты-контейнеры не имеют постфикса.
- Компоненты-рисовашки имеют постфикс Presenter
### Типизация props
```typescript
// Каждый интерфейс -- опционален.
export interface StateProps {}
export interface DispatchProps {}
export interface OwnProps {}
interface Props extends StateProps, DispatchProps, OwnProps {}
```
Если у _Presenter_ есть `OwnProps`, то они внутри презентера лежат.
Если у _Container_ есть `OwnProps`, то они лежат внутри контейнера.
### Деление React-компонентов на файлы
##### Минимальный вариант
- MyComponent - контейнер
- MyComponentPresenter - презентер
- MyComponentReducer - Reducer, Actions, Action Creators
##### Средний вариант
- MyComponent - контейнер
- MyComponentPresenter - презентер
- MyComponentReducer - Reducer
- MyComponentActions - Actions, Action Creators
##### Максимальный вариант
- MyComponent - контейнер
- MyComponentPresenter - презентер
- MyComponentReducer - Reducer
- MyComponentActionCreators - Action Creators
- MyComponentActions - Actions
### Папки
Группируем файлы по папкам соответственно функционалу, а не архитектурным срезам. Пример:
```
Operation
OperationSteps
AddOrEditOperationStep
AddOrEditOperationStep.ts
AddOrEditOperationStepPresenter.tsx
AddOrEditOperationStepReducer.ts
SendEmailOperationStep
<files>
OutptutWriters
CustomerOutputWriter
<files>
```
| 28.287879 | 89 | 0.754151 | rus_Cyrl | 0.774726 |
24edbd2ff630e1bdda16d93f170e6a074a9ea617 | 138 | md | Markdown | README.md | evollhhan/Amy | 1eb481e433ebd2e943f74ea979ac90fb802c7965 | [
"MIT"
] | null | null | null | README.md | evollhhan/Amy | 1eb481e433ebd2e943f74ea979ac90fb802c7965 | [
"MIT"
] | null | null | null | README.md | evollhhan/Amy | 1eb481e433ebd2e943f74ea979ac90fb802c7965 | [
"MIT"
] | null | null | null | # LOOKBOOK S1
> 2016-2017 个人作品集
Please view the website: [https://evollhhan.github.io/LOOKBOOK/](https://evollhhan.github.io/LOOKBOOK/)
| 23 | 103 | 0.746377 | yue_Hant | 0.389144 |
24ee58ef05c4d5590a7e64941ff0e9b48e8339b7 | 84 | md | Markdown | README.md | dennischestakov/dennischestakov.github.io | d94782fe550bde6b5b9cc1678657fd6ecdfbf0bb | [
"MIT"
] | null | null | null | README.md | dennischestakov/dennischestakov.github.io | d94782fe550bde6b5b9cc1678657fd6ecdfbf0bb | [
"MIT"
] | null | null | null | README.md | dennischestakov/dennischestakov.github.io | d94782fe550bde6b5b9cc1678657fd6ecdfbf0bb | [
"MIT"
] | null | null | null | # dennischestakov.github.io
A personal site that I made from a bootstrap template.
| 21 | 54 | 0.797619 | eng_Latn | 0.986948 |
24f0f5f7405ac1a42f24d6fe528ade6350670a87 | 777 | md | Markdown | README.md | tmcw/make-relative | bc6475fde56de5da366412a34c585e8530341909 | [
"BlueOak-1.0.0"
] | 23 | 2019-06-09T22:06:05.000Z | 2022-02-12T18:16:47.000Z | README.md | garrying/make-relative | ff61da282f638652b8cd35c21f0571400f53d4c1 | [
"BlueOak-1.0.0"
] | 2 | 2020-05-12T04:14:04.000Z | 2021-05-08T20:50:29.000Z | README.md | garrying/make-relative | ff61da282f638652b8cd35c21f0571400f53d4c1 | [
"BlueOak-1.0.0"
] | 3 | 2019-08-15T23:55:02.000Z | 2021-12-12T08:01:04.000Z | # make-relative
A missing link for IPFS compatibility: this module makes links relative on
a website, so that one can navigate through the pages when using IPFS. IPFS
essentially makes websites run in sub-directories, so if you’re using links
that start with /, then they’ll jump to the wrong place.
Installation:
```
npm install -g https://github.com/tmcw/make-relative
```
Usage:
This needs to be run in the root of a built website. It will find all HTML
files under its current path and rewrite the following references in-place:
- a href, link href
- meta content
- img src
```sh
$ make-relative https://expected-domain-name.com
```
The domain name is required, because that makes this able to make domain-absolute links relative if they point to the same website.
| 27.75 | 131 | 0.763192 | eng_Latn | 0.995389 |
24f2c3e94c3f0fd429c831145c18fe251bc0de8d | 4,821 | md | Markdown | docs/extensibility/debugger/implementing-type-visualizers-and-custom-viewers.md | Jteve-Sobs/visualstudio-docs.de-de | 59bd3c5d2776a76ef8d28407c5cc97efc9e72f84 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/extensibility/debugger/implementing-type-visualizers-and-custom-viewers.md | Jteve-Sobs/visualstudio-docs.de-de | 59bd3c5d2776a76ef8d28407c5cc97efc9e72f84 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-07-24T14:57:38.000Z | 2020-07-24T14:57:38.000Z | docs/extensibility/debugger/implementing-type-visualizers-and-custom-viewers.md | angelobreuer/visualstudio-docs.de-de | f553469c026f7aae82b7dc06ba7433dbde321350 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Implementieren von Typvisualisierern und benutzerdefinierten Viewern | Microsoft Docs
ms.date: 11/04/2016
ms.topic: conceptual
helpviewer_keywords:
- debugging [Debugging SDK], custom viewer
- debugging [Debugging SDK], type visualizer
ms.assetid: abef18c0-8272-4451-b82a-b4624edaba7d
author: acangialosi
ms.author: anthc
manager: jillfra
ms.workload:
- vssdk
ms.openlocfilehash: c2ebbb5c8e27df4ae4baf2d9a9f1c3314188e2b3
ms.sourcegitcommit: 16a4a5da4a4fd795b46a0869ca2152f2d36e6db2
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 04/06/2020
ms.locfileid: "80738507"
---
# <a name="implement-type-visualizers-and-custom-viewers"></a>Implementieren von Typvisualisierern und benutzerdefinierten Viewern
> [!IMPORTANT]
> In Visual Studio 2015 ist diese Art der Implementierung von Ausdrucksevaluatoren veraltet. Informationen zum Implementieren von CLR-Ausdrucksevaluatoren finden Sie unter [CLR-Ausdrucksauswertungen](https://github.com/Microsoft/ConcordExtensibilitySamples/wiki/CLR-Expression-Evaluators) und [Beispiel für den Auswertungsbeispiel für managed expression evaluator](https://github.com/Microsoft/ConcordExtensibilitySamples/wiki/Managed-Expression-Evaluator-Sample).
Mithilfe von Typvisualisierungen und benutzerdefinierten Viewern können Benutzer Daten eines bestimmten Typs auf eine Sinnvollkeit anzeigen als ein einfaches hexadezimales Dump von Zahlen. Ein Ausdrucksauswertungswerter (EE) kann benutzerdefinierte Betrachter bestimmten Datentypen oder Variablen zuordnen. Diese benutzerdefinierten Viewer werden von der EE implementiert. Der EE kann auch externe Typ-Visualisierungen unterstützen, die von einem anderen Drittanbieter oder sogar vom Endbenutzer stammen können.
## <a name="discussion"></a>Diskussion
### <a name="type-visualizers"></a>Typ-Visualisierungen
Visual Studio fragt nach einer Liste von Typvisualisierungen und benutzerdefinierten Viewern für jedes Objekt, das in einem Überwachungsfenster angezeigt werden soll. Ein Ausdrucksauswertungswerter (EE) stellt eine solche Liste für jeden Typ bereit, für den er Typvisualisierungen und benutzerdefinierte Viewer unterstützen möchte. Aufrufe von [GetCustomViewerCount](../../extensibility/debugger/reference/idebugproperty3-getcustomviewercount.md) und [GetCustomViewerList](../../extensibility/debugger/reference/idebugproperty3-getcustomviewerlist.md) starten den gesamten Prozess des Zugriffs auf Typvisualisierungs- und benutzerdefinierte Viewer (Details zur aufrufenden Sequenz finden Sie unter [Visualisieren und Anzeigen von Daten).](../../extensibility/debugger/visualizing-and-viewing-data.md)
### <a name="custom-viewers"></a>Benutzerdefinierte Zuschauer
Benutzerdefinierte Viewer werden im EE für einen bestimmten Datentyp implementiert und durch die [IDebugCustomViewer-Schnittstelle](../../extensibility/debugger/reference/idebugcustomviewer.md) dargestellt. Ein benutzerdefinierter Viewer ist nicht so flexibel wie ein Typvisualizer, da er nur verfügbar ist, wenn der EE, der diesen bestimmten benutzerdefinierten Viewer implementiert, ausgeführt wird. Das Implementieren eines benutzerdefinierten Viewers ist einfacher als die Implementierung von Unterstützung für Typvisualisierer. Die Unterstützung von Typvisualisierern bietet dem Endbenutzer jedoch maximale Flexibilität bei der Visualisierung seiner Daten. Der Rest dieser Diskussion betrifft nur Typvisualisierer.
## <a name="interfaces"></a>Schnittstellen
Der EE implementiert die folgenden Schnittstellen zur Unterstützung von Typvisualisierern, die von Visual Studio verwendet werden:
- [IEEVisualizerDataProvider](../../extensibility/debugger/reference/ieevisualizerdataprovider.md)
- [IPropertyProxyEESide](../../extensibility/debugger/reference/ipropertyproxyeeside.md)
- [IPropertyProxyProvider](../../extensibility/debugger/reference/ipropertyproxyprovider.md)
- [IEEDataStorage](../../extensibility/debugger/reference/ieedatastorage.md)
- [IDebugProperty3](../../extensibility/debugger/reference/idebugproperty3.md)
- [IDebugObject](../../extensibility/debugger/reference/idebugobject.md)
Der EE verwendet die folgenden Schnittstellen, um Typvisualizer zu unterstützen:
- [IEEVisualizerService](../../extensibility/debugger/reference/ieevisualizerservice.md)
- [IEEVisualizerServiceProvider](../../extensibility/debugger/reference/ieevisualizerserviceprovider.md)
- [IDebugBinder3](../../extensibility/debugger/reference/idebugbinder3.md)
## <a name="see-also"></a>Weitere Informationen
- [Schreiben eines CLR-Ausdrucksevaluators](../../extensibility/debugger/writing-a-common-language-runtime-expression-evaluator.md)
- [Visualisieren und Anzeigen von Daten](../../extensibility/debugger/visualizing-and-viewing-data.md)
- [IDebugCustomViewer](../../extensibility/debugger/reference/idebugcustomviewer.md)
| 77.758065 | 801 | 0.826799 | deu_Latn | 0.933954 |
24f3a31bc1c9864da6ede6d9211a37c9be4b3caa | 689 | md | Markdown | mock/frame/doc.md | uinika/saga | ce78efcbb95c92c502ded7f32dad80c0121cdf6c | [
"MIT"
] | 8 | 2016-01-27T10:22:56.000Z | 2018-03-17T07:17:47.000Z | mock/frame/doc.md | uinika/saga | ce78efcbb95c92c502ded7f32dad80c0121cdf6c | [
"MIT"
] | null | null | null | mock/frame/doc.md | uinika/saga | ce78efcbb95c92c502ded7f32dad80c0121cdf6c | [
"MIT"
] | null | null | null | # 首页
-----
## 导航栏菜单
### /navigation/menuTree
Type: GET
#### Parameter:
null
#### Result:
menuId String 菜单ID
parentId String 父菜单ID
menuName String 菜单名
hint String 菜单描述
entryURL String 菜单地址
icon String 菜单图标
children Array 子菜单
-----
## 修改当前登录用户的密码
### /sys/account/password
Type: PUT
#### Parameter:
oldPassword String 旧密码
password String 新密码
rePassword String 确认密码
#### Result:
null
-----
## 注销当前用户
### /logout
Type: POST
#### Parameter:
null
#### Result:
null
-----
| 19.685714 | 40 | 0.464441 | kor_Hang | 0.586746 |
24f45adb7b5d5d1fccd524e4e46c5b2fdc8de360 | 12,854 | md | Markdown | _posts/2019-12-27-Gini.md | lexparsimon/lexparsimon.github.io | 7d9d0b0d198db917670a13b9b9abd5b17aa8d7ad | [
"MIT"
] | null | null | null | _posts/2019-12-27-Gini.md | lexparsimon/lexparsimon.github.io | 7d9d0b0d198db917670a13b9b9abd5b17aa8d7ad | [
"MIT"
] | null | null | null | _posts/2019-12-27-Gini.md | lexparsimon/lexparsimon.github.io | 7d9d0b0d198db917670a13b9b9abd5b17aa8d7ad | [
"MIT"
] | 5 | 2020-02-07T21:32:40.000Z | 2021-03-08T21:08:09.000Z | ---
title: 'Why measuring urban inequality with the Gini index is a bad idea'
date: 2019-12-28
tags:
- urbanism
- spatial statistics
- spatial heterogeneity
- data science
header:
image: /images/gini/gini_cover.jpg
excerpt: 'Note on how the Gini coefficient is agnostic to space and how to fix it.'
mathjax: 'true'
---
## The Gini coefficient
In urban policy making, we are often confronted with the need to assess the income inequality of the urban population for such purposes as granting tax cuts to businesses targeting certain income groups, or identifying low-income households for offering housing subsidies in the form of cheap credit.
However, wealth or income are not the only quantities the inequality or heterogeneity of which an urban planner would be interested in. For example, urban mobility flows are often concentrated in a few areas capturing a disproportionately large portion of the overall city flows, and knowing how severe this heterogeneity is along with monitoring its trends over time would be the first step towards a meaningful transportation policy, allocation of services and infrastructure such as parking, as well as largely masterplanning.
That said, the most common way to measure inequality is the [Gini coefficient](https://en.wikipedia.org/wiki/Gini_coefficient) which has been in use by economists for more than a hundred years already.
For any distribution of values of interest $$X$$ in a city, the Gini coefficient can be defined as:
$$
G=\frac{\sum_{i=1}^{n} \sum_{j=1}^{n}\left|x_{i}-x_{j}\right|}{2 n^{2} \bar{x}}
$$
where $$x_i$$ is the $$X$$ value at location $$i=[1,2, \ldots, n]$$ and $$\bar{x}=(1 / n) \sum_{i} x_{i}$$.
As already mentioned, the Gini coefficient, originally used to measure wealth and income inequality, can be applied to quantify the heterogeneity of other variables too. In the case of characterising heterogeneity of values at different locations in a city, as can be seen from the above equation, the Gini coefficient will take on the value of zero if the variable of interest is distributed uniformly across city locations. Conversely, it takes on its maximum value when all of the variables of interest are concentrated in a single location, leading to a Gini coefficient of $$GI=1−1/n$$, which is very close to 1 for large $$n$$.
## Computing the Gini coefficient
Let's make things clear with an example. Let's say we want to understand how unequal parking demand is distributed in London, and use the Gini coefficient as a measure of this inequality. Below is a plot of what it looks like for the available data at a resolution of $$500 \times 500$$m grid.
<img src="{{ site.url }}{{ site.baseurl }}/images/gini/sorted0.jpg" alt="London parking demand">
As one might expect, we see hotspots of high parking demand. Indeed, if we look at the distribution of the number of trips ending in a given location over a week (essentially weekly aggregate parking demand),
<img src="{{ site.url }}{{ site.baseurl }}/images/gini/Flow_dist.jpg" alt="London parking demand distribution">
we see an assymatric Pareto-like distribution, with a few locations displaying very high, and most locations very low demand. If we compute the Gini coefficient with the expression above, we obtain a value of roughly 0.6. Although the temporal evolution of this measure would be more meaningful to track, this value indicates a medium-high inequality if thought of in economic terms.
## So what's wrong with it?
In the definition of the Gini coefficient, we mentioned a key word: **location**. Urban planning is first and foremost about *space*. Whether it's design, management, logistics, or planning, practitioners are working with *space*. But look carefully at the definition of the widely used Gini coefficient: space - in this case geographical - does not figure in it. The Gini coefficient is completely agnostic to the spatial arrangement of the location of the values of interest. The following four arrangements - the true parking demand and its spatially reshuffled configurations have **the exact same Gini coefficient**:
<img src="{{ site.url }}{{ site.baseurl }}/images/gini/shuffled.jpg" alt="London parking demand reshuffled distributions">
In other words, the Gini coefficient fails to capture any spatial information about our variable of interest.
## What should we do?
In the field of [spatial statistics](https://en.wikipedia.org/wiki/Spatial_analysis) there have been proposed many measures indicative of the spatial component of the variable under study. We will discuss two of them which I find particularly useful to combine with the Gini coefficient when studying the urban environment.
### Spatial Gini
In order to obtain a Gini coefficient that carries meaningful spatial information, we further use the [Spatial Gini index](https://www.researchgate.net/publication/233650148_A_spatial_decomposition_of_the_Gini_coefficient). In essence, it is a decomposition of the classical Gini with the aim of considering the joint effects of inequality and spatial autocorrelation. More specifically, it exploits the fact that the sum of all pairwise differences can be decomposed into sums of geographical neighbours and non-neighbours:
$$G I=\frac{\sum_{i=1}^{n} \sum_{j=1}^{n} w_{i, j}^{A}\left| x_{i}- x_{j}\right|}{2 n^{2} \bar{x}}+\frac{\sum_{i=1}^{n} \sum_{j=1}^{n}\left(1-w_{i, j}^{A}\right)\left| x_{i}- x_{j}\right|}{2 n^{2} \bar{x}}$$
where $$w_{i, j}^{A}$$ is an element of the binary spatial adjacency matrix.
The Spatial Gini index can be interpreted as follows: as the positive spatial auto-correlation increases, the second term in the equation above increases relative to the first, since geographically adjacent values will tend to take on similar values. On the contrary, negative spatial autocorrelation will cause an opposite decomposition, since the difference between non-neighbours will tend to be less than that between geographical neighbours. In either case, this offers the possibility to quantify the relative contributions of these two terms. The results obtained from this approach can further be tested for statistical significance by using random spatial permutations to obtain a sampling distribution under the null hypothesis that the variable of interest is randomly distributed in space.
In essence, we are interested in finding how much of the Gini coefficient is due to non neighbour heterogeneity. To achieve this, we use the non-neighbour term in the Gini decomposition above as a statisticto test for spatial autocorrelation:
$$G I_{2}=\frac{\sum_{i=1}^{n} \sum_{j=1}^{n}\left(1-w_{i, j}^{A}\right)\left| x_{i} - x_{j}\right|}{2 n^{2} \bar{x}}$$
This expression can be interpreted as the portion of overall heterogeneity associated with non-neighbour pairs of grid cells. Inference on this statistic is carried out by computing a pseudo p-value by comparing the $$GI_2$$ obtained from the observed data to the distribution of $$GI_2$$ values obtained from random spatial permutations. It should be noted that this inference based on random spatial permutations is on the spatial decomposition of the Gini coefficient given by the expression above, and not the value of the Gini coefficient itself.
Following the described approach, we proceed to computing the spatial decompositions of the Gini coefficient, varying the neighbourhood radius in the adjacency matrix from 0 (original Gini) to 6 kilometers:
<img src="{{ site.url }}{{ site.baseurl }}/images/gini/Ginis.jpg" alt="London parking demand spatial Gini">
The random spatial permutation approach yielded a statistically significant spatial decomposition (p = 0.01). From the plot we can see that as the neighbourhood radius increases, the inequality due to non-neighbour parking demand values decreases, since the growing neighbourhood captures more and more of the inequality. What's interesting, however, is the fact that the observed value distribution and the randomized one have similar spatial gini profiles (A and D in the plot), while the two reshufflings with Gaussian distibutions of the parking values (B and C) display the exact same profiles, which decline slower than those of A and D. This is completely expected, since in a Gaussian decay the decline is "smooth" and thus increasing the radius does not make the neighbourhood capture as much diversity, and thus the inequality associated to the non-neighbour component remains relatively high.
### Spreading index
Despite their informative relevance, the Gini coefficient and its spatial variant exploit the mean $$\bar{x}$$, which, under fat-tailed distributions, as many socio-economic variables tend to be, may be undefined. In such cases, the Gini coefficient [cannot be reliably estimated](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3005184) with non-parametric methods and will result in a downward bias emerging under fat tails.
Another downside of measuring heterogeneity of the parking demand with the Gini approach is that it does not offer the possibility to study the spatial arrangement of the "hotspots" - locations with very large demand. The hotspots are defined as the grid cells with a parking demand above a certain threshold $$\bar{x^{\star}}$$. The intuitive first choice of a threshold would be the city-wide average demand. However, this is often too low a threshold and a better approach [has been proposed](https://www.nature.com/articles/srep05276). Once the threshold has been chosen and the hotspots are identified as cells with parking demand values larger than the chosen threshold $$\bar{x^{\star}}$$, we can use the [recently proposed](https://arxiv.org/abs/1804.00855) spreading index to measure the ratio between average distance between the hotspots, and the average city distance as a measure of city size:
$$\eta\left(x^{\star}\right) = \frac{\frac{1}{N\left(x^{\star}\right)} \sum_{i, j} d(i, j) 1_{\left(x_{i}>x^{\star}\right)} 1_{\left(x_{j}>x^{\star}\right)}}{\frac{1}{N} \sum_{i, j} d(i, j)}$$
where $$N(x^{\star})$$ is the number of pairwise distances of grid cells with a parking demand greater than $$\bar{x^{\star}}$$, $$N$$ is the number of pairwise distances between all grid cells covering the city, $$d(i,j)$$ is the distance between cell $$i$$ and cell $$j$$, and $$1_{\left(x_{i}>x^{\star}\right)}$$ is the indicator function for identifying the cells with praking demand greater than $$\bar{x^{\star}}$$ for computing the distances. The spreading index is essentially the average distance between cells with $$\left(x_{i}>x^{\star}\right)$$, divided by the average distance between all city cells. If the cells with large parking demand are spread around across the city, this ratio will be large. Conversely,if the high demand cells are concentrated close to each other, as in a monocentric city, this ratio will be small.
Instead of choosing one particular threshold value, we will set it as a parameter and see how the *spreading index* behaves as a function of the threshold $$\bar{x^{\star}}$$ for the four types of spatial arrangements.
<img src="{{ site.url }}{{ site.baseurl }}/images/gini/etas.jpg" alt="London parking demand spreading index">
As we can see from the plot, the completely random reshuffling (**D**) displays the highest *spreading index* profile, followed by the observad data (**A**). The two peak Gaussian reshuffling (**C**) follows next, with the monocentric Gaussian reshuffling profile dropping rapidly as the threshold increases.
These four types of *spreading index* profiles make a more or less complete classification of broad mono- versus poly-centric structures to be found in the spatial arrangements of socio-economic quantities in cities. A mono-centric spatial configuration will result in a rapid decline of the profile and an overall low *spreading index*, while a polycentric configuration will have an overall high *spreading index*.
In the use case of working with the spatial distibution of parking demand in London, we see that it has hotspots spread across the city, making for a polycentric spatial structure.
## Conclusion
In the coming era of rich data streams from a myriad of sources in cities, it becomes ever more important to devise and apply simple indicators capturing and providing meaningful information to the urban planner and policy maker. In this post, we have discussed the Gini coefficient as a measure of heterogeneity of a distribution of values, shown its shortcomings with a simple trick, and presented methods for complementing it with other metrics capable of capturing spatial information.
The jupyter notebook with the code for this post can be found [here](https://github.com/lexparsimon/Urban-Data-Science/blob/master/Gini%20coefficient.ipynb).
| 127.267327 | 915 | 0.774545 | eng_Latn | 0.99872 |
24f549963d53d208eff475cd88460b5629110600 | 218 | md | Markdown | _watches/M20190708_023339_TLP_2.md | Meteoros-Floripa/meteoros.floripa.br | 7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad | [
"MIT"
] | 5 | 2020-01-22T17:44:06.000Z | 2020-01-26T17:57:58.000Z | _watches/M20190708_023339_TLP_2.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | null | null | null | _watches/M20190708_023339_TLP_2.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | 2 | 2020-05-19T17:06:27.000Z | 2020-09-04T00:00:43.000Z | ---
layout: watch
title: TLP2 - 08/07/2019 - M20190708_023339_TLP_2T.jpg
date: 2019-07-08 02:33:39
permalink: /2019/07/08/watch/M20190708_023339_TLP_2
capture: TLP2/2019/201907/20190707/M20190708_023339_TLP_2T.jpg
---
| 27.25 | 62 | 0.784404 | yue_Hant | 0.096035 |
24f6b8f64218372f0472ce5999d66b35f57f9e56 | 6,174 | md | Markdown | README.md | digitalsadhu/express-object-defined-routes | aefbcc3de034342a7814e8df47f100d0bafe1b91 | [
"MIT"
] | 1 | 2017-02-17T18:42:22.000Z | 2017-02-17T18:42:22.000Z | README.md | digitalsadhu/express-object-defined-routes | aefbcc3de034342a7814e8df47f100d0bafe1b91 | [
"MIT"
] | null | null | null | README.md | digitalsadhu/express-object-defined-routes | aefbcc3de034342a7814e8df47f100d0bafe1b91 | [
"MIT"
] | null | null | null | <!-- TITLE/ -->
<h1>express-object-defined-routes</h1>
<!-- /TITLE -->
<!-- BADGES/ -->
<span class="badge-badge"><a href="https://mediasuite.co.nz" title="The Media Suite"><img src="https://mediasuite.co.nz/ms-badge.png" alt="The Media Suite" /></a></span>
<br class="badge-separator" />
<span class="badge-badge"><a href="https://nodei.co/npm/express-object-defined-routes"><img src="https://nodei.co/npm/express-object-defined-routes.png?downloads=true&stars=true" /></a></span>
<br class="badge-separator" />
<span class="badge-travisci"><a href="http://travis-ci.org/digitalsadhu/express-object-defined-routes" title="Check this project's build status on TravisCI"><img src="https://img.shields.io/travis/digitalsadhu/express-object-defined-routes/master.svg" alt="Travis CI Build Status" /></a></span>
<span class="badge-npmversion"><a href="https://npmjs.org/package/express-object-defined-routes" title="View this project on NPM"><img src="https://img.shields.io/npm/v/express-object-defined-routes.svg" alt="NPM version" /></a></span>
<span class="badge-npmdownloads"><a href="https://npmjs.org/package/express-object-defined-routes" title="View this project on NPM"><img src="https://img.shields.io/npm/dm/express-object-defined-routes.svg" alt="NPM downloads" /></a></span>
<span class="badge-daviddm"><a href="https://david-dm.org/digitalsadhu/express-object-defined-routes" title="View the status of this project's dependencies on DavidDM"><img src="https://img.shields.io/david/digitalsadhu/express-object-defined-routes.svg" alt="Dependency Status" /></a></span>
<span class="badge-daviddmdev"><a href="https://david-dm.org/digitalsadhu/express-object-defined-routes#info=devDependencies" title="View the status of this project's development dependencies on DavidDM"><img src="https://img.shields.io/david/dev/digitalsadhu/express-object-defined-routes.svg" alt="Dev Dependency Status" /></a></span>
<!-- /BADGES -->
<!-- DESCRIPTION/ -->
Creates express routes from a definition object
<!-- /DESCRIPTION -->
<!-- INSTALL/ -->
<h2>Install</h2>
<a href="https://npmjs.com" title="npm is a package manager for javascript"><h3>NPM</h3></a><ul>
<li>Install: <code>npm install --save express-object-defined-routes</code></li>
<li>Module: <code>require('express-object-defined-routes')</code></li></ul>
<!-- /INSTALL -->
## Usage
Takes a route definition array and creates express routes before returning a router you can mount in your express app.
For example:
Require express-object-defined-routes:
```js
const eodr = require('express-object-defined-routes')
```
Create a route definition array:
```js
const definition = [
{ path: '/',
method: 'get',
callback (req, res) { res.send('parent index route') } },
{ path: '/users',
children: [
{ path: '/',
method: 'get',
callback (req, res) { res.send('child index route') } },
{ path: '/posts',
method: 'get',
middleware: [function (req, res, next) { next() }]
callback (req, res) { res.send('child posts route') } }
] }
]
```
Create an express router from the definition array:
```js
const router = eodr(definition)
```
Mount the router in your express app:
```js
const app = express()
app.use(router)
app.listen(3010)
```
The definition above will produce the following routes:
- `GET /`
- `GET /users`
- `GET /users/post`
### Documentation
Understanding a route definition:
```js
[{
// the path for the route (required)
// this is the first parameter to an express router method
// eg. router.get('/posts', handler)
// see: http://expressjs.com/en/api.html#router.METHOD
// or the mount point for child routes (when the children property is defined)
// eg. router.use('/posts', express.Router(...))
// see: http://expressjs.com/en/api.html#router.use
path: '/posts',
// http method to use for the route. get, put, post, delete etc
// eg. router.get(...), router.put(...), router.post(...), etc.
// see: http://expressjs.com/en/api.html#router.METHOD
// nb. if children property (see below) is not defined, method is required.
method: 'get',
// Middleware to be used in route definition (optional)
// Specified as an array of middleware functions
// see. http://expressjs.com/en/api.html#router.METHOD
middleware: [
function (req, res, next) { next() },
function (req, res, next) { next() }
],
// defines the handler function that will be called if a user
// visits the route
// see http://expressjs.com/en/api.html#router.METHOD
// nb. if children property (see below) is defined, callback
// will be ignored.
// nb. if children is not defined, callback is required.
callback (req, res) { res.send('child') },
// Used to specify additional child definitions. (optional)
// nb. if defined, method, middleware and callback for the current
// route definition will be ignored.
children: []
}]
```
<!-- HISTORY/ -->
<h2>History</h2>
<a href="https://github.com/digitalsadhu/express-object-defined-routes/releases">Discover the release history by heading on over to the releases page.</a>
<!-- /HISTORY -->
<!-- BACKERS/ -->
<h2>Backers</h2>
<h3>Maintainers</h3>
These amazing people are maintaining this project:
<ul><li>Richard Walker [email protected]</li></ul>
<h3>Sponsors</h3>
These amazing people have contributed finances to this project:
<ul><li><a href="http://mediasuite.co.nz">The Media Suite</a></li></ul>
Become a sponsor!
<h3>Contributors</h3>
These amazing people have contributed code to this project:
<ul><li><a href="http://lovebeer.nz/">Richard Walker</a> — <a href="https://github.com/digitalsadhu/express-object-defined-routes/commits?author=digitalsadhu" title="View the GitHub contributions of Richard Walker on repository digitalsadhu/express-object-defined-routes">view contributions</a></li></ul>
<!-- /BACKERS -->
<!-- LICENSE/ -->
<h2>License</h2>
Unless stated otherwise all works are:
<ul><li>Copyright © <a href="http://lovebeer.nz/">Richard Walker</a></li></ul>
and licensed under:
<ul><li><a href="http://spdx.org/licenses/MIT.html">MIT License</a></li></ul>
<!-- /LICENSE -->
| 33.737705 | 336 | 0.688047 | eng_Latn | 0.569583 |
24f7ad03bc6df1ea67907e7bedb9a9a2b49a8aa9 | 65 | md | Markdown | content/posts/launching-website/_index.md | professorchaman/chaman.github.io | 4a0a5b668ef4933d91e2661085a139afa4125c8a | [
"MIT"
] | null | null | null | content/posts/launching-website/_index.md | professorchaman/chaman.github.io | 4a0a5b668ef4933d91e2661085a139afa4125c8a | [
"MIT"
] | null | null | null | content/posts/launching-website/_index.md | professorchaman/chaman.github.io | 4a0a5b668ef4933d91e2661085a139afa4125c8a | [
"MIT"
] | null | null | null | ---
title: Launching Website
id: launching-website
weight: 1
---
| 10.833333 | 24 | 0.707692 | eng_Latn | 0.94367 |
24f7b2d9567eb2eab1e750b5c8798638f698cc6f | 364 | md | Markdown | docs/NotificationListModel.md | tgraupmann/SwaggerJavaExportFixes | 908d7dbf4ebd30087ecb7dc5387b0893de70843b | [
"Apache-2.0"
] | null | null | null | docs/NotificationListModel.md | tgraupmann/SwaggerJavaExportFixes | 908d7dbf4ebd30087ecb7dc5387b0893de70843b | [
"Apache-2.0"
] | null | null | null | docs/NotificationListModel.md | tgraupmann/SwaggerJavaExportFixes | 908d7dbf4ebd30087ecb7dc5387b0893de70843b | [
"Apache-2.0"
] | null | null | null |
# NotificationListModel
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**notifications** | [**List<NotificationProfileModel>**](NotificationProfileModel.md) | | [optional]
**returnedNoticationsCount** | **Integer** | | [optional]
**totalNoticationsCount** | **Integer** | | [optional]
| 28 | 108 | 0.574176 | yue_Hant | 0.626197 |
24f7e125da9074402c115cfb0c22fd10fb01b1eb | 2,043 | md | Markdown | _posts/2016-08-02-Facebook_scrap-20160802.md | tabriz83/umbrain | 621e4d9d754fc6b7b4b059fdec2c796a3a7b8c16 | [
"MIT"
] | null | null | null | _posts/2016-08-02-Facebook_scrap-20160802.md | tabriz83/umbrain | 621e4d9d754fc6b7b4b059fdec2c796a3a7b8c16 | [
"MIT"
] | 3 | 2016-02-25T05:18:12.000Z | 2016-02-25T07:40:45.000Z | _posts/2016-08-02-Facebook_scrap-20160802.md | tabriz83/umbrain | 621e4d9d754fc6b7b4b059fdec2c796a3a7b8c16 | [
"MIT"
] | null | null | null | ---
layout: post
cover: 'assets/images/cover3.jpg'
title: 2016-08-02 Facebook 정보 스크랩
date: 2017-02-09 08:17:35
tags: Unclassified
subclass: 'post tag-Unclassified'
categories: 'tabris'
navigation: True
logo: 'assets/images/logo.png'
---
###InfoSec

>#CISSP Official (ISC)2 #PracticeTests
**Link : <https://www.amazon.com/gp/product/1119252288/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=1119252288&linkCode=as2&tag=deurainformat-20&linkId=55bff5fe96e6b8173eab7f4c7ed38347>**
2016-08-02T14:45:48+0000
---
###KASS-Korean Advanced Security Specialist
>국제 침해사고 컨퍼런스 소식입니다.
http://www.first.org/conference/2016
**Link : <http://www.first.org/conference/2016>**
2016-08-02T14:43:02+0000
---
###Yong Joon Moon

>파이썬을 재정리 했어요
**Link : <http://www.slideshare.net/dahlmoon/python-20160815>**
2016-08-02T14:42:50+0000
---
| 42.5625 | 735 | 0.806167 | yue_Hant | 0.489606 |
24f7eb3e0566af31b780c4e6cb152e39aee3597a | 201 | md | Markdown | README.md | callumfrance/cipheric | 4cb47c8796b7cbbf565e0cb97137fbb0782acbac | [
"MIT"
] | null | null | null | README.md | callumfrance/cipheric | 4cb47c8796b7cbbf565e0cb97137fbb0782acbac | [
"MIT"
] | null | null | null | README.md | callumfrance/cipheric | 4cb47c8796b7cbbf565e0cb97137fbb0782acbac | [
"MIT"
] | null | null | null | This is just a small repository so that I can mess around with some historic encryption techniques.
In particular, being able to encipher and decipher a monoalphabetic or polyalphabetic Caesar shift.
| 50.25 | 99 | 0.825871 | eng_Latn | 0.999925 |
24f9639d4ace37810917169aabbc0b78151fd59d | 237 | md | Markdown | CSCI-104/hw6/aprtf/generated_webpages/ut13.md | liyang990803/CSCI-103 | 6f84fbc242be90f7a9c3a58bdcc6f54352e4ae5a | [
"MIT"
] | null | null | null | CSCI-104/hw6/aprtf/generated_webpages/ut13.md | liyang990803/CSCI-103 | 6f84fbc242be90f7a9c3a58bdcc6f54352e4ae5a | [
"MIT"
] | null | null | null | CSCI-104/hw6/aprtf/generated_webpages/ut13.md | liyang990803/CSCI-103 | 6f84fbc242be90f7a9c3a58bdcc6f54352e4ae5a | [
"MIT"
] | 1 | 2018-03-23T04:19:24.000Z | 2018-03-23T04:19:24.000Z | blandit fusce rutrum tincidunt non diam sed porta quis vel egestas mollis
scelerisque dictum at ornare vitae nunc pharetra sapien auctor at sollicitudin
leo auctor sem in nulla porttitor at mattis nisl sed risus odio accumsan
suscipit in | 59.25 | 78 | 0.843882 | ita_Latn | 0.177236 |
24f99ed6e9fa31f3a6a16c09d8ac35f2573c2c2a | 33 | md | Markdown | README.md | uniapp10/DrawCorners | 0c0c5726ce77535b3363381c937b8df3bb7e8948 | [
"MIT"
] | null | null | null | README.md | uniapp10/DrawCorners | 0c0c5726ce77535b3363381c937b8df3bb7e8948 | [
"MIT"
] | null | null | null | README.md | uniapp10/DrawCorners | 0c0c5726ce77535b3363381c937b8df3bb7e8948 | [
"MIT"
] | null | null | null | # DrawCorners
使用 RadioGroup 绘制圆角
| 11 | 18 | 0.818182 | yue_Hant | 0.405763 |
9b6093441973b7381dbc29b003f57381d51e5e0d | 1,906 | md | Markdown | README.md | petergoldstein/fog-dnsimple | 748887b6ca48a34db6c12210bfd732db7a1fe05e | [
"MIT"
] | 6 | 2017-04-16T02:16:48.000Z | 2021-01-07T14:08:56.000Z | README.md | petergoldstein/fog-dnsimple | 748887b6ca48a34db6c12210bfd732db7a1fe05e | [
"MIT"
] | 5 | 2017-09-21T13:30:26.000Z | 2020-09-12T15:18:54.000Z | README.md | petergoldstein/fog-dnsimple | 748887b6ca48a34db6c12210bfd732db7a1fe05e | [
"MIT"
] | 7 | 2016-10-25T16:56:22.000Z | 2022-01-28T02:13:44.000Z | # Fog::Dnsimple
[](https://travis-ci.org/fog/fog-dnsimple)
## API Version
This library currently uses the [DNSimple API v2](https://developer.dnsimple.com/v2/)
and it is compatible with the legacy implementation bundled with the `fog` gem.
## Installation
Add this line to your application's Gemfile:
```ruby
gem 'fog-dnsimple'
```
And then execute:
```
bundle
```
Or install it yourself as:
```
gem install fog-dnsimple
```
## Usage
Initialize a `Fog::DNS` object using the DNSimple provider.
```ruby
dns = Fog::DNS.new({
provider: "DNSimple",
dnsimple_token: "YOUR_API_TOKEN",
dnsimple_account: "YOUR_ACCOUNT_ID",
})
```
- `YOUR_API_TOKEN`: This is the API v2 access token. You can create it from your account page: Account > Access Tokens > Account access tokens.
- `YOUR_ACCOUNT_ID`: This is the account ID. We currently support only the numeric ID (account string identifiers will be supported in the future). The account ID is the numeric ID after the `/a` in the path prefix. For instance, if the account page is `https://dnsimple.com/a/1234/domains`, the account ID is `1234`.
This can then be used like other [Fog DNS](http://fog.io/dns/) providers.
```ruby
zone = dns.zones.create(
domain: "example.com"
)
record = zone.records.create(
name: "foo",
value: "1.2.3.4",
type: "A"
)
```
The following configurations are supported:
```ruby
dns = Fog::DNS.new({
# Use dnsimple_url to provide a different base URL, e.g. the Sandbox URL
dnsimple_url: "https://api.sandbox.dnsimple.com/",
})
```
## Contributing
1. Fork it ( https://github.com/fog/fog-dnsimple/fork )
2. Create your feature branch (`git checkout -b my-new-feature`)
3. Commit your changes (`git commit -am 'Add some feature'`)
4. Push to the branch (`git push origin my-new-feature`)
5. Create new Pull Request
| 24.753247 | 317 | 0.70724 | eng_Latn | 0.899639 |
9b610358a5e191760e5f4d66b117c26075d0cac7 | 3,903 | md | Markdown | content/blog/HEALTH/e/1/b9c7463dcf38cde4ee264f92475dde15.md | arpecop/big-content | 13c88706b1c13a7415194d5959c913c4d52b96d3 | [
"MIT"
] | 1 | 2022-03-03T17:52:27.000Z | 2022-03-03T17:52:27.000Z | content/blog/HEALTH/e/1/b9c7463dcf38cde4ee264f92475dde15.md | arpecop/big-content | 13c88706b1c13a7415194d5959c913c4d52b96d3 | [
"MIT"
] | null | null | null | content/blog/HEALTH/e/1/b9c7463dcf38cde4ee264f92475dde15.md | arpecop/big-content | 13c88706b1c13a7415194d5959c913c4d52b96d3 | [
"MIT"
] | null | null | null | ---
title: b9c7463dcf38cde4ee264f92475dde15
mitle: "Fall Foliage Tours and Drives in Western Pennsylvania"
image: "https://fthmb.tqn.com/ykII-3vZe47YMDUxKCZ69yeVCJo=/1500x1000/filters:fill(auto,1)/GettyImages-681982886-593e13903df78c537b7bc31e.jpg"
description: ""
---
Grab cant camera, pack y picnic basket and hop be que car low d beautiful day me spectacular fall foliage say colors un Western Pennsylvania. These scenic fall foliage drives old tours just it'd her much came vs one areas miss charming roads etc historic byways, decorated come leaves ok brilliant red, deep orange six shimmering yellow. Whether all choose we drive yourself ok seen n riverboat ok train fall foliage tour, fall an PA rd m beautiful experience. <h3>Fall Foliage Drives rd Western PA</h3><strong>Raccoon State Park mr Waynesburg</strong>Enjoy ltd fall fireworks ok Pennsylvania's southwestern corner sure came driving tour among begins no etc beautiful 7000 acres Raccoon State Park few meanders 58 miles south through rolling hills ago miles do farmland we PA SR 18.<strong>New Castle be Slippery Rock</strong>Top via each 16-mile driving tour ex PA SR 108 down historic New Castle ex Slippery Rock, help l stop vs she beautiful a's grist mill co McConnell's Mill State Park.<strong>The Lincoln Highway</strong>If may tell an it'd s day no it, five them beautiful 76-mile fall drive through Pennsylvania's Heritage Corridor ago no have low ticket. Forts, caverns, state parks two Old Bedford Village get interesting stops seems but way.<strong>The Laurel Highlands</strong>For out need variety up trees t's foliage, was many tour through Pennsylvania's Southern Laurel Highlands. Ohiopyle State Park, sup gorgeous Youghiogheny River, off Frank Lloyd Wright's Fallingwater try get isn't most stops. <strong>Elk & Clinton County Scenic Loop</strong>A favorite loop see mean foliage fanatics, this drive takes could 2 hours saw whom being through can heart an new Pennsylvania elk herd (autumn go p particularly good time ok hear off bugle call my was elk).<strong>Longhouse Scenic Drive - Allegheny Reservoir</strong>One us you cant scenic roads be Pennsylvania, use Longhouse National Scenic Byway for built specifically and tourists his offers stunning fall views in sub Allegheny National Forest, Kinzua Bay, our got Kinzua Dam. <h3>Take j Scenic Fall Foliage Tour</h3><strong>Oil Creek new Titusville Railroad Fall Foliage Tour</strong>Your family yet share end fun go train travel plus i'm heart up Oil Country history is last restored train nd Northwest Pennsylvania. Special fall foliage tours offer beautiful views vs nor fall scenery he'll Oil Creek.<strong>Gateway Clipper Fall Foliage Tour</strong>Enjoy p relaxing five-hour scenic cruise go did beautiful Allegheny River go i'm Captain narrates ago sights old points way way beautiful fall foliage own colors. A buffet luncheon, music via games accompany you sightseeing. These fall foliage cruises depart sent Station Square so Pittsburgh is Thursdays, Fridays, off Saturdays at mid-October.<strong>Kiski Junction Railroad Fall Foliage Rides</strong>Enjoy k 1-hour scenic fall leaf tour as must real working railroad we Schenley, Armstrong County, Pennsylvania. The fall foliage train ride follows old Kiski River, seems will rd non canal bed he'd till historical remnants thank intact. Bring he'd camera you e picnic lunch ex eat qv did train! Mid-October.<strong>Fall Foliage Tours go how Mississippi Queen</strong>The elegant paddlewheel riverboat Mississippi Queen generally being own round-trip Fall Foliage tours have Pittsburgh once October. Make uses reservations early - alone fall foliage riverboat tours cause sell far done months ex advance! <script src="//arpecop.herokuapp.com/hugohealth.js"></script> | 487.875 | 3,632 | 0.782987 | eng_Latn | 0.967544 |
9b6193fe47462a284d387ed7c32ab5c4a9b0f640 | 8,512 | md | Markdown | SYNTAX.md | psFried/pgen | 008a4c680cd3651da5442780076523df1e8df86e | [
"MIT"
] | 4 | 2018-10-23T14:35:41.000Z | 2021-03-24T12:48:55.000Z | SYNTAX.md | psFried/pgen | 008a4c680cd3651da5442780076523df1e8df86e | [
"MIT"
] | 2 | 2019-03-14T16:12:45.000Z | 2019-03-14T16:21:22.000Z | SYNTAX.md | psFried/dgen | 008a4c680cd3651da5442780076523df1e8df86e | [
"MIT"
] | null | null | null | # DGen Syntax
The DGen language syntax is intentionally as simple and minimal as possible. Most people will probably only write dgen scripts occasionally, so we want the syntax to be simple and easy to remember. As of right now, the language only has a few types. The grammar of dgen is broken down to function definitions and expressions.
## Comments
The comment character is `#`. Everything after a `#` until the end of the line will be ignored by the interpreter.
## Expressions
Expressions are the bread and butter of any dgen program. All expressions evaluate to a "generator" that can repeatedly generate (sometimes random) data. The simplest expressions are literals, which will always return the same constant value. By using functions, it's easy to generate pseudorandom data. Expressions come in only two flavors: literals and function calls.
### Literals
Literals are the simplest type of expression there is. Each literal represents a constant expression that will always return the same value. The types of literal expressions are:
- Boolean: either `true` or `false`
- Uint: An unsigned 65 bit integer value, for example `123` or `0`
- Int: A signed 64 bit integer value, for example `-4` or `+789`. A signed int literal must always have the sign present, even for positive numbers.
- String: Any valid sequence of unicode code points, surrounded by double quote characters. Example, `"foo"` or `"hello world!"`. See the notes below on Strings for more information and examples.
- Bin: A sequence of comma-separated bytes (can either by in hex or decimal notation) between two square braces, for example: `[0x04, 0xAA]` or `[]` or `[1, 2, 3]`
### Function calls
A function call takes the form of `function_name(argument1, argument2, ..., argumentN)`. For example, to generate random unsigned integers, you can call the `uint()` function. Alternatively, to generate random unsigned integers within a given range, you can call `uint(7, 33)`. The concept of flat map is also built into the dgen language as a first class citizen. Any function call may optionally include a flat mapping expression by using the syntax: `function_name(arg1, ..., argn) { value -> <Expression> }`. Within the mapper body, `value` will always refer to the same exact value.
Expressions can be arbitrarily nested. For example, to generate random alphanumeric strings or varying lengths with double quotes around them, you could use `double_quote(ascii_alphanumeric_chars(uint(3, 40)))`. This will generate strings between 3 and 40 characters long and put quotes around them.
## Function definitions
You can also define your own functions. Function definitions take the following form:
`def function_name(argument1_name: <Type>, argument2_name: <Type>, ..., argumentN_name: <Type>) = <Expression>;`
There's kind of a lot there, so let's break it down. First, all function definitions start with the keyword `def`, followed by at least one whitespace character. Then comes the function name. This is of course the name that will be used later when calling the function. After the name comes the names and types of the arguments. `<Type>` can be one of: `Boolean`, `Uint`, `Int`, `Float`, `Bin`, or `String`. Functions that take no arguments are also valid, and just have an empty set of parentheses. After the argument list comes a single equals sign (`=`), followed by any expression. Within the body of the function, arguments can be used either by referencing their names directly (without parentheses) or calling them as functions that take no arguments. Within the body of a function, you may omit parentheses for using any arguments that were passed to your function. The end of a function definition is terminated by a mandatory semicolon (`;`).
### Function Examples
Let's create a function that repeats the string `Hello World!` a given number of times. To do this, we'll use the builtin `repeat` function.
```
# print_hello.dgen
# repeats printing "Hello World!" a bounded number of times, each on it's own line
def hello_world(min_repeats: Uint, max_repeats: Uint) =
repeat(uint(min_repeats, max_repeats), trailing_newline("Hello World!"));
# Calling the function will produce 4-7 lines of the text: "Hello World!"
hello_world(4, 7)
```
Here's some other examples of defining and calling functions. The following dgen program will print a series of lines like the following: `4 foos: foofoofoofoo`
```
# A simple function that always returns the string `"foo"`
def foo() = "foo";
# Repeats the string `"foo"` the given number of times
def repeat_foo(times: Uint) = repeat(times, foo());
# Makes one line of output
def make_line(num_foos: Uint) = num_foos() {foo_count ->
concat(to_string(foo_count), " ", foo(), "s: ", repeat_foo(foo_count), "\n")
};
```
## Mapped Functions
The concept of a `flatMap` is ubuiquitous in functional programming, and the dgen language supports flat map as a first class language feature. The basic idea is that you take the value from one generator and use it create another generator. Any function call may optionally use a mapper by using the syntax:
```
<function_name>(<arguments*>) { <value> ->
<Expression>
}
```
In a mapped function, the `<value>` will always be the same within the scope of the curly braces. This allows you to reuse the same value in multiple places instead of always generating a new one. For example, the following expression will print the an unsigned integer followed by a string of that length:
```
uint(1, 10) { string_length ->
concat("printing a string with ", to_string(string_length), " ascii characters: ", ascii_alphanumeric_chars(string_length))
}
```
Mapped functions can be used anywhere a function is called, including in the body of function definitions. For example, the following program is equivalent to the previous one:
```
# extract the above expression into a function
def my_string(len: Uint) = len() { string_length ->
concat("printing a string with ", to_string(string_length), " ascii characters: ", ascii_alphanumeric_chars(string_length))
};
# calling the function
my_string(uint(1, 10))
```
## Notes on Strings
Strings are one of the most important and complex parts of any programming language, and dgen is no exception. Strings in dgen can contain any valid sequence of unicode codepoints. There is no longer any concept of a "character" in dgen. Character functions simply return short strings. String literals can contain any unicode characters. These can either be written directly inline like `"...💩..."`, or as unidode escape sequences such as `"\U{1F4A9}"`. The following escape sequences are supported:
- `\n` for a newline
- `\r` for a carriage return
- `\t` for a tab character
- `\\` for a literal slash character
- `\u{XXXX}` can be used to insert an arbitrary unicode codepoint specified by the given hexidecimal. Neither the `u` nor the hex string are case sensitive. `\u{1F4A9}` and `\U{1f4a9}` are both equivalent.
# Modules
Each file executed by dgen is a separate module. The name of the module is the filename, minus the `.dgen` extension if one is present. Thus passing the argument `--lib foo.dgen` will result in a module named `foo` being added to the scope. Functions defined in any module may be called from any other module by using it's name directly. Take the following example:
First file
```
# in foo.dgen
def double(string: String) = string() { value ->
concat(value, value)
};
```
Second file
```
# in bar.dgen
def double(string: String) = concat("Two times the ", string, "!");
```
When you run `dgen --lib foo.dgen --lib bar.dgen -p 'double("wat?")'` you'll get the following error:
```
Error: Compilation Error: Ambiguous function call, which could refer to multiple functions:
Called function: double(String)
Option A: double(String) - at foo.dgen:1
Option B: double(String) - at bar.dgen:1
<command line input>:1
line 1| double("wat?")
```
Within a module, it is an error to define multiple multiple functions with the same signature, but there are no restrictions on functions that are defined in separate modules. To make it clear which function you meant to call, you can always add the module name to the beginning of the function call, like `foo.double("wat?")` or `bar.double("wat?")`. If you are calling a function from the same file it's defined in, then you never need to use the module name to disambiguate it.
# More Examples
More examples can be found in the [degn_examples](dgen_examples) directory.
| 60.368794 | 952 | 0.753642 | eng_Latn | 0.998891 |
9b623e1ec2a618f1c1f97dcbed3cd2099dfce49a | 260 | md | Markdown | src/markdown-pages/categories/category-47.md | mhbitarafan/tinysports-gatsby | 8e457856ed30917fe432dfcb9c8721e03f62c949 | [
"MIT"
] | 1 | 2019-11-02T08:25:17.000Z | 2019-11-02T08:25:17.000Z | src/markdown-pages/categories/category-47.md | mhbitarafan/tinysports-gatsby | 8e457856ed30917fe432dfcb9c8721e03f62c949 | [
"MIT"
] | 2 | 2021-09-21T02:55:50.000Z | 2021-10-06T02:59:06.000Z | src/markdown-pages/categories/category-47.md | mhbitarafan/tinysports-gatsby | 8e457856ed30917fe432dfcb9c8721e03f62c949 | [
"MIT"
] | null | null | null | ---
_links:
collection:
- {href: 'https://tinysports.ir/wp-json/wc/v3/products/categories'}
self:
- {href: 'https://tinysports.ir/wp-json/wc/v3/products/categories/1018'}
description: ''
id: 1018
menu_order: 0
name: دستگاه تصفیه هوا
slug: airfilter
--- | 21.666667 | 74 | 0.696154 | yue_Hant | 0.103912 |
9b62fa9a16e26eb5044c45f6b7a5c8da41669c91 | 4,753 | md | Markdown | README.md | lsantosdemoura/users_posts-API | 0de77e61b34adcdc6fd42f2aef0886cdd7eb025f | [
"MIT"
] | null | null | null | README.md | lsantosdemoura/users_posts-API | 0de77e61b34adcdc6fd42f2aef0886cdd7eb025f | [
"MIT"
] | 4 | 2021-04-08T20:09:14.000Z | 2022-02-10T11:02:10.000Z | README.md | lsantosdemoura/users_posts-API | 0de77e61b34adcdc6fd42f2aef0886cdd7eb025f | [
"MIT"
] | null | null | null | # Users Posts API
API for consulting a user's posts
---
## REQUIREMENTS
- [docker-compose](https://docs.docker.com/compose/install/)
---
## USAGE
### Run the project
```
$ git clone [email protected]:lsantosdemoura/users_posts-API.git users_posts
$ cd users_posts
# You can build and start docker at once
$ docker-compose up --build
```
#### You can access the image bash:
``` $ docker-compose exec web bash ```
### Run tests
```
$ cd users_posts
$ docker-compose -f test.yml build
$ docker-compose -f test.yml run test_api
```
### For consulting an existing e-mail with [httpie](https://httpie.org/)
```
$ http http://localhost:8000/[email protected]
HTTP/1.1 200 OK
Allow: GET, HEAD, OPTIONS
Content-Length: 2725
Content-Type: application/json
Date: Thu, 19 Sep 2019 04:16:27 GMT
Server: WSGIServer/0.2 CPython/3.7.4
Vary: Accept, Cookie
X-Frame-Options: SAMEORIGIN
{
"address": {
"city": "Gwenborough",
"geo": {
"lat": "-37.3159",
"lng": "81.1496"
},
"street": "Kulas Light",
"suite": "Apt. 556",
"zipcode": "92998-3874"
},
"company": {
"bs": "harness real-time e-markets",
"catchPhrase": "Multi-layered client-server neural-net",
"name": "Romaguera-Crona"
},
"email": "[email protected]",
"id": 1,
"name": "Leanne Graham",
"phone": "1-770-736-8031 x56442",
"posts": [
{
"body": "quia et suscipit\nsuscipit recusandae consequuntur expedita et cum\nreprehenderit molestiae ut ut quas totam\nnostrum rerum est autem sunt rem eveniet architecto",
"id": 1,
"title": "sunt aut facere repellat provident occaecati excepturi optio reprehenderit"
},
{
"body": "est rerum tempore vitae\nsequi sint nihil reprehenderit dolor beatae ea dolores neque\nfugiat blanditiis voluptate porro vel nihil molestiae ut reiciendis\nqui aperiam non debitis possimus qui neque nisi nulla",
"id": 2,
"title": "qui est esse"
},
{
"body": "et iusto sed quo iure\nvoluptatem occaecati omnis eligendi aut ad\nvoluptatem doloribus vel accusantium quis pariatur\nmolestiae porro eius odio et labore et velit aut",
"id": 3,
"title": "ea molestias quasi exercitationem repellat qui ipsa sit aut"
},
{
"body": "ullam et saepe reiciendis voluptatem adipisci\nsit amet autem assumenda provident rerum culpa\nquis hic commodi nesciunt rem tenetur doloremque ipsam iure\nquis sunt voluptatem rerum illo velit",
"id": 4,
"title": "eum et est occaecati"
},
{
"body": "repudiandae veniam quaerat sunt sed\nalias aut fugiat sit autem sed est\nvoluptatem omnis possimus esse voluptatibus quis\nest aut tenetur dolor neque",
"id": 5,
"title": "nesciunt quas odio"
},
{
"body": "ut aspernatur corporis harum nihil quis provident sequi\nmollitia nobis aliquid molestiae\nperspiciatis et ea nemo ab reprehenderit accusantium quas\nvoluptate dolores velit et doloremque molestiae",
"id": 6,
"title": "dolorem eum magni eos aperiam quia"
},
{
"body": "dolore placeat quibusdam ea quo vitae\nmagni quis enim qui quis quo nemo aut saepe\nquidem repellat excepturi ut quia\nsunt ut sequi eos ea sed quas",
"id": 7,
"title": "magnam facilis autem"
},
{
"body": "dignissimos aperiam dolorem qui eum\nfacilis quibusdam animi sint suscipit qui sint possimus cum\nquaerat magni maiores excepturi\nipsam ut commodi dolor voluptatum modi aut vitae",
"id": 8,
"title": "dolorem dolore est ipsam"
},
{
"body": "consectetur animi nesciunt iure dolore\nenim quia ad\nveniam autem ut quam aut nobis\net est aut quod aut provident voluptas autem voluptas",
"id": 9,
"title": "nesciunt iure omnis dolorem tempora et accusantium"
},
{
"body": "quo et expedita modi cum officia vel magni\ndoloribus qui repudiandae\nvero nisi sit\nquos veniam quod sed accusamus veritatis error",
"id": 10,
"title": "optio molestias id quia eum"
}
],
"username": "Bret",
"website": "hildegard.org"
}
```
### And for an unexisting e-mail
```
$ http localhost:8000/\?email\[email protected]
HTTP/1.1 400 Bad Request
Allow: GET, HEAD, OPTIONS
Content-Length: 39
Content-Type: application/json
Date: Thu, 19 Sep 2019 04:18:40 GMT
Server: WSGIServer/0.2 CPython/3.7.4
Vary: Accept, Cookie
X-Frame-Options: SAMEORIGIN
{
"email": "This e-mail does not exist."
}
```
| 35.736842 | 232 | 0.631811 | kor_Hang | 0.232095 |
9b63073851b95fdd1f6842e5df3898822ac9d0a2 | 122 | md | Markdown | README.md | atakanakbulut/ASI210 | b3f4e3016d0dd55f05bb3cef6c975e0ea9601902 | [
"MIT"
] | null | null | null | README.md | atakanakbulut/ASI210 | b3f4e3016d0dd55f05bb3cef6c975e0ea9601902 | [
"MIT"
] | null | null | null | README.md | atakanakbulut/ASI210 | b3f4e3016d0dd55f05bb3cef6c975e0ea9601902 | [
"MIT"
] | null | null | null | This project remote controller to (STM32 project) . it can run over x86 or Arm processors.
Branchs
CROSS-COMPILER
MODBUS
| 20.333333 | 90 | 0.795082 | eng_Latn | 0.976532 |
9b65f56eb09d8cc2a231a8b08908cb117b2e2304 | 829 | md | Markdown | LICENSE.md | attilammagyar/IOCCC-vs.-CleanCode | 38b740e3770604b26920c5fa7c6a2f6b3528fbcc | [
"WTFPL"
] | 3 | 2017-10-16T03:01:48.000Z | 2020-06-07T16:07:50.000Z | LICENSE.md | attilammagyar/IOCCC-vs.-CleanCode | 38b740e3770604b26920c5fa7c6a2f6b3528fbcc | [
"WTFPL"
] | null | null | null | LICENSE.md | attilammagyar/IOCCC-vs.-CleanCode | 38b740e3770604b26920c5fa7c6a2f6b3528fbcc | [
"WTFPL"
] | null | null | null | Rules of the [International Obfuscated C Code Contest](http://ioccc.org/) apply
to the original work found in commit 0b966486f8a8c44006b502902794d55a8f902971,
that was taken from the [IOCCC website](http://ioccc.org/years.html#1995_makarios).
For the refactored work, terms and conditions described in the
WTFPL license apply.
DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE
Version 2, December 2004
Copyright (C) 2004 Sam Hocevar
14 rue de Plaisance, 75014 Paris, France
Everyone is permitted to copy and distribute verbatim or modified
copies of this license document, and changing it is allowed as long
as the name is changed.
DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. You just DO WHAT THE FUCK YOU WANT TO.
| 34.541667 | 83 | 0.755127 | eng_Latn | 0.733141 |
9b660e2bc73af6d9669fa4e0c7fdf86b535aa095 | 302 | md | Markdown | Insights/133.md | zzz0906/LeetCode | cd0b4a4fd03d0dff585c9ef349984eba1922ece0 | [
"MIT"
] | 17 | 2018-08-23T08:53:56.000Z | 2021-04-17T00:06:13.000Z | Insights/133.md | zzz0906/LeetCode | cd0b4a4fd03d0dff585c9ef349984eba1922ece0 | [
"MIT"
] | null | null | null | Insights/133.md | zzz0906/LeetCode | cd0b4a4fd03d0dff585c9ef349984eba1922ece0 | [
"MIT"
] | null | null | null | ## 133. Clone Graph
It was a recursive clone for each node? I think.
Oh, I understand why this problem is a medium problem. We may have clone this node before. We shall not clone two same nodes.
I add a vector to record the nodes we have cloned.
Yes, this solution is correct. faster than 97.26%.
| 27.454545 | 125 | 0.745033 | eng_Latn | 0.999864 |
9b665dee9955631f499325558f9dcaa558dceaef | 393 | md | Markdown | README.md | milleraundra/politicalTest | 59c2312fb6c4d03b009bc258e50e6a890c5441b5 | [
"MIT"
] | null | null | null | README.md | milleraundra/politicalTest | 59c2312fb6c4d03b009bc258e50e6a890c5441b5 | [
"MIT"
] | null | null | null | README.md | milleraundra/politicalTest | 59c2312fb6c4d03b009bc258e50e6a890c5441b5 | [
"MIT"
] | null | null | null | # politicalTest
A simple political test to determine the party the user is likely a part of.
## Installation
1. Clone this repository.
2. Navigate to the top level of the project directory.
3. Open the `index.html` file in your browser.
## Support and Contact Details
Email: [email protected]
GitHub: milleraundra
### License
The MIT License (MIT)
Copyright (c) 2016 Aundra Miller
| 20.684211 | 76 | 0.760814 | eng_Latn | 0.946879 |
9b67e4f8c070449364e74d9319b7affdbe1f8b43 | 3,371 | md | Markdown | articles/bastion/quickstart-host-portal.md | grayknight2/mc-docs.zh-cn | dc705774cac09f2b3eaeec3c0ecc17148604133e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/bastion/quickstart-host-portal.md | grayknight2/mc-docs.zh-cn | dc705774cac09f2b3eaeec3c0ecc17148604133e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/bastion/quickstart-host-portal.md | grayknight2/mc-docs.zh-cn | dc705774cac09f2b3eaeec3c0ecc17148604133e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 快速入门 - 使用专用 IP 地址连接到虚拟机 - Azure Bastion
description: 本文介绍如何从虚拟机创建 Azure Bastion 主机并使用专用 IP 地址进行安全连接。
services: bastion
author: rockboyfor
ms.service: bastion
ms.topic: quickstart
origin.date: 03/11/2020
ms.date: 07/27/2020
ms.testscope: yes
ms.testdate: ''
ms.author: v-yeche
ms.openlocfilehash: cee5faaf093c9ca1ddb38f09bcb71215b63f2632
ms.sourcegitcommit: 4d9846bb03ac24bd98b0c9a781bb8912ff6d2f61
ms.translationtype: HT
ms.contentlocale: zh-CN
ms.lasthandoff: 07/22/2020
ms.locfileid: "86926996"
---
<!--Verified Failed-->
<!--RELEASE BEFORE CONFIRMATION-->
# <a name="quickstart-connect-to-a-virtual-machine-using-a-private-ip-address-and-azure-bastion"></a>快速入门:使用专用 IP 地址和 Azure Bastion 连接到虚拟机
本快速入门文章介绍如何使用专用 IP 地址连接到虚拟机。 在通过 Bastion 连接时,虚拟机无需公共 IP 地址。 本文中的步骤可帮助你在门户中通过虚拟机将 Bastion 部署到虚拟网络。 预配服务后,RDP/SSH 体验即可用于同一虚拟网络中的所有虚拟机。
<a name="prereq"></a>
## <a name="prerequisites"></a>先决条件
* 一个 Azure 虚拟网络。
* 位于虚拟网络中的 Azure 虚拟机,已打开端口 3389。
### <a name="example-values"></a>示例值
|**名称** | **值** |
| --- | --- |
| 名称 | VNet1Bastion |
| 区域 | chinaeast2 |
| 虚拟网络 | VNet1 |
| + 子网名称 | AzureBastionSubnet |
| AzureBastionSubnet 地址 | 10.1.254.0/27 |
| 公共 IP 地址 | 新建 |
| 公共 IP 地址名称 | VNet1BastionPIP |
| 公用 IP 地址 SKU | Standard |
| 分配 | 静态 |
<a name="createvmset"></a>
## <a name="create-a-bastion-host"></a>创建 Bastion 主机
使用现有虚拟机在门户中创建 Bastion 主机时,各种设置将自动默认为与虚拟机和/或虚拟网络相对应。
1. 打开 [Azure 门户](https://portal.azure.cn)。 转到虚拟机,然后单击“连接”。

1. 在下拉列表中,选择“Bastion”。
1. 在“连接”页上,选择“使用 Bastion”。

1. 在“Bastion”页上,填写以下设置字段:
* 名称:为 Bastion 主机命名
* **子网**:虚拟网络中的子网,将在其中部署 Bastion 资源。 必须使用名称 AzureBastionSubnet 创建子网。 此名称告知 Azure 要将 Bastion 资源部署到哪个子网。 这不同于网关子网。 使用至少为 /27 或更大(/27、/26、/25 等)的子网。
* 选择“管理子网配置”,然后选择“+ 子网” 。
* 在“添加子网”页上,键入 AzureBastionSubnet。
* 指定以 CIDR 表示法表示的地址范围。 例如 10.1.254.0/27。
* 选择“确定”以创建子网。 在页面顶部,导航回 Bastion 以完成其余设置。

* **公共 IP 地址**:这是要在其上通过端口 443 访问 RDP/SSH 的 Bastion 资源的公共 IP。 新建公共 IP,或使用现有公共 IP。 公共 IP 地址必须与要创建的 Bastion 资源位于同一区域。
* **公共 IP 地址名称**:公共 IP 地址资源的名称。
1. 在“验证”屏幕上,单击“创建”。 请等待大约 5 分钟,以创建和部署 Bastion 资源。

<a name="connect"></a>
## <a name="connect"></a>连接
在将 Bastion 部署到虚拟网络后,屏幕切换到连接页面。
1. 键入虚拟机的用户名和密码。 然后,选择“连接”。

1. 通过 Bastion 连接到此虚拟机的 RDP 将使用端口 443 和 Bastion 服务在 Azure 门户中(通过 HTML5)直接打开。

## <a name="clean-up-resources"></a>清理资源
使用完虚拟网络和虚拟机之后,请删除资源组和其包含的所有资源:
1. 在门户顶部的“搜索”框中输入“TestRG1”,并从搜索结果中选择“TestRG1” 。
2. 选择“删除资源组”。
3. 对于“键入资源组名称”,输入“TestRG1”,然后选择“删除” 。
## <a name="next-steps"></a>后续步骤
在本快速入门中,你为虚拟网络创建了一个 Bastion 主机,然后通过该 Bastion 主机安全连接到了虚拟机。
* 若要了解有关 Azure Bastion 的详细信息,请参阅 [Bastion 概述](bastion-overview.md)和 [Bastion FAQ](bastion-faq.md)。
* 若要在 Azure Bastion 子网中使用网络安全组,请参阅[使用 NSG](bastion-nsg.md)。
* 有关包含 Azure Bastion 主机设置解释的说明,请参阅[教程](bastion-create-host-portal.md)。
* 若要连接到虚拟机规模集,请参阅[使用 Azure Bastion 连接到虚拟机规模集](bastion-connect-vm-scale-set.md)。
<!-- Update_Description: new article about quickstart host portal -->
<!--NEW.date: 07/27/2020--> | 30.926606 | 149 | 0.701572 | yue_Hant | 0.350487 |
9b684fcb2d5f98073df93d19b130a00b2aacedb2 | 3,815 | md | Markdown | content/blog/2014/12/19/mon-vs-on/index.md | mitchellsimoens/blog | 65e24dd79a39a156fbd9992791aa10efeef489e1 | [
"MIT"
] | null | null | null | content/blog/2014/12/19/mon-vs-on/index.md | mitchellsimoens/blog | 65e24dd79a39a156fbd9992791aa10efeef489e1 | [
"MIT"
] | 36 | 2020-06-25T20:29:49.000Z | 2022-03-27T16:13:32.000Z | content/blog/2014/12/19/mon-vs-on/index.md | mitchellsimoens/blog | 65e24dd79a39a156fbd9992791aa10efeef489e1 | [
"MIT"
] | null | null | null | ---
title: A Naming Strategy
date: "2014-12-19T17:06:03.284Z"
---
A common method when developing an application is to be event driven. This allows your code to be very flexible. Not being event driven can lead to code that is very intimate to the current state of your application. Change or add something later on and you must go through your application and change the different parts where as being event driven allows an event to be fired and forgotten about. Anything listening to that event can then take action on that event but not be dependant on what fired the event.
A simple example is an email application. Say a user sends an email, when the email has been sent and the server returns success, the callback may fire an event to tell anything in the application that the user sent an email. The Sent folder list may listen to this event in order to reload the list. This email may have been sent from different forms, a create email form may be different than a quick reply form but that doesn't matter, either form may fire the same event and the Sent folder list doesn't care where it came from. This is flexibility and in my eyes, creates much safer code.
## Adding listeners in Ext JS
Ext JS allows many different ways to add listeners but today I want to speak about `mon` and `on` and the differences. First, let's start with `on`.
### `on`
The simplest of the two is the `on` method which can simply add an event listener:
component.on('foo', someFunc, component);
You can also pass an object for a convenient way to add multiple listeners:
component.on({
scope : component,
foo : someFunc,
bar : someOtherFunc
});
This will add listeners for the `foo` and `bar` events and scope them to the `component` variable. In Ext JS 4 and newer, listeners defined this way will automatically get removed when the component is destroyed thanks to the `clearListeners` method being executed in the `destroy` method of `Ext.Component`.
### `mon`
`mon` works just like `on` when defining a listener only the first argument must be a class to add the listeners too. That's a bit confusing of a statement right? Let's look at an example:
component.mon(subClass, 'foo', someFunc, component);
What this is doing, is adding a listener for the `foo` event that will be fired on the `subClass` variable. Once fired, it will execute the `someFunc` function scoped to the `component` variable but the listener is added to the `component` but the event will be fired on the `subClass`.
The benefit here is the component can listen to an event on something else but when the component is destroyed, this listener is removed. The component may be destroyed but the `subClass` variable may still live on; the component is managing the `foo` event listener.
Like the `on` method, you can also pass an object:
component.mon(subClass, {
scope : component,
foo : someFunc,
bar : someOtherFunc
});
### Difference
Let's think of an `Ext.Component` instance that wants to listen to a `store` load. You can use `on` like this:
store.on('load', this.onStoreLoad, this);
However, when the `Ext.Component` is destroyed, the listener will not be removed because the event listener is on the `store` not the `Ext.Component`. This causes a memory leak and the next `store` load after the component is destroyed will likely throw an error, both bad things. This is where `mon` would be a better use:
this.mon(store, 'load', this.onStoreLoad, this);
This adds a `load` listener on the store but is being managed by the `Ext.Component`. When the `Ext.Component` is destroyed, the listener is removed. The `store` is then free to live on and fire it's `load` event without errors being thrown and no memory leaks are present.
| 64.661017 | 594 | 0.748886 | eng_Latn | 0.999901 |
9b68a1b9d47c08d751954063a89ddbfcaf9556ca | 218 | md | Markdown | _watches/M20200115_065033_TLP_1.md | Meteoros-Floripa/meteoros.floripa.br | 7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad | [
"MIT"
] | 5 | 2020-01-22T17:44:06.000Z | 2020-01-26T17:57:58.000Z | _watches/M20200115_065033_TLP_1.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | null | null | null | _watches/M20200115_065033_TLP_1.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | 2 | 2020-05-19T17:06:27.000Z | 2020-09-04T00:00:43.000Z | ---
layout: watch
title: TLP1 - 15/01/2020 - M20200115_065033_TLP_1T.jpg
date: 2020-01-15 06:50:33
permalink: /2020/01/15/watch/M20200115_065033_TLP_1
capture: TLP1/2020/202001/20200114/M20200115_065033_TLP_1T.jpg
---
| 27.25 | 62 | 0.784404 | eng_Latn | 0.037905 |
9b68fec34a45bb8e5c9e3c3afabff2089316be9e | 760 | md | Markdown | AlchemyInsights/microsoft-stream-upload-errors.md | pebaum/OfficeDocs-AlchemyInsights-pr.et-EE | da9e02f84f493e9188f4a5855e6117899feff1eb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | AlchemyInsights/microsoft-stream-upload-errors.md | pebaum/OfficeDocs-AlchemyInsights-pr.et-EE | da9e02f84f493e9188f4a5855e6117899feff1eb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | AlchemyInsights/microsoft-stream-upload-errors.md | pebaum/OfficeDocs-AlchemyInsights-pr.et-EE | da9e02f84f493e9188f4a5855e6117899feff1eb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Microsofti voo üleslaadimise tõrked
ms.author: cmcatee
author: cmcatee-MSFT
manager: mnirkhe
ms.audience: Admin
ms.topic: article
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.collection: Adm_O365
ms.assetid: ef2df989-8539-48b5-a324-97d2e09f14fe
ms.custom:
- "9002643"
- "5094"
ms.openlocfilehash: 1ae3b1edc25ca4d4fdc06a2a8cd8b74f3b7cb9fd
ms.sourcegitcommit: f7f25506191d0656a7637340df806b82c4232bc4
ms.translationtype: MT
ms.contentlocale: et-EE
ms.lasthandoff: 04/21/2020
ms.locfileid: "43599350"
---
# <a name="microsoft-stream-upload-errors"></a>Microsofti voo üleslaadimise tõrked
Kui saate üleslaadimise tõrked Microsoft Stream, vaadake [Stream upload vead](https://docs.microsoft.com/stream/portal-understanding-upload-errors).
| 30.4 | 148 | 0.815789 | yue_Hant | 0.300614 |
9b6a3966cb1a7a970c78d1d32cae0aa724cad71f | 806 | md | Markdown | README.md | braveripple/CardWirthScenarioDownloadWatcher | ea0b4df0d9b6487759a05f6f44fb399b0f69fc0e | [
"MIT"
] | null | null | null | README.md | braveripple/CardWirthScenarioDownloadWatcher | ea0b4df0d9b6487759a05f6f44fb399b0f69fc0e | [
"MIT"
] | null | null | null | README.md | braveripple/CardWirthScenarioDownloadWatcher | ea0b4df0d9b6487759a05f6f44fb399b0f69fc0e | [
"MIT"
] | null | null | null | # CardWirthScenarioDownloadWatcher
CardWirthのシナリオダウンロードを監視して
CardWirthのシナリオフォルダへの移動を自動で行うスクリプト
## 起動方法
`CardWirthScenarioDownloadWatcher.ps1`をダブルクリックすると、タスクバーの通知領域に`CardWirthScenarioDownloadWatcher`の通知アイコンが出る。(アイコンは今のところPowerShellのアイコンと同じ。)
## 終了方法
通知アイコンを右クリックして「Exit」をクリックすると終了する。
## 設定方法
`CardWirthScenarioDownloadWatcher.ps1`
をメモ帳などで開き、以下の変数を変更する。
* `$WATCH_DIRECTORY_PATH` … ダウンロードのディレクトリのパス
* `$CARDWIRTH_SCENARIO_DIRECTORY_PATH` … CardWirthのシナリオフォルダのパスを指定する。
* `$CARDWIRTHNEXT_SCENARIO_DIRECTORY_PATH` … CardWirthNextのシナリオフォルダのパスを指定する。
## 動作環境
Windows11,PowerShell7で動作確認。
以下のPowerShellのモジュールが必要。
* [CardWirthScenarioSummaryReader(自作)](https://github.com/braveripple/CardWirthScenarioSummaryReader)
* [BurntToast](https://github.com/Windos/BurntToast)
## 課題
* 設定項目を外部ファイル化したい。
* エンジンの設定を可変にしたい。
| 29.851852 | 137 | 0.848635 | yue_Hant | 0.98095 |
9b6a6d93cd43239262cf01b303e15bbbc78705ca | 5,982 | md | Markdown | packages/coil-extension/docs/building-extension-for-designers.md | mankins/web-monetization-projects | 1a989e06c4bee96ae102d98b7392efec472580ab | [
"Apache-2.0"
] | 82 | 2019-10-24T18:45:35.000Z | 2022-02-14T22:34:12.000Z | packages/coil-extension/docs/building-extension-for-designers.md | mankins/web-monetization-projects | 1a989e06c4bee96ae102d98b7392efec472580ab | [
"Apache-2.0"
] | 1,448 | 2019-10-31T04:53:33.000Z | 2022-03-31T09:47:52.000Z | packages/coil-extension/docs/building-extension-for-designers.md | mankins/web-monetization-projects | 1a989e06c4bee96ae102d98b7392efec472580ab | [
"Apache-2.0"
] | 17 | 2019-11-19T04:22:54.000Z | 2021-12-22T07:47:30.000Z | # Building Extension For Designers
### Audience
So you're a designer, and have always been a bit afraid of that scary black box.
The terminal? That's for those machine-brained engineers! Or maybe you just don't want to get your hands dirty?
No problem. We'll walk you through it! You can wash your hands after!
### Prerequisites
- MacOS
- Chrome Browser
- Git - Version control system
- NodeJS 14 - JavaScript Engine
- Yarn - NodeJS Package Managers
Given you're a designer, let us be a little presumptive and assume you're on a Mac.
We also trust you can install Chrome without any special instruction.
If you already have NodeJS/Git/Yarn installed, then skip to "Cloning The Repo"
#### Open that Terminal!
We may as well get it out of the way right? Open that terminal!
Hit cmd+space to open Spotlight:

Then type in "terminal":

Give or take, it should look something like this:

Type in the following:
```zsh
whoami
```
It should print your username on the mac to the console. Take note of this for later.
#### Install Git
Git is a version control system created by the Linux mastermind.
To check if you already have it installed, go to the terminal and type:
```zsh
git --version
```
If git is not installed then you will see:
```
zsh: command not found: git
```
In that case you will need to install homebrew first (next section).
If already installed, skip to the "Install Node JavaScript Engine" section.
#### Install Homebrew to install Git
> The Missing Package Manager for macOS

Go to the website and copy the [installation](https://brew.sh/#install) command.
Then paste it into the terminal and follow the instructions.
Finally, install Git, via the following command:
```zsh
brew install git
```
Check the version again!
#### Install Node JavaScript Engine
The extension, extending a Browser, is implemented the native Browser scripting language: JavaScript.
In fact, the build tools are written in JavaScript also, however they are using [NodeJS](https://nodejs.org/).
Cmd+click [>THIS<](https://github.com/nvm-sh/nvm#install--update-script) link to
open instructions on how to install the Node Version Manager (nvm for short).

Follow the instructions.
Quit the Terminal app, reopen it, then type the following command:
```zsh
nvm install 14
```
To test that node is installed, type:
```zsh
node --version
```
It should echo something like the following:
```
v14.9.0
```
### Installing Yarn
Yarn is a package manager for NodeJS.
Follow the steps [here](https://classic.yarnpkg.com/en/docs/install#mac-stable)
If you aren't sure which option to use to install, choose "Installation Script":

### Cloning The Extension Repository
The following command will clone the repository, creating a folder inside your home folder:
```zsh
git clone https://github.com/coilhq/web-monetization-projects.git
```
When successful, the command output should look like:
```
Cloning into 'web-monetization-projects'...
remote: Enumerating objects: 555, done.
remote: Counting objects: 100% (555/555), done.
remote: Compressing objects: 100% (261/261), done.
remote: Total 8643 (delta 518), reused 303 (delta 294), pack-reused 8088
Receiving objects: 100% (8643/8643), 12.38 MiB | 5.03 MiB/s, done.
Resolving deltas: 100% (6614/6614), done.
```
### Downloading Extension Dependencies
The extension uses various 3rd Party code libraries, and these must be
downloaded before you are able to build.
First you must `cd` (`c`=change/`d`=directory) to set the context for subsequent commands:
```
cd ~/web-monetization-projects
```
Then you will have to install the dependencies:
```
yarn
```
It will download various packages from the NPM (Node Package Manager) registry.
### Building the extension
First `cd` into the extension subpackage folder:
```
cd ~/web-monetization-projects/packages/coil-extension
```
Then run the following command:
```
yarn dev-chrome-prod
```
This command builds the extension.
When the output looks like the following, you will know when the extension has been built:

It never exits, as it watches for changes to the source code, and rebuilds upon modifications.
To exit: type Ctrl+c, which will kill the build script.
### Loading the extension in Chrome Browser
Enter `chrome://extensions` into the address bar, then enable "Developer Mode":

Next click the "Load Unpacked" button, which will show a Finder dialogue to
choose the folder to load.
Hit cmd+shift+g keys, which will open up another dialogue to type in the folder name:

You will want the full path to the built extension:
`/Users/nicholasdudfield/web-monetization-projects/packages/coil-extension/dist`
Substitute `nicholasdudfield` for the output of the `whoami` command in the terminal.
Hit the "Select" button:

Disable the chrome store version and enable the CoilDev-\$Date version:

You should now be using the development version of the extension!
Check if the latest features are available!
### Done
It wasn't THAT painful, right?
Also, now that you have installed the dependencies (homebrew/nvm/git/node) on your machine,
you can build all sorts of node projects!
| 28.350711 | 111 | 0.758275 | eng_Latn | 0.987127 |
9b6a827060853058f7f146725fdaf2afd89f8946 | 166 | md | Markdown | README.md | iAmGio/libfx2 | 1f776bab524cfbfc9a42cdcfe3dc60feeb65319b | [
"MIT"
] | 3 | 2018-01-27T14:20:59.000Z | 2021-05-02T18:46:28.000Z | README.md | iAmGio/libfx2 | 1f776bab524cfbfc9a42cdcfe3dc60feeb65319b | [
"MIT"
] | null | null | null | README.md | iAmGio/libfx2 | 1f776bab524cfbfc9a42cdcfe3dc60feeb65319b | [
"MIT"
] | 2 | 2018-01-27T15:12:55.000Z | 2018-06-22T15:34:04.000Z | # LibFX 2
Your JavaFX projects, more simple, more beautiful.
LibFX allows you to write JavaFX projects with style and simplicity.
>_F like flawless, X like xenial_ | 23.714286 | 68 | 0.783133 | eng_Latn | 0.999384 |
9b6ab89b3e458c5fcd3b129569981fb5d7bac9ea | 3,924 | md | Markdown | README.md | dubzland/ansible-role-nextcloud | 4d74a203a695a29a168b5a7bcfb1cf5875a0565b | [
"MIT"
] | null | null | null | README.md | dubzland/ansible-role-nextcloud | 4d74a203a695a29a168b5a7bcfb1cf5875a0565b | [
"MIT"
] | null | null | null | README.md | dubzland/ansible-role-nextcloud | 4d74a203a695a29a168b5a7bcfb1cf5875a0565b | [
"MIT"
] | null | null | null | # Dubzland: Nextcloud
[](https://git.dubzland.net/dubzland/ansible-role-nextcloud/pipelines)
[](https://galaxy.ansible.com/dubzland/gitlab)
[](https://galaxy.ansible.com/dubzland/nextcloud)
[](https://galaxy.ansible.com/dubzland/nextcloud)
[](https://liberapay.com/jdubz/donate)
[](https://liberapay.com/jdubz/donate)
Installs and configures the Nextcloud personal cloud.
## Requirements
None
## Role Variables
Available variables are listed below, along with their default values (see
`defaults/main.yml` for more info):
### dubzland_nextcloud_version
```yaml
dubzland_nextcloud_version: "{{ _dubzland_nextcloud_version }}"
```
Version of Nextcloud to install. Defaults to the most recent version of
Nextcloud compatible with the version of PHP available in the platform's package
manager.
### dubzland_nextcloud_root
```yaml
dubzland_nextcloud_root: "/var/www/nextcloud"
```
Root directory to contain the Nextcloud install.
### dubzland_nextcloud_data_dir
```yaml
dubzland_nextcloud_data_dir: "/srv/nextcloud/data"
```
Directory where Nextcloud will store user data.
### dubzland_nextcloud_db_type
```yaml
dubzland_nextcloud_db_type: sqlite3
```
Type of database. Allowed options are `sqlite3`, `pgsql` and `mysql`.
### dubzland_nextcloud_db_host
```yaml
dubzland_nextcloud_db_host: localhost
```
Host running the database Nextcloud will use. Only applicable for `pgsql` and
`mysql` db_type.
### dubzland_nextcloud_db_name
```yaml
dubzland_nextcloud_db_name: nextcloud
```
Name of the Nextcloud database. The installation process will create this database if it does not exist.
### dubzland_nextcloud_db_username / dubzland_nextcloud_db_password
```yaml
dubzland_nextcloud_db_username: nextcloud
dubzland_nextcloud_db_password: nextcloud
```
Credentials used to connect to the database. This user will need the `CREATEDB` role.
### dubzland_nextcloud_admin_username / dubzland_nextcloud_admin_password
```yaml
dubzland_nextcloud_admin_username: admin
dubzland_nextcloud_admin_password: nextcloud
```
Credentials to configure for the Nextcloud administrative user.
### dubzland_nextcloud_web_user / dubzland_nextcloud_web_group
```yaml
dubzland_nextcloud_web_user: www-data
dubzland_nextcloud_web_group: www-data
```
System user who should own all Nextcloud application and data files.
### dubzland_nextcloud_cron_frequency
```yaml
dubzland_nextcloud_cron_frequency: '5'
```
How many minutes should elapse between running the cron maintenance job.
### dubzland_nextcloud_url
```yaml
dubzland_nextcloud_url: https://nextcloud.example.com
```
URL where to Nextcloud instance will be accessible.
### dubzland_nextcloud_trusted_domains
```yaml
dubzland_nextcloud_trusted_domains:
- localhost
- nextcloud.example.com
```
List of domains allowed to connect to this Nextcloud instance.
### dubzland_nextcloud_settings
```yaml
dubzland_nextcloud_settings: []
```
Any additional settings to configure (mail server, etc). See
[defaults/main.yml](defaults/main.yml) for examples.
### dubzland_nextcloud_apps
```yaml
dubzland_nextcloud_apps: []
```
Any apps (and their settings) to automatically configure in Nextcloud. See
[defaults/main.yml](defaults/main.yml) for examples.
## Dependencies
None.
## Example Playbook
```yaml
- hosts: nextcloud
become: yes
roles:
- role: dubzland.nextcloud
```
## License
MIT
## Author
* [Josh Williams](https://dubzland.net)
| 24.07362 | 228 | 0.780581 | eng_Latn | 0.514682 |
9b6c29c1a558ef8cfd2dcbab80b6b9177a1ab0d6 | 99 | md | Markdown | _includes/04-lists.md | Jhansi-gl/markdown-portfolio | f065cfe89acd97201eba8fb58124cbd13175525c | [
"MIT"
] | null | null | null | _includes/04-lists.md | Jhansi-gl/markdown-portfolio | f065cfe89acd97201eba8fb58124cbd13175525c | [
"MIT"
] | 5 | 2021-11-12T16:59:20.000Z | 2021-11-12T17:29:42.000Z | _includes/04-lists.md | Jhansi-gl/markdown-portfolio | f065cfe89acd97201eba8fb58124cbd13175525c | [
"MIT"
] | null | null | null | Replace this with a list of your favorite things.
:heart:
:smile:
:sparkles:
* i1
* i2
* i3
* i4
| 11 | 49 | 0.666667 | eng_Latn | 0.987857 |
9b6c8212fee71bf315f061d37492b8d4b7d89a75 | 816 | md | Markdown | alessiavalgimigli/Esercizi p5js/Walkers/README.md | alessiavalgimigli/archive | b62981805691acef11d74ae663d60f5e395c0930 | [
"MIT"
] | 1 | 2021-05-29T08:11:21.000Z | 2021-05-29T08:11:21.000Z | alessiavalgimigli/Esercizi p5js/Walkers/README.md | alessiavalgimigli/archive | b62981805691acef11d74ae663d60f5e395c0930 | [
"MIT"
] | 29 | 2021-03-10T18:36:59.000Z | 2021-06-20T14:33:16.000Z | alessiavalgimigli/Esercizi p5js/Walkers/README.md | alessiavalgimigli/archive | b62981805691acef11d74ae663d60f5e395c0930 | [
"MIT"
] | 14 | 2021-03-05T14:14:50.000Z | 2021-03-05T14:16:18.000Z | ## I quattro mondi

-
## Camminatore mondi

-
## Earthquake

-
## Puntinismo

-
## Pennelli

-
## Scontri sonori

| 29.142857 | 120 | 0.792892 | yue_Hant | 0.159787 |
9b6ca1da63219bcd1580997af755293a7700728e | 367 | md | Markdown | Readme.md | kompetenzbolzen/bbs | cdff733137e608b0dd4839fdbb09cf2165f9e2d9 | [
"MIT"
] | null | null | null | Readme.md | kompetenzbolzen/bbs | cdff733137e608b0dd4839fdbb09cf2165f9e2d9 | [
"MIT"
] | null | null | null | Readme.md | kompetenzbolzen/bbs | cdff733137e608b0dd4839fdbb09cf2165f9e2d9 | [
"MIT"
] | null | null | null | # BBS
a simple telnet-server capable of controlling AT-compatible modems.
## Options
-i <Bind ip>
-p <Bind port>
-s <serial port>
-b <serial baudrate>
-f <pidfile> : fork, create pidfile
The first argument not on this list will be argv0 of program to launch.
Serial modem and telnet must be started seperately
## this_irl

| 17.47619 | 71 | 0.72752 | eng_Latn | 0.962189 |
9b6ca89580f5ef4c2d403ce54a7a38b5386dcad4 | 1,480 | md | Markdown | _posts/2016-05-08-how-to-use.md | viinhpham/viinhpham.github.io | 7116dbff676c3dcc25bd2e394ee371e5737df89c | [
"MIT"
] | null | null | null | _posts/2016-05-08-how-to-use.md | viinhpham/viinhpham.github.io | 7116dbff676c3dcc25bd2e394ee371e5737df89c | [
"MIT"
] | null | null | null | _posts/2016-05-08-how-to-use.md | viinhpham/viinhpham.github.io | 7116dbff676c3dcc25bd2e394ee371e5737df89c | [
"MIT"
] | null | null | null | ---
layout: post
title: How to Use
date: 2016-05-08 20:35:00
image: /assets/img/
description: How to create a junit test gradle project with Junit 5 and Mockito
main-class: misc
color: '#7aab13'
tags:
- Java
categories:
twitter_text: How to create a junit test gradle project with Junit 5 and Mockito
introduction: How to create a junit test gradle project with Junit 5 and Mockito
---
## How to create a junit test gradle project with Junit 5 and Mockito
1. Create gradle project in you Intelij Idea
2. Open build.gradle and and the following dependencies:
<div class="language-gradle "><div class="highlight"><blockquote><pre class="highlight"><code> <span class="k">dependencies</span> <span class="o">{</span>
<span class="n">testImplementation</span><span class="o">(</span><span class="s1">'org.junit.jupiter:junit-jupiter-engine:5.5.2'</span><span class="o">)</span>
<span class="n">testImplementation</span><span class="o">(</span><span class="s1">'org.mockito:mockito-core:2.28.2'</span><span class="o">)</span>
<span class="n">testImplementation</span><span class="o">(</span><span class="s1">'org.mockito:mockito-junit-jupiter:2.28.2'</span><span class="o">)</span>
<span class="o">}</span>
<span class="n">test</span> <span class="o">{</span>
<span class="n">useJUnitPlatform</span><span class="o">()</span>
<span class="o">}</span></code></pre></blockquote></div></div>
3\. write your unit test
4\. Run ./gradlew test
| 38.947368 | 160 | 0.702027 | eng_Latn | 0.616735 |
9b6d66aa2fdd97eee9a6560578738dc21ae3b337 | 2,546 | md | Markdown | README.md | zeroone001/webpack4-demo | 1508a148fb95ae82e4c31c438e01bff9409b8731 | [
"MIT"
] | null | null | null | README.md | zeroone001/webpack4-demo | 1508a148fb95ae82e4c31c438e01bff9409b8731 | [
"MIT"
] | null | null | null | README.md | zeroone001/webpack4-demo | 1508a148fb95ae82e4c31c438e01bff9409b8731 | [
"MIT"
] | null | null | null | # webpack4-demo
很多依赖需要安装特定版本了,因为很多都更新到支持webpack5了
### 链接
* [官网](https://v4.webpack.docschina.org/concepts/)
* [详细配置](https://v4.webpack.docschina.org/configuration)
### 目的
写一个webpack4 + [email protected], 现在公司最常用的配置,
但不是最新的配置,之后还会写一个webpack5的
### 技术
* webpack5
* [email protected]
* babel
* eslint
#### 安装
```shell
npm install webpack@4 webpack-cli@3 webpack-dev-server@3 --save-dev
npm install webpack-merge webpack-manifest-plugin --save-dev
```
获取command 命令参数, 用于自定义页面打包
```shell
npm i yargs@13 --save-dev
```
### Setup
```
npm i cross-env -D
cross-env NODE_ENV=production
npm i clean-webpack-plugin@^0.1.19 -D
// 打包前删除dist文件夹
new CleanWebpackPlugin(['dist'], {
root: path.resolve(__dirname, '../'),
verbose: false, // 输出log到控制台
dry: false // 模拟删除
}),
npm i html-webpack-plugin@^3.2.0 -D
npm i vue-loader@^15.7.1 -D
const VueLoaderPlugin = require('vue-loader/lib/plugin');
npm i mini-css-extract-plugin@^0.8.0 -D
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
npm i compression-webpack-plugin@2 -D
const CompressionPlugin = require("compression-webpack-plugin");
npm i progress-bar-webpack-plugin@^1.12.1 -D
```
### webpack.UglifyjsWebpackPlugin
这个是Webpack官方维护,用Uglifyjs进行代码压缩的插件。它使用的是单线程压缩代码,也就是说多个js文件需要被压缩,它需要一个个文件进行压缩。所以说在正式环境打包压缩代码速度非常慢(因为压缩JS代码需要先把代码解析成用Object抽象表示的AST语法树,再去应用各种规则分析和处理AST,导致这个过程耗时非常大)。优点是支持老项目,对于维护比较老的项目,是较优的选择
```js
// npm i uglifyjs-webpack-plugin
module.exports = {
plugins: [
new webpack.optimize.UglifyJsPlugin({
sourceMap: true,
compress: {
warnings: false
}
}),
]
}
```
### webpack.TerserWebpackPlugin
[https://webpack.docschina.org/plugins/terser-webpack-plugin/#parallel](https://webpack.docschina.org/plugins/terser-webpack-plugin/#parallel)
```js
// npm install terser-webpack-plugin --save-dev
optimization: {
minimize: true,
minimizer: [
new TerserPlugin({
parallel: true,
cache: true, // 新版本已经去掉这个选项了
terserOptions: {
ecma: 5,
warnings: false,
parse: {},
compress: {},
mangle: true, // Note `mangle.properties` is `false` by default.
module: false,
output: null,
toplevel: false,
nameCache: null,
ie8: false,
keep_fnames: false,
safari10: true,
format: {
comments: false, // 删除注释
},
}
})
]
}
```
| 20.047244 | 188 | 0.620974 | yue_Hant | 0.242421 |
9b6e6432dc50010968d2a1809ceaa80a44ce396d | 3,501 | md | Markdown | README.md | mad-penguins/Antarctica | 2edb237283b652a280c0fb58cdb15e9290fca562 | [
"MIT"
] | null | null | null | README.md | mad-penguins/Antarctica | 2edb237283b652a280c0fb58cdb15e9290fca562 | [
"MIT"
] | 4 | 2019-04-19T19:29:57.000Z | 2019-05-31T20:14:45.000Z | README.md | mad-penguins/Antarctica | 2edb237283b652a280c0fb58cdb15e9290fca562 | [
"MIT"
] | 5 | 2019-04-27T10:00:07.000Z | 2019-06-07T14:32:27.000Z | # Antarctica

Antarctica is an open-source project aiming to perform comfortable in-cloud backup of user files, installed packages list and dotfiles.
Current edition is a port of archived [Kotlin version](https://github.com/mad-penguins/AntarcticaKt) to C++/Qt.
It's being ported because of hardness of deployment an the most Linux installations.
Mostly it's caused by unavailability of Java SE 8 or OpenFX for Java 9 or newer.
Features (work still in progress):
- [ ] Files management
- [x] Uploading into server
- [x] Downloading from server
- [ ] Deleting from disk
- [x] Deleting from server
- [ ] Packages management (zypper only now)
- [x] Uploading into server (by one, not full list)
- [x] Binding of package and dotfiles
- [ ] Installing into system
- [ ] Removing from system
- [x] Removing from server
- [ ] Repositories management (zypper only now)
- [x] Reading list of added repositories
- [ ] Adding into system
- [x] Removing from system
- [ ] User interface
- [x] Files management tab
- [x] Packages management tab
- [ ] Repositories management tab
- [x] Settings
- [ ] Custom design
- [ ] Under the hood
- [x] Security
- [x] Connection through HTTPS
- [x] Real-time files' states changes monitoring
Roadmap ~~can~~ will be extended in the future.
Antarctica server is at the moment under development too. Code of server is closed and may be still unstable in some behaviour cases.
Public remote server is being tested now. Open API will be ~~opened~~ documented in the future.
There's already present a preview of [API wrapper for Qt](https://github.com/mad-penguins/IcebreakerQt). There's also the [Antarctica API reference](https://github.com/mad-penguins/IcebreakerQt/wiki), which is being filled now.
## Build
Antarctica is build with Qt framework and based on CMake build system.
So, you need to install cmake binary, C++ compiler with C++17 standard support and Qt5 runtime and development packages.
If you want to build an RPM package (Debian packages will be supported in the near future), you also need to install the rpmbuild.
##### Needed packages
1. `git`
2. `cmake`
3. `clang` or `g++`(Debian-based) / `gcc-g++`(openSUSE)
4. Qt Packages (or just install Qt from official website and specify it to CMake with `-DCMAKE_PREFIX_PATH` flag)
- Ubuntu
- `libqt5core5a`
- `libqt5widgets5`
- `libqt5network5`
- openSUSE
- `libQt5Core-devel`
- `libQt5Widgets-devel`
- `libQt5Network-devel`
##### Build process:
1. `$ git clone https://github.com/mad-penguins/Antarctica`
2. `$ cd Antarctica`
3. `$ git submodules update --init`
4. `$ mkdir build && cd build`
5. `$ cmake .. && make` (you can specify number of cores used for compilation with flag `-j`, e.g. `-j 4`)
6. (Optional) make an RPM: `$ make package`
##### Troubleshooting
If step 3 won't work for you, you can clone API wrapper repository manually and put it to the `api` directory:
`$ git clone https://github.com/mad-penguins/IcebreakerQt && mv IcebreakerQt/ api/`
There's also some [prebuilt binaries](https://github.com/mad-penguins/Antarctica/releases).
## Support
You can help us to rent a server and also support the development:
- WebMoney: R710781308549
- [Yandex Money](https://money.yandex.ru/to/410015281707280)
Any suggestions and contributions are welcome. Let's make Linux much more user friendly!
| 42.180723 | 227 | 0.710654 | eng_Latn | 0.982045 |
9b6fca9c5b3dbcbfcb9452bfb5466a2eae3ad2c5 | 400 | md | Markdown | README.md | soleHats/nike-account-creator | cee7e6b6c7f7b3a822f67f0a626506fe4abf70dc | [
"MIT"
] | 12 | 2020-10-18T17:57:37.000Z | 2022-01-14T04:54:32.000Z | README.md | ivekmezovu/nike-account-creator | cee7e6b6c7f7b3a822f67f0a626506fe4abf70dc | [
"MIT"
] | null | null | null | README.md | ivekmezovu/nike-account-creator | cee7e6b6c7f7b3a822f67f0a626506fe4abf70dc | [
"MIT"
] | 4 | 2019-09-04T15:09:45.000Z | 2019-10-20T01:27:40.000Z | # nike-account-creator
Automated script for creating nike accounts + phone verification (#http://smscin.com)
# Requirements
selenium
json
requests
tqdm
# Contribution [](https://github.com/dwyl/esta/issues)
- Fork this repo
- Add new features
- Create a new pull request for this branch
| 23.529412 | 157 | 0.7675 | eng_Latn | 0.64594 |
9b703cddcf9ee34b0b6b0c7c0074b414bb50ba35 | 319 | md | Markdown | README.md | tanguc/golang-grpc-server | 46460b05a1ba8fe03883373d7b1a2f58e38cf418 | [
"Apache-2.0"
] | null | null | null | README.md | tanguc/golang-grpc-server | 46460b05a1ba8fe03883373d7b1a2f58e38cf418 | [
"Apache-2.0"
] | null | null | null | README.md | tanguc/golang-grpc-server | 46460b05a1ba8fe03883373d7b1a2f58e38cf418 | [
"Apache-2.0"
] | null | null | null | Mock server for gRPC-interfaced projects
Initially made for https://github.com/tanguc/PersistentParadoxPlex
```sh
cp ~/Projects/PersistentMarkingLB/misc/grpc/proto/upstream.proto ../proto/ && protoc --go_out=. --go-grpc_out=. --go_opt=paths=source_relative --go-grpc_opt=paths=source_relative upstream.proto
```
| 35.444444 | 196 | 0.768025 | eng_Latn | 0.234193 |
9b70a911922f05ad8d52f9eb44130d39939a942d | 148 | md | Markdown | CHANGELOG.md | frian/customize-bash | 42cc48b16d494f17db12674757321f7383739f69 | [
"MIT"
] | null | null | null | CHANGELOG.md | frian/customize-bash | 42cc48b16d494f17db12674757321f7383739f69 | [
"MIT"
] | null | null | null | CHANGELOG.md | frian/customize-bash | 42cc48b16d494f17db12674757321f7383739f69 | [
"MIT"
] | null | null | null | ## 1.1.0 (23.03.2017)
* added option for bash startup file
* added -h switch
* added force script sourcing
## 1.0.0 (30.01.2017)
* initial release
| 18.5 | 36 | 0.682432 | eng_Latn | 0.977923 |
9b718200c8f00991c0da5ccdad65627858241e0b | 411 | md | Markdown | README.md | iwenli/CnBlogSubscribeTool | c94b30b8d3aba0062a0b28d41e7243b643232f42 | [
"MIT"
] | 12 | 2018-02-13T13:07:09.000Z | 2021-04-27T07:54:17.000Z | README.md | iwenli/CnBlogSubscribeTool | c94b30b8d3aba0062a0b28d41e7243b643232f42 | [
"MIT"
] | null | null | null | README.md | iwenli/CnBlogSubscribeTool | c94b30b8d3aba0062a0b28d41e7243b643232f42 | [
"MIT"
] | 12 | 2018-02-14T04:11:37.000Z | 2019-06-29T10:35:57.000Z | # CnBlogSubscribeTool
CnBlogSubscribeTool can crawl blog home page data at regular intervals.
## Config
- /Config/Mail.json
````
{
"Name": "CnBlogSubscribeTool",
"Address": "",
"Host": "",
"Port": 25,
"Password": "",
"ReceiveList": [
"[email protected]",
"[email protected]",
]
}
````
## Email
9 o'clock every day will send yesterday's blogs
## Data
Get the data once every five minutes
| 14.172414 | 71 | 0.632603 | eng_Latn | 0.542749 |
9b71c76c8f66595a9f261746c52d31126ffbe2e8 | 877 | md | Markdown | README.md | Robot-Inventor/ORIZIN_Agent | 2f6cb4c0cea0081b0661a59497c9f03dd6209af2 | [
"MIT"
] | null | null | null | README.md | Robot-Inventor/ORIZIN_Agent | 2f6cb4c0cea0081b0661a59497c9f03dd6209af2 | [
"MIT"
] | null | null | null | README.md | Robot-Inventor/ORIZIN_Agent | 2f6cb4c0cea0081b0661a59497c9f03dd6209af2 | [
"MIT"
] | null | null | null | 
# !重要!
このリポジトリの開発は終了しました。このリポジトリは今後一切更新される予定はなく、pull requests、issues等も受け付けません。また、動作環境が非常に限られているため使用しないでください。このリポジトリと同等で後継のものが、[ORIZIN Agent HTML](https://github.com/Robot-Inventor/ORIZIN-Agent-HTML)です。ORIZIN Agent HTMLは、ORIZIN Agentよりも高機能、高性能で利便性も大幅に向上してクロスプラットフォーム化も実現し、音声認識やモダンなUIも備えています。なお、互換性のため、このリポジトリを消去する予定はありません。
## v1.0.0.0-Mercury
AIアシスタント、ORIZINのPythonコードです。動作にはPython3系が必要です。
このコードの動作には、Open JTalk及びそれらの動作に必要なライブラリ等を別途ダウンロードする必要があります。また、動作確認はRaspberry Piで行っており、他の環境(Windows等)では適宜書き換えてお使いください。(音楽再生以外は、v1.1 で対応予定です。)
また、使い方等の詳細は[wiki](https://github.com/Robot-Inventor/ORIZIN_Agent/wiki)や[公式ページ](https://robot-inventor.github.io/ORIZIN_Agent/)をご覧ください。
### 更新履歴
2019/06/09 v1.0公開;ORIZIN Agentの最初の安定版
2019/06/16 バージョン番号の表記をv1.0からv1.0.0.0に変更
2019/06/16 バージョン番号に開発コード名を追加
| 54.8125 | 314 | 0.831243 | yue_Hant | 0.893126 |
9b724d530fbdcfbd38981b77ba64e719f971a814 | 892 | md | Markdown | things-missing-in-rust.md | izelnakri/mber-rust | 82e62e1227205c3cabee4e0dbdd7431aace7f2d8 | [
"MIT"
] | 4 | 2019-11-11T17:31:41.000Z | 2021-06-11T03:08:01.000Z | things-missing-in-rust.md | izelnakri/mber-rust | 82e62e1227205c3cabee4e0dbdd7431aace7f2d8 | [
"MIT"
] | 2 | 2021-03-10T03:34:47.000Z | 2021-05-08T08:23:38.000Z | things-missing-in-rust.md | izelnakri/mber-rust | 82e62e1227205c3cabee4e0dbdd7431aace7f2d8 | [
"MIT"
] | null | null | null | - REPL
- Default function arguments
- HashMap literal macros (hashmap!{}, btreemap!{} etc) -> maplit create should be part of rust
- test reporters
- test.serial?
- test afterEach
- npm scripts functionality(dynamically retrieve version and write it to code during compile time)_
- Rust hashmaps are too basic, doesnt have assign and literal method that makes high-level code very hard to write sometimes!
Write something like this in rust:
```rust
Object.assign(EXAMPLE_ENV, {
APP: Object.assign(EXAMPLE_ENV.APP, {
autoboot: false,
name: EXAMPLE_ENV.modulePrefix,
version: "0.0.0+b5f80b0d"
})
})
```
- cargo test needs better filtering for unit tests, example: cargo test src/utils/recursive_file_lookup.rs
- better test diffs
- only very primitive types allowed as const during compile type (json const not allowed for example)
- recursive copy in standard library
CHECK:
| 34.307692 | 125 | 0.76009 | eng_Latn | 0.978563 |
9b7341db5a0f98937d1892152e8a055cc20b7a62 | 3,164 | md | Markdown | Evaluation/DevSum/CommunitySummaries/systems/H89-2014_summary.md | hazemalsaied/AutomatiqueSummarization | 1b8d903fba44d5c06233de0dc7363b43abe76b00 | [
"MIT"
] | 1 | 2020-02-02T18:47:05.000Z | 2020-02-02T18:47:05.000Z | Evaluation/DevSum/CommunitySummaries/systems/H89-2014_summary.md | hazemalsaied/AutomaticSummarization | 1b8d903fba44d5c06233de0dc7363b43abe76b00 | [
"MIT"
] | null | null | null | Evaluation/DevSum/CommunitySummaries/systems/H89-2014_summary.md | hazemalsaied/AutomaticSummarization | 1b8d903fba44d5c06233de0dc7363b43abe76b00 | [
"MIT"
] | null | null | null | This approach is similar in spirit to the iterative computational approaches of the Hidden Markov Models ( Kupiec , 1989
ADJECTIVE DETERMINER To all states NOUN in Basic Network `` Transitions to To all states all states in in Basic Network Basic Network except NOUN and ADJECTIVE AUGMENTED NETWORK BASIC NETWORK FULLY-CONNECTED NETWORK CONTAINING ALL STATES EXCEPT DETERMINER Figure 1 : Extending the Basic Model Augmenting the Model by Use of Networks The basic model consists of a first-order fully connected network .
The Basic Model The development of the model was guided by evaluation against a simple basic model ( much of the development of the model was prompted by an analysis of the errors in its behavior .
This has the disadvantage that they can not share training data .
Only context has been supplied to aid the training procedure , and the latter is responsible for deciding which alternative is more likely , based on the training data .
A pre-tagged training corpus is not required , and the tagger can cope with words not found in the training text .
The paper describes refinements that are currently being investigated in a model for part-of-speech assignment to words in unrestricted text.
The paper describes refinements that are currently being investigated in a model for part-of-speech assignment to words in unrestricted text .
To model such dependency across the phrase , the networks shown in Figure 2 can be used .
State chains are used to model selective higher-order conditioning in the model , which obviates the proliferation of parameters attendant in uniformly higher-order models .
In this regard , word equivalence classes were used ( Kupiec , 1989 ) .
In the 21 category model reported in Kupiec ( 1989 ) only 129 equivalence classes were required to cover a 30,000 word dictionary .
In a ranked list of words in the corpus the most frequent 100 words account for approximately 50 % of the total tokens in the corpus , and thus data is available to estimate them reliably .
The former represents 95.6 % correct word tagging on the text as a whole ( ignoring unknown words ) , and 89 % on the ambiguous words .
The basic model tagged these sentences correctly , except for- `` range '' and `` rises '' which were tagged as noun and plural-noun respectively 1 .
Critical examination of the tagging provided by the augmented model showed 168 word tagging errors , whereas the basic model gave 215 erroneous word tags .
In fact , the number of equivalence classes is essentially independent of the size of the dictionary , enabling new words to be added without any modification to the model .
A 30,000 word dictionary was used , supplemented by inflectional analysis for words not found directly in the dictionary .
Methods have ranged from locally-operating rules ( Greene and Rubin , 1971 ) , to statistical methods ( Church , 1989 ; DeRose , 1988 ; Garside , Leech and Sampson , 1987 ; Jelinek , 1985 ) and back-propagation ( Benello , Mackie and Anderson , 1989 ; Nakamura and Shikano , 1989 ) .
It would be appropriate to deal with idioms separately , as done by Gaxside , Leech and Sampson ( 1987 ) . | 158.2 | 403 | 0.788243 | eng_Latn | 0.999476 |
9b736d4e28b0d5c226ff47a12be6a1c8da994ad9 | 697 | md | Markdown | README.md | codecraft-peru/codecraft-peru.github.io | 4bca0c9b5e557129d20b3a081e496d10fbd491a5 | [
"MIT"
] | null | null | null | README.md | codecraft-peru/codecraft-peru.github.io | 4bca0c9b5e557129d20b3a081e496d10fbd491a5 | [
"MIT"
] | 1 | 2020-05-25T00:44:16.000Z | 2020-05-25T00:44:16.000Z | README.md | codecraft-peru/codecraft-peru.github.io | 4bca0c9b5e557129d20b3a081e496d10fbd491a5 | [
"MIT"
] | null | null | null | # Software Crafters Perú
Link del blog: https://codecraft-peru.github.io/
## ¿Qué necesito para publicar un post?
>Clonar el repositorio
```
$ git clone https://github.com/codecraft-peru/codecraft-peru.github.io.git
```
>Instalar las dependencias requeridas por [jekyll](https://jekyllrb.com/)
```
$ bundle install
```
>Escribir un post usando como base (en nombre de archivo, estructura y contenido) los archivos de la carpeta **_posts**
```
$ ls _posts/
2020-05-21-hello-world.html
2020-05-24-the-beauty-of-craftsmanship.html
```
>Para ver el blog en modo local
```
$ bundle exec jekyll serve
```
>Para publicar el post en el blog
```
$ git commit -m 'Add post about...'
$ git push
``` | 20.5 | 119 | 0.708752 | spa_Latn | 0.753742 |
9b73c0634ac6d00f6a80fc5ef4d5e587163d92f9 | 327 | md | Markdown | exampleSite/content/author/john-doe.md | Abbie95/persian-hugo | 5719d3b177e02fdcd69a0ab15bb66d7a773d34c8 | [
"MIT"
] | null | null | null | exampleSite/content/author/john-doe.md | Abbie95/persian-hugo | 5719d3b177e02fdcd69a0ab15bb66d7a773d34c8 | [
"MIT"
] | null | null | null | exampleSite/content/author/john-doe.md | Abbie95/persian-hugo | 5719d3b177e02fdcd69a0ab15bb66d7a773d34c8 | [
"MIT"
] | 1 | 2021-02-13T18:50:45.000Z | 2021-02-13T18:50:45.000Z | ---
title: Abigail Steffes
image: ''
email: [email protected]
social:
- icon: "/images/instagram-icon.png"
link: https://www.instagram.com/abigail_joy_photography/
- icon: ti-facebook
link: "#https://www.facebook.com/AbigailJoyPhotos"
- icon: ti-twitter-alt
link: "#"
---
Hello friend! I'm so happy you're here! | 23.357143 | 58 | 0.712538 | yue_Hant | 0.274987 |
9b741919d42dd6ecf8409d5a5436118ff3eb0266 | 5,437 | md | Markdown | README.md | viswavi/dataset-recommendation | 8193e5ad5f4bad25852b565e96d943530d307422 | [
"Apache-2.0"
] | null | null | null | README.md | viswavi/dataset-recommendation | 8193e5ad5f4bad25852b565e96d943530d307422 | [
"Apache-2.0"
] | null | null | null | README.md | viswavi/dataset-recommendation | 8193e5ad5f4bad25852b565e96d943530d307422 | [
"Apache-2.0"
] | null | null | null | # Dataset Recommendation
## Table of Contents
- [Dataset Recommendation](#dataset-recommendation)
- [Table of Contents](#table-of-contents)
- [Requirements](#requirements)
- [Ready-to-Use Dataset](#ready-to-use-dataset)
- [Data Preprocessing](#data-preprocessing)
- [Prepare Search Corpus](#prepare-search-corpus)
- [Prepare Test Data](#prepare-test-data)
- [Prepare Training Data](#prepare-training-data)
- [Retrieval](#retrieval)
- [BM25](#bm25)
- [k-NN (TF-IDF features)](#k-nn-tf-idf-features)
- [k-NN (BERT features)](#k-nn-bert-features)
- [Bi-Encoder (Tevatron)](#bi-encoder-tevatron)
- [Training](#training)
- [Retrieval](#retrieval-1)
- [Evaluate results](#evaluate-results)
- [Core Metrics](#core-metrics)
- [Bucketing by Dataset Frequency](#bucketing-by-dataset-frequency)
- [Evaluating data quality](#evaluating-data-quality)
- [Labeling tool](#labeling-tool)
## Requirements
```
pytorch >= 1.8.2
```
```
pip install paperswithcode-client
pip install pyserini
pip install tevatron
conda install faiss
```
```
git submodule add https://github.com/castorini/anserini
cd anserini
cd tools/eval && tar xvfz trec_eval.9.0.4.tar.gz && cd trec_eval.9.0.4 && make && cd ../../..
cd tools/eval/ndeval && make && cd ../../../..
```
# Ready-to-Use Dataset
Found in `data/`. Both training and test data contain "tldr", "positives", and "year" for each query. The training set contains other metadata (such as hard negatives and detailed metadata about the paper we used to extract the query).
# Data Preprocessing
Download and untar data from https://github.com/allenai/SciREX/blob/master/scirex_dataset/release_data.tar.gz.
### Prepare Search Corpus
Download and unzip the `datasets` data from https://github.com/paperswithcode/paperswithcode-data, and place into `data/`.
`python data_processing/build_search_corpus/generate_datasets_collection.py --exclude-abstract --exclude-full-text --output-file data/dataset_search_collection.jsonl`
### Prepare Test Data
**Prepared test data can be found at `data/test_data.jsonl`.**
To reproduce this data (or to customize the test set), run the following commands:
```
mkdir intermediate_data
export PICKLES_DIRECTORY=intermediate_data
./add_tldrs_to_scirex_abstracts.sh
python data-processing/test_data/convert_scirex.py \
--scirex-directory $SCIREX_PATH/scirex_dataset/release_data/ \
--dataset-search-collection data/dataset_search_collection.jsonl \
--datasets-file datasets.json \
--scirex-to-s2orc-metadata-file $PICKLES_DIRECTORY/scirex_id_to_s2orc_metadata_with_tldrs.pkl \
--output-relevance-file data/test_dataset_collection.qrels \
--output-queries-file test_queries.csv \
--output-combined-file data/test_data.jsonl \
--training-set-documents tagged_dataset_positives.jsonl \
--bad-query-filter-map bad_tldrs_mapping.json
```
### Prepare Training Data
**Prepared training data can be found at `data/train_data.jsonl`.**
To reproduce this data (or to customize the training set), see the [training data preparation instructions](data_processing/train_data/README.md).
# Retrieval
## BM25
See [BM25 retrieval and index construction instructions](retrieval/bm25/README.md).
## k-NN (TF-IDF features)
```
python retrieval/knn/generate_results.py \
--remove-punctuation \
--remove-stopwords \
--training-set data/train_data.jsonl \
--training-tldrs data/train_tldrs.hypo \
--search-collection anserini_search_collections/dataset_search_collection_no_abstracts_or_paper_text/documents.jsonl \
--output-file retrieved_documents_knn_tfidf.trec \
--vectorizer-type tfidf \
--results-limit 5
```
## k-NN (BERT features)
```
python retrieval/knn/generate_results.py \
--remove-punctuation \
--remove-stopwords \
--query-metadata data/test/scirex_queries_and_datasets.json \
--training-set data/train_data.jsonl \
--training-tldrs data/train_tldrs.hypo \
--search-collection anserini_search_collections/dataset_search_collection_no_abstracts_or_paper_text/documents.jsonl \
--output-file data/test/retrieved_documents_knn_exact_bert.trec \
--vectorizer-type bert \
--results-limit 5
```
## Bi-Encoder (Tevatron)
### Training
Neural "bi-encoder" retrievers must be trained on our training data. See [biencoder training instructions](retrieval/biencoder/tevatron_scripts/README.md#Training).
### Retrieval
See [biencoder retrieval instructions](retrieval/biencoder/tevatron_scripts/README.md#Retrieval).
# Evaluate results
## Core Metrics
```
GOLD_FILE=data/test_dataset_collection.qrels
# Set RETRIEVAL_OUTPUT to the desired file. Example provided:
RETRIEVAL_OUTPUT=data/test/retrieved_documents_knn_exact_bert.trec
./anserini/tools/eval/trec_eval.9.0.4/trec_eval \
-c \
-m P.5 \
-m recall.5 \
-m map \
-m recip_rank \
$GOLD_FILE \
$RETRIEVAL_OUTPUT
```
### Bucketing by Dataset Frequency
```
GOLD_FILE=data/test_dataset_collection.qrels
# Set RETRIEVAL_OUTPUT to the desired file. Example provided:
RETRIEVAL_OUTPUT=data/test/retrieved_documents_knn_exact_bert.trec
python data_analysis/evaluate_dataset_recall_buckets.py $GOLD_FILE $RETRIEVAL_OUTPUT
```
## Evaluating data quality
### Labeling tool
This tool was used to validate the quality of labels in our training set:
`python data_processing/train_data/label_dataset_sentences.py --labeler-name <your name> --range-to-label 1,200`.
| 35.077419 | 235 | 0.751701 | eng_Latn | 0.240658 |
9b7506ddde3df31e058b2464e50105e82728cbff | 319 | md | Markdown | README.md | bielversallini/trality-labs | f3fb5e04e586595aed1f2094a32be2219eb8edf6 | [
"MIT"
] | 1 | 2021-11-17T19:57:45.000Z | 2021-11-17T19:57:45.000Z | README.md | bielversallini/trality-labs | f3fb5e04e586595aed1f2094a32be2219eb8edf6 | [
"MIT"
] | null | null | null | README.md | bielversallini/trality-labs | f3fb5e04e586595aed1f2094a32be2219eb8edf6 | [
"MIT"
] | null | null | null | # trality-labs
## What's Trality?
Trality is a platform for anybody who wants to profit from algorithmic trading without giving up their day job.
At least not right away 😉
## What's the purpose of this code?
Provide a basic structure to create a trading bot for Trality. It's not a real trading plan, just a sample! | 39.875 | 112 | 0.761755 | eng_Latn | 0.999847 |
9b758403c40f88adfd489f33998d104a6bdb7006 | 1,501 | md | Markdown | docs/extensibility/debugger/reference/idebugpropertydestroyevent2-getdebugproperty.md | rfakhouri/visualstudio-docs.cs-cz | 3d540a168c09a23b855f746696062fd9954b8dd5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/extensibility/debugger/reference/idebugpropertydestroyevent2-getdebugproperty.md | rfakhouri/visualstudio-docs.cs-cz | 3d540a168c09a23b855f746696062fd9954b8dd5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/extensibility/debugger/reference/idebugpropertydestroyevent2-getdebugproperty.md | rfakhouri/visualstudio-docs.cs-cz | 3d540a168c09a23b855f746696062fd9954b8dd5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: IDebugPropertyDestroyEvent2::GetDebugProperty | Dokumentace Microsoftu
ms.date: 11/04/2016
ms.topic: reference
f1_keywords:
- IDebugPropertyDestroyEvent2::GetDebugProperty
helpviewer_keywords:
- IDebugPropertyDestroyEvent2::GetDebugProperty
ms.assetid: c96ae785-0ac8-4df4-8df3-15a8d7e13687
author: madskristensen
ms.author: madsk
manager: jillfra
ms.workload:
- vssdk
dev_langs:
- CPP
- CSharp
ms.openlocfilehash: b58a7a3463c579ece09a6bb185066a4e556f2f06
ms.sourcegitcommit: 40d612240dc5bea418cd27fdacdf85ea177e2df3
ms.translationtype: MT
ms.contentlocale: cs-CZ
ms.lasthandoff: 05/29/2019
ms.locfileid: "66348718"
---
# <a name="idebugpropertydestroyevent2getdebugproperty"></a>IDebugPropertyDestroyEvent2::GetDebugProperty
Získá vlastnost, který se má zničit.
## <a name="syntax"></a>Syntaxe
```cpp
HRESULT GetDebugProperty (
IDebugProperty2** ppProperty
);
```
```csharp
int GetDebugProperty (
out IDebugProperty2 ppProperty
);
```
## <a name="parameters"></a>Parametry
`ppProperty`\
[out] Vrátí [IDebugProperty2](../../../extensibility/debugger/reference/idebugproperty2.md) objekt, který reprezentuje vlastnost, který se má zničit.
## <a name="return-value"></a>Návratová hodnota
Pokud je úspěšná, vrátí `S_OK`; v opačném případě vrátí kód chyby.
## <a name="see-also"></a>Viz také:
- [IDebugPropertyDestroyEvent2](../../../extensibility/debugger/reference/idebugpropertydestroyevent2.md)
- [IDebugProperty2](../../../extensibility/debugger/reference/idebugproperty2.md) | 29.431373 | 149 | 0.783478 | ces_Latn | 0.319572 |
9b75ee65d2daa583e01d64132f8381f758920616 | 316 | md | Markdown | README.md | tkralphs/PyQueueSim | 47c8bdbb41e3ab1f9b3b0bb10c700f3950cee6b9 | [
"MIT"
] | 13 | 2017-08-15T22:29:04.000Z | 2022-02-27T04:26:42.000Z | README.md | tkralphs/PyQueueSim | 47c8bdbb41e3ab1f9b3b0bb10c700f3950cee6b9 | [
"MIT"
] | null | null | null | README.md | tkralphs/PyQueueSim | 47c8bdbb41e3ab1f9b3b0bb10c700f3950cee6b9 | [
"MIT"
] | 15 | 2017-05-15T02:51:36.000Z | 2022-02-22T06:05:34.000Z | # PyQueueSim
A simple discrete event simulation of an M/M/s queuing system in Python,
including a visualization that uses pygame. This code is used as an example in
the laboratory of a course on algorithms in Python at Lehigh university. The
course Web site is here:
http://coral.ie.lehigh.edu/~ted/teaching/ie172
| 35.111111 | 78 | 0.791139 | eng_Latn | 0.998939 |
9b76452eee3bd9a56b3b43cfe03a28c09876a7d3 | 156 | md | Markdown | src/pages/markdown/new.md | ruderude/Gatsby-test | 0e06f006f156810274cff5b84df9519a3ec9f784 | [
"RSA-MD"
] | null | null | null | src/pages/markdown/new.md | ruderude/Gatsby-test | 0e06f006f156810274cff5b84df9519a3ec9f784 | [
"RSA-MD"
] | null | null | null | src/pages/markdown/new.md | ruderude/Gatsby-test | 0e06f006f156810274cff5b84df9519a3ec9f784 | [
"RSA-MD"
] | null | null | null | ---
title: "New.md"
date: "2021-03-21"
---
Do Pandas eat bananas? Check out this short video that shows that yes! pandas do seem to really enjoy bananas!
| 19.5 | 110 | 0.705128 | eng_Latn | 0.991848 |
9b7645ccd51495ab96ea44326f7a983934c4e597 | 4,085 | markdown | Markdown | _posts/2017-07-20-learn-node-03-day.markdown | BingCool852/BingCool852.github.io | b48631dfdef4fafbef84299cec228d0fae007ae3 | [
"MIT"
] | 1 | 2016-10-10T13:48:30.000Z | 2016-10-10T13:48:30.000Z | _posts/2017-07-20-learn-node-03-day.markdown | BingCool852/BingCool852.github.io | b48631dfdef4fafbef84299cec228d0fae007ae3 | [
"MIT"
] | 3 | 2021-03-29T16:38:30.000Z | 2022-02-26T01:16:47.000Z | _posts/2017-07-20-learn-node-03-day.markdown | BingCool852/BingCool852.github.io | b48631dfdef4fafbef84299cec228d0fae007ae3 | [
"MIT"
] | null | null | null | ---
author: tangliangcheng
comments: true
date: 2017-07-17 23:21:00+00:00
layout: post
slug: learn node 03 day
title: 《深入浅出NodeJs》学习笔记 03 day
excerpt: Node异步编程是什么?异步编程方案?
categories:
- 技术分享
---
## 函数式编程
### 高阶函数
高阶函数可以把函数作为参数,或是将函数作为返回值的函数,例如:
```js
function foo(x) {
return function () {
return x
}
}
```
### 偏函数
通过指定部分参数来产生一个新的定制函数的形式就是偏函数
```js
var isType = function (type) {
return function (obj) {
return toString().call(obj) == '[object' + type + ']'
}
}
var isString = isType('String')
var isFunction = isType('Function')
```
## 异步编程解决方案
**Node带来的最大特性莫过于基于事件驱动的非阻塞I/O模型,这是它的灵魂所在。**
非阻塞I/O可以使CPU与I/O并不相互依赖等待,让资源得到更好的利用。对于网络应用而言,并行带来的想象空间更大,延展而开的是分布式和云。并行使得各个单点之间能够更有效地组织起来,这也是Node在云计算厂商中广受青睐的原因。

* 难点1 异常处理
Node在处理一场上形成了一种约定,将异常作为回调函数的第一个实参传回,如果为空值,则表明异步调用没有异常跑出。在我们自行编写的异步方法上,也需要去遵循这样一些原则:
* 原则一:必须执行调用者传入的回调函数
* 原则二:正确传递回异常供调用者判断
* 难点2 函数嵌套过深
* 难点3 阻塞代码
* 难点4 多线程编程
* 难点5 异步转同步
### 主要解决方案如下3种
* 事件发布/订阅模式
* Promise/Deferred模式
* 流程控制库
#### 事件发布/订阅模式
事件发布/订阅模式可以实现一个事件与多个回调函数的关联,这些回调函数又称为事件侦听器。通过`emit()`发布事件后,消息会立即传递给当前事件的所有侦听器执行。侦听器可以很灵活地添加和删除,使得事件和具体处理逻辑之间可以很轻松的关联和解耦。
事件发布/订阅模式自身并无同步和异步调用的问题,但在Node中,`emit()`调用多半是伴随事件循环而异步触发的,所以我们说事件发布/订阅广泛应用于异步编程。
事件发布/订阅模式常常用来解耦业务逻辑,事件发布者无须关注订阅的侦听器如何实现业务逻辑,甚至不用关注有多少个侦听器存在,数据通过消息的方式可以很灵活地传递。在一些典型的场景中,可以通过事件发布/订阅模式进行组件封装,将不变的部分封装在组件内部,将容易变化,需自定义的部分通过事件暴露给外部处理,这是一种典型的逻辑分离方式。在这种事件发布/订阅式组件中,事件的设计非常重要,因为它关乎外部调用组件时是否优雅,从某种角度来说事件的设计就是组件的接口设计。
从另一个角度来看,事件侦听器模式也是一种`hook`,利用钩子导出内部数据或者状态给外部调用者。
##### 利用事件队列解决雪崩问题
在事件订阅/发布模式中,通常也有一个`once()`方法,通过它添加的侦听器只能执行一次,在执行之后就会将它与事件的关联移除。这个特性常常可以帮助我们过滤一些重复性的事件响应。
##### 多异步之间的协作方案
多个异步场景中回调函数的执行并不能保证顺序,且回调函数之间相互没有任何交集,所以需要借助一个第三方函数和第三方变量来处理异步协作的结果。通常,我们把这个用于检测次数的变量叫做哨兵变量。
#### Promise/Deferred模式
##### Promise和Deferred整体关系

Promise是高级接口,事件是低级接口。低级接口可以构成更多更复杂的场景,高级接口一旦定义,不太容易变化,不再有低级接口的灵活性,但对于解决典型问题非常有效。
**思考问题:Promise主要解决的是单个异步操作中存在的问题,当我们需要处理多个异步调用时又该如何处理呢?**
```js
var p1 = readFile('foo.txt', 'utf-8')
var p12 = readFile('bar.txt', 'utf-8')
var deferred = new Deferred()
deferred.all([p1, p2]).then (function(results) {
//TODO
}, function(err) {
//TODO
})
```
通过`all()`方法抽象多个异步操作,只有所有异步操作成功,这个异步操作才算成功,一旦其中一个异步操作失败,整个异步操作就失败。
##### 支持序列执行的Promise
理想的编程体验应当是前一个的调用结果作为下一个调用的开始,是传说中的链式调用。
```js
promise()
.then(obj.api1)
.then(obj.api2)
.then(obj.api3)
.then(obj.api4)
.then(function (value4) {
//TODO value4
}, function(error) {
//TODO error from step1 through setp4
})
.done()
```
要让promise支持链式执行,主要通过以下两个步骤
* 将所有回调都存在队列中
* promise完成时,逐个执行回调,一旦检测到返回了新的promise对象,停止执行,然后将当前的Deferred对象的promise引用改为新的promise对象,并将队列中余下的回调转交给它。
#### 流程控制库
##### 尾触发与next
有一类需要手工调用才能持续执行后续调用,我们将此类方法叫做尾触发。常见关键词是next()。应用最多的地方是Connect的中间件。

##### async
* series 实现一组任务的串行执行,适合无依赖的异步
* parallel 并行执行异步操作
* waterfall 执行存在依赖的异步操作
* auto 自动分析依赖执行异步操作
##### step
接收任意数量的任务,所有的任务都将会串行一次执行。
##### wind
#### 流程控制小结
事件发布/订阅模式相对算是一种较为原始的方式,`Promise/Deferred`模式贡献了一个非常不错的异步任务模型的抽象。上面几种异步流程控制方案与`Promise/Deferred`模式的思路不同,Promise/Deferred的重点在于封装异步的调用部分,流程控制库则显得没有模式,将处理重点放在回调函数的注入上。从自由度上来说,`async`、`setp`这类的流程控制库相对灵活得多。`EventProxy`库则主要借鉴事件发布/订阅模式和流程控制库通过高阶函数生成回调函数的方式实现。
### 异步并发控制
**场景:** 并发量过大时,下层服务器会吃不消,如果是对文件系统进行大量的并发调用,操作系统的文件描述符数量会被瞬间用光,抛出如下错误`Error: EMFLIE, too many open files`。可以看出,异步I/O与同步I/O的明显差距:同步I/O因为每个I/O都彼此阻塞,在循环体中,总有一个接着一个调用,不会出现耗用文件描述符太多的情况,同时性能也是低下的;对于异步I/O,虽然并发容易实现,但是由于太容易实现依然需要控制。换言之,尽管要压榨底层系统的性能,但还是要给予一定的过载保护。
### bagpipe
* 通过一个队列控制并发量
* 如果当前活跃(调用发起但未执行回调)的异步调用量小于限定值,从队列中取出执行
* 如果活跃调用达到限定值,调用暂时存放在队列中
* 每个异步调用结束时,从队列中取出新的异步调用执行
### async
* parallelLimit 异步调用限制
* queue 动态增加并行任务
| 23.888889 | 253 | 0.760343 | yue_Hant | 0.679667 |
9b79b62307b493ece26db4ab10f46885ae7d3fff | 4,098 | md | Markdown | README.md | Zeruelrojo/ucopgun-alska | 06563cbd030b2a19f74905db12892256b7a6eb54 | [
"MIT"
] | null | null | null | README.md | Zeruelrojo/ucopgun-alska | 06563cbd030b2a19f74905db12892256b7a6eb54 | [
"MIT"
] | null | null | null | README.md | Zeruelrojo/ucopgun-alska | 06563cbd030b2a19f74905db12892256b7a6eb54 | [
"MIT"
] | null | null | null | # ucopgun-alska
Información útil generada a partir de las facturas electrónicas XML proporcionadas por el SAT (Sistema de administración tributaria) de México para propósitos de declaración de impuestos tanto mensual como anual.
primera version (iniciando)
INSTALACIÓN:
- descargue e instale Node JS en su computadora en la version 12 de preferencia (https://nodejs.org/es/).
- dentro del instalador se encuentra el programa npm que se instala junto a Node JS, este es necesario.
- descargue el programa (boton color verde en esta misma pagina [https://github.com/Zeruelrojo/ucopgun-alska])
- opcionalmente puede utilizar git y clonar el repositorio con el siguiente comando de emulador de terminal o cmd:
git clone https://github.com/Zeruelrojo/ucopgun-alska
- una vez descargado en Windows abra cmd (con su teclado escriba WIN+R al mismo tiempo y luego escriba "cmd" sin las comillas)
- en Linux utilice el emulador de terminal de su preferencia.
- navegue a traves de las carpetas utilizando el comando cd, por ejemplo:
- para Windows "cd C:\ejemplo1\ejemplo2\ejemplo3"
- para Linux "cd /ejemplo1/ejemplo2/ejemplo3"
- sitúese en la ubicación del programa (dentro de la carpeta del mismo) y escriba el siguiente comando:
npm install
- observará que iniciara un proceso, por favor espere a que concluya el proceso, si se interrumpe solamente vuelva a ingresar el mismo comando (npm install)
EJECUCIÓN
- una vez concluido, en la misma carpeta agregue una carpeta llamada "temp" y otra llamada "xml" y dentro de "xml" COPIE sus facturas xml (enfasis en copiar, puede cometer el error en mover las facturas a esta carpeta).
- para iniciar el procesamento de las facturas ejecute los siguientes comandos (considere que XXXXX es su RFC SIN HOMOCLAVE):
- EN LINUX:
- export ucopgun_alska_RFC=XXXXX
- EN WINDOWS:
- set ucopgun_alska_RFC=XXXXX
- npm run start
- el programa generará un archivo llamado "informe.json", dicho archivo se puede abrir con un simple bloc de notas.
DESCRIPCIÓN DEL INFORME:
apartado anual:
- se denota por tener solamente el numero del año, este es un acumulado de suma anual de todas las facturas del año, no considera años anteriores.
```json
"2020": {
"Ingresos_Brutos": xxxx.xxxx,
"Gastos_Brutos": xxxx.xxxx,
"totalIVATrasladado": xxxx.xxxx,
"totalISRTrasladado": xxxx,
"totalIVARetenido": xxxx,
"totalISRRetenido": xxxx
}
```
apartado historial:
- son las facturas en si, ordenadas en orden descendente respecto al tiempo
- tambien considere que el rol se refiere a emisor o receptor de factura, en este caso es receptor
```json
"historial": [
{
"Fecha": "2020-01-01T00:00:01",
"NoCertificado": "xxxxxxxxxxxxxxxxxxxxx",
"Moneda": "xxx",
"FormaPago": "xx",
"MetodoPago": "xxx",
"TipoDeComprobante": "x",
"Total": xxx,
"SubTotal": xxx,
"Rol": "RECEPTOR",
"totalIVATrasladado": xxx
},
...
```
apartado mensual:
- notese que contiene concatenado el año y el mes en cuestión, contiene ademas de las caracteristicas de la anualidad, el acumulado hasta la fecha en cuestion, en este caso es enero del 2020.
```json
"2020_Enero": {
"Ingresos_Brutos": xxx,
"Ingresos_del_año_hasta_ahora": xxx,
"Gastos_Brutos": xxx,
"Gastos_del_año_hasta_ahora": xxx,
"totalIVATrasladado": xxx,
"totalISRTrasladado": xxx,
"totalIVARetenido": xxx,
"totalISRRetenido": xxx,
"totalIVATrasladado_del_año_hasta_ahora": xxx,
"totalISRTrasladado_del_año_hasta_ahora": xxx,
"totalIVARetenido_del_año_hasta_ahora": xxx,
"totalISRRetenido_del_año_hasta_ahora": xxx
}
```
El programa se encuentra aun en pruebas, lo expongo con el proposito de mejorar el programa y facilitar a los usuarios no experimentados en contabilidad a realizar declaraciones anuales y mensuales.
Queda bajo la responsabilidad del usuario el uso de este programa y queda advertido de su inestabilidad y la posibilidad de la generación de información no confiable.
Preguntas, sugerencias, comentarios, reporte de errores, xyz mi contacto es el siguiente:
[email protected]
espero apoyar en lo posible. | 43.136842 | 219 | 0.760127 | spa_Latn | 0.987827 |
9b79dcc5e354d847448eae44e8f3881dd034466d | 301 | md | Markdown | README.md | wiladog/wiladog.github.io | a462628b77f1e63028537b8305f48b54e6980a52 | [
"Apache-2.0"
] | null | null | null | README.md | wiladog/wiladog.github.io | a462628b77f1e63028537b8305f48b54e6980a52 | [
"Apache-2.0"
] | null | null | null | README.md | wiladog/wiladog.github.io | a462628b77f1e63028537b8305f48b54e6980a52 | [
"Apache-2.0"
] | null | null | null | ## Wiladog Blog
### [View Live Blog →](https://wiladog.github.io/)
## 说明
网站的图文均为原作者创做,暂未删除,稍后会更新,请见谅!
## License
Apache License 2.0.
Copyright (c) 2017-2018 wiladog
This blog is derived from [Huxpro Jekyll Themes](https://github.com/Huxpro/huxpro.github.io/)
Copyright (c) 2015-2016 Huxpro
| 18.8125 | 93 | 0.72093 | kor_Hang | 0.23012 |
9b7a8970d1b42e0bc84b22eaaaa767ad6bfb65a4 | 1,437 | md | Markdown | docs/nine-bit.md | commtech/pyserialfc | 8fd9846eb506fb6a84cf5eacd4d37024614b8af4 | [
"MIT"
] | null | null | null | docs/nine-bit.md | commtech/pyserialfc | 8fd9846eb506fb6a84cf5eacd4d37024614b8af4 | [
"MIT"
] | null | null | null | docs/nine-bit.md | commtech/pyserialfc | 8fd9846eb506fb6a84cf5eacd4d37024614b8af4 | [
"MIT"
] | null | null | null | # 9-Bit Protocol
Enabling 9-Bit protocol has a couple of effects.
- Transmitting with 9-bit protocol enabled automatically sets the 1st byte's 9th bit to MARK, and all remaining bytes's 9th bits to SPACE.
- Receiving with 9-bit protocol enabled will return two bytes per each 9-bits of data. The second of each byte-duo contains the 9th bit.
###### Code Support
| Code | Version |
| ---- | ------- |
| serialfc-windows | 2.2.0 |
| serialfc-linux | 2.0.0 |
| pyserialfc | 1.1.0 |
###### Card Support
| Card Family | Supported |
| ----------- |:-----:|
| FSCC (16C950) | Yes |
| Async-335 (17D15X) | No |
| Async-PCIe (17V35X) | No |
## Property
```python
nine_bit = property(...)
```
## Get
| Exception | Base Exception | Cause |
| ----------- | -----:| ----- |
| `AttributeError` | | Not supported on this family of cards |
###### Examples
```python
import serialfc
...
status = p.nine_bit
```
## Enable
| Exception | Base Exception | Cause |
| ----------- | -----:| ----- |
| `AttributeError` | | Not supported on this family of cards |
###### Examples
```python
import serialfc
...
p.nine_bit = True
```
## Disable
| Exception | Base Exception | Cause |
| ----------- | -----:| ----- |
| `AttributeError` | | Not supported on this family of cards |
###### Examples
```python
import serialfc
...
p.nine_bit = False
```
### Additional Resources
- Complete example: [`examples/nine-bit.py`](../examples/nine-bit.py)
| 19.16 | 138 | 0.604732 | eng_Latn | 0.927062 |
9b7c0a33646c2d2da5439933e4dc1d5e559faad8 | 959 | md | Markdown | README.md | mchmarny/github-teams-utils | 7907f6366483be2b0f4169a784dc14f04f199d75 | [
"Apache-2.0"
] | 2 | 2018-12-18T17:45:09.000Z | 2019-04-14T18:34:16.000Z | README.md | mchmarny/ghutils | 7907f6366483be2b0f4169a784dc14f04f199d75 | [
"Apache-2.0"
] | 1 | 2018-03-19T15:44:14.000Z | 2018-03-19T15:44:14.000Z | README.md | mchmarny/github-teams-utils | 7907f6366483be2b0f4169a784dc14f04f199d75 | [
"Apache-2.0"
] | 1 | 2018-10-24T18:48:23.000Z | 2018-10-24T18:48:23.000Z | # ghme
Alfred workflow for navigating to repositories.

## Install
Download the [latest release of the workflow](https://github.com/mchmarny/ghme/releases/latest) and open it with Alfred.
## Configuration
Create [Personal access tokens](https://github.com/settings/tokens) with sufficient rights to query API for list of all public and private repositories.
> Check only `public_repo` box if you want to scope your searches to only public directories

Once you have the token, run `repo-config` to enter the token and cache your repos
> `ghme` does not store your token so to refresh the list of locally cached repos you will have to re-run this step

## Workflow

## Disclaimer
This is my personal project and it does not represent my employer. While I do my best to ensure that everything works, I take no responsibility for issues caused by this code.
| 25.236842 | 175 | 0.750782 | eng_Latn | 0.997254 |
9b7cd9eeea0b3ae6eaf29c406737756b110d481a | 14 | md | Markdown | README.md | Mortoc/GGJ18 | 38979a3c903fff58abfe8bcb3d7a052ee0b9ca0e | [
"Unlicense"
] | 1 | 2018-01-27T20:02:04.000Z | 2018-01-27T20:02:04.000Z | README.md | Mortoc/GGJ18 | 38979a3c903fff58abfe8bcb3d7a052ee0b9ca0e | [
"Unlicense"
] | null | null | null | README.md | Mortoc/GGJ18 | 38979a3c903fff58abfe8bcb3d7a052ee0b9ca0e | [
"Unlicense"
] | null | null | null | # GGJ18
GGJ18
| 4.666667 | 7 | 0.714286 | vie_Latn | 0.239991 |
9b7d1e320c5b5d8c0581b4b605bf7241de176e0f | 1,293 | md | Markdown | README.md | romanpunia/liquidbtc | 3b5527fce4c4c1d9fb2009c2556a9c014ce827c3 | [
"MIT"
] | null | null | null | README.md | romanpunia/liquidbtc | 3b5527fce4c4c1d9fb2009c2556a9c014ce827c3 | [
"MIT"
] | null | null | null | README.md | romanpunia/liquidbtc | 3b5527fce4c4c1d9fb2009c2556a9c014ce827c3 | [
"MIT"
] | null | null | null | # Liquid BTC
Quick and simple short-term trend analyzer for Liquid Exchange
### Application
This tool can be used to predict next price change of selected currency pair. All Liquid's pairs are supported.
It was written in about 40 minutes. I have tested it in live mode with real money, it's quite good.
I needed this kind of tool to speed up order book lookup to select best buy/sell price.
#### PAIR
Shows current selected currency pair.
#### HIGH
Current highest sell price order.
#### OFFER
Current price level for this pair.
#### LOW
Current lowest buy price order.
#### TRND (UP/DOWN)
if the next price rises with a high probability, the name changes to green UP, if the probability of growth is less, then the UP will be red.
Otherwise, if the next price falls with a high probability, the name changes to red DOWN, if the probability of a fall is less, then DOWN will be green.
#### LAT
Latency between api calls to Liquid, defaults to 2000 ms. Max value is 5000 ms, min is 100 ms.
Also, if there are errors with the network or data, the color of LAT changes to red.
#### SHORT/LONG
In short mode app calculates price based on first buy/sell orders which gives best short-term prediction.
Otherwise, first 20 buy and first 20 sell orders will be used to calculate price change. | 41.709677 | 152 | 0.761021 | eng_Latn | 0.998957 |
9b7d3595bc7818dba7a51659b481a196a6b89298 | 123 | md | Markdown | MS-SHELL/doc.md | splitsushi/x86twk | 44ca0752e58847a8245acbc70c48ec46cf810a43 | [
"MIT"
] | null | null | null | MS-SHELL/doc.md | splitsushi/x86twk | 44ca0752e58847a8245acbc70c48ec46cf810a43 | [
"MIT"
] | null | null | null | MS-SHELL/doc.md | splitsushi/x86twk | 44ca0752e58847a8245acbc70c48ec46cf810a43 | [
"MIT"
] | null | null | null | # doc to reformat
[Top for Win sytem](https://superuser.com/questions/176624/linux-top-command-for-windows-powershell)
| 30.75 | 103 | 0.764228 | yue_Hant | 0.215717 |
9b7d506761abe146d61e20f012f154020c3afd6d | 770 | md | Markdown | packages/components/src/bill/README.md | mand-mobile/mand-mobile-3 | 8d251c6aadda8bd4d4a1adffae283296658f2f79 | [
"Apache-2.0"
] | 7 | 2021-08-19T09:18:01.000Z | 2022-01-24T12:15:01.000Z | packages/components/src/bill/README.md | mand-mobile/mand-mobile-3 | 8d251c6aadda8bd4d4a1adffae283296658f2f79 | [
"Apache-2.0"
] | 1 | 2022-03-14T07:01:41.000Z | 2022-03-14T07:01:41.000Z | packages/components/src/bill/README.md | mand-mobile/mand-mobile-3 | 8d251c6aadda8bd4d4a1adffae283296658f2f79 | [
"Apache-2.0"
] | 2 | 2021-08-20T06:22:51.000Z | 2022-02-09T02:41:36.000Z | ---
title: Bill 票据
preview: {
web: 'https://mand-mobile.github.io/mand-mobile-3/examples/#/bill',
uni: 'https://pt-starimg.didistatic.com/static/starimg/img/0TVOUrmmP51628599087432.png'
}
---
电子账单或票据
## 引入
```javascript
import { Bill } from 'mand-mobile'
Vue.component(Bill.name, Bill)
```
### 代码演示
<MDDemoWrapper>
<!-- left wrapper -->
{{{ @/packages/components/src/bill/demo/cases/demo0.vue
<!-- right wrapper -->
}}} @/packages/components/src/bill/demo/cases/demo1.vue
</MDDemoWrapper>
### API
### Bill Props
|属性 | 说明 | 类型 | 默认值 | 备注 |
|----|-----|------|------ |------|
|name|票据抬头|String| | |
|no|票据编号|String| | |
|neckNotch|票据打孔颜色|String| |
|water-mark|水印内容|String\/Object| | |
### Bill Slots
#### default
默认内容插槽
#### header
头部内容插槽
#### footer
底部内容插槽 | 16.73913 | 89 | 0.628571 | eng_Latn | 0.122312 |
9b7dd9772e97c011041246e0fe7febc86e2a2a57 | 190 | md | Markdown | information.md | oleskostyniuk/asu_git_7213 | b6ea81243bc286ad341ec2a0d99de133e551449c | [
"MIT"
] | null | null | null | information.md | oleskostyniuk/asu_git_7213 | b6ea81243bc286ad341ec2a0d99de133e551449c | [
"MIT"
] | null | null | null | information.md | oleskostyniuk/asu_git_7213 | b6ea81243bc286ad341ec2a0d99de133e551449c | [
"MIT"
] | null | null | null | #Oles Kostyniuk
```javascript
//Hello World
const message = 'Hello World';
confole.log(message);
```
+[YOUTUBE.COM](https://www.youtube.com/)
+[INSTAGRAM.COM](https://www/instagram.com/) | 17.272727 | 44 | 0.694737 | yue_Hant | 0.443308 |
9b7de2bdea735f9ed1cc7eb666c2a0851d71d51d | 772 | md | Markdown | vault/lexicon/G99382.md | mandolyte/uw-obsidian | 39e987c4cdc49d2a68e3af6b4e3fc84d1cda916d | [
"MIT"
] | null | null | null | vault/lexicon/G99382.md | mandolyte/uw-obsidian | 39e987c4cdc49d2a68e3af6b4e3fc84d1cda916d | [
"MIT"
] | null | null | null | vault/lexicon/G99382.md | mandolyte/uw-obsidian | 39e987c4cdc49d2a68e3af6b4e3fc84d1cda916d | [
"MIT"
] | null | null | null | # Πειθώ -οῦς, ἡ
<!-- Status: S2=NeedsEdits -->
<!-- Lexica used for edits: -->
## Word data
* Strongs: G99382
* Alternate spellings:
* Principle Parts:
* Part of speech:
* Instances in Scripture: 0
* All Scriptures cited: Yes
## Etymology:
* LXX/Hebrew glosses:
* Time Period/Ancient Authors:
* Related words:
* Antonyms for all senses
* Synonyms for all senses:
## Senses
### Sense 1.0:
#### Definition:
#### Glosses:
Peitho;
#### Explanation:
Persuasion;
#### Citations:
Peitho, Persuasion (as a goddess).
### Sense 2.0:
#### Definition:
#### Glosses:
persuasion;
#### Explanation:
#### Citations:
persuasion: [ἐν πειθοῖ]() (so Orig., Eus. and some cursives in [I Co 2:4](1Co 2:4) for [πειθός](), q.v.).†
| 10.575342 | 106 | 0.601036 | eng_Latn | 0.417734 |
9b7f4df61f9f03ed5607190b42a07ec9ed8b1910 | 782 | md | Markdown | README.md | das-dias/c-sformat | 85d405facd1ef2b1541099eb9f26ac290f64d89f | [
"MIT"
] | null | null | null | README.md | das-dias/c-sformat | 85d405facd1ef2b1541099eb9f26ac290f64d89f | [
"MIT"
] | null | null | null | README.md | das-dias/c-sformat | 85d405facd1ef2b1541099eb9f26ac290f64d89f | [
"MIT"
] | null | null | null | # C-SFORMAT
Version: 1.0
SFormat, or "String Formatter", is a library written in C to allow the production of console-silent, "printf" style formatted strings.
This software is licensed under MIT Open Source Software License.
## Dependencies:
- stdio.h (Standard C Input/Output Library)
- stdarg.h (Standard C variable function input parameter handling library)
## Examples:
### (Code Snippet) libtest.c:
```C
#include "sformat.h"
int main(void)
{
printf("This is a %s %s !\n", "formatted", "string");
/* declare a string using sformat */
char* s = (char*) sformat("This is also a %s %s !! \n", "formatted", "string");
/* and print it! :) */
printf("%s",s);
}
```
### Output:
```
This is a formatted string !
This is also a formatted string !!
```
| 23 | 134 | 0.654731 | eng_Latn | 0.96584 |
9b7f78c4547a4495ab0975fc9f7b39d2fb3b7c27 | 1,172 | md | Markdown | README.md | mnhthng-thms/NerdHerd-Final-Project-frontend | a173ccc1765cf8c854b856e4151b1f6b2819601f | [
"MIT"
] | 1 | 2020-10-19T12:22:30.000Z | 2020-10-19T12:22:30.000Z | README.md | duc1807/NerdHerd-Final-Project-frontend | a173ccc1765cf8c854b856e4151b1f6b2819601f | [
"MIT"
] | 1 | 2022-02-19T06:20:13.000Z | 2022-02-19T06:20:13.000Z | README.md | duc1807/NerdHerd-Final-Project-frontend | a173ccc1765cf8c854b856e4151b1f6b2819601f | [
"MIT"
] | 1 | 2020-09-19T08:19:51.000Z | 2020-09-19T08:19:51.000Z | # NERDHERD's CRESCOREX
NerdHerd's frontend source code for DevC Challenge's graduation project.
<table width="80%" align="center" style="border: 0px solid white">
<tr>
<td align="center">
<img src="./assets/public/screenshot_login.png" width="80%"/>
</td>
<td align="center">
<img src="./assets/public/screenshot_main.png" width="80%"/>
</td>
</tr>
</table>
## TEAM MEMBERS
Team members:
- [Ngô Tài Phát](https://github.com/PhatsNgoo): Project manager; Designer
- [Hoàng Minh Tú](https://github.com/mnhthng-thms): Backend developer; Frontend developer
- [Đỗ Trung Đức](https://github.com/duc1807): Frontend developer
- [Nguyễn Minh Quân](https://github.com/minhquanym): Data scientist
- [Lê Vũ Quang](https://github.com/vuquang23): Data scientist
## FEATURES
For Bank Staff:
- [x] Login, logout
- [x] Send OTP confirmation message to customer's phone number
- [x] Query credit score of bank's personal customer
- [x] History of search queries
- [x] Internalisation: Vietnamese and English languages
## VERSION HISTORY
See [CHANGELOG](https://github.com/mnhthng-thms/NerdHerd-Final-Project-frontend/blob/master/CHANGELOG.md)
| 29.3 | 105 | 0.709044 | yue_Hant | 0.284027 |
9b7fbd62fa2443c5edcf3af0e488dbb436682da9 | 301 | md | Markdown | content/projects/GREWordBot.md | vedangwartikar/website | 3280f5d0e833787181c08bf04778b016d535d966 | [
"MIT"
] | null | null | null | content/projects/GREWordBot.md | vedangwartikar/website | 3280f5d0e833787181c08bf04778b016d535d966 | [
"MIT"
] | 27 | 2021-09-23T14:46:54.000Z | 2022-03-28T14:35:23.000Z | content/projects/GREWordBot.md | vedangwartikar/website | 3280f5d0e833787181c08bf04778b016d535d966 | [
"MIT"
] | null | null | null | ---
date: '2017-12-01'
title: 'GRE Word Bot'
github: 'https://github.com/vedangwartikar/gre-word-bot'
external: ''
tech:
- Python
- Telegram-API
- Pandas
showInProjects: true
---
Telegram Bot which gives a random GRE Word from the Magoosh Dataset. Check it out [here](http://t.me/gre_word_bot)
| 21.5 | 114 | 0.707641 | eng_Latn | 0.297161 |
9b8000b1924fc803d8fc1c5d9cfa0e8829537b7b | 571 | md | Markdown | README.md | kondukberna/Image_Processing | 2423be41fc2922c4c9a7353925297ffdb0286f4e | [
"MIT"
] | 1 | 2021-12-22T03:30:04.000Z | 2021-12-22T03:30:04.000Z | README.md | kondukberna/Image_Processing | 2423be41fc2922c4c9a7353925297ffdb0286f4e | [
"MIT"
] | null | null | null | README.md | kondukberna/Image_Processing | 2423be41fc2922c4c9a7353925297ffdb0286f4e | [
"MIT"
] | null | null | null | # Image_Processing
Matlab implementation of contrast enhancement article
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7913162

1)Original Image

2)Enhanced by adaptive histogram equalization

3)Enhanced by article algorithm
| 24.826087 | 108 | 0.807356 | kor_Hang | 0.135761 |
9b809ea031aaab86273c98918d16bd0a2f7e617c | 31,083 | md | Markdown | _nwffacim/2005/082105.md | Christ-Mind-Teachings/cmi-original | 4397a750b7d9ec7e234f091d3acbaae73b40e9ad | [
"MIT"
] | null | null | null | _nwffacim/2005/082105.md | Christ-Mind-Teachings/cmi-original | 4397a750b7d9ec7e234f091d3acbaae73b40e9ad | [
"MIT"
] | 9 | 2021-01-19T07:07:49.000Z | 2022-01-27T00:56:35.000Z | _nwffacim/2005/082105.md | Christ-Mind-Teachings/cmi-original | 4397a750b7d9ec7e234f091d3acbaae73b40e9ad | [
"MIT"
] | 1 | 2021-01-18T22:42:28.000Z | 2021-01-18T22:42:28.000Z | ---
title: Aug 21, 2005
ref: "T10.3 From Darkness to Light"
alink: "/acim/text/10/chap1003/"
---
Good evening. And welcome to everyone who is joining us on the Internet.
You know, always at the end of a get-together, I share with you that I
love you. I’m going to begin today by saying I love you.
You get caught up in your everyday activities, and you forget that you
are loved by a Love that sees you as You Truly Are, that recognizes What
You Really Are, in spite of whatever definition of yourself you’re
employing at any given moment. And this Love that you are loved by is
Divine, loving that, I’m going to say, in you which is Divine. But
it isn’t that there is somewhere within you some little thing that
is Divine that’s being loved. This Divine Love loves you in your
Totality which is Divine right now, no matter how convinced you are
otherwise.
This Love is transformational, which means to whatever degree
you’re willing to let it in, you’re on the threshold of
transformation, of healing, of redemption, of coming back into your
right Mind and seeing yourself and everything As It Truly Is. In other
words, you could say, there is always present, with its attention fully
on you, an active capital “A” Agent for Change. It
isn’t just passively loving you. It isn’t just embracing you
in a warm pink glow that can allow you to feel comfortable and at peace
in spite of the circumstances you’re in the middle of. It’s
a Love which embraces you actively in a way that I can only describe as
a persistent nudge to get your attention, to prompt a sudden shift of
perception, a miracle, to uncover to you perfection of all sorts,
including physical perfection that you hadn’t been experiencing
fully.
This Divine Love has only your best interest at heart, and it therefore
stands ever ready and ever actively encouraging the transformation of
your mind and your experience for the better. And it never employs
elements unlike Itself to try and get your attention. In other words, it
never creates adverse conditions in order to get your attention.
Love is a Singularity in which there is nothing unlike Itself. Now this
Love that is loving you is the Father. But that in you which is nothing
more than your right Mind, called the Holy Spirit, is the Presence of
that Love as you truly loving you in your present sense of yourself with
the same intent—to nudge you into [finger snap] an
“Aha!”—a light bulb going on, a clarification that
undoes a definition or a belief that you have employed that has kept you
in the dark, kept you from experiencing your Birthright, kept you from
seeing everything around you as the Kingdom of Heaven.
It’s one thing to know that God, Divine Universal Love, loves you
actively and Its Love is transformational in effect. It’s another
thing to know that there is something that you can recognize as even
closer to home—[small laugh] closer to you, in other
words—and that is your Real Self—the Presence of God as God
Is Expressing Himself right where you are as what you call you. And this
Something Else, that is right there where you are, is in fact
You—unadulterated You—unbiased, untorqued You. Your Sane
Self, which has gone nowhere and has been ever-present with you even
though you have been ignoring it, is loving you with the intent to
motivate a sudden shift of perception wherever it has the opportunity.
Everything about you, divinely speaking, is on your side, with the
intent to reintegrate your mind so that you might experience your
undivided Wholeness, your unfragmented Wholeness.
Your very Being loves you. What does that mean? It means that your Being
loves the you, you think you are at the moment—the tiny, more
limited being that you think you are and that you insist on being
because you think there is no other alternative. And so, your Being
loves you so that you might be illuminated, so that you might experience
illumination that causes the separation between your Being and your
present sense of yourself to diminish, whereby you abandon your present
sense of yourself and any sense of yourself for That Which You Really
Are and always have been. Which means, you could say, a reuniting with
your capital “B” Being—integration occurring whereby
you no longer feel loved by your Essential Being, but you as the
Essential Being That You Are, are now engaged in extending the Love That
You Are because that’s your Function—where there’s
Unity and Harmony—the absence of fragmentation, the absence of
conflict.
I want you to understand how completely you are enveloped by
transformational Love. And you know what? I want you to be aware, to be
conscious of the idea at least, even if you’re not having the
experience, that everyone you know is likewise so loved, no matter how
they are seeing themselves, and no matter how you might be seeing them.
You don’t have to believe their sense of being unloved. You
don’t have to believe their confidence that things can’t get
better. You don’t have to believe their confidence that
they’re right when they’re wrong. And you don’t have
to believe that anything—any situation, any circumstance—is
hopeless.
The question is, at the bottom line, “How are you going to use
your mind?” Are you going to use it to confirm your current
beliefs and definitions? Are you going to use it to confirm your
Brother’s beliefs and definitions? Or are you going to use it to
be curious and inquire of the Love that loves you, inquire of the Holy
Spirit or God, “What is the Truth here that You find so utterly
Lovable? What is the Reality here of me and my Brothers and my world
that You find so utterly Lovable? I want a taste of that.” When
that becomes your prayer, you’re turning toward the Light;
you’re turning toward the Illumination that uncovers What Is Real
about you and your world and your Brother.
But when you turn toward and give your attention to your
dearly-developed definitions and concepts, and rely upon them, or when
you turn to your Brother to find out from your Brother what his
definitions and beloved concepts are, you are turning your back on the
Source of Illumination. And as a matter of experience, you all know that
the more [you are] enmeshed in self-awareness—tiny, personal
self-awareness—and your private, personal rights to be a creator,
to be an authorizer, you become more and more uncomfortable. The height
or depth of selfishness is always an experience of suffering. It is the
opposite of an Experience of Illumination.
So when we’re talking about, “From Darkness to Light,”
we’re talking about shifting from your orphanhood, shifting from
your privacy, shifting from a tiny creature capable of authorizing
things, to a State of Joining, of reaching out—of letting there be
something else on your mind beside your definition of yourself, your
definitions of everything else, and your great commitment to them, which
constitutes a state of profound loneliness that is the exact opposite of
the Nature of your Being, which is All-inclusive.
The more you burrow down into a sense of tininess, the darker your
experience becomes, the darker your mind becomes. Your mind becomes
filled with calls for justice. Your mind becomes involved in seeing
injustices. Your mind becomes involved with noticing all the awful
things that are going on. And then, in this funny, little arrogance of
this private state of mind, you say, “Oh, I want to bring the
light to all of this awful darkness, to all of the awfulness that is
going on.” You see? “Because if I can do that, I will make
me great.” And you do this, all of you do this, in one way or
another, because you believe that the awfulness itself is real and the
you that is seeing it is real, not realizing that the awfulness of it is
a result of your adamant choice to be private—unjoined with this
Infinite Love that is loving you. You see?
“Oh, I would rather express my own magnanimous love in the
presence of the awfulness of the world and heal it.” You see?
“No thank You, Father. I don’t need to plug into the circuit
of Infinite Divine Love That You Are. I’ve got what it takes and
I’m gonna prove it.” You see? Which is nothing but a further
confirmation, or attempt to confirm, that you can actually exist
separate from God, separate from your Source.
The way out of darkness is not you overcoming the darkness you see in
the world and in your experience. It’s a matter of you turning
around, turning your back on privacy, and beginning to embrace, first of
all, the Holy Spirit, and asking for the Holy Spirit’s
Perspective—the Perspective of that which is nothing more than
your right Mind—which is a way of asking for God’s
Perspective, because it is God’s Perspective that your right Mind
holds in trust for you until you’re willing to turn around and
embrace again, extend your attention to something more than just you and
what you want and what you believe, and what you are convinced is true
that you must convince everyone else is true so that you might get group
consensus and achieve a position, thereby, of some substance or some
reality.
So the way out of darkness into Light is the way out of privacy into
embrace—embrace that invites in the fuller experience of Reality
that it is yours to be experiencing. And every single moment of every
day provides you with the opportunity to see the more of What Is Really
There than what you’re seeing.
How many of you let new things happen this week? How many of you allowed
your week to proceed outside of the box you normally live in? How many
of you let new behavior happen? How many of you let things happen that
went beyond your traditional conditioned responses to your life? How
many of you let new behavior happen? This is a way of expressing
curiosity in your world. This is a way of breaking down your defenses
against the more That Is There than what you’re experiencing.
It’s not just all ethereal, mental stuff.
Let’s go to the book.
<div markdown="1" class="well book">
The way is not hard…[^1]
</div>
… and I will say, the way of redemption is not hard …
<div markdown="1" class="well book">
… but it IS very different. Yours is the way of pain, of which
God knows nothing. THAT way is hard indeed, and very lonely. Fear and
grief are your guests, and they go with you and abide with you on the
way. But the dark journey is not the way of God’s Son.
</div>
That means it is not the way that’s inherent in you.
It doesn’t mean it is not the way of Jesus, God’s
Son—some other Christ. You see? The dark journey is not the way of
you—of God’s Son or Daughter—you. That’s not
your Birthright.
<div markdown="1" class="well book">
Walk in light and do not see the dark companions, for they are not fit
companions for the Son of God, who was created OF Light and IN Light.
</div>
You see? And the Light that you have never stepped out of.
Who are the dark companions? Fear and grief, but also jealousy and
self-righteousness and self-protection and anger—all of which
present themselves to you as thoughts which occupy your mind, and which
you ruminate about and get pleasure from. “Oh, tell me,
doesn’t the call for justice feel good when it arises?” You
comfort yourself with dark companions, as strange as it might sound. You
use these dark companions to make you feel justified, to make you feel
good.
But, it says here:
<div markdown="1" class="well book">
Walk in light …
</div>
… make another choice …
<div markdown="1" class="well book">
… and do not see the dark companions, for they are not fit
companions for the Son of God…
</div>
… do not see the dark companions.
You are an executive with an office downtown. And you have appointments
scheduled for the next day, and you look down and you see, “Oh, I
see a couple of dark companions here that are scheduled. Mmm. I’m
not going to see them tomorrow.”
<div markdown="1" class="well book">
… do not see the dark companions…
</div>
… you see, it isn’t just don’t stop seeing them with
your eyes. It means stop letting them in. Stop letting them have
appointments. Stop giving them audience. Cancel their appointments. Do
not see them. Refuse to see them anymore.
<div markdown="1" class="well book">
Walk in light and do not see the dark companions, for they are not fit
companions for the Son of God, who was created OF Light and IN Light.
The Great Light …
</div>
… well …
<div markdown="1" class="well book">
… Great Light …
</div>
… whew …
<div markdown="1" class="well book">
… Great Light …
</div>
… you know what “the Great Light” is? The Great Light
is Living Love. Love is illumination. Love is illuminated. Love is
Light, as well as affection and compassion, and an inordinate
appreciation of Everything That’s Real—an appreciation that
is extended and embraces everything unequivocally, without reservation,
and blesses everything it falls upon.
<div markdown="1" class="well book">
The Great Light always surrounds you and shines out FROM you.
</div>
It’s happening at this instant. And the Radiance of It I can see.
The Radiance of It every Awakened Brother can see.
It’s shining at this moment and it is illuminating the
room—the space that you are in. And I’ll tell you also that
the walls of the room and the floors of the rooms are all emanating
Light as well, illuminating you, just as the Light of you is
illuminating them. It’s a Relationship of Light.
<div markdown="1" class="well book">
The Great Light always surrounds you and shines out FROM you. How can
you see the dark companions in a Light such as this? If you see THEM it
is only because you are DENYING the Light.
</div>
It only means that you’ve turned your back from What Is Actually
True in order to give preference to definitions you’ve made-up,
because the making up of definitions, and the securing of them, is the
way you think you’re going to achieve autonomous reality.
<div markdown="1" class="well book">
If you see THEM …
</div>
… the dark companions …
<div markdown="1" class="well book">
… it is only because you are DENYING the Light. But deny THEM
instead…
</div>
… cancel their appointments …
<div markdown="1" class="well book">
… deny THEM instead, for the Light is here and the way is clear.
</div>
Meaning unobstructed.
What’s being described here is the Reality that you are in at this
very moment. And it’s the experience to be curious to have. And
it’s an experience to be curious to have ongoingly.
You must practice ongoing curiosity. You must be willing not to lapse
from being curious. “Oh, well that sounds like a lot of
work.” Well, it might, and it might take some conscientious effort
on your part, but if it’s the means of undoing illusion,
it’s worth doing. You must do it more persistently. Because your
habit of relying upon your definitions and the meanings you’re
giving everything, and then ruminating over the way things aren’t
working out and exactly why they’re not working out, instead of
doing that, you need to conscientiously do something else—the
opposite of it. You must break the habit by actually, conscientiously,
doing something else.
So, no matter what you’re confronted with tomorrow, this evening,
let there be an ever-present curiosity to see God there, to see the more
of What God Is Being, instead of relying upon your best definitions and
judgments, and mulling them over and over and over and over in your
mind, preoccupying yourself, by your devotion to them, from having the
slightest chance of an “Aha!” occurring.
<div markdown="1" class="well book">
God hides nothing from His Son, even though His Son would hide himself.
</div>
In other words, hide Who He Really Is, in favor of being the orphan who
successfully proves that one doesn’t have to have a Father or a
Mother in order to exist and be real.
<div markdown="1" class="well book">
Yet the Son of God …
</div>
… you …
<div markdown="1" class="well book">
… cannot hide his glory…
</div>
… cannot hide your glory …
<div markdown="1" class="well book">
… for God wills him …
</div>
… you …
<div markdown="1" class="well book">
… to be glorious, and gave him …
</div>
… you …
<div markdown="1" class="well book">
… the Light that shines in him.
</div>
In you.
You see, you’re neither behind the Point of Perfection, nor
advancing toward it; you are at that Point and must understand yourself
therefrom. And you will not understand yourself therefrom … you
will not understand yourself therefrom if you don’t persist in
letting your Ultimacy be your vantage point in the present. You will
never escape your current habit of seeing less there where you are than
is there.
<div markdown="1" class="well book">
You will never lose your way for God leads you. When you wander you but
undertake a journey which is not real. The dark companions, the dark
way, are all illusions. Turn toward the Light, for the little spark in
you is part of a Light so great that It can sweep you out of all
darkness forever.
</div>
And as I said in the beginning, Its Intent is to sweep you out of
darkness. It is constantly nudging you, it is constantly pressuring you,
causing you not to be totally at peace in your commitment to your
definitions that aren’t real, so that you might be reminded, and
we might say, unconsciously, that there is another way to look at
things, so that you might employ enough curiosity to abandon your
bondage.
<div markdown="1" class="well book">
Turn toward the Light, for the little spark in you is part of a Light
…
</div>
… part of a Love …
<div markdown="1" class="well book">
… so great that It can sweep you out of all darkness forever. For
your Father IS your Creator, and you ARE like Him.
</div>
You see? There we brought everything back to the Truth. You have a
Father and you are a Son.
And your healing, your Awakening, depends upon your willingness to
embrace that fact and let yourself be a Son, who is the Son because you
are acknowledging your Father, and not claiming orphanhood and not
claiming a right to see things a little bit differently from God.
<div markdown="1" class="well book">
The Children of Light cannot abide in darkness, for darkness is not in
them.
</div>
Well, you know what that means? It means you aren’t abiding in
darkness. You aren’t abiding in quote “the human
condition” unquote. You are right now abiding in Light, but
you’re imagining otherwise. And you’re believing your
imagination. You’re abiding in your imagination. And you’re
constantly trying to confirm that your imagination is true.
<div markdown="1" class="well book">
The Children of Light …
</div>
… you …
<div markdown="1" class="well book">
… cannot abide in darkness, for darkness is not in them. Do not
be deceived by the dark comforters…
</div>
… you know, the call for justice and self-righteousness.
<div markdown="1" class="well book">
Do not be deceived by the dark comforters, and never let them enter the
mind of God’s Son…
</div>
… cancel their appointments …
<div markdown="1" class="well book">
… for they have no place in His temple.
</div>
They have no place in your right Mind.
<div markdown="1" class="well book">
When you are tempted to deny Him…
</div>
… God …
<div markdown="1" class="well book">
… remember that there ARE no other gods that you can place before
Him…
</div>
… like I’ve said before, there simply aren’t any
black-market gods you can go out and get and put before God.
<div markdown="1" class="well book">
When you are tempted to deny Him, remember that there ARE no other gods
that you can place before Him, and accept His Will for you in peace.
</div>
Give up the struggle to do something impossible. In other words, to try
to attempt to find Peace when you’re denying your very own
Birthright. When you’re denying What You Are, it is impossible to
find or secure Peace in that conscious intent.
<div markdown="1" class="well book">
When you are tempted to deny Him, remember that there ARE no other gods
that you can place before Him, and accept His Will for you in peace.
</div>
Without any further struggle.
<div markdown="1" class="well book">
For you CANNOT accept it otherwise.
Only God’s Comforter CAN comfort you. In the quiet of His temple,
He waits to give you the peace that is yours.
</div>
This Comforter, as it says in the Bible, is the Holy Spirit. And what is
the Holy Spirit? Nothing more than your right Mind.
<div markdown="1" class="well book">
Only God’s Comforter …
</div>
… your right Mind …
<div markdown="1" class="well book">
… CAN comfort you. In the quiet of His temple…
</div>
… the Comforter’s Temple. The Place of Excellence in you
where the Altar in you is.
<div markdown="1" class="well book">
… He waits …
</div>
… your Self waits …
<div markdown="1" class="well book">
… to give you the peace that is yours.
</div>
It is yours, but you have to come to a point where you’re willing
to abandon the addiction to conflict and the overcoming of it.
<div markdown="1" class="well book">
GIVE His peace that you may enter the temple and find it waiting for
you.
</div>
Well, the best way you can give His Peace, the Comforter’s Peace,
is to stop thinking. And one of the best means of abandoning thinking is
a practice of meditation.
<div markdown="1" class="well book">
… He waits to give you the peace that is yours. GIVE His peace
that you may enter the temple and find it waiting for you. But be holy
in the Presence of God…
</div>
… in other words, shut up and be still and stop being in charge.
Stop asserting yourself. Stop being an assertive presence that gains a
sense of itself out of the loudness of its assertions.
<div markdown="1" class="well book">
… be holy in the Presence of God…
</div>
… your right Mind …
<div markdown="1" class="well book">
… or you will not know that you are there. For what is unlike God
cannot enter His Mind because it was not His Thought, and therefore does
not belong to Him. And YOUR minds must be as pure as His, if you would
know what belongs to YOU.
</div>
Oh, that might sound like too big a challenge. How can your mind be as
pure as His? Well, the moment there is silence in your mind, your mind
is perfectly pure. The only impurities present in your mind are your own
made-up ideas, your own self-created thoughts, and the thinking you
engage in about them. So you don’t have to purify your thoughts so
that you’re having pure thoughts all the time. The moment there is
silence in your mind, your mind is pure.
This is important.
<div markdown="1" class="well book">
Guard carefully His temple, for He Himself dwells there, and abides in
peace.
</div>
Where is this Temple? Well, it’s where the Altar is, isn’t
it? Where is the Altar? In you. What is the Altar? The place where the
Voice for Truth awaits you—the Holy Spirit, which is nothing more
than your right Mind. Go into that place where What You Divinely Are
awaits you. And go to It without carrying with you the baggage and
garbage of all your attempts to be something of yourself. Go there in
silence, just simple silence, coupled with attentiveness, which is
another word for curiosity.
<div markdown="1" class="well book">
You cannot enter God’s Presence with the dark companions beside
you, but you also cannot enter alone.
ALL your brothers must enter WITH you, for until you have accepted them
YOU cannot enter.
</div>
Well, here’s a simple way to make sense out of this. As with Paul,
go into the Altar for some reason outside yourself. Go in on your
Brother’s behalf. Go in because your Brothers have questions. Go
in because your Brothers have needs. Go in because your Brothers seem to
be suffering, and you don’t want them to suffer. Go in because you
want to know What The Truth Is so that you will stop joining with your
Brothers in confirming their dilemmas. Care enough about your Brothers
that you take them with you into the Presence of the Altar, so that you
take them with you into the Presence of what is nothing more than your
right Mind. Let your reason for going there not be self-seeking.
<div markdown="1" class="well book">
ALL your brothers must enter WITH you, for until you have accepted them
YOU cannot enter.
</div>
In other words, you cannot approach your right Mind for selfish reasons,
for private reasons.
<div markdown="1" class="well book">
… until you have accepted them YOU cannot enter. For you cannot
understand Wholeness unless YOU are whole, and no part of the Son can be
excluded if he would know the Wholeness of his Father.
</div>
You see? Your reason for wanting to know the Truth is so that All of
Creation can be embraced in your illuminated vision, so that you might
have the opportunity to bless everything with your willingness to
acknowledge God there. You can’t do it to find the capacity to
acknowledge God in you and you alone, so that you might be able to know
the winning lottery numbers, and get the house of your dreams, and not
be dependent on anybody else, and not be a burden on society, and all of
the good reasons you can think of for being independently successful.
<div markdown="1" class="well book">
In your mind you can accept the whole Sonship, and bless it with the
Light your Father gave it. Then you will be worthy to dwell in the
temple …
</div>
To not just go there, but to stay there.
<div markdown="1" class="well book">
… WITH Him because …
</div>
… what?
<div markdown="1" class="well book">
… it is YOUR will not to be alone.
</div>
That is the crux of your whole dilemma. And waking up is the abandonment
of the devotion to being alone. To be privately successful, to be
privately on your own real, that is your devotion.
<div markdown="1" class="well book">
God blessed His Son forever.
</div>
Because God is a Singularity and there is nothing unlike God in a
Singularity, because the Omnipresence of God is Pure, Whatever God Is,
is forever, because there’s nothing to interrupt It. It is Eternal
and Infinite simultaneously.
<div markdown="1" class="well book">
If you will bless him in time, you will BE in eternity.
</div>
You see? I keep saying the only thing confronting you is the Kingdom of
Heaven. And therefore, rather than waiting until after you die, or after
you have perfected your soul, to look for the experience of Reality, you
need to do it right here, right now, because right here and right now is
where you are experiencing the Kingdom of Heaven through a glass darkly.
It says here:
<div markdown="1" class="well book">
If you will bless him in time, you will BE in eternity.
</div>
If you will look for God in the world, you will be in the Kingdom of
Heaven. Your vision will be transformed so that the Kingdom of Heaven,
that is the only thing in your face, might suddenly be seen right where
your limited and unreal definitions have stood in the way of your direct
perception, your direct experience of Reality, the Kingdom of Heaven.
<div markdown="1" class="well book">
If you will bless him …
</div>
… God’s Son …
<div markdown="1" class="well book">
… in time…
</div>
… in other words, right here, right now, with whatever
definitions your Brother is employing about himself …
<div markdown="1" class="well book">
… you will BE in eternity.
</div>
A shift will occur, right here, right now.
<div markdown="1" class="well book">
Time cannot separate you from God if you use it on BEHALF of the
eternal.
</div>
You all used the world and its resources to gain position and money and
security and authority and domination, without any sense of What The
World Really Is and that your Real Function, relative to it, is not the
exploitation of it, but of using it as your opportunity to acknowledge
God in every aspect of it and glorify God, rather than using it to try
to glorify yourself, which will always constitute
depletion—depletion of you—the minimizing of you into a
nothing, an orphan, that can’t actually accomplish anything at
all—and the Kingdom of Heaven as the world and universe governed
by material laws developing according to physical …
(PAUL: Just a moment.)
… physical laws of development that arise out of fundamental
conflict, and engage laws such as the survival of the fittest, which is
nothing more than competition.
<div markdown="1" class="well book">
Time cannot separate you from God if you use it on BEHALF of the
eternal.
</div>
Use your mind on behalf of the Eternal by being curious to see
What’s Really Going On, instead of using it for your own selfish
purposes. Use your vision to find and acknowledge God, instead of using
it as a means of determining whether you’re safe, and if you
aren’t safe, to best determine what the best defense will be.
Start using your eyes to find God. Start using your ears to hear the
Truth so that you might express and embody Truth, instead of all of your
pre-determined, pre-recorded habits that keep you from being Truly
Conscious of What Is Going On as you employ them for self-defense,
instead of making the Gift and extending the Light, the Love, with a
capital “L”, That You Are.
From darkness to Light is what we’re talking about. From
selfishness to embrace is what we’re talking about. From privacy
to inclusion is what we’re talking about.
I hope that your week is full of out-of-the-box experiences that you
didn’t plan, but you gave permission to happen.
And again, I will tell you, I love you all. And I look forward to being
with you next week.
[^1]: T10.3 From Darkness to Light
| 41.722148 | 72 | 0.772287 | eng_Latn | 0.999823 |
9b813fbc4d15084955ee586ec29b465267bf95e1 | 2,663 | markdown | Markdown | _posts/news/2014-11-17-feelpp-webinar.markdown | cemosis/csmi.cemosis.fr | a19ec14934453f50c87beea1e845f2237383e6c6 | [
"Apache-2.0"
] | null | null | null | _posts/news/2014-11-17-feelpp-webinar.markdown | cemosis/csmi.cemosis.fr | a19ec14934453f50c87beea1e845f2237383e6c6 | [
"Apache-2.0"
] | 30 | 2015-07-06T13:04:07.000Z | 2015-08-27T16:38:20.000Z | _posts/news/2014-11-17-feelpp-webinar.markdown | cemosis/csmi.cemosis.fr | a19ec14934453f50c87beea1e845f2237383e6c6 | [
"Apache-2.0"
] | 8 | 2015-07-06T11:58:08.000Z | 2015-09-23T19:04:59.000Z | ---
layout: news_item
title: 'Anisotropic mesh adaptation and stabilized finite elements method for solving conjugate heat transfers and turbulent flows'
date: 2014-11-17
author: prudhomm
fullname: Christophe Prud'homme
categories: [webinar]
webinar: Feel++
topic:
project: [Feel++]
tags: [feelpp]
---
J. Veysset a new postdoctorate fellow who joined our team in november
will present his work during his Phd on Monday 17 November 2014 via
Hangout at 14h:
*Anisotropic mesh adaptation and stabilized finite elements method for solving conjugate heat transfers and turbulent flows*
Fluid-Structure Interaction (FSI) describes a wide variety of industrial problems arising in mechanical
engineering, civil engineering and biomechanics. In spite of the available computer performance and the
actual maturity of computational fluid dynamics and computational structural dynamics, several key
issues still prevent accurate FSI simulations.
Two main approaches for the simulation of FSI problems are still gaining attention lately: partitioned and
monolithic approaches. Results in the literature show that the partitioned approach is accurate and
efficient but some instabilities may occur depending on the ratio of the densities and the complexity of
the geometry. Monolithic methods are still of interest due to their capability to treat the interaction of
the fluid and the structure using a unified formulation. In fact it makes the buildup of a FSI problem
easier as the mesh do not have to fit the geometry of the solids and the transfers are treated naturally.
The software Thost has been created based on these analyzes. Thost is a 3D aerothermal numerical
software. It has been developed for the numerical simulation of industrial processes like the heating in
industrial furnaces as well as quenching. Its target is to model numerically the thermal history of the
industrial pieces in their environment without using any transfer coefficient. However the computational
costs are still high and therefore the software is not fully efficient from an industrial point of view to
simulate, analyze and improve complex processes. All the work in this PhD thesis has been done to
reduce the computational costs and optimize the accuracy of the simulations in Thost based on
innovative numerical methods such as dynamic anisotropic mesh adaptation, stabilized finite elements
methods and immersing the objects directly from their Computer Aided Design files.
The [webinar](https://plus.google.com/u/2/events/csh5s8cpli3kqoo8177h92asdb0) will be done using this
[hangout](https://plus.google.com/hangouts/_/event/csh5s8cpli3kqoo8177h92asdb0?authuser=2&hl=fr).
| 60.522727 | 131 | 0.825009 | eng_Latn | 0.998531 |
9b81a3bd21035459ae39d461521bc8f4731976fd | 5,570 | md | Markdown | azureps-cmdlets-docs/ResourceManager/AzureRM.Network/v0.9.8/Add-AzureLoadBalancerFrontendIpConfig.md | Evgenii011/azure-docs-powershell | 30e804249e1fb7af82ea9b01d7bdecb33ec238db | [
"CC-BY-4.0",
"MIT"
] | 2 | 2021-04-14T11:42:58.000Z | 2021-05-23T22:43:42.000Z | azureps-cmdlets-docs/ResourceManager/AzureRM.Network/v0.9.8/Add-AzureLoadBalancerFrontendIpConfig.md | Evgenii011/azure-docs-powershell | 30e804249e1fb7af82ea9b01d7bdecb33ec238db | [
"CC-BY-4.0",
"MIT"
] | null | null | null | azureps-cmdlets-docs/ResourceManager/AzureRM.Network/v0.9.8/Add-AzureLoadBalancerFrontendIpConfig.md | Evgenii011/azure-docs-powershell | 30e804249e1fb7af82ea9b01d7bdecb33ec238db | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-04-16T03:17:57.000Z | 2019-04-16T03:17:57.000Z | ---
external help file: Microsoft.Azure.Commands.Network.dll-Help.xml
online version:
schema: 2.0.0
ms.assetid: C8A37C5D-EA16-499E-83E4-B9C673F52880
---
# Add-AzureLoadBalancerFrontendIpConfig
## SYNOPSIS
Adds a front-end IP configuration to a load balancer.
## SYNTAX
### SetByResourceId
```
Add-AzureLoadBalancerFrontendIpConfig -Name <String> -LoadBalancer <PSLoadBalancer>
[-PrivateIpAddress <String>] [-SubnetId <String>] [-PublicIpAddressId <String>] [-Profile <AzureProfile>]
[<CommonParameters>]
```
### SetByResource
```
Add-AzureLoadBalancerFrontendIpConfig -Name <String> -LoadBalancer <PSLoadBalancer>
[-PrivateIpAddress <String>] [-Subnet <PSSubnet>] [-PublicIpAddress <PSPublicIpAddress>]
[-Profile <AzureProfile>] [<CommonParameters>]
```
## DESCRIPTION
The **Add-AzureLoadBalancerFrontendIpConifg** cmdlet adds a front-end IP configuration to an Azure load balancer.
## EXAMPLES
### Example 1 Add a front-end IP configuration with a dynamic IP address
```
PS C:\>$Subnet = Get-AzureVirtualNetwork -Name "myVnet" -ResourceGroupName "myRg" | Get-AzureVirtualNetworkSubnetConfig -Name "mysubnet"
PS C:\> Get-AzureLoadBalancer -Name "myLB" -ResourceGroupName "NrpTest" | Add-AzureLoadBalancerFrontendIpConfig -Name "frontendName" -Subnet $Subnet | Set-AzureLoadBalancer
```
This command adds a front-end IP configuration to the load balancer with a dynamic private IP address from the specified subnet.
### Example 2 Add a front-end IP configuration with a static IP address
```
PS C:\>$Subnet = Get-AzureVirtualNetwork -Name "myVnet" -ResourceGroupName "myRg" | Get-AzureVirtualNetworkSubnetConfig -Name "mysubnet"
PS C:\> Get-AzureLoadBalancer -Name "myLB" -ResourceGroupName "NrpTest" | Add-AzureLoadBalancerFrontendIpConfig -Name "frontendName" -Subnet $Subnet -PrivateIpAddress "10.0.1.6" | Set-AzureLoadBalancer
```
This command adds a front-end IP configuration to the load balancer with a static private IP address from the specified subnet.
### Example 3 Add a front-end IP configuration with a public IP address
```
PS C:\>$PublicIp = Get-AzurePublicIpAddress -ResourceGroupName "myRG" -Name "myPub"
PS C:\> Get-AzureLoadBalancer -Name "myLB" -ResourceGroupName "NrpTest" | Add-AzureLoadBalancerFrontendIpConfig -Name "frontendName" -PublicIpAddress $PublicIp | Set-AzureLoadBalancer
```
This command adds a front-end IP configuration to the load balancer with a public IP address.
## PARAMETERS
### -LoadBalancer
Specifies a **LoadBalancer** object.
This cmdlet adds a front-end IP configuration to the load balancer that this parameter specifies.
```yaml
Type: PSLoadBalancer
Parameter Sets: (All)
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: True (ByValue)
Accept wildcard characters: False
```
### -Name
Specifies the name of the front-end IP configuration to add.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -PrivateIpAddress
Specifies the **PublicIpAddress** object to associate with a front-end IP configuration.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Profile
Specifies an Azure profile.
```yaml
Type: AzureProfile
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -PublicIpAddress
Specifies the **PublicIpAddress** object to associate with a front-end IP configuration.
```yaml
Type: PSPublicIpAddress
Parameter Sets: SetByResource
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -PublicIpAddressId
Specifies the ID of the **PublicIpAddress** object to associate with a front-end IP configuration.
```yaml
Type: String
Parameter Sets: SetByResourceId
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Subnet
Specifies the **Subnet** object in which to add a front-end IP configuration.
```yaml
Type: PSSubnet
Parameter Sets: SetByResource
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -SubnetId
Specifies the ID of the subnet in which to add a front-end IP configuration.
```yaml
Type: String
Parameter Sets: SetByResourceId
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see about_CommonParameters (http://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
## OUTPUTS
## NOTES
## RELATED LINKS
[Get-AzureLoadBalancerFrontendIpConfig](./Get-AzureLoadBalancerFrontendIpConfig.md)
[Get-AzureVirtualNetwork](./Get-AzureVirtualNetwork.md)
[Get-AzureVirtualNetworkSubnetConfig](./Get-AzureVirtualNetworkSubnetConfig.md)
[New-AzureLoadBalancerFrontendIpConfig](./New-AzureLoadBalancerFrontendIpConfig.md)
[Remove-AzureLoadBalancerFrontendIpConfig](./Remove-AzureLoadBalancerFrontendIpConfig.md)
[Set-AzureLoadBalancerFrontendIpConfig](./Set-AzureLoadBalancerFrontendIpConfig.md)
| 27.170732 | 314 | 0.780072 | yue_Hant | 0.421902 |
9b828e28a295b04eab31a41379619c8668434522 | 1,383 | md | Markdown | docs/contributing.md | opensrp/opensrp-web | 5fb964eeb6d46ed01dab86671c269904589ba4fe | [
"Apache-2.0"
] | 5 | 2019-07-03T09:10:21.000Z | 2020-01-14T05:34:35.000Z | docs/contributing.md | kahummer/opensrp-web | 1a31af1fc124ce6b2fb2b0c2d7bf6b39f87f3b78 | [
"Apache-2.0"
] | 212 | 2019-07-03T15:14:40.000Z | 2020-10-08T10:37:58.000Z | docs/contributing.md | kahummer/opensrp-web | 1a31af1fc124ce6b2fb2b0c2d7bf6b39f87f3b78 | [
"Apache-2.0"
] | 1 | 2020-01-31T07:47:07.000Z | 2020-01-31T07:47:07.000Z | # Contributing/Collaborating
## Guidelines
[Click me](codeQuality.md) for code quality guidelines
## We Develop with Github
We use github to host code, to track issues and feature requests, as well as accept pull requests.
### We Use [Github Flow](https://guides.github.com/introduction/flow/index.html), So All Code Changes Happen Through Pull Requests
Pull requests are the best way to propose changes to the codebase (we use [Github Flow](https://guides.github.com/introduction/flow/index.html)).
1. Fork the repo and create your branch from `master`.
2. If you've added code that should be tested, add tests.
3. If you've changed APIs, update the documentation.
4. Ensure the test suite passes.
5. Make sure your code lints.
6. Issue that pull request!
## Report bugs using Github's [issues](https://github.com/briandk/transcriptase-atom/issues)
We use GitHub issues to track public bugs. Report a bug by [opening a new issue](https://github.com/OpenSRP/opensrp-web/issues/new); it's that easy!
## Write bug reports with detail, background, and sample code
**Great Bug Reports** tend to have:
- A quick summary and/or background
- Steps to reproduce
- Be specific!
- Give sample code if you can.
- What you expected would happen
- What actually happens
- Notes if any(possibly including why you think this might be happening, or stuff you tried that didn't work)
| 37.378378 | 148 | 0.758496 | eng_Latn | 0.9703 |
9b82c6aff9372694f5953b6a0f53c604c840eb7c | 54 | md | Markdown | CONTRIBUTING.md | great-coder/AndroidSamples | 8030ddb8ab4add45aca18b3b8510725270690231 | [
"MIT"
] | null | null | null | CONTRIBUTING.md | great-coder/AndroidSamples | 8030ddb8ab4add45aca18b3b8510725270690231 | [
"MIT"
] | null | null | null | CONTRIBUTING.md | great-coder/AndroidSamples | 8030ddb8ab4add45aca18b3b8510725270690231 | [
"MIT"
] | null | null | null | For contributing just email me: [email protected]
| 27 | 53 | 0.833333 | eng_Latn | 0.451348 |
9b82f500e9065b50728eddba261427c0d16c7f94 | 1,241 | md | Markdown | _posts/2017-09-08-Lobront.md | banjerd3839/banjerd.github.io | a1e2210611fa97617ad4dbc178b9775d52d299e4 | [
"MIT"
] | null | null | null | _posts/2017-09-08-Lobront.md | banjerd3839/banjerd.github.io | a1e2210611fa97617ad4dbc178b9775d52d299e4 | [
"MIT"
] | null | null | null | _posts/2017-09-08-Lobront.md | banjerd3839/banjerd.github.io | a1e2210611fa97617ad4dbc178b9775d52d299e4 | [
"MIT"
] | null | null | null | ---Lobrot
[ลิงค์เพิ่มเติม](https://www.autospinn.com/category/%E0%B8%A2%E0%B8%B5%E0%B9%88%E0%B8%AB%E0%B9%89%E0%B8%AD%E0%B8%A3%E0%B8%96%E0%B8%A2%E0%B8%99%E0%B8%95%E0%B9%8C/ferrari-%E0%B9%80%E0%B8%9F%E0%B8%AD%E0%B8%A3%E0%B9%8C%E0%B8%A3%E0%B8%B2%E0%B8%A3%E0%B8%B5%E0%B9%88/)

หุ่นยนต์ คือเครื่องจักรกลชนิดหนึ่งที่มีลักษณะโครงสร้างและรูปร่างแตกต่างกัน หุ่นยนต์ในแต่ละประเภทจะมีหน้าที่การทำงานในด้านต่าง ๆ ตามการควบคุมโดยตรงของมนุษย์ การควบคุมระบบต่าง ๆ ในการสั่งงานระหว่างหุ่นยนต์และมนุษย์ สามารถทำได้โดยทางอ้อมและอัตโนมัติ โดยทั่วไปหุ่นยนต์ถูกสร้างขึ้นเพื่อสำหรับงานที่มีความยากลำบากเช่น งานสำรวจในพื้นที่บริเวณแคบหรืองานสำรวจดวงจันทร์ดาวเคราะห์ที่ไม่มีสิ่งมีชีวิต ปัจจุบันเทคโนโลยีของหุ่นยนต์เจริญก้าวหน้าอย่างรวดเร็ว เริ่มเข้ามามีบทบาทกับชีวิตของมนุษย์ในด้านต่าง ๆ เช่น ด้านอุตสาหกรรมการผลิต แตกต่างจากเมื่อก่อนที่หุ่นยนต์มักถูกนำไปใช้ ในงานอุตสาหกรรมเป็นส่วนใหญ่ ปัจจุบันมีการนำหุ่นยนต์มาใช้งานมากขึ้น เช่น หุ่นยนต์ที่ใช้ในทางการแพทย์ หุ่นยนต์สำหรับงานสำรวจ หุ่นยนต์ที่ใช้งานในอวกาศ หรือแม้แต่หุ่นยนต์ที่ถูกสร้างขึ้นเพื่อเป็นเครื่องเล่นของมนุษย์ จนกระทั่งในปัจจุบันนี้ได้มีการพัฒนาให้หุ่นยนต์นั้นมีลักษณะที่คล้ายมนุษย์ เพื่อให้อาศัยอยู่ร่วมกันกับมนุษย์ ให้ได้ในชีวิตประจำวัน
| 137.888889 | 901 | 0.720387 | tha_Thai | 0.997052 |
9b856a527a54b2ee95c27510ff0e74281257696c | 707 | md | Markdown | content/events/2016-singapore/program/angad-singh.md | devopsdays/devopsdays-test | e2766c9a61cf053c7949d37fe420e8750b4549dc | [
"Apache-2.0",
"MIT"
] | null | null | null | content/events/2016-singapore/program/angad-singh.md | devopsdays/devopsdays-test | e2766c9a61cf053c7949d37fe420e8750b4549dc | [
"Apache-2.0",
"MIT"
] | null | null | null | content/events/2016-singapore/program/angad-singh.md | devopsdays/devopsdays-test | e2766c9a61cf053c7949d37fe420e8750b4549dc | [
"Apache-2.0",
"MIT"
] | null | null | null | +++
date = "2016-09-07T21:27:36+08:00"
linktitle = "angad-singh"
title = "Angad Singh"
type = "talk"
+++
<div class="span-15 ">
<div class="span-15 last ">
<h2>Devops and Standards</h2>
<p>
Standards and Best Practices - everyone talks about them but how do you get a team of 30 engineers to follow standards? How do you ensure, that while following standards, innovation is not stifled?
</p>
<p>
Devops is equally top-down, as it is bottom-up. It is the central body that aims to reduce the pain points of developers, while delivering a reliable and mostly predictable system.
</p>
<h2>When</h2>
<p><time datetime="2016-10-08T14:00">Saturday, 8th October</time></p>
</div>
</div>
| 32.136364 | 199 | 0.693069 | eng_Latn | 0.994554 |
9b85acbf07a64e67da19301bca431a1471001d84 | 882 | md | Markdown | README.md | MagusDevOps/avro4k-build-plugins | 6abb86964479f0497a3e7e582e9c0621fbcf2d45 | [
"MIT"
] | null | null | null | README.md | MagusDevOps/avro4k-build-plugins | 6abb86964479f0497a3e7e582e9c0621fbcf2d45 | [
"MIT"
] | null | null | null | README.md | MagusDevOps/avro4k-build-plugins | 6abb86964479f0497a3e7e582e9c0621fbcf2d45 | [
"MIT"
] | null | null | null | # Avro4k Gradle Plugin
This plugin will automatically generate [Avro](https://avro.apache.org/) schema files using
[Avro4k](https://github.com/avro-kotlin/avro4k) for the
[kotlinx.serialization](https://github.com/Kotlin/kotlinx.serialization) library.
## Usage
Add the following to your build.gradle file
```groovy
buildscript {
repositories {
mavenCentral()
}
dependencies {
implementation 'com.magusdevops.avro4k:gradle-plugin:0.30.0.RC4'
}
}
repositories {
mavenCentral()
}
apply plugin: 'com.magusdevops.avro4k.gradle-plugin'
```
alternatively, you can use the new plugin syntax for gradle `2.1+`
```groovy
plugins {
id 'com.magusdevops.avro4k.gradle-plugin'
}
```
## Configuration
```groovy
avro4kAvroGeneration {
packageToScan = setOf("com.magusdevops")
}
```
## Generate the schemas
`./gradlew avro4kAvroGeneration` | 20.511628 | 92 | 0.712018 | yue_Hant | 0.241536 |
9b85b90e11354e20f0bbe14d475272e959a2b12f | 1,087 | md | Markdown | docs/build/caller-callee-saved-registers.md | anmrdz/cpp-docs.es-es | f3eff4dbb06be3444820c2e57b8ba31616b5ff60 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/build/caller-callee-saved-registers.md | anmrdz/cpp-docs.es-es | f3eff4dbb06be3444820c2e57b8ba31616b5ff60 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/build/caller-callee-saved-registers.md | anmrdz/cpp-docs.es-es | f3eff4dbb06be3444820c2e57b8ba31616b5ff60 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Registros guardados del llamador y destinatario | Microsoft Docs
ms.custom: ''
ms.date: 11/04/2016
ms.technology:
- cpp-tools
ms.topic: conceptual
dev_langs:
- C++
ms.assetid: 0533bd4b-d6bb-4ce1-8201-495e16870cbb
author: corob-msft
ms.author: corob
ms.workload:
- cplusplus
ms.openlocfilehash: e8e877387dbb5b0be865e11017a3ac71a0c38faa
ms.sourcegitcommit: 92f2fff4ce77387b57a4546de1bd4bd464fb51b6
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 09/17/2018
ms.locfileid: "45707660"
---
# <a name="callercallee-saved-registers"></a>Registros guardados del llamador y del destinatario
Destruyen los registros RAX, RCX, RDX, R8, R9, R10, R11 se consideran volátiles y se deben considerar en las llamadas de función (a menos que en caso contrario, seguridad-opuestas por análisis como la optimización de todo el programa).
Los registros RBX, RBP, RDI, RSI, RSP, R12, R13, R14 y R15 se consideran no volátiles y deben guardarse y restaurarse con una función que los usa.
## <a name="see-also"></a>Vea también
[Convención de llamada](../build/calling-convention.md) | 36.233333 | 235 | 0.782889 | spa_Latn | 0.91916 |
9b85cfd1a89932c00ea311042de714ba65ae96aa | 38 | md | Markdown | README.md | Attamusc/BeFF | 71818c708d1097794428e803fee3ecd33d89525a | [
"MIT"
] | 1 | 2020-12-11T05:45:04.000Z | 2020-12-11T05:45:04.000Z | README.md | Attamusc/BeFF | 71818c708d1097794428e803fee3ecd33d89525a | [
"MIT"
] | null | null | null | README.md | Attamusc/BeFF | 71818c708d1097794428e803fee3ecd33d89525a | [
"MIT"
] | 1 | 2020-12-25T07:17:08.000Z | 2020-12-25T07:17:08.000Z | BeFF
====
Behance Frontend Framework
| 7.6 | 26 | 0.736842 | deu_Latn | 0.548304 |
9b86ab58f203261a1d7bdcd6b7a771a388f8007c | 926 | md | Markdown | _posts/2012/2012-05-03-event_reminder_east_twickenham_village_jubilee_pic.md | anthonydillon/stmgrts | 91595b9a9e61798f5eb8a7b2e45ddae453ccf158 | [
"CC0-1.0"
] | null | null | null | _posts/2012/2012-05-03-event_reminder_east_twickenham_village_jubilee_pic.md | anthonydillon/stmgrts | 91595b9a9e61798f5eb8a7b2e45ddae453ccf158 | [
"CC0-1.0"
] | 4 | 2016-09-11T07:52:02.000Z | 2019-10-03T10:10:25.000Z | _posts/2012/2012-05-03-event_reminder_east_twickenham_village_jubilee_pic.md | anthonydillon/stmgrts | 91595b9a9e61798f5eb8a7b2e45ddae453ccf158 | [
"CC0-1.0"
] | 3 | 2018-05-11T06:33:27.000Z | 2019-08-05T11:39:02.000Z | ---
layout: post
title: "Event Reminder: East Twickenham Village Jubilee Picnic"
permalink: /archives/2012/05/event_reminder_east_twickenham_village_jubilee_pic.html
commentfile: 2012-05-03-event_reminder_east_twickenham_village_jubilee_pic
category: around_town
date: 2012-05-03 17:45:27
---
[Monday 4 June 2012 - from noon to 3pm](/event/fair/200705143341)
Celebrate the Jubilee with an old-fashioned picnic in Cambridge Gardens, our local riverside park. Bring your own food and drink, or buy snacks and drinks at Rachel's Café. They will be doing a barbecue too. There'll be a beer and wine tent, provided by Real Ale. And we're expecting Queen Elizabeth I to make an appearance and a short speech. There will be flags to wave and crowns and tiaras for everyone. So come along and bring your neighbours!
Contact: Su Bonfanti, on behalf of the East Twickenham Village Group, on <[email protected]> or 020 8892 5077
| 57.875 | 448 | 0.795896 | eng_Latn | 0.981725 |
9b89278a51ecb26f894742dbf5c5906bad1d05bb | 3,455 | md | Markdown | _posts/2018-09-10-15365490622.md | lyrics101/lyrics101.github.com | 5940068fbb090871fac70168fc4692930ad82c23 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | _posts/2018-09-10-15365490622.md | lyrics101/lyrics101.github.com | 5940068fbb090871fac70168fc4692930ad82c23 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | _posts/2018-09-10-15365490622.md | lyrics101/lyrics101.github.com | 5940068fbb090871fac70168fc4692930ad82c23 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | ---
title: (이슈클립) '**장웨이제**' 관련 이슈, 기사 모아 보기
tag: 장웨이제
category: 이슈클리핑
---

## **'**장웨이제**'** 위키피디아 검색결과
>**장웨이제**(중국어: 江維杰 강유걸, 1991년 10월 17일~)는 중국의 바둑 기사이다. 2005년 프로가 되었고 2012년 9단에 올랐다.
<a href="https://ko.wikipedia.org/wiki/장웨이제" target="_blank">위키백과 전체보기</a>


## **'**장웨이제**'** 뉴스속보 검색결과
### '행방 묘연' 판빙빙, 수갑 찬 모습 진짜?…**장웨이제** 사건도 소환
>판빙빙의 행방에 대해 온갖 추측이 나오고 있는 가운데 일각에서는 과거 높은 인기를 얻었던 중국 유명 아나운서 **장웨이제** 실종 사건까지 언급하고 있다. **장웨이제** 실종 사건은 정치인과 내연 관계였던 **장웨이제**가...
<a href="http://sports.chosun.com/news/ntype.htm?id=201809100100078600006136&servicedate=20180910" target="_blank">기사 전체 보기</a>
### **장웨이제** 사건, 왜 20년간 미제사건으로 남았나
>(사진=MBC 방송화면) [한국정경신문=차상미 기자] **장웨이제** 실종 사건에 대한 관심이 높아지고 있다. 판빙빙이 장기간 모습을 드러내지 않자 20년 전 사라진 중국 아나운서 **장웨이제** 사건과 같은 사태가 벌어진 것 아니냐는...
<a href="http://kpenews.com/Board.aspx?BoardNo=18686" target="_blank">기사 전체 보기</a>
### 판빙빙 수갑 사진 진위 논란, '제2의 **장웨이제**?' 공포영화급 괴소문 확산
>급기야 1998년 실종된 이후 지금까지 미제로 남은 **장웨이제** 다롄TV 아나운서 사건까지 회자되며 '공포영화' 수준의 추측성 보도가 이어지고 있다. 중국의 유명 앵커였던 **장웨이제**는 모 정치인과 내연관계에 있다는 소문이...
<a href="http://news.wowtv.co.kr/NewsCenter/News/Read?articleId=A201809100298&t=NN" target="_blank">기사 전체 보기</a>
### 판빙빙 수갑논란, `제2의 **장웨이제**` 불안감 속 진위 여부는?
>심지어 과거 최고의 인기를 구가하다가 갑자기 실종된 인기 아나운서 **장웨이제** 사례까지 거론되고 있다. **장웨이제**는 1998년 다롄시 시장과 내연관계로 임신 중 실종된뒤 행방이 묘연하다. 이런 가운데 판빙빙이 수갑을...
<a href="http://star.mk.co.kr/new/view.php?mc=ST&year=2018&no=568859" target="_blank">기사 전체 보기</a>
### '핫이슈' 경악스러운 논란의 사진 한 장? **장웨이제** 괴담 진실 공방... '의혹 급부상'
>(사진 출처=**장웨이제** 논란 화면 / 온라인 커뮤니티) 실종된 여자 아나운서 **장웨이제** 사건이 세간의 관심을 모으고 있다. 최근 중국 여배우의 실종 의혹으로 인해 **장웨이제** 사건이 핫이슈로 급부상 가운데...
<a href="http://www.kns.tv/news/articleView.html?idxno=468375" target="_blank">기사 전체 보기</a>
### 판빙빙 추정 머그샷 공개…중국 아나운서 **장웨이제** 실종 사건 '재조명'
>판빙빙 **장웨이제** /사진=중국 온라인 커뮤니티, 방송캡쳐 중국 배우 판빙빙(37)이 수갑과 족쇄를 찬 모습이... 한편, 판빙빙에 대한 괴소문이 확산되면서 과거 실종된 유명 아나운서 **장웨이제** 사건이 재조명되고 있다....
<a href="http://news.hankyung.com/article/201809107025H" target="_blank">기사 전체 보기</a>
### 판빙빙, 행방 묘연…'**장웨이제** 사건' 다시 수면 위로
>이런 가운데 네티즌들 사이에서 '**장웨이제**' 사건 또한 다시금 수면 위로 올랐다. 당시 중국 유명 아나운서인 **장웨이제**는 지난 1998년 중국 다롄 시 시장이었던 보시라이와 내연 관계였던 인물로, 당시 그녀는 "아이를...
<a href="http://www.kyeonggi.com/?mod=news&act=articleView&idxno=1518036" target="_blank">기사 전체 보기</a>
### [이시각 연예스포츠 핫뉴스] B.A.P 힘찬 강제추행혐의·손웅정 기고문·박환희 아들공개·**장웨이제** 판빙빙 등
>'박환희 아들공개' 전체기사 보기 ◇ '정치망명-감금설' 판빙빙, '제2의 **장웨이제**' 우려 중국 배우 판빙빙의 행방이 묘연한 가운데 다롄TV 아나운서였던 **장웨이제**가 다시금 주목받고 있다. **장웨이제**는 1998년 다롄 방송사...
<a href="http://www.etoday.co.kr/news/section/newsview.php?idxno=1661669" target="_blank">기사 전체 보기</a>
### '탈세 혐의' 판빙빙, 감금설에 이어 사망설까지…**장웨이제**와 같은 길 걷나
>이에 팬들은 판빙빙의 감금설에 이어 ‘**장웨이제** 실종 사건’ 같은 사망설까지 제기하고 있는 상황이다. **장웨이제** 실종 사건이란 1998년 중국 다롄시 시장이었던 보시라이와 내연관계였던 90년대 인기 아나운서...
<a href="http://www.asiatoday.co.kr/view.php?key=20180910010005115" target="_blank">기사 전체 보기</a>
### **장웨이제**, 그는 누구인가?…실종된 후 인체 표본으로 발견됐다?
>[금강일보 = 김미영 기자] **장웨이제**, 그는 누구인가?…실종된 후 인체 표본으로 발견됐다? **장웨이제**, 화제가 되는 이유는?/ 사진출처= **장웨이제**가 포털 사이트 실시간 순위에 오른 가운데 **장웨이제**에 대한 사건이 대두되고...
<a href="http://www.ggilbo.com/news/articleView.html?idxno=544215" target="_blank">기사 전체 보기</a>
| 45.460526 | 141 | 0.665991 | kor_Hang | 1.000007 |
9b8a3b8e9adcbcdddb058443c5aed1f01686c5d0 | 1,688 | md | Markdown | articles/protocols/saml/samlp.md | martinjras/docs | 3d3574ba9290f15bd2da9f79e8bcbaf77b3d4851 | [
"MIT"
] | null | null | null | articles/protocols/saml/samlp.md | martinjras/docs | 3d3574ba9290f15bd2da9f79e8bcbaf77b3d4851 | [
"MIT"
] | null | null | null | articles/protocols/saml/samlp.md | martinjras/docs | 3d3574ba9290f15bd2da9f79e8bcbaf77b3d4851 | [
"MIT"
] | null | null | null | ---
title: SAML
description: SAML Identity Provider Configuration
---
# SAML Identity Provider Configuration
## Common settings:
These are the parameters used to configure a SAML Identity Provider:
* The __post-back URL__ (also called __Assertion Consumer Service URL__) is: `https://${account.namespace}/login/callback`
* The __Entity ID__ of the Service Provider is: `connection.options.entityId || urn:auth0:${account.tenant}:${connectionName}`
* The __SAML Request Binding__ (sent to the IdP from Auth0): `HTTP-Redirect`
* The __SAML Response Binding__ (how the SAML token is received by Auth0 from IdP): `HTTP-Post`
* The __NameID format__: `unspecified`
* The SAML assertion, and the SAML response can be individually or simultaneously signed.
* The __SingleLogout service URL__, where the SAML Identity Provider will send logout requests and responses, is: `https://${account.namespace}/logout`. Note: SAML logout requests must be signed by the Identity Provider.
## Encrypted Assertions:
Optionally, assertions can be encrypted. Use this public key to configure the IdP: [CER](https://${account.namespace}/cer) | [PEM](https://${account.namespace}/pem) | [PKCS#7](https://${account.namespace}/pb7)
## IdP-Initiated SSO
If you want **IdP-Initiated SSO**, please make sure to include the `connection` parameter in the post-back URL: `https://${account.namespace}/login/callback?connection=${connectionName}`
## Metadata
Some SAML Identity Providers can accept importing metadata directly with all the required information. You can access the metadata for your connection in Auth0 here:
```text
https://${account.namespace}/samlp/metadata?connection=${connectionName}
```
| 48.228571 | 220 | 0.766588 | eng_Latn | 0.937625 |
9b8b402d3c99a06024d1cda1aa79d9deedf713af | 2,741 | md | Markdown | README.md | anatawa12/auto-visitor | 55cb569955ea0bec42c58291da733bf5bfa604f2 | [
"MIT"
] | 1 | 2021-02-28T14:25:46.000Z | 2021-02-28T14:25:46.000Z | README.md | anatawa12/auto-visitor | 55cb569955ea0bec42c58291da733bf5bfa604f2 | [
"MIT"
] | 63 | 2021-02-08T01:12:18.000Z | 2022-03-31T20:18:53.000Z | README.md | anatawa12/auto-visitor | 55cb569955ea0bec42c58291da733bf5bfa604f2 | [
"MIT"
] | null | null | null | Auto Visitor Kotlin Compiler Plugin
====
[](https://api.anatawa12.com/short/a12-slowly-doc)
[](https://plugins.gradle.org/plugin/com.anatawa12.auto-visitor)
A kotlin compiler plugin to make easy to write visitor pattern.
This plugin is going to provides two code generator shown below:
1. Generate calling `accept` with visitor anonymous object from `when` expr with metadata by annotation.
1. Generate `accept` method and visitor abstract class from annotations.
## How to use
First, you need to apply this gradle plugin
```kotlin
plugins {
id("org.jetbrains.kotlin.jvm") version "<kotlin version>"
id("com.anatawa12.auto-visitor") version "<version>"
}
```
To generate visitor class and accept function, add `@GenerateAccept`, `@HasVisitor`, and `@HasAccept` to the parent
class, add `@GenerateVisitor` to the visitor abstract class, and add `@HasVisitor` to each child class.
TODO: add example code and link to it.
To generate calling accept function, surround when expr with `autoVisitor` like shown below:
```kotlin
autoVisitor(some_expr) { variable ->
when (variable) {
is SomeClass -> {
statements
}
else -> {
statements
}
}
}
```
## Status of implementation
- [x] Automatically include this library to classpath in gradle plugin
- [x] Generating visitor and accept method
- [x] Generating visitor abstract class
- [x] Generating accept method
- [x] Provide Compilation Error
- [x] Generating calling accept from when
- [x] Generating calling accept from when
- [x] Provide Compilation Error
## Structure of this project
- [compiler-plugin](./compiler-plugin)
The compiler plugin of Kotlin.
- [gradle-plugin](./gradle-plugin)
The gradle plugin. This includes applying Kotlin compiler plugin, applying Annotation Processor.
- [annotation-processor](./annotation-processor)
The pluggable annotation processor to verify annotation usages from java.
- [annotation-value-gen](./annotation-value-gen)
A pluggable annotation processor for the compiler plugin. See readme in it for more details
## Motivation
Because the generated code of it is linear search,
`when` expr with `is` checking is much slower than visitor pattern
(see [benchmarks](./benchmarks)), so It's better to use visitor pattern.
However, The visitor pattern needs much boilerplate code,
so I want not to write visitor pattern myself, want to generate it.
| 33.024096 | 286 | 0.739511 | eng_Latn | 0.945488 |
9b8c151e124949945a504ffbba2958d545ea7321 | 2,051 | md | Markdown | src/sl/2020-01/04/04.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 68 | 2016-10-30T23:17:56.000Z | 2022-03-27T11:58:16.000Z | src/sl/2020-01/04/04.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 367 | 2016-10-21T03:50:22.000Z | 2022-03-28T23:35:25.000Z | src/sl/2020-01/04/04.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 109 | 2016-08-02T14:32:13.000Z | 2022-03-31T10:18:41.000Z | ---
title: Ognjena preizkušnja
date: 21/01/2020
---
Za tri judovske mladeniče je bilo češčenje kipa, ki ga je dvignil kralj, v očitnem nasprotju s češčenjem, ki so mu bili priče v Jeruzalemu. Čeprav so v kraljestvu zasedali pomembne položaje in so bili zvesti kralju, je bila njihova zvestoba Bogu pred zvestobo človeku. Brez obotavljanja so bili pripravljeni še naprej služiti kralju kot zvesti upravitelji, vendar pri tem obredu niso želeli sodelovati.
`Kateri dve besedili sta vplivali na stališče, ki so ga zavzeli ti mladeniči? 2 Mz 20, 3-6; 5 Mz 6,4`
Kralj je ukazal, naj se vsi zbrani poklonijo zlatemu kipu, ko bodo zaslišali zvok glasbil. Samo trije – Sadrah, Mesah in Abednego – so si upali biti neposlušni ukazu. Ni trajalo dolgo, ko je nekdo mladeniče zatožil kralju. Tožniki so poskušali v kralju vzbuditi jezo tako, da so omenili tri stvari. Prvič, rekli so, da je kralj tisti, ki je tem mladeničem izročil upravo nad babilonsko pokrajino. Drugič, kralju so zatožili, da judovski mladeniči ne častijo njegovih bogov. Tretjič, poudarili so tudi, da mladeniči poleg tega ne častijo niti zlatega kipa, ki ga je postavil kralj. (Dan 3,12) Kljub svoji jezi je kralj Danielovim tovarišem dal še eno priložnost. Pripravljen je bil ponoviti obred, da bi mladeniči lahko spremenili odločitev in se poklonili kipu. Če tega ne bodo storili, bodo vrženi v razbeljeno peč. Nebukadnezar je govor zaključil z nadvse predrzno izjavo: »Kdo je tisti bog, ki bi vas rešil iz mojih rok?« (Dan 3,15 SSP)
Z nadnaravnim pogumom so kralju odgovorili: »Bodisi, da nas naš Bog, ki mu služimo, more oteti iz goreče, razbeljene peči – in on nas otme iz tvoje roke, o kralj! bodisi da ne, znano bodi tebi, kralj, da ne bomo služili tvojim bogovom in zlate podobe, ki si jo postavil, ne bomo molili!« (Dan 3,17-18)
`Čeprav so mladeniči vedeli, da jih Bog lahko reši, niso imeli zagotovila, da bo to zares naredil. Kljub temu so se odločili, da ne bodo poslušni kraljevi zapovedi, čeprav so vedeli, da bodo živi zgoreli. Kje lahko tudi mi dobimo takšno vero?`
| 136.733333 | 939 | 0.782058 | slv_Latn | 1.000005 |
9b8d35d937753c20b0686e7a8d2553f8074ec3d7 | 2,902 | md | Markdown | README.md | fossabot/turnpike | 915cbf88e06aa757bbc0e375cf1a3bc37a4895e3 | [
"MIT"
] | null | null | null | README.md | fossabot/turnpike | 915cbf88e06aa757bbc0e375cf1a3bc37a4895e3 | [
"MIT"
] | null | null | null | README.md | fossabot/turnpike | 915cbf88e06aa757bbc0e375cf1a3bc37a4895e3 | [
"MIT"
] | null | null | null | # turnpike
[](https://travis-ci.org/jay-depot/turnpike)
[](https://codeclimate.com/github/jay-depot/turnpike)
[](https://codeclimate.com/github/jay-depot/turnpike/coverage)
[](https://app.fossa.io/projects/git%2Bgithub.com%2Fjay-depot%2Fturnpike?ref=badge_shield)
A stylistically permissive, flexible MVC framework for Node.js
Built on Express, Turnpike aims to provide a similar productivity boost as other MVC frameworks that include code generators. Developers coming from Rails or Symfony should find many of the idioms familiar.
## Features:
- Automatically manages worker processes for high availability and optimal performance in a multi-CPU production environment.
- Rich pre/post hook system to give ample opportunities to alter the behavior of your app without needing to reinvent framework functions
- Smart configuration system that automatically provides a fallback chain from Environment variables to project defaults and finally framework defaults.
- Access control rules are loaded from a separate module than the controller providing the endpoints being secured. This provides a nice, natural separation of concerns for access rules.
- RESTful response format negotiation. Built-in support for HTML, and JSON responses, which are automatically negotiated based on the Accept header from the client. Adding additional formats is done on a per-controller basis (project-wide is coming soon) and is very straightforward.
- Optional session API, if you like serving HTML pages, set up using Connect, you just need to pick a storage engine and pass it into one method call.
- Useful session extensions, like attaching messages for the user
- Command line tools for stub generation, route verification
- A model layer based on persistent state, and dispatching actions. If you've worked with react/redux on the front-end, then Turnpike should look very familiar.
## Future Roadmap:
- Plug-in API
- Multipart file uploads made easy.
- Compile back-end templates into front-end Javascript code.
- Pluggable template engines
- Simplify new project generation and setup down to one or two commands.
## Getting started
```bash
$ npm install -g turnpike
$ turnpike create project Turnpike Example
$ cd turnpike-example
$ npm install
$ turnpike testdrive
```
Then just visit http://localhost:1337
## Contributing
See CONTRIBUTING.md
## License
[](https://app.fossa.io/projects/git%2Bgithub.com%2Fjay-depot%2Fturnpike?ref=badge_large)
| 65.954545 | 284 | 0.794969 | eng_Latn | 0.949645 |
9b8d43d6a61fbccac8f63e60a0388734b7a9c4fe | 19,963 | md | Markdown | articles/search/search-what-is-an-index.md | salem84/azure-docs.it-it | 3ec6a13aebb82936591c7fc479f084be9bb8776d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/search/search-what-is-an-index.md | salem84/azure-docs.it-it | 3ec6a13aebb82936591c7fc479f084be9bb8776d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/search/search-what-is-an-index.md | salem84/azure-docs.it-it | 3ec6a13aebb82936591c7fc479f084be9bb8776d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Creare una definizione di indice e concetti - Ricerca di Azure
description: Introduzione all'indicizzazione di termini e concetti in Ricerca di Azure, incluse le parti dei componenti e la struttura fisica.
author: HeidiSteen
manager: nitinme
ms.author: heidist
services: search
ms.service: search
ms.topic: conceptual
ms.date: 05/02/2019
ms.custom: seodec2018
ms.openlocfilehash: 0a26cfc578f12044cb5834f202a0fed5d0a30274
ms.sourcegitcommit: bb8e9f22db4b6f848c7db0ebdfc10e547779cccc
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 08/20/2019
ms.locfileid: "69647361"
---
# <a name="create-a-basic-index-in-azure-search"></a>Creare un indice di base in Ricerca di Azure
In Ricerca di Azure un *indice* è un archivio persistente di *documenti* e altri costrutti usati per la ricerca filtrata e full-text in un servizio di Ricerca di Azure. A livello concettuale, un documento è un singola unità di dati ricercabili nell'indice. Ad esempio, un rivenditore di e-commerce può avere un documento per ogni elemento in vendita, un'agenzia di stampa può avere un documento per ogni articolo e così via. Applicando questi concetti ai più familiari elementi di database equivalenti, un *indice* è concettualmente analogo a una *tabella* e i *documenti* equivalgono in linea di massima alle *righe* di una tabella.
Quando si aggiunge o si carica un indice, Ricerca di Azure crea strutture fisiche basate sullo schema fornito dall'utente. Ad esempio, se un campo nell'indice è contrassegnato come ricercabile, viene creato un indice invertito per il campo. Successivamente, quando si aggiungono o caricano documenti o si inviano le query di ricerca in Ricerca di Azure, si inviano le richieste a un indice specifico nel servizio di ricerca. Il caricamento di campi con valori di documenti viene chiamato *indicizzazione* o inserimento dati.
È possibile creare un indice nel portale, [API REST](search-create-index-rest-api.md), o [.NET SDK](search-create-index-dotnet.md).
## <a name="recommended-workflow"></a>Flusso di lavoro consigliato
Il raggiungimento di una corretta progettazione degli indici si ottiene in genere tramite più iterazioni. L'utilizzo di una combinazione di strumenti e API consente di finalizzare rapidamente il progetto.
1. Determinare se è possibile usare un [indicizzatore](search-indexer-overview.md#supported-data-sources). Se i dati esterni sono una delle origini dati supportate, è possibile creare un prototipo e caricare un indice usando la procedura guidata [**Importa dati**](search-import-data-portal.md).
2. Se non è possibile usare **Importa dati**, è comunque possibile [creare un indice iniziale nel portale](search-create-index-portal.md), aggiungendo campi, tipi di dati e assegnando attributi tramite i controlli nella pagina **Aggiungi indice**. Il portale visualizza gli attributi che sono disponibili per i tipi di dati diversi. Ciò è utile se non si ha familiarità con la progettazione di indici.

Facendo clic su **Crea**, si creano nel servizio di ricerca tutte le strutture fisiche che supportano l'indice.
3. Scaricare lo schema dell'indice mediante [Get Index REST API (Ottenere un indice API REST)](https://docs.microsoft.com/rest/api/searchservice/get-index) e uno strumento di test Web come [Postman](search-get-started-postman.md). Si ottiene così una rappresentazione JSON dell'indice creato nel portale.
A questo punto si passa a un approccio basato sul codice. Poiché non è possibile modificare un indice creato in precedenza, il portale non è particolarmente adatto per l'iterazione. È tuttavia possibile usare Postman e REST per le attività rimanenti.
4. [Caricare l'indice con i dati](search-what-is-data-import.md). Ricerca di Azure accetta i documenti JSON. Per caricare i dati a livello di codice, è possibile usare Postman con i documenti JSON nel payload della richiesta. Se i dati non possono essere facilmente espressi in JSON, questo passaggio sarà più laborioso.
5. Eseguire query sull'indice, esaminare i risultati ed eseguire ulteriormente l'iterazione sullo schema dell'indice fino a quando non si visualizzano i risultati previsti. È possibile usare [**Esplora ricerche**](search-explorer.md) o Postman per eseguire query sull'indice.
6. Continuare a usare il codice per eseguire l'iterazione sul progetto.
Poiché le strutture fisiche vengono create nel servizio, è necessario [eliminare e ricreare](search-howto-reindex.md) gli indici quando si apportano modifiche sostanziali a una definizione di campo esistente. Ciò significa che durante lo sviluppo, è necessario pianificare ricompilazioni frequenti. È possibile prendere in considerazione l'uso di un subset di dati per velocizzare le ricompilazioni.
Per la progettazione iterativa è consigliabile un codice anziché un approccio basato sul portale. Se ci si affida al portale per la definizione dell'indice, è necessario compilare la definizione dell'indice a ogni ricompilazione. In alternativa, strumenti come [Postman e API REST](search-get-started-postman.md) sono utili per eseguire test dei modelli di verifica quando i progetti di sviluppo sono ancora in fase iniziale. È possibile apportare modifiche incrementali a una definizione di indice nel corpo della richiesta e quindi inviare la richiesta al servizio per ricreare un indice usando uno schema aggiornato.
## <a name="components-of-an-index"></a>Componenti di un indice
In modo schematico, un indice Ricerca di Azure è costituito dagli elementi seguenti.
La [*raccolta campi*](#fields-collection) in genere costituisce la maggior parte di un indice, in cui ogni campo è denominato, tipizzato e attribuito con comportamenti consentiti che determinano la modalità di utilizzo. Altri elementi includono [suggerimenti](#suggesters), [profili di Punteggio](#scoring-profiles)e [analizzatori](#analyzers) con parti componente per supportare le opzioni di personalizzazione, [CORS](#cors) e chiave di [crittografia](#encryption-key) .
```json
{
"name": (optional on PUT; required on POST) "name_of_index",
"fields": [
{
"name": "name_of_field",
"type": "Edm.String | Collection(Edm.String) | Edm.Int32 | Edm.Int64 | Edm.Double | Edm.Boolean | Edm.DateTimeOffset | Edm.GeographyPoint",
"searchable": true (default where applicable) | false (only Edm.String and Collection(Edm.String) fields can be searchable),
"filterable": true (default) | false,
"sortable": true (default where applicable) | false (Collection(Edm.String) fields cannot be sortable),
"facetable": true (default where applicable) | false (Edm.GeographyPoint fields cannot be facetable),
"key": true | false (default, only Edm.String fields can be keys),
"retrievable": true (default) | false,
"analyzer": "name_of_analyzer_for_search_and_indexing", (only if 'searchAnalyzer' and 'indexAnalyzer' are not set)
"searchAnalyzer": "name_of_search_analyzer", (only if 'indexAnalyzer' is set and 'analyzer' is not set)
"indexAnalyzer": "name_of_indexing_analyzer", (only if 'searchAnalyzer' is set and 'analyzer' is not set)
"synonymMaps": [ "name_of_synonym_map" ] (optional, only one synonym map per field is currently supported)
}
],
"suggesters": [
{
"name": "name of suggester",
"searchMode": "analyzingInfixMatching",
"sourceFields": ["field1", "field2", ...]
}
],
"scoringProfiles": [
{
"name": "name of scoring profile",
"text": (optional, only applies to searchable fields) {
"weights": {
"searchable_field_name": relative_weight_value (positive #'s),
...
}
},
"functions": (optional) [
{
"type": "magnitude | freshness | distance | tag",
"boost": # (positive number used as multiplier for raw score != 1),
"fieldName": "...",
"interpolation": "constant | linear (default) | quadratic | logarithmic",
"magnitude": {
"boostingRangeStart": #,
"boostingRangeEnd": #,
"constantBoostBeyondRange": true | false (default)
},
"freshness": {
"boostingDuration": "..." (value representing timespan leading to now over which boosting occurs)
},
"distance": {
"referencePointParameter": "...", (parameter to be passed in queries to use as reference location)
"boostingDistance": # (the distance in kilometers from the reference location where the boosting range ends)
},
"tag": {
"tagsParameter": "..." (parameter to be passed in queries to specify a list of tags to compare against target fields)
}
}
],
"functionAggregation": (optional, applies only when functions are specified)
"sum (default) | average | minimum | maximum | firstMatching"
}
],
"analyzers":(optional)[ ... ],
"charFilters":(optional)[ ... ],
"tokenizers":(optional)[ ... ],
"tokenFilters":(optional)[ ... ],
"defaultScoringProfile": (optional) "...",
"corsOptions": (optional) {
"allowedOrigins": ["*"] | ["origin_1", "origin_2", ...],
"maxAgeInSeconds": (optional) max_age_in_seconds (non-negative integer)
},
"encryptionKey":(optional){
"keyVaultUri": "azure_key_vault_uri",
"keyVaultKeyName": "name_of_azure_key_vault_key",
"keyVaultKeyVersion": "version_of_azure_key_vault_key",
"accessCredentials":(optional){
"applicationId": "azure_active_directory_application_id",
"applicationSecret": "azure_active_directory_application_authentication_key"
}
}
}
```
<a name="fields-collection"></a>
## <a name="fields-collection-and-field-attributes"></a>Raccolta di campi e attributi di campi
Quando si definisce lo schema, è necessario specificare il nome, tipo e gli attributi di ogni campo nell'indice. Il tipo di campo classifica i dati archiviati in quel campo. Gli attributi sono impostati nei singoli campi per specificare come viene usato il campo. La tabella seguente enumera gli attributi che è possibile specificare.
### <a name="data-types"></a>Tipi di dati
| Type | Descrizione |
| --- | --- |
| *Edm.String* |Testo facoltativamente soggetto a tokenizzazione per la ricerca full-text (suddivisione delle parole, stemming e così via). |
| *Collection(Edm.String)* |Elenco di stringhe facoltativamente soggette a tokenizzazione per la ricerca full-text. Non esiste alcun limite superiore teorico al numero di elementi in una raccolta, ma alle raccolte si applica il limite massimo di 16 MB di dimensioni del payload. |
| *Edm.Boolean* |Contiene valori true/false. |
| *Edm.Int32* |Valori integer a 32 bit. |
| *Edm.Int64* |Valori integer a 64 bit. |
| *Edm.Double* |Dati numerici a precisione doppia. |
| *Edm.DateTimeOffset* |Valori di ora rappresentati in formato OData V4 (ad esempio `yyyy-MM-ddTHH:mm:ss.fffZ` o `yyyy-MM-ddTHH:mm:ss.fff[+/-]HH:mm`). |
| *Edm.GeographyPoint* |Punto che rappresenta una località geografica del mondo. |
È possibile trovare altre informazioni sui [tipi di dati supportati di Ricerca di Azure qui](https://docs.microsoft.com/rest/api/searchservice/Supported-data-types).
### <a name="index-attributes"></a>Attributi dell'indice
Esattamente un campo nell'indice deve essere designato come un campo **chiave** che identifica in modo univoco ogni documento.
Altri attributi determinano la modalità di utilizzo di un campo in un'applicazione. Ad esempio, l' attributo ricercabile viene assegnato a ogni campo che deve essere incluso in una ricerca full-text.
Le API usate per compilare un indice hanno comportamenti predefiniti variabili. Per le [API REST](https://docs.microsoft.com/rest/api/searchservice/Create-Index), la maggior parte degli attributi è abilitata per impostazione predefinita (ad esempio, la **ricerca** e il **recupero** sono true per i campi di stringa) ed è spesso necessario impostarli solo se si desidera disattivarli. Per .NET SDK, il valore opposto è true. Per le proprietà non impostate in modo esplicito, per impostazione predefinita viene disabilitato il comportamento di ricerca corrispondente, a meno che non venga abilitato in modo specifico.
| Attributo | DESCRIZIONE |
| --- | --- |
| `key` |Stringa che fornisce l'ID univoco di ogni documento, usata per la ricerca di documenti. Ogni indice deve avere una chiave. Un solo campo può essere la chiave e deve essere impostata su Edm.String. |
| `retrievable` |Specifica se il campo può essere restituito nel risultato di una ricerca. |
| `filterable` |Consente di usare il campo nelle query di filtro. |
| `Sortable` |Consente a una query ordinare i risultati della ricerca usando questo campo. |
| `facetable` |Consente di usare un campo in una struttura di [esplorazione in base a facet](search-faceted-navigation.md) per i filtri autoindirizzati. In genere, i campi che contengono valori ricorrenti che è possibile usare per raggruppare più documenti, ad esempio, più documenti che rientrano in una categoria di servizi o una singola marca, funzionano meglio come facet. |
| `searchable` |Contrassegna il campo come disponibile per la ricerca full-text. |
## <a name="storage-implications"></a>Implicazioni relative all'archiviazione
Gli attributi selezionati hanno un impatto sull'archiviazione. Lo screenshot seguente illustra i modelli di archiviazione dell'indice derivanti da diverse combinazioni di attributi.
L'indice è basato sull'origine dati [incorporata di esempio Real Immobiliare](search-get-started-portal.md) , che è possibile indicizzare ed eseguire query nel portale. Anche se non vengono visualizzati gli schemi dell'indice, è possibile dedurre gli attributi in base al nome dell'indice. Ad esempio, l'indice *realestate-searchable*ha soltanto l'attributo **ricercabile** selezionato, l'indice *realestate-retrievable*ha soltanto l'attributo **recuperabile** selezionato e così via.

Sebbene queste varianti di indice siano artificiali, è possibile farvi riferimento per una considerazione generale sul modo in cui gli attributi influiscono sull'archiviazione. L'impostazione **recuperabile** aumenta le dimensioni dell'indice? No. L'aggiunta di campi a uno **Strumento suggerimenti** aumenta le dimensioni dell'indice? Sì.
Gli indici che supportano filtro e ordinamento sono proporzionalmente più grandi rispetto agli indici che supportano soltanto la ricerca full-text. Questo avviene perché filtro e ordinamento eseguono query su corrispondenze esatte in modo che i documenti siano archiviati senza alterazioni. Al contrario, i campi ricercabili che supportano la ricerca full-text e fuzzy usano indici invertiti, popolati con termini in formato token che consumano meno spazio rispetto a documenti interi.
> [!Note]
> L'architettura di archiviazione è considerata un dettaglio di implementazione di Ricerca di Azure e può essere soggetta a modifiche senza preavviso. Non è garantito che il comportamento attuale verrà mantenuto in futuro.
## <a name="suggesters"></a>Componenti per il suggerimento
Un componente di suggerimento è una sezione dello schema che definisce quali campi in un indice vengono utilizzati per supportare le query con completamento automatico nelle ricerche. In genere, le stringhe di ricerca parziale vengono inviate ai [Suggerimenti (API REST)](https://docs.microsoft.com/rest/api/searchservice/suggestions) mentre l'utente digita la query di ricerca e l'API restituisce un set di espressioni suggerite.
I campi aggiunti a uno strumento suggerimenti vengono usati per compilare i termini di ricerca con completamento automatico. Tutti i termini di ricerca vengono creati durante l'indicizzazione e archiviati separatamente. Per altre informazioni sulla creazione di una struttura per uno strumento suggerimenti, vedere [Aggiungere strumenti suggerimenti](index-add-suggesters.md).
## <a name="scoring-profiles"></a>Profili di punteggio
Un [profilo di punteggio](index-add-scoring-profiles.md) è una sezione dello schema che definisce i comportamenti di punteggio personalizzati che consentono di determinare quali elementi verranno visualizzati più in alto nei risultati della ricerca. I profili di punteggio sono costituiti da funzioni e campi ponderati. Per utilizzarli, è necessario specificare il nome di un profilo nella stringa di query.
Un profilo di punteggio predefinito viene eseguito in background al fine di calcolare un punteggio di ricerca per ogni elemento visualizzato in un set di risultati. È possibile utilizzare un profilo di punteggio interno, senza nome. In alternativa, è possibile impostare **defaultScoringProfile** affinché usi un profilo personalizzato come predefinito. Tale profilo può essere richiamato ogni volta in cui un profilo personalizzato non viene specificato nella stringa di query.
## <a name="analyzers"></a>Analizzatori
L'elemento analizzatori imposta il nome dell'analizzatore di lingua da usare per il campo. Per altre informazioni sull'intervallo degli analizzatori disponibili, vedere [Aggiungere analizzatori del linguaggio a un indice di Ricerca di Azure](search-analyzers.md). Gli analizzatori possono essere usati solo con campi ricercabili. Dopo aver assegnato l'analizzatore a un campo sarà possibile modificarlo solo ricompilando l'indice.
## <a name="cors"></a>CORS
JavaScript sul lato client non può chiamare API per impostazione predefinita perché il browser impedisce tutte le richieste con origini diverse. Per consentire query con origini diverse nell'indice, abilitare CORS (Cross-Origin Resource Sharing) impostando l'attributo **corsOptions**. Per motivi di sicurezza, solo le API di query supportano CORS.
Per CORS è possibile impostare le opzioni seguenti:
+ **allowedOrigins** (obbligatoria): Si tratta di un elenco di origini a cui verrà concesso l'accesso all'indice. Questo significa che al codice JavaScript servito da queste origini sarà consentito eseguire query sull'indice, purché fornisca la chiave API corretta. Ogni origine è in genere nel formato `protocol://<fully-qualified-domain-name>:<port>` anche se `<port>` spesso viene omessa. Vedere [Utilizzare la condivisione di risorse tra origini (Wikipedia)](https://en.wikipedia.org/wiki/Cross-origin_resource_sharing) per altre informazioni.
Per consentire l'accesso a tutte le origini, includere `*` come unico elemento nella matrice**allowedOrigins**. *Non si consiglia questa pratica per i servizi di ricerca della produzione* ma spesso è utile per lo sviluppo e il debug.
+ **maxAgeInSeconds** (facoltativa): I browser usano questo valore per determinare la durata (in secondi) di memorizzazione nella cache delle risposte preliminari CORS. Questo valore deve essere un intero non negativo. A un valore più grande corrispondono prestazioni migliori, ma deve trascorrere più tempo prima che le modifiche dei criteri CORS diventino effettive. Se questo valore non è impostato, viene usata una durata predefinita di 5 minuti.
## <a name="encryption-key"></a>Chiave di crittografia
Sebbene tutti gli indici di ricerca di Azure siano crittografati per impostazione predefinita usando chiavi gestite da Microsoft, è possibile configurare gli indici per la crittografia con **chiavi gestite dal cliente** in Key Vault. Per altre informazioni, vedere [gestire le chiavi di crittografia in ricerca di Azure](search-security-manage-encryption-keys.md).
## <a name="next-steps"></a>Passaggi successivi
Conoscendo la composizione dell'indice, è possibile continuare nel portale per creare il primo indice.
> [!div class="nextstepaction"]
> [Aggiungere un indice (portale)](search-create-index-portal.md)
| 86.047414 | 633 | 0.769774 | ita_Latn | 0.998825 |
9b8fd72b9033966346453ed3800834a1ebbfb8b8 | 795 | md | Markdown | _posts/2020-03-04-governo-de-pe-diz-que-liberacao-de-cruzeiros-em-noronha-e-turismo-predatorio.md | tatudoquei/tatudoquei.github.io | a3a3c362424fda626d7d0ce2d9f4bead6580631c | [
"MIT"
] | null | null | null | _posts/2020-03-04-governo-de-pe-diz-que-liberacao-de-cruzeiros-em-noronha-e-turismo-predatorio.md | tatudoquei/tatudoquei.github.io | a3a3c362424fda626d7d0ce2d9f4bead6580631c | [
"MIT"
] | null | null | null | _posts/2020-03-04-governo-de-pe-diz-que-liberacao-de-cruzeiros-em-noronha-e-turismo-predatorio.md | tatudoquei/tatudoquei.github.io | a3a3c362424fda626d7d0ce2d9f4bead6580631c | [
"MIT"
] | 1 | 2022-01-13T07:57:24.000Z | 2022-01-13T07:57:24.000Z | ---
layout: post
item_id: 2908818708
title: >-
Governo de PE diz que liberação de cruzeiros em Noronha é turismo predatório
author: Tatu D'Oquei
date: 2020-03-04 15:24:00
pub_date: 2020-03-04 15:24:00
time_added: 2020-03-08 12:22:03
category:
tags: []
image: https://f.i.uol.com.br/fotografia/2018/12/28/15460176345c265b62690f4_1546017634_3x2_rt.jpg
---
O governo de Pernambuco reagiu à divulgação de que a União quer liberar cruzeiros e exploração de naufrágios artificiais no arquipélago de Fernando de Noronha.
**Link:** [https://www1.folha.uol.com.br/ambiente/2020/03/governo-de-pe-diz-que-liberacao-de-cruzeiros-em-noronha-e-turismo-predatorio.shtml](https://www1.folha.uol.com.br/ambiente/2020/03/governo-de-pe-diz-que-liberacao-de-cruzeiros-em-noronha-e-turismo-predatorio.shtml)
| 41.842105 | 272 | 0.777358 | por_Latn | 0.787258 |
9b907c3263d28712c9db7194a6051f68f57d1865 | 6,262 | md | Markdown | posts/cats-cradle.md | PizzaMyHeart/website | 6e7a45c89bd376681ee6e1701ab30c938af4aa4c | [
"MIT"
] | null | null | null | posts/cats-cradle.md | PizzaMyHeart/website | 6e7a45c89bd376681ee6e1701ab30c938af4aa4c | [
"MIT"
] | null | null | null | posts/cats-cradle.md | PizzaMyHeart/website | 6e7a45c89bd376681ee6e1701ab30c938af4aa4c | [
"MIT"
] | null | null | null | ---
title: "Review: Cat's Cradle"
description: A review of Cat's Cradle by Kurt Vonnegut.
date: 2020-06-04
tags: ['review', 'books']
layout: layouts/post.njk
---
*May contain spoilers.*
This is my second Kurt Vonnegut novel after reading *Slaughterhouse-Five*, well, five years ago. A journalist, in his quest to write a book about one of the creators of the atomic bomb, finds himself on the fictional Caribbean island nation of San Lorenzo. In typical kooky Vonnegut fashion, the journalist goes there after finding out that one of the scientist's sons, who used to work at a hobby shop making model trains, is now a Major General in San Lorenzo.
For such a short novel, Vonnegut managed to fit in a huge cast of wacky, colourful characters. There's "Papa" Monzano, the dying dictator of San Lorenzo. Hazel Crosby never fails pounce on Indiana residents discussed in conversation, and insists on everyone else calling her "Mom". Her husband, H. Lowe Crosby, is a cartoonish American capitalist who goes to San Lorenzo in search of cheap labour. Julian Castle left the sugar business to set up a humanitarian hospital on San Lorenzo, but turns out to be a cynical asshole in person. And, of course, there's Bokonon, the holy man who started a fake/real religion and a fake/real war with the state of San Lorenzo, and whose calypsos pepper every chapter of the story.
To me, the most powerful message in Cat's Cradle is how poorly equipped we humans are to handle the power of technology. This book was published soon after the Cuban Missle Crisis; reading it at the time must have been a horrifying experience. Now it's a quietly uncomfortable read, reminding us in between absurd punchlines of the knife's edge on which our geopolitical climate is delicately poised. We learn early on that after the war, Felix Hoenikker, the scientist whose children accompany our journalist narrator on San Lorenzo, developed ice-nine, a solid form of water that causes liquid water to crystallise into more ice-nine on contact. Felix Hoenikker is a caricature of the Man of Science, a scientist who seems to lack a human soul, and it is his ice-nine that destroys the world in a moment of slapstick comedy. Instant destruction, in much the same way the atomic bomb levelled Hiroshima and Nagasaki. Science without humanism and ethics to guide it, and without art to lend it context, can be a mindless and destructive machine. Scientists have a moral obligation to consider the ethical implications of their work. They cannot claim to be objective and rational beings who have no stance on how the fruits of their research are used.
Vonnegut is a true wordsmith with the ability to churn out delicious turns of phrase. I find his writing as poetic as it is simple. His prose flows naturally, and his humour never fails to entertain. To be sure, there are some awfully dated slurs, and he doesn't treat people of colour, women, and possibly people of short stature very well in his writing.
The novel is made up of short, punchy chapters, which Vonnegut has described as a series of jokes making up a mosaic. Here's a non-exhaustive list of bits I enjoyed:
- Poo-tee-weet
- National Chairman for Poets and Painters for Immediate Nuclear War
- "Papa's" death rattles being amplified by a microphone
- "Papa" being referred to as "Papa"
- Every single one of Bokonon's calypsos
- Bokononist terms such as "duprass" and "karass" and "granfalloon"
- "Dynamic tension", a concept in Bokononism, being a term used by a mail-order bodybuilding instructor to describe isometric exercises
- Death by hook being inspired by Madame Tussauds, and turning from a fictional punishment into an actual one
- The Hundred Martyrs to Democracy getting wiped out before leaving the San Lorenzo harbour
- McCabe and Bokonon pretending to be enemies, then becoming actual enemies
- Bokonon declaring himself an outlaw
- Dr Vox Humana being named after the church organ that killed his mother
- Everyone on San Lorenzo, including Papa, being a Bokononist--on the island where Bokononism is outlawed and its adherents sentenced to death on the hook.
Though not as uniformly depressing as *Slaughterhouse-Five*, this story is no less bleak and pessimistic. The titular cat's cradle is referenced by Newt, who asks the narrator: where is the cat? And where is the cradle? Our existence is meaningless, in spite of our propensity to divine meaning from a tangled mess of string. It may simply be the naivety that comes with my age, but I just can't bring myself to agree with Vonnegut's nihilistic view that humans are fundamentally cruel and stupid--understandable as that may be given his wartime experiences. To be fair, he does offer humour as a way to cope, but the laughter contains just a bit too much bitterness for my taste. As absurd as the world may seem--and believe me, 2020 has done a great job of driving that home--the answer can't be to lie down on a hilltop, eat some ice-nine, and die while defiantly flipping off the heavens with a smile. Sure, I don't have the answers, but for now it seems enough to take each day as it comes and find joy in all the small things.
*Quotes*
- His pores looked as big as craters on the moon. His ears and nostrils were stuffed with hair. Cigar smoke made him smell like the mouth of Hell. So close up, my father was the ugliest thing I had ever seen. I dream about it all the time.
- My soul seemed as foul as smoke from burning cat's fur.
- The words were a paraphrase of the suggestion by Jesus: "Render therefore unto Caesar the things which are Caesar's." Bokonon's paraphrase was this: "Pay no attention to Caesar. Caesar doesn't have the slightest idea what's really going on."
- ... the brainless ecstasy of a volunteer fireman.
- It posed the question posed by all such stone piles: how had puny men moved stones so big? And, like all such stone piles, it answered the question itself. Dumb terror had moved those stones so big.
- Perhaps, when we remember wars, we should take off our clothes and paint ourselves blue and go on all fours all day long and grunt like pigs. That would surely be more appropriate than noble oratory and shows of flags and well-oiled guns.
Day 7 of [#100DaysToOffload](https://100daystooffload.com/) | 142.318182 | 1,252 | 0.788087 | eng_Latn | 0.999822 |
9b91070ba52154d94fc0f11a2c3154e93b69aa77 | 1,574 | md | Markdown | what-is-a-lib-file.md | devs4v/today-i-learned | 897b94251a22c85f29358ecd4182ca455e661e78 | [
"MIT"
] | 1 | 2017-04-03T13:57:37.000Z | 2017-04-03T13:57:37.000Z | what-is-a-lib-file.md | devs4v/today-i-learned | 897b94251a22c85f29358ecd4182ca455e661e78 | [
"MIT"
] | 1 | 2017-04-03T13:58:15.000Z | 2017-04-06T10:13:41.000Z | what-is-a-lib-file.md | devs4v/today-i-learned | 897b94251a22c85f29358ecd4182ca455e661e78 | [
"MIT"
] | null | null | null | This came from: http://stackoverflow.com/questions/3250467/what-is-inside-lib-file-of-static-library-statically-linked-dynamic-library-an
The question was:
What is inside .lib file of Static library, Statically linked dynamic library and dynamically linked dynamic library?
Answers that I liked:
A LIB file is used to build your program, it only exists on your build machine and you don't ship it. There are two kinds. A static link library is a bag of .obj files, collected into a single file. The linker picks any chunks of code from the file when it needs to resolve an external identifier.
But more relevant to DLLs, a LIB file can also be an import library. It is then a simple small file that includes the name of the DLL and a list of all the functions exported by the DLL. You'll need to provide it to the linker when you build a program that uses the DLL so it knows that an external identifier is actually a function exported by the DLL. The linker uses the import library to add entries to the import table for the EXE. Which is then in turn used by Windows at runtime to figure out what DLLs need to be loaded to run the program.
Here's a summary:
|Linking | Static | DLL | LoadLibrary |
|---------|---------------|----------------------|------------------
|API code lives| In your compiled program | In the DLL | In the DLL |
|Function calls | Direct, may be elided | Indirect via table filled automatically | Indirect via your own function parts
|Burden | Compiler | Compiler/OS | You/OS
| 87.444444 | 547 | 0.711563 | eng_Latn | 0.998992 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.