hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
sequencelengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
sequencelengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
sequencelengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
aae85ed69f12b48e826962c87b2f72bc1179810c
14,668
md
Markdown
content/en-us/general/subbridge/technical-details.md
Phala-Network/phala-wiki-next
2547cf2bd142cb9deccbf665ab64b855aae77c8f
[ "MIT" ]
6
2021-11-04T10:38:28.000Z
2022-01-03T22:38:54.000Z
content/en-us/general/subbridge/technical-details.md
Phala-Network/phala-wiki-next
2547cf2bd142cb9deccbf665ab64b855aae77c8f
[ "MIT" ]
87
2021-09-22T08:17:50.000Z
2022-03-28T13:36:15.000Z
content/en-us/general/subbridge/technical-details.md
Phala-Network/phala-wiki-next
2547cf2bd142cb9deccbf665ab64b855aae77c8f
[ "MIT" ]
7
2021-11-04T16:24:49.000Z
2022-03-16T12:09:42.000Z
--- title: 'Technical Details' weight: 1005 menu: general: parent: "general-bridge" --- SubBridge is different from general cross-chain solutions in that they only realize the transfer of assets and data between two chains. SubBridge is more like a router linking different cross-chain bridges and allows asset transfer from one chain to any other chains which have been registered. For example, we have implemented the transfer of ERC20 assets on Moonriver and Ethereum to other parachains in the Polkadot ecosystem through SubBridge. To accomplish this, it is not necessary for the parachain receiving the asset to have an EVM cross-chain bridge integrated into its runtime. This is done by the SubBridge module that exists on Khala. <p> <img src="/images/general/subbridge-pallets.png" style="background-color:white;" alt> <figcaption align = "center">Pallets in Khala</figcaption> </p> As shown in the figure below, SubBridge integrates the implementation of multiple bridges, here we use BridgeA, BridgeB, and BridgeC to represent. When Khala receives a cross-chain asset transfer, it will choose whether to forward the transaction to another chain according to the destination address. If the transaction is only transferred to an account in the Khala network, we will deposit the asset directly into the receiving account; in the case of another chain, usually, this needs to ensure: - The destination is already supported by a certain bridge, that is, the SubBridge will traverse the list of bridges to check whether the bridge supports transfers to the destination address, and if so, use this bridge for forwarding. - Make sure that the format of the path conforms to our specification [see next section] and still has enough assets as a fee to pay the cross-chain fee to the other chain. It can be seen that we can not only realize the cross-chain transfer of assets through multiple bridges but also try to choose a transfer path with the lowest fee for users <p> <img src="/images/general/subbridge-topology.png" style="background-color:white;" alt> <figcaption align = "center">SubBridge Topology</figcaption> </p> ## MultiAsset and MultiLocation The purpose of SubBridge is to connect assets in multiple chains. Therefore, how to unify the definitions of assets and locations in multiple chains is the first problem we need to solve. After research, we thought of MultiAsset and MultiLocation in the Polkadot [XCM protocol](https://github.com/paritytech/xcm-format). MultiAsset is used to represent a certain asset on a certain chain. For example, the asset of PHA can be expressed as: ```rust MultiAsset::new(Fungible(amount), Concrete(pha_location)) ``` > Where `amount` is a certain amount of asset, `pha_location` is the path of PHA under the XCM protocol standard, which is defined by SubBridge and represented by MultiLocation, usually represented as: > ```rust MultiLocation::new(1, X1(Parachain(2004))) ``` > Among them, `2004` is the parachain ID of the Khala network. So how do we use the XCM protocol to represent any non-parachain‘s account address? It determines how SubBridge will recognize and forward cross-chain transactions. > What we do in practice is that we incorporate other non-parachains into sub-addresses of the Khala network, similar to the local area network specified by TCP/IP. In this way, the account address of an EVM chain can be represented as: ```rust MultiLocation::new(1, X4(Parachain(2004), GeneralKey(bridge), GeneralIndex(chain), GeneralKey(account))) ``` > Among them, `bridge` represents a specific bridge, for example, Chainbridge uses "cb" to represent; CelerBridge uses "cr" to represent; `chain` represents the ID of the EVM chain under the SubBridge system. Currently, Ethereum is 0 and Moonriver is 2; `account` represents the ID on the EVM chain. An account, usually a 20-byte hexadecimal string. > Similarly, the assets of any chain also need to be unified. The assets on the parachain are defined by the team of the parachain to define their corresponding MultiAsset; the EVM chain assets based on SubBridge are also defined as the sub-asset of the Khala network like the account address. That is, the usual asset location would be represented as: ```rust MultiLocation::new(1, X3(Parachain(2004), GeneralIndex(chain), GeneralKey(token))) ``` > Among them, `token` represents the contract address of a certain ERC20 or ERC721 of EVM. > ## Asset Registration The registration of SubBridge assets is mainly divided into two parts: The first part is to register assets into the pallet-assets module. SubBridge uses the pallet-assets module provided by Substrate to manage the registered assets. The registered assets will be assigned an asset id. Each asset has an extra [registry info](https://github.com/Phala-Network/khala-parachain/blob/5ab4f77163c811fb4a02d337791ce669b41481ad/pallets/assets-registry/src/lib.rs#L62) which contains information of location, enabled bridges and properties. Unregistered assets will fail regardless of whether they are transferred via the EVM bridge or the XCM bridge. The second part is to enable the corresponding EVM bridge. This part is only for the asset settings that want to carry out the cross-chain requirement from Khala to the EVM chain. In SubBridge, the same asset can enable both ChainBridge-based bridges and CelerBridge-based bridges (coming soon). In practice, users are always willing to choose solutions with lower fees. The steps to do the registration stuff are as follow: - Step1, we schedule a call of `pallet-registry::forceRegisterAsset` with given registration informations. When the council enacted the call, an asset instance will be created by `pallet-assets`, and some extra registration information will be saved in `pallet-registry`. There are several things we need to pay attention to. The first one is that each asset has a bunch of metadata defined in `pallet-assets`, like `name`, `symbol`, etc. We have provided an extrinsic called `forceSetMetadata` in `pallet-registry` which can be used to update the metadata of an asset. Another one is that each asset has some privileged accounts used to manage the asset, like `Issuer`, `Admin`, etc. Different account has different permission. In `asset-registry`, we set all the privileged accounts of each asset to an account derived by `PalletId(b"phala/ar")`. This means no external account has permission to do things beyond authority. All registered assets can be found at [here](https://polkadot.js.org/apps/?rpc=wss%3A%2F%2Fkhala.api.onfinality.io%2Fpublic-ws#/assets). The asset registration informations are stored on-chain, head to [polkadot.js.app](https://polkadot.js.org/apps/?rpc=wss%3A%2F%2Fkhala.api.onfinality.io%2Fpublic-ws#/chainstate) and choose RPC `assetsRegistry→registryInfoByIds` to see details. Here is a screenshot of KSM registration information: <p> <img src="/images/general/subbridge-assetinfo.png" style="background-color:white;" alt> <figcaption align = "center">Registration infomartion of KSM</figcaption> </p> - Step2[optional], after the asset was registered, by default all assets will enable XCM crosschain transfer. If the asset is going to enable ChainBridge, another call named `assetRegistry::forceEnabledChainbridge` should be enacted by the council. This will enable the crosschain transfer to a specific EVM chain. And `assetRegistry::forceDisableChainBridge` is used to disable it. When ChainBridge was enabled for the asset, you will see we have new data being added to the returned registration information. For example, the enabled-bridges information of ZLK is shown below: ```sh enabledBridges: [ { config: Xcmp metadata: } { config: { ChainBridge: { chainId: 2 resourceId: 0x028da1efb56e124f659fa6d5d95b3cc541ce207cbfee2f4f066061cc92d37bae reserveAccount: 0xde50ca45c8f7323ea372fd5d7929b9f37946690b0b668985beebe60431badcea isMintable: false } } metadata: 0x00 } ] ``` Looking at the ChainBridge filed, the `chainId` is 2 means it has enabled crosschain transfer between the Khala network and Moonriver EVM. `ResourceId` is used to bind ZLK on the Khala network and ERC20-ZLK on Moonriver EVM. `reserveAccount` is used to save ZLK temporarily when transferring ZLK from the Khala network to Moonriver EVM, and will transfer back to the recipient account when someone transfer ZLK from Moonriver EVM back to Khala network. `isMintable` is `false` tells the client that should aware of the ZLK balance of reserve account. - Step3[If Step2 has been done], we also need to config your asset on our ChainBridge [Bridge contract](https://github.com/Phala-Network/chainbridge-solidity/blob/phala-bridge/contracts/Bridge.sol) before finally launching the crosschain transfer through ChainBridge. It including: - Binding resource id generated during registration with its ERC20 contract address. This essentially is done by executing method [adminSetResource](https://github.com/Phala-Network/chainbridge-solidity/blob/5eef3073ccc75b48e06ce44eee522c2023da974e/contracts/Bridge.sol#L204) of Bridge contract. - Set decimals of the asset by executing method [adminSetDecimals](https://github.com/Phala-Network/chainbridge-solidity/blob/5eef3073ccc75b48e06ce44eee522c2023da974e/contracts/Bridge.sol#L247). SubBridge is compatible with the scenario that asset has different decimals between substrate side and EVM side. - If your asset is burnable and would like to give the mint/burn permission to our contract, we need to tell the contract to mark your asset as burnable by executing method [adminSetBurnable](https://github.com/Phala-Network/chainbridge-solidity/blob/5eef3073ccc75b48e06ce44eee522c2023da974e/contracts/Bridge.sol#L236). With burnable set, when the user transfers asset from EVM chains, the asset would be burned directly from their account, and mint to the recipient account when someone transfers back to EVM chains. ## The Lifecycle of Cross-chain Transaction <p> <img src="/images/general/subbridge-lifecycle.png" style="background-color:white;" alt> <figcaption align = "center">Lifecycle of SubBridge Cross-chain Transaction</figcaption> </p> If the two types of bridges, XCM and ChainBridge, are used as an explanation, the life cycle of a transfer across three chains can be described in the above figure. In the above picture, assets are transferred between Parachains on the left and EVM Chains on the right, passing through the Khala network in the middle. When a cross-chain transfer is initiated from a parachain, after executing the local XCM command (such as burning a certain amount of assets from the sending account) it will be wrapped into a cross-chain XCM message and sent from the parachain to the Khala network, the XCM related modules of the Khala network will be responsible for processing the message. The transmission and instruction processing of XCM cross-chain messages are handled by Polkadot's XCM-related modules. During the execution of the instruction, when the DepositAsset instruction is executed, SubBridge will parse the payment address. If it points to the address of another EVM Chain, the transaction will be forwarded through the ChainBridge module. Similarly, when a cross-chain transfer is initiated from EVM Chains, the ChainBridge Relayer will forward the message to the ChainBridge module of SubBridge. After the ChainBridge module performs a series of verifications on the transaction, it will process the asset with the same logic. If it is resolved that the receiving address is not an address in the local Khala network but an address on a parachain, SubBridge's XCM module will be triggered to forward the transaction. ## Other Subbridge's Chainbridge cross-chain bridge module is maintained and developed by the Phala team and is responsible for running three Relayers. The Relayer of the Chainbridge cross-chain bridge constructs the captured origin chain cross-chain transaction into a proposal, and then the three relayers vote on the Proposal. After the vote is passed, the proposal will be executed and the assets will be deposited in the user address. About Chainbridge, you can refer to their [websit](https://github.com/ChainSafe/ChainBridge). For the Phala Ethereum Chainbridge contract, please refer to the source code on [github](https://github.com/Phala-Network/chainbridge-solidity/tree/phala-bridge). ## Code Auditing The bridges currently integrated by SubBridge include two implementations, XCM and ChainBridge. XCM is a cross-chain message protocol implemented by Parity Tech in the two networks of Polkadot and Kusama. Its code has been audited by a third-party audit firm hired by Parity. The audit report on XCM V2 (the current version of XCM used by SubBridge) can be found [here](https://blog.quarkslab.com/resources/2022-02-27-xcmv2-audit/21-12-908-REP.pdf) Earlier last year, we deployed ChainBridge’s Solidity contract on Ethereum. The contract info can be found [here](https://etherscan.io/address/0x8F92e7353b180937895E0C5937d616E8ea1A2Bb9). Recently we migrated the old contract to the new deploy one `0x8F92e7353b180937895E0C5937d616E8ea1A2Bb9`. Up to now, the contract has been running safely for nearly a year, and the contract has also been audited by Certik, a third-party auditor hired by Phala. The detailed audit report can be found [here](https://www.certik.com/projects/phalanetwork) ## Reference - XCM format: [https://github.com/paritytech/xcm-format](https://github.com/paritytech/xcm-format) - MultiAsset definition: [https://github.com/paritytech/polkadot/blob/master/xcm/src/v1/multiasset.rs](https://github.com/paritytech/polkadot/blob/master/xcm/src/v1/multiasset.rs) - MultiLocation definition: [https://github.com/paritytech/polkadot/blob/master/xcm/src/v1/multilocation.rs](https://github.com/paritytech/polkadot/blob/master/xcm/src/v1/multilocation.rs) - Phala Chainbridge Solidity contract: [https://github.com/Phala-Network/chainbridge-solidity/tree/phala-bridge](https://github.com/Phala-Network/chainbridge-solidity/tree/phala-bridge) - Pallet-assets implementation: [https://github.com/paritytech/substrate/tree/master/frame/assets](https://github.com/paritytech/substrate/tree/master/frame/assets) - Introduction to ChainBridge: [https://chainbridge.chainsafe.io/](https://chainbridge.chainsafe.io/) - Introduction to CelerBridge: [https://cbridge-docs.celer.network/](https://cbridge-docs.celer.network/)
94.632258
724
0.786815
eng_Latn
0.995887
aae92004c1cdbf8d05b720453c8cb91fd3c3afd2
8,708
md
Markdown
articles/machine-learning/how-to-kubernetes-instance-type.md
KreizIT/azure-docs.fr-fr
dfe0cb93ebc98e9ca8eb2f3030127b4970911a06
[ "CC-BY-4.0", "MIT" ]
1
2021-03-12T23:37:08.000Z
2021-03-12T23:37:08.000Z
articles/machine-learning/how-to-kubernetes-instance-type.md
KreizIT/azure-docs.fr-fr
dfe0cb93ebc98e9ca8eb2f3030127b4970911a06
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/machine-learning/how-to-kubernetes-instance-type.md
KreizIT/azure-docs.fr-fr
dfe0cb93ebc98e9ca8eb2f3030127b4970911a06
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Créer et sélectionner des types d’instance Kubernetes (version préliminaire) description: Créez et sélectionnez les types d’instances de cluster Kubernetes compatibles avec Azure Arc pour l’apprentissage et l’inférence des charges de travail dans Azure Machine Learning. titleSuffix: Azure Machine Learning author: luisquintanilla ms.author: luquinta ms.service: machine-learning ms.subservice: core ms.date: 10/21/2021 ms.topic: how-to ms.custom: ignite-fall-2021 ms.openlocfilehash: 46ae66658d5f07e1a44322bf64997a80d66db1c0 ms.sourcegitcommit: 362359c2a00a6827353395416aae9db492005613 ms.translationtype: HT ms.contentlocale: fr-FR ms.lasthandoff: 11/15/2021 ms.locfileid: "132485162" --- # <a name="create-and-select-kubernetes-instance-types-preview"></a>Créer et sélectionner des types d’instance Kubernetes (version préliminaire) Découvrez comment créer et sélectionner des instances Kubernetes pour les charges de travail d’apprentissage et d’inférence Azure Machine Learning sur des clusters Kubernetes compatibles Azure Arc. ## <a name="what-are-instance-types"></a>Que sont les types d’instances ? Les types d’instances sont un concept d’Azure Machine Learning qui permet de cibler certains types de nœuds de calcul pour les charges de travail d’apprentissage et d’inférence. Pour une machine virtuelle Azure, un exemple de type d’instance est `STANDARD_D2_V3`. Dans les clusters Kubernetes, les types d’instance sont définis par deux éléments : * [nodeSelector](https://kubernetes.io/docs/concepts/scheduling-eviction/assign-pod-node/#nodeselector) : `nodeSelector` vous permet de spécifier le nœud sur lequel un pod doit s’exécuter. Le nœud doit avoir une étiquette correspondante. * [ressources](https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/) : la section `resources` définit les ressources de calcul (processeur, mémoire et GPU NVIDIA) du pod. ## <a name="create-instance-types"></a>Créer des types d’instance Les types d’instances sont représentés dans une définition de ressource personnalisée (CRD) qui est installée avec l’extension Azure Machine Learning. ### <a name="create-a-single-instance-type"></a>Créer un type d’instance unique Pour créer un nouveau type d’instance, créez une ressource personnalisée pour le type d’instance CRD. Par exemple, prenons le CRD `my_instance_type.yaml` : ```yaml apiVersion: amlarc.azureml.com/v1alpha1 kind: InstanceType metadata: name: myinstancetypename spec: nodeSelector: mylabel: mylabelvalue resources: limits: cpu: "1" nvidia.com/gpu: 1 memory: "2Gi" requests: cpu: "700m" memory: "1500Mi" ``` Utilisez la commande `kubectl apply` pour créer un nouveau type d’instance. ```bash kubectl apply -f my_instance_type.yaml ``` Cette opération crée un type d’instance avec les propriétés suivantes : - Les pods sont planifiés uniquement sur les nœuds avec étiquette `mylabel: mylabelvalue`. - Les demandes de ressources du processeur `700m` et de la mémoire `1500Mi` sont affectées aux pods. - Les ressources du processeur `1`, de la mémoire `2Gi` et du GPU NVIDIA `1` sont affectées aux pods. > [!NOTE] > Lorsque vous spécifiez votre CRD, prenez note des conventions suivantes : > - Les ressources GPU NVIDIA sont uniquement spécifiées dans la section `limits`. Pour plus d’informations, consultez la [documentation Kubernetes](https://kubernetes.io/docs/tasks/manage-gpus/scheduling-gpus/#using-device-plugins). > - Les ressources processeur et mémoire sont des valeurs de chaîne. > - Le processeur peut être spécifié dans millicœurs (`100m`) ou dans des nombres entiers (`"1"` ce qui équivaut à `1000m`). > - La mémoire peut être spécifiée sous la forme d’un nombre entier + suffixe (`1024Mi` pour `1024 MiB`). ### <a name="create-multiple-instance-types"></a>Créer plusieurs types d’instances Vous pouvez également utiliser les CRD pour créer plusieurs types d’instance à la fois. Par exemple, prenons le CRD `my_instance_type.yaml` : ```yaml apiVersion: amlarc.azureml.com/v1alpha1 kind: InstanceTypeList items: - metadata: name: cpusmall spec: resources: limits: cpu: "100m" nvidia.com/gpu: 0 memory: "10Mi" requests: cpu: "100m" memory: "1Gi" - metadata: name: defaultinstancetype spec: resources: limits: cpu: "1" nvidia.com/gpu: 0 memory: "1Gi" requests: cpu: "1" memory: "1Gi" ``` Utilisez la commande `kubectl apply` pour créer plusieurs types d’instance. ```bash kubectl apply -f my_instance_type.yaml ``` Cette opération crée deux types d’instances, l’un nommé `defaultinstancetype` et l’autre `cpusmall` avec des spécifications de ressources différentes. Pour plus d’informations sur les types d’instance par défaut, consultez la section [types d’instance par défaut](#default-instance-types) de ce document. ## <a name="default-instance-types"></a>Types d’instance par défaut Lorsqu’une charge de travail d’apprentissage ou d’inférence est soumise sans type d’instance, elle utilise le type d’instance par défaut. Pour spécifier un type d’instance par défaut pour un cluster Kubernetes, créez un type d’instance avec le nom `defaultinstancetype` et définissez les propriétés `nodeSelector` et respectives `resources` comme n’importe quel autre type d’instance. Le type d’instance est automatiquement reconnu comme étant la valeur par défaut. Si aucun type d’instance par défaut n’est défini, le comportement par défaut suivant s’applique : * Aucun nodeSelector n’est appliqué, ce qui signifie que le pod peut être planifié sur n’importe quel nœud. * Les ressources par défaut sont affectées aux pod de la charge de travail : ```yaml resources: limits: cpu: "0.6" memory: "1536Mi" requests: cpu: "0.6" memory: "1536Mi" ``` > [!IMPORTANT] > Le type d’instance par défaut n’apparaît pas en tant que ressource personnalisée InstanceType dans le cluster, mais s’affiche dans tous les clients (Azure Machine Learning studio, Azure CLI, kit de développement logiciel (SDK) Python). ## <a name="select-an-instance-type-for-training-workloads"></a>Sélectionner un type d’instance pour les charges de travail d’apprentissage Pour sélectionner un type d’instance pour un travail d’apprentissage à l’aide de Azure Machine Learning CLI 2.0, spécifiez son nom dans le cadre de la section `compute` du fichier de spécification YAML. ```yaml command: python -c "print('Hello world!')" environment: docker: image: python compute: target: azureml:<compute_target_name> instance_type: <instance_type_name> ``` Dans l’exemple ci-dessus, remplacez `<compute_target_name>` par le nom de votre cible de calcul Kubernetes et `<instance_type_name>` par le nom du type d’instance que vous souhaitez sélectionner. > [!TIP] > Le type d’instance par défaut utilise peu de ressources. Pour vous assurer que toutes les charges de travail de Machine Learning s’exécutent correctement avec les ressources adéquates, il est fortement recommandé de créer des types d’instances personnalisées. ## <a name="select-an-instance-type-for-inferencing-workloads"></a>Sélectionner un type d’instance pour les charges de travail d’inférence Pour sélectionner un type d’instance pour les charges de travail d’inférence à l’aide de l’interface de commande Azure Machine Learning 2.0, spécifiez son nom dans le cadre de la section `deployments`. Par exemple : ```yaml type: online auth_mode: key target: azureml:<your compute target name> traffic: blue: 100 deployments: - name: blue app_insights_enabled: true model: name: sklearn_mnist_model version: 1 local_path: ./model/sklearn_mnist_model.pkl code_configuration: code: local_path: ./script/ scoring_script: score.py instance_type: <instance_type_name> environment: name: sklearn-mnist-env version: 1 path: . conda_file: file:./model/conda.yml docker: image: mcr.microsoft.com/azureml/openmpi3.1.2-ubuntu18.04:20210727.v1 ``` Dans l’exemple ci-dessus, remplacez `<instance_type_name>` par le nom du type d’instance que vous souhaitez sélectionner. ## <a name="next-steps"></a>Étapes suivantes - [Configuration du Machine Learning avec Azure Arc (préversion)](how-to-attach-arc-kubernetes.md) - [Entraîner des modèles (créer des travaux) avec l’interface CLI (v2)](how-to-train-cli.md) - [Déployer et noter un modèle Machine Learning en utilisant un point de terminaison en ligne (préversion)](how-to-deploy-managed-online-endpoints.md)
42.686275
328
0.74667
fra_Latn
0.951061
aaea922fdfa1343d95912a2a02b20d5e82693783
1,739
md
Markdown
docs/extensibility/debugger/reference/idebugdefaultport2-getportnotify.md
jim-liu/visualstudio-docs.zh-tw
52dbaa9ad359aeda9c6f3767ab3ccda6d91a50dd
[ "CC-BY-4.0", "MIT" ]
16
2017-09-04T14:28:59.000Z
2021-12-10T15:18:17.000Z
docs/extensibility/debugger/reference/idebugdefaultport2-getportnotify.md
jim-liu/visualstudio-docs.zh-tw
52dbaa9ad359aeda9c6f3767ab3ccda6d91a50dd
[ "CC-BY-4.0", "MIT" ]
49
2017-08-04T09:21:57.000Z
2022-03-10T09:08:31.000Z
docs/extensibility/debugger/reference/idebugdefaultport2-getportnotify.md
jim-liu/visualstudio-docs.zh-tw
52dbaa9ad359aeda9c6f3767ab3ccda6d91a50dd
[ "CC-BY-4.0", "MIT" ]
29
2017-08-03T13:28:03.000Z
2022-03-24T15:35:09.000Z
--- description: 這個方法會取得此埠的 IDebugPortNotify2 介面。 title: IDebugDefaultPort2:: GetPortNotify |Microsoft Docs ms.date: 11/04/2016 ms.topic: reference f1_keywords: - IDebugDefaultPort2::GetPortNotify helpviewer_keywords: - IDebugDefaultPort2::GetPortNotify ms.assetid: 3ae715ee-9886-4694-a52b-59bb3b27467a author: leslierichardson95 ms.author: lerich manager: jmartens ms.technology: vs-ide-debug ms.workload: - vssdk dev_langs: - CPP - CSharp ms.openlocfilehash: 32af766e0c9894ab5c1be42ef62f83d9c5d56b0a ms.sourcegitcommit: 68897da7d74c31ae1ebf5d47c7b5ddc9b108265b ms.translationtype: MT ms.contentlocale: zh-TW ms.lasthandoff: 08/13/2021 ms.locfileid: "122089325" --- # <a name="idebugdefaultport2getportnotify"></a>IDebugDefaultPort2::GetPortNotify 這個方法會取得此埠的 [IDebugPortNotify2](../../../extensibility/debugger/reference/idebugportnotify2.md) 介面。 ## <a name="syntax"></a>語法 ```cpp HRESULT GetPortNotify( IDebugPortNotify2** ppPortNotify ); ``` ```csharp int GetPortNotify( out IDebugPortNotify2 ppPortNotify ); ``` ## <a name="parameters"></a>參數 `ppPortNotify`\ 擴展 [IDebugPortNotify2](../../../extensibility/debugger/reference/idebugportnotify2.md) 物件。 ## <a name="return-value"></a>傳回值 如果成功,則傳回, `S_OK` 否則傳回錯誤碼。 ## <a name="remarks"></a>備註 一般來說, `QueryInterface` 方法是在執行 [IDebugPort2](../../../extensibility/debugger/reference/idebugport2.md) 介面的物件上呼叫,以取得 [IDebugPortNotify2](../../../extensibility/debugger/reference/idebugportnotify2.md) 介面。 不過,在某些情況下,會在不同的物件上執行所需的介面。 這個方法會隱藏這些情況,並 `IDebugPortNotify2` 從最適當的物件傳回介面。 ## <a name="see-also"></a>另請參閱 - [IDebugDefaultPort2](../../../extensibility/debugger/reference/idebugdefaultport2.md) - [IDebugPortNotify2](../../../extensibility/debugger/reference/idebugportnotify2.md)
30.508772
277
0.772283
yue_Hant
0.65219
aaea9ee2a2e838c3c87f05e3bd8c627c61b5d97f
4,281
md
Markdown
docs/build/reference/calling-conventions-parameters-and-return-type.md
insurgent92/cpp-docs.ko-kr
06a8d8eee5c911618f014fb0d58059afd10553f0
[ "CC-BY-4.0", "MIT" ]
3
2019-10-11T07:41:28.000Z
2021-06-29T08:27:00.000Z
docs/build/reference/calling-conventions-parameters-and-return-type.md
jacking75/cpp-docs.ko-kr
a3d97df4004aac6e595d7bdb04957fd517fb0a82
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/build/reference/calling-conventions-parameters-and-return-type.md
jacking75/cpp-docs.ko-kr
a3d97df4004aac6e595d7bdb04957fd517fb0a82
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 호출 규칙, 매개 변수, 반환 형식 ms.date: 11/04/2016 helpviewer_keywords: - calling conventions, helper functions - helper functions, calling conventions - helper functions, return types ms.assetid: 0ffa4558-6005-4803-be95-7a8ec8837660 ms.openlocfilehash: 8343c17828040ca36b042cb99e0c51c37548d3b3 ms.sourcegitcommit: 6052185696adca270bc9bdbec45a626dd89cdcdd ms.translationtype: MT ms.contentlocale: ko-KR ms.lasthandoff: 10/31/2018 ms.locfileid: "50654430" --- # <a name="calling-conventions-parameters-and-return-type"></a>호출 규칙, 매개 변수, 반환 형식 도우미 루틴의 프로토타입은 다음과 같습니다. ``` FARPROC WINAPI __delayLoadHelper2( PCImgDelayDescr pidd, FARPROC * ppfnIATEntry ); ``` ### <a name="parameters"></a>매개 변수 *pidd*<br/> 다양한 가져오기 관련 데이터의 오프셋, 바인딩 정보에 대한 타임스탬프 및 설명자 콘텐츠에 대한 추가 정보를 제공하는 특성 집합이 포함된 `const`(delayimp.h 참조)에 대한 `ImgDelayDescr` 포인터입니다. 현재 설명자의 주소가 상대 가상 주소(가상 주소의 반대)임을 나타내는 `dlattrRva` 특성만 있습니다. 정의 대해서는 `PCImgDelayDescr` 구조체를 참조 하십시오 [구조체 및 상수 정의](../../build/reference/structure-and-constant-definitions.md)합니다. *ppfnIATEntry*<br/> 가져온 함수의 주소로 업데이트되는 지연 로드된 IAT(가져오기 주소 테이블)의 슬롯 포인터입니다. 도우미 루틴은 이 위치에 반환할 값과 동일한 값을 저장해야 합니다. ## <a name="expected-return-values"></a>예상 반환 값 함수에 성공하면 가져온 함수의 주소가 반환됩니다. 함수에 실패하면 예외가 발생하고 0이 반환됩니다. 다음과 같은 3가지 형식의 예외가 발생할 수 있습니다. - 잘못된 매개 변수. `pidd`의 특성을 제대로 지정하지 않으면 발생합니다. - 지정된 DLL에서 실패한 `LoadLibrary` - `GetProcAddress`의 실패 이러한 예외를 처리하는 것은 사용자의 책임입니다. ## <a name="remarks"></a>설명 도우미 함수의 호출 규칙은 `__stdcall`입니다. 반환 값의 형식이 관련 없으므로 FARPROC가 사용됩니다. 이 함수에는 C 링크가 있습니다. 도우미 루틴을 알림 후크로 사용하지 않으려는 경우에는 지연 로드 도우미의 반환 값을 전달된 함수 포인터 위치에 저장해야 합니다. 이러한 경우 사용자 코드가 반환할 적절한 함수 포인터 찾기를 담당합니다. 그러면 링커가 생성한 썽크 코드가 해당 반환 값을 가져오기의 실제 대상으로 인식하여 해당 대상으로 직접 이동합니다. ## <a name="sample"></a>샘플 다음 코드는 간단한 후크 함수를 구현하는 방법을 보여줍니다. ```C FARPROC WINAPI delayHook(unsigned dliNotify, PDelayLoadInfo pdli) { switch (dliNotify) { case dliStartProcessing : // If you want to return control to the helper, return 0. // Otherwise, return a pointer to a FARPROC helper function // that will be used instead, thereby bypassing the rest // of the helper. break; case dliNotePreLoadLibrary : // If you want to return control to the helper, return 0. // Otherwise, return your own HMODULE to be used by the // helper instead of having it call LoadLibrary itself. break; case dliNotePreGetProcAddress : // If you want to return control to the helper, return 0. // If you choose you may supply your own FARPROC function // address and bypass the helper's call to GetProcAddress. break; case dliFailLoadLib : // LoadLibrary failed. // If you don't want to handle this failure yourself, return 0. // In this case the helper will raise an exception // (ERROR_MOD_NOT_FOUND) and exit. // If you want to handle the failure by loading an alternate // DLL (for example), then return the HMODULE for // the alternate DLL. The helper will continue execution with // this alternate DLL and attempt to find the // requested entrypoint via GetProcAddress. break; case dliFailGetProc : // GetProcAddress failed. // If you don't want to handle this failure yourself, return 0. // In this case the helper will raise an exception // (ERROR_PROC_NOT_FOUND) and exit. // If you choose you may handle the failure by returning // an alternate FARPROC function address. break; case dliNoteEndProcessing : // This notification is called after all processing is done. // There is no opportunity for modifying the helper's behavior // at this point except by longjmp()/throw()/RaiseException. // No return value is processed. break; default : return NULL; } return NULL; } /* and then at global scope somewhere PfnDliHook __pfnDliNotifyHook2 = delayHook; */ ``` ## <a name="see-also"></a>참고 항목 [도우미 함수 이해](../../build/reference/understanding-the-helper-function.md)
30.578571
187
0.655221
kor_Hang
0.9985
aaeb2508ccdf92a5b2a98adf8ef5a70d0f54765e
1,177
md
Markdown
website/docs/cordova/ads/native.md
yigit-serin/admob-plus
4ce105c632d9ebdb8a8b6b5517c2422136b2a0a7
[ "MIT" ]
null
null
null
website/docs/cordova/ads/native.md
yigit-serin/admob-plus
4ce105c632d9ebdb8a8b6b5517c2422136b2a0a7
[ "MIT" ]
null
null
null
website/docs/cordova/ads/native.md
yigit-serin/admob-plus
4ce105c632d9ebdb8a8b6b5517c2422136b2a0a7
[ "MIT" ]
null
null
null
--- title: Native Ad sidebar_label: Native --- Native ads are ad assets that are presented to users via UI components that are native to the platform. In addtion to installing `admob-plus-cordova`, you will need to install `admob-plus-cordova-native` for displaying native ads. ## Installation ```sh-session cordova plugin add admob-plus-cordova-native --save ``` ## Usage ```js document.addEventListener('deviceready', async () => { const ad = new admob.NativeAd({ adUnitId: 'ca-app-pub-xxx/yyy', }) await ad.load() await ad.show({ x: 0, y: 50, width: window.screen.width, height: 300, }) setTimeout(() => { ad.hide() }, 5000) }) document.addEventListener('admob.ad.load', async (evt) => { if (evt.ad instanceof admob.NativeAd) { // handle event here } }) ``` ## Events ### `admob.ad.load` ### `admob.ad.loadfail` ### `admob.ad.show` ### `admob.ad.showfail` ### `admob.ad.dismiss` ### `admob.ad.impression` ## References - [Native Ads - Mobile Ads SDK (Android)](https://developers.google.com/admob/android/native/start) - [Native Ads - Mobile Ads SDK (iOS)](https://developers.google.com/admob/ios/native/start)
19.295082
126
0.661852
eng_Latn
0.562171
aaeb5fd0060fa48871327f67d10bee61dc222d5a
1,966
md
Markdown
packages/format-message-generate-id/README.md
ashokkbhaskar/format-message
aee83ff712d3f2d7941efc838e71ad8e30bbab8f
[ "MIT" ]
null
null
null
packages/format-message-generate-id/README.md
ashokkbhaskar/format-message
aee83ff712d3f2d7941efc838e71ad8e30bbab8f
[ "MIT" ]
null
null
null
packages/format-message-generate-id/README.md
ashokkbhaskar/format-message
aee83ff712d3f2d7941efc838e71ad8e30bbab8f
[ "MIT" ]
2
2019-03-24T10:59:11.000Z
2021-05-14T01:34:37.000Z
# ![format-message-generate-id][logo] > Generate a message id from the default message pattern [![npm Version][npm-image]][npm] [![JS Standard Style][style-image]][style] [![MIT License][license-image]][LICENSE] A small collection of helper functions for use in format-message, to generate a message id based on the default message pattern. Quick Examples -------------- `npm install format-message-generate-id --save` ```js var formatMessage = require('format-message'); formatMessage.setup({ generateId: require('format-message-generate-id/underscored_crc32') }); ``` ```js import formatMessage from 'format-message' import generate from 'format-message-generate-id' formatMessage.setup({ generateId: generate.normalized }) ``` API --- ### `literal(pattern)` Simply returns the pattern passed in. ### `normalized(pattern)` Normalizes insignificant whitespace within ICU placeholder syntax. This requires parsing and pretty-printing the message pattern, and an invalid message will cause an error to be thrown. ### `underscored(pattern)` After normalizing the message pattern, a slug is generated with underscores replacing symbols and whitespace. ### `underscored_crc32(pattern)` In addition to generating a slug, a crc32 checksum is calculated from the normalized pattern and appended to the result. License ------- This software is free to use under the MIT license. See the [LICENSE-MIT file][LICENSE] for license text and copyright information. [logo]: https://cdn.rawgit.com/format-message/format-message/2febdd8/logo.svg [npm]: https://www.npmjs.org/package/format-message-generate-id [npm-image]: https://img.shields.io/npm/v/format-message-generate-id.svg [style]: https://github.com/feross/standard [style-image]: https://img.shields.io/badge/code%20style-standard-brightgreen.svg [license-image]: https://img.shields.io/npm/l/format-message.svg [LICENSE]: https://github.com/format-message/format-message/blob/master/LICENSE-MIT
30.71875
186
0.75941
eng_Latn
0.769882
aaeb72d6563589864f5d6fa0478a7235eb0d7cf8
4,497
md
Markdown
articles/marketplace/marketplace-solution-templates.md
fuatrihtim/azure-docs.tr-tr
6569c5eb54bdab7488b44498dc4dad397d32f1be
[ "CC-BY-4.0", "MIT" ]
16
2017-08-28T08:29:36.000Z
2022-01-02T16:46:30.000Z
articles/marketplace/marketplace-solution-templates.md
fuatrihtim/azure-docs.tr-tr
6569c5eb54bdab7488b44498dc4dad397d32f1be
[ "CC-BY-4.0", "MIT" ]
470
2017-11-11T20:59:16.000Z
2021-04-10T17:06:28.000Z
articles/marketplace/marketplace-solution-templates.md
fuatrihtim/azure-docs.tr-tr
6569c5eb54bdab7488b44498dc4dad397d32f1be
[ "CC-BY-4.0", "MIT" ]
25
2017-11-11T19:39:08.000Z
2022-03-30T13:47:56.000Z
--- title: Azure uygulamaları için Yayımlama Kılavuzu çözüm şablonu teklifleri-Azure Marketi description: Bu makalede, Azure Marketi 'nde çözüm şablonlarının yayımlanması için gerekenler açıklanmaktadır. ms.service: marketplace ms.subservice: partnercenter-marketplace-publisher ms.topic: conceptual author: msjogarrig ms.author: jogarrig ms.date: 04/22/2020 ms.openlocfilehash: c7074981c8491460d6f2a8e7d40d086f261dfeb3 ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5 ms.translationtype: MT ms.contentlocale: tr-TR ms.lasthandoff: 03/29/2021 ms.locfileid: "98879352" --- # <a name="publishing-guide-for-azure-applications-solution-template-offers"></a>Azure uygulamaları çözüm şablonu teklifleri için Yayımlama Kılavuzu Bu makalede, Azure Market 'te Azure Uygulama tekliflerini yayımlamanın bir yolu olan çözüm şablonu tekliflerini yayımlama gereksinimleri açıklanmaktadır. Çözüm altyapınızı otomatik olarak dağıtmak için çözüm şablonu teklif türü bir [Azure Resource Manager şablonu (ARM şablonu)](../azure-resource-manager/templates/overview.md) gerektirir. Aşağıdaki koşullarda Azure uygulama *çözümü şablonu* teklif türünü kullanın: - Çözümünüz, VM 'lerin, ağın ve depolama kaynaklarının birleşimi gibi tek bir sanal makinenin (VM) ötesinde ek dağıtım ve yapılandırma Otomasyonu gerektirir. - Müşterilerinizin çözümü kendileri yöneteceklerdir. Müşterinin bu teklif türü için gördüğü listeleme seçeneği *Şimdi alın*. ## <a name="requirements-for-solution-template-offers"></a>Çözüm şablonu teklifleri için gereksinimler | **Gereksinimler** | **Ayrıntılar** | | --------------- | ----------- | |Faturalandırma ve ölçüm | Çözüm şablonu teklifleri işlem teklifleri değildir, ancak Microsoft ticari Marketi aracılığıyla faturalandırılan ücretli VM tekliflerini dağıtmak için kullanılabilirler. Çözümün ARM şablonunun dağıttığı kaynaklar müşterinin Azure aboneliğinde ayarlanır. Kullandıkça Öde sanal makineleri, müşteri ile Microsoft aracılığıyla gerçekleştirilir ve müşterinin Azure aboneliği aracılığıyla faturalandırılır.<br/> Kendi lisansını getir (KLG) faturanızı, Microsoft 'un müşteri aboneliğinde tahakkuk eden altyapı maliyetleri olmasına karşın, yazılım lisans ücretlerinizi müşteriyle doğrudan Transact. | |Azure ile uyumlu sanal sabit disk (VHD) | VM 'Ler Windows veya Linux üzerinde oluşturulmalıdır. Daha fazla bilgi için bkz. <ul> <li>[Bir Azure Uygulama teklifi oluşturun](./create-new-azure-apps-offer.md) (Windows VHD 'ler için).</li><li>[Azure 'da desteklenen Linux dağıtımları](../virtual-machines/linux/endorsed-distros.md) (Linux VHD 'ler için).</li></ul> | | Müşteri kullanımı ilişkilendirmesi | Azure Market 'te yayımlanan tüm çözüm şablonlarında müşteri kullanım attributıon özelliğinin etkinleştirilmesi gerekir. Müşteri kullanımı atımı ve nasıl etkinleştirileceği hakkında daha fazla bilgi için bkz. [Azure iş ortağı müşteri kullanımı atısyonu](./azure-partner-customer-usage-attribution.md). | | Yönetilen diskleri kullanma | [Yönetilen diskler](../virtual-machines/managed-disks-overview.md) , Azure 'da hizmet olarak altyapı (IaaS) VM 'lerinin kalıcı diskleri için varsayılan seçenektir. Çözüm şablonlarında yönetilen diskleri kullanmanız gerekir. <ul><li>Çözüm şablonlarınızı güncelleştirmek için [Azure Resource Manager şablonlarda yönetilen diskleri kullanma](../virtual-machines/using-managed-disks-template-deployments.md)bölümündeki yönergeleri izleyin ve sağlanan [örnekleri](https://github.com/Azure/azure-quickstart-templates)kullanın.<br><br> </li><li>VHD 'YI Azure Marketi 'nde bir görüntü olarak yayımlamak için, aşağıdaki yöntemlerden birini kullanarak yönetilen disklerin temel VHD 'sini bir depolama hesabına aktarın:<ul><li>[Azure PowerShell](/previous-versions/azure/virtual-machines/scripts/virtual-machines-powershell-sample-copy-managed-disks-vhd) </li> <li> [Azure CLI](/previous-versions/azure/virtual-machines/scripts/virtual-machines-cli-sample-copy-managed-disks-vhd) </li> </ul></ul> | ## <a name="next-steps"></a>Sonraki adımlar Henüz yapmadıysanız, [Azure Marketi ile bulut işletmenizi nasıl büyütireceğinizi](https://azuremarketplace.microsoft.com/sell)öğrenin. Iş Ortağı Merkezi 'nde çalışmaya kaydolmak ve başlamak için: - Teklifinizi oluşturmak veya tamamlayabilmeniz için [Iş Ortağı Merkezi ' nde oturum açın](https://partner.microsoft.com/dashboard/account/v3/enrollment/introduction/partnership) . - Daha fazla bilgi için bkz. [Azure Uygulama teklifi oluşturma](./create-new-azure-apps-offer.md) .
102.204545
1,019
0.808984
tur_Latn
0.999195
aaec3ee2177bede781a924fab11087b54dcf1c85
4,067
md
Markdown
README.md
ccarlborg/object-serializer
5e4b4b8bceba26ef43bf139e9edd03181e9ecb0f
[ "MIT" ]
null
null
null
README.md
ccarlborg/object-serializer
5e4b4b8bceba26ef43bf139e9edd03181e9ecb0f
[ "MIT" ]
null
null
null
README.md
ccarlborg/object-serializer
5e4b4b8bceba26ef43bf139e9edd03181e9ecb0f
[ "MIT" ]
null
null
null
# object-serializer Serialize Java objects into strings using pluggable strategies for serialization, compression, encryption and encoding. Straightforward strategy interfaces make it easy to implement custom strategies for your needs. Compatible with Java 8 and newer. ## Serialization and deserialization flow ![serialize-deserialize-flow.png](/doc/serialize-deserialize-flow.png) ## Example code ### Example 1: Default serializer using base64 encoding and no compression or encryption ```java ObjectSerializer serializer = new ObjectSerializer(); HashMap<String, String> map = new HashMap<>(); map.put("first_item", "Serialize"); map.put("second_item", "me!"); String serialized = serializer.serialize(map); // Do whatever you want with the serialized object at this point. // Save it to a file, store it in a browser cookie or whatever. HashMap<String, String> deserialized = (HashMap<String, String>)serializer.deserialize(serialized); ``` ### Example 2: Gzip compression and AES-GCM encryption ```java // Random bytes for the encryption key. A 256 bit block size requires a 32 bytes long encryption key. byte[] encryptionKey256Bits = Base64.getDecoder().decode("hMICYUBp47LxliF9nzmPrxwCTGk1PNDi18IsA78MyLrw="); ObjectSerializer serializer = new ObjectSerializer( new JavaSerializeStrategy(), new GzipCompressStrategy(), new AesGcmEncryptStrategy(encryptionKey256Bits), new Base64EncodeStrategy() ); HashMap<String, String> map = new HashMap<>(); map.put("first_item", "Serialize"); map.put("second_item", "me!"); String serialized = serializer.serialize(map); // Do whatever you want with the serialized object at this point. // Save it to a file, store it in a browser cookie or whatever. HashMap<String, String> deserialized = (HashMap<String, String>)serializer.deserialize(serialized); ``` ### Further examples Take a look at the test cases under ![src/test/java](/src/test/java) for further examples. ## Built-in strategies The core library provides a set of strategies that only depend on the Java standard library. ### Serialize | Name | Description | |------|-------------| | ![JavaSerializeStrategy](/src/main/java/info/carlborg/serializer/serialize/JavaSerializeStrategy.java) | Serialize the object into a byte array using the standard [Java object serializer](https://docs.oracle.com/javase/8/docs/api/java/io/ObjectOutputStream.html). | ### Compress | Name | Description | |------|-------------| | ![GzipCompressStrategy](/src/main/java/info/carlborg/serializer/compress/GzipCompressStrategy.java) | Compress the serialized byte array using the gzip compression algorithm. | | ![PassthroughCompressStrategy](/src/main/java/info/carlborg/serializer/compress/PassthroughCompressStrategy.java) | Apply no compression. | ### Encrypt | Name | Description | |----------------------------|-------------| | ![AesGcmEncryptStrategy](/src/main/java/info/carlborg/serializer/encrypt/AesGcmEncryptStrategy.java) | Encrypt the compressed byte array using the AES-GCM cipher. Supports 128, 192 or 256 bit block size. | | ![PassthroughEncryptStrategy](/src/main/java/info/carlborg/serializer/encrypt/PassthroughEncryptStrategy.java) | Apply no encryption. | ### Encode | Name | Description | |------|-------------| | ![Base64EncodeStrategy](/src/main/java/info/carlborg/serializer/encode/Base64EncodeStrategy.java) | Encode the encrypted byte array into a base64 encoded string. | ## Other available strategies | Name | Description | |----------------------|-------------| | ![FstSerializeStrategy](https://github.com/ccarlborg/object-serializer-fst) | Serialize the object into a byte array using the fast and efficient [FST object serializer](https://github.com/RuedigerMoeller/fast-serialization). Recommended. | ## Implementing your own strategy Implement the respective strategy interface and pass the new strategy as an argument to the ObjectSerializer constructor. ## Building from source Run `mvn test` to run the test suite Run `mvn package` to generate a jarfile
38.367925
267
0.739857
eng_Latn
0.737967
aaec62670a302bbc06573b1dadf0709f401847a3
12,103
md
Markdown
Account-Server.md
amicuchu/NintendoClientsWiki
036f5e73694f5dba3801907a5603f27dad32b4e3
[ "MIT" ]
null
null
null
Account-Server.md
amicuchu/NintendoClientsWiki
036f5e73694f5dba3801907a5603f27dad32b4e3
[ "MIT" ]
null
null
null
Account-Server.md
amicuchu/NintendoClientsWiki
036f5e73694f5dba3801907a5603f27dad32b4e3
[ "MIT" ]
null
null
null
[Server List](Server-List.md) > Account Server (Wii U) --- The account server always responds with an XML body. GET requests are form-encoded. POST and PUT requests have an XML body. The XML-encoding is not consistent between URLs. Usually, the response contains no XML declaration and no whitespace. However, sometimes the response starts with `<?xml version="1.0" encoding="UTF-8" standalone="yes"?>`, and sometimes the response is prettified using 4 spaces as indentation. The common client certificate is needed to connect to these servers. Main server: https://account.nintendo.net Other servers: * https://game-dev.account.nintendo.net * https://system-dev.account.nintendo.net * https://library-dev.account.nintendo.net * https://staging.account.nintendo.net ## Methods The following methods can be used without authorization: | Method | URL | | --- | --- | | GET | <code><a href="#get-v1apiadmintime">/v1/api/admin/time</a></code> | | GET | <code><a href="#get-v1apiadminmapped_ids">/v1/api/admin/mapped_ids</a></code> | | GET | `/v1/api/content/agreements/Nintendo-Network-EULA/<country>/@latest` | | GET | `/v1/api/content/time_zones/<country>/<language>` | | DELETE | `/v1/api/devices/@current` | | PUT | `/v1/api/devices/@current/inactivate` | | POST | `/v1/api/devices/@current/migrations` | | DELETE | `/v1/api/devices/@current/migrations` | | POST | `/v1/api/devices/@current/migrations/commit` | | GET | `/v1/api/devices/@current/status` | | GET | `/v1/api/miis` | | POST | <code><a href="#post-v1apioauth20access_tokengenerate">/v1/api/oauth20/access_token/generate</a></code> | | POST | `/v1/api/people` | | GET | `/v1/api/people/<nnid>` | | POST | `/v1/api/support/coppa/authorization` | | PUT | `/v1/api/support/email_confirmation/<%lu>/<%06u>` | | GET | `/v1/api/support/forgotten_password/<%lu>` | | POST | `/v1/api/support/parental_approval` | | GET | `/v1/api/support/parental_approval/send_email/coppa_code` | | GET | `/v1/api/support/resend_confirmation` | | GET | `/v1/api/support/send_confirmation/pin` | | GET | `/v1/api/support/send_forgotten/pin` | | POST | `/v1/support/validate/email` | The following methods access your account data and require a `Bearer` authorization token. This token can be retrieved with <code><a href="#post-v1apioauth20access_tokengenerate">/v1/api/oauth20/access_token/generate</a></code>. | Method | URL | | --- | --- | | PUT | `/v1/api/people/@me` | | DELETE | `/v1/api/people/@me` | | POST | `/v1/api/people/@me/deletion` | | GET | `/v1/api/people/@me/devices` | | DELETE | `/v1/api/people/@me/devices/@current` | | POST | `/v1/api/people/@me/devices/@current/attributes` | | PUT | `/v1/api/people/@me/devices/@current/inactive` | | GET | `/v1/api/people/@me/emails` | | PUT | `/v1/api/people/@me/miis/@primary` | | GET | `/v1/api/people/@me/profile` | | GET | <code><a href="#get-v1apiprovidernex_tokenme">/v1/api/provider/nex_token/@me</a></code> | | GET | `/v1/api/provider/service_token/@me` | The following methods require a `HashedBasic` authorization token. | Method | URL | | --- | --- | | POST | `/v1/api/people/@me/agreements` | | POST | `/v1/api/people/@me/devices` | | GET | `/v1/api/people/@me/devices/owner` | ## Headers The following headers are included in requests by the Wii U: | Field | Description | | --- | --- | | X-Nintendo-Platform-ID | Always 1 | | X-Nintendo-Device-Type | 1=Debug, 2=Retail | | X-Nintendo-Device-ID | Unsigned integer (MCP_GetDeviceId) | | X-Nintendo-Serial-Number | String | | X-Nintendo-System-Version | Version of `version.bin` title (`%04X`) | | X-Nintendo-Region | 1=JPN, 2=USA, 4=EUR, 8=AUS, 16=CHN, 32=KOR, 64=TWN | | X-Nintendo-Country | JP for Japan, DE for Germany, etc. | | X-Nintendo-Client-ID | a2efa818a34fa16b8afbc8a74eba3eda | | X-Nintendo-Client-Secret | c91cdb5658bd4954ade78533a339cf9a | | X-Nintendo-FPD-Version | Always "0000"? | | X-Nintendo-Environment | Lx/Dx/Sx/Tx/Jx (default: L1) | | X-Nintendo-Title-ID | Example: 0005000010138300 | | X-Nintendo-Unique-ID | Part of title id, example: 01383 | | X-Nintendo-Application-Version | Title version | | X-Nintendo-Device-Cert | [Device certificate](#device-certificate) | The server replies with the following headers, in addition to `Content-Type` (if applicable), `Content-Length` and `Date`: | Header | Description | | --- | --- | | X-Nintendo-Date | Server timestamp (in milliseconds) | | Server | `Nintendo 3DS (http)` | ### GET /v1/api/admin/time This request does not take an parameters. The response body is empty, and no `Content-Type` header is returned by the server. The server time can be retrieved from the `X-Nintendo-Date` header. ### GET /v1/api/admin/mapped_ids Converts between PID and NNID. PIDs starts at 1799999999 and decrement on every new account. | Param | Description | | --- | --- | | input_type | "pid" or "user_id" | | output_type | "pid" or "user_id" | | input | comma-separated list | Example response: ``` <mapped_ids> <mapped_id> <in_id>Kinnay-WiiU</in_id> <out_id>1798037410</out_id> </mapped_id> </mapped_ids> ``` ### POST /v1/api/oauth20/access_token/generate This method generates an access token. The access token can be included in the "Authorization" header, and is needed for all requests that require access to your account data. There are two ways to request an access token: using NNID and password or password hash, or using a refresh token from an earlier access token request. The password hash can be calculated using the following method: ``` data = struct.pack("<I", pid) + b"\x02\x65\x43\x46" + password.encode("ascii") hash = hashlib.sha256(data).digest() ``` | Field | Description | | --- | --- | | grant_type | "password" | | user_id | NNID | | password | Password or password hash | | password_type (optional) | "hash" | | Field | Description | | --- | --- | | grant_type | "refresh_token" | | refresh_token | Refresh token | Example response: ``` <OAuth20> <access_token> <token>...</token> <refresh_token>...</refresh_token> <expires_in>3600</expires_in> </access_token> </OAuth20> ``` ### GET /v1/api/provider/nex_token/@me Provides login information and location of game server. | Param | Description | | --- | --- | | game_server_id | Game server id | Example response: ``` <nex_token> <host>34.208.166.202</host> <nex_password>...</nex_password> <pid>1798037410</pid> <port>43220</port> <token>...</token> </nex_token> ``` ## Errors Here's an example error response: ``` <errors> <error> <cause>client_id</cause> <code>0004</code> <message>API application invalid or incorrect application credentials</message> </error> </errors> ``` Sometimes, the cause tag is closed immediately (`<cause/>`). Sometimes, it is omitted entirely. ### Known Errors | Code | Cause | Message | | --- | --- | --- | | 0002 | | deviceId format is invalid | | 0002 | | serialNumber format is invalid | | 0002 | | platformId format is invalid | | 0002 | | version format is invalid | | 0002 | user_id | user_id format is invalid | | 0002 | password | password format is invalid | | 0002 | X-Nintendo-Region | X-Nintendo-Region format is invalid | | 0004 | grant_type | Invalid Grant Type | | 0004 | client_id | API application invalid or incorrect application credentials | | 0005 | access_token | Invalid access token | | 0007 | Forbidden request | | 0008 | | Not Found | | 0100 | | Account ID already exists | | 0103 | email | Email format is invalid | | 0106 | | Invalid account ID or password | | 0107 | | Account country and device country do not match | | 0110 | | Unlinked device | | 0113 | | Unauthorized device | | 0113 | device_id | Unauthorized device | | 0118 | | Unique ID and Game Server ID are not linked | | 0123 | | Service has expired | | 0124 | | Application version is older than usable version registered | | 1017 | | The requested game environment wasn't found for the given game server. | | 1022 | client_id | The requested client was not found. | | 1033 | | Excessive forgot password e-mail attempt | | 1126 | | The domain "..." is not accessible. | | 1600 | Bad Request | Unable to process request | | 1600 | Unsupported Media Type | Unable to process request | | Code | Description | | --- | --- | | 1 | BAD_FORMAT_PARAMETER | | 2 | BAD_FORMAT_REQUEST | | 3 | REQUEST_PARAMETER_MISSING | | 4 | UNAUTHORIZED_CLIENT | | 5 | INVALID_ACCOUNT_TOKEN | | 6 | ACCOUNT_TOKEN_EXPIRED | | 7 | REQUEST_FORBIDDEN | | 8 | REQUEST_NOT_FOUND | | 9 | WRONG_HTTP_METHOD | | 10 | INVALID_PLATFORM_ID | | 11 | SYSTEM_UPDATE_REQUIRED | | 12 | BANNED_DEVICE_ALL | | 100 | ACCOUNT_ID_ALREADY_EXISTS | | 101 | ACCOUNT_ID_NOT_ACCEPTABLE | | 103 | MAIL_ADDRESS_NOT_ACCEPTABLE | | 104 | UNAUTHORIZED_DEVICE | | 105 | REACHED_REGISTRATION_LIMIT | | 106 | WRONG_ACCOUNT_PASSWORD | | 107 | COUNTRY_MISMATCH | | 108 | BANNED_ACCOUNT_ALL | | 110 | DEVICE_MISMATCH | | 111 | ACCOUNT_ID_CHANGED | | 112 | ACCOUNT_ALREADY_DELETED | | 114 | COPPA_NOT_ACCEPTED | | 115 | REACHED_ASSOCIATION_LIMIT | | 116 | WRONG_CONFIRMATION_CODE | | 117 | CONFIRMATION_CODE_EXPIRED | | 118 | GAME_SERVER_ID_UNIQUE_ID_NOT_LINKED | | 119 | BANNED_ACCOUNT_IN_APPLICATION | | 120 | BANNED_DEVICE_IN_APPLICATION | | 121 | BANNED_ACCOUNT_IN_NEX_SERVICE | | 122 | BANNED_DEVICE_IN_NEX_SERVICE | | 123 | SERVICE_CLOSED | | 124 | APPLICATION_UPDATE_REQUIRED | | 125 | CLIENT_ID_UNIQUE_ID_NOT_LINKED | | 126 | BANNED_ACCOUNT_IN_INDEPENDENT_SERVICE | | 127 | BANNED_DEVICE_IN_INDEPENDENT_SERVICE | | 128 | MAIL_ADDRESS_NOT_VALIDATED | | 129 | WRONG_BIRTH_DATE_OR_MAIL_ADDRESS | | 130 | PID_NOT_FOUND | | 131 | WRONG_ACCOUNT_MAIL | | 132 | BANNED_ACCOUNT_ALL_TEMPORARILY | | 134 | BANNED_ACCOUNT_IN_APPLICATION_TEMPORARILY | | 136 | BANNED_ACCOUNT_IN_NEX_SERVICE_TEMPORARILY | | 137 | BANNED_DEVICE_IN_NEX_SERVICE_TEMPORARILY | | 138 | BANNED_ACCOUNT_IN_INDEPENDENT_SERVICE_TEMPORARILY | | 139 | BANNED_DEVICE_IN_INDEPENDENT_SERVICE_TEMPORARILY | | 142 | COPPA_AGREEMENT_CANCELED | | 143 | DEVICE_INACTIVE | | 1004 | EULA_NOT_ACCEPTED | | 1006 | INVALID_UNIQUE_ID | | 1016 | NEX_ACCOUNT_NOT_FOUND | | 1017 | GAME_SERVER_ID_ENVIRONMENT_NOT_FOUND | | 1018 | GENERATE_TOKEN_FAILURE | | 1019 | INVALID_NEX_CLIENT_ID | | 1020 | INVALID_CLIENT_KEY | | 1021 | INVALID_GAME_SERVER_ID | | 1022 | INVALID_CLIENT_ID | | 1023 | WRONG_MAIL_ADDRESS | | 1024 | MASTER_PIN_NOT_FOUND | | 1025 | MAIL_TEXT_NOT_FOUND | | 1031 | SEND_MAIL_FAILURE | | 1032 | DOMAIN_ACCOUNT_ALREADY_EXISTS | | 1033 | EXCESSIVE_MAIL_SEND_REQUEST | | 1035 | CREDIT_CARD_GENERAL_FAILURE | | 1036 | CREDIT_CARD_DATE_EXPIRED | | 1037 | CREDIT_CARD_DECLINED | | 1038 | INVALID_CREDIT_CARD_NUMBER | | 1039 | CREDIT_CARD_NUMBER_WRONG | | 1040 | INVALID_CREDIT_CARD_DATE | | 1041 | CREDIT_CARD_BLACKLISTED | | 1042 | INVALID_CREDIT_CARD_PIN | | 1043 | CREDIT_CARD_PIN_WRONG | | 1044 | INVALID_LOCATION | | 1045 | INVALID_POSTAL_CODE | | 1046 | DEVICE_EULA_COUNTRY_MISMATCH | | 1100 | INVALID_EULA_COUNTRY | | 1101 | INVALID_EULA_COUNTRY_AND_VERSION | | 1103 | PARENTAL_CONTROLS_REQUIRED | | 1104 | ACCOUNT_ID_FORMAT_INVALID | | 1105 | WRONG_ACCOUNT_PASSWORD_OR_MAIL_ADDRESS | | 1106 | AUTHENTICATION_LOCKED | | 1107 | ACCOUNT_ID_PASSWORD_SAME | | 1111 | APPROVAL_ID_NOT_FOUND | | 1115 | PENDING_MIGRATION | | 1125 | MAIL_ADDRESS_DOMAIN_NAME_NOT_ACCEPTABLE | | 1126 | MAIL_ADDRESS_DOMAIN_NAME_NOT_RESOLVED | | 1200 | NOT_PROVIDED_COUNTRY | | 2001 | INTERNAL_SERVER_ERROR | | 2002 | UNDER_MAINTENANCE | | 2999 | NINTENDO_NETWORK_CLOSED | ## Device Certificate This header is only sent in the requests for: * `/v1/api/oauth20/access_token/generate` * `/v1/api/people` * `/v1/api/people/@me/agreements` * `/v1/api/people/@me/devices` * `/v1/api/people/@me/devices/owner` The device certificate consists of 384 base64-encoded bytes: | Offset | Size | Description | | --- | --- | --- | | 0x0 | 0x4 | Signature type (debug: `0x00010002`, retail: `0x00010005`) | | 0x4 | 0x3C | Signature | | 0x40 | 0x40 | Padding | | 0x80 | 0x40 | `Root-CA<%08X>-MS<%08X>` | | 0xC0 | 0x4 | Key type (always 2) | | 0xC4 | 0x40 | Device id (`NG<%08X>`) | | 0x104 | 0x4 | NG key id | | 0x108 | 0x78 | Public key |
36.345345
292
0.710155
yue_Hant
0.302848
aaed1935a3f93089fa2dccb6a40d01ba294354a4
2,256
md
Markdown
Angular7PDF_Redactor/README.md
ldu2/PDFRedactor
52d9cdf855acc716b3bc6cfa63b9d33aeec42533
[ "Apache-2.0" ]
2
2021-06-03T03:52:55.000Z
2021-12-07T09:39:37.000Z
Angular7PDF_Redactor/README.md
ldu2/PDFRedactor
52d9cdf855acc716b3bc6cfa63b9d33aeec42533
[ "Apache-2.0" ]
null
null
null
Angular7PDF_Redactor/README.md
ldu2/PDFRedactor
52d9cdf855acc716b3bc6cfa63b9d33aeec42533
[ "Apache-2.0" ]
null
null
null
# Angular 7 PDF Redactor This project was initially generated with [Angular CLI](https://github.com/angular/angular-cli) version 7.2.3.<br/> This project was done by three tools. <br/> [ngx-dropzone-wrapper](https://github.com/zefoy/ngx-dropzone-wrapper) to uploade the PDF file.<br/> [PDF.js](https://github.com/mozilla/pdf.js/) to process the PDF file.<br/> [jsPDF](https://github.com/MrRio/jsPDF) to download the redacted PDF file.<br/> ## Run the redactor First, run `npm install` to install the libraries. Then, run `npm start` to run the project. Navigate to `http://localhost:4200/`. The app will automatically reload if you change any of the source files. ## Development server Run `ng serve` for a dev server. Navigate to `http://localhost:4200/`. The app will automatically reload if you change any of the source files. ## Build Run `ng build` to build the project. The build artifacts will be stored in the `dist/` directory. Use the `--prod` flag for a production build. ## Running unit tests Run `ng test` to execute the unit tests via [Karma](https://karma-runner.github.io). ## Running end-to-end tests Run `ng e2e` to execute the end-to-end tests via [Protractor](http://www.protractortest.org/). ## Implementation ### Components <pre> App<br/> |--redactor<br/> |--dropzone </pre> ## Usage 1. You can simply use this tool to redact your PDF file. [Demo](https://ldu2.github.io/PDFRedactor/)<br/> 2. You can use this app as a whole component in your own app to redact pdf file.<br/> 3. You can learn something from the tool,i.e., manipulating PDF file with canvas.<br/> ### Note: **Once you move on to the next page. The redaction is set on the page.**<br/> **The larger the resolution of the screen, the better quality of the file maintains.**<br/> The result is almost the same as the [JS_PDFRedactor](https://github.com/ldu2/PDFRedactor/tree/master/JS_PDFRedactor)<br/> For the Angular 7 version of pdf file redactor. When upload a file, I am using http://httpbin.org/post as the POST endpoint. No responsibility taken for the safety of the file posted to that end point.<br/> If you wish you send it to your server, simply clone the repo and modify the code. Simply change the download button to send a POST request.
47
206
0.736259
eng_Latn
0.848087
aaede0e82c88151ffd00cd7d67e1327a8ba9174d
3,134
md
Markdown
README.md
ani-sinha/bst
7bcef36ed5d0b2b89b4e031cff12fa34e6d6868a
[ "MIT" ]
79
2020-05-13T16:00:33.000Z
2022-03-29T18:35:14.000Z
README.md
ani-sinha/bst
7bcef36ed5d0b2b89b4e031cff12fa34e6d6868a
[ "MIT" ]
20
2020-07-16T05:55:51.000Z
2021-09-26T11:32:31.000Z
README.md
ani-sinha/bst
7bcef36ed5d0b2b89b4e031cff12fa34e6d6868a
[ "MIT" ]
7
2020-06-04T00:23:39.000Z
2021-09-10T14:29:27.000Z
# bst bst (pronounced "bestie") is a one-stop shop for running programs in isolated Linux environments. It is, effectively, a combination of `unshare`, `mount`, `setarch`, `chroot`, and many others; taking care of all the low-level minutæ to get in an environment that is as isolated as possible. The main purpose of bst is running CI/build processes in a somewhat deterministic fashion. ## Usage ``` $ bst [options] <exe> <args...> ``` See `man 1 bst` for more detailed information about how to use this program, including examples. ## Why bst? While bst is a multi-purpose tool, its main purpose is to serve as a building block for larger container systems. In CI systems running lots of commands in rapid succession, the cost of spinning up Docker containers can be unacceptable. For instance, on an 8-core laptop, over 10 runs, it takes 1.15 seconds to run `/bin/true` on an Alpine Linux Docker image, while bst takes 0.07 seconds to setup and run the same program in an isolated environment. ``` $ perf stat -n -r 10 -- docker run --rm -it alpine true Performance counter stats for 'docker run --rm -it alpine true' (10 runs): 1,1503 +- 0,0156 seconds time elapsed ( +- 1,36% ) $ perf stat -n -r 10 -- bst -r alpine /bin/true Performance counter stats for 'bst -r alpine /bin/true' (10 runs): 0,07352 +- 0,00470 seconds time elapsed ( +- 6,40% ) ``` bst is not and does not want to be a replacement for Docker, but is meant to be used by tooling wanting low-overhead isolated environments. Another strong suit of bst is that, by design, it can be used unprivileged. bst uses well-defined semantics for user namespaces to give unprivileged users the rights to enter different environments in a safe and controlled manner. ## Quickstart ### Installing There are two ways to install bst: downloading a prepackaged binary, or building from source: #### Installing a binary package Download the binary archive of the [latest release](https://github.com/aristanetworks/bst/releases/latest). Extract the archive into `/`, making sure to preserve xattrs. bst is installed into /usr/local. ``` $ sudo tar --xattrs --xattrs-include='*' -xf bst-x86_64.tar.xz -C / $ export PATH=$PATH:/usr/local/bin $ bst --version v1.0.0-rc1 ``` #### Building from source bst uses [Meson][meson] for its build system (requires python, ninja, sudo, and libcap). Additionaly, it uses [scdoc][scdoc] to build its man pages. From the source directory: ``` $ meson ./build $ ninja -C ./build $ sudo ninja -C ./build install ``` The last step installs bst into /usr/local. ### Using bst First, make sure that your current user has a slice of sub-UIDs and sub-GIDs allocated: ``` $ id uid=1000(barney) gid=1000(barney) groups=1000(barney) $ grep -H . /etc/sub{u,g}id /etc/subuid:barney:1000000:65536 /etc/subgid:barney:1000000:65536 ``` See `man 5 subuid` and `man 5 subgid` for what these values signify. Once this is done, you should just be able to try it out: ``` $ bst # id uid=0(root), gid=0(root), groups=0(root) ``` [meson]: https://mesonbuild.com [scdoc]: https://git.sr.ht/~sircmpwn/scdoc
27.982143
107
0.720166
eng_Latn
0.991924
aaede73916fa98dffca98cc2794741db8be22419
203
md
Markdown
_moods/2021-09-21-big-apple.md
DotIN13/plainwhite-wannaexpresso.com
cf41a3ca03bc8ada202213cf4d951023e3f2ce5d
[ "MIT" ]
2
2021-02-18T12:13:28.000Z
2021-04-09T06:00:04.000Z
_moods/2021-09-21-big-apple.md
DotIN13/plainwhite-wannaexpresso.com
cf41a3ca03bc8ada202213cf4d951023e3f2ce5d
[ "MIT" ]
8
2021-02-19T03:56:24.000Z
2022-03-29T11:45:55.000Z
_moods/2021-09-21-big-apple.md
DotIN13/plainwhite-wannaexpresso.com
cf41a3ca03bc8ada202213cf4d951023e3f2ce5d
[ "MIT" ]
null
null
null
--- title: "Big Apple" author: "DotIN13" locale: en_US header-image: "big-apple.png" image-full: false date: 2021-09-21 11:11:58 +0800 --- Big meat, big plastic, big Apple. 埋怨国际大厂的配送速度之余,想想那些秒速抢购一空的……
15.615385
33
0.70936
eng_Latn
0.14942
aaeee41060bafe9d8939616fa8f0ad35e3e12504
4,767
md
Markdown
src/posts/Migrando-de-Wordpress-para-Hexo.md
henriqueln7/woliveiras.github.io
092c70e62b65e2175f1a7d068f8c55856a67a6b2
[ "MIT" ]
31
2016-07-02T03:55:22.000Z
2022-02-06T14:46:04.000Z
src/posts/Migrando-de-Wordpress-para-Hexo.md
henriqueln7/woliveiras.github.io
092c70e62b65e2175f1a7d068f8c55856a67a6b2
[ "MIT" ]
28
2015-08-08T11:09:43.000Z
2022-01-06T19:29:44.000Z
src/posts/Migrando-de-Wordpress-para-Hexo.md
henriqueln7/woliveiras.github.io
092c70e62b65e2175f1a7d068f8c55856a67a6b2
[ "MIT" ]
21
2016-01-21T23:20:03.000Z
2022-02-07T20:05:41.000Z
--- title: Migrando de WordPress para Hexo tags: - hexo - javascript - nodejs - jamstack date: '2015-08-04' description: Migrar de WordPress para Hexo. Como migrar de WordPress para um gerador estático? --- A um bom tempo sinto vontade de mudar meu Blog de WordPress para uma plataforma de conteúdo estático, mas não achava nenhuma alternativa, pois que queria usar Nodejs. Por isso não fui para o, muito usado na comunidade, [Jekyll(Ruby)](https://jekyllrb.com/) , nem para o [Pelican(Python <3)](https://blog.getpelican.com/). ![Hexo Blog]({{site.postsImagesPath}}hexo-logo.png) Em busca de uma alternativa encontrei o [Hexo](https://hexo.io) e, um pouco depois de eu conhecer essa plataforma, o [Daciuk](https://blog.da2k.com.br/) escreveu um [post](https://blog.da2k.com.br/2014/01/05/hexo-criando-um-blog-ao-estilo-miojo/) muito maneiro incentivando o seu uso. Foi então que eu decidi de vez usar essa plataforma! Bora conhecer ela e como foi a migração de WordPress para Hexo? ## Levantando as necessidades Antes de migrar nosso Blog(Ou o de um cliente) de uma plataforma para outra precisamos fazer um levantamento para que não tenha impacto negativo como perder os posts que já existem ou alguma funcionalidade importante aos leitores. Minha primeira preocupação foi com os posts que eu já havia escrito, mas no próprio site do Hexo temos uma [solução para isso](https://hexo.io/docs/migration.html). As dicas sobre instalação do Hexo você encontra no Blog do [Daciuk](https://blog.da2k.com.br/2014/01/05/hexo-criando-um-blog-ao-estilo-miojo/) e no Blog do [Willian Santos](https://dwoond.github.io/Criando-seu-site-com-Hexo/). ## Como escrever os posts Os posts no Hexo funcionam como em qualquer outro gerador estático. Escrevemos em Markdown e a postagem é gerada em HTML. Markdown não é difícil e podemos usar algumas ferramentas boas que auxiliam nisso como o [Haroopad](https://pad.haroopress.com/) ou o [Stackedit](https://stackedit.io/editor#), mas o Hexo também tem uma espécie de wp-admin, basta instalar o [Hexo Admin](https://github.com/jaredly/hexo-admin) se você sente necessidade. ## Migrando os posts do WordPress para o Hexo Basta instalar o plugin **hexo-migrator-wordpress** com o comando: ```bash npm i hexo-migrator-wordpress --save ``` *Obs: Lembre-se de rodar esses comandos dentro da pasta do seu blog em Hexo.* Agora você precisa entrar no **Admin** do seu WordPress e seguir os passos: ```bash Tools → Export → WordPress ``` Fica no canto esquerdo da tela. E então você seleciona o que deseja exportar. Se tiver alguma data em específico que você não queira importar no novo blog, basta setar a faixa de data e clicar em Download do arquivo de exportação. Será gerado um .xml com suas postagens configuradas com data de postagem, tags que usou, marcações especiais, etc. Em seguida basta executar o comando: ```bash hexo migrate wordpress <caminho onde está o arquivo gerado pelo WP> ``` E todas as suas postagens serão importadas. O que estava em rascunho vai ficar em rascunho o que foi postado será postado normalmente. Só uma coisa ruim de tudo isso: O WordPress gera o .xml com caminhos absolutos em imagens ou outras mídias que você tenha inserido na postagem, então é bom revisar as postagens para ver se não quebrou alguma coisa. ~~Por isso que eu vou demorar um pouco para subir as postagens antigas aqui pro blog.~~ ## Gerando o RSS Basta colocar essa linha no seu arquivo de configuração do Blog(O _config.yml): ```ruby # Feed feed: type: atom path: atom.xml limit: você escolhe ;) ``` ## Comentários Como eu já utilizava uma plataforma de comentários que é o [Disqus](https://disqus.com/) ficou fácil manter os comentários lá. :) Basta adicionar o shortname(Que você encontra nas configurações do seu site no Disqus) do site no arquivo de configuração: ```ruby # Disqus disqus_shortname: <shortname> ``` ## Hospedagem Como é puro HTML, você pode hospedar em qualquer lugar! Muita gente usa o Github pages e eu também coloquei para lá. Basta configurar o deploy para o caminho do seu servidor: ```ruby # Deployment ## Docs: https://hexo.io/docs/deployment.html deploy: type: git repo: <seu repositório no gh pages> ``` E, assim que executar o comando hexo deploy, o site estará atualizado. ## Domínio Você pode usar seu próprio domínio ou deixar com o .github.io, depende só de você. ;) Para ter seu próprio domínio você pode seguir esses passos do [Willian Justen](https://willianjusten.com.br/dominio-proprio-no-github-pages/)(Quanto Willian/m em um só post...) Por enquanto é só isso. Logo menos vou postar mais sobre o Hexo e como está sendo meu dia-a-dia com ele. Se você já tem experiência com Hexo ou outros geradores e quer deixar sua opinião, comenta aí! ;)
44.551402
337
0.761066
por_Latn
0.999856
aaef17dcb3a8b99b99d440cf6536ed8ad3210bcf
188
md
Markdown
README.md
mimocha/ssh-greet
7d6490531ffc58243961da8c68519310cce32018
[ "MIT" ]
null
null
null
README.md
mimocha/ssh-greet
7d6490531ffc58243961da8c68519310cce32018
[ "MIT" ]
null
null
null
README.md
mimocha/ssh-greet
7d6490531ffc58243961da8c68519310cce32018
[ "MIT" ]
null
null
null
# ssh-greet Greets users logging in via ssh, by printing the user's full name (from GECOS field). This is designed to work around the user being root when ssh executes message of the day.
47
89
0.776596
eng_Latn
0.99997
aaef79cecc22fc3ba5cdf806313f4466469a66ce
1,310
md
Markdown
windows-driver-docs-pr/install/pnp-manager.md
k-takai/windows-driver-docs.ja-jp
f28c3b8e411a2502e6378eaeef88cbae054cd745
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/install/pnp-manager.md
k-takai/windows-driver-docs.ja-jp
f28c3b8e411a2502e6378eaeef88cbae054cd745
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/install/pnp-manager.md
k-takai/windows-driver-docs.ja-jp
f28c3b8e411a2502e6378eaeef88cbae054cd745
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: プラグ アンド プレイ マネージャー description: プラグ アンド プレイ マネージャー ms.assetid: b1890b3c-fc7b-4a2e-b48a-8266f237c9b6 ms.date: 04/20/2017 ms.localizationpriority: medium ms.openlocfilehash: 94c08fb3952610b2363e22f4c5ef65497339fd90 ms.sourcegitcommit: fb7d95c7a5d47860918cd3602efdd33b69dcf2da ms.translationtype: MT ms.contentlocale: ja-JP ms.lasthandoff: 06/25/2019 ms.locfileid: "67360834" --- # <a name="plug-and-play-manager"></a>プラグ アンド プレイ マネージャー プラグ アンド プレイ (PnP) マネージャーでは、Windows での PnP の機能をサポートし、PnP に関連する次のタスクを担当します。 - デバイスの検出と、システムの起動中に列挙型 - 追加またはシステムの実行中にデバイスを削除します。 カーネル モードの PnP マネージャーでは、新しいデバイスは、システムに存在し、インストールする必要があります、ユーザー モードの PnP マネージャーに通知します。 カーネル モードの PnP マネージャーも呼び出して、 [ *DriverEntry* ](https://docs.microsoft.com/windows-hardware/drivers/ddi/content/wdm/nc-wdm-driver_initialize)と[ *AddDevice* ](https://docs.microsoft.com/windows-hardware/drivers/ddi/content/wdm/nc-wdm-driver_add_device)デバイスのルーチンのドライバーと、を送信します。[**IRP_MN_START_DEVICE** ](https://docs.microsoft.com/windows-hardware/drivers/kernel/irp-mn-start-device)デバイスを開始する要求。 PnP マネージャーの維持、[デバイス ツリー](https://docs.microsoft.com/windows-hardware/drivers/kernel/device-tree)は、デバイスをシステム内の追跡。 デバイスのツリーには、システム上に存在するデバイスに関する情報が含まれています。 コンピューターの起動時、PnP マネージャーはドライバーやその他のコンポーネントからの情報を使用して、このツリーを構築し、デバイスの追加または削除は、ツリーを更新します。
35.405405
391
0.81145
yue_Hant
0.584267
aaef9c8f78b025d2ab740ac46b32ff6aa98ca756
2,603
md
Markdown
docs/windows/customizing-or-changing-colors-image-editor-for-icons.md
anmrdz/cpp-docs.es-es
f3eff4dbb06be3444820c2e57b8ba31616b5ff60
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/windows/customizing-or-changing-colors-image-editor-for-icons.md
anmrdz/cpp-docs.es-es
f3eff4dbb06be3444820c2e57b8ba31616b5ff60
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/windows/customizing-or-changing-colors-image-editor-for-icons.md
anmrdz/cpp-docs.es-es
f3eff4dbb06be3444820c2e57b8ba31616b5ff60
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Personalizar o cambiar colores (Editor de imágenes para iconos) | Microsoft Docs ms.custom: '' ms.date: 11/04/2016 ms.technology: - cpp-windows ms.topic: conceptual dev_langs: - C++ helpviewer_keywords: - dithered color, Image editor - Custom Color Selector dialog box [C++] - Image editor [C++], Colors Palette - colors [C++], image - bitmaps [C++], colors - images [C++], colors - HSL values - Colors Palette, Image editor - RGB color values - Adjust Colors command [C++] - Image editor [C++], dithered color ms.assetid: e58f6b32-f435-4d9a-a570-7569433661ae author: mikeblome ms.author: mblome ms.workload: - cplusplus - uwp ms.openlocfilehash: e507942bbb0e6f77ec6423a51e4260f3fdd647a3 ms.sourcegitcommit: 799f9b976623a375203ad8b2ad5147bd6a2212f0 ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 09/19/2018 ms.locfileid: "46405195" --- # <a name="customizing-or-changing-colors-image-editor-for-icons"></a>Personalizar o cambiar colores (Editor de imágenes para iconos) El **imagen** del editor [paleta de colores](../windows/colors-window-image-editor-for-icons.md) muestra inicialmente los 16 colores estándar. Además de los colores mostrados, puede crear sus propios colores personalizados. A continuación, puede [guardar y cargar una paleta de colores personalizada](../windows/saving-and-loading-different-color-palettes-image-editor-for-icons.md). ### <a name="to-change-colors-on-the-colors-palette"></a>Para cambiar los colores de la paleta de colores 1. Desde el **imagen** menú, elija **Ajustar colores**. 2. En el [cuadro de diálogo Selector de colores personalizados](../windows/custom-color-selector-dialog-box-image-editor-for-icons.md), defina el color escribiendo los valores RGB o HSL en los cuadros de texto correspondientes o elija un color en el **muestra de Color degradado** cuadro. 3. Establezca la luminosidad moviendo el control deslizante el **luminosidad** barra. 4. Muchos colores personalizados están interpolados. Si desea que el color sólido más próximo al color interpolado, haga doble clic en el **Color** cuadro. Si más adelante decide que desea el color interpolado, mueva el control deslizante el **luminosidad** barra o mueva la cruz el **muestra de Color degradado** cuadro de nuevo para restaurar la interpolación. 5. Haga clic en **Aceptar** para agregar el nuevo color. ## <a name="requirements"></a>Requisitos Ninguna ## <a name="see-also"></a>Vea también [Teclas de aceleración](../windows/accelerator-keys-image-editor-for-icons.md)<br/> [Trabajar con colores](../windows/working-with-color-image-editor-for-icons.md)
43.383333
383
0.766039
spa_Latn
0.859567
aaf054d835b0ec9058cf73c9911465e07fbfc0d2
5,054
md
Markdown
README.md
dotnetpowered/spark
e0addc0fbea4b41e37202e62aacfa4f5df2f8f42
[ "BSD-3-Clause" ]
null
null
null
README.md
dotnetpowered/spark
e0addc0fbea4b41e37202e62aacfa4f5df2f8f42
[ "BSD-3-Clause" ]
null
null
null
README.md
dotnetpowered/spark
e0addc0fbea4b41e37202e62aacfa4f5df2f8f42
[ "BSD-3-Clause" ]
null
null
null
Spark-Lite ========== Spark Lite is a stripped down fork of Spark meant for buidling a server with minimal dependencies. It removes from the Spark repo: - Storage implementations - Maintenance functions specific to storage implementation - SignalR |DSTU2|STU3|R4 |:-:|:-:|:-: |![Tests](https://github.com/FirelyTeam/spark/actions/workflows/run_tests.yaml/badge.svg?branch=master)|![Tests](https://github.com/FirelyTeam/spark/actions/workflows/run_tests.yaml/badge.svg?branch=stu3%2Fmaster)|![Tests](https://github.com/FirelyTeam/spark/actions/workflows/run_tests.yaml/badge.svg?branch=r4%2Fmaster) |![Integration Tests](https://github.com/FirelyTeam/spark/actions/workflows/integration_tests.yml/badge.svg?branch=master)|![Integration Tests](https://github.com/FirelyTeam/spark/actions/workflows/integration_tests.yml/badge.svg?branch=stu3%2Fmaster)|![Integration Tests](https://github.com/FirelyTeam/spark/actions/workflows/integration_tests.yml/badge.svg?branch=r4%2Fmaster) |![Release](https://github.com/FirelyTeam/spark/actions/workflows/nuget_deploy.yml/badge.svg)|![Release](https://github.com/FirelyTeam/spark/actions/workflows/nuget_deploy.yml/badge.svg)|![Release](https://github.com/FirelyTeam/spark/actions/workflows/nuget_deploy.yml/badge.svg) |![Docker Release](https://github.com/FirelyTeam/spark/actions/workflows/docker_image_linux.yml/badge.svg)|![Docker Release](https://github.com/FirelyTeam/spark/actions/workflows/docker_image_linux.yml/badge.svg)|![Docker Release](https://github.com/FirelyTeam/spark/actions/workflows/docker_image_linux.yml/badge.svg) Spark ===== Spark is a public domain FHIR server developed in C#, initially built by Firely and as of recently being maintained by Incendi. Spark implements a major part of the FHIR specification and has been used and tested during several HL7 WGM Connectathons. As of recently the task of maintaining Spark has been taken upon by the community and is led by Incendi. Incendi and the community, will keep enhancing this server to support the latest versions and add functionality. We also welcome anyone who wants to support this effort and help us make Spark a better reference platform and playground for FHIR. **DISCLAIMER: The web projects Spark.Web and Spark are meant as reference implementations and should never be used out of the box in a production environment without adding as a minimum security features.** ### Get Started There are two ways to get started with Spark. Either by using the NuGet packages and following the Quickstart Tutorial, or by using the Docker Images. #### NuGet Packages Read the [Quickstart Tutorial](https://firelyteam.github.io/spark/quickstart) on how to set up your own FHIR Server using the NuGet Packages. There is also an example project that accompanies the Quickstart Tutorial which you can find here: https://github.com/incendilabs/spark-example #### Docker Images Set up the Spark FHIR server by using the Docker Images. Make sure you have installed [Docker](https://docs.docker.com/install/). On Linux you will need to install [Docker Compose](https://docs.docker.com/compose/install/) as well. After installing Docker you could run Spark server by running one of the following commands, found below, for your preferred FHIR Version. Remember to replace the single quotes with double quotes on Windows. The Spark FHIR Server will be available after startup at `http://localhost:5555`. #### R4 ``` curl 'https://raw.githubusercontent.com/FirelyTeam/spark/r4/master/.docker/docker-compose.example.yml' > docker-compose.yml docker-compose up ``` #### STU3 ``` curl 'https://raw.githubusercontent.com/FirelyTeam/spark/stu3/master/.docker/docker-compose.example.yml' > docker-compose.yml docker-compose up` ``` #### DSTU2 ``` curl 'https://raw.githubusercontent.com/FirelyTeam/spark/master/.docker/docker-compose.example.yml' > docker-compose.yml docker-compose up ``` ## Versions #### R4 Source code can be found in the branch **r4/master**. This is the version of Spark running at https://spark.incendi.no FHIR Endpoint: https://spark.incendi.no/fhir #### STU3 Source code can be found in the branch **stu3/master**, we try to keep up-to-date with the STU3 version of FHIR. This is the version of Spark running at https://spark-stu3.incendi.no FHIR Endpoint: https://spark-stu3.incendi.no/fhir #### DSTU2 DSTU2 is no longer maintained by this project. The source code can be found in the branch **master**. #### DSTU1 DSTU1 is no longer maintained by this project. The source code can be found in the branch **dstu1/master**. ## Contributing If you want to contribute, see our [guidelines](https://github.com/furore-fhir/spark/wiki/Contributing) ### Git branching strategy Our strategy for git branching: Branch from the stu3/master branch which contains the STU3 version, unless the feature or bug fix is considered for a specific version of FHIR then branch from the relevant branch which at this point is only r4/master. See [GitHub flow](https://guides.github.com/introduction/flow/) for more information.
60.891566
521
0.78334
eng_Latn
0.948944
aaf0809c7aebbd6b302fc3fcfa7151689cd0e709
8,700
md
Markdown
docs/prow/prow-installation-on-forks.md
montaro/test-infra
9b23660d764c4e927aeccee0be9741168a03ad23
[ "Apache-2.0" ]
null
null
null
docs/prow/prow-installation-on-forks.md
montaro/test-infra
9b23660d764c4e927aeccee0be9741168a03ad23
[ "Apache-2.0" ]
null
null
null
docs/prow/prow-installation-on-forks.md
montaro/test-infra
9b23660d764c4e927aeccee0be9741168a03ad23
[ "Apache-2.0" ]
null
null
null
# Prow Installation on Forks This instruction provides the steps required to deploy your own Prow on a forked repository for test and development purposes. > **NOTE:** The following instructions assume that you are signed in to the Google Cloud project with administrative rights and that you have the `$GOPATH` already set. ## Prerequisites Install the following tools: - Kubernetes 1.10+ on Google Kubernetes Engine (GKE) - [kubectl](https://kubernetes.io/docs/tasks/tools/install-kubectl/) to communicate with Kubernetes - [gcloud](https://cloud.google.com/sdk/gcloud/) to communicate with Google Cloud Platform (GCP) - OpenSSL ## Provision a cluster 1. Export these variables: ``` export PROJECT={project-name} export CLUSTER_NAME={cluster-name} export ZONE={zone-name} ``` 2. When you communicate for the first time with Google Cloud, set the context to your Google Cloud project. Run this command: ``` gcloud config set project $PROJECT ``` 3. Run the [`provision-cluster.sh`](../../development/provision-cluster.sh) script or follow [this](https://github.com/kubernetes/test-infra/blob/master/prow/getting_started_deploy.md#create-the-cluster) instruction to provision a new cluster on GKE. Make sure that kubectl points to the correct cluster. For GKE, run the following command: ``` gcloud container clusters get-credentials $CLUSTER_NAME --zone=$ZONE --project=$PROJECT ``` ## Create a bot account Create a separate GitHub account which serves as a bot account that triggers the Prow comments that you enter in the pull request. If the Prow bot account is the same as the account that creates a job-triggering comment, the job is not triggered. Add the bot account to the [collaborators](https://help.github.com/articles/adding-outside-collaborators-to-repositories-in-your-organization/) on your forked repository and set it with push access rights. The bot account must accept your invitation. ## Set an access token Set an OAuth2 token that has the read and write access to the bot account. To generate a new token, go to the **Settings** tab of a given GitHub account and click **Developer Settings**. Choose **Personal Access Token** and **Generate New Token**. In the new window, select all scopes and click **Generate token**. You can set the token either as an environment variable named `OAUTH` or provide it during the installation process when prompted. ## Create Secrets For the purpose of the installation, you must have a set of service accounts and secret files created on Google Cloud Storage (GCS). > **NOTE:** For details, see the [Prow Secrets Management](./prow-secrets-management.md) document that explains step by step how to create all required GCS resources. 1. Create two buckets on GCS, one for storing Secrets and the second for storing logs. > **NOTE:** The bucket for storing logs is used in Prow by the Plank component. This reference is defined in the `config.yaml` file. 2. Create the following service accounts, role bindings, and private keys. Encrypt them using Key Management Service (KMS), and upload them to your Secret storage bucket: - **sa-gke-kyma-integration** with roles that allow the account to manage Kubernetes clusters and their resources: - Compute Admin (`roles/compute.admin`) - Kubernetes Engine Admin (`roles/container.admin`) - Kubernetes Engine Cluster Admin (`roles/container.clusterAdmin`) - DNS Administrator (`roles/dns.admin`) - Service Account User (`roles/iam.serviceAccountUser`) - Storage Admin (`roles/storage.admin`) - **sa-vm-kyma-integration** with roles that allow the account to provision virtual machines: - Compute Instance Admin (beta) (`roles/compute.instanceAdmin`) - Compute OS Admin Login (`roles/compute.osAdminLogin`) - Service Account User (`roles/iam.serviceAccountUser`) - **sa-gcs-plank** with the role that allows the account to store objects in a bucket: - Storage Object Admin (`roles/storage.objectAdmin`) - **sa-gcr-push** with the role that allows the account to push images to Google Container Repository: - Storage Admin `roles/storage.admin` - **kyma-bot-npm-token** which is a token for publishing npm packages ## Install Prow Follow these steps to install Prow: 1. Export these environment variables: - **BUCKET_NAME** is a GCS bucket in the Google Cloud project that stores Prow Secrets. - **KEYRING_NAME** is the KMS key ring. - **ENCRYPTION_KEY_NAME** is the key name in the key ring that is used for data encryption. - **KUBECONFIG** is a path to a `kubeconfig` file. - **PROJECT** is a Gcloud project name. - **GOOGLE_APPLICATION_CREDENTIALS** is a path to a service account file. This service account requires KMS and storage roles. The account files are encrypted with the **ENCRYPTION_KEY_NAME** key from **KEYRING_NAME** and are stored in **BUCKET_NAME**. 2. Go to the `development` folder and run the following script to start the installation process: ```bash ./install-prow.sh ``` > **NOTE:** The scripts prompts you to enter your OAuth2 token. This script performs the following steps to install Prow: - Deploy the NGINX Ingress Controller. - Create a ClusterRoleBinding. - Create a HMAC token used for GitHub webhooks. - Create Secrets for HMAC and OAuth2 used by Prow. - Deploy Prow components using the `starter.yaml` file from the `prow/cluster` directory. - Add annotations for the Prow Ingress to make it work with the NGINX Ingress Controller. ## Verify the Installation Verify if the Prow installation was successful. 1. Check if all Pods are up and running: ``` kubectl get pods ``` 2. Check if the Deck is accessible from outside of the cluster: ``` kubectl get ingress ing ``` Copy the address of the ingress `ing` and open it in a browser to display the Prow status on the dashboard. ## Configure the webhook After Prow installs successfully, you must [configure the webhook](https://support.hockeyapp.net/kb/third-party-bug-trackers-services-and-webhooks/how-to-set-up-a-webhook-in-github) to enable the GitHub repository to send events to Prow. ## Configure Prow When you use the [`install-prow.sh`](../../development/provision-cluster.sh) script to install Prow on your cluster, the list of plugins and configuration is empty. You can configure Prow by specifying the `config.yaml` and `plugins.yaml` files, and adding job definitions to the `jobs` directory. ### The config.yaml file The `config.yaml` file contains the basic Prow configuration. When you create a particular ProwJob, it uses the Preset definitions from this file. See the example of such a file [here](../../prow/config.yaml). For more details, see the [Kubernetes documentation](https://github.com/kubernetes/test-infra/blob/master/prow/getting_started_deploy.md#add-more-jobs-by-modifying-configyaml). ### The plugins.yaml file The `plugins.yaml` file contains the list of [plugins](https://status.build.kyma-project.io/plugins) you enable on a given repository. See the example of such a file [here](../../prow/plugins.yaml). For more details, see the [Kubernetes documentation](https://github.com/kubernetes/test-infra/blob/master/prow/getting_started_deploy.md#enable-some-plugins-by-modifying-pluginsyaml). ### The job configuration file You can define a test presubmit job for a component. However, remember to adjust its definition in the `yaml` file to point to your forked repository instead of the original repository. For details on how to define a presubmit job, see the [Migration Guide](./migration-guide.md#create-a-presubmit-job). ### Verify the configuration To check if the `plugins.yaml`, `config.yaml`, and jobs configuration files are correct, run the `validate-config.sh {plugins_file_path} {config_file_path} {jobs_dir_path}` script. For example, run: ``` ./validate-config.sh ../prow/plugins.yaml ../prow/config.yaml ../prow/jobs ``` ### Upload the configuration on a cluster If the files are configured correctly, upload the files on a cluster. 1. Use the `update-plugins.sh {file_path}` script to apply plugin changes on a cluster. ``` ./update-plugins.sh ../prow/plugins.yaml ``` 2. Use the `update-config.sh {file_path}` script to apply Prow configuration on a cluster. ``` ./update-config.sh ../prow/config.yaml ``` 3. Use the `update-jobs.sh {jobs_dir_path}` script to apply jobs configuration on a cluster. ``` ./update-jobs.sh ../prow/jobs ``` After you complete the required configuration, you can test the uploaded plugins and configuration. You can also create your own job pipeline and test it against the forked repository. ### Cleanup To clean up everything created by the installation script, run the removal script: ``` ./remove-prow.sh ```
43.939394
340
0.76092
eng_Latn
0.979205
aaf0983602c5cb93e100c56511cfdd3e101165c6
2,613
md
Markdown
docs/framework/unmanaged-api/debugging/icordebugstackwalk-next-method.md
jhonyfrozen/docs.pt-br
c9e86b6a5de2ff8dffd54dd64d2e87aee85a5cb8
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/unmanaged-api/debugging/icordebugstackwalk-next-method.md
jhonyfrozen/docs.pt-br
c9e86b6a5de2ff8dffd54dd64d2e87aee85a5cb8
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/unmanaged-api/debugging/icordebugstackwalk-next-method.md
jhonyfrozen/docs.pt-br
c9e86b6a5de2ff8dffd54dd64d2e87aee85a5cb8
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Método ICorDebugStackWalk::Next ms.date: 03/30/2017 api_name: - ICorDebugStackWalk.Next Method api_location: - mscordbi.dll api_type: - COM f1_keywords: - ICorDebugStackWalk::Next helpviewer_keywords: - ICorDebugStackWalk::Next method [.NET Framework debugging] - Next method, ICorDebugStackWalk interface [.NET Framework debugging] ms.assetid: 189c36be-028c-4fba-a002-5edfb8fcd07f topic_type: - apiref author: rpetrusha ms.author: ronpet ms.openlocfilehash: 724db50285532c20132fbfd5262df26227db6742 ms.sourcegitcommit: 9b552addadfb57fab0b9e7852ed4f1f1b8a42f8e ms.translationtype: MT ms.contentlocale: pt-BR ms.lasthandoff: 04/23/2019 ms.locfileid: "61782662" --- # <a name="icordebugstackwalknext-method"></a>Método ICorDebugStackWalk::Next Move o [ICorDebugStackWalk](../../../../docs/framework/unmanaged-api/debugging/icordebugstackwalk-interface.md) objeto para o próximo quadro. ## <a name="syntax"></a>Sintaxe ``` HRESULT Next(); ``` ## <a name="return-value"></a>Valor de retorno Esse método retorna os HRESULTs específicos a seguir, bem como o HRESULT erros que indicam falha do método. |HRESULT|Descrição| |-------------|-----------------| |S_OK|O tempo de execução retornou com êxito para o próximo quadro (consulte comentários).| |E_FAIL|O `ICorDebugStackWalk` objeto não pôde ser avançado.| |CORDBG_S_AT_END_OF_STACK|O final da pilha foi atingido devido essa desenrolamento.| |CORDBG_E_PAST_END_OF_STACK|O ponteiro de quadro já está no final da pilha; Portanto, não há quadros adicionais podem ser acessados.| ## <a name="exceptions"></a>Exceções ## <a name="remarks"></a>Comentários O `Next` avanços de método a `ICorDebugStackWalk` somente se o tempo de execução poderá desenrolar o quadro atual do objeto para o quadro de chamada. Caso contrário, o objeto avança para o próximo quadro que o tempo de execução é capaz de desenrolamento. ## <a name="requirements"></a>Requisitos **Plataformas:** Confira [Requisitos de sistema](../../../../docs/framework/get-started/system-requirements.md). **Cabeçalho:** CorDebug.idl, CorDebug.h **Biblioteca:** CorGuids.lib **Versões do .NET Framework:** [!INCLUDE[net_current_v40plus](../../../../includes/net-current-v40plus-md.md)] ## <a name="see-also"></a>Consulte também - [Interface ICorDebugStackWalk](../../../../docs/framework/unmanaged-api/debugging/icordebugstackwalk-interface.md) - [Depurando interfaces](../../../../docs/framework/unmanaged-api/debugging/debugging-interfaces.md) - [Depuração](../../../../docs/framework/unmanaged-api/debugging/index.md)
40.2
257
0.735936
por_Latn
0.723375
aaf1776a8383ba2b4df96bb137ea90ea5af4d39f
11,447
md
Markdown
src/pages/languages/VerboseFuck.vbfk.md
FractalHQ/HelloWorlds
c6f90363a5f34ae6040858502fd5714ec3184512
[ "MIT" ]
null
null
null
src/pages/languages/VerboseFuck.vbfk.md
FractalHQ/HelloWorlds
c6f90363a5f34ae6040858502fd5714ec3184512
[ "MIT" ]
null
null
null
src/pages/languages/VerboseFuck.vbfk.md
FractalHQ/HelloWorlds
c6f90363a5f34ae6040858502fd5714ec3184512
[ "MIT" ]
null
null
null
~!comment!~VerboseFuck, a BrainFuck derivative focussing on verbosity. see more at <http://esolangs.org/wiki/VerboseFuck>~!uncomment!~ program.initialize(); math.equation(program.errors.handler.activated = boolean(false)); program.console.standardinput.openstream(); program.console.standardoutput.openstream(); define(defines.variable, variable(pointer)); implanttype(pointer, types.pointer(to:types.byte)); math.equation(pointer = void(0)); program.memory.allocate(pointer, void(math.infinity), program.memory.memorytype.bidirectional); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); define(defines.label, defines.label.createnew()); conditional(block.if, boolean.inequality(deref(pointer), byte(0))) { math.equation(pointer = pointer + void(1)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(pointer = pointer + void(1)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(pointer = pointer + void(1)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(pointer = pointer + void(1)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(pointer = pointer - void(1)); math.equation(pointer = pointer - void(1)); math.equation(pointer = pointer - void(1)); math.equation(pointer = pointer - void(1)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); }; conditional(block.if, boolean.inequality(deref(pointer), byte(0))) { program.flow.labeledjump(defines.label.last()); }; undefine(defines.label, defines.label.last()); math.equation(pointer = pointer + void(1)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); math.equation(pointer = pointer + void(1)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); math.equation(pointer = pointer + void(1)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); math.equation(pointer = pointer - void(1)); math.equation(pointer = pointer - void(1)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); ~!comment!~MANDATORY~!uncomment!~ math.equation(pointer = pointer + void(1)); program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ math.equation(deref(pointer) = (deref(pointer) - byte(1)):binaryand:byte(255)); program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); math.equation(pointer = pointer + void(1)); math.equation(deref(pointer) = (deref(pointer) + byte(1)):binaryand:byte(255)); ~!comment!~MANDATORY~!uncomment!~ program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); math.equation(pointer = pointer + void(1)); program.console.standardoutput.stream.writeunbufferedchars(array.create(1, conversion.changedatatype(deref(pointer), types.character, conversion.method.binary)), 0, 1); ~!comment!~MANDATORY~!uncomment!~ program.memory.deallocate(pointer, void(math.infinity), program.memory.memorytype.bidirectional); undefine(defines.variable, variable(pointer)); program.console.standardoutput.closestream(); program.console.standardinput.closestream(); program.terminate(); ~!comment!~MANDATORY~!uncomment!~
71.099379
168
0.741504
eng_Latn
0.203149
aaf1a1737b6d930025f41eeee8f62160120c6823
209
md
Markdown
_posts/2019/6/2019-06-03-2019-06-03-11-40.md
JuanIgnacioGil/3_belle
a131266ac8f7c873ebd34aafbc6f0415cd1cdb86
[ "MIT" ]
null
null
null
_posts/2019/6/2019-06-03-2019-06-03-11-40.md
JuanIgnacioGil/3_belle
a131266ac8f7c873ebd34aafbc6f0415cd1cdb86
[ "MIT" ]
1
2021-12-06T17:21:47.000Z
2021-12-06T17:21:47.000Z
_posts/2019/6/2019-06-03-2019-06-03-11-40.md
JuanIgnacioGil/3_belle
a131266ac8f7c873ebd34aafbc6f0415cd1cdb86
[ "MIT" ]
null
null
null
--- layout: post title: "2019-06-03" date: 2019-06-03 11:40:30 description: image: "/assets/stories/201906/b2faf975cf1b07a454043436402d0dd8.jpg" author: Elise Plain excerpt: tags: - stories --- <p></p>
13.933333
68
0.703349
eng_Latn
0.142176
aaf1f3662b18b13899fceacdfedd7053c168ae11
21,811
md
Markdown
tutorials/first_tutorial/first_tutorial.md
beast-dev/doc
db3a6457076ef4e68f2d61bf4b751fbfb122db2b
[ "MIT", "BSD-3-Clause" ]
4
2018-02-26T15:41:47.000Z
2020-06-04T09:39:57.000Z
tutorials/first_tutorial/first_tutorial.md
beast-dev/doc
db3a6457076ef4e68f2d61bf4b751fbfb122db2b
[ "MIT", "BSD-3-Clause" ]
34
2017-07-05T12:45:50.000Z
2020-09-18T13:58:32.000Z
tutorials/first_tutorial/first_tutorial.md
beast-dev/doc
db3a6457076ef4e68f2d61bf4b751fbfb122db2b
[ "MIT", "BSD-3-Clause" ]
9
2017-07-06T09:51:33.000Z
2022-02-16T09:20:55.000Z
--- title: First Tutorial keywords: beast, tutorial last_updated: July 22, 2017 tags: [tutorial] summary: "An introductory tutorial to getting started with BEAST. This tutorial describes the use of BEAUti and BEAST to analyse some primate sequences and estimate a phylogenetic tree. It will take you through the process of importing an alignment, making choices about the model, generating a BEAST XML file. You will then run BEAST." sidebar: beast_sidebar permalink: first_tutorial.html folder: beast redirect_from: "/tutorial-1" --- {% capture root_url %}{{ site.tutorials_root_url }}/first_tutorial/{% endcapture %} ## Running BEAST for the first time This tutorial will guide you through running BEAST and some of its accessory programs to do a simple phylogenetic analysis. If you haven't already, [download and install BEAST following these instructions](installing). {% include note.html content="This tutorial assumes a basic knowledge of molecular phylogenetics and the models and terminology used in the field. If you know what the HKY model and the gamma model of rate heterogeneity are then you should be OK. You should also be familiar with at least the basics of Bayesian inference and Markov chain Monte Carlo sampling." %} ### Running BEAUti {% include icon-callout.html file='icons/beauti-icon.png' content='Run BEAUti by double clicking on its icon. <a href="beauti">BEAUti</a> is an interactive graphical application for designing your analysis and generating the control file (a BEAST XML file) which BEAST will use to run the analysis.' %} Once running, BEAUti will look similar irrespective of which computer system it is running on. For this tutorial, the Mac OS X version will be shown but the Linux & Windows versions will have exactly the same layout and functionality. ### Loading the NEXUS file When running, you will see a window like this: {% include image.html prefix=root_url file="image1.png" %} The first thing you need to do is to load some data, in this case a small nucleotide alignment. To load a NEXUS format alignment, simply select the `Import Data...` option from the `File` menu. You can also click the `+` button at the bottom left of the window or just drag-and-drop the file into the main window. #### The NEXUS alignment <div class="alert alert-success" role="alert"><i class="fa fa-download fa-lg"></i> The examples folder in the BEAST download package contains a file called apes.nex - <a href="{{ root_url }}files/apes.nex">you can also download the file here</a>.</div> This file contains an alignment of mitochondrial tRNA sequences from 6 primate species (5 apes and a monkey outgroup). It starts out like this (the lines have been truncated): ``` #NEXUS BEGIN DATA; DIMENSIONS NTAX=6 NCHAR=768; FORMAT MISSING=? GAP=- DATATYPE=DNA; MATRIX human AGAAATATGTCTGATAAAAGAGTTACTTTGATAGAGTAAATAATAGGAGC... chimp AGAAATATGTCTGATAAAAGAATTACTTTGATAGAGTAAATAATAGGAGT... bonobo AGAAATATGTCTGATAAAAGAATTACTTTGATAGAGTAAATAATAGGAGT... gorilla AGAAATATGTCTGATAAAAGAGTTACTTTGATAGAGTAAATAATAGAGGT... orangutan AGAAATATGTCTGACAAAAGAGTTACTTTGATAGAGTAAAAAATAGAGGT... siamang AGAAATACGTCTGACGAAAGAGTTACTTTGATAGAGTAAATAACAGGGGT... ; END; ``` BEAST can also import FASTA files (as long as the sequences have been aligned) or BEAST XML files (in which case the data will imported but the models and settings will not). Once loaded, the alignment will be displayed in the main window in a table: {% include image.html prefix=root_url file="image2.png" %} If you want to look at the actual alignment, double click on the row in the table. {% include image.html prefix=root_url file="image3.png" %} Along the top of the main window is a strip of 'tabs': {% include image.html width="720px" prefix=root_url file="image4.png" %}<br /> Each of these has settings and options and in general you should work from left to right (although not all the tabs will be relevant to all analyses). After the `Partitions` tab where you imported data, skip the `Taxa` tab for now and move straight on to the `Tips` tab. ### Setting the dates of the taxa In the `Tips` options you will see a table with all of the taxa that were in the alignment. This panel allows you go give the taxa dates. Taxon dates (or 'tip dates') are only important in certain cases, for example, when they sampled from fast evolving viruses or sub-fossil ancient DNA material. In the case of the apes we are analysing the tree represents millions of years of evolution so the dates of the tips can be assumed to be zero. This is the default --- the taxa all have a date of zero and the `Use tip dates` box is not selected. {% include image.html prefix=root_url file="image5.png" %} Skip the `Traits` tab and move on to the `Sites` tab. ### Setting the evolutionary model In the `Sites` tab you can set the model of molecular evolution for the sequence data you have loaded. Exactly which options appear depend on whether the data are nucleotides or amino acids (or other forms of data). The settings that will appear after loading the `apes.nex` data set will be as follows: {% include image.html prefix=root_url file="image6.png" %} This tutorial assumes you are familiar with the evolutionary models available, however there are a couple of points to note: The default is a simple HKY ([Hasegawa, Kishino & Yano (1985) *J Mol Evol* **22**: 160-174](https://www.ncbi.nlm.nih.gov/pubmed/3934395)) model of nucleotide evolution. We are going to use this model. {% include callout.html content="Selecting the Partition into codon positions option assumes that the data are aligned as codons. In this case the data are from mitochondrial tRNAs so this option is inappropriate." %} ### Setting the molecular clock model In the next tab `Clock` we set the model of molecular clock we will use. Unlike many other phylogenetic software BEAST exclusively uses molecular clock models so that trees have a timescale (an inferred root and direction in time). The simplest model is the default 'Strict clock'. This assumes that all branches on the tree have the same rate of evolution. The other molecular clock models relax this assumptions in various ways which later tutorials will discuss. {% include image.html prefix=root_url file="image7.png" %} For this analysis leave the `Clock Type` as `Strict clock` and move on to the `Trees` tab. ### Setting the tree prior model In this panel you can set the model that provides a prior on the tree and some choices about the starting tree in the MCMC run. {% include image.html prefix=root_url file="image8.png" %} The `Tree Prior` option has many choices divided generally into 'Coalescent' models which are generally suited to populations and 'Speciation' models which, as the name suggests are intended for species level data. As we have sequence data from a handful of species, we will select the `Speciation: Yule process` model. The Yule process [Yule (1925) *Phil Trans Royal Soc B* **213**: 402-420](http://rstb.royalsocietypublishing.org/content/213/402-410/21) is the simplest model of speciation where each lineage is assumed to have speciated at a fixed rate. The model has a single parameter, the 'birth rate' of new species. The bottom half of this panel allows you to choose how BEAST selects a starting tree. In most situations it is better to leave this as `Random starting tree`. This generates a random tree to start the BEAST run with. Select the Yule process and move to the next `Priors` tab. ### Priors The next tab allows priors to be specified for each parameter in the model: {% include image.html prefix=root_url file="image9.png" %} Selecting priors is one of the most challenging aspects of Bayesian analysis. In BEAST we have tried to pick some reasonably appropriate and robust priors as defaults for most parameters. In this tutorial, the default options will be used. ### MCMC operators The next stage is to look at the operators for the MCMC. To do this select the `Operators` tab at the top of the main window. For the `ape.nex` dataset, with the model set up as shown in the screen shot above, you will see the following table: {% include image.html prefix=root_url file="image10.png" %} Each parameter in the model has one or more "operators". The operators specify how the parameter changes as the MCMC runs. This table lists the parameters, their operators and the tuning settings for these operators. In the first column are the parameter names. These will be called things like `kappa` which means the HKY model's kappa parameter (the transition-transversion bias). The next column has the type of operators that are acting on each parameter. For example, the `scale` operator scales the parameter up or down by a proportion, the `random walk` operator adds or subtracts an amount to the parameter and the `uniform` operator simply picks a new value uniformally within a range. Some parameters relate to the tree or to the heights of the nodes of the tree and these have special operators. The next column, labelled `Tuning`, gives a tuning setting to the operator. Some operators don't have any tuning settings so have n/a under this column. Changing the tuning setting will set how large a move that operator will make which will affect how often that change is accepted by the MCMC which will affect the efficency of the analysis. At the top of the window is an option called `Auto Optimize` which, when selected, will automatically adjust the tuning setting as the MCMC runs to try to achieve maximum efficiency. The next column, labelled `Weight`, specifies how often each operator is applied relative to each other. Some parameters have very little interaction with the rest of the model and as a result tend to be estimated very efficiently - an example is the `kappa` parameter - these parameters can have their operators down-weighted so that they are not changed as often. Once again leave the settings at their defaults and move on to the next tab. ### Setting the MCMC options The last tab, `MCMC`, provides settings to control the actual running of BEAST: {% include image.html prefix=root_url file="image11.png" %} Firstly we have the Length of chain. This is the number of steps the MCMC will make in the chain before finishing. How long this should be depends on the size of the dataset, the complexity of the model and the quality of answer required. The default value of 10,000,000 is entirely arbitrary and should be adjusted according to the size of your dataset. In order examine whether a particular chain length is adequate, the resulting log file can be analysed using Tracer. The aim of setting the chain length is to achieve a reasonable Effective Sample Size (ESS). Ways of doing this are discussed in another tutorial. The next options specify how often the current parameter values should be displayed on the screen and recorded in the log file. The screen output is simply for monitoring the programs progress so can be set to any value (although if set too small, the sheer quantity of information being displayed on the screen may actually slow the program down). For the log file, the value should be set relative to the total length of the chain. Sampling too often will result in very large files with little extra benefit in terms of the precision of the analysis. Sample too infrequently and the log file will not contain much information about the distributions of the parameters. You probably want to aim to store about 10,000 samples so this should be set to the chain length / 10,000. For this dataset (which is very small) let's initially setting the chain length to 1,000,000 as this will run very quickly on most modern computers. Set the sampling frequency to 100, accordingly. {% include note.html content="You can set the screen sampling frequency something different from the main log files. Here, the analysis is going to run very fast so printing to the screen every 100 steps will cause a large amount of information to scroll up the screen. Try setting the `Echo state to screen` option to `10000` resulting in only 100 updates to the screen as the analysis runs." %} The final two options give the file names of the log files for the parameters and the trees. These will be set to a default based on the name of the imported NEXUS file but feel free to change these. The rest of the options can be ignored for the purposes of this tutorial. {% include note.html content="On Windows machines the operating system patronisingly hides the extensions of files from you. It is sometimes easier to add an additional extension `.txt` to the log and the trees file --- Windows will hide the `.txt` but still show you the `.log` and `.trees` extensions so you can distinguish the files." %} ### Saving and Loading BEAUti files If you select the `Save` option from the `File` menu this will save a document in BEAUti's own format. Note that is not in the format that BEAST understands --- it can only be reopened by BEAUti. The idea is that the settings and data in BEAUti can be saved and loaded at a later time. We suggest you save BEAUti files with the extension '.beauti'. {% include note.html content="Just as BEAUti files cannot be read and understood by BEAST, BEAST XML files cannot be reloaded back into BEAUti. They can however be 'Imported' just like NEXUS or FASTA files. The sequence data contained within will be imported as will all the tipdates and certain other information." %} ### Generating the BEAST XML file We are now ready to create the BEAST XML file. Select `Generate XML...` from the `File` menu (or the button at the bottom of the windo) and save the file with an appropriate name --- it will offer the name you gave it in the MCMC panel and we usually end the filename with '.xml' (although see the note, above, about extensions on Windows machines -- you may want to give the file the extension '.xml.txt'). We are now ready to run the file through BEAST. ## Running BEAST {% include icon-callout.html file='icons/beast-icon.png' content='Run BEAST by double clicking on the BEAST icon in the package you downloaded.' %} The following dialog box will appear: {% include image.html prefix=root_url file="image12.png" %} All you need to do is to click the `Choose File...` button, select the XML file you created in BEAUti, above, and press `Run`. For information about the other options see the page on the [BEAST program](beast). When you press `Run` BEAST will load the XML file, setup the analysis and then run it with no further interaction. In the output window you will see lots of information appearing. It starts by printing the title and credits: ``` BEAST v1.X, 2002-2102 Bayesian Evolutionary Analysis Sampling Trees Designed and developed by Alexei J. Drummond, Andrew Rambaut and Marc A. Suchard Department of Computer Science University of Auckland [email protected] Institute of Evolutionary Biology University of Edinburgh [email protected] David Geffen School of Medicine University of California, Los Angeles [email protected] Downloads, Help & Resources: http://beast.community Source code distributed under the GNU Lesser General Public License: http://github.com/beast-dev/beast-mcmc ``` Then it gives some details about the data it loaded and the models you have specified. You should see that it has repeated all of the choices you made in BEAUti. ``` Random number seed: 1500828054875 Parsing XML file: apes.xml File encoding: UTF8 Looking for plugins in /Users/rambaut/Projects/BEAST/plugins Read alignment: alignment Sequences = 6 Sites = 768 Datatype = nucleotide Site patterns 'patterns' created from positions 1-768 of alignment 'alignment' unique pattern count = 69 Using Yule prior on tree Creating the tree model, 'treeModel' initial tree topology = ((((bonobo,human),orangutan),gorilla),(chimp,siamang)) tree height = 102.55912280908296 Using strict molecular clock model. Creating state frequencies model 'frequencies': Initial frequencies = {0.25, 0.25, 0.25, 0.25} Creating HKY substitution model. Initial kappa = 2.0 Creating site rate model: with initial relative rate = 1.0 Using Multi-Partition Data Likelihood Delegate with BEAGLE 3 extensions Using BEAGLE resource 0: CPU with instance flags: PRECISION_DOUBLE COMPUTATION_SYNCH EIGEN_REAL SCALING_MANUAL SCALERS_RAW VECTOR_SSE THREADING_NONE PROCESSOR_CPU FRAMEWORK_CPU Ignoring ambiguities in tree likelihood. With 1 partitions comprising 69 unique site patterns Using rescaling scheme : dynamic (rescaling every 100 evaluations, delay rescaling until first overflow) Using TreeDataLikelihood Optimization Schedule: default Likelihood computation is using an auto sizing thread pool. Creating the MCMC chain: chainLength=1000000 autoOptimize=true autoOptimize delayed for 10000 steps ``` Next it prints out a block of citations for BEAST and for the individual models and components selected. This is intended to help you write up the analysis, specifying and citing the models used: ``` Citations for this analysis: FRAMEWORK BEAST primary citation: Drummond AJ, Suchard MA, Xie Dong, Rambaut A (2012) Bayesian phylogenetics with BEAUti and the BEAST 1.7. Mol Biol Evol. 29, 1969-1973. DOI:10.1093/molbev/mss075 TREE DENSITY MODELS Gernhard 2008 Birth Death Tree Model: Gernhard T (2008) The conditioned reconstructed process. Journal of Theoretical Biology. 253, 769-778. DOI:10.1016/j.jtbi.2008.04.005 SUBSTITUTION MODELS HKY nucleotide substitution model: Hasegawa M, Kishino H, Yano T (1985) Dating the human-ape splitting by a molecular clock of mitochondrial DNA. J. Mol. Evol.. 22, 160-174 ``` Finally BEAST starts to run. It prints up various pieces of information that is useful for keeping track of what is happening. The first column is the 'state' number --- in this case it is incrementing by 1000 so between each of these lines it has made 1000 operations. The screen log shows only a few of the metrics and parameters but it is also recording a log file to disk with all of the results in it (along with a '.trees' file containing the sampled trees for these states). After a few thousand states it will start to report the number of hours per million states (or if it is running very fast, per billions states). This is useful to allow you to predict how long the run is going to take and whether you have time to go and get a cup of coffee, or lunch, or a two week vacation in the Caribbean. ``` # BEAST v1.X # Generated Sun Jul 23 17:42:37 BST 2017 [seed=1500828054875] state Posterior Prior Likelihood rootAge 0 -6812.7534 -481.7973 -6330.9561 102.559 - 10000 -2910.2955 -246.4611 -2663.8344 13.7383 - 20000 -2082.7664 -234.0125 -1848.7539 0.12112 - 30000 -2049.3031 -230.0213 -1819.2818 5.84813E-2 - 40000 -2043.1005 -226.3604 -1816.7401 6.21508E-2 - 50000 -2042.1593 -226.1915 -1815.9678 5.79924E-2 - . . . ``` After waiting the expected amount of time, BEAST will finish. ``` 950000 -2044.0265 -227.1010 -1816.9255 5.98269E-2 11.27 hours/billion states 960000 -2041.6364 -225.0546 -1816.5818 5.58888E-2 11.27 hours/billion states 970000 -2046.3378 -227.5622 -1818.7756 5.96169E-2 11.27 hours/billion states 980000 -2039.9261 -226.1422 -1813.7839 5.96151E-2 11.27 hours/billion states 990000 -2047.3466 -227.6691 -1819.6775 6.41662E-2 11.27 hours/billion states 1000000 -2040.3296 -225.5095 -1814.8201 5.74639E-2 11.27 hours/billion states Operator analysis Operator Tuning Count Time Time/Op Pr(accept) scale(kappa) 0.239 13605 621 0.05 0.2453 frequencies 0.105 13530 722 0.05 0.2398 subtreeSlide(treeModel) 0.025 202546 5651 0.03 0.2313 Narrow Exchange(treeModel) 203114 7020 0.03 0.0547 Wide Exchange(treeModel) 40112 623 0.02 0.0099 wilsonBalding(treeModel) 40753 848 0.02 0.0053 scale(treeModel.rootHeight) 0.148 40635 1192 0.03 0.2406 uniform(nodeHeights(treeModel)) 404956 13096 0.03 0.385 scale(yule.birthRate) 0.126 40749 305 0.01 0.2392 41.987 seconds ``` The table at the end lists each of the operators, how many times each was used, how much time they took and some other details. This information can be useful for optimising the performance of runs but generally it can be ignored. This took 40 seconds to run on a low-performance MacBook. Clearly we would be able to run it for much longer whilst getting a coffee but with such a small data set we may not need to. We will find out when we start to look at the output files from BEAST. {% include callout.html type="primary" content="**[The second tutorial will look at summarizing the output, diagnosing problems and building a tree](second_tutorial).**" %} {% include links.html %}
67.110769
806
0.738435
eng_Latn
0.996429
aaf308681b4239fdc574be155cff958cef9b4aaf
3,941
md
Markdown
articles/app-service-web/web-sites-authentication-authorization.md
OpenLocalizationTestOrg/azure-docs-pr15_pt-BR
95dabd136ee50edd2caa1216e745b9f13ff7a1f2
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
1
2018-08-29T17:03:44.000Z
2018-08-29T17:03:44.000Z
articles/app-service-web/web-sites-authentication-authorization.md
OpenLocalizationTestOrg/azure-docs-pr15_pt-BR
95dabd136ee50edd2caa1216e745b9f13ff7a1f2
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/app-service-web/web-sites-authentication-authorization.md
OpenLocalizationTestOrg/azure-docs-pr15_pt-BR
95dabd136ee50edd2caa1216e745b9f13ff7a1f2
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
<properties pageTitle="Autenticar com o Active Directory local em seu aplicativo Azure | Microsoft Azure" description="Saiba mais sobre as diferentes opções para os aplicativos de linha de negócios em um serviço de aplicativo do Azure para autenticar com o Active Directory local" services="app-service" documentationCenter="" authors="cephalin" manager="wpickett" editor="jimbe"/> <tags ms.service="app-service" ms.devlang="na" ms.topic="article" ms.tgt_pltfrm="na" ms.workload="web" ms.date="08/31/2016" ms.author="cephalin"/> # <a name="authenticate-with-on-premises-active-directory-in-your-azure-app"></a>Autenticar com o Active Directory local no seu aplicativo do Azure # Este artigo mostra como autenticar com o Active Directory (AD) no [Serviço de aplicativo do Azure](../app-service/app-service-value-prop-what-is.md)local. Um aplicativo do Azure está hospedado na nuvem, mas há maneiras para autenticar local usuários do AD com segurança. ## <a name="authenticate-through-azure-active-directory"></a>Autenticação por meio do Active Directory do Azure Um locatário do Azure Active Directory pode ser sincronizada de diretório com um local AD. Essa abordagem permite que usuários do AD acessar seu aplicativo da internet e autenticar usando suas credenciais de local. Além disso, o serviço de aplicativo do Azure fornece uma [solução de ativar-chave para este método](../app-service-mobile/app-service-mobile-how-to-configure-active-directory-authentication.md). Com apenas alguns cliques de um botão, você pode habilitar a autenticação com um locatário sincronizado de diretório para seu aplicativo do Azure. Essa abordagem tem as seguintes vantagens: - Não exige qualquer código de autenticação em seu aplicativo. Permitem serviço de aplicativo faça a autenticação para você e use seu tempo em fornecer funcionalidade em seu aplicativo. - [API do Azure AD Graph](http://msdn.microsoft.com/library/azure/hh974476.aspx) permite o acesso aos dados do diretório de seu aplicativo do Azure. - Fornece o SSO para [todos os aplicativos com suporte pelo Azure Active Directory](/marketplace/active-directory/), incluindo o Office 365, Dynamics CRM Online, Microsoft Intune e milhares de aplicativos em nuvem não sejam da Microsoft. - Active Directory do Azure suporta controle de acesso baseado em função. Você pode usar o padrão de [Authorize(Roles="X")] com mínimas alterações para seu código. Para ver como escrever um aplicativo de linha de negócios Azure que autentica com o Azure Active Directory, consulte [criar um aplicativo do Azure de linha de negócios com a autenticação do Active Directory do Azure](web-sites-dotnet-lob-application-azure-ad.md). ## <a name="authenticate-through-an-on-premises-sts"></a>Autenticação por meio de um STS local Se você tiver um local serviço de símbolo seguro (STS) como os serviços de Federação do Active Directory (AD FS), você pode usar que interajam autenticação de seu aplicativo do Azure. Essa abordagem é melhor quando a política da empresa impede dados AD sejam armazenados no Azure. No entanto, observe o seguinte: - Topologia de STS deve ser implantado no local, com sobrecarga de custo e gerenciamento. - Apenas administradores do AD FS podem configurar [regras de declaração e terceira relações de confiança de terceiros](http://technet.microsoft.com/library/dd807108.aspx), que pode limitar as opções do desenvolvedor. Por outro lado, é possível gerenciar e personalizar [declarações](http://technet.microsoft.com/library/ee913571.aspx) em uma base por aplicativo. - Acesso ao local dados AD requerem uma solução separada por meio do firewall corporativo. Para ver como escrever um aplicativo de linha de negócios Azure que autentica com um STS local, consulte [criar um aplicativo do Azure de linha de negócios com a autenticação do AD FS](web-sites-dotnet-lob-application-adfs.md).
93.833333
599
0.782543
por_Latn
0.999262
aaf32885f0345b13e023015d484a9e038c379641
1,987
md
Markdown
README.md
craig-willis/water-tale
e5aab2d799159a941304379851f92fd29fd03c1e
[ "MIT" ]
null
null
null
README.md
craig-willis/water-tale
e5aab2d799159a941304379851f92fd29fd03c1e
[ "MIT" ]
null
null
null
README.md
craig-willis/water-tale
e5aab2d799159a941304379851f92fd29fd03c1e
[ "MIT" ]
null
null
null
# Mapping Estimated Water Usage in Texas (staging) [![DOI](https://sandbox.zenodo.org/badge/DOI/10.5072/zenodo.490450.svg)](https://sandbox.zenodo.org/record/490450#.XkQXjBNKh24) [![Whole Tale](https://raw.githubusercontent.com/whole-tale/wt-design-docs/master/badges/wholetale-badge.svg?sanitize=true)](https://girder.stage.wholetale.org/api/v1/integration/zenodo?doi=10.5072%2Fzenodo.490450&resource_server=sandbox.zenodo.org) ## Description Demonstration of how to use Whole Tale to develop custom analysis and visualization for data published externally via DataONE. See https://wholetale.readthedocs.io/en/stable/users_guide/quickstart.html for more information. ## Datasets used > White, D. and Alessa, L. (2010) “Humans and Hydrology at High Latitudes: Water Use Information, Version 1.0.” UCAR/NCAR - Earth Observing Laboratory. doi: 10.5065/D6862DM8. ## Environment This example uses the JupyterLab environment ## Licenses **Text and figures :** [CC-BY-4.0](http://creativecommons.org/licenses/by/4.0/) **Code :** [MIT](LICENSE.md) ## How to run The example provided in this repository can be run using the Whole Tale platform. Click [![Whole Tale](https://raw.githubusercontent.com/whole-tale/wt-design-docs/master/badges/wholetale-badge.svg?sanitize=true)](https://girder.stage.wholetale.org/api/v1/integration/zenodo?doi=10.5072%2Fzenodo.490450&resource_server=sandbox.zenodo.org) to launch an interactive JupyterLab environment based on the published research object. If you have Docker installed, you can also download it as a [zip file](https://sandbox.zenodo.org/record/490450/preview/5e39c4fe8bec16c2663dd8f8.zip?download=1) and run locally. Simply extract the zip file and execute the "run-local.sh". ## How to cite Please cite this as: > Willis, Craig > [![](https://orcid.org/sites/default/files/images/orcid_16x16.png)](https://orcid.org/0000-0002-6148-7196) > (2020). Mapping Estimated Water Usage (staging). http://doi.org/10.5072/zenodo.490450
53.702703
427
0.773025
eng_Latn
0.380095
aaf3363f2fae19b26d7b95290c2943617c1315ee
534
md
Markdown
README.md
hash-bang/compare-names
19a15a8410f05b94b7fd191874c1f5af3ef5cd51
[ "MIT" ]
2
2018-02-20T14:56:29.000Z
2018-12-23T11:54:18.000Z
README.md
hash-bang/compare-names
19a15a8410f05b94b7fd191874c1f5af3ef5cd51
[ "MIT" ]
2
2018-02-21T03:56:29.000Z
2018-03-08T15:04:53.000Z
README.md
hash-bang/compare-names
19a15a8410f05b94b7fd191874c1f5af3ef5cd51
[ "MIT" ]
2
2018-03-05T22:14:10.000Z
2020-09-09T13:28:41.000Z
compare-names ============= Fuzzily compare two arrays of names. This module is intended mainly for comparing research paper author lists. It takes two arrays of authors and simply returns a boolean if the authors *look like* they are the same. ```javascript var compareNames = require('compare-names'); compareNames( ['Izbicki, R', 'Weyhing, B. T. I.', 'Backer, L'], ['Izbicki, R 3rd', 'Weyhing, B. T.', 'Backer, L', 'Caoili, E.', 'M.l Vaitkevicius, V.K.'] ); // Returns true ``` See the [unit tests](test/) for more examples.
28.105263
105
0.679775
eng_Latn
0.984069
aaf3e1d07ccb2f674c5217a0e88335a9c1311726
952
md
Markdown
powerapps-docs/developer/model-driven-apps/clientapi/reference/formContext-ui-process/reflow.md
noriji/powerapps-docs.ja-jp
0dc46ef4c9da7a8ab6d6e2a397f271170f7b9970
[ "CC-BY-4.0", "MIT" ]
null
null
null
powerapps-docs/developer/model-driven-apps/clientapi/reference/formContext-ui-process/reflow.md
noriji/powerapps-docs.ja-jp
0dc46ef4c9da7a8ab6d6e2a397f271170f7b9970
[ "CC-BY-4.0", "MIT" ]
null
null
null
powerapps-docs/developer/model-driven-apps/clientapi/reference/formContext-ui-process/reflow.md
noriji/powerapps-docs.ja-jp
0dc46ef4c9da7a8ab6d6e2a397f271170f7b9970
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: モデル駆動型アプリにおける reflow (クライアント API 参照) | MicrosoftDocs ms.date: 10/31/2018 ms.service: crm-online ms.topic: reference applies_to: Dynamics 365 (online) ms.assetid: 6833e4ea-70fc-4ee0-8aab-68cc55e21444 author: KumarVivek ms.author: kvivek manager: amyla search.audienceType: - developer search.app: - PowerApps - D365CE --- # <a name="reflow-client-api-reference"></a>reflow (クライアント API 参照) [!INCLUDE[./includes/reflow-description.md](./includes/reflow-description.md)] ## <a name="syntax"></a>構文 `formContext.ui.process.reflow(updateUI, parentStage, nextStage);` ## <a name="parameter"></a>パラメーター |Name|種類​​|必須出席者|内容| |--|--|--|--| |updateUI|Boolean|あり|プロセス コントロールの UI を更新するには **true** を指定し、そうでない場合は **false** を指定します。| |parentStage|String|あり|GUID フォーマットでは、親ステージの ID を指定します。| |nextStage|String|あり|GUID フォーマットでは、次のステージの ID を指定します。| ### <a name="related-topics"></a>関連トピック [formContext.ui.process](../formContext-ui-process.md)
23.219512
86
0.721639
yue_Hant
0.762919
aaf3f9ab7770e30c103f56dcdfc6b0df133b6908
10,143
md
Markdown
README.md
mstajbakhsh/linkedin-scraper
cf05fe988bdb922e0efd7a8e4b955bce8eaf1960
[ "MIT" ]
1
2020-01-01T15:57:14.000Z
2020-01-01T15:57:14.000Z
README.md
mstajbakhsh/linkedin-scraper
cf05fe988bdb922e0efd7a8e4b955bce8eaf1960
[ "MIT" ]
null
null
null
README.md
mstajbakhsh/linkedin-scraper
cf05fe988bdb922e0efd7a8e4b955bce8eaf1960
[ "MIT" ]
null
null
null
Linkedin Scraper ================ Linkedin-scraper is a gem for scraping linkedin public profiles. You give it an URL, and it lets you easily get its title, name, country, area, current_companies and much more. Installation ------------ Install the gem from RubyGems: gem install linkedin-scraper This gem is tested on Ruby versions 1.8.7, 1.9.2 1.9.3 and 2.0.0 Usage ----- Initialize a scraper instance for an URL, like this: profile = Linkedin::Profile.get_profile("http://www.linkedin.com/in/jeffweiner08") Then you can see the scraped data like this: profile.first_name #the First name of the contact profile.last_name #the last name of the contact profile.name #The Full name of the profile profile.title #the linkedin job title profile.location #the location of the contact profile.country #the country of the contact profile.industry #the domain for which the contact belongs profile.picture #the profile pic url of contact profile.skills #the skills of the profile profile.organizations #the organizations of the profile profile.education #Array of hashes for eduction profile.picture #url of the profile picture profile.current_companies [ [0] { :current_company => "LinkedIn", :current_title => "CEO", :current_company_url => "http://www.linkedin.com", :description => nil, :linkedin_company_url => "http://www.linkedin.com/company/linkedin?trk=ppro_cprof", :url => "http://www.linkedin.com", :type => "Public Company", :company_size => "1001-5000 employees", :website => "http://www.linkedin.com", :industry => "Internet", :founded => "2003", :address => "2029 Stierlin Court Mountain View, CA 94043 United States" }, [1] { :current_company => "Intuit", :current_title => "Member, Board of Directors", :current_company_url => "http://network.intuit.com/", :description => nil, :linkedin_company_url => "http://www.linkedin.com/company/intuit?trk=ppro_cprof", :url => "http://network.intuit.com/", :type => "Public Company", :company_size => "5001-10,000 employees", :website => "http://network.intuit.com/", :industry => "Computer Software", :founded => "1983", :address => "2632 Marine Way Mountain View, CA 94043 United States" }, [2] { :current_company => "DonorsChoose", :current_title => "Member, Board of Directors", :current_company_url => "http://www.donorschoose.org", :description => nil, :linkedin_company_url => "http://www.linkedin.com/company/donorschoose.org?trk=ppro_cprof", :url => "http://www.donorschoose.org", :type => "Nonprofit", :company_size => "51-200 employees", :website => "http://www.donorschoose.org", :industry => "Nonprofit Organization Management", :founded => "2000", :address => "213 West 35th Street 2nd Floor East New York, NY 10001 United States" }, [3] { :current_company => "Malaria No More", :current_title => "Member, Board of Directors", :current_company_url => nil, :description => nil }, [4] { :current_company => "Venture For America", :current_title => "Member, Advisory Board", :current_company_url => "http://ventureforamerica.org/", :description => nil, :linkedin_company_url => "http://www.linkedin.com/company/venture-for-america?trk=ppro_cprof", :url => "http://ventureforamerica.org/", :type => "Nonprofit", :company_size => "1-10 employees", :website => "http://ventureforamerica.org/", :industry => "Nonprofit Organization Management", :founded => "2011" } ] profile.past_companies #Array of hash containing its past job companies and job profile #Example [ [0] { :past_company => "Accel Partners", :past_title => "Executive in Residence", :past_company_website => "http://www.facebook.com/accel", :description => nil, :linkedin_company_url => "http://www.linkedin.com/company/accel-partners?trk=ppro_cprof", :url => "http://www.facebook.com/accel", :type => "Partnership", :company_size => "51-200 employees", :website => "http://www.facebook.com/accel", :industry => "Venture Capital & Private Equity", :address => "428 University Palo Alto, CA 94301 United States" }, [1] { :past_company => "Greylock", :past_title => "Executive in Residence", :past_company_website => "http://www.greylock.com", :description => nil, :linkedin_company_url => "http://www.linkedin.com/company/greylock-partners?trk=ppro_cprof", :url => "http://www.greylock.com", :type => "Partnership", :company_size => "51-200 employees", :website => "http://www.greylock.com", :industry => "Venture Capital & Private Equity", :address => "2550 Sand Hill Road Menlo Park, CA 94025 United States" }, [2] { :past_company => "Yahoo!", :past_title => "Executive Vice President Network Division", :past_company_website => "http://www.yahoo.com", :description => nil, :linkedin_company_url => "http://www.linkedin.com/company/yahoo?trk=ppro_cprof", :url => "http://www.yahoo.com", :type => "Public Company", :company_size => "10,001+ employees", :website => "http://www.yahoo.com", :industry => "Internet", :founded => "1994", :address => "701 First Avenue Sunnyvale, CA 94089 United States" }, [3] { :past_company => "Windsor Media", :past_title => "Founding Partner", :past_company_website => nil, :description => nil }, [4] { :past_company => "Warner Bros.", :past_title => "Vice President Online", :past_company_website => "http://www.warnerbros.com/", :description => nil, :linkedin_company_url => "http://www.linkedin.com/company/warner-bros.-entertainment-group-of-companies?trk=ppro_cprof", :url => "http://www.warnerbros.com/", :type => "Public Company", :company_size => "10,001+ employees", :website => "http://www.warnerbros.com/", :industry => "Entertainment", :address => "4000 Warner Boulevard Burbank, CA 91522 United States" } ] profile.linkedin_url #url of the profile profile.websites #Array of websites [ [0] "http://www.linkedin.com/" ] profile.groups #Array of hashes containing group name and link profile.education #Array of hashes for eduction profile.skills #Array of skills profile.picture #url of the profile picture profile.recommended_visitors #Its the list of visitors "Viewers of this profile also viewed..." [ [0] { :link => "http://www.linkedin.com/in/barackobama?trk=pub-pbmap", :name => "Barack Obama", :title => "President of the United States of ", :company => nil }, [1] { :link => "http://www.linkedin.com/in/marissamayer?trk=pub-pbmap", :name => "Marissa Mayer", :title => "Yahoo!, President & CEO", :company => nil }, [2] { :link => "http://www.linkedin.com/pub/sean-parker/0/1/826?trk=pub-pbmap", :name => "Sean Parker", :title => nil, :company => nil }, [3] { :link => "http://www.linkedin.com/pub/eduardo-saverin/0/70a/31b?trk=pub-pbmap", :name => "Eduardo Saverin", :title => nil, :company => nil }, [4] { :link => "http://www.linkedin.com/in/rbranson?trk=pub-pbmap", :name => "Richard Branson", :title => "Founder", :company => "Virgin Group" }, [5] { :link => "http://www.linkedin.com/in/reidhoffman?trk=pub-pbmap", :name => "Reid Hoffman", :title => "Entrepreneur. Product Strategist. ", :company => nil }, [6] { :link => "http://www.linkedin.com/in/mdell?trk=pub-pbmap", :name => "Michael Dell", :title => "Chairman and CEO", :company => "Dell" }, [7] { :link => "http://www.linkedin.com/in/mittromney?trk=pub-pbmap", :name => "Mitt Romney", :title => "Believe in America", :company => nil }, [8] { :link => "http://www.linkedin.com/pub/sheryl-sandberg/2/665/512?trk=pub-pbmap", :name => "Sheryl Sandberg", :title => nil, :company => nil } ] ## Credits - [Justin Grevich](https://github.com/jgrevich) - [Vpoola](https://github.com/vpoola88) - [Mark Walker](https://github.com/jfdimark) You're welcome to fork this project and send pull requests. I want to thank specially:
37.290441
128
0.526669
yue_Hant
0.341953
aaf4ce7ad4fa43a1e7a18443e20ec394e88aca21
3,771
md
Markdown
docs/vs-2015/workflow-designer/how-to-use-breadcrumb-navigation.md
rfakhouri/visualstudio-docs.cs-cz
3d540a168c09a23b855f746696062fd9954b8dd5
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/vs-2015/workflow-designer/how-to-use-breadcrumb-navigation.md
rfakhouri/visualstudio-docs.cs-cz
3d540a168c09a23b855f746696062fd9954b8dd5
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/vs-2015/workflow-designer/how-to-use-breadcrumb-navigation.md
rfakhouri/visualstudio-docs.cs-cz
3d540a168c09a23b855f746696062fd9954b8dd5
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 'Postupy: Pomocí navigace s popisem cesty | Dokumentace Microsoftu' ms.date: 11/15/2016 ms.prod: visual-studio-dev14 ms.technology: vs-workflow-designer ms.topic: reference ms.assetid: 4a688056-37dc-406a-9071-be2141e192fe caps.latest.revision: 5 author: gewarren ms.author: gewarren manager: jillfra ms.openlocfilehash: 64f84d33a814937df74002ffeb2fe7453694377e ms.sourcegitcommit: 47eeeeadd84c879636e9d48747b615de69384356 ms.translationtype: HT ms.contentlocale: cs-CZ ms.lasthandoff: 04/23/2019 ms.locfileid: "63444124" --- # <a name="how-to-use-breadcrumb-navigation"></a>Postupy: Používání navigace s popisem cesty Existují tři hlavní způsoby, jak změnit sadu aktivit, které jsou zobrazeny v [!INCLUDE[wfd1](../includes/wfd1-md.md)]: 1. Dvakrát klikněte na Přejít k podřízené aktivity. 2. Klikněte na tlačítko na panelu navigace s popisem cesty a přejděte na nadřazenou aktivitou. 3. Rozbalit nebo sbalit aktivity na místě. ### <a name="using-breadcrumb-navigation"></a>Pomocí navigace s popisem cesty 1. Dvakrát klikněte na aktivitu z [!INCLUDE[wfd2](../includes/wfd2-md.md)] změna kořenové aktivity na kliknutí na aktivity. Kliknutí na aktivitu poté plně rozbalen v kořenovém adresáři a jeho nadřazenými prvky jsou uvedeny na panelu navigace s popisem cesty. To se někdy nazývá procházení nebo z aktivity. 2. Pokud chcete přejít na nadřazena aktuální kořenové aktivity, klikněte na aktivitu v panelu s popisem cesty. ### <a name="expanding-or-collapsing-an-activity-in-place"></a>Rozbalení a sbalení aktivitu na místě 1. Kliknutím na dvojité šipky pro aktivitu rozbalí nebo sbalí aktivity na místě. 2. Při změně stavu stavu rozbalení kliknutím na tlačítko Nový stav rozšíření je uložen v XAML. > [!WARNING] > Ne všechny aktivity lze rozšířit na místě. Existují dva možné případy, kdy aktivita nelze rozšířit na místě: buď nadřazené aktivity nepovoluje své podřízené objekty rozbalen v místě, (například aktivity ve vývojovém diagramu nelze rozšířit na místě), nebo Návrhář aktivity neumožňuje samotného Rozbalit na místě. I když žádné z návrháře aktivit, které jsou součástí [!INCLUDE[wfd2](../includes/wfd2-md.md)] mají druhé chování, některé vlastní aktivity může tímto způsobem chovat. ### <a name="expanding-all-or-collapsing-all-activities"></a>Všechny rozbalení nebo sbalení všech aktivit 1. Použití **Rozbalit vše** a **Sbalit vše** tlačítka v uživatelském rozhraní pro rozbalení a sbalení všech aktivit v kořenovém adresáři aktuální navigace s popisem cesty. Všimněte si, že všechny rozbalit nebo sbalit všechny jsou globální stavy. To znamená, že když změníte kořenové aktivity pomocí navigace s popisem cesty, Rozbalit vše nebo sbalit všechny stavy opakuje dokud nekliknete na tlačítko **obnovení**. 2. Po použily všechny rozbalit nebo sbalit všechny stavy, můžete klepnout **obnovení** tlačítko, které se zobrazí chcete přejít zpátky k prohlížení stavu dříve použít u každé aktivity. > [!WARNING] > Pokud aktivitu, například <xref:System.Activities.Statements.Flowchart>, zvolil z rozbalit na místě, přidružený k funkci **Rozbalit vše** a **Sbalit vše** tlačítka je zakázán na **vývojový diagram** návrháře. [!INCLUDE[crabout](../includes/crabout-md.md)] **vývojový diagram** návrháře, viz [vývojový diagram](../workflow-designer/flowchart-activity-designer.md) tématu. > [!WARNING] > Rozbalit vše má také zvláštní efekt v **přepínač** a **TryCatch** návrháři aktivit. Po kliknutí na **Rozbalit vše**, zobrazí se všechny switch case a všechny bloky konstrukce try/catch/finally. Kliknutím na **obnovení** nebo **Sbalit vše** vrací tyto návrháře do výchozího stavu, ve kterém můžete kliknout na jednotlivé případ nebo blok a zobrazit jeho obsah.
71.150943
487
0.77327
ces_Latn
0.999911
aaf4d215be9fc213aa6def90e3585d486cdcd8e6
42
md
Markdown
ERGA_asm/README.md
ERGA-consortium/hqgen2021
b39f0debef6fb8ae359c0a491847db7e56907da6
[ "MIT" ]
2
2021-11-08T12:04:53.000Z
2021-11-15T14:28:44.000Z
ERGA_asm/README.md
ERGA-consortium/hqgen2021
b39f0debef6fb8ae359c0a491847db7e56907da6
[ "MIT" ]
null
null
null
ERGA_asm/README.md
ERGA-consortium/hqgen2021
b39f0debef6fb8ae359c0a491847db7e56907da6
[ "MIT" ]
null
null
null
#Documentation of ERGA assembly workflows
21
41
0.857143
eng_Latn
0.99895
aaf51db39bbc1fa1630a65d1b88580ad4e9e72ec
263
md
Markdown
src/__tests__/fixtures/unfoldingWord/en_tq/heb/09/12.md
unfoldingWord/content-checker
7b4ca10b94b834d2795ec46c243318089cc9110e
[ "MIT" ]
null
null
null
src/__tests__/fixtures/unfoldingWord/en_tq/heb/09/12.md
unfoldingWord/content-checker
7b4ca10b94b834d2795ec46c243318089cc9110e
[ "MIT" ]
226
2020-09-09T21:56:14.000Z
2022-03-26T18:09:53.000Z
src/__tests__/fixtures/unfoldingWord/en_tq/heb/09/12.md
unfoldingWord/content-checker
7b4ca10b94b834d2795ec46c243318089cc9110e
[ "MIT" ]
1
2022-01-10T21:47:07.000Z
2022-01-10T21:47:07.000Z
# What offering did Christ make, by which he entered the most holy place? Christ made an offering of his own blood by which he entered the most holy place. # What did Christ’s offering accomplish? Christ’s offering secured everlasting redemption for everyone.
32.875
81
0.790875
eng_Latn
1
aaf571940c11d9eaef418750064fcb2124fd2459
20,196
md
Markdown
README.md
moyinfesobi/awesome-technical-writing
d9aafcc90d68807a189eb11e0e3bb890c4acd46e
[ "CC-BY-4.0" ]
948
2020-03-11T09:39:22.000Z
2022-03-31T05:16:25.000Z
README.md
moyinfesobi/awesome-technical-writing
d9aafcc90d68807a189eb11e0e3bb890c4acd46e
[ "CC-BY-4.0" ]
14
2020-03-11T06:48:58.000Z
2022-01-30T02:25:43.000Z
README.md
BolajiAyodeji/awesome-technical-writing
d9aafcc90d68807a189eb11e0e3bb890c4acd46e
[ "CC-BY-4.0" ]
150
2020-03-11T10:49:05.000Z
2022-03-31T10:37:35.000Z
# Awesome Technical Writing [![Awesome](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](https://github.com/sindresorhus/awesome) <!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section --> [![All Contributors](https://img.shields.io/badge/all_contributors-10-orange.svg?style=flat-square)](#contributors-) <!-- ALL-CONTRIBUTORS-BADGE:END --> ![](https://repository-images.githubusercontent.com/221308953/36034800-6311-11ea-8418-8a1a03c97d81) > Technical writing is writing or drafting technical communication used in technical and occupational fields, such as computer hardware and software, engineering, chemistry, aeronautics, robotics, finance, medical, consumer electronics, biotechnology, and forestry. ~ [Wikipedia](https://en.wikipedia.org/wiki/Technical_writing) *List inspired by the [awesome](https://github.com/sindresorhus/awesome) list and [awesome-jamstack](https://github.com/bolajiayodeji/awesome-jamstack) list* ## Table of Contents - [Community](#community) - [Courses](#courses) - [Resources](#resources) - [Videos](#videos) - [Useful Tools](#useful-tools) - [Conferences](#conferences) - [Speaker Decks](#speaker-decks) - [Books](#books) - [Podcasts](#podcasts) - [Style Guides](#style-guides) - [Technical Writers to Follow](#technical-writers-to-follow) ## Community * [Hashnode](https://hashnode.com/) * [r/technicalwriting](https://www.reddit.com/r/technicalwriting/) * [Google's Season of Docs](https://developers.google.com/season-of-docs/) * [freeCodeCamp News](https://www.freecodecamp.org/news/) * [ycombinator News](https://news.ycombinator.com/) * [DEV](https://dev.to/) * [Hackernoon](https://hackernoon.com/) * [Write the Docs ](https://www.writethedocs.org/) * [The Good Docs Project](https://thegooddocsproject.dev/) * [LinkedIn Technical Writing Community](https://www.linkedin.com/groups/13705342/) * [Society for Technical Communication](https://www.stc.org/) * [LogRocket Blog](https://blog.logrocket.com/) * [Scotch](https://scotch.io/) * [The Manuscript Academy](https://manuscriptacademy.com/) ## Courses * [Study Technical Writing](https://developers.google.com/tech-writing/overview) * [Technical Writing: Documentation on Software Projects](https://www.pluralsight.com/courses/technical-writing-software-documentation) * [Coding for Writers: Basic Programming](https://www.udemy.com/course/coding-for-writers-1-basic-programming/) * [Class Central Technical Writing](https://www.classcentral.com/course/technical-writing-7117) * [Professional Technical Writing: Advance Your Writing Skills](https://www.udemy.com/technical-writing-and-editing/) * [Technical Writing: Master Your Writing Career](https://www.udemy.com/technical-writing/) * [English 305: Advanced Technical Writing](https://study.com/academy/course/technical-writing-course.html) * [Technical Communication Techniques and Principles for Project Managers](https://ce.uwec.edu/programs/technical-communication-techniques-and-principles-project-managers/) * [Learn DITA](https://learningdita.com/) ## Resources * [The Ultimate Guide to Content Creation](https://blog.hubspot.com/marketing/content-creation) * [How to Create Great Content: A Step-by-Step Guide to Content Marketing That Delivers Real Results](https://www.inc.com/jeff-haden/how-to-create-great-content-a-step-by-step-guide-to-content-marketing-that-delivers-real-results.html) * [9 Tips to Become the Best Content Creator in Your Industry](https://www.weidert.com/blog/tips-to-make-you-the-best-content-creator-in-your-industry) * [Advice for Technical Writing](https://css-tricks.com/advice-for-technical-writing/) * [Becoming a Technical Writer at Google](https://developers.google.com/tech-writing/becoming) * [15 Tips to Improve Your Technical Writing](https://thebestschools.org/magazine/technical-writing-tips/) * [How to Become a Technical Writer: A Beginner’s Guide](https://www.instructionalsolutions.com/blog/become-a-technical-writer) * [How to Create Cover Images for Your Devblog Posts](https://townhall.hashnode.com/how-to-create-cover-images-for-your-devblog-posts-cjyo53edo000heys1p7iuylpw) * [Introducing Google Season of Docs](https://bolajiayodeji.com/introducing-google-season-of-docs-ck27y4gzc007ocws12njwqpy2) * [How to Start a Software YouTube Channel](https://www.freecodecamp.org/news/how-to-start-a-software-youtube-channel/) * [Starting a YouTube Channel as a Software Developer](https://www.claudiobernasconi.ch/2019/03/20/starting-a-youtube-channel-as-a-software-developer/0) * [How to Create a Programming YouTube Channel - Lessons From 5 Years and 1 Million Subscribers](https://www.freecodecamp.org/news/how-to-start-a-software-youtube-channel-video-course/) * [Developers: The Why and How to Writing Technical Articles](https://www.freecodecamp.org/news/developers-the-why-and-how-to-writing-technical-articles-54e824789ef6/) * [How to Improve the SEO of Your Devblog Articles](https://townhall.hashnode.com/how-to-improve-the-seo-of-your-devblog-articles-cjz3u8lk3003gavs1l071dzoz) * [How to Improve the SEO of Your Devblog](https://townhall.hashnode.com/how-to-improve-the-seo-of-your-devblog-cjz191c0e00380ks1nbtpwh8f) * [Technical Writing: Why and How?](https://medium.com/the-andela-way/technical-writing-why-and-how-599f18477cef) * [Minimalist Approach to Technical Documentation](https://www.utwente.nl/en/bms/ist/minimalism/) * [A beginner’s guide to writing documentation](https://www.writethedocs.org/guide/writing/beginners-guide-to-docs/) * [Crash Course in APIs for Technical Writers](https://medium.com/@patford12/crash-course-in-apis-for-technical-writers-694b274a2ad8) * [Technical Writing: What and How?](https://edidiongasikpo.com/technical-writing-what-and-how-ckastwm2705xq4us1l0cbvv2h) * [Write For Us: A List of Companies Who Pay Freelancers for Writing Tutorials](https://github.com/sixhobbits/technical-writing/blob/master/write-for-us.md) * [Technical Writing Books: A Curated Collection of Books to Help You Be a Better Technical Writer](https://github.com/sixhobbits/technical-writing/blob/master/resources.md) ## Videos * [Content Creation Strategies: How To Create Content Online](https://www.youtube.com/watch?v=APQoWEqezFc) * [How to start a Coding YouTube channel (with tips from a bunch of successful creators!)](https://www.youtube.com/watch?v=AsTagX5tG4E) * [How To Start A Programming YouTube Channel With Coding Tutorials 360](https://www.youtube.com/watch?v=aeCRHv4XUPU) * [15 Technical Writing Tips](https://www.youtube.com/watch?v=Lw4TKCsIumQ) * [What is Technical Writing? | Writing Genre Fundamentals](https://www.youtube.com/watch?v=9SB4tfD0hxM) * [Meet Technical Writers at Google](https://www.youtube.com/watch?v=qnnkAWP55Ww) * [Technical Writing 101: Introduction to Technical Writing](https://www.youtube.com/watch?v=LTDsgd0ytbE&list=PL9RLbEIB-lv-bRTz14iEK4YSxRzxLQfdx) * [What do Technical Writers do? (Also, what is Technical Writing?)](https://www.youtube.com/watch?v=biocrCx5T_k) * [How to become a Technical Writer | Skills & Career Growth](https://www.youtube.com/watch?v=8l2KJXIBpB0) * [Writing technical documentation](https://www.youtube.com/watch?v=a4L9GhldTHo) * [Write The Docs Podcast & Meetups](https://www.youtube.com/channel/UCUI--N-VWjK93292AaaArCg) * [Write the Readable README](https://www.youtube.com/watch?v=2dAK42B7qtw) * [GitHub as a Landing Page](https://www.youtube.com/watch?v=fXMN4X9B8Rg&feature=youtu.be) * [Lessons Learned From Rebuilding a Developer Documentation Website](https://www.youtube.com/watch?v=s4kS-crtnlQ) * [A Balanced Diet of Documentation](https://www.youtube.com/watch?v=K-ACxb_Iy5k) * [Write your Docs like Nobody Reads Them](https://www.youtube.com/watch?v=ye-hCiJ5_Dg) * [The Developer's Guide to Technical Writing](https://drive.google.com/file/d/1a01iOLh_EYSTdukw_--mFJk-PD_IZXYh/view?fbclid=IwAR0D11PeXf_x4n8KlQJ2XCeY294QHyfnktrfk3cCW2VcSrTsh6zaYisLTqY) * [How to write technical blog posts](https://www.youtube.com/watch?v=YODPgBadj80) * [Technical Writing Portfolio](https://www.youtube.com/watch?v=68ddwfpXHrE) ## Podcasts * [The Manuscript Podcast](https://brenobarreto.co/the-manuscript-podcast/) - The intersection of writing and the development of technology products. * [The Manuscript Academy Podcast](https://manuscriptacademy.com/podcast) - Interviews with agents and editors, how-to tips, and behind-the-scenes looks at the creation of the Academy. * [Write the Docs Podcast](https://podcast.writethedocs.org/) - The Write the Docs Podcast publishes discussion-style podcasts focusing on topics related to the Write the Docs community. * [The Not-Boring Tech Writer](https://www.thenotboringtechwriter.com/) - The Not-Boring Tech Writer podcast introduces technical writers skills used in the open data movement. ## Useful Tools * [technical-writing-template](https://github.com/BolajiAyodeji/technical-writing-template) - A sample template for writing a technical article. * [HackMD](https://hackmd.io/) - Real-time collaborate on technical documentation in markdown. * [Dropbox Paper](https://www.dropbox.com/paper) - A flexible workspace for collaborative document-editing. * [Google Docs](https://docs.google.com/) - Smart editing and styling tools to help you easily format text and paragraphs. * [Notion](https://notion.so) - The all-in-one workspace for your notes, tasks, wikis and databases. * [Grammarly](https://www.grammarly.com/) - a writing assistant that goes deeper than grammar to offer you comprehensive writing feedback. * [Canva](https://www.canva.com/) - A graphic design platform that allows you to create social media graphics, and other visual content. * [TinyPNG](https://tinypng.com/) - Smart PNG and JPEG image compression. * [Full Page Screen Capture](https://chrome.google.com/webstore/detail/full-page-screen-capture/fdpohaocaechififmbbbbbknoalclacl?hl=en) - The simplest way to take a full page screenshot of your current browser window. * [Awesome Screenshot: Screen Video Recorder](https://chrome.google.com/webstore/detail/awesome-screenshot-screen/nlipoenfbbikpbjkfpfillcgkoblgpmj?hl=en) - Screen Capture full page screenshot and recorder for screencast. * [Readme Markdown Generator](https://github.com/kefranabg/readme-md-generator) - CLI that generates beautiful README.md files. * [Capture to a Gif](https://chrome.google.com/webstore/detail/capture-to-a-gif/eapecadlmfblmnfnojebefkbginhggeh) - Record content of pages to an animated gif picture from browser. * [Microsoft Word](https://www.microsoft.com/en/microsoft-365/word) - Goes beyond checking spelling and grammar, intelligent suggestions is also included to assist you across documents, email, and on the web. * [Log4brains](https://github.com/thomvaill/log4brains) - Docs-as-code knowledge base to manage Architecture Decision Records (ADR) for your project and publish them automatically as a static website. ## Conferences * [Write the Docs Conferences](https://www.writethedocs.org/conf/) * [The LavaCon Content Strategy Conference ](https://lavacon.org/) * [API the Docs](https://apithedocs.org/) * [Technical Communication UK Metro Conference](http://technicalcommunicationuk.com/) * [MadWorld](https://www.madcapsoftware.com/madworld-conferences/) * [WritersUA West Content Pro Conference](http://west.writersua.com/) ## Speaker Decks * [Effective Documentation: The Key to Open Source Growth](https://slides.com/bolajiayodeji/effective-oss-docs) * [Technical Writing for Non-Writers](https://speakerdeck.com/taroth21/technical-writing-for-non-writers) ## Books * [The Developer's Guide to Content Creation](https://www.developersguidetocontent.com/) - by Stephanie Morillo * [The Developer's Guide to Creating a Successful Blog](https://gumroad.com/l/successfulblog) - by Flavio Copes * [Everybody Writes](https://www.goodreads.com/book/show/23001125-everybody-writes) - by Ann Handley * [Technical Writing for Dummies](http://www.amazon.co.uk/Technical-Writing-Dummies-Sheryl-Lindsell-Roberts/dp/0764553089/ref=sr_1_1?ie=UTF8&s=books&qid=1283958591&sr=8-1) - by Sheryl Lindsell-Roberts * [The Handbook of Technical Writing](https://www.amazon.com/gp/aw/d/1457675528/ref=cm_cr_arp_mb_bdcrb_top?ie=UTF8) - by Gerald J. Alre * [How To Write Usable User Documentation, 2nd Edition](https://www.amazon.com/How-Write-Usable-User-Documentation/dp/0897746392/ref=sr_1_1) - by Edmond H. Weiss * [Letting Go of the Words](https://www.goodreads.com/book/show/1135441.Letting_Go_of_the_Words) by Janice Redish * [Docs Like Code](https://www.amazon.com/Docs-Like-Code-Anne-Gentle/dp/1365816079) by Anne Gentle * [The Product is Docs](https://www.goodreads.com/book/show/37563319-the-product-is-docs) by Christopher Gales * [Every Page is Page One](https://www.amazon.com/Every-Page-One-Topic-Based-Communication/dp/1937434281) by Mark Baker * [Modern Technical Writing: An Introduction to Software ](https://www.amazon.com/Modern-Technical-Writing-Introduction-Documentation-ebook/dp/B01A2QL9SS/ref=sr_1_1?crid=Y7S35M7LVXWE&dchild=1&keywords=modern+technical+writing&qid=1595414707&sprefix=modern+technical+%2Caps%2C346&sr=8-1)- by Andrew Etter ## Style Guides * [Microsoft Manual of Style](https://ptgmedia.pearsoncmg.com/images/9780735648715/samplepages/9780735648715.pdf) * [IBM Editorial Style Guide](https://www.ibm.com/developerworks/library/styleguidelines/) * [The Red Hat Style Guide](https://stylepedia.net/style/) * [Google Developer Documentation Style Guide](https://developers.google.com/style/) <details> <summary> ## Technical Writers to follow </summary> | Name | Link to Blog | Blog Niche | Link to Twitter | |-------------------|--------------------------------------------------------------------------|----------------------------------|-------------------------------------------------------| | Bolaji Ayodeji | https://blog.bolajiayodeji.com | Web Development, JAMstack, A11y | [@iambolajiayo](https://twitter.com/iambolajiayo) | | Angie Jones | https://angiejones.tech | Test Automation, Java | [@techgirl1908](https://twitter.com/techgirl1908) | | Sarah Drasner | https://sarah.dev/writing | Web Development, Vuejs, SVGs | [@sarah_edo](https://twitter.com/sarah_edo) | | Prosper Otemuyiwa | https://medium.com/@unicodeveloper | All things Technical & Magical | [@unicodeveloper](https://twitter.com/unicodeveloper) | | Ire Aderinokun | https://bitsofco.de | Frontend Development, JavaScript | [@ireaderinokun](https://twitter.com/ireaderinokun) | | Tom Johnson | https://idratherbewriting.com | Technical Writing and API lessons| [@tomjohnson](https://twitter.com/tomjohnson) | | Anne Gentle | https://justwriteclick.com | Doc as Code | [@annegentle](https://twitter.com/annegentle) | | Kayce Basques | https://kayce.basqu.es/blog | Dev tools and Documentation | [@kaycebasques](https://twitter.com/kaycebasques) | | Tania Rascia | https://taniarascia.com | Modern JavaScript, Node.js, and development | [@taniarascia](https://twitter.com/taniarascia) | | SWYX | https://swyx.io/writing | Web Development, React and Tech | [@swyx](https://twitter.com/swyx) | | Sean C Davis | https://cobwwweb.com/ | Web Development, JamStack | [@seancdavis29](https://twitter.com/seancdavis29) | </details> ## Contributing Found an awesome technical writer, resource, article, blog, tool, video, speaker deck etc.? Please send me a pull request and follow the [contributors guidelines](/CONTRIBUTING.md). ## Contributors ✨ <!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section --> <!-- prettier-ignore-start --> <!-- markdownlint-disable --> <table> <tr> <td align="center"><a href="https://www.patreon.com/bolajiayodeji"><img src="https://avatars2.githubusercontent.com/u/30334776?v=4" width="100px;" alt=""/><br /><sub><b>Bolaji Ayodeji</b></sub></a><br /><a href="#content-BolajiAyodeji" title="Content">🖋</a> <a href="https://github.com/BolajiAyodeji/awesome-technical-writing/commits?author=BolajiAyodeji" title="Documentation">📖</a> <a href="#design-BolajiAyodeji" title="Design">🎨</a></td> <td align="center"><a href="https://github.com/prachford"><img src="https://avatars2.githubusercontent.com/u/59001653?v=4" width="100px;" alt=""/><br /><sub><b>prachford</b></sub></a><br /><a href="https://github.com/BolajiAyodeji/awesome-technical-writing/commits?author=prachford" title="Documentation">📖</a></td> <td align="center"><a href="https://iamjude.xyz"><img src="https://avatars3.githubusercontent.com/u/44995419?v=4" width="100px;" alt=""/><br /><sub><b>Jude J Obiejesi</b></sub></a><br /><a href="https://github.com/BolajiAyodeji/awesome-technical-writing/commits?author=JaybeeClassical" title="Documentation">📖</a></td> <td align="center"><a href="https://medium.com/@patford12"><img src="https://avatars0.githubusercontent.com/u/64233065?v=4" width="100px;" alt=""/><br /><sub><b>Patrick Rachford</b></sub></a><br /><a href="https://github.com/BolajiAyodeji/awesome-technical-writing/commits?author=rachfop" title="Documentation">📖</a></td> <td align="center"><a href="http://edidiongasikpo.com/"><img src="https://avatars1.githubusercontent.com/u/28895379?v=4" width="100px;" alt=""/><br /><sub><b>Didicodes</b></sub></a><br /><a href="https://github.com/BolajiAyodeji/awesome-technical-writing/commits?author=edyasikpo" title="Documentation">📖</a></td> <td align="center"><a href="https://github.com/browncrussell"><img src="https://avatars3.githubusercontent.com/u/70669410?v=4" width="100px;" alt=""/><br /><sub><b>browncrussell</b></sub></a><br /><a href="https://github.com/BolajiAyodeji/awesome-technical-writing/commits?author=browncrussell" title="Documentation">📖</a></td> <td align="center"><a href="https://dwyer.co.za"><img src="https://avatars2.githubusercontent.com/u/2641205?v=4" width="100px;" alt=""/><br /><sub><b>Gareth Dwyer</b></sub></a><br /><a href="https://github.com/BolajiAyodeji/awesome-technical-writing/commits?author=sixhobbits" title="Documentation">📖</a></td> </tr> <tr> <td align="center"><a href="https://peterthaleikis.com"><img src="https://avatars0.githubusercontent.com/u/8433587?v=4" width="100px;" alt=""/><br /><sub><b>Peter Thaleikis</b></sub></a><br /><a href="https://github.com/BolajiAyodeji/awesome-technical-writing/commits?author=spekulatius" title="Documentation">📖</a></td> <td align="center"><a href="https://genesisgabiola.now.sh/"><img src="https://avatars0.githubusercontent.com/u/8042418?v=4" width="100px;" alt=""/><br /><sub><b>Genesis Gabiola</b></sub></a><br /><a href="https://github.com/BolajiAyodeji/awesome-technical-writing/commits?author=genesisgabiola" title="Documentation">📖</a></td> <td align="center"><a href="https://bluebricks.dev"><img src="https://avatars2.githubusercontent.com/u/8985674?v=4" width="100px;" alt=""/><br /><sub><b>Thomas Vaillant</b></sub></a><br /><a href="https://github.com/BolajiAyodeji/awesome-technical-writing/commits?author=thomvaill" title="Documentation">📖</a></td> </tr> </table> <!-- markdownlint-enable --> <!-- prettier-ignore-end --> <!-- ALL-CONTRIBUTORS-LIST:END --> --- **Check out my [blog](https://blog.bolajiayodeji.com), [newsletter](https://bawd.bolajiayodeji.com), or say *hi* on [Twitter](https://twitter.com/iambolajiayo).** ## License <a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a><br />This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.
89.76
445
0.729699
yue_Hant
0.598778
aaf605098a451df16de7dcf6485734878be73a28
282
md
Markdown
README.md
eestevez123/Delivery-Bot-Route-Planning
b0f6092b1e41264f9b67803e2809d60f14cb5d0b
[ "MIT" ]
null
null
null
README.md
eestevez123/Delivery-Bot-Route-Planning
b0f6092b1e41264f9b67803e2809d60f14cb5d0b
[ "MIT" ]
null
null
null
README.md
eestevez123/Delivery-Bot-Route-Planning
b0f6092b1e41264f9b67803e2809d60f14cb5d0b
[ "MIT" ]
null
null
null
# Delivery-Bot-Route-Planning ## Description: The program finds all possible paths from the user given location to it's destination and finds the optimal path for the robot so that food could be delivered quickly and efficiently ## Team Members Samson Cain Yiwen Pan Eddie Estevez
35.25
182
0.804965
eng_Latn
0.992483
aaf729ffa6e45d0104ce6b8f5c0a513bfa65fa7f
3,472
md
Markdown
_posts/2018-09-23-Download-eric-discworld-9-terry-pratchett.md
Jobby-Kjhy/27
ea48bae2a083b6de2c3f665443f18b1c8f241440
[ "MIT" ]
null
null
null
_posts/2018-09-23-Download-eric-discworld-9-terry-pratchett.md
Jobby-Kjhy/27
ea48bae2a083b6de2c3f665443f18b1c8f241440
[ "MIT" ]
null
null
null
_posts/2018-09-23-Download-eric-discworld-9-terry-pratchett.md
Jobby-Kjhy/27
ea48bae2a083b6de2c3f665443f18b1c8f241440
[ "MIT" ]
null
null
null
--- layout: post comments: true categories: Other --- ## Download Eric discworld 9 terry pratchett book Even weight. And he might not have dreamed of cleverly common Samoyed dress. I don't know anything about it. had the same names as they did in the outer world. "What'd you say, utterly devoid of consequences of original sin. I'll close my eyes and tell myself a story? The willows had grown, is this fine young fella the Jay you were telling eric discworld 9 terry pratchett about?" Hanlon asked. " grew to eric discworld 9 terry pratchett saile of them; and amongst the rest, in a vertiginous spiral. And she would say, and-although he "You won't dance?" "I'll stay with you, Curtis She tugged at the lock of hair over her temple again, ironic-and a little sad. experienced it. In double briefs, the flight burning - the narrowing gap. Then he wrote a letter to King Azadbekht, who invented hip, eagles in flight, which were doubtless intended to be used in lighting fires winter. You walked on, chief," Driscoll announced, the practice was probably good for him, ii, 143, so he had lied about his intentions without feeling guilty because the people who told him not to be dishonest hadn't given him any choice. 334; ii. After all, would God I knew who shall to us himself with news of you present, he would now have the memory of her suffering from which to take consolation. " He climbed up the ladder and handed the grey man the mirror. Some, making the eric discworld 9 terry pratchett red, which might also have caused the shrieking figure to perform these drug lords?" 135. progress. Life was too short to waste it working if you had the means to afford lifelong baby, No. Labuan itself and its immediate neighbourhood have The light in her dimmed. the obituary appears, at maternal grandparents. 226_n_ "Car?" buttons, a tall nurse stepped aside and motioned Celestina to leaving things out. "But it never lasts. " programmers and technical writers is in place. Selifontov travelled in a reindeer sledge along the coast of the discovery was completely unknown in Moscow. And then Agnes. Perhaps the breath of consist of stratified granitic rocks, NUMMELIN, looking straight into hers without  STORY OF THE OLD WOMAN. Me here talkin' plain truth, painted in bright his successful trick. She'd said, or angel eric discworld 9 terry pratchett, as if by a momentary dawn, ii, 171_n_ lucky as an Irian', Voice Production, but he lay in sleep like a man who was no stranger to the bed, 'An I foregather not with him, and now eric discworld 9 terry pratchett up at his mother once more, she told him to come with her and led him very far into the wood. "She's tough, manned by a rancher "I should sap. rain, in which case the election will automatically be suspended, Leilani," Polly says, that the often for a trifle, and a GPS nearest was open. anything in this screwy life, 'I desire eric discworld 9 terry pratchett thee that thou marry me to thy daughter, the Third Platoon of D Company had set up its Tactical Battle Station in a depression surrounded by interconnecting patches of sagebrush and scrub. A boat is waiting at the dock to take her, maybe it eric discworld 9 terry pratchett the dark variety A small glistening pink animal poked its head out of the Toad's great tangled realized that this might not be the case, God hath it in His power to change a case from foul to fair, but you will if you stay here long enough-they know genetics. Hovgaard.
385.777778
3,366
0.785138
eng_Latn
0.99982
aaf86645759bf94fe72db75a1c627013a061694a
2,080
md
Markdown
docs/quick-start.md
enrico-neri-zerynth/authboss
e4bafea7518c4eb2cfacb36b79dfc581015ab0f5
[ "MIT" ]
2,028
2017-07-31T16:05:56.000Z
2022-03-31T20:28:07.000Z
docs/quick-start.md
Robitx/authboss
e850577692608b085f014330cae581e5fd0fbcf5
[ "MIT" ]
201
2017-07-31T04:06:28.000Z
2022-02-17T15:50:51.000Z
docs/quick-start.md
Robitx/authboss
e850577692608b085f014330cae581e5fd0fbcf5
[ "MIT" ]
158
2017-07-31T01:42:43.000Z
2022-03-29T07:51:55.000Z
# Quick Start To get started with Authboss in the simplest way, is to simply create a Config, populate it with the things that are required, and start implementing [use cases](#use-cases). The use cases describe what's required to be able to use a particular piece of functionality, or the best practice when implementing a piece of functionality. Please note the [app requirements](#app-requirements) for your application as well [integration requirements](#integration-requirements) that follow. Of course the standard practice of fetching the library is just the beginning: ```bash # Get the latest, you must be using Go modules as of v3 of Authboss. go get -u github.com/volatiletech/authboss/v3 ``` Here's a bit of starter code that was stolen from the sample. ```go ab := authboss.New() ab.Config.Storage.Server = myDatabaseImplementation ab.Config.Storage.SessionState = mySessionImplementation ab.Config.Storage.CookieState = myCookieImplementation ab.Config.Paths.Mount = "/authboss" ab.Config.Paths.RootURL = "https://www.example.com/" // This is using the renderer from: github.com/volatiletech/authboss ab.Config.Core.ViewRenderer = abrenderer.NewHTML("/auth", "ab_views") // Probably want a MailRenderer here too. // This instantiates and uses every default implementation // in the Config.Core area that exist in the defaults package. // Just a convenient helper if you don't want to do anything fancy. defaults.SetCore(&ab.Config, false, false) if err := ab.Init(); err != nil { panic(err) } // Mount the router to a path (this should be the same as the Mount path above) // mux in this example is a chi router, but it could be anything that can route to // the Core.Router. mux.Mount("/authboss", http.StripPrefix("/authboss", ab.Config.Core.Router)) ``` For a more in-depth look you **definitely should** look at the authboss sample to see what a full implementation looks like. This will probably help you more than any of this documentation. [https://github.com/volatiletech/authboss-sample](https://github.com/volatiletech/authboss-sample)
39.245283
98
0.766827
eng_Latn
0.979762
aaf870efc83621dd9dd0ed64fc18bf158862712d
1,092
md
Markdown
_posts/2020-07-31-job_posting.md
faculty-hieonn/faculty-hieonn.github.io
48f710106b0c05532525792b3e3123112899c9d9
[ "MIT" ]
null
null
null
_posts/2020-07-31-job_posting.md
faculty-hieonn/faculty-hieonn.github.io
48f710106b0c05532525792b3e3123112899c9d9
[ "MIT" ]
4
2021-11-08T23:14:54.000Z
2022-03-31T11:43:03.000Z
_posts/2020-07-31-job_posting.md
faculty-hieonn/faculty-hieonn.github.io
48f710106b0c05532525792b3e3123112899c9d9
[ "MIT" ]
null
null
null
--- title: "(청년TLO육성사업) 졸업생 취창업 지원 안내" date: 2020-07-31 11:26:28 -0400 categories: notice --- 청년TLO육성사업은 본교 이공계 졸업생(부전공자도 가능)이 취창업 및 기술사업화 인재로 성장하는 것을 돕는 프로그램으로 6개월간 근로계약체결 및 4대보험지원 혜택과 함께 이공계 연구실, 산학협력단, 파트너 기업 등에 근무하며 기술사업화 역량을 쌓아갈 수 있도록 매칭 지원하는 프로그램입니다. 출산 육아에 의한 경력단절, 각종 시험 준비에 의한 미취업, 기술사업화 및 창업 준비의 어려움 등을 해소하고 학교가 졸업 이후의 진로까지 케어하는 뜻깊은 프로그램이니 졸업생들이 많이 참여할 수 있도록 프로그램이 마련되었습니다. 아울러 연구실에서 근무하기를 원하시는 경우 우선적으로 매칭해 드리도록 하겠습니다. • 신청자격: 만34세 이하 본교 이공계 학석박사 졸업자(복수/부전공이 이공계인 경우도 가능, 2020년 8월 졸업생 가능) • 근무기간: 6개월(2020.9.1.~2021.2.28.) • 신청마감: 2020.8.17(월) • 급여조건: 산학협력단 발령 근로계약체결, 학사(179만원/월), 석박사(180만원/월) • 기대효과: 기술인력 양성을 통한 취창업 지원 / 본교 취업률 제고 • 근무장소: 학내 이공계 연구실, 산학협력단, 창업보육센터, 파트너 기업 등 자세한 사항을 아래 링크 및 첨부의 공고를 참고해 주십시오. <a href='http://www.ewha.ac.kr/ewha/news/notice.do?mode=view&articleNo=322329&article.offset=10&articleLimit=10&no=486'>http://www.ewha.ac.kr/ewha/news/notice.do?mode=view&articleNo=322329&article.offset=10&articleLimit=10&no=486</a> ※ 문의 : <br> 산학협력팀 청년TLO 운영단([email protected] / 02-3277-3743) 김현수 교수 ([email protected], 02-3277-4067) 이화여자대학교 산학협력단 제공
33.090909
233
0.707875
kor_Hang
1.000009
aaf87cf8eb024944f14a298ddfc4b598a81026fc
2,217
md
Markdown
vendor/migratetoflarum/fake-data/README.md
llupRisingll/test
16c326878d7811687172f1d6c7415de6963e705f
[ "MIT" ]
null
null
null
vendor/migratetoflarum/fake-data/README.md
llupRisingll/test
16c326878d7811687172f1d6c7415de6963e705f
[ "MIT" ]
null
null
null
vendor/migratetoflarum/fake-data/README.md
llupRisingll/test
16c326878d7811687172f1d6c7415de6963e705f
[ "MIT" ]
null
null
null
# Fake Data by MigrateToFlarum [![MIT license](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/migratetoflarum/fake-data/blob/master/LICENSE.md) [![Latest Stable Version](https://img.shields.io/packagist/v/migratetoflarum/fake-data.svg)](https://packagist.org/packages/migratetoflarum/fake-data) [![Total Downloads](https://img.shields.io/packagist/dt/migratetoflarum/fake-data.svg)](https://packagist.org/packages/migratetoflarum/fake-data) [![Donate](https://img.shields.io/badge/paypal-donate-yellow.svg)](https://www.paypal.me/clarkwinkelmann) This extension allows you to generate fake data for testing your forum. You choose number of new users, discussions and posts. If you do not create new users, random existing users will be picked as authors for new discussions and posts. If you do not create new discussions, random existing discussions will be picked for new posts. A new button is added on discussions if you want to generate replies to that discussion only. The script can be called via the API at `/api/fake-data` with parameters `user_count`, `discussion_count` and `posts_count`. If you pass zero for users or discussions, you can provide `user_ids` and/or `discussion_ids` to restrict which users/discussions can be picked. **It's probably best to not install this on your production forum!** ## Installation Use [Bazaar](https://discuss.flarum.org/d/5151-flagrow-bazaar-the-extension-marketplace) or install manually: ```bash composer require migratetoflarum/fake-data ``` ## Updating ```bash composer update migratetoflarum/fake-data php flarum cache:clear ``` ## A MigrateToFlarum extension This is a free extension by MigrateToFlarum, an online forum migration tool (launching soon). Follow us on Twitter for updates https://twitter.com/MigrateToFlarum Need a custom Flarum extension ? [Contact Clark Winkelmann !](https://clarkwinkelmann.com/flarum) ## Links - [Flarum Discuss post](https://discuss.flarum.org/d/21160) - [Source code on GitHub](https://github.com/migratetoflarum/fake-data) - [Report an issue](https://github.com/migratetoflarum/fake-data/issues) - [Download via Packagist](https://packagist.org/packages/migratetoflarum/fake-data)
47.170213
539
0.779883
eng_Latn
0.602676
aaf8900347e175bae3efa1262a5415ea373cb197
8,716
md
Markdown
WindowsServerDocs/security/guarded-fabric-shielded-vm/guarded-fabric-create-a-linux-shielded-vm-template.md
KisaragiEffective/windowsserverdocs.ja-jp
cb093a5b71fa9e962ce7c933f4b22163dd4954ac
[ "CC-BY-4.0", "MIT" ]
null
null
null
WindowsServerDocs/security/guarded-fabric-shielded-vm/guarded-fabric-create-a-linux-shielded-vm-template.md
KisaragiEffective/windowsserverdocs.ja-jp
cb093a5b71fa9e962ce7c933f4b22163dd4954ac
[ "CC-BY-4.0", "MIT" ]
null
null
null
WindowsServerDocs/security/guarded-fabric-shielded-vm/guarded-fabric-create-a-linux-shielded-vm-template.md
KisaragiEffective/windowsserverdocs.ja-jp
cb093a5b71fa9e962ce7c933f4b22163dd4954ac
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Linux のシールドされた VM テンプレートディスクを作成する ms.custom: na ms.prod: windows-server ms.topic: article ms.assetid: d0e1d4fb-97fc-4389-9421-c869ba532944 manager: dongill author: rpsqrd ms.technology: security-guarded-fabric ms.date: 08/29/2018 ms.openlocfilehash: 66d5f70f747a6209f2856afde58b6f486ea597f8 ms.sourcegitcommit: 6aff3d88ff22ea141a6ea6572a5ad8dd6321f199 ms.translationtype: MT ms.contentlocale: ja-JP ms.lasthandoff: 09/27/2019 ms.locfileid: "71386708" --- # <a name="create-a-linux-shielded-vm-template-disk"></a>Linux のシールドされた VM テンプレートディスクを作成する > 適用先:Windows Server 2019、Windows Server (半期チャネル)、 このトピックでは、1つまたは複数のテナント Vm のインスタンス化に使用できる Linux のシールドされた Vm のテンプレートディスクを準備する方法について説明します。 ## <a name="prerequisites"></a>前提条件 Linux のシールドされた VM を準備してテストするには、次のリソースを使用できる必要があります。 - Windows Server バージョン1709以降を実行している仮想化機能を備えたサーバー - 実行中の VM のコンソールに接続するために Hyper-v マネージャーを実行できる2台目のコンピューター (Windows 10 または Windows Server 2016) - サポートされている Linux シールドされた VM Os の1つの ISO イメージ: - 4\.4 カーネルを使用した Ubuntu 16.04 LTS - Red Hat Enterprise Linux 7.3 - SUSE Linux Enterprise Server 12 Service Pack 2 - Lsvmtools パッケージと OS の更新プログラムをダウンロードするためのインターネットアクセス > [!IMPORTANT] > 上記の Linux Os の新しいバージョンには、既知の TPM ドライバーのバグが含まれている可能性があります。これにより、シールドされた Vm として正常にプロビジョニングできなくなります。 > 修正プログラムが利用可能になるまで、テンプレートまたはシールドされた Vm を新しいリリースに更新することはお勧めしません。 > サポートされている Os の一覧は、更新プログラムが公開されるときに更新されます。 ## <a name="prepare-a-linux-vm"></a>Linux VM を準備する シールドされた Vm は、セキュリティで保護されたテンプレートディスクから作成されます。 テンプレートディスクには、VM のオペレーティングシステムとメタデータが含まれます。これには、コア OS コンポーネントがデプロイ前に変更されないように、/boot パーティションと/root パーティションのデジタル署名が含まれます。 テンプレートディスクを作成するには、最初に、今後のシールドされた Vm の基本イメージとして準備する通常の (シールドされていない) VM を作成する必要があります。 この VM に対してインストールしたソフトウェアと構成の変更は、このテンプレートディスクから作成されたすべてのシールドされた Vm に適用されます。 以下の手順では、Linux VM を templatization 用に準備するための最小限の要件について説明します。 > [!NOTE] > Linux ディスク暗号化は、ディスクがパーティション分割されるときに構成されます。 > これは、Linux のシールドされた VM テンプレートディスクを作成するために、dm-crypt を使用して事前に暗号化された新しい VM を作成する必要があることを意味します。 1. 仮想化サーバーで、管理者特権の PowerShell コンソールで次のコマンドを実行して、Hyper-v と Host Guardian Hyper-v サポート機能がインストールされていることを確認します。 ```powershell Install-WindowsFeature Hyper-V, HostGuardian -IncludeManagementTools -Restart ``` 2. 信頼できるソースから ISO イメージをダウンロードし、仮想化サーバーに保存するか、仮想化サーバーからアクセスできるファイル共有に保存します。 3. Windows Server バージョン1709を実行している管理コンピューターで、次のコマンドを実行して、シールドされた VM リモートサーバー管理ツールをインストールします。 ```powershell Install-WindowsFeature RSAT-Shielded-VM-Tools ``` 4. 管理コンピューターで**Hyper-v マネージャー**を開き、仮想化サーバーに接続します。 これを行うには、[サーバーへの接続...] をクリックします。[操作] ウィンドウで、または Hyper-v マネージャーを右クリックして、[サーバーへの接続...] を選択します。Hyper-v サーバーの DNS 名と、必要に応じて、接続に必要な資格情報を指定します。 5. Hyper-v マネージャーを使用して、仮想化サーバーで[外部スイッチを構成](https://docs.microsoft.com/windows-server/virtualization/hyper-v/get-started/create-a-virtual-switch-for-hyper-v-virtual-machines)し、Linux VM がインターネットにアクセスして更新プログラムを取得できるようにします。 6. 次に、Linux OS をインストールするための新しい仮想マシンを作成します。 [操作] ウィンドウで、[**新規** > **バーチャルマシン**] をクリックしてウィザードを起動します。 "テンプレート化 Linux" などの VM のフレンドリ名を入力し、 **[次へ]** をクリックします。 7. ウィザードの2ページ目で、 **[第2世代]** を選択して、VM が UEFI ベースのファームウェアプロファイルでプロビジョニングされていることを確認します。 8. 設定に従って、ウィザードの残りの部分を完了します。 この VM には差分ディスクを使用しないでください。シールドされた VM テンプレートディスクは差分ディスクを使用できません。 最後に、OS をインストールできるように、前の手順でダウンロードした ISO イメージをこの VM の仮想 DVD ドライブに接続します。 9. Hyper-v マネージャーで、新しく作成した VM を選択し、操作 ウィンドウの **接続** をクリックして、vm の仮想コンソールに接続します。 表示されるウィンドウで、 **[開始]** をクリックして仮想マシンを有効にします。 10. 選択した Linux ディストリビューションのセットアッププロセスを続行します。 各 Linux ディストリビューションは異なるセットアップウィザードを使用しますが、Linux のシールドされた VM テンプレートディスクになる Vm では、次の要件を満たす必要があります。 - ディスクは、GUID Paritioning Table (GPT) レイアウトを使用してパーティション分割する必要があります。 - ルートパーティションは、dm-crypt を使用して暗号化する必要があります。 パスフレーズは、**パスフレーズ**(すべて小文字) に設定する必要があります。 このパスフレーズはランダム化され、シールドされた VM がプロビジョニングされるときにパーティションが再暗号化されます。 - ブートパーティションは、 **ext2**ファイルシステムを使用する必要があります。 11. Linux OS が完全に起動し、サインインした後、linux 仮想カーネルと関連する Hyper-v integration services パッケージをインストールすることをお勧めします。 さらに、シールドされた VM にアクセスするには、SSH サーバーまたはその他のリモート管理ツールをインストールする必要があります。 Ubuntu で、次のコマンドを実行してこれらのコンポーネントをインストールします。 ```bash sudo apt-get install linux-virtual linux-tools-virtual linux-cloud-tools-virtual linux-image-extra-virtual openssh-server ``` RHEL で、代わりに次のコマンドを実行します。 ```bash sudo yum install hyperv-daemons openssh-server sudo service sshd start ``` SLES で、次のコマンドを実行します。 ```bash sudo zypper install hyper-v sudo chkconfig hv_kvp_daemon on sudo systemctl enable sshd ``` 12. 必要に応じて Linux OS を構成します。 インストールするすべてのソフトウェア、追加したユーザーアカウント、およびシステム全体の構成の変更は、このテンプレートディスクから作成された今後のすべての Vm に適用されます。 シークレットや不要なパッケージはディスクに保存しないようにしてください。 13. System Center Virtual Machine Manager を使用して Vm をデプロイする予定の場合は、vmm ゲストエージェントをインストールして、VMM が VM のプロビジョニング中に OS を特殊化できるようにします。 特殊化を使用すると、各 VM を、さまざまなユーザーと SSH キー、ネットワーク構成、およびカスタムセットアップ手順で安全にセットアップできます。 Vmm のドキュメントで[vmm ゲストエージェントを取得してインストール](https://docs.microsoft.com/system-center/vmm/vm-linux#install-the-vmm-guest-agent)する方法について説明します。 14. 次に、 [Microsoft Linux ソフトウェアリポジトリをパッケージマネージャーに追加](../../administration/linux-package-repository-for-microsoft-software.md)します。 15. パッケージマネージャーを使用して、Linux のシールドされた VM ブートローダー shim、プロビジョニングコンポーネント、およびディスク準備ツールを含む lsvmtools パッケージをインストールします。 ```bash # Ubuntu 16.04 sudo apt-get install lsvmtools # SLES 12 SP2 sudo zypper install lsvmtools # RHEL 7.3 sudo yum install lsvmtools ``` 14. Linux OS のカスタマイズが完了したら、システムで lsvmprep インストールプログラムを見つけて実行します。 ```bash # The path below may change based on the version of lsvmprep installed # Run "find /opt -name lsvmprep" to locate the lsvmprep executable sudo /opt/lsvmtools-1.0.0-x86-64/lsvmprep ``` 15. VM をシャットダウンします。 16. VM のチェックポイントを作成した場合 (Hyper-v によって作成された自動チェックポイントと Windows 10 の更新プログラムが含まれている場合など)、続行する前に削除してください。 チェックポイントは、テンプレートディスクウィザードでサポートされていない差分ディスク (. .avhdx) を作成します。 チェックポイントを削除するには、 **Hyper-v マネージャー**を開き、VM を選択して、チェックポイントペインで最上位のチェックポイントを右クリックし、 **[チェックポイントのサブツリーを削除]** をクリックします。 ![Hyper-v マネージャーでテンプレート VM のすべてのチェックポイントを削除する](../media/Guarded-Fabric-Shielded-VM/delete-checkpoints-lsvm-template.png) ## <a name="protect-the-template-disk"></a>テンプレートディスクを保護する 前のセクションで準備した VM は、Linux のシールドされた VM テンプレートディスクとして使用する準備がほぼ完了しています。 最後の手順では、テンプレートディスクウィザードを使用してディスクを実行します。これにより、ルートとブートパーティションの現在の状態がハッシュされ、デジタル署名が行われます。 ハッシュとデジタル署名は、シールドされた VM がプロビジョニングされるときに検証され、テンプレートの作成とデプロイの間で2つのパーティションに対して承認されていない変更が行われないようにします。 ### <a name="obtain-a-certificate-to-sign-the-disk"></a>ディスクに署名するための証明書を取得する ディスクの測定値にデジタル署名するには、テンプレートディスクウィザードを実行するコンピューターで証明書を取得する必要があります。 証明書は次の要件を満たしている必要があります。 Certificate プロパティ | 必須の値 ---------------------|--------------- キーアルゴリズム | RSA 最小キーサイズ | 2048ビット 署名アルゴリズム | SHA256 (推奨) キー使用法 | デジタル署名 この証明書の詳細は、テナントがシールドデータファイルを作成し、信頼するディスクを承認しているときに、テナントに表示されます。 そのため、この証明書は、自分とテナントが相互に信頼している証明機関から取得することが重要です。 ホストとテナントの両方であるエンタープライズシナリオでは、エンタープライズ証明機関からこの証明書を発行することを検討できます。 この証明書を所有するすべてのユーザーが、正規のディスクと同じように信頼された新しいテンプレートディスクを作成できるため、この証明書は慎重に保護してください。 テストラボ環境では、次の PowerShell コマンドを使用して自己署名証明書を作成できます。 ```powershell New-SelfSignedCertificate -Subject "CN=Linux Shielded VM Template Disk Signing Certificate" ``` ### <a name="process-the-disk-with-the-template-disk-wizard-cmdlet"></a>テンプレートディスクウィザードのコマンドレットを使用してディスクを処理する Windows Server バージョン1709を実行しているコンピューターにテンプレートディスクと証明書をコピーし、次のコマンドを実行して署名プロセスを開始します。 @No__t-0 パラメーターに指定した VHDX は、更新されたテンプレートディスクで上書きされます。そのため、コマンドを実行する前にコピーを作成してください。 > [!IMPORTANT] > Windows Server 2016 または Windows 10 で利用可能なリモートサーバー管理ツールは、Linux のシールドされた VM テンプレートディスクの準備には使用できません。 > Linux のシールドされた VM テンプレートディスクを準備するには、windows Server バージョン1709または Windows Server 2019 で使用可能なリモートサーバー管理ツールで利用可能な[保護-TemplateDisk](https://docs.microsoft.com/powershell/module/shieldedvmtemplate/protect-templatedisk?view=win10-ps)コマンドレットを使用してください。 ```powershell # Replace "THUMBPRINT" with the thumbprint of your template disk signing certificate in the line below $certificate = Get-Item Cert:\LocalMachine\My\THUMBPRINT Protect-TemplateDisk -Path 'C:\temp\MyLinuxTemplate.vhdx' -TemplateName 'Ubuntu 16.04' -Version 1.0.0.0 -Certificate $certificate -ProtectedTemplateTargetDiskType PreprocessedLinux ``` これで、テンプレートディスクを使用して、Linux のシールドされた Vm をプロビジョニングする準備ができました。 System Center Virtual Machine Manager を使用して VM をデプロイしている場合は、VHDX を VMM ライブラリにコピーできるようになりました。 また、VHDX からボリューム署名カタログを抽出することもできます。 このファイルは、テンプレートを使用する VM 所有者に、署名証明書、ディスク名、およびバージョンに関する情報を提供するために使用されます。 このファイルを承認するには、このファイルをシールドデータファイルウィザードにインポートして、署名証明書を所有しているテンプレートの作成者に対して、このファイル用のテンプレートディスクを作成する必要があります。 ボリューム署名カタログを抽出するには、PowerShell で次のコマンドを実行します。 ```powershell Save-VolumeSignatureCatalog -TemplateDiskPath 'C:\temp\MyLinuxTemplate.vhdx' -VolumeSignatureCatalogPath 'C:\temp\MyLinuxTemplate.vsc' ```
40.539535
245
0.803924
jpn_Jpan
0.571407
aaf8bb7475762073ec5fea01444dd61aff47b385
472
md
Markdown
doc/README.md
teo-tsirpanis/dotnet-proj-info
eae946665b2410b479af26c8aeffd8a4c323a426
[ "MIT" ]
35
2017-03-03T18:08:53.000Z
2019-10-13T15:46:57.000Z
doc/README.md
teo-tsirpanis/dotnet-proj-info
eae946665b2410b479af26c8aeffd8a4c323a426
[ "MIT" ]
29
2017-04-12T20:37:21.000Z
2019-10-05T12:55:59.000Z
doc/README.md
teo-tsirpanis/dotnet-proj-info
eae946665b2410b479af26c8aeffd8a4c323a426
[ "MIT" ]
12
2017-08-27T09:23:43.000Z
2019-10-02T12:14:09.000Z
# Info ## How to release a new version For example you want to release version `1.2.3` 1. check that the version `1.2.3` is the one in `Directory.Build.props`, in the `Version` property 2. git tag `v1.2.3` and push it (note the `v` prefix) 3. Appveyor will trigger a job based on that tag, and add the generated nupkgs in a new [github release](https://github.com/enricosada/dotnet-proj-info/releases) 4. Go in the appveyor job, and click `Deploy` to push to nuget.org
42.909091
161
0.730932
eng_Latn
0.992096
aaf8c92eb4168d8cb30382bec2579e93235476e7
853
md
Markdown
_posts/2021-05-22-267281142.md
bookmana/bookmana.github.io
2ed7b023b0851c0c18ad8e7831ece910d9108852
[ "MIT" ]
null
null
null
_posts/2021-05-22-267281142.md
bookmana/bookmana.github.io
2ed7b023b0851c0c18ad8e7831ece910d9108852
[ "MIT" ]
null
null
null
_posts/2021-05-22-267281142.md
bookmana/bookmana.github.io
2ed7b023b0851c0c18ad8e7831ece910d9108852
[ "MIT" ]
null
null
null
--- title: "Keynote Combo Split 4A (Paperback)" date: 2021-05-22 04:56:32 categories: [외국도서, ELT-사전] image: https://bimage.interpark.com/goods_image/1/1/4/2/267281142s.jpg description: ● ● COMBO SPLIT FOR SHORTER COURSE OPTIONS, EACH COMBO SPLIT FOR KEYNOTE OFFERS ONE HALF OF THE STUDENT BOOK MATERIAL AND THE RELEVANT PRINT WORKBOOK ACTIVIT --- ## **정보** - **ISBN : 9781337108980** - **출판사 : Cengage Learning** - **출판일 : 20170401** - **저자 : Bohlke, David/ Cevik, Ben** ------ ## **요약** ● ● COMBO SPLIT FOR SHORTER COURSE OPTIONS, EACH COMBO SPLIT FOR KEYNOTE OFFERS ONE HALF OF THE STUDENT BOOK MATERIAL AND THE RELEVANT PRINT WORKBOOK ACTIVITIES. 교재 소개 감동을 주는 커뮤니케이션을 위한 4-SKILL 코스북 TED TALKS의 열정적이고 설득력 있는 연사들이 전하는... ------ ------ Keynote Combo Split 4A (Paperback) ------ ## **리뷰** 4.0 정-녀 안전하게 잘 받았습니다 2020.03.21 <br/>
18.148936
170
0.672919
kor_Hang
0.537632
aaf96c2b66edba3a4b3ddc1c3049a17ee1c5de24
632
md
Markdown
link_utili.md
olistik/covid19italia
7250408bc28630c61d21a2099d69ae40696255ce
[ "MIT" ]
91
2016-08-26T17:42:45.000Z
2022-03-30T01:10:24.000Z
link_utili.md
olistik/covid19italia
7250408bc28630c61d21a2099d69ae40696255ce
[ "MIT" ]
400
2016-08-25T23:08:46.000Z
2021-12-10T11:34:44.000Z
link_utili.md
olistik/covid19italia
7250408bc28630c61d21a2099d69ae40696255ce
[ "MIT" ]
102
2016-08-25T08:03:03.000Z
2021-11-09T19:26:11.000Z
--- layout: page title: Link e Contatti Utili permalink: /link_utili/ --- Di seguito la lista dei contatti e dei link registrati da TerremotoCentroItalia # Contatti {% assign filteredissues = site.data.issuesjson | where: "state","open" | where_exp: "member","member.issue.labels contains 'Contatti'"%} {% for member in filteredissues %} * <a href="/issues/{{ member.number | datapage_url: '.' }}">{{member.title}}</a> {% endfor %} # Link {: .table .table-striped #links} Nome |Link :---------------|:----------------------- {% for member in site.data.links %} {{member.Nome}} | [Link]({{member.Link}}) {% endfor %}
27.478261
137
0.623418
eng_Latn
0.406847
aaf9e4abaf29a90e699bca9189262941b7345a20
10,781
md
Markdown
sources_non_forked/vim-go/README.md
kieranajp/vimrc
cc0e8a990733029717cb4719a791772685dc398e
[ "MIT" ]
1
2021-03-04T18:18:49.000Z
2021-03-04T18:18:49.000Z
sources_non_forked/vim-go/README.md
kieranajp/vimrc
cc0e8a990733029717cb4719a791772685dc398e
[ "MIT" ]
null
null
null
sources_non_forked/vim-go/README.md
kieranajp/vimrc
cc0e8a990733029717cb4719a791772685dc398e
[ "MIT" ]
null
null
null
# vim-go Go (golang) support for Vim, which comes with pre-defined sensible settings (like auto gofmt on save), with autocomplete, snippet support, improved syntax highlighting, go toolchain commands, and more. If needed vim-go installs all necessary binaries for providing seamless Vim integration with current commands. It's highly customizable and each individual feature can be disabled/enabled easily. ![vim-go](https://dl.dropboxusercontent.com/u/174404/vim-go-2.png) ## Features * Improved Syntax highlighting with items such as Functions, Operators, Methods. * Auto completion support via `gocode` * Better `gofmt` on save, which keeps cursor position and doesn't break your undo history * Go to symbol/declaration with `:GoDef` * Look up documentation with `:GoDoc` inside Vim or open it in browser * Automatically import packages via `:GoImport` or plug it into autosave * Compile your package with `:GoBuild`, install it with `:GoInstall` or test them with `:GoTest` (also supports running single tests via `:GoTestFunc`) * Quickly execute your current file/files with `:GoRun` * Automatic `GOPATH` detection based on the directory structure (i.e. `gb` projects, `godep` vendored projects) * Change or display `GOPATH` with `:GoPath` * Create a coverage profile and display annotated source code to see which functions are covered with `:GoCoverage` * Call `gometalinter` with `:GoMetaLinter`, which invokes all possible linters (golint, vet, errcheck, deadcode, etc..) and shows the warnings/errors * Lint your code with `:GoLint` * Run your code through `:GoVet` to catch static errors * Advanced source analysis tools utilizing guru, such as `:GoImplements`, `:GoCallees`, and `:GoReferrers` * Precise type-safe renaming of identifiers with `:GoRename` * List all source files and dependencies * Unchecked error checking with `:GoErrCheck` * Integrated and improved snippets, supporting `ultisnips` or `neosnippet` * Share your current code to [play.golang.org](http://play.golang.org) with `:GoPlay` * On-the-fly type information about the word under the cursor. Plug it into your custom vim function. * Go asm formatting on save * Tagbar support to show tags of the source code in a sidebar with `gotags` * Custom vim text objects such as `a function` or `inner function` list. * Jump to function or type declarations with `:GoDecls` or `:GoDeclsDir` * A async launcher for the go command is implemented for Neovim, fully async building and testing (beta). * Integrated with the Neovim terminal, launch `:GoRun` and other go commands in their own new terminal. (beta) * Alternate between implementation and test code with `:GoAlternate` ## Install Master branch is supposed to be a development branch. So stuff here can break and change. Please try use always the [latest release](https://github.com/fatih/vim-go/releases/latest) Vim-go follows the standard runtime path structure, so I highly recommend to use a common and well known plugin manager to install vim-go. Do not use vim-go with other Go oriented vim plugins. For Pathogen just clone the repo. For other plugin managers add the appropriate lines and execute the plugin's install command. * [Pathogen](https://github.com/tpope/vim-pathogen) * `git clone https://github.com/fatih/vim-go.git ~/.vim/bundle/vim-go` * [vim-plug](https://github.com/junegunn/vim-plug) * `Plug 'fatih/vim-go'` * [NeoBundle](https://github.com/Shougo/neobundle.vim) * `NeoBundle 'fatih/vim-go'` * [Vundle](https://github.com/gmarik/vundle) * `Plugin 'fatih/vim-go'` Please be sure all necessary binaries are installed (such as `gocode`, `godef`, `goimports`, etc.). You can easily install them with the included `:GoInstallBinaries` command. If invoked, all necessary binaries will be automatically downloaded and installed to your `$GOBIN` environment (if not set it will use `$GOPATH/bin`). Note that this command requires `git` for fetching the individual Go packages. Additionally, use `:GoUpdateBinaries` to update the installed binaries. ### Optional * Autocompletion is enabled by default via `<C-x><C-o>`. To get real-time completion (completion by type) install: [neocomplete](https://github.com/Shougo/neocomplete.vim) for Vim or [deoplete](https://github.com/Shougo/deoplete.nvim) and [deoplete-go](https://github.com/zchee/deoplete-go) for NeoVim * To display source code tag information on a sidebar install [tagbar](https://github.com/majutsushi/tagbar). * For snippet features install: [neosnippet](https://github.com/Shougo/neosnippet.vim) or [ultisnips](https://github.com/SirVer/ultisnips). * Screenshot color scheme is a slightly modified molokai: [fatih/molokai](https://github.com/fatih/molokai). * For a better documentation viewer checkout: [go-explorer](https://github.com/garyburd/go-explorer). ## Usage Many of the plugin's [features](#features) are enabled by default. There are no additional settings needed. All usages and commands are listed in `doc/vim-go.txt`. Note that help tags needs to be populated. Check your plugin manager settings to generate the documentation (some do it automatically). After that just open the help page to see all commands: :help vim-go ## Example Mappings vim-go has several `<Plug>` mappings which can be used to create custom mappings. Unless otherwise specified, none of these mappings are enabled by default. Here some examples you might find useful: Run commands such as `go run` for the current file with `<leader>r` or `go build` and `go test` for the current package with `<leader>b` and `<leader>t` respectively. Display beautifully annotated source code to see which functions are covered with `<leader>c`. ```vim au FileType go nmap <leader>r <Plug>(go-run) au FileType go nmap <leader>b <Plug>(go-build) au FileType go nmap <leader>t <Plug>(go-test) au FileType go nmap <leader>c <Plug>(go-coverage) ``` By default the mapping `gd` is enabled, which opens the target identifier in current buffer. You can also open the definition/declaration, in a new vertical, horizontal, or tab, for the word under your cursor: ```vim au FileType go nmap <Leader>ds <Plug>(go-def-split) au FileType go nmap <Leader>dv <Plug>(go-def-vertical) au FileType go nmap <Leader>dt <Plug>(go-def-tab) ``` Open the relevant Godoc for the word under the cursor with `<leader>gd` or open it vertically with `<leader>gv` ```vim au FileType go nmap <Leader>gd <Plug>(go-doc) au FileType go nmap <Leader>gv <Plug>(go-doc-vertical) ``` Or open the Godoc in browser ```vim au FileType go nmap <Leader>gb <Plug>(go-doc-browser) ``` Show a list of interfaces which is implemented by the type under your cursor with `<leader>s` ```vim au FileType go nmap <Leader>s <Plug>(go-implements) ``` Show type info for the word under your cursor with `<leader>i` (useful if you have disabled auto showing type info via `g:go_auto_type_info`) ```vim au FileType go nmap <Leader>i <Plug>(go-info) ``` Rename the identifier under the cursor to a new name ```vim au FileType go nmap <Leader>e <Plug>(go-rename) ``` More `<Plug>` mappings can be seen with `:he go-mappings`. Also these are just recommendations, you are free to create more advanced mappings or functions based on `:he go-commands`. ## Settings Below are some settings you might find useful. For the full list see `:he go-settings`. By default syntax-highlighting for Functions, Methods and Structs is disabled. To change it: ```vim let g:go_highlight_functions = 1 let g:go_highlight_methods = 1 let g:go_highlight_structs = 1 let g:go_highlight_interfaces = 1 let g:go_highlight_operators = 1 let g:go_highlight_build_constraints = 1 ``` Enable goimports to automatically insert import paths instead of gofmt: ```vim let g:go_fmt_command = "goimports" ``` By default vim-go shows errors for the fmt command, to disable it: ```vim let g:go_fmt_fail_silently = 1 ``` Disable auto fmt on save: ```vim let g:go_fmt_autosave = 0 ``` Disable opening browser after posting your snippet to `play.golang.org`: ```vim let g:go_play_open_browser = 0 ``` By default when `:GoInstallBinaries` is called, the binaries are installed to `$GOBIN` or `$GOPATH/bin`. To change it: ```vim let g:go_bin_path = expand("~/.gotools") let g:go_bin_path = "/home/fatih/.mypath" "or give absolute path ``` ### Using with Neovim (beta) Note: Neovim currently is not a first class citizen for vim-go. You are free to open bugs but I'm not going to look at them. Even though I'm using Neovim myself, Neovim itself is still alpha. So vim-go might not work well as good as in Vim. I'm happy to accept pull requests or very detailed bug reports. Run `:GoRun` in a new tab, horizontal split or vertical split terminal ```vim au FileType go nmap <leader>rt <Plug>(go-run-tab) au FileType go nmap <Leader>rs <Plug>(go-run-split) au FileType go nmap <Leader>rv <Plug>(go-run-vertical) ``` By default new terminals are opened in a vertical split. To change it ```vim let g:go_term_mode = "split" ``` By default the testing commands run asynchronously in the background and display results with `go#jobcontrol#Statusline()`. To make them run in a new terminal ```vim let g:go_term_enabled = 1 ``` ### Using with Syntastic Sometimes when using both `vim-go` and `syntastic` Vim will start lagging while saving and opening files. The following fixes this: ```vim let g:syntastic_go_checkers = ['golint', 'govet', 'errcheck'] let g:syntastic_mode_map = { 'mode': 'active', 'passive_filetypes': ['go'] } ``` Another issue with `vim-go` and `syntastic` is that the location list window that contains the output of commands such as `:GoBuild` and `:GoTest` might not appear. To resolve this: ```vim let g:go_list_type = "quickfix" ``` ## More info Check out the [Wiki](https://github.com/fatih/vim-go/wiki) page for more information. It includes [Screencasts](https://github.com/fatih/vim-go/wiki/Screencasts), an [FAQ section](https://github.com/fatih/vim-go/wiki/FAQ-Troubleshooting), and many other [various pieces](https://github.com/fatih/vim-go/wiki) of information. ## Donation People have asked for this for a long time, now you can be a fully supporter by [being a patron](https://www.patreon.com/fatih)! This is fully optional and is just a way to support vim-go's ongoing development directly. Thanks! [https://www.patreon.com/fatih](https://www.patreon.com/fatih) ## Credits * Go Authors for official vim plugins * Gocode, Godef, Golint, Guru, Goimports, Gotags, Errcheck projects and authors of those projects. * Other vim-plugins, thanks for inspiration (vim-golang, go.vim, vim-gocode, vim-godef) * [Contributors](https://github.com/fatih/vim-go/graphs/contributors) of vim-go ## License The BSD 3-Clause License - see `LICENSE` for more details
36.299663
91
0.751415
eng_Latn
0.976631
aafacccd7602b5c38df6172465183c1d2b6530d2
1,501
md
Markdown
_posts/2017-11-24-Object-Detection.md
lsummer/blog
3c7675f1c4363355a59ef153087ab5fb9accfbe9
[ "MIT" ]
null
null
null
_posts/2017-11-24-Object-Detection.md
lsummer/blog
3c7675f1c4363355a59ef153087ab5fb9accfbe9
[ "MIT" ]
2
2018-05-25T01:40:17.000Z
2018-05-26T09:25:54.000Z
_posts/2017-11-24-Object-Detection.md
lsummer/lsummer.github.io
3c7675f1c4363355a59ef153087ab5fb9accfbe9
[ "MIT" ]
null
null
null
--- layout: post title: Object Detection subtitle: A review of object detection date: 2017-11-24 author: Xiya Lv header-img: img/post-bg-ios9-web.jpg catalog: true tags: - Machine Learning - Deep Learning - Object Detection --- # Abstract Object detection as a core role of computer vision, ideals with detecting instances of semantic objects of a certain class in digital images and videos, involving not only recognizing and classifying every object, but localizing each one by drawing the appropriate bounding box around it. With the development of deep Learning, which has recently shown outstanding performance on many fields, many successful approaches to object detection are proposed, making the object detection systems faster and more accurate. In this paper, I review some basic knowledges of object detection, such as the datasets using by most of the papers, the criteria for determing performance, non-maximal suppression for fixing multiple detections. And review the main successful methods in the recent years: R-CNN, Fast R-CNN, Faster R-CNN, R-FCN, YOLO and SSD. By the end of this review, we will hopefully have gained an understanding of how deep learning is applied to object detection, and how these object detection models inspire and diverge from others. **Keywords:** Object Detection, Deep Learning, CNN, SSD, YOLO The full paper is [here](https://github.com/lsummer/lsummer.github.io/blob/master/_posts/ReviewofObjectDetection.pdf).
65.26087
1,042
0.781479
eng_Latn
0.997043
aaface2e712948ef9a20ab653404db5cbadb9adc
11,257
md
Markdown
docs/docs/data-fetching.md
Pabelnedved/ghostlocoko
221481bef7733c9957fb9ba83178f08280243b81
[ "MIT" ]
3
2019-03-07T09:46:18.000Z
2020-05-16T14:21:08.000Z
docs/docs/data-fetching.md
Pabelnedved/ghostlocoko
221481bef7733c9957fb9ba83178f08280243b81
[ "MIT" ]
29
2018-02-16T13:36:10.000Z
2020-02-03T12:02:11.000Z
docs/docs/data-fetching.md
Pabelnedved/ghostlocoko
221481bef7733c9957fb9ba83178f08280243b81
[ "MIT" ]
2
2020-03-27T18:27:28.000Z
2020-04-27T04:37:13.000Z
--- title: Build Time and Client Runtime Data Fetching --- import BuildDataExample from "@components/build-data-example" import ClientDataExample from "@components/client-data-example" This guide demonstrates how to fetch data at both [_build time_](/docs/glossary#build) and [_runtime_](/docs/glossary#runtime) in Gatsby. Most of the techniques outlined are for custom data handling. Be sure to check out Gatsby's [plugin library](/plugins/) to see if there's an off-the-shelf solution for your data requirements, such as [sourcing from a CMS](/docs/headless-cms/) or other third-party integration. ## The benefits of the hybrid nature of Gatsby apps Because Gatsby is capable of generating content at build time as well as making calls to external services at runtime, you can make [hybrid pages](/docs/adding-app-and-website-functionality/#hybrid-app-pages) that take advantage of the benefits of static content as well as dynamic content. You can gather data ahead of time while the site builds so that when a user loads your page the data is already available. Then, for data that is of a more dynamic nature, you can request data from another service like an API with `fetch` or a library like `axios`. This combination of static/dynamic is possible through React [hydration](/docs/glossary#hydration), which means that Gatsby (through React.js) builds static files to render server-side. When Gatsby's script bundle downloads and executes in the browser, it preserves the HTML markup built by Gatsby and turns the site into a full React web application that can manipulate the [DOM](/docs/glossary#dom). The result of this process creates fast loading pages and a nice user experience. > To understand how statically generated content can turn into a React app, refer to the [Understanding React Hydration guide](/docs/react-hydration) Compiling pages at build time is useful when your website content won't change often, or when triggering a build process to recompile works fine. However, some websites with more dynamic needs require a [client-side](/docs/glossary#client-side) runtime to handle constantly changing content after the page loads, like a chat widget, user upvotes, or an email client web application. ## Combining build time and client runtime data To illustrate a combination of build time and client runtime data, this guide uses code from a [small example site](https://gatsby-data-fetching.netlify.com). It uses the [`gatsby-source-graphql`](/packages/gatsby-source-graphql/) plugin to fetch data from GitHub's GraphQL API at build time for static content like the name and URL to a repository, and the [`fetch` API](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) to retrieve more dynamic data from the GitHub API on the [client-side](/docs/glossary#client-side) like star counts when the page loads in the browser. Reasons to fetch certain data at build time vs. client runtime will vary, but in this example the repo's name and URL are much less likely to change between builds of the site. The repo's star counts, on the other hand, are likely to change often and would benefit from a client-side request to the GitHub API to stay current between static site builds. > Check out the code from the [full example here](https://github.com/gatsbyjs/gatsby/tree/master/examples/data-fetching). ### Fetching data at build time In order to fetch data at build time, you can use a source plugin or source data yourself. To source data yourself you can create an integration with a third-party system by creating [nodes for the GraphQL layer](/docs/node-creation/) in your `gatsby-node` file from retrieved data that becomes queryable in pages. This is the same method that source plugins implement to [source data](/docs/content-and-data/) while the site builds. You can read about that process in the [Creating a Source Plugin guide](/docs/creating-a-source-plugin/). > This process of fetching data at build time and creating pages from the data is [covered in more depth in the tutorial](/tutorial/part-five/) as well as the docs for [creating pages from data programmatically](/docs/programmatically-create-pages-from-data/). #### Source data to be queried at build time To source data using an existing source plugin you need to install a plugin and add it to your [config](/docs/api-files-gatsby-config/). To use `gatsby-source-graphql`, first install it: ```shell npm install --save gatsby-source-graphql ``` Then, add the plugin to your `gatsby-config`: ```javascript:title=gatsby-config.js module.exports = { plugins: [ { resolve: `gatsby-source-graphql`, options: { typeName: `GitHub`, fieldName: `github`, url: `https://api.github.com/graphql`, //highlight-line headers: { Authorization: `Bearer your-github-token`, }, }, }, ], } ``` > Because the GitHub GraphQL API requires you to be authenticated to make requests, you need to create a token and replace the "your-github-token" text in the header for Authorization with that token. You can [secure your key using environment variables](/docs/environment-variables/) if you're pushing code to a public repository. Alternately, if you want to source data yourself you can use APIs Gatsby provides. Source plugins take advantage of the [`sourceNodes` API](/docs/node-apis/#sourceNodes) and the [`createNode` action](/docs/actions/#createNode) provided by Gatsby to make your data queryable during the build process. If you want to source data yourself you can add a section of code like this using the `createNode` API to add a node to your data layer manually: ```js:title=gatsby-node.js const fetch = require(`node-fetch`) exports.sourceNodes = async ({ actions: { createNode }, createContentDigest, }) => { // get data from GitHub API at build time const result = await fetch(`https://api.github.com/repos/gatsbyjs/gatsby`) const resultData = await result.json() // create node for build time data example in the docs createNode({ // nameWithOwner and url are arbitrary fields from the data nameWithOwner: resultData.full_name, url: resultData.html_url, // required fields id: `example-build-time-data`, parent: null, children: [], internal: { type: `Example`, contentDigest: createContentDigest(resultData), }, }) } ``` It is important to note that node fields can indeed be arbitrary. The `nameWithOwner` and `url` fields from above are examples of this. This node created manually could be retrieved with a query like this: ```graphql query { example { nameWithOwner url } } ``` #### Writing a query to gather the static data needed for a page With the source plugin installed and added to your config, you can write a [static query](/docs/use-static-query/) that will retrieve the necessary data from Gatsby's data layer while building the site. ```jsx:title=src/pages/index.js import React from "react" import { graphql, useStaticQuery } from "gatsby" // highlight-line const IndexPage = () => { // highlight-start const gatsbyRepoData = useStaticQuery(graphql` query { github { repository(name: "gatsby", owner: "gatsbyjs") { id nameWithOwner url } } } `) // highlight-end return ( <section> <p> Build Time Data: Gatsby repo{` `} <a href={gatsbyRepoData.github.repository.url}> {gatsbyRepoData.github.repository.nameWithOwner} // highlight-line </a> </p> </section> ) } export default IndexPage ``` > This data is gathered at build time and written to a JSON file. As the build continues, the code is rewritten behind the scenes to dynamically import the JSON file and set `gatsbyRepoData` equal to the contents of the JSON file instead of the call to `useStaticQuery`. You can read more about this process in the Gatsby internals section on [Normal vs Static Queries](/docs/static-vs-normal-queries/#replacing-queries-with-json-imports) Here's an adaptation of this build time data example being used on this page: <BuildDataExample /> The linked URL and repository name are fetched at build time; if the name of the repository changed and the site were rebuilt, it would update. ### Fetching data at client-side runtime For fetching data at runtime in the browser, you can use any method to retrieve data that you would use in a regular React app. #### Retrieving data with the `fetch` API The `fetch` API is a modern implementation of the older, well-supported `XMLHttpRequest` (also known as AJAX). With the `useState` and `useEffect` hooks from React, you can query for the data once when the page loads, and save the data returned to a variable called `starsCount`. Every time the page is refreshed, `fetch` will go to the GitHub API to retrieve the most up-to-date version of the data. ```jsx:title=src/pages/index.js import React, { useState, useEffect } from "react" // highlight-line import { graphql, useStaticQuery } from "gatsby" const IndexPage = () => { // Build Time Data Fetching const gatsbyRepoData = useStaticQuery(graphql` query { github { repository(name: "gatsby", owner: "gatsbyjs") { id nameWithOwner url } } } `) // Client-side Runtime Data Fetching // highlight-start const [starsCount, setStarsCount] = useState(0) useEffect(() => { // get data from GitHub api fetch(`https://api.github.com/repos/gatsbyjs/gatsby`) .then(response => response.json()) // parse JSON from request .then(resultData => { setStarsCount(resultData.stargazers_count) }) // set data for the number of stars }, []) // highlight-end return ( <section> <p> Build Time Data: Gatsby repo{` `} <a href={gatsbyRepoData.github.repository.url}> {gatsbyRepoData.github.repository.nameWithOwner} </a> </p> <p>Runtime Data: Star count for the Gatsby repo {starsCount}</p> // highlight-line </section> ) } export default IndexPage ``` In the code above, both the build time and runtime data are rendered from the same page component. The build time data has the advantage of being loaded before the user ever gets to the page. When the site loads in the browser, the runtime section in the `useEffect` hook will gather its data and render it as well. Here's an adaptation of this runtime example being used on this page (which is also a Gatsby app)! <ClientDataExample /> The repo's star count is fetched at runtime; if you refresh the page, this number will likely update. ## Other resources You may be interested in other projects (both used in production and proof-of-concepts) that illustrate this usage: - [Live example](https://gatsby-data-fetching.netlify.com) of the code used in this guide - [Gatsby store](https://github.com/gatsbyjs/store.gatsbyjs.org): with static product pages at build time and client-side interactions for e-commerce features - [Gatsby mail](https://github.com/DSchau/gatsby-mail): a client-side email application - [Example repo fetching data using Apollo](https://github.com/jlengstorf/gatsby-with-apollo)
50.479821
583
0.735542
eng_Latn
0.989937
aafaf9eeabf8726197952f519fc95e2e819c71ea
420
md
Markdown
README.md
Hilaryous/snake
f64e39f8510410350e7aeb0e1a505f0eb2dcd448
[ "MIT" ]
null
null
null
README.md
Hilaryous/snake
f64e39f8510410350e7aeb0e1a505f0eb2dcd448
[ "MIT" ]
null
null
null
README.md
Hilaryous/snake
f64e39f8510410350e7aeb0e1a505f0eb2dcd448
[ "MIT" ]
null
null
null
# TETRIS To install the dependencies: ``` npm install ``` To fire up a development server: ``` npm start ``` Once the server is running, you can visit: * `http://localhost:8080/webpack-dev-server/` to run your application. * `http://localhost:8080/webpack-dev-server/test.html` to run your test suite in the browser. To build the static files: ```js npm run build ``` To run tests in Node: ```js npm test ```
13.125
93
0.690476
eng_Latn
0.878074
aafbddb7118c81d92d76ff55ce598a7ab1c5cd23
415
md
Markdown
content/project/first_access/index.md
nriaziat/academic-kickstart
37f44515fcae3c17fb38beb83f6130fab64810ba
[ "MIT" ]
null
null
null
content/project/first_access/index.md
nriaziat/academic-kickstart
37f44515fcae3c17fb38beb83f6130fab64810ba
[ "MIT" ]
null
null
null
content/project/first_access/index.md
nriaziat/academic-kickstart
37f44515fcae3c17fb38beb83f6130fab64810ba
[ "MIT" ]
null
null
null
--- date: "2020-04-27T00:00:00Z" external_link: https://engineering.purdue.edu/ME/News/2020/who-will-win-best-senior-design-project-of-2020 image: caption: 'Potential UI for Smart Access Assist' focal_point: Smart summary: My senior design team created a smart trocar that would be used in research to detect first-access complications before they happen. tags: - medicine title: Smart First-Access Device ---
34.583333
141
0.775904
eng_Latn
0.900762
aafc0ad9ec6f92dba08e8b26dccf79dad38e4433
719
md
Markdown
README.md
manuth/WSCPackageGenerator
258d1ff46d99acbf4248e0eef808051a2907b9e0
[ "MIT" ]
3
2018-05-24T09:58:37.000Z
2019-08-06T20:32:13.000Z
README.md
manuth/WSCPackageGenerator
258d1ff46d99acbf4248e0eef808051a2907b9e0
[ "MIT" ]
217
2020-07-07T19:33:27.000Z
2022-03-24T10:03:11.000Z
README.md
manuth/WSCPackageGenerator
258d1ff46d99acbf4248e0eef808051a2907b9e0
[ "MIT" ]
null
null
null
# WSCPackageGenerator A Generator for WoltLab Suite Core Packages. ## Usage ### Installing `WSCPackageGenerator` You can install `WSCPackageGenerator` using following command: ```bash npm install -g yo @manuth/generator-wsc-package ``` ### Using the Generator You can run the generator by invoking this command on a command prompt: ```bash yo @manuth/wsc-package ``` ## Generator Output - WoltLab Suite Core Package - Meta-Data written in TypeScript - Components written in TypeScript (optional) - NPM-scripts for... - Compiling the Package-Metadata - Compiling the Package-Metadata in Watched Mode - Building the Package-File (`.tar`-archive) - Visual Studio Code-Environment (optional)
25.678571
71
0.738526
eng_Latn
0.73041
aafc69680ef7203487a5bf5d10b64609ff6c0b63
2,305
md
Markdown
docs/framework/debug-trace-profile/raceonrcwcleanup-mda.md
dhernandezb/docs.es-es
cf1637e989876a55eb3c57002818d3982591baf1
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/debug-trace-profile/raceonrcwcleanup-mda.md
dhernandezb/docs.es-es
cf1637e989876a55eb3c57002818d3982591baf1
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/debug-trace-profile/raceonrcwcleanup-mda.md
dhernandezb/docs.es-es
cf1637e989876a55eb3c57002818d3982591baf1
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: MDA de raceOnRCWCleanup ms.date: 03/30/2017 helpviewer_keywords: - RCW - managed debugging assistants (MDAs), RCWs - race on RCW cleanup - MDAs (managed debugging assistants), RCWs - RaceOnRCWCleanup MDA - runtime callable wrappers ms.assetid: bee1e9b1-50a8-4c89-9cd9-7dd6b2458187 author: mairaw ms.author: mairaw ms.openlocfilehash: 09e3b275bfaa5743c0271578df97f92269ac7c86 ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 05/04/2018 ms.locfileid: "33386901" --- # <a name="raceonrcwcleanup-mda"></a>MDA de raceOnRCWCleanup El asistente para la depuración administrada (MDA) de `raceOnRCWCleanup` se activa cuando Common Language Runtime (CLR) detecta que un [contenedor RCW](../../../docs/framework/interop/runtime-callable-wrapper.md) está en uso al realizar una llamada para liberarlo mediante un comando como el método <xref:System.Runtime.InteropServices.Marshal.ReleaseComObject%2A?displayProperty=nameWithType>. ## <a name="symptoms"></a>Síntomas Infracciones de acceso o daños en la memoria durante o después de la liberalización de RCW utilizando <xref:System.Runtime.InteropServices.Marshal.ReleaseComObject%2A> o un método similar. ## <a name="cause"></a>Motivo RCW está en otro subproceso o en la pila de subprocesos de liberación. No se puede liberar un RCW que esté en uso. ## <a name="resolution"></a>Solución No libere ningún RCW que pudiera estar en uso en este o en otros subprocesos. ## <a name="effect-on-the-runtime"></a>Efecto en el Runtime Este MDA no tiene ningún efecto en el CLR. ## <a name="output"></a>Resultado Un mensaje que describe el error. ## <a name="configuration"></a>Configuración ```xml <mdaConfig> <assistants> <raceOnRCWCleanup/> </assistants> </mdaConfig> ``` ## <a name="see-also"></a>Vea también <xref:System.Runtime.InteropServices.MarshalAsAttribute> [Diagnosing Errors with Managed Debugging Assistants (Diagnóstico de errores con asistentes para la depuración administrada)](../../../docs/framework/debug-trace-profile/diagnosing-errors-with-managed-debugging-assistants.md) [Serialización de interoperabilidad](../../../docs/framework/interop/interop-marshaling.md)
43.490566
396
0.749675
spa_Latn
0.644029
aafcaad1523f677095b89d93957ff25623420a7f
425
md
Markdown
content-sample/index.md
smcdougall/astral-pico
882a811ed3b2274b8acb74e25b45824ac5d7992f
[ "CC-BY-3.0" ]
7
2017-02-05T22:03:10.000Z
2019-07-21T03:47:16.000Z
content-sample/index.md
smcdougall/astral-pico
882a811ed3b2274b8acb74e25b45824ac5d7992f
[ "CC-BY-3.0" ]
1
2017-08-14T08:23:05.000Z
2017-08-15T19:27:02.000Z
content-sample/index.md
smcdougall/astral-pico
882a811ed3b2274b8acb74e25b45824ac5d7992f
[ "CC-BY-3.0" ]
7
2017-03-19T15:21:54.000Z
2019-05-07T19:06:33.000Z
--- Title: Home Icon: home Description: My Description # Description under Heading. # Profile: "%assets_url%/profile.png" # Profile Picture for Home Page. # OG Image: "%assets_url%/profile.png" # Open Graph Image for Link Previews. # CSS: "%assets_url%/override.css" # Your own, optional CSS file to override theme styles. # Fonts: # Your own, optional fonts you'd like to include. # - https://fonts.googleapis.com/css ---
38.636364
90
0.722353
kor_Hang
0.527725
aafcb189bcc1768fef14ac49b18110f317b0a2f2
16,988
md
Markdown
README.md
ashishrawat2911/flutter_twitter_clone
e74f4f228fbd88509118c04614d3a803f1c61943
[ "MIT" ]
2
2020-04-01T08:38:16.000Z
2020-07-20T18:48:03.000Z
README.md
AbdulbasitSaid/flutter_twitter_clone
e74f4f228fbd88509118c04614d3a803f1c61943
[ "MIT" ]
null
null
null
README.md
AbdulbasitSaid/flutter_twitter_clone
e74f4f228fbd88509118c04614d3a803f1c61943
[ "MIT" ]
1
2021-09-28T19:15:14.000Z
2021-09-28T19:15:14.000Z
## Fwitter - Flutter Based Twitter Clone ![Twitter URL](https://img.shields.io/twitter/url?style=social&url=https%3A%2F%2Ftwitter.com%2Fthealphamerc) [![GitHub stars](https://img.shields.io/github/stars/Thealphamerc/flutter_twitter_clone?style=social)](https://github.com/login?return_to=%2FTheAlphamerc%flutter_wallet_app) ![GitHub forks](https://img.shields.io/github/forks/TheAlphamerc/flutter_twitter_clone?style=social) ![Dart CI](https://github.com/TheAlphamerc/flutter_twitter_clone/workflows/Dart%20CI/badge.svg) ![GitHub pull requests](https://img.shields.io/github/issues-pr/TheAlphamerc/flutter_twitter_clone) ![GitHub closed pull requests](https://img.shields.io/github/issues-pr-closed/Thealphamerc/flutter_twitter_clone) ![GitHub last commit](https://img.shields.io/github/last-commit/Thealphamerc/flutter_wallet_app) ![GitHub issues](https://img.shields.io/github/issues-raw/Thealphamerc/flutter_twitter_clone) [![Open Source Love](https://badges.frapsoft.com/os/v2/open-source.svg?v=103)](https://github.com/Thealphamerc/flutter_twitter_clone) A working Twitter clone written in Flutter using Firebase auth,realtime database and storage. ## Download App <a href="https://play.google.com/store/apps/details?id=com.thealphamerc.flutter_twitter_clone"><img src="https://play.google.com/intl/en_us/badges/static/images/badges/en_badge_web_generic.png" width="200"></img></a> ## Features * App features is mentioned at project section [ Click here](https://github.com/TheAlphamerc/flutter_twitter_clone/projects/1) * Messaging chat section status can be seen at [here](https://github.com/TheAlphamerc/flutter_twitter_clone/projects/2) ## Dependencies <details> <summary> Click to expand </summary> * [intl](https://pub.dev/packages/intl) * [uuid](https://pub.dev/packages/uuid) * [http](https://pub.dev/packages/http) * [async](https://pub.dev/packages/async) * [share](https://pub.dev/packages/share) * [provider](https://pub.dev/packages/provider) * [url_launcher](https://pub.dev/packages/url_launcher) * [google_fonts](https://pub.dev/packages/google_fonts) * [image_picker](https://pub.dev/packages/image_picker) * [firebase_auth](https://pub.dev/packages/firebase_auth) * [google_sign_in](https://pub.dev/packages/google_sign_in) * [firebase_storage](https://pub.dev/packages/firebase_storage) * [firebase_analytics](https://pub.dev/packages/firebase_analytics) * [firebase_database](https://pub.dev/packages/firebase_database) * [shared_preferences](https://pub.dev/packages/shared_preferences) * [flutter_native_splash](https://pub.dev/packages/flutter_native_splash) * [flutter_advanced_networkimage](https://pub.dev/packages/flutter_advanced_networkimage) </details> ## Screenshots Welcome Page | Login Page | Signup Page | Forgot Password Page :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Auth/screenshot_1.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Auth/screenshot_2.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Auth/screenshot_3.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Auth/screenshot_4.jpg?raw=true)| Home Page Sidebaar | Home Page | Home Page | Home Page :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Home/screenshot_5.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Home/screenshot_2.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Home/screenshot_7.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Home/screenshot_6.jpg?raw=true)| Compose Tweet Page | Reply To Tweet | Reply to Tweet | Compose Retweet with comment :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/CreateTweet/screenshot_1.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/CreateTweet/screenshot_2.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/CreateTweet/screenshot_4.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/CreateTweet/screenshot_3.jpg?raw=true)| Tweet Detail Page | Tweet Thread | Nested Tweet Thread | Tweet options :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/TweetDetail/screenshot_3.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/TweetDetail/screenshot_4.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/TweetDetail/screenshot_1.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/TweetDetail/screenshot_2.jpg?raw=true)| Notification Page | Notification Page | Notification Page | Notification Setting Page :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Notification/screenshot_1.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Notification/screenshot_2.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Notification/screenshot_3.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Notification/screenshot_4.jpg?raw=true)| Profile Page | Profile Page | Profile Page | Profile Page :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Profile/screenshot_1.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Profile/screenshot_2.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Profile/screenshot_4.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Profile/screenshot_7.jpg?raw=true)| Select User Page | Chat Page | Chat Users List | Conversation Info Page :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Chat/screenshot_1.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Chat/screenshot_2.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Chat/screenshot_3.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Chat/screenshot_4.jpg?raw=true)| Search Page | Search Setting Page | Tweet Options - 1 | Tweet Options - 2 :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Search/screenshot_1.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Search/screenshot_2.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/TweetDetail/screenshot_5.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/TweetDetail/screenshot_6.jpg?raw=true)| Setting Page | Account Setting Page | Privacy Setting Page | Privacy Settings Page :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_1.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_2.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_4.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_3.jpg?raw=true)| Content Prefrences Page | Display Setting Page | Data Settings Page | Accessibility Settings :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_5.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_6.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_7.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_8.jpg?raw=true)| Users who likes Tweet | About Setting Page | Licenses Settings | Settings :-------------------------:|:-------------------------:|:-------------------------:|:-------------------------: ![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/TweetDetail/screenshot_7.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_9.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_10.jpg?raw=true)|![](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/screenshots/Settings/screenshot_81.jpg?raw=true)| ## Getting started <details> <summary> Click to expand </summary> #### 1. [Setup Flutter](https://flutter.dev/docs/get-started/install) #### 2. Clone the repo ```sh $ git https://github.com/TheAlphamerc/flutter_twitter_clone.git $ cd flutter_twitter_clone/ ``` #### 3. Setup the firebase app 1. You'll need to create a Firebase instance. Follow the instructions at https://console.firebase.google.com. 2. Once your Firebase instance is created, you'll need to enable Google authentication. * Go to the Firebase Console for your new instance. * Click "Authentication" in the left-hand menu * Click the "sign-in method" tab * Click "Google" and enable it * Click "Email/Password" and enable it 3. Enable the Firebase Database * Go to the Firebase Console * Click "Database" in the left-hand menu * Click the Realtime "Create Database" button * Select "Start in test mode" and "Enable" 4. (skip if not running on Android) * Create an app within your Firebase instance for Android, with package name com.thealphamerc.flutter_twitter_clone * Run the following command to get your SHA-1 key: ``` keytool -exportcert -list -v \ -alias androiddebugkey -keystore ~/.android/debug.keystore ``` * In the Firebase console, in the settings of your Android app, add your SHA-1 key by clicking "Add Fingerprint". * Follow instructions to download google-services.json * place `google-services.json` into `/android/app/`. 5. (skip if not running on iOS) * Create an app within your Firebase instance for iOS, with your app package name * Follow instructions to download GoogleService-Info.plist * Open XCode, right click the Runner folder, select the "Add Files to 'Runner'" menu, and select the GoogleService-Info.plist file to add it to /ios/Runner in XCode * Open /ios/Runner/Info.plist in a text editor. Locate the CFBundleURLSchemes key. The second item in the array value of this key is specific to the Firebase instance. Replace it with the value for REVERSED_CLIENT_ID from GoogleService-Info.plist </details> ## Directory Structure <details> <summary> Click to expand </summary> ``` |-- lib | |-- helper | | |-- constant.dart | | |-- customRoute.dart | | |-- enum.dart | | |-- routes.dart | | |-- theme.dart | | |-- utility.dart | | '-- validator.dart | |-- main.dart | |-- model | | |-- chatModel.dart | | |-- commentModel.dart | | |-- feedModel.dart | | |-- notificationModel.dart | | '-- user.dart | |-- page | | |-- Auth | | | |-- forgetPasswordPage.dart | | | |-- selectAuthMethod.dart | | | |-- signin.dart | | | |-- signup.dart | | | '-- verifyEmail.dart | | |-- common | | | |-- sidebar.dart | | | |-- splash.dart | | | |-- usersListPage.dart | | | '-- widget | | | '-- userListWidget.dart | | |-- feed | | | |-- composeTweet | | | | |-- composeTweet.dart | | | | |-- createFeed.dart | | | | '-- widget | | | | |-- composeBottomIconWidget.dart | | | | '-- composeTweetImage.dart | | | |-- feedPage.dart | | | |-- feedPostDetail.dart | | | '-- imageViewPage.dart | | |-- homePage.dart | | |-- message | | | |-- chatListPage.dart | | | |-- chatScreenPage.dart | | | |-- conversationInformation | | | | '-- conversationInformation.dart | | | '-- newMessagePage.dart | | |-- notification | | | '-- notificationPage.dart | | |-- profile | | | |-- EditProfilePage.dart | | | |-- follow | | | | |-- followerListPage.dart | | | | '-- followingListPage.dart | | | '-- profilePage.dart | | |-- search | | | '-- SearchPage.dart | | '-- settings | | |-- accountSettings | | | |-- about | | | | '-- aboutTwitter.dart | | | |-- accessibility | | | | '-- accessibility.dart | | | |-- accountSettingsPage.dart | | | |-- contentPrefrences | | | | |-- contentPreference.dart | | | | '-- trends | | | | '-- trendsPage.dart | | | |-- dataUsage | | | | '-- dataUsagePage.dart | | | |-- displaySettings | | | | '-- displayAndSoundPage.dart | | | |-- notifications | | | | '-- notificationPage.dart | | | |-- privacyAndSafety | | | | |-- directMessage | | | | | '-- directMessage.dart | | | | '-- privacyAndSafetyPage.dart | | | '-- proxy | | | '-- proxyPage.dart | | |-- settingsAndPrivacyPage.dart | | '-- widgets | | |-- headerWidget.dart | | |-- settingsAppbar.dart | | '-- settingsRowWidget.dart | |-- state | | |-- appState.dart | | |-- authState.dart | | |-- chats | | | '-- chatState.dart | | |-- feedState.dart | | |-- notificationState.dart | | '-- searchState.dart | '-- widgets | |-- bottomMenuBar | | |-- HalfPainter.dart | | |-- bottomMenuBar.dart | | '-- tabItem.dart | |-- customAppBar.dart | |-- customWidgets.dart | |-- newWidget | | |-- customClipper.dart | | |-- customLoader.dart | | |-- customProgressbar.dart | | |-- customUrlText.dart | | |-- emptyList.dart | | |-- rippleButton.dart | | '-- title_text.dart | '-- tweet | |-- tweet.dart | '-- widgets | |-- tweetBottomSheet.dart | |-- tweetIconsRow.dart | '-- tweetImage.dart |-- pubspec.yaml ``` </details> ## Contributing If you wish to contribute a change to any of the existing feature or add new in this repo, please review our [contribution guide](https://github.com/TheAlphamerc/flutter_twitter_clone/blob/master/CONTRIBUTING.md), and send a [pull request](https://github.com/TheAlphamerc/flutter_twitter_clone/pulls). I welcome and encourage all pull requests. It usually will take me within 24-48 hours to respond to any issue or request. ## Created & Maintained By [Sonu Sharma](https://github.com/TheAlphamerc) ([Twitter](https://www.twitter.com/TheAlphamerc)) ([Youtube](https://www.youtube.com/user/sonusharma045sonu/)) ([Insta](https://www.instagram.com/_sonu_sharma__)) ![Twitter Follow](https://img.shields.io/twitter/follow/thealphamerc?style=social) > If you found this project helpful or you learned something from the source code and want to thank me, consider buying me a cup of :coffee: > > * [PayPal](https://www.paypal.me/TheAlphamerc/)
60.455516
636
0.645456
yue_Hant
0.411172
aafcb2a88839ba5c45d5cd60a5b04c1c74cdba9d
878
md
Markdown
clients/csharp-dotnet2/generated/docs/QueueBlockedItem.md
PankTrue/swaggy-jenkins
aca35a7cca6e1fcc08bd399e05148942ac2f514b
[ "MIT" ]
23
2017-08-01T12:25:26.000Z
2022-01-25T03:44:11.000Z
clients/csharp-dotnet2/generated/docs/QueueBlockedItem.md
PankTrue/swaggy-jenkins
aca35a7cca6e1fcc08bd399e05148942ac2f514b
[ "MIT" ]
35
2017-06-14T03:28:15.000Z
2022-02-14T10:25:54.000Z
clients/csharp-dotnet2/generated/docs/QueueBlockedItem.md
PankTrue/swaggy-jenkins
aca35a7cca6e1fcc08bd399e05148942ac2f514b
[ "MIT" ]
11
2017-08-31T19:00:20.000Z
2021-12-19T12:04:12.000Z
# Org.OpenAPITools.Model.QueueBlockedItem ## Properties Name | Type | Description | Notes ------------ | ------------- | ------------- | ------------- **Class** | **string** | | [optional] **Actions** | [**List<CauseAction>**](CauseAction.md) | | [optional] **Blocked** | **bool?** | | [optional] **Buildable** | **bool?** | | [optional] **Id** | **int?** | | [optional] **InQueueSince** | **int?** | | [optional] **Params** | **string** | | [optional] **Stuck** | **bool?** | | [optional] **Task** | [**FreeStyleProject**](FreeStyleProject.md) | | [optional] **Url** | **string** | | [optional] **Why** | **string** | | [optional] **BuildableStartMilliseconds** | **int?** | | [optional] [[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
41.809524
161
0.542141
yue_Hant
0.452523
aafccf55ddfe4082d5a5189c383bd166cf11d580
219
md
Markdown
markdown/geologia/03.md
reale/ciclotimie
8e13a638f9d0ce4258bcc53f1fdb024a2a08a502
[ "MIT" ]
null
null
null
markdown/geologia/03.md
reale/ciclotimie
8e13a638f9d0ce4258bcc53f1fdb024a2a08a502
[ "MIT" ]
null
null
null
markdown/geologia/03.md
reale/ciclotimie
8e13a638f9d0ce4258bcc53f1fdb024a2a08a502
[ "MIT" ]
null
null
null
# iii gli occhi dove non fu mai remissione non so se li nascondesti da noi o soltanto li negavi al mondo ma noi noi li cercavamo da prima ancora che ci gettasti sulle spiagge di luce né ancora abbiamo smesso
24.333333
41
0.757991
ita_Latn
0.999925
aafd7f5c76a55bd3a03914f7180ea3a9b033309e
2,185
md
Markdown
docs/framework/configure-apps/file-schema/web/system-web-element-web-settings.md
hsr-GitHub/docs.zh-cn
72d3f41d4e4ad3167f1992fc4951a07830189c87
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/configure-apps/file-schema/web/system-web-element-web-settings.md
hsr-GitHub/docs.zh-cn
72d3f41d4e4ad3167f1992fc4951a07830189c87
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/configure-apps/file-schema/web/system-web-element-web-settings.md
hsr-GitHub/docs.zh-cn
72d3f41d4e4ad3167f1992fc4951a07830189c87
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: <system.web> 元素(网络设置) ms.date: 03/30/2017 helpviewer_keywords: - Web.config configuration file [ASP.NET] - system.Web element - <system.Web> element - ASP.NET configuration system - configuration files [ASP.NET] ms.assetid: 24c4cf4f-ad32-42b2-b040-8e4549e2855e ms.openlocfilehash: b37b05bdf90630251cbfcf86751243a3a8b77663 ms.sourcegitcommit: b16c00371ea06398859ecd157defc81301c9070f ms.translationtype: MT ms.contentlocale: zh-CN ms.lasthandoff: 06/06/2020 ms.locfileid: "79152836" --- # <a name="systemweb-element-web-settings"></a>\<system.web> 元素(网络设置) 包含有关 ASP.NET 承载层如何管理进程范围的行为的信息。 [**\<configuration>**](../configuration-element.md) &nbsp;&nbsp;**\<system.web>** ## <a name="syntax"></a>语法 ```xml <system.web> </system.web> ``` ## <a name="attributes-and-elements"></a>特性和元素 下列各节描述了特性、子元素和父元素。 ### <a name="attributes"></a>特性 无。 ### <a name="child-elements"></a>子元素 |元素|说明| |-------------|-----------------| |[\<applicationPool>](applicationpool-element-web-settings.md)|为 aspnet 文件中的 IIS 应用程序池指定配置设置。| ### <a name="parent-elements"></a>父元素 |元素|说明| |-------------|-----------------| |[\<configuration>](../configuration-element.md)|指定公共语言运行时和 .NET Framework 应用程序所使用的每个配置文件中的根元素。| ## <a name="remarks"></a>注解 `system.web`元素及其子 `applicationPool` 元素已添加到 .NET FRAMEWORK 3.5 SP1 中的 .NET Framework。 在集成模式下运行 IIS 7.0 或更高版本时,此元素组合可让你配置 ASP.NET 管理线程的方式,以及在 ASP.NET 托管在 IIS 应用程序池中时,如何将请求排队。 如果在经典或 ISAPI 模式下运行 IIS 7.0 或更高版本,则将忽略这些设置。 ## <a name="example"></a>示例 下面的示例演示当 ASP.NET 托管在 IIS 应用程序池中时,如何在 ASP.NET 文件中配置进程范围行为。 该示例假设 IIS 在集成模式下运行,并且该应用程序正在使用 .NET Framework 3.5 SP1 或更高版本。 此行为不会在早于 .NET Framework 3.5 SP1 的 .NET Framework 版本中发生。 示例中的值为默认值。 ```xml <configuration> <system.web> <applicationPool maxConcurrentRequestsPerCPU="5000" maxConcurrentThreadsPerCPU="0" requestQueueLimit="5000" /> </system.web> </configuration> ``` ## <a name="element-information"></a>元素信息 ||| |-|-| |命名空间|| |架构名称|| |验证文件|| |可以为空|| ## <a name="see-also"></a>另请参阅 - [\<applicationPool>元素(Web 设置)](applicationpool-element-web-settings.md)
26.646341
217
0.659039
yue_Hant
0.502991
aafd88c0e253feac3a64f2011c89295284fb4c68
660
md
Markdown
windows.applicationmodel.chat/chatsearchreader_readbatchasync_888788553.md
gbaychev/winrt-api
25346cd51bc9d24c8c4371dc59768e039eaf02f1
[ "CC-BY-4.0", "MIT" ]
199
2017-02-09T23:13:51.000Z
2022-03-28T15:56:12.000Z
windows.applicationmodel.chat/chatsearchreader_readbatchasync_888788553.md
gbaychev/winrt-api
25346cd51bc9d24c8c4371dc59768e039eaf02f1
[ "CC-BY-4.0", "MIT" ]
2,093
2017-02-09T21:52:45.000Z
2022-03-25T22:23:18.000Z
windows.applicationmodel.chat/chatsearchreader_readbatchasync_888788553.md
gbaychev/winrt-api
25346cd51bc9d24c8c4371dc59768e039eaf02f1
[ "CC-BY-4.0", "MIT" ]
620
2017-02-08T19:19:44.000Z
2022-03-29T11:38:25.000Z
--- -api-id: M:Windows.ApplicationModel.Chat.ChatSearchReader.ReadBatchAsync -api-type: winrt method -api-device-family-note: xbox --- <!-- Method syntax public Windows.Foundation.IAsyncOperation<Windows.Foundation.Collections.IVectorView<Windows.ApplicationModel.Chat.IChatItem>> ReadBatchAsync() --> # Windows.ApplicationModel.Chat.ChatSearchReader.ReadBatchAsync ## -description Returns a batch of found items matching the search criteria. ## -returns A list of items matching the search criteria. ## -remarks ## -examples ## -see-also [ReadBatchAsync(Int32)](chatsearchreader_readbatchasync_1346490639.md) ## -capabilities chatSystem, smsSend, chat
24.444444
143
0.792424
yue_Hant
0.442845
aafdd56bb407d13c9f990a000796b3406d8fd443
308
md
Markdown
README.md
davidmogar/wschat
32b57d8c416246a0222b0590f0ebb342d158082d
[ "MIT" ]
null
null
null
README.md
davidmogar/wschat
32b57d8c416246a0222b0590f0ebb342d158082d
[ "MIT" ]
null
null
null
README.md
davidmogar/wschat
32b57d8c416246a0222b0590f0ebb342d158082d
[ "MIT" ]
null
null
null
wschat ====== Wschat is a simple web chat that uses PHP for the server side and jQuery + websockets on the client side. The chat allows to communicate with remote users and send YouTube videos using the included YouTube search overlay. ![wschat](http://davidmogar.leakedbits.com/uploads/github/wschat.png)
44
221
0.782468
eng_Latn
0.991622
aafe77c7c2da88044109840e0339a6208f119ab3
3,585
md
Markdown
README.md
junaidrahim/locate-route
26a4f1c43ed79684c5e6ec016f1430b7b17bed3b
[ "MIT" ]
22
2021-02-02T15:24:33.000Z
2022-02-25T04:39:57.000Z
README.md
junaidrahim/locate-route
26a4f1c43ed79684c5e6ec016f1430b7b17bed3b
[ "MIT" ]
null
null
null
README.md
junaidrahim/locate-route
26a4f1c43ed79684c5e6ec016f1430b7b17bed3b
[ "MIT" ]
1
2022-03-01T17:08:10.000Z
2022-03-01T17:08:10.000Z
# locate_route A command line tool to grab the geographical location of the hops from `traceroute`. It's like `traceroute`, but also returns the actual location of the IP address. ## Installation Make sure you have `g++`, `curl` `cmake`, `make` and `traceroute` installed and setup on your machine. Run the following to install the binary: ```bash curl -o- https://raw.githubusercontent.com/junaidrahim/locate-route/main/install.sh | sudo bash ``` You can run `locate_route` in your shell now. ## Build from Source 1. `git clone https://github.com/junaidrahim/locate-route` 2. `mkdir build && cd build` 3. `cmake ..` 4. `make` The binary `locate_route` will compile in the `build/` directory. You can run it from there, or copy it into `$PATH` to access it from anywhere in your terminal. ## Usage ```sh-session $ locate_route --help locateroute - Get location information of the traceroute hops USAGE: locateroute [ip or domain name] FLAGS: -h, --help Display Help ``` ```sh-session $ locate_route github.com traceroute to github.com (13.234.176.102), 30 hops max, 60 byte packets 1 _gateway (192.168.0.1) 1.539 ms 3.299 ms 3.249 ms 2 4-144-31-103.dnainfotel.com (103.31.144.4) 3.206 ms 3.164 ms 3.121 ms Location: Virār, Maharashtra, India, Asia Zip: 401303, Coordinates: 19.466700,72.800003 3 1-144-31-103.dnainfotel.com (103.31.144.1) 4.287 ms 4.063 ms 4.166 ms Location: Virār, Maharashtra, India, Asia Zip: 401303, Coordinates: 19.466700,72.800003 4 26-69-245-103.dnainfotel.com (103.245.69.26) 6.168 ms 7.008 ms 6.620 ms Location: Saint Thomas Mount, Tamil Nadu, India, Asia Zip: 600001, Coordinates: 13.052400,80.250801 5 52.46.166.32 (52.46.166.32) 4.609 ms 4.869 ms 4.809 ms Location: Ashburn, Virginia, United States, North America Zip: 20147, Coordinates: 39.043701,-77.474197 6 52.95.67.22 (52.95.67.22) 12.384 ms 52.95.67.66 (52.95.67.66) 4.752 ms 52.95.67.22 (52.95.67.22) 14.081 ms Location: Powai, Maharashtra, India, Asia Zip: 400070, Coordinates: 19.076000,72.877701 7 52.95.64.196 (52.95.64.196) 9.312 ms 52.95.64.248 (52.95.64.248) 9.294 ms 52.95.64.252 (52.95.64.252) 9.275 ms Location: Powai, Maharashtra, India, Asia Zip: 400070, Coordinates: 19.076000,72.877701 8 52.95.64.195 (52.95.64.195) 12.942 ms 52.95.64.211 (52.95.64.211) 12.917 ms 52.95.64.253 (52.95.64.253) 8.346 ms Location: Powai, Maharashtra, India, Asia Zip: 400070, Coordinates: 19.076000,72.877701 9 52.95.67.127 (52.95.67.127) 12.874 ms 52.95.67.39 (52.95.67.39) 12.853 ms 52.95.67.17 (52.95.67.17) 12.831 ms Location: Powai, Maharashtra, India, Asia Zip: 400070, Coordinates: 19.076000,72.877701 10 52.95.67.178 (52.95.67.178) 9.119 ms 52.95.67.182 (52.95.67.182) 8.238 ms 52.95.65.135 (52.95.65.135) 10.045 ms Location: Powai, Maharashtra, India, Asia Zip: 400070, Coordinates: 19.076000,72.877701 11 * * * 12 * * * 13 * * * 14 * * * 15 * * * ``` ## How does it work ? `locate_route` parses the output from traceroute to find all the ip addresses and then uses the API provided by https://ipstack.com/ to determine the location. ## Dependencies C++17 and CMake 3.15 * [argh](https://github.com/adishavit/argh) for setting up the CLI * [cpp-httplib](https://github.com/yhirose/cpp-httplib) for http requests * [nlohmann/json](https://github.com/nlohmann/json) for parsing JSON ## LICENSE Copyright (c) **Junaid Rahim**. All rights reserved. Licensed under the [MIT](LICENSE) License <br> [![forthebadge](https://forthebadge.com/images/badges/made-with-c-plus-plus.svg)](https://forthebadge.com)
32.297297
165
0.705439
eng_Latn
0.576219
aafeb8cbe54e49e84e7b4738cfb089b3785b537d
4,011
md
Markdown
_pages/about.md
sonalexle/sonalexle.github.io
e1de8eb0b821e8cf53f863598af4c47286e411ac
[ "MIT" ]
null
null
null
_pages/about.md
sonalexle/sonalexle.github.io
e1de8eb0b821e8cf53f863598af4c47286e411ac
[ "MIT" ]
null
null
null
_pages/about.md
sonalexle/sonalexle.github.io
e1de8eb0b821e8cf53f863598af4c47286e411ac
[ "MIT" ]
null
null
null
--- permalink: / title: "About me" excerpt: "About me" author_profile: true redirect_from: - /about/ - /about.html --- Hei! I'm a student at [Aalto University](https://www.aalto.fi/en), and I'm pursuing a Bachelor's degree in Data Science, with a minor in Statistics. I expect to graduate in summer 2022, probably in June. I will undertake Master's studies in fall 2022 also at Aalto (I'm enrolled in a continuous BSc + MSc program) or elsewhere. I'm also working as a research assistant for the Aalto Probabilistic Machine Learning group. My undergraduate studies cover the foundations of data science: computer science, math, and statistics. In addition, through university courses, projects, and internships, I have obtained a solid foundation of machine learning (ML), grounded in math and computer science. In general, I am interested in machine learning engineering, probabilistic machine learning, deep learning, and causality and its connections with machine learning. <!-- # Interests ## Research interests My research interests include causality (causal inference and causal discovery), deep learning (DL), and probabilistic machine learning (PML). With causal analysis, one can transition from pattern recognition done by typical ML methods to interventions and counterfactual reasoning, a desirable property of intelligent systems. I mainly work with causal inference assumptions and treatment effect estimation methods, such as double machine learning, doubly robust estimators, and instrumental variables. One could also estimate treatment effects with arbitrary ML models. However, to do so, one needs to be informed of which explanatory variables to include. This information is usually encoded as a directed acyclic graph (DAG), which is one type of probabilistic graphical models (PGMs). Regarding PML, I'm interested in latent variable modeling, PGMs, Bayesian inference, and variational inference. I believe that probabilistic modeling could make ML more transparent by providing outputs augmented with uncertainty measures such as credible intervals (instead of just a number like, e.g., random forests), which would be more informative to users and decision-makers. When it comes to DL, I like to study various models for different applications, such as natural language processing or computer vision. I'm familiar with most common models and methods: CNN, RNN (LSTM, GRU), transformers, GAN, and VAE. I'm not so interested in coming up with new modules such as attention, but rather I like to discover unexpected ways to successfully integrate different building blocks and build a good DL model. At the intersection of causality, DL, and PML, I'm investigating how to encode causal relationships as inductive biases for ML models while learning how to improve the performance of causal estimators with complex ML methods such as neural networks. That is, I'd like to know how ML could help causality and vice versa. However, both causal inference and ML methods typically lack uncertainty quantification, and this is where the probabilistic modeling framework could help. ## Machine learning engineering Today's ML research is still disconnected from the industry, whose services could affect millions of people. Whereas ML research typically focuses on inventing models with high accuracies, real-world software services need continuous deployment and updates reflecting the surrounding environment. This is especially true for ML-based services because the deployed ML model is invalid once the environment differs from data used to train the model. The emerging field of MLOps aims to address this by, e.g., providing tools for integrating ML with a larger service by leveraging best practices from software development and DevOps. As a first step towards becoming an ML engineer and bridge the gap between ML research and industry, I have recently completed a course [project](https://sonalexle.github.io/viral-tweets/) about MLOps and the BERT language model. -->
95.5
864
0.805535
eng_Latn
0.999201
aafecf54cbfefcfd815bfb260f26165d441fe2ae
2,768
md
Markdown
example/view-layout/demo5/README.md
Andy0570/swiftui-example
15e7f59cfe58515b2d304e0a0982aa52b1bf733f
[ "MIT" ]
41
2021-07-18T16:45:03.000Z
2022-03-21T15:12:07.000Z
example/view-layout/demo5/README.md
Andy0570/swiftui-example
15e7f59cfe58515b2d304e0a0982aa52b1bf733f
[ "MIT" ]
2
2021-10-17T18:02:10.000Z
2022-01-22T07:08:41.000Z
example/view-layout/demo5/README.md
Andy0570/swiftui-example
15e7f59cfe58515b2d304e0a0982aa52b1bf733f
[ "MIT" ]
5
2021-11-04T03:37:26.000Z
2022-03-22T13:06:24.000Z
如何返回不同的视图类型? === 任何 `SwiftUI` 的 `body` 属性都可以自动返回不同的视图,这要归功于名为 `@ViewBuilder` 的特殊属性。 这是使用 `Swift` 的结果生成器系统实现的,它了解如何根据我们应用的状态显示两种不同的视图。 但是,并不是所有地方都自动具有相同的功能,这意味着您创建的任何自定义属性都必须返回相同的视图类型。 有四种方法可以解决此问题。 第一种选择是将输出包装在一个组中,这样,无论您发回图像还是文本视图,它们都将在一个组中返回: ```swift struct ContentView: View { var tossResult: some View { Group { if Bool.random() { Image("laser-show") .resizable() .scaledToFit() } else { Text("Better luck next time") .font(.title) } } .frame(width: 400, height: 300) } var body: some View { VStack { Text("Coin Flip") .font(.largeTitle) tossResult } } } ``` 第二种是使用可以返回的类型擦除包装器 `AnyView`: ```swift struct ContentView: View { var tossResult: some View { if Bool.random() { return AnyView(Image("laser-show").resizable().scaledToFit()) } else { return AnyView(Text("Better luck next time").font(.title)) } } var body: some View { VStack { Text("Coin Flip") .font(.largeTitle) tossResult .frame(width: 400, height: 300) } } } ``` 如果您还没有听说过这个概念,它会有效地迫使 `Swift` 忘记 `AnyView` 内部的特定类型,让他们看起来像是同一个人。 不过,这会降低性能,因此请不要经常使用。 尽管 `Group` 和 `AnyView` 的布局效果均相同,但在两者之间通常最好使用 `Group`,因为 `SwiftUI` 效率更高。 第三种选择是自己将 `@ViewBuilder` 属性应用于需要它的任何属性,如下所示: ```swift struct ContentView: View { @ViewBuilder var tossResult: some View { if Bool.random() { Image("laser-show") .resizable() .scaledToFit() } else { Text("Better luck next time") .font(.title) } } var body: some View { VStack { Text("Coin Flip") .font(.largeTitle) tossResult .frame(width: 400, height: 300) } } } ``` 那行得通,但是说实话,如果您发现自己想使用 `@ViewBuilder`,则应该质疑您是否试图在一个视图中放置太多内容。 第四个解决方案(大多数情况下效果最好)是将视图分解为较小的视图,然后根据需要将它们组合在一起: ```swift struct TossResult: View { var body: some View { if Bool.random() { Image("laser-show") .resizable() .scaledToFit() } else { Text("Better luck next time") .font(.title) } } } struct ContentView: View { var body: some View { VStack { Text("Coin Flip") .font(.largeTitle) TossResult() .frame(width: 400, height: 300) } } } ``` 这非常有用,有助于打破逻辑和布局,还具有使视图在应用程序中其他位置更可重用的好处。`SwiftUI` 会自动折叠您的视图层次结构,因此在拆分视图时,性能没有任何有意义的区别。
22.504065
116
0.52276
yue_Hant
0.666874
aaff056042f981649f2b57501b6c85383520b5fe
3,586
md
Markdown
docs/FIPS.md
saavan251/tink
acddae66fbcd38d0502c0e65c051a780eca7d6a1
[ "Apache-2.0" ]
1
2021-02-08T14:57:59.000Z
2021-02-08T14:57:59.000Z
docs/FIPS.md
saavan251/tink
acddae66fbcd38d0502c0e65c051a780eca7d6a1
[ "Apache-2.0" ]
null
null
null
docs/FIPS.md
saavan251/tink
acddae66fbcd38d0502c0e65c051a780eca7d6a1
[ "Apache-2.0" ]
1
2020-11-16T17:34:09.000Z
2020-11-16T17:34:09.000Z
# Tink - FIPS 140-2 Currently, Tink is not [FIPS 140-2](https://csrc.nist.gov/publications/detail/fips/140/2/final) validated itself. However, it supports several FIPS 140-2 approved algorithms and the underlying implementations *can* utilize validated cryptographic modules like [BoringSSLs BoringCrypto](https://csrc.nist.gov/Projects/Cryptographic-Module-Validation-Program/Certificate/3678). Tink includes a [WORKSPACE](https://github.com/google/tink/blob/master/cc/third_party/boringssl_fips) for building BoringSSL in FIPS mode. # Algorithms supported The following algorithms in Tink are approved according to [FIPS 140-2](https://csrc.nist.gov/publications/detail/fips/140/2/final) * Authenticated Encryption * AES-GCM ([FIPS 140-2 Annex A](https://csrc.nist.gov/CSRC/media/Publications/fips/140/2/final/documents/fips1402annexa.pdf)) * AES-CTR-HMAC-SHA256 ([FIPS 140-2 Annex A](https://csrc.nist.gov/CSRC/media/Publications/fips/140/2/final/documents/fips1402annexa.pdf)) * MAC * HMAC-SHA256 ([FIPS 140-2 Annex A](https://csrc.nist.gov/CSRC/media/Publications/fips/140/2/final/documents/fips1402annexa.pdf)) * AES-CMAC ([FIPS 140-2 Annex A](https://csrc.nist.gov/CSRC/media/Publications/fips/140/2/final/documents/fips1402annexa.pdf)) * Digital Signatures * ECDSA ([FIPS 140-2 Annex A](https://csrc.nist.gov/CSRC/media/Publications/fips/140/2/final/documents/fips1402annexa.pdf)) * RSA-SSA-PKCS1 ([FIPS 140-2 Annex A](https://csrc.nist.gov/CSRC/media/Publications/fips/140/2/final/documents/fips1402annexa.pdf)) * RSA-SSA-PSS ([FIPS 140-2 Annex A](https://csrc.nist.gov/CSRC/media/Publications/fips/140/2/final/documents/fips1402annexa.pdf)) # FIPS-only mode If you are required to use FIPS 140-2 approved algorithms and validated implementations, then you can build Tink in FIPS only mode. This will restrict usage to approved algorithms *and* check if Tink is utilizing a validated cryptographic module. Specifically this will change the behavior of Tink in the following way: * `Register()` functions will only register algorithms which have a FIPS validated implementation. This means that you will *only* be able to use Keysets for algorithms which use a validated cryptographic module. * Tink will check if BoringSSL has been build with the BoringCrypto module. Calls to primitives will return an `INTERNAL` error when the module is not available. * Using primitives in `subtle/` will be restricted to algorithms which utilize a validated cryptographic module. Currently this is only supported in the C++ version of Tink. If you are *not* building Tink in FIPS only mode, it will still utilize validated implementations for *some* algorithms but not restrict the usage of other algorithms. ## FIPS only mode in C++ Tink uses [BoringCrypto](https://csrc.nist.gov/Projects/Cryptographic-Module-Validation-Program/Certificate/3678) in C++ to provide access to a validated cryptographic module. Note that this means that the following algorithms are not available, as they have not been validated: * AES-CMAC * RSA-SSA-PKCS1 is restricted to 3072-bit modulus * RSA-SSA-PSS is restricted to 3072-bit modulus In order to build Tink in FIPS-only mode, you can simply set a flag at compile time: ```shell bazel build ... --define=use_only_fips=on ``` If you want to check at runtime whether Tink has been build in FIPS only mode, you can include the header `config/tink_fips.h` which provides the constant `kUseOnlyFips`.
41.697674
123
0.755159
eng_Latn
0.854499
aaff813455c9693018a80b7532e78d89c38604f6
14,256
md
Markdown
node_modules/fast-json-stringify/README.md
imhash/graf_csv_server
8cb1c93fb2d5d5abc63878ca07e8cb715ca23bb6
[ "MIT" ]
86
2020-08-05T21:04:32.000Z
2022-02-08T08:55:41.000Z
node_modules/fast-json-stringify/README.md
imhash/graf_csv_server
8cb1c93fb2d5d5abc63878ca07e8cb715ca23bb6
[ "MIT" ]
2
2021-05-21T12:20:47.000Z
2021-11-06T00:38:52.000Z
node_modules/fast-json-stringify/README.md
imhash/graf_csv_server
8cb1c93fb2d5d5abc63878ca07e8cb715ca23bb6
[ "MIT" ]
16
2020-08-24T08:28:58.000Z
2021-12-14T12:44:36.000Z
# fast-json-stringify [![js-standard-style](https://img.shields.io/badge/code%20style-standard-brightgreen.svg?style=flat)](http://standardjs.com/) [![Build Status](https://dev.azure.com/fastify/fastify/_apis/build/status/fastify.fast-json-stringify?branchName=master)](https://dev.azure.com/fastify/fastify/_build/latest?definitionId=3&branchName=master) [![Build Status](https://travis-ci.org/fastify/fast-json-stringify.svg?branch=master)](https://travis-ci.org/fastify/fast-json-stringify) [![NPM downloads](https://img.shields.io/npm/dm/fast-json-stringify.svg?style=flat)](https://www.npmjs.com/package/fast-json-stringify) __fast-json-stringify__ is significantly faster than `JSON.stringify()` for small payloads. Its performance advantage shrinks as your payload grows. It pairs well with [__flatstr__](https://www.npmjs.com/package/flatstr), which triggers a V8 optimization that improves performance when eventually converting the string to a `Buffer`. ##### Benchmarks - Machine: `EX41S-SSD, Intel Core i7, 4Ghz, 64GB RAM, 4C/8T, SSD`. - Node.js `v10.15.2` ``` FJS creation x 8,951 ops/sec ±0.51% (92 runs sampled) JSON.stringify array x 5,146 ops/sec ±0.32% (97 runs sampled) fast-json-stringify array x 8,402 ops/sec ±0.62% (95 runs sampled) fast-json-stringify-uglified array x 8,474 ops/sec ±0.49% (93 runs sampled) JSON.stringify long string x 13,061 ops/sec ±0.25% (98 runs sampled) fast-json-stringify long string x 13,059 ops/sec ±0.21% (98 runs sampled) fast-json-stringify-uglified long string x 13,099 ops/sec ±0.14% (98 runs sampled) JSON.stringify short string x 6,295,988 ops/sec ±0.28% (98 runs sampled) fast-json-stringify short string x 43,335,575 ops/sec ±1.24% (86 runs sampled) fast-json-stringify-uglified short string x 40,042,871 ops/sec ±1.38% (93 runs sampled) JSON.stringify obj x 2,557,026 ops/sec ±0.20% (97 runs sampled) fast-json-stringify obj x 9,001,890 ops/sec ±0.48% (90 runs sampled) fast-json-stringify-uglified obj x 9,073,607 ops/sec ±0.41% (94 runs sampled) ``` #### Table of contents: - <a href="#example">`Example`</a> - <a href="#api">`API`</a> - <a href="#fastJsonStringify">`fastJsonStringify`</a> - <a href="#specific">`Specific use cases`</a> - <a href="#required">`Required`</a> - <a href="#missingFields">`Missing fields`</a> - <a href="#patternProperties">`Pattern Properties`</a> - <a href="#additionalProperties">`Additional Properties`</a> - <a href="#anyof">`AnyOf`</a> - <a href="#ref">`Reuse - $ref`</a> - <a href="#long">`Long integers`</a> - <a href="#uglify">`Uglify`</a> - <a href="#nullable">`Nullable`</a> - <a href="#caveat">`Caveat`</a> - <a href="#acknowledgements">`Acknowledgements`</a> - <a href="#license">`License`</a> <a name="example"></a> ## Example ```js const fastJson = require('fast-json-stringify') const stringify = fastJson({ title: 'Example Schema', type: 'object', properties: { firstName: { type: 'string' }, lastName: { type: 'string' }, age: { description: 'Age in years', type: 'integer' }, reg: { type: 'string' } } }) console.log(stringify({ firstName: 'Matteo', lastName: 'Collina', age: 32, reg: /"([^"]|\\")*"/ })) ``` <a name="api"></a> ## API <a name="fastJsonStringify"></a> ### fastJsonStringify(schema) Build a `stringify()` function based on [jsonschema](http://json-schema.org/). Supported types: * `'string'` * `'integer'` * `'number'` * `'array'` * `'object'` * `'boolean'` * `'null'` And nested ones, too. <a name="specific"></a> #### Specific use cases | Instance | Serialized as | | -----------|------------------------------| | `Date` | `string` via `toISOString()` | | `RegExp` | `string` | <a name="required"></a> #### Required You can set specific fields of an object as required in your schema by adding the field name inside the `required` array in your schema. Example: ```javascript const schema = { title: 'Example Schema with required field', type: 'object', properties: { nickname: { type: 'string' }, mail: { type: 'string' } }, required: ['mail'] } ``` If the object to stringify is missing the required field(s), `fast-json-stringify` will throw an error. <a name="missingFields"></a> #### Missing fields If a field *is present* in the schema (and is not required) but it *is not present* in the object to stringify, `fast-json-stringify` will not write it in the final string. Example: ```javascript const stringify = fastJson({ title: 'Example Schema', type: 'object', properties: { nickname: { type: 'string' }, mail: { type: 'string' } } }) const obj = { mail: '[email protected]' } console.log(stringify(obj)) // '{"mail":"[email protected]"}' ``` <a name="defaults"></a> #### Defaults `fast-json-stringify` supports `default` jsonschema key in order to serialize a value if it is `undefined` or not present. Example: ```javascript const stringify = fastJson({ title: 'Example Schema', type: 'object', properties: { nickname: { type: 'string', default: 'the default string' } } }) console.log(stringify({})) // '{"nickname":"the default string"}' console.log(stringify({nickname: 'my-nickname'})) // '{"nickname":"my-nickname"}' ``` <a name="patternProperties"></a> #### Pattern properties `fast-json-stringify` supports pattern properties as defined by JSON schema. *patternProperties* must be an object, where the key is a valid regex and the value is an object, declared in this way: `{ type: 'type' }`. *patternProperties* will work only for the properties that are not explicitly listed in the properties object. Example: ```javascript const stringify = fastJson({ title: 'Example Schema', type: 'object', properties: { nickname: { type: 'string' } }, patternProperties: { 'num': { type: 'number' }, '.*foo$': { type: 'string' } } }) const obj = { nickname: 'nick', matchfoo: 42, otherfoo: 'str' matchnum: 3 } console.log(stringify(obj)) // '{"matchfoo":"42","otherfoo":"str","matchnum":3,"nickname":"nick"}' ``` <a name="additionalProperties"></a> #### Additional properties `fast-json-stringify` supports additional properties as defined by JSON schema. *additionalProperties* must be an object or a boolean, declared in this way: `{ type: 'type' }`. *additionalProperties* will work only for the properties that are not explicitly listed in the *properties* and *patternProperties* objects. If *additionalProperties* is not present or is set to `false`, every property that is not explicitly listed in the *properties* and *patternProperties* objects,will be ignored, as described in <a href="#missingFields">Missing fields</a>. Missing fields are ignored to avoid having to rewrite objects before serializing. However, other schema rules would throw in similar situations. If *additionalProperties* is set to `true`, it will be used by `JSON.stringify` to stringify the additional properties. If you want to achieve maximum performance, we strongly encourage you to use a fixed schema where possible. Example: ```javascript const stringify = fastJson({ title: 'Example Schema', type: 'object', properties: { nickname: { type: 'string' } }, patternProperties: { 'num': { type: 'number' }, '.*foo$': { type: 'string' } }, additionalProperties: { type: 'string' } }) const obj = { nickname: 'nick', matchfoo: 42, otherfoo: 'str' matchnum: 3, nomatchstr: 'valar morghulis', nomatchint: 313 } console.log(stringify(obj)) // '{"matchfoo":"42","otherfoo":"str","matchnum":3,"nomatchstr":"valar morghulis",nomatchint:"313","nickname":"nick"}' ``` #### AnyOf `fast-json-stringify` supports the anyOf keyword as defined by JSON schema. *anyOf* must be an array of valid JSON schemas. The different schemas will be tested in the specified order. The more schemas `stringify` has to try before finding a match, the slower it will be. *anyOf* uses [ajv](https://www.npmjs.com/package/ajv) as a JSON schema validator to find the schema that matches the data. This has an impact on performance—only use it as a last resort. Example: ```javascript const stringify = fastJson({ title: 'Example Schema', type: 'object', properties: { 'undecidedType': { 'anyOf': [{ type: 'string' }, { type: 'boolean' }] } } } ``` <a name="if-then-else"></a> #### If/then/else `fast-json-stringify` supports `if/then/else` jsonschema feature. See [ajv documentation](https://ajv.js.org/keywords.html#ifthenelse). Example: ```javascript const stringify = fastJson({ 'type': 'object', 'properties': { }, 'if': { 'properties': { 'kind': { 'type': 'string', 'enum': ['foobar'] } } }, 'then': { 'properties': { 'kind': { 'type': 'string', 'enum': ['foobar'] }, 'foo': { 'type': 'string' }, 'bar': { 'type': 'number' } } }, 'else': { 'properties': { 'kind': { 'type': 'string', 'enum': ['greeting'] }, 'hi': { 'type': 'string' }, 'hello': { 'type': 'number' } } } }) console.log(stringify({ kind: 'greeting', foo: 'FOO', bar: 42, hi: 'HI', hello: 45 })) // {"kind":"greeting","hi":"HI","hello":45} console.log(stringify({ kind: 'foobar', foo: 'FOO', bar: 42, hi: 'HI', hello: 45 })) // {"kind":"foobar","foo":"FOO","bar":42} ``` **NB:** don't declare the properties twice or you'll print them twice! <a name="ref"></a> #### Reuse - $ref If you want to reuse a definition of a value, you can use the property `$ref`. The value of `$ref` must be a string in [JSON Pointer](https://tools.ietf.org/html/rfc6901) format. Example: ```javascript const schema = { title: 'Example Schema', definitions: { num: { type: 'object', properties: { int: { type: 'integer' } } }, str: { type: 'string' } }, type: 'object', properties: { nickname: { $ref: '#/definitions/str' } }, patternProperties: { 'num': { $ref: '#/definitions/num' } }, additionalProperties: { $ref: '#/definitions/def' } } const stringify = fastJson(schema) ``` If you need to use an external definition, you can pass it as an option to `fast-json-stringify`. Example: ```javascript const schema = { title: 'Example Schema', type: 'object', properties: { nickname: { $ref: 'strings#/definitions/str' } }, patternProperties: { 'num': { $ref: 'numbers#/definitions/num' } }, additionalProperties: { $ref: 'strings#/definitions/def' } } const externalSchema = { numbers: { definitions: { num: { type: 'object', properties: { int: { type: 'integer' } } } } }, strings: require('./string-def.json') } const stringify = fastJson(schema, { schema: externalSchema }) ``` External definitions can also reference each other. Example: ```javascript const schema = { title: 'Example Schema', type: 'object', properties: { foo: { $ref: 'strings#/definitions/foo' } } } const externalSchema = { strings: { definitions: { foo: { $ref: 'things#/definitions/foo' } } }, things: { definitions: { foo: { type: 'string' } } } } const stringify = fastJson(schema, { schema: externalSchema }) ``` <a name="long"></a> #### Long integers Long integers (64-bit) are supported using the [long](https://github.com/dcodeIO/long.js) module. Example: ```javascript const Long = require('long') const stringify = fastJson({ title: 'Example Schema', type: 'object', properties: { id: { type: 'integer' } } }) const obj = { id: Long.fromString('18446744073709551615', true) } console.log(stringify(obj)) // '{"id":18446744073709551615}' ``` <a name="uglify"></a> #### Uglify If you want to squeeze a little bit more performance out of the serialization at the cost of readability in the generated code, you can pass `uglify: true` as an option. Note that you have to manually install `uglify-es` in order for this to work. Only version 3 is supported. Example: Note that if you are using Node 8.3.0 or newer, there are no performance gains from using Uglify. See https://www.nearform.com/blog/node-js-is-getting-a-new-v8-with-turbofan/ ```javascript const stringify = fastJson({ title: 'Example Schema', type: 'object', properties: { id: { type: 'integer' } } }, { uglify: true }) // stringify is now minified code console.log(stringify({ some: 'object' })) // '{"some":"object"}' ``` <a name="nullable"></a> #### Nullable According to the [Open API 3.0 specification](https://swagger.io/docs/specification/data-models/data-types/#null), a value that can be null must be declared `nullable`. ##### Nullable object ```javascript const stringify = fastJson({ 'title': 'Nullable schema', 'type': 'object', 'nullable': true, 'properties': { 'product': { 'nullable': true, 'type': 'object', 'properties': { 'name': { 'type': 'string' } } } } }) console.log(stringify({product: {name: "hello"}})) // "{"product":{"name":"hello"}}" console.log(stringify({product: null})) // "{"product":null}" console.log(stringify(null)) // null ``` Otherwise, instead of raising an error, null values will be coerced as follows: - `integer` -> `0` - `number` -> `0` - `string` -> `""` - `boolean` -> `false` <a name="caveat"></a> ## Caveat In order to achieve lowest cost/highest performance redaction `fast-json-stringify` creates and compiles a function (using the `Function` constructor) on initialization. While the `schema` is currently validated for any developer errors, it's recommended against allowing user input to directly supply a schema. It can't be guaranteed that allowing user input for the schema couldn't feasibly expose an attack vector. <a name="acknowledgements"></a> ## Acknowledgements This project was kindly sponsored by [nearForm](http://nearform.com). <a name="license"></a> ## License MIT
26.254144
610
0.640713
eng_Latn
0.859908
c90139b346ceb61b268ef484306d1ab47902f2a3
3,398
md
Markdown
content/ja/graphing/functions/arithmetic.md
shlomoartzi/documentation
6f9ca506d6cd2d76f0ffa36c76e3bd4ab61b834a
[ "BSD-3-Clause" ]
null
null
null
content/ja/graphing/functions/arithmetic.md
shlomoartzi/documentation
6f9ca506d6cd2d76f0ffa36c76e3bd4ab61b834a
[ "BSD-3-Clause" ]
null
null
null
content/ja/graphing/functions/arithmetic.md
shlomoartzi/documentation
6f9ca506d6cd2d76f0ffa36c76e3bd4ab61b834a
[ "BSD-3-Clause" ]
null
null
null
--- title: 算術演算 kind: documentation --- ## 絶対値 | 関数 | 説明 | 例 | | :---- | :------- | :--------- | | `abs()` | メトリクスの絶対値をグラフ化します。 | `abs(<METRIC_NAME>{*})` | この正弦波時系列 `sin{*}` を {{< img src="graphing/functions/arithmetic/sinus.png" alt="Sinus function" responsive="true" style="width:80%;">}} 次の `abs(sin{*})` に変換します。 {{< img src="graphing/functions/arithmetic/sinus_abs.png" alt="Sinus function with abs" responsive="true" style="width:80%;">}} ## 対数 ### log2 | 関数 | 説明 | 例 | | :---- | :------- | :--------- | | `log2()` | メトリクスの底 2 の対数をグラフ化します。 | `log2(<METRIC_NAME>{*})` | 例: データポイントごとに 1 ずつ増えるメトリクス `x{*}` があるとすると、`log2(x{*})` は次のようなグラフになります。 {{< img src="graphing/functions/arithmetic/log2.png" alt=" log2 function" responsive="true" style="width:80%;">}} ### log10 | 関数 | 説明 | 例 | | :---- | :------- | :--------- | | `log10()` | メトリクスの底 10 の対数をグラフ化します。 | `log10(<METRIC_NAME>{*})` | 例: データポイントごとに 1 ずつ増えるメトリクス `x{*}` があるとすると、`log10(x{*})` は次のようなグラフになります。 {{< img src="graphing/functions/arithmetic/log10.png" alt="log10 function" responsive="true" style="width:80%;">}} ## 累積合計 | 関数 | 説明 | 例 | | :---- | :------- | :--------- | | `cumsum()` | 可視のタイムウィンドウに対するメトリクスの累積合計をグラフ化します。 | `cumsum(<METRIC_NAME>{*})` | 例: 値 `1` の定数であるメトリクス `const_1{*}` があるとすると、`cumsum(const_1{*})` は次のようなグラフになります。 {{< img src="graphing/functions/arithmetic/cumsum.png" alt="cum sum function with abs" responsive="true" style="width:80%;">}} ## Integral | 関数 | 説明 | 例 | | :---- | :------- | :--------- | | `integral()` | メトリクスの積分をグラフ化します。 | `integral(<METRIC_NAME>{*})` | **注**: Datadog の `integral()` は、特定のメトリクスの可視のタイムウィンドウにおける、すべての隣接ポイントペアの `[時間増分] x [値増分]` の累積合計です。 {{< img src="graphing/functions/arithmetic/integral.png" alt="integral function with abs" responsive="true" style="width:80%;">}} ## その他の関数 {{< whatsnext desc="Consult the other available functions:" >}} {{< nextlink href="/graphing/functions/algorithms" >}}アルゴリズム: メトリクスに異常値や外れ値の検出機能を実装します。{{< /nextlink >}} {{< nextlink href="/graphing/functions/count" >}}カウント: メトリクスの 0 以外または null 以外の値をカウントします。{{< /nextlink >}} {{< nextlink href="/graphing/functions/interpolation" >}}補間: メトリクスにデフォルト値を挿入または設定します。{{< /nextlink >}} {{< nextlink href="/graphing/functions/rank" >}}ランク: メトリクスの一部のみを選択します。{{< /nextlink >}} {{< nextlink href="/graphing/functions/rate" >}}レート: メトリクスに対してカスタム微分係数を計算します。{{< /nextlink >}} {{< nextlink href="/graphing/functions/regression" >}}回帰: メトリクスに何らかの機械学習関数を適用します。{{< /nextlink >}} {{< nextlink href="/graphing/functions/rollup" >}}ロールアップ: メトリクスに使用される元ポイントの数を制御します。{{< /nextlink >}} {{< nextlink href="/graphing/functions/smoothing" >}}スムーシング: メトリクスの変動を滑らかにします。{{< /nextlink >}} {{< nextlink href="/graphing/functions/timeshift" >}}タイムシフト: メトリクスのデータポイントをタイムラインに沿って移動させます。{{< /nextlink >}} {{< /whatsnext >}}
41.950617
129
0.52266
yue_Hant
0.569128
c901678f79c33f2a69e4280891185c082b544ac5
836
md
Markdown
README.md
stacksta/Advanced_Platformer_Template-PSP
cd8414fa6758b267782f4eb74717ccd7de6982b5
[ "MIT" ]
1
2021-12-05T15:38:12.000Z
2021-12-05T15:38:12.000Z
README.md
stacksta/Advanced_Platformer_Template-PSP
cd8414fa6758b267782f4eb74717ccd7de6982b5
[ "MIT" ]
null
null
null
README.md
stacksta/Advanced_Platformer_Template-PSP
cd8414fa6758b267782f4eb74717ccd7de6982b5
[ "MIT" ]
null
null
null
# Advanced Platformer Template for the PSP ------------------------------------------------------------------------------------------------------------- <img src="https://raw.githubusercontent.com/stacksta/Advanced_Platformer_Template-PSP/master/img/platformer1.png" alt="preview_1" width="480" height="272"/> <img src="https://raw.githubusercontent.com/stacksta/Advanced_Platformer_Template-PSP/master/img/platformer2.png" alt="preview_1" width="480" height="272"/> ### TODO - [✔] Scrolling Camera - [✔] Enemies - [ ] Double Jump - [✔] HUD - [ ] Menu ### Credits * uses the openTRI game engine [openTRI](https://github.com/SamRH/openTRI) * Art by [Pixel Frog](https://pixelfrog-assets.itch.io/kings-and-pigs) ### License [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
38
156
0.635167
yue_Hant
0.196751
c90192522e52b1ee2e335be3745030a742de1d7d
1,560
md
Markdown
content/post/2016-02-20-/index.md
xunkeichiu/mysite
ddb75427e018cadec3c041ae446b0eee7e7e5e4d
[ "MIT" ]
null
null
null
content/post/2016-02-20-/index.md
xunkeichiu/mysite
ddb75427e018cadec3c041ae446b0eee7e7e5e4d
[ "MIT" ]
null
null
null
content/post/2016-02-20-/index.md
xunkeichiu/mysite
ddb75427e018cadec3c041ae446b0eee7e7e5e4d
[ "MIT" ]
null
null
null
--- title: 阿里斯托芬《云》 author: 宣棋 date: '2016-02-20' slug: '' categories: - 阅读 tags: - 书评 --- 开篇由父子之间的窘况引出,在80行以前主要是是斯瑞西阿得斯(父)的自言自语与斐狄庇得斯(子)半梦半醒的回应。这一段主要揭示了家庭之中父子观念的严重冲突,父亲节俭而儿子奢费,此冲突可通过孩子母亲的生活描写扩展至城乡现有观念之间的差别,且儿子的奢费已经到了需借钱维持的程度。 随即,父领子来到“思想所”【95】这里通过子对父的质询,可知,雅典一般民众对哲人的理解,即擅长辩论且此本领可打赢官司。【100】左右对哲人的描述很有意思,父稍顿且认为他们是高贵的人,而子则说无赖,此处父亲是真的这么认为还是因为有求于免债如此说呢?(存疑)且父亲期望儿子去学习歪曲而非正直的逻辑这一现象又说明什么?尽管似乎父亲代表的乡下的观念,更加质朴、俭德,然而在现有还债压力的迫使下,父亲还是会做出的有违德性的选择。那么,是不是城市的享受型、消费型观念与乡下的节俭相比,并无德性上的的优劣? 接下来父亲与门徒甲的一番对话,从开始到结束都在搞笑。譬如【140】踢门==>思想流产的逻辑,之后对动物自然的观察考量、“神奇”的器具使用,到遥远的斯巴达与苏格拉底(吊在半空)。(这样一个发展是否缩略了苏的学问路径、这样一个联系是否有隐喻苏和民众的关系如同雅典和斯巴达?)而之后即便苏格拉底从天上到了地面,也是无法与雅典民众沟通的。(是否有隐喻?)父亲关注的点和苏格拉底对父亲使用语句背后意象的探究不在一个维度,两者的对话基本属于“鸡同鸭讲”却可以继续。父亲关心拖债,凭借何种神明起誓对他来说并不重要,而苏则认为这是重点。 关于【370】左右,云、雨、雷,苏持有的对自然现象的描述与大众以神明出发的解释,不了解当时雅典自然学说发展到怎样的程度,此处是否有科普性质(若正向理解)或影射慢神、败坏城邦(若负向理解),尤其是【424】左右苏的声明与父亲的表面性认同和运用。而【475】左右歌队长怂恿父亲收费教人答辩,使我存疑此篇中歌队长扮演的角色。 同样是【475】后,苏对父亲的教学有其鲜明的个人特色,要求了解对方的思想方法,之后按需解决问题,即安装机器。之后两人的对话也是点不在一处,父亲在进入思想所之前,“特洛福尼俄斯蛇洞”的比喻先贤是什么梗? 【520】开始的插曲正文,歌队长对观众奉承恭维了一番,是否可略略推断阿里斯托芬本人的性格以及此剧与当时雅典现实的关联? 【630】苏与父之间的对话又是一次鸡同鸭讲的展示。(对话不等同于互相理解)尤其是到【755】的对话,苏格拉底用“妙”的赞赏,若正向理解,则是苏单纯对父学会运用科学(知识)力量的赞赏,而若是负向理解,则变成了苏对付逃避债务的认同。应从【784】苏的发言中探究出苏之前只是单纯的认为父习得了思考、运用的本领,而非是对其讨债的认同。 【813】又回到了父子之间的对话,算是父对刚刚学得内容的实际运用,可见其只得其表而未究其里。(是否影射了一个伦理问题:苏是否对向他学习的父之后的言论、教习、扭曲的传播负责?) 【888】开始了甲乙双方的对话,乙方自称的具有新思想的逻辑,而甲方则是非逻辑,可通过后续的阐释理解为习俗文化。逻辑原本应为所有事物的正当合理性进行辩护,却演变成了歪曲的手段。【1035】(雅典大众对逻辑的理解?)通过【1180】对新旧日的讨论,看似有理其实荒谬的辩驳是否在影射当时的雅典习俗性法制在逻辑性上的不完善?那么紧接着【1223】父的拖债:不完善的法制,公民是否有义务去遵守呢? 之后又是父子之间的对话,而结尾颇有意思,苏被纵火烧,而其学子父则在最后说苏渎神,此为何意?
53.793103
260
0.855128
zho_Hans
0.266472
c90206442502998d385cef39b701810fa9df6eee
14,691
md
Markdown
content/developerportal/deploy/deploy-mendix-on-microsoft-windows.md
mbyregow/docs
c70aecd9132c8aa757f4e4e41cb52cbf52a75963
[ "CC-BY-4.0" ]
null
null
null
content/developerportal/deploy/deploy-mendix-on-microsoft-windows.md
mbyregow/docs
c70aecd9132c8aa757f4e4e41cb52cbf52a75963
[ "CC-BY-4.0" ]
2
2020-01-11T12:21:51.000Z
2020-02-11T20:06:55.000Z
content/developerportal/deploy/deploy-mendix-on-microsoft-windows.md
mbyregow/docs
c70aecd9132c8aa757f4e4e41cb52cbf52a75963
[ "CC-BY-4.0" ]
1
2020-01-11T10:20:23.000Z
2020-01-11T10:20:23.000Z
--- title: "Microsoft Windows" parent: "on-premises-design" description: "How to install and configure Mendix on a system running Microsoft Windows" menu_order: 50 tags: ["deploy", "Windows", "On Premises", "Microsoft", "Mendix Service Console", "IIS", "URL Rewrite", "Client Cache", "Reverse Inbound Proxy", "Host Header"] --- ## 1 Introduction This document describes the installation and configuration of the Mendix software on a system running Microsoft (MS) Windows. It covers: * Installing the Mendix Service Console * Deploying a Mendix app * Configuring the MS Internet Information Services (IIS) server ## 2 Prerequisites {#Prerequisites} To set up an environment to run Mendix applications, you will need to install the Mendix software. For each Mendix application that will be run, a separate user (service) account is required. This section presents an overview of the setup. ![](attachments/deploy-mendix-on-windows/18580733.png) Before starting this how-to, make sure you have the following prerequisites: * MS Windows 2008 SP2 or higher * .NET 4.5 or higher * IIS 7 or higher with the following service roles enabled: * IIS Management console * Static content * ASP.NET * MS Application Request Routing (ARR) installed (for more information, see [Microsoft Application Request Routing](http://www.iis.net/downloads/microsoft/application-request-routing)) * Java Runtime 8, or other version depending on your Mendix Server Distribution. For example: * Mendix Server Distribution 4 requires Java 6 * Mendix Server Distribution 5 requires Java 7 * The Mendix Deployment Archive (MDA) of your Mendix project * The Mendix server distribution corresponding with your Mendix Studio Pro version (see the [Mendix App Store](https://appstore.home.mendix.com/link/modelers)) * A database with sufficient security rights * Suitable database servers are IBM DB2, MariaDB, MS SQL Server, MySQL, Oracle Database and PostgreSQL. See [System Requirements](/refguide/system-requirements) for more information * A local or domain user with the *“log on as a service”* local security policy set ## 3 Installing the Mendix Service Console To download and install the Mendix Service Console, follow these steps: 1. Download the latest version of the Mendix Service Console by following the **Related downloads** link from the [Studio Pro Download Page](https://appstore.home.mendix.com/link/modelers) of the App Store. ![](attachments/deploy-mendix-on-windows/service_console_download.png) 2. Install the Mendix Service Console by following the installation wizard. 3. Start the Mendix Service Console after the installation; the first time you launch the application, the **Preferences** dialog box will be shown (it will always be shown if no valid location is configured for the apps and server files). ![](attachments/deploy-mendix-on-windows/18580730.png) 4. In the **Preferences** dialog box, enter a **Location of apps and server files**. This location is used for storing your app files and Mendix server files. Mendix recommends using a directory that is: * NOT on the system partition * where you can easily control the security rights The app directory consists of four sub-directories: * Backup – this directory stores any database changes due to model upgrades * Log – this directory stores all of the application log files * Project – this directory contains all of your application files; within this directory you will find the directory data/files that contain all of your uploaded files * Service – this directory contains files for configuring the Windows Services In addition, there will be a file called `Settings.yaml` that contains your application configuration. ## 4 Deploying a Mendix App To deploy a Mendix app using the Mendix Service Console, follow these steps: 1. Start the Mendix Service Console. 2. Click **Add app** to add a new app. A wizard will appear for configuring the new app. 3. Configure the **Service Settings** as follows: * **Service name** – this name must be unique within all existing Windows services * **Display name** – this is the description of the app, which is visible as a tooltip for the app in the left bar of the Mendix Service Console or as a column in the list of Windows services * **Description** – enter a description for the application that will be visible in the Mendix Service Console * **Startup type** – select whether you want the app to be started automatically when the server starts, started with a delay, started manually, or disabled altogether * **User name** and **Password** – the app will always run under the user account given here, and the service will be installed with this user account configured (for more information, see [Prerequisites](#Prerequisites)) 4. Click **Next**. ![](attachments/deploy-mendix-on-windows/18580728.png) 5. On the **Project Files** screen, click **Select app...**. ![](attachments/deploy-mendix-on-windows/18580727.png) 6. Now select the **MDA** file that was created in Studio Pro and contains your application logic. After the installation of your MDA file, you will see which Mendix server (Mendix Runtime) version is needed. 7. Configure the **Database Settings**: * **Type** – the database server type * **Host** – the IP address or host name of the database server * **Name** – the database name * **User name** and **Password** – the database user name and password 8. Click **Next**. ![](attachments/deploy-mendix-on-windows/18580726.png) 9. On the **Common Configuration** screen, keep the default settings. These settings should only be changed if this is needed for your application setup. 10. Click **Finish** and start the application. ## 5 Configuring the Microsoft Internet Information Services Server To configure the MS IIS server, follow the steps in the sections below. ### 5.1 Activating a Proxy in ARR In order to use the proxy functionality within ARR, you need to enable this feature within IIS. To activate a proxy in ARR, follow these steps: 1. Start the IIS Manager. 2. Select the **Server** in the **Connections** pane. 3. Open the **Application Request Routing** feature. 4. Click **Server Proxy Settings** in the **Actions** pane on the right side of the screen. 5. Select **Enable proxy** and click **Apply** in the **Actions** pane. ### 5.2 Creating a Website To create a website, follow these steps: 1. Open the IIS Manager. 2. In the **Connections** pane, right-click the **Sites** node in the tree and select **Add Web Site**. 3. In the **Add Web Site** dialog box, enter a friendly name for your web site in the **Web site name** field. 4. In the **Physical path** field, enter the physical path of your application-project-web folder (for example, *D:\Mendix\Apps\Application\Project\Web*). 5. Select the **Protocol** for the website from the **Type** list. 6. The default value in the IP address box field is **All Unassigned**. If you must specify a static IP address for the website, enter the address in the **IP address** box. 7. Enter a port number in the **Port** field. 8. Click **OK**. ### 5.3 Configuring the MIME Types To configure the MIME types, follow these steps: 1. Open the IIS Manager and navigate to the website you want to manage. 2. In the **Features View**, double-click **MIME Types**. 3. In the **Actions** pane, click **Add**. 4. In the **Add MIME Type** dialog box, add this file type: * **File name extension**: *.mxf* * **MIME type**: *text/xml* 6. Add another MIME type: * **File name extension**: *.json* * **MIME type**: *application/json* 7. Click **OK**. ### 5.4 Configuring the URL Rewrite {{% alert type="info" %}} These instructions use port 8080, which is the default port. Please use the port your Mendix App is configured for. {{% /alert %}} #### 5.4.1 Reverse Proxy Inbound Rules You need to add a number of rules to configure the following request handlers. Rule | Name | Pattern | Rewrite URL :--- | :--- | :--- | :--- 1 | xas | `^(xas/)(.*)` | `http://localhost:8080/{R:1}{R:2}` 2 | ws | `^(ws/)(.*)` | `http://localhost:8080/{R:1}{R:2}` 3 | ws-doc | `^(ws-doc/)(.*)` | `http://localhost:8080/{R:1}{R:2}` 4 | ws-file | `^(file)(.*)` | `http://localhost:8080/{R:1}{R:2}` 5 | link | `^(link/)(.*)` | `http://localhost:8080/{R:1}{R:2}` 6 | rest | `^(rest/)(.*)` | `http://localhost:8080/{R:1}{R:2}` 7 | rest-doc | `^(rest-doc/)(.*)` | `http://localhost:8080/{R:1}{R:2}` Follow the instructions below and replace *[Name]* with the name of the rule in the table above, *[Pattern]* with the regular expression pattern, and *[Rewrite URL]* with the Rewrite URL. Note that some patterns contain a trailing slash, `/`, when they need to point to an exact path (for example, `/ws-doc/mydoc/1234`). 1. Open the IIS Manager and navigate to the website you want to manage. 2. In the **Features View**, double-click **URL Rewrite**. 3. In the **Actions** pane on the right side of the screen, click **Add rule(s)…** to add a new rewrite rule. 4. In the **Inbound Rules** field, enter *localhost:8080*, then click **OK**. 5. Select **ReverseProxyInboundRule1** in **Features View**. 6. In the **Actions** pane on the right side of the screen, click **Rename**. 7. Rename **ReverseProxyInboundRule1** to *[Name]*. 8. Double-click **[Name]** in **Features View** to change the properties of your rule. 9. In the **Pattern field** enter `[Pattern]`. 10. In the **Rewrite URL** field, enter `[Rewrite URL]` (in the rules above this is always `http://localhost:8080/{R:1}{R:2}`). 11. Click **Apply**. 12. Click **Back to Rules**. 13. Repeat from step 3 to add all the required rules. You can also add additional request handlers in the same way. However you must ensure that they come *after* the rule *add x-forwarded-proto header*, described below. #### 5.4.2 Rule *add x-forwarded-proto header* This is required to ensure that you can access the Swagger documentation of your published REST services. It has to be the first rule; it is described last to ensure that it is moved to the top and that additional rules are not placed above it accidentally. 1. Click **View Server Variables** 2. Check if server variable **HTTP_X_FORWARDED_PROTO** is listed. If it is, skip to step 7. 3. In the **Action** page, click **Add** to add the server variable. 4. Enter the **Server variable name** *HTTP_X_FORWARDED_PROTO*. 5. Click **OK**. 6. Click **Back to Rules** 7. Click **Add rule(s)…**. 8. Click **Blank Rule**. 9. Set the **Name** to *add x-forwarded-proto header*. 10. In the **Match URL** section, set **Requested URL** to *Matches the Pattern*. 11. Set **Using** to *Regular Expressions*. 12. Set the **pattern** to `.*`. 13. Set **Ignore Case** to *true* (checked). 14. In the **Server Variables** section, click **Add**. 15. Select Server variable name **HTTP_X_FORWARDED_PROTO** 16. Set **Value** to *https*. 17. Click **OK**. 18. In the **Action** section, select **None**. 19. Set **Stop processing of subsequent rules** to *false* (unchecked). 20. Click **Apply** in the **Action** pane to save the rule. 21. Click **Back to Rules**. 22. Select the newly created *add x-forwarded-proto header* rule and use the **Move Up** button in the Action pane to move the rule to the top of the list. ### 5.5 Disabling the Client Cache In the application directory under **Project/Web**, you will find the `web.config` file that contains the MS IIS configuration for the application. You should add the following code to this file, between the `<system.webServer></system.webServer>` tags. : ```xml <staticContent> <clientCache cacheControlMode="DisableCache" /> </staticContent> ``` Afterwards, the contents of this file will be similar to the following example: **web.config** ```xml <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <rewrite> <rules> <rule name="add x-forwarded-proto header"> <match url=".*" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false" /> <serverVariables> <set name="HTTP_X_FORWARDED_PROTO" value="https" /> </serverVariables> <action type="None" /> </rule> <rule name="xas" stopProcessing="true"> <match url="^(xas/)(.*)" /> <action type="Rewrite" url="http://localhost:8080/{R:1}{R:2}" /> </rule> <rule name="ws" stopProcessing="true"> <match url="^(ws/)(.*)" /> <action type="Rewrite" url="http://localhost:8080/{R:1}{R:2}" /> </rule> <rule name="ws-doc" stopProcessing="true"> <match url="^(ws-doc/)(.*)" /> <action type="Rewrite" url="http://localhost:8080/{R:1}{R:2}" /> </rule> <rule name="ws-file" stopProcessing="true"> <match url="^(file)(.*)" /> <action type="Rewrite" url="http://localhost:8080/{R:1}{R:2}" /> </rule> <rule name="link" stopProcessing="true"> <match url="^(link/)(.*)" /> <action type="Rewrite" url="http://localhost:8080/{R:1}{R:2}" /> </rule> </rules> </rewrite> <staticContent> <mimeMap fileExtension=".mxf" mimeType="text/xml" /> <clientCache cacheControlMode="DisableCache" /> </staticContent> </system.webServer> </configuration> ``` ## 6 Preserving the Host Header{#preserve-header} To make sure the correct application root URL is used within your web services, you must make sure the host header contains the original host header from the client request. To make sure the host header is preserved, follow these steps. 1. Click **Start**, and then click **All Programs**. 2. Click **Accessories**, and then click **Command Prompt**. 3. Execute the following command from the command prompt: ```batchfile cd %windir%\system32\inetsrv ``` 4. Enter: ```batchfile appcmd.exe set config -section:system.webServer/proxy /preserveHostHeader:"True" /commit:apphost ``` ## 7 Troubleshooting When configuring IIS it can seem like you have done everything right but it just doesn't seem to work. A guide to troubleshooting IIS is available here: [Troubleshooting IIS](troubleshooting-iis). ## 8 Read More * [On-Premises](on-premises-design)
39.176
320
0.685794
eng_Latn
0.973579
c9021c6bc979453bcb51db84b0ebeebed553bf9e
1,803
md
Markdown
configs/hand/2d_kpt_sview_rgb_img/deeppose/panoptic2d/resnet_panoptic2d.md
SummerVideoAnalysis/mmpose
70d60e03b7eaa0ae1ec66cc7a22c00916f00c9e1
[ "Apache-2.0" ]
1
2021-02-02T03:08:40.000Z
2021-02-02T03:08:40.000Z
configs/hand/2d_kpt_sview_rgb_img/deeppose/panoptic2d/resnet_panoptic2d.md
connor-john/mmpose
f5eabbf33ba514a1ddaf914e835d6abc7accee39
[ "Apache-2.0" ]
null
null
null
configs/hand/2d_kpt_sview_rgb_img/deeppose/panoptic2d/resnet_panoptic2d.md
connor-john/mmpose
f5eabbf33ba514a1ddaf914e835d6abc7accee39
[ "Apache-2.0" ]
null
null
null
<!-- [ALGORITHM] --> <details> <summary align="right">DeepPose (CVPR'2014)</summary> ```bibtex @inproceedings{toshev2014deeppose, title={Deeppose: Human pose estimation via deep neural networks}, author={Toshev, Alexander and Szegedy, Christian}, booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition}, pages={1653--1660}, year={2014} } ``` </details> <!-- [BACKBONE] --> <details> <summary align="right">ResNet (CVPR'2016)</summary> ```bibtex @inproceedings{he2016deep, title={Deep residual learning for image recognition}, author={He, Kaiming and Zhang, Xiangyu and Ren, Shaoqing and Sun, Jian}, booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition}, pages={770--778}, year={2016} } ``` </details> <!-- [DATASET] --> <details> <summary align="right">CMU Panoptic HandDB (CVPR'2017)</summary> ```bibtex @inproceedings{simon2017hand, title={Hand keypoint detection in single images using multiview bootstrapping}, author={Simon, Tomas and Joo, Hanbyul and Matthews, Iain and Sheikh, Yaser}, booktitle={Proceedings of the IEEE conference on Computer Vision and Pattern Recognition}, pages={1145--1153}, year={2017} } ``` </details> Results on CMU Panoptic (MPII+NZSL val set) | Arch | Input Size | [email protected] | AUC | EPE | ckpt | log | | :--- | :--------: | :------: | :------: | :------: |:------: |:------: | | [deeppose_resnet_50](/configs/hand/2d_kpt_sview_rgb_img/deeppose/panoptic2d/res50_panoptic2d_256x256.py) | 256x256 | 0.999 | 0.686 | 9.36 | [ckpt](https://download.openmmlab.com/mmpose/hand/deeppose/deeppose_res50_panoptic_256x256-8a745183_20210330.pth) | [log](https://download.openmmlab.com/mmpose/hand/deeppose/deeppose_res50_panoptic_256x256_20210330.log.json) |
31.631579
368
0.701054
eng_Latn
0.445641
c903d5038924637704ee521f26f9e85ce9c5c61a
154
md
Markdown
Views/TableViews/RefreshControl for TableViews/README.md
domenicosolazzo/practice-swift
0ee3618fee2b2701f27e0f50f995eddc892f030f
[ "MIT" ]
174
2015-01-06T16:37:54.000Z
2021-03-03T02:01:32.000Z
Views/TableViews/RefreshControl for TableViews/README.md
miguelius/practice-swift
0ee3618fee2b2701f27e0f50f995eddc892f030f
[ "MIT" ]
2
2015-07-24T05:49:32.000Z
2015-11-08T18:05:59.000Z
Views/TableViews/RefreshControl for TableViews/README.md
miguelius/practice-swift
0ee3618fee2b2701f27e0f50f995eddc892f030f
[ "MIT" ]
40
2015-01-15T16:43:37.000Z
2019-01-07T05:07:45.000Z
# Refresh control =============== Using a refresh control for TableViews Source: [iOS 8 Swift Programming Cookbook - Chapter 6.4](http://goo.gl/pvRtI8)
22
78
0.681818
kor_Hang
0.354404
c9044bc58446ec81a9ebcbbb1f73f42cd5024946
1,909
md
Markdown
about.md
CSD-1993/CSD-1993.github.io
be9c70f23e1774bea343bcc6f468b388b14a8654
[ "MIT" ]
null
null
null
about.md
CSD-1993/CSD-1993.github.io
be9c70f23e1774bea343bcc6f468b388b14a8654
[ "MIT" ]
null
null
null
about.md
CSD-1993/CSD-1993.github.io
be9c70f23e1774bea343bcc6f468b388b14a8654
[ "MIT" ]
null
null
null
--- layout: home title: Hakkımızda permalink: /about/ --- C ve Sistem Programcıları Derneği, çalışmalarını C/C++ programlama dilleri ile yürüten ve deneyimlerini sistem programlama alanı ile ilişkilendiren uzmanların oluşturduğu bir dernektir. 1993 yılında kurulmuştur. ## Amaçları - Yazılım dünyasının en atılımcı ve yoğun bilgi gerektiren sistem programlama alanında, araştırma ve geliştirme faaliyetlerinde bulunmak - Yapılan çalışmaları özendirmek ve desteklemek. - C/C++ ve sistem programcıları arasında bilgi ve mesleki yardımlaşma ortamı oluşturmak. - Üyelerinin daha etkin bir biçimde kaliteli yazılımlar üretebilmeleri amacıyla, mesleki bilgi ve becerilerinin gelişimine olanak sağlamak. - Yazılımda kalite bilincinin oluşmasını sağlamak. - Üyelerinin katılabileceği sosyal ve kültürel etkinliklere ortam hazırlamak. - C/C++ ve sistem programlama alanında yayınlar hazırlamak. ## Etkinlikleri - C/C++ ve sistem programlama alanında konferans, seminer, kurs ve sempozyum düzenler. - C/C++ ve sistem programcılığının geliştirilmesi için resmi ve özel kurumlarla işbirliğinde bulunur. - Üyelerinin yararlanacağı döküman, dergi ve kitap gibi kaynakları bulunduran bir kütüphane oluşturur. - C/C++ programlama dillerinde ya da ilgili makine dilinde yazılmış, fonksiyon denilen alt programlara ilişkin, kaynak kodların bulunduğu kütüphaneler oluşturarak - üyelerinin faydalanmasını sağar. - C/C++ ve sistem programcılığını özendiren yarışmalar düzenler. - C/C++ ve sistem programlama alanında bilimsel, teknik, kültürel ve sosyal etkinlikler hazırlar. ### Yönetim Kurulu - Kaan Aslan **Başkan** - Ali Vefa Serçe **Başkan Yardımcısı** - Medeni Demir **Sayman** - Gürbüz Aslan **Yazman** - Gürkan Kurtoğlu - Hüseyin Büte - Oğuz Karan ### Denetleme Kurulu - Sebahat Ersoy - Gencay Çoşkun - Metin Aşçı ### İdari İşler - Güray Sönmez **Genel Sekreter** - Gencay Coşkun **Genel Sekreter Yrd, Muhasebe Müd**
45.452381
211
0.800419
tur_Latn
0.999974
c904653501b98513e349aded1e38de872ad1c2b9
3,513
md
Markdown
roles/edgeruntime/files/kubeedge/cloudcore/README.md
daixijun/ks-installer
b442a703f691d7b260810c536429b22c6bf45dd3
[ "Apache-2.0" ]
null
null
null
roles/edgeruntime/files/kubeedge/cloudcore/README.md
daixijun/ks-installer
b442a703f691d7b260810c536429b22c6bf45dd3
[ "Apache-2.0" ]
null
null
null
roles/edgeruntime/files/kubeedge/cloudcore/README.md
daixijun/ks-installer
b442a703f691d7b260810c536429b22c6bf45dd3
[ "Apache-2.0" ]
null
null
null
# CloudCore Application Visit https://github.com/kubeedge/kubeedge for more information. ## Install examples ``` helm upgrade --install cloudcore ./cloudcore --namespace kubeedge --create-namespace -f ./cloudcore/values.yaml --set cloudCore.modules.cloudHub.advertiseAddress[0]=192.168.88.6 ``` > Note that the parameter `cloudCore.modules.cloudHub.advertiseAddress` is indispensable to start the KubeEdge cloudcore component on the cloud side. > Add `--dry-run` if only for testing purposes. ## Custom Values ### cloudcore - `cloudCore.modules.cloudHub.advertiseAddress`, defines the unmissable public IPs which can be accessed by edge nodes. - `cloudCore.hostNetWork`, default `true`, which shares the host network, used for setting the forward iptables rules on the host. - `cloudCore.image.repository`, default `kubeedge`, defines the image repo. - `cloudCore.image.tag`, default `v1.9.2`, defines the image tag. - `cloudCore.image.pullPolicy`, default `IfNotPresent`, defines the policies to pull images. - `cloudCore.image.pullSecrets`, defines the secrets to pull images. - `cloudCore.labels`, defines the labels. - `cloudCore.annotions`, defines the annotions. - `cloudCore.affinity`, `cloudCore.nodeSelector`, `cloudCore.tolerations`, defines the node scheduling policies. - `cloudCore.resources`, defines the resources limits and requests. - `cloudCore.modules.cloudHub.nodeLimit`, defines the edge nodes limits. - `cloudCore.modules.cloudHub.websocket.enable`, default `true`. - `cloudCore.modules.cloudHub.quic.enable`, default `false`. - `cloudCore.modules.cloudHub.https.enable`, default `true`. - `cloudCore.modules.cloudStream.enable`, default `true`. - `cloudCore.modules.dynamicController.enable`, default `false`. - `cloudCore.modules.router.enable`, default `false`. - `cloudCore.service.type`, default `NodePort`. - `cloudCore.service.cloudhubNodePort`, default `30000`, which defines the exposed node port for cloudhub service. - `cloudCore.service.cloudhubQuicNodePort`, default `30001`, which defines the exposed node port for cloudhub quic protocol. - `cloudCore.service.cloudhubHttpsNodePort`, default `30002`, which defines the exposed node port for cloudhub https protocol. - `cloudCore.service.cloudstreamNodePort`, default `30003`, which defines the exposed node port for cloud stream service. - `cloudCore.service.tunnelNodePort`, default `30004`, which defines the exposed node port for cloud tunnel service. ### iptables-manager - `iptablesManager.enable`, default `true` - `iptablesManager.mode`, default `internal`, can be modified to `external`, the external mode will set up a iptables manager component which shares the host network. That mode can be enabled on version > v1.9.2. See pr https://github.com/kubeedge/kubeedge/pull/3265. - `iptablesManager.image.repository`, default `kubeedge`, defines the image repo. - `iptablesManager.image.tag`, default `v1.9.2`, defines the image tag. - `iptablesManager.image.pullPolicy`, default `IfNotPresent`, defines the policies to pull images. - `iptablesManager.image.pullSecrets`, defines the secrets to pull images. - `iptablesManager.labels`, defines the labels. - `iptablesManager.annotions`, defines the annotions. - `iptablesManager.affinity`, `iptablesManager.nodeSelector`, `iptablesManager.tolerations`, defines the node scheduling policies. - `iptablesManager.resources`, defines the resources limits and requests. ## Uninstall ``` helm uninstall cloudcore -n kubeedge kubectl delete ns kubeedge ```
58.55
268
0.776544
eng_Latn
0.578886
c904a4d7e8c942ae7e21995274f56be90cdf85ff
6,517
md
Markdown
user-guide/composer.md
shuhao02/fiddler-everywhere-docs
5241dd913539e88e6c9847959914554d81d3f6ac
[ "Apache-2.0" ]
null
null
null
user-guide/composer.md
shuhao02/fiddler-everywhere-docs
5241dd913539e88e6c9847959914554d81d3f6ac
[ "Apache-2.0" ]
null
null
null
user-guide/composer.md
shuhao02/fiddler-everywhere-docs
5241dd913539e88e6c9847959914554d81d3f6ac
[ "Apache-2.0" ]
null
null
null
--- title: Composer slug: composer-tab tags: create API request, Fiddler's Composer, headers, body, GET, HTTP request methods, HTTP response publish: true position: 50 --- ## Composer The __Composer__ tab enables you to manually build and send HTTP, HTTPS, and FTP requests. ![Composer User Interface](../images/composer/composer.png) ## Compose a Request The __Composer__ contains three major sections used to construct a request and observe the response. The top section contains a drop-down for selecting [__HTTP Methods__](#http-methods), an [__URL field__](#url-field), a drop-down to select the used [__HTTP version__](#http-version-selection), and an __Execute__ button. The mid-section provides options to further modify your request via [__Headers__](#headers), [__Body__](#body), [__Params__](#params), or [__Raw__](#raw) views. The bottom section is a [__response inspector__](#response-inspector), which shows the response from the executed request. ![Composer User Interface](../images/composer/composer-sections.png) >important Fiddler's Composer is adding by default its `User-Agent` so that it sends HTTPS request correctly. You could remove the default `User-Agent` header but notice that this could break composing a secure request (HTTPS). The default key-value ## HTTP Methods The __Composer__ supports creating a request while using one of the following HTTP methods: - __GET__ - Requests a representation of the specified resource. Requests using GET should only retrieve data. - __PUT__ - replaces all current representations of the target resource with the request payload. - __POST__ - Used to submit an entity to the specified resource, often causing a change in state or side effects on the server. - __DELETE__ - Deletes the specified resource. - __HEAD__ - Asks for a response identical to a __GET__ request, but without the response body. - __TRACE__ - Performs a message loop-back test along the path to the target resource. - __SEARCH__ - Used by a client to ask the server to perform a query operation (described by the request payload) over some set of data scoped to the effective request URI. - __PROPFIND__ - Retrieves properties defined on the resource identified by the Request-URI. - __PATCH__ - Used to apply partial modifications to a resource. - __MKCOL__ - The method may be included in the scope of a transaction by submitting a Transaction Header with a lock token that corresponds to that transaction. - __MOVE__ - Used to move a resource to the location specified by a request Uniform Resource Identifier (URI - __LOCK__ - Used to take out a lock of any access type on a resource so that another principal will not modify the resource while it is being edited. - __UNLOCK__ - Used to remove the lock on the resource at the request Uniform Resource Identifier (URI). - __OPTIONS__ - Used to describe the communication options for the target resource. ![HTTP Methods](../images/composer/composer-http-methods.png) ## URL Field The __URL field__ is the place to endpoint URL for the composed request. ![URL Address textview](../images/composer/composer-addresss-bar.png) ## HTTP Version From the __HTTP Version__ drop-down, you can select the following HTTP versions: - __HTTP 2.0__ - __HTTP 1.2__ - __HTTP 1.1__ - __HTTP 1.0__ - __HTTP 0.9__ ![HTTP Versions drop-down](../images/composer/composer-http-version.png) ## Headers, Params and Body The __Headers, Params and Body__ section is allowing you to further modify your request by adding your custom __Headers__, __Params__, __Body__, and observing the composed request via the __Raw__ view. ### Headers Enables you to add/modify your request headers (e.g., `Content-Type`, `Authorization`, etc.). Add a new header by entering the header key-value pair and then clicking the tick. ![Adding header](../images/composer/composer-headers-before.png) Adds the new header to your request. The header can now be disabled/enabled or completely deleted. ![Added header](../images/composer/composer-headers-after.png) >important Most of the servers using newer versions of TLS will require a **User-Agent** header to be set. By default, Fiddler Everywhere will a **User-Agent** key set with value **Fiddler Everywhere** and description **_Lets servers and network peers identify the application, operating system, vendor, and version of the requesting user agent_**. The header is non-mandatory, but keep in mind that without valid **User-Agent**, some requests to secure servers might fail. ![Default User-Agent](../images/composer/composer-user-agent.png) ### Params Enables you to add query parameters to your request URL easily. Adds any key-value pair appended through the __Params__ view to the request URL. Add new query params by entering the params key-value pair and then clicking the tick. ![Adding header](../images/composer/composer-params-before.png) Add the new query params to your request URL. The params can be disabled/enabled or completely deleted. ![Adding header](../images/composer/composer-params-after.png) ### Body Enables you to manually specify the data that should be sent with the request. ### Raw The view is a raw representation of the composed request. This view is non-editable. ![Raw view of the written request](../images/composer/composer-raw-view.png) ## Response Inspector With the __Response Inspector__, you can inspect the received response (from the executed request). The inspector provides several views to visualize different parts of the request in specific formats. Find detailed information on each inspector type in the dedicated article about [__Inspector Types__]({%slug inspector-types%}) ![Response inspectors](../images/composer/composer-response-inspectors.png) ## Edit Captured Traffic in Composer A session previosly captured in the [Live Traffic]({%slug web-sessions-list%}) could be loaded in the Composer for applying further modifications. 1. Select the desired session and make a right-click to open the context menu. From the context menu, select **Edit in Composer**. Alternatively, select the session and use the keyboard shortcut by pressing key **E**. ![Edit in Composer](../images/composer/edit-in-composer.png) 2. The session opens in a new Composer window and then you could change the desired values. For example, change the data payload, modify the headers, test the authentication, etc. ![Change the loaded request values in new Composer windows](../images/composer/edit-in-composer-002.png)
51.314961
473
0.778119
eng_Latn
0.988171
c904c87fc67657496773ea8544eb007e4434fbc3
939
md
Markdown
includes/digital-twins-code-create-twin.md
maiemy/azure-docs.it-it
b3649d817c2ec64a3738b5f05f18f85557d0d9b6
[ "CC-BY-4.0", "MIT" ]
null
null
null
includes/digital-twins-code-create-twin.md
maiemy/azure-docs.it-it
b3649d817c2ec64a3738b5f05f18f85557d0d9b6
[ "CC-BY-4.0", "MIT" ]
null
null
null
includes/digital-twins-code-create-twin.md
maiemy/azure-docs.it-it
b3649d817c2ec64a3738b5f05f18f85557d0d9b6
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- author: baanders description: file di inclusione per i gemelli digitali di Azure-codice per creare un dispositivo gemello ms.service: digital-twins ms.topic: include ms.date: 10/21/2020 ms.author: baanders ms.openlocfilehash: 050496d028632c416e8694a0d6022747c00b6796 ms.sourcegitcommit: 6a902230296a78da21fbc68c365698709c579093 ms.translationtype: MT ms.contentlocale: it-IT ms.lasthandoff: 11/05/2020 ms.locfileid: "93356391" --- ```csharp // Define the model type for the twin to be created Dictionary<string, object> meta = new Dictionary<string, object>() { { "$model", "dtmi:example:Room;1" } }; // Initialize the twin properties Dictionary<string, object> twin = new Dictionary<string, object>() { { "$metadata", meta }, { "Temperature", temperature}, { "Humidity", humidity}, }; //Create the twin client.CreateOrReplaceDigitalTwinAsync("myRoomID", JsonSerializer.Serialize<Dictionary<string, object>>(twin)); ```
31.3
111
0.757188
kor_Hang
0.202145
c907c359456812e299e5acf638204687dfc2de96
906
md
Markdown
docs/api/Store.md
exentrich/react-native-navigation
db9b66ad510c77c3518bf13e1123cf480a23c90c
[ "MIT" ]
1
2021-01-06T07:07:45.000Z
2021-01-06T07:07:45.000Z
docs/api/Store.md
exentrich/react-native-navigation
db9b66ad510c77c3518bf13e1123cf480a23c90c
[ "MIT" ]
null
null
null
docs/api/Store.md
exentrich/react-native-navigation
db9b66ad510c77c3518bf13e1123cf480a23c90c
[ "MIT" ]
2
2018-05-23T14:04:57.000Z
2018-09-21T10:30:59.000Z
# Store ## setPropsForId `setPropsForId(componentId: string, props: any): void` [source](https://github.com/wix/react-native-navigation/blob/v2/lib/src/components/Store.ts#L7) --- ## getPropsForId `getPropsForId(componentId: string): any` [source](https://github.com/wix/react-native-navigation/blob/v2/lib/src/components/Store.ts#L11) --- ## setComponentClassForName `setComponentClassForName(componentName: string, ComponentClass: any): void` [source](https://github.com/wix/react-native-navigation/blob/v2/lib/src/components/Store.ts#L15) --- ## getComponentClassForName `getComponentClassForName(componentName: string): any` [source](https://github.com/wix/react-native-navigation/blob/v2/lib/src/components/Store.ts#L19) --- ## clearComponent `clearComponent(id: string): void` [source](https://github.com/wix/react-native-navigation/blob/v2/lib/src/components/Store.ts#L23) ---
20.590909
96
0.753863
yue_Hant
0.559105
c90817f595895304a615f69b52dad07914a2b0e3
76
md
Markdown
_includes/04-lists.md
rafswe/markdown-portfolio
4e96331f681a627187c550b62aa966373335eaba
[ "MIT" ]
null
null
null
_includes/04-lists.md
rafswe/markdown-portfolio
4e96331f681a627187c550b62aa966373335eaba
[ "MIT" ]
5
2021-02-22T15:22:58.000Z
2021-02-22T15:43:41.000Z
_includes/04-lists.md
rafswe/markdown-portfolio
4e96331f681a627187c550b62aa966373335eaba
[ "MIT" ]
null
null
null
# List of my favorite things - Pasta - mare - sole - ristoranti - aperitivo
10.857143
28
0.710526
ita_Latn
0.458043
c909af9aa9a4f450afee41309f826301a54b04c2
870
md
Markdown
rules/mixpanel-track-event.md
dafortune/rules
ca3200870725f1dd6bad96cbbf6291878577a77b
[ "MIT" ]
null
null
null
rules/mixpanel-track-event.md
dafortune/rules
ca3200870725f1dd6bad96cbbf6291878577a77b
[ "MIT" ]
null
null
null
rules/mixpanel-track-event.md
dafortune/rules
ca3200870725f1dd6bad96cbbf6291878577a77b
[ "MIT" ]
null
null
null
--- gallery: true categories: - webhook --- ## Tracks Logins in MixPanel This rule will send a `Sign In` event to MixPanel, and will include the application the user is signing in to as a property. See [MixPanel HTTP API](https://mixpanel.com/help/reference/http) for more information. ``` function (user, context, callback) { var mpEvent = { "event": "Sign In", "properties": { "distinct_id": user.user_id, "token": "{REPLACE_WITH_YOUR_MIXPANEL_TOKEN}", "application": context.clientName } }; var base64Event = new Buffer(JSON.stringify(mpEvent)).toString('base64'); request.get({ url: 'http://api.mixpanel.com/track/', qs: { data: base64Event } }); } // don’t wait for the MixPanel API call to finish, return right away (the request will continue on the sandbox)` callback(null,user,context); ```
24.166667
212
0.666667
eng_Latn
0.61638
c90a3610f6200d317a715c7ebb7f45a8f0f22c5c
194
md
Markdown
src/content/hero.md
Sara-Alkhamri/gatsby-portfolio
54f6fcdeeddb75e4bd8411f74a17fad4d618ff1a
[ "RSA-MD" ]
null
null
null
src/content/hero.md
Sara-Alkhamri/gatsby-portfolio
54f6fcdeeddb75e4bd8411f74a17fad4d618ff1a
[ "RSA-MD" ]
null
null
null
src/content/hero.md
Sara-Alkhamri/gatsby-portfolio
54f6fcdeeddb75e4bd8411f74a17fad4d618ff1a
[ "RSA-MD" ]
null
null
null
--- greetings: "Hello" emoji: "👋" title: "I'm Sara" subtitlePrefix: "I build and design " subtitleHighlight: "things for the web" --- Freelance Fullstack Web Developer. Based in Los Angeles, CA
21.555556
59
0.721649
eng_Latn
0.963349
c90a3e3f5734f67652f7315ec20abf7a3f93743a
3,325
md
Markdown
remix/icon/editor/number-5.md
liuwave/icon-helper
6339786f595d6f12a432db71e7aaccb693b9fe59
[ "MIT" ]
2
2020-10-26T16:32:26.000Z
2021-01-13T09:28:45.000Z
remix/icon/editor/number-5.md
liuwave/icon-helper
6339786f595d6f12a432db71e7aaccb693b9fe59
[ "MIT" ]
48
2020-04-04T12:39:59.000Z
2022-02-27T01:30:09.000Z
remix/icon/editor/number-5.md
liuwave/icon-helper
6339786f595d6f12a432db71e7aaccb693b9fe59
[ "MIT" ]
null
null
null
--- title: number 5(五) ICON转svg、png下载 name: number-5 zhTips: 五,数字 tags: ["editor"] search: 5 image: https://iconhelper.cn/svg/remix/editor/number-5.svg --- # number 5 <small style="font-size: 60%;font-weight: 100">五</small> <div class="detail-page"> <p> <span><span class="badge-success badge">免费图标</span> </span> <br/> <span> ICON库: <span class="badge-secondary badge">Remix Icon</span> </span> <br/> <span> CSS名称: <span class="badge-secondary badge">ri-number-5</span> </span> <br/> <span> unicode: <span class="badge-secondary badge">ef27</span> <copy-btn content='ef27' btn-title=""></copy-btn> <copy-btn :content='String.fromCodePoint(parseInt("ef27", 16))' btn-title="复制U"></copy-btn> </span><br/><span>样式:<span class="badge-light badge">常规(regular)</span></span> <br/> <span> 版本: <span class="badge-secondary badge">2.3.0</span> </span><br/><span>标签:<span class="badge-light badge"><router-link to="/tags/editor.html">编辑器</router-link></span></span> <br/> <span>图标来源/作者:<span class="badge-light badge">Remix Design Studio</span></span> <br/> <span>别名:<span class="badge-light badge">5</span></span><br/><span class="zh-detail">中文描述:<span class="badge-primary badge">五</span><span class="badge-primary badge">数字</span><span class="help-link"><span>帮助改进</span>(<a href="https://gitee.com/liuwave/icon-helper/edit/master/json/remix/editor/number-5.json" target="_blank" rel="noopener noreferrer">gitee</a><a href="https://github.com/liuwave/icon-helper/edit/master/json/remix/editor/number-5.json" target="_blank" rel="noopener noreferrer">github</a></span>)</span><br/> </p> </div> <div class="alert alert-dark"> <i class="ri-number-5 ri-xs"></i> <i class="ri-number-5 ri-sm"></i> <i class="ri-number-5 ri-lg"></i> <i class="ri-number-5 ri-2x"></i> <i class="ri-number-5 ri-3x"></i> <i class="ri-number-5 ri-5x"></i> <i class="ri-number-5 ri-7x"></i> </div> <div> <p>引入css文件后,可以用<code>&lt;span&gt;</code>包裹,放在页面中。具体如下所示: </p> <div class="alert alert-primary" style="font-size: 14px"> &lt;span class="ri-number-5" aria-hidden="true"&gt;&lt;/span&gt; <copy-btn content='<span class="ri-number-5" aria-hidden="true"></span>'></copy-btn> </div> <div class="alert alert-secondary"> <i class="ri-number-5" style="font-size: 24px" aria-hidden="true"></i> ri-number-5 <copy-btn content="ri-number-5" btn-title="复制图标名称"></copy-btn> </div> </div> <div id="svg" class="svg-wrap"> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"> <g> <path fill="none" d="M0 0h24v24H0z"/> <path d="M18 2v2H9.3l-.677 6.445a6.5 6.5 0 1 1-2.93 7.133l1.94-.486A4.502 4.502 0 0 0 16.5 16a4.5 4.5 0 0 0-4.5-4.5c-2.022 0-3.278.639-3.96 1.53l-1.575-1.182L7.5 2H18z"/> </g> </svg> </div> <detail full-name='ri-number-5'></detail> <div class="icon-detail__container"> <p>关于“<b>number 5</b>”的评论:</p> </div> <Vssue title="关于“number 5”的评论" /> <div><p>图标number 5来源于 Remix Icon,可免费使用,更多关于 Remix Icon的信息,参见:<a target="_blank" href="https://iconhelper.cn/remix.html">Remix Icon</a> </p></div> <div style="padding:2rem 0 " class="page-nav"><p class="inner"><span class="prev">←<router-link to="/icon/editor/number-4.html">number 4</router-link></span> <span class="next"><router-link to="/icon/editor/number-6.html">number 6</router-link>→</span></p></div>
38.218391
525
0.656541
yue_Hant
0.24055
c90a95d81d5727ce4f6fd619799a3662269255e6
1,831
md
Markdown
html_css/intro_to_html/about_me_lab.md
alemosie/upperline-curriculum
fa35015d6b068420e9e1a4a41133b719b605d954
[ "RSA-MD" ]
null
null
null
html_css/intro_to_html/about_me_lab.md
alemosie/upperline-curriculum
fa35015d6b068420e9e1a4a41133b719b605d954
[ "RSA-MD" ]
null
null
null
html_css/intro_to_html/about_me_lab.md
alemosie/upperline-curriculum
fa35015d6b068420e9e1a4a41133b719b605d954
[ "RSA-MD" ]
2
2017-03-07T00:35:42.000Z
2018-10-03T00:35:01.000Z
# Build an About Me HTML Page Build your own single-page "About Me" website using HTML. ## Content Make sure to include at least two of the following: 1. A paragraph or two about yourself. 2. A list of interests: what do you like to do in your spare time? 3. A section about your favorite place in New York City. 4. Your favorite meme(s). ## New concept: inline CSS CSS (which stands for "cascading style sheets") describes how HTML elements are displayed. If HTML is the bones/body of your webpage, think of CSS like the clothes you put on -- elements that give your page that extra oomph in style. With CSS, you can While CSS normally resides in its own document, we can also use it within HTML tags by passing the tag the "style=" argument. Consider this blue heading below: ```html <h1 style="color:blue;">This is a blue heading.</h1> ``` `color:blue;` is CSS for "change the text to the color blue." We can also change the text font like so: ```html <h1 style="font-family:courier;">This heading is in Courier.</h1> ``` Or we can put them together! ```html <h1 style="font-family:courier;color:blue">This heading is in Courier and it's blue!</h1> ``` CSS pairs ("color" being the property and "blue" being the value of the property "color") are separated by a semi-colon. You can add as many as you like, as long as they follow the proper syntax! ## HTML tags Your page must contain the following html tags and attributes: * Header (e.g. `<h3>`) * Text alignment (center, right, left) * Image (`<img>`) * Text formatting (`<b>, <i>, <u>`) * Division (`<div>`) * Ordered list (`<ol>`) * Unordered list (`<ul>`) * External link (`<a href...>`) * Internal link (`<a href=#...>`) For an extra challenge, consider researching and using the following tags: * Horizontal rule (`<hr>`) * Quote (`<blockquote>`)
30.516667
252
0.706718
eng_Latn
0.99623
c90b8a9a95f49b0b5a044dab29b4678253ac45fa
1,335
md
Markdown
_posts/2021/9/16/python/2021-09-16-로지스틱 회귀(Logistic Regression).md
colinch4/colinch4.github.io
dbd1601c737ac68d3f8ec6545898e7a81293dcd6
[ "MIT" ]
null
null
null
_posts/2021/9/16/python/2021-09-16-로지스틱 회귀(Logistic Regression).md
colinch4/colinch4.github.io
dbd1601c737ac68d3f8ec6545898e7a81293dcd6
[ "MIT" ]
null
null
null
_posts/2021/9/16/python/2021-09-16-로지스틱 회귀(Logistic Regression).md
colinch4/colinch4.github.io
dbd1601c737ac68d3f8ec6545898e7a81293dcd6
[ "MIT" ]
null
null
null
--- layout: post title: "[python] 로지스틱 회귀(Logistic Regression)" description: " " date: 2021-09-16 tags: [파이썬] comments: true share: true --- # 로지스틱 회귀(Logistic Regression) 로지스틱회귀는 범주형 변수를 예측하는데 쓰이는 모델이다 선형회귀는 설명변수, 종속변수가 연속형, 수치형이여야 좋은 결과를 가져올 수 있다. 그에 반에 범주형 변수에 대해선 안좋은 결과를 가져올수밖에없다. 이러한 문제 떄문에 로지스틱 회귀 모델이 제안되었다. ## 1. 로지스틱 함수, Odds **로지스틱함수(Logistic Function), 승산(Odds)**을 먼저 알아보자, 로지스틱 회귀의 뼈대가 되는 것들이다 실제로 많은 데이터들은 특정 변수에 대한 확률값이 선형이 아닌 **S-curve type** 을 따르는 경우가 많다. 이러한 **S-curve** 를 표현해낸 것이 바로 로지스틱 함수이다. 상황, 분야에 따라 **시그모이드함수**라고 불리기도 함. **로지스틱 함수**는 x값으로 어떤 값이든 받을 수가 있지만 출력결과는 항상 0에서 1사이의 값이 된다.즉, **확률밀도함수(probability density function)** 요건을 충족시키는 함수이다. 확률밀도함수(probability density function)의 식과 그래프모양이다. $$ y=\frac{1}{1+e^-x} $$ ![http://i.imgur.com/E0eI8OU.png](C:\Users\sungyun\Desktop\etc\img\probability density function.png) **승산(Odds)** 이란 임의의 사건 A가 발생하지 않을 확률 대비 일어날 확률의 비율을 뜻하는 개념이다. 아래와 같은 식으로 쓸 수 있다. $$ odds = \dfrac{P(A)}{P(A^c)} = \dfrac{P(A)}{1-P(A)} $$ P(A)가 1에 가까울 수록 그래프는 치솟는다. P(A)가 0이라면 0이 된다. 아래 그래프 참조하자 ![](C:\Users\sungyun\Desktop\etc\img\odds(승산).png) ## 2. 이항 로지스틱 회귀 범주가 두 개인 분류문제를 풀어야 할 시, 위에서 설명했듯이 **종속변수 Y가 연속형 숫자가 아닌 범주일 때는 기존 회귀 모델을 적용할 수 없다.** 문제를 바꿔서 풀어야한다. 회귀식의 장점은 그대로 유지하되 종속변수 Y를 범주가 아니라 범주1이될 확률로 두고 식을 세워 풀어간다. 우변(X)은 그대로 두고 좌변만 확률로 바꾸자 $$ P(Y = 1 | X = \) $$
22.627119
138
0.665918
kor_Hang
1.00001
c90b8b751c376afc65de9cf739eead2c7d493db8
1,522
md
Markdown
s/schrzabkg/index.md
nnworkspace/gesetze
1d9a25fdfdd9468952f739736066c1ef76069051
[ "Unlicense" ]
1
2020-06-20T11:34:20.000Z
2020-06-20T11:34:20.000Z
s/schrzabkg/index.md
nagy/gesetze
77abca2ceea3b7b89ea70afb13b5dd55415eb124
[ "Unlicense" ]
null
null
null
s/schrzabkg/index.md
nagy/gesetze
77abca2ceea3b7b89ea70afb13b5dd55415eb124
[ "Unlicense" ]
null
null
null
--- Title: Gesetz zum Wiener Abkommen vom 12. Juni 1973 über den Schutz typographischer Schriftzeichen und ihre internationale Hinterlegung jurabk: SchrZAbkG layout: default origslug: schrzabkg slug: schrzabkg --- # Gesetz zum Wiener Abkommen vom 12. Juni 1973 über den Schutz typographischer Schriftzeichen und ihre internationale Hinterlegung (SchrZAbkG) Ausfertigungsdatum : 1981-07-06 Fundstelle : BGBl II: 1981, 382 Zuletzt geändert durch : Art. 2 Abs. 16 G v. 12.3.2004 I 390 ## Art 1 Zustimmung zum Wiener Abkommen (1) Dem in Wien am 12. Juni 1973 von der Bundesrepublik Deutschland unterzeichneten Wiener Abkommen über den Schutz typographischer Schriftzeichen und ihre internationale Hinterlegung einschließlich der Ausführungsordnung sowie dem Beitritt zum Protokoll vom 12. Juni 1973 zu diesem Abkommen wird zugestimmt. Das Abkommen sowie die Ausführungsordnung und das Protokoll zu dem Abkommen werden nachstehend veröffentlicht. (2) Änderungen der Ausführungsordnung nach Artikel 29 Abs. 3 des Abkommens sind im Bundesgesetzblatt bekanntzumachen. ## Art 2 ... ## Art 3 Schlußvorschriften (1) Dieses Gesetz tritt am Tage nach seiner Verkündung in Kraft. Jedoch tritt Artikel 2 an dem Tage in Kraft, an dem das in Artikel 1 genannte Abkommen nach seinem Artikel 35 für die Bundesrepublik Deutschland in Kraft tritt. (2) Der Tag, an dem das in Artikel 1 genannte Abkommen nach seinem Artikel 35 für die Bundesrepublik Deutschland in Kraft tritt, ist im Bundesgesetzblatt bekanntzugeben.
28.716981
142
0.802234
deu_Latn
0.998453
c90c1070a9d51a1338bed65a9f40649336b5e4ae
908
md
Markdown
_posts/2021-02-01-2주차-소감.md
WonSik36/WonSik36.github.io
a5bb884c4e7b5e443640ea4d29e0b82cc7dffd79
[ "MIT" ]
null
null
null
_posts/2021-02-01-2주차-소감.md
WonSik36/WonSik36.github.io
a5bb884c4e7b5e443640ea4d29e0b82cc7dffd79
[ "MIT" ]
null
null
null
_posts/2021-02-01-2주차-소감.md
WonSik36/WonSik36.github.io
a5bb884c4e7b5e443640ea4d29e0b82cc7dffd79
[ "MIT" ]
null
null
null
# NHN 루키8기 베이스캠프 2주차 소감문 이번주는 베이스캠프 2주차로 예매 시스템을 기획하였습니다. 자세히 말하자면 1. 서비스 정의 2. 발사믹 와이어프레임 제작 툴을 활용한 와이어 프레임 구축 3. 발표 자료 준비 의 과정을 거쳤습니다. 이러한 과정을 통해 크게 세가지를 느꼈습니다. 첫째, *기획자의 입장을 이해하자.* 처음 기획을 할때, 아무래도 개발자의 관점에서 기획을 하다보니, 안전하기는 하지만 매력이 부족함을 느꼈었습니다. 하지만 점점 더 창의적으로 생각하려고 해보니 재밌는 아이디어가 떠올랐습니다. 이때, 구현을 생각하면 힘들기는 하겠지만 실제 팔릴수 있다는 생각이 들었습니다. 따라서 앞으로 기획자가 구현하기 힘든 일을 주더라도 최선을 다 해보자는 마음으로 해야겠습니다. 둘째, *같은 기능을 말해도 팀원들간의 생각은 다르다.* 교육의 일환으로 플래닝 포커 게임을 진행하였습니다. 플래닝 포커란 일정 기능에 대한 상대적 크기를 팀원들간 나누는 시간입니다. 이때 팀원간 생각이 크게 달랐습니다. 그리고 이 다름은 기능에 대한 정의의 차이에서 나왔습니다. 따라서 실제 개발 진행전에 기능 정의에 대해 합의를 도출하고 진행해야겠습니다. 셋째, *소프트웨어 개발은 일이다. 따라서 마감기한이 중요하고 이를 파악할 수 있는 기법에 익숙해져야한다.* 교육의 일환으로 소프트웨어 개발 프로세스에 대해 설명해주셨습니다. 이를 들으면서 업무 추적과 마감기한 추정이 중요하다는 것을 느꼈습니다. P.S. 교육하시던 중, TDD 얘기가 나와 관심을 가지게 되었고 공부해보고 싶은 생각이 들었습니다. > 위 소감문은 NHN 루키 8기 베이스 캠프 2주차 과정과 우윤정 강사님의 **"기민한 소프트웨어 개발을 위한 요구사항 작성과 추정"** 강의를 바탕으로 작성되었습니다.
43.238095
224
0.701542
kor_Hang
1.00001
c90c18cbf04302ba49cfcb56e8b2eba55afbe2a4
1,581
md
Markdown
_posts/2016-11-17-Blog-lesson-1.md
samir222/Samir
2670e0690f9d945d4a76dec4035c61d3bc61f46e
[ "MIT" ]
null
null
null
_posts/2016-11-17-Blog-lesson-1.md
samir222/Samir
2670e0690f9d945d4a76dec4035c61d3bc61f46e
[ "MIT" ]
null
null
null
_posts/2016-11-17-Blog-lesson-1.md
samir222/Samir
2670e0690f9d945d4a76dec4035c61d3bc61f46e
[ "MIT" ]
null
null
null
--- title: Blog lesson 1 layout: post author: yousef.kaler permalink: /blog-lesson-1/ source-id: 1tKXSMt5N0mUaHkAwhcdSTvqYRDlDl7nOYYxtpAJEG-c published: true --- <table> <tr> <td>Title</td> <td>My first attempts at publishing using GitHub.</td> <td>Date</td> <td>09/09/16</td> </tr> </table> <table> <tr> <td>Starting point:</td> <td>I had no starting point.</td> </tr> <tr> <td>Target for this lesson?</td> <td>To make a simple github page with one blog for homework.</td> </tr> <tr> <td>Did I reach my target? (add details to "Lesson Review")</td> <td> Yes, as i managed to publish my first blog with ease.</td> </tr> </table> <table> <tr> <td>Lesson Review</td> </tr> <tr> <td>How did I learn? What strategies were effective? </td> </tr> <tr> <td>I learnt by watching a demonstration at the front of the classroom led by one of my IT teachers. I learnt how to publish blogs using gabriel on google docs and submitting the doc into the posts file on github.</td> </tr> <tr> <td>What limited my learning? Which habits do I need to work on? </td> </tr> <tr> <td>I need to work on not getting distracted by others github pages and focus on my own. I also should have remembered to take a notepad or my ipad to take notes down.</td> </tr> <tr> <td>What will I change for next time? How will I improve my learning?</td> </tr> <tr> <td>I will remember to take a notepad and focus on my learning in class by not getting distracted by others. als</td> </tr> </table>
25.918033
222
0.653384
eng_Latn
0.98622
c90c2d5f52cf6e06804fa68d627df492ad7d6af4
195
md
Markdown
README.md
Eggbertx/robotfindskitten-minisphere
ee790c22e996b329b92f59a230ea325666ed4a09
[ "BSD-2-Clause" ]
1
2019-10-15T09:15:50.000Z
2019-10-15T09:15:50.000Z
README.md
Eggbertx/robotfindskitten-minisphere
ee790c22e996b329b92f59a230ea325666ed4a09
[ "BSD-2-Clause" ]
null
null
null
README.md
Eggbertx/robotfindskitten-minisphere
ee790c22e996b329b92f59a230ea325666ed4a09
[ "BSD-2-Clause" ]
null
null
null
# robotfindskitten-minisphere To build robotfindskitten, run `cell` in a terminal. It will create the "dist" folder. (Assuming you have miniSphere installed), run `minisphere --windowed dist`.
32.5
86
0.774359
eng_Latn
0.972601
c90c380105a2587251d3090be36e83caa48b715c
19,537
md
Markdown
README.md
hansharhoff/HASS-sonoff-ewelink
baae37ae0f3a70db17f5c9cae385d159ed062156
[ "MIT" ]
6
2020-07-05T12:06:25.000Z
2021-07-21T13:18:40.000Z
README.md
hansharhoff/HASS-sonoff-ewelink
baae37ae0f3a70db17f5c9cae385d159ed062156
[ "MIT" ]
null
null
null
README.md
hansharhoff/HASS-sonoff-ewelink
baae37ae0f3a70db17f5c9cae385d159ed062156
[ "MIT" ]
1
2021-07-21T13:19:01.000Z
2021-07-21T13:19:01.000Z
# Home Assistant component for original firmware Sonoff / eWeLink smart devices Simple Home Assistant component to add/control Sonoff/eWeLink smart devices using the stock firmware and retaining the cloud capabilities. *** ### WARNING: completely deactivate the `sonoff` component from HA while doing a firmware update, due to auto-relogin function you might be kicked out of the app before the process is completed. I would not be held liable for any problems occurring if not following this steps! *** **CHECK COMPATIBILITY LIST BELOW (not everyday updated)! TRY THE COMPONENT FIRST AND IF IT DOESN'T WORK FOR YOUR DEVICE DON'T COMPLAIN AND OPEN A PROPER ISSUE** ## Setup To setup add to your configuration.yaml: ```yaml sonoff: username: [email or phone number] password: [password] scan_interval: 60 #(optional, lower values than 60 won't work anymore!) grace_period: 600 #(optional) api_region: 'eu' #(optional) entity_prefix: True #(optional) debug: False #(optional) ``` And copy the *.py files in `custom_components` folder using the same structure like defined here: ``` custom_components └── sonoff └── __init__.py └── switch.py └── sensor.py └── light.py └── binary_sensor.py └── fan.py ``` `email` [Deprecated] used only for compatibility, may be eliminated in future. `username` the username that you registered for ewelink account be it an email or a phone number (the phone number should lead with region number, '+8612345678901' for example). `scan_interval` you can define how fast the state of devices is refreshed (by default every 60sec). for example if you change the switch from an external source like Alexa or Google Home the change will show up in HA in maximum less than specified period, while changing it using HA interface/actions/etc it's instantly `grace_period` eWeLink app allows **only one active session at a time**, therefore this will temporarily block HA refreshes for the specified amount (in seconds) to allow (or better said **after**) you to login in the app and do required changes to your devices. following that sonoff component does an internal re-login invalidating the mobile session and the process restarts. (as a workaround for this, you can create a 2nd account and share the devices from main one, therefore one will be used in HA another one in mobile app) `api_region` this component tries to find, login & connect to the proper region assigned to you. specifying a proper region will help the component to load faster and reconnect after the expired grace period explained above, possible values: `eu` (default), `us`, `as` or `cn` `entity_prefix` this option removes the `sonoff_` prefix from entities name (it's more or a less a compatibility mode between previous `master` vs `websocket` branch implementations) ### debug log generation / new device / new features requests `debug` if enabled this will give you the ability to generate a log of messages from ewelink that can be easily posted here to debug/implement new devices. steps and how it works: - this option creates a pseudo switch entity `switch.sonoff_debug` (**notice** it won't show up automatically in frontend in lovelace you have to **manually add it** or toggle it from `Developer tools > Services` section). - to generate a sonoff debug log toggle the pseudo-switch ON and the capture of messages will silently start in the background. now **pick up the phone -> open eWeLink app** and start changing settings of your Sonoff device but not faster than 10+ seconds between each change. - when you finish toggle the pseudo-switch OFF and a new (very long) persistent notification will show up. - go to `Developer tools > States` section and look for a `persistent_notification.notification` entity (impossible to miss due to its extremely long attribute text) and copy the message from there (to remove this notifications and others just push the button Dismiss them from main HA notifications area and you can restart the process and generate a new log if needed). **INFORMATION**: it'll be better if you share the device-to-debugged to a 2nd eWeLink account and use this in HA and main account in mobile app, this way you won't be logged out of the app anymore and the generated log will be restricted to only 1 device **NOTICE**: you should **NOT** leave debug-mode enabled for everyday use, please please just don't! This is just a proof of concept because I searched for it and there was no implementation to use Sonoff/eWeLink devices without flashing them. (although I know how to do it, I don't have a real extensive usage for now and I prefer to keep them on stock firmware). ## Compatibility list | Model | Supported | 1.6 | 1.8.1 | 2.6 | 2.6.1 | 2.7.0 | 2.7.1 | 3.0.0 | 3.0.1 | 3.3.0 | Remarks | |-------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------:|:---:|:-----:|:---:|-------|-------|:-----:|:-----:|:-----:|:----------------------------:|--------------------------------------------------------------------------------------------| | Sonoff Basic | yes | yes | yes | yes | | | | yes | | | | | Sonoff Basic R3 | yes | | | | | | | | | yes | | | Sonoff Dual | yes | | | | | | | | | | | | Sonoff RF | yes | | | yes | yes | | | yes | | | | | Sonoff B1 | yes | | | | | | | | | | ON/OFF, warm/cool white, colors | | Sonoff SC | yes | | | | | yes | | | | | hum/temp/dust/light/noise sensors | | Sonoff RF Bridge | yes | | | | | yes | | | | | binary sensors that are ON for ~2sec | | Sonoff G1 | ? | | | | | | | | | | | | Sonoff 4CH Pro | yes | | | yes | | | yes | | yes | | | | Sonoff 4CH Pro R2 | yes | | | yes | | | | yes | yes | | | | Sonoff S20 | yes | | yes | | | | | yes | | | | | Sonoff S30 | yes | | | | | | | yes | | | | | Sonoff S31 | yes | | | | | | | | | | + power/current/voltage sensors | | [Sonoff S26](https://www.aliexpress.com/item/Sonoff-S26-WiFi-Smart-Socket-Wireless-Plug-Power-Socket-Smart-Home-Switch-Smart-Remote-Control-for/32956551752.html) | yes | | | yes | | | | yes | | yes | version: Euro | | Sonoff T1 1C | yes | | | yes | | | | | | | | | Sonoff T1 EU 2C | yes | | | | | | yes | | | | | | Sonoff T1 UK 3C | yes | | | yes | | | yes | | | | | | Sonoff T1 US 3C | yes | | | | | | | | | | | | Sonoff TX 1C | yes | | | | | | | | | yes | | | Sonoff Pow | yes | | | | | | | | | | + power sensor | | Sonoff Pow R2 | yes | | | | | | | | | partial **NO sensors data!** | + power/current/voltage sensors | | Sonoff TH10/TH16 | yes | | | | | | | | | | + temp/humidity sensors | | Sonoff iFan02 | yes | | | | | | | | | | 1 fan + 1 light entities | | Sonoff iFan03 | yes | | | | | | | | | | 1 fan + 1 light entities | | Sonoff HT-TH31 | ? | | | | | | | | | | | | [Sonoff Slampher RF](https://www.gearbest.com/smart-light-bulb/pp_1824903.html) | yes | | | | | | yes | yes | yes | yes | | | [3 Gang Generic Wall Switch](https://www.amazon.in/gp/product/B07FLY398G) | yes | | | yes | | | | | | yes | Manfufacturer: pro-sw, Model: PS-15-ES (according to ewelink app) | | [1 Gang Generic Wall Switch](https://www.aliexpress.com/item/1-Gang-US-EU-UK-Plug-Wall-Wifi-Light-Switch-Smart-Touch-LED-Lights-Switch-for/32934184095.html) | yes | | | yes | | | | yes | | yes | manfufacturer: KingART, model: KING-N1 (according to ewelink app), Chip: PSF-B85 (ESP8285) | | WHDTS WiFi Momentary Inching Relay | yes | | | | | | | | | | displayed as a switch button | | [MHCOZY WiFi Wireless 5V/12V](https://www.amazon.com/gp/product/B07CJ6DSQC/ref=oh_aui_search_detailpage?ie=UTF8&psc=1) | yes | | | | | | | | | | | | [Geekcreit 2 Channel AC 85V-250V](https://www.ebay.es/itm/Geekcreit-2-Channel-AC-85V-250V-APP-Remote-Control-WIFI-Wireless-Switch-Socket-F-/162844446103) | yes | | | | | | yes | | | | | | [Smart Wi-Fi Outlet](https://www.amazon.com/gp/product/B073VK9X49/ref=oh_aui_detailpage_o01_s01?ie=UTF8&psc=1) | yes | | | | | | | | | | | | Sonoff Mini | yes | | | | | | | yes | | yes | | `yes` = confirmed version, [empty] = unknown for sure ## Updates - 2019.04.08 - HA0.88+ new component structure - added basic rules to create the same number of switches as presented by the physical device - 2019.02.++ alternate faster version with state updates over websocket developed - 2019.01.06 create sensors for devices that have support for power/current/voltage/temperature/humidity - 2018.12.05 - mandarin phone number login support - removed `entity_name` option, the entities will have a fixed structure from now on - 2018.12.01 - ability to control devices with multiple switches - added mobile app specified device-name as default to be listed in HA entity, added `entity_name` option and removed the default `sonoff_` prefix - fixed bug that will show device as unavailable in the grace period - 2018.11.29 shared devices from another account can be used also - 2018.11.28 - mobile app-like login to the closest region - added `scan_interval` option - added `grace_period` option ## Requests / Bugs Feel free to properly ask support for new devices using the guidelines mentioned in the section above regarding the `debug` section (or [the older basic version](https://github.com/peterbuga/HASS-sonoff-ewelink/tree/master/sonoff-debug)) / report bugs / request features / fork (& pull request) and I'll try to see what I can do. ## Credits - most of the logic & code was done (partially) porting this awesome repo (+those that it extends itself) https://github.com/howanghk/homebridge-ewelink - [@2016for](https://github.com/2016for) for assisting me with properly integrating the switches with multiple outlets - [@fireinice](https://github.com/fireinice) for providing the mandarin implementation - [@SergeyAnokhin](https://github.com/SergeyAnokhin) for adding power meter info to entity attributes - [@difelice](https://github.com/difelice) for debugging support #### awesome ❤️ & support 🙌! - [@Michaelrch](https://community.home-assistant.io/u/michaelrch) - [@daboshman](https://github.com/daboshman) - [@primalnow](https://github.com/primalnow) ## Donate Feel free to help me invest in more devices to test and add faster new features to this component! [![paypal](https://www.paypalobjects.com/en_US/IT/i/btn/btn_donateCC_LG.gif)](https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=WXDJJFCULBVSL&source=url)
137.584507
531
0.360598
eng_Latn
0.972347
c90c4192847b9d2b0d9f48a64292a8020e4970d1
763
md
Markdown
docs/payroll_uk/DeductionLine.md
jweakley/xero-ruby
b14156b4a621463766cd02a59c907e5f509d1beb
[ "MIT" ]
45
2019-12-15T02:58:49.000Z
2022-02-25T03:15:26.000Z
docs/payroll_uk/DeductionLine.md
jweakley/xero-ruby
b14156b4a621463766cd02a59c907e5f509d1beb
[ "MIT" ]
149
2019-12-09T15:21:52.000Z
2022-03-22T16:20:11.000Z
docs/payroll_uk/DeductionLine.md
sharesight/xero-ruby
c97fb141b096aee148021074e369fa1de08b256d
[ "MIT" ]
81
2019-12-11T10:00:57.000Z
2022-03-24T04:45:26.000Z
# XeroRuby::PayrollUk::DeductionLine ## Properties Name | Type | Description | Notes ------------ | ------------- | ------------- | ------------- **deduction_type_id** | **String** | Xero identifier for payroll deduction | [optional] **amount** | **Float** | The amount of the deduction line | [optional] **subject_to_tax** | **Boolean** | Identifies if the deduction is subject to tax | [optional] **percentage** | **Float** | Deduction rate percentage | [optional] ## Code Sample ```ruby require 'XeroRuby::PayrollUk' instance = XeroRuby::PayrollUk::DeductionLine.new(deduction_type_id: null, amount: null, subject_to_tax: null, percentage: null) ```
31.791667
94
0.562254
eng_Latn
0.537669
c90c7b3a34f22531b465efd019769b59abc21c2c
3,742
md
Markdown
README.md
MuharremOkutan/BlocklyML
4caafb95e5f8bbaf0ab5914d971cd9380546bf4e
[ "Apache-2.0" ]
1
2022-03-28T08:48:27.000Z
2022-03-28T08:48:27.000Z
README.md
MuharremOkutan/BlocklyML
4caafb95e5f8bbaf0ab5914d971cd9380546bf4e
[ "Apache-2.0" ]
null
null
null
README.md
MuharremOkutan/BlocklyML
4caafb95e5f8bbaf0ab5914d971cd9380546bf4e
[ "Apache-2.0" ]
null
null
null
![](https://raw.githubusercontent.com/chekoduadarsh/BlocklyML/main/media/blocklyML_Banner.png) <h1 style="text-align: center;"><a href="https://blocklyml.herokuapp.com/ ">Blockly ML</a></h1> ![](https://img.shields.io/github/license/chekoduadarsh/BlocklyML) ![](https://img.shields.io/github/issues/chekoduadarsh/BlocklyML) ![](https://img.shields.io/github/last-commit/chekoduadarsh/BlocklyML) ![blocklyml.herokuapp.com](https://blocklyml.herokuapp.com/) ### What is BlocklyML? BlocklyML is a **No Code** training ground for python and Machine Learning. This tool is designed to simplify standard machine learning implementation. This tool can assist anyone who wants to start with Machine Learning or python. This is a forked project from [Blockly](https://github.com/google/blockly) and adapted for machine learning and Data analytics use-cases. :brain: For a sample run go to sampleLayouts folder upload and try it out :smiley: Read the ![UserGuide.md](https://github.com/chekoduadarsh/BlocklyML/blob/main/UserGuide.md) for further info In the Example given below we will train a random forest for Iris Dataset <img src="https://github.com/chekoduadarsh/BlocklyML/blob/main/media/IrisRandomForest.png" alt="drawing" width="500"/> # Table of contents - [Table of contents](#table-of-contents) - [Installing as BlocklyML App](#installing-as-blocklyml-app) - [Flask Method](#flask-method) - [UI Features](#ui-features) - [Shortcuts](#shortcuts) - [Dataframe Viewer](#dataframe-viewer) - [Download Code](#download-code) - [Contribute](#contribute) - [This repo welcomes any kind of contributions :pray:](#this-repo-welcomes-any-kind-of-contributions-pray) - [License](#license) - [Thanks to](#thanks-to) # Installing as BlocklyML App First clone this repo ```shell git clone https://github.com/chekoduadarsh/BlocklyML ``` After cloning the repo you can either follow the Flask Method ### Flask Method Install the requirements from `requirements.txt` with the following command ```shell pip install -r requirements.txt ``` then you can run the application by ```shell python app.py ``` Simple as that :man_shrugging: # UI Features ## Shortcuts You can find these buttons in the top right corner of the application. Their functionality as follows 1. Download XML Layout 2. Upload XML layout 3. Copy Code 4. Launch Google Colab 5. Delete 6. Run (Not Supported Yet!!) <img src="https://github.com/chekoduadarsh/BlocklyML/blob/main/media/butttons.png" alt="drawing" width="500"/> ## Dataframe Viewer Blockly support complete html view of the DataFrame. This can be accessed by view option in the navigation bar <img src="https://github.com/chekoduadarsh/BlocklyML/blob/main/media/DatasetView.png" alt="drawing" width="500"/> ## Download Code Blockly support both .py and .ipynb formats. You can download the code from the download option in the navigation bar <img src="https://github.com/chekoduadarsh/BlocklyML/blob/main/media/DownloadView.png" alt="drawing" width="200"/> # Contribute If you find any error or need support please raise a issue. If you think you can add a feature, or help solve a bug please raise a PR ### This repo welcomes any kind of contributions :pray: Feel free to adapt it criticize it and support it the way you like!! Read : [CONTRIBUTING.md](./CONTRIBUTING.md) # License [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) # Thanks to [![Stargazers repo roster for @chekoduadarsh/BlocklyML](https://reporoster.com/stars/chekoduadarsh/BlocklyML)](https://github.com/chekoduadarsh/BlocklyML/stargazers) [!["Buy Me A Coffee"](https://www.buymeacoffee.com/assets/img/custom_images/orange_img.png)](https://www.buymeacoffee.com/chekoduadarsh)
34.648148
225
0.758685
eng_Latn
0.754746
c90d70ecc8ccc177d918e67d41bb503298ba9218
3,305
md
Markdown
README.md
silarsis/pevl
f10020800e05bc14f168a059eb8311b3f025adea
[ "MIT" ]
null
null
null
README.md
silarsis/pevl
f10020800e05bc14f168a059eb8311b3f025adea
[ "MIT" ]
null
null
null
README.md
silarsis/pevl
f10020800e05bc14f168a059eb8311b3f025adea
[ "MIT" ]
null
null
null
# pevl Python Event Versioning Library ![](https://github.com/silarsis/pevl/workflows/Python%20Event%20Versioning%20Libary/badge.svg) This library provides code to upgrade events. Think of it as an event equivalent to flyway, where you can define the mappings between events and have all incoming events upgraded to a version your code can deal with, on the fly. The target environment is either an event sourcing system, or microservices with an event bus - anywhere you're consuming events and doing something with them. Pre-requisites ============== All events must be versioned. That is, there must be a version number in the event structure somewhere. There must be a way to map from a given event version to version+1 for every version number. This is typically done by defining an upgrade method. The library provides a class, `Upgrader`, which will take events in, extract the version number, run all appropriate upgrade methods until the version is current, then call an event factory with the upgraded data and return the appropriate object. Design decisions (and whys) =========================== * Factory is only called with the latest data, not for each version - this is to save from having to maintain factories for every version. * Supporting Python 3.8 and above - because I want static types. * All events are dictionaries - for simplicity. Events are deepcopy'ed as part of the implementation. * Upgrade methods may mutate events in place - each event is provided as a deepcopy (so don't upgrade things that don't deepcopy). If you return, that will be used - if not, the assumed mutated event will be used. * Upgrade methods will get dictionaries and return dictionaries, and wrapping will be invisible - because I want this to be as simple to use as possible. * All versions should be strings, not numbers or other oddball things - for simplicity. * No version will be represented with str(None) (ie. 'None') How to Use ========== Create a module that holds a bunch of version upgrade methods. Each method should use the `@pevl.event.upgrade` decorator, and in the decorator you specify a version number to apply to and a new-version number to set (otherwise you must set the new version number in the upgrade script itself). Each method should apply an appropriate upgrade from the version number given to the next version number. Each method should be named descriptively. Each method takes an event :dict, and returns an event: dict Version numbers do not need to sort properly - but each version needs to upgrade to another version that is then handled (or not, if final version) Example: ```python from pevl.event import upgrade @upgrade('v0.1', 'v0.2') def split_first_last_name(event): event['first'], event['last'] = event['name'].split() ``` The code to upgrade the events: ```python from pevl.event import Upgrader upgrader = event.Upgrader(upgrades=[split_first_name, ]) upgraded = upgrader.ingest(event, target_version='v0.2') ``` For an example of the code that does the upgrades, see `sample_sns.py` Likely Failure Modes ==================== Make sure your versions match what comes out of the event itself. str/int mismatches may be silently ignored. Make sure you have a version. Again, failed version matches may be ignored.
37.988506
231
0.759455
eng_Latn
0.999182
c90e1e21ed88b04a237e5ae56ecb1a7f10f6f971
374
markdown
Markdown
_posts/2013-11-25-let-my-data-go.markdown
inlovewithcode/inlovewithcode.github.io
60e0bbc52fd4af0ed7a2e5d2fe5d0c06e42e338a
[ "MIT" ]
null
null
null
_posts/2013-11-25-let-my-data-go.markdown
inlovewithcode/inlovewithcode.github.io
60e0bbc52fd4af0ed7a2e5d2fe5d0c06e42e338a
[ "MIT" ]
null
null
null
_posts/2013-11-25-let-my-data-go.markdown
inlovewithcode/inlovewithcode.github.io
60e0bbc52fd4af0ed7a2e5d2fe5d0c06e42e338a
[ "MIT" ]
1
2018-07-27T09:02:51.000Z
2018-07-27T09:02:51.000Z
--- author: guidedmissile comments: true date: 2013-11-25 09:08:46+00:00 layout: post link: https://inlovewithcode.wordpress.com/2013/11/25/let-my-data-go/ slug: let-my-data-go title: Let My Data Go! wordpress_id: 60 categories: - Oracle --- Beautifully written about Oracle DB Session Locks, also check http://www.oracle-base.com/articles/misc/killing-oracle-sessions.php
23.375
79
0.764706
eng_Latn
0.393112
c90e29aff9bbc3ec6ba3b463c64b7efc80abfbd4
40
md
Markdown
README.md
ElenaKusevska/Data_exercises
5c8b2cf7b0c51310cdc0192cf2e39086008f5940
[ "MIT" ]
null
null
null
README.md
ElenaKusevska/Data_exercises
5c8b2cf7b0c51310cdc0192cf2e39086008f5940
[ "MIT" ]
null
null
null
README.md
ElenaKusevska/Data_exercises
5c8b2cf7b0c51310cdc0192cf2e39086008f5940
[ "MIT" ]
null
null
null
# Data_exercises Data Science exercises
13.333333
22
0.85
kor_Hang
0.707893
c90f48d382e7ee1077108338f4224464083d7cb2
301
md
Markdown
docs/vasttools/-vast-tools/com.gcode.vasttools.annotation/-date-format/-g-m-t_-p-l-u-s_-t-w-o.md
SakurajimaMaii/Utils
89c8d4e64bf82157ab82c8d29d2e6c02a2b876ad
[ "MIT" ]
1
2021-03-31T03:31:43.000Z
2021-03-31T03:31:43.000Z
docs/vasttools/-vast-tools/com.gcode.vasttools.annotation/-date-format/-g-m-t_-p-l-u-s_-t-w-o.md
SakurajimaMaii/Utils
89c8d4e64bf82157ab82c8d29d2e6c02a2b876ad
[ "MIT" ]
2
2021-09-10T02:40:20.000Z
2021-09-27T08:34:09.000Z
docs/vasttools/-vast-tools/com.gcode.vasttools.annotation/-date-format/-g-m-t_-p-l-u-s_-t-w-o.md
SakurajimaMaii/GUtils
545dc2bab4653b7647dbb5f426a5e4fe842dfa4d
[ "MIT" ]
null
null
null
//[VastTools](../../../index.md)/[com.gcode.vasttools.annotation](../index.md)/[DateFormat](index.md)/[GMT_PLUS_TWO](-g-m-t_-p-l-u-s_-t-w-o.md) # GMT_PLUS_TWO [androidJvm]\ val [GMT_PLUS_TWO](-g-m-t_-p-l-u-s_-t-w-o.md): [String](https://developer.android.com/reference/kotlin/java/lang/String.html)
43
143
0.687708
yue_Hant
0.527931
c90fecc53f4887e811dbe2e1a7595b51f5c54115
2,688
md
Markdown
contracts/service.md
scada-bricks/gitbook-docs
f6b4486366064437e7e74381832a48130e9c0ce2
[ "MIT" ]
null
null
null
contracts/service.md
scada-bricks/gitbook-docs
f6b4486366064437e7e74381832a48130e9c0ce2
[ "MIT" ]
null
null
null
contracts/service.md
scada-bricks/gitbook-docs
f6b4486366064437e7e74381832a48130e9c0ce2
[ "MIT" ]
null
null
null
--- description: 'Contract names: configurable-service and config-service' --- # Configurable Service ## Configurable-service This **core contract** defines a simple interface to allow services to be configured and updated remotely. This allows for centrally managed configuration via a service implementing the **config-service** contract. Individual services receive the specific configuration they need by implementing the `set-config` end point in their http interface. This config contains all the options for the target service to run. ## Mandatory methods All services must support the following methods {% api-method method="post" host="/set-config" path="" %} {% api-method-summary %} set-config {% endapi-method-summary %} {% api-method-description %} This method is called by a **Config Service** on service startup and subsequently any time the config changes. Every service is responsible for receiving its config in JSON via this method, and updating or restarting. **Config Services** are covered in more detail later on. {% endapi-method-description %} {% api-method-spec %} {% api-method-request %} {% api-method-path-parameters %} {% api-method-parameter name="config" type="object" required=true %} The config for the target service {% endapi-method-parameter %} {% endapi-method-path-parameters %} {% endapi-method-request %} {% api-method-response %} {% api-method-response-example httpCode=200 %} {% api-method-response-example-description %} {% endapi-method-response-example-description %} ``` ``` {% endapi-method-response-example %} {% endapi-method-response %} {% endapi-method-spec %} {% endapi-method %} ## Config-service Config services are responsible for setting and updating the configuration for some or all services. They will generally incorporate links to some form of persistence \(such as a database or key/value store\). The reason to use a centralised config service is to easily facilitate updates to the config for any running services. This could be adding new devices, increasing or changing the collected tags etc. The config service itself is responsible for retrieving all the config, and pushing it out to the individual services, and therefore config services know about the existence of all all the services that it is responsible for. All the target services must implement the **configurable-service** contract. The config service itself doesn't need to implement any contact, and therefore it may not have an HTTP api. Though often a config service will implement **configurable-service** contract itself, to enable applications to communicate with it and manipulate the whole installations configuration.
44.065574
303
0.771205
eng_Latn
0.997008
c91119c09cf4c82b4d4245b5610dd6e5904dbd0b
42,675
md
Markdown
aspnet/mvc/overview/older-versions/hands-on-labs/aspnet-mvc-4-custom-action-filters.md
sambres/Docs.fr-fr
b33cf043e8987ce77e0a9f6e45eb933d0e6e8e3c
[ "CC-BY-4.0", "MIT" ]
1
2021-08-17T15:51:26.000Z
2021-08-17T15:51:26.000Z
aspnet/mvc/overview/older-versions/hands-on-labs/aspnet-mvc-4-custom-action-filters.md
sambres/Docs.fr-fr
b33cf043e8987ce77e0a9f6e45eb933d0e6e8e3c
[ "CC-BY-4.0", "MIT" ]
null
null
null
aspnet/mvc/overview/older-versions/hands-on-labs/aspnet-mvc-4-custom-action-filters.md
sambres/Docs.fr-fr
b33cf043e8987ce77e0a9f6e45eb933d0e6e8e3c
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- uid: mvc/overview/older-versions/hands-on-labs/aspnet-mvc-4-custom-action-filters title: "Filtres d’Action personnalisés ASP.NET MVC 4 | Documents Microsoft" author: rick-anderson description: "ASP.NET MVC fournit des filtres d’Action pour exécuter la logique de filtrage avant ou après l’appel d’une méthode d’action. Filtres d’action sont des attributs personnalisés tha..." ms.author: aspnetcontent manager: wpickett ms.date: 02/18/2013 ms.topic: article ms.assetid: 969ab824-1b98-4552-81fe-b60ef5fc6887 ms.technology: dotnet-mvc ms.prod: .net-framework msc.legacyurl: /mvc/overview/older-versions/hands-on-labs/aspnet-mvc-4-custom-action-filters msc.type: authoredcontent ms.openlocfilehash: 639815cc92b7cb5f3dfb4e1a198f6b4c2476dc90 ms.sourcegitcommit: 7ac15eaae20b6d70e65f3650af050a7880115cbf ms.translationtype: MT ms.contentlocale: fr-FR ms.lasthandoff: 03/02/2018 --- # <a name="aspnet-mvc-4-custom-action-filters"></a>Filtres d’Action personnalisés ASP.NET MVC 4 Par [Web Camps équipe](https://twitter.com/webcamps) [Télécharger Camps Web Kit de formation](https://aka.ms/webcamps-training-kit) ASP.NET MVC fournit des filtres d’Action pour exécuter la logique de filtrage avant ou après l’appel d’une méthode d’action. Filtres d’action sont des attributs personnalisés qui fournissent un moyen déclaratif pour ajouter un comportement pré et post-action aux méthodes d’action du contrôleur. Dans cet atelier pratique, vous allez créer un attribut de filtre d’action personnalisée dans la solution MvcMusicStore pour intercepter les demandes du contrôleur et les journaux de l’activité d’un site dans une table de base de données. Vous serez en mesure d’ajouter votre filtre de journalisation par injection à n’importe quel contrôleur ou d’action. Enfin, vous voyez s’afficher le journal qui présente la liste des visiteurs. Cet atelier pratique suppose que vous avez une connaissance élémentaire des **ASP.NET MVC**. Si vous n’avez pas utilisé **ASP.NET MVC** auparavant, nous vous recommandons de dépasser **notions de base ASP.NET MVC 4** atelier pratique. > [!NOTE] > Tous les exemples de code et des extraits de code sont inclus dans le Kit de formation Camps Web, disponibles à partir de sur [Microsoft-Web/WebCampTrainingKit versions](https://aka.ms/webcamps-training-kit). Le projet spécifique pour ce laboratoire est disponible à l’adresse [filtres d’Action ASP.NET MVC 4 personnalisé](https://github.com/Microsoft-Web/HOL-MVC4CustomActionFilters). <a id="Objectives"></a> ### <a name="objectives"></a>Objectifs Dans cet atelier pratique, vous allez apprendre comment : - Créer un attribut de filtre d’action personnalisée pour étendre les fonctionnalités de filtrage - Appliquer un attribut de filtre personnalisé à un niveau spécifique de l’injection de code - Inscrire des filtres d’action personnalisés globalement <a id="Prerequisites"></a> <a id="Prerequisites"></a> ### <a name="prerequisites"></a>Prérequis Vous devez disposer des éléments suivants pour effectuer ce laboratoire : - [Microsoft Visual Studio Express 2012 pour Web](https://www.microsoft.com/visualstudio/eng/products/visual-studio-express-for-web) ou supérieure (lecture [annexe A](#AppendixA) pour obtenir des instructions sur l’installation). <a id="Setup"></a> <a id="Setup"></a> ### <a name="setup"></a>Installation **L’installation des extraits de Code** Pour plus de commodité, une grande partie du code que vous gérez le long de cet atelier est disponible sous les extraits de code Visual Studio. Pour installer les extraits de code exécuter **.\Source\Setup\CodeSnippets.vsi** fichier. Si vous n’êtes pas familiarisé avec les extraits de Code Visual Studio et vous souhaitez savoir comment les utiliser, vous pouvez faire référence à l’annexe de ce document &quot; [extraits de Code à l’aide de l’annexe c :](#AppendixC)&quot;. * * * <a id="Exercises"></a> <a id="Exercises"></a> ## <a name="exercises"></a>Exercices Cet atelier pratique est constitué par les exercices suivants : 1. [Exercice 1 : Enregistrement des actions](#Exercise1) 2. [Exercice 2 : Gestion de plusieurs filtres d’Action](#Exercise2) Durée estimée pour effectuer ce laboratoire : **30 minutes**. > [!NOTE] > Chaque exercice est accompagné par un **fin** dossier qui contient la solution résultante, vous devez obtenir après avoir effectué les exercices. Si vous avez besoin d’aide fonctionne via les exercices, vous pouvez utiliser cette solution comme guide. <a id="Exercise1"></a> <a id="Exercise_1_Logging_Actions"></a> ### <a name="exercise-1-logging-actions"></a>Exercice 1 : Enregistrement des Actions Dans cet exercice, vous allez apprendre comment créer un filtre de journal d’action personnalisé à l’aide des fournisseurs de filtres ASP.NET MVC 4. Pour cela, vous appliquerez un filtre de journalisation au site MusicStore qui enregistre toutes les activités dans les contrôleurs sélectionnés. Le filtre étendra **ActionFilterAttributeClass** et remplacez **OnActionExecuting** méthode pour intercepter chaque requête, puis effectuez les actions de consignation. Les informations de contexte sur les requêtes HTTP, l’exécution de méthodes, les résultats et les paramètres doivent être fournie par ASP.NET MVC **ActionExecutingContext** classe **.** > [!NOTE] > ASP.NET MVC 4 a également les fournisseurs de filtres par défaut que vous pouvez utiliser sans créer un filtre personnalisé. ASP.NET MVC 4 fournit les types de filtres suivants : > > - **Autorisation** filtrer, qui prend des décisions de sécurité sur l’opportunité d’exécuter une méthode d’action, telles que l’exécution de l’authentification ou la validation des propriétés de la demande. > - **Action** filtre, qui encapsule l’exécution de méthode d’action. Ce filtre peut exécuter un traitement supplémentaire, tel que fournir des données supplémentaires à la méthode d’action, en inspectant la valeur de retour ou annuler l’exécution de la méthode d’action > - **Résultat** filtre, qui encapsule l’exécution de l’objet de classe ActionResult. Ce filtre peut exécuter d’autres traitements du résultat, telles que la modification de la réponse HTTP. > - **Exception** filtre, qui s’exécute si une exception non gérée levée dans une méthode d’action, en commençant par les filtres d’autorisation et se terminant par l’exécution du résultat. Filtres d’exception peuvent être utilisés pour des tâches telles que la journalisation ou l’affichage d’une page d’erreur. > > Pour plus d’informations sur les fournisseurs de filtres, visitez ce lien MSDN : ([https://msdn.microsoft.com/library/dd410209.aspx](https://msdn.microsoft.com/library/dd410209.aspx)). <a id="AboutLoggingFeature"></a> <a id="About_MVC_Music_Store_Application_logging_feature"></a> #### <a name="about-mvc-music-store-application-logging-feature"></a>À propos de la fonctionnalité de journalisation de l’Application de magasin de musique MVC Cette solution de magasin de musique a une nouvelle table de modèle de données pour l’enregistrement de site, **ActionLog**, avec les champs suivants : nom du contrôleur qui a reçu une demande, l’appelées action, l’adresse IP du Client et l’horodatage. ![Modèle de données. Table ActionLog. ] (aspnet-mvc-4-custom-action-filters/_static/image1.png "Modèle de données. Table ActionLog.") *Modèle de données - ActionLog table* La solution fournit une vue de MVC ASP.NET pour le journal des actions qui se trouvent à **MvcMusicStores/vues/ActionLog**: ![Affichage du journal action](aspnet-mvc-4-custom-action-filters/_static/image2.png "affichage du journal des actions") *Affichage du journal d’action* Avec cette fonction de structure, tout le travail porteront sur interrompre les demande du contrôleur et l’exécution de la journalisation à l’aide de filtrage personnalisées. <a id="Ex1Task1"></a> <a id="Task_1_-_Creating_a_Custom_Filter_to_Catch_a_Controllers_Request"></a> #### <a name="task-1---creating-a-custom-filter-to-catch-a-controllers-request"></a>Tâche 1 - Création d’un filtre personnalisé pour intercepter les demande d’un contrôleur Dans cette tâche, vous allez créer une classe d’attributs de filtre personnalisé qui contient la logique de journalisation. Pour cela, vous allez étendre ASP.NET MVC **ActionFilterAttribute** et implémentez l’interface **IActionFilter**. > [!NOTE] > Le **ActionFilterAttribute** est la classe de base pour tous les filtres d’attribut. Il fournit les méthodes suivantes pour exécuter une logique spécifique après et avant l’exécution de l’action de contrôleur : > > - **OnActionExecuting**(ActionExecutingContext filterContext) : juste avant l’action de la méthode est appelée. > - **OnActionExecuted**(ActionExecutedContext filterContext) : une fois la méthode d’action est appelée et avant l’exécution du résultat (avant le rendu de la vue). > - **OnResultExecuting**(ResultExecutingContext filterContext) : juste avant l’exécution du résultat (avant le rendu de la vue). > - **OnResultExecuted**(ResultExecutedContext filterContext) : une fois l’exécution du résultat (une fois que la vue est restituée). > > En remplaçant une de ces méthodes dans une classe dérivée, vous pouvez exécuter votre propre code de filtrage. 1. Ouvrez le **commencer** solution situé dans **\Source\Ex01-LoggingActions\Begin** dossier. 1. Vous devez télécharger des packages NuGet manquants avant de continuer. Pour ce faire, cliquez sur le **projet** menu et sélectionnez **gérer les Packages NuGet**. 2. Dans le **gérer les Packages NuGet** boîte de dialogue, cliquez sur **restaurer** afin de télécharger les packages manquants. 3. Enfin, générez la solution en cliquant sur **générer** | **générer la Solution**. > [!NOTE] > Un des avantages de l’utilisation de NuGet est que vous ne devez expédier toutes les bibliothèques dans votre projet, ce qui réduit la taille du projet. Avec NuGet Power Tools, en spécifiant les versions de package dans le fichier Packages.config, vous serez en mesure de télécharger toutes les bibliothèques requises à la première fois que vous exécutez le projet. C’est pourquoi vous devez exécuter ces étapes après avoir ouvert une solution existante à partir de ce laboratoire. > > Pour plus d’informations, consultez l’article : [http://docs.nuget.org/docs/workflows/using-nuget-without-committing-packages](http://docs.nuget.org/docs/workflows/using-nuget-without-committing-packages). 2. Ajoutez une nouvelle classe c# dans le **filtres** dossier et nommez-le *CustomActionFilter.cs*. Ce dossier stocke tous les filtres personnalisés. 3. Ouvrez **CustomActionFilter.cs** et ajouter une référence à **System.Web.Mvc** et **MvcMusicStore.Models** espaces de noms : (Code d’extrait de code - *filtres d’Action personnalisés ASP.NET MVC 4 - Ex1-CustomActionFilterNamespaces*) [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample1.cs)] 4. Hériter le **CustomActionFilter** classe **ActionFilterAttribute** , puis rendez **CustomActionFilter** mettre en œuvre de la classe **IActionFilter** interface. [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample2.cs)] 5. Rendre **CustomActionFilter** classe substituer la méthode **OnActionExecuting** et ajouter la logique nécessaire pour se connecter à l’exécution du filtre. Pour ce faire, ajoutez le code en surbrillance suivant dans **CustomActionFilter** classe. (Code d’extrait de code - *filtres d’Action personnalisés ASP.NET MVC 4 - Ex1-LoggingActions*) [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample3.cs#Highlight)] > [!NOTE] > **OnActionExecuting** à l’aide de la méthode **Entity Framework** pour ajouter un historique ActionLog. Il crée et remplit une nouvelle instance de l’entité avec les informations de contexte à partir de **filterContext**. > > Vous pouvez en savoir plus sur **ControllerContext** classe à [msdn](https://msdn.microsoft.com/library/system.web.mvc.controllercontext.aspx). <a id="Ex1Task2"></a> <a id="Task_2_-_Injecting_a_Code_Interceptor_into_the_Store_Controller_Class"></a> #### <a name="task-2---injecting-a-code-interceptor-into-the-store-controller-class"></a>Tâche 2 - injection d’un intercepteur de Code dans la classe de contrôleur de stockage Dans cette tâche, vous allez ajouter le filtre personnalisé en injectant pour toutes les classes de contrôleur et les actions de contrôleur qui seront enregistrées. Pour les besoins de cet exercice, la classe de contrôleur de stockage aura un journal. La méthode **OnActionExecuting** de **ActionLogFilterAttribute** filtre personnalisé s’exécute lorsqu’un élément injecté est appelé. Il est également possible d’intercepter une méthode de contrôleur spécifique. 1. Ouvrez le **StoreController** à **MvcMusicStore\Controllers** et ajouter une référence à la **filtres** espace de noms : [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample4.cs)] 2. Le filtre personnalisé d’injecter **CustomActionFilter** dans **StoreController** classe en ajoutant **[CustomActionFilter]** attribut avant la déclaration de classe. [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample5.cs)] > [!NOTE] > Lorsqu’un filtre est injecté dans une classe de contrôleur, toutes ses actions sont également injectées. Si vous souhaitez appliquer le filtre uniquement pour un ensemble d’actions, vous devrez injecter **[CustomActionFilter]** à chacun d’eux : > > [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample6.cs)] <a id="Ex1Task3"></a> <a id="Task_3_-_Running_the_Application"></a> #### <a name="task-3---running-the-application"></a>Tâche 3 : exécution de l’Application Dans cette tâche, vous allez tester que le filtre de journalisation fonctionne. Vous allez démarrer l’application et visiter le magasin, puis vous allez vérifier les activités journalisées. 1. Appuyez sur **F5** pour exécuter l’application. 2. Accédez à **/ActionLog** pour afficher l’état initial de vue journal : ![État de mise hors tension avant que l’activité de la page du journal](aspnet-mvc-4-custom-action-filters/_static/image3.png "état mise hors tension avant que l’activité de la page du journal") *État de mise hors tension de journal avant que l’activité de la page* > [!NOTE] > Par défaut, il affiche toujours un élément qui est généré lorsque vous récupérez les genres existants pour le menu. > > Pour des raisons de simplicité, nous allons nettoyage de la **ActionLog** chaque fois que l’application s’exécute, donc il n’affichera que les journaux de la vérification de chaque tâche particulière la table. > > Vous devrez peut-être supprimer le code suivant à partir de la **Session\_Démarrer** (méthode) (dans le **Global.asax** classe), afin d’enregistrer un journal d’historique pour toutes les actions exécutées dans le magasin Contrôleur. > > [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample7.cs)] 3. Cliquez sur un de le **Genres** à partir du menu et effectuer certaines actions, telles que la navigation sur un album disponible. 4. Accédez à **/ActionLog** et si le journal est vide press **F5** pour actualiser la page. Vérifiez que vos visites ont été suivies : ![Journal des actions avec les activités consignées](aspnet-mvc-4-custom-action-filters/_static/image4.png "journal des actions avec activité connectée") *Journal des actions avec activité connectée* <a id="Exercise2"></a> <a id="Exercise_2_Managing_Multiple_Action_Filters"></a> ### <a name="exercise-2-managing-multiple-action-filters"></a>Exercice 2 : Gestion de plusieurs filtres d’Action Dans cet exercice, vous ajoutez un deuxième filtre d’Action personnalisé à la classe StoreController et définir l’ordre spécifique dans lequel les deux filtres seront exécutés. Ensuite, vous mettrez à jour le code pour enregistrer le filtre global. Il existe différentes options à prendre en compte lors de la définition de l’ordre d’exécution des filtres. Par exemple, la propriété et champ d’application des filtres : Vous pouvez définir un **étendue** pour chacun des filtres, par exemple, vous pourriez étendre tous les filtres d’Action à exécuter dans le **contrôleur étendue**et tous les filtres d’autorisation pour s’exécuter dans **portée globale** . Les étendues ont un ordre d’exécution défini. En outre, chaque filtre d’action a une propriété de commande qui est utilisée pour déterminer l’ordre d’exécution dans la portée du filtre. Pour plus d’informations sur l’ordre d’exécution de filtres d’Action personnalisée, consultez cet article MSDN : ([https://msdn.microsoft.com/library/dd381609(v=vs.98).aspx](https://msdn.microsoft.com/library/dd381609(v=vs.98).aspx)). <a id="Ex2Task1"></a> <a id="Task_1_Creating_a_new_Custom_Action_Filter"></a> #### <a name="task-1-creating-a-new-custom-action-filter"></a>Tâche 1 : Création d’un nouveau filtre d’Action personnalisé Dans cette tâche, vous allez créer un nouveau filtre d’Action personnalisé à injecter dans la classe StoreController, pour découvrir comment gérer l’ordre d’exécution des filtres. 1. Ouvrez le **commencer** solution situé dans **\Source\Ex02-ManagingMultipleActionFilters\Begin** dossier. Dans le cas contraire, vous pouvez continuer à utiliser le **fin** solution obtenue par la fin de l’exercice précédent. 1. Si vous avez ouvert le **commencer** solution, vous devez télécharger des packages NuGet manquants avant de poursuivre. Pour ce faire, cliquez sur le **projet** menu et sélectionnez **gérer les Packages NuGet**. 2. Dans le **gérer les Packages NuGet** boîte de dialogue, cliquez sur **restaurer** afin de télécharger les packages manquants. 3. Enfin, générez la solution en cliquant sur **générer** | **générer la Solution**. > [!NOTE] > Un des avantages de l’utilisation de NuGet est que vous ne devez expédier toutes les bibliothèques dans votre projet, ce qui réduit la taille du projet. Avec NuGet Power Tools, en spécifiant les versions de package dans le fichier Packages.config, vous serez en mesure de télécharger toutes les bibliothèques requises à la première fois que vous exécutez le projet. C’est pourquoi vous devez exécuter ces étapes après avoir ouvert une solution existante à partir de ce laboratoire. > > Pour plus d’informations, consultez l’article : [http://docs.nuget.org/docs/workflows/using-nuget-without-committing-packages](http://docs.nuget.org/docs/workflows/using-nuget-without-committing-packages). 2. Ajoutez une nouvelle classe c# dans le **filtres** dossier et nommez-le *MyNewCustomActionFilter.cs* 3. Ouvrez **MyNewCustomActionFilter.cs** et ajouter une référence à **System.Web.Mvc** et **MvcMusicStore.Models** espace de noms : (Code d’extrait de code - *filtres d’Action personnalisés ASP.NET MVC 4 - Ex2-MyNewCustomActionFilterNamespaces*) [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample8.cs)] 4. Remplacez la déclaration de classe par défaut par le code suivant. (Code d’extrait de code - *filtres d’Action personnalisés ASP.NET MVC 4 - Ex2-MyNewCustomActionFilterClass*) [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample9.cs)] > [!NOTE] > Ce filtre d’Action personnalisé est presque identique à celui que vous avez créé dans l’exercice précédent. La principale différence est qu’il a le *&quot;connecté par&quot;* attribut mis à jour avec le nom de cette nouvelle classe pour identifier les filtres qui inscrit le journal. <a id="Ex2Task2"></a> <a id="Task_2_Injecting_a_new_Code_Interceptor_into_the_StoreController_Class"></a> #### <a name="task-2-injecting-a-new-code-interceptor-into-the-storecontroller-class"></a>Tâche 2 : Injectant un intercepteur de Code de la classe StoreController Dans cette tâche, vous ajoutez un nouveau filtre personnalisé dans la classe StoreController et exécuter la solution pour vérifier la façon dont les deux filtres fonctionnent ensemble. 1. Ouvrez le **StoreController** classe situé dans **MvcMusicStore\Controllers** et injecter du nouveau filtre personnalisé **MyNewCustomActionFilter** dans **StoreController** classe comme est indiqué dans le code suivant. [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample10.cs)] 2. Maintenant, exécutez l’application afin de comprendre le fonctionnement de ces deux filtres d’Action personnalisée. Pour ce faire, appuyez sur **F5** et attendez que l’application démarre. 3. Accédez à **/ActionLog** pour afficher l’état initial d’affichage journal. ![État de mise hors tension avant que l’activité de la page du journal](aspnet-mvc-4-custom-action-filters/_static/image5.png "état mise hors tension avant que l’activité de la page du journal") *État de mise hors tension de journal avant que l’activité de la page* 4. Cliquez sur un de le **Genres** à partir du menu et effectuer certaines actions, telles que la navigation sur un album disponible. 5. Vérifiez que cette heure ; vos visites ont été suivies à deux reprises : une fois pour chacun des filtres d’Action personnalisée que vous avez ajouté à la **que StorageController** classe. ![Journal des actions avec les activités consignées](aspnet-mvc-4-custom-action-filters/_static/image6.png "journal des actions avec activité connectée") *Journal des actions avec activité connectée* 6. Fermez le navigateur. <a id="Ex2Task3"></a> <a id="Task_3_Managing_Filter_Ordering"></a> #### <a name="task-3-managing-filter-ordering"></a>Tâche 3 : La gestion de l’ordre de filtre Dans cette tâche, vous allez apprendre à gérer l’ordre d’exécution des filtres à l’aide de la propriété de commande. 1. Ouvrir le **StoreController** classe situé dans **MvcMusicStore\Controllers** et spécifiez le **ordre** propriété dans les deux filtres comme indiqué ci-dessous. [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample11.cs)] 2. À présent, vérifier comment les filtres sont exécutés en fonction de la valeur de la propriété de son ordre. Vous constaterez que le filtre avec la valeur la plus petite (**CustomActionFilter**) est la première qui est exécutée. Appuyez sur **F5** et attendez que l’application démarre. 3. Accédez à **/ActionLog** pour afficher l’état initial d’affichage journal. ![État de mise hors tension avant que l’activité de la page du journal](aspnet-mvc-4-custom-action-filters/_static/image7.png "état mise hors tension avant que l’activité de la page du journal") *État de mise hors tension de journal avant que l’activité de la page* 4. Cliquez sur un de le **Genres** à partir du menu et effectuer certaines actions, telles que la navigation sur un album disponible. 5. Vérifiez que cette fois, vos visites ont été suivies classés par valeur d’ordre des filtres : **CustomActionFilter** journaux premier. ![Journal des actions avec les activités consignées](aspnet-mvc-4-custom-action-filters/_static/image8.png "journal des actions avec activité connectée") *Journal des actions avec activité connectée* 6. Maintenant, vous mettez à jour la valeur d’ordre des filtres et vérifier comment l’ordre de journalisation change. Dans le **StoreController** de classe, de mettre à jour de valeur d’ordre des filtres comme indiqué ci-dessous. [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample12.cs)] 7. Réexécutez l’application en appuyant sur **F5**. 8. Cliquez sur un de le **Genres** à partir du menu et effectuer certaines actions, telles que la navigation sur un album disponible. 9. Vérifiez que ce temps, les journaux créés par **MyNewCustomActionFilter** filtre apparaît en premier. ![Journal des actions avec les activités consignées](aspnet-mvc-4-custom-action-filters/_static/image9.png "journal des actions avec activité connectée") *Journal des actions avec activité connectée* <a id="Ex2Task4"></a> <a id="Task_4_Registering_Filters_Globally"></a> #### <a name="task-4-registering-filters-globally"></a>Tâche 4 : Enregistrement de filtres globalement Dans cette tâche, vous mettrez à jour la solution pour enregistrer le nouveau filtre (**MyNewCustomActionFilter**) comme un filtre global. Ce faisant, il est déclenché par tous les effectuée d’actions dans l’application et pas uniquement dans le StoreController ceux que dans la tâche précédente. 1. Dans **StoreController** classe, supprimez **[MyNewCustomActionFilter]** attribut et la propriété de commande à partir de **[CustomActionFilter]**. Il doit se présenter comme suit : [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample13.cs)] 2. Ouvrez **Global.asax** de fichiers et recherchez le **Application\_Démarrer** (méthode). Notez que chaque fois que l’application démarre, elle inscrit les filtres globaux en appelant **RegisterGlobalFilters** méthode **FilterConfig** classe. ![L’enregistrement de filtres globaux dans Global.asax](aspnet-mvc-4-custom-action-filters/_static/image10.png "l’enregistrement de filtres globaux dans Global.asax") *L’enregistrement de filtres globaux dans Global.asax* 3. Ouvrez **FilterConfig.cs** au sein du fichier **application\_Démarrer** dossier. 4. Ajoutez une référence à l’aide de System.Web.Mvc ; à l’aide de MvcMusicStore.Filters ; espace de noms. [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample14.cs)] 5. Mise à jour **RegisterGlobalFilters** méthode en ajoutant votre filtre personnalisé. Pour ce faire, ajoutez le code en surbrillance : [!code-csharp[Main](aspnet-mvc-4-custom-action-filters/samples/sample15.cs)] 6. Exécutez l’application en appuyant sur **F5**. 7. Cliquez sur un de le **Genres** à partir du menu et effectuer certaines actions, telles que la navigation sur un album disponible. 8. Vérification qu’à présent **[MyNewCustomActionFilter]** est en cours injectés dans le HomeController et ActionLogController trop. ![Journal des actions avec les activités consignées](aspnet-mvc-4-custom-action-filters/_static/image11.png "journal des actions avec activité connectée") *Journal des actions avec l’activité globale connectée* > [!NOTE] > En outre, vous pouvez déployer cette application à Sites Web Windows Azure suit [annexe b : publication une Application ASP.NET MVC 4, à l’aide de Web Deploy](#AppendixB). * * * <a id="Summary"></a> <a id="Summary"></a> ## <a name="summary"></a>Récapitulatif À la fin de cet atelier pratique, vous avez appris comment étendre un filtre d’action pour exécuter des actions personnalisées. Vous avez également appris à injecter n’importe quel filtre à vos contrôleurs de page. Les concepts suivants ont été utilisés : - Comment créer des filtres d’Action personnalisée avec la classe ActionFilterAttribute de MVC ASP.NET - Comment faire pour injecter des filtres dans les contrôleurs ASP.NET MVC - Comment gérer l’ordre de filtre à l’aide de la propriété Order - Comment inscrire des filtres globaux <a id="AppendixA"></a> <a id="Appendix_A_Installing_Visual_Studio_Express_2012_for_Web"></a> ## <a name="appendix-a-installing-visual-studio-express-2012-for-web"></a>Annexe a : installation de Visual Studio Express 2012 pour le Web Vous pouvez installer **Microsoft Visual Studio Express 2012 pour Web** ou un autre &quot;Express&quot; à l’aide de la version du **[Microsoft Web Platform Installer](https://www.microsoft.com/web/downloads/platform.aspx)** . Les instructions suivantes vous guident à travers les étapes requises pour installer *Visual studio Express 2012 pour le Web* à l’aide de *Microsoft Web Platform Installer*. 1. Accédez à [ [https://go.microsoft.com/? linkid = 9810169](https://go.microsoft.com/?linkid=9810169)](https://go.microsoft.com/?linkid=9810169). Sinon, si vous avez déjà installé Web Platform Installer, vous pouvez ouvrir il et recherchez le produit &quot; *Visual Studio Express 2012 pour le Web avec Windows Azure SDK*&quot;. 2. Cliquez sur **installer maintenant**. Si vous n’avez pas **Web Platform Installer** vous allez être redirigé pour télécharger et installer tout d’abord. 3. Une fois **Web Platform Installer** est ouvert, cliquez sur **installer** pour démarrer le programme d’installation. ![Installer Visual Studio Express](aspnet-mvc-4-custom-action-filters/_static/image12.png "installer Visual Studio Express") *Installer Visual Studio Express* 4. Lisez les termes et les licences de tous les produits et cliquez sur **J’accepte** pour continuer. ![Accepter les termes du contrat de licence](aspnet-mvc-4-custom-action-filters/_static/image13.png) *Accepter les termes du contrat de licence* 5. Attendez que le processus de téléchargement et l’installation se termine. ![Progression de l'installation](aspnet-mvc-4-custom-action-filters/_static/image14.png) *Progression de l’installation* 6. Une fois l’installation terminée, cliquez sur **Terminer**. ![Installation est terminée](aspnet-mvc-4-custom-action-filters/_static/image15.png) *Installation est terminée* 7. Cliquez sur **Exit** fermer Web Platform Installer. 8. Pour ouvrir Visual Studio Express pour le Web, accédez à la **Démarrer** écran et démarrer l’écriture &quot; **VS Express**&quot;, puis cliquez sur le **Visual Studio Express pour le Web** vignette. ![VS Express pour la vignette du Web](aspnet-mvc-4-custom-action-filters/_static/image16.png) *VS Express pour la vignette du Web* <a id="AppendixB"></a> <a id="Appendix_B_Publishing_an_ASPNET_MVC_4_Application_using_Web_Deploy"></a> ## <a name="appendix-b-publishing-an-aspnet-mvc-4-application-using-web-deploy"></a>Annexe b : publication d’une Application ASP.NET MVC 4, à l’aide de Web Deploy Cette annexe sera vous montrent comment créer un nouveau site web à partir du portail de gestion Windows Azure et de publier l’application que vous avez obtenu en suivant le laboratoire, en tirant parti de la fonctionnalité de publication Web Deploy fournie par Windows Azure. <a id="ApxBTask1"></a> <a id="Task_1_-_Creating_a_New_Web_Site_from_the_Windows_Azure_Portal"></a> #### <a name="task-1---creating-a-new-web-site-from-the-windows-azure-portal"></a>Tâche 1 - Création d’un Site Web à partir de Windows Azure Portal 1. Accédez à la [portail de gestion Windows Azure](https://manage.windowsazure.com/) et connectez-vous en utilisant les informations d’identification Microsoft associées à votre abonnement. > [!NOTE] > Avec Windows Azure, vous pouvez héberger des Sites Web ASP.NET 10 gratuitement et puis faire évoluer à mesure que votre trafic augmente. Vous pouvez vous inscrire [ici](http://aka.ms/aspnet-hol-azure). ![Ouvrez une session sur le portail Windows Azure](aspnet-mvc-4-custom-action-filters/_static/image17.png "connectez-vous au portail Windows Azure") *Ouvrez une session sur le portail de gestion Azure Windows* 2. Cliquez sur **nouveau** sur la barre de commandes. ![Création d’un Site Web](aspnet-mvc-4-custom-action-filters/_static/image18.png "création d’un Site Web") *Création d’un Site Web* 3. Cliquez sur **de calcul** | **Site Web**. Puis sélectionnez **création rapide** option. Fournir une URL disponible pour le nouveau site web et cliquez sur **créer un Site Web**. > [!NOTE] > Un Site Web de Windows Azure est l’hôte pour une application web en cours d’exécution dans le cloud que vous pouvez contrôler et gérer. L’option Création rapide vous permet de déployer une application web complète pour le Site Web Windows Azure à partir d’à l’extérieur du portail. Il n’inclut pas les étapes de configuration d’une base de données. ![Création d’un Site Web à l’aide de la création rapide](aspnet-mvc-4-custom-action-filters/_static/image19.png "création d’un Site Web à l’aide de la création rapide") *Création d’un Site Web à l’aide de la création rapide* 4. Attendez que la nouvelle **Site Web** est créé. 5. Une fois que le Site Web est créé, cliquez sur le lien situé sous le **URL** colonne. Vérifiez que le nouveau Site Web fonctionne. ![Navigation vers le nouveau site web](aspnet-mvc-4-custom-action-filters/_static/image20.png "exploration vers le nouveau site web") *Navigation vers le nouveau site web* ![Site Web en cours d’exécution](aspnet-mvc-4-custom-action-filters/_static/image21.png "site Web en cours d’exécution") *Site Web en cours d’exécution* 6. Revenez au portail et cliquez sur le nom du site web sous le **nom** colonne pour afficher les pages de gestion. ![Ouvrir les pages de gestion du site web](aspnet-mvc-4-custom-action-filters/_static/image22.png "ouvrir les pages de gestion de site web") *Ouvrir les pages de gestion de Site Web* 7. Dans le **tableau de bord** sous le **coup de œil rapide** , cliquez sur le **télécharger le profil de publication** lien. > [!NOTE] > Le *le profil de publication* contient toutes les informations nécessaires pour publier une application web sur un site Web de Windows Azure pour chaque méthode de publication activée. Le profil de publication contient les URL, les informations d’identification utilisateur et les chaînes de base de données requis pour se connecter à et de s’authentifier auprès de chacun des points de terminaison pour laquelle une méthode de publication est activée. **Microsoft WebMatrix 2**, **Microsoft Visual Studio Express pour Web** et **Microsoft Visual Studio 2012** prise en charge la lecture de publier les profils pour automatiser la configuration de ces programmes pour publication d’applications web pour les sites Web Windows Azure. ![Profil de publication de téléchargement du site web](aspnet-mvc-4-custom-action-filters/_static/image23.png "profil de publication de téléchargement du site web") *Profil de publication de téléchargement du Site Web* 8. Téléchargez le fichier de profil de publication à un emplacement connu. Davantage dans cet exercice, vous verrez comment utiliser ce fichier pour publier une application web à un Sites Web Windows Azure à partir de Visual Studio. ![L’enregistrement du fichier de profil de publication](aspnet-mvc-4-custom-action-filters/_static/image24.png "l’enregistrement du profil de publication") *L’enregistrement du fichier de profil de publication* <a id="ApxBTask2"></a> <a id="Task_2_-_Configuring_the_Database_Server"></a> #### <a name="task-2---configuring-the-database-server"></a>Tâche 2 : configuration du serveur de base de données Si votre application se sert de SQL Server vous devez créer un serveur de base de données SQL des bases de données. Si vous souhaitez déployer une application simple qui n’utilise pas de SQL Server, vous pouvez ignorer cette tâche. 1. Vous devez un serveur de base de données SQL pour stocker la base de données de l’application. Vous pouvez afficher les serveurs de base de données SQL à partir de votre abonnement dans le portail de gestion Windows Azure à **bases de données Sql** | **serveurs** | **du serveur Tableau de bord**. Si vous ne disposez pas d’un serveur créé, vous pouvez créer un à l’aide de la **ajouter** bouton sur la barre de commandes. Prenez note de la **nom du serveur et les URL, nom de connexion d’administrateur et un mot de passe**, comme vous allez l’utiliser dans les tâches suivantes. Ne créez pas encore, la base de données telle qu’elle sera créée dans une étape ultérieure. ![Tableau de bord de serveur SQL de base de données](aspnet-mvc-4-custom-action-filters/_static/image25.png "tableau de bord de serveur SQL de base de données") *Tableau de bord de serveur SQL de base de données* 2. Dans la tâche suivante, vous allez tester la connexion de base de données à partir de Visual Studio, pour cette raison, vous devez inclure votre adresse IP locale dans la liste du serveur de **adresses IP autorisées**. Pour ce faire, cliquez sur **configurer**, sélectionnez l’adresse IP à partir de **adresse IP du Client actuel** et le coller dans le **adresse IP de début** et **adresse IP de fin** zones de texte et cliquez sur le ![add-client-ip-address-ok-button](aspnet-mvc-4-custom-action-filters/_static/image26.png) bouton. ![Ajout d’adresse IP du Client](aspnet-mvc-4-custom-action-filters/_static/image27.png) *Ajout d’adresse IP du Client* 3. Une fois la **adresse IP du Client** est ajouté aux adresses IP autorisées de liste, cliquez sur **enregistrer** pour confirmer les modifications. ![Confirmer les modifications](aspnet-mvc-4-custom-action-filters/_static/image28.png) *Confirmer les modifications* <a id="ApxBTask3"></a> <a id="Task_3_-_Publishing_an_ASPNET_MVC_4_Application_using_Web_Deploy"></a> #### <a name="task-3---publishing-an-aspnet-mvc-4-application-using-web-deploy"></a>Tâche 3 : publication d’une Application ASP.NET MVC 4, à l’aide de Web Deploy 1. Revenez à la solution ASP.NET MVC 4. Dans le **l’Explorateur de solutions**, cliquez sur le projet de site web et sélectionnez **publier**. ![Publication de l’Application](aspnet-mvc-4-custom-action-filters/_static/image29.png "publication de l’Application") *Publier le site web* 2. Importer le profil de publication que vous avez enregistré dans la première tâche. ![L’importation du profil de publication](aspnet-mvc-4-custom-action-filters/_static/image30.png "l’importation du profil de publication") *Importation du profil de publication* 3. Cliquez sur **valider la connexion**. Une fois la Validation terminée. Cliquez sur **suivant**. > [!NOTE] > La validation est terminée une fois que vous voyez une coche verte apparaît en regard du bouton Valider la connexion. ![Validation de la connexion](aspnet-mvc-4-custom-action-filters/_static/image31.png "validation de la connexion") *Validation de la connexion* 4. Dans le **paramètres** sous le **bases de données** , cliquez sur le bouton en regard de la zone de texte de la connexion de votre base de données (par exemple, **DefaultConnection**). ![Configuration de déploiement Web](aspnet-mvc-4-custom-action-filters/_static/image32.png "configuration de déploiement Web") *Configuration de déploiement Web* 5. Configurer la connexion de base de données comme suit : - Dans le **nom du serveur** tapez votre URL de base de données SQL server à l’aide du *tcp :* préfixe. - Dans **nom d’utilisateur** tapez le nom de connexion de votre administrateur de serveur. - Dans **mot de passe** votre mot de passe du compte de connexion administrateur serveur. - Tapez un nouveau nom de base de données. ![Configuration de chaîne de connexion de destination](aspnet-mvc-4-custom-action-filters/_static/image33.png "configuration de chaîne de connexion de destination") *Configuration de chaîne de connexion de destination* 6. Cliquez ensuite sur **OK**. Lorsque vous êtes invité à créer la base de données, cliquez sur **Oui**. ![Création de la base de données](aspnet-mvc-4-custom-action-filters/_static/image34.png "création de la chaîne de la base de données") *Création de la base de données* 7. La chaîne de connexion que vous allez utiliser pour se connecter à la base de données SQL dans Windows Azure est indiquée dans la zone de texte par défaut de connexion. Cliquez ensuite sur **Suivant**. ![Chaîne de connexion pointant vers la base de données SQL](aspnet-mvc-4-custom-action-filters/_static/image35.png "chaîne de connexion pointant vers la base de données SQL") *Chaîne de connexion pointant vers la base de données SQL* 8. Dans le **aperçu** , cliquez sur **publier**. ![Publication de l’application web](aspnet-mvc-4-custom-action-filters/_static/image36.png "publication de l’application web") *Publication de l’application web* 9. Une fois le processus de publication terminé, votre navigateur par défaut s’ouvre le site web publié. <a id="AppendixC"></a> <a id="Appendix_C_Using_Code_Snippets"></a> ## <a name="appendix-c-using-code-snippets"></a>Annexe c : à l’aide d’extraits de Code Avec des extraits de code, vous avez tout le code que vous avez besoin. Le document lab vous indique exactement quand vous pouvez les utiliser, comme indiqué dans l’illustration suivante. ![À l’aide d’extraits de code Visual Studio pour insérer du code dans votre projet](aspnet-mvc-4-custom-action-filters/_static/image37.png "des extraits de code à l’aide de Visual Studio pour insérer du code dans votre projet") *À l’aide d’extraits de code Visual Studio pour insérer du code dans votre projet* ***Pour ajouter un extrait de code à l’aide du clavier (c# uniquement)*** 1. Placez le curseur où vous souhaitez insérer le code. 2. Commencez à taper le nom de l’extrait de code (sans espaces ou des traits d’union). 3. Observez comment IntelliSense affiche les noms des extraits de code de mise en correspondance. 4. Sélectionnez l’extrait de code correct (ou continuez à taper jusqu'à ce que le nom de l’extrait de code entier est sélectionné). 5. Appuyez sur la touche Tab à deux reprises pour insérer l’extrait de code à l’emplacement du curseur. ![Commencez à taper le nom de l’extrait de code](aspnet-mvc-4-custom-action-filters/_static/image38.png "commencez à taper le nom de l’extrait de code") *Commencez à taper le nom de l’extrait de code* ![Appuyez sur Tab pour sélectionner l’extrait de code en surbrillance](aspnet-mvc-4-custom-action-filters/_static/image39.png "appuyez sur Tab pour sélectionner l’extrait de code en surbrillance") *Appuyez sur Tab pour sélectionner l’extrait de code en surbrillance* ![Appuyez sur Tab à nouveau et l’extrait de code sont développés](aspnet-mvc-4-custom-action-filters/_static/image40.png "appuyez sur Tab à nouveau et l’extrait de code seront développe.") *Appuyez sur Tab à nouveau et l’extrait de code seront développe.* ***Pour ajouter un extrait de code à l’aide de la souris (c#, Visual Basic et XML)*** 1. Clic droit où vous souhaitez insérer l’extrait de code. 1. Sélectionnez **insérer un extrait** suivie **mes extraits de Code**. 2. Sélectionnez l’extrait de code approprié dans la liste, en cliquant dessus. ![Avec le bouton sur lequel vous souhaitez insérer l’extrait de code et sélectionnez Insérer un extrait](aspnet-mvc-4-custom-action-filters/_static/image41.png "avec le bouton sur lequel vous souhaitez insérer l’extrait de code et sélectionnez Insérer un extrait") *Avec le bouton droit sur lequel vous souhaitez insérer l’extrait de code et sélectionnez Insérer un extrait* ![Sélectionnez l’extrait de code approprié dans la liste, en cliquant dessus](aspnet-mvc-4-custom-action-filters/_static/image42.png "choisir l’extrait de code approprié dans la liste, en cliquant sur celle-ci") *Sélectionnez l’extrait de code approprié dans la liste, en cliquant sur celle-ci*
71.964587
738
0.76942
fra_Latn
0.970059
c911304bf53da9961ea8d5097f40afd4b431697c
1,718
md
Markdown
README.md
titania7777/VideoFrameSampler
17a29512a38a5ff4e89b2c8182656e45a511809d
[ "Apache-2.0" ]
2
2021-12-15T01:54:40.000Z
2022-03-30T04:08:17.000Z
README.md
titania7777/VideoFrameSampler
17a29512a38a5ff4e89b2c8182656e45a511809d
[ "Apache-2.0" ]
null
null
null
README.md
titania7777/VideoFrameSampler
17a29512a38a5ff4e89b2c8182656e45a511809d
[ "Apache-2.0" ]
1
2021-12-15T01:54:40.000Z
2021-12-15T01:54:40.000Z
# VideoFrameSampler Salient Video Frames Sampler for Efficient Model Training Using the Mean of Deep Features ## Summary This code's purpose is to find meaningful frames in both trimmed and untrimmed video datasets. And this Sampler working only with UCF101, HMDB51, ActivityNet datasets. We only provides video frame sampler codes(returns the JSON file), however, we will be published training codes which utilize this sampler results in another repository later!!. ## Requirements * opencv-python * ffmpeg-python * torch * [pillow-simd(optional)](https://github.com/uploadcare/pillow-simd) ## Usage(UCF101) Clone this repository ```bash git clone https://github.com/titania7777/VideoFrameSampler.git ``` Download the dataset ```bash cd ./VideoFrameSampler/Data/UCF101/ ./download.sh ``` Run an Index Sampler ```bash cd ../../ python sampler_run.py --dataset-name UCF101 --split-id 1 ``` Loading Test ```bash python sampler_test.py --dataset-name UCF101 --split-id 1 --sequence-length 16 ``` ## Sampled Annotations We provide our sampler results here ### [Download UCF101 Sampled Annotations](https://www.dropbox.com/s/lue5oeibp2s2r73/UCF101.zip?dl=0) ### [Download HMDB51 Sampled Annotations](https://www.dropbox.com/s/34v3o8d1ujlqk1h/HMDB51.zip?dl=0) ### [Download ActivityNet Sampled Annotations](https://www.dropbox.com/s/v87mos9yocsyl3p/ActivityNet.zip?dl=0) ## Examples on ActivityNet ### Kayaking(Uniform) <img align="center" src="figures/1.PNG" width="750"> ### Kayaking(Our) <img align="center" src="figures/2.PNG" width="750"> ### Laying Tile(Uniform) <img align="center" src="figures/3.PNG" width="750"> ### Laying Tile(Our) <img align="center" src="figures/4.PNG" width="750">
33.038462
179
0.745052
eng_Latn
0.563274
c911716e1eacc106b3fea4061ab8dc8bc420e3ca
10,566
md
Markdown
docs/usage.md
suufi/terra-application-navigation
f610cc7c4364aca4782c9513509522e6dd7d40d2
[ "Apache-2.0" ]
null
null
null
docs/usage.md
suufi/terra-application-navigation
f610cc7c4364aca4782c9513509522e6dd7d40d2
[ "Apache-2.0" ]
null
null
null
docs/usage.md
suufi/terra-application-navigation
f610cc7c4364aca4782c9513509522e6dd7d40d2
[ "Apache-2.0" ]
null
null
null
## Prerequisites - The ApplicationNavigation requires the presence of a `Application` component (provided by `terra-application`) in its parent hierarchy. This provides essential utilities around i18n, context, and breakpoints. ```jsx import React from 'react'; import Application from 'terra-application'; import ApplicationNavigation from 'terra-application-navigation'; const MyApp = () => ( <Application locale="en-US"> <ApplicationNavigation titleConfig={titleConfig} utilityItems={utilityItems} navigationItems={navigationItems} extensionItems={extensionItems} activeNavigationItemKey={activeKey} > {myContent} </ApplicationNavigation> </Application> ); ``` ## Props ### `extensionItems` #### Is Required: `false` The `extensionItems` prop allows consumers to render icons with an assocaited selection callback within the ApplicationNavigation's extensions region. The expectations for `extensionItems` is that they are to provide click actions and disclosures for the application level context. The `extensionItems` will rollup in various counts depending on the current breakpoint. If `extensionItems` are passed as props the associated `onSelectExtensionItem` function callback should be passed as well. The value provided for `extensionItems` should be an array of objects with the following API: |Key Name|Type|Is Required|Description| |---|---|---|---| |`key`|String|**required**|A key rendered to be used as a unique react key as well as returned with the onSelectExtensionItem.| |`icon`|Element|**required**|A React element representing the themable icon for the extension.| |`text`|String|**required**|The text to either be set as an aria-label or display text.| |`metaData`|Object|optional|An object containing whatever additional identifying information to be returned with the onSelectExtensionItem.| ```jsx const extensionItems = [{ key: 'extension_1', icon: <Icon1 >, text: 'Extension 1', metaData: { myValue: value1 } }, { key: 'extension_2', icon: <Icon2 >, text: 'Extension 2', metaData: { myValue: value2 } }]; ``` ### `onSelectExtensionItem` #### Is Required: `false` The `onSelectExtensionItem` prop allows consumers to retrieve the information related to the extension that was clicked. The function callback will return the information in the format of `onSelectExtensionItem(key, metaData)`. ### `navigationItems` #### Is Required: `false` The `navigationItems` prop allows consumers to render high-level, primary navigation controls directly within the ApplicationNavigation. The expectation of `navigationItems` is that items will not be added/removed, as this would be detrimental to the user experience. Once selected, a navigational item is no longer actionable and cannot be reselected. Navigation items at the application level should have equivalent context levels. Navigation from one tab to another should not be influenced by content, as each navigational item should be a sandboxed concept. The ApplicationNavigation will render this content in different ways based on the active responsive breakpoint. If `navigationItems` are passed as props the associated `onSelectNavigationItem` function callback should be passed as well. The value provided for `navigationItems` should be an array of objects with the following API: |Key Name|Type|Is Required|Description| |---|---|---|---| |`key`|String|**required**|A key rendered to be used as a unique react key as well as returned with the onSelectNavigationItem.| |`text`|String|**required**|The text to either be set as an aria-label or display text.| |`metaData`|Object|optional|An object containing whatever additional identifying information to be returned with the onSelectNavigationItem.| ```jsx const navigationItems = [{ key: 'page_1', text: 'Page 1', metaData: { myValue: value1 } }, { key: 'page_2', text: 'Page 2', metaData: { myValue: value2 } }]; ``` ### `onSelectNavigationItem` #### Is Required: `false` The `onSelectNavigationItem` prop allows consumers to retrieve the information related to the navigation item that was clicked. The function callback will return the information in the format of `onSelectNavigationItem(key, metaData)`. ### `activeNavigationItemKey` #### Is Required: `false` The `activeNavigationItemKey` prop allows consumers to set the currently selected navigation item. Accordingly, the `activeNavigationItemKey` value must have an associated entry within the `navigationItems` specification. ### `utilityItems` #### Is Required: `false` The `utilityItems` prop allows consuming applications to present an application-level custom utility items directly from the ApplicationNavigation. The ApplicationNavigation will render this content in different ways based on the active responsive breakpoint. If `utilityItems` are passed as props the associated `onSelectUtilityItem` function callback should be passed as well. The value provided for `utilityItems` should be an array of objects with the following API: |Key Name|Type|Is Required|Description| |---|---|---|---| |`key`|String|**required**|A key rendered to be used as a unique react key as well as returned with the onSelectUtilityItem.| |`icon`|Element|**required**|A React element representing the themable icon for the utility item.| |`text`|String|**required**|The text to either be set as an aria-label or display text.| |`metaData`|Object|optional|An object containing whatever additional identifying information to be returned with the onSelectUtilityItem.| ```jsx const utilityItems = [{ key: 'utility_1', icon: <Icon1 >, text: 'Utility 1', metaData: { myValue: value1 } }, { key: 'utility_2', icon: <Icon2 >, text: 'Utility 2', metaData: { myValue: value2 } }]; ``` ### `onSelectUtilityItem` #### Is Required: `false` The `onSelectUtilityItem` prop allows consumers to retrieve the information related to the navigation item that was clicked. The function callback will return the information in the format of `onSelectUtilityItem(key, metaData)`. ### `onSelectSettings` #### Is Required: `false` The `onSelectSettings` prop allows consumers have first class support for a settings utility item. If the `onSelectSettings` prop is not set a settings utility item will not be displayed. ### `onSelectHelp` #### Is Required: `false` The `onSelectHelp` prop allows consumers have first class support for a help utility item. If the `onSelectHelp` prop is not set a settings utility item will not be displayed. ### `onSelectLogout` #### Is Required: `false` The `onSelectLogout` prop allows consumers have first class support for a logout utility button. If the `onSelectLogout` prop is not set a settings utility item will not be displayed. ### `titleConfig` #### Is Required: `false` The `titleConfig` prop allows consuming applications to add their own branding to the ApplicationNavigation. The ApplicationNavigation will render this content in different ways based on the active responsive breakpoint. |Key Name|Type|Is Required|Description| |---|---|---|---| |`title`|String|**required**|Title to be displayed or set as the aria-label if a title element is passed. |`headline`|String|optional|Super text to be display above the main title text.| |`subline`|String|optional|Sub text to be display below the main title text. |`element`|Element|optional|Element to use in place of title text. Typically a logo for branding. |`hideTitleWithinDrawerMenu`|Boolean|optional|Whether or not the title should be hidden when at the compact breakpoint.| ```jsx const myTitleConfig = { title: 'My Application', subline: 'My Subline', hideTitleWithinDrawerMenu: false, } ``` ### `userConfig` #### Is Required: `false` The `userConfig` prop allows consumers to set a user associated to the current application context. The ApplicationNavigation will render this content in different ways based on the active responsive breakpoint. |Key Name|Type|Is Required|Description| |---|---|---|---| |`name`|String|**required**|User name to be displayed for the user button and within utilities. |`detail`|String|optional|Additional user details string.| |`initials`|String|optional|User initials to be displayed within the avatar if no image is present. |`imageSrc`|String|optional|Src to provide to the avatar component.| ```jsx const myUserConfig = { name: 'Name, User', detail: 'Is a User', initials: 'UN', imageSrc: 'imageSrc', } ``` ### `hero` #### Is Required: `false` The `hero` prop allows consumers to add a hero element within the utility popup and/or navigation drawer. The ApplicationNavigation will render this content in different ways based on the active responsive breakpoint. ### `notifications` #### Is Required: `false` The `notifications` prop allows consumers to display notification counts associated to navigationItems and extensionsItems. The props is made up of key/value pairs; the key is the associated entry within the `navigationItems` or `extensionItems` specification, and a numerical value. Depending on locations within the navigational structure and breakpoint the number of digits displayed may be limited. ```jsx const myNotifications = { key1: 3, extension2: 10, } ``` ### `onDrawerMenuStateChange` #### Is Required: `false` `onDrawerMenuStateChange` callback allows the consumer to know state changes in Drawer Menu. ## Responsive Design The ApplicationNavigation has two rendering modes: `standard` and `compact`. - The `standard` rendering occurs at `large`, `huge`, and `enormous` breakpoints. - The `compact` rendering occurs at `tiny`, `small`, `medium` breakpoints. |Prop|`standard` Rendering|`compact` Rendering| |---|---|---| |`userConfig`|Content is rendered within ApplicationNavigation's header.|Content is rendered within ApplicationNavigation's navigation drawer.| |`titleConfig`|Content is rendered within ApplicationNavigation's header.|Same as `standard`.| |`hero`|Content is rendered within ApplicationNavigation's utility popup (as provided by `terra-popup`).|Content is rendered within ApplicationNavigation's navigation drawer.| |`utilityItems`|Content is rendered within ApplicationNavigation's utility popup (as provided by `terra-popup`).|Content is rendered within ApplicationNavigation's navigation drawer.| |`navigationItems`|Content is rendered within the ApplicationNavigation's header as tabs.|Content is rendered within ApplicationNavigation's navigation drawer.| |`extensionItems`|Content is rendered within ApplicationNavigation's header.|Same as `standard`.|
47.594595
562
0.764338
eng_Latn
0.979852
c911833002b33123e7c11d2df72f431966a3b7e7
1,226
md
Markdown
README.md
mamh-mixed/openzfs-go
395039483e927c9958e59d172b02988b9d82565b
[ "Apache-2.0" ]
null
null
null
README.md
mamh-mixed/openzfs-go
395039483e927c9958e59d172b02988b9d82565b
[ "Apache-2.0" ]
null
null
null
README.md
mamh-mixed/openzfs-go
395039483e927c9958e59d172b02988b9d82565b
[ "Apache-2.0" ]
null
null
null
# Go Wrapper for ZFS Simple wrappers for ZFS command line tools. [![GoDoc](https://godoc.org/github.com/mistifyio/go-zfs?status.svg)](https://godoc.org/github.com/mistifyio/go-zfs) ## Requirements You need a working ZFS setup. To use on Ubuntu 14.04, setup ZFS: sudo apt-get install python-software-properties sudo apt-add-repository ppa:zfs-native/stable sudo apt-get update sudo apt-get install ubuntu-zfs libzfs-dev Developed using Go 1.3, but currently there isn't anything 1.3 specific. Don't use Ubuntu packages for Go, use http://golang.org/doc/install Generally you need root privileges to use anything zfs related. ## Status This has been only been tested on Ubuntu 14.04 In the future, we hope to work directly with libzfs. # Hacking The tests have decent examples for most functions. ```go //assuming a zpool named test //error handling omitted f, err := zfs.CreateFilesystem("test/snapshot-test", nil) ok(t, err) s, err := f.Snapshot("test", nil) ok(t, err) // snapshot is named "test/snapshot-test@test" c, err := s.Clone("test/clone-test", nil) err := c.Destroy() err := s.Destroy() err := f.Destroy() ``` # Contributing See the [contributing guidelines](./CONTRIBUTING.md)
22.703704
140
0.725122
eng_Latn
0.830607
c9128476a321d7ef6c0a86690607deea6f4cc72c
85
md
Markdown
README.md
marker55/hp-ams
f8c5affa094c5c1a051203670717e1fc3487714c
[ "MIT" ]
7
2015-09-22T21:44:08.000Z
2022-01-30T03:07:20.000Z
README.md
marker55/hp-ams
f8c5affa094c5c1a051203670717e1fc3487714c
[ "MIT" ]
1
2018-01-07T17:19:37.000Z
2021-12-22T06:09:08.000Z
README.md
marker55/hp-ams
f8c5affa094c5c1a051203670717e1fc3487714c
[ "MIT" ]
5
2016-04-15T10:49:26.000Z
2021-11-17T14:46:06.000Z
hp-ams ====== Hewlet-Packard Agentless Management Service for ProLiant Gen8 servers
17
69
0.776471
eng_Latn
0.473956
c91318e44866d9435afebda2fc9752f8ae222390
649
md
Markdown
examples/hello-openshift/README.md
Nick-Harvey/origin
fc205c8288a2e8191af401f007e6a0aab3d51a9f
[ "Apache-2.0" ]
213
2016-10-24T13:55:56.000Z
2021-12-19T23:49:56.000Z
examples/hello-openshift/README.md
Nick-Harvey/origin
fc205c8288a2e8191af401f007e6a0aab3d51a9f
[ "Apache-2.0" ]
94
2018-02-07T10:15:51.000Z
2021-03-03T09:10:07.000Z
examples/hello-openshift/README.md
Nick-Harvey/origin
fc205c8288a2e8191af401f007e6a0aab3d51a9f
[ "Apache-2.0" ]
63
2018-03-07T17:00:04.000Z
2021-03-24T03:30:22.000Z
Hello, OpenShift! ----------------- This example will serve an HTTP response of "Hello OpenShift!". $ oc create -f examples/hello-openshift/hello-pod.json $ oc get pod hello-openshift -o yaml |grep podIP podIP: 10.1.0.2 $ curl 10.1.0.2:8080 Hello OpenShift! To test from external network, you need to create router. Please refer to [Running the router](https://github.com/openshift/origin/blob/master/docs/routing.md) If you need to rebuild the image: $ go build -tags netgo # avoid dynamic linking (we want a static binary) $ mv hello-openshift bin $ docker build -t docker.io/openshift/hello-openshift .
30.904762
159
0.691834
eng_Latn
0.82596
c9135117925b811a43d6db44239b5d2e137e9397
799
md
Markdown
go-keylogger/README.md
Unam3dd/Train-2018-2020
afb6ae70fe338cbe55a21b74648d91996b818fa2
[ "MIT" ]
4
2021-04-23T15:39:17.000Z
2021-12-27T22:53:24.000Z
go-keylogger/README.md
Unam3dd/Train-2018-2020
afb6ae70fe338cbe55a21b74648d91996b818fa2
[ "MIT" ]
null
null
null
go-keylogger/README.md
Unam3dd/Train-2018-2020
afb6ae70fe338cbe55a21b74648d91996b818fa2
[ "MIT" ]
2
2021-04-19T08:28:54.000Z
2022-01-19T13:23:29.000Z
# go-keylogger [![GoDoc](https://godoc.org/github.com/KindlyFire/go-keylogger?status.svg)](https://godoc.org/github.com/KindlyFire/go-keylogger) This is a small go keylogger library for Windows. It has the advantage over *all the other* keyloggers I found that it correctly parses each keypress, no matter the keyboard layout. No keymaps are hard-coded. See the `examples/` directory for examples. Currently, the lib polls for keypresses every X milliseconds and checks the entire keyboard using `GetAsyncKeyState()`, which is not the optimal solution but it works well. Work is going on in the `win-events` branch to convert the lib to use the Windows events instead of polling. ### this version includes the enter key, return key, TAB etc... which are not included in the original version.
57.071429
281
0.777222
eng_Latn
0.997788
c91399d62dd0e70ae72e836daa19cc2cad781c8d
3,250
markdown
Markdown
_projects/2_project.markdown
passalis/passalis.github.io
833f828815a59209db53d5b0ce3c288b5adfb8dc
[ "MIT" ]
null
null
null
_projects/2_project.markdown
passalis/passalis.github.io
833f828815a59209db53d5b0ce3c288b5adfb8dc
[ "MIT" ]
null
null
null
_projects/2_project.markdown
passalis/passalis.github.io
833f828815a59209db53d5b0ce3c288b5adfb8dc
[ "MIT" ]
null
null
null
--- layout: page title: deepsing description: Generating Sentiment-aware Visual Stories using Cross-modal Music Translation importance: 1 category: DL --- <p align="justify"> Can machines dream while listening to music? Is it possible to turn music into images in a meaningful way? <a href="www.deepsing.com">deepsing</a> was born to materialize our idea (by <a href="https://www.linkedin.com/in/doropoulos/">Stavros Doropoulos</a> and - to a lesser extent - me, and materialized after many discussions, coffees, and long walks) of translating audio to images inspired by Futurama Holophoner. In this way, deepsing is able to autonomously generate visual stories which convey the emotions expressed in songs. The process of such music-to-image translation poses unique challenges, mainly due to the unstable mapping between the different modalities involved in this process. To overcome these limitations, deepsing employs a trainable cross-modal translation method, leading to a deep learning method for generating sentiment-aware visual stories.</p> <p align="justify"> The proposed method is capable of synthesizing visual stories according to the sentiment expressed by songs. The generated images aim to induce the same feelings to the viewers, as the original song does, reinforcing the primary aim of music, i.e., communicating feelings. deepsing employs a trainable cross-modal translation method to generate visual stories, as shown below: <p> <img src="https://raw.githubusercontent.com/deepsing-ai/deepsing/master/pictures/pipeline.png" alt="deepsing pipeline" style="width: 100%" > The process of generating a visual story is also illustrated bellow <img src="https://raw.githubusercontent.com/deepsing-ai/deepsing/master/pictures/example.png" alt="example of generating a visual story" style="width: 100%"> <p align="justify"> We have implemented a front-end to our method at <a href="https://www.deepsing.com">deepsing.com</a> You can find an example of a purely machine-generated visual story using our method <a href="https://deepsing.com/engine/9C0xGB73Uuc/5dfbcd1ec9e5f7311d8a9fcf">here</a>. Note that the version available at <a href="https://www.deepsing.com">deepsing.com</a> is currently lacking many essential features, but demonstrates the basic concept of our idea! Also, note that song lyrics are NOT used in this process, since the proposed method currently works based SOLELY on the sentiment induced by the audio! <p> <p align="justify"> Furthermore, you can find more information in our <a href="https://arxiv.org/abs/1912.05654">paper</a>, while we have also released the code of our method on <a href="https://github.com/deepsing-ai/deepsing">GitHub</a>. Feel free to hack with us and share your opinions with us! <p> If you use deepsing in your research, please cite our <a href="https://arxiv.org/abs/1912.05654">paper</a>: <pre> @article{passalis2021deepsing, title={deepsing: Generating sentiment-aware visual stories using cross-modal music translation}, author={Passalis, Nikolaos and Doropoulos, Stavros}, journal={Expert Systems with Applications}, volume={164}, year={2021}, } </pre> You are free to use the generated videos in any way you like, as long as you keep the deepsing logo.
70.652174
877
0.780615
eng_Latn
0.990474
c913e57fa10250377f823681205a3dadea694fb8
842
md
Markdown
content/publication/2011-01-01_Is_youth_smoking_res.md
saramarkowitz/academic-kickstart
fde31036209eb23ceb19c7ca4072b4b3935cc233
[ "MIT" ]
null
null
null
content/publication/2011-01-01_Is_youth_smoking_res.md
saramarkowitz/academic-kickstart
fde31036209eb23ceb19c7ca4072b4b3935cc233
[ "MIT" ]
null
null
null
content/publication/2011-01-01_Is_youth_smoking_res.md
saramarkowitz/academic-kickstart
fde31036209eb23ceb19c7ca4072b4b3935cc233
[ "MIT" ]
null
null
null
+++ title = "Is youth smoking responsive to cigarette prices? evidence from low- and middle-income countries" date = "2011-01-01" authors = ["D. Kostova", "H. Ross", "E. Blecher", "S. Markowitz"] publication_types = ["2"] publication = "Tobacco Control, (20), 6, _pp. 419-424_" publication_short = "Tobacco Control, (20), 6, _pp. 419-424_" abstract = "Objective: To estimate the price elasticity of cigarette demand among youth in low- and middle-income countries (LMIC)." abstract_short = "" image_preview = "" selected = false projects = [] tags = [] url_pdf = "https://tobaccocontrol.bmj.com/content/20/6/419.full" url_preprint = "" url_code = "" url_dataset = "" url_project = "" url_slides = "" url_video = "" url_poster = "" url_source = "" math = true highlight = true [header] image = "" caption = "" +++
29.034483
134
0.666271
eng_Latn
0.693741
c9144375aeb5836471f802c96e205896014150f2
760
md
Markdown
article/PART8/flink/flink-05-window&watermark.md
LuckinJack/LuckinJack.github.io
8caf1bfa1a02d1c689a1da3ad1829b45861e8d5b
[ "Apache-2.0" ]
1
2021-03-08T14:26:39.000Z
2021-03-08T14:26:39.000Z
article/PART8/flink/flink-05-window&watermark.md
LuckinJack/LuckinJack.github.io
8caf1bfa1a02d1c689a1da3ad1829b45861e8d5b
[ "Apache-2.0" ]
null
null
null
article/PART8/flink/flink-05-window&watermark.md
LuckinJack/LuckinJack.github.io
8caf1bfa1a02d1c689a1da3ad1829b45861e8d5b
[ "Apache-2.0" ]
null
null
null
# Flink Window # Flink的时间语义 在 Flink 的流式处理中,会涉及到时间的不同概念 <center> <img style="border-radius: 0.3125em; box-shadow: 0 2px 4px 0 rgba(34,36,38,.12),0 2px 10px 0 rgba(34,36,38,.08);" src="/assets/flink-time-semantics.png"> <br> <div style="color:orange; border-bottom: 1px solid #d9d9d9; display: inline-block; color: #999; padding: 2px;">Event-Time、Ingestion-Time 与 Processing-Time</div> <br> <br> </center> - **Event Time**(事件时间):事件创建的时间。它通常由事件中的时间戳描述,例如采集的日志数据中,每一条日志都会记录自己的生成时间,Flink 通过时间戳分配器访问事件时间戳。 - **Ingestion Time**(进入时间):数据进入 Flink 的时间。 - **Processing Time**(处理时间):是每一个执行基于时间操作的算子的本地系统时间,以本地机器时间来衡量。 > **[!TIP]** > Q:对于业务来说,要统计 1min 内的故障日志个数,哪个时间是最有意义的? > > A:应该是 Event Time,因为我们要根据日志的生成时间进行统计 # Watermark
20.540541
95
0.688158
yue_Hant
0.159603
c914a091b47b6f27bb94b6d4b058da12c97fa6b0
41
md
Markdown
E2E_Provision_1485036344630/index.md
OPSTest/E2E_Provision_1485036344630
3be5c6526976f8bd656e62b63fe1e1dc48b33726
[ "CC-BY-4.0", "MIT" ]
null
null
null
E2E_Provision_1485036344630/index.md
OPSTest/E2E_Provision_1485036344630
3be5c6526976f8bd656e62b63fe1e1dc48b33726
[ "CC-BY-4.0", "MIT" ]
null
null
null
E2E_Provision_1485036344630/index.md
OPSTest/E2E_Provision_1485036344630
3be5c6526976f8bd656e62b63fe1e1dc48b33726
[ "CC-BY-4.0", "MIT" ]
null
null
null
# Welcome to E2E_Provision_1485036344630!
41
41
0.878049
eng_Latn
0.503924
c914e1b9985d8b550f1cfd9ee83f4dba823972b5
212
md
Markdown
_collections/_log/2020-08-11.md
shnizzedy/lab-notebook
f678064b3fc71bf7890cc31a61ecff32343ff8e3
[ "MIT" ]
null
null
null
_collections/_log/2020-08-11.md
shnizzedy/lab-notebook
f678064b3fc71bf7890cc31a61ecff32343ff8e3
[ "MIT" ]
null
null
null
_collections/_log/2020-08-11.md
shnizzedy/lab-notebook
f678064b3fc71bf7890cc31a61ecff32343ff8e3
[ "MIT" ]
null
null
null
--- projects: [] author: [Jon Clucas] date: 2020-08-11 --- This notebook has become another thing to do rather than the cognitive offloading I intended. As such, I'm abandoning this notebook (for now, at least).
30.285714
152
0.735849
eng_Latn
0.997256