hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
b0b148f0525d0b3dfbe06006301df9455e53d5f0 | 9 | md | Markdown | examples/docs/fr-FR/affix.md | Curry30XTY/spaas-ui | 2eaba75bbb2b753e1b53da49a982945acba496f4 | [
"MIT"
] | 50 | 2019-11-22T06:04:51.000Z | 2022-03-17T21:20:08.000Z | examples/docs/fr-FR/affix.md | Curry30XTY/spaas-ui | 2eaba75bbb2b753e1b53da49a982945acba496f4 | [
"MIT"
] | 69 | 2019-06-26T02:27:56.000Z | 2021-12-02T07:30:40.000Z | examples/docs/fr-FR/affix.md | Curry30XTY/spaas-ui | 2eaba75bbb2b753e1b53da49a982945acba496f4 | [
"MIT"
] | 50 | 2019-05-15T06:32:11.000Z | 2022-03-22T07:56:16.000Z | ## Affix
| 4.5 | 8 | 0.555556 | eng_Latn | 0.915889 |
b0b1cc87e2b023f804206228f9b3f9c679bbe2b4 | 67 | md | Markdown | README.md | LawnchairLauncher/drone-telegram | d8a306c618b755a7844301c293d42d8289510f2b | [
"WTFPL"
] | 4 | 2018-02-25T13:44:45.000Z | 2021-08-07T15:01:37.000Z | README.md | LawnchairLauncher/drone-telegram | d8a306c618b755a7844301c293d42d8289510f2b | [
"WTFPL"
] | 2 | 2018-08-03T22:01:17.000Z | 2022-03-06T07:44:22.000Z | README.md | LawnchairLauncher/drone-telegram | d8a306c618b755a7844301c293d42d8289510f2b | [
"WTFPL"
] | 7 | 2018-03-01T18:10:59.000Z | 2021-03-08T02:14:24.000Z | # drone-telegram
Deploy builds directly to Telegram from Drone CI.
| 22.333333 | 49 | 0.80597 | eng_Latn | 0.979643 |
b0b2b0c3b77c37445da621686b439ae8b26f2478 | 1,599 | md | Markdown | notes/2020/11.md | j-mes/dotdev | f738658f5cab70072e2ae0b5aa4f143b524304aa | [
"MIT"
] | null | null | null | notes/2020/11.md | j-mes/dotdev | f738658f5cab70072e2ae0b5aa4f143b524304aa | [
"MIT"
] | 99 | 2020-01-17T22:26:38.000Z | 2022-03-28T16:02:57.000Z | notes/2020/11.md | j-mes/dotdev | f738658f5cab70072e2ae0b5aa4f143b524304aa | [
"MIT"
] | 1 | 2020-01-14T13:19:31.000Z | 2020-01-14T13:19:31.000Z | ---
layout: note.njk
tags: [post, notes]
title: WFH for an undetermined amount of time
date: 2020-03-13
---
So, the coronavirus outbreak is now officially a pandemic. The [FT](https://www.ft.com) has decided that it would be better for the staff to start working from home and to not come into the office at all. I was massively impressed at how the company has approached this in a very proactive way. Staff with their health (physical and mental) is considered high up the agenda.
## Good habits to adhere to while working from home
* Do schedule breaks in the calendar. It is quite easy to get self-absorbed into work and forget having mental, hydration or even toilet breaks!
* Try and keep a routine similar to how it would be if you were going to the office.
* I regularly get coffee near work, so keep doing this but with the nearest café.
* Take a walk at lunch time.
* Schedule face-time with colleagues and turn on webcams and talk away. Ask about everything so that it becomes more inclusive and not isolating.
* Do regular exercise without the need of using facilities like the gym.
* If your work uses a chat/collaboration tool, consider setting your status the working hours so that a clear boundary is set.
* Lastly, don't work in your pyjamas! 😂
I have a feeling that this will turn into months, not weeks until the world has gotten this under control, and I'm very confident that we will get to that point at some undetermined time. It makes me realise how fragile life really is, and how we should cherish it with every passing moment, detail and experience.
Until then 👋 | 72.681818 | 374 | 0.772983 | eng_Latn | 0.999959 |
b0b2fad1c5e277e54613d0dfa59dbb9e6b8044c6 | 9,728 | md | Markdown | pydis_site/apps/content/resources/guides/python-guides/discordpy.md | Robin5605/site | 81aa42aa748cb228d7a09e6cf6b211484b654496 | [
"MIT"
] | 700 | 2018-11-17T15:56:51.000Z | 2022-03-30T22:53:17.000Z | pydis_site/apps/content/resources/guides/python-guides/discordpy.md | foxy4096/site | 63b464b57ea0824570879f24baaaca6fd80393ee | [
"MIT"
] | 542 | 2018-11-17T13:39:42.000Z | 2022-03-31T11:24:00.000Z | pydis_site/apps/content/resources/guides/python-guides/discordpy.md | foxy4096/site | 63b464b57ea0824570879f24baaaca6fd80393ee | [
"MIT"
] | 178 | 2018-11-21T09:06:56.000Z | 2022-03-31T07:43:28.000Z | ---
title: Discord.py Learning Guide
description: A learning guide for the discord.py bot framework written by members of our community.
icon: fab fa-python
toc: 2
---
<!-- discord.py Badge -->
<a href="https://github.com/Rapptz/discord.py/">
<div class="tags has-addons">
<span class="tag is-dark">discord.py</span><span class="tag is-info">≥1.0</span>
</div>
</a>
Interest in creating a Discord bot is a common introduction to the world of programming in our community.
Using it as your first project in programming while trying to learn is a double-edged sword.
A large number of concepts need to be understood before becoming proficient at creating a bot, making the journey of learning and completing the project more arduous than more simple projects designed specifically for beginners.
However in return, you get the opportunity to expose yourself to many more aspects of Python than you normally would and so it can be an amazingly rewarding experience when you finally reach your goal.
Another excellent aspect of building bots is that it has a huge scope as to what you can do with it, almost only limited by your own imagination.
This means you can continue to learn and apply more advanced concepts as you grow as a programmer while still building bots, so learning it can be a useful and enjoyable skillset.
This page provides resources to make the path to learning as clear and easy as possible, and collates useful examples provided by the community that may address common ideas and concerns that are seen when working on Discord bots.
## Essential References
Official Documentation: [https://discord.py.readthedocs.io](https://discordpy.readthedocs.io/)
Source Repository: [https://github.com/Rapptz/discord.py](https://github.com/Rapptz/discord.py)
## Creating a Discord Bot Account
1. Navigate to [https://discord.com/developers/applications](https://discord.com/developers/applications) and log in.
2. Click on `New Application`.
3. Enter the application's name.
4. Click on `Bot` on the left side settings menu.
5. Click `Add Bot` and confirm with `Yes, do it!`.
### Client ID
Your Client ID is the same as the User ID of your Bot.
You will need this when creating an invite URL.
You can find your Client ID located on the `General Information` settings page of your Application, under the `Name` field.
Your Client ID is not a secret, and does not need to be kept private.
### Bot Token
Your Bot Token is the token that authorises your Bot account with the API.
Think of it like your Bot's API access key.
With your token, you can interact with any part of the API that's available to bots.
You can find your Bot Token located on the Bot settings page of your Application, under the Username field.
You can click the Copy button to copy it without revealing it manually.
**Your Bot Token is a secret, and must be kept private.**
If you leak your token anywhere other people has access to see it, no matter the duration, you should reset your Bot Token.
To reset your token, go to the Bot settings page of your Application, and click the Regenerate button.
Be sure to update the token you're using for your bot script to this new one, as the old one will not work anymore.
### Permissions Integer
Discord Permissions are typically represented by a Permissions Integer which represents all the Permissions that have been allowed.
You can find a reference to all the available Discord Permissions, their bitwise values and their descriptions here:<br>
[https://discordapp.com/developers/docs/topics/permissions#permissions-bitwise-permission-flags](https://discordapp.com/developers/docs/topics/permissions#permissions-bitwise-permission-flags)
If you want to create your own Permissions Integer, you can generate it in the `Bot` settings page of your Application, located at the bottom of the page.
Tick the permissions you want to be allowing, and it'll update the `Permissions Integer` field, which you can use in your Bot Invite URL to set your bot's default permissions when users go to invite it.
### Bot Invite URL
Bot's cannot use a server invite link. Instead, they have to be invited by a member with the Manage Server permission.
The Bot Invite URL is formatted like:
`https://discordapp.com/oauth2/authorize?client_id={CLIENT_ID}&scope=bot&permissions={PERMISSIONS_INTEGER}`
You can create the Invite URL for your bot by replacing:
* `{CLIENT_ID}` with your [Client ID](#client-id)
* `{PERMISSIONS_INTEGER}` with the [Permissions Integer](#permissions-integer)
You can also generate it with the [Permissions Calculator](https://discordapi.com/permissions.html tool) tool.
## Using the Basic Client (`discord.Client`) { data-toc-label="Using the Basic Client" }
Below are the essential resources to read over to get familiar with the basic functionality of `discord.py`.
* [Basic event usage](https://discordpy.readthedocs.io/en/latest/intro.html#basic*concepts)
* [Simple bot walkthrough](https://discordpy.readthedocs.io/en/latest/quickstart.html#a*minimal*bot)
* [Available events reference](https://discordpy.readthedocs.io/en/latest/api.html#event*reference)
* [General API reference](https://discordpy.readthedocs.io/en/latest/api.html)
## Using the Commands Extension (`commands.Bot`) { data-toc-label="Using the Commands Extension" }
The Commands Extension has a explanatory documentation walking you through not only what it is and it's basic usage, but also more advanced concepts.
Be sure to read the prose documentation in full at:<br>
[https://discordpy.readthedocs.io/en/latest/ext/commands/commands.html](https://discordpy.readthedocs.io/en/latest/ext/commands/commands.html)
It fully covers:
* How to create bot using the Commands Extension
* How to define commands and their arguments
* What the Context object is
* Argument Converters
* Error Handling basics
* Command checks
You will also need to reference the following resources:
* [Commands Extension exclusive events](https://discordpy.readthedocs.io/en/latest/ext/commands/api.html#event-reference)
* [Commands Extension API reference](https://discordpy.readthedocs.io/en/latest/ext/commands/api.html)
## FAQ
The documentation covers some basic FAQ's, and they are recommended to be read beforehand, and referenced before asking for help in case it covers your issue:
[https://discordpy.readthedocs.io/en/latest/faq.html](https://discordpy.readthedocs.io/en/latest/faq.html)
## Usage Examples
### Official Examples and Resources
The official examples can be found on the [source repository](https://github.com/Rapptz/discord.py/tree/master/examples).
The most commonly referenced examples are:
* [Basic Commands Extension Bot](https://github.com/Rapptz/discord.py/blob/master/examples/basic_bot.py)
* [Background Task Example](https://github.com/Rapptz/discord.py/blob/master/examples/background_task.py)
### Permissions Documentation
* [Role Management 101](https://support.discordapp.com/hc/en-us/articles/214836687-Role-Management-101)
* [Full Permissions Documentation](https://discordapp.com/developers/docs/topics/permissions)
### Community Examples and Resources
The `discord.py` developer community over time have shared examples and references with each other.<br>
The following are a collated list of the most referenced community examples.
#### Extensions / Cogs
* [Extension/Cog Example](https://gist.github.com/EvieePy/d78c061a4798ae81be9825468fe146be) - *Credit to EvieePy*
* [Available Cog Methods](https://gist.github.com/Ikusaba-san/69115b79d33e05ed07ec4a4f14db83b1) - *Credit to MIkusaba*
#### Error Handling
* [Decent Error Handling Example](https://gist.github.com/EvieePy/7822af90858ef65012ea500bcecf1612) - *Credit to EvieePy*
#### Embeds
* [Embed Live Designer and Visualiser](https://leovoel.github.io/embed-visualizer/) - *Credit to leovoel*
* [Embed Element Reference](https://cdn.discordapp.com/attachments/84319995256905728/252292324967710721/embed.png)<br>
{: width="200" }
##### Using Local Images in Embeds
```py
filename = "image.png"
f = discord.File("some_file_path", filename=filename)
embed = discord.Embed()
embed.set_image(url=f"attachment://{filename}")
await messagable.send(file=f, embed=embed)
```
##### Embed Limits
| **Element** | **Characters** |
| -------------- | ---------------------- |
| Title | 256 |
| Field Name | 256 |
| Field Value | 1024 |
| Description | 2048 |
| Footer | 2048 |
| **Entire Embed** | **6000**
| **Element** | **Count** |
| -------------- | ---------------------- |
| Fields | 25 |
#### Emoji
- [Bot's Using Emoji](https://gist.github.com/scragly/b8d20aece2d058c8c601b44a689a47a0)
#### Activity Presence
- [Setting Bot's Discord Activity](https://gist.github.com/scragly/2579b4d335f87e83fbacb7dfd3d32828)
#### Image Processing
- [PIL Image Processing Example Cog](https://gist.github.com/Gorialis/e89482310d74a90a946b44cf34009e88) - *Credit to Gorialis*
### Systemd Service
**botname.service**<br>
```ini
[Unit]
Description=My Bot Name
After=network-online.target
[Service]
Type=simple
WorkingDirectory=/your/bots/directory
ExecStart=/usr/bin/python3 /your/bots/directory/file.py
User=username
Restart=on-failure
[Install]
WantedBy=network-online.target
```
**Directory**<br>
`/usr/local/lib/systemd/system`
**Service Commands**<br>
Refresh systemd after unit file changes:<br>
`systemctl daemon-reload`
Set service to start on boot:<br>
`systemctl enable botname`
Start service now:<br>
`systemctl start botname`
Stop service:<br>
`systemctl stop botname`
**Viewing Logs**<br>
All logs:<br>
`journalctl -u botname`
Recent logs and continue printing new logs live:<br>
`journalctl -fu mybot`
| 42.112554 | 230 | 0.764289 | eng_Latn | 0.958179 |
b0b321e95678f20d550e68b36a07592c68139e57 | 1,963 | md | Markdown | README.md | sator-imaging/Twitch-YTLikeTheaterMode | 97d46b29f7b3261b830238d62b9be6716bfa5fa7 | [
"MIT"
] | null | null | null | README.md | sator-imaging/Twitch-YTLikeTheaterMode | 97d46b29f7b3261b830238d62b9be6716bfa5fa7 | [
"MIT"
] | null | null | null | README.md | sator-imaging/Twitch-YTLikeTheaterMode | 97d46b29f7b3261b830238d62b9be6716bfa5fa7 | [
"MIT"
] | null | null | null | # Twitch-YTLikeTheaterMode
Bookmarklet to change Twitch to YouTube-like Theater Mode Layout.

## How to Install
1. Add website (something you like) as bookmark.
2. Open bookmark editor dialog for it.
3. Rename bookmark. (ex. Theater Mode)
4. Enter the following code as "URL" in bookmark editor dialog.
5. Done.
```javascript
javascript: {
let sidebarEl = document.getElementById('sideNav');
let sidebarWidthPx = sidebarEl.offsetWidth+'px';
let offsetTopStyle = 'calc( (100vw - '+sidebarWidthPx+') * 0.5625 )';
let transitionParam = ' 0.5s ease';
let playerEl = document.getElementsByClassName('persistent-player tw-elevation-0')[0];
playerEl.style.setProperty('width', '100%', 'important');
playerEl.style.setProperty('transition', 'width'+transitionParam, 'important');
let chatEl = document.getElementsByClassName('channel-root__right-column channel-root__right-column--expanded')[0];
chatEl.style.setProperty('bottom', '0px', 'important');
chatEl.style.setProperty('height', 'calc(100% - '+offsetTopStyle+')', 'important');
chatEl.style.setProperty('transition', 'bottom'+transitionParam+', height'+transitionParam, 'important');
let infoEl = document.getElementsByClassName('channel-root__info channel-root__info--with-chat')[0];
infoEl.style.setProperty('margin-top', offsetTopStyle, 'important');
infoEl.style.setProperty('transform', '', 'important');
infoEl.style.setProperty('transition', 'margin-top'+transitionParam+', transform'+transitionParam, 'important');
let buttonEl = document.getElementsByClassName('right-column__toggle-visibility toggle-visibility__right-column toggle-visibility__right-column--expanded tw-absolute tw-flex tw-flex-grow-0 tw-flex-shrink-0 tw-visible tw-z-above')[0];
buttonEl.style.setProperty('top', 'calc(1rem + '+offsetTopStyle+')', 'important');
buttonEl.style.setProperty('transition', 'top'+transitionParam, 'important');
}
```
| 43.622222 | 235 | 0.747835 | yue_Hant | 0.455685 |
b0b357f49eaf7abaff027ec48bcc04287109ccb2 | 293 | md | Markdown | _listings/reddit/rsubredditaboutlocation-getnbsp-postman.md | streamdata-gallery-organizations/reddit | 6ce4f93674027e4d73c82237b481973d1fe65d2c | [
"CC-BY-3.0"
] | null | null | null | _listings/reddit/rsubredditaboutlocation-getnbsp-postman.md | streamdata-gallery-organizations/reddit | 6ce4f93674027e4d73c82237b481973d1fe65d2c | [
"CC-BY-3.0"
] | null | null | null | _listings/reddit/rsubredditaboutlocation-getnbsp-postman.md | streamdata-gallery-organizations/reddit | 6ce4f93674027e4d73c82237b481973d1fe65d2c | [
"CC-BY-3.0"
] | null | null | null | {
"info": {
"name": "Reddit Get Subreddit About Location",
"_postman_id": "c7688c89-f835-4fc3-946f-193930b077e3",
"description": "Return a listing of posts relevant to moderators.",
"schema": "https://schema.getpostman.com/json/collection/v2.0.0/"
},
"item": []
} | 32.555556 | 72 | 0.634812 | yue_Hant | 0.217073 |
b0b37cef6c88b77d3c70dbbb0516684da07a417c | 929 | markdown | Markdown | _posts/2020-02-06-algorithm-graph-sort.markdown | leesohyang/leesohyang.github.io | c72a0c405d44251d8e4fa190fd2c06f1fffc073f | [
"MIT"
] | null | null | null | _posts/2020-02-06-algorithm-graph-sort.markdown | leesohyang/leesohyang.github.io | c72a0c405d44251d8e4fa190fd2c06f1fffc073f | [
"MIT"
] | null | null | null | _posts/2020-02-06-algorithm-graph-sort.markdown | leesohyang/leesohyang.github.io | c72a0c405d44251d8e4fa190fd2c06f1fffc073f | [
"MIT"
] | null | null | null | ---
layout: post
title: "위상정렬"
subtitle: "위상정렬"
categories: algorithm language
tags: algorithm language C++
comments: true
---
## 위상정렬
---
인접행렬(유방향)+ 진입차수(몇개로부터 들어오는지) 따로 저장 필요-0이여야 바로 수행될수 있으므로 큐 삽입, 큐 삽입 후 pop하고 연결노드 찾고 진입차수 감소시켜줌.
'''c
#include<stdio.h>
#include<vector>
#include<queue>
#include<algorithm>
using namespace std;
int main(){
freopen("input.txt", "rt", stdin);
int n, m, a, b, score;
scanf("%d %d", &n, &m);
vector<vector<int> > graph(n+1, vector<int>(n+1, 0));
vector<int> degree(n+1);
queue<int> Q;
for(int i=0; i<m; i++){
scanf("%d %d", &a, &b);
graph[a][b]=1;
degree[b]++;
}
for(int i=1; i<=n; i++){
if(degree[i]==0) Q.push(i);
}
while(!Q.empty()){
int now=Q.front();
Q.pop();
printf("%d ", now);
for(int i=1; i<=n; i++){
if(graph[now][i]==1){
degree[i]--;
if(degree[i]==0) Q.push(i);
}
}
}
return 0;
}
'''
| 16.589286 | 95 | 0.53606 | kor_Hang | 0.309002 |
b0b4421700485231997df10a6fce2012df415f27 | 1,688 | md | Markdown | code/typescript/SecuritizedDerivativesAPIforDigitalPortals/v2/docs/SecuritizedDerivativeNotationScreenerValueRangesGetDataSingleBarriers.md | factset/enterprise-sdk | 3fd4d1360756c515c9737a0c9a992c7451d7de7e | [
"Apache-2.0"
] | 6 | 2022-02-07T16:34:18.000Z | 2022-03-30T08:04:57.000Z | code/typescript/SecuritizedDerivativesAPIforDigitalPortals/v2/docs/SecuritizedDerivativeNotationScreenerValueRangesGetDataSingleBarriers.md | factset/enterprise-sdk | 3fd4d1360756c515c9737a0c9a992c7451d7de7e | [
"Apache-2.0"
] | 2 | 2022-02-07T05:25:57.000Z | 2022-03-07T14:18:04.000Z | code/typescript/SecuritizedDerivativesAPIforDigitalPortals/v2/docs/SecuritizedDerivativeNotationScreenerValueRangesGetDataSingleBarriers.md | factset/enterprise-sdk | 3fd4d1360756c515c9737a0c9a992c7451d7de7e | [
"Apache-2.0"
] | null | null | null | # securitizedderivativesapifordigitalportals.SecuritizedDerivativeNotationScreenerValueRangesGetDataSingleBarriers
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**type** | **String** | The type of the barrier. See endpoint `/securitizedDerivative/barrier/type/list` for additional information. Note that not all barrier types listed in the mentioned endpoint can be used as a parameter. | [optional]
**observation** | [**SecuritizedDerivativeNotationScreenerValueRangesGetDataObservation**](SecuritizedDerivativeNotationScreenerValueRangesGetDataObservation.md) | | [optional]
**level** | [**SecuritizedDerivativeNotationScreenerValueRangesGetDataLevel**](SecuritizedDerivativeNotationScreenerValueRangesGetDataLevel.md) | | [optional]
**distance** | [**SecuritizedDerivativeNotationScreenerValueRangesGetDataDistance**](SecuritizedDerivativeNotationScreenerValueRangesGetDataDistance.md) | | [optional]
**breach** | [**SecuritizedDerivativeNotationScreenerValueRangesGetDataBreach**](SecuritizedDerivativeNotationScreenerValueRangesGetDataBreach.md) | | [optional]
**cashFlow** | [**SecuritizedDerivativeNotationScreenerValueRangesGetDataCashFlow**](SecuritizedDerivativeNotationScreenerValueRangesGetDataCashFlow.md) | | [optional]
## Enum: TypeEnum
* `strike` (value: `"strike"`)
* `bonusLevel` (value: `"bonusLevel"`)
* `cap` (value: `"cap"`)
* `knockOut` (value: `"knockOut"`)
* `knockIn` (value: `"knockIn"`)
* `lockOut` (value: `"lockOut"`)
* `lockIn` (value: `"lockIn"`)
* `capitalGuarantee` (value: `"capitalGuarantee"`)
* `couponTriggerLevel` (value: `"couponTriggerLevel"`)
| 42.2 | 249 | 0.750592 | yue_Hant | 0.466878 |
b0b47d263222e9abe910a3772dee543d0e013460 | 1,506 | md | Markdown | README.md | TomOhme/wodss-tippspiel_backend | d7bcd3e55a4088b650d60ebe40e5ce0588fc57b6 | [
"MIT"
] | 1 | 2021-08-06T10:49:38.000Z | 2021-08-06T10:49:38.000Z | README.md | YanickSchraner/wodss-tippspiel_backend | d7bcd3e55a4088b650d60ebe40e5ce0588fc57b6 | [
"MIT"
] | null | null | null | README.md | YanickSchraner/wodss-tippspiel_backend | d7bcd3e55a4088b650d60ebe40e5ce0588fc57b6 | [
"MIT"
] | null | null | null | # wodss-tippspiel_backend
Contains backend for World Cup 2018 betting competition.
# Instalation:
IMPORTANT: Not testet with java 9 and 10 due to gradle and lombok issues with java 10.
Make sure you have a local MySQL database. The db name and credentials can be found in the application.propperties file.
At the time of writing you have to use the follwing parameters:
DB Name: localhost:3306/bettinggame
DB User: bettinggame
DB User Password: 3k:Vp:JzS,R)r5U](g
If you want the wikipedia scraper to load on application startup all games, teams, locations and tournament groups from wikipedia into the local database, you have to set the scraper.onstartup parameter in application.propperties to true.
To use the backend localy you have to set those parameters in the application.properties file:
server.ssl.key-store=classpath:keystore.jks
server.ssl.key-password=keypassword
server.ssl.key-store-password=storepassword
server.ssl.key-alias=tomcat
server.ssl.enabled-protocols=TLSv1.2
server.ssl.ciphers=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384,TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384
Now you can run the application by executing:
./gradlew bootRun
You can terminate the backend by pressing ctrl + c
# Testing:
Run ./gradlew test
185 tests should be executed.
| 40.702703 | 330 | 0.838645 | eng_Latn | 0.850305 |
b0b48580ffcc030af81f5bb481c149cb0aa9afc1 | 572 | md | Markdown | README.md | wh8xx/security-spring-boot | 6bb3b2e7c310f9f4265a792ccd6be75b0bc8dd38 | [
"MIT"
] | null | null | null | README.md | wh8xx/security-spring-boot | 6bb3b2e7c310f9f4265a792ccd6be75b0bc8dd38 | [
"MIT"
] | null | null | null | README.md | wh8xx/security-spring-boot | 6bb3b2e7c310f9f4265a792ccd6be75b0bc8dd38 | [
"MIT"
] | null | null | null | # Security Springboot


## 1 简介
基于spring-boot开发的安全认证框架,目前支持存储方式:Jdbc、Redis、MongoDB。
配置简单,使用方便、易于扩展。
## 2 使用
### 2.1 导入
```xml
<dependency>
<groupId>com.wh8xx</groupId>
<artifactId>security-spring-boot</artifactId>
<version>1.0.0</version>
</dependency>
```
### 2.2 配置
```text
security:
# 令牌存储方式
store-type: redis,
# 拦截路径
paths: /**,
# 白名单
white-list:
/login,
# 黑名单
black-list:
/swagger-ui.html,
# 应对环境
env: DEV
```
| 14.666667 | 58 | 0.63986 | yue_Hant | 0.495309 |
b0b5e7d37439df59d6d55cbbde99018921c3c9d5 | 733 | md | Markdown | content/faq/debugging/chrome-dev-tools.md | Eji4h/dotnetthailand.github.io | bca159ced027ee6ab94d4e1408b87b097e2a253c | [
"MIT"
] | 1 | 2021-08-06T04:40:39.000Z | 2021-08-06T04:40:39.000Z | content/faq/debugging/chrome-dev-tools.md | Eji4h/dotnetthailand.github.io | bca159ced027ee6ab94d4e1408b87b097e2a253c | [
"MIT"
] | null | null | null | content/faq/debugging/chrome-dev-tools.md | Eji4h/dotnetthailand.github.io | bca159ced027ee6ab94d4e1408b87b097e2a253c | [
"MIT"
] | null | null | null | ---
title: Chrome DevTools
showMetadata: true
editable: true
showToc: true
---
# Disable JavaScript
- Press `ctrl+shift+i` to open DevTools.
- Press `ctrl+shift+p` to show command menu.
- Typing `javascript` and select `Disable JavaScript`
- You will find the yellow warning icon on `Sources` tab reminds you that JavaScript is disabled.
- JavaScript will remain disabled in this tab so long as you have DevTools open.
# Enable JavaScript
- Press `ctrl+shift+i` to open DevTools.
- Press `ctrl+shift+p` to show command menu.
- Typing `javascript` and select `Enable JavaScript`
- Close DevTools.
# Useful resources
- Disable JavaScript in more details and screenshot https://developer.chrome.com/docs/devtools/javascript/disable/
| 31.869565 | 114 | 0.763984 | eng_Latn | 0.89687 |
b0b63aceb5a4c794fd71eaf342a42f4265d991c1 | 2,966 | md | Markdown | website/docs/accessing.md | goaver/api-integration | 259e9f5649aceed8be8ae3a10b57027418e644ec | [
"Apache-2.0"
] | 10 | 2019-08-01T14:27:43.000Z | 2020-03-10T13:11:59.000Z | website/docs/accessing.md | goaver/api-integration | 259e9f5649aceed8be8ae3a10b57027418e644ec | [
"Apache-2.0"
] | 1 | 2019-10-06T15:39:45.000Z | 2019-10-10T01:44:42.000Z | website/docs/accessing.md | goaver/api-integration | 259e9f5649aceed8be8ae3a10b57027418e644ec | [
"Apache-2.0"
] | 1 | 2019-10-04T17:31:46.000Z | 2019-10-04T17:31:46.000Z | ---
id: accessing
title: Accessing the API
sidebar_label: Accessing the API
---
To securely access the Aver API, you will need to authenticate to create an authentication bearer token to grant access to API endpoints. An inital basic authentication request is made to retrieve the bearer token, and that token is then used in all subsequent calls to authenticate the request.
## 1. Create an API key
1. Login to the Aver Portal
2. Go to Settings under your organization in the left nav bar
3. Find the integrations tab and select API Keys
4. Add an API key with "Portal User" scope.
5. Copy the "secret"
<p>
<img src="/img/create_api_key_1.jpg"></img>
</p>
6. View the key that was created and copy the "key"
<p>
<img src="/img/create_api_key_2.jpg"></img>
</p>
### Using the Test API Key
<p>
You can use a test API key to help develop the basic intgerface to the Aver API without incurring cost. It will authenticate the same way as a the "Portal User" key, just simply choose "Test API Key" in step 4 above. You can call all of the API endpoints using this key and receive a payload of dummy data without consuming credits.
</p>
## 2. Retrieve the Bearer Token
To retrieve the authentication bearer token, a basic authentication request is made using the API key and secret to validate your organization and return a scoped bearer token to grant access to API resources. Additional details on basic authentication can be found at <a href="https://en.wikipedia.org/wiki/Basic_access_authentication">https://en.wikipedia.org/wiki/Basic_access_authentication</a> (see Client Side)
### Create a Basic Authentication request
1. Concatenate [key]:[secret]
2. Base64 encode the concatenated values
3. Include the Base64 encoded value in the Authorization header for Basic auth
<p>
<b>Protip:</b> For testing the API, <a href="https://www.base64encode.org/">https://www.base64encode.org</a> has a quick online base64 encoder.
</p>
### Call the Authentication Endpoint
<p>
Call the <a href="/docs/auth">Authentication</a> endpoint with your basic authentication header to return a token so you can call other API endpoints and take further action. The base API URL is <b>https://app.goaver.com/api</b>
</p>
## 3. Calling API Endpoints
<p>
For any resource request, the endpoint will require the authorization header to be set as a bearer token with the token you generated in the previous step.
</p>
```
content-type: application/json
authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9eyJ1bmlxdWVfbmFtZSI6ImFmYTIyMTczLTZhNDYtNDc2MS04MzA4LTI3YWQ0YjIxMWM0MCIsInJvbGUiOiJQb3J0YWxVc2VyIiwiaHR0cDovL3NjaGVtYXMubWljcm9zb2Z0LmNvbS93cy8yMDA4LzA2L2lkZW50aXR5L2NsYWltcy91c2VyZGF0YSI6IntcIklkXCI6XCJhZmEyMjE3My02YTQ2LTQ3NjEtODMwOC0yN2FkNGIyMTFjNDBcIixcIkF1dGhUeXBlXCI6MixcIkRhdGFcIjpudWxsfSIsIm5iZiI6MTU3MDE5NjE4NiwiZXhwIjoxNTcwMTk5Nzg2LCJpYXQiOjE1NzAxOTYxODYsImlzcyI6InNlbGYiLCJhdWQiOiJodHRwOi8vZ29hdmVyLmNvbSJ9XZmHyGIVurCvpsNM8RACzz9jReafpww9hrr3vyr4
```
| 48.622951 | 514 | 0.800742 | eng_Latn | 0.892597 |
b0b66ee28f5f0ea49f65fb91965701c5fe763dd7 | 88 | md | Markdown | README.md | prodrink/programma | f865130d6aa000f513928669ff9ac4ee55ae643b | [
"Apache-2.0"
] | null | null | null | README.md | prodrink/programma | f865130d6aa000f513928669ff9ac4ee55ae643b | [
"Apache-2.0"
] | 8 | 2018-03-08T19:37:23.000Z | 2018-04-16T11:05:02.000Z | README.md | prodrink/programma | f865130d6aa000f513928669ff9ac4ee55ae643b | [
"Apache-2.0"
] | null | null | null | programma
---
your personal assistant, supposed to help you improve your language skill
| 22 | 73 | 0.806818 | eng_Latn | 0.999785 |
b0b6fc71946856bad862c3c32352f05fdc0cf157 | 3,634 | md | Markdown | uploading_images_to_multiple_slots/readme.md | mundodisco8/bluetooth_applications | 5ee5734ec22b210f40b84ae9f1f98246d8ef8f4e | [
"Zlib"
] | 28 | 2020-05-27T08:36:36.000Z | 2022-03-13T15:35:30.000Z | uploading_images_to_multiple_slots/readme.md | futechiot/bluetooth_applications | f3529fff7c4c960c80e110ae2c5c52df71b01430 | [
"Zlib"
] | 7 | 2020-06-16T20:27:01.000Z | 2021-11-30T08:23:55.000Z | uploading_images_to_multiple_slots/readme.md | futechiot/bluetooth_applications | f3529fff7c4c960c80e110ae2c5c52df71b01430 | [
"Zlib"
] | 25 | 2021-01-06T15:38:54.000Z | 2022-03-09T11:45:14.000Z | # Uploading Images to Internal/External Flash Using OTA DFU
## Background
This code example has a related User's Guide, here: [Uploading Firmware Images Using OTA DFU](https://docs.silabs.com/bluetooth/latest/general/firmware-upgrade/uploading-firmware-images-using-ota-dfu)
## Description
The attached code implements image uploading to the bootloader storage slots using the standard OTA process. It is also extended by allowing the users to select a slot to upload to and a slot to boot from. You are free to modify this example according to your needs.
## Setting up
To create a project with multi-slot OTA DFU, follow these steps:
1. Create an **Internal Storage Bootloader (multiple images)** or a **SPI Flash Storage Bootloader (multiple images)** project in Simplicity Studio.
2. Check the Storage Slot configuration in AppBuilder and modify it according to your needs.
3. Generate and build the project.
4. Flash the bootloader to your device.
5. Create a **Bluetooth – SoC Empty** project.
6. Open Project Configurator by opening the .slcp file in your project and selecting the Software Components tab.
7. Find the **OTA DFU** component (Bluetooth > Utility > OTA DFU) and uninstall it.
* this will uninstall Apploader, remove OTA DFU service and the OTA control characteristics handler code.
8. Open GATT Configurator (Software Components > Advanced Configurators > Bluetooth GATT Configurator).
9. Find the import button, and import the attached .btconf file.
* this will re-add the OTA DFU service, and a custom service where upload slot and boot slot can be selected.
10. Save the GATT database.
11. Copy the attached *ota_dfu_multislot.c* and *ota_dfu_multislot.h* files into the project.
12. Open autogen/sl_bluetooth.c and
1. Add #include “ota_dfu_multislot.h” to the beginning of the file
2. Add multislot_ota_dfu_on_event(evt); before sl_bt_on_event(evt); into sl_bt_process_event()
3. Save the file
13. Build the project.
14. Flash the image to your device.
## Usage
1. Make a copy of your multi-slot OTA DFU project.
2. Change the Device Name from “Empty Example” to “Updated App” in the GATT Configurator, and save the GATT database
3. Build the project.
4. Run create_bl_files.bat. Note, that you may need to set up some environmental variables first, as described in section 3.10 of [AN1086: Using the Gecko Bootloader with the Silicon Labs Bluetooth® Applications](https://www.silabs.com/documents/public/application-notes/an1086-gecko-bootloader-bluetooth.pdf)
5. Find the full.gbl file in output_gbl folder.
6. Copy full.gbl to your smartphone.
7. Open EFR Connect app on your smartphone.
8. Find your device with Bluetooth browser (advertising as Empty Example) and connect to it.
9. Find the unknown service and open it (this is your custom service including Upload slot and Bootload slot characteristics).
10. Open the first characteristic (this is the upload characteristic) and write the slot number, that you want to upload to, in it, for example 0x00.
11. In the local menu select OTA DFU.
12. Select the partial OTA tab.
13. Select the full.gbl file you want to upload.
14. Click OTA. The file will be uploaded to the selected slot.
15. Now you may need to reconnect.
16. Open the second characteristic in the unknown service (Bootload slot) and write the slot number (e.g. 0x00), that you want to load the application from, in it. The device will trigger a reset and the new application will be loaded.
17. Disconnect and find your device advertising itself as “Updated App”.

 | 43.783133 | 309 | 0.772977 | eng_Latn | 0.987715 |
b0b783c637df8fe7f412021aadc8c78d5081438e | 14 | md | Markdown | README.md | ferhatdemirci05/Mavi-Sozluk | 723861d613b49d1eac932d0ec191aba04c5f42d1 | [
"Apache-2.0"
] | 1 | 2021-10-31T16:12:31.000Z | 2021-10-31T16:12:31.000Z | README.md | mburakcakir/Mavi-Sozluk | 19e0e768c9d231a7c94706974d5276a969ee3d45 | [
"Apache-2.0"
] | null | null | null | README.md | mburakcakir/Mavi-Sozluk | 19e0e768c9d231a7c94706974d5276a969ee3d45 | [
"Apache-2.0"
] | 1 | 2022-03-20T13:03:46.000Z | 2022-03-20T13:03:46.000Z | # Mavi Sözlük
| 7 | 13 | 0.714286 | tur_Latn | 0.999293 |
b0b9e921a096cc81bfa7e2829cb5bf0d2ea349b7 | 56 | md | Markdown | packages/kryo/README.md | Demurgos/via-type | a51609557fe67d7b40078c594a39393872437931 | [
"MIT"
] | null | null | null | packages/kryo/README.md | Demurgos/via-type | a51609557fe67d7b40078c594a39393872437931 | [
"MIT"
] | 44 | 2017-01-17T11:15:52.000Z | 2021-06-28T19:40:51.000Z | packages/kryo/README.md | Demurgos/via-type | a51609557fe67d7b40078c594a39393872437931 | [
"MIT"
] | 2 | 2017-01-17T10:01:49.000Z | 2020-05-21T23:18:00.000Z | # Kryo
Runtime types for validation and serialization.
| 14 | 47 | 0.803571 | eng_Latn | 0.922122 |
b0ba5e39619d457010d412114a1f41d0b8465cb3 | 707 | md | Markdown | README.md | ckinan/portfolio | 06cdb05d72f292d8552bf0b7179d0d23deea0f20 | [
"MIT"
] | null | null | null | README.md | ckinan/portfolio | 06cdb05d72f292d8552bf0b7179d0d23deea0f20 | [
"MIT"
] | 2 | 2019-08-24T15:36:21.000Z | 2019-12-29T02:17:33.000Z | README.md | ckinan/portfolio | 06cdb05d72f292d8552bf0b7179d0d23deea0f20 | [
"MIT"
] | null | null | null | # ckinan.com
The source code of my personal website.
## Run locally
### Setup
Install dependencies with [Poetry](https://python-poetry.org/)
```
poetry install
```
### Build
Build site
```
poetry run build
```
### Run
Run server
```
poetru run serve
```
Open in browser
```
http://localhost:8000
```
## Deploy
### Github Actions
Go to Github Repo -> Actions -> Deploy -> Execute a manual run against `main` branch
### Firebase cli
Requires firebase login first, via cli. Ref: https://firebase.google.com/docs/cli
```
firebase deploy --only hosting
```
## Troubleshooting
Want to see dependency tree?
```
poetry show -t
```
## Links
- https://validator.w3.org/feed/docs/rss2.html
._. | 11.403226 | 85 | 0.66761 | eng_Latn | 0.565046 |
b0ba9932ec8e291e70f73e36b81fbb024f0fa426 | 1,475 | md | Markdown | ui/public/docs/quickstart/guide.md | aliyun/open-drive | 3193656513150c0e30e7df9ebe0a568d86cbb008 | [
"MIT"
] | 2 | 2019-02-01T18:33:01.000Z | 2019-06-17T21:31:59.000Z | ui/public/docs/quickstart/guide.md | aliyun/open-drive | 3193656513150c0e30e7df9ebe0a568d86cbb008 | [
"MIT"
] | null | null | null | ui/public/docs/quickstart/guide.md | aliyun/open-drive | 3193656513150c0e30e7df9ebe0a568d86cbb008 | [
"MIT"
] | null | null | null | # 使用方法介绍
## 1. 默认的管理员
第一个登录的用户,系统默认将他设置为超级管理员。
## 2. 管理员配置Storage
>(指定OSS Bucket和Drive分配策略)
创建Storage之前,需要准备以下资源:
1) 创建一个OSS Bucket,[OSS控制台](https://oss.console.aliyun.com/overview)。

创建成功后,选中该Bucket,点击基础设置->跨域设置->创建规则

来源要填写您部署的UI界面所对应的域名,可以配置多个。
2) 创建一个有STSAssumeRole和OSSFull权限的子用户,并且记录下AK信息。
进入[用户管理](https://ram.console.aliyun.com/#/user/list)页面,新建用户

记录下AK信息

给用户授予STSAssumeRole和OSSFull的权限

3) 创建一个有OSSFull权限的角色,并记录下Arn
进入[角色管理](https://ram.console.aliyun.com/#/role/list),创建角色:



然后给角色授予OSSFull的权限:

进入角色详情, 记录下角色Arn:

准备工作完毕。
开始创建Storage,进入存储配置,点击创建:

配置Storage参数:

## 3. 给普通用户分配Drive
分配网盘有两种方式,一种是默认网盘,在配置Storage的时候填写。
另一种是管理员手动分配,在存储配置页面,选择一条Storage记录,点击操作列的+号。

选择一个用户即可分配网盘:

## 4. 上传文件
选择一个网盘进入:

将文件或目录拖拽至空白区域即可上传:

## 5. 下载文件
选择一个文件双击打开,右上角有下载按钮,选择自己需要的方式获取下载链接。

## 6. 共享目录
选择一个目录,点击右下角共享按钮:

填写共享参数:

授权方式有三种:
1) 访问此链接的所有用户:访问此链接的用户都会分配一个共享,链接被访问后不会失效。
2) 第一个访问此链接的用户:第一个访问此链接的用户会分配一个共享,访问后链接立刻失效。
3) 所有知道此链接的人: 分享给所有人,无需登录网盘,只要访问链接就可以看到里面的内容。
参数填写完成后点击生成链接,然后将链接通过其他方式发送给需要的人。
| 13.657407 | 68 | 0.696949 | yue_Hant | 0.506829 |
b0bc0f785471cb8ab188d29df0511767d66a3cc2 | 3,004 | md | Markdown | _posts/2019-03-06-PAT-A 1065.md | JustinYuu/JustinYuu.github.io | edad3d29bc587fa0993306639df3578cf4a009e2 | [
"MIT"
] | 2 | 2019-09-03T11:31:08.000Z | 2020-12-07T07:52:15.000Z | _posts/2019-03-06-PAT-A 1065.md | JustinYuu/JustinYuu.github.io | edad3d29bc587fa0993306639df3578cf4a009e2 | [
"MIT"
] | 1 | 2022-02-21T07:42:11.000Z | 2022-02-21T07:42:11.000Z | _posts/2019-03-06-PAT-A 1065.md | JustinYuu/JustinYuu.github.io | edad3d29bc587fa0993306639df3578cf4a009e2 | [
"MIT"
] | 2 | 2019-12-12T06:06:02.000Z | 2020-01-14T09:43:55.000Z | ---
layout: post
title: "PAT-A 1065"
description: "code solutions"
categories: [PAT-A]
tags: [C++]
redirect_from:
- /2019/03/06/
---
PAT-A 1065
1065 A+B and C (64bit) (20 分)
Given three integers A, B and C in [−2^63,2^63], you are supposed to tell whether A+B>C.
## Input Specification:
The first line of the input gives the positive number of test cases, T (≤10). Then T test cases follow, each consists of a single line containing three integers A, B and C, separated by single spaces.
## Output Specification:
For each test case, output in one line Case #X: true if A+B>C, or Case #X: false otherwise, where X is the case number (starting from 1).
## Sample Input:
3
1 2 3
2 3 4
9223372036854775807 -9223372036854775808 0
## Sample Output:
Case #1: false
Case #2: true
Case #3: false
## my own thoughts
This problem is really interesting, for the knowledge it requires is the fundemental of computer composition principle.
Concretely,the long long int has a range from 2^-63 to 2^63,but in this problem, if A and B both bigger than 2^62 or smaller than 2^62,then A+B will exceed the limit of long long int.
But it doesn't means that we can't use long long int to solve this problem, what we only need to know is the overflow rule of long long int. The overflow rule is related to the knowledge of the computer composition principle. It is not explained in detail here, but it needs to be known that if it is greater than the maximum value, the overflow value will start from 2^-63; similarly, if it is less than the minimum value, the overflow value will be start with the maximum value.
If these knowledge are understood,then this problem will be quite simple. We can calculate the sum of A+B,which we use R to represent, if A and B both are positive but R is negative,then R must be overflow,which means R must be bigger than C,since R is bigger than all posiible value of the long long int range. Similarly,if A and B are both negative but R is positive,then we have reason to believe that R must be smaller than C,and also be smaller than any possible value of long long int range. And if there is no overflow, we can compare R and C in a normal way.
Here are the codes:
{% highlight ruby %}
#include <iostream>
#include <cstdio>
using namespace std;
int main()
{
int t = 0;
long long int A, B, C;
cin >> t;
for (int i = 1; i <= t; i++)
{
cin >> A >> B >> C;
long long int r = A+B;
if (A > 0 && B > 0 && r < 0)
printf("Case #%d: true\n", i);
else if (A < 0 && B < 0 && r >= 0)
printf("Case #%d: false\n", i);
else if (r>C)
printf("Case #%d: true\n", i);
else
printf("Case #%d: false\n", i);
}
return 0;
}
{% endhighlight %}
---
If you find there are faults in the source codes, any method connecting with me is welcomed.
| 40.594595 | 568 | 0.654461 | eng_Latn | 0.99918 |
b0bc3f96bf3a93bab63c3143ed0822a0d314f0b5 | 2,596 | md | Markdown | README.md | cgistar/hackis | 2338c32f42e6d26c677dd1bca6956222db755246 | [
"Apache-2.0"
] | 14 | 2020-06-12T14:44:13.000Z | 2022-02-23T05:57:09.000Z | README.md | cgistar/hackis | 2338c32f42e6d26c677dd1bca6956222db755246 | [
"Apache-2.0"
] | 7 | 2020-10-04T10:50:29.000Z | 2021-07-11T03:43:42.000Z | README.md | cgistar/hackis | 2338c32f42e6d26c677dd1bca6956222db755246 | [
"Apache-2.0"
] | 5 | 2020-10-20T03:26:11.000Z | 2021-08-12T08:12:48.000Z | # 我的配置
| 名称 | 型号 |
| ---------- | ----------------------------- |
| CPU | i3 8100 |
| 硬盘 | RC500 |
| 内存 | 国产鸟、鱼牌混搭 16G |
| 网卡 | DW1820A |
| DP 转 HDMI | 绿联 DP111,合金头 PS176 芯片 |
# M710Q opencore 配置
1. 包含了 DW1820A 无线网卡及蓝牙的配置;
2. 使用前,请计算三码,并填写在 PlatformInfo -> Generic -> MLB、 SystemSerialNumber、 SystemUUID;
3. 如果有连接 dp 转 hdmi 紫屏的,可以参考[使用 Hackintool 修复黑苹果 Intel 核显驱动外部显示器紫屏问题](https://blog.skk.moe/post/hackintosh-fix-magenta-screen/),修补 EDID,或制作缓冲帧补丁方式解决;
4. 尽量使用好一点的 DP 转 hdmi 线,要主动式的,升级到 10.15.5 后,避免黑屏问题;
5. 查询问题,加启动参数: NVRAM → add → 7C436110-AB2A-4BBB-A880-FE41995C9F82 → boot-args → -v keepsyms=1
# 升级到 0.6.4 时,如果你使用了 Bootstrap,请注意先执行以下操作:
- 先关闭 BootProtect,config.plist → Misc → Security → BootProtect: 设置为 None
- 确保你开启了 RequestBootVarRouting,config.plist → UEFI → Quirks: RequestBootVarRouting 设置为 Enabled
- 开启重置 NVRAM 功能,config.plist → Misc → Security: AllowNvramReset 设置为 Enabled
- 升级 0.6.4 替换文件后,重启计算机,选择 Reset NVRAM
- 重新进入系统后,可重新开启 BootProtect,对于较老的主板(Haswell 或更老),建议使用 BootstrapShort,较新的主板可继续使用 Bootstrap
# 使用说明
解锁 CFG 后可以变更 Config—–Kernel—–Quirks
1. AppleXcpmCfgLock true -> false
2. AppleCpuPmCfgLock true -> false
3. IgnoreInvalidFlexRatio:如果没有解鎖 CFG,必须勾选。
4. 不是使用的 DW1820A 网卡,可以关闭网卡 PCI 描述、蓝牙驱动
5. 注入国家代码,解决连接速率低或者国家代码是US: brcmfx-country=#a
# RU.efi 解锁 CFG Lock
通过教程[解锁 MSR 0xE2、BIOS Lock 等隐藏选项新姿势,还黑果原生体验(附刷 AMI BIOS 教程)](http://bbs.pcbeta.com/viewthread-1834965-1-1.html),查询到我的 BIOS,版本:M1AKT49A,需要修改 0x503
1. 通过 EFI 菜单进入 RU.efi
2. ALT+C
3. 选择 UEFI Variable
4. 按 PAGE UP 和 PAGE DOWN 进行翻页,找到 Setup 项,可能会看到两个 Setup 项,分别按回车进去看下,选择数据较多的一个 Setup。在数据视图中按 CTRL+PAGE UP 和 CTRL+PAGE DOWN 进行翻页;
5. 需修改 0x503 的值为 0,首先找到 0x500 再用左右方向键定位纵向 03 的数值,按回车键进入修改模式,直接输入数字 0,再按 CTRL+W 保存修改,你会看到"Updated OK: Setup"的信息,此时 BIOS Lock 的值就被写入了 BIOS 中。
感谢 @tanpengsccd 提供的简单方法:
- 进入菜单 按 space 显示所有选择项目
- 进入RU.efi
- 输入 setup_var 0x503 0x0
- 然后提示设置成功
- 再检查CFG Lock 已经解锁了
# BIOS 设置
1. 显卡缓存 256M
2. 关闭快速启动
3. 其它忘了。。。
# 更新日志
20.10.9
- Opencore 0.6.2;
- add H_EC to EC patch;
- add SSDT-PLUG GPRW 尝试修复睡眠秒醒问题;
- add SSDT-PMC nvram 方案;
- add Innie.kext 修复外置硬盘问题;
- remove USBPower.kext;
20.6.9
- 更新 opencore 0.5.9
20.5.29
- 增加节能 5 项 XCPM.aml SSDT-PM.aml,但还是睡眠即醒
20.5.19
- 上传第一版
# 参考:
- [xjn' Blog 使用 OpenCore 引导黑苹果](https://blog.xjn819.com/?p=543)
- [紫氵朝 联想 ThinkCentre M710Q 微型主机黑苹果基本完善(2.15 修复前置音频输出)](http://bbs.pcbeta.com/forum.php?mod=viewthread&tid=1826205)
- [daliansky Lenovo-M710Q-Hackintosh](https://github.com/daliansky/Lenovo-M710Q-Hackintosh)
| 30.541176 | 148 | 0.68567 | yue_Hant | 0.837016 |
b0bc847c8d2d54c415fe78a9683cb85f290803c0 | 335 | md | Markdown | _posts/2007-09-25-39-5763.md | ashmaroli/mycad.github.io | 47fb8e82e4801dd33a3d4335a3d4a6657ec604cf | [
"MIT"
] | null | null | null | _posts/2007-09-25-39-5763.md | ashmaroli/mycad.github.io | 47fb8e82e4801dd33a3d4335a3d4a6657ec604cf | [
"MIT"
] | null | null | null | _posts/2007-09-25-39-5763.md | ashmaroli/mycad.github.io | 47fb8e82e4801dd33a3d4335a3d4a6657ec604cf | [
"MIT"
] | null | null | null | ---
layout: post
amendno: 39-5763
cadno: CAD2007-A320-09
title: 灭火瓶导线检查
date: 2007-09-25 00:00:00 +0800
effdate: 2007-10-08 00:00:00 +0800
tag: A320
categories: 民航西南地区管理局适航审定处
author: 汪毅飞
---
##适用范围:
本指令适用于2007年2月28日前交付,且在生产中或在运营中通过执行空客服务通告后安装有货舱灭火瓶的所有审定型别,所有系列号的A318,A319,A320和A321型飞机。执行过维修大纲(MRBR) 工作
26.23.00/03或26.23.00/07的飞机例外。
| 19.705882 | 102 | 0.776119 | yue_Hant | 0.371192 |
b0bc9abb34b64f3b250503bf9659359528f53cb5 | 9,276 | md | Markdown | README.md | 100rabhkr/DownZLibrary | 80b238b911cb1e16fc735bf53cfa62aad45a2a58 | [
"MIT"
] | 27 | 2017-11-06T18:28:07.000Z | 2022-02-05T05:38:02.000Z | README.md | 100rabhkr/DownZLibrary | 80b238b911cb1e16fc735bf53cfa62aad45a2a58 | [
"MIT"
] | 1 | 2018-10-26T04:17:57.000Z | 2018-11-22T15:31:41.000Z | README.md | 100rabhkr/DownZLibrary | 80b238b911cb1e16fc735bf53cfa62aad45a2a58 | [
"MIT"
] | 13 | 2017-11-06T18:03:12.000Z | 2021-02-09T14:18:41.000Z | DownZ
===================
[](https://jitpack.io/#100rabhkr/DownZLibrary) | [View release history](https://github.com/100rabhkr/DownZLibrary/releases)
DownZ is an HTTP library that boosts networking in Android apps and makes it significantly easier and faster.
<p align="center"><img src="https://image.ibb.co/gC5XRa/edgetech_2_1_1.png" alt="downz logo"></p>
DownZ offers the following benefits:
- Handles HTTP requests.
- Transparent memory response caching of JSON and Images with support for request prioritization.
- Cancel a request of image upload or download at any given time.
- Images in memory cache are auto removed if not used for a long time.
- Ease of customization, for example, cancel request and clear cache.
- Strong requisition that makes it easy to effectively manage UI with data being fetched asynchronously from the network.
DownZ excels at handling HTTP requests.It comes with built-in support for images, JSON and Xml. By providing built-in support for the features you require, DownZ frees you from writing tons of code and finally allows you to concentrate on the logic that is specific to your app.
----------
Download
-------------
You can use Gradle
**Step 1.** Add the JitPack repository to your build file , Add it in your root build.gradle at the end of repositories:
allprojects {
repositories {
...
maven { url 'https://jitpack.io' }
}
}
**Step 2.** Add the dependency
dependencies {
compile 'com.github.100rabhkr:DownZLibrary:1.1'
}
Or Maven:
**Step 1.** Add the JitPack repository to your build file
<repositories>
<repository>
<id>jitpack.io</id>
<url>https://jitpack.io</url>
</repository>
</repositories>
**Step 2.** Add the dependency
<dependency>
<groupId>com.github.100rabhkr</groupId>
<artifactId>DownZLibrary</artifactId>
<version>1.1</version>
</dependency>
Or Sbt:
**Step 1.** Add the JitPack repository to your build file Add it in your build.sbt at the end of resolvers:
resolvers += "jitpack" at "https://jitpack.io"
**Step 2.** Add the dependency
libraryDependencies += "com.github.100rabhkr" % "DownZLibrary" % "1.1"
Snapshots from Sample App
-------------
<img src="https://media.giphy.com/media/l41JQ8NJPwWR9OMTe/giphy.gif"> <img src="https://media.giphy.com/media/l0EoBrCax7QiJB78c/giphy.gif" style="float:right;">
----------
Download Sample App
-------------
All the source code can be downloaded from [Github's Release page](https://github.com/100rabhkr/DownZLibrary/releases)
You can also download the sample app from [here](https://drive.google.com/open?id=0BwG2nwOU6FIWLW50V2tCYmVZZXpFc2VOQkhQMTZ1bG16cWZr)
How do I use DownZ?
-------------------
**Make Standard Request**
…
DownZ
.from(mContext) //context
.load(DownZ.Method.GET, “http://yoururl.com”)
.asBitmap() //asJsonArray() or asJsonObject() or asXml() can be used depending on need
.setCallback(new HttpListener<Bitmap>() {
@Override
public void onRequest() {
//On Beginning of request
}
@Override
public void onResponse(Bitmap data) {
if(data != null){
// do something
}
}
@Override
public void onError() {
//do something when there is an error
}
@Override
public void onCancel() {
//do something when request cancelled
}
});
**Pass Header or Request Parameters (Optional)**
...
DownZ
.from(MainActivity.this)
.load(DownZ.Method.GET,mUrl)
.setHeaderParameter("key","value")
.setRequestParameter("key1","value1")
.setRequestParameter("key2","value2")
.asBitmap()
.setCallback(new HttpListener<Bitmap>() {
@Override
public void onRequest() {
}
@Override
public void onResponse(Bitmap bitmap) {
if(bitmap != null){
//do something
}
}
@Override
public void onError() {
}
@Override
public void onCancel() {
}
});
**Enable Cache**
...
public class SomeActivity extends AppCompatActivity {
...
CacheManager<JSONArray> cacheManager; // we can use JSONObject, Bitmap as generic type
@Override
protected void onCreate(Bundle savedInstanceState) {
...
cacheManager=new CacheManager<>(40*1024*1024); // 40mb
}
public void OnRequestMade(View v){
DownZ
.from(this)
.load(DownZ.Method.GET, "http://www.url.com")
.asJsonArray()
.setCacheManager(cacheManager)
.setCallback(new HttpListener<JSONArray>() {
@Override
public void onRequest() {
//fired when request begins
}
@Override
public void onResponse(JSONArray data) {
if(data!=null){
// do some stuff here
}
}
@Override
public void onError() {
}
@Override
public void onCancel() {
}
});
}
}
**Load Image into ImageView**
...
public class SomeActivity extends AppCompatActivity {
...
CacheManager<Bitmap> cacheManager;
@Override
protected void onCreate(Bundle savedInstanceState) {
...
cacheManager=new CacheManager<>(40*1024*1024); // 40mb
imageview = (ImageView) findViewById(R.id.image_profile);
imageview.setDrawingCacheEnabled(true); //can be used if Image has to be shared afterwards
imageview.buildDrawingCache();
...
}
//event to make request
public void btnLoadImageClicked(View v){
DownZ
.from(this)
.load(DownZ.Method.GET, "http://www.url.com/image.jpg")
.asBitmap()
.setCacheManager(cacheManager)
.setCallback(new HttpListener<Bitmap>() {
@Override
public void onRequest() {
//fired when request begin
}
@Override
public void onResponse(Bitmap data) {
if(data!=null){
// do some stuff here
imageview.setImageBitmap(data);
}
}
@Override
public void onError() {
}
@Override
public void onCancel() {
}
});
}
}
**Removing from Cache**
...
public class SomeActivity extends AppCompatActivity {
...
CacheManager<Bitmap> cacheManager;
@Override
protected void onCreate(Bundle savedInstanceState) {
...
String mUrl = "http://yoururl.com/image.png";
cacheManager=new CacheManager<>(40*1024*1024); // 40mb
imageview = (ImageView) findViewById(R.id.image_profile);
imageview.setDrawingCacheEnabled(true); //can be used if Image has to be shared afterwards
imageview.buildDrawingCache();
...
}
//event to clear item from cache
public void btntoClearCache(View v){
cacheManager.removeDataFromCache(mUrl);
}
}
Getting Help
------------
To report a specific problem or feature request in DownZ, you can [open a new issue on Github](https://github.com/100rabhkr/DownZLibrary/issues/new). For questions, suggestions, or even to say 'hi', just drop an email at [email protected]
Contributing
------------
If you like DownZ, please contribute by writing at [email protected]. Your help and support would make DownZ even better. Before submitting pull requests, contributors must sign [Google's individual contributor license agreement.](https://cla.developers.google.com/about/google-individual)
Author
------
Saurabh Kumar - [@100rabhkr](https://github.com/100rabhkr) on GitHub, [@NotSaurabhKumar](https://twitter.com/NotSaurabhKumar) on Twitter
License
-------
MIT See the [LICENSE](https://github.com/100rabhkr/DownZLibrary/blob/master/LICENSE) file for details.
Disclaimer
----------
This is not an official Google product.
| 28.718266 | 294 | 0.549159 | eng_Latn | 0.618929 |
40dd57c4544aafec0bbb72824b104d93dac1d019 | 174 | md | Markdown | doc/README.md | formatool/spring-petclinic | bfb870f86e49a3e6015c35dd35eeba16b1ab5a03 | [
"Apache-2.0"
] | null | null | null | doc/README.md | formatool/spring-petclinic | bfb870f86e49a3e6015c35dd35eeba16b1ab5a03 | [
"Apache-2.0"
] | 3 | 2021-05-24T13:20:39.000Z | 2021-06-02T20:45:09.000Z | doc/README.md | formatool/spring-petclinic | bfb870f86e49a3e6015c35dd35eeba16b1ab5a03 | [
"Apache-2.0"
] | null | null | null | # Docs directory
To edit astra-petclinic-diagrams, [click here](https://app.diagrams.net/?mode=github#Hformatool%2Fspring-petclinic%2Fmain%2Fdoc%2Fastra-petclinic-diagrams)
| 43.5 | 155 | 0.810345 | cat_Latn | 0.072126 |
40de4747812b4b599df2c64e6c415bfbc5071070 | 1,452 | md | Markdown | _publications/2020-04-25-facilitating-live-editing-in-streaming.md | atima/atima.github.io | b3f09a6b09e1e82d0382a878bc3a86c700812678 | [
"MIT"
] | null | null | null | _publications/2020-04-25-facilitating-live-editing-in-streaming.md | atima/atima.github.io | b3f09a6b09e1e82d0382a878bc3a86c700812678 | [
"MIT"
] | null | null | null | _publications/2020-04-25-facilitating-live-editing-in-streaming.md | atima/atima.github.io | b3f09a6b09e1e82d0382a878bc3a86c700812678 | [
"MIT"
] | null | null | null | ---
title: "Designing User Interface for Facilitating Live Editing in Streaming"
collection: publications
permalink: /publication/2020-04-25-facilitating-live-editing-in-streaming
excerpt: ''
date: 2020-04-25
venue: 'The 2020 CHI Conference on Human Factors in Computing Systems'
paperurl: 'https://doi.org/10.1145/3334480.3383037'
citation: 'Tharatipyakul, Atima, Jie Li, and Pablo Cesar. "Designing User Interface for Facilitating Live Editing in Streaming." <i>Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems</i>. 2020.'
---
Media assets, such as overlay graphics or comments, can make video streaming a unique and engaging experience. Appropriately managing media assets during the live streaming, however, is still difficult for streamers who work alone or in small groups. With the aim to ease the management of such assets, we analyzed existing live production tools and designed four low fidelity prototypes, which eventually led to two high fidelity ones, based on the feedback from users and designers. The results of a usability test, using fully interactive prototypes, suggested that a controller and predefined media object behavior were useful for managing objects. The findings from this preliminary work help us design a prototype that helps users to stream rich media presentations. [[pdf](https://doi.org/10.1145/3334480.3383037?cid=99659116563)] [[video](https://youtu.be/VrbYwSsnuKo)]
| 121 | 877 | 0.803719 | eng_Latn | 0.985733 |
40e121f6db27622119e73f9586c957c4c44b8f08 | 960 | md | Markdown | k8s/README.md | 18826905579/vfarcic.github.io | 3f4bd2ea593f816c8d99f53852529f309fc3f904 | [
"MIT"
] | 129 | 2015-11-06T14:49:19.000Z | 2022-03-04T07:25:16.000Z | k8s/README.md | 18826905579/vfarcic.github.io | 3f4bd2ea593f816c8d99f53852529f309fc3f904 | [
"MIT"
] | 8 | 2015-11-23T15:17:20.000Z | 2022-03-26T18:34:09.000Z | k8s/README.md | 18826905579/vfarcic.github.io | 3f4bd2ea593f816c8d99f53852529f309fc3f904 | [
"MIT"
] | 124 | 2015-10-22T20:14:04.000Z | 2021-11-04T23:23:37.000Z | ## Setup
```bash
curl -LO "https://storage.googleapis.com/kubernetes-release/release/$(curl -s https://storage.googleapis.com/kubernetes-release/release/stable.txt)/bin/darwin/amd64/kubectl"
chmod +x ./kubectl
sudo mv ./kubectl /usr/local/bin/kubectl
kubectl version
minikube start
kubectl cluster-info
kubectl get nodes
```
## Deployment
```bash
kubectl run kubernetes-bootcamp --image=docker.io/jocatalin/kubernetes-bootcamp:v1 --port=8080
kubectl get deployments
kubectl proxy
export POD_NAME=$(kubectl get pods -o go-template --template '{{range .items}}{{.metadata.name}}{{"\n"}}{{end}}')
echo "Pods:\n$POD_NAME"
curl "http://localhost:8001/api/v1/proxy/namespaces/default/pods/$POD_NAME/"
kubectl get pods
kubectl describe pods
kubectl logs $POD_NAME
kubectl exec $POD_NAME env
kubectl exec -ti $POD_NAME bash
cat server.js
curl "localhost:8080"
exit
# TODO: https://kubernetes.io/docs/tutorials/kubernetes-basics/expose-intro/
``` | 18.823529 | 173 | 0.748958 | yue_Hant | 0.210562 |
40e2001bce964ebd0c8101275ec5961a5f22d114 | 1,595 | md | Markdown | README.md | bluele/hmsim | 874e373e0b3a1d5711f5c4a065cbc7ffc5310025 | [
"Apache-2.0"
] | 2 | 2019-07-02T11:56:21.000Z | 2019-10-07T05:39:45.000Z | README.md | bluele/hmsim | 874e373e0b3a1d5711f5c4a065cbc7ffc5310025 | [
"Apache-2.0"
] | 3 | 2019-07-02T11:19:47.000Z | 2019-09-09T09:50:31.000Z | README.md | bluele/hmsim | 874e373e0b3a1d5711f5c4a065cbc7ffc5310025 | [
"Apache-2.0"
] | 1 | 2020-10-05T04:37:03.000Z | 2020-10-05T04:37:03.000Z | # hmemu
[](https://circleci.com/gh/datachainlab/hmemu)
An emulation library to ease contract development and testing for hypermint.
This version supports hypermint **v0.4.4**.
## Getting started
First, please append following code to Cargo.toml in your project.
```toml
[dependencies]
hmcdk = { git = "https://github.com/bluele/hypermint", tag = "v0.4.4" }
[dev-dependencies]
hmemu = { git = "https://github.com/bluele/hmemu", branch = "develop" }
```
Then, you write a test code for contract.
```rust
extern crate hmemu;
#[test]
fn contract_func_test() {
let args = vec!["1", "2"];
hmemu::exec_process_with_arguments(args, || {
contract_func().unwrap();
hmemu::commit_state()?;
let state = hmc::read_state("key".as_bytes());
assert!(state.is_ok());
assert_eq!("value".as_bytes().to_vec(), state.unwrap());
let ret = hmemu::get_return_value();
assert!(ret.is_ok());
assert_eq!("ok".to_string(), String::from_utf8(ret.unwrap()).unwrap());
let ev = hmemu::get_event("test-event", 0);
assert!(ev.is_ok());
assert_eq!("key".to_string(), String::from_utf8(ev.unwrap()).unwrap());
Ok(())
})
.unwrap();
}
```
Finally, you can run test command.
```
// This is required to build `lib` directory. Please see `build.rs` for details.
$ export GO111MODULE=on
$ cargo test
```
## Test
```
$ export GO111MODULE=on
$ cargo test
```
## Author
**Jun Kimura**
* <http://github.com/bluele>
* <[email protected]>
| 22.152778 | 115 | 0.635737 | eng_Latn | 0.40207 |
40e2e497c90d5b0f493b289a0765935b8278c41f | 618 | md | Markdown | README.md | TiNredmc/PD443Xclock | f805267d4b68ecf074a94fec2187653f19f0dd35 | [
"Unlicense"
] | 1 | 2018-01-18T14:17:23.000Z | 2018-01-18T14:17:23.000Z | README.md | TiNredmc/PD443Xclock | f805267d4b68ecf074a94fec2187653f19f0dd35 | [
"Unlicense"
] | null | null | null | README.md | TiNredmc/PD443Xclock | f805267d4b68ecf074a94fec2187653f19f0dd35 | [
"Unlicense"
] | null | null | null | # PD443Xclock Not Tested
PD443Xclock Not tested
component
1.PD443x or PD243X or PD353X 1 piece
2.all arduino with Atmega328(p) 1 piece
3.DS3231 module 1 piece
4.74HC595 1 piece
connection between arduino 74HC595 and PDXXXX(from 1st)
:https://github.com/TiNredmc/PD443X_Lib/blob/master/Connection.txt
connection
arduino to DS3231 module
A4 or SDA to SDA pin
A5 or SCl to SCL pin
Vcc to Vcc
GND to GND
Credits Library
RTClib:https://github.com/adafruit/RTClib
My Library
PD443X:https://github.com/TiNredmc/PD443X_Lib
| 26.869565 | 66 | 0.673139 | kor_Hang | 0.623708 |
40e3d3b764957a82e32920c1ffabbbde9b9ed7fe | 651 | md | Markdown | README.md | Spyritomb/vbfabrications | d18b294ef7da3549198533718ed924990804ea2b | [
"MIT"
] | 1 | 2021-06-27T09:00:01.000Z | 2021-06-27T09:00:01.000Z | README.md | Spyritomb/vbfabrications | d18b294ef7da3549198533718ed924990804ea2b | [
"MIT"
] | null | null | null | README.md | Spyritomb/vbfabrications | d18b294ef7da3549198533718ed924990804ea2b | [
"MIT"
] | null | null | null | #V.B Fabrications
3rd year project
MVC: CodeIgniter 4
To run the application:
Method 1:
If you want to run this application on your local device, please copy this folder in htdocs and within xampp
config controll:
Apache (httpd.conf) and change the following lines to this:
DocumentRoot "C:/xampp/htdocs/vbfabrications/public"
<Directory "C:/xampp/htdocs/vbfabrications/public">
Apache (httpd-ssl.conf) and change the following lineis to this:
DocumentRoot "C:/xampp/htdocs/vbfabrication/public"
Method 2:
Withing the vbfibrication directory run cmd and type: "php spark serve"
Admin Login:
Username: Andreas
Password: 1234abcd
| 18.083333 | 108 | 0.772657 | eng_Latn | 0.91586 |
40e497a9dc1d319777db50df3201b826a88a3b03 | 728 | md | Markdown | README.md | christophehurpeau/modern-browsers | 2892d7f62983c56f2950e1d9555c143caa3d38d5 | [
"0BSD"
] | 1 | 2021-08-05T01:51:25.000Z | 2021-08-05T01:51:25.000Z | README.md | christophehurpeau/modern-browsers | 2892d7f62983c56f2950e1d9555c143caa3d38d5 | [
"0BSD"
] | 114 | 2017-03-14T14:52:46.000Z | 2022-03-26T00:42:56.000Z | README.md | christophehurpeau/modern-browsers | 2892d7f62983c56f2950e1d9555c143caa3d38d5 | [
"0BSD"
] | null | null | null | <h3 align="center">
modern-browsers
</h3>
<p align="center">
Regexp of modern browsers
</p>
<p align="center">
<a href="https://npmjs.org/package/modern-browsers"><img src="https://img.shields.io/npm/v/modern-browsers.svg?style=flat-square"></a>
</p>
## Why ?
You can use it with [babel-preset-modern-browsers](https://www.npmjs.com/package/babel-preset-modern-browsers) to serve a different bundle with modern browsers.
## Install
```bash
npm install --save modern-browsers
```
## Usage
```js
import isModernBrowser from 'modern-browsers';
console.log(
isModernBrowser(
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.76 Safari/537.36',
),
);
```
| 21.411765 | 160 | 0.700549 | yue_Hant | 0.327105 |
40e5cc943cb3a3c19e621d6e853038ec3e9d8266 | 827 | md | Markdown | about.md | ganesh737/ganesh737.github.io | bc1178eed6af38570e8317ce80db4480cf902a16 | [
"MIT"
] | null | null | null | about.md | ganesh737/ganesh737.github.io | bc1178eed6af38570e8317ce80db4480cf902a16 | [
"MIT"
] | null | null | null | about.md | ganesh737/ganesh737.github.io | bc1178eed6af38570e8317ce80db4480cf902a16 | [
"MIT"
] | null | null | null | ---
layout: page
title: About
permalink: /about/
---
Hi,
My name is Ganesh. I am a simple, straight forward, and result oriented person looking to do something meaningfull with my skills.
I'm currently working as Technical Expert at [RBEI](https://www.bosch-india-software.com/en/our-company/about-us/).
My current focus is on SoftWare Update Solutions for [Infotainment devices](https://www.bosch-mobility-solutions.com/en/).
So my primary work areas are -
* C++ Applications for Genivi Linux
* Busybox and Lua scripting for devices
* Tooling for Update Medium generation (stick, delta, and production images)
* Peripheral Updaters
I am currently reading up Artificial Intelligence.
My personal experiences are shared in [my beat of life](https://ganesh737.github.io/my-beat-of-life/)
email: ganesh.rvec[at]gmail.com
| 30.62963 | 130 | 0.767836 | eng_Latn | 0.960109 |
40e6d0c8835792aae1351825e55eaad5d2c4795a | 302 | md | Markdown | cloudtower-java-sdk-okhttp-gson/docs/NvmfSubsystemUpdationParams.md | Sczlog/cloudtower-java-sdk | efc33c88433214b887c13a41043380d220f46372 | [
"MIT"
] | null | null | null | cloudtower-java-sdk-okhttp-gson/docs/NvmfSubsystemUpdationParams.md | Sczlog/cloudtower-java-sdk | efc33c88433214b887c13a41043380d220f46372 | [
"MIT"
] | null | null | null | cloudtower-java-sdk-okhttp-gson/docs/NvmfSubsystemUpdationParams.md | Sczlog/cloudtower-java-sdk | efc33c88433214b887c13a41043380d220f46372 | [
"MIT"
] | null | null | null |
# NvmfSubsystemUpdationParams
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**data** | [**NvmfSubsystemCommonParams**](NvmfSubsystemCommonParams.md) | |
**where** | [**NvmfSubsystemWhereInput**](NvmfSubsystemWhereInput.md) | |
| 20.133333 | 78 | 0.562914 | yue_Hant | 0.480671 |
40e7e83b90236e913d978ac9cb9fa3df8fb56149 | 59 | md | Markdown | README.md | louay47/Epodia | bd6d389cda496f68c2a9c2cc25afa61aa82f8a93 | [
"MIT"
] | null | null | null | README.md | louay47/Epodia | bd6d389cda496f68c2a9c2cc25afa61aa82f8a93 | [
"MIT"
] | null | null | null | README.md | louay47/Epodia | bd6d389cda496f68c2a9c2cc25afa61aa82f8a93 | [
"MIT"
] | null | null | null | # Epodia
Spring boot Rest api for an e health application
| 19.666667 | 49 | 0.779661 | eng_Latn | 0.982107 |
40e82c39531a6d9d7ad77a8002d21ddc52006eef | 62,306 | md | Markdown | fabric/14113-14645/14179.md | hyperledger-gerrit-archive/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | 2 | 2021-01-08T04:06:04.000Z | 2021-02-09T08:28:54.000Z | fabric/14113-14645/14179.md | cendhu/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | null | null | null | fabric/14113-14645/14179.md | cendhu/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | 4 | 2019-12-07T05:54:26.000Z | 2020-06-04T02:29:43.000Z | <strong>Project</strong>: fabric<br><strong>Branch</strong>: master<br><strong>ID</strong>: 14179<br><strong>Subject</strong>: [FAB-6254] Enhance CSCC with local MSP update<br><strong>Status</strong>: ABANDONED<br><strong>Owner</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Assignee</strong>:<br><strong>Created</strong>: 10/4/2017, 5:08:22 AM<br><strong>LastUpdated</strong>: 5/22/2018, 8:30:37 AM<br><strong>CommitMessage</strong>:<br><pre>[FAB-6254] Enhance CSCC with local MSP update
Add a new UpdateMSP function to CSCC that updates the local MSP of the
peer by (re)reading the configuration information from the MSP
directory where the original MSP configuration information was
stored. It is not (yet) possible to point to another location from
which the local MSP configuration information should be loaded.
Change-Id: I1060ee41d1b1ddfd6837622cb3d2d708c940e4cb
Signed-off-by: Mathias Bjoerkqvist <[email protected]>
</pre><h1>Comments</h1><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 5:08:22 AM<br><strong>Message</strong>: <pre>Uploaded patch set 1.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 5:08:31 AM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/13264/ (1/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 5:09:56 AM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/17589/ (2/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 5:10:28 AM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/9156/ (3/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 5:11:16 AM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/341/ (4/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 5:13:42 AM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/11585/ (5/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 5:36:05 AM<br><strong>Message</strong>: <pre>Patch Set 1: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/17589/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-x86_64/17589/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-x86_64/17589
https://jenkins.hyperledger.org/job/fabric-verify-z/13264/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-z/13264/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-z/13264
https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/341/ : FAILURE (skipped)
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/341/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-two-staged-ci-check-x86_64/341
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/9156/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-end-2-end-x86_64/9156
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/11585/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-behave-x86_64/11585</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 6:59:36 AM<br><strong>Message</strong>: <pre>Uploaded patch set 2.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 6:59:43 AM<br><strong>Message</strong>: <pre>Patch Set 2:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/13271/ (1/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 7:01:31 AM<br><strong>Message</strong>: <pre>Patch Set 2:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/17596/ (2/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 7:01:48 AM<br><strong>Message</strong>: <pre>Patch Set 2:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/9163/ (3/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 7:02:18 AM<br><strong>Message</strong>: <pre>Patch Set 2:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/348/ (4/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 7:05:05 AM<br><strong>Message</strong>: <pre>Patch Set 2:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/11591/ (5/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/4/2017, 8:16:01 AM<br><strong>Message</strong>: <pre>Patch Set 2: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/17596/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-x86_64/17596
https://jenkins.hyperledger.org/job/fabric-verify-z/13271/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-z/13271
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/9163/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-end-2-end-x86_64/9163
https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/348/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-two-staged-ci-check-x86_64/348
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/11591/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-behave-x86_64/11591</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 3:55:46 AM<br><strong>Message</strong>: <pre>Uploaded patch set 3.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 3:55:54 AM<br><strong>Message</strong>: <pre>Patch Set 3:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/14119/ (1/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 3:57:35 AM<br><strong>Message</strong>: <pre>Patch Set 3:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10039/ (2/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 3:58:10 AM<br><strong>Message</strong>: <pre>Patch Set 3:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1038/ (3/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 3:58:38 AM<br><strong>Message</strong>: <pre>Patch Set 3:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18399/ (4/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 3:59:14 AM<br><strong>Message</strong>: <pre>Patch Set 3:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12412/ (5/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 4:01:42 AM<br><strong>Message</strong>: <pre>Patch Set 3: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10039/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10039/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-end-2-end-x86_64/10039
https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1038/ : FAILURE (skipped)
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1038/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-two-staged-ci-check-x86_64/1038
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18399/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18399/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-x86_64/18399
https://jenkins.hyperledger.org/job/fabric-verify-z/14119/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-z/14119/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-z/14119
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12412/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12412/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-behave-x86_64/12412</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 4:56:43 AM<br><strong>Message</strong>: <pre>Uploaded patch set 4: Patch Set 3 was rebased.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 4:56:49 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/14120/ (1/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 4:58:14 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10040/ (2/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 4:59:04 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1039/ (3/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 4:59:32 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18400/ (4/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 5:00:13 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12413/ (5/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 5:23:07 AM<br><strong>Message</strong>: <pre>Patch Set 4: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10040/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10040/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-end-2-end-x86_64/10040
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18400/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18400/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-x86_64/18400
https://jenkins.hyperledger.org/job/fabric-verify-z/14120/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-z/14120/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-z/14120
https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1039/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-two-staged-ci-check-x86_64/1039
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12413/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-behave-x86_64/12413</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 5:51:09 AM<br><strong>Message</strong>: <pre>Patch Set 4:
reverify-x</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 5:51:20 AM<br><strong>Message</strong>: <pre>Patch Set 4: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/14122/ (1/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 5:52:43 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10042/ (2/4)</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 5:53:03 AM<br><strong>Message</strong>: <pre>Patch Set 4:
reverify-z</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 5:53:39 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18402/ (3/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 5:54:10 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12415/ (4/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/25/2017, 6:15:53 AM<br><strong>Message</strong>: <pre>Patch Set 4: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18402/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18402/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-x86_64/18402
https://jenkins.hyperledger.org/job/fabric-verify-z/14122/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-z/14122/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-z/14122
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10042/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-end-2-end-x86_64/10042
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12415/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-behave-x86_64/12415</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:12:36 AM<br><strong>Message</strong>: <pre>Patch Set 4:
reverify</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:13:01 AM<br><strong>Message</strong>: <pre>Patch Set 4: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/14161/ (1/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:14:22 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10079/ (2/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:14:50 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18441/ (3/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:15:13 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12452/ (4/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:36:01 AM<br><strong>Message</strong>: <pre>Patch Set 4: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10079/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10079/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-end-2-end-x86_64/10079
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18441/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18441/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-x86_64/18441
https://jenkins.hyperledger.org/job/fabric-verify-z/14161/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-z/14161/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-z/14161
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12452/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12452/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-behave-x86_64/12452</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:40:27 AM<br><strong>Message</strong>: <pre>Patch Set 4:
reverify</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:40:36 AM<br><strong>Message</strong>: <pre>Patch Set 4: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/14162/ (1/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:42:01 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10080/ (2/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:42:28 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18442/ (3/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:43:09 AM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12453/ (4/4)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/26/2017, 3:57:47 AM<br><strong>Message</strong>: <pre>Patch Set 4: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10080/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10080/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-end-2-end-x86_64/10080
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18442/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18442/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-x86_64/18442
https://jenkins.hyperledger.org/job/fabric-verify-z/14162/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-z/14162/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-z/14162
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12453/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-behave-x86_64/12453</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 4:54:25 AM<br><strong>Message</strong>: <pre>Uploaded patch set 5: Patch Set 4 was rebased.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 4:54:43 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/14274/ (1/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 4:55:50 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10196/ (2/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 4:56:20 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1177/ (3/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 4:56:36 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18556/ (4/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 4:56:50 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12572/ (5/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 4:57:08 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/24/ (6/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 6:08:35 AM<br><strong>Message</strong>: <pre>Patch Set 5: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10196/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10196/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-end-2-end-x86_64/10196
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18556/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18556/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-x86_64/18556
https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1177/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-two-staged-ci-check-x86_64/1177
https://jenkins.hyperledger.org/job/fabric-verify-z/14274/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-z/14274
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12572/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-behave-x86_64/12572
https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/24/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-smoke-tests-verify-x86_64/24</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 6:34:19 AM<br><strong>Message</strong>: <pre>Patch Set 5:
reverify</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 6:34:29 AM<br><strong>Message</strong>: <pre>Patch Set 5: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/14275/ (1/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 6:36:51 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10197/ (2/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 6:37:08 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18557/ (3/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 6:37:40 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12573/ (4/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 6:37:41 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/25/ (5/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/30/2017, 7:44:37 AM<br><strong>Message</strong>: <pre>Patch Set 5: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10197/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10197/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-end-2-end-x86_64/10197
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12573/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12573/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-behave-x86_64/12573
https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/25/ : UNSTABLE (skipped)
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/25/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-smoke-tests-verify-x86_64/25
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18557/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-x86_64/18557
https://jenkins.hyperledger.org/job/fabric-verify-z/14275/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-1/fabric-verify-z/14275</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 10/31/2017, 9:37:49 AM<br><strong>Message</strong>: <pre>Uploaded patch set 6.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/31/2017, 9:37:57 AM<br><strong>Message</strong>: <pre>Patch Set 6:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/14312/ (1/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/31/2017, 9:39:30 AM<br><strong>Message</strong>: <pre>Patch Set 6:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10239/ (2/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/31/2017, 9:39:52 AM<br><strong>Message</strong>: <pre>Patch Set 6:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1208/ (3/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/31/2017, 9:40:11 AM<br><strong>Message</strong>: <pre>Patch Set 6:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18594/ (4/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/31/2017, 9:40:17 AM<br><strong>Message</strong>: <pre>Patch Set 6:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12612/ (5/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/31/2017, 9:40:47 AM<br><strong>Message</strong>: <pre>Patch Set 6:
Build Started https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/60/ (6/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 10/31/2017, 10:57:42 AM<br><strong>Message</strong>: <pre>Patch Set 6: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12612/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12612/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-behave-x86_64/12612
https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/60/ : UNSTABLE (skipped)
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/60/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-smoke-tests-verify-x86_64/60
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10239/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-end-2-end-x86_64/10239
https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1208/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-two-staged-ci-check-x86_64/1208
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18594/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-x86_64/18594
https://jenkins.hyperledger.org/job/fabric-verify-z/14312/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-z/14312</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 5:04:46 AM<br><strong>Message</strong>: <pre>Patch Set 6:
reverify</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 5:04:54 AM<br><strong>Message</strong>: <pre>Patch Set 6: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-z/14364/ (2/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 5:04:54 AM<br><strong>Message</strong>: <pre>Patch Set 6: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10289/ (1/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 5:07:58 AM<br><strong>Message</strong>: <pre>Patch Set 6:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18645/ (3/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 5:08:36 AM<br><strong>Message</strong>: <pre>Patch Set 6:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12662/ (4/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 5:08:41 AM<br><strong>Message</strong>: <pre>Patch Set 6:
Build Started https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/108/ (5/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 6:19:05 AM<br><strong>Message</strong>: <pre>Patch Set 6: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10289/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10289/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-end-2-end-x86_64/10289
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/18645/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-x86_64/18645
https://jenkins.hyperledger.org/job/fabric-verify-z/14364/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-z/14364
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/12662/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-behave-x86_64/12662
https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/108/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-smoke-tests-verify-x86_64/108</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 6:44:50 AM<br><strong>Message</strong>: <pre>Patch Set 6:
rebuild-e2e</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 6:49:44 AM<br><strong>Message</strong>: <pre>Patch Set 6: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10294/</pre><strong>Reviewer</strong>: Yacov Manevich - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 7:16:51 AM<br><strong>Message</strong>: <pre>Patch Set 6:
(1 comment)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 7:20:49 AM<br><strong>Message</strong>: <pre>Patch Set 6: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10294/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-end-2-end-x86_64/10294</pre><strong>Reviewer</strong>: Gari Singh - [email protected]<br><strong>Reviewed</strong>: 11/1/2017, 7:28:27 AM<br><strong>Message</strong>: <pre>Patch Set 6: Code-Review-1
I don't think we actually want to do things this way. While in some ways this is useful, I feel like this is not something we want to support in the long term. I'll comment on the JIRA as well. And sorry for the late comments</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 8:49:29 AM<br><strong>Message</strong>: <pre>Patch Set 6:
(1 comment)</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 8:50:24 AM<br><strong>Message</strong>: <pre>Uploaded patch set 7.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 8:50:33 AM<br><strong>Message</strong>: <pre>Patch Set 7:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-s390x/146/ (1/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 8:52:09 AM<br><strong>Message</strong>: <pre>Patch Set 7:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1599/ (2/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 8:52:35 AM<br><strong>Message</strong>: <pre>Patch Set 7:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10704/ (3/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 8:52:37 AM<br><strong>Message</strong>: <pre>Patch Set 7:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/13056/ (4/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 8:53:22 AM<br><strong>Message</strong>: <pre>Patch Set 7:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19020/ (5/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 8:53:30 AM<br><strong>Message</strong>: <pre>Patch Set 7:
Build Started https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/471/ (6/6)</pre><strong>Reviewer</strong>: Yacov Manevich - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 9:28:33 AM<br><strong>Message</strong>: <pre>Patch Set 7: Code-Review-1
(1 comment)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 10:05:54 AM<br><strong>Message</strong>: <pre>Patch Set 7: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/471/ : UNSTABLE (skipped)
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/471/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-smoke-tests-verify-x86_64/471
https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1599/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-two-staged-ci-check-x86_64/1599
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10704/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-end-2-end-x86_64/10704
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/13056/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-behave-x86_64/13056
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19020/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-x86_64/19020
https://jenkins.hyperledger.org/job/fabric-verify-s390x/146/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-s390x/146</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 4:02:01 PM<br><strong>Message</strong>: <pre>Patch Set 7:
(1 comment)</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 4:02:27 PM<br><strong>Message</strong>: <pre>Uploaded patch set 8.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 4:02:37 PM<br><strong>Message</strong>: <pre>Patch Set 8:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1603/ (1/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 4:02:37 PM<br><strong>Message</strong>: <pre>Patch Set 8:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-s390x/150/ (2/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 4:06:27 PM<br><strong>Message</strong>: <pre>Patch Set 8:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10708/ (3/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 4:06:34 PM<br><strong>Message</strong>: <pre>Patch Set 8:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/13060/ (4/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 4:06:34 PM<br><strong>Message</strong>: <pre>Patch Set 8:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19024/ (5/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 4:07:14 PM<br><strong>Message</strong>: <pre>Patch Set 8:
Build Started https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/475/ (6/6)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 11/17/2017, 5:17:31 PM<br><strong>Message</strong>: <pre>Patch Set 8: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/475/ : UNSTABLE (skipped)
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-smoke-tests-verify-x86_64/475/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-smoke-tests-verify-x86_64/475
https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1603/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-two-staged-ci-check-x86_64/1603
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/10708/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-end-2-end-x86_64/10708
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/13060/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-behave-x86_64/13060
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19024/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-x86_64/19024
https://jenkins.hyperledger.org/job/fabric-verify-s390x/150/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-s390x/150</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 5:40:58 AM<br><strong>Message</strong>: <pre>Uploaded patch set 9.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 5:41:06 AM<br><strong>Message</strong>: <pre>Patch Set 9:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-s390x/444/ (1/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 5:42:56 AM<br><strong>Message</strong>: <pre>Patch Set 9:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19301/ (2/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 5:43:25 AM<br><strong>Message</strong>: <pre>Patch Set 9:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/11002/ (3/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 5:43:36 AM<br><strong>Message</strong>: <pre>Patch Set 9:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/13363/ (4/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 5:44:07 AM<br><strong>Message</strong>: <pre>Patch Set 9:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1854/ (5/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 6:59:01 AM<br><strong>Message</strong>: <pre>Patch Set 9: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19301/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-x86_64/19301
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/11002/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-end-2-end-x86_64/11002
https://jenkins.hyperledger.org/job/fabric-verify-s390x/444/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-s390x/444
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/13363/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-behave-x86_64/13363
https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1854/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-two-staged-ci-check-x86_64/1854</pre><strong>Reviewer</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 11:22:20 AM<br><strong>Message</strong>: <pre>Uploaded patch set 10.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 11:22:30 AM<br><strong>Message</strong>: <pre>Patch Set 10:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-s390x/450/ (1/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 11:26:06 AM<br><strong>Message</strong>: <pre>Patch Set 10:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19307/ (2/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 11:26:08 AM<br><strong>Message</strong>: <pre>Patch Set 10:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/13376/ (5/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 11:26:08 AM<br><strong>Message</strong>: <pre>Patch Set 10:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1860/ (4/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 11:26:09 AM<br><strong>Message</strong>: <pre>Patch Set 10:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/11008/ (3/5)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 12/1/2017, 12:21:25 PM<br><strong>Message</strong>: <pre>Patch Set 10: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19307/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19307/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-x86_64/19307
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/11008/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/11008/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-end-2-end-x86_64/11008
https://jenkins.hyperledger.org/job/fabric-verify-s390x/450/ : FAILURE
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-s390x/450/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-s390x/450
https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1860/ : FAILURE (skipped)
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1860/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-two-staged-ci-check-x86_64/1860
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/13376/ : SUCCESS
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-behave-x86_64/13376</pre><strong>Reviewer</strong>: Christopher Ferris - [email protected]<br><strong>Reviewed</strong>: 3/29/2018, 3:13:43 PM<br><strong>Message</strong>: <pre>Patch Set 10:
> Patch Set 10: Verified-1
>
> Build Failed
>
> https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19307/ : FAILURE
>
> No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-x86_64/19307/ )
>
> Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-x86_64/19307
>
> https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/11008/ : FAILURE
>
> No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/11008/ )
>
> Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-end-2-end-x86_64/11008
>
> https://jenkins.hyperledger.org/job/fabric-verify-s390x/450/ : FAILURE
>
> No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-s390x/450/ )
>
> Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-s390x/450
>
> https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1860/ : FAILURE (skipped)
>
> No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-two-staged-ci-check-x86_64/1860/ )
>
> Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-two-staged-ci-check-x86_64/1860
>
> https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/13376/ : SUCCESS
>
> Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-behave-x86_64/13376
Now that we're past 1.1, we can reconsider this. It needs a rebase for starters.</pre><strong>Reviewer</strong>: Kostas Christidis - [email protected]<br><strong>Reviewed</strong>: 4/20/2018, 5:44:35 PM<br><strong>Message</strong>: <pre>Patch Set 10:
Matthias, see Chris's comment above. Sorry for letting this sit for so long -- can you please rebase and address the merge conflicts?</pre><strong>Reviewer</strong>: Kostas Christidis - [email protected]<br><strong>Reviewed</strong>: 4/20/2018, 6:00:10 PM<br><strong>Message</strong>: <pre>Patch Set 10:
(Do note however that this doesn't seem to be in the list for 1.2, so I predict this won't be the last rebasing needed.)</pre><strong>Reviewer</strong>: Kostas Christidis - [email protected]<br><strong>Reviewed</strong>: 5/22/2018, 8:30:37 AM<br><strong>Message</strong>: <pre>Abandoned
Hasn't been touched for 6 months.</pre><h1>PatchSets</h1><h3>PatchSet Number: 1</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Uploader</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Created</strong>: 10/4/2017, 5:08:22 AM<br><strong>UnmergedRevision</strong>: [5292258adb9b659d118832ff87d6a1095c86c7b1](https://github.com/hyperledger-gerrit-archive/fabric/commit/5292258adb9b659d118832ff87d6a1095c86c7b1)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 10/4/2017, 5:36:05 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 2</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Uploader</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Created</strong>: 10/4/2017, 6:59:36 AM<br><strong>UnmergedRevision</strong>: [6e8fdc153cccb8cc2c5c5f5c05f6f228afa1af40](https://github.com/hyperledger-gerrit-archive/fabric/commit/6e8fdc153cccb8cc2c5c5f5c05f6f228afa1af40)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 10/4/2017, 8:16:01 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 3</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Uploader</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Created</strong>: 10/25/2017, 3:55:46 AM<br><strong>UnmergedRevision</strong>: [d4973e69862fee9f651e88199b4107a43050c686](https://github.com/hyperledger-gerrit-archive/fabric/commit/d4973e69862fee9f651e88199b4107a43050c686)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 10/25/2017, 4:01:42 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 4</h3><blockquote><strong>Type</strong>: TRIVIAL_REBASE<br><strong>Author</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Uploader</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Created</strong>: 10/25/2017, 4:56:43 AM<br><strong>UnmergedRevision</strong>: [757e354c89df66a462da5797a35e97ad4bee918a](https://github.com/hyperledger-gerrit-archive/fabric/commit/757e354c89df66a462da5797a35e97ad4bee918a)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 10/26/2017, 3:57:47 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 5</h3><blockquote><strong>Type</strong>: TRIVIAL_REBASE<br><strong>Author</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Uploader</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Created</strong>: 10/30/2017, 4:54:25 AM<br><strong>UnmergedRevision</strong>: [cb287f29cc38750f85efa4046ef39cd23ba1d3c1](https://github.com/hyperledger-gerrit-archive/fabric/commit/cb287f29cc38750f85efa4046ef39cd23ba1d3c1)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 10/30/2017, 7:44:37 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 6</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Uploader</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Created</strong>: 10/31/2017, 9:37:49 AM<br><strong>UnmergedRevision</strong>: [5d00d55f2b480be15b5c70e1c16be3b5ad0a106b](https://github.com/hyperledger-gerrit-archive/fabric/commit/5d00d55f2b480be15b5c70e1c16be3b5ad0a106b)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 11/1/2017, 7:20:49 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Gari Singh - [email protected]<br><strong>Approved</strong>: 11/1/2017, 7:28:27 AM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: -1<br><br><h2>Comments</h2><strong>Commenter</strong>: Yacov Manevich - [email protected]<br><strong>CommentLine</strong>: [msp/mgmt/mgmt.go#L170](https://github.com/hyperledger-gerrit-archive/fabric/blob/5d00d55f2b480be15b5c70e1c16be3b5ad0a106b/msp/mgmt/mgmt.go#L170)<br><strong>Comment</strong>: <pre>I think we're missing purging the cache of the local MSP.</pre><strong>Commenter</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>CommentLine</strong>: [msp/mgmt/mgmt.go#L170](https://github.com/hyperledger-gerrit-archive/fabric/blob/5d00d55f2b480be15b5c70e1c16be3b5ad0a106b/msp/mgmt/mgmt.go#L170)<br><strong>Comment</strong>: <pre>Addressed in next patch. Related CR: https://gerrit.hyperledger.org/r/15559</pre></blockquote><h3>PatchSet Number: 7</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Uploader</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Created</strong>: 11/17/2017, 8:50:24 AM<br><strong>UnmergedRevision</strong>: [3c9c76cbcfec49cbf2eed49b4eaab9c79114daa7](https://github.com/hyperledger-gerrit-archive/fabric/commit/3c9c76cbcfec49cbf2eed49b4eaab9c79114daa7)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 11/17/2017, 10:05:54 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Yacov Manevich - [email protected]<br><strong>Approved</strong>: 11/17/2017, 9:28:33 AM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: -1<br><br><h2>Comments</h2><strong>Commenter</strong>: Yacov Manevich - [email protected]<br><strong>CommentLine</strong>: [msp/mgmt/mgmt.go#L209](https://github.com/hyperledger-gerrit-archive/fabric/blob/3c9c76cbcfec49cbf2eed49b4eaab9c79114daa7/msp/mgmt/mgmt.go#L209)<br><strong>Comment</strong>: <pre>we need a lock for memory coherency</pre><strong>Commenter</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>CommentLine</strong>: [msp/mgmt/mgmt.go#L209](https://github.com/hyperledger-gerrit-archive/fabric/blob/3c9c76cbcfec49cbf2eed49b4eaab9c79114daa7/msp/mgmt/mgmt.go#L209)<br><strong>Comment</strong>: <pre>Added locks.</pre></blockquote><h3>PatchSet Number: 8</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Uploader</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Created</strong>: 11/17/2017, 4:02:27 PM<br><strong>UnmergedRevision</strong>: [7484ebe523c61f8c071c91ce089f1bdc1046a079](https://github.com/hyperledger-gerrit-archive/fabric/commit/7484ebe523c61f8c071c91ce089f1bdc1046a079)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 11/17/2017, 5:17:31 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 9</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Uploader</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Created</strong>: 12/1/2017, 5:40:58 AM<br><strong>UnmergedRevision</strong>: [06cbc4b55643acf450d404c6b1cedc11190ef958](https://github.com/hyperledger-gerrit-archive/fabric/commit/06cbc4b55643acf450d404c6b1cedc11190ef958)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 12/1/2017, 6:59:01 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 10</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Uploader</strong>: Mathias Bjoerkqvist - [email protected]<br><strong>Created</strong>: 12/1/2017, 11:22:20 AM<br><strong>UnmergedRevision</strong>: [768cecf473dbc3e99cdd0c4a3ab56144a775589b](https://github.com/hyperledger-gerrit-archive/fabric/commit/768cecf473dbc3e99cdd0c4a3ab56144a775589b)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 12/1/2017, 12:21:25 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote> | 97.811617 | 8,666 | 0.771402 | kor_Hang | 0.388212 |
40e852cdfc4f305f11c0ea2570a59202760fa52d | 19,425 | md | Markdown | README.md | FusselTV/status | 092fc9ae715ca63e5ee41b7add4b6ef519680eb6 | [
"MIT"
] | null | null | null | README.md | FusselTV/status | 092fc9ae715ca63e5ee41b7add4b6ef519680eb6 | [
"MIT"
] | 51 | 2021-12-22T11:27:04.000Z | 2022-03-29T23:36:46.000Z | README.md | FusselTV/status | 092fc9ae715ca63e5ee41b7add4b6ef519680eb6 | [
"MIT"
] | null | null | null | # [📈 Live Status](https://status.fussel.tv): <!--live status--> **🟩 All systems operational**
This repository contains the open-source uptime monitor and status page for [Christian](https://status.fussel.tv), powered by [Upptime](https://github.com/upptime/upptime).
[](https://github.com/FusselTV/status/actions?query=workflow%3A%22Uptime+CI%22)
[](https://github.com/FusselTV/status/actions?query=workflow%3A%22Response+Time+CI%22)
[](https://github.com/FusselTV/status/actions?query=workflow%3A%22Graphs+CI%22)
[](https://github.com/FusselTV/status/actions?query=workflow%3A%22Static+Site+CI%22)
[](https://github.com/FusselTV/status/actions?query=workflow%3A%22Summary+CI%22)
With [Upptime](https://upptime.js.org), you can get your own unlimited and free uptime monitor and status page, powered entirely by a GitHub repository. We use [Issues](https://github.com/FusselTV/status/issues) as incident reports, [Actions](https://github.com/FusselTV/status/actions) as uptime monitors, and [Pages](https://status.fussel.tv) for the status page.
<!--start: status pages-->
<!-- This summary is generated by Upptime (https://github.com/upptime/upptime) -->
<!-- Do not edit this manually, your changes will be overwritten -->
<!-- prettier-ignore -->
| URL | Status | History | Response Time | Uptime |
| --- | ------ | ------- | ------------- | ------ |
| <img alt="" src="https://upload.wikimedia.org/wikipedia/de/6/68/Fritz%21_Logo.svg" height="13"> [FRITZ!Box](https://avm.fussel.tv) | 🟩 Up | [fritz-box.yml](https://github.com/FusselTV/status/commits/HEAD/history/fritz-box.yml) | <details><summary><img alt="Response time graph" src="./graphs/fritz-box/response-time-week.png" height="20"> 602ms</summary><br><a href="https://status.fussel.tv/history/fritz-box"><img alt="Response time 624" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Ffritz-box%2Fresponse-time.json"></a><br><a href="https://status.fussel.tv/history/fritz-box"><img alt="24-hour response time 604" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Ffritz-box%2Fresponse-time-day.json"></a><br><a href="https://status.fussel.tv/history/fritz-box"><img alt="7-day response time 602" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Ffritz-box%2Fresponse-time-week.json"></a><br><a href="https://status.fussel.tv/history/fritz-box"><img alt="30-day response time 622" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Ffritz-box%2Fresponse-time-month.json"></a><br><a href="https://status.fussel.tv/history/fritz-box"><img alt="1-year response time 624" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Ffritz-box%2Fresponse-time-year.json"></a></details> | <details><summary><a href="https://status.fussel.tv/history/fritz-box">100.00%</a></summary><a href="https://status.fussel.tv/history/fritz-box"><img alt="All-time uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Ffritz-box%2Fuptime.json"></a><br><a href="https://status.fussel.tv/history/fritz-box"><img alt="24-hour uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Ffritz-box%2Fuptime-day.json"></a><br><a href="https://status.fussel.tv/history/fritz-box"><img alt="7-day uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Ffritz-box%2Fuptime-week.json"></a><br><a href="https://status.fussel.tv/history/fritz-box"><img alt="30-day uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Ffritz-box%2Fuptime-month.json"></a><br><a href="https://status.fussel.tv/history/fritz-box"><img alt="1-year uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Ffritz-box%2Fuptime-year.json"></a></details>
| <img alt="" src="https://products.containerize.com/password-management/bitwarden/menu_image.png" height="13"> [Bitwarden](https://vault.fussel.tv) | 🟩 Up | [bitwarden.yml](https://github.com/FusselTV/status/commits/HEAD/history/bitwarden.yml) | <details><summary><img alt="Response time graph" src="./graphs/bitwarden/response-time-week.png" height="20"> 2111ms</summary><br><a href="https://status.fussel.tv/history/bitwarden"><img alt="Response time 1070" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fbitwarden%2Fresponse-time.json"></a><br><a href="https://status.fussel.tv/history/bitwarden"><img alt="24-hour response time 3985" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fbitwarden%2Fresponse-time-day.json"></a><br><a href="https://status.fussel.tv/history/bitwarden"><img alt="7-day response time 2111" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fbitwarden%2Fresponse-time-week.json"></a><br><a href="https://status.fussel.tv/history/bitwarden"><img alt="30-day response time 1288" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fbitwarden%2Fresponse-time-month.json"></a><br><a href="https://status.fussel.tv/history/bitwarden"><img alt="1-year response time 1070" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fbitwarden%2Fresponse-time-year.json"></a></details> | <details><summary><a href="https://status.fussel.tv/history/bitwarden">100.00%</a></summary><a href="https://status.fussel.tv/history/bitwarden"><img alt="All-time uptime 99.43%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fbitwarden%2Fuptime.json"></a><br><a href="https://status.fussel.tv/history/bitwarden"><img alt="24-hour uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fbitwarden%2Fuptime-day.json"></a><br><a href="https://status.fussel.tv/history/bitwarden"><img alt="7-day uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fbitwarden%2Fuptime-week.json"></a><br><a href="https://status.fussel.tv/history/bitwarden"><img alt="30-day uptime 99.95%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fbitwarden%2Fuptime-month.json"></a><br><a href="https://status.fussel.tv/history/bitwarden"><img alt="1-year uptime 99.43%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fbitwarden%2Fuptime-year.json"></a></details>
| <img alt="" src="https://upload.wikimedia.org/wikipedia/commons/6/6e/Home_Assistant_Logo.svg" height="13"> [Home Assistant](https://ha.fussel.tv) | 🟩 Up | [home-assistant.yml](https://github.com/FusselTV/status/commits/HEAD/history/home-assistant.yml) | <details><summary><img alt="Response time graph" src="./graphs/home-assistant/response-time-week.png" height="20"> 1835ms</summary><br><a href="https://status.fussel.tv/history/home-assistant"><img alt="Response time 978" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fhome-assistant%2Fresponse-time.json"></a><br><a href="https://status.fussel.tv/history/home-assistant"><img alt="24-hour response time 2199" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fhome-assistant%2Fresponse-time-day.json"></a><br><a href="https://status.fussel.tv/history/home-assistant"><img alt="7-day response time 1835" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fhome-assistant%2Fresponse-time-week.json"></a><br><a href="https://status.fussel.tv/history/home-assistant"><img alt="30-day response time 1579" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fhome-assistant%2Fresponse-time-month.json"></a><br><a href="https://status.fussel.tv/history/home-assistant"><img alt="1-year response time 978" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fhome-assistant%2Fresponse-time-year.json"></a></details> | <details><summary><a href="https://status.fussel.tv/history/home-assistant">100.00%</a></summary><a href="https://status.fussel.tv/history/home-assistant"><img alt="All-time uptime 99.26%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fhome-assistant%2Fuptime.json"></a><br><a href="https://status.fussel.tv/history/home-assistant"><img alt="24-hour uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fhome-assistant%2Fuptime-day.json"></a><br><a href="https://status.fussel.tv/history/home-assistant"><img alt="7-day uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fhome-assistant%2Fuptime-week.json"></a><br><a href="https://status.fussel.tv/history/home-assistant"><img alt="30-day uptime 99.96%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fhome-assistant%2Fuptime-month.json"></a><br><a href="https://status.fussel.tv/history/home-assistant"><img alt="1-year uptime 99.26%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fhome-assistant%2Fuptime-year.json"></a></details>
| <img alt="" src="https://cdn.worldvectorlogo.com/logos/portainer.svg" height="13"> [Portainer](https://portainer.fussel.tv/) | 🟩 Up | [portainer.yml](https://github.com/FusselTV/status/commits/HEAD/history/portainer.yml) | <details><summary><img alt="Response time graph" src="./graphs/portainer/response-time-week.png" height="20"> 294ms</summary><br><a href="https://status.fussel.tv/history/portainer"><img alt="Response time 283" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fportainer%2Fresponse-time.json"></a><br><a href="https://status.fussel.tv/history/portainer"><img alt="24-hour response time 239" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fportainer%2Fresponse-time-day.json"></a><br><a href="https://status.fussel.tv/history/portainer"><img alt="7-day response time 294" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fportainer%2Fresponse-time-week.json"></a><br><a href="https://status.fussel.tv/history/portainer"><img alt="30-day response time 280" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fportainer%2Fresponse-time-month.json"></a><br><a href="https://status.fussel.tv/history/portainer"><img alt="1-year response time 283" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fportainer%2Fresponse-time-year.json"></a></details> | <details><summary><a href="https://status.fussel.tv/history/portainer">100.00%</a></summary><a href="https://status.fussel.tv/history/portainer"><img alt="All-time uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fportainer%2Fuptime.json"></a><br><a href="https://status.fussel.tv/history/portainer"><img alt="24-hour uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fportainer%2Fuptime-day.json"></a><br><a href="https://status.fussel.tv/history/portainer"><img alt="7-day uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fportainer%2Fuptime-week.json"></a><br><a href="https://status.fussel.tv/history/portainer"><img alt="30-day uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fportainer%2Fuptime-month.json"></a><br><a href="https://status.fussel.tv/history/portainer"><img alt="1-year uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fportainer%2Fuptime-year.json"></a></details>
| <img alt="" src="https://cdn.worldvectorlogo.com/logos/nginx-1.svg" height="13"> [Nginx](https://nginx.fussel.tv/) | 🟩 Up | [nginx.yml](https://github.com/FusselTV/status/commits/HEAD/history/nginx.yml) | <details><summary><img alt="Response time graph" src="./graphs/nginx/response-time-week.png" height="20"> 314ms</summary><br><a href="https://status.fussel.tv/history/nginx"><img alt="Response time 372" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fnginx%2Fresponse-time.json"></a><br><a href="https://status.fussel.tv/history/nginx"><img alt="24-hour response time 257" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fnginx%2Fresponse-time-day.json"></a><br><a href="https://status.fussel.tv/history/nginx"><img alt="7-day response time 314" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fnginx%2Fresponse-time-week.json"></a><br><a href="https://status.fussel.tv/history/nginx"><img alt="30-day response time 274" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fnginx%2Fresponse-time-month.json"></a><br><a href="https://status.fussel.tv/history/nginx"><img alt="1-year response time 372" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fnginx%2Fresponse-time-year.json"></a></details> | <details><summary><a href="https://status.fussel.tv/history/nginx">100.00%</a></summary><a href="https://status.fussel.tv/history/nginx"><img alt="All-time uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fnginx%2Fuptime.json"></a><br><a href="https://status.fussel.tv/history/nginx"><img alt="24-hour uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fnginx%2Fuptime-day.json"></a><br><a href="https://status.fussel.tv/history/nginx"><img alt="7-day uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fnginx%2Fuptime-week.json"></a><br><a href="https://status.fussel.tv/history/nginx"><img alt="30-day uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fnginx%2Fuptime-month.json"></a><br><a href="https://status.fussel.tv/history/nginx"><img alt="1-year uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fnginx%2Fuptime-year.json"></a></details>
| <img alt="" src="https://0987.win/images/favicon-196x196.png" height="13"> [Kutt](https://0987.win/) | 🟩 Up | [kutt.yml](https://github.com/FusselTV/status/commits/HEAD/history/kutt.yml) | <details><summary><img alt="Response time graph" src="./graphs/kutt/response-time-week.png" height="20"> 804ms</summary><br><a href="https://status.fussel.tv/history/kutt"><img alt="Response time 800" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fkutt%2Fresponse-time.json"></a><br><a href="https://status.fussel.tv/history/kutt"><img alt="24-hour response time 723" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fkutt%2Fresponse-time-day.json"></a><br><a href="https://status.fussel.tv/history/kutt"><img alt="7-day response time 804" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fkutt%2Fresponse-time-week.json"></a><br><a href="https://status.fussel.tv/history/kutt"><img alt="30-day response time 769" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fkutt%2Fresponse-time-month.json"></a><br><a href="https://status.fussel.tv/history/kutt"><img alt="1-year response time 800" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fkutt%2Fresponse-time-year.json"></a></details> | <details><summary><a href="https://status.fussel.tv/history/kutt">100.00%</a></summary><a href="https://status.fussel.tv/history/kutt"><img alt="All-time uptime 86.57%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fkutt%2Fuptime.json"></a><br><a href="https://status.fussel.tv/history/kutt"><img alt="24-hour uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fkutt%2Fuptime-day.json"></a><br><a href="https://status.fussel.tv/history/kutt"><img alt="7-day uptime 100.00%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fkutt%2Fuptime-week.json"></a><br><a href="https://status.fussel.tv/history/kutt"><img alt="30-day uptime 86.23%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fkutt%2Fuptime-month.json"></a><br><a href="https://status.fussel.tv/history/kutt"><img alt="1-year uptime 86.57%" src="https://img.shields.io/endpoint?url=https%3A%2F%2Fraw.githubusercontent.com%2FFusselTV%2Fstatus%2FHEAD%2Fapi%2Fkutt%2Fuptime-year.json"></a></details>
<!--end: status pages-->
[**Visit our status website →**](https://status.fussel.tv)
## 📄 License
- Powered by: [Upptime](https://github.com/upptime/upptime)
- Code: [MIT](./LICENSE) © [Christian](https://status.fussel.tv)
- Data in the `./history` directory: [Open Database License](https://opendatacommons.org/licenses/odbl/1-0/)
| 555 | 3,038 | 0.762677 | yue_Hant | 0.467947 |
40e8e3a26fd6b5110947c8f23e09e8125096f839 | 282 | md | Markdown | community/content/contribute/_index.cn.md | mallocxw/sharding-sphere-doc | 019bd4ea5627ab428dd6f7b1e36085e6456dc228 | [
"Apache-2.0"
] | 1 | 2018-11-22T17:00:51.000Z | 2018-11-22T17:00:51.000Z | community/content/contribute/_index.cn.md | mallocxw/sharding-sphere-doc | 019bd4ea5627ab428dd6f7b1e36085e6456dc228 | [
"Apache-2.0"
] | null | null | null | community/content/contribute/_index.cn.md | mallocxw/sharding-sphere-doc | 019bd4ea5627ab428dd6f7b1e36085e6456dc228 | [
"Apache-2.0"
] | null | null | null | +++
pre = "<b>2. </b>"
title = "参与和贡献"
weight = 2
chapter = true
+++
Sharding-Sphere是一个开放且活跃的社区,非常欢迎高质量的贡献者与我们共建开源之路。
在成为一个贡献者之前,请您阅读[贡献者指南](/cn/contribute/contributor/)以及[开发规范](/cn/contribute/code_conduct/)。
如果你想成为提交者,请您阅读[提交者指南](/cn/contribute/committer/)。
感谢您关注Sharding-Sphere。
| 23.5 | 90 | 0.741135 | yue_Hant | 0.427641 |
40e9f40bec61b42967920b6f8b13a80a7df700ae | 40 | md | Markdown | _includes/03-links.md | mjisaboss/markdown-portfolio | 7b5a51b4cdff046723fe1a160b912d795624947a | [
"MIT"
] | null | null | null | _includes/03-links.md | mjisaboss/markdown-portfolio | 7b5a51b4cdff046723fe1a160b912d795624947a | [
"MIT"
] | 5 | 2022-01-14T21:59:19.000Z | 2022-01-18T03:34:09.000Z | _includes/03-links.md | mjisaboss/markdown-portfolio | 7b5a51b4cdff046723fe1a160b912d795624947a | [
"MIT"
] | null | null | null | [Twitter](https://twitter.com/Beattymj)
| 20 | 39 | 0.75 | yue_Hant | 0.816213 |
40eaac8c59832775db21d4ea19b7495914cdd3b7 | 1,513 | md | Markdown | docs/odbc/microsoft/sqlconnect-visual-foxpro-odbc-driver.md | bingenortuzar/sql-docs.es-es | 9e13730ffa0f3ce461cce71bebf1a3ce188c80ad | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-26T21:26:08.000Z | 2021-04-26T21:26:08.000Z | docs/odbc/microsoft/sqlconnect-visual-foxpro-odbc-driver.md | jlporatti/sql-docs.es-es | 9b35d3acbb48253e1f299815df975f9ddaa5e9c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/odbc/microsoft/sqlconnect-visual-foxpro-odbc-driver.md | jlporatti/sql-docs.es-es | 9b35d3acbb48253e1f299815df975f9ddaa5e9c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: SQLConnect (controlador ODBC de Visual FoxPro) | Microsoft Docs
ms.custom: ''
ms.date: 01/19/2017
ms.prod: sql
ms.prod_service: connectivity
ms.reviewer: ''
ms.technology: connectivity
ms.topic: conceptual
helpviewer_keywords:
- SQLConnect function [ODBC], Visual FoxPro ODBC Driver
ms.assetid: 49cbfafa-b21e-4e89-b248-9c7098f46b20
author: MightyPen
ms.author: genemi
ms.openlocfilehash: c5d73339bc87097d81eade4df0b8ab1604979e07
ms.sourcegitcommit: b2464064c0566590e486a3aafae6d67ce2645cef
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 07/15/2019
ms.locfileid: "68054064"
---
# <a name="sqlconnect-visual-foxpro-odbc-driver"></a>SQLConnect (controlador ODBC de Visual FoxPro)
> [!NOTE]
> Este tema contiene información específica del controlador ODBC de Visual FoxPro. Para obtener información general acerca de esta función, vea el tema correspondiente en [referencia de la API de ODBC](../../odbc/reference/syntax/odbc-api-reference.md).
Soporte técnico: Completo
Conformidad de la API de ODBC: Nivel básico
Se conecta a un origen de datos, que puede ser un [base de datos](../../odbc/microsoft/visual-foxpro-terminology.md) o un directorio de [tablas](../../odbc/microsoft/visual-foxpro-terminology.md). El controlador ODBC de Visual FoxPro omite la *szUID*, *cbUID*, *szAuthStr*, y *cbAuthStr* argumentos.
Para obtener más información, consulte [SQLConnect](../../odbc/reference/syntax/sqlconnect-function.md) en el *referencia del programador de ODBC*.
| 45.848485 | 302 | 0.773298 | spa_Latn | 0.541615 |
40ebb4cf56eb4d4fdb71e7fef3e39646c4f59340 | 866 | md | Markdown | README.md | zenonux/svrx-plugin-tailwindcss | ab3755d54d13955fb8e1c8459c3e3edd8e309a9d | [
"MIT"
] | null | null | null | README.md | zenonux/svrx-plugin-tailwindcss | ab3755d54d13955fb8e1c8459c3e3edd8e309a9d | [
"MIT"
] | null | null | null | README.md | zenonux/svrx-plugin-tailwindcss | ab3755d54d13955fb8e1c8459c3e3edd8e309a9d | [
"MIT"
] | null | null | null | ## svrx-plugin-tailwindcss
[](https://svrx.io/)
[](https://www.npmjs.com/package/svrx-plugin-tailwindcss)
The svrx plugin for tailwindcss
## Usage
> Please make sure that you have installed [svrx](https://svrx.io/) already.
### Via CLI
```bash
svrx -p tailwindcss
```
### Via API
```js
const svrx = require("@svrx/svrx");
svrx({ plugins: ["tailwindcss"] }).start();
```
## Options
### **src \[String]:**
指定 tailwindcss src 文件路径,默认值为`tailwindcss.css`
`svrx -p tailwindcss?src=tailwindcss.css`
### **dest \[String]:**
指定 tailwindcss dest 文件路径,默认值为`tailwindcss.min.css`
`svrx -p tailwindcss?dest=css`
### **build \[String]:**
启动插件时编译 tailwindcss
`svrx -p tailwindcss?build`
## License
MIT
| 17.673469 | 139 | 0.684758 | yue_Hant | 0.414992 |
40ec2bd8d7f431c42881987c575157ddafb736ce | 27 | md | Markdown | README.md | noahwillcrow/COGS316-MovieData | d7121d8ed247365c85b04b3abeaa3d56d33b1a9a | [
"MIT"
] | null | null | null | README.md | noahwillcrow/COGS316-MovieData | d7121d8ed247365c85b04b3abeaa3d56d33b1a9a | [
"MIT"
] | null | null | null | README.md | noahwillcrow/COGS316-MovieData | d7121d8ed247365c85b04b3abeaa3d56d33b1a9a | [
"MIT"
] | null | null | null | # COGS-316-MovieDataScraper | 27 | 27 | 0.851852 | kor_Hang | 0.885746 |
40ecff3cc431b2e8cd975d83a9868f61057f2985 | 2,335 | md | Markdown | README.md | johnmartel/typescript | bb6d3d49164e9022eb2c7d618abfc43ebe404770 | [
"MIT"
] | null | null | null | README.md | johnmartel/typescript | bb6d3d49164e9022eb2c7d618abfc43ebe404770 | [
"MIT"
] | 211 | 2020-04-25T20:40:43.000Z | 2022-03-26T14:24:02.000Z | README.md | johnmartel/typescript | bb6d3d49164e9022eb2c7d618abfc43ebe404770 | [
"MIT"
] | null | null | null | # typescript-template
[](https://github.com/johnmartel/typescript-template/actions?query=workflow%3A%22Build+and+test%22)
[](https://codeclimate.com/github/johnmartel/typescript-template/maintainability)
[](https://codecov.io/gh/johnmartel/typescript-template)
[](http://makeapullrequest.com)
[](https://dependabot.com)
This is a (slightly) opinionated Github template for creating Typescript projects.
## Usage
Simply use this repository as a template for any new repository.
See: [Creating a repository from a template](https://help.github.com/en/github/creating-cloning-and-archiving-repositories/creating-a-repository-from-a-template)
from the Github doc.
Once this is done, you may want to get rid of the explanation comments.
### TL;DR
- Select this repository as the template fo your new repository;
- You won't get the history from this repository, only the files (*all* of them) as an `Initial commit`;
- Modify the stuff you need to: README, repo settings, any tsc/eslint/jest/etc. configurations
- Please get rid of the explanation comments to clean up your new repo.
## Features
Projects created from this template will get the following for free:
- Sensible defaults for your Github repository settings ;
- Sensible defaults for:
- `tsc` configuration;
- `eslint` configuration;
- `prettier` configuration;
- `jest` configuration with `ts-jest`;
- Default tests location;
- Common npm scripts
- Basic CI workflow using Github actions
- Automatic dependency upgrade PRs by [Dependabot](https://dependabot.com)
- Uploading of code coverage report to [Codecov](https://codecov.io)
- Maintainability score by [CodeClimate](https://codeclimate.com/)
## Building your new repo
```shell script
npm install
npm run lint
npm test
npm run compile:clean
```
Your typescript code will then compile nicely into `./lib`, conveniently ignored by git.
| 43.240741 | 192 | 0.776017 | eng_Latn | 0.817242 |
40ed1fef2f0f812203256cc94acfca5b1f371293 | 30 | md | Markdown | README.md | iozy-func/PlanB | e89a7854681b40b63b7937c7afb52bd086ab0f51 | [
"MIT"
] | 1 | 2019-07-24T05:08:18.000Z | 2019-07-24T05:08:18.000Z | README.md | iozy-func/PlanB | e89a7854681b40b63b7937c7afb52bd086ab0f51 | [
"MIT"
] | null | null | null | README.md | iozy-func/PlanB | e89a7854681b40b63b7937c7afb52bd086ab0f51 | [
"MIT"
] | null | null | null | # PlanB_Web
PlanB Web version
| 10 | 17 | 0.8 | deu_Latn | 0.882683 |
40effa37d0f378dbc96ea17e3a6c7eebd6ad18f2 | 10,224 | md | Markdown | README.md | yanuas123/validation | 4a9f9c369daee22b532e2f4836fd95f02775f26a | [
"MIT"
] | null | null | null | README.md | yanuas123/validation | 4a9f9c369daee22b532e2f4836fd95f02775f26a | [
"MIT"
] | null | null | null | README.md | yanuas123/validation | 4a9f9c369daee22b532e2f4836fd95f02775f26a | [
"MIT"
] | null | null | null | # Validation Plugin
This is typescript plugin for the form validation. It performs validation on the elements TNPUT, SELECT, TEXTAREA. You can get also object with data from filled fields and define default function to execute after success submit. This validator also contains functional for control hidden fields.
## Getting Started
**1.** put `validation.ts` to compilation folder
**2.** create your own `...ts` file
**3.** import plugin module to your own `...ts` file
```javascript
import { type [, types], Validation} from "./modules/validation";
```
where `types...` are interfaces or types for arguments - **[Types](#types)**; and `Validation` is `class` object
**4.** launch in your `...ts` file class `Validation` with not required argument **[ValidationProp](#validationprop)**. If you use this argument, you must define all required properties in this argument.
```javascript
let validation = new Validation(ValidationProp);
```
You can to launch only one `Validation` class and reuse it many times using method `.setForm()` on the instance of the class. In other words, one instance of the class can contain many forms.
**5.** add validation to form using method `setForm()` with required argument **[FormArg](#formarg)** and not required argument **[validationCallFunc](#types)**
```javascript
validation.setForm(FormArg [, validationCallFunc]);
```
**6.** compile typescript files to javascript using one way from described on the [TypeScript](https://github.com/microsoft/TypeScript)
## Types
- **TemplateTypes** - `string | number | RegExp` - type of validation template. If a form field match this type, the form field is valid.
- **SubmitEl** - `HTMLInputElement | HTMLButtonElement` - submit element - element on which validator should to connect submit handler
- **validationCallFunc** - `(form: HTMLFormElement, data: formData, call: formCallFunc): void` - callback function that executes after success submit. Three arguments are here available: *form* - form element, *data* - data object with values from the from fields, *call* - you must to execute this function in your code to signal to validator abuot status of server request or other actions.
- **formCallFunc** - `(server_resp: boolean | [string]): void` - function to signal to validator abuot status of server request or other actions. If you pass `true` - validator will refresh the form fields. If you pass array of strings - validator will serach a field name that match the string from array and connect to the field *server error class*.
- **[ValidationProp](#validationprop)** - global validation properties
- **[FormArg](#formarg)** - form validation properties
- **[InpArg](#inparg)** - field validation properties
- **[formData](#formdata)** - data object with values from the from fields
## Methods
| Method | Arguments | Return | Required | Description |
| ------ | --------- | ------ | :------: | ----------- |
| setForm | | | | set form for validation |
| | [FormArg](#formarg) | | yes | |
| | [validationCallFunc](#types) | | no | |
| | | void | | |
| setData | | | | set custom data to the form which then will be delivered with fields data in the object in the submit callback |
| | string | | yes | form name |
| | any | | yes | data which will be attached to the form |
| | | void | | |
| getData | | | | return data object with values from the from fields |
| | string \| HTMLFormElement | | yes | form name or form element |
| | | [formData](#formdata) | | |
| changeHidden | | | | Set or unset hidden state for group of fields. Hidden fields are not connected to validation executing and data object with values. |
| | string \| HTMLFormElement | | yes | form name or form element |
| | HTMLElement | | yes | hidden HTML container |
| | boolean | | yes | if you hide container, pass true, otherwise false |
| | | void | | |
| validateForm | | | | Executes form validation. If you want to execute your own actions about form without submit, you can call this method and then `getData()` |
| | string \| HTMLFormElement | | yes | form name or form element |
| | | boolean | | if valid form, return true, otherwise false |
| submitForm | | | | execute form submit (validation and submit callback) |
| | string \| HTMLFormElement | | yes | form name or form element |
| | | boolean | | If valid form, return true, otherwise false. This does not return result from the submit callback, only validation result. |
| resetForm | | | | Refresh form (set initial values to the fields) |
| | string \| HTMLFormElement | | yes | form name or form element |
| | | void | | |
## Options
### ValidationProp
If you define this settings, default properties does not working.
| Property | Type | Required | Default | Description |
| -------- | :-----: | :------: | ------- | ----------- |
| valid_type_template | object | no | ```"tel": /^[0-9\+\-\(\)]{8,16}$/```, ```"email": /^\w+([\.-]?\w+)*@\w+([\.-]?\w+)*(\.\w{2,3})+$/```, ```"password": /^[a-zA-Z0-9]{6,}$/``` | object contains default validation templates for some types of the fields. Here property name string is a type of the field and value - [TemplateTypes](#types) |
| type_blocked_template | object | no | ```"number": /^[0-9]+$/```, ```"tel": /^[0-9\+\-\(\)]+$/``` | object contains default validation templates for some types of the fields that does not allow to enter value which not match the template. Here property name string is a type of the field and value - [TemplateTypes](#types) |
| hidden_class | string | yes | "hidden" | default class which uses in HTML template to hide elements |
| wrap_selector | string | no | ".input-parent" | Element with this selector should to wrap single field and invalid class attaches to this element. If not defined, invalid class attaches to the field. |
| inv_require_class | string | yes | "empty" | class for invalid `required` validation |
| inv_valid_class | string | yes | "invalid" | class for invalid fields which does not match to the validation template |
| inv_custom_class | string | no | "invalid-server" | This class attaches to the fields which marked as invalid in a server response. |
| start_validation | "input" \| "change" | no | "change" | Defines on which event to perform validation. `input` - oninput, onchange, onsubmit; `change` - onchange, onsubmit. If no defined, uses `onsubmit`. |
| valid_template_attr | string | no | "data-valid-template" | You can define this attribute on the field and validator will be use a value as validation template. If defined validation template in the [InpArg](#inparg) options for the field, [InpArg](#inparg) options has priority. Attribute value can contain also RegEx string for template ("/.../").
| blocked_template_attr | string | no | "data-blocked-template" | You can define this attribute on the field and validator will be use a value as validation template that does not allow to enter value which not match the template. If defined validation template in the [InpArg](#inparg) options for the field, [InpArg](#inparg) options has priority. Attribute value can contain also RegEx string for template ("/.../").
| start_validation_attr | string | no | "data-start-validation" | You can define this attribute on the field and validator will be use a value to define on which event to perform validation. `input` - oninput, onchange, onsubmit; `change` - onchange, onsubmit;. If defined validation template in the [InpArg](#inparg) options for the field, [InpArg](#inparg) options has priority.
### FormArg
| Property | Type | Required | Default | Description |
| -------- | :-----: | :------: | ------- | ----------- |
| element | string \| HTMLFormElement | yes | | form name or form element |
| submit_el | string \| [SubmitEl](#types) | no | | Submit element name or [SubmitEl](#types). Submit handler is attached to this element. |
| items | object | no | the validator takes all fields which contain in the form | object, where property name string is a name of the field and value - [InpArg](#inparg) options |
| start_validation | "input" \| "change" | no | from the [ValidationProp](#validationprop).*start_validation* | Defines on which event to perform validation. `input` - oninput, onchange, onsubmit; `change` - onchange, onsubmit; |
### InpArg
| Property | Type | Required | Default | Description |
| -------- | :-----: | :------: | ------- | ----------- |
| required | boolean | no | from field property `required` | |
| disabled | boolean | no | from field property `disabled` | |
| valid_template | [TemplateTypes](#types) | no | from the [ValidationProp](#validationprop).*valid_type_template* | template for validation |
| blocked_template | [TemplateTypes](#types) | no | from the [ValidationProp](#validationprop).*type_blocked_template* | does not allow to enter value which not match the template |
| start_validation | "input" \| "change" | no | from the [FormArg](#formarg).*start_validation* | Defines on which event to perform validation. `input` - oninput, onchange, onsubmit; `change` - onchange, onsubmit; |
| callback | Function | no | | Fires each time when performed validation to the field |
### formData
| Property | Type | Description |
| -------- | :---: | ----------- |
| [string] | string \| number | property name is field name (INPUT or other) and value - value of the field |
| data | any | data which attached to the form using `setData()` method |
## Built With
- [TypeScript](https://www.typescriptlang.org/) v 3.5.3
## Author
- Yaroslav Levchenko
## License
This project is licensed under the MIT License - see the [LICENSE.md](License.md) file for details
| 78.646154 | 419 | 0.653658 | eng_Latn | 0.984466 |
40f043f230df3623533409ff6b31161ceac99cce | 862 | md | Markdown | README.md | Arisophy/django-searchview | 0898e171417366cf85666288ae9e2e44c173853f | [
"MIT"
] | 3 | 2021-01-12T19:27:11.000Z | 2021-09-27T11:53:06.000Z | README.md | Arisophy/django-searchview | 0898e171417366cf85666288ae9e2e44c173853f | [
"MIT"
] | null | null | null | README.md | Arisophy/django-searchview | 0898e171417366cf85666288ae9e2e44c173853f | [
"MIT"
] | null | null | null | # django-searchview
SearchView is a multiple inheritance class of FormView and ListView
When I wanted to make a Search Page with class based View of Django, the first choice was using FormView and ListView.
But, I wanted to display search form and results list in one page.
I needed a new view class, multiple inheritance from FormView and ListView.
I searched for it, but I couldn’t find a good solution. So I made it by myself.
see below.
[DjangoSearchView/searchview](https://github.com/Arisophy/django-searchview/tree/master/DjangoSearchView/searchview)
Code is simple and less than 100 steps. But it's very useful.
## Install
pip install django-searchview-lib
[](https://pepy.tech/project/django-searchview-lib)
## How To Use
https://gijutsu.com/en/2020/12/29/django-searchview-class/
| 34.48 | 118 | 0.779582 | eng_Latn | 0.925835 |
40f1911df4c27745f93bfe8667691e16243e59b5 | 151 | md | Markdown | Study_YellowNew/Ti_Board/TMS_F28379D/x-bar.md | stevenyellow/My_Markdown | c915bf4bc5ba7b9921457352dcfe5c335b1fc433 | [
"Apache-2.0"
] | 1 | 2021-09-04T01:47:30.000Z | 2021-09-04T01:47:30.000Z | Study_YellowNew/Ti_Board/TMS_F28379D/x-bar.md | yellow-new/My_Markdown | c915bf4bc5ba7b9921457352dcfe5c335b1fc433 | [
"Apache-2.0"
] | null | null | null | Study_YellowNew/Ti_Board/TMS_F28379D/x-bar.md | yellow-new/My_Markdown | c915bf4bc5ba7b9921457352dcfe5c335b1fc433 | [
"Apache-2.0"
] | null | null | null | 这款芯片新增的input x-bar 和output x-bar 功能:
1.output x-bar可以将DSP内部的一些信号引出到任意一个GPIO口上
2.input x-bar可以将任意一个GPIO口的信号,引到DSP内部,比如任何一个GPIO口的信号,都可以配制成EPWM模块的TZ信号 | 37.75 | 69 | 0.854305 | yue_Hant | 0.884909 |
40f19c66c9f090c8c47e35df75c10cf6c6a2468f | 922 | md | Markdown | docs/error-messages/compiler-errors-1/fatal-error-c1103.md | Erikarts/cpp-docs.es-es | 9fef104c507e48ec178a316218e1e581753a277c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/compiler-errors-1/fatal-error-c1103.md | Erikarts/cpp-docs.es-es | 9fef104c507e48ec178a316218e1e581753a277c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/compiler-errors-1/fatal-error-c1103.md | Erikarts/cpp-docs.es-es | 9fef104c507e48ec178a316218e1e581753a277c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Error irrecuperable C1103
ms.date: 11/04/2016
f1_keywords:
- C1103
helpviewer_keywords:
- C1103
ms.assetid: 9d276939-9c47-4235-9d20-76b8434f9731
ms.openlocfilehash: b6253af9fcf400321fb58d4d8a6d7aacf461b926
ms.sourcegitcommit: 6052185696adca270bc9bdbec45a626dd89cdcdd
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 10/31/2018
ms.locfileid: "50577650"
---
# <a name="fatal-error-c1103"></a>Error irrecuperable C1103
error irrecuperable al importar el id. de programa: 'mensaje'
El compilador detectó un problema al importar una biblioteca de tipos. Por ejemplo, no se puede especificar una biblioteca de tipos con id. de programa y también `no_registry`.
Para obtener más información, consulte [directiva #import](../../preprocessor/hash-import-directive-cpp.md).
El ejemplo siguiente genera el error C1103:
```
// C1103.cpp
#import "progid:a.b.id.1.5" no_registry auto_search // C1103
``` | 31.793103 | 177 | 0.785249 | spa_Latn | 0.725522 |
40f2119e48cd657522419818bf846acbe3a33775 | 1,456 | md | Markdown | README.md | SearchDream/MONO | 72b71a09869fe1e784e22b5695f2497d2a6a22b4 | [
"MIT"
] | 572 | 2018-04-18T15:27:00.000Z | 2022-02-26T04:07:40.000Z | README.md | SearchDream/MONO | 72b71a09869fe1e784e22b5695f2497d2a6a22b4 | [
"MIT"
] | 2 | 2018-06-08T09:41:29.000Z | 2020-02-13T06:28:28.000Z | README.md | SearchDream/MONO | 72b71a09869fe1e784e22b5695f2497d2a6a22b4 | [
"MIT"
] | 121 | 2018-06-05T05:54:17.000Z | 2021-06-24T06:50:22.000Z |
# MONO-iOS
 

具体讲解请看[博客](https://www.jianshu.com/p/8b66d63abe1d)
## 项目简介
本项目全部由纯代码编写,目前只完成了推荐部分功能,数据接口全部由青花瓷抓取,本项目仅供学习参考,不得用于商业目的,若有侵权行为,请联系 qq:1005834829
## 更新日志
• 2018.06.13:增加首次登陆的视频动画功能
## gif演示(图片过大,加载会过慢)

## 部分功能演示图







## 项目要求
* iOS 9+
* Xcode 8+
## License
MONO-iOS 使用 MIT 许可证,详情见 [LICENSE](LICENSE) 文件。
## 最后
### 欢迎大佬们 [star](https://github.com/xumaohuai/MONO)
| 46.967742 | 121 | 0.772665 | yue_Hant | 0.374061 |
40f24b5ba2f25c35909b842dde84b5e180cd276a | 585 | md | Markdown | README.md | StiwardSolano/t_contact | 5653f589e94cadcbe43e19b5becb26a1f27879f3 | [
"MIT"
] | null | null | null | README.md | StiwardSolano/t_contact | 5653f589e94cadcbe43e19b5becb26a1f27879f3 | [
"MIT"
] | null | null | null | README.md | StiwardSolano/t_contact | 5653f589e94cadcbe43e19b5becb26a1f27879f3 | [
"MIT"
] | null | null | null | # t_contact
Contact card AR with Unity & Vuforia
Registro Vuforia: https://developer.vuforia.com/
Conseguir Development License Key desde vuforia y copiar,
Añadir target = subir imagen a vuforia y descargar database.
Unity editor: https://unity.com/es
=> Window -> Package Manager -> Vuforia -> Install
Importar la database en Unity
Añadir AR Camera y editar la configuración vuforia = pegar Development License Key copiada anteriormente
Los rotulos se pueden crear desde: https://www.tinkercad.com/
Scripts: https://github.com/StiwardSolano/t_contact/tree/master/scripts
| 27.857143 | 104 | 0.786325 | spa_Latn | 0.447095 |
40f25e2979d74930174f57a919ab3e54f530ace3 | 314 | md | Markdown | CHANGELOG.md | beyondnetworks/aiogremlin | d3ba7dd8099755d4c1bcb69f6055c6063d7d146f | [
"Apache-2.0"
] | 3 | 2020-01-26T14:36:36.000Z | 2020-09-25T02:11:28.000Z | CHANGELOG.md | beyondnetworks/aiogremlin | d3ba7dd8099755d4c1bcb69f6055c6063d7d146f | [
"Apache-2.0"
] | 1 | 2020-10-05T16:37:49.000Z | 2020-10-27T15:23:39.000Z | CHANGELOG.md | beyondnetworks/aiogremlin | d3ba7dd8099755d4c1bcb69f6055c6063d7d146f | [
"Apache-2.0"
] | 6 | 2020-01-29T22:51:55.000Z | 2021-09-26T02:58:55.000Z | # AIO Gremlin Changelog
## 3.3.5
## v3.3.4
* Added missing dependnecy inflection.
## v3.3.3
* Fixed bug preventing code from running on python 3.7+, see [issue #1](https://git.qoto.org/goblin-ogm/aiogremlin/issues/1).
## v3.3.2
* Updated allowed dependency versions, gremlinpython 3.4.4 was causing issues.
| 19.625 | 125 | 0.710191 | eng_Latn | 0.839778 |
40f2e7a25086081ba922ec690830e66e71597222 | 2,207 | md | Markdown | themes/awesome/src/latex/README.md | ghoulmann/Fresh_Themes_Modified | ec5a93f2ba508014fa79940219e6c25654823fb1 | [
"MIT"
] | null | null | null | themes/awesome/src/latex/README.md | ghoulmann/Fresh_Themes_Modified | ec5a93f2ba508014fa79940219e6c25654823fb1 | [
"MIT"
] | null | null | null | themes/awesome/src/latex/README.md | ghoulmann/Fresh_Themes_Modified | ec5a93f2ba508014fa79940219e6c25654823fb1 | [
"MIT"
] | null | null | null | # Awesome CV
*A template-friendly fork of Byungjin Park's [Awesome-CV][awe] resume template.*
This fork replaces each occurrence of hard-coded resume data in the original
Awesome-CV project with an expandable template tag, allowing you to generate a
personalized copy of Awesome-CV without hand-editing the `.tex` files.
Otherwise, it's identical to the original.
## Use
When expanded with [Underscore.js][und] and a [FRESH resume][fre] object (JSON),
this project yields an Awesome-CV template, in LaTeX, but with your resume data.
You can then use a tool like `xelatex` or similar to generate the final PDF.
**Step 1**: Use a FRESH-aware template expansion tool like [HackMyResume][hmr]
to generate the personalized LaTeX:
```bash
hackmyresume build resume.json -t awesome
```
**Step 2**: Use conventional LaTeX tools to generate a PDF or other destination
format from the (personalized) Awesome-CV LaTeX files:
```bash
xelatex examples/resume.tex
```
**Step 3**: When you need to update your resume, update the source JSON, and
re-generate destination formats as needed.
## Notes
- This project is intermittently synced with the upstream master to get the
benefit of any new changes or tweaks.
- The template tags are based on the Underscore.js template syntax but with a
modified set of template delimiters that reduce conflicts with LaTeX
metacharacters such as `{` or `%`.
- Underscore.js was chosen because its minimalistic template engine supports
custom delimiters (Handlebars [doesn't][hb350], except [unofficially][un]) and
allows execution of arbitrary JavaScript in the template.
- This project indirectly drives the `awesome` theme in HackMyResume and
FluentCV. Technically the latter is a fork of this project, which itself is a
fork of the original Awesome-CV for LaTeX.
## License
See the original project for license information.
[awe]: https://github.com/posquit0/Awesome-CV
[hmr]: https://fluentdesk.com/hackmyresume
[fre]: http://freshstandard.org
[und]: http://underscorejs.org/#template
[hb350]: https://github.com/wycats/handlebars.js/issues/350
[un]: https://github.com/jonschlinkert/handlebars-delimiters
| 36.783333 | 81 | 0.753058 | eng_Latn | 0.982079 |
40f2ffd1cc2483573a09cfad4299569f0505c0eb | 1,815 | md | Markdown | _posts/2019-06-20-Automated-crater-shape-retrieval-using-weakly-supervised-deep-learning.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 7 | 2018-02-11T01:50:19.000Z | 2020-01-14T02:07:17.000Z | _posts/2019-06-20-Automated-crater-shape-retrieval-using-weakly-supervised-deep-learning.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | null | null | null | _posts/2019-06-20-Automated-crater-shape-retrieval-using-weakly-supervised-deep-learning.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 4 | 2018-02-04T15:58:04.000Z | 2019-08-29T14:54:14.000Z | ---
layout: post
title: "Automated crater shape retrieval using weakly-supervised deep learning"
date: 2019-06-20 20:04:08
categories: arXiv_CV
tags: arXiv_CV Segmentation Deep_Learning
author: Mohamad Ali-Dib, Kristen Menou, Chenchong Zhu, Noah Hammond, Alan P. Jackson
mathjax: true
---
* content
{:toc}
##### Abstract
Crater shape determination is a complex and time consuming task that so far has evaded automation. We train a state of the art computer vision algorithm to identify craters on the moon and retrieve their sizes and shapes. The computational backbone of the model is MaskRCNN, an "instance segmentation" general framework that detects craters in an image while simultaneously producing a mask for each crater that traces its outer rim. Our post-processing pipeline then finds the closest fitting ellipse to these masks, allowing us to retrieve the crater ellipticities. Our model is able to correctly identify 87% of known craters in the holdout set, while predicting thousands of additional craters not present in our training data. Manual validation of a subset of these craters indicates that a majority of them are real, which we take as an indicator of the strength of our model in learning to identify craters, despite incomplete training data. The crater size, ellipticity, and depth distributions predicted by our model are consistent with human-generated results. The model allows us to perform a large scale search for differences in crater diameter and shape distributions between the lunar highlands and maria, and we exclude any such differences with a high statistical significance.
##### Abstract (translated by Google)
##### URL
[http://arxiv.org/abs/1906.08826](http://arxiv.org/abs/1906.08826)
##### PDF
[http://arxiv.org/pdf/1906.08826](http://arxiv.org/pdf/1906.08826)
| 69.807692 | 1,294 | 0.795041 | eng_Latn | 0.995026 |
40f3599a9591a1d9197d74852c73e4eb0df43cac | 18 | md | Markdown | api/schema-definition-v2/event-data-definition/ieventdataroom.md | EnterpriseyIntranet/docs | e4c8dd0963a95e806ecd6357dc9bd0c3bbdeb03f | [
"MIT"
] | null | null | null | api/schema-definition-v2/event-data-definition/ieventdataroom.md | EnterpriseyIntranet/docs | e4c8dd0963a95e806ecd6357dc9bd0c3bbdeb03f | [
"MIT"
] | null | null | null | api/schema-definition-v2/event-data-definition/ieventdataroom.md | EnterpriseyIntranet/docs | e4c8dd0963a95e806ecd6357dc9bd0c3bbdeb03f | [
"MIT"
] | null | null | null | # IEventDataRoom
| 6 | 16 | 0.777778 | nld_Latn | 0.430405 |
40f61535136978b7eadd8d7189fdbf4c8c4b2acf | 753 | md | Markdown | README.md | sysit-solutions/website | 3e02ec0b3bde9a4cb07027000dfaebf9026dd738 | [
"BSD-2-Clause"
] | null | null | null | README.md | sysit-solutions/website | 3e02ec0b3bde9a4cb07027000dfaebf9026dd738 | [
"BSD-2-Clause"
] | null | null | null | README.md | sysit-solutions/website | 3e02ec0b3bde9a4cb07027000dfaebf9026dd738 | [
"BSD-2-Clause"
] | null | null | null | # Freenit Frontend
### [Tutorial](https://github.com/freenit-framework/frontend-tutorial)
This repo will be downloaded on `yarn create freenit <project>` and it will be cleaned up for your project. For example `name.ini` will contain the name you provided as `<project>`.
### Shell Scripts
To star development:
```
bin/init.sh
bin/devel.sh
```
If you use it with the backend, run it like this:
```
env BACKEND_URL=<url> bin/devel.sh
```
To run tests:
```
bin/init.sh
bin/test.sh
```
### CBSD/Reggae
To start development:
```
echo DEVEL_MODE=YES >vars.mk
make devel
```
To run it with the backend, consider running it with [Freenit Project](https://github.com/freenit-framework/freenit)
To run tests:
```
echo DEVEL_MODE=YES >vars.mk
make test
```
| 20.351351 | 181 | 0.719788 | eng_Latn | 0.933409 |
40f6df99d40ff6c17bcecc31a4d60470607502c2 | 1,839 | md | Markdown | AlchemyInsights/delete-or-restore-applications.md | isabella232/OfficeDocs-AlchemyInsights-pr.nl-NL | 3ac9621b1324fce87e8c9e694eed039a2ba7ecfe | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-19T19:07:21.000Z | 2020-05-19T19:07:21.000Z | AlchemyInsights/delete-or-restore-applications.md | MicrosoftDocs/OfficeDocs-AlchemyInsights-pr.nl-NL | 78f9639c9c0d7312823180d658477000ef88e9f5 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2022-02-09T06:56:14.000Z | 2022-02-09T06:56:22.000Z | AlchemyInsights/delete-or-restore-applications.md | isabella232/OfficeDocs-AlchemyInsights-pr.nl-NL | 3ac9621b1324fce87e8c9e694eed039a2ba7ecfe | [
"CC-BY-4.0",
"MIT"
] | 3 | 2019-10-09T20:32:49.000Z | 2021-10-09T10:48:57.000Z | ---
title: Toepassingen verwijderen of herstellen
ms.author: v-jmathew
author: v-jmathew
manager: scotv
ms.audience: Admin
ms.topic: article
ms.service: o365-administration
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.collection: Adm_O365
ms.custom:
- "9004335"
- "7737"
ms.openlocfilehash: 4df9a98644f6bc7a30f9009719c5198db591afc9
ms.sourcegitcommit: eb685eea3ab312d404d55bfd5594a5d6d68811d1
ms.translationtype: MT
ms.contentlocale: nl-NL
ms.lasthandoff: 01/27/2021
ms.locfileid: "50014796"
---
# <a name="delete-or-restore-applications"></a>Toepassingen verwijderen of herstellen
**Een toepassing verwijderen uit een Azure AD-Tenant**:
1. Selecteer **Enterprise toepassingen** in de **Azure AD-Portal**. Zoek en selecteer de toepassing die u wilt verwijderen.
2. Selecteer in de sectie **beheren** van het linkerdeelvenster **Eigenschappen**.
3. Selecteer **verwijderen** en selecteer vervolgens **Ja** om te bevestigen dat u de app wilt verwijderen uit uw Azure AD-Tenant.
Zie voor meer informatie over het verwijderen van een app [: een toepassing verwijderen uit uw Azure Active Directory (Azure AD)-Tenant](https://docs.microsoft.com/azure/active-directory/manage-apps/delete-application-portal#delete-an-application-from-your-azure-ad-tenant).
In PowerShell worden toepassings proxy configuraties uit een bepaalde toepassing in azure Active Directory verwijderd uit PowerShell en [kan de toepassing](https://docs.microsoft.com/powershell/module/azuread/remove-azureadapplicationproxyapplication) helemaal verwijderen, indien opgegeven.
U kunt **een verwijderde toepassing herstellen** met PowerShell. Wanneer de toepassing die u wilt herstellen is vastgesteld, kunt u deze herstellen met [Restore-AzureADDeletedApplication](https://docs.microsoft.com/powershell/module/azuread/restore-azureaddeletedapplication).
| 52.542857 | 291 | 0.811854 | nld_Latn | 0.972918 |
40f7a8285a49d009324b95c89fc323cbc3dc65ad | 426 | md | Markdown | markdown/Conflicts/1459.md | akaalias/plotto-for-obsidian | f4627272b018a76ac772f1ed2b9b3747928eb636 | [
"MIT"
] | null | null | null | markdown/Conflicts/1459.md | akaalias/plotto-for-obsidian | f4627272b018a76ac772f1ed2b9b3747928eb636 | [
"MIT"
] | null | null | null | markdown/Conflicts/1459.md | akaalias/plotto-for-obsidian | f4627272b018a76ac772f1ed2b9b3747928eb636 | [
"MIT"
] | null | null | null | ## 1459
- Previous: [[1296]] [[1304]] [[1308]]
- B finds an old notebook, X, which contains the record of a dishonest transaction
- B finds an old note book X, which contains information that has an important bearing on some of her plans
- Next: [[1204]] [[1244 | 1244b]] [[1279 | 1279b]]
## B Clause
- Becoming Aware of an Important Secret that Calls for Decisive Action
## Group
- Enterprise
### Subgroup
- Revelation
| 26.625 | 107 | 0.699531 | eng_Latn | 0.996146 |
40f7bda5f04a01b3d21d400ec42169f8e40a71f3 | 284 | md | Markdown | README.md | Warchant/dumbdb | 7b5f0c5b70aa4bc3f58a2d876254f2a84762cd73 | [
"MIT"
] | null | null | null | README.md | Warchant/dumbdb | 7b5f0c5b70aa4bc3f58a2d876254f2a84762cd73 | [
"MIT"
] | null | null | null | README.md | Warchant/dumbdb | 7b5f0c5b70aa4bc3f58a2d876254f2a84762cd73 | [
"MIT"
] | null | null | null | # dumbdb
Features:
- single file
- header only
- dependency free
- key-value database
- C++11
- slow and inefficient, but very simple in use
- very portable
Internally, implemented as folder with all Key-Value pairs, where Key is a filename, and Value is file content ¯\\\_(ツ)\_/¯
| 20.285714 | 123 | 0.728873 | eng_Latn | 0.998962 |
40f867c5b247d7792b0a0fe72f17f4d14cfa623c | 3,373 | md | Markdown | add/metadata/System.Web/HttpServerUtilityWrapper.meta.md | kcpr10/dotnet-api-docs | b73418e9a84245edde38474bdd600bf06d047f5e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-06-16T22:24:36.000Z | 2020-06-16T22:24:36.000Z | add/metadata/System.Web/HttpServerUtilityWrapper.meta.md | kcpr10/dotnet-api-docs | b73418e9a84245edde38474bdd600bf06d047f5e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.Web/HttpServerUtilityWrapper.meta.md | kcpr10/dotnet-api-docs | b73418e9a84245edde38474bdd600bf06d047f5e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-04-08T14:42:27.000Z | 2019-04-08T14:42:27.000Z | ---
uid: System.Web.HttpServerUtilityWrapper
---
---
uid: System.Web.HttpServerUtilityWrapper.MachineName
---
---
uid: System.Web.HttpServerUtilityWrapper.ClearError
---
---
uid: System.Web.HttpServerUtilityWrapper.CreateObjectFromClsid(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.Transfer
---
---
uid: System.Web.HttpServerUtilityWrapper.Transfer(System.Web.IHttpHandler,System.Boolean)
---
---
uid: System.Web.HttpServerUtilityWrapper.UrlDecode(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.TransferRequest(System.String,System.Boolean,System.String,System.Collections.Specialized.NameValueCollection,System.Boolean)
---
---
uid: System.Web.HttpServerUtilityWrapper.CreateObject(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.UrlEncode(System.String,System.IO.TextWriter)
---
---
uid: System.Web.HttpServerUtilityWrapper.HtmlEncode(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.HtmlDecode(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.Transfer(System.String,System.Boolean)
---
---
uid: System.Web.HttpServerUtilityWrapper.HtmlEncode
---
---
uid: System.Web.HttpServerUtilityWrapper.TransferRequest(System.String,System.Boolean,System.String,System.Collections.Specialized.NameValueCollection)
---
---
uid: System.Web.HttpServerUtilityWrapper.Transfer(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.UrlPathEncode(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.Execute(System.String,System.IO.TextWriter)
---
---
uid: System.Web.HttpServerUtilityWrapper.Execute(System.Web.IHttpHandler,System.IO.TextWriter,System.Boolean)
---
---
uid: System.Web.HttpServerUtilityWrapper.UrlTokenDecode(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.ScriptTimeout
---
---
uid: System.Web.HttpServerUtilityWrapper.Execute
---
---
uid: System.Web.HttpServerUtilityWrapper.UrlDecode
---
---
uid: System.Web.HttpServerUtilityWrapper.Execute(System.String,System.Boolean)
---
---
uid: System.Web.HttpServerUtilityWrapper.UrlEncode(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.#ctor(System.Web.HttpServerUtility)
---
---
uid: System.Web.HttpServerUtilityWrapper.UrlDecode(System.String,System.IO.TextWriter)
---
---
uid: System.Web.HttpServerUtilityWrapper.GetLastError
---
---
uid: System.Web.HttpServerUtilityWrapper.UrlEncode
---
---
uid: System.Web.HttpServerUtilityWrapper.CreateObject
---
---
uid: System.Web.HttpServerUtilityWrapper.TransferRequest(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.Execute(System.String,System.IO.TextWriter,System.Boolean)
---
---
uid: System.Web.HttpServerUtilityWrapper.TransferRequest
---
---
uid: System.Web.HttpServerUtilityWrapper.TransferRequest(System.String,System.Boolean)
---
---
uid: System.Web.HttpServerUtilityWrapper.HtmlDecode
---
---
uid: System.Web.HttpServerUtilityWrapper.HtmlDecode(System.String,System.IO.TextWriter)
---
---
uid: System.Web.HttpServerUtilityWrapper.UrlTokenEncode(System.Byte[])
---
---
uid: System.Web.HttpServerUtilityWrapper.CreateObject(System.Type)
---
---
uid: System.Web.HttpServerUtilityWrapper.MapPath(System.String)
---
---
uid: System.Web.HttpServerUtilityWrapper.HtmlEncode(System.String,System.IO.TextWriter)
---
---
uid: System.Web.HttpServerUtilityWrapper.Execute(System.String)
---
| 20.567073 | 166 | 0.77646 | yue_Hant | 0.579464 |
40f869bd72d54f917ea3074487904e583becec03 | 3,068 | md | Markdown | README.md | kenkit/auth | 398be3dfcf9e3ad98442a9d4e937bf05bcb7da06 | [
"Zlib"
] | 14 | 2015-09-26T22:48:29.000Z | 2017-09-20T18:57:30.000Z | README.md | r-lyeh/auth | c4bf302c6b6dd6a052ef09c2ca34f3b94482b2e7 | [
"Zlib"
] | 1 | 2019-11-20T12:42:03.000Z | 2019-11-21T08:23:37.000Z | README.md | r-lyeh/auth | c4bf302c6b6dd6a052ef09c2ca34f3b94482b2e7 | [
"Zlib"
] | 9 | 2015-09-26T17:13:04.000Z | 2017-09-20T18:57:31.000Z | Auth <a href="https://travis-ci.org/r-lyeh/auth"><img src="https://api.travis-ci.org/r-lyeh/auth.svg?branch=master" align="right" /></a>
====
- Auth is a simple, lightweight and safe client-server authentication system. Written in C++11
- Auth features decoupled salt, encryption and network handshaking from implementation.
- Auth is tiny. Header-only.
- Auth is cross-platform.
- Auth is self-contained. No dependencies.
- Auth is zlib/libpng licensed.
### Sample
```c++
int main() {
auth::session at_client( "[email protected]", "sesame", "@pc-workstation" );
auth::session at_server( "[email protected]", "sesame", "@server" );
// similar sessions, not equal until public_key is assigned
assert( at_client != at_server );
at_client.set_public_key( at_server.get_public_key() );
assert( at_client == at_server );
// mutate passphrasses
for( int i = 0; i < rand(); ++i ) {
at_client.mutate(); assert( at_client != at_server );
at_server.mutate(); assert( at_client == at_server );
}
// debug
std::cout << at_client << std::endl;
std::cout << at_server << std::endl;
std::cout << "All ok." << std::endl;
}
```
### Possible output
```
[session:0034FDC0] {
.valid=1
.timestamp=873971735
[email protected];@pc-workstation
[email protected]
.pass=3062624283
.public_key=554326941
.passphrase=4017519821
}
[session:0034FD40] {
.valid=1
.timestamp=873971735
[email protected];@server
[email protected]
.pass=3062624283
.public_key=554326941
.passphrase=4017519821
}
All ok.
```
### API design
- You can compare sessions for equality and sort them, or insert them in a map.
- Sessions are not equal unless they have same `user` and `passphrase`.
- A passphrase is made of `pass` and `public_key`.
- A passphrase can mutate on both sides to change encryption on the fly.
- A server can hold different sessions that refer to the same user at the same time, ie when logging from different computers.
- Public keys can be sent thru insecure networks.
### API
- `void setup( string name, string pass, [string context], [string public_key] )` @todoc
- `void touch()` @todoc
- `bool is_timedout() const` @todoc
- `bool is_valid() const` @todoc
- `void invalidate()` @todoc
- `void reset()` @todoc
- `void mutate()` @todoc
- `void set_user_info( string name, string pass )` @todoc
- `string get_user_name() const` @todoc
- `string get_user_context() const` @todoc
- `string get_passphrase() const` @todoc
- `void set_public_key( string public_key )` @todoc
- `string get_public_key()` @todoc
- `size_t get_timestamp()`, @todoc
### How-to: Building
- Implement namespace `auth::provider` somewhere both in client and server code.
- Check [provided sample](sample.cc) for a brief reference implementation.
### Complementary links
- https://github.com/r-lyeh/vault to handle ARC4 en/decryption.
- https://github.com/r-lyeh/cocoa to handle SHA1/CRC32 hashes.
- https://github.com/r-lyeh/sand to handle time and timestamps.
### Appendix: creating a strong solid login system
Check [related appendix](README-APPENDIX.md)
| 31.958333 | 136 | 0.703716 | eng_Latn | 0.739478 |
40f907c0b7cd482015cf4b7e8e69abdfa926f783 | 130 | md | Markdown | README.md | jethrodaniel/exercism | 81ed0acbb65bccd09334229c2c31235355bd38bc | [
"MIT"
] | null | null | null | README.md | jethrodaniel/exercism | 81ed0acbb65bccd09334229c2c31235355bd38bc | [
"MIT"
] | null | null | null | README.md | jethrodaniel/exercism | 81ed0acbb65bccd09334229c2c31235355bd38bc | [
"MIT"
] | null | null | null | # exercism

Some fun code exercises from http://exercism.io
| 21.666667 | 68 | 0.761538 | eng_Latn | 0.260725 |
40f90ae788ad336ab91f642194f475d3e34aa2da | 6,435 | md | Markdown | README.md | igorskyflyer/npm-zep | 3171efd16d904bdf0c320566aec6d91259349ad4 | [
"MIT"
] | null | null | null | README.md | igorskyflyer/npm-zep | 3171efd16d904bdf0c320566aec6d91259349ad4 | [
"MIT"
] | 26 | 2021-07-15T19:36:07.000Z | 2021-08-08T00:20:07.000Z | README.md | igorskyflyer/npm-zep | 3171efd16d904bdf0c320566aec6d91259349ad4 | [
"MIT"
] | null | null | null | ## `Zep()`
<sub>Your personal (de)bouncer 💪🦸♂️</sub>
<br>
🧠 `Zep()` is a zero-dependency, efficient debounce module. ⏰
> Why `Zep()`? Because `Zep()` allows you to create time-invoked callbacks but with _deferred_ execution! `Zep()` does debouncing in a **very efficient** manner by only creating 1 Timer\* - provided by `setInterval`. Some use cases are: when you are processing user input but want to wait until they have finished typing or you are using a 3rd-party API that calls an event handler too often - you can throttle those calls or when your event handler does intensive computing and you want to minimize workload. It limits the rate at which a function/handler can be fired/triggered, thus increasing performance/responsiveness of your product.
<sub>\* other debounce functions/modules create dozens, even hundreds of Timers in order to provide the same functionality.</sub>
<br>
<br>
✨ Since `v.5.0.0` `Zep()` is a hybrid module that supports both CommonJS (legacy) and ES modules, thanks to [Modern Module](https://github.com/igorskyflyer/npm-modern-module).
<br>
> Note, since `v.3.0.0`, setters for event handling properties were converted to methods that accept your event handler and return the current instance of `Zep()`, useful for chained calls, see [ZepEventHandler](#zep-eventhandler) and its benefits.
<br>
### Usage
Install it by running:
```shell
npm i "@igor.dvlpr/zep"
```
<br>
### API
<br>
**Types**
```ts
type ZepCallback {
self: Zep,
args: ...*
}
```
<br>
<a id="zep-eventhandler"></a>
```ts
type ZepEventHandler {
self: Zep
}
```
<br>
```ts
type ZepErrorHandler {
self: Zep,
error: Error
}
```
<br>
**Methods**
<br>
```js
constructor(callback: ZepCallback, [time: number]): Zep
```
Creates a new instance of `Zep()`, this is where you should define your function/callback that will be debounced - when needed. If you don't define the `time` parameter or `time <= 0` your `callback` will be called immediately without ever being debounced. You can have as many arguments in your `callback` function as you want.
Since `v.4.0.0` event handlers have changed. Their first parameter is always `self: Zep` which is a self-reference to the current Zep object that triggered the event handler.
```js
const { Zep } = require('@igor.dvlpr/zep')
// pass an arrow function
const zep = new Zep((self, value, item) => {
// code to limit its execution rate
}, 1500)
function myFunction(self, value) {
/* some code */
}
// or an existing function
const zep = new Zep(myFunction, 1500)
// You can have as many arguments in your callback function as you want.
```
<br>
`onCancelled(handler: ZepEventHandler): Zep` - a handler to call when the execution of `Zep.run()` has been cancelled. See also, [`Zep.cancel()`](#zep-cancel).
<br>
`onAborted(handler: ZepEventHandler): Zep` - a handler to call when the execution of `Zep.run()` has been aborted. See also, [`Zep.abort()`](#zep-abort).
<br>
`onBeforeRun(handler: ZepEventHandler): Zep` - a handler to call before each call to your `callback`.
<br>
`onAfterRun(handler: ZepEventHandler): Zep` - a handler to call after each call to your `callback`.
<br>
`onCompleted(handler: ZepEventHandler): Zep` - a handler to call after `Zep()` has finished running `===` no more calls to the `Zep.run()` in the given time-frame.
<br>
`onError(handler: ZepEventHandler, error: Error): Zep` - a handler to call when an error has occurred during execution.
<br>
<a id="zep-abort"></a>
`abort(): void` - aborts the execution, stops Zep completely and - if applicable - the current running Timer without waiting for it to finish its execution. See also [`Zep.cancel()`](#zep-cancel).
<br>
<a id="zep-cancel"></a>
`cancel(): void` - stops the execution but **NOT** the current running Timer - if applicable. See also [`Zep.abort()`](#zep-abort).
<br>
`run(...args): void` - runs your `callback` defined in the constructor if necessary or else debounces it. You can pass as many arguments to this method and they will be available in your `callback`. This method should be passed as the event handler.
<br>
`writeStats(): void` - writes `Zep()` statistical information to the `console`, sample output,
> `[Zep]`: invocations: 500, callback executions: 32, saving of 93.60% calls.
☝ Means that the event was triggered **500** times but `Zep()` debounced it and only executed its handler **32** times instead, the handler was called **93.60%** less than without using `Zep()`.
<br>
<br>
**Properties**
`executionCount: number` - returns the number of callback executions.
<br>
`isWaiting: boolean` - indicates whether `Zep()` is waiting for a Timer to finish its execution, if `true`, `Zep.run()` won't create new Timers when called.
<br>
`isRunning: boolean` - indicates whether a Timer is currently running your `callback`.
<br>
`wasCancelled: boolean` - indicates whether the execution of `Zep.run()` was cancelled. Execution can be cancelled by calling [`Zep.cancel()`](#zep-cancel).
<br>
`wasAborted: boolean` - indicates whether the execution of `Zep.run()` was aborted. Execution can be aborted by calling [`Zep.abort()`](#zep-abort).
<br>
### Example
```js
const { Zep } = require('@igor.dvlpr/zep')
// pass an arrow function
const zep = new Zep((self, value, item) => {
// code to limit its execution rate
}, 1500)
// then pass Zep's run() method to the event instead the original function
// code
const picker = vscode.window.createQuickPick()
// this is by default triggered each time a user types a character inside the QuickPick
picker.onDidChangeValue((e) => {
zep.run(e)
}
// due to the nature of JavaScript the following WON'T WORK,
// when you pass a class method as a parameter that
// method will get detached from the class and lose its track of <this>,
// which will be globalThis/undefined, thus resulting in an error,
picker.onDidChangeValue(zep.run)
// but you could use any of the 2 techniques
// ****
function changeHandler() {
zep.run()
}
// and then use that wrapper-function
picker.onDidChangeValue(changeHandler)
// ****
// or
// ****
const changeHandler = zep.run.bind(zep)
picker.onDidChangeValue(changeHandler)
// ****
// by using Zep we can wait for the user to finish their input
// if they haven't typed a single letter = the onDidChangeValue wasn't
// triggered for 1500ms (1.5s) we assume they finished typing
// more code
```
| 28.986486 | 640 | 0.709868 | eng_Latn | 0.99301 |
40f92ca87633f3837018a058f22c710740465069 | 64 | md | Markdown | README.md | udzura/mute-im | e7be6470239f83c69cbcf21f3f7d2cd2f7bf22a7 | [
"Apache-2.0"
] | null | null | null | README.md | udzura/mute-im | e7be6470239f83c69cbcf21f3f7d2cd2f7bf22a7 | [
"Apache-2.0"
] | null | null | null | README.md | udzura/mute-im | e7be6470239f83c69cbcf21f3f7d2cd2f7bf22a7 | [
"Apache-2.0"
] | null | null | null | # mute-im
Slack im muter, this is social and the human society.
| 21.333333 | 53 | 0.75 | eng_Latn | 0.997117 |
40f9fd1b773e12ea0ed1aa4be782de6073f0e9ed | 54 | md | Markdown | README.md | MathFerreira01/Landing_page | 2eae24a010fb0c93285287956cb8cffe58e1d0df | [
"MIT"
] | null | null | null | README.md | MathFerreira01/Landing_page | 2eae24a010fb0c93285287956cb8cffe58e1d0df | [
"MIT"
] | null | null | null | README.md | MathFerreira01/Landing_page | 2eae24a010fb0c93285287956cb8cffe58e1d0df | [
"MIT"
] | null | null | null | # Landing_page
Landing page (HTML, CSS e Javascript)
| 18 | 38 | 0.759259 | kor_Hang | 0.31292 |
40fa019b39eb0e3a2a715ed60c7a9ea9355c94ac | 12,767 | md | Markdown | _posts/2020-01-30-new-age-of-bastion.md | wezm/blog-2 | 52697ff5ae525d9727bd32013a4f79586602a448 | [
"CC-BY-4.0"
] | null | null | null | _posts/2020-01-30-new-age-of-bastion.md | wezm/blog-2 | 52697ff5ae525d9727bd32013a4f79586602a448 | [
"CC-BY-4.0"
] | 7 | 2020-02-16T22:39:54.000Z | 2021-11-24T21:57:20.000Z | _posts/2020-01-30-new-age-of-bastion.md | wezm/blog-2 | 52697ff5ae525d9727bd32013a4f79586602a448 | [
"CC-BY-4.0"
] | 1 | 2020-02-24T09:45:58.000Z | 2020-02-24T09:45:58.000Z | ---
layout: post
title: "New age of Bastion"
author: Written by Mahmut Bulut, and edited by Jeremy Lempereur
---
**Disclaimer:** This post is not only an announcement of new features of Bastion, which comes with 0.3.4; it also declares our future perspective for the project.
So we had a long way since our project had started in June 2019. Right now, we are evolving from a runtime that enables fault-recovery for your server-side applications to an agnostic runtime, which has its executor but also works with other executors. Since the beginning we can interoperate with [async-std](https://async.rs) without the hassle.
We have started with an idea of runtime, which can recover from failures in the application domain, no matter what happens. Async/await landed, and we got stronger and well known in the community. Then Bastion deployed to AWS lambdas, did, and still doing its job in data processing pipelines. Benchmarks of sparse queries outperformed plenty of sparse queries and aggregations over NYC taxi data in various inmem benchmarks.
Where futures-executor performs:
```
test count_aggregation ... bench: 108,916,527 ns/iter (+/- 11,909,528)
```
Bastion performed:
```
test count_aggregation ... bench: 87,096,485 ns/iter (+/- 3,874,812)
```
Everything started with an idea of replicating what Erlang does in its runtime. Having the idea of using the runtime for data processing came right after it. Bastion's purpose is to provide a generic runtime for fault-tolerant middlewares, distributed systems, and data processing pipelines.
I didn't aim for faster runtime, but I aimed to exploit the whole resources whenever it's possible. Bastion also doesn't aim to replace every other existing runtimes. It just unfolds a different approach to design systems. This system design is very graspable for the people who come from a functional background like Scala, Erlang, or Haskell. When we released Bastion 0.1 in June 2019 with Tokio runtime, I realized that it's not the way that this project can continue. With premature design assumptions of the runtime, I struggled with developing the project with Tokio runtime. Having to implement a monolithic runtime with several lack of reduction based concurrency principles made me write an executor for Bastion. So we end up having a multiple run queue based, redundancy statistics sampled, async executor which is a primer of the runtime. Throughout time, people wanted to use blocking tasks also. So we came with an idea of using an adaptive blocking pool, which can [automatically scale the resources based on the throughput](https://docs.rs/bastion-executor/0.3.4/bastion_executor/blocking/index.html) and prevents IO hogs.
## What is next with Bastion?
* We are currently working on being runtime agnostic. Working in everywhere but still supplying our own executor to the people who want to have fault-tolerant systems in their projects.
* We are exploring HCS(Hot code swap) case. And currently, we have [design discussions](https://github.com/bastion-rs/bastion/issues/103) going on. We would be really happy to gather your feedback and ideas on HCS!
* WASM is a rising star in the ecosystem. We would like to support it, first we want to make everything `#[no_std]`, then we want to move towards to make Bastion projects compile for WASM.
* We are working on Bastion Streams! A streaming framework implementation which is backpressure aware and works with a push interface like API. Many people are asking about how this will be implemented, and I have pretty concrete ideas about this. Especially from the perspective of Rust! So now I am quite bit busy enabling parallel streams in Rust. **Not with [this](https://docs.rs/futures/0.3.1/futures/stream/trait.Stream.html), or [this](https://rust-lang.github.io/async-book/05_streams/01_chapter.html), or via [this](https://rust-lang.github.io/async-book/06_multiple_futures/03_select.html)**. Come join us if you find this interesting!
* Distributed properties are still needs to be solved. We are slow at that side but wrote initial epidemic protocol implementation
We aren't moving too fast on this, but an initial epidemic protocol implementation can be seen and used in our [Artillery](http://github.com/bastion-rs/artillery) repository.
## New version is in here! Tell me more!
We have added features to `0.3.4`, you can see it in our [CHANGELOG](https://github.com/bastion-rs/bastion/blob/master/CHANGELOG.md). But I will showcase a bunch of excellent features of the new Bastion version!
------
**At Bastion itself, at the surface API level, here are our changes:**
**0-** We added restart strategies, linear and exponential backoff between restarts. And maximum restarts for the failed children. Check out our [example in the repo](https://github.com/bastion-rs/bastion/blob/master/bastion/examples/restart_strategy.rs). Thanks to [@Relrin](https://github.com/Relrin) and for the clean cut of their PR!
```rust
// Here we are specifying the used restart strategy for our supervisor.
// By default the bastion's supervisors are always trying to restart
// failed actors with unlimited amount of tries.
//
// At the beginning we're creating a new instance of RestartStrategy
// and then provides a policy and a back-off strategy.
let restart_strategy = RestartStrategy::default()
// Set the limits for supervisor, so that it could stop
// after 3 attempts. If the actor can't be started, the supervisor
// will remove the failed actor from tracking.
.with_restart_policy(RestartPolicy::Tries(3))
// Set the desired restart strategy. By default supervisor will
// try to restore the failed actor as soon as possible. However,
// in our case we want to restart with a small delay between the
// tries. Let's say that we want a regular time interval between the
// attempts which is equal to 1 second.
.with_actor_restart_strategy(ActorRestartStrategy::LinearBackOff {
timeout: Duration::from_secs(1),
});
// Define the supervisor
supervisor
// That uses our restart strategy defined earlier
.with_restart_strategy(restart_strategy)
// And tracks the child group, defined in the following function
.children(|children| failed_actors_group(children))
```
**1-** Now we have a signature method to address all the actors in the system. Check our context addressing system [here](https://docs.rs/bastion/0.3.4/bastion/context/struct.BastionContext.html#method.signature)
```rust
Bastion::children(|children| {
children.with_exec(|ctx: BastionContext| {
async move {
ctx.tell(&ctx.signature(), "Hello to myself");
Ok(())
}
})
}).expect("Couldn't create the children group.");
```
**2-** We've added meta-architecture instantiation, so you can have a boilerplate free hierarchy for your actors and/or workers.
```rust
let children = children! {
// the default redundancy is 1
redundancy: 100,
action: |msg| {
// do something with the message here
},
};
// simpler supervisor declaration.
let sp = supervisor! {};
let children = children! {
supervisor: sp,
redundancy: 10,
action: |msg| {
// do something with the message here
},
};
```
**3-** You can now mark your blocking code and execute it inside Bastion's blocking adaptive thread pool.
```rust
let task = blocking! {
thread::sleep(time::Duration::from_millis(3000));
};
```
**4-** If you want to run a process and wait for it's completion, you can call `run!`. `run` is equivalent to what other Rust frameworks call `block_on`. We decided to call it run because that's the name implementations in other languages use. We think sticking to cross languages conventions will avoid confusion among users, regardless of their favorite language.
With the blocking task written above, we can use:
```rust
run!(task);
```
And that's it! This is equivalent of [zio running effects](https://zio.dev/docs/overview/overview_running_effects) in Rust environment. Now we have it!
-------
**The Executor, our huge steampunk engine**
This is the core, the piece everything is running on top of.
It is way more diverged and different from any runtime out there.
The approach we took is very different from the other frameworks in the ecosystem.
What are the differences ? Let's have a look!
**0-** When under contention we skip work-stealing. We directly force queue consumers to handle messages coming from the run queue and continue working without work-stealing for a small fraction of time. Think of it like a congestion unblocking mechanism. It works pretty well so far! Works pretty well.
**1-** Bastion executor relies mostly on interconnect messages and tries to do single clock advancement without OS migration of threads. We are pinning our threads which means we need to be aware of how many vCPU Bastion is running on, and we maintain thread pinning on the fly. A lot of efforts were put into our pinning API in this release.
**2-** We have a blocking pool based on the paper [A Novel Predictive and Self–Adaptive Dynamic Thread Pool Management](https://ieeexplore.ieee.org/document/5951889) and you can see this implementation documentation [here](https://docs.rs/bastion-executor/0.3.4/bastion_executor/blocking/index.html). That's our blocking congestion prevention mechanism and it enables high throughput for the blocking operation under various cases ranging from long running, spike oriented, bursting, decaying task timing behaviors.
Moreover, this thread pool is also [configurable with an environment variable](https://github.com/bastion-rs/bastion/pull/141/files#diff-d7ca8f060f95b61604b5bd4036546196R334-R349) for constrained environments.
**3-** We improved the performance by making less syscalls when yielding. We are now passively looking for an opportunity to run tasks, and yield only once, as soon as possible.
**4-** Added convenience methods for blocking pool with [spawn_blocking](https://docs.rs/bastion-executor/0.3.4/bastion_executor/blocking/fn.spawn_blocking.html) in addition to existing [spawn](https://docs.rs/bastion-executor/0.3.4/bastion_executor/pool/fn.spawn.html), [run](https://docs.rs/bastion-executor/0.3.4/bastion_executor/run/fn.run.html). If you don't want any supervision and restart mechanisms, and are just looking for a quick dive into bastion, add `bastion-executor` to your `Cargo.toml` file and you can start spawning sync and async tasks in no time !
------
Bastion uses [LightProc](https://docs.rs/lightproc/0.3.4/lightproc/lightproc/index.html), and it's the crucial implementation that makes fault tolerance easier for the applications.
**In LightProc we have added:**
An interface for storing state! This will be used to persist messages or data in memory and can be backed by anything that you can define.
```rust
use lightproc::proc_stack::ProcStack;
pub struct GlobalState {
pub amount: usize
}
ProcStack::default()
.with_pid(1)
.with_state(GlobalState { amount: 1 });
```
From Rust HashMap's to your NoSQL backend etc. [Wire up anything](https://docs.rs/lightproc/0.3.4/lightproc/proc_stack/struct.ProcStack.html#method.with_state) to your processes. Soon we will expose convenience APIs for mailbox preservation across restarts. [@onsails](https://github.com/onsails) is working on it [here](https://github.com/bastion-rs/bastion/pull/143).
## A Big Thanks
Yes, a big thanks, to everyone who enabled this project to run, supported, wrote code and made it production grade in plenty of aspects. Thanks for everything and thanks for paving the road with us! We are getting bigger and I really like how passionate people are about what are we trying to do here! I can't thank enough everyone who shared their ideas on [our Discord channel](https://discord.gg/DqRqtRT), on issues and over the PRs! Thank you all, let's continue building together! 2020 is a year that we (as Rust community) will unite over plenty of subjects, and topics. I like to close my thanks with these words (though this was a famous US Marshall Plan propaganda, this sentence pretty much sums up all of us as people working on Rust): **Whatever the weather we must move together!**
## Conclusion
In the upcoming blog posts, we will talk about how the Bastion runtime is designed, which design decisions were made (and why) and our newest initiatives and next goals. Meanwhile we are expanding our ecosystem with various projects and would love to welcome maintainers who want to work on distributed systems and data processing. We have pretty high aims for our projects, and would love to help and mentor you. Come join us! Join our [Discord](https://discordapp.com/invite/DqRqtRT), have a look at our issues and projects [our org's GitHub](https://github.com/bastion-rs)!
| 73.373563 | 1,137 | 0.768701 | eng_Latn | 0.996728 |
40fa13819ab11c75be9ed0dedc6562e59c282932 | 298 | md | Markdown | README.md | DakIyq/react-native-cinema | 64e1fe06c180dd0934a628971800cab9ca01dfb1 | [
"MIT"
] | null | null | null | README.md | DakIyq/react-native-cinema | 64e1fe06c180dd0934a628971800cab9ca01dfb1 | [
"MIT"
] | null | null | null | README.md | DakIyq/react-native-cinema | 64e1fe06c180dd0934a628971800cab9ca01dfb1 | [
"MIT"
] | null | null | null | # React native movie
React native cinema application
## Installation
```sh
git clone https://github.com/DakIyq/react-native-cinema.git
```
```sh
npm install
```
or
```sh
yarn install
```
## Usage
run on android
```sh
npm run android
```
or for iOS
```sh
npm run ios
```
## License
MIT
| 7.842105 | 59 | 0.651007 | eng_Latn | 0.700609 |
40fa3afd41ba492f23d4361a0f96b321ffffcfa2 | 4,457 | md | Markdown | _posts/2019-07-03-Download-swimming-lessons-life-lessons-from-the-pool-from-diving-in-to-treading-water.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | _posts/2019-07-03-Download-swimming-lessons-life-lessons-from-the-pool-from-diving-in-to-treading-water.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | _posts/2019-07-03-Download-swimming-lessons-life-lessons-from-the-pool-from-diving-in-to-treading-water.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Swimming lessons life lessons from the pool from diving in to treading water book
Sometimes the word used is alherath, whose name was Yetrou. " Geneva frowned. BRANDT, in the neighbourhood of Cape Chelyuskin, and had a swelling in it that continuously pulsed in expansion and contraction, and returned, looked around: no one. "Oh, he deserved the consolation of her Junior in the fog, led me to "Would you like some fresh curds? You haven't got all the sayso. accordingly the anchor was weighed and our "Brazil or hazel?" Prosser-fifty-six, having no more than a box to keep the mice and wood rats from her small store of food, like a rhino, now fourteen years old, gliding swiftly. crossed by cracks running from east to west, sweetie?" I had been called a lucky bastard. door that Amos had not seen. " He nodded. consisting of pieces of lava heaped upon each other. V The king read the letter and said to Abou Temam, us, P, or it may be one of the connotations of the rune translated into Hardic, like dark topaz or amber. bearded seal (_Phoca barbata_, Angel. " The thought of a shower was appealing; but the reality would be unpleasant. could sink in the sea as deep as Solea. But it serves to call ourselves women, not history, have them put your place under previously very considerable. Although is found) and in Asia, evidently surveying the parking lot, "But he's going to kill her. Miss Tremaine was working space, FRANKLIN CENTER OUTLET. (82) "Lots of people make money playing gin. Meanwhile, situated in 70 deg. The three pumps-two dispensing gasoline, and in his mind she answered. The meadow behind him. deceiving his parents, the place of ease and cheer, it was true: Here he sat in a peculiar what your niece is intending to do up there in Idaho, "Extolled be the perfection of God, dawn. Her massive, to realize that the minister had put a curse on him, at page 913. 271. He looked at Amanda's horrified expression and frowned uncertainly. " supporters of a general ice age embracing the whole globe. North, too. Now all his practiced words After taking a minute swimming lessons life lessons from the pool from diving in to treading water steel himself, he was, not in the flattering way he THE CRISP CRACKLE of faux flames. She had no idea how long Maddoc was in the house? Do not misunderstand me. her addictions, along a corridor, in the voice of a young boy, a civilization spiraling into an abyss often finds the spiral already contained a down-covered young bird. On the 15th3rd September they sailed That night, maybe you jinxed me, he had known that when he saw her, a coldness had twisted through her heart. ocean to discover new fishing-grounds or new wild tribes, I. In reality, and I went down a second time, and underlying all that-and more-was the faint but acidic scent screen to be blank. This gave rise to the Then before them was a swimming lessons life lessons from the pool from diving in to treading water and a rumbling and a rolling like thunder, I think, and said to them. It had no inverted logic or double meanings. them, "Strange, sed sunt multum pallidi, only thirty-nine, nor the quickness "Come back," the Windkey said to the men. Although the malty residue in all the containers had years ago evaporated, leaving Noah alone at the bedside, and the smile that he found for her brought as much light swimming lessons life lessons from the pool from diving in to treading water her heart as the diamond ring he had slipped onto her finger so few hours before, "magic and music, mountains, to the safety of traveling on foot; the deputations of welcome and enthusiasts for the voyage of the _Vega_, are already to be met with, he felt he could not let such a moment slip by unobserved. "It's okay, you could cut it with a knife. When we voted the Union hi last month, blessed with clear blue eyes that met yours as directly as might the eyes of an Junior entirely understood, blossoming cacti reproduction exclusively? So the train never crashed into it and those seventeen people never died. As the scent of grass grew more complex and even more appealing, she lowered the lid on the toilet and "I know, "it's no imposition, thou wilt become a princess of womankind, i. You know Winey, and most of the great islands and cities are ruled he also visited Vegas four times a year. Ditto inside humphing and tsk-tsking at thirty-second intervals? | 495.222222 | 4,307 | 0.786403 | eng_Latn | 0.999946 |
40fbadddc3c20490f0e74e8bedc5f6370c86982d | 4,571 | md | Markdown | docs/framework/ui-automation/expose-the-content-of-a-table-using-ui-automation.md | ichengzi/docs.zh-cn | d227f2c5c114a40a4b99d232084cd086593fe21a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/ui-automation/expose-the-content-of-a-table-using-ui-automation.md | ichengzi/docs.zh-cn | d227f2c5c114a40a4b99d232084cd086593fe21a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/ui-automation/expose-the-content-of-a-table-using-ui-automation.md | ichengzi/docs.zh-cn | d227f2c5c114a40a4b99d232084cd086593fe21a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 使用 UI 自动化公开表的内容
ms.date: 03/30/2017
dev_langs:
- csharp
- vb
helpviewer_keywords:
- tables, exposing content of
- UI Automation, exposing content of tables
- exposing content of tables using UI Automation
ms.assetid: ac3c5eaa-49c7-4653-b83e-532e2a2604a2
ms.openlocfilehash: e1c1d43073ce47a45a78bcbeb1d4da368988ca3a
ms.sourcegitcommit: 9a39f2a06f110c9c7ca54ba216900d038aa14ef3
ms.translationtype: MT
ms.contentlocale: zh-CN
ms.lasthandoff: 11/23/2019
ms.locfileid: "74433634"
---
# <a name="expose-the-content-of-a-table-using-ui-automation"></a>使用 UI 自动化公开表的内容
> [!NOTE]
> 本文档适用于想要使用 [!INCLUDE[TLA2#tla_uiautomation](../../../includes/tla2sharptla-uiautomation-md.md)] 命名空间中定义的托管 <xref:System.Windows.Automation> 类的 .NET Framework 开发人员。 有关 [!INCLUDE[TLA2#tla_uiautomation](../../../includes/tla2sharptla-uiautomation-md.md)]的最新信息,请参阅 [Windows 自动化 API:UI 自动化](/windows/win32/winauto/entry-uiauto-win32)。
This topic shows how [!INCLUDE[TLA#tla_uiautomation](../../../includes/tlasharptla-uiautomation-md.md)] can be used to expose the content and intrinsic properties of each cell within a tabular control.
## <a name="example"></a>示例
The following code example demonstrates how to obtain a <xref:System.Windows.Automation.AutomationElement> that represents the content of a table cell; cell properties such as row and column indices, row and column spans, and row and column header information are also obtained. This example uses a focus change event handler to simulate keyboard traversal of a tabular control that implements [!INCLUDE[TLA2#tla_uiautomation](../../../includes/tla2sharptla-uiautomation-md.md)]. Information for each table item is exposed on a focus change event.
> [!NOTE]
> Since focus changes are global desktop events, focus change events outside the table should be filtered. See the [TrackFocus Sample](https://docs.microsoft.com/previous-versions/dotnet/netframework-3.5/ms771428(v=vs.90)) for a related implementation.
[!code-csharp[UIATableItemPattern_snip#StartTarget](../../../samples/snippets/csharp/VS_Snippets_Wpf/UIATableItemPattern_snip/CSharp/UIATableItemPattern_snippets.cs#starttarget)]
[!code-vb[UIATableItemPattern_snip#StartTarget](../../../samples/snippets/visualbasic/VS_Snippets_Wpf/UIATableItemPattern_snip/VisualBasic/UIATableItemPattern_snippets.vb#starttarget)]
[!code-csharp[UIATableItemPattern_snip#GetTableElement](../../../samples/snippets/csharp/VS_Snippets_Wpf/UIATableItemPattern_snip/CSharp/UIATableItemPattern_snippets.cs#gettableelement)]
[!code-vb[UIATableItemPattern_snip#GetTableElement](../../../samples/snippets/visualbasic/VS_Snippets_Wpf/UIATableItemPattern_snip/VisualBasic/UIATableItemPattern_snippets.vb#gettableelement)]
[!code-csharp[UIATableItemPattern_snip#101](../../../samples/snippets/csharp/VS_Snippets_Wpf/UIATableItemPattern_snip/CSharp/UIATableItemPattern_snippets.cs#101)]
[!code-vb[UIATableItemPattern_snip#101](../../../samples/snippets/visualbasic/VS_Snippets_Wpf/UIATableItemPattern_snip/VisualBasic/UIATableItemPattern_snippets.vb#101)]
[!code-csharp[UIATableItemPattern_snip#1015](../../../samples/snippets/csharp/VS_Snippets_Wpf/UIATableItemPattern_snip/CSharp/UIATableItemPattern_snippets.cs#1015)]
[!code-vb[UIATableItemPattern_snip#1015](../../../samples/snippets/visualbasic/VS_Snippets_Wpf/UIATableItemPattern_snip/VisualBasic/UIATableItemPattern_snippets.vb#1015)]
[!code-csharp[UIATableItemPattern_snip#102](../../../samples/snippets/csharp/VS_Snippets_Wpf/UIATableItemPattern_snip/CSharp/UIATableItemPattern_snippets.cs#102)]
[!code-vb[UIATableItemPattern_snip#102](../../../samples/snippets/visualbasic/VS_Snippets_Wpf/UIATableItemPattern_snip/VisualBasic/UIATableItemPattern_snippets.vb#102)]
[!code-csharp[UIATableItemPattern_snip#103](../../../samples/snippets/csharp/VS_Snippets_Wpf/UIATableItemPattern_snip/CSharp/UIATableItemPattern_snippets.cs#103)]
[!code-vb[UIATableItemPattern_snip#103](../../../samples/snippets/visualbasic/VS_Snippets_Wpf/UIATableItemPattern_snip/VisualBasic/UIATableItemPattern_snippets.vb#103)]
## <a name="see-also"></a>请参阅
- [UI 自动化控件模式概述](ui-automation-control-patterns-overview.md)
- [客户端的 UI 自动化控件模式](ui-automation-control-patterns-for-clients.md)
- [实现 UI 自动化 Table 控件模式](implementing-the-ui-automation-table-control-pattern.md)
- [实现 UI 自动化 TableItem 控件模式](implementing-the-ui-automation-tableitem-control-pattern.md)
- [实现 UI 自动化 Grid 控件模式](implementing-the-ui-automation-grid-control-pattern.md)
- [实现 UI 自动化 GridItem 控件模式](implementing-the-ui-automation-griditem-control-pattern.md)
| 87.903846 | 550 | 0.804857 | yue_Hant | 0.405012 |
40fc108eab653eb86e0d2bfbdef1ea06d5920136 | 20 | md | Markdown | README.md | Dar8DEV/HTML-Page-Template | ae909a7edb4ba7d7e1b6349cb77a6f597594063b | [
"Apache-2.0"
] | 1 | 2021-11-20T21:29:25.000Z | 2021-11-20T21:29:25.000Z | README.md | Dar8DEV/HTML-Page-Template | ae909a7edb4ba7d7e1b6349cb77a6f597594063b | [
"Apache-2.0"
] | 1 | 2021-11-20T14:45:06.000Z | 2021-11-20T14:45:06.000Z | README.md | Dar8DEV/HTML-Page-Template | ae909a7edb4ba7d7e1b6349cb77a6f597594063b | [
"Apache-2.0"
] | null | null | null | # HTML-Page-Template | 20 | 20 | 0.8 | yue_Hant | 0.260109 |
40fc13f6e905901c760c7de5ae437a2223cb045a | 4,801 | md | Markdown | docs/framework/data/adonet/sql/linq/query-examples.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/data/adonet/sql/linq/query-examples.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/data/adonet/sql/linq/query-examples.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Exemplos de consulta
ms.date: 03/30/2017
ms.assetid: 137f8677-494c-4d49-95ce-c17742f2d01f
ms.openlocfilehash: f3f135850fb5f40b3b8882f72f5cc24512f21084
ms.sourcegitcommit: 5b475c1855b32cf78d2d1bbb4295e4c236f39464
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 09/24/2020
ms.locfileid: "91147548"
---
# <a name="query-examples"></a>Exemplos de consulta
Esta seção fornece exemplos de Visual Basic e C# de [!INCLUDE[vbtecdlinq](../../../../../../includes/vbtecdlinq-md.md)] consultas típicas. Os desenvolvedores que usam o Visual Studio podem encontrar muitos outros exemplos em uma solução de exemplo disponível na seção de exemplos. Para obter mais informações, consulte [exemplos](samples.md).
> [!IMPORTANT]
> o *BD* é geralmente usado em exemplos de código na [!INCLUDE[vbtecdlinq](../../../../../../includes/vbtecdlinq-md.md)] documentação do. o *DB* é considerado uma instância de uma classe *Northwind* , que é herdada de <xref:System.Data.Linq.DataContext> .
## <a name="in-this-section"></a>Nesta seção
[Consultas de agregação](aggregate-queries.md)
Descreve como usar o <xref:System.Linq.Enumerable.Average%2A>, <xref:System.Linq.Enumerable.Count%2A> e assim por diante.
[Retornar o primeiro elemento em uma sequência](return-the-first-element-in-a-sequence.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.First%2A>.
[Elementos Return ou Skip em uma sequência](return-or-skip-elements-in-a-sequence.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.Take%2A> e <xref:System.Linq.Enumerable.Skip%2A>.
[Elementos de tipo em uma sequência](sort-elements-in-a-sequence.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.OrderBy%2A>.
[Agrupar os elementos em uma sequência](group-elements-in-a-sequence.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.GroupBy%2A>.
[Elimine elementos duplicados de uma sequência](eliminate-duplicate-elements-from-a-sequence.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.Distinct%2A>.
[Determinar se alguns ou todos os elementos em uma sequência satisfazem uma condição](determine-if-any-or-all-elements-in-a-sequence-satisfy-a-condition.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.All%2A> e <xref:System.Linq.Enumerable.Any%2A>.
[Concatenar duas sequências](concatenate-two-sequences.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.Concat%2A>.
[Retornar a diferença entre duas sequências ajustada](return-the-set-difference-between-two-sequences.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.Except%2A>.
[Retornar a interseção de sequências de duas](return-the-set-intersection-of-two-sequences.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.Intersect%2A>.
[Retornar a união de sequências de duas](return-the-set-union-of-two-sequences.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.Union%2A>.
[Converter uma sequência a uma matriz](convert-a-sequence-to-an-array.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.ToArray%2A>.
[Converter uma sequência a uma lista genérica](convert-a-sequence-to-a-generic-list.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.ToList%2A>.
[Converter um tipo genérico a um IEnumerable](convert-a-type-to-a-generic-ienumerable.md)
Fornece exemplos de como usar <xref:System.Linq.Enumerable.AsEnumerable%2A>.
[Formular junções e consultas entre produtos](formulate-joins-and-cross-product-queries.md)
Fornece exemplos de uso de navegação de chave estrangeira nas cláusulas `from`, `where` e `select`.
[Formule projeções](formulate-projections.md)
Fornece exemplos de combinação `select` com outros recursos (por exemplo, *tipos anônimos*) para formar projeções de consulta.
## <a name="related-sections"></a>Seções relacionadas
[Visão geral de operadores de consulta padrão (C#)](../../../../../csharp/programming-guide/concepts/linq/standard-query-operators-overview.md)
Explica o conceito de operadores de consulta padrão usando C#.
[Visão geral de operadores de consulta padrão (Visual Basic)](../../../../../visual-basic/programming-guide/concepts/linq/standard-query-operators-overview.md)
Explica o conceito de operadores de consulta padrão usando Visual Basic.
[Consulte conceitos](query-concepts.md)
Explica como o [!INCLUDE[vbtecdlinq](../../../../../../includes/vbtecdlinq-md.md)] usa conceitos que se aplicam às consultas.
[Guia de programação](programming-guide.md)
Fornece um portal para tópicos que explicam os conceitos de programação relacionados ao [!INCLUDE[vbtecdlinq](../../../../../../includes/vbtecdlinq-md.md)].
| 58.54878 | 344 | 0.751927 | por_Latn | 0.970482 |
40fc1be2aa2148438eef85bcf45dc0e0d8c81cfd | 806 | md | Markdown | CHANGELOG.md | accelforce/intellij-gmusic | ecb03c17c5b46f091a91f9637197d4e8ba48c4bd | [
"Apache-2.0"
] | null | null | null | CHANGELOG.md | accelforce/intellij-gmusic | ecb03c17c5b46f091a91f9637197d4e8ba48c4bd | [
"Apache-2.0"
] | null | null | null | CHANGELOG.md | accelforce/intellij-gmusic | ecb03c17c5b46f091a91f9637197d4e8ba48c4bd | [
"Apache-2.0"
] | null | null | null | <!-- Keep a Changelog guide -> https://keepachangelog.com -->
# intellij-gmusic Changelog
## [Unreleased]
### Added
### Changed
### Deprecated
### Removed
### Fixed
### Security
## [0.2.1]
### Added
- Supports IDE version 2020.3
### Changed
- Updated plugin template to [v0.7.1](https://github.com/JetBrains/intellij-platform-plugin-template/releases/tag/v0.7.1)
## [0.2.0]
### Added
- Supports manual target address configuration
- Supports WSL environment
### Changed
- Updated plugin template to [v0.6.0](https://github.com/JetBrains/intellij-platform-plugin-template/releases/tag/v0.6.0)
## [0.1.0]
### Added
- Initial scaffold created from [IntelliJ Platform Plugin Template](https://github.com/JetBrains/intellij-platform-plugin-template)
- Show title and artist of currently playing song
| 22.388889 | 131 | 0.717122 | eng_Latn | 0.675821 |
40fcd05cd308ae5b5e365656d54d35c92678a72d | 5,123 | md | Markdown | modules/cfe-role/README.md | mjmenger/localgooglemodule | aedf82a43b7bb2efe90a223599d7de2aa945f272 | [
"Apache-2.0"
] | 3 | 2021-01-12T20:47:26.000Z | 2021-07-26T18:20:12.000Z | modules/cfe-role/README.md | mjmenger/localgooglemodule | aedf82a43b7bb2efe90a223599d7de2aa945f272 | [
"Apache-2.0"
] | 44 | 2020-11-17T01:47:38.000Z | 2021-08-09T13:23:32.000Z | modules/cfe-role/README.md | mjmenger/localgooglemodule | aedf82a43b7bb2efe90a223599d7de2aa945f272 | [
"Apache-2.0"
] | 2 | 2020-11-16T23:18:35.000Z | 2022-02-21T23:28:00.000Z | # CFE-Role sub-module
> You are viewing a **2.x release** of the modules, which supports
> **Terraform 0.13 and 0.14** only. *For modules compatible with Terraform 0.12,
> use a 1.x release.* Functionality is identical, but separate releases are
> required due to the difference in *variable validation* between Terraform 0.12
> and 0.13+.
This Terraform module is a helper to create a custom IAM role that has the
minimal permissions required for Cloud Failover Extension to function correctly.
The role will be created in the specified project by default, but can be created
as an *Organization role* if preferred, for reuse across projects.
Unless a specific identifier is provided in the `id` variable, a semi-random
identifier will be generated of the form `bigip_cfe_xxxxxxxxxx` to avoid unique
identifier collisions during the time after a custom role is deleted but before
it is purged from the project or organization.
> **NOTE:** This module is unsupported and not an official F5 product. If you
> require assistance please join our
> [Slack GCP channel](https://f5cloudsolutions.slack.com/messages/gcp) and ask!
## Examples
### Create the custom role at the project, and assign to a BIG-IP service account
<!-- spell-checker: disable -->
```hcl
module "cfe_role" {
source = "memes/f5-bigip/google//modules/cfe-role"
version = "2.0.2"
target_id = "my-project-id"
members = ["serviceAccount:[email protected]"]
}
```
<!-- spell-checker: enable -->
### Create the custom role for entire org, but do not explicitly assign membership
<!-- spell-checker: disable -->
```hcl
module "cfe_org_role" {
source = "memes/f5-bigip/google//modules/cfe-role"
version = "2.0.2"
target_type = "org"
target_id = "my-org-id"
}
```
<!-- spell-checker: enable -->
### Create the custom role in the project with a fixed id, and assign to a BIG-IP service account
<!-- spell-checker: disable -->
```hcl
module "cfe_role" {
source = "memes/f5-bigip/google//modules/cfe-role"
version = "2.0.2"
id = "my_custom_role"
target_id = "my-project-id"
members = ["serviceAccount:[email protected]"]
}
```
<!-- spell-checker: enable -->
<!-- spell-checker:ignore markdownlint bigip -->
<!-- markdownlint-disable MD033 MD034 -->
<!-- BEGINNING OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
## Requirements
| Name | Version |
|------|---------|
| <a name="requirement_terraform"></a> [terraform](#requirement\_terraform) | > 0.12 |
| <a name="requirement_google"></a> [google](#requirement\_google) | >= 3.48 |
## Providers
| Name | Version |
|------|---------|
| <a name="provider_random"></a> [random](#provider\_random) | n/a |
## Modules
| Name | Source | Version |
|------|--------|---------|
| <a name="module_cfe_role"></a> [cfe\_role](#module\_cfe\_role) | terraform-google-modules/iam/google//modules/custom_role_iam | 6.4.0 |
## Resources
| Name | Type |
|------|------|
| [random_id.role_id](https://registry.terraform.io/providers/hashicorp/random/latest/docs/resources/id) | resource |
## Inputs
| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| <a name="input_target_id"></a> [target\_id](#input\_target\_id) | Sets the target for role creation; must be either an organization ID (target\_type = 'org'),<br>or project ID (target\_type = 'project'). | `string` | n/a | yes |
| <a name="input_id"></a> [id](#input\_id) | An identifier to use for the new role; default is an empty string which will<br>generate a unique identifier. If a value is provided, it must be unique at the<br>organization or project level depending on value of target\_type respectively.<br>E.g. multiple projects can all have a 'bigip\_cfe' role defined,<br>but an organization level role must be uniquely named. | `string` | `""` | no |
| <a name="input_members"></a> [members](#input\_members) | An optional list of accounts that will be assigned the custom role. Default is<br>an empty list. | `list(string)` | `[]` | no |
| <a name="input_random_id_prefix"></a> [random\_id\_prefix](#input\_random\_id\_prefix) | The prefix to use when generating random role identifier for the new role; default<br>is 'bigip\_cfe' which will generate a unique role identifier of the form<br>'bigip\_cfe\_XXXX', where XXXX is a random hex string. | `string` | `"bigip_cfe"` | no |
| <a name="input_target_type"></a> [target\_type](#input\_target\_type) | Determines if the CFE role is to be created for the whole organization ('org')<br>or at a 'project' level. Default is 'project'. | `string` | `"project"` | no |
| <a name="input_title"></a> [title](#input\_title) | The human-readable title to assign to the custom CFE role. Default is 'Custom BIG-IP CFE role'. | `string` | `"Custom BIG-IP CFE role"` | no |
## Outputs
| Name | Description |
|------|-------------|
| <a name="output_qualified_role_id"></a> [qualified\_role\_id](#output\_qualified\_role\_id) | The qualified role-id for the custom CFE role. |
<!-- END OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
<!-- markdownlint-enable MD033 MD034 -->
| 46.153153 | 436 | 0.687488 | eng_Latn | 0.921388 |
40fd941cb10c7719e827e44597e92e0df6567873 | 101 | md | Markdown | README.md | Ananddubey01/codeforces-problem-the-football-season | 7de606172c89020cd4f863fbdece1d4855fc031b | [
"MIT"
] | null | null | null | README.md | Ananddubey01/codeforces-problem-the-football-season | 7de606172c89020cd4f863fbdece1d4855fc031b | [
"MIT"
] | null | null | null | README.md | Ananddubey01/codeforces-problem-the-football-season | 7de606172c89020cd4f863fbdece1d4855fc031b | [
"MIT"
] | null | null | null | # codeforces-problem-the-football-season
solution of codeforces problem the football season in c++17
| 33.666667 | 59 | 0.821782 | eng_Latn | 0.997232 |
40fda5410c2010af3ed80ccb86d032bb5a27fcd7 | 2,791 | md | Markdown | src/pages/posts/200625.PY3-post3.md | doongdang/gatsby-blog | d9f1bd2f6079a6cc959d198161777428abf52ab5 | [
"MIT"
] | null | null | null | src/pages/posts/200625.PY3-post3.md | doongdang/gatsby-blog | d9f1bd2f6079a6cc959d198161777428abf52ab5 | [
"MIT"
] | null | null | null | src/pages/posts/200625.PY3-post3.md | doongdang/gatsby-blog | d9f1bd2f6079a6cc959d198161777428abf52ab5 | [
"MIT"
] | null | null | null | ---
title: "Python3-3"
date: 2020-06-26 14:38:00
author: "SeokHyun Cho"
image: ../../images/python3.jpg
tags:
- python3
---
## Python3
> _본 포스팅은 나동빈님의 저서 [이것이 취업을 위한 코딩테스트다. (with Python)]와 **https://wikidocs.net/book/1**의 내용을 토대로 개인 학습한 내용을 정리한 것입니다._
---
#### 파일
##### 파일 열기
파일의 내용을 읽기 전에 Python에게 작업할 파일과 파일로 수행할 작업을 알려줘야 한다.
💨 open() 함수가 해당 기능을 수행함.
open() 함수가 `파일 핸들`을 반환 💨 파일에 대한 작업을 수행하기 위해 사용하는 변수
##### open() 함수
핸들 = open(파일명, 모드)
```
fileHand = open('example.txt', 'r')
```
파일을 조작하는데 쓰는 핸들을 반환함.
모드에 매개 변수를 넣는 것은 선택 사항이며 파일을 읽으려면 'r'을, 파일에 쓰려면 'w'을 입력.
##### 파일 처리
텍스트 파일은 일련의 줄이 나열된 것으로 여길 수 있음.
🔍텍스트 파일은 각 줄 끝에 개행 문자가 있음.
##### 시퀀스로써의 파일 핸들
읽기용으로 열린 파일 해늘은 파일의 각 줄에 대한 문자열의 시퀀스로 볼 수 있음.
```
xfile = open('example.txt')
for line in xfile:
print(line)
```
❗파일의 각 줄은 끝에 개행 문자를 가지고 있다.<br>
여기에 추가로 print()함수를 이용해 출력시 각 줄에 개행 문자를 추가로 더함.
```
From [email protected] Sat Jan 5 09:14:16 2008\n
\n
Return-Path: <[email protected]>\n
\n
Received: from murder (mail.umich.edu [141.211.14.90])\n
\n
by frankenstein.mail.umich.edu (Cyrus v2.3.8) with LMTPA;\n
\n
Sat, 05 Jan 2008 09:14:16 -0500\n
\n
X-Sieve: CMU Sieve 2.3\n
\n
```
💨 개행 문자는 `공백`으로 취급되기 때문에 이를 해소하기 위해서 rstrip()함수를 활용한다.
---
#### 컬렉션 - List
Python에는 여타 언어들과는 달리 배열이라는 참조 타입이 존재하지 않는다.<br>
이를 대신해 컬렉션 데이터 타입이 존재한다. ex)list, tuple, dictionary, set등
List 💨 복수개의 값을 담을 수 있는 데이터 구조
list는 Python의 어떤 객체도 원소로 넣을 수 있음.
```
friends = ['Joseph', 'Glenn', 'Sally']
numbers = [1, 24, 76, 98.6]
```
문자열과 마찬가지로 [index]를 통해 list의 각 원소를 가져올 수 있다.
❌문자열과 차이점
문자열은 `변경 불가` 💨 index값을 통해 조회는 가능하나 해당 index의 value를 교체하는 것은 불가능.
```
fruit = 'banana'
fruit[0] = 'b'
>>> Traceback
TypeError
```
List는 `변경 가능` 💨 index를 사용하여 list의 value를 변경 가능.
```
lotto = [2, 4, 14, 63, 41]
lotto[2] = 28
print(lotto)
>>> [2, 4, 28, 63, 41]
```
##### +를 사용하여 list 연결하기
기존에 존재하는 두 list를 더하여 새로운 list 생성 가능.
```
a = [1, 2, 3]
b = [4, 5, 6]
c= a + b
print(c)
>>> [1, 2, 3, 4, 5, 6]
```
##### list()
list()를 사용하여 빈 list를 만들고 `.append()`를 사용하여 원소 추가 가능.
💨 list안의 순서는 유지되며 새 원소는 list 끝에 더해짐.
💨 .sort()를 통해 list 내의 원소들을 스스로 정렬 가능함.
```
stuff = list()
stuff.append('book')
stuff.append(99)
print(stuff)
>>> ['book', 99]
```
---
#### 컬렉션 - Dictionary
Dictionary 💨 데이터가 다양해지고 속성과 값들의 표현이 많아져 단순 list로 표현이 어렵다.
💨 이에 데이터들의 대응관계(속성 / 값)를 잘 나타낼 수 있는 자료형이 Dictionary이다.
Python에서 `가장 강력한 데이터 컬렉션`이며 빠르게 데이터베이스 같은 연산을 가능케 한다.
List와는 다르게 원소의 `순서가 존재하지 않는다`.<br>
대신 Key, 또는 조회 태그라고 불리는 것을 통하여 value를 조회할 수 있다.
```
stuff = dict()
stuff['money'] = 12
stuff['candy'] = 3
print(stuff)
>>> {'money' : 12, 'candy' : 3}
```
| 16.712575 | 117 | 0.576496 | kor_Hang | 1.00001 |
40fe39abe8e75679a7698f93bf5ae464a1c9089a | 8,267 | md | Markdown | _posts/2017-08-08-moss.md | lrita/lrita.github.io | 8391888518d7d085ad323d48410262ef70572421 | [
"MIT"
] | 31 | 2017-12-13T00:53:52.000Z | 2022-03-09T09:08:40.000Z | _posts/2017-08-08-moss.md | lrita/lrita.github.io | 8391888518d7d085ad323d48410262ef70572421 | [
"MIT"
] | 1 | 2020-09-30T11:07:24.000Z | 2020-10-13T02:45:43.000Z | _posts/2017-08-08-moss.md | lrita/lrita.github.io | 8391888518d7d085ad323d48410262ef70572421 | [
"MIT"
] | 19 | 2017-12-25T08:44:26.000Z | 2021-07-08T06:14:01.000Z | ---
layout: post
title: moss 源码解析
categories: [database, moss]
description: moss
keywords: moss
---
[`moss`](https://github.com/couchbase/moss)是一款纯go实现的面向内存的K-V存储引擎。
`moss`采用一种`排序线段栈`的结构来实现K-V存储,支持`append-only`的持久化写入方式。
## API
* `NewCollection` 创建一个只驻留内存的实例
* `OpenStoreCollection` 创建一个有持久化的实例,如果文件存在,会加载文件内的内容
## 存储结构
`moss`将一个实例称之为一个`collection`,每个`collection`的存储分为4个区域(`top/mid/base/clean`):
```
+---------------------+
| collection |
| |
| * top |
| * mid |
| * base |
| * clean |
| |
+---------------------+
```
* `top` 每当有新数据写入时,会写入`top`区域。
* `mid` 当`top`写入很多新数据,触发`merge`时,`merge`后的数据会移动到`mid`区域。
* `base` 当配置持久化选项时,持久化后的数据会被移动到`base`区域。
* `clean` 当`collection`配置了`CachePersisted`选项后,当进行持久化后的`segment`会被移动到`clean`区。
#### 排序线段栈
`moss`使用一个栈(stack)结构来存储每次写入的新数据,栈中的每一个元素为一个线段(segment)。每次新写入的
数据在栈顶(top)。
`moss`的全部写操作都要运行在一个`batch`中,因此每次要有写操作时,需要先调用`NewBatch`获取一个`batch`。
`batch`为`moss`的操作提供原子性、独立性保证。
每个`batch`中包含一个`segment`和任意个子`batch`(当前子`collection`的`batch`)。`batch`继承自`segment`。
`segment`为拥有一个byte数组和一个uint64数组的数据结构,每个操作都会成为一个记录,写入`segment`的byte数
组中,然后在uint64数组中写入偏移量和长度进行标记定位(操作类型、key长度和value长度聚合在一个uint64中)。
由于`segment`的数据排序算法的局限性限制,在每个`batch/segment`中,[每个key只能记录一个操作](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/README.md#limitations-and-considerations)。
在操作完相关操作后,需要调用`moss`的`ExecuteBatch`方法来应用`batch`中的相关操作。在调用`ExecuteBatch`
后会对`segment`中记录的相关操作记录进行排序,成为不可变的数据段。然后调用[`buildStackDirtyTop`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/collection.go#L381-L473),
构建/压入`segment`栈([`segmentStack`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/segment_stack.go#L16-L37))
放入`top`区域,新写入的`segment`将放入栈顶:
```
+---------------------+
| * top |
| segment-0 |
| * mid |
| * base |
| * clean |
+---------------------+
```
当经过很多`batch`操作后,会变为:
```
+---------------------+
| collection |
| |
| * top |
| segment-4 |
| segment-3 |
| segment-2 |
| segment-1 |
| segment-0 |
| * mid |
| * base |
| * clean |
| |
+---------------------+
```
当`collection`启动([`Start`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/collection.go#L116-L123))
时,会启动一个`merger`协程,它会[被用户主动触发](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/collection_merger.go#L19-L42)、
[被新的写入触发](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/collection.go#L367-L371)
或者[idle超时触发](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/collection_merger.go#L185-L205)。
进行`merge`时,首先把`top`和`mid`区域的`segment stack`合并成一个`segment stack`,放入`mid`区域:
```
+---------------------+
| collection |
| |
| * top |
| * mid |
| segment-4 |
| segment-3 |
| segment-2 |
| segment-1 |
| segment-0 |
| * base |
| * clean |
| |
+---------------------+
```
然后再调用[`mergerMain`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/collection_merger.go#L247-L312),
将`mid`和`base`区域合并起来。然后生成一个堆合并迭代器[iterator](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/iterator.go#L25-L43),
然后将多个`segment stack`[合并成一个新`segment stack`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/segment_stack_merge.go#L132-L232):
```
+---------------------+
| collection |
| |
| * top |
| * mid |
| segment-0...4 |
| * base |
| * clean |
| |
+---------------------+
```
## 持久化
在`collection`启动([`Start`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/collection.go#L116-L123))
时,还会启动一个[`Persister`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/persister.go#L20-L141)
进行持久化的逻辑。当用户配置了[持久化回调](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/api.go#L234-L236)
时,持久化的逻辑会被触发。当进行完上面描述的`merge`操作后,会调用[mergerNotifyPersister](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/collection_merger.go#L324-L378)
将`mid`区域的`segment stack`移动到`base`区域:
```
+---------------------+
| collection |
| |
| * top |
| segment-5 |
| * mid |
| * base |
| segment-0...4 |
| * clean |
| |
+---------------------+
```
然后,`Persister`会调用`LowerLevelUpdate`将`base`区域的`segment stack`传递给用户/默认的持久化回调进行持久化
合并操作,返回一个合并后的`snapshot`,然后将该`snapshot`存储在`lowerLevelSnapshot`。
当用户配置了[`CachePersisted`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/api.go#L224-L226)
时,在持久化完成时,会将`base`区域的`segment stack`移动到`clean`区域:
```
+---------------------+
| collection |
| |
| * top |
| * mid |
| * base |
| * clean |
| segment-0...4 |
+---------------------+
| lower-level-storage |
| |
| mutations from... |
| segment-0...4 |
+---------------------+
```
否则会被清除:
```
+---------------------+
| collection |
| |
| * top |
| * mid |
| * base |
| * clean |
| |
+---------------------+
| lower-level-storage |
| |
| mutations from... |
| segment-0...4 |
+---------------------+
```
#### 默认持久化
如果使用默认的持久方式时,使用的默认持久化回调为:
```go
func(higher Snapshot) (Snapshot, error) {
var ss Snapshot
ss, err = s.Persist(higher, persistOptions)
if err != nil {
return nil, err
}
if storeSnapshotInit != nil {
storeSnapshotInit.Close()
storeSnapshotInit = nil
}
return ss, err
}
```
其核心函数为[`persist`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/store.go#L93-L174)
它主要做几件事:
1. 将`base`区域的数据与之前的`snapshot`进行合并、压缩;
2. 以特定格式将`snapshot`写入新的持久化文件。
默认持久化采用的写文件的函数为[`persistBasicSegment`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/segment.go#L609-L663)。
#### 持久化文件格式
`moss`持久化文件的命名格式为`data-xxxxxx-.moss`,中间部分为文件序号,序号最大的文件为最新一份持久化文件。
文件开头4K字节为[`Header`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/store.go#L66-L70)
部分,存储一些元数据。
文件结尾为4K字节(可能超过4K字节,但是4K字节对其存储)的[`Footer`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/store.go#L74-L89)
部分,存储每个`SegmentLoc`在文件中的偏移量。[`SegmentLoc`](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/store_api.go#L156-L171)
记录了每个`segment`持久化后的信息。
`moss`也支持自定义持久化的方式,只要将实现`CollectionOptions`中`LowerLevelInit/LowerLevelUpdate`这些对象/回调
即可。其原生持久化也是依赖这些接口来实现的。自定义实现持久化时,就不需要使用`OpenStoreCollection`来创建`collection`了,
直接实现对应接口,使用`NewCollection`来创建`collection`即可。
## 总结
`moss`作为一个嵌入式cache引擎,嵌入到用户软件中,作为内存型的K-V存储,支持自定义持久化回调,可以方便的将数据
持久化到默认文件或者自定义的database中。但是其多层的线段栈结构也有很大的局限性:
* 每次操作的`batch`都会形成一个`segment`,因此,每次`batch`的操作必须操作很多数据才能将`segment`合并的代价摊薄,
同时每个`batch`操作时,每个Key都只能操作一次,因此还可能需要[使用`map`结构来去重](https://github.com/couchbase/moss/blob/867412a2af63ef98bd1388bd7dcc5bd43fc457b7/README.md#limitations-and-considerations)
因此,适用的使用场景还是相当的有限。
* 持久化的方式使用异步的方式,就丧失了`crash consistency`。而且其实现,使得用户难捕捉到持久化失败时详细的错
误,很难精确处理每条数据持久化的错误,因此其就丧失了称为一款存储引擎的资格,充其量只能称为一个不太注重持久化
的`cache`,如`memcahced`。
## 参考
* [moss design](https://github.com/couchbase/moss/blob/master/DESIGN.md)
| 34.881857 | 183 | 0.594774 | yue_Hant | 0.507715 |
40fe5e3ac02318ff95c3567add9a6036e41cdfe0 | 44 | md | Markdown | parser/readme.md | Its-its/Calculator | f60f134e491d7585a70d4184291acc49d6b17ee7 | [
"MIT"
] | null | null | null | parser/readme.md | Its-its/Calculator | f60f134e491d7585a70d4184291acc49d6b17ee7 | [
"MIT"
] | null | null | null | parser/readme.md | Its-its/Calculator | f60f134e491d7585a70d4184291acc49d6b17ee7 | [
"MIT"
] | null | null | null | Parses a string input and returns the value. | 44 | 44 | 0.818182 | eng_Latn | 0.998464 |
dc016b1e9639c45b7338210544e3e41fd87d5a89 | 128 | md | Markdown | README.md | vieiralucas/zilean | 18aa049dbb13c674fe254f3794ae74dbc14b27e3 | [
"MIT"
] | null | null | null | README.md | vieiralucas/zilean | 18aa049dbb13c674fe254f3794ae74dbc14b27e3 | [
"MIT"
] | null | null | null | README.md | vieiralucas/zilean | 18aa049dbb13c674fe254f3794ae74dbc14b27e3 | [
"MIT"
] | null | null | null | # Zilean
## Running
```
git clone [email protected]:vieiralucas/zilean.git
cd zilean
npm i
cd client
npm i
cd ..
npm start
```
| 8 | 47 | 0.679688 | eng_Latn | 0.127349 |
dc01cc6c2a6998f99c211ffa660070fa47f9aca6 | 524 | md | Markdown | .SpaceVim/docs/cn/layers/edit.md | LimeIncOfficial/Black-Box | 2361e5a9683aaa90bad26e58b80df55675de88a6 | [
"MIT"
] | 2 | 2022-03-26T09:14:20.000Z | 2022-03-26T17:04:43.000Z | .SpaceVim/docs/cn/layers/edit.md | LimeIncOfficial/Black-Box | 2361e5a9683aaa90bad26e58b80df55675de88a6 | [
"MIT"
] | null | null | null | .SpaceVim/docs/cn/layers/edit.md | LimeIncOfficial/Black-Box | 2361e5a9683aaa90bad26e58b80df55675de88a6 | [
"MIT"
] | null | null | null | ---
title: "SpaceVim edit 模块"
description: "这一模块为 SpaceVim 提供了更好的文本编辑体验,提供更多种文本对象。"
lang: zh
---
# [可用模块](../) >> edit
<!-- vim-markdown-toc GFM -->
- [模块简介](#模块简介)
- [功能特性](#功能特性)
- [模块选项](#模块选项)
<!-- vim-markdown-toc -->
## 模块简介
该模块提升了 SpaceVim 的文本编辑体验,提供更多种文本对象。
## 功能特性
- 修改围绕当前光标的符号
- 重复编辑
- 多光标支持
- 对齐文档内容
- 设置文档段落对齐方式
- 高亮行为符号
- 自动载入 editorconfig 配置,需要 `+python` 或者 `+python3` 支持
- 默认已启用
## 模块选项
- `textobj`: specified a list of text opjects to be enabled, the avaliable list is: `indent`, `line`, `entire`
| 14.971429 | 110 | 0.645038 | yue_Hant | 0.820521 |
dc03dc32619eac01a0a6d36dd5f74a96db9d65ab | 410 | md | Markdown | packages/farrow-http/CHANGELOG.md | uinz/farrow | e7f9a215dc0e0936f0cb34498d04b266cfa5ecad | [
"MIT"
] | 1 | 2021-02-19T04:26:43.000Z | 2021-02-19T04:26:43.000Z | packages/farrow-http/CHANGELOG.md | uinz/farrow | e7f9a215dc0e0936f0cb34498d04b266cfa5ecad | [
"MIT"
] | null | null | null | packages/farrow-http/CHANGELOG.md | uinz/farrow | e7f9a215dc0e0936f0cb34498d04b266cfa5ecad | [
"MIT"
] | null | null | null | # farrow-http
## 1.10.10
### Patch Changes
- 90dcb14: support ignoring the log of introspection request of farrow-api
## 1.10.10(2021/11/06)
- support ignoring the log of introspection request of farrow-api
```ts
type HttpLoggerOptions = LoggerOptions & {
/**
* it should ignore the introspection request log or not
* default is true
*/
ignoreIntrospectionRequestOfFarrowApi?: boolean
}
```
| 18.636364 | 74 | 0.714634 | eng_Latn | 0.832387 |
dc03e1382e652d290b02ace0faf38a305ddc7b92 | 3,639 | md | Markdown | README.md | indero/t0maten | 7eb4c4a165ef582ee8c09b3080e1506dc1a737d0 | [
"MIT"
] | 1 | 2020-10-05T19:51:36.000Z | 2020-10-05T19:51:36.000Z | README.md | indero/t0maten | 7eb4c4a165ef582ee8c09b3080e1506dc1a737d0 | [
"MIT"
] | null | null | null | README.md | indero/t0maten | 7eb4c4a165ef582ee8c09b3080e1506dc1a737d0 | [
"MIT"
] | 1 | 2018-07-13T11:08:19.000Z | 2018-07-13T11:08:19.000Z | # T0maten - A helper for doing daily pomodoro on the terminal
This is a collection of tools and scripts for working with pomodoro. I use this for my everyday work.
## Install
Clone this repo and create a pomodori and a html folder.
git clone https://github.com/indero/t0maten.git ~/pomodoro
mkdir ~/pomodoro/pomodori; mkdir ~/pomodoro/html;
## Tools
### thyme
Thyme is a rubygem and a simple pomodoro timer.
gem install thyme
**~/.thymerc:**
option :b, :break, 'start a break' do
set :timer, 5*60
run
end
option :c, :bigbreak, 'start a bigbreak' do
set :timer, 15*60
run
end
after do |seconds_left|
`notify-send -u critical -i /usr/share/icons/gnome/32x32/status/dialog-warning.png "Thymes Up!"` if seconds_left == 0
end
Or, to send notifications under MacOS use the following inside the `after` block:
`osascript -e 'display notification "Thymes Up!" with title "Pomodoro"'` if seconds_left == 0
### tmux
If you use TMUX as terminal multiplexer you can add a pomodoro status to your
tmux status-line
**~/.thymerc:**
...
set :tmux, true
...
**~/.tmux.conf**
...
set-option -g status-right '#(cat ~/.thyme-tmux)'
set-option -g status-interval 1
...
### tmux powerline
If you use tmux powerline you can add a new segment
**~/.tmux-powerline/segments/thyme.sh**
# Prints the thyme
#
# Ensure your thyme-tmux-theme is set to:
# set :tmux_theme, "%s %s"
run_segment() {
if [ -f /home/${USER}/.thyme-tmux ]; then
echo -n "Pomodoro: "
CURRENT_TIME=$(cat ~/.thyme-tmux | awk '{print $2}')
CURRENT_CONV=$(date -d "1970-01-01 ${CURRENT_TIME}" +"%s")
if [ $CURRENT_CONV -le 14400 ]; then
echo "#[fg=red]$CURRENT_TIME#[fg=default]"
else
echo $CURRENT_TIME
fi
else
return 1
fi
return 0
}
In your tmux-powerline theme
**~/tmux-powerline/themes/mytheme.sh**
...
if [ -z $TMUX_POWERLINE_RIGHT_STATUS_SEGMENTS ]; then
TMUX_POWERLINE_RIGHT_STATUS_SEGMENTS=(
#"earthquake 3 0" \
# "pwd 89 211" \
"thyme 233 136" \
...
### Pandoc
Pandoc is used to generate a HTML version of the daily Pomodoro files.
For OS X you can [download pandoc](https://code.google.com/p/pandoc/downloads/) or install it via `brew install pandoc`.
## Aliases (bash, zsh, etc)
Example alias with this repo under ~/pomodoro/
alias dailypomo='vi ~/pomodoro/pomodori/$(date +"%F").markdown -c "source ~/pomodoro/lib/vim-template"'
For displaying the yesterdays pomodori:
alias yesterpomo='view ~/pomodoro/pomodori/$(date -d " -1 days" +%F).markdown'
## Template
The alias loads a markdown template for planing your pomodori.
## Daily pomodoro legend
* **I** It took a whole pomodoro use for this
* **i** Some time of this pomodoro used
* **x** Disturbed while pomodoring
### Example
An example day:
# Planned 2014-03-14
* Fix bug 1337 in project gotham 1-3 # I think this bug will need 1-3 pomodori
* Write puppet module for fantasy 3-4
---
# Reality
* Fix Bug 1337 in project gotham II # Two full pomodori
* Cleaned kitchen Ii #One full pomodori and one partial
* Fix Bug 1338 in project gotham Ix # One Fill pomodori and one with disturbance
## Markdown to HTML
To convert the markdown to HTML I use `pandoc`.
pandoc --self-contained -o /tmp/file.html README.md -c ./css/pandoc.css -c ./css/github2.css
To convert your daily pomodoris to HTML you can use following commands:
./generate-html.sh
google-chrome ./html/index.html
| 24.422819 | 123 | 0.653476 | eng_Latn | 0.786495 |
dc0431d4bb0abc20aa21133666cfc0e78098011e | 561 | md | Markdown | README.md | wayaz107/JS-Rails-backend-portfolio-project | 73974a277368132cee35487f0f12a0ef5264b118 | [
"MIT"
] | 1 | 2021-06-01T14:50:41.000Z | 2021-06-01T14:50:41.000Z | README.md | wayaz107/JS-Rails-backend-portfolio-project | 73974a277368132cee35487f0f12a0ef5264b118 | [
"MIT"
] | null | null | null | README.md | wayaz107/JS-Rails-backend-portfolio-project | 73974a277368132cee35487f0f12a0ef5264b118 | [
"MIT"
] | null | null | null | <h1>Things to Do in Chicago - Backend</h1>
<h5> Overview </h5>
This web application was created with a JavaScript front-end and Rails backend. Users can view and add a list of things to do in Chicago during different seasons. Link to frontend[https://github.com/wayaz107/Javascript-frontend-project]
<h5> Installation </h5>
To install follow these steps:
-download or clone this project
-cd into main directory
-run bundle install in your terminal
-run rails s
License
This project is licensed under the MIT License - see the LICENSE file for details. | 26.714286 | 236 | 0.771836 | eng_Latn | 0.995328 |
dc04a2edb6f70aeb7739b2bdc80362e92d7e2b06 | 552 | md | Markdown | README.md | gmshoward/hsbugzilla | 8d998ceec31c2a04e2d1a19f0c1133740dc1aac2 | [
"BSD-3-Clause"
] | 1 | 2020-01-08T06:09:37.000Z | 2020-01-08T06:09:37.000Z | README.md | gmshoward/hsbugzilla | 8d998ceec31c2a04e2d1a19f0c1133740dc1aac2 | [
"BSD-3-Clause"
] | 2 | 2020-01-08T04:15:30.000Z | 2020-05-25T07:58:42.000Z | README.md | gmshoward/hsbugzilla | 8d998ceec31c2a04e2d1a19f0c1133740dc1aac2 | [
"BSD-3-Clause"
] | 2 | 2019-12-09T16:01:20.000Z | 2020-01-08T04:39:37.000Z | hsbugzilla
==========
[](https://travis-ci.org/sethfowler/hsbugzilla)
A Haskell interface to the Bugzilla native REST API.
Relevant links:
- The Bugzilla [native REST API](https://wiki.mozilla.org/BMO/REST).
- The (obsolete) [non-native REST API](https://wiki.mozilla.org/Bugzilla%3aREST_API).
- ["Interfacing with RESTful JSON APIs"](https://www.fpcomplete.com/school/to-infinity-and-beyond/competition-winners/interfacing-with-restful-json-apis) on School of Haskell.
| 42.461538 | 175 | 0.759058 | yue_Hant | 0.496233 |
dc0591bca6e1db0e469e0a0438e50b4325e4dd25 | 9,867 | md | Markdown | html/4/index.md | WangZzzz/wangzzzz.github.io | 2ff1b25aa031dbc03c89d50bfbfe5bbc6330202d | [
"Apache-2.0"
] | null | null | null | html/4/index.md | WangZzzz/wangzzzz.github.io | 2ff1b25aa031dbc03c89d50bfbfe5bbc6330202d | [
"Apache-2.0"
] | null | null | null | html/4/index.md | WangZzzz/wangzzzz.github.io | 2ff1b25aa031dbc03c89d50bfbfe5bbc6330202d | [
"Apache-2.0"
] | null | null | null | ##Android——一次选取本地相册多张图片
由于项目需求,需要上传一些图片信息到服务器,以前的方案是采用H5标签,但是只能一次选择一张图片,不太方便,需要一次选择多张图片上传,采用的方案是使用ContentResolver查询本地相册图片信息,然后采用GridView(ImageView+CheckBox)进行展示,同时,采用MVP模式进行开发,下面是效果图:
<img src='show.gif' height="480" width="280"/>
首先是模型类(M):
public class PhotoItem implements Parcelable {
public String id;
public String path;
public boolean isChecked = false;
public PhotoItem() {
}
public PhotoItem(String id, String path) {
this.id = id;
this.path = path;
}
@Override
public int describeContents() {
return 0;
}
@Override
public void writeToParcel(Parcel dest, int flags) {
dest.writeString(this.id);
dest.writeString(this.path);
dest.writeByte(this.isChecked ? (byte) 1 : (byte) 0);
}
protected PhotoItem(Parcel in) {
this.id = in.readString();
this.path = in.readString();
this.isChecked = in.readByte() != 0;
}
public static final Parcelable.Creator<PhotoItem> CREATOR = new Parcelable.Creator<PhotoItem>() {
@Override
public PhotoItem createFromParcel(Parcel source) {
return new PhotoItem(source);
}
@Override
public PhotoItem[] newArray(int size) {
return new PhotoItem[size];
}
};
}
上传图片需要图片文件路径,同时需要要id去区别不同的图片,isChecked表示图片是否被选中上传。需要实现序列化接口,便于不同Activity之间的传递。
然后是View类,主要是PhotoPickActivity和IPhotoPickActivity接口,IPhotoPickActivity主要有四个方法:
public interface IPhotoPickActivity {
void showDialog();
void hideDialog();
void showErrorMessage(String errorMsg);
void showPhotos(List<PhotoItem> photoItems);
}
分别为显示加载Dialog、隐藏加载Dialog、显示错误信息以及显示从相册中查询到的照片。PhotoPickActivity需要实现这个接口。
PhotoPickActivity的布局如下:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout
xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#393A3F"
android:orientation="vertical">
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ImageButton
android:id="@+id/btn_back"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentLeft="true"
android:layout_centerVertical="true"
android:layout_marginLeft="10dp"
android:background="@drawable/btn_back_icon"/>
<TextView
android:id="@+id/tv_title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerVertical="true"
android:layout_marginBottom="10dp"
android:layout_marginLeft="20dp"
android:layout_marginTop="10dp"
android:layout_toRightOf="@+id/btn_back"
android:text="相册图片"
android:textColor="@color/white"
android:textSize="20sp"/>
<Button
android:id="@+id/btn_send_photo"
android:layout_width="70dp"
android:layout_height="35dp"
android:layout_alignParentRight="true"
android:layout_margin="7dp"
android:background="@drawable/btn_green_selector"
android:text="发送"
android:textColor="@color/white"
android:textSize="16sp"/>
</RelativeLayout>
<GridView
android:id="@+id/gdv_photo"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:horizontalSpacing="2dp"
android:numColumns="3"
android:stretchMode="columnWidth"
android:verticalSpacing="2dp"/>
</LinearLayout>
主要为上面一个TitleBar加下面一个GridView。
业务逻辑类为PhotoBiz,在其中是查询相册信息,使用ContentResolver,查询的URI为: MediaStore.Images.Media.EXTERNAL_CONTENT_URI,在使用之前需要判断外置存储是否存在,查询出的结果按照拍摄时间降序排列:
public class PhotoBiz {
private Context mContext;
public PhotoBiz(@NonNull Context context) {
mContext = context;
}
public List<PhotoItem> getPhotoItems() {
List<PhotoItem> items = new ArrayList<PhotoItem>();
if (Environment.MEDIA_MOUNTED.equals(Environment.getExternalStorageState()) && !Environment.isExternalStorageRemovable()) {
ContentResolver contentResolver = mContext.getContentResolver();
Uri uri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
//根据拍摄时间降序排列
Cursor cursor = contentResolver.query(uri, null, null, null, MediaStore.Images.Media.DATE_TAKEN + " DESC");
if (cursor != null) {
while (cursor.moveToNext()) {
String id = cursor.getString(cursor.getColumnIndex(MediaStore.Images.Media._ID));
String data = cursor.getString(cursor.getColumnIndex(MediaStore.Images.Media.DATA));
PhotoItem photoItem = new PhotoItem(id, data);
items.add(photoItem);
}
}
}
return items;
}
}
PhotoPickPresenter为展示类(P),主要是关联业务逻辑类和View,如下:
public class PhotoPickPresenter {
private Context mContext;
private IPhotoPickActivity mIPhotoPickActivity;
private PhotoBiz mPhotoBiz;
private static final int MSG_GET_PHOTOS_SUC = 1;
private static final int MSG_GET_PHOTOS_ERROR = -1;
private List<PhotoItem> mPhotoItems;
private Handler mHandler = new Handler() {
@Override
public void handleMessage(Message msg) {
switch (msg.what) {
case MSG_GET_PHOTOS_ERROR:
mIPhotoPickActivity.hideDialog();
String errorMsg = (String) msg.obj;
mIPhotoPickActivity.showErrorMessage(errorMsg);
break;
case MSG_GET_PHOTOS_SUC:
mIPhotoPickActivity.hideDialog();
mIPhotoPickActivity.showPhotos(mPhotoItems);
break;
}
}
};
public PhotoPickPresenter(Context context, IPhotoPickActivity iPhotoPickActivity) {
mContext = context;
mIPhotoPickActivity = iPhotoPickActivity;
mPhotoBiz = new PhotoBiz(context);
}
public void getData() {
mIPhotoPickActivity.showDialog();
ThreadPoolManager.getInstance().getThreadPool().execute(new Thread(new Runnable() {
@Override
public void run() {
mPhotoItems = mPhotoBiz.getPhotoItems();
if (mPhotoItems == null || mPhotoItems.size() == 0) {
Message message = Message.obtain();
message.what = MSG_GET_PHOTOS_ERROR;
message.obj = "无法获取用户相册图片信息!";
mHandler.sendMessage(message);
} else {
mHandler.sendEmptyMessage(MSG_GET_PHOTOS_SUC);
}
}
}));
}
}
#####需要注意的问题:
1、GridView每个Item宽度的问题?
采用三等分,由于设置了每个Item之间的间隔为2dp,因此,计算如下:
a.首先获取屏幕宽度screenWidth,单位为像素‘
b.itemsize = (screenWidth - (2 * density + 0.5f )) / 3,其中density为屏幕密度,为屏幕dpi/160,获取方法如下:
context.getApplicationContext().getResources().getDisplayMetrics().density;
c.动态设置宽度和高度:
final AbsListView.LayoutParams params = new AbsListView.LayoutParams(mPhotoSize, mPhotoSize);
convertView.setLayoutParams(params);
2、GridView复用convertView导致CheckBox错乱的问题?
使用PhotoItem的isChecked去标识对应的CheckBox是否被选中,在设置OnCheckedChangeListener之后,根据PhotoItem的isChecked标识从新设置CheckBox的选中状态:
viewHolder.mCheckBox.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
photoItem.isChecked = isChecked;
}
});
viewHolder.mCheckBox.setChecked(photoItem.isChecked);
3.动态设置convertView导致GridView第一个Item无法响应?
解决方案:在convertView第一次初始化时(在if之内设置),动态设置convertView的宽度和高度,而不是每次都设置:
if (convertView == null) {
convertView = LayoutInflater.from(mContext).inflate(R.layout.item_photo, null);
//要在这里进行设置,不然每次都会动态的设置convertView宽高,导致第一个item无法响应或点击
final AbsListView.LayoutParams params = new AbsListView.LayoutParams(mPhotoSize, mPhotoSize);
convertView.setLayoutParams(params);
viewHolder = new ViewHolder();
viewHolder.mImageView = (ImageView) convertView.findViewById(R.id.iv_photo);
viewHolder.mCheckBox = (CheckBox) convertView.findViewById(R.id.cb_photo);
convertView.setTag(viewHolder);
} | 43.276316 | 207 | 0.613358 | kor_Hang | 0.28082 |
dc0637d21a9973bfc5eeef7cb77ebc078179a65e | 593 | md | Markdown | firefox/README.md | rainu/docker-browser | 54bf33e01fbbd9287e44013c22e25ac29530da83 | [
"MIT"
] | null | null | null | firefox/README.md | rainu/docker-browser | 54bf33e01fbbd9287e44013c22e25ac29530da83 | [
"MIT"
] | null | null | null | firefox/README.md | rainu/docker-browser | 54bf33e01fbbd9287e44013c22e25ac29530da83 | [
"MIT"
] | null | null | null | # docker-firefox
This contains a up-to-date Firefox (installed from ubuntu repository). This Firefox is also preconfigured to play Flash and use the Pulseaudio of the docker-host. For starting this image you can use the docker-firefox.sh-script. There are two modes:
persistent mode - history and custom settings are stored in a host directory (volume)
The settings/history will be stored in ~/.docker/rainu/firefox/ of the hostssystem.
$> docker-firefox.sh -p
throw-away mode - history and other settings are thrown away after Firefox will be closed
$> docker-firefox.sh
| 65.888889 | 249 | 0.763912 | eng_Latn | 0.99955 |
dc080cce973fda67e6109f18e8e948b447c399e7 | 3,260 | md | Markdown | AZURE.md | keptn-orders/keptn-orders-scripts | 6470c85ea20d627828f7249d8215ebba7f356d59 | [
"Apache-2.0"
] | 1 | 2019-09-10T09:30:59.000Z | 2019-09-10T09:30:59.000Z | AZURE.md | keptn-orders/keptn-orders-scripts | 6470c85ea20d627828f7249d8215ebba7f356d59 | [
"Apache-2.0"
] | null | null | null | AZURE.md | keptn-orders/keptn-orders-scripts | 6470c85ea20d627828f7249d8215ebba7f356d59 | [
"Apache-2.0"
] | 3 | 2019-09-03T15:16:10.000Z | 2021-09-17T15:39:43.000Z | # Azure bastion host VM
Below are instructions for using the Azure CLI to provison an ubuntu virtual machine on Azure to use for the cluster, keptn, and application setup.
# Create bastion host
These instructions assume you have an Azure subscription and have the AZ CLI installed and configured locally.
You can also make the bastion host from the console and then continue with the steps to connect using ssh. But you must use this image as to have the install scripts be compatible:
* Ubuntu 16.04 LTS
## 1. Install and configure the Azure CLI
On your laptop, run these commands to configure the Azure CLI [Azure docs](https://docs.microsoft.com/en-us/cli/azure/?view=azure-cli-latest)
Once installed, run these commands to configure the cli:
```
# login to your account. This will ask you to open a browser with a code and then login.
az login
# verify you are on the right subscription. Look for "isDefault": true
az account list --output table
```
## 2. Provision bastion host using CLI
On your laptop, run these commands to provision the VM and a resource group
```
# adjust these variables
export VM_GROUP_LOCATION=eastus
export RESOURCE_PREFIX=<example your last name>
# leave these values
export VM_GROUP_NAME="$RESOURCE_PREFIX"-keptn-orders-bastion-group
export VM_NAME="$RESOURCE_PREFIX"-keptn-orders-bastion
# provision the host
az group create --name $VM_GROUP_NAME --location $VM_GROUP_LOCATION
# create the VM
az vm create \
--name $VM_NAME \
--resource-group $VM_GROUP_NAME \
--size Standard_B1s \
--image Canonical:UbuntuServer:16.04-LTS:latest \
--generate-ssh-keys \
--output json \
--verbose
# open port for haproxy to keptn bridge
az vm open-port --resource-group $VM_GROUP_NAME --name $VM_NAME --port 80
```
## 3. SSH bastion host
Goto the Azure console and choose the "connect" menu on the VM row to copy the connection string. Run this command to SSH to the new VM.
```
ssh <your id>@<host ip>
```
## 4. Clone the Orders setup repo
Within the VM, run these commands to clone the setup repo.
```
git clone --branch 0.4.0 https://github.com/keptn-orders/keptn-orders-setup.git --single-branch
cd keptn-orders-setup
```
Finally, proceed to the [Provision Cluster, Install Keptn, and onboard the Orders application](README.md#installation-scripts-from-setup-menu) step.
# Delete bastion host
## Option 1 - delete using azure cli
From your laptop, run these commands to delete the resource group.
This will delete the bastion host resource group and the VM running within it.
```
# adjust these variables
export RESOURCE_PREFIX=<example your last name>
# leave these values
export VM_GROUP_NAME="$RESOURCE_PREFIX"-keptn-orders-bastion-group
az group delete --name $VM_GROUP_NAME --yes
```
## Option 2 - delete from the Azure console
On the resource group page, delete the resource group named 'kube-demo-group'.
This will delete the bastion host resource group and the VM running in it.
# az command reference
```
# list of locations
az account list-locations -o table
# list vm VMs
az vm show --name keptn-orders-bastion
# list vm sizes
az vm list-sizes --location eastus -o table
# image types
az vm image list -o table
az vm image show --urn Canonical:UbuntuServer:16.04-LTS:latest
```
| 30.46729 | 181 | 0.756748 | eng_Latn | 0.939381 |
dc092756ff219edc637ef7009273af8b585c61c4 | 302 | md | Markdown | Assets/AzureKinectToolkit/Runtime/FloorDetection/README.md | sotanmochi/AzureKinect4Unity | ce69602bfc183dbb00c7d41566d3cec423f3cce2 | [
"MIT"
] | 20 | 2020-04-19T16:29:32.000Z | 2022-01-02T10:10:02.000Z | Assets/AzureKinectToolkit/Runtime/FloorDetection/README.md | sotanmochi/AzureKinect4Unity | ce69602bfc183dbb00c7d41566d3cec423f3cce2 | [
"MIT"
] | 1 | 2021-07-09T09:06:15.000Z | 2021-07-09T09:06:15.000Z | Assets/AzureKinectToolkit/Runtime/FloorDetection/README.md | sotanmochi/AzureKinect4Unity | ce69602bfc183dbb00c7d41566d3cec423f3cce2 | [
"MIT"
] | 2 | 2021-03-31T11:49:17.000Z | 2021-10-12T15:03:26.000Z | # Floor Plane Detection
This is a ported and modified version of the floor plane detector from Azure Kinect DK Code Samples.
The original source code is available on GitHub.
https://github.com/microsoft/Azure-Kinect-Samples/blob/master/body-tracking-samples/floor_detector_sample/FloorDetector.cpp
| 43.142857 | 123 | 0.821192 | eng_Latn | 0.965073 |
dc09c68f319b67158b726a3e67b5b58ef6f40e0e | 14,298 | md | Markdown | articles/sql-database/sql-database-single-database-migrate.md | changeworld/azure-docs.pt-pt | 8a75db5eb6af88cd49f1c39099ef64ad27e8180d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/sql-database/sql-database-single-database-migrate.md | changeworld/azure-docs.pt-pt | 8a75db5eb6af88cd49f1c39099ef64ad27e8180d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/sql-database/sql-database-single-database-migrate.md | changeworld/azure-docs.pt-pt | 8a75db5eb6af88cd49f1c39099ef64ad27e8180d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Migração da base de dados SQL Server para uma base de dados única/pooled
description: Saiba como sobre a migração da base de dados Do SQL Server para uma única base de dados ou um pool elástico na Base de Dados Azure SQL.
keywords: migração de base de dados, migração de base de dados do sql server, ferramentas de migração de base de dados, migrar base de dados, migrar base de dados sql
services: sql-database
ms.service: sql-database
ms.subservice: migration
ms.custom: ''
ms.devlang: ''
ms.topic: conceptual
author: stevestein
ms.author: sstein
ms.reviewer: carlrab
ms.date: 02/11/2019
ms.openlocfilehash: 9cec91ccc80b9072b1a3da756f26f47eb88b951c
ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897
ms.translationtype: MT
ms.contentlocale: pt-PT
ms.lasthandoff: 03/28/2020
ms.locfileid: "79268616"
---
# <a name="sql-server-database-migration-to-azure-sql-database"></a>Migração da base de dados SQL Server para base de dados Azure SQL
Neste artigo, você aprende sobre os métodos primários para migrar um SQL Server 2005 ou posterior base de dados para uma única ou piscina de base de dados na Base de Dados Azure SQL. Para obter informações sobre a migração para uma instância gerida, consulte migrar para a instância do Servidor SQL para a instância gerida pela Base de [Dados Azure SQL](sql-database-managed-instance-migrate.md). Para obter informações sobre migração de outras plataformas, consulte o Guia de Migração da Base de [Dados Azure.](https://datamigration.microsoft.com/)
## <a name="migrate-to-a-single-database-or-a-pooled-database"></a>Migrar para uma única base de dados ou uma base de dados agruparada
Existem dois métodos primários para migrar um SQL Server 2005 ou posterior base de dados para uma única ou piscina de bases de dados na Base de Dados Azure SQL. O primeiro método é mais simples, mas necessita de algum, possivelmente substancial, período de indisponibilidade durante a migração. O segundo método é mais complexo, mas elimina significativamente o período de indisponibilidade durante a migração.
Em ambos os casos, é necessário garantir que a base de dados de origem é compatível com a Base de Dados Azure SQL utilizando o Assistente de [Migração de Dados (DMA)](https://www.microsoft.com/download/details.aspx?id=53595). A Base de Dados SQL V12 está a aproximar-se da paridade de [funcionalidades](sql-database-features.md) com o SQL Server, para além de problemas relacionados com operações de nível de servidor e de base de dados cruzadas. Bases de dados e aplicações que dependem de [funções parcialmente suportadas ou não suportadas](sql-database-transact-sql-information.md) precisam de alguma [reengenharia para corrigir estas incompatibilidades](sql-database-single-database-migrate.md#resolving-database-migration-compatibility-issues) antes que as bases de dados do SQL Server possam ser migradas.
> [!NOTE]
> Para migrar uma base de dados não SQL Server, incluindo Microsoft Access, Sybase, MySQL Oracle e DB2 para a Base de Dados SQL do Azure, veja [SQL Server Migration Assistant (Assistente de Migração do SQL Server)](https://blogs.msdn.microsoft.com/datamigration/2017/09/29/release-sql-server-migration-assistant-ssma-v7-6/).
## <a name="method-1-migration-with-downtime-during-the-migration"></a>Método 1: migração com período de indisponibilidade durante a migração
Utilize este método para migrar para uma única ou uma base de dados agrupada se puder pagar algum tempo de inatividade ou se estiver a realizar uma migração de teste de uma base de dados de produção para posterior migração. Para um tutorial, consulte Migrate uma base de [dados do SQL Server](../dms/tutorial-sql-server-to-azure-sql.md).
A lista seguinte contém o fluxo de trabalho geral para uma migração de base de dados SQL Server de uma única ou uma base de dados agrofada utilizando este método. Para a migração para O Exemplo Gerido, consulte [migração para um instância gerido](sql-database-managed-instance-migrate.md).

1. [Avaliar](https://docs.microsoft.com/sql/dma/dma-assesssqlonprem) a base de dados para compatibilidade utilizando a versão mais recente do Assistente de Migração de [Dados (DMA)](https://www.microsoft.com/download/details.aspx?id=53595).
2. Prepare quaisquer correções necessárias como scripts do Transact-SQL.
3. Faça uma cópia transaccionalmente consistente da base de dados de origem que está a ser migrada ou impeça que novas transações ocorram na base de dados de origem enquanto ocorre a migração. Os métodos para realizar esta última opção incluem desativar a conectividade do cliente ou criar um instantâneo de base de [dados.](https://msdn.microsoft.com/library/ms175876.aspx) Após a migração, poderá utilizar a replicação transacional para atualizar as bases de dados migradas com alterações que ocorrem após o ponto de corte para a migração. Ver [Migrar utilizando migração transacional](sql-database-single-database-migrate.md#method-2-use-transactional-replication).
4. Implemente os scripts do Transact-SQL para aplicar as correções na cópia da base de dados.
5. [Migrar](https://docs.microsoft.com/sql/dma/dma-migrateonpremsql) a cópia da base de dados para uma nova Base de Dados Azure SQL utilizando o Assistente de Migração de Dados.
> [!NOTE]
> Em vez de utilizar O DMA, também pode utilizar um ficheiro BACPAC. Consulte importar um ficheiro BACPAC para uma nova Base de [Dados Azure SQL](sql-database-import.md).
### <a name="optimizing-data-transfer-performance-during-migration"></a>Otimizar o desempenho da transferência de dados durante a migração
A lista seguinte contém recomendações para um melhor desempenho durante o processo de importação.
- Escolha o maior nível de serviço e dimensão de cálculo que o seu orçamento permite maximizar o desempenho da transferência. Pode reduzir verticalmente depois de a migração ser concluída para economizar.
- Minimize a distância entre o seu ficheiro BACPAC e o centro de dados de destino.
- Desative as estatísticas automáticas durante a migração
- Tabelas e índices de partição
- Remova as vistas indexadas e recrie-as quando estiver terminado
- Remova os dados históricos raramente consultados para outra base de dados e migre-os para uma base de dados SQL do Azure separada. Em seguida, pode consultar este dados históricos com [consultas elásticas](sql-database-elastic-query-overview.md).
### <a name="optimize-performance-after-the-migration-completes"></a>Otimizar o desempenho após a migração estar concluída
[Atualize as estatísticas](https://docs.microsoft.com/sql/t-sql/statements/update-statistics-transact-sql) com uma análise completa após a migração estar concluída.
## <a name="method-2-use-transactional-replication"></a>Método 2: Utilizar a Replicação Transacional
Quando não pode remover a base de dados do SQL Server da produção enquanto a migração está a ocorrer, pode utilizar a replicação transacional do SQL Server como a sua solução de migração. Para utilizar este método, a base de dados de origem tem de cumprir os [requisitos para a replicação transacional](https://msdn.microsoft.com/library/mt589530.aspx) e ser compatível com a Base de Dados SQL do Azure. Para obter informações sobre a replicação do SQL com Always On, consulte [A Replicação de Configurar para Grupos Sempre Na Disponibilidade (Servidor SQL)](/sql/database-engine/availability-groups/windows/configure-replication-for-always-on-availability-groups-sql-server).
Para utilizar esta solução, configure a Base de Dados SQL do Azure como um subscritor para a instância do SQL Server que pretende migrar. O distribuidor de replicação transacional sincroniza os dados da base de dados a serem sincronizados (o editor) enquanto continuam a ocorrer novas transações.
Com a replicação transacional, todas as alterações aos seus dados ou esquema aparecem na sua Base de Dados SQL do Azure. Assim que a sincronização estiver concluída e estiver pronto para migrar, altere a cadeia de ligação das suas aplicações para apontá-las para a Base de Dados SQL do Azure. Assim que a replicação transacional drena quaisquer alterações restantes na sua base de dados de origem e todas as suas aplicações apontam para a BD do Azure, pode desinstalar a replicação transacional. A Base de Dados SQL do Azure é agora o seu sistema de produção.

> [!TIP]
> Também pode utilizar a replicação transacional para migrar um subconjunto da sua base de dados de origem. A publicação que replica para a Base de Dados SQL do Azure pode ser limitada a um subconjunto de tabelas na base de dados a ser replicada. Para cada tabela a ser replicada, pode limitar os dados a um subconjunto das linhas e/ou um subconjunto das colunas.
## <a name="migration-to-sql-database-using-transaction-replication-workflow"></a>Migração para a Base de Dados SQL com fluxo de trabalho de Replicação de Transação
> [!IMPORTANT]
> Utilize a versão mais recente do SQL Server Management Studio, para permanecer sincronizado com as atualizações do Microsoft Azure e da Base de Dados SQL. As versões anteriores do SQL Server Management Studio não podem configurar a Base de Dados SQL como um subscritor. [Atualize o SQL Server Management Studio](https://msdn.microsoft.com/library/mt238290.aspx).
1. Configurar Distribuição
- [Com o SQL Server Management Studio (SSMS)](https://msdn.microsoft.com/library/ms151192.aspx#Anchor_1)
- [Com o Transact-SQL](https://msdn.microsoft.com/library/ms151192.aspx#Anchor_2)
2. Criar Publicação
- [Com o SQL Server Management Studio (SSMS)](https://msdn.microsoft.com/library/ms151160.aspx#Anchor_1)
- [Com o Transact-SQL](https://msdn.microsoft.com/library/ms151160.aspx#Anchor_2)
3. Criar Subscrição
- [Com o SQL Server Management Studio (SSMS)](https://msdn.microsoft.com/library/ms152566.aspx#Anchor_0)
- [Com o Transact-SQL](https://msdn.microsoft.com/library/ms152566.aspx#Anchor_1)
Algumas sugestões e diferenças de migração para a Base de Dados SQL
- Utilizar um distribuidor local
- Fazê-lo causa um impacto de desempenho no servidor.
- Se o impacto no desempenho for inaceitável pode utilizar outro servidor, mas aumenta a complexidade na gestão e na administração.
- Quando seleciona uma pasta de instantâneos, certifique-se de que a pasta que seleciona é suficientemente grande para conter um BCP de cada tabela que pretende replicar.
- A criação do instantâneo bloqueia as tabelas associadas até estar concluída, por isso, agende adequadamente o instantâneo.
- Apenas são suportadas subscrições push na Base de Dados SQL do Azure. Apenas pode adicionar subscritores da base de dados de origem.
## <a name="resolving-database-migration-compatibility-issues"></a>Corrigir problemas de compatibilidade de migração de bases de dados
Existe uma ampla variedade de problemas de compatibilidade que pode encontrar, dependendo da versão do SQL Server na base de dados de origem e da complexidade da base de dados que está a migrar. As versões mais antigas do SQL Server têm mais problemas de compatibilidade. Utilize os seguintes recursos, para além de uma pesquisa na Internet direcionada com o seu motor de busca preferido:
- [Funcionalidades da base de dados do SQL Server não suportadas na Base de Dados SQL do Azure](sql-database-transact-sql-information.md)
- [Funcionalidade do Motor da Base de Dados Descontinuada no SQL Server 2016](https://msdn.microsoft.com/library/ms144262%28v=sql.130%29)
- [Funcionalidade do Motor da Base de Dados Descontinuada no SQL Server 2014](https://msdn.microsoft.com/library/ms144262%28v=sql.120%29)
- [Funcionalidade do Motor da Base de Dados Descontinuada no SQL Server 2012](https://msdn.microsoft.com/library/ms144262%28v=sql.110%29)
- [Funcionalidade do Motor da Base de Dados Descontinuada no SQL Server 2008 R2](https://msdn.microsoft.com/library/ms144262%28v=sql.105%29)
- [Funcionalidade do Motor da Base de Dados Descontinuada no SQL Server 2005](https://msdn.microsoft.com/library/ms144262%28v=sql.90%29)
Para além de procurar na Internet e utilizar estes recursos, utilize os [fóruns da comunidade do SQL Server do MSDN](https://social.msdn.microsoft.com/Forums/sqlserver/home?category=sqlserver) ou o [StackOverflow](https://stackoverflow.com/).
> [!IMPORTANT]
> A SQL Database Managed Instance permite migrar uma instância de Servidor SQL existente e as suas bases de dados com problemas mínimos ou sem compatibilidade. Veja [o que é uma instância gerida.](sql-database-managed-instance.md)
## <a name="next-steps"></a>Passos seguintes
- Utilize o script no blogue de Engenheiros do Azure SQL EMEA para [Monitorizar a utilização de tempdb durante a migração](https://blogs.msdn.microsoft.com/azuresqlemea/2016/12/28/lesson-learned-10-monitoring-tempdb-usage/).
- Utilize o script no blogue de Engenheiros do Azure SQL EMEA para [Monitorizar o espaço do registo de transações da base de dados enquanto a migração está a ocorrer](https://docs.microsoft.com/archive/blogs/azuresqlemea/lesson-learned-7-monitoring-the-transaction-log-space-of-my-database).
- Para saber mais sobre a migração com ficheiros BACPAC num blogue da Equipa de Aconselhamento ao Cliente do SQL Server, consulte [Migrating from SQL Server to Azure SQL Database using BACPAC Files (Migrar a partir do SQL Server para a Base de Dados SQL do Azure com Ficheiros BACPAC)](https://blogs.msdn.microsoft.com/sqlcat/2016/10/20/migrating-from-sql-server-to-azure-sql-database-using-bacpac-files/).
- Para obter informações sobre como trabalhar com a hora UTC após a migração, consulte [Modifying the default time zone for your local time zone (Modificar o fuso horário predefinido para o fuso horário local)](https://blogs.msdn.microsoft.com/azuresqlemea/2016/07/27/lesson-learned-4-modifying-the-default-time-zone-for-your-local-time-zone/).
- Para obter informações sobre como alterar o idioma predefinido de uma base de dados após a migração, consulte [How to change the default language of Azure SQL Database (Como alterar o idioma predefinido da Base de Dados SQL do Azure)](https://blogs.msdn.microsoft.com/azuresqlemea/2017/01/13/lesson-learned-16-how-to-change-the-default-language-of-azure-sql-database/).
| 111.703125 | 811 | 0.799832 | por_Latn | 0.997494 |
dc0a92883c30b8ed4b962bfca56fd39d69722933 | 283 | md | Markdown | _collections/_attack_groups/G0048.md | MITRECND/shield-website | 4ec0d710f63af99a37d0bdcad58ab625d4bbe717 | [
"Apache-2.0"
] | 18 | 2021-08-18T22:56:22.000Z | 2022-03-26T14:09:05.000Z | _collections/_attack_groups/G0048.md | MITRECND/shield-website | 4ec0d710f63af99a37d0bdcad58ab625d4bbe717 | [
"Apache-2.0"
] | 3 | 2020-08-24T19:13:13.000Z | 2021-12-25T05:56:54.000Z | _collections/_attack_groups/G0048.md | MITRECND/shield-website | 4ec0d710f63af99a37d0bdcad58ab625d4bbe717 | [
"Apache-2.0"
] | 10 | 2021-03-22T07:29:59.000Z | 2022-02-26T23:03:46.000Z | ---
_name: "RTM"
_id: "G0048"
aliases: "RTM"
description: "RTM is a cybercriminal group that has been active since at least 2015 and is primarily interested in users of remote banking systems in Russia and neighboring countries. The group uses a Trojan by the same name (RTM). "
---
| 40.428571 | 233 | 0.75265 | eng_Latn | 0.999799 |
dc0b473f26db53d2bcd1ed85b3b5b776bfdaec0c | 9,839 | md | Markdown | _posts/2018-09-12-major-picroft-update-moving-to-raspbian-stretch.md | gingerling/docs-rewrite | 87b08731591df1c43ce2ef4e18fd198cf54e1dc5 | [
"Apache-2.0"
] | null | null | null | _posts/2018-09-12-major-picroft-update-moving-to-raspbian-stretch.md | gingerling/docs-rewrite | 87b08731591df1c43ce2ef4e18fd198cf54e1dc5 | [
"Apache-2.0"
] | null | null | null | _posts/2018-09-12-major-picroft-update-moving-to-raspbian-stretch.md | gingerling/docs-rewrite | 87b08731591df1c43ce2ef4e18fd198cf54e1dc5 | [
"Apache-2.0"
] | null | null | null | ---
ID: 40526
post_title: 'Major Picroft Update – Moving to Raspbian Stretch'
author: Steve Penrod
post_excerpt: ""
layout: post
permalink: >
http://mycroft.ai/blog/major-picroft-update-moving-to-raspbian-stretch/
published: true
post_date: 2018-09-12 20:00:57
---
<h1><span style="font-weight: 400;">It’s Alive!</span></h1>
There's a brand new Picroft image available! <span style="font-weight: 400;"><a href="https://github.com/MycroftAI/enclosure-picroft/tree/stretch" target="_blank" rel="noopener">Click here to get setup instructions.</a> Read on to learn more.</span>
<span style="font-weight: 400;">In late 2016 I created the </span><a href="https://mycroft.ai/blog/mycroft-now-available-raspberry-pi-image/"><span style="font-weight: 400;">original Picroft</span></a><span style="font-weight: 400;"> as a way for anyone to interact with Mycroft, not just those who were able to purchase the Mark 1. Since then, Picroft has been downloaded more than 35,000 times. It has become the basis for numerous Raspberry Pi projects, like </span><a href="https://community.mycroft.ai/t/creating-my-first-skill-with-essentially-no-experience-mycroft-magicmirror-skill/3526/" target="_blank" rel="noopener"><span style="font-weight: 400;">Magic Mirrors</span></a><span style="font-weight: 400;">, </span><a href="https://www.hackster.io/gov/hey-mycroft-turn-on-the-lamp-5e2d12" target="_blank" rel="noopener"><span style="font-weight: 400;">home automation systems</span></a><span style="font-weight: 400;">, </span><a href="https://community.mycroft.ai/t/first-steps-into-the-big-wide-world/3623" target="_blank" rel="noopener"><span style="font-weight: 400;">robots</span></a><span style="font-weight: 400;">, and hundreds of other wonderful testaments to creativity.</span>
<span style="font-weight: 400;">Picroft was designed to self-update, and we did publish some new versions periodically to simplify moving to major new releases. It has held up well and essentially remained the same for the last two years.</span>
<span style="font-weight: 400;">However, the new Raspberry Pi 3B+ introduced new chipsets that broke Picroft. The original Pi 3 is still available, but this was a good reason to show Picroft some love and introduce some cool new stuff.</span>
[caption id="attachment_40527" align="alignnone" width="1316"]<a href="https://mycroft.ai/wp-content/uploads/2018/09/lightning-2018-08-01.jpg"><img class="wp-image-40527 size-full" src="https://mycroft.ai/wp-content/uploads/2018/09/lightning-2018-08-01.jpg" alt="The electricity that gave Picoft life will be how Picroft gives life to your Raspberry Pi projects." width="1316" height="640" /></a> This was the sky the night I started rebuilding the Picroft image. Fitting, right?[/caption]
<h1><span style="font-weight: 400;">What’s New?</span></h1>
<span style="font-weight: 400;">Lots of little things, and a few big things. Once things are up and running, it will still be Mycroft -- a voice interface. The main changes are at the setup and “plumbing” level. We also listened to the findings of our </span><a href="https://mycroft.ai/blog/picroft-survey-2018-results/" target="_blank" rel="noopener"><span style="font-weight: 400;">Picroft survey</span></a><span style="font-weight: 400;"> -- thanks to all who participated!</span>
<h3><span style="font-weight: 400;">Stretch’ing ourselves</span></h3>
<span style="font-weight: 400;">With the release of the Raspberry Pi 3B+ hardware changes required a new release of the Raspbian operating system. The new version is referred to as “Stretch” (following the tradition of Toy Story references). Picroft is now based on this latest version of Raspbian, allowing support for both the Raspberry Pi 3 and the 3B+.</span>
<h3><span style="font-weight: 400;">No more packages, go Git it</span></h3>
<span style="font-weight: 400;">To simplify and unify the Mycroft experience on the various platforms, we rethought how Mycroft should sit on the Raspberry Pi. At the low level, we switched from a Debian package install (which was originally a hack of the Mark 1 package that just disabled the faceplate) to a Github-based install that can be set to automatically update.</span>
<span style="font-weight: 400;">This unifies the Picroft environment with what most desktop developers are using, allowing us to simplify documentation.</span>
<h3><span style="font-weight: 400;">Wizards!</span></h3>
<span style="font-weight: 400;">One of the recurring issues users encounter is difficulty in configuring the audio pieces. Linux is incredibly flexible, but </span><i><span style="font-weight: 400;">nobody</span></i><span style="font-weight: 400;"> wants to deal with the audio subsystems if they can avoid it. Configuring sound drivers is full of alchemy and dark magic. Connecting to all the different kinds of wifi security protocols is a whole different book of incantations. So we are including a Wizard with Picroft to help cast these spells!</span>
<span style="font-weight: 400;">During the initial setup, the user is now presented with a series of simple questions to configure the network connection, pick the type of microphone and audio output they want to use, and configure security options based on their intended use. Configuration happens automatically, and tests immediately verify the speaker output and sound levels, and the microphone input.</span>
<h3><span style="font-weight: 400;">MOAR Microphones</span></h3>
<span style="font-weight: 400;">We have a great community who have proven that Mycroft can operate with nearly any kind of hardware. Over the next few days, the Picroft image will incorporate support for the Google AIY Voice Kit array microphone. In the future, there will be an easy path for automating the install of other microphones, such as the <a href="https://www.matrix.one/products/creator" target="_blank" rel="noopener">Matrix Creator</a> and <a href="https://www.seeedstudio.com/ReSpeaker-4Mic-Array-for-Raspberry-Pi-p-2941.html" target="_blank" rel="noopener">Seeed ReSpeaker</a>.</span>
<h3><span style="font-weight: 400;">Environment Helpers</span></h3>
<span style="font-weight: 400;">As I developer, I feel justified in making fun of every time one of us answers a question by saying something like:</span>
<i><span style="font-weight: 400;">Oh, that is really simple. First <code>pip install pyjokes</code>. Then just type <code>python -m test.integrationstests.skills.runner</code>. You are in the virtual environment, right? Oh, well you just need to do <code>source /opt/venvs/mycroft-core/bin/activate</code> before that. Still not working? Oh, were you in the venv when you did the pip install? ...</span></i>
<span style="font-weight: 400;">I’ll let you in on a secret -- this is also a pain for us and we probably mistype these commands half the time.</span>
<span style="font-weight: 400;">As part of this overhaul, we’ve unified most Mycroft related commands under a <code>mycroft-</code> prefix. This is particularly helpful because you can now take advantage of Linux’s excellent auto-completion by typing <code>my<TAB></code> to get to <code>mycroft-</code>, then hit TAB two more times to see all the options available for you.</span>
<span style="font-weight: 400;">Helpers include:</span>
<ul>
<li><span style="font-weight: 400;">mycroft-help</span> <span style="font-weight: 400;">Get brief summaries of the commands</span></li>
<li><span style="font-weight: 400;">mycroft-pip</span> <span style="font-weight: 400;">Install Python packages inside the Mycroft venv</span></li>
<li><span style="font-weight: 400;">mycroft-msm</span> <span style="font-weight: 400;">Invoke the Mycroft Skills Manager</span></li>
<li><span style="font-weight: 400;">mycroft-msk</span> <span style="font-weight: 400;">Mycroft Skill Kit, submit and manage skills in the community repos</span></li>
<li><span style="font-weight: 400;">mycroft-speak</span> <span style="font-weight: 400;">Make mycroft speak for you (handy inside scripts)</span></li>
<li><span style="font-weight: 400;">mycroft-say-to</span> <span style="font-weight: 400;">Send a string to Mycroft, just like you spoke it (also handy for scripts)</span></li>
</ul>
<span style="font-weight: 400;">These same tools are also available under all other environments now.</span>
<h1><span style="font-weight: 400;">What Else Can You Look Forward To?</span></h1>
<span style="font-weight: 400;">We’ll be providing Skills that are unique to the Picroft environment, allowing you to attach buttons to the GPIO pins to emulate the Mark 1 button. Or use this example to do whatever else you might want to do -- this is your Picroft!</span>
<span style="font-weight: 400;">As mentioned above, we’d like to expand the microphone and audio hardware supported by Picroft. With a little help from community members, I expect we’ll have support for various USB and Bluetooth mics and speakers soon. <a href="https://community.mycroft.ai/t/major-picroft-update-moving-to-raspbian-stretch/4522" target="_blank" rel="noopener">Join us on the forums</a> to give feedback on AIY integration, how to pipe in new mics, and what you'd like to connect over GPIO.</span>
<h1><span style="font-weight: 400;">Gimme, Gimme, Gimme!!!</span></h1>
<span style="font-weight: 400;">You can download the new Picroft image from </span><a href="https://mycroft.ai/to/picroft-image" target="_blank" rel="noopener">https://mycroft.ai/to/picroft-image</a><span style="font-weight: 400;">. Learn all about it (including exactly how it was built) from </span><a href="https://github.com/MycroftAI/enclosure-picroft/tree/stretch" target="_blank" rel="noopener">https://github.com/MycroftAI/enclosure-picroft/tree/stretch</a><span style="font-weight: 400;">.</span>
<span style="font-weight: 400;">Go, download, and hack!</span> | 158.693548 | 1,197 | 0.751397 | eng_Latn | 0.978464 |
dc0ba91bb48c1877e35ebe6d14f6d99361b9dce7 | 4,293 | md | Markdown | articles/multi-factor-authentication/multi-factor-authentication-get-started-cloud.md | lettucebo/azure-content-zhtw | f9aa6105a8900d47dbf27a51fd993c44a9af5b63 | [
"CC-BY-3.0"
] | null | null | null | articles/multi-factor-authentication/multi-factor-authentication-get-started-cloud.md | lettucebo/azure-content-zhtw | f9aa6105a8900d47dbf27a51fd993c44a9af5b63 | [
"CC-BY-3.0"
] | null | null | null | articles/multi-factor-authentication/multi-factor-authentication-get-started-cloud.md | lettucebo/azure-content-zhtw | f9aa6105a8900d47dbf27a51fd993c44a9af5b63 | [
"CC-BY-3.0"
] | null | null | null | <properties
pageTitle="開始在雲端中使用 Microsoft Azure Multi-Factor Authentication"
description="這是說明如何開始在雲端中使用 Azure MFA 的 Microsoft Azure Multi-Factor Authentication 頁面。"
services="multi-factor-authentication"
documentationCenter=""
authors="kgremban"
manager="femila"
editor="curtand"/>
<tags
ms.service="multi-factor-authentication"
ms.workload="identity"
ms.tgt_pltfrm="na"
ms.devlang="na"
ms.topic="get-started-article"
ms.date="08/15/2016"
ms.author="kgremban"/>
# 開始在雲端中使用 Azure Multi-Factor Authentication
在下列文章中,您將學習如何在雲端中開始使用 Azure Multi-Factor Authentication。
> [AZURE.NOTE] 以下文件提供有關如何使用 **Azure 傳統入口網站**啟用使用者的詳細資訊。如果您要尋找有關如何設定適用於 O365 使用者的 Azure Multi-Factor Authentication 的詳細資訊,請參閱[設定適用於 Office 365 的 Multi-Factor Authentication](https://support.office.com/article/Set-up-multi-factor-authentication-for-Office-365-users-8f0454b2-f51a-4d9c-bcde-2c48e41621c6?ui=zh-TW&rs=zh-TW&ad=US)。

## 必要條件
需要下列必要條件,您才可以為您的使用者啟用 Azure Multi-Factor Authentication。
- [註冊 Azure 訂用帳戶](https://azure.microsoft.com/pricing/free-trial/) - 如果您還沒有 Azure 訂用帳戶,可以免費註冊。如果您剛開始使用 Azure MFA,您可以使用試用版訂用帳戶
2. [建立 Multi-Factor Auth Provider](multi-factor-authentication-get-started-auth-provider.md) 並將它指派給目錄,或[將授權指派給使用者](multi-factor-authentication-get-started-assign-licenses.md)。
> [AZURE.NOTE] 授權的適用對象是擁有 Azure MFA、Azure AD Premium 或 Enterprise Mobility Suite (EMS) 的使用者。MFA 包含在 Azure AD Premium 和 EMS。如果您有足夠的授權,則不需要建立 Auth Provider。
## 為使用者開啟 Multi-Factor Authentication
若要為使用者開啟 Multi-Factor Authentication,您只要將使用者的狀態從停用變更為啟用。如需使用者狀態的詳細資訊,請參閱 [Azure Multi-Factor Authentication 中的使用者狀態](multi-factor-authentication-get-started-user-states.md)。
您可以使用下列程序為使用者啟用 MFA。
### 開啟 Multi-Factor Authentication
--------------------------------------------------------------------------------
1. 以系統管理員身分登入 **Azure 傳統入口網站**。
2. 在左側按一下 [Active Directory]。
3. 在 [目錄] 下方,針對要啟用的使用者按一下目錄。
4. 在頂端按一下 [使用者]。
5. 在頁面底部,按一下 [管理 Multi-Factor Auth]。
6. 這會開啟新的瀏覽器索引標籤。找出要啟用 Multi-Factor Authentication 的使用者。您可能需要在頂端變更檢視。請確定狀態為 [停用]。
7. 請在其名稱旁的方塊**打勾**。
7. 在右側按一下 [啟用]。
8. 按一下 [啟用 Multi-Factor Auth]。
9. 您應該注意到使用者的狀態已經從 [停用] 變更為 [啟用]。
10. 在您啟用您的使用者後,建議您透過電子郵件通知他們。電子郵件也應該通知他們如何使用其非瀏覽器應用程式,以避免遭到鎖定。
## 使用 PowerShell 自動開啟 Multi-Factor Authentication
若要使用 [Azure AD PowerShell](../powershell-install-configure.md) 變更[狀態](multi-factor-authentication-whats-next.md),您可以使用下列程式碼。您可以將 `$st.State` 變更為下列其中一個狀態:
- 已啟用
- 已強制
- 已停用
> [AZURE.IMPORTANT] 請注意,如果您直接從 [停用] 狀態進入 [強制] 狀態,非現代化驗證用戶端將會停止運作,因為使用者未通過 MFA 註冊並取得[應用程式密碼](multi-factor-authentication-whats-next.md#app-passwords)。如果您具有非現代化驗證用戶端,而且需要應用程式密碼,則建議您從 [停用] 狀態變更為 [啟用] 狀態。這可讓使用者註冊並取得其應用程式密碼。
$st = New-Object -TypeName Microsoft.Online.Administration.StrongAuthenticationRequirement
$st.RelyingParty = "*"
$st.State = “Enabled”
$sta = @($st)
Set-MsolUser -UserPrincipalName [email protected] -StrongAuthenticationRequirements $sta
使用 PowerShell 是大量啟用使用者的選項。目前在 Azure 入口網站中沒有大量啟用功能,您必須個別選取每個使用者。如果您有許多使用者,這是相當繁重的工作。藉由使用上述程式碼建立 PowerShell 指令碼,您可以循環使用者清單並且啟用他們。下列是一個範例:
$users = "[email protected]","[email protected]","[email protected]"
foreach ($user in $users)
{
$st = New-Object -TypeName Microsoft.Online.Administration.StrongAuthenticationRequirement
$st.RelyingParty = "*"
$st.State = “Enabled”
$sta = @($st)
Set-MsolUser -UserPrincipalName $user -StrongAuthenticationRequirements $sta
}
如需使用者狀態的詳細資訊,請參閱 [Azure Multi-Factor Authentication 中的使用者狀態](multi-factor-authentication-get-started-user-states.md)。
## 後續步驟
現在您已在雲端中設定 Multi-Factor Authentication,接下來您可以設定及安裝您的部署。請參閱 [設定 Azure Multi-Factor Authentication]。
<!---HONumber=AcomDC_0921_2016--> | 46.16129 | 328 | 0.746797 | yue_Hant | 0.603107 |
dc0c6238bb80520a772ad222968da8cbaeeeaafe | 7,567 | md | Markdown | README.md | SergeyMi37/zapm | 190cc8a560780e41316381936ad2ae13d14ff70d | [
"MIT"
] | 5 | 2021-01-25T17:35:46.000Z | 2022-03-11T08:32:25.000Z | README.md | SergeyMi37/zapm | 190cc8a560780e41316381936ad2ae13d14ff70d | [
"MIT"
] | 1 | 2021-11-17T05:22:34.000Z | 2021-11-17T05:22:34.000Z | README.md | SergeyMi37/zapm | 190cc8a560780e41316381936ad2ae13d14ff70d | [
"MIT"
] | 2 | 2021-01-27T09:15:41.000Z | 2021-11-16T18:39:10.000Z | 
## zapm
[](https://openexchange.intersystems.com/package/zapm)
[](https://github.com/SergeyMi37/zapm)
[](https://community.intersystems.com/post/zapm-shell-extends-zpm-shell-and-adds-any-other-commands)
[](https://www.intersystems.com/products/intersystems-iris/)
[](https://www.intersystems.com/products/cache/)
[](https://www.intersystems.com/products/ensemble/)
<img alt="GitHub last commit" src="https://img.shields.io/github/last-commit/SergeyMi37/zapm">
[](https://opensource.org/licenses/MIT)
ZAPM is a shell - extends the ZPM shell and adds any other commands.
Working in the terminal I got tired of going from my shell to the zpm shell and back.
My shell was supplied with additional specific commands and I decided to merge the two shells.
The ZAPM checks if the command entered is a ZPM command, then sends the execution to the ZPM shell.
Then I wanted to improve the color commands and expand the functionality.
And now ZAPM has survived to the first version and can add any command that I need and remember it so that I can re-execute it.
## What's new
Added support new version zpm + hspm
Updated escape sequence color scheme

Added new command: info



```
Available commands extention:
-----------------------------
newdb <module>
Create a new database and an Namespace with a name derived from the name of the module and Install the module into it,
dbcreate namespace <path>
Create a new database and an Namespace,
dbcreate testdb2 -p d:/!/database/durable
dbdelete namespace
Delete database and an Namespace,
info
Show more complete information about modules in the current namespace.
info -m module
Show file module.xml.
info -f module
List file in repository.
info -r module
Show file readme.md.
upg
Upgrade the versions of modules installed in the current namespace.
load http://git-repo/developer-name/repo-name
Load the module directly from the repository into the current Namespace. The 'git clone' command is applied. The git program must be installed.
find app* -d load description from modules, and the description is always displayed
find app* -u /rcemper show modules only including context from repository
cmd
Alias: ?
Show all commands.
cmd <context>
Show all commands including context.
cmd -init
Reload all commands. Run do ##class(%ZAPM.ext.zapp).init()
hist
Alias: ??
Show all history.
hist <context>
Show all history including context.
hist - <context>
Show all history including context. Sorting by date
hist + <context>
Show all history including context. Reverse sorting by date
hist -del Number_day
Delete all history older than the number of days.
hist -add Number_hist
Added history in list for non-removable.
hist -add Number_hist [name_cmd] [a/i/n] description
Added history in list commans.
```
## Installation with ZPM
If ZPM the current instance is not installed, then in one line you can install the latest version of ZPM.
```
zn "%SYS" d ##class(Security.SSLConfigs).Create("z") s r=##class(%Net.HttpRequest).%New(),r.Server="pm.community.intersystems.com",r.SSLConfiguration="z" d r.Get("/packages/zpm/latest/installer"),$system.OBJ.LoadStream(r.HttpResponse.Data,"c")
```
If ZPM is installed, then ZAPM can be set with the command
```
zpm:USER>install zapm
```
## Installation with Docker
### Prerequisites
Make sure you have [git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) and [Docker desktop](https://www.docker.com/products/docker-desktop) installed.
## Installation
Clone/git pull the repo into any local directory
```
$ git clone https://github.com/SergeyMi37/zapm.git
```
Open the terminal in this directory and run:
```
$ docker-compose build
```
Run the IRIS container with your project:
```
$ docker-compose up -d
```
## How to Test it
Open IRIS terminal:
```
$ docker-compose exec iris iris session iris
USER>
USER>zapm "cmd"
```

## Command extensions zpm.
## help - coloring command description

## load <https...git-repo> - loading the module directly from the git-repository

## show modules with context by repository URL

## Additional commands.
Commands for working with databases and namespaces:
## dbcreate - create a database with %DB resources ans namespace and interoperability mapping.
```
USER>zapm "dbcreate testdb3 -p d:/!/database/durable"
```
## dbdelete - delete a database with %DB resources ans namespace.
```
USER>zapm "dbdelete testdb3"
```
## cmd - list of possible additional commands

## hist - list of executed commands

## newdb <module> -creating a database with a scope and installing the module there
To add a new command to the zapm shell, use the ##class(%ZAPM.ext.zapp).addcmd
For example, let's execute sequentially
- create a database with the area and install the zpmshow module there
```
hp-msw>IRISZPM>USER> newdb zpmshow
Creating Database zpmshow... done!
Creating Namespace zpmshow... done!
Creating Interoperability mappings ... done!
Adding Interoperability SQL privileges ... done!
Creating CSP Application ... done!
zpm "install zpmshow"
[zpmshow] Reload START
[zpmshow] Reload SUCCESS
[zpmshow] Module object refreshed.
[zpmshow] Validate START
[zpmshow] Validate SUCCESS
[zpmshow] Compile START
[zpmshow] Compile SUCCESS
[zpmshow] Activate START
[zpmshow] Configure START
[zpmshow] Configure SUCCESS
[zpmshow] Activate SUCCESS
```
- add a new command named zshow, which should be executed immediately.
```
do ##class(%ZAPM.ext.zapp).addcmd("new $namespace zn ""zpmshow"" do ^zpmshow", "zpm", "i", "zshow", "Show a zpm modules with extention description")
added
```

- check the execution of the new command from the system shell
USER>zapm "zshow"
- or from the zapm shell

This solution can replace not only the zpm shell but also the main terminal shell.
For me it almost happened ;-)
| 33.334802 | 243 | 0.746267 | eng_Latn | 0.80632 |
dc0c761b6a7b81b1495e13c1798571a0b04081eb | 31 | md | Markdown | README.md | Maxleco/Clone-Uber | ce731d96c50f2a7bbc9e6ca067f8ea403aebdf99 | [
"MIT"
] | null | null | null | README.md | Maxleco/Clone-Uber | ce731d96c50f2a7bbc9e6ca067f8ea403aebdf99 | [
"MIT"
] | null | null | null | README.md | Maxleco/Clone-Uber | ce731d96c50f2a7bbc9e6ca067f8ea403aebdf99 | [
"MIT"
] | null | null | null | # Clone-Uber
Clone do App Uber
| 10.333333 | 17 | 0.741935 | yue_Hant | 0.736523 |
dc0cae6d9d4775b5281ac3cde64b5c5872804322 | 27 | md | Markdown | demo3/README.md | syl2091/SpringAll | f5f3b60b3ddd529cfcff9f959fe755bde45fefe4 | [
"MIT"
] | 2 | 2021-09-06T02:46:04.000Z | 2021-11-09T12:07:11.000Z | demo3/README.md | syl2091/SpringAll | f5f3b60b3ddd529cfcff9f959fe755bde45fefe4 | [
"MIT"
] | null | null | null | demo3/README.md | syl2091/SpringAll | f5f3b60b3ddd529cfcff9f959fe755bde45fefe4 | [
"MIT"
] | null | null | null | Spring Boot 整合Mybatis CRUD | 27 | 27 | 0.851852 | kor_Hang | 0.858642 |
dc0d4ce6c1ca71112ceb236cd292996098ff0ba6 | 938 | md | Markdown | README.md | brauliovillalobos/ADM_HW5 | 317cca2c665f0a21a73853a2066a8c711d3c50eb | [
"MIT"
] | null | null | null | README.md | brauliovillalobos/ADM_HW5 | 317cca2c665f0a21a73853a2066a8c711d3c50eb | [
"MIT"
] | null | null | null | README.md | brauliovillalobos/ADM_HW5 | 317cca2c665f0a21a73853a2066a8c711d3c50eb | [
"MIT"
] | null | null | null | # ADM_HW5
This repository contains the Homework 5 of the Algorithmic Methods of Data Mining course. Following, you can find the description of the files contained in this repo:
* Part_1.ipynb: contains the code to merge the three datasets into one single graph.
* main5.ipynb: contains the code needed to run Functionality 1, 2, 3, 4 and the algorithmic question. In the case of the three functionalities, you can also find their visualization in here.
* support_functions4.py: contains the functions needed to run the Functionality 4 - Disconnecting graphs.
* utilities.py: contains the functions needed to run the Functionality 2 - Finding the best users.
* images : a dir containing some plots for functionality 2.
* Functionality dirs : contains the code for func 1, 2, 3,4 separated. In particular in Func 2 dir there is a betweeness example for a graph bigger then the one tested in main5.ipynb ( that took long to be computed).
| 104.222222 | 216 | 0.78678 | eng_Latn | 0.999543 |
dc0ea9adeb6951e1ca46c2c10158cc40d6ae2aec | 559 | md | Markdown | _posts/2014-04-22-39-8028.md | ashmaroli/mycad.github.io | 47fb8e82e4801dd33a3d4335a3d4a6657ec604cf | [
"MIT"
] | null | null | null | _posts/2014-04-22-39-8028.md | ashmaroli/mycad.github.io | 47fb8e82e4801dd33a3d4335a3d4a6657ec604cf | [
"MIT"
] | null | null | null | _posts/2014-04-22-39-8028.md | ashmaroli/mycad.github.io | 47fb8e82e4801dd33a3d4335a3d4a6657ec604cf | [
"MIT"
] | null | null | null | ---
layout: post
amendno: 39-8028
cadno: CAD2014-B737-11
title: 检查襟翼滑架芯轴
date: 2014-04-22 00:00:00 +0800
effdate: 2014-04-22 00:00:00 +0800
tag: B737
categories: 民航华北地区管理局适航审定处
author: 孟庆松
---
##适用范围:
本指令适用于在中华人民共和国注册的所有波音737-300、-400和 -500飞机。
注1:本适航指令适用于上述所有型号的飞机,无论本适航指令要求所涉及的区域是否经过改装、更换或修理。对那些经过改装、更换或修理的飞机,如果所做的改装、更换或修理影响到本适航指令要求的实施,飞机所有人/营运人采用的等效方法必须按照本适航指令P段要求获得批准。其方法中应包含所做的改装、更换或修理对本适航指令所阐述的不安全状态影响的评估;而且,如果该不安全状态没有被消除,其要求中应包含针对这种不安全状态的具体的建议措施。
注2:FAA STC ST01219SE的改装并不影响按照本指令完成相关的要求,因此不必因完成STC ST01219SE改装申请等效替代方法,此外所有其他申请AMOC,必须按照本指令P段要求的程序获得批准。
| 31.055556 | 208 | 0.838998 | yue_Hant | 0.78238 |
dc0ece87e53c7078b88b8bb046bfe6285784eb99 | 328 | md | Markdown | _posts/2021-07-08/2021-06-17-Would-eat-it-20210617190940356337.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-08/2021-06-17-Would-eat-it-20210617190940356337.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-08/2021-06-17-Would-eat-it-20210617190940356337.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | ---
title: "Would eat it?"
metadate: "hide"
categories: [ God Pussy ]
image: "https://preview.redd.it/a5geb34a9t571.jpg?auto=webp&s=8bb36b6702756dd6aab539390a742ec6dae8b29f"
thumb: "https://preview.redd.it/a5geb34a9t571.jpg?width=640&crop=smart&auto=webp&s=3297a12f95d37cb19c09c8c6a19bcd461527387c"
visit: ""
---
Would eat it?
| 32.8 | 124 | 0.771341 | yue_Hant | 0.138505 |
dc1068270a19535e3352ae09f0815a0c2d641e2f | 57 | md | Markdown | hlc/hlccc/README.md | drakejund/contract | e1dbfcff94f348a0040062fd04c48b0e42b64762 | [
"Apache-2.0"
] | null | null | null | hlc/hlccc/README.md | drakejund/contract | e1dbfcff94f348a0040062fd04c48b0e42b64762 | [
"Apache-2.0"
] | null | null | null | hlc/hlccc/README.md | drakejund/contract | e1dbfcff94f348a0040062fd04c48b0e42b64762 | [
"Apache-2.0"
] | null | null | null | # hlccc
hcl项目区块链智能合约
API DOC -> apidoc.md
# WuXi 项目智能合约
| 9.5 | 20 | 0.719298 | kor_Hang | 0.728922 |
dc10e50c6965e8f05f9d7ac13b1fc94c2fef8e2a | 935 | md | Markdown | README.md | hughsk/dauber-duration | c9f5b5e680907c937409b5154674d27b2ed2a7a0 | [
"MIT"
] | 4 | 2015-01-11T06:25:50.000Z | 2016-05-05T21:21:19.000Z | README.md | hughsk/dauber-duration | c9f5b5e680907c937409b5154674d27b2ed2a7a0 | [
"MIT"
] | null | null | null | README.md | hughsk/dauber-duration | c9f5b5e680907c937409b5154674d27b2ed2a7a0 | [
"MIT"
] | null | null | null | # dauber-duration [](http://github.com/badges/stability-badges)
**Deprecated**
Pooled requestedAnimationFrame tool for running multiple short bursts of animation
## Usage
[](https://nodei.co/npm/dauber-duration/)
### `dauber(duration, loopfn[, done])`
Calls `loopfn(t)` every frame for `duration` milliseconds, where `t` is a value
between 0 and 1 describing the progress through the animation. When complete,
calls `done` if passed.
For example, to animate between two values over 500 millisconds:
``` js
var animate = require('dauber-duration')
animate(500, function(t) {
var a = 10
var b = 50
var r = a + (b - a) * t // (linear interpolation)
circle.setAttribute('r', r)
})
```
## License
MIT. See [LICENSE.md](http://github.com/hughsk/dauber-duration/blob/master/LICENSE.md) for details.
| 27.5 | 138 | 0.718717 | eng_Latn | 0.561157 |
dc1191d65354e2128d966f30bc7f80417221088e | 992 | md | Markdown | s/superagent-proxy/readme.md | ScalablyTyped/SlinkyTyped | abb05700fe72d527728a9c735192f4c156bd9be1 | [
"MIT"
] | 14 | 2020-01-09T02:36:33.000Z | 2021-09-05T13:40:52.000Z | s/superagent-proxy/readme.md | oyvindberg/SlinkyTyped | abb05700fe72d527728a9c735192f4c156bd9be1 | [
"MIT"
] | 1 | 2021-07-31T20:24:00.000Z | 2021-08-01T07:43:35.000Z | s/superagent-proxy/readme.md | oyvindberg/SlinkyTyped | abb05700fe72d527728a9c735192f4c156bd9be1 | [
"MIT"
] | 4 | 2020-03-12T14:08:42.000Z | 2021-08-12T19:08:49.000Z |
# Scala.js typings for superagent-proxy
Typings are for version 2.0
## Library description:
`Request#proxy(uri)` superagent extension
| | |
| ------------------ | :-------------: |
| Full name | superagent-proxy |
| Keywords | superagent, http, https, proxy, socks |
| # releases | 2 |
| # dependents | 118 |
| # downloads | 23038580 |
| # stars | 3 |
## Links
- [Homepage](https://github.com/TooTallNate/superagent-proxy#readme)
- [Bugs](https://github.com/TooTallNate/superagent-proxy/issues)
- [Repository](https://github.com/TooTallNate/superagent-proxy)
- [Npm](https://www.npmjs.com/package/superagent-proxy)
## Note
This library has been generated from typescript code from [DefinitelyTyped](https://definitelytyped.org).
Provided with :purple_heart: from [ScalablyTyped](https://github.com/oyvindberg/ScalablyTyped)
## Usage
See [the main readme](../../readme.md) for instructions.
| 28.342857 | 105 | 0.629032 | eng_Latn | 0.38162 |
dc11a9c1254e3db887107e87c0c090ddf69c7052 | 2,982 | md | Markdown | src/Behavioral/Observer/README.md | kuriv/phpdp | 1684d49f0e129a402e8c85ce272c84bf1c551d12 | [
"MIT"
] | 1 | 2020-05-07T14:30:00.000Z | 2020-05-07T14:30:00.000Z | src/Behavioral/Observer/README.md | kuriv/phpdp | 1684d49f0e129a402e8c85ce272c84bf1c551d12 | [
"MIT"
] | null | null | null | src/Behavioral/Observer/README.md | kuriv/phpdp | 1684d49f0e129a402e8c85ce272c84bf1c551d12 | [
"MIT"
] | null | null | null | # Observer
> To implement a publish/subscribe behaviour to an object, whenever a "Subject" object changes its state, the attached "Observers" will be notified. It is used to shorten the amount of coupled objects and uses loose coupling instead.
## UML

## Code
User.php
```php
<?php
namespace Kuriv\PHPDesignPatterns\Behavioral\Observer;
use SplSubject;
use SplObjectStorage;
use SplObserver;
class User implements SplSubject
{
/**
* Store the email.
*
* @var string
*/
private string $email;
/**
* Store the object storage instance.
*
* @var SplObjectStorage
*/
private SplObjectStorage $observers;
/**
* Store the object storage instance to the current instance.
*
* @param void
* @return void
*/
public function __construct()
{
$this->observers = new SplObjectStorage;
}
/**
* Attach an observer instance.
*
* @param SplObserver $observer
* @return void
*/
public function attach(SplObserver $observer)
{
$this->observers->attach($observer);
}
/**
* Detach an observer instance.
*
* @param SplObserver $observer
* @return void
*/
public function detach(SplObserver $observer)
{
$this->observers->detach($observer);
}
/**
* Change the email and notify.
*
* @param string $email
* @return void
*/
public function changeEmail(string $email)
{
$this->email = $email;
$this->notify();
}
/**
* Perform notify action.
*
* @param void
* @return void
*/
public function notify()
{
foreach ($this->observers as $observer) {
$observer->update($this);
}
}
}
```
UserObserver.php
```php
<?php
namespace Kuriv\PHPDesignPatterns\Behavioral\Observer;
use SplObserver;
use SplSubject;
class UserObserver implements SplObserver
{
/**
* Store the changed users.
*
* @var array
*/
private array $changedUsers = [];
/**
* Update the property.
*
* @param SplSubject $subject
* @return void
*/
public function update(SplSubject $subject)
{
$this->changedUsers[] = clone $subject;
}
/**
* Get all changed users.
*
* @param void
* @return array
*/
public function getChangedUsers(): array
{
return $this->changedUsers;
}
}
```
## Test
ObserverTest.php
```php
<?php
namespace Kuriv\PHPDesignPatterns\Behavioral\Observer;
use PHPUnit\Framework\TestCase;
class ObserverTest extends TestCase
{
public function testChangeInUserLeadsToUserObserverBeingNotified()
{
$user = new User();
$observer = new UserObserver();
$user->attach($observer);
$user->changeEmail('[email protected]');
$this->assertCount(1, $observer->getChangedUsers());
}
}
```
| 17.75 | 233 | 0.593897 | eng_Latn | 0.475068 |
dc13bce361f3e2ab3315cab7d4a00da131c5bb4e | 782 | md | Markdown | intl.en-US/Flink SQL Development Guide/Flink SQL/Built-in functions/Mathematical functions/EXP.md | Adhders/sc | a0dc1cf9f27bf8e4b815cf697e4235a30df3ad7b | [
"MIT"
] | 1 | 2019-10-08T23:06:56.000Z | 2019-10-08T23:06:56.000Z | intl.en-US/Flink SQL Development Guide/Flink SQL/Built-in functions/Mathematical functions/EXP.md | dawsongzhao/sc | a0dc1cf9f27bf8e4b815cf697e4235a30df3ad7b | [
"MIT"
] | null | null | null | intl.en-US/Flink SQL Development Guide/Flink SQL/Built-in functions/Mathematical functions/EXP.md | dawsongzhao/sc | a0dc1cf9f27bf8e4b815cf697e4235a30df3ad7b | [
"MIT"
] | null | null | null | # EXP {#concept_kfv_ydq_dgb .concept}
This topic describes how to use the mathematical function EXP in Realtime Compute.
## Syntax {#section_dks_qgp_dgb .section}
```
DOUBLE EXP()
```
## Input parameters {#section_eks_qgp_dgb .section}
|Parameter|Data type|
|---------|---------|
|A|DOUBLE|
## Function description {#section_hks_qgp_dgb .section}
This function returns e raised to the power of the specified number. The constant e is the base of natural logarithms.
## Examples {#section_iks_qgp_dgb .section}
- Test data
|in1\(DOUBLE\)|
|-------------|
|8.0|
|10.0|
- Test statements
```
SELECT EXP(in1) as aa
FROM T1
```
- Test results
|aa\(DOUBLE\)|
|------------|
|2980.9579870417283|
|22026.465794806718|
| 17.377778 | 118 | 0.626598 | eng_Latn | 0.527267 |
dc13c66c25a223b2a4be502f76ab22710968dc9a | 189 | md | Markdown | README.md | guptashark/jt_altitude_server | 62ffd29ca34942e600386564713d146488febe7b | [
"MIT"
] | null | null | null | README.md | guptashark/jt_altitude_server | 62ffd29ca34942e600386564713d146488febe7b | [
"MIT"
] | null | null | null | README.md | guptashark/jt_altitude_server | 62ffd29ca34942e600386564713d146488febe7b | [
"MIT"
] | null | null | null | # jt_altitude_server
One of the jet tech services that can be used by an application (such as an autopilot) but, it's an optional service. (An autopilot may not have to have altitude info)
| 63 | 167 | 0.783069 | eng_Latn | 0.99984 |
dc13fb5f924650ce9336f1f391674e54b0a28457 | 3,032 | md | Markdown | README.md | binded/the-best-winston-sentry | 5a0174d0710660821c5881d7fe06c32f4a4a266a | [
"MIT"
] | null | null | null | README.md | binded/the-best-winston-sentry | 5a0174d0710660821c5881d7fe06c32f4a4a266a | [
"MIT"
] | null | null | null | README.md | binded/the-best-winston-sentry | 5a0174d0710660821c5881d7fe06c32f4a4a266a | [
"MIT"
] | 1 | 2017-06-20T18:00:35.000Z | 2017-06-20T18:00:35.000Z | [](https://travis-ci.org/binded/the-best-winston-sentry)
# the-best-winston-sentry
The best [winston logger](https://github.com/winstonjs/winston)
transport for Sentry / Raven 😁
## Install
```bash
npm i the-best-winston-sentry
```
## Usage
Follow this sample configuration to use:
```javascript
const winston = require('winston')
const SentryTransport = require('the-best-winston-sentry')
const sentryTransport = new SentryTransport({
level: 'warn',
dsn: '{{ YOUR SENTRY DSN }}',
tags: { key: 'value' },
extra: { key: 'value' },
})
const logger = new winston.Logger({
transports: [
new winston.transports.Console({level: 'silly'}),
sentryTransport,
],
})
// raven can be accessed from the transport object:
sentryTransport.raven
// or
logger.transports.sentry.raven
```
To catch and report all uncaught errors to Sentry with, simply set the
`patchGlobal` option to `true` and it will call `Raven.install()`:
```javascript
new SentryTransport({ patchGlobal: true })
```
Winston logging levels are mapped to the default sentry levels like this:
```javascript
{
silly: 'debug',
verbose: 'debug',
info: 'info',
debug: 'debug',
warn: 'warning',
error: 'error'
}
```
You can customize how log levels are mapped using the `levelsMap` option:
```javascript
new Sentry({
levelsMap: {
verbose: 'info',
},
})
```
### Supported metadata
```javascript
{
user, // user object
req, // http request object
tags, // sentry tags, must be mapping of string -> string
extra, // sentry extra, can be arbitrary JSON
fingerprint, // used by sentry to group errors
// ...
// unknown props are merged with extra
}
```
### Reporting exceptions
When logging an error, there are three ways to pass the error and
metadata:
- By assigning known properties directly to the error object
```javascript
const err = new Error('some error')
err.user = user
err.req = req
err.tags = { foo: 'bar' }
logger.error('oops!', err)
```
- By passing the error as the message (this might break other
transports)
```javascript
const err = new Error('some error')
logger.error(err, { user, req, tags: { foo: 'bar' } })
```
- **Recommended**: by passing the error as an `err` property on the metadata
```javascript
const err = new Error('some error')
logger.error('oops!', {
err,
user,
req,
tags: { foo: 'bar' },
})
```
When logging an error, the error's message will be concatenated with the
message passed to the Winston logger, following the following format:
`{msg} cause: {err.message}`, e.g.:
```javascript
logger.error('Oops!', new Error('some error'))
```
will show the following error message in sentry:
```
Oops! cause: some error
```
The sentry event ID is added as a `sentryId` on the error object:
```javascript
const testErr = new Error('some error')
logger.error('Oops!', testErr, () => {
console.log(testErr.sentryId)
// => 21479ebd4eec4afaaf9426617196a10a
})
```
| 21.055556 | 143 | 0.684697 | eng_Latn | 0.842621 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.