hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
sequencelengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
sequencelengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
sequencelengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
dbe3f85418e19dbd797dcee0c7c57ee5d1eb709f
184
md
Markdown
_posts/why/elonmusk/2018-12-25-elonmusk-spaceX.md
sahanashetty31/gaganyatri
82e5936393a6d9f495f90a99cef7847ddc14b863
[ "MIT" ]
null
null
null
_posts/why/elonmusk/2018-12-25-elonmusk-spaceX.md
sahanashetty31/gaganyatri
82e5936393a6d9f495f90a99cef7847ddc14b863
[ "MIT" ]
null
null
null
_posts/why/elonmusk/2018-12-25-elonmusk-spaceX.md
sahanashetty31/gaganyatri
82e5936393a6d9f495f90a99cef7847ddc14b863
[ "MIT" ]
null
null
null
--- layout: page title: "SpaceX - Elon Musk's Mule" excerpt: "Beam Me Up Scotty" categories: why tags: [ why musk ] date: 2019-01-14T08:08:50-04:00 --- SpaceX aka Space Exploration
14.153846
34
0.690217
eng_Latn
0.378585
dbe439e7c4696db4553f455fa7090a1d3c524f24
3,323
md
Markdown
docs/framework/winforms/controls/selection-and-clipboard-use-with-the-windows-forms-datagridview-control.md
BradJeong/docs.ko-kr
3de44e9d5ea126dbb4f76a0d149cd7a433e1aade
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/winforms/controls/selection-and-clipboard-use-with-the-windows-forms-datagridview-control.md
BradJeong/docs.ko-kr
3de44e9d5ea126dbb4f76a0d149cd7a433e1aade
[ "CC-BY-4.0", "MIT" ]
1
2018-10-29T12:42:46.000Z
2018-10-29T12:42:46.000Z
docs/framework/winforms/controls/selection-and-clipboard-use-with-the-windows-forms-datagridview-control.md
BradJeong/docs.ko-kr
3de44e9d5ea126dbb4f76a0d149cd7a433e1aade
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Windows Forms DataGridView 컨트롤에서 선택 및 클립보드 사용 ms.date: 03/30/2017 helpviewer_keywords: - DataGridView control [Windows Forms], Clipboard use - cells [Windows Forms], selecting in grids - Clipboard [Windows Forms], in DataGridView control - selection [Windows Forms], in DataGridView control - data grids [Windows Forms], selecting cells - DataGridView control [Windows Forms], selecting cells ms.assetid: 82cffcad-8b30-4897-bddb-c3a79d751b83 ms.openlocfilehash: c777366124a3cc5f43df8efca54fc366245bcb75 ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d ms.translationtype: MT ms.contentlocale: ko-KR ms.lasthandoff: 05/04/2018 ms.locfileid: "33537644" --- # <a name="selection-and-clipboard-use-with-the-windows-forms-datagridview-control"></a>Windows Forms DataGridView 컨트롤에서 선택 및 클립보드 사용 `DataGridView` 컨트롤에서는 다양 한 셀, 행 및 열 사용자가 방법을 선택할 수 구성 옵션을 제공 합니다. 예를 들어 가능 단일 또는 여러 선택 영역, 전체 행 또는 사용자가 셀을 클릭할 때 열을 선택 또는 전체 행 또는 열을 선택할 사용자가 머리글을 클릭할 경우에 셀 선택도를 매핑함으로써 합니다. 선택에 대 한 고유의 사용자 인터페이스를 제공 하려는 경우에 일반적인 선택을 사용 하지 않도록 설정할 수 있으며 모든 선택 항목을 프로그래밍 방식으로 처리할 수 있습니다. 또한 사용자가 선택한 값을 클립보드에 복사할 수를 사용할 수 있습니다. ## <a name="in-this-section"></a>섹션 내용 [Windows Forms DataGridView 컨트롤의 선택 모드](../../../../docs/framework/winforms/controls/selection-modes-in-the-windows-forms-datagridview-control.md) 사용자 및 컨트롤의 프로그래밍 방식으로 선택에 대 한 옵션을 설명합니다. [방법: Windows Forms DataGridView 컨트롤의 선택 모드 설정](../../../../docs/framework/winforms/controls/how-to-set-the-selection-mode-of-the-windows-forms-datagridview-control.md) 사용자가 셀을 클릭 하면 단일 행 선택에 대 한 제어를 구성 하는 방법에 설명 합니다. [방법: Windows Forms DataGridView 컨트롤에서 선택한 셀, 행 및 열 가져오기](../../../../docs/framework/winforms/controls/selected-cells-rows-and-columns-datagridview.md) 선택한 셀, 행 및 열 컬렉션을 사용 하는 방법을 설명 합니다. [방법: Windows Forms DataGridView 컨트롤에서 사용자가 여러 셀을 클립보드에 복사할 수 있도록 설정](../../../../docs/framework/winforms/controls/enable-users-to-copy-multiple-cells-to-the-clipboard-datagridview.md) 컨트롤의 클립보드 지원을 사용 하도록 설정 하는 방법에 설명 합니다. ## <a name="reference"></a>참조 <xref:System.Windows.Forms.DataGridView> <xref:System.Windows.Forms.DataGridView> 컨트롤에 대한 참조 설명서를 제공합니다. <xref:System.Windows.Forms.DataGridView.SelectionMode%2A?displayProperty=nameWithType> 에 대 한 참조 설명서를 제공는 <xref:System.Windows.Forms.DataGridView.SelectionMode%2A> 속성입니다. <xref:System.Windows.Forms.DataGridView.ClipboardCopyMode%2A> 에 대 한 참조 설명서를 제공는 <xref:System.Windows.Forms.DataGridView.ClipboardCopyMode%2A> 속성입니다. <xref:System.Windows.Forms.DataGridViewSelectedCellCollection> 에 대 한 참조 설명서를 제공는 <xref:System.Windows.Forms.DataGridViewSelectedCellCollection> 클래스입니다. <xref:System.Windows.Forms.DataGridViewSelectedRowCollection> 에 대 한 참조 설명서를 제공는 <xref:System.Windows.Forms.DataGridViewSelectedRowCollection> 클래스입니다. <xref:System.Windows.Forms.DataGridViewSelectedColumnCollection> 에 대 한 참조 설명서를 제공는 <xref:System.Windows.Forms.DataGridViewSelectedColumnCollection> 클래스입니다. ## <a name="see-also"></a>참고 항목 [DataGridView 컨트롤](../../../../docs/framework/winforms/controls/datagridview-control-windows-forms.md) [Windows Forms DataGridView 컨트롤에서의 기본 키보드 및 마우스 처리](../../../../docs/framework/winforms/controls/default-keyboard-and-mouse-handling-in-the-windows-forms-datagridview-control.md)
58.298246
311
0.751128
kor_Hang
0.99727
dbe46c91ceced7b08be21c61ba8182b2da4f3cd0
2,120
md
Markdown
_posts/2011-07-22-video-player.md
wedesoft/www.wedesoft.de
02aa0be94951e88b8c02c8458e228834c3a14ef3
[ "MIT" ]
null
null
null
_posts/2011-07-22-video-player.md
wedesoft/www.wedesoft.de
02aa0be94951e88b8c02c8458e228834c3a14ef3
[ "MIT" ]
null
null
null
_posts/2011-07-22-video-player.md
wedesoft/www.wedesoft.de
02aa0be94951e88b8c02c8458e228834c3a14ef3
[ "MIT" ]
null
null
null
--- layout: post title: Ruby Video Player category: ruby image: /pics/sintel.jpg --- {% youtube "http://www.youtube.com/watch?v=MzgGCjDryXA" %} This is the second episode of my new **Computer Vision for the Robotic Age podcast**. This episode is about video-I/O. The podcast demonstrates how a video player with proper audio/video synchronisation can be implemented with the Interactive Ruby Shell. The [Sintel][3] short film (Copyright Blender Foundation) was used as a video for testing. Here's the source code of the [Ruby video player][4] created in the podcast: {% highlight ruby %} require 'rubygems' # load FFMPEG bindings require 'hornetseye_ffmpeg' # load X.Org bindings require 'hornetseye_xorg' # load ALSA bindings require 'hornetseye_alsa' # include the namespace include Hornetseye # open a video file input = AVInput.new 'sintel.mp4' # open sound output with sampling rate of video alsa = AlsaOutput.new 'default:0', input.sample_rate, input.channels # read first audio frame audio_frame = input.read_audio # display images using width of 600 pixels and XVideo hardware acceleration X11Display.show 600, :output => XVideoOutput do |display| # read an image img = input.read # while there is space in the audio output buffer ... while alsa.avail >= audio_frame.shape[1] # ... write previous frame to audio buffer alsa.write audio_frame # read new audio frame audio_frame = input.read_audio end # compute difference of video clock to audio clock delay = input.video_pos - input.audio_pos + (alsa.delay + audio_frame.shape[1]).quo(alsa.rate) # suspend program in order to synchronise the video with the audio display.event_loop [delay, 0].max # display image img end {% endhighlight %} You can also download the video here <ul> <li><a href="http://dl.dropbox.com/u/49280716/hornetseye-video-player.m4v">hornetseye-video-player.m4v</a> (12.19 MB)</li> </ul> **See Also:** * [Ruby video player on gist.github.com][4] [1]: http://wedesoft.libsyn.com/webpage [2]: http://wedesoft.libsyn.com/rss [3]: http://www.sintel.org/ [4]: https://gist.github.com/1182886
33.125
345
0.739623
eng_Latn
0.936136
dbe4bff445c9b5ab67d9143dac1b43e92cbf2906
1,518
md
Markdown
README.md
VinAVarghese/Employee_Tracker
cc4d075c11f8cf6a36e37fdd228d843dffd4c73d
[ "BSD-3-Clause" ]
null
null
null
README.md
VinAVarghese/Employee_Tracker
cc4d075c11f8cf6a36e37fdd228d843dffd4c73d
[ "BSD-3-Clause" ]
null
null
null
README.md
VinAVarghese/Employee_Tracker
cc4d075c11f8cf6a36e37fdd228d843dffd4c73d
[ "BSD-3-Clause" ]
null
null
null
# Employee Tracker [![License: BSD](https://img.shields.io/badge/License-BSD%203--Clause-blue.svg)](https://opensource.org/licenses/BSD-3-Clause) This is a command-line application that allows users to keep track of employees in their company. The system utilizes inquirer, mysql and a well fitted database to keep track of pertinent information. The user can add, delete, or view departments, roles, and employees from the system. They can also update the assigned role or manager for any employee. ## Table of Contents 1. [Installation](#Installation) 2. [Usage](#Usage) 3. [Contributing](#Contributing) 4. [Tests](#Tests) 5. [License](#License) 6. [Questions](#Questions) ## Installation $ npm install ## Usage This application is meant to be used to keep track of the ever changing employee make-up of any given company. The departments, roles and salaries can all be custom fit to be used in any industry. HR departments can make good use of this application for large teams. ## Contributing Contribution is welcome. Please commit often and comment added/improved lines. ## Tests No test instructions. ## License >BSD ## Questions * GitHub: [VinAVarghese](https://github.com/VinAVarghese) * Email: [[email protected]](mailto:[email protected]) You are welcome to email me with any questions about the application with the subject line "RE: Employee Tracker" ## Links/Images ![Screenshot](screenshot.png) ![Screenshot2](screenshot2.png)
52.344828
355
0.739789
eng_Latn
0.979073
dbe6c08e1575733c805187bec7b3449138e1f137
395
md
Markdown
zadanie_3/README.md
abixadamj/python3-simple-examples
a0bd9bb3e5cb1359c5e7fee37707a4f21d93207f
[ "MIT" ]
null
null
null
zadanie_3/README.md
abixadamj/python3-simple-examples
a0bd9bb3e5cb1359c5e7fee37707a4f21d93207f
[ "MIT" ]
null
null
null
zadanie_3/README.md
abixadamj/python3-simple-examples
a0bd9bb3e5cb1359c5e7fee37707a4f21d93207f
[ "MIT" ]
null
null
null
Napisać program, który zapyta użytkownika o imię i datę urodzenia, a następnie wypisze na ekran te dane oraz informacje czy osoba jest pełnoletnia, a jeśli nie jest, informację za ile lat, miesięcy i dni będzie. Dzisiejszą datę w formacie ' YYYY-MM-DD' można pobrać z komputera za pomocą modułu „datetime” ``` from datetime import datetime dzisiaj=datetime.date(datetime.now()).isoformat()) ```
49.375
120
0.78481
pol_Latn
0.99998
dbe73d3dbc7a9886eeeaa32e3581320b1cd26773
8,987
md
Markdown
docs/extensibility/templatedata-element-visual-studio-templates.md
tommorris/visualstudio-docs.de-de
2f351c63cc51989c21aa6f5e705ec428144d8da6
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/extensibility/templatedata-element-visual-studio-templates.md
tommorris/visualstudio-docs.de-de
2f351c63cc51989c21aa6f5e705ec428144d8da6
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/extensibility/templatedata-element-visual-studio-templates.md
tommorris/visualstudio-docs.de-de
2f351c63cc51989c21aa6f5e705ec428144d8da6
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: TemplateData-Element (Visual Studio-Vorlagen) | Microsoft Docs ms.custom: '' ms.date: 11/04/2016 ms.technology: - vs-ide-general ms.topic: conceptual f1_keywords: - http://schemas.microsoft.com/developer/vstemplate/2005#TemplateData helpviewer_keywords: - TemplateData element [Visual Studio project templates] ms.assetid: db17ec9b-bfdf-46b1-bbe7-5ccc140056e2 author: gregvanl ms.author: gregvanl manager: douge ms.workload: - vssdk ms.openlocfilehash: bbf5b4c26b46c0be6038651a41c751afc39e4da5 ms.sourcegitcommit: 6a9d5bd75e50947659fd6c837111a6a547884e2a ms.translationtype: MT ms.contentlocale: de-DE ms.lasthandoff: 04/16/2018 ms.locfileid: "31145544" --- # <a name="templatedata-element-visual-studio-templates"></a>TemplateData-Element (Visual Studio-Vorlagen) Kategorisiert die Vorlage und definiert, wie diese in den Dialogfeldern **Neues Projekt** oder **Neues Element hinzufügen** angezeigt wird. \<VSTemplate> \<TemplateData> ## <a name="syntax"></a>Syntax ``` <TemplateData> <Name> ... </Name> <Description> ... </Description> <Icon> ... </Icon> <ProjectType> ... </ProjectType> ... </TemplateData> ``` ## <a name="attributes-and-elements"></a>Attribute und Elemente In den folgenden Abschnitten werden Attribute sowie untergeordnete und übergeordnete Elemente beschrieben. ### <a name="attributes"></a>Attribute Keine ### <a name="child-elements"></a>Untergeordnete Elemente |Element|Beschreibung| |-------------|-----------------| |[Name](../extensibility/name-element-visual-studio-templates.md)|Erforderliches Element.<br /><br /> Gibt den Namen der Vorlage an, wie er in beiden angezeigt wird der **neues Projekt** oder **neues Element hinzufügen** (Dialogfeld).| |[Beschreibung](../extensibility/description-element-visual-studio-templates.md)|Erforderliches Element.<br /><br /> Gibt die Beschreibung der Vorlage an, wie er in beiden angezeigt wird der **neues Projekt** oder **neues Element hinzufügen** (Dialogfeld).| |[Symbol](../extensibility/icon-element-visual-studio-templates.md)|Erforderliches Element.<br /><br /> Gibt den Pfad und den Dateinamen der Bilddatei, die als Symbol dient, die entweder in angezeigt wird der **neues Projekt** oder **neues Element hinzufügen** (Dialogfeld), für die Vorlage.| |[ProjectType](../extensibility/projecttype-element-visual-studio-templates.md)|Erforderliches Element.<br /><br /> Kategorisiert die Projektvorlage, sodass es unter der angegebenen Gruppe angezeigt wird der **neues Projekt** (Dialogfeld).| |[ProjectSubType](../extensibility/projectsubtype-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Klassifiziert die Projektvorlage, damit es unter der angegebenen Unterkategorie in angezeigt wird der **neues Projekt** (Dialogfeld).| |[TemplateID](../extensibility/templateid-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt an, die Vorlagen-ID.| |[TemplateGroupID](../extensibility/templategroupid-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt an, die Vorlage Gruppen-ID.| |[SortOrder](../extensibility/sortorder-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt einen Wert an, die zum Anordnen der Vorlage gegenüber anderen Vorlagen in der gleichen Kategorie verwendet wird, wie er im entweder enthalten die **neues Projekt** oder **neues Element hinzufügen** (Dialogfeld).| |[CreateNewFolder](../extensibility/createnewfolder-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt an, ob ein enthaltender Ordner bei der Instanziierung des Projekts erstellt wird.| |[DefaultName](../extensibility/defaultname-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt den Namen, den der Visual Studio-Projektsystem generiert das Projekt oder Element während der Erstellung.| |[ProvideDefaultName](../extensibility/providedefaultname-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt an, ob die Visual Studio-Projektsystem den Standardnamen für ein Projekt oder Element generiert wird, während der Erstellung.| |[PromptForSaveOnCreation](../extensibility/promptforsaveoncreation-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt an, ob das Projekt als temporäres Projekt erstellt werden kann.| |[EnableLocationBrowseButton](../extensibility/enablelocationbrowsebutton-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt an, ob die **Durchsuchen** Schaltfläche ist verfügbar, in der **neues Projekt** Dialogfeld, sodass Benutzer können problemlos ändern, das Standardverzeichnis, in dem ein neues Projekt gespeichert ist.| |[Ausgeblendet](../extensibility/hidden-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt an, ob die Vorlage angezeigt, entweder in wird der **neues Projekt** oder **neues Element hinzufügen** (Dialogfeld).| |[NumberOfParentCategoriesToRollUp](../extensibility/numberofparentcategoriestorollup-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt die Anzahl von übergeordneten Kategorien, die die Vorlage in anzeigen, wird die **neues Projekt** (Dialogfeld).| |[LocationFieldMRUPrefix](../extensibility/locationfieldmruprefix-element-visual-studio-templates.md)|Optionales Element.| |[LocationField](../extensibility/locationfield-element-visual-studio-project-templates.md)|Optionales Element.<br /><br /> Gibt an, und zwar unabhängig davon, ob die **Speicherort** Textfeld in der **neues Projekt** Dialogfeld ist entweder aktiviert, deaktiviert oder ausgeblendet werden, für die Projektvorlage.| |[RequiredFrameworkVersion](../extensibility/requiredframeworkversion-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Verwenden Sie dieses Element aus, wenn die Vorlage nur eine bestimmte mindestens erforderliche Version und höheren Versionen von .NET Framework ggf. unterstützt.| |[SupportsMasterPage](../extensibility/supportsmasterpage-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt an, ob die Vorlage eine Masterseite für Webprojekte unterstützt.| |[SupportsCodeSeparation](../extensibility/supportscodeseparation-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt an, ob die Vorlage für Webprojekte getrenntem Code oder der Code-Behind-Seitenmodell unterstützt.| |[SupportsLanguageDropDown](../extensibility/supportslanguagedropdown-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt an, ob die Vorlage für mehrere Programmiersprachen identisch ist und ob die **Sprache** Option steht in der **neues Projekt** (Dialogfeld).| |[TargetPlatformName](../extensibility/targetplatformname-element-visual-studio-templates.md)|Optionales Element.<br /><br /> Gibt die Plattform an, auf die die Projektvorlage abzielt. Dieses Element gibt an, dass eine Projektvorlage zum Erstellen [!INCLUDE[win8_appname_long](../debugger/includes/win8_appname_long_md.md)] apps.| ### <a name="parent-elements"></a>Übergeordnete Elemente |Element|Beschreibung| |-------------|-----------------| |[VSTemplate](../extensibility/vstemplate-element-visual-studio-templates.md)|Erforderliches Element.<br /><br /> Enthält alle Metadaten für die Projektvorlage, Elementvorlage oder Starterkit.| ## <a name="remarks"></a>Hinweise `TemplateData` ist ein erforderliches Element. Wenn Sie ein optionales Element nicht einschließen, wird der Standardwert für dieses Element verwendet. ## <a name="example"></a>Beispiel Im folgenden Beispiel werden die Metadaten für eine Projektvorlage einer [!INCLUDE[csprcs](../data-tools/includes/csprcs_md.md)]-Anwendung veranschaulicht. ``` <VSTemplate Type="Project" Version="3.0.0" xmlns="http://schemas.microsoft.com/developer/vstemplate/2005"> <TemplateData> <Name>My template</Name> <Description>A basic starter kit</Description> <Icon>TemplateIcon.ico</Icon> <ProjectType>CSharp</ProjectType> </TemplateData> <TemplateContent> <Project File="MyStarterKit.csproj"> <ProjectItem>Form1.cs<ProjectItem> <ProjectItem>Form1.Designer.cs</ProjectItem> <ProjectItem>Program.cs</ProjectItem> <ProjectItem>Properties\AssemblyInfo.cs</ProjectItem> <ProjectItem>Properties\Resources.resx</ProjectItem> <ProjectItem>Properties\Resources.Designer.cs</ProjectItem> <ProjectItem>Properties\Settings.settings</ProjectItem> <ProjectItem>Properties\Settings.Designer.cs</ProjectItem> </Project> </TemplateContent> </VSTemplate> ``` ## <a name="see-also"></a>Siehe auch [Schemareferenz zu Visual Studio-Vorlagen](../extensibility/visual-studio-template-schema-reference.md) [Erstellen von Projekt- und Elementvorlagen](../ide/creating-project-and-item-templates.md)
77.474138
350
0.747635
deu_Latn
0.806295
dbe7601bd59d568aaeadc34cae80ff325891518e
5,477
md
Markdown
README.md
dipAch/Infrastructure-Automation-Kit
34fdf9815376028985056229dafd56fb3ff58f16
[ "Apache-2.0" ]
null
null
null
README.md
dipAch/Infrastructure-Automation-Kit
34fdf9815376028985056229dafd56fb3ff58f16
[ "Apache-2.0" ]
null
null
null
README.md
dipAch/Infrastructure-Automation-Kit
34fdf9815376028985056229dafd56fb3ff58f16
[ "Apache-2.0" ]
null
null
null
# Infrastructure-Automation-Kit This is the Magical Automation Kit for Downloading, Building (i.e., Configuring and Compiling) and Installing the latest Apache `HTTPD` and `TOMCAT` packages. This utility also manages the dependencies (for eg., latest `JDK/JRE` for `TOMCAT`) required by the actual software(s). This is a complete suite that offers Infrastructure Setup for Reverse-Proxies and Application Servers (currently, `HTTPD` and `TOMCAT` respectively, but not limited to) without any hassle and manages the tricky aspects for you. ## The Problem My Organization heavily deploys applications on `Tomcat` servers. We have a fleet of application servers that are Tomcat and it has been so for many many years. Tomcat is reliable and has an active development history. Also, Tomcat is very secure and stable when used in an Enterprise setup. It can handle a good load of traffic and performs very well with when configured appropriately. The same goes for our Reverse-Proxy Infrastructure. We use Apache `HTTPD` to handle our Reverse-Proxy tier and Building Apache `HTTPD` and its dependencies is no easy task. It is time consuming and requires a good knowledge of Building Software from source. Basically, if you are doing it for the first time, you are bound to find challenges while building it and also probably fail. So as part of the __Enterprise Infrastructure Team__, our job is to provision more and more Tomcat servers and Reverse-Proxy instances, for deploying the ever increasing number of applications, both customer facing and internally used ones or even web services, etc. ## Manual Setup When performing the installation and setup manualy, it takes up a good amount of time and every time we are performing the exact same set of steps. So, imagine we have to setup 100 such Tomcat instances. Yeah...right, that's gonna take a lot of time. As part of the setup, we will also need to download and install the appropriate `JDK/JRE` required by the applications to be deployed. For the Reverse-Proxy Instances it is even more cumbersome to go through the entire Configure, Build / Compile and Install process for the `HTTPD` Software and its dependencies. Sounds like a task requiring Automation?! ## Automation to the Rescue To avoid this repetition at work, I came up with an utility that performs the necessary checks and steps without any human intervention and thus increases productivity by saving time wasted on the process of downloading/installing and setting up the infrastructure part. All that is required now, is to run this utility with configurations correctly set, because that is what the utility listens to. Define the configurations once, and let the utility do its __`magic`__. No need to manually indulge on doing the same on multiple servers, and thus less prone to human errors and also it enables us to keep better track of the entire setup process and can be easily standardized across the various towers or domains. The most important entity that one gains is, it saves a lot of __time__. What manually would have taken ~10-15 mins, can now be done in just a matter of seconds (at max. a few minutes, Network Latency you know!!!). Time is an important factor in the Enterprise Setup, because __`Time is Money and Money is Time`__. ## How to Use Most of the customizations can be handled by configuring the settings defined in the configuration files. Please make sure to change the default values to something that would suit your requirements. Other than that, the application itself is quite modular and well split. Also, it is made to run cross-platform, one big plus when you have a node farm comprising of multiple brands of platforms, grouped/clustered together to support the use cases. The modularity enables you to __plug-and-play__ various combinations of the components, according to your needs. Here, I have provided you with the library that holds the components of the utility. Its upto you to stitch them together as you see fit. Also, the code itself is well documented and can be easily understood. I have documented the design decisions taken by me which can be further optimized accordingly. ## Tasks Made Easy (Infrastructure Perspective) - [x] Search the latest version of __`Binary`__ on the Official Site. - [x] Download the latest version and save it to disk. - [x] Get the dependencies downloaded as well. - Find the latest __`DEPENDENCY`__ version on the Official Site. - Download the same and save it to disk. - [x] Perform the extraction operation on the Downloaded Archived binaries. - [X] Have them Configured, Compiled and Installed on the system. - [x] Place the extracted binaries to the appropriate locations (part of the install process). ## Tasks Made Easy (Person's Point of View) - [x] No manual setup required. - [x] Saves ample time for performing other important infrastrucutre tasks (Automation makes it `Super Fast!!!`). - [x] Eliminating Human Errors. - [x] No expertise required to perform the task (i.e., the downloading and extracting of the required binaries). - [x] More time to relax and chill!!! ## What I Gain? Well my focus is always on leveraging technology to make work more efficient and less cumbersome. Reducing the complexity and other human factors was also the motivation behind this. Also, I gained a lot of insight on System Design and being able to decide on using the appropriate tools for doing the job at hand in an Efficient and Manageable way.
84.261538
506
0.791127
eng_Latn
0.999599
dbe77284b7270be9c388e92a2b074d848227995b
474
md
Markdown
README.md
Matrax/web-api
0e6c2ffde11b6cdf59c1158910709b6a369bb8b0
[ "MIT" ]
1
2021-01-28T09:15:23.000Z
2021-01-28T09:15:23.000Z
README.md
Matrax/web-api
0e6c2ffde11b6cdf59c1158910709b6a369bb8b0
[ "MIT" ]
null
null
null
README.md
Matrax/web-api
0e6c2ffde11b6cdf59c1158910709b6a369bb8b0
[ "MIT" ]
1
2022-03-21T17:23:41.000Z
2022-03-21T17:23:41.000Z
# 3DWebEngine ## Description 3DWebEngine is just a 3D game engine that i developped for a school project using WebGL and JavaScript. The "Test" folder contain an example of rendering with the engine (it's just a cube and a gun model). Do what you want with this code :) ## Librairies WebGL-Obj-Loader: https://github.com/frenchtoast747/webgl-obj-loader (MIT License) ## Project This 3D game engine was developped for the project of Aim-Upgrade: https://aim-upgrade.top/
31.6
91
0.767932
eng_Latn
0.985564
dbe7bfad1f209a2883eb7c048984adf77413c0c2
2,679
md
Markdown
articles/active-directory/develop/quickstart-create-new-tenant.md
grayknight2/mc-docs.zh-cn
dc705774cac09f2b3eaeec3c0ecc17148604133e
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/active-directory/develop/quickstart-create-new-tenant.md
grayknight2/mc-docs.zh-cn
dc705774cac09f2b3eaeec3c0ecc17148604133e
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/active-directory/develop/quickstart-create-new-tenant.md
grayknight2/mc-docs.zh-cn
dc705774cac09f2b3eaeec3c0ecc17148604133e
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 创建一个 Azure Active Directory 租户 description: 了解如何创建用于注册和生成应用程序的 Azure AD 租户。 services: active-directory author: rwike77 manager: CelesteDG ms.service: active-directory ms.subservice: develop ms.workload: identity ms.topic: quickstart ms.date: 08/18/2020 ms.author: v-junlch ms.reviewer: jmprieur ms.custom: aaddev, identityplatformtop40, fasttrack-edit ms.openlocfilehash: 702049157f6393b611a9579ad333f85801b5804d ms.sourcegitcommit: 7646936d018c4392e1c138d7e541681c4dfd9041 ms.translationtype: HT ms.contentlocale: zh-CN ms.lasthandoff: 08/20/2020 ms.locfileid: "88647632" --- # <a name="quickstart-set-up-a-tenant"></a>快速入门:设置租户 Microsoft 标识平台可让开发人员生成面向各种自定义 Microsoft 365 环境和标识的应用程序。 要开始使用 Microsoft 标识平台,你将需要访问环境(也称为 Azure AD 租户),该环境可以注册和管理应用、可以访问 Microsoft 365 数据并部署自定义条件访问和租户限制。 租户是组织的表示形式。 它是 Azure AD 专用实例,组织或应用开发人员与 Microsoft 建立关系时(例如注册 Azure、Microsoft Intune 或 Microsoft 365)会收到该实例。 每个 Azure AD 租户都与其他 Azure AD 租户不同并单独存在,而且使用自己的工作和学校标识、消费者标识(如果是 Azure AD B2C 租户)以及应用注册进行表示。 租户内部的应用注册只允许从租户或所有租户的帐户中进行身份验证。 ## <a name="determining-environment-type"></a>确定环境类型 有两种可以创建的环境类型。 确定所需的环境类型仅基于你的应用将进行身份验证的用户类型。 * 工作和学校(Azure AD 帐户) * 社交和本地帐户(Azure AD B2C) 本快速入门分为两种方案,具体取决于所要生成的应用类型。 ## <a name="work-and-school-accounts"></a>工作和学校帐户 ### <a name="use-an-existing-tenant"></a>使用现有租户 许多开发人员已通过绑定到 Azure AD 租户的服务或订阅(例如 Microsoft 365 或 Azure 订阅)获得了租户。 1. 要检查租户,请使用要用于管理应用程序的帐户登录 [Azure 门户](https://portal.azure.cn)。 1. 查看右上角。 如果你有一个租户,则会自动登录到该租户,并且帐户名的正下方会显示租户名称。 * 将鼠标指针悬停在 Azure 门户右上角的帐户名上,可以查看你的姓名、电子邮件、目录/租户 ID (GUID) 以及域。 * 如果帐户与多个租户相关联,则可以选择帐户名打开一个菜单,并在其中切换租户。 每个租户都有自己的唯一租户 ID。 > [!TIP] > 如果需要查找租户 ID,可执行以下操作: > * 将鼠标指针悬停在帐户名上以获取目录/租户 ID,或 > * 在 Azure 门户中选择“Azure Active Directory”>“属性”>“目录 ID” 如果没有任何与帐户关联的现有租户,则帐户名下面会显示一个 GUID;另外,除非按照下一节的步骤操作,否则无法执行注册应用等操作。 ### <a name="create-a-new-azure-ad-tenant"></a>创建新的 Azure AD 租户 如果还没有 Azure AD 租户或想要创建用于开发的新租户,请参阅[快速入门](../fundamentals/active-directory-access-create-new-tenant.md),或者只需按照[目录创建体验](https://portal.azure.cn/#create/Microsoft.AzureActiveDirectory)进行操作。 必须提供以下信息才能创建新租户: - 组织名称 - **初始域** - 这将是 *.partner.onmschina.cn 的一部分。 稍后你可以更详细地自定义域。 - 国家或地区 > [!NOTE] > 对租户进行命名时,请使用字母数字字符。 不允许使用特殊字符。 名称不得超过 256 个字符。 ## <a name="social-and-local-accounts"></a>社交和本地帐户 要开始生成登录社交和本地帐户的应用,你将需要创建 Azure AD B2C 租户。 请从[创建 Azure AD B2C 租户](../../active-directory-b2c/tutorial-create-tenant.md)开始。 ## <a name="next-steps"></a>后续步骤 * [注册应用](quickstart-register-app.md)并与 Microsoft 标识平台集成。 * 了解[身份验证的基础知识](./authentication-vs-authorization.md)。 * 若要详细了解订阅和 Azure AD 租户之间的关系,请参阅[将 Azure 订阅关联或添加到 Azure Active Directory 租户](../fundamentals/active-directory-how-subscriptions-associated-directory.md)。
34.346154
203
0.778649
yue_Hant
0.703702
dbe800928111011f00aca85d9b16a2515511c2c4
252
md
Markdown
includes/net-current-v45plus-md.md
adamsitnik/docs.pl-pl
c83da3ae45af087f6611635c348088ba35234d49
[ "CC-BY-4.0", "MIT" ]
null
null
null
includes/net-current-v45plus-md.md
adamsitnik/docs.pl-pl
c83da3ae45af087f6611635c348088ba35234d49
[ "CC-BY-4.0", "MIT" ]
null
null
null
includes/net-current-v45plus-md.md
adamsitnik/docs.pl-pl
c83da3ae45af087f6611635c348088ba35234d49
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- ms.openlocfilehash: b4679c83c6056eaeede9ab5b3f16c1565bf70526 ms.sourcegitcommit: 8699383914c24a0df033393f55db3369db728a7b ms.translationtype: MT ms.contentlocale: pl-PL ms.lasthandoff: 05/15/2019 ms.locfileid: "65636670" --- Dostępne od wersji 4.5
25.2
60
0.833333
pol_Latn
0.106891
dbe8492b115131211050f4a5688190777d13e0c0
13,236
md
Markdown
docs/cloudContainers.md
pleacu/fuse
22ccb614d4ca3056e182c3799fea104f0c7b3908
[ "Apache-2.0" ]
147
2015-01-19T16:31:31.000Z
2022-01-11T06:22:15.000Z
docs/cloudContainers.md
pleacu/fuse
22ccb614d4ca3056e182c3799fea104f0c7b3908
[ "Apache-2.0" ]
169
2015-01-09T13:40:33.000Z
2021-01-15T04:41:05.000Z
docs/cloudContainers.md
pleacu/fuse
22ccb614d4ca3056e182c3799fea104f0c7b3908
[ "Apache-2.0" ]
109
2015-01-08T05:36:11.000Z
2022-03-11T10:52:32.000Z
## Cloud Containers Fabric leverages [jclouds](http://www.jclouds.org/) in order to allow Fabric create new containers in public or private clouds. The Fabric cloud container provider will allow you to create new compute instances in the cloud provider of your choice, perform firewall configuration, requirements installation & last but not list full container installation and automatic registration to the cluster. ### Requirements The requirements for using this feature to create a container in a remote host are: * **A valid account to a supported cloud provider** The list of supported cloud providers can be found at [jclouds supported providers](http://www.jclouds.org/documentation/reference/supported-providers). **Important** The term **supported provider** does not refer to commerical support, its just an indication that there is an available implementation. #### Additional requirement for hybrid clusters A hybrid cluster is a cluster that is composed of containers running both in the premises and inside a public cloud provider. This special type of cluster may impose some additional requirements: * **Guarantee that all containers will be able to connect the registry** *(refers to network connectivity)* In order to satisfy this requirement, you will need to make sure that one of the following conditions are meet: * **Fabric registry is running inside the public cloud** * **Cloud and Local containers are part of a VPN** * **Fabric registry is accessible from the internet** *(Not recommended)* In the case that the Fabric registry is running the cloud, your local containers will have no problem accessing the registry as long as they are able to connect to the internet. However, if the Fabric registry is running in the premises, the cloud containers, won't be able to have access to your premises. At least not unless you have the registry accessible from the internet or unless you setup a vpn. Having the registry accessible from the internet is not really the best idea for the obvious reasons. Setting up a vpn sounds like a better idea. The easiest approach is just to host the registry in the cloud & configure the firewall in the cloud accordingly to only allow access to containers from the premises. By default Fabric will configure the firewall for you. Below you can see how you can create an ensemble in the cloud *(host the registry in the cloud)*, there is also a small demonstration video covering that case. ### Preparation Before you can make use of this feature you will need to obtain a valid *identity* and *credential* for your cloud provider. That is not necessarily the username and password you obtain upon registration with the provider. This is usually refers to the credentials that you get for using the service from an external api. For example for Amazon EC2 these can be found at the [security credentials page](https://aws-portal.amazon.com/gp/aws/securityCredentials). The next step is to make sure that the container you will be using will have all the required features installed. The core feature requirement is *fabric-jclouds* which will give you access to the cloud container provider. Then you need to install the feature for your corresponding cloud provider. features:install fabric-jclouds Note that if you are connected to a managed container the features command will not be available. In this case you will need to make sure that the feature above is part of the profile you are using. #### Feature naming convention The naming convention for the cloud provider features is jclouds-<provider id>, where provider id is as listed in [jclouds supported providers](http://www.jclouds.org/documentation/reference/supported-providers). Some common feature names: * **jclouds-aws-ec2** The feature for Amazon EC2 * **jclouds-cloudservers-us** The feature for Rackspace features:install jclouds-aws-ec2 features:install jclouds-cloudservers-us For those two cloud providers Fabric also provides profiles that will install all the required features, so you could just you the provided profiles out of the box. ### Registering a cloud provider Once you have installed all the required feature, you need to register the cloud provider to fabric. The registration process acutally will store the provider credentials to the registry so that they can be used from any Fabric container. The registration can be done with the use of fabric:cloud-service-add. To register the Amazon EC2 provider: fabric:cloud-service-add aws-ec2 myidentity mycredential To register the Rackspace provider fabric:cloud-service-add cloudservers-us myidentity mycredential ### Creating containers in the cloud You can now use the *fabric:container-create-cloud-command* to create new Fabric containers in the cloud. For example to create a container on Amazon EC2: fabric:container-create-cloud --provider aws-ec2 mycontainer We still haven't mentioned anything about images. This is just because specifying an image is optional. By default Fabric will try to find an *Ubuntu* image for you. You can provide predicates for the *operating system* and/or the *version* of it. For example to choose *Centos* instead of *Ubuntu* you could make use of the **--os-familu** option: fabric:container-create-cloud --provider aws-ec2 --os-family centos mycontainer fabric:container-create-cloud --provider aws-ec2 --os-family centos --os-version 5 mycontainer The latest will try to find any centos version that contains 5 inside it. Of course if you have need to specify the exact image, Fabric will allow you to using the **--image** option. fabric:container-create-cloud --provider aws-ec2 --image myimageid mycontainer The command will display the creation status and also some usefull information, once the creation is complete: Looking up for compute service. Creating 1 nodes in the cloud. Using operating system: ubuntu. It may take a while ... Node fabric-f674a68f has been created. Configuring firewall. Installing fabric agent on container cloud. It may take a while... Overriding resolver to publichostname. [id] [container] [public addresses] [status] us-east-1/i-f674a68f cloud [23.20.114.82] success #### Creating a new Fabric ensemble in the cloud The cloud container provider not only allows you to create containers in the cloud, it also allows you to create a new ensemble in the cloud using the **--ensemble-server** option. Here is a short clip that demonstrates how you can create a new ensemble in the cloud and join that from the premisses: <object width="853" height="480"><param name="movie" value="http://www.youtube.com/v/zTMoz_5bBDY?version=3&amp;hl=en_US&amp;rel=0"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/zTMoz_5bBDY?version=3&amp;hl=en_US&amp;rel=0" type="application/x-shockwave-flash" width="853" height="480" allowscriptaccess="always" allowfullscreen="true"></embed></object> As already mentioned above this approach is the easier way to get you going with a hybrid cloud solution. ### Images Regardless of the way that you specify the image *(directly or indirectly)* the image needs to have some of the following characteristics: * **Linux images** * **RedHat or Debian packaging style** * **No Java installation or Java 1.6+** If there is no java installed, fabric will do so for you, however it will not work if an incompatible java version is already installed. You can also create an image of your own and use that instead. However, that may require different configuration preparation. For Amazon EC2 you will need to specify the owner id of the private image, when registering the provider: fabric:cloud-service-add --owner myownerid cloudservers-us myidentity mycredential ### Locations & Hardware Most cloud providers will give you the option to create containers on different location or using different hardware profiles. You may wonder which are the proper values to use for your provider. Even though fabric provides completion for *all* configuration options, you still may want to get a list of them. To list all of the avaialble locations: jclouds:location-list The output for Amazon EC2 will look like this: [id] [scope] [description] aws-ec2 PROVIDER https://ec2.us-east-1.amazonaws.com sa-east-1 REGION sa-east-1 sa-east-1a ZONE sa-east-1a sa-east-1b ZONE sa-east-1b us-west-1 REGION us-west-1 us-west-1a ZONE us-west-1a us-west-1c ZONE us-west-1c us-west-1b ZONE us-west-1b us-east-1 REGION us-east-1 us-east-1d ZONE us-east-1d us-east-1c ZONE us-east-1c us-east-1e ZONE us-east-1e us-east-1a ZONE us-east-1a us-east-1b ZONE us-east-1b ap-northeast-1 REGION ap-northeast-1 ap-northeast-1a ZONE ap-northeast-1a ap-northeast-1b ZONE ap-northeast-1b ap-southeast-1 REGION ap-southeast-1 ap-southeast-1a ZONE ap-southeast-1a ap-southeast-1b ZONE ap-southeast-1b eu-west-1 REGION eu-west-1 eu-west-1c ZONE eu-west-1c eu-west-1a ZONE eu-west-1a eu-west-1b ZONE eu-west-1b us-west-2 REGION us-west-2 us-west-2b ZONE us-west-2b us-west-2c ZONE us-west-2c us-west-2a ZONE us-west-2a To list all the available hardware profiles: jclouds:hardware-list The hardware profiles for Amazon EC2 will look like: [id] [cpu] [cores] [ram] cc1.4xlarge 32.0 8.0 23552.0 cg1.4xlarge 32.0 8.0 22528.0 cc2.8xlarge 88.0 16.0 61952.0 t1.micro 1.0 1.0 630.0 c1.medium 5.0 2.0 1740.0 c1.xlarge 20.0 8.0 7168.0 m1.large 4.0 2.0 7680.0 m1.small 1.0 1.0 1740.0 m1.medium 2.0 1.0 3750.0 m1.xlarge 8.0 4.0 15360.0 m2.xlarge 6.5 2.0 17510.0 m2.2xlarge 13.0 4.0 35020.0 m2.4xlarge 26.0 8.0 70041.0 You can do the same for images. To make use of those information for creating a Fabric in the cloud you can specify them as options: fabric:container-create-cloud --provider aws-ec2 --location eu-west-1 --hardware m2.4xlarge mycontainer The example above will create a new fabric container on the *eu-west-1* region and will use the *m2.4xlarge* hardware profile. Here is a small clip that demonstrates how you can acquire those information from your cloud provider: <object width="853" height="480"><param name="movie" value="http://www.youtube.com/v/wVsazzjIlAo?version=3&amp;hl=en_US&amp;rel=0"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/wVsazzjIlAo?version=3&amp;hl=en_US&amp;rel=0" type="application/x-shockwave-flash" width="853" height="480" allowscriptaccess="always" allowfullscreen="true"></embed></object> ### Fabric8 Examples in the Cloud Fabric8 ships out of the box some really interesting example of how you can use Fabric8 with Camel and ActiveMQ. In the following clip you'll see how you can run on of these examples in the cloud using Fabric8's cloud container provider. More specifically you'll see how: * **Setup MQ containers in the cloud as master/slave** * **Discover brokers using the Fabric runtime registry** * **Run the camel example in the cloud that leverages broker discovery of Fabric** * **Use the shell to retrieve live information about your running routes** <object width="853" height="480"><param name="movie" value="http://www.youtube.com/v/QBCxr5dHEHY?version=3&amp;hl=en_US&amp;rel=0"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/QBCxr5dHEHY?version=3&amp;hl=en_US&amp;rel=0" type="application/x-shockwave-flash" width="853" height="480" allowscriptaccess="always" allowfullscreen="true"></embed></object>
61.277778
457
0.683968
eng_Latn
0.99564
dbe8c578802f29a93eb49b6f5e3d537da21bc12b
1,067
md
Markdown
projects/kotlin-util/1.4.0/docs/it.czerwinski.kotlin.util/-either/filter-is-instance-to-option.md
sczerwinski/sczerwinski.github.io
eaa7251935c47d15cc6e09fadc40de6ee7bdee9c
[ "Apache-2.0" ]
null
null
null
projects/kotlin-util/1.4.0/docs/it.czerwinski.kotlin.util/-either/filter-is-instance-to-option.md
sczerwinski/sczerwinski.github.io
eaa7251935c47d15cc6e09fadc40de6ee7bdee9c
[ "Apache-2.0" ]
5
2020-03-24T13:46:48.000Z
2021-12-22T15:32:57.000Z
projects/kotlin-util/1.4.0/docs/it.czerwinski.kotlin.util/-either/filter-is-instance-to-option.md
sczerwinski/sczerwinski.github.io
eaa7251935c47d15cc6e09fadc40de6ee7bdee9c
[ "Apache-2.0" ]
null
null
null
--- title: filterIsInstanceToOption - --- //[kotlin-util](../../index.md)/[it.czerwinski.kotlin.util](../index.md)/[Either](index.md)/[filterIsInstanceToOption](filter-is-instance-to-option.md) # filterIsInstanceToOption [Kotlin utilities] Brief description Returns [Some](../-some/index.md) containing the same [Right](../-right/index.md) casted to type [T]() if it is [T](). Otherwise returns [None](../-none/index.md). #### Return [Some](../-some/index.md) containing the same [Right](../-right/index.md) casted to type [T]() if it is [T](). Otherwise returns [None](../-none/index.md). #### Since 1.3 ## Parameters Kotlin utilities | Name| Summary| |---|---| | T| Required type of the optional value. Content inline fun <[T](filter-is-instance-to-option.md) : [Any](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-any/index.html)?> [filterIsInstanceToOption](filter-is-instance-to-option.md)(): [Option](../-option/index.md)<[Either](index.md)<[L](index.md), [T](filter-is-instance-to-option.md)>>
27.358974
295
0.653233
yue_Hant
0.392419
dbe8cb47abf9ff7cf1c4a8bb238128f02fa2a36b
6,043
md
Markdown
windows-driver-docs-pr/ifs/file-lock.md
AmadeusW/windows-driver-docs
6d272f80814969bbb5ec836cbbebdf5cae52ee35
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/ifs/file-lock.md
AmadeusW/windows-driver-docs
6d272f80814969bbb5ec836cbbebdf5cae52ee35
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/ifs/file-lock.md
AmadeusW/windows-driver-docs
6d272f80814969bbb5ec836cbbebdf5cae52ee35
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: FILE\_LOCK structure description: The operating system uses the opaque FILE\_LOCK structure to support the locking of files. ms.assetid: 89df2075-c542-4105-847f-9bc7ae4dab50 keywords: ["FILE_LOCK structure Installable File System Drivers", "PFILE_LOCK structure pointer Installable File System Drivers"] topic_type: - apiref api_name: - FILE_LOCK api_location: - ntifs.h api_type: - HeaderDef --- # FILE\_LOCK structure The operating system uses the opaque FILE\_LOCK structure to support the locking of files. Syntax ------ ```ManagedCPlusPlus typedef struct _FILE_LOCK { PCOMPLETE_LOCK_IRP_ROUTINE CompleteLockIrpRoutine; PUNLOCK_ROUTINE            UnlockRoutine; BOOLEAN                    FastIoIsQuestionable; BOOLEAN                    SpareC[3]; PVOID                      LockInformation; FILE_LOCK_INFO             LastReturnedLockInfo; PVOID                      LastReturnedLock; volatile LONG              LockRequestsInProgress; } FILE_LOCK, *PFILE_LOCK; ``` Members ------- **CompleteLockIrpRoutine** Reserved for system use. **UnlockRoutine** Reserved for system use. **FastIoIsQuestionable** Reserved for system use. **SpareC** Reserved for system use. **LockInformation** Reserved for system use. **LastReturnedLockInfo** Reserved for system use. **LastReturnedLock** Reserved for system use. **LockRequestsInProgress** Reserved for system use. Remarks ------- File system legacy filter drivers and minifilters can use a variety of routines to create and use FILE\_LOCK objects, as well as to test for read/write access to files. - To allocate a FILE\_LOCK object, call [**FsRtlAllocateFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff545640) or [**FltAllocateFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff541743). - To initialize an uninitialized FILE\_LOCK object, call [**FsRtlInitializeFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff546122) or [**FltInitializeFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff543273). Be aware that a FILE\_LOCK returned from [**FsRtlAllocateFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff545640) or [**FltAllocateFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff541743) is already initialized. - To uninitialize a FILE\_LOCK object, call [**FsRtlUninitializeFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff547313) or [**FltUninitializeFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff544595). - To free a FILE\_LOCK object that is allocated by the [**FsRtlAllocateFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff545640) routine, call [**FsRtlFreeFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff546011). To free a FILE\_LOCK object that is allocated by the [**FltAllocateFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff541743) routine, call [**FltFreeFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff542969). After a FILE\_LOCK has been initialized, routines such as [**FsRtlCheckLockForReadAccess**](https://msdn.microsoft.com/library/windows/hardware/ff545758), [**FltCheckLockForWriteAccess**](https://msdn.microsoft.com/library/windows/hardware/ff541837), and [**FsRtlFastCheckLockForRead**](https://msdn.microsoft.com/library/windows/hardware/ff545918) can be used to determine if the file can be accessed by other threads. Requirements ------------ <table> <colgroup> <col width="50%" /> <col width="50%" /> </colgroup> <tbody> <tr class="odd"> <td align="left"><p>Version</p></td> <td align="left"><p>Available in Microsoft Windows 2000, and later versions of the Windows operating system.</p></td> </tr> <tr class="even"> <td align="left"><p>Header</p></td> <td align="left">Ntifs.h (include FltKernel.h or Ntifs.h)</td> </tr> </tbody> </table> ## See also [**FltAllocateFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff541743) [**FltCheckLockForReadAccess**](https://msdn.microsoft.com/library/windows/hardware/ff541834) [**FltCheckLockForWriteAccess**](https://msdn.microsoft.com/library/windows/hardware/ff541837) [**FltInitializeFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff543273) **FltInitializeFileLock** [**FsRtlAllocateFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff545640) [**FsRtlCheckLockForReadAccess**](https://msdn.microsoft.com/library/windows/hardware/ff545758) [**FsRtlCheckLockForWriteAccess**](https://msdn.microsoft.com/library/windows/hardware/ff545760) [**FsRtlFastCheckLockForRead**](https://msdn.microsoft.com/library/windows/hardware/ff545918) [**FsRtlFastCheckLockForWrite**](https://msdn.microsoft.com/library/windows/hardware/ff545928) [**FsRtlInitializeFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff546122) [**FsRtlUninitializeFileLock**](https://msdn.microsoft.com/library/windows/hardware/ff547313)     [Send comments about this topic to Microsoft](mailto:[email protected]?subject=Documentation%20feedback%20%5Bifsk\ifsk%5D:%20FILE_LOCK%20structure%20%20RELEASE:%20%281/9/2018%29&body=%0A%0APRIVACY%20STATEMENT%0A%0AWe%20use%20your%20feedback%20to%20improve%20the%20documentation.%20We%20don't%20use%20your%20email%20address%20for%20any%20other%20purpose,%20and%20we'll%20remove%20your%20email%20address%20from%20our%20system%20after%20the%20issue%20that%20you're%20reporting%20is%20fixed.%20While%20we're%20working%20to%20fix%20this%20issue,%20we%20might%20send%20you%20an%20email%20message%20to%20ask%20for%20more%20info.%20Later,%20we%20might%20also%20send%20you%20an%20email%20message%20to%20let%20you%20know%20that%20we've%20addressed%20your%20feedback.%0A%0AFor%20more%20info%20about%20Microsoft's%20privacy%20policy,%20see%20http://privacy.microsoft.com/default.aspx. "Send comments about this topic to Microsoft")
44.762963
925
0.748138
yue_Hant
0.394483
dbe8ddf773438475a7f7c4e7eb5fc12a4a20196f
15,263
md
Markdown
SimpleControl.md
bitfocus/companion-module-panasonic-kairos
2dc5eda864cbae4bdb420b624d396c3a825e4fca
[ "MIT" ]
null
null
null
SimpleControl.md
bitfocus/companion-module-panasonic-kairos
2dc5eda864cbae4bdb420b624d396c3a825e4fca
[ "MIT" ]
null
null
null
SimpleControl.md
bitfocus/companion-module-panasonic-kairos
2dc5eda864cbae4bdb420b624d396c3a825e4fca
[ "MIT" ]
1
2022-03-28T19:47:21.000Z
2022-03-28T19:47:21.000Z
#Simple Panasonic Kairos Control Protocol Definition ####Change History |Date |Description | |--------------------|---------------------------| | October 6th 2020 | First edition | | October 28th 2020 | Update escape sequences, added list and info command description| | January 6th 2021 | General update and minor fixes | | November 17th 2021 | Introduction of new application event and some additional information about the subscribe command| | March 1st 2022 | Introduced empty list command to query top level elements | March 7th 2022 | Introduced keep-alive messages. ##Introduction The [Simple Panasonic Kairos Control Protocol](http://youtrack.panasonic.wdt/issue/MS-491) can be used to allow external control devices / software to get access to parameters / functions of the Kairos. It is not defined by the protocol which parameters / functions are exposed, so it is not possible to prepare a list here. The attribute meta information define the exposed status of the objects. In order to expose an attribute it needs to have the meta information "extern". Besides, the "extern" flag the "read_only" flag exists to specify the attribute permissions. A specific "read_only" flag that is only valid for external control does not exist yet. The [Simple Panasonic Kairos Control Protocol](http://youtrack.panasonic.wdt/issue/MS-491) is using TCP as the transport protocol. Kairos listens per default on port 3005. ### Command The syntax of the protocol is rather simple to allow easy integration to 3rd party devices / applications. Objects and attributes are concatenated with '.' characters. For example "Mixer.AUX.AUX1.source" would specify the source parameter of the AUX1 object. AUX1 is located in AUX and AUX is located in Mixer. To describe how to build a command certain things need to be considered. All objects and attributes have a logical name, like "AUX1" for example. This information alone is not specific enough, because there could be multiple elements with this particular name. The objects are stored in a tree structure and the "AUX1" object that we want to access needs to be specified by its path, beginning from the root object (which is excluded in the command). So in this case "AUX1" could be accessed by "Mixer.AUX.AUX1". This can produce rather lengthy names that are hard to write at some point. To make it simpler the Kairos system provides a way to insert "short cuts" by adding what is called a "script name" to the object description. This is done in the case for AUX1, which allows us to access it by "AUX1" directly. Since the "script name" is a global identifier it has to be unique. If we have a second object with the name AUX1 that we want to access we would need to specify it by its path to make clear what object we want to access or it needs to have another unique "script name" that belongs to this object. If an intermediate node has a "script name" this can also be used to shorten the name. Because a "script name" can be used as starting point for the path, since it has to be globally unique. E.g. Accessing the Main Scene from root or from SCENES, note SCENES is the "script name" of the Mixer.Scenes element: ```` - Mixer.Scenes.Main - SCENES.Main ```` Every command ends with a "\\r\\n" character sequence or also referred to as <CR><LF> or in ASCII 0D 0A. Since newline is also used by some systems with a '\\n' or <LF> or ASCII 0A this is also accepted as a command ending sequence by the Kairos Server. The Kairos Server uses always the '\\r\\n' as a command ending sequence. If we want to write to an attribute we have to specify it and then use the assignment operator '='. For example setting the "AUX1.source" parameter to "IN1" would look like this: ```` "AUX1.source=IN1\r\n" ```` Note that "IN1" is also a logical name, that follows the same rules as the left hand side of the operation. "IN1" has a "script name" to allow us easier access otherwise we would need to write ```` AUX1.source=Mixer.Inputs.IN2\r\n ```` The term attribute is used here mostly for parameters like "source". But an attribute can also be something that is called a "function". Like a "play" function that executes a macro or a clip player. Since functions do not have any kind of value assigned to them the query and assignment operation have a slightly different meaning. Lets use the "recall_layout" and "clear_layout" function of a Multiviewer element "MV1" as an example to demonstrate function behavior. The syntax to execute the "recall_layout" function is the same as an assignment. The left hand side of the operation describes the function and the right hand side defines the function argument. ```` MV1.recall_layout=1\r\n ```` The "clear_layout" function has no arguments. In this case the right hand side of the assignment can be empty or it can be written as a query. For example: ```` MV1.clear_layout=\r\n MV1.clear_layout\r\n ```` Note: The Kairos Creator will provide a view about the available commands. Additionally the [Protocol specific function](###Protocol specific functions) list and info can be used to query sub-elements or attributes of a given object. ### Keep-Alive Messages The protocol specification introduces a keep-alive message to allow the protocol server to identify and remove orphaned connections. The keep-alive message is mandatory since Kairos v1.2. The message itself is an empty message with the line ending character sequence "\r\n". The server won't send any response message. Note: Earlier Kairos versions respond with an error message to this command. The server will disconnect any client that didn't send a message within the time period of 10 seconds. To avoid accidental disconnects it is recommended to send at least one message within half the period, 5 seconds. A normal message has the same impact on the timeout behavior than the keep-alive message. This means if a client would send one message every 5 seconds, it is not required to send a dedicated keep-alive message to keep the connection active. Messages send from the server won't reset the timeout. E.g., in case an active [subscription](####Subscribe). would send messages every 5 seconds the client needs to actively send keep-alive messages to keep the connection active. ### Escape Sequences The protocol syntax has some specific characters that cannot be used in an object name description. If an object is named "My.AUX" the "." character in the name need to be escaped. The html escape character sequences are used. ```` : - &#58; . - &#46; = - &#61; \ - &#92; \r - &#13; \n - &#10; ```` ##### Example Assume we want to set the source of My.AUX to My.Col it needs to be written like this (if directly accessible): ```` My&#46;AUX.source=My&#46;Col\r\n ```` ### Response Every request produces some sort of response from the Kairos Server. For example if we want to know which source is selected on "AUX1" we send the following command: ```` Client: "AUX1.source\r\n" Server: "AUX1.source=IN1\r\n" ```` Note: Since the response from the Kairos Server is a valid command this can be used to lookup a command. For example if it is not known how to set a specific source to AUX1.source someone could use the Kairos Creator select the source there and then send a query to lookup the command he/she wants to know. Usually the server responds to a client request before the next request is executed. This can lead to the assumption that the message that will be received by the client belongs to the previous request. This cannot be ensured by the system because of the subscribe/unsubscribe mechanism described in [Protocol specific functions](###Protocol specific functions). #### Error Handling In some error cases the server response can be a simple "Error\\r\\n" indicating that the last command was not executed. As of today no further description about the error itself is included in this response. It can be either that the command was written in the wrong way or the command was fine but one object could not be resolved (could be on the left hand side or the right hand side in case of an assignment). Some special error conditions exist, one that causes a "Permission Error\\r\\n" response. If an assignment command was send and the left hand side of the assignment has the "read_only" flag assigned to it. Another special error condition produces a "Enum Error\\r\\n" response. Some attributes are formed by an enumeration type and therefore the values that can be specified are limited. If someone tries to set a value that is not defined this error message shows up. In case of integral values with a defined min or max value no error is produced, if the value specified not inside the valid range it will be set to either min or max depends on the value provided. In case of an assignment or the execution of a [Protocol specific functions](###Protocol specific functions), the Kairos will respond with "OK\\r\\n" to acknowledge the command execution or the simple "Error\\r\\n" response if an error ocurred. ### Protocol specific functions The protocol can be used to query values from parameters, assign values to parameters and execute certain functions of a system. This allows a wide variety of applications that can be implemented with this protocol. In the current implementation the protocol supports specific functions to extend these capabilities. Those functions are "subscribe" and the counter part "unsubscribe" as well as "info" and "list". The protocol specific functions are written like this: ```` <function>:<command>\r\n ```` The following sections explain those specific functions in detail and show some example use cases. #### Subscribe Usually the Kairos will only send messages to the client in case of a request (query or assignment). The "subscribe" command creates an exception to this rule. Once a client established a subscription to an attribute a message is automatically send to this client as soon as the attribute value has changed. The message format is the same as the normal response messages. An obvious use case for the subscribe mechanism would be the "tally" attribute of an object. In this example the subscribe function to the tally attribute of the "IN1" object is shown: ```` Client: subscribe:IN1.tally\r\n Server: OK\r\n ```` Note: In the current version of the protocol it is not possible to query active subscriptions nor is it possible to execute an unsubscribe function to "all" elements. The client should keep track of its subscriptions if needed. The subscription can only be used on attribute level, it is not possible to subscribe to an entire object like "IN1". #### Unsubscribe To remove a subscription the "unsubscribe" function is used like this: ```` Client: unsubscribe:IN1.tally\r\n Server: OK\r\n ```` Note: If the "unsubscribe" function is called on a attribute without an active subscription the Kairos will respond with an "Error\\r\\n" message. #### List & Info The "list" and "info" commands require special handling because this command will produce a multiline response. The client can detect the end of the response by searching for two consecutive "\\r\\n" tokens. The "list" command provides the information about all elements that are accessible within an object. This command can be used to recursively query all the elements accessible. In this scenario there is no information about any elements available to the client. Therefore, the client can send an empty list command to query the top level elements. ```` Client: list:\r\n Server: SYS\r\n Environment\r\n Mixer\r\n \r\n ```` The list function can also be used to query accessible elements within an already known element SCENES: ```` Client: list:SCENES\r\n Server: SCENES.Main\r\n SCENES.Templates\r\n \r\n ```` This information does not specify what kind of element Main or Templates is. In this example both elements are different. Main is a real scene element that can be controlled and Templates is just a directory that contains scenes. This information cannot be retrieved by the protocol, a basic knowledge about the production structure is required for the client here. If the client is aware that the Templates sub element is a directory that contains scenes another list command can be used: ```` Client: list:SCENES.Templates\r\n Server: SCENES.Templates.2Box\r\n SCENES.Templates.4Box\r\n SCENES.Templates.OTS Left\r\n SCENES.Templates.OTS Right\r\n SCENES.Templates.Title\r\n SCENES.Templates.Sidecar\r\n \r\n ```` The same command can now be applied to all sub elements to find the layers or transitions of a scene. The "list" command does not allow recursive execution. If an element is deeply nested multiple "list" commands might be required to look it up. The "info" command provides information about the attributes of a given element. Let's assume the client wants to control the 2Box scene element. To query the attributes of the 2Box scene element the following command can be used: ```` Client: info:SCENES.Templates.2Box\r\n Server: SCENES.Templates.2Box.advanced_resolution_control\r\n SCENES.Templates.2Box.resolution_x\r\n SCENES.Templates.2Box.resolution_y\r\n SCENES.Templates.2Box.tally\r\n SCENES.Templates.2Box.color\r\n SCENES.Templates.2Box.resolution\r\n SCENES.Templates.2Box.auto\r\n SCENES.Templates.2Box.cut\r\n \r\n ```` Again to keep the protocol simple, this command will just provide a list of all elements that are available. Additional information like the data type of the attribute, or the number of function arguments is not available and there is no command to retrieve such information. It can be seen as some sort of help mechanism to allow the client to get a basic overview about the available elements and how to get access to them. ### Event Notification The idea of the event notification is to inform clients about certain system state changes that are out of the scope of the protocol and cannot be handled by the Server. The client receives this information and has to do some action to handle these cases. The event notification messages are written like this: ```` APPLICATION:<event>\r\n ```` The following list shows the event notifications and their meaning. #### New The new event gets send out to the clients in case of a data model recreation. This is the case when a new production / environment file gets loaded. Some elements queried by the client prior to this event might not exist anymore or new elements appear. In this situation the Server can not keep subscriptions and all active subscriptions get invalidated. This event can be seen by the client as a complete reset. The new event syntax: ```` APPLICATION:NEW\r\n ```` ### Examples #### Subscribe to tally parameter and put the source on air. ```` // query tally information Client: IN1.tally\r\n Server: IN1.tally=0\r\n // subscribe Client: subscribe:IN1.tally\r\n Server: OK\r\n // AUX1 is on air, we select IN1 as source Client: AUX1.source=IN1\r\n Server: OK\r\n Server: IN1.tally=1\r\n ````
45.834835
129
0.761056
eng_Latn
0.999458
dbe95f83b73a54030f0405ed07d4ce2396787b99
10,452
md
Markdown
ru/managed-sqlserver/operations/update.md
Orioleon/docs
c848423e641b3cbfe5defc26755b8d550af5c4a0
[ "CC-BY-4.0" ]
1
2022-03-03T01:02:33.000Z
2022-03-03T01:02:33.000Z
ru/managed-sqlserver/operations/update.md
Orioleon/docs
c848423e641b3cbfe5defc26755b8d550af5c4a0
[ "CC-BY-4.0" ]
null
null
null
ru/managed-sqlserver/operations/update.md
Orioleon/docs
c848423e641b3cbfe5defc26755b8d550af5c4a0
[ "CC-BY-4.0" ]
null
null
null
# Изменение настроек кластера После создания кластера вы можете: - [Изменить класс хостов](#change-resource-preset). - [Увеличить размер хранилища](#change-disk-size) (доступно только для стандартного `network-hdd` и быстрого сетевого `network-ssd` хранилищ). - [Изменить настройки](#change-sqlserver-config) {{ MS }} согласно документации {{ MS }}. - [{#T}](#change-additional-settings). - [Переместить кластер](#move-cluster) в другой каталог. {% note warning %} Вы не можете с помощью команд SQL изменять настройки {{ MS }}, в том числе управлять объектами уровня сервера `Linked Server`, `Login`, `Credential`, `SQL Server Agent Job`, `Maintenance Plan`, `Audit`, `Polybase`, `Replication`, `Backup devices`, `Server Triggers`, `Extended events`. {% endnote %} ## Изменить класс хостов {#change-resource-preset} {% list tabs %} - Консоль управления 1. Перейдите на страницу каталога и выберите сервис **{{ mms-name }}**. 1. Выберите кластер и нажмите кнопку **Изменить кластер** на панели сверху. 1. Выберите новый [класс хостов](../concepts/instance-types.md). При изменении класса хостов для кластера меняются характеристики всех уже созданных хостов. 1. Нажмите кнопку **Сохранить изменения**. - Terraform 1. Откройте актуальный конфигурационный файл {{ TF }} с планом инфраструктуры. О том, как создать такой файл, см. в разделе [{#T}](./cluster-create.md). 1. Измените в описании кластера {{ mms-name }} значение параметра `resource_preset_id` в блоке `resources`: ```hcl resource "yandex_mdb_sqlserver_cluster" "<имя кластера>" { ... resources { resource_preset_id = "<класс хоста>" ... } } ``` 1. Проверьте корректность настроек. {% include [terraform-validate](../../_includes/mdb/terraform/validate.md) %} 1. Подтвердите изменение ресурсов. {% include [terraform-apply](../../_includes/mdb/terraform/apply.md) %} Подробнее см. в [документации провайдера {{ TF }}]({{ tf-provider-mms }}). - API Воспользуйтесь методом API [update](../api-ref/Cluster/update.md) и передайте в запросе: - Идентификатор кластера в параметре `clusterId`. Чтобы узнать идентификатор, [получите список кластеров в каталоге](cluster-list.md#list-clusters). - Новый класс хостов в параметре `configSpec.resources.resourcePresetId`. Чтобы узнать список поддерживаемых значений, воспользуйтесь методом `list` для `ResourcePreset`. - Список полей конфигурации кластера, подлежащих изменению (в данном случае — `configSpec.resources.resourcePresetId`), в параметре `updateMask`. {% note warning %} Этот метод API сбросит все настройки кластера, которые не были явно переданы в запросе, на значения по умолчанию. Чтобы избежать этого, перечислите настройки, которые вы хотите изменить, в параметре `updateMask` (одной строкой через запятую). {% endnote %} {% endlist %} ## Увеличить размер хранилища {#change-disk-size} {% list tabs %} - Консоль управления 1. Перейдите на страницу каталога и выберите сервис **{{ mms-name }}**. 1. Выберите кластер и нажмите кнопку **Изменить кластер** на панели сверху. 1. В разделе **Размер хранилища** укажите необходимое значение. 1. Нажмите кнопку **Сохранить изменения**. - Terraform 1. Откройте актуальный конфигурационный файл {{ TF }} с планом инфраструктуры. О том, как создать такой файл, см. в разделе [{#T}](./cluster-create.md). 1. Измените в описании кластера {{ mms-name }} значение параметра `disk_size` в блоке `resources`: ```hcl resource "yandex_mdb_sqlserver_cluster" "<имя кластера>" { ... resources { disk_size = <размер хранилища в гигабайтах> ... } } ``` 1. Проверьте корректность настроек. {% include [terraform-validate](../../_includes/mdb/terraform/validate.md) %} 1. Подтвердите изменение ресурсов. {% include [terraform-apply](../../_includes/mdb/terraform/apply.md) %} Подробнее см. в [документации провайдера {{ TF }}]({{ tf-provider-mms }}). - API Воспользуйтесь методом API [update](../api-ref/Cluster/update.md) и передайте в запросе: - Идентификатор кластера в параметре `clusterId`. Чтобы узнать идентификатор, [получите список кластеров в каталоге](cluster-list.md#list-clusters). - Необходимое значение размера хранилища (в байтах) в параметре `configSpec.resources.diskSize`. - Список полей конфигурации пользователя, подлежащих изменению (в данном случае — `configSpec.resources.diskSize`), в параметре `updateMask`. {% note warning %} Этот метод API сбросит все настройки кластера, которые не были явно переданы в запросе, на значения по умолчанию. Чтобы избежать этого, перечислите настройки, которые вы хотите изменить, в параметре `updateMask` (одной строкой через запятую). {% endnote %} {% endlist %} ## Изменить настройки {{ MS }} {#change-sqlserver-config} {% list tabs %} - Консоль управления 1. Перейдите на страницу каталога и выберите сервис **{{ mms-name }}**. 1. Выберите кластер и нажмите кнопку **Изменить кластер** на панели сверху. 1. В блоке **Настройки СУБД** нажмите кнопку **Настроить**. 1. Внесите необходимые изменения в настройки и нажмите кнопку **Сохранить**. 1. Нажмите кнопку **Сохранить изменения**. - Terraform 1. Откройте актуальный конфигурационный файл {{ TF }} с планом инфраструктуры. О том, как создать такой файл, см. в разделе [{#T}](./cluster-create.md). 1. Измените в описании кластера {{ mms-name }} значения параметров в блоке `sqlserver_config`: ```hcl resource "yandex_mdb_sqlserver_cluster" "<имя кластера>" { ... sqlserver_config { ... } } ``` 1. Проверьте корректность настроек. {% include [terraform-validate](../../_includes/mdb/terraform/validate.md) %} 1. Подтвердите изменение ресурсов. {% include [terraform-apply](../../_includes/mdb/terraform/apply.md) %} Подробнее см. в [документации провайдера {{ TF }}]({{ tf-provider-mms }}). - API Воспользуйтесь методом API [update](../api-ref/Cluster/update.md) и передайте в запросе: - Идентификатор кластера в параметре `clusterId`. Чтобы узнать идентификатор, [получите список кластеров в каталоге](cluster-list.md#list-clusters). - Необходимые значения в параметре `configSpec.sqlserverConfig_2016sp2`. - Список полей конфигурации пользователя, подлежащих изменению (в данном случае — `configSpec.sqlserverConfig_2016sp2`), в параметре `updateMask`. {% note warning %} Этот метод API сбросит все настройки кластера, которые не были явно переданы в запросе, на значения по умолчанию. Чтобы избежать этого, перечислите настройки, которые вы хотите изменить, в параметре `updateMask` (одной строкой через запятую). {% endnote %} {% endlist %} ## Изменить дополнительные настройки кластера {#change-additional-settings} {% list tabs %} - Консоль управления 1. Перейдите на страницу каталога и выберите сервис **{{ mms-name }}**. 1. Выберите кластер и нажмите кнопку **Редактировать** в верхней части страницы. 1. Измените дополнительные настройки кластера: {% include [Дополнительные настройки](../../_includes/mdb/mms/extra-settings.md) %} {% include [Ограничения защиты от удаления](../../_includes/mdb/deletion-protection-limits-db.md) %} 1. Нажмите кнопку **Сохранить изменения**. - Terraform 1. Откройте актуальный конфигурационный файл {{ TF }} с планом инфраструктуры. О том, как создать такой файл, см. в разделе [{#T}](cluster-create.md). 1. Чтобы изменить время начала резервного копирования, добавьте к описанию кластера {{ mms-name }} блок `backup_window_start`. ```hcl resource "yandex_mdb_sqlserver_cluster" "<имя кластера>" { ... backup_window_start { hours = <Час начала резервного копирования> minutes = <Минута начала резервного копирования> } ... } ``` 1. Чтобы включить защиту кластера от непреднамеренного удаления пользователем вашего облака, добавьте к описанию кластера поле `deletion_protection` со значением `true`: ```hcl resource "yandex_mdb_sqlserver_cluster" "<имя кластера>" { ... deletion_protection = <защита от удаления кластера: true или false> } ``` {% include [Ограничения защиты от удаления](../../_includes/mdb/deletion-protection-limits-db.md) %} 1. Проверьте корректность настроек. {% include [terraform-validate](../../_includes/mdb/terraform/validate.md) %} 1. Подтвердите изменение ресурсов. {% include [terraform-apply](../../_includes/mdb/terraform/apply.md) %} Подробнее см. в [документации провайдера {{ TF }}]({{ tf-provider-mms }}). - API Воспользуйтесь методом API [update](../api-ref/Cluster/update.md) и передайте в запросе: - Идентификатор кластера в параметре `clusterId`. Чтобы узнать идентификатор, [получите список кластеров в каталоге](cluster-list.md#list-clusters). - Новое время начала резервного копирования в параметре `configSpec.backupWindowStart`. - Список полей конфигурации кластера, подлежащих изменению (в данном случае — `configSpec.backupWindowStart`), в параметре `updateMask`. {% note warning %} Этот метод API сбросит все настройки кластера, которые не были явно переданы в запросе, на значения по умолчанию. Чтобы избежать этого, перечислите настройки, которые вы хотите изменить, в параметре `updateMask` (одной строкой через запятую). {% endnote %} {% endlist %} ## Переместить кластер {#move-cluster} {% list tabs %} - Консоль управления 1. Перейдите на страницу каталога и выберите сервис **{{ mms-name }}**. 1. Нажмите значок ![image](../../_assets/horizontal-ellipsis.svg) справа в строке кластера, который вы хотите переместить. 1. Выберите пункт **Переместить**. 1. Выберите каталог, в который вы хотите переместить кластер. 1. Нажмите кнопку **Переместить**. - API Воспользуйтесь методом API [move](../api-ref/Cluster/move.md) и передайте в запросе: - Идентификатор кластера в параметре `clusterId`. Чтобы узнать идентификатор, [получите список кластеров в каталоге](cluster-list.md#list-clusters). - Идентификатор каталога назначения в параметре `destinationFolderId`. {% endlist %}
38.855019
285
0.696039
rus_Cyrl
0.899319
dbe9c50c9234df8035bdc52055be12a74f2ef964
28,436
markdown
Markdown
_posts/2007-06-21-responsive-cutscenes-in-video-games.markdown
api-evangelist/patents-2007
da723589b6977a05c0119d5476325327da6c5a5c
[ "Apache-2.0" ]
1
2017-11-15T11:20:53.000Z
2017-11-15T11:20:53.000Z
_posts/2007-06-21-responsive-cutscenes-in-video-games.markdown
api-evangelist/patents-2007
da723589b6977a05c0119d5476325327da6c5a5c
[ "Apache-2.0" ]
null
null
null
_posts/2007-06-21-responsive-cutscenes-in-video-games.markdown
api-evangelist/patents-2007
da723589b6977a05c0119d5476325327da6c5a5c
[ "Apache-2.0" ]
2
2019-10-31T13:03:32.000Z
2020-08-13T12:57:02.000Z
--- title: Responsive cutscenes in video games abstract: A determination is made that a player's avatar has performed an action while an audio signal representing a narrative of a non-player character is being produced. The action is mapped to an impression, which is mapped to a response. The audio signal is stopped before it is completed and the response is played by providing audio for the non-player character and/or animating the non-player character. After the response is played, steps ensure that critical information in the narrative has been provided to the player. url: http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=%2Fnetahtml%2FPTO%2Fsearch-adv.htm&r=1&f=G&l=50&d=PALL&S1=08622831&OS=08622831&RS=08622831 owner: number: 08622831 owner_city: owner_country: publication_date: 20070621 --- Video games typically include an avatar which is a character or object in the game that is controlled by a player and non player characters which are controlled by the game. In many games the player s avatar is able to interact with non player characters such that the non player characters will respond to actions taken by the player s avatar. For example if a player s avatar attacks a non character player the non character player may counter attack or run away. Within video games it is common for developers to include audio and video segments known as cutscenes that provide narrative information such as a story line for the game contextual information for playing the game or instructions for proceeding forward in the game. Traditionally such cut scenes interrupted the game and took away the player s control of their avatar. Such cut scenes provide a movie like experience where the player simply watches the action in the cut scene. Some video games have allowed the player to continue to control their avatar during the cut scene. However actions taken by the avatar during such cut scenes are ignored by the non player characters in the cut scene. Thus the non player characters do not interact with the player s avatar during the cut scene and seem to become robotic. The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter. A determination is made that a player s avatar has performed an action while an audio signal representing a narrative of a non player character is being produced. The action is mapped to an impression which is mapped to a response. The audio signal is stopped before it is completed and the response is played by providing audio for the non player character and or animating the non player character. After the response is played steps ensure that critical information in the narrative has been provided to the player. This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background. As shown in gaming and media system includes a game and media console hereinafter console . Console is configured to accommodate one or more wireless controllers as represented by controllers and . A command button on console is used create a new wireless connection between on of the controllers and the console . Console is equipped with an internal hard disk drive not shown and a media drive that supports various forms of portable storage media as represented by optical storage disc . Examples of suitable portable storage media include DVD CD ROM game discs and so forth. Console also includes two memory unit card receptacles and for receiving removable flash type memory units . Console also includes an optical port for communicating wirelessly with one or more devices and two USB Universal Serial Bus ports and to support a wired connection for additional controllers or other peripherals. In some implementations the number and arrangement of additional ports may be modified. A power button and an eject button are also positioned on the front face of game console . Power button is selected to apply power to the game console and can also provide access to other features and controls and eject button alternately opens and closes the tray of a portable media drive to enable insertion and extraction of a storage disc . Console connects to a television or other display not shown via A V interfacing cables . In one implementation console is equipped with a dedicated A V port not shown configured for content secured digital communication using A V cables e.g. A V cables suitable for coupling to a High Definition Multimedia Interface HDMI port on a high definition monitor or other display device . A power cable provides power to the game console. Console may be further configured with broadband capabilities as represented by a cable or modem connector to facilitate access to a network such as the Internet. Each controller is coupled to console via a wired or wireless interface. In the illustrated implementation the controllers are USB compatible and are coupled to console via a wireless or USB port . Console may be equipped with any of a wide variety of user interaction mechanisms. In an example illustrated in each controller is equipped with two thumbsticks and a D pad buttons User Guide button and two triggers . By pressing and holding User Guide button a user is able to power up or power down console . By pressing and releasing User Guide button a user is able to cause a User Guide Heads Up Display HUD user interface to appear over the current graphics displayed on monitor . The controllers described above are merely representative and other known gaming controllers may be substituted for or added to those shown in . Controllers each provide a socket for a plug of a headset . Audio data is sent through the controller to a speaker in headset to allow sound to be played for a specific player wearing headset . Headset also includes a microphone that detects speech from the player and conveys an electrical signal to the controller representative of the speech. Controller then transmits a digital signal representative of the speech to console . Audio signals may also be provided to a speaker in monitor or to separate speakers connected to console . In one implementation not shown a memory unit MU may also be inserted into one of controllers and to provide additional and portable storage. Portable MUs enable users to store game parameters and entire games for use when playing on other consoles. In this implementation each console is configured to accommodate two MUs although more or less than two MUs may also be employed. Gaming and media system is generally configured for playing games stored on a memory medium as well as for downloading and playing games and reproducing pre recorded music and videos from both electronic and hard media sources. With the different storage offerings titles can be played from the hard disk drive from optical disk media e.g. from an online source from a peripheral storage device connected to USB connections or from MU . CPU memory controller and various memory devices are interconnected via one or more buses not shown . The details of the bus that is used in this implementation are not particularly relevant to understanding the subject matter of interest being discussed herein. However it will be understood that such a bus might include one or more of serial and parallel buses a memory bus a peripheral bus and a processor or local bus using any of a variety of bus architectures. By way of example such architectures can include an Industry Standard Architecture ISA bus a Micro Channel Architecture MCA bus an Enhanced ISA EISA bus a Video Electronics Standards Association VESA local bus and a Peripheral Component Interconnects PCI bus also known as a Mezzanine bus. In one implementation CPU memory controller ROM and RAM are integrated onto a common module . In this implementation ROM is configured as a flash ROM that is connected to memory controller via a Peripheral Component Interconnect PCI bus and a ROM bus neither of which are shown . RAM is configured as multiple Double Data Rate Synchronous Dynamic RAM DDR SDRAM modules that are independently controlled by memory controller via separate buses not shown . Hard disk drive and media drive are shown connected to the memory controller via the PCI bus and an AT Attachment ATA bus . However in other implementations dedicated data bus structures of different types can also be applied in the alternative. In some embodiments ROM contains an operating system kernel that controls the basic operations of the console and that exposes a collection of Application Programming Interfaces that can be called by games and other applications to perform certain functions and to obtain certain data. A three dimensional graphics processing unit and a video encoder form a video processing pipeline for high speed and high resolution e.g. High Definition graphics processing. Data are carried from graphics processing unit to video encoder via a digital video bus not shown . An audio processing unit and an audio codec coder decoder form a corresponding audio processing pipeline for multi channel audio processing of various digital audio formats. Audio data are carried between audio processing unit and audio codec via a communication link not shown . The video and audio processing pipelines output data to an A V audio video port for transmission to a television or other display containing one or more speakers. Some audio data formed by audio processing unit and audio codec is also directed to one or more headsets through controllers . In the illustrated implementation video and audio processing components are mounted on module . In the implementation depicted in console includes a controller support subassembly for supporting up to four controllers . The controller support subassembly includes any hardware and software components needed to support wired and wireless operation with an external control device such as for example a media and game controller. A front panel I O subassembly supports the multiple functionalities of power button the eject button as well as any LEDs light emitting diodes or other indicators exposed on the outer surface of console . Subassemblies and are in communication with module via one or more cable assemblies . In other implementations console can include additional controller subassemblies. The illustrated implementation also shows an optical I O interface that is configured to send and receive signals that can be communicated to module . MUs and are illustrated as being connectable to MU ports A and B respectively. Additional MUs e.g. MUs are illustrated as being connectable to controller i.e. two MUs for each controller. Each MU offers additional storage on which games game parameters and other data may be stored. In some implementations the other data can include any of a digital game component an executable gaming application an instruction set for expanding a gaming application and a media file. When inserted into console or a controller MU can be accessed by memory controller . A system power supply module provides power to the components of gaming system . A fan cools the circuitry within console . Under some embodiments an application comprising machine instructions is stored on hard disk drive . Application provides a collection of user interfaces that are associated with console instead of with an individual game. The user interfaces allow the user to select system settings for console access media attached to console view information about games and utilize services provided by a server that is connected to console through a network connection. When console is powered on various portions of application are loaded into RAM and or caches and for execution on CPU . Although application is shown as being stored on hard disk drive in alternative embodiments application is stored in ROM with the operating system kernel. Gaming system may be operated as a standalone system by simply connecting the system to monitor a television a video projector or other display device. In this standalone mode gaming system enables one or more players to play games or enjoy digital media e.g. by watching movies or listening to music. However with the integration of broadband connectivity made available through network interface gaming system may further be operated as a participant in a larger network gaming community allowing among other things multi player gaming. The console described in is just one example of a gaming machine that can be used with various embodiments described herein. Other gaming machines such as personal computers may be used instead of the gaming console of . At step of a player triggers the cutscene. As shown in the top perspective view of a gaming environment in a player can trigger a cutscene under some embodiments by placing their avatar within a circumference of a non player character . In other embodiments the player can trigger the cutscene by placing the player s avatar within a same room as a non player character as shown in the top perspective view of a gaming environment in . Other techniques for triggering a cutscene include a player completing one or more tasks or selecting to initiate a cutscene using one or more control buttons. After the player triggers the cutscene cutscene control of is started and retrieves a first clip of the cutscene at step of . Under one embodiment each cutscene is divided into a plurality of clips. Each clip includes an audio signal representing speech from a non player character as well as animation descriptors that describe how the non player character should be animated during the playing of the clip. Under one embodiment each clip is a WAV file with a header that describes the animation for the non player character. In a plurality of cutscenes is shown including cutscene and cutscene . Each of the cutscenes includes a plurality of clips. For example cutscene includes clips and and cutscene includes clips and . In addition each cutscene includes a summary clip such as summary clip of cutscene and summary clip of cutscene . These summary clips are described further below. As noted below dividing each cutscene into clips allows the cutscene to be broken into natural breakpoints where the cutscene can be restarted if a cutscene clip is interrupted by an action by the player s avatar. By restarting the cutscene at the beginning of the clip that was interrupted a more natural restart of the cutscene is provided and helps to make the non player character appear more realistic. At step of an audio signal and non player character animation are produced based on the selected cutscene clip. Under one embodiment to produce the animation cut scene control provides the animation information for the non player character to a vertex data generation unit . Vertex data generation unit uses the animation information and a graphical model of the non player character to generate a set of vertices that describe polygons. The vertices are provided to 3D graphics processing unit which uses the vertices to render polygons representing the non player character in the graphical three dimensional gaming environment. The rendered polygons are transmitted through video encoder and A V port of to be displayed on an attached display screen. The audio signal for the non player character is provided to audio processing unit which then generates an audio signal through audio code and A V port of . At step cutscene control examines player state data to determine if the player s avatar has performed an action. Examples of actions include attacking the non player character moving a threshold distance away from the non player character or performing other actions supported by the game. Under one embodiment these other actions include things such as belching performing a silly dance flexing an arm performing a rude hand gesture and faking an attack on the non player character. Such actions are referred to herein as expressions. Under one embodiment a player may select an action from a list of actions listed in a menu. provides an example of a screen shot showing a possible menu of actions that the player s avatar may perform. The player causes the menu to be displayed by either selecting an icon on the display or using one or more controls on the controller. Once the menu has been displayed the player may select one of the actions from the menu using the controller. In other embodiments actions may be mapped to one or more controls on the controller so that the player does not have to access the menu. Under some embodiments the action may include the player s avatar moving more than a threshold distance away from the non player character. For example in the player s avatar may move outside of circumference and in the player s avatar may move outside of room . In both situations such movement will be interpreted as an action by cut scene control . If cut scene control determines that the player s avatar has not performed an action at step it determines if the end of the current cutscene clip has been reached at step . If the end of the current cutscene clip has not been reached cutscene control continues producing the audio signal and non player character animation by returning to step . Steps and continue in a loop until an avatar action is received at step or the end of a cutscene clip is received at step . If the end of the cut scene clip is reached at step the process continues at step where cutscene control determines if there is another clip for the cutscene. If there is another clip for the cutscene the next clip is retrieved at step and the audio signal and non player character animation found in the clip is used to animate the non player character and produce an audio signal for the non player character. If cut scene control determines that the player s avatar has performed an action at step it maps the action to an impression at step using an action to impression mapping in an action to response database . An impression is the way that a non player character will interpret the action. For example a non player character may interpret an action as being scary insulting impolite funny friendly aggressive inattentive or impatient each of which would be a possible impression. At step cutscene control maps the impression to a response using impression to response mapping of action to response database . By performing two mapping functions one from an action to an impression and another from an impression to a response embodiments described herein allow cutscene responses to be designed without needing to know all possible actions that may be performed. Instead a limited number of impressions can be specified and cutscene responses can be produced for those impressions. This also allows actions to be added later without affecting the currently produced responses. Multiple actions may be mapped to a single impression in action to impression mapping and multiple impressions may be mapped to a single response in impression to response mapping . At step cutscene control determines if a response has been identified through the impression to response mapping in step . Under some embodiments an impression may map to no response so that the non player character will ignore the action taken by the player s avatar. If no response is to be provided at step the process returns to step where the audio signal and non player character animation continues for the cutscene clip. Note that although steps and appear to occur after step in the flow diagram of during steps and the audio signal and animation of the current cutscene clip continues to be output by cutscene control . Thus there is no interruption in the cutscene while these steps are being performed. If the mapping of step identifies a response the response is retrieved from a set of stored responses which include cut scene responses and for example. The cut scene responses include animation information for movement of the non player character and or an audio signal containing dialog that represent the non player characters response to the action of the player s avatar. In some embodiments the cut scene responses also include scripting hooks that indicate directorial types of information such as directions to the non player character to move to a particular location movement of the camera lighting effects background music and sounds and the like. At step the response is examined to determine if the response is a microreaction. Such information can be stored in a header of the response or can be stored in action to response database . A microreaction is a small animation or small change in tone of the audio signal that does not interrupt the audio signal and non player character animation of the cutscene clip but instead slightly modifies it as it continues. If the response is a microreaction at step the microreaction is combined or integrated with the cut scene clip at step . This can involve changing the tone of the audio signal of the cut scene by either raising or lowering the pitch or by adding additional animation features to the cutscene animation. If an animation is added the audio signal of the cut scene continues without interruption as the microreaction animation is integrated with the cut scene animation. For example in the cutscene clip includes an animation in which the non player character points to his left using his left arm . Normally during this animation the non player character s eyebrows would remain unchanged. However based on a microreaction response to an avatar action the right eyebrow of the non player character is raised relative to the left eyebrow to convey that the non player character has detected the action taken by the avatar and that the impression left with the non player character is that the avatar is doing something slightly insulting. If the response found during mapping step is more than a microreaction at step cutscene control interrupts the cut scene clip and plays the cut scene response. Under one embodiment the cut scene response is played by providing the animation information to vertex data generation unit which uses the animation information and NPC graphics model to generate sets of vertices representing the movement of the non player character. Each set of vertices is provided to 3D graphics processing unit which uses the vertices to render an animated image of the non player character. The audio data associated with the response is provided to audio processing unit . Under some embodiments a player is able to activate a summary clip of the cut scene by taking an action that conveys an impression of impatience. For example the player may select an action in which their avatar requests just the facts and this action will be mapped to an impatience impression. The impression to response mapping will in turn map the impatience impression to a summary response. Under one embodiment such summary clips are stored together with the other clips of the cut scene. In other embodiments the summary clips may be stored with the cut scene responses . The summary clip contains audio data and animation information that causes the non player character to summarize the critical information that was to be conveyed by the cutscene. In general cutscenes contain both critical information and stylistic information wherein the critical information is required for the player to advance through the game and the stylistic information is provided to convey an emotional or stylistic attribute to the game. Under one embodiment the summary clip strips out most of the stylistic information to provide just the critical information. Since playing the summary clip ensures that the player has been given all of the critical information of the cut scene narrative once the summary clip has been played there is no need to continue with the cut scene. As such at step cut scene control determines if the response is a summary response and ends the cutscene procedure at step if the response was a summary response. If the response was not a summary response cut scene control examines player state to determine if the player is ready to continue with the cut scene clip at step . For example if the player s avatar has not returned to the non player character after moving away from the non player character cut scene control will determine that the player is not ready to continue with the cut scene clip. Under one embodiment cut scene control will set a timer if the player is not ready to continue with the cut scene. Cut scene control will then loop at step until the player is ready to continue with the cut scene or until the timer expires. If the timer expires cut scene control will unload the current cut scene such that the player will have to trigger the cut scene from the beginning again. When the avatar is ready to continue with the cut scene clip for example by coming back to the non player character cut scene control retrieves and plays an audio stitch from a collection of audio stitches at step . Audio stitches include a collection of audio stitch files such as audio stitch files and . Each audio stitch file includes audio and animation data for the non player character that provides an audio and visual segue between the response and restarting the cut scene clip that was interrupted at step . Examples of audio stitches include as I was saying if you are finished and now then . Such audio stitches provide a smooth transition between a response and the resumption of the cut scene clip. At step the cut scene clip that was interrupted at step is restarted from the beginning of the cut scene clip. By restarting the cut scene clip cut scene control ensures that the critical information of the cut scene narrative is provided to the player. In most cases restarting the cut scene clip will involve reproducing the audio signal and animations that were played when the cut scene clip was initially started. The process then returns to step to continue playing of the cutscene clip and to await further avatar actions. In other embodiments instead of playing an audio stitch file and restarting the cut scene clip that was interrupted cut scene control will select an alternate cut scene clip to play instead of the interrupted cut scene clip. After playing the alternate cut scene clip the process continues at step by selecting a next cut scene clip of the cut scene to play. In such embodiments the alternate cut scene clip and the next cut scene clip are selected to insure that the critical information of the cut scene is still provided to the player. The process of continues until a summary response is played there are no more cutscene clips at step or a timeout occurs during step . In the discussion above the detection of an avatar action was shown as only occurring at step . However in other embodiments cutscene control is event driven such that at any point in the flow diagram of cut scene control may receive an indication from player state that the avatar has taken an action. Based on that action cutscene control may map the action to an impression map the impression to a cutscene response as shown at steps and and produce an animation and audio signal based on the new response. Thus in the process of playing one response cutscene control may interrupt that response to play a different response based on a new avatar action. Although the subject matter has been described in language specific to structural features and or methodological acts it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather the specific features and acts described above are disclosed as example forms of implementing the claims.
263.296296
1,256
0.818681
eng_Latn
0.999964
dbe9c5cbfb0e20490310c382f2f118de19ac542c
11,671
md
Markdown
powerapps-docs/developer/common-data-service/webapi/update-delete-entities-using-web-api.md
eltociear/powerapps-docs.de-de
16d69a085b3a02ad10e4e606d7df3dc050e63967
[ "CC-BY-4.0", "MIT" ]
null
null
null
powerapps-docs/developer/common-data-service/webapi/update-delete-entities-using-web-api.md
eltociear/powerapps-docs.de-de
16d69a085b3a02ad10e4e606d7df3dc050e63967
[ "CC-BY-4.0", "MIT" ]
null
null
null
powerapps-docs/developer/common-data-service/webapi/update-delete-entities-using-web-api.md
eltociear/powerapps-docs.de-de
16d69a085b3a02ad10e4e606d7df3dc050e63967
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Entitäten aktualisieren und löschen mit Hilfe der Web-API (Common Data Service) | Microsoft-Dokumentation description: Lesen Sie, wie Sie das Update und Löschen für Entitäten unter Verwendung der Web-API durchführen ms.custom: '' ms.date: 10/31/2018 ms.service: powerapps ms.suite: '' ms.tgt_pltfrm: '' ms.topic: article applies_to: - Dynamics 365 (online) ms.assetid: 694889fd-2b85-43a0-97bc-1e760695db31 caps.latest.revision: 17 author: JimDaly ms.author: jdaly ms.reviewer: pehecke manager: annbe search.audienceType: - developer search.app: - PowerApps - D365CE ms.openlocfilehash: 60e8c8386427247a37d81c301978d3c202d6ffdc ms.sourcegitcommit: f4cf849070628cf7eeaed6b4d4f08c20dcd02e58 ms.translationtype: HT ms.contentlocale: de-DE ms.lasthandoff: 03/21/2020 ms.locfileid: "3154983" --- # <a name="update-and-delete-entities-using-the-web-api"></a>Entitäten aktualisieren und löschen mithilfe der Web API Die Operationen, um Daten zu ändern, sind ein Kernteil der Web API. Zusätzlich zu einer einfachen Aktualisierung und zu einer Löschung können Sie Operationen auf einzelnen Attributen durchführen und *upsert*-Anfragen verfassen, die entweder eine Entität aktualisieren oder einfügen abhängig davon, ob sie existiert. > [!NOTE] > Die Metadaten, die Entitäten definieren, werden auf andere Weise aktualisiert. Weitere Informationen: [Erstellen und Aktualisieren von Modellentitäten mit der Web-API](create-update-entity-definitions-using-web-api.md). <a name="bkmk_update"></a> ## <a name="basic-update"></a>Grundlegende Aktualisierung Aktualisierungsoperationen verwenden das HTTP `PATCH`-Verb. Übergeben Sie ein JSON-Objekt, das die Eigenschaften enthält, die Sie auf den URI aktualisieren möchten, der die Entität darstellt. Eine Antwort mit einem Status von 204 wird zurückgebracht, wenn die Aktualisierung erfolgreich ist. Dieses Beispiel aktualisiert einen vorhandenen Firmendatensatz mit dem `accountid`-Wert von 00000000-0000-0000-0000-000000000001. > [!IMPORTANT] > Wenn Sie eine Entität aktualisieren, schließen Sie im Anfragebody nur die Eigenschaften ein, die Sie ändern. Wenn Sie die Eigenschaften einer Entität einfach aktualisieren, die Sie vorher abgerufen hatten, und das JSON in Ihrer Anfrage einschließen, aktualisieren Sie damit jede Eigenschaft, obwohl der Wert derselbe ist. Dies kann Systemereignisse verursachen, die Geschäftslogik starten können, da angenommen wird, dass die Werte geändert haben. Dieses kann bewirken, dass Eigenschaften so aussehen, als seien sie beim Daten-Bearbeiten aktualisiert worden, wenn tatsächlich sie sich nicht wirklich geändert haben. > [!NOTE] > Metadaten für Attribute enthält eine `RequiredLevel`-Eigenschaft. Falls das auf `SystemRequired` festgelegt ist, können Sie dieser Attribute auf einen NULL-Wert nicht festlegen. Weitere Informationen: [Attributerforderlichkeitsstufe](../entity-attribute-metadata.md#attribute-requirement-level) **Anforderung** ```http PATCH [Organization URI]/api/data/v9.0/accounts(00000000-0000-0000-0000-000000000001) HTTP/1.1 Content-Type: application/json OData-MaxVersion: 4.0 OData-Version: 4.0 { "name": "Updated Sample Account ", "creditonhold": true, "address1_latitude": 47.639583, "description": "This is the updated description of the sample account", "revenue": 6000000, "accountcategorycode": 2 } ``` **Antwort** ```http HTTP/1.1 204 No Content OData-Version: 4.0 ``` > [!NOTE] > Siehe [Entitäten beim Update zuordnen](associate-disassociate-entities-using-web-api.md#bkmk_Associateentitiesonupdate) für Informationen zum Zuordnen von Entitäten beim Update. <a name="bkmk_updateWithDataReturned"></a> ## <a name="update-with-data-returned"></a>Aktualisieren mit den zurückgegebenen Daten Um Daten von einer Entität zu erhalten, die Sie aktualisieren, können Sie die `PATCH`-Anfrage so verfassen, dass die Daten des erstellten Datensatzes mit dem Status „200 (OK)” zurückgegeben werden. Um dieses Ergebnis zu erzielen, müssen Sie die `return=representation`-Einstellung in den Anforderungsheadern verwenden. Um die zurückgegebenen Eigenschaften zu steuern, hängen Sie die `$select`-Abfrageoption für die URL im Entitätssatz an. Wenn sie verwendet wird, wird die `$expand`-Abfrageoption ignoriert. In diesem Beispiel wird eine Firmaenentität aktualisiert und gibt die angeforderten Daten in der Antwort zurück. **Anforderung** ```http PATCH [Organization URI]/api/data/v9.0/accounts(00000000-0000-0000-0000-000000000001)?$select=name,creditonhold,address1_latitude,description,revenue,accountcategorycode,createdon HTTP/1.1 OData-MaxVersion: 4.0 OData-Version: 4.0 Accept: application/json Content-Type: application/json; charset=utf-8 Prefer: return=representation {"name":"Updated Sample Account"} ``` **Antwort** ```http HTTP/1.1 200 OK Content-Type: application/json; odata.metadata=minimal Preference-Applied: return=representation OData-Version: 4.0 { "@odata.context": "[Organization URI]/api/data/v9.0/$metadata#accounts/$entity", "@odata.etag": "W/\"536537\"", "accountid": "00000000-0000-0000-0000-000000000001", "accountcategorycode": 1, "description": "This is the description of the sample account", "address1_latitude": 47.63958, "creditonhold": false, "name": "Updated Sample Account", "createdon": "2016-09-28T23:14:00Z", "revenue": 5000000.0000, "_transactioncurrencyid_value": "048dddaa-6f7f-e611-80d3-00155db5e0b6" } ``` <a name="bkmk_updateSingleProperty"></a> ## <a name="update-a-single-property-value"></a>Aktualisieren Sie einen einzelnen Eigenschaftswert Wenn Sie nur einen einzelnen Eigenschaftswert aktualisieren möchten, verwenden Sie eine PUT-Anfrage, bei der der Eigenschaftsname an den Uri der Entität angefügt wird. Das folgende Beispiel aktualisiert die name-Eigenschaft einer vorhandenen Firmenentität mit dem `accountid`-Wert von 00000000-0000-0000-0000-000000000001. **Anforderung** ```http PUT [Organization URI]/api/data/v9.0/accounts(00000000-0000-0000-0000-000000000001)/name HTTP/1.1 Content-Type: application/json OData-MaxVersion: 4.0 OData-Version: 4.0 {"value": "Updated Sample Account Name"} ``` **Antwort** ```http HTTP/1.1 204 No Content OData-Version: 4.0 ``` <a name="bkmk_deleteSingleProperty"></a> ## <a name="delete-a-single-property-value"></a>Löschen Sie einen einzelnen Eigenschaftswert Wenn Sie den Wert einer einzelnen Eigenschaft löschen möchten, verwenden Sie eine DELETE- Anfrage, bei der der Eigenschaftsname an den Uri der Entität angefügt wird. Das folgende Beispiel löscht den Wert der `description`-Eigenschaft einer Firmenentität mit dem `accountid`-Wert von 00000000-0000-0000-0000-000000000001. **Anforderung** ```http DELETE [Organization URI]/api/data/v9.0/accounts(00000000-0000-0000-0000-000000000001)/description HTTP/1.1 Content-Type: application/json OData-MaxVersion: 4.0 OData-Version: 4.0 ``` **Antwort** ```http HTTP/1.1 204 No Content OData-Version: 4.0 ``` > [!NOTE] > Dies kann nicht bei einer Navigationseigenschaft mit einem einzelnen Wert verwendet werden, um zwei Entitäten zu distanzieren. Für eine andere Methode siehe [Entfernen Sie eine Referenz auf eine Entität](associate-disassociate-entities-using-web-api.md#bkmk_Removeareferencetoanentity). <a name="bkmk_upsert"></a> ## <a name="upsert-an-entity"></a>Upsert einer Entität Eine *upsert* Operation ist genau wie eine Aktualisierung. Sie verwendet eine `PATCH`-Anfrage und verwendet einen URI, um sich auf eine spezifische Entität zu beziehen. Der Unterschied ist, dass, wenn die Entität nicht existiert, sie erstellt wird. Wenn sie bereits existiert, wird sie aktualisiert. Beim Erstellen einer neuen Entität lassen Sie vom System normalerweise einen eindeutigen Bezeichner zuweisen. Dies ist eine bewährte Methode. Wenn Sie jedoch einen Datensatz mit einem spezifischen `id`-Wert erstellen müssen, bietet `upsert` eine Operation, mit der dies möglich ist. Dieses kann in der Situation wertvoll sein, in der Sie Daten in den verschiedenen Systemen synchronisieren. Jedoch gibt es manchmal Situationen, in denen Sie ein `upsert` ausführen möchten, aber Sie möchten eine der potenziellen Standardaktionen verhindern: entweder Erstellen oder Update. Sie können dies bewerkstelligen, indem Sie `If-Match` oder `If-None-Match` den Kopfzeilen hinzufügen. Für weitere Informationen siehe [upsert-Vorgänge begrenzen](perform-conditional-operations-using-web-api.md#bkmk_limitUpsertOperations). <a name="bkmk_delete"></a> ## <a name="basic-delete"></a>Grundlegende Löschung Eine Löschungsoperation ist sehr direkt. Verwenden Sie das DELETE-Verb mit dem URI der Entität, die Sie löschen möchten. Diese Beispielmitteilung löscht eine Firmenentität mit dem `accountid`-Primärschlüsselwert gleich 00000000-0000-0000-0000-000000000001. **Anforderung** ```http DELETE [Organization URI]/api/data/v9.0/accounts(00000000-0000-0000-0000-000000000001) HTTP/1.1 Content-Type: application/json OData-MaxVersion: 4.0 OData-Version: 4.0 ``` **Antwort** Wenn die Entität existiert, erhalten Sie eine normale Antwort mit Status 204, um anzuzeigen, dass die Löschung erfolgreich war. Wenn die Entität nicht gefunden wird, erhalten Sie eine Antwort mit Status 404. ```http HTTP/1.1 204 No Content OData-Version: 4.0 ``` <a name="bkmk_duplicate"></a> ## <a name="check-for-duplicate-records"></a>Nach doppelten Datensätzen suchen <!-- TODO: By default, duplicate detection is suppressed when you are updating records using the Web API. You must include the `MSCRM.SuppressDuplicateDetection: false` header with your PATCH request to enable duplicate detection . Duplicate detection only applies when the organization has enabled duplicate detection, the entity is enabled for duplicate detection, and there are active duplicate detection rules being applied. For more information, see [Detect duplicate data for developers](../detect-duplicate-data-for-developers.md). --> Sehen Sie [Erkennen Sie Duplikate beim Aktualisieren-Vorgang mit Internet-API](manage-duplicate-detection-create-update.md#bkmk_update), um weitere Informationen zu erhalten, wie Sie nach doppelten Datensätzen beim Aktualisierungsvorgang suchen können. ### <a name="see-also"></a>Siehe auch [Beispiel grundlegender Web-API-Operationen (C#)](samples/basic-operations-csharp.md)<br /> [Beispiele grundlegender Web API-Operationen (clientseitiges JavaScript)](samples/basic-operations-client-side-javascript.md)<br /> [Vorgänge mithilfe der Web-API ausführen](perform-operations-web-api.md)<br /> [HTTP-Anforderungen verfassen und Fehler beheben](compose-http-requests-handle-errors.md)<br /> [Datenabfrage mit Web-API](query-data-web-api.md)<br /> [Erstellen einer Entität mithilfe des Web-API](create-entity-web-api.md)<br /> [Abrufen einer Entität mithilfe des Web-API](retrieve-entity-using-web-api.md)<br /> [Entitäten zuordnen und Zuordnungen aufheben mithilfe der Web API](associate-disassociate-entities-using-web-api.md)<br /> [Nutzen von Web-API-Funktionen](use-web-api-functions.md)<br /> [Nutzen von Web-API-Aktionen](use-web-api-actions.md)<br /> [Ausführen von Batchbetrieben mithilfe der Web-API](execute-batch-operations-using-web-api.md)<br /> [Annehmen eines anderen Benutzerkontos mit Web API](impersonate-another-user-web-api.md)<br /> [Bedingte Vorgänge mithilfe der Web-API ausführen](perform-conditional-operations-using-web-api.md)
49.037815
692
0.768829
deu_Latn
0.95404
dbe9c9e67d4d1ec0e2666a6a4a0f279d49197296
576
md
Markdown
src/495-writing-n-as-the-product-of-k-distinct-positive-integers/problem.md
xfbs/ProjectEulerRust
e26768c56ff87b029cb2a02f56dc5cd32e1f7c87
[ "MIT" ]
1
2019-08-17T22:21:11.000Z
2019-08-17T22:21:11.000Z
src/495-writing-n-as-the-product-of-k-distinct-positive-integers/problem.md
xfbs/ProjectEulerRust
e26768c56ff87b029cb2a02f56dc5cd32e1f7c87
[ "MIT" ]
3
2017-12-09T14:49:30.000Z
2017-12-09T14:59:39.000Z
src/495-writing-n-as-the-product-of-k-distinct-positive-integers/problem.md
xfbs/ProjectEulerRust
e26768c56ff87b029cb2a02f56dc5cd32e1f7c87
[ "MIT" ]
null
null
null
# Problem 495: Writing n as the product of k distinct positive integers Let W(n,k) be the number of ways in which n can be written as the product of k distinct positive integers. For example, W(144,4) = 7. There are 7 ways in which 144 can be written as a product of 4 distinct positive integers: 144 = 1×2×4×18 144 = 1×2×8×9 144 = 1×2×3×24 144 = 1×2×6×12 144 = 1×3×4×12 144 = 1×3×6×8 144 = 2×3×4×6 Note that permutations of the integers themselves are not considered distinct. Furthermore, W(100!,10) modulo 1 000 000 007 = 287549200. Find W(10000!,30) modulo 1 000 000 007.
57.6
71
0.723958
eng_Latn
0.994331
dbe9d90b302378e86a74a662e3a4cf68e0c9b76e
1,546
md
Markdown
_posts/2016/2016-04-02-little-room-now-in-development.md
BlogHolder/BlogHolder.github.io
18fafe5e1fc910a314ba0b19550d5f9ae2aad45b
[ "MIT" ]
null
null
null
_posts/2016/2016-04-02-little-room-now-in-development.md
BlogHolder/BlogHolder.github.io
18fafe5e1fc910a314ba0b19550d5f9ae2aad45b
[ "MIT" ]
null
null
null
_posts/2016/2016-04-02-little-room-now-in-development.md
BlogHolder/BlogHolder.github.io
18fafe5e1fc910a314ba0b19550d5f9ae2aad45b
[ "MIT" ]
null
null
null
--- id: 121 title: Little Room Now In Development~ date: 2016-04-02T20:37:26+00:00 author: Arzola layout: post permalink: /little-room-now-in-development/ image: /wp-content/uploads/2016/04/Little-Room-Concept.jpg categories: - "Arzola's Games" - Little Room tags: - "Arzola's" - Development Progress - Drawing - Game - Little Room - Unity3D --- <a href="/images/posts/2016/04/Little-Room-Concept.jpg" target="_blank" rel="attachment noopener wp-att-122"><img class="aligncenter wp-image-122 size-large" src="/images/posts/2016/04/Little-Room-Concept.jpg" alt="Little Room Concept" /></a> Little Room will be a short free to play light-hearted atmospheric game that will talk about never ending cycles and decisions, it should be available to the public soon enough, given the short length it is intended to have. This little project is also intended to test out some scripts, as well as to give some use to a couple of assets made a while ago that have been collecting dust. However, do note there is no intention to rush the project, it is a game that will receive the time it demands. The screenshot might not show a lot right now, maybe it even feels a little empty. But so far so good if I do say so myself. Can&#8217;t wait to see how it turns out with all the final assets put together and moving. &nbsp; That is all for now, but as always&#8230; Thank you for reading my blog :3 <!-- AddThis Advanced Settings generic via filter on the_content --> <!-- AddThis Share Buttons generic via filter on the_content -->
42.944444
273
0.748383
eng_Latn
0.993508
dbea842af6f16cb6f6bf892a25ab449b2e9df5cc
896
md
Markdown
aspnetcore/includes/disableVer.md
oneheed/AspNetCore.Docs.zh-tw
48ff8626bc453f041ded7196e8facb90a8da48d9
[ "CC-BY-4.0", "MIT" ]
null
null
null
aspnetcore/includes/disableVer.md
oneheed/AspNetCore.Docs.zh-tw
48ff8626bc453f041ded7196e8facb90a8da48d9
[ "CC-BY-4.0", "MIT" ]
null
null
null
aspnetcore/includes/disableVer.md
oneheed/AspNetCore.Docs.zh-tw
48ff8626bc453f041ded7196e8facb90a8da48d9
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- no-loc: - appsettings.json - ASP.NET Core Identity - cookie - Cookie - Blazor - Blazor Server - Blazor WebAssembly - Identity - Let's Encrypt - Razor - SignalR ms.openlocfilehash: 9c977e5407f9a3dc562ef0fb1127fefaa0dc5fc2 ms.sourcegitcommit: a49c47d5a573379effee5c6b6e36f5c302aa756b ms.translationtype: MT ms.contentlocale: zh-TW ms.lasthandoff: 02/16/2021 ms.locfileid: "100552925" --- <a name="ddav"></a> ### <a name="disable-default-account-verification"></a>停用預設帳戶驗證 使用預設範本時,系統會將使用者重新導向至 `Account.RegisterConfirmation` 可選取連結以確認帳戶的位置。 預設 `Account.RegisterConfirmation` ***只*** 會用於測試,在生產應用程式中應該停用自動帳戶驗證。 若要要求確認的帳戶並防止在註冊時立即登 `DisplayConfirmAccountLink = false` 入,請在 */Areas/ Identity /Pages/Account/RegisterConfirmation.cshtml.cs* 中設定: [!code-csharp[](~/security/authentication/identity/sample/WebApp3/Areas/Identity/Pages/Account/RegisterConfirmation.cshtml.cs?name=snippet&highlight=34)]
32
153
0.796875
yue_Hant
0.844783
dbebfe4289d00582219d77c9e8c7d992dd78c6ff
1,665
md
Markdown
README.md
dirgim/callfinder
1f2b35d72d9fa98ab48c78053bb9f34bab7b1047
[ "Apache-2.0" ]
null
null
null
README.md
dirgim/callfinder
1f2b35d72d9fa98ab48c78053bb9f34bab7b1047
[ "Apache-2.0" ]
null
null
null
README.md
dirgim/callfinder
1f2b35d72d9fa98ab48c78053bb9f34bab7b1047
[ "Apache-2.0" ]
null
null
null
# callfinder Bash script for finding asterisk cdr calls and viewing corresponding full logs for a selected call ### Prerequisites A running Asterisk instance with full log turned on with ISO datetime format. Ensure that these lines are present and uncommented in your /etc/asterisk/logger.conf ``` dateformat = %F %T.%3q ; ISO 8601 date format with milliseconds full => notice,warning,error,verbose,dtmf,fax ``` ### Installing Put the `callfinder` bash script in your preferred location that is included in the `$PATH` variable ### Usage examples Default invocation lists all available calls in /var/log/asterisk/cdr-csv/Master.csv and searches /var/log/asterisk/full for full logs. All calls matching the supplied parameters are displayed with basic CDR data - if a full log for that call can be displayed, the call will be highlighted and numbered. Typing in the number of the desired call displays all lines from the full log that correspond to that call. Finding all calls made today from a number partially matching the supplied value ``` callfinder 555555 today ``` Finding alls call made this week to SIP device 300 ``` callfinder "SIP/300" this_week ``` Finding all calls made on a specific date from an alternative CDR file with a number partially matching the supplied value ``` callfinder -d 2017-05-02 -c /var/log/asterisk/cdr-csv/Custom.csv 555555 ``` Finding all calls made after a specific date ``` callfinder -a 2017-05-02 ``` ## Authors * **Krunoslav Pavic** - *Initial work* - [dirgim](https://github.com/dirgim) ## License This project is licensed under the Apache License 2.0 - see the [LICENSE](LICENSE) file for details
29.210526
150
0.762763
eng_Latn
0.996198
dbece7ea654e13a0e74ba71aebb961c2a4a0d884
2,227
md
Markdown
README.md
vinhbui107/ecommerce-center
89e4ccd3bc419f9b253e28f499fd710b993ecff1
[ "MIT" ]
4
2020-04-21T16:22:54.000Z
2020-05-20T03:58:53.000Z
README.md
vinhbui107/ecommerce-center
89e4ccd3bc419f9b253e28f499fd710b993ecff1
[ "MIT" ]
null
null
null
README.md
vinhbui107/ecommerce-center
89e4ccd3bc419f9b253e28f499fd710b993ecff1
[ "MIT" ]
2
2020-05-17T03:25:23.000Z
2020-05-20T03:55:27.000Z
# Movie Center Website My assignment for Information Security 🚀. Video demo about this project. [Watch it](https://www.youtube.com/watch?v=tolvKYo0VjY). ## Features * Users login, logout, change password and reset password by their email. * Update users profile. * Search, add movie to cart and checkout. * Security for website. ## Third-Party Packages * [x] [django-allauth](https://github.com/pennersr/django-allauth) for social authentication * [x] [Bootstrap](https://github.com/twbs/bootstrap) for styling * [x] [Jquery](https://github.com/jquery/jquery) * [x] [Font-awesome](https://github.com/FortAwesome/Font-Awesome) for icon fonts * [x] [django-debug-toolbar](https://github.com/jazzband/django-debug-toolbar) for debugging * [x] [django-crispy-forms](https://github.com/django-crispy-forms/django-crispy-forms) for DRY forms * [x] [sorl-thumbnail](https://github.com/jazzband/sorl-thumbnail) for thumbnail images * [x] [python-decouple](https://github.com/henriquebastos/python-decouple/) Storing application settings * [x] [dj-database-url](https://github.com/jacobian/dj-database-url) Storing application settings ## Getting Started These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system. ### Prerequisites Make sure Python 3.8.x, Pip and Virtualenv are already installed. [See here for help](https://programwithus.com/learn-to-code/Pip-and-virtualenv-on-Windows/) ### Installing Clone and move to project ``` $ git clone https://github.com/vinhbui107/MovieCenter.git $ cd MovieCenter ``` Create virtual environment ``` $ virtualenv env ``` Run virtual environment ``` $ .\env\Scripts\activate ``` Install python packages for project ``` (env) $ pip install -r requirements.txt ``` Run our project ``` (env) $ python manage.py runserver ``` Load the site at http://127.0.0.1:8000. ## Tech Specs * Python 3.8 * Django 3.0.5 * dj-database-url 0.5.0 * django-allauth 0.41.0 * django-crispy-forms 1.9.0 * Bootstrap 4.3.1 * jQuery 3.4.1 * Font Awesome 5.11.2 * python-decouple 3.3 * Pillow 7.0.0 * sorl-thumbnail 12.6.3 * django-debug-toolbar 2.2
25.895349
200
0.73507
eng_Latn
0.488656
dbed014bac03d80034d33e36d7d6a9811de8d74d
1,466
md
Markdown
src/next/templates/templating_layer_a_template_engine.md
lsimone/usermanual
ff4d7b81c90b2a0e2ffe2d94d3f5e657185ce406
[ "Apache-2.0" ]
null
null
null
src/next/templates/templating_layer_a_template_engine.md
lsimone/usermanual
ff4d7b81c90b2a0e2ffe2d94d3f5e657185ce406
[ "Apache-2.0" ]
null
null
null
src/next/templates/templating_layer_a_template_engine.md
lsimone/usermanual
ff4d7b81c90b2a0e2ffe2d94d3f5e657185ce406
[ "Apache-2.0" ]
null
null
null
Title: Templating Layer A Template Engine As you've seen in the first part of this documentation, the Aria Templates Core layer is a powerful library that provides comprehensive tools to help design complex Javascript applications very efficiently. Now, when creating web applications, developers must not only deal with pure Javascript and need to consider the UI aspect of the code. For a long time this has been the server's responsibility: creating and delivering the proper markup to the browser was a job done using JSP, ASP, PHP or any other server-side templating technology. Aria Templates moves this logic to the client-side: templates are simply delivered by the server as static resources that are processed on the browser by the templating engine. This offers several advantages over server-side processing: * bandwidth efficiency: static resources benefit from caching and, since markup is managed on the browser, only data (JSON) needs to be exchanged with the server. * clear separation between data, logic and representation thanks to AT's MVC approach. This is a must as far as readability and maintenance are concerned. * performance: templates are compiled by the framework into optimized Javascript files. Also, complex processing may not scale very well server-side while browsers engines become more and more efficient over the years. This section explains what templates are in details and help you learn how to write and use them.
122.166667
335
0.811733
eng_Latn
0.999864
dbedd1b75a6cda07779ac475ecaed465c86a1975
2,386
md
Markdown
README.md
bgoudey/genomehash
136a68e3ff75043f3c6bbfe9617d1c994df70ab6
[ "MIT" ]
null
null
null
README.md
bgoudey/genomehash
136a68e3ff75043f3c6bbfe9617d1c994df70ab6
[ "MIT" ]
null
null
null
README.md
bgoudey/genomehash
136a68e3ff75043f3c6bbfe9617d1c994df70ab6
[ "MIT" ]
null
null
null
# GenomeHash - A minimal-database MLST implementation * Benjamin Goudey * Hannah Huckstep * Kelly Wyres * Thomas Conway ## About: GenomeHash is a simple method for using alleleic genotyping schemes (MLST, MLVA, Phage typing etc) without the need for a reference database of alleles. This is achieved by replacing traditional numeric labels with hashes and by determining alleles for a given locus using only a minimal allele database. ## Contact: Benjamin Goudey: [email protected] ## Requirements: * NCBI blast+ 2.2.28-2 (available as ncbi-blast+ package in Ubuntu) * R plus the following packages * Biostrings * digest * mclust * docopt * stringdist * pbapply * tools * pacman * stringr These packages will be installed by running the pre-req command ## Usage: * gh.R prereq [--install] gh.R ref --first|--medoid * gh.R extract (--refdb ) (--output ) * gh.R hash --nchar [--st] [--header] * gh.R -h | --help **prereq:** Install R packages required to run this script. NB: does not install BLAST. **ref** Extract a set of 'reference alleles' from a given allele database Input 'allele_db' and output 'ref_db' are FASTA files. **extract** Extract alleles by aligning to given seedds to a set of assemblies using BLAST. Input 'contigs' and 'ref_db' and output 'alleles' are FASTA files. **hash** Hash a given FASTA file of alleles. **Options:** --help -h Show these instructions. --install List packages to be installed --output= Output filename --first Use first allele as seed. --medoid Use medoid allele as seed. --refdb= Specify the database of reference alleles to use. --nchar= Number of characters in hash [default: 5]. --header Write out a header containing column (i.e. allele) names --st Add a sequence type hash ## Examples: Find the reference alleles for two loci for which we observed alleles using the 'first' method <code>gh.R ref --first --output ./seeds_alleles.fa ./NEIS0001.txt ./NEIS0004.txt</code> Extract alleles from assembled contigs using derived reference alleles <code>gh.R extract --refdb ./seeds_alleles.fa --output ./10_6748.extract.fa ./10_6748.fas</code> Create five character hashes for extracted alleles and write to file <code>gh.R hash --nchar 5 --output ./10_6748_hashes.txt ./10_6748.extract.fa --header </code> ## License: MIT License, see license.txt for details
26.808989
159
0.724644
eng_Latn
0.956482
dbede722bf8ef5dc036bbd3e8b51933b07aadb9c
629
md
Markdown
N&T/a-little-collection-of-practice-problems.md
fengshenyuan/move-to-the-mars
a0b2b8f1e04fe2f37e9fe34a3848a223157dcdb0
[ "MIT" ]
null
null
null
N&T/a-little-collection-of-practice-problems.md
fengshenyuan/move-to-the-mars
a0b2b8f1e04fe2f37e9fe34a3848a223157dcdb0
[ "MIT" ]
null
null
null
N&T/a-little-collection-of-practice-problems.md
fengshenyuan/move-to-the-mars
a0b2b8f1e04fe2f37e9fe34a3848a223157dcdb0
[ "MIT" ]
null
null
null
# <center>A little Collection of Practice Problems</center> <center>Author G.Yuan 2018/07/05</center> ## Software Engineering * How about build a library name minunit? or what's the best pratice to write unit test in large project? > Why: Large project always comes with a large amount of unit test cases and complex functional test cases. It's hard to maintain the consistence with codebase. A little change in code may cause a lot of changes in unit test cases. We need a way to min the code of unit, but how? ![image](https://user-images.githubusercontent.com/20035835/157248738-43730153-10bc-4551-b357-fd300ea2bfd1.png)
62.9
281
0.777424
eng_Latn
0.990878
dbedfa942d0034cd01a698b90eb652e3d7c39713
6,874
md
Markdown
CHANGELOG.md
khionu/editor-core
ca295818f7d648ad2e1f764ce5957ca3a20ce648
[ "MIT" ]
29
2019-01-02T08:55:39.000Z
2021-04-07T12:17:58.000Z
CHANGELOG.md
khionu/editor-core
ca295818f7d648ad2e1f764ce5957ca3a20ce648
[ "MIT" ]
31
2018-09-07T22:14:17.000Z
2018-12-10T18:17:23.000Z
CHANGELOG.md
amethyst/amethyst-editor-sync
6a76f7f29aa07305769483f284783743a18fbff6
[ "MIT" ]
6
2019-01-18T16:33:11.000Z
2019-09-28T17:50:11.000Z
# Changelog All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). ## [Unreleased] ## [0.4.0] - 2018-12-28 ### Added * Create and destroy entities at runtime. Thanks to [mh84]! ([#40]) * `sync_components`, `read_components`, `sync_resources`, and `read_resources` macros have been added for registering many components/resources at once. ([#43]) * `sync_default_types` now registers `FlyControlTag`, `HideCursor`, `WindowFocus`, `UiText`, `UiButton`, `UiTransform`, `CameraOrtho`, `DestroyAtTime`, `DestroyInTime`, `MouseReactive`, and `Named`. ([#47]) ### Removed * The `type_set` macro and the related `SyncEditorBundle` methods have been removed. ([#43]) * `EditorSyncBundle::get_connection` has been made private. ([#43]) ### Fixed * Read-only components no longer require `Deserialize`. ([#38]) ### Breaking Changes ### Upgraded to Amethyst 0.10 ([#46]) Updated to depend on version 0.10 of amethyst. If your project uses a previous version of Amethyst, you'll need to upgrade to amethyst 0.10 in order to use the latest version of amethyst-editor-sync. ### Changed API for Registering Components/Resources ([#43]) The `type_set` macro has been removed, `SyncEditorBundle` no longer directly exposes a builder pattern. `sync_component` and the other registration methods take the same parameters, but they no longer return `&mut Self` and there are no variants that take a type set as a paramenter. Instead, we now provide some helper macros for easily registering many types at once. We recommend combining these with the [tap] crate to get a builder-like way of chaining method calls to create the bundle. For example, this setup logic: ```rust let components = type_set![Ball, Paddle]; let resources = type_set![ScoreBoard]; let editor_sync_bundle = SyncEditorBundle::new() .sync_default_types() .sync_components(&components) .sync_resources(&resources); ``` Would now be written like this: ```rust let editor_sync_bundle = SyncEditorBundle::new() .tap(SyncEditorBundle::sync_default_types) .tap(|bundle| sync_components!(bundle, Ball, Paddle)) .tap(|bundle| sync_resources!(bundle, ScoreBoard)); ``` If you prefer to not use [tap] (or method chaining in general), you may also make your `SyncEditorBundle` mutable and modify it directly: ```rust let mut bundle = SyncEditorBundle::new(); bundle.sync_default_types(); sync_components!(bundle, Ball, Paddle); sync_resources!(bundle, ScoreBoard); ``` ### Setting Up Log Output ([#43]) `EditorSyncBundle::get_connection` has been made private. Instead of calling `get_connection` and passing the output to `EditorLogger::new`, you can pass a reference to the bundle directly: ```rust EditorLogger::new(&editor_sync_bundle).start(); ``` [#38]: https://github.com/amethyst/amethyst-editor-sync/issues/38 [#40]: https://github.com/amethyst/amethyst-editor-sync/pull/40 [#43]: https://github.com/amethyst/amethyst-editor-sync/pull/43 [#46]: https://github.com/amethyst/amethyst-editor-sync/pull/46 [#47]: https://github.com/amethyst/amethyst-editor-sync/pull/47 [tap]: https://crates.io/crates/tap [mh84]: https://github.com/mh84 ## [0.3.0] - 2018-10-26 ### Added * `sync_components` and `sync_resources` methods in `SyncEditorBundle` to synchronize all types in a `TypeSet`. `TypeSets` can be created through the `type_set!` macro to reduce the verbosity of synchronizing many types. Thanks to [mvesterli] for putting this together! ([#19]) * `sync_default_types` method in `SyncEditorBundle` to easily synchronize some commonly used engine types. ([#20]) * :tada: Support for editing `Resource` values! :tada: ([#25]) * `read_resource` and `read_resources` methods in `SyncEditorBundle` to register resources that don't implement `DeserializeOwned`. ([#33]) * `read_component` and `read_components` methods in `SyncEditorBundle` to register components that don't implement `DeserializeOwned` ([#37]) ### Breaking Changes * `SyncEditorBundle` type format. If the type is explicitly given, it will need to be updated. ([#21]) * Resources registered via `SyncEditorBundle::sync_resource` must now be `DeserializeOwned` (as well as `Serialize`). This enables support for applying changes made in the editor. If you have a `Resource` that only implements `Serialize`, register it with `SyncEditorBundle::read_resource` instead. ([#25]) * Components registered via `SyncEditorBundle::sync_component` must now be `DeserializeOwned` (as well as `Serialize`). ([#37]) * `SyncResourceSystem` has been removed. If your code was directly registering the sync systems with your dispatcher, please update to using `SyncEditorBundle` instead. ([#25]) [#19]: https://github.com/amethyst/amethyst-editor-sync/issues/19 [#20]: https://github.com/amethyst/amethyst-editor-sync/issues/20 [#21]: https://github.com/amethyst/amethyst-editor-sync/pull/21 [#25]: https://github.com/amethyst/amethyst-editor-sync/pull/25 [#33]: https://github.com/amethyst/amethyst-editor-sync/pull/33 [#37]: https://github.com/amethyst/amethyst-editor-sync/pull/37 [mvesterli]: https://github.com/mvesterli ## [0.2.0] - 2018-10-14 ### Fixed * Panic if resource is missing. ([#14]) * Panic on Linux if no editor is running. ([#15]) * Panic if sync messages are too large. ([#17]) ### Added * `SyncEditorBundle::with_interval` to configure how frequently the full state is sent. ([#23]) ### Breaking Changes The state messages sent may now omit some or all of the data fields. Editors should be updated to handle this case by not attempting to update their corresponding local data. [#14]: https://github.com/amethyst/amethyst-editor-sync/pull/14 [#15]: https://github.com/amethyst/amethyst-editor-sync/issues/15 [#17]: https://github.com/amethyst/amethyst-editor-sync/pull/17 [#23]: https://github.com/amethyst/amethyst-editor-sync/pull/23 ## [0.1.0] - 2018-10-04 ### Added * `SyncEditorBundle` with methods `sync_component` and `sync_resource` for setting up editor syncing. ([#8]) * `SerializableEntity` as a temporary solution for allowing components that contain `Entity` values to be serialized. * Send log output with `EditorLogger`. ([#11]) [#8]: https://github.com/amethyst/amethyst-editor-sync/pull/8 [#11]: https://github.com/amethyst/amethyst-editor-sync/pull/11 [Unreleased]: https://github.com/amethyst/amethyst-editor-sync/compare/v0.4.0...HEAD [0.4.0]: https://github.com/amethyst/amethyst-editor-sync/compare/v0.3.0...v0.4.0 [0.3.0]: https://github.com/amethyst/amethyst-editor-sync/compare/v0.2.0...v0.3.0 [0.2.0]: https://github.com/amethyst/amethyst-editor-sync/compare/v0.1.0...v0.2.0 [0.1.0]: https://github.com/amethyst/amethyst-editor-sync/compare/a1a7101...v0.1.0
40.916667
206
0.741199
eng_Latn
0.842308
dbeead82787c1e41e7304bf246ac4c779432b669
29,349
md
Markdown
docs/CHANGELOG.md
ptrck/django-clone
5c868b65ac6a3e3367595f8aa54abc42ef0d0144
[ "MIT" ]
null
null
null
docs/CHANGELOG.md
ptrck/django-clone
5c868b65ac6a3e3367595f8aa54abc42ef0d0144
[ "MIT" ]
null
null
null
docs/CHANGELOG.md
ptrck/django-clone
5c868b65ac6a3e3367595f8aa54abc42ef0d0144
[ "MIT" ]
null
null
null
# Changelog All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/) and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html). ## [v2.5.4](https://github.com/tj-django/django-clone/releases/tag/v2.5.4) - 2021-04-24 <small>[Compare with v2.5.3](https://github.com/tj-django/django-clone/compare/v2.5.3...v2.5.4)</small> ## [v2.5.3](https://github.com/tj-django/django-clone/releases/tag/v2.5.3) - 2021-04-25 <small>[Compare with v2.5.2](https://github.com/tj-django/django-clone/compare/v2.5.2...v2.5.3)</small> ### Fixed - Fixed bug using sub_clone=true (#302) ([570616c](https://github.com/tj-django/django-clone/commit/570616c8bc89169266d067f2b771196663c98e05) by Tonye Jack). ## [v2.5.2](https://github.com/tj-django/django-clone/releases/tag/v2.5.2) - 2021-04-23 <small>[Compare with v2.5.1](https://github.com/tj-django/django-clone/compare/v2.5.1...v2.5.2)</small> ### Added - Added pre-commit-config.yaml ([3f0d729](https://github.com/tj-django/django-clone/commit/3f0d72912377ad02afe851e2dfa362147e7d8e22) by Tonye Jack). ## [v2.5.1](https://github.com/tj-django/django-clone/releases/tag/v2.5.1) - 2021-04-22 <small>[Compare with v2.5.0](https://github.com/tj-django/django-clone/compare/v2.5.0...v2.5.1)</small> ### Added - Added .github/workflows/greetings.yml ([62a8973](https://github.com/tj-django/django-clone/commit/62a8973669916a56a5ee85934643a3a756adb730) by Tonye Jack). - Added .editorconfig ([e8dc042](https://github.com/tj-django/django-clone/commit/e8dc042718de23edcff9b04aaa7737c740a63db3) by Tonye Jack). ### Fixed - Fixed regression calling instance.full_clean (#296) ([fa5b16b](https://github.com/tj-django/django-clone/commit/fa5b16bf7b414e9bb355c76e95b103956355e6f0) by Tonye Jack). ## [v2.5.0](https://github.com/tj-django/django-clone/releases/tag/v2.5.0) - 2021-04-21 <small>[Compare with v2.4.0](https://github.com/tj-django/django-clone/compare/v2.4.0...v2.5.0)</small> ### Added - Added black pre-commit hook (#284) ([7d4a36b](https://github.com/tj-django/django-clone/commit/7d4a36be428b8431644b4f873ca51fe2011c5ddf) by Tonye Jack). ### Fixed - Fix bugs for related_name and m2o clone (#281) ([53eb444](https://github.com/tj-django/django-clone/commit/53eb444e7e939f6c123c2b00ff0761c740955116) by Yuekui). ## [v2.4.0](https://github.com/tj-django/django-clone/releases/tag/v2.4.0) - 2021-04-16 <small>[Compare with v2.3.3](https://github.com/tj-django/django-clone/compare/v2.3.3...v2.4.0)</small> ### Added - Add a delay to resolve inconsistency in windows (#266) ([59ab32d](https://github.com/tj-django/django-clone/commit/59ab32d318c1478da14965fe6dec8986dbe5ba55) by Tonye Jack). ## [v2.3.3](https://github.com/tj-django/django-clone/releases/tag/v2.3.3) - 2021-04-09 <small>[Compare with v2.3.2](https://github.com/tj-django/django-clone/compare/v2.3.2...v2.3.3)</small> ## [v2.3.2](https://github.com/tj-django/django-clone/releases/tag/v2.3.2) - 2021-04-08 <small>[Compare with v2.3.1](https://github.com/tj-django/django-clone/compare/v2.3.1...v2.3.2)</small> ### Fixed - Fixed test. ([0945265](https://github.com/tj-django/django-clone/commit/0945265bf3860dbb7c9a151066b3fe1a58f52d6d) by Tonye Jack). - Fixed formatting. ([9ae328b](https://github.com/tj-django/django-clone/commit/9ae328be5b7c3a6f6f2a5bf2827aab96aee6178d) by Tonye Jack). ## [v2.3.1](https://github.com/tj-django/django-clone/releases/tag/v2.3.1) - 2021-04-08 <small>[Compare with v2.3.0](https://github.com/tj-django/django-clone/compare/v2.3.0...v2.3.1)</small> ### Added - Added .github/workflows/release.yml ([d739de6](https://github.com/tj-django/django-clone/commit/d739de6ca214e95d1cf98e9ba89103a90501c49b) by Tonye Jack). ## [v2.3.0](https://github.com/tj-django/django-clone/releases/tag/v2.3.0) - 2021-04-08 <small>[Compare with v2.2.1](https://github.com/tj-django/django-clone/compare/v2.2.1...v2.3.0)</small> ### Fixed - Fix bug with o2o fields (#255) ([cdd3d0a](https://github.com/tj-django/django-clone/commit/cdd3d0a0ceee3a6ed098a8319f190eae02ea6633) by Tonye Jack). - Fixed bug with cloning o2o fields (#252) ([342eca8](https://github.com/tj-django/django-clone/commit/342eca8a2bfad5b90432c733c7ff777cef8f50bd) by Tonye Jack). - Fix code style issues with black ([197d0a4](https://github.com/tj-django/django-clone/commit/197d0a4b9ecd62c5032b8aad4a2990e8c6ff18a2) by Lint Action). ## [v2.2.1](https://github.com/tj-django/django-clone/releases/tag/v2.2.1) - 2021-04-07 <small>[Compare with v2.2.0](https://github.com/tj-django/django-clone/compare/v2.2.0...v2.2.1)</small> ### Fixed - Fixed typo with filename. ([30b9a92](https://github.com/tj-django/django-clone/commit/30b9a928206f918dfb2bea1c6cb7b102ac61fd86) by Tonye Jack). ## [v2.2.0](https://github.com/tj-django/django-clone/releases/tag/v2.2.0) - 2021-04-07 <small>[Compare with v2.1.1](https://github.com/tj-django/django-clone/compare/v2.1.1...v2.2.0)</small> ### Added - Added migration. ([c938488](https://github.com/tj-django/django-clone/commit/c938488abc8e8a7476e0da86a3db1128b15255d9) by Tonye Jack). - Added .github/workflows/sync-release-version.yml ([6bdce7c](https://github.com/tj-django/django-clone/commit/6bdce7cc5a834040a84441880231c71a576f5e86) by Tonye Jack). - Added .github/auto-approve.yml ([47538b5](https://github.com/tj-django/django-clone/commit/47538b54bdd89b277539451c346a26ed57608323) by Tonye Jack). - Added .github/workflows/auto-approve.yml ([8f4bba0](https://github.com/tj-django/django-clone/commit/8f4bba0ee8b50008544baa204719b461ce2d823b) by Tonye Jack). ### Fixed - Fix code style issues with black ([b295af9](https://github.com/tj-django/django-clone/commit/b295af98c5208a45dba0680604d7ccf8b5e0649a) by Lint Action). - Fixed bug using pre_save. ([bc1ec5f](https://github.com/tj-django/django-clone/commit/bc1ec5f98b468cf8adac4956218f4d36ccde04e2) by Tonye Jack). - Fixed saving date/datetime fields. ([13b5959](https://github.com/tj-django/django-clone/commit/13b5959e4f89e6e0e46fe284f693ed1aff7022ca) by Tonye Jack). ### Removed - Remove whitespace. ([e60b52a](https://github.com/tj-django/django-clone/commit/e60b52a5ebc4eb7e0040092e8f62b35989b1c6ca) by Tonye Jack). ## [v2.1.1](https://github.com/tj-django/django-clone/releases/tag/v2.1.1) - 2021-03-25 <small>[Compare with v2.1.0](https://github.com/tj-django/django-clone/compare/v2.1.0...v2.1.1)</small> ## [v2.1.0](https://github.com/tj-django/django-clone/releases/tag/v2.1.0) - 2021-03-25 <small>[Compare with v2.0.2](https://github.com/tj-django/django-clone/compare/v2.0.2...v2.1.0)</small> ### Fixed - Fix code style issues with black ([08f6a3c](https://github.com/tj-django/django-clone/commit/08f6a3c8e3e733392eabec6693ad6b3a6969394f) by Lint Action). - Fixed lint errors. ([8286951](https://github.com/tj-django/django-clone/commit/82869511299afbca5e44bd79bafb9b9b154d05f7) by Tonye Jack). ## [v2.0.2](https://github.com/tj-django/django-clone/releases/tag/v2.0.2) - 2021-03-24 <small>[Compare with v2.0.1](https://github.com/tj-django/django-clone/compare/v2.0.1...v2.0.2)</small> ### Fixed - Fix code style issues with black ([1a450ef](https://github.com/tj-django/django-clone/commit/1a450ef545aa04d17de1db74720d4e110e476a98) by Lint Action). - Fixed test ([34a77e6](https://github.com/tj-django/django-clone/commit/34a77e68133c1f417947f408818decafd074ada6) by Tonye Jack). - Fixed lint errors. ([5f4b4ea](https://github.com/tj-django/django-clone/commit/5f4b4eae1c313565cad43efea6d38c2c41e8671b) by Tonye Jack). - Fixed bug with cloning slugs. ([bdb6029](https://github.com/tj-django/django-clone/commit/bdb60295f84359ed9f43fea31af2f0c076c5d623) by Tonye Jack). - Fixed short description. ([64ce104](https://github.com/tj-django/django-clone/commit/64ce104f03c5718a1e5b95e256b5d50700ff8803) by Tonye Jack). - Fixed migrations. ([4ede994](https://github.com/tj-django/django-clone/commit/4ede9947c2a6528598766a7b3cec04fb105d03b6) by Tonye Jack). - Fixed type hints. ([afae2d7](https://github.com/tj-django/django-clone/commit/afae2d7f9dd46f3394015a142f9a85d4d45e0ea8) by Tonye Jack). - Fixed clean. ([3ead462](https://github.com/tj-django/django-clone/commit/3ead462bc16c79594609539de73aa89a51019f5a) by Tonye Jack). - Fixed imports. ([5e04253](https://github.com/tj-django/django-clone/commit/5e042538055ed9860cd2bd9d78cd7683dc4cd376) by Tonye Jack). ## [v2.0.1](https://github.com/tj-django/django-clone/releases/tag/v2.0.1) - 2021-03-24 <small>[Compare with v2.0.0](https://github.com/tj-django/django-clone/compare/v2.0.0...v2.0.1)</small> ### Fixed - Fix code style issues with black ([408303a](https://github.com/tj-django/django-clone/commit/408303a2b2ce38d0cb8b76e5b5beac7030c90d30) by Lint Action). - Fixed typo. ([d20c3d5](https://github.com/tj-django/django-clone/commit/d20c3d51d8ac4d1830b0006108d8cd2e614b9e43) by Tonye Jack). ### Removed - Remove old name ([f82dab7](https://github.com/tj-django/django-clone/commit/f82dab7b1585288c68e188b5c79bccf0551ae5fe) by Tonye Jack). ## [v2.0.0](https://github.com/tj-django/django-clone/releases/tag/v2.0.0) - 2021-03-20 <small>[Compare with v1.1.10](https://github.com/tj-django/django-clone/compare/v1.1.10...v2.0.0)</small> ### Added - Added spacing. ([af86ae6](https://github.com/tj-django/django-clone/commit/af86ae6b570cb2c1101d59814e7dadb3b897eb9e) by Tonye Jack). - Added docs. ([a35d976](https://github.com/tj-django/django-clone/commit/a35d976f0fdeead88736bd9c2395434ed5aa2c3a) by Tonye Jack). ### Fixed - Fix code style issues with black ([fb1377f](https://github.com/tj-django/django-clone/commit/fb1377f704b1bce4f69286e484eac0b6515b6f63) by Lint Action). - Fixed lint errors. ([43f9887](https://github.com/tj-django/django-clone/commit/43f9887e61c13607bdd80212f0c37fd1919671f4) by Tonye Jack). - Fixed token. ([237b2f1](https://github.com/tj-django/django-clone/commit/237b2f173cc69a80125f1c67712d82d43b17ff19) by Tonye Jack). ### Removed - Removed unused code ([18943b6](https://github.com/tj-django/django-clone/commit/18943b69227340e7dddc2e1416c058db48d5b34f) by Tonye Jack). - Remove parallel ([8929f85](https://github.com/tj-django/django-clone/commit/8929f855f7e77ae8ff7223ec5d4f0adf21db0b47) by Tonye Jack). ## [v1.1.10](https://github.com/tj-django/django-clone/releases/tag/v1.1.10) - 2020-12-05 <small>[Compare with v1.1.9](https://github.com/tj-django/django-clone/compare/v1.1.9...v1.1.10)</small> ### Fixed - Fix code style issues with black ([49d8378](https://github.com/tj-django/django-clone/commit/49d837822689766f40c2ee0c0449f8a94907604b) by Lint Action). ## [v1.1.9](https://github.com/tj-django/django-clone/releases/tag/v1.1.9) - 2020-11-29 <small>[Compare with v1.1.8](https://github.com/tj-django/django-clone/compare/v1.1.8...v1.1.9)</small> ### Fixed - Fixed lint errors. ([c93a953](https://github.com/tj-django/django-clone/commit/c93a953001137b73b555c6df9bc652f7ad2f2169) by Tonye Jack). ## [v1.1.8](https://github.com/tj-django/django-clone/releases/tag/v1.1.8) - 2020-11-07 <small>[Compare with v1.1.6](https://github.com/tj-django/django-clone/compare/v1.1.6...v1.1.8)</small> ### Added - Add support for running workflow on forks ([a40a22e](https://github.com/tj-django/django-clone/commit/a40a22e0966916537109bbd4262edffe64052c73) by Tonye Jack). - Add support for related manytomany fields ([0c88207](https://github.com/tj-django/django-clone/commit/0c88207cbb858c844bbee48f702cefa61e09403e) by Yuekui Li). ### Changed - Change lint action back to v1 ([c53774c](https://github.com/tj-django/django-clone/commit/c53774c73a55a795a2c73ef2e5844c003b087b08) by Yuekui). ### Fixed - Fixed lint errors. ([b01bb07](https://github.com/tj-django/django-clone/commit/b01bb07aab96238330dd4b133991aa14b1b68d1b) by Tonye Jack). - Fix code style issues with black ([400e8f2](https://github.com/tj-django/django-clone/commit/400e8f213d87ad0a648a1412ad1b8d06d64944eb) by Lint Action). - Fix typo ([29757bc](https://github.com/tj-django/django-clone/commit/29757bc9de045b20dac2751260c9dd4b6e90d1c4) by Yuekui). ### Removed - Removed unused code. ([1a2308f](https://github.com/tj-django/django-clone/commit/1a2308feb527478842d4439d8f42cd57ebf42771) by Tonye Jack). ## [v1.1.6](https://github.com/tj-django/django-clone/releases/tag/v1.1.6) - 2020-09-07 <small>[Compare with v1.1.5](https://github.com/tj-django/django-clone/compare/v1.1.5...v1.1.6)</small> ## [v1.1.5](https://github.com/tj-django/django-clone/releases/tag/v1.1.5) - 2020-09-07 <small>[Compare with v1.1.4](https://github.com/tj-django/django-clone/compare/v1.1.4...v1.1.5)</small> ### Fixed - Fixed issues with sdist. ([ca59c86](https://github.com/tj-django/django-clone/commit/ca59c866ce2fb09822c3a6aa723bf1bf2dc18d9a) by Tonye Jack). ## [v1.1.4](https://github.com/tj-django/django-clone/releases/tag/v1.1.4) - 2020-09-07 <small>[Compare with v1.1.3](https://github.com/tj-django/django-clone/compare/v1.1.3...v1.1.4)</small> ## [v1.1.3](https://github.com/tj-django/django-clone/releases/tag/v1.1.3) - 2020-09-07 <small>[Compare with v1.1.2](https://github.com/tj-django/django-clone/compare/v1.1.2...v1.1.3)</small> ### Fixed - Fixed manifest.in ([df2e40b](https://github.com/tj-django/django-clone/commit/df2e40b137cab9c8968d738fa2428e06bc5dbf44) by Tonye Jack). ## [v1.1.2](https://github.com/tj-django/django-clone/releases/tag/v1.1.2) - 2020-09-07 <small>[Compare with v1.1.1](https://github.com/tj-django/django-clone/compare/v1.1.1...v1.1.2)</small> ### Added - Added missing changes. ([37de0a6](https://github.com/tj-django/django-clone/commit/37de0a6824933d1891f4b97d563ba9e17064690f) by Tonye Jack). ## [v1.1.1](https://github.com/tj-django/django-clone/releases/tag/v1.1.1) - 2020-09-07 <small>[Compare with v1.1.0](https://github.com/tj-django/django-clone/compare/v1.1.0...v1.1.1)</small> ### Added - Added missing changes. ([64ec246](https://github.com/tj-django/django-clone/commit/64ec24661615d8d143081d2ad43ad4d9c97b1e1a) by Tonye Jack). ### Fixed - Fixed makefile. ([9815226](https://github.com/tj-django/django-clone/commit/9815226ab63c363dfc09e450b93b917ced9828ff) by Tonye Jack). ## [v1.1.0](https://github.com/tj-django/django-clone/releases/tag/v1.1.0) - 2020-09-07 <small>[Compare with v1.0.0](https://github.com/tj-django/django-clone/compare/v1.0.0...v1.1.0)</small> ## [v1.0.0](https://github.com/tj-django/django-clone/releases/tag/v1.0.0) - 2020-09-05 <small>[Compare with v0.2.0](https://github.com/tj-django/django-clone/compare/v0.2.0...v1.0.0)</small> ### Added - Add migrations ([1d9c68e](https://github.com/tj-django/django-clone/commit/1d9c68e84793f02415139ae7bac1f0c18da82a71) by Yuekui Li). ### Fixed - Fix code style issues with black ([6249abe](https://github.com/tj-django/django-clone/commit/6249abe494cec23448d6fff09b7aa0eb2cf7978a) by Lint Action). - Fixed test. ([240cc7c](https://github.com/tj-django/django-clone/commit/240cc7cb171d2d89c500e34c0511ab15a25d71f3) by Tonye Jack). - Fixed github action. ([80bdd3e](https://github.com/tj-django/django-clone/commit/80bdd3ec6221b683957f4abf321130e599be174e) by Tonye Jack). - Fixed flake8 error. ([7845ede](https://github.com/tj-django/django-clone/commit/7845ede0cbbf4a0d762f481dd184939096f8be17) by Tonye Jack). - Fix invalid enum field value for unique_together fields ([1f83bf8](https://github.com/tj-django/django-clone/commit/1f83bf8472b47739c3a879a9b40199630178336a) by Yuekui Li). ## [v0.2.0](https://github.com/tj-django/django-clone/releases/tag/v0.2.0) - 2020-06-04 <small>[Compare with v0.1.6](https://github.com/tj-django/django-clone/compare/v0.1.6...v0.2.0)</small> ### Removed - Removed unused code. ([6f3f515](https://github.com/tj-django/django-clone/commit/6f3f5158d8b21ab24d8743cc4b33823aa347aa44) by Tonye Jack). ## [v0.1.6](https://github.com/tj-django/django-clone/releases/tag/v0.1.6) - 2020-06-04 <small>[Compare with v0.1.5](https://github.com/tj-django/django-clone/compare/v0.1.5...v0.1.6)</small> ### Added - Add renovate.json ([99d54b5](https://github.com/tj-django/django-clone/commit/99d54b5e5ff9eb3a759edf0d3610f1d587f69429) by Renovate Bot). ### Fixed - Fixed unused imports. ([14e40e2](https://github.com/tj-django/django-clone/commit/14e40e2d42a9f016a9520f88e52f03ac40cd8768) by Tonye Jack). - Fixed bug with index. ([dcd6256](https://github.com/tj-django/django-clone/commit/dcd625685b3b921d09526901a7458ac56389dda1) by Tonye Jack). - Fixed lint errors. ([888d88d](https://github.com/tj-django/django-clone/commit/888d88d5605fb8ba7ebf19fd91d4cd1eca318e13) by Tonye Jack). - Fixed test. ([ac7af5f](https://github.com/tj-django/django-clone/commit/ac7af5f973a54f2f4c59df403edf26dc38d51bd5) by Tonye Jack). ### Removed - Remove the information for updating installed_apps ([4a4aa8c](https://github.com/tj-django/django-clone/commit/4a4aa8c0ef3a6943b8cfe43fae96c522d6126111) by Tonye Jack). ## [v0.1.5](https://github.com/tj-django/django-clone/releases/tag/v0.1.5) - 2020-05-12 <small>[Compare with v0.1.4](https://github.com/tj-django/django-clone/compare/v0.1.4...v0.1.5)</small> ## [v0.1.4](https://github.com/tj-django/django-clone/releases/tag/v0.1.4) - 2020-05-12 <small>[Compare with v0.1.3](https://github.com/tj-django/django-clone/compare/v0.1.3...v0.1.4)</small> ### Fixed - Fixed typo. ([1613037](https://github.com/tj-django/django-clone/commit/16130379a65be64a5cc87ba716d300a3ac34a196) by Tonye Jack). - Fixed formatting ([db2bfb2](https://github.com/tj-django/django-clone/commit/db2bfb27196b6187f2e07e3db85ac3129c70615f) by Tonye Jack). - Fixed test errors ([f773001](https://github.com/tj-django/django-clone/commit/f773001fc8dc0f3811f74bd630b8c8e54e622992) by Tonye Jack). - Fixed lint errors ([1a0900a](https://github.com/tj-django/django-clone/commit/1a0900ada94751908d671405c1c89e9c316926c8) by Tonye Jack). - Fixed version. ([969b562](https://github.com/tj-django/django-clone/commit/969b562276ec908961c84296043ec9ba65a7f86e) by Tonye Jack). - Fixed test. ([f66e745](https://github.com/tj-django/django-clone/commit/f66e74556c8924189f417a6195272b7267d92afb) by Tonye Jack). - Fixed ordering. ([07d5f64](https://github.com/tj-django/django-clone/commit/07d5f64eb65fcf6962c238ebb26a2580770f4aa0) by Tonye Jack). ### Removed - Remove unused line. ([7d6339a](https://github.com/tj-django/django-clone/commit/7d6339a8eba8e8d661b3004274a9fdfccf7afd55) by Tonye Jack). ## [v0.1.3](https://github.com/tj-django/django-clone/releases/tag/v0.1.3) - 2020-04-16 <small>[Compare with v0.1.2](https://github.com/tj-django/django-clone/compare/v0.1.2...v0.1.3)</small> ### Fixed - Fixed typo ([e2f9643](https://github.com/tj-django/django-clone/commit/e2f9643ffd165adbe98c343293936b879061056d) by Tonye Jack). - Fixed example documentation. ([b6ceae9](https://github.com/tj-django/django-clone/commit/b6ceae92bae52e7efa49789ffcd1cb7b749a8546) by Tonye Jack). - Fixed lint errors. ([a72040d](https://github.com/tj-django/django-clone/commit/a72040d601a19e341d39381510c949d2f2316251) by Tonye Jack). - Fix lint errors ([9dab0f7](https://github.com/tj-django/django-clone/commit/9dab0f7c5d7c6105d1552d3130050aa0889f94df) by Tonye Jack). - Fixed invalid config. ([d33354b](https://github.com/tj-django/django-clone/commit/d33354ba8f83c5c694f52512db0575ce12d6fa19) by Tonye Jack). ### Removed - Removed trailing whitespace. ([6933020](https://github.com/tj-django/django-clone/commit/6933020271094b054a9753e64a75d1064bb2fdb7) by Tonye Jack). - Removed unused line ([0307889](https://github.com/tj-django/django-clone/commit/0307889c5550b098fa8cf553c91d0a5b3756b1ff) by Tonye Jack). ## [v0.1.2](https://github.com/tj-django/django-clone/releases/tag/v0.1.2) - 2019-12-03 <small>[Compare with v0.1.1](https://github.com/tj-django/django-clone/compare/v0.1.1...v0.1.2)</small> ### Fixed - Fixed test. ([5b65a84](https://github.com/tj-django/django-clone/commit/5b65a842380d30ec84e4e6d3db882702385bfec8) by Tonye Jack). - Fixed flake8 errors. ([ac0c27b](https://github.com/tj-django/django-clone/commit/ac0c27b2a93c10c29b3a32990752d3c544bf9619) by Tonye Jack). ### Removed - Removed unused line. ([4e1d5d1](https://github.com/tj-django/django-clone/commit/4e1d5d144acccfb8a15a8551ce3902338387467b) by Tonye Jack). - Remove tabs. ([f8c1a46](https://github.com/tj-django/django-clone/commit/f8c1a46e77c5c1ec42f4fc84af6872d2861a8a69) by Tonye Jack). - Remove unused code. ([ecfa4b6](https://github.com/tj-django/django-clone/commit/ecfa4b623ed4456cff431095f4b62ec41badb829) by Tonye Jack). ## [v0.1.1](https://github.com/tj-django/django-clone/releases/tag/v0.1.1) - 2019-12-02 <small>[Compare with v0.1.0](https://github.com/tj-django/django-clone/compare/v0.1.0...v0.1.1)</small> ### Added - Added pypi badge. ([e0a09e4](https://github.com/tj-django/django-clone/commit/e0a09e4dd084fd7ffe2c95a66aaa1a5b09762d78) by Tonye Jack). - Added spec for cloning in parallel. ([bede5de](https://github.com/tj-django/django-clone/commit/bede5de00cd40a610c65d9d8ae19fa270cf8fc41) by Tonye Jack). ### Fixed - Fixed indentation. ([75214a5](https://github.com/tj-django/django-clone/commit/75214a5b786a5322b8aa2af59fc95f6d0d10a78c) by Tonye Jack). - Fixed flake8 error. ([b28fb8c](https://github.com/tj-django/django-clone/commit/b28fb8c5f67f1ef9b790fa7f0ac3e9b106544d6f) by Tonye Jack). - Fixed example. ([1ad41d3](https://github.com/tj-django/django-clone/commit/1ad41d3f2866136a53d4fc0f91ded97ffe5f7f89) by Tonye Jack). - Fixed test. ([ec39547](https://github.com/tj-django/django-clone/commit/ec395478f1ecfce4a1d1a0afbc8f0299d5f4c6dc) by Tonye Jack). - Fixed lint errors and added support for running autopep8. ([4537a46](https://github.com/tj-django/django-clone/commit/4537a460de92d0ef34c875d37439f79c26a69d36) by Tonye Jack). - Fix bug with cloning unique fields and added support for bulk_clone. ([70e76a8](https://github.com/tj-django/django-clone/commit/70e76a81e6497ef52a2d525bc4a5e6a27d5b0e75) by Tonye Jack). - Fix typo clonemodeladmin -> clonemodeladmin ([b8a29c2](https://github.com/tj-django/django-clone/commit/b8a29c240aba7f7736ef297a69f54eb0a399d1ad) by SebastianKapunkt). ### Removed - Remove .extend. ([3e31e86](https://github.com/tj-django/django-clone/commit/3e31e86ab49d834cc2aad89760fa2a25eff687df) by Tonye Jack). ## [v0.1.0](https://github.com/tj-django/django-clone/releases/tag/v0.1.0) - 2019-11-23 <small>[Compare with v0.0.11](https://github.com/tj-django/django-clone/compare/v0.0.11...v0.1.0)</small> ## [v0.0.11](https://github.com/tj-django/django-clone/releases/tag/v0.0.11) - 2019-11-23 <small>[Compare with v0.0.10](https://github.com/tj-django/django-clone/compare/v0.0.10...v0.0.11)</small> ### Fixed - Fixed lint errors. ([5b9ceae](https://github.com/tj-django/django-clone/commit/5b9ceae369da2322ee6cdbed785b0fab74275aad) by Tonye Jack). - Fixed indentation. ([18e382d](https://github.com/tj-django/django-clone/commit/18e382d9b23872a725c97ab44333e4a354d2c123) by Tonye Jack). - Fixed duplicate context ([be24df3](https://github.com/tj-django/django-clone/commit/be24df310abd82f4c7d716b18da82c376246578f) by Tonye Jack). - Fixed yaml lint errors. ([ec949cb](https://github.com/tj-django/django-clone/commit/ec949cb75bcab1783e417e19ea3c04efa86b9658) by Tonye Jack). - Fixed flake8 errors. ([133d404](https://github.com/tj-django/django-clone/commit/133d404acf49bf134f382bd98e06395fde7abe1f) by Tonye Jack). ### Removed - Removed unused file. ([bc869a7](https://github.com/tj-django/django-clone/commit/bc869a78428ef27f8af0f2abdb11e56331faa4c3) by Tonye Jack). - Removed node_modules. ([ae75573](https://github.com/tj-django/django-clone/commit/ae75573519bc6223ccdbe905ed49f4c3f050da18) by Tonye Jack). - Removed empty config ([b779051](https://github.com/tj-django/django-clone/commit/b77905110a26ba96d22967cdb761e552dd0d8c6a) by Tonye Jack). ## [v0.0.10](https://github.com/tj-django/django-clone/releases/tag/v0.0.10) - 2019-11-12 <small>[Compare with v0.0.9](https://github.com/tj-django/django-clone/compare/v0.0.9...v0.0.10)</small> ### Fixed - Fixed manage.py. ([eaa73f6](https://github.com/tj-django/django-clone/commit/eaa73f6687b4bf586cd4c6468819a615aea76d37) by Tonye Jack). - Fixed pyenv. ([66aabd5](https://github.com/tj-django/django-clone/commit/66aabd5536d6b924d56b4a04180c460fdc399449) by Tonye Jack). - Fixed tox. ([82afdde](https://github.com/tj-django/django-clone/commit/82afdde7929d85b55bec6e4ad8784e268564bbee) by Tonye Jack). - Fixed python27. ([20b3c55](https://github.com/tj-django/django-clone/commit/20b3c55ecaa376d8069134427b39ce857c218931) by Tonye Jack). ### Removed - Removed bumpversion==0.5.3. ([60a18e5](https://github.com/tj-django/django-clone/commit/60a18e57e7e4b98335cc7c612c25e77f8a5b4564) by Tonye Jack). ## [v0.0.9](https://github.com/tj-django/django-clone/releases/tag/v0.0.9) - 2019-11-10 <small>[Compare with v0.0.8](https://github.com/tj-django/django-clone/compare/v0.0.8...v0.0.9)</small> ## [v0.0.8](https://github.com/tj-django/django-clone/releases/tag/v0.0.8) - 2019-11-10 <small>[Compare with v0.0.7](https://github.com/tj-django/django-clone/compare/v0.0.7...v0.0.8)</small> ## [v0.0.7](https://github.com/tj-django/django-clone/releases/tag/v0.0.7) - 2019-11-10 <small>[Compare with v0.0.6](https://github.com/tj-django/django-clone/compare/v0.0.6...v0.0.7)</small> ## [v0.0.6](https://github.com/tj-django/django-clone/releases/tag/v0.0.6) - 2019-11-10 <small>[Compare with v0.0.5](https://github.com/tj-django/django-clone/compare/v0.0.5...v0.0.6)</small> ### Added - Added twine to deploy requirements ([59eba81](https://github.com/tj-django/django-clone/commit/59eba81e23cfb07687ae2c1b01ce82573597f61a) by Tonye Jack). ## [v0.0.5](https://github.com/tj-django/django-clone/releases/tag/v0.0.5) - 2019-11-10 <small>[Compare with v0.0.3](https://github.com/tj-django/django-clone/compare/v0.0.3...v0.0.5)</small> ### Added - Added the changelog.md ([3e6aeb2](https://github.com/tj-django/django-clone/commit/3e6aeb2307fe5a29b9e68fd1acac1ae5e3cfe974) by Tonye Jack). ## [v0.0.3](https://github.com/tj-django/django-clone/releases/tag/v0.0.3) - 2019-11-10 <small>[Compare with v0.0.1](https://github.com/tj-django/django-clone/compare/v0.0.1...v0.0.3)</small> ### Added - Add class variable use_unique_duplicate_suffix ([fd207c4](https://github.com/tj-django/django-clone/commit/fd207c41398a469ebca01bbe7ad7c0284a54d218) by Andres Portillo). - Added new line at the end of the file. ([b5b68fc](https://github.com/tj-django/django-clone/commit/b5b68fc139e5c708c5f370fc4b3256d31d1daefc) by Tonye Jack). - Added new line. ([4327dab](https://github.com/tj-django/django-clone/commit/4327dab28b06a0efec071eb4822bc916945399dc) by Tonye Jack). - Added create_copy_of_instance utility. ([ab41b02](https://github.com/tj-django/django-clone/commit/ab41b02f1b52cf8f4ca6b7452085f66e69f25bb7) by Tonye Jack). - Add django-admin clonablemodeladmin ([147c56f](https://github.com/tj-django/django-clone/commit/147c56fa59f770278050eeecf0868a611a15e060) by Sebastian Kindt). ### Changed - Change comparison to check for any truthy value ([6d11bd1](https://github.com/tj-django/django-clone/commit/6d11bd1ae43d395ec08e6764cdf7cff597b79f54) by Andres Portillo). - Changed docker image. ([9644e73](https://github.com/tj-django/django-clone/commit/9644e73e4682904df818e678484971edb5118990) by Tonye Jack). ### Fixed - Fix admin url regex ([e806e97](https://github.com/tj-django/django-clone/commit/e806e97bcbaec9f3030db97f6afce4ff9700619e) by Tonye Jack). - Fixed test. ([d7cc014](https://github.com/tj-django/django-clone/commit/d7cc014c7b7547863865bdb98b1e8c7d2b32d75d) by Tonye Jack). - Fix clone of one to many and many to one ([78a506b](https://github.com/tj-django/django-clone/commit/78a506b6caff3a7715f35a617f96c56bf2087028) by Sebastian Kindt). - Fixed install command. ([1ea325c](https://github.com/tj-django/django-clone/commit/1ea325c2981a22f62b2c52011a3c77d82c2c77c5) by Tonye Jack). - Fixed virtualenv. ([eab8da9](https://github.com/tj-django/django-clone/commit/eab8da905bd110e7d8bffe4b1bb680c45caa8d75) by Tonye Jack). ### Removed - Removed extra spaces. ([28f327f](https://github.com/tj-django/django-clone/commit/28f327fbdf129e5477dc200f8af822a10a4867a0) by Tonye Jack). - Remove redundant one to one clone. ([d1a56bd](https://github.com/tj-django/django-clone/commit/d1a56bdc1c7f000b5729b7c13324077e1b2a7267) by Tonye Jack). - Removed unused code. ([6216032](https://github.com/tj-django/django-clone/commit/6216032d67534f0ab35da8efb574a6c1aebcf838) by Tonye Jack). ## [v0.0.1](https://github.com/tj-django/django-clone/releases/tag/v0.0.1) - 2019-04-06 <small>[Compare with first commit](https://github.com/tj-django/django-clone/compare/7b01ca1df72d4ac2a691257807cf9eab426332ae...v0.0.1)</small> ### Added - Added pypi deployment setup. ([96816e6](https://github.com/tj-django/django-clone/commit/96816e66ca00d6146860cb96e64047d8f5add6d8) by Tonye Jack). - Added circleci config. ([af22fc8](https://github.com/tj-django/django-clone/commit/af22fc8c00fdb17f12d83140c35a50cd8c678ba9) by Tonye Jack). - Added base class setup. ([7b01ca1](https://github.com/tj-django/django-clone/commit/7b01ca1df72d4ac2a691257807cf9eab426332ae) by Tonye Jack). ### Fixed - Fixed typo. ([1cfe037](https://github.com/tj-django/django-clone/commit/1cfe037e50c8fca6c31a31ecf4b36b7c88c39fbe) by Tonye Jack).
60.513402
188
0.763944
yue_Hant
0.286035
dbefcfe23b05a4e3bb48c9a11fd77423852d87e9
2,982
md
Markdown
bing-docs/bing-entity-search/how-to/rank-results.md
nkgami/bing-docs
51cc5573df9b95242f4a2fc69cb55146f2718395
[ "CC-BY-4.0", "MIT" ]
3
2020-12-04T06:26:52.000Z
2021-11-04T20:17:24.000Z
bing-docs/bing-entity-search/how-to/rank-results.md
nkgami/bing-docs
51cc5573df9b95242f4a2fc69cb55146f2718395
[ "CC-BY-4.0", "MIT" ]
75
2020-06-15T19:45:25.000Z
2022-03-31T02:29:57.000Z
bing-docs/bing-entity-search/how-to/rank-results.md
nkgami/bing-docs
51cc5573df9b95242f4a2fc69cb55146f2718395
[ "CC-BY-4.0", "MIT" ]
14
2020-10-13T23:01:06.000Z
2022-02-26T08:02:53.000Z
--- title: Using ranking to display answers - Bing Entity Search titleSuffix: Bing Search Services description: Learn how to use ranking to display the answers that the Bing Entity Search API returns. services: bing-search-services author: swhite-msft manager: ehansen ms.service: bing-search-services ms.subservice: bing-entity-search ms.topic: conceptual ms.date: 07/15/2020 ms.author: scottwhi --- # Use ranking to display entity search results Each entity search response includes a [RankingResponse](../reference/response-objects.md#rankingresponse) answer that tells you how to display the search results. For details about using the ranking response, see [Web Search API](../../bing-web-search/rank-results.md). ## Ranking response examples The following shows an example **RankingResponse** object for entities. ```json { "_type": "SearchResponse", "queryContext": { "originalQuery": "Jimi Hendrix" }, "entities": { ... }, "rankingResponse": { "sidebar": { "items": [ { "answerType": "Entities", "resultIndex": 0, "value": { "id": "https://www.bingapis.com/api/v7/#Entities.0" } }, { "answerType": "Entities", "resultIndex": 1, "value": { "id": "https://www.bingapis.com/api/v7/#Entities.1" } } ] } } } ``` Based on this ranking response, you'd display both entities in the sidebar. And this example shows what the ranking response looks like for local business entities. ```json { "_type": "SearchResponse", "queryContext": { "originalQuery": "hilton near me", "askUserForLocation": true }, "places": { ... }, "rankingResponse": { "mainline": { "items": [ { "answerType": "Places" }, { "answerType": "Places", "resultIndex": 0, "value": { "id": "https://www.bingapis.com/api/v7/#Places.0" } }, { "answerType": "Places", "resultIndex": 1, "value": { "id": "https://www.bingapis.com/api/v7/#Places.1" } }, { "answerType": "Places", "resultIndex": 2, "value": { "id": "https://www.bingapis.com/api/v7/#Places.2" } }, { "answerType": "Places", "resultIndex": 3, "value": { "id": "https://www.bingapis.com/api/v7/#Places.3" } }, { "answerType": "Places", "resultIndex": 4, "value": { "id": "https://www.bingapis.com/api/v7/#Places.4" } } ] } } } ``` ## Next steps - Learn about [sending entity search requests](search-for-entities.md) - Learn about the [quickstarts](../quickstarts/quickstarts.md) and [samples](../samples.md) that are available to help you get up and running fast.
25.058824
270
0.556338
eng_Latn
0.685481
dbeff02d599744ca33a2b6bdc4f19c3f2d3da640
2,260
md
Markdown
packages/query-params/README.md
Lindeneg/cl-react-hooks
46f4eef7bba77aacc38f2a5fdabf83ef796c5f54
[ "MIT" ]
2
2022-03-09T08:18:36.000Z
2022-03-10T18:51:15.000Z
packages/query-params/README.md
Lindeneg/cl-react-hooks
46f4eef7bba77aacc38f2a5fdabf83ef796c5f54
[ "MIT" ]
null
null
null
packages/query-params/README.md
Lindeneg/cl-react-hooks
46f4eef7bba77aacc38f2a5fdabf83ef796c5f54
[ "MIT" ]
null
null
null
### @lindeneg/query-params ![typescript](https://badgen.net/badge/icon/typescript?icon=typescript&label) ![bundle-size](https://badgen.net/bundlephobia/min/@lindeneg/query-params) ![license](https://badgen.net/npm/license/@lindeneg/query-params) [Sandbox](https://codesandbox.io/s/lindeneg-query-params-rnmi9) --- React hook using [URLSearchParams](https://developer.mozilla.org/en-US/docs/Web/API/URLSearchParams) ### Installation `yarn add @lindeneg/query-params` --- #### Arguments | Name | Required | Ref | Type | Description | | ------ | -------- | --- | ------------------------ | -------------------------------------------------- | | params | Y | T | `Record<string, string>` | object with param keys and optional default values | #### Return Object with properties: | Name | Type | Description | | ------------- | ----------------------------------------- | -------------------------------- | | values | `T` | object with current param values | | onParamChange | `(type: keyof T, value: string) => void;` | change param value | #### Usage ```tsx function SomeComponent() { const { values, onParamChange } = useQueryParams({ id: '', context: '', // optional default value sort: 'desc', }); } ``` This will result in `window.location.search` having the value `?sort=desc`. If `sort` also was an empty string, then `window.location.search` would be empty. Update param as follows: ```ts onParamChange('context', 'products'); ``` Now `window.location.search` will be `?context=products&sort=desc` To remove it, use an empty string: ```ts onParamChange('context', ''); ``` Now `window.location.search` will be `?sort=desc` If default values are specified, an empty string will not remove the key but rather use the default value instead. So, if `sort` is changed ```ts onParamChange('sort', 'asc'); ``` Then `window.location.search` will of course be `?sort=asc` However, if the key is removed ```ts onParamChange('sort', ''); ``` The default value will be used and `window.location.search` will be `?sort=desc`
28.25
218
0.572566
eng_Latn
0.598206
dbf0040601f5b48a9ccb872d13939711a5a4f016
756
md
Markdown
BUILD.md
liuseniubi/advancements_tracker
f27835cb0a27df90fe95c2eb3d003eab3be988d3
[ "MIT" ]
null
null
null
BUILD.md
liuseniubi/advancements_tracker
f27835cb0a27df90fe95c2eb3d003eab3be988d3
[ "MIT" ]
7
2021-03-31T19:11:51.000Z
2021-04-27T23:42:05.000Z
BUILD.md
liuseniubi/advancements_tracker
f27835cb0a27df90fe95c2eb3d003eab3be988d3
[ "MIT" ]
2
2021-04-05T07:42:17.000Z
2021-06-18T22:00:56.000Z
# Development Environment The mod was build in a [Visual Studio Code][visual_studio_code] environment, but you could technical use any other JAVA IDE as well. # Forge Docs Please take a look at the [Forge Docs][forge_docs] to see an full overview of the Forge API. # Grandle The build system is based on [gradle][gradle]. # Testing For testing you basically only need the following gradlew commands. ## Test with Client `gradlew.bat runClient` ## Test with Server `gradlew.bat runServer` Note: Make sure to read and accept the eula.txt inside the run directory, otherwise the server will not start. [forge_docs]: https://mcforge.readthedocs.io/en/latest/ [gradle]: https://docs.gradle.org/ [visual_studio_code]: https://code.visualstudio.com/
25.2
132
0.763228
eng_Latn
0.96711
dbf04a90ffe924d7826374bb2cd5cb67addddf85
6,093
md
Markdown
_posts/2019-01-26-Download-amazon-it-grammatica-inglese-con-esercizi-di.md
Bunki-booki/29
7d0fb40669bcc2bafd132f0991662dfa9e70545d
[ "MIT" ]
null
null
null
_posts/2019-01-26-Download-amazon-it-grammatica-inglese-con-esercizi-di.md
Bunki-booki/29
7d0fb40669bcc2bafd132f0991662dfa9e70545d
[ "MIT" ]
null
null
null
_posts/2019-01-26-Download-amazon-it-grammatica-inglese-con-esercizi-di.md
Bunki-booki/29
7d0fb40669bcc2bafd132f0991662dfa9e70545d
[ "MIT" ]
null
null
null
--- layout: post comments: true categories: Other --- ## Download Amazon it grammatica inglese con esercizi di book After the applause died away, this is, p? He pushed in a long metal flap at the side of the trunk, the nearness of those searching for him doesn't matter, a single hour, with a door at the farther end. Will you still try?" cannot here take any further notice of them. The kitchens that serviced the restaurant from the level above also serviced the staff cafeteria in the Government Center, "O flight of the transfinite. He tried to think of a compliment that wouldn't be completely insincere? "I'll go in, they say so will the Archmage be one returned from death, and I thought that he had gone "It's Amos!" cried Hidalga. Or maybe he wanted to be hit, was as free of criminals as it was untroubled by lumbering brontosaurs, "and I could not be back for lunch. Darlene would be all right, paws cool, religious Tom had acted with the best intentions-but also with the intelligence and the good judgment that God had given him and that he had spent a lifetime honing. It is even said that here The past three years had given Wally much to celebrate, but my heart of them doth not complain. He has such amazon it grammatica inglese con esercizi di incredible innocence. " wonderfully unpredictable world it is, everything; she had listened; she had been still, back!"-and warded it off as if it were Hanna. Because strength ultimately gets to control the wealth and to impose ideas. Her wide-open hazel neighbourhood of the _Vega's_ winter haven. called out, and he would be toward Geneva's. Although the trucker looks vastly amused, mother and sister and two sons; he would leave Mote with could never be subjected to pain, without meaning, alone in a long coach car. "Who?" "Don't you know anything about spaceships?" McKillian shouted. We used W. Or by rast. " which have been made during recent decades to our knowledge of amazon it grammatica inglese con esercizi di "You couldnвt afford one. amazon it grammatica inglese con esercizi di her hairpins! At eight o'clock in the evening, a neck made to burst restraining informant referred to a tradition handed down from former warlike The door was falling shut. So, 'When the night cometh and my master entereth [the harem] and the folk go away, Amsterdam! It is river. The sky, doesn't it?" I shrugged. Kurremkarmerruk shook his head! He suspected the blame lay with his exceptional sensitivity to violence, will have an influence on the development of the organism, to be our own dogs. "So I guess I'll have to. Frankly speaking, she sees only what anyone can seeвwhich strikes her as plenty strange enough. And she had a talent for There are some who say that the school had its beginnings far differently. I am a prisoner Now the Persian had a mameluke, two-fold menu, The Issue at Hand, Tom produced another quarter from a pants pocket, and the village witch was punished for them. But she's got ten years on you, whenas repentance advantaged us not, a fully evolved butterfly? " When the king heard these tidings of Aamir, a child of three would be too young to learn the use of a blind man's cane, she wouldn't be able to move any faster than the Slut Al the lime. He remained perfectly motionless for a minute, zigging and renowned Russian navigator, though walk with you. direction of the gate, BREACH OF WARRANTY OR BREACH OF CONTRACT EXCEPT THOSE the rapidity with which the people thereby make themselves distraction, it's none of your business anymore. Now Ishac had returned to his house upon an occasion that presented itself to him; and when he entered the vestibule, but a quiet promise, and McKillian listened over his shoulder as Weinstein briefed them on the situation as he saw it, a needed purpose. Eleven saints had been given twelve shares of "You're what?" The dogвs tail wags, applauded the bed, D? RAMBRENT. (After a Photograph. _ From above. A seal caught in a net among the ice The higher animal forms which, if you have any need of that, thick-necked toad. "What's your name?" bathroom, like maps of imaginary realms, never saved a life, it slipped into the tight amazon it grammatica inglese con esercizi di of his curled forefinger. "Yeah. departed from thence, as amazon it grammatica inglese con esercizi di all their clothes, pineapple cheesecake, as though she were on a pew? He specialized in postwar Germany-locals and zones, King Akambar! vessel, tool-using culture had not yet emerged, which was reached on 3, but Celestina worked as a waitress to pay for her studio apartment and other needs. When the police operator answered, being anyone or anything other than himselfвrequires a constant misshapen digit that was connected by a thick web of tissue to a gnarled and stubby middle finger, and this shall bring thee great worship in the eyes of all the Jinn. " fifteen-hundred energy units. " voyage he passed a very good harbour in 72 deg. He felt violated. Car theft. " In the face of his kindness, "That's enough for of life and life's purpose was superior to any other, rain falling less peace and well-being than one is inclined beforehand to suppose. 1583, often without trace of a nest. " me amazon it grammatica inglese con esercizi di either stupid or disposed to lie. And he was beginning to doubt if the demolition squad suiting up to go outside farther back in the Hexagon would be able to do much good since the external approaches to the module would almost certainly be amazon it grammatica inglese con esercizi di just as effectively; he knew how the minds that designed things like this worked Junior stalked her, and she was wearing tan slacks with an orange silk blouse covering firm. But a finder can always find work, watching. Alarmed, he kept the shipwrights busy. 93; Then he gave the cup to the Khalif, while others brought fresh logs and worked the bellows sleeves, and if we do that. They were very marked friendliness, and Otter knew he was wrong. "She?" her hairpins. That's gratifying," Junior said sincerely. The Waterfowl and the Tortoise cxlviii 3.
677
5,975
0.790743
eng_Latn
0.999888
dbf0823e34f074a19357dfa4e3b3805cbc51fa45
2,484
md
Markdown
docs/ado/guide/data/determining-what-is-supported.md
Sticcia/sql-docs.it-it
31c0db26a4a5b25b7c9f60d4ef0a9c59890f721e
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/ado/guide/data/determining-what-is-supported.md
Sticcia/sql-docs.it-it
31c0db26a4a5b25b7c9f60d4ef0a9c59890f721e
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/ado/guide/data/determining-what-is-supported.md
Sticcia/sql-docs.it-it
31c0db26a4a5b25b7c9f60d4ef0a9c59890f721e
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Determinazione delle funzionalità supportate | Microsoft Docs ms.prod: sql ms.prod_service: connectivity ms.technology: connectivity ms.custom: '' ms.date: 01/19/2017 ms.reviewer: '' ms.topic: conceptual helpviewer_keywords: - editing data [ADO], Supports method - Supports method [ADO] ms.assetid: 65090cba-6d46-4775-8d61-f6838e7752a6 author: MightyPen ms.author: genemi manager: craigg ms.openlocfilehash: 3a9b8bcf01f348679fc16230c021166d4d9dc786 ms.sourcegitcommit: f7fced330b64d6616aeb8766747295807c92dd41 ms.translationtype: MT ms.contentlocale: it-IT ms.lasthandoff: 04/23/2019 ms.locfileid: "62472220" --- # <a name="determining-what-is-supported"></a>Determinazione delle funzionalità supportate Il **supporta** metodo viene utilizzato per determinare se un determinato **Recordset** oggetto supporta un determinato tipo di funzionalità. Presenta la sintassi seguente: ``` boolean = recordset.Supports(CursorOptions ) ``` ## <a name="remarks"></a>Note Il **supporta** metodo restituisce un valore booleano che indica se il provider supporta tutte le funzionalità identificate dall'argomento CursorOptions. È possibile usare la **supporta** metodo per determinare quali tipi di funzionalità di un **Recordset** supporta dell'oggetto. Se il **Recordset** oggetto supporta le funzionalità di cui costanti corrispondente sono *CursorOptions*, il **supporta** restituzione del metodo **True**. In caso contrario, restituisce **False**. Usando il **supporta** metodo, è possibile cercare la capacità del **Recordset** oggetto per aggiungere nuovi record, usare i segnalibri, usare il **trovare** metodo, utilizzare lo scorrimento, utilizzare il **Indice** proprietà, nonché per eseguire gli aggiornamenti in batch. Per un elenco completo di costanti e i relativi significati, vedere [CursorOptionEnum](../../../ado/reference/ado-api/cursoroptionenum.md). Anche se il **supporta** metodo può restituire **True** per una determinata funzionalità, ma questa soluzione non garantisce che il provider può rendere la funzionalità disponibile in tutte le circostanze. Il **supporta** metodo restituisce semplicemente se il provider supporta la funzionalità specificata, presupponendo che vengano soddisfatte determinate condizioni. Ad esempio, il **supporta** metodo potrebbe indicare che un **Recordset** oggetto supporti gli aggiornamenti, anche se il cursore è basato su un join tra più tabelle - alcune colonne di cui non sono aggiornabili.
65.368421
583
0.787037
ita_Latn
0.996814
dbf0b863e64736d2ea4374f97eff7c175ab5aa9f
3,873
md
Markdown
spec/spec-format-catalog.md
uptonking/note4yaoo
c7ee878ffdbc4ccf0496cbcb3eda4644e47029f4
[ "MIT" ]
6
2020-10-08T04:56:02.000Z
2022-02-17T18:59:27.000Z
spec/spec-format-catalog.md
uptonking/note4yaoo
c7ee878ffdbc4ccf0496cbcb3eda4644e47029f4
[ "MIT" ]
null
null
null
spec/spec-format-catalog.md
uptonking/note4yaoo
c7ee878ffdbc4ccf0496cbcb3eda4644e47029f4
[ "MIT" ]
1
2020-10-08T04:56:04.000Z
2020-10-08T04:56:04.000Z
--- title: spec-format-catalog tags: [catalog, format, spec] created: '2019-12-02T10:25:27.789Z' modified: '2020-10-15T13:42:23.746Z' --- # spec-format-catalog > specifications about file formats # guide # popular-file-formats - csf (Component Story Format) - title, component, parameters, decrators - [Storybook for everyone: CSF vs MDX](https://dev.to/lauracarballo/storybook-for-everyone-csf-vs-mdx-88b) - Both CSF and MDX provide a great way of building component libraries. - I recently ran a twitter poll on prefered method when writing stories and nearly 70% of the people (80 votes aprox.) voted on using CSF, which is understandable as it is still the standard and recommended way. - But, MDX is still a very convenient way of writing stories in those cases where CSF seems a bit of a barrier for non technical users or our component needs precise and well structured documentation. - [What are your thoughts on the MDX format for @storybookjs](https://twitter.com/lcarb14/status/1379913918445801473) - I'm currently preferring MDX as it provides both Stories and Docs with a single file. # JSON-stat - docs - https://json-stat.org/ - https://github.com/jsonstat/toolkit - https://json-stat.org/samples/oecd.json - Until the introduction of JSON-stat, the main statistical standards for data and metadata exchange were XML-based: they were usually complicated and verbose. - JSON-stat is a simple lightweight JSON dissemination format best suited for data visualization, mobile apps or open data initiatives, that has been designed for all kinds of disseminators(传播者). - JSON-stat also proposes an HTML microdata schema to enrich HTML tables and put the JSON-stat vocabulary in the browser. - The JSON-stat format is a simple lightweight JSON format for data dissemination. - It is based in a cube model that arises from the evidence that the **most common form of data dissemination is the tabular form**. - HTML microdata allows machine-readable data to be embedded in HTML documents in the form of nested groups of name-value pairs. - JSON-stat proposes a vocabulary in microdata format to enrich HTML tables. - If you have to process JSON-stat responses, you are not on your own. - There are solutions available in several programming languages: JavaScript, Java, R, Python, Julia, PHP. - And many end user tools to browse, filter, validate and convert JSON-stat. # graphics - SVG(SCALABLE VECTOR GRAPHICS) - SVG is a markup language for describing two-dimensional graphics applications and images, and a set of related graphics script interfaces. - [SVG Working Group Charter](https://www.w3.org/Graphics/SVG/2014/new-charter) - [SVG Roadmap](https://www.w3.org/Graphics/SVG/WG/wiki/Roadmap) # ref - [Understand how structured data works](https://developers.google.com/search/docs/guides/intro-structured-data) - Structured data is a standardized format for providing information about a page and classifying the page content - Most Search structured data uses schema.org vocabulary, but you should rely on the documentation on developers.google.com as definitive for Google Search behavior - [JSON-LD](http://json-ld.org/) - A JavaScript notation embedded in a `<script>` tag in the page head or body. - Google can read JSON-LD data when it is dynamically injected into the page's contents - [RDFa](https://rdfa.info/) - An HTML5 extension that supports linked data by introducing HTML tag attributes that correspond to the user-visible content that you want to describe for search engines. - RDFa is commonly used in both the head and body sections of the HTML page. - [Microdata](https://www.w3.org/TR/microdata/) - An open-community HTML specification used to nest structured data within HTML content. - It is typically used in the page body, but can be used in the head.
65.644068
214
0.765815
eng_Latn
0.991462
dbf0e37e038bdb812143b0f98d914ff4b53b3dee
356
md
Markdown
README.md
mitsuaki1229/TypeScriptWithCats
937193557cdc67f43e638fb2d4d5efb2a2ff9515
[ "MIT" ]
null
null
null
README.md
mitsuaki1229/TypeScriptWithCats
937193557cdc67f43e638fb2d4d5efb2a2ff9515
[ "MIT" ]
null
null
null
README.md
mitsuaki1229/TypeScriptWithCats
937193557cdc67f43e638fb2d4d5efb2a2ff9515
[ "MIT" ]
null
null
null
TypeScriptWithCats ==== Feel in touch with cats. ## Requirement * npm ## Install ```shell $ npm install --save-dev typescript jest ts-jest @types/jest $ tsc -v Version 4.1.3 ``` ## Usage ```shell $ npm test ``` ## Licence This software is released under the MIT License, see LICENSE.md. ## Author [mitsuaki1229](https://github.com/mitsuaki1229)
11.483871
64
0.682584
eng_Latn
0.602464
dbf0e824a46f7113282c2431585fbb26b7c601aa
7,352
md
Markdown
docs/schemas/incident.md
Linked-Pro/platform
79b1ecefd4f0f8ea338ecb072bd571c461394c67
[ "Apache-2.0" ]
96
2015-06-19T16:08:15.000Z
2019-11-28T07:56:05.000Z
docs/schemas/incident.md
Linked-Pro/platform
79b1ecefd4f0f8ea338ecb072bd571c461394c67
[ "Apache-2.0" ]
49
2015-07-02T02:34:45.000Z
2019-10-21T04:50:50.000Z
docs/schemas/incident.md
Linked-Pro/platform
79b1ecefd4f0f8ea338ecb072bd571c461394c67
[ "Apache-2.0" ]
18
2020-03-10T23:46:38.000Z
2022-02-05T10:07:22.000Z
# Incident Schema ```txt https://platform.codeclimate.com/schemas/incident ``` Incidents are a normalized, de-duplicated event. It can be thought of as a problem or an issue within your service that needs to be addressed and resolved. | Abstract | Extensible | Status | Identifiable | Custom Properties | Additional Properties | Access Restrictions | Defined In | | :------------------ | ---------- | -------------- | ------------ | :---------------- | --------------------- | ------------------- | -------------------------------------------------------------------------------------- | | Can be instantiated | No | Unknown status | No | Forbidden | Allowed | none | [Incident.schema.json](../../spec/schemas/Incident.schema.json "open original schema") | ## Incident Type `object` ([Incident](incident.md)) # Incident Properties | Property | Type | Required | Nullable | Defined by | | :---------------------- | ------------- | -------- | -------------- | :-------------------------------------------------------------------------------------------------------------------------- | | [type](#type) | `string` | Optional | cannot be null | [Incident](incident-properties-type.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/type") | | [id](#id) | `string` | Required | cannot be null | [Incident](incident-properties-id.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/id") | | [self](#self) | `string` | Required | cannot be null | [Incident](incident-properties-self.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/self") | | [title](#title) | `string` | Required | cannot be null | [Incident](incident-properties-title.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/title") | | [htmlUrl](#htmlUrl) | `string` | Optional | cannot be null | [Incident](incident-properties-htmlurl.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/htmlUrl") | | [number](#number) | `integer` | Optional | cannot be null | [Incident](incident-properties-number.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/number") | | [status](#status) | Not specified | Required | cannot be null | [Incident](incident-properties-status.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/status") | | [createdAt](#createdAt) | `string` | Required | cannot be null | [Incident](incident-properties-createdat.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/createdAt") | | [updatedAt](#updatedAt) | `string` | Optional | cannot be null | [Incident](incident-properties-updatedat.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/updatedAt") | | [deletedAt](#deletedAt) | `string` | Optional | cannot be null | [Incident](incident-properties-deletedat.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/deletedAt") | ## type `type` - is optional - Type: `string` - cannot be null - defined in: [Incident](incident-properties-type.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/type") ### type Type `string` ### type Constraints **constant**: the value of this property must be equal to: ```json "Incident" ``` ## id The unique ID of this incident from the incident response platform. `id` - is required - Type: `string` - cannot be null - defined in: [Incident](incident-properties-id.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/id") ### id Type `string` ## self The canonical URI for this record. `self` - is required - Type: `string` - cannot be null - defined in: [Incident](incident-properties-self.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/self") ### self Type `string` ### self Constraints **URI**: the string must be a URI, according to [RFC 3986](https://tools.ietf.org/html/rfc4291 "check the specification") ## title The human-readable title of this incident. `title` - is required - Type: `string` - cannot be null - defined in: [Incident](incident-properties-title.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/title") ### title Type `string` ## htmlUrl The URL for a human to view this incident. `htmlUrl` - is optional - Type: `string` - cannot be null - defined in: [Incident](incident-properties-htmlurl.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/htmlUrl") ### htmlUrl Type `string` ### htmlUrl Constraints **URI**: the string must be a URI, according to [RFC 3986](https://tools.ietf.org/html/rfc4291 "check the specification") ## number The number identifying this incident. `number` - is optional - Type: `integer` - cannot be null - defined in: [Incident](incident-properties-number.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/number") ### number Type `integer` ## status The status of this incident `status` - is required - Type: unknown - cannot be null - defined in: [Incident](incident-properties-status.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/status") ### status Type unknown ### status Constraints **enum**: the value of this property must be equal to one of the following values: | Value | Explanation | | :--------------- | ----------- | | `"triggered"` | | | `"acknowledged"` | | | `"resolved"` | | ## createdAt The time this incident began. `createdAt` - is required - Type: `string` - cannot be null - defined in: [Incident](incident-properties-createdat.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/createdAt") ### createdAt Type `string` ### createdAt Constraints **date time**: the string must be a date time string, according to [RFC 3339, section 5.6](https://tools.ietf.org/html/rfc3339 "check the specification") ## updatedAt The time this incident was last updated. `updatedAt` - is optional - Type: `string` - cannot be null - defined in: [Incident](incident-properties-updatedat.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/updatedAt") ### updatedAt Type `string` ### updatedAt Constraints **date time**: the string must be a date time string, according to [RFC 3339, section 5.6](https://tools.ietf.org/html/rfc3339 "check the specification") ## deletedAt The time this incident was deleted. `deletedAt` - is optional - Type: `string` - cannot be null - defined in: [Incident](incident-properties-deletedat.md "https&#x3A;//platform.codeclimate.com/schemas/incident#/properties/deletedAt") ### deletedAt Type `string` ### deletedAt Constraints **date time**: the string must be a date time string, according to [RFC 3339, section 5.6](https://tools.ietf.org/html/rfc3339 "check the specification")
31.965217
223
0.623368
eng_Latn
0.565503
dbf0e89af172ca37b8e441c159c1f7f2228c3009
27,403
md
Markdown
docs/_posts/2021-12-01-blog-2.md
CaroPattle/data_stories
025942ad72061ae7ecc0d2b9150ea1c12a0d32a9
[ "MIT" ]
null
null
null
docs/_posts/2021-12-01-blog-2.md
CaroPattle/data_stories
025942ad72061ae7ecc0d2b9150ea1c12a0d32a9
[ "MIT" ]
null
null
null
docs/_posts/2021-12-01-blog-2.md
CaroPattle/data_stories
025942ad72061ae7ecc0d2b9150ea1c12a0d32a9
[ "MIT" ]
null
null
null
--- title: Bicycle Volume & Speed - Analysis & Visualisation date: 2021-12-01 00:00:00 description: This post covers how we used the Bicycle Volume & Speed dataframe to create analysis and visualisations. featured_image: '/images/Sensor_Site_Map.png' --- ### The dataset In the previous blog, we shared how we imported and readied our dataset in a Python coding environment. This blog picks up where the previous post left off, covering the various analyses and visualisations created using the open dataset. ### Python Libraries These are the core libraries we'll import for our visualisations: {% highlight py%} import pandas as pd import numpy as np import plotly.express as px import geopandas as gpd import plotly.graph_objects as go {% endhighlight %} ### Bicycle Sensor Map For the first visualisation, we'll map bicycle volume by year and create a slider or animation to show change. We'll also create HTML files for each analysis, which can be embedded as interactive visualisations on a website like this one. <p>&nbsp;</p> {% highlight py%} colours = ['#9ca3ff','#838cfc','#636efa','#3d4bfc'] fig = px.scatter_mapbox(df_merge, lat="STRT_LAT", lon="STRT_LONG", hover_name="SITE_DESC", hover_data={'STRT_LAT':False, "STRT_LONG":False, "Total_cyclists":True, "Average_speed":True, "Average_speed":':.2f'}, color = "Total_cyclists", size = "Total_cyclists", size_max=15, color_continuous_scale = colours, opacity = 0.6, range_color = [0, df_merge.Total_cyclists.max()], animation_frame = df_merge.year_M, zoom=10.5, labels={"year_M":"Month","Average_speed": "Average Speed(km/hr)", "Total_cyclists": "Total Cyclists"}) fig.update_layout(mapbox_style="carto-positron") fig.update_coloraxes(colorbar_title_text=" ", colorbar_title_side= "right", colorbar_bgcolor="rgba(255,255,255,255)", colorbar_thicknessmode= "pixels", colorbar_thickness=33, colorbar_lenmode="pixels", colorbar_tickfont_family="Arial", colorbar_tickfont_size= 14, colorbar_ticklabelposition= 'inside', colorbar_ticks="inside", colorbar_tickcolor='rgba(255,255,255,255)', colorbar_len=520, colorbar_xpad= 0, colorbar_ypad= 0, colorbar_tickfont_color='rgba(255,255,255,255)', colorbar_x = 1, colorbar_yanchor="top", colorbar_y = 1) fig.update_layout(hoverlabel=dict(bgcolor="white", font_size=14, font_family="Arial", bordercolor="rgba(255,255,255,255)")) fig.update_traces(hoverlabel_font_color="Black") fig.update_layout(margin={"r":0,"t":0,"l":0,"b":0}) fig.update_layout(updatemenus=[dict(type='buttons', showactive=False, y=-0.10, x=-0, xanchor='left', yanchor='bottom') ]) fig.layout.sliders[0].pad.t=6 fig.update_layout(sliders=[dict(font_color ="#a0a1a3", font_family = 'Arial', font_size = 14, currentvalue_visible = False, borderwidth=0, bgcolor='#b0b7e8', tickcolor = '#b8b8b8', activebgcolor = '#949feb', transition_duration = 300, transition_easing = "linear" )]) for x in fig.layout.sliders[0].steps: x.label = format_year_month([x.label])[0] fig.write_html(f'{visuals}Sensor_Map.html') fig.show() {% endhighlight %} <p>&nbsp;</p> <iframe src="/data_stories/visuals/bicycle_flows/Sensor_Map.html" class="video-wrap full-width" height="600px" style="border:none;"></iframe> ### Sensor Traffic To visualise traffic flows at sensor locations, we'll plot the bicycle volume recorded by each individual sensor as an interactive horizontal bar chart. <p>&nbsp;</p> {% highlight py %} df_sensor = df_merge.groupby(['LOCATION_DESC'])["Total_cyclists"]\ .agg(sum)\ .sort_index(ascending=False)\ .sort_values()\ .reset_index() df_sensor['Cyclist_percentage'] = df_sensor.apply(lambda x: \ 100* x.Total_cyclists/df_sensor.Total_cyclists.sum(),axis=1) fig = go.Figure() fig.add_trace(go.Bar(x=df_sensor.Total_cyclists, y=df_sensor.LOCATION_DESC, orientation='h', customdata =df_sensor.Cyclist_percentage, hovertemplate =( '<b>%{y}</b><br>' 'Total cyclists: <b>%{x:,}</b><br>'+ 'Network share: <b>%{customdata:.1f}%<extra></extra></b>'))) fig.update_yaxes(title= None, showgrid = True, showticklabels = True, showline=True) fig.update_xaxes( color = '#b8b8b8', showgrid = True, showticklabels = True, showline=True) fig.for_each_annotation(lambda a: a.update(text=a.text.split("=")[-1])) fig.update_layout(height = 1000) fig.update_layout(yaxis = dict(tickfont = dict(family='Arial',size=16),gridcolor='White'), #color = '#a0a1a3' xaxis = dict(tickfont = dict(family='Arial',size=16, color = '#b8b8b8'),gridcolor='White'), hoverlabel=dict(font=dict(family='Arial', size=14))) fig.update_layout(hoverlabel=dict(bgcolor="white", font_size=14, font_family="Arial", font_color='Black', bordercolor="rgba(255,255,255,255)"), plot_bgcolor='#f2f2f2') fig.update_traces(width=0.9) fig.update_layout(barmode='stack', bargap=0.20, height=1400, margin=dict(pad=10)) fig.update_yaxes(showticklabels=True, visible=True) fig.layout.xaxis.side='top' fig.show() fig.write_html(f'{visuals}Sensor_by_volume.html') {% endhighlight %} <p>&nbsp;</p> <iframe src="/data_stories/visuals/bicycle_flows/Sensor_by_volume.html" class="video-wrap full-width" height="400px" style="border:none;"></iframe> <br> ### Bicycle Volume Next, let's look at some basic summary statistics for bicycle volume by year. <p>&nbsp;</p> {% highlight py %} dfm.groupby('year')[['VEHICLE','SPEED']].agg({'VEHICLE':['count','sum','mean','median']}).round(1) {% endhighlight %} | Year | Monthly sensor counts | Number of cyclists | Average monthly cyclists | Median monthly cyclists | |--------------------------|------------------------:|-------------------:|-------------------------:|------------------------:| | 2018 | 882 | 11,350,902 | 12,869.5 | 10,365 | | 2019 | 963 | 13,064,949 | 13,566.9 | 10,338 | | 2020 | 1,016| 11,803,715 | 11,617.8 | 9,267.5 | | 2021 (excl. Nov, Dec) | 908 | 9,625,145 | 10,600.4 | 8,933.5 | | **Overall** | **3769** | **45,844,711** | **12,163.6** | **9,813** | <br> ### Covid19 public health restrictions In order to analyse how public health restrictions might have affected bicycle traffic, we've collated a separate xlsx file with the following columns. <p>&nbsp;</p> {% highlight py %} restrictions = pd.read_excel(os.path.abspath('../../data/DPC/Restriction_timeline.xlsx'), parse_dates = True) restrictions {% endhighlight %} <p>&nbsp;</p> | | Start_Date | End_Date | Restriction | Lockdown_Number | Restriction_Level | |---:|-----------:|-----------:|------------:|----------------:|------------------:| | 0 | 2020-03-23 | 2020-03-25 | Partial | NaN | Stage 1 | | 1 | 2020-03-26 | 2020-03-30 | Partial | NaN | Stage 2 | | 2 | 2020-03-31 | 2020-05-12 | Full | 1.0 | Stage 3 | | 3 | 2020-05-13 | 2020-05-31 | Partial | NaN | NaN | | 4 | 2020-06-01 | 2020-06-21 | Partial | NaN | NaN | | 5 | 2020-06-22 | 2020-07-29 | Partial | NaN | NaN | | 6 | 2020-07-01 | 2020-07-08 | Full | NaN | Stage 3 | | 7 | 2020-07-09 | 2020-08-01 | Full | 2.0 | Stage 3 | | 8 | 2020-08-02 | 2020-10-18 | Full | 2.0 | Stage 4 | | 9 | 2020-10-19 | 2020-10-27 | Full | 2.0 | Stage 4 | | 10 | 2020-10-28 | 2021-02-12 | Partial | NaN | NaN | | 11 | 2021-02-13 | 2021-02-17 | Full | 3.0 | Stage 4 | | 12 | 2021-02-18 | 2021-05-27 | Partial | NaN | NaN | | 13 | 2021-05-28 | 2021-06-10 | Full | 4.0 | Stage 4 | | 14 | 2021-06-11 | 2021-07-15 | Partial | NaN | NaN | | 15 | 2021-07-16 | 2021-07-27 | Full | 5.0 | Stage 4 | | 16 | 2021-07-28 | 2021-08-04 | Partial | NaN | NaN | | 17 | 2021-08-05 | 2021-09-27 | Full | 6.0 | Stage 4 | | 18 | 2021-09-28 | 2021-10-21 | Full | 6.0 | NaN | | 19 | 2021-10-22 | 2021-10-31 | Partial | NaN | NaN | <p>&nbsp;</p> For this analysis, full restrictions correspond to Stage 3 and Stage 4 restrictions (4 reasons to leave the home), while partial restrictions correspond with Stage 1 and Stage 2 restrictions. We've recorded 01/06/2020 - 08/07/2020 as a full lockdown, which although limited to 36 suburbs and public housing complexes, encompassed a significant portion of Melbourne. Linking the restriction date ranges with the dataframe dates is made easier by creating an in-memory SQL database using the code below, which allows greater flexibility in joining based on dates within date ranges. <p>&nbsp;</p> {% highlight py %} import sqlite3 #Make the db in memory conn = sqlite3.connect(':memory:') #write the tables df_merge_W.reset_index().to_sql('weeks', conn, index=False) restrictions.to_sql('restrictions', conn, index=False) qry = ''' select weeks.*, restrictions.* from weeks join restrictions on year_W between "Start_Date" and "End_Date" ''' df_restrictions = pd.read_sql_query(qry, conn) df_restrictions['year_W'] = pd.to_datetime(df_restrictions['year_W']) df_restrictions = df_restrictions.set_index(['SITE_XN_ROUTE', 'LOC_LEG','year_W']) df_restrictions = pd.concat([df_restrictions, df_merge_W.loc[~df_merge_W.index.isin(df_restrictions.index)]]) df_restrictions = df_restrictions[~df_restrictions.index.duplicated(keep='first')] {% endhighlight %} <p>&nbsp;</p> Finally, we'll create a table which summarises cyclist volume and speed by restriction level. {% highlight py %} df_restrictions.groupby (['Restriction']).agg({'Total_cyclists':['mean'],'Average_speed':'mean'}).round(1) {% endhighlight %} <p>&nbsp;</p> | Restriction | Total_cyclists | Average_speed | |------------:|----------------|---------------| | None | 3161.4 | 21.2 | | Partial | 2360.5 | 20.4 | | Full | 2386.5 | 19.5 | ### Distribution of bicycle volume by restriction level Here we'll use a plotly express boxplot to look at how the distribution of bicycle volume may differ according to public health restriction levels. <p>&nbsp;</p> {% highlight py %} fig = px.box(df_restrictions, y ="Total_cyclists", color="Restriction", notched=True, category_orders={'Restriction':['None','Partial','Full']}, labels={"Total_cyclists": "Total Cyclists"}) fig.update_layout(hoverlabel=dict(bgcolor="white", font_size=14, font_color = 'Black', font_family="Arial", bordercolor="rgba(255,255,255,255)"), plot_bgcolor='#f2f2f2',) fig.update_yaxes(tickfont = dict(family='Arial',size=16, color = "#a0a1a3"), gridcolor='White') fig.update_layout(legend=dict( yanchor="top", xanchor="left", font=dict( family="Arial", size=14, ))) fig.show() fig.write_html(f'{visuals}Restriction_volume.html') {% endhighlight %} <p>&nbsp;</p> <iframe src="/data_stories/visuals/bicycle_flows/Restriction_volume.html" allowfullscreen class="video-wrap full-width" height="600px" style="border:none;"></iframe> ### Bicycle speed by year The code below uses a plotly express line graph to chart annual bicycle speed. A unified hoverlabel allows the user to compare datapoints while hiding unneccessary labelling. <p>&nbsp;</p> {% highlight py %} f_volume = dfm.groupby(['year','month'])['SPEED'].agg(['mean','std']).reset_index() df_volume = df_volume.rename(columns={"mean":"Average speed (km/h)"}).round({"Average speed (km/h)":2}) df_volume['year_month'] = df_volume.apply(lambda x: f'{int(x["year"])}-{x["month"]}',axis=1) colours = ['#9ca3ff','#838cfc','#636efa','#3d4bfc'] fig = px.line(df_volume.loc[df_volume.year_month!="2021-Nov"], x='month', y='Average speed (km/h)', color='year', markers = False, color_discrete_sequence=colours, render_mode="SVG", hover_data={'month':False}) fig.update_xaxes(ticklabelposition="inside top", title=None, showgrid = False, showline=True, linewidth=2, linecolor='#b8b8b8') fig.update_yaxes(title= None, showgrid = True, showticklabels = True, showline=True, linewidth=2, linecolor='#b8b8b8', gridcolor='rgba(0,0,0,0)') fig.update_traces(line=dict(width=4),hovertemplate=None) fig.update_layout( yaxis = dict(tickfont = dict(size=14))) annotations = [] y_terminal_manually_adjusted = [21.08, 20.85, 20.44, 20.335235284775884] x_terminals = [[v for v in zip(d.x,d.y) if str(v[1])!="nan"][-1][0] for d in fig.data] for x, y, year, colour in zip(x_terminals, y_terminal_manually_adjusted, df_volume.year.unique(), colours): # labeling the right_side of the plot annotations.append(dict(x=x, y=y, xanchor='left', yanchor='middle', align="right", text=str(year), font=dict(family='Arial', size=16, color=colour), showarrow=False)) fig.update_layout(annotations=annotations) fig.update_traces(hovertemplate='<b>%{y}</b> km') fig.update_layout(yaxis_range=[17.5,22], hovermode="x unified", showlegend = False, yaxis = dict(tickfont = dict(family='Arial',size=14, color = '#b8b8b8')), xaxis = dict(tickfont = dict(family='Arial',size=14, color = '#b8b8b8')), hoverlabel=dict(font=dict(family='Arial', size=14)), plot_bgcolor='rgba(0,0,0,0)') layout = go.Layout(margin=go.layout.Margin( l=0, #left margin r=0, #right margin b=0, #bottom margin t=0, #top margin ) ) fig.show() fig.write_html(os.path.abspath(f'{visuals}speed_chart.html')) {% endhighlight %} <iframe src="/data_stories/visuals/bicycle_flows/speed_chart.html" allowfullscreen class="video-wrap full-width" height="500px" style="border:none;"></iframe> ### Bicycle direction by year The following code makes use of polar charts to create compasses with volume bars. <p>&nbsp;</p> {% highlight py %} from pandas.api.types import CategoricalDtype df_direction = df_merge.groupby(['Year', 'BEARING_DESC'])['Total_cyclists'].sum().reset_index() cardinal_dict = { 'NORTH BOUND' :'N', 'NORTH EAST BOUND' :'NE', 'EAST BOUND' :'E', 'SOUTH EAST BOUND' :'SE', 'SOUTH BOUND' :'S', 'SOUTH WEST BOUND' :'SW', 'WEST BOUND' :'W', 'NORTH WEST BOUND' :'NW' } df_direction['BEARING_DESC'] = df_direction['BEARING_DESC'].apply(lambda x: cardinal_dict[x]) cardinal_dict = {k:v for k,v in zip(["N", "NE", "E", "SE", "S", "SW", "W", "NW"], [0, 45, 90, 135, 180, 225, 270, 315])} df_direction['BEARING_DESC'] = df_direction['BEARING_DESC'].apply(lambda x: cardinal_dict[x]) range_cyclists=[0,df_direction['Total_cyclists'].max()] figs={} years=df_direction['Year'].unique() n_plots = len(years) for year in years: figs[year] = px.bar_polar(df_direction.query(f"Year == '{year}'").sort_values('BEARING_DESC'), r="Total_cyclists", range_r =range_cyclists, theta = "BEARING_DESC", labels = {'BEARING_DESC':'Direction'}, title=year) for subfig in figs: figs[subfig] figs[subfig].update_layout( title_font = dict(family = 'Arial', size = 16), polar = dict( radialaxis = dict(gridcolor='White'), angularaxis = dict( thetaunit = "degrees", dtick = 45, rotation=90, direction = "clockwise", tickmode="array", tickvals=[0, 45, 90, 135, 180, 225, 270, 315], ticktext=["N", "NE", "E", "SE", "S", "SW", "W", "NW"], gridcolor='White' ), bgcolor='#f2f2f2', )) figs[subfig].show() pio.write_image(figs[subfig],f'{visuals}Sensor_Direction_{subfig}.svg', scale = 5) {% endhighlight %} <p>&nbsp;</p> {% include post-components/gallery.html columns = 1 full_width = false images = "/data_stories/visuals/bicycle_flows/Sensor_Direction_2018.svg,/data_stories/visuals/bicycle_flows/Sensor_Direction_2019.svg,/data_stories/visuals/bicycle_flows/Sensor_Direction_2020.svg,/data_stories/visuals/bicycle_flows/Sensor_Direction_2021.svg" %} <br> <p>&nbsp;</p> ### Traffic comparison by sensor This visualisation of the relative share of traffic flow at sensor locations pre- and during the first two years of the pandemic extends previous code, using a plotly graph object bar chart. <p>&nbsp;</p> {% highlight py %} df_merge['year'] = df_merge.year_M.apply(lambda x: int(x.split('-')[0])) df_merge['month'] = df_merge.year_M.apply(lambda x: int(x.split('-')[1])) df_merge['bi_year_split'] = df_merge.year.isin([2020,2021])\ .astype(str)\ .replace('False','2018-19')\ .replace('True','2020-21') # Sum up total cyclists for 2018-19 and 2020-21, excluding November and December to ensure fair comparison with 2021 df_sensor = df_merge.loc[df_merge.month<11].groupby(['bi_year_split','LOCATION_DESC'])["Total_cyclists"]\ .agg(sum)\ .sort_index(ascending=False)\ .reset_index() # Here we calculate the totals to use as a denominator for the percentage of cyclist traffic. total_cyclists = df_merge.loc[df_merge.month<11].groupby('LOCATION_DESC')['Total_cyclists'].sum().reset_index() df_sensor['Cyclist percentage'] = pd.merge(df_sensor, total_cyclists, on='LOCATION_DESC', how='left')\ .apply(lambda x: 100* x.Total_cyclists_x/x.Total_cyclists_y,axis=1) # sort values on cycling percentage df_sensor.sort_values('Cyclist percentage', ascending=False,inplace=True) # remove 99% (ie. partial year sensors) df_sensor = df_sensor.loc[(df_sensor['Cyclist percentage']<99)&(df_sensor['Cyclist percentage']>1)] fig = go.Figure(go.Bar(y=df_sensor.loc[df_sensor.bi_year_split=='2018-19'].LOCATION_DESC, x=df_sensor.loc[df_sensor.bi_year_split=='2018-19']['Cyclist percentage'], legendgroup="2018-19", name="2018-19", marker_color=colours[0], marker_opacity=1.0, orientation='h', hovertemplate =( '<b>%{y}</b><br>' 'Sensor volume: <b>%{x:.1f}%<extra></extra><br>') ) ) fig.add_trace(go.Bar(y=df_sensor.loc[df_sensor.bi_year_split=='2020-21'].LOCATION_DESC, x=df_sensor.loc[df_sensor.bi_year_split=='2020-21']['Cyclist percentage'], legendgroup="2020-21", name="2020-21", marker_color=colours[2], marker_opacity=1.0, orientation='h', hovertemplate =( '<b>%{y}</b><br>' 'Sensor volume: <b>%{x:.1f}%<extra></extra><br>') ) ) fig.update_layout(barmode='relative') fig.add_vline(x=50) fig.add_annotation(x=.5,y=-.12, text="Percentage of cyclists across 4 years", xref="paper", yref="paper", showarrow=False, font_size=14, font_color="#a0a1a3") fig.update_layout(height = 1000) fig.update_layout(yaxis = dict(tickfont = dict(family='Arial',size=16, color = 'Black')), xaxis = dict(tickfont = dict(family='Arial',size=14, color = '#b8b8b8')), hoverlabel=dict(font=dict(family='Arial', size=14)), paper_bgcolor='rgba(0,0,0,0)', plot_bgcolor='rgba(0,0,0,0)') fig.update_layout(hoverlabel=dict(font_size=14, font_family="Arial")) fig.update_layout(legend=dict( orientation="h", yanchor="top", y= 1.03, xanchor="left", x=0.65, font=dict( family="Arial", size=14, ) )) fig.update_layout(barmode='stack', bargap=0.7) fig.update_yaxes(showticklabels=True, visible=True) fig.layout.xaxis.side='bottom' fig.update_traces(width=0.9) fig.update_layout(hoverlabel=dict(bgcolor="white", font_size=14, font_family="Arial", font_color='Black', bordercolor="rgba(255,255,255,255)"), height=1400, margin=dict(pad=10 )) fig.show() fig.write_html(f'{visuals}Sensor_by_split_years_comparison.html') {% endhighlight %} <iframe src="/data_stories/visuals/bicycle_flows/Sensor_by_split_years_comparison.html" allowfullscreen class="video-wrap full-width" height="500px" style="border:none;"></iframe> ### Traffic change by sensor distance from CBD To measure the distance of each sensor from the CBD, first we'll need to create a geodataframe with the coordinates in metres. We'll define the input coordinate referene system as EPSC 4326 (WGS84) and reproject this to EPSG 3111 (VicGrid 94). Melbourne CBD was represented as the junction of Elizabeth and Bourke streets, identified from Google Maps with the coordinates (144.9631419, -37.8142100) and transformed to VicGrid GDA94 (EPSG 4326) spatial reference in units of metres using [epsg.io](https://epsg.io/transform#s_srs=4326&t_srs=3111&x=144.9631419&y=-37.8142100), with the coordinates in units of Easting and Northing being (x = 2496754.68 , y = 2409644.73). This allows us to estimate the approximate distance from sensors to Melbourne's CBD in metres, based on the basic Euclidean 'crow flies' straight line distance. <p>&nbsp;</p> {% highlight py %} gdf = gpd.GeoDataFrame( df_restrictions, geometry=gpd.points_from_xy(df_restrictions.STRT_LONG, df_restrictions.STRT_LAT))\ .set_crs(4326)\ .to_crs(3111) CBD_x = 2496754.68 CBD_y = 2409644.73 df_restrictions['x_metres']=gdf.geometry.x.apply(lambda x: x-CBD_x) df_restrictions['y_metres']=gdf.geometry.y.apply(lambda y: y-CBD_y) df_restrictions['distance_cbd_km'] = np.sqrt(df_restrictions.x_metres**2 + df_restrictions.y_metres**2)/1000 biyearly_pct_cbd = df_sensor.merge(df_restrictions[['LOCATION_DESC','distance_cbd_km','geometry']],on='LOCATION_DESC').drop_duplicates() {% endhighlight %} <p>&nbsp;</p> We'll then use a plotly express scatter with trendlines to view the datapoints. <p>&nbsp;</p> {% highlight py %} biyearly_pct_cbd['log_dist_cbd'] = biyearly_pct_cbd.distance_cbd_km.apply(lambda x: np.log10(x)) fig = px.scatter(biyearly_pct_cbd, x="log_dist_cbd", y="Cyclist percentage", color="bi_year_split", labels={"bi_year_split": "Year", "LOCATION_DESC":"Site", "distance_cbd_km": "Distance to CBD (km)", "Cyclist percentage":"Percentage of cyclists in period"}, hover_data={'bi_year_split':True, "LOCATION_DESC":True, "distance_cbd_km": ":.1f", "log_dist_cbd":False, "Total_cyclists":False, "Cyclist percentage":":.1f"}, trendline='ols', ) fig.update_layout( xaxis = dict( title="Distance from CBD (km; Log10 scale)", tickmode = 'array', tickvals = [np.log10(x) for x in [1,10,20,40,80]], ticktext = [1,10,20,40,80] ) ) fig.update_layout(hoverlabel=dict(bgcolor="white", font_size=14, font_color = 'Black', font_family="Arial", bordercolor="rgba(255,255,255,255)"), plot_bgcolor='#f2f2f2',) fig.show() fig.write_html(f'{visuals}Cyclist_percentage_distance_cbd.html') {% endhighlight %} <iframe src="/data_stories/visuals/bicycle_flows/Cyclist_percentage_distance_cbd.html" allowfullscreen class="video-wrap full-width" height="500px" style="border:none;"></iframe>
42.485271
587
0.557384
eng_Latn
0.530742
dbf1074c63bb865116512b7bc6635cf036a24844
404
md
Markdown
145. Binary Tree Postorder Traversal.md
kongpingfan/go-leetcode
7ca9ddb0bf7e39083cd071ac488cd3ba3c360db8
[ "RSA-MD" ]
1
2019-04-14T02:45:57.000Z
2019-04-14T02:45:57.000Z
145. Binary Tree Postorder Traversal.md
kongpingfan/go-leetcode
7ca9ddb0bf7e39083cd071ac488cd3ba3c360db8
[ "RSA-MD" ]
null
null
null
145. Binary Tree Postorder Traversal.md
kongpingfan/go-leetcode
7ca9ddb0bf7e39083cd071ac488cd3ba3c360db8
[ "RSA-MD" ]
null
null
null
# 145. Binary Tree Postorder Traversal Given a binary tree, return the *postorder* traversal of its nodes' values. **Example:** ``` Input: [1,null,2,3] 1 \ 2 / 3 Output: [3,2,1] ``` *** 参考 99 二叉树的中序遍历。 ```go /** * Definition for singly-linked list. * type ListNode struct { * Val int * Next *ListNode * } */ func sortList(head *ListNode) *ListNode { } ```
10.918919
75
0.574257
yue_Hant
0.656084
dbf134ceefe834fd36811828499ff748df12a02d
1,129
md
Markdown
README.md
w158rk/easy-ecc
ab6fffee39d53887ce23ceab4e98b97b8baf2071
[ "BSD-2-Clause" ]
null
null
null
README.md
w158rk/easy-ecc
ab6fffee39d53887ce23ceab4e98b97b8baf2071
[ "BSD-2-Clause" ]
null
null
null
README.md
w158rk/easy-ecc
ab6fffee39d53887ce23ceab4e98b97b8baf2071
[ "BSD-2-Clause" ]
null
null
null
This is a project for the primitives in Elliptic Curve Cryptography (ECC). It is forked from https://github.com/esxgx/easy-ecc while some modifications have been done to make it easier to build and read, as well as to make it a little more efficient. There are several branches for different versions, ECC-8, ECC-32 and ECC-64, aiming at platforms with different sizes of words. Algorithms in these branches are generally the same. # Use you can use 8-bit, 32-bit or 64-bit version for your convenience. ```sh git checkout ecc-{8 or 32 or 64} ``` ## compiling ```sh mkdir build-release cd build-release cmake ../ make ``` or ```sh mkdir build-debug cd build-debug cmake -DCMAKE_BUILD_TYPE=Debug ../ make ``` for Arduino mega 2560, use ```sh mkdir build-avr cd build-avr cmake --DCMAKE_BUILD_TYPE=Release-avr ../ make make keytest-upload make keytest-serial ``` ## run the test programs ``` ./bin/keytest -- key make, encrypt, decrypt, sign and verify ./bin/pointtest -- test scalar multipication or point addition ./bin/fieldtest -- test number multiplication and squaring ```
22.58
251
0.717449
eng_Latn
0.992383
dbf17e5df670d49e2933e6a296b2fef0b466c54c
320
md
Markdown
README.md
viniciusme/quiz-javascript
b5b333483b3cf7f698891a5464d120fff0423c9a
[ "MIT" ]
null
null
null
README.md
viniciusme/quiz-javascript
b5b333483b3cf7f698891a5464d120fff0423c9a
[ "MIT" ]
null
null
null
README.md
viniciusme/quiz-javascript
b5b333483b3cf7f698891a5464d120fff0423c9a
[ "MIT" ]
null
null
null
# quiz-javascript As perguntas são geradas dinamicamente e de acordo com cada pergunta respondida uma barra de progresso vai informando o quanto ainda falta para finalizar. Ao final é mostrado um placar com quantas perguntas foram realizadas e quantas o usuário acertou. Tecnologias utilizadas HTML, CSS3 e JAVASCRIPT.
106.666667
301
0.828125
por_Latn
0.999956
dbf1865aa699eb9608cc3631e4e87e394df955ad
4,594
md
Markdown
docs/onAuth.md
kmorgan8588/isomorphic-git-fork
a623133345a5d8b6bb7a8352ea9702ce425d8266
[ "MIT" ]
5,982
2017-12-28T15:42:50.000Z
2022-03-31T21:43:54.000Z
docs/onAuth.md
kmorgan8588/isomorphic-git-fork
a623133345a5d8b6bb7a8352ea9702ce425d8266
[ "MIT" ]
1,417
2017-12-28T14:53:31.000Z
2022-03-31T19:25:40.000Z
docs/onAuth.md
kmorgan8588/isomorphic-git-fork
a623133345a5d8b6bb7a8352ea9702ce425d8266
[ "MIT" ]
338
2018-01-28T19:47:35.000Z
2022-03-29T13:10:27.000Z
--- title: onAuth sidebar_label: onAuth --- The `onAuth` callback allows isomorphic-git to request credentials. It is only called if a server returns an HTTP error (such as 404 or 401) when attempting to access the resource without credentials. Authentication is normally required for pushing to a git repository. It may also be required to clone or fetch from a private repository. Git does all its authentication using HTTPS Basic Authentication. An `onAuth` function is called with a `url` and an `auth` object and should return a GitAuth object: ```ts /** * @callback AuthCallback * @param {string} url * @param {GitAuth} auth - Might have some values if the URL itself originally contained a username or password. * @returns {GitAuth | void | Promise<GitAuth | void>} */ /** * @typedef {Object} GitAuth * @property {string} [username] * @property {string} [password] * @property {Object<string, string>} [headers] * @property {boolean} cancel - Tells git to throw a `UserCanceledError` (instead of an `HTTPError`). */ ``` ## Example ```js await git.clone({ ..., onAuth: url => { let auth = lookupSavedPassword(url) if (auth) return auth if (confirm('This repo is password protected. Ready to enter a username & password?')) { auth = { username: prompt('Enter username'), password: prompt('Enter password'), } return auth } else { return { cancel: true } } } }) ``` ## Option 1: Username & Password Return an object with `{ username, password }`. However, there are some things to watch out for. If you have two-factor authentication (2FA) enabled on your account, you probably cannot push or pull using your regular username and password. Instead, you may have to use a Personal Access Token. (Bitbucket calls them "App Passwords".) ### Personal Access Tokens - [Instructions for GitHub](https://help.github.com/articles/creating-a-personal-access-token-for-the-command-line/) - [Instructions for Bitbucket](https://confluence.atlassian.com/bitbucket/app-passwords-828781300.html) - [Instructions for GitLab](https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html) In this situation, you want to return an object with `{ username, password }` where `password` is the Personal Access Token. Note that GitHub actually lets you specify the token as the `username` and leave the password blank, which is convenient but none of the other hosting providers do this that I'm aware of. ### OAuth2 Tokens If you are writing a third-party app that interacts with GitHub/GitLab/Bitbucket, you may be obtaining OAuth2 tokens from the service via a feature like "Login with GitHub". Depending on the OAuth2 token's grants, you can use those tokens for pushing and pulling from git repos as well. In this situation, you want to return an object with `{ username, password }` where `username` and `password` depend on where the repo is hosted. Unfortunately, all the major git hosting companies have chosen different conventions for converting OAuth2 tokens into Basic Authentication headers! | | `username` | `password` | | ---------- | ---------------- | --------------- | | GitHub | `token` | 'x-oauth-basic' | | GitHub App | 'x-access-token' | `token` | | BitBucket | 'x-token-auth' | `token` | | GitLab | 'oauth2' | `token` | I will gladly accept pull requests to document more companies' conventions. Since it is a rarely used feature, I'm not including the conversion table directly in isomorphic-git anymore. But if there's interest in maintaining this table as some kind of function, I'm considering starting an `@isomorphic-git/quirksmode` package to handle these kinds of hosting-provider specific oddities. ## Option 2: Headers This is the super flexible option. Just return the HTTP headers you want to add as an object with `{ headers }`. If you can provide `{ username, password, headers }` if you want. (Although if `headers` includes an `Authentication` property that overwrites what you would normally get from `username`/`password`.) To re-implement the default Basic Auth behavior, do something like this: ```js let auth = { headers: { Authentication: `Basic ${Buffer.from(`${username}:${password}`).toString('base64')}` } } ``` If you are using a custom proxy server that has its own authentication in addition to the destination authentication, you could inject it like so: ```js let auth = { username, password, headers: { 'X-Authentication': `Bearer ${token}` } } ```
37.966942
201
0.710927
eng_Latn
0.990929
dbf1ab7563126f8b91570d99ff852854b1e0fb0d
3,114
md
Markdown
curriculum/challenges/english/05-back-end-development-and-apis/basic-node-and-express/implement-a-root-level-request-logger-middleware.md
Buzzfreeze/freeCodeCamp
ed383c47121d83ed612cbfbb4bd2688594089433
[ "BSD-3-Clause" ]
null
null
null
curriculum/challenges/english/05-back-end-development-and-apis/basic-node-and-express/implement-a-root-level-request-logger-middleware.md
Buzzfreeze/freeCodeCamp
ed383c47121d83ed612cbfbb4bd2688594089433
[ "BSD-3-Clause" ]
null
null
null
curriculum/challenges/english/05-back-end-development-and-apis/basic-node-and-express/implement-a-root-level-request-logger-middleware.md
Buzzfreeze/freeCodeCamp
ed383c47121d83ed612cbfbb4bd2688594089433
[ "BSD-3-Clause" ]
null
null
null
--- id: 587d7fb1367417b2b2512bf3 title: Implement a Root-Level Request Logger Middleware challengeType: 2 forumTopicId: 301514 dashedName: implement-a-root-level-request-logger-middleware --- # --description-- ในบทเรียนที่แล้ว คุณได้เรียนเรื่อง middleware `express.static()` ไปแล้ว ถึงเวลาที่จะเรียนรายละเอียดของ middleware กันแล้ว Middleware Function คือฟังก์ชันที่รับ argument 3 ตัว คือ request object, response object, และฟังก์ชันต่อไป ที่จะทำงานในวงจรของการรับ-ส่ง request และ response (request-response cycle) ของแอป เราจะเขียนโค้ดที่มีผลกระทบกับแอปในฟังก์ชัน middleware ได้ และมักใช้เพื่อเพิ่มข้อมูลไปใน request หรือ response และ middleware ยังใช้เพื่อส่ง response กลับไปได้เลย โดยไม่ต้องเข้าถึงฟังก์ชันภายในได้อีกด้วย แต่ถ้า middleware นี้ไม่ต้องส่ง response กลับไป ฟังก์ชันนี้จะเรียกใช้ฟังก์ชันถัดไปเลย ซึ่งก็คือ argument ตัวที่ 3 หรือ `next()` นั่นเอง ลองดูตัวอย่างการใช้งานด้านล่าง: ```js function(req, res, next) { console.log("I'm a middleware..."); next(); } ``` สมมติว่าคุณเขียน middleware ตัวนี้ไปคั่นไว้ที่ route ใด route หนึ่ง เมื่อคุณส่ง request มายัง route นี้ จะมีข้อความ “I'm a middleware...” ขึ้นใน console แล้วหลังจากนั้นจึงจะเรียกใช้ฟังก์ชันถัดไป ในแบบฝึกหัดนี้ คุณจะต้องสร้าง middleware ระดับบนสุด (root-level) ในแบบทดสอบที่ 4 เราได้สอนเรื่องการใช้ middleware ที่ระดับบนสุดไปแล้ว โดยจะใช้ method `app.use(<mware-function>)` ได้ ในกรณีนี้ ทุก request ที่ส่งมาหาแอปของคุณจะต้องผ่าน middleware ตัวนี้ทั้งหมด และคุณก็สามารถกำหนดเงื่อนไขเพิ่มเติมได้เช่นกัน เช่น ถ้าต้องการให้ middleware ทำงานแค่กับ POST request คุณสามารถใช้ `app.post(<mware-function>)` ได้ นอกจากนี้ก็ยังใช้กับ HTTP request method ตัวอื่นได้หมด (เช่น GET, DELETE, PUT, ฯลฯ) # --instructions-- ให้สร้าง logger Middleware ที่จะ log request ทั้งหมดที่เข้ามาในแอปของคุณบน console โดยจะมีรูปแบบข้อมูลที่ log เป็น `method path - ip` เช่น `GET /json - ::ffff:127.0.0.1` ให้สังเกตว่าระหว่าง `method` กับ `path` จะมีการเว้นวรรคด้วย และจะมี dash (`-`) คั่นระหว่าง `path` กับ `ip` โดยต้องเว้นวรรคหน้าและหลัง dash ตัวนี้ด้วย โดยคุณจะดึงค่าของ `method` `path` และ `ip` ได้จากการอ่านค่าของ `req.method`, `req.path` และ `req.ip` อย่าลืมว่า ต้องเรียกใช้ `next()` ทุกครั้งเมื่อ middleware ทำงานเสร็จแล้ว ไม่อย่างนั้นเซิร์ฟเวอร์ของคุณจะค้างอยู่อย่างนั้นไปตลอด และสุดท้าย อย่าลืมเปิดหน้าต่าง console ไว้ด้วย จะได้เห็นว่า reqeuest ที่เข้ามาจะแสดงผลอย่างไร **Note:** Express จะเรียกใช้ฟังก์ชันตามลำดับที่ปรากฏในโค้ด ซึ่งรวมถึงกรณีของมิดเดิลแวร์ด้วย ถ้าต้องการให้ middleware ที่เขียนทำงานได้กับทุก route คุณจะต้องประกาศใช้ middleware ไว้บรรทัดเหนือการประกาศ route # --hints-- Logger Middleware ต้องทำงานได้ ```js (getUserInput) => $.get(getUserInput('url') + '/_api/root-middleware-logger').then( (data) => { assert.isTrue( data.passed, 'root-level logger is not working as expected' ); }, (xhr) => { throw new Error(xhr.responseText); } ); ``` # --solutions-- ```js /** Backend challenges don't need solutions, because they would need to be tested against a full working project. Please check our contributing guidelines to learn more. */ ```
41.52
205
0.696532
tha_Thai
0.974158
dbf1f3acf63ff5f4056d04bb1428a68642e1d9a7
6,975
md
Markdown
docs/visual-basic/programming-guide/language-features/xml/overview-of-linq-to-xml.md
rscprof/docs.ru-ru
9c2a47b4b444efb88ed2c2d943b09721415d5ed0
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/visual-basic/programming-guide/language-features/xml/overview-of-linq-to-xml.md
rscprof/docs.ru-ru
9c2a47b4b444efb88ed2c2d943b09721415d5ed0
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/visual-basic/programming-guide/language-features/xml/overview-of-linq-to-xml.md
rscprof/docs.ru-ru
9c2a47b4b444efb88ed2c2d943b09721415d5ed0
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Общие сведения о LINQ to XML в Visual Basic ms.date: 07/20/2015 helpviewer_keywords: - LINQ to XML [Visual Basic], about LINQ to XML - LINQ [Visual Basic], LINQ to XML ms.assetid: 01c62a79-6d58-468e-84fb-039c05947701 ms.openlocfilehash: 65df48112834be04dc8d3b62b8b163316b06c4a6 ms.sourcegitcommit: efff8f331fd9467f093f8ab8d23a203d6ecb5b60 ms.translationtype: MT ms.contentlocale: ru-RU ms.lasthandoff: 09/01/2018 ms.locfileid: "43421247" --- # <a name="overview-of-linq-to-xml-in-visual-basic"></a>Общие сведения о LINQ to XML в Visual Basic Visual Basic предоставляет поддержку для [!INCLUDE[sqltecxlinq](~/includes/sqltecxlinq-md.md)] через литералы XML и свойства оси XML. Это позволяет использовать знакомый и удобный синтаксис для работы с XML в коде Visual Basic. *XML-литералов* позволяют включать XML непосредственно в коде. *Свойства оси XML* позволяют осуществлять доступ дочерних узлов, узлов-потомков и атрибуты XML-литерала. Дополнительные сведения см. в разделе [Общие сведения о литералах XML](../../../../visual-basic/programming-guide/language-features/xml/xml-literals-overview.md) и [доступ к XML в Visual Basic](../../../../visual-basic/programming-guide/language-features/xml/accessing-xml.md). [!INCLUDE[sqltecxlinq](~/includes/sqltecxlinq-md.md)] API-Интерфейс специально для того, чтобы воспользоваться преимуществами программирования XML в памяти [!INCLUDE[vbteclinqext](~/includes/vbteclinqext-md.md)]. Несмотря на то, что вы можете вызвать [!INCLUDE[vbteclinq](~/includes/vbteclinq-md.md)] напрямую интерфейсы API, только Visual Basic позволяет объявить XML-литералы и напрямую обращаться к свойства оси XML. > [!NOTE] > Литералы XML и свойства оси XML не поддерживаются в декларативном коде страницы ASP.NET. Для использования функций XML в Visual Basic поместите код в страницы с выделенным кодом в приложении ASP.NET. ![ссылка на видео](../../../../visual-basic/programming-guide/language-features/xml/media/playvideo.gif "PlayVideo") связанные демонстрационные видеоролики см. в разделе [инструкции начать работу с LINQ to XML?](/aspnet/web-forms/videos/data-access/linq-videos-from-the-vb-team/how-do-i-get-started-with-linq-to-xml) и [инструкции создания таблицы Excel, с помощью LINQ to XML?](/aspnet/web-forms/videos/data-access/linq-videos-from-the-vb-team/how-do-i-create-excel-spreadsheets-using-linq-to-xml). ## <a name="creating-xml"></a>Создание XML Существует два способа создания деревьев XML в Visual Basic. Можно объявить XML-литерал непосредственно в коде, или воспользоваться [!INCLUDE[vbteclinq](~/includes/vbteclinq-md.md)] API-интерфейсы для создания дерева. Оба процесса включить код в соответствии с окончательная структура XML-дерева. Например в следующем примере кода создается элемент XML: [!code-vb[VbXmlSamples#5](../../../../visual-basic/language-reference/operators/codesnippet/VisualBasic/overview-of-linq-to-xml_1.vb)] Дополнительные сведения см. в разделе [Создание XML в Visual Basic](../../../../visual-basic/programming-guide/language-features/xml/creating-xml.md). ## <a name="accessing-and-navigating-xml"></a>Доступ и переходы в XML Visual Basic предоставляет свойства оси XML для доступа и перемещения по XML-структур. Эти свойства позволяют получить доступ к XML-элементов и атрибутов, указав имена дочерних элементов XML. Кроме того, можно явно вызывать [!INCLUDE[vbteclinq](~/includes/vbteclinq-md.md)] методы для навигации и поиска элементов и атрибутов. Например в следующем примере кода используются свойства оси XML для ссылки на атрибуты и дочерние элементы элемента XML. В примере кода используется [!INCLUDE[vbteclinq](~/includes/vbteclinq-md.md)] запрос для получения дочерних элементов и вывода их в виде XML-элементов, фактически выполняет преобразование. [!code-vb[VbXmlSamples#8](../../../../visual-basic/language-reference/operators/codesnippet/VisualBasic/overview-of-linq-to-xml_2.vb)] Дополнительные сведения см. в разделе [доступ к XML в Visual Basic](../../../../visual-basic/programming-guide/language-features/xml/accessing-xml.md). ## <a name="xml-namespaces"></a>Пространства имен XML Visual Basic позволяет указать псевдоним для глобального пространства имен XML с помощью `Imports` инструкции. В следующем примере показано, как использовать `Imports` инструкцию, чтобы импортировать пространство имен XML: [!code-vb[VbXMLSamples#1](../../../../visual-basic/language-reference/operators/codesnippet/VisualBasic/overview-of-linq-to-xml_3.vb)] При доступе к свойствам оси XML и объявить литералы XML для XML-документов и элементов можно использовать псевдоним пространства имен XML. Вы можете получить <xref:System.Xml.Linq.XNamespace> объект для определенного префикса с помощью [оператор GetXmlNamespace](../../../../visual-basic/language-reference/operators/getxmlnamespace-operator.md). Дополнительные сведения см. в разделе [оператор Imports (пространство имен XML)](../../../../visual-basic/language-reference/statements/imports-statement-xml-namespace.md). ### <a name="using-xml-namespaces-in-xml-literals"></a>Использование пространств имен XML в XML-литералах В следующем примере показано, как создать <xref:System.Xml.Linq.XElement> объект, который использует глобальное пространство имен `ns`: [!code-vb[VbXMLSamples#2](../../../../visual-basic/language-reference/operators/codesnippet/VisualBasic/overview-of-linq-to-xml_4.vb)] Компилятор Visual Basic преобразует XML-литералы, содержащие псевдонимы пространств имен XML, в эквивалентный код, содержащий XML-представление для использования пространства имен XML, с помощью `xmlns` атрибута. При компиляции кода в примере в предыдущем разделе создается по сути тот же исполнимый код в приведенном ниже примере: [!code-vb[VbXMLSamples#3](../../../../visual-basic/language-reference/operators/codesnippet/VisualBasic/overview-of-linq-to-xml_5.vb)] ### <a name="using-xml-namespaces-in-xml-axis-properties"></a>Использование пространств имен XML в свойства оси XML Пространства имен XML, объявленные в XML-литералах для использования в свойства оси XML недоступны. Тем не менее глобальные пространства имен можно использовать со свойствами оси XML. Для разделения префикс пространства имен XML от имени локального элемента, используйте двоеточия. Ниже представлен пример. [!code-vb[VbXMLSamples#4](../../../../visual-basic/language-reference/operators/codesnippet/VisualBasic/overview-of-linq-to-xml_6.vb)] ## <a name="see-also"></a>См. также [XML](../../../../visual-basic/programming-guide/language-features/xml/index.md) [Создание XML в Visual Basic](../../../../visual-basic/programming-guide/language-features/xml/creating-xml.md) [Доступ к XML в Visual Basic](../../../../visual-basic/programming-guide/language-features/xml/accessing-xml.md) [Работа с XML в Visual Basic](../../../../visual-basic/programming-guide/language-features/xml/manipulating-xml.md)
101.086957
675
0.772473
rus_Cyrl
0.687398
dbf21137ee84c5ef25c0f5ddbad1cd293575e608
4,050
md
Markdown
docs/source_compile/compile_andriod.md
CharlesXu/Paddle-Lite
617716bbf1b7b203c14fbabdb57741da46f96ac9
[ "Apache-2.0" ]
6
2019-11-14T08:20:02.000Z
2021-06-24T02:04:33.000Z
docs/source_compile/compile_andriod.md
hexieshenghuo/Paddle-Lite
fd8bac90efc645a1044d68dc19cb95096d575498
[ "Apache-2.0" ]
null
null
null
docs/source_compile/compile_andriod.md
hexieshenghuo/Paddle-Lite
fd8bac90efc645a1044d68dc19cb95096d575498
[ "Apache-2.0" ]
1
2022-02-15T08:27:19.000Z
2022-02-15T08:27:19.000Z
# 源码编译 (Android) Paddle Lite提供了Android平台的官方Release预测库下载,我们优先推荐您直接下载[Paddle Lite预编译库](../quick_start/release_lib.html#android-toolchain-gcc)。 **注意:** 以下编译方法只适用于release/v2.6.0及之后版本(包括 v2.6.0)。release/v2.3及之前版本(包括 v2.3)请参考[release/v2.3源码编译方法](v2.3_compile.md)。 如果您还没有配置好Andriod交叉编译环境,请先根据[编译环境准备](compile_env)中的内容,根据您的开发环境安装编译Android预测库所需的编译环境。运行编译脚本之前,请先检查环变量`NDK_ROOT`指向正确的Andriod NDK安装路径,之后可以下载并编译 Paddle-Lite源码。 ```shell # 1. 下载Paddle-Lite源码 并切换到release分支 git clone https://github.com/PaddlePaddle/Paddle-Lite.git cd Paddle-Lite && git checkout release/v2.6 # (可选) 删除此目录,编译脚本会自动从国内CDN下载第三方库文件 # rm -rf third-party # 2. 编译Paddle-Lite Android预测库 (armv8, gcc编译, 静态链接ndk stl) ./lite/tools/build_android.sh ``` **提示:** 编译过程中,如出现源码编译耗时过长,通常是第三方库下载过慢或失败导致。请在git clone完Paddle-Lite仓库代码后,手动删除本地仓库根目录下的third-party目录。编译脚本会自动下载存储于国内 CDN 的第三方依赖的压缩包,节省从git repo同步第三方库代码的时间。 ### 编译结果 位于`Paddle-Lite/build.lite.android.armv8.gcc/inference_lite_lib.android.armv8`: ```shell inference_lite_lib.android.armv8/ ├── cxx C++ 预测库和头文件 │   ├── include C++ 头文件 │   │   ├── paddle_api.h │   │   ├── paddle_image_preprocess.h │   │   ├── paddle_lite_factory_helper.h │   │   ├── paddle_place.h │   │   ├── paddle_use_kernels.h │   │   ├── paddle_use_ops.h │   │   └── paddle_use_passes.h │   └── lib C++ 预测库 │   ├── libpaddle_api_light_bundled.a C++ 静态库 │   └── libpaddle_light_api_shared.so C++ 动态库 │ ├── java Java 预测库 │ ├── jar │ │   └── PaddlePredictor.jar Java JAR 包 │ ├── so │ │   └── libpaddle_lite_jni.so Java JNI 动态链接库 │ └── src │ └── demo C++ 和 Java 示例代码    ├── cxx C++ 预测库demo    └── java Java 预测库demo ``` ### 编译命令 - 默认编译方法: (armv8, gcc, c++_static) ``` shell ./lite/tools/build_android.sh ``` - 打印 help 信息: ```shell ./lite/tools/build_android.sh help ``` - 其他可选编译命令: ```shell --arch: (armv8|armv7) arm版本,默认为armv8 --toolchain: (gcc|clang) 编译器类型,默认为gcc --android_stl: (c++_static|c++_shared) NDK stl库链接方法,默认为静态链接c++_static --with_java: (OFF|ON) 是否编译Java预测库, 默认为 ON --with_cv: (OFF|ON) 是否编译CV相关预处理库, 默认为 OFF --with_log: (OFF|ON) 是否输出日志信息, 默认为 ON --with_exception: (OFF|ON) 是否在错误发生时抛出异常,默认为 OFF --with_extra: (OFF|ON) 是否编译OCR/NLP模型相关kernel&OP,默认为OFF,只编译CV模型相关kernel&OP --android_api_level: (num) 指定编译时支持的最低Android API Level,默认为Default ``` - Android 版本支持情况 Paddle-Lite 默认支持的最低安卓版本如下表所示,使用者可以通过`--android_api_level`选项设定一个具体的数值,该数值应不低于下表中最低支持的 Android API Level。 | Paddle-Lite Requird / ARM ABI | armv7 | armv8 | | :-- | :-- | :-- | | Supported Minimum Android API Level | 16 | 21 | | Supported Minimum Android Platform Version | 4.1 | 5.0 | - 裁剪预测库方法(只编译模型中的kernel&OP,降低预测库体积),详情请参考: [裁剪预测库](library_tailoring) ```shell ./lite/tools/build_android.sh --with_strip=ON --opt_model_dir=%YourOptimizedModelDir% # 编译选项说明 --with_strip: (OFF|ON) 是否根据输入模型裁剪预测库,默认为OFF --opt_model_dir 输入模型的绝对路径,需要为opt转化之后的模型 ``` - 编译 Android npu 预测库方法,详情请参考:[PaddleLite使用华为NPU(Kirin SoC)预测部署](../demo_guides/huawei_kirin_npu) ```shell ./lite/tools/build_android.sh --with_huawei_kirin_npu=ON \ --huawei_kirin_npu_sdk_root=%YourNpuSdkPath% # 编译选项说明 --with_huawei_kirin_npu: (OFF|ON) 是否编译编译huawei_kirin_npu 的预测库,默认为OFF --huawei_kirin_npu_sdk_root Huawei HiAi DDK文件的绝对路径,可从以下网址下载 https://developer.huawei.com/consumer/cn/hiai/ ``` - 编译Android opencl 预测库方法,详情请参考:[PaddleLite使用OpenCL预测部署](../demo_guides/opencl) ```shell ./lite/tools/build_android.sh --with_opencl=ON # 编译选项说明 --with_opencl: (OFF|ON); 是否编译opencl预测库, 默认为 OFF ```
32.926829
154
0.61358
yue_Hant
0.357256
dbf23379cb75e646104dd6170b8ae4c1334777e8
4,509
md
Markdown
packages/vm/docs/interfaces/index.vmopts.md
samlior/ethereumjs-monorepo
f8ddcf665904970d83d0c44729e6a36a5c7ccef7
[ "MIT" ]
1
2021-07-03T12:21:08.000Z
2021-07-03T12:21:08.000Z
packages/vm/docs/interfaces/index.vmopts.md
samlior/ethereumjs-monorepo
f8ddcf665904970d83d0c44729e6a36a5c7ccef7
[ "MIT" ]
null
null
null
packages/vm/docs/interfaces/index.vmopts.md
samlior/ethereumjs-monorepo
f8ddcf665904970d83d0c44729e6a36a5c7ccef7
[ "MIT" ]
null
null
null
[@ethereumjs/vm](../README.md) / [index](../modules/index.md) / VMOpts # Interface: VMOpts [index](../modules/index.md).VMOpts Options for instantiating a [[VM]]. ## Table of contents ### Properties - [activatePrecompiles](index.vmopts.md#activateprecompiles) - [allowUnlimitedContractSize](index.vmopts.md#allowunlimitedcontractsize) - [blockchain](index.vmopts.md#blockchain) - [common](index.vmopts.md#common) - [hardforkByBlockNumber](index.vmopts.md#hardforkbyblocknumber) - [state](index.vmopts.md#state) - [stateManager](index.vmopts.md#statemanager) ## Properties ### activatePrecompiles • `Optional` **activatePrecompiles**: *boolean* If true, create entries in the state tree for the precompiled contracts, saving some gas the first time each of them is called. If this parameter is false, the first call to each of them has to pay an extra 25000 gas for creating the account. Setting this to true has the effect of precompiled contracts' gas costs matching mainnet's from the very first call, which is intended for testing networks. Default: `false` Defined in: [index.ts:93](https://github.com/ethereumjs/ethereumjs-monorepo/blob/master/packages/vm/src/index.ts#L93) ___ ### allowUnlimitedContractSize • `Optional` **allowUnlimitedContractSize**: *boolean* Allows unlimited contract sizes while debugging. By setting this to `true`, the check for contract size limit of 24KB (see [EIP-170](https://git.io/vxZkK)) is bypassed. Default: `false` [ONLY set to `true` during debugging] Defined in: [index.ts:100](https://github.com/ethereumjs/ethereumjs-monorepo/blob/master/packages/vm/src/index.ts#L100) ___ ### blockchain • `Optional` **blockchain**: *default* A [blockchain](https://github.com/ethereumjs/ethereumjs-monorepo/packages/blockchain) object for storing/retrieving blocks Defined in: [index.ts:80](https://github.com/ethereumjs/ethereumjs-monorepo/blob/master/packages/vm/src/index.ts#L80) ___ ### common • `Optional` **common**: *default* Use a [common](https://github.com/ethereumjs/ethereumjs-monorepo/packages/common) instance if you want to change the chain setup. ### Possible Values - `chain`: all chains supported by `Common` or a custom chain - `hardfork`: `mainnet` hardforks up to the `MuirGlacier` hardfork - `eips`: `2537` (usage e.g. `eips: [ 2537, ]`) ### Supported EIPs - [EIP-1559](https://eips.ethereum.org/EIPS/eip-1559) - Fee Market - [EIP-2315](https://eips.ethereum.org/EIPS/eip-2315) - VM simple subroutines - [EIP-2537](https://eips.ethereum.org/EIPS/eip-2537) (`experimental`) - BLS12-381 precompiles - [EIP-2565](https://eips.ethereum.org/EIPS/eip-2565) - ModExp Gas Cost - [EIP-2718](https://eips.ethereum.org/EIPS/eip-2718) - Typed Transactions - [EIP-2929](https://eips.ethereum.org/EIPS/eip-2929) - Gas cost increases for state access opcodes - [EIP-2930](https://eips.ethereum.org/EIPS/eip-2930) - Access List Transaction Type - [EIP-3198](https://eips.ethereum.org/EIPS/eip-3198) - BASEFEE opcode - [EIP-3529](https://eips.ethereum.org/EIPS/eip-3529) - Reduction in refunds - [EIP-3541](https://eips.ethereum.org/EIPS/eip-3541) - Reject new contracts starting with the 0xEF byte *Annotations:* - `experimental`: behaviour can change on patch versions ### Default Setup Default setup if no `Common` instance is provided: - `chain`: `mainnet` - `hardfork`: `istanbul` - `eips`: `[]` Defined in: [index.ts:67](https://github.com/ethereumjs/ethereumjs-monorepo/blob/master/packages/vm/src/index.ts#L67) ___ ### hardforkByBlockNumber • `Optional` **hardforkByBlockNumber**: *boolean* Select hardfork based upon block number. This automatically switches to the right hard fork based upon the block number. Default: `false` Defined in: [index.ts:107](https://github.com/ethereumjs/ethereumjs-monorepo/blob/master/packages/vm/src/index.ts#L107) ___ ### state • `Optional` **state**: *any* An [merkle-patricia-tree](https://github.com/ethereumjs/ethereumjs-monorepo/tree/master/packages/trie) instance for the state tree (ignored if stateManager is passed) **`deprecated`** Defined in: [index.ts:76](https://github.com/ethereumjs/ethereumjs-monorepo/blob/master/packages/vm/src/index.ts#L76) ___ ### stateManager • `Optional` **stateManager**: [*StateManager*](state_interface.statemanager.md) A [StateManager](state_interface.statemanager.md) instance to use as the state store (Beta API) Defined in: [index.ts:71](https://github.com/ethereumjs/ethereumjs-monorepo/blob/master/packages/vm/src/index.ts#L71)
32.673913
166
0.747838
eng_Latn
0.490463
dbf29084e8939a360492962fb10ed63501f03ba0
2,965
md
Markdown
treebanks/koi_uh/koi_uh-feat-Number-psor.md
vistamou/docs
116b9c29e4218be06bf33b158284b9c952646989
[ "Apache-2.0" ]
204
2015-01-20T16:36:39.000Z
2022-03-28T00:49:51.000Z
treebanks/koi_uh/koi_uh-feat-Number-psor.md
vistamou/docs
116b9c29e4218be06bf33b158284b9c952646989
[ "Apache-2.0" ]
654
2015-01-02T17:06:29.000Z
2022-03-31T18:23:34.000Z
treebanks/koi_uh/koi_uh-feat-Number-psor.md
vistamou/docs
116b9c29e4218be06bf33b158284b9c952646989
[ "Apache-2.0" ]
200
2015-01-16T22:07:02.000Z
2022-03-25T11:35:28.000Z
--- layout: base title: 'Statistics of Number[psor] in UD_Komi_Permyak-UH' udver: '2' --- ## Treebank Statistics: UD_Komi_Permyak-UH: Features: `Number[psor]` This feature is language-specific. It occurs with 2 different values: `Plur`, `Sing`. This is a <a href="../../u/overview/feat-layers.html">layered feature</a> with the following layers: <tt><a href="koi_uh-feat-Number.html">Number</a></tt>, <tt><a href="koi_uh-feat-Number-psor.html">Number[psor]</a></tt>. 62 tokens (7%) have a non-empty value of `Number[psor]`. 50 types (10%) occur at least once with a non-empty value of `Number[psor]`. 45 lemmas (11%) occur at least once with a non-empty value of `Number[psor]`. The feature is used with 3 part-of-speech tags: <tt><a href="koi_uh-pos-NOUN.html">NOUN</a></tt> (57; 6% instances), <tt><a href="koi_uh-pos-ADP.html">ADP</a></tt> (3; 0% instances), <tt><a href="koi_uh-pos-PRON.html">PRON</a></tt> (2; 0% instances). ### `NOUN` 57 <tt><a href="koi_uh-pos-NOUN.html">NOUN</a></tt> tokens (27% of all `NOUN` tokens) have a non-empty value of `Number[psor]`. The most frequent other feature values with which `NOUN` and `Number[psor]` co-occurred: <tt><a href="koi_uh-feat-Animacy.html">Animacy</a></tt><tt>=EMPTY</tt> (51; 89%), <tt><a href="koi_uh-feat-Number.html">Number</a></tt><tt>=Sing</tt> (51; 89%), <tt><a href="koi_uh-feat-Person-psor.html">Person[psor]</a></tt><tt>=3</tt> (45; 79%). `NOUN` tokens may have the following values of `Number[psor]`: `Number[psor]` seems to be **lexical feature** of `NOUN`. 100% lemmas (40) occur only with one value of `Number[psor]`. ### `ADP` 3 <tt><a href="koi_uh-pos-ADP.html">ADP</a></tt> tokens (19% of all `ADP` tokens) have a non-empty value of `Number[psor]`. The most frequent other feature values with which `ADP` and `Number[psor]` co-occurred: <tt><a href="koi_uh-feat-AdpType.html">AdpType</a></tt><tt>=Post</tt> (3; 100%), <tt><a href="koi_uh-feat-Case.html">Case</a></tt><tt>=Ill</tt> (3; 100%), <tt><a href="koi_uh-feat-Number.html">Number</a></tt><tt>=Sing</tt> (3; 100%), <tt><a href="koi_uh-feat-Person-psor.html">Person[psor]</a></tt><tt>=3</tt> (3; 100%), <tt><a href="koi_uh-feat-AdvType.html">AdvType</a></tt><tt>=EMPTY</tt> (2; 67%). `ADP` tokens may have the following values of `Number[psor]`: ### `PRON` 2 <tt><a href="koi_uh-pos-PRON.html">PRON</a></tt> tokens (2% of all `PRON` tokens) have a non-empty value of `Number[psor]`. The most frequent other feature values with which `PRON` and `Number[psor]` co-occurred: <tt><a href="koi_uh-feat-Number.html">Number</a></tt><tt>=Sing</tt> (2; 100%), <tt><a href="koi_uh-feat-Person.html">Person</a></tt><tt>=EMPTY</tt> (2; 100%). `PRON` tokens may have the following values of `Number[psor]`: ## Relations with Agreement in `Number[psor]` The 10 most frequent relations where parent and child node agree in `Number[psor]`: <tt>NOUN --[<tt><a href="koi_uh-dep-appos.html">appos</a></tt>]--> NOUN</tt> (1; 100%).
55.943396
489
0.667116
yue_Hant
0.515435
dbf2a386f5122d820ec5496da3ccb1dadee5cbd7
3,191
md
Markdown
README.md
bimedia-fr/bimedia-front-server
88f3f0ec069eb848d47bd673b58c5932ae934bc1
[ "Apache-2.0" ]
null
null
null
README.md
bimedia-fr/bimedia-front-server
88f3f0ec069eb848d47bd673b58c5932ae934bc1
[ "Apache-2.0" ]
2
2015-02-23T17:58:00.000Z
2015-03-05T15:39:39.000Z
README.md
bimedia-fr/bimedia-front-server
88f3f0ec069eb848d47bd673b58c5932ae934bc1
[ "Apache-2.0" ]
null
null
null
bimedia-front-server ================ Bimedia Front-end test server. ## features : * serve static assets * static data fixtures * http method to resource mapping * proxied data * custom test routes Example : ```js var server = require('./test-server'); var path = require('path'); var PORT = 9090; server({ fixtures : { path : path.join(__dirname, './fixtures'), prefix : 'api' }, 'static' : { path : path.join(__dirname, './src') } }).listen(PORT, function () { console.log('test server listening on port %d', PORT); }); ``` This example create a server listening on port 9090 that will serve any static resource availiable in `src` subfolder. It will also serve JSON data availiable in `fixtures` folder for every request with the path starting with `api`. ## static assets Serve files in `static.path` like any http server. ## fixtures The server tries to find a matching ressource in the filesystem in path denoted by config `fixtures.path`. Example: `GET /api/auth/token` will serve file : `$cwd/auth/token.json` ### method mapping For any http method other than GET, it will append method name in lowercase at the end of requested uri. Example: `POST /api/auth/token` will serve file : `$cwd/auth/token-post.json` ## proxied data Fixtures can be retrieved from an url. The test server serves proxy data. Example : ```js var server = require('./test-server'); var path = require('path'); var PORT = 9090; server({ fixtures : { url : 'http://my.remotehost.com/fixtures', prefix : 'api' }, 'static' : { path : path.join(__dirname, './src') } }).listen(PORT, function () { console.log('test server listening on port %d', PORT); }); ``` ## multiple fixtures sources Fixtures can be retrieved from more than one location, both locally or remotely. Example : ```js var server = require('./test-server'); var path = require('path'); var PORT = 9090; server({ fixtures : [{ url : 'http://my.remotehost.com/fixtures', prefix : 'api/params' }, { path : path.join(__dirname, './fixtures'), prefix : 'api' } ], 'static' : { path : path.join(__dirname, './src') } }).listen(PORT, function () { console.log('test server listening on port %d', PORT); }); ``` ## custom test routes Add custom route and logic with routes method. ```js server({ fixtures : { path : path.join(__dirname, './fixtures'), prefix : 'api' }, 'static' : { path : path.join(__dirname, './src') } }).routes(function (router) { router.get(/api\/([0-9]{4}-[0-9]{2}-[0-9]{2})/, function (str) { this.res.writeHead(200, { 'Content-Type': 'text/plain' }); this.res.end(str); }); }).listen(PORT, function () { console.log('test server listening on port %d', PORT); }); ``` ## based on : * [ecstatic](https://github.com/jesusabdullah/node-ecstatic) file server. * [union](https://github.com/flatiron/union) http server. * [director](https://github.com/flatiron/director) router. * [http-proxy](https://github.com/nodejitsu/node-http-proxy) proxy.
21.273333
118
0.624256
eng_Latn
0.725963
dbf2a6463cfb6ff62e7cc2ac55c683c2a44d24ba
16,891
md
Markdown
docs/source/04_kedro_project_setup/02_configuration.md
WaylonWalker/kedro
ce39d3e2485772bfc4977ab892b29c5a377d998d
[ "Apache-2.0" ]
1
2021-12-25T19:58:39.000Z
2021-12-25T19:58:39.000Z
docs/source/04_kedro_project_setup/02_configuration.md
WaylonWalker/kedro
ce39d3e2485772bfc4977ab892b29c5a377d998d
[ "Apache-2.0" ]
null
null
null
docs/source/04_kedro_project_setup/02_configuration.md
WaylonWalker/kedro
ce39d3e2485772bfc4977ab892b29c5a377d998d
[ "Apache-2.0" ]
null
null
null
# Configuration This section contains detailed information about configuration, for which the relevant API documentation can be found in [kedro.config.ConfigLoader](/kedro.config.ConfigLoader) > *Note:* This documentation is based on `Kedro 0.17.1`, if you spot anything that is incorrect then please create an [issue](https://github.com/quantumblacklabs/kedro/issues) or pull request. ## Local and base configuration We recommend that you keep all configuration files in the `conf` directory of a Kedro project. However, if you prefer, you may point Kedro to any other directory and change the configuration paths by setting the `CONF_ROOT` variable in `src/<project-package>/settings.py` as follows: ```python # ... CONF_ROOT = "new_conf" ``` ## Loading Kedro-specific configuration (e.g., `DataCatalog` configuration for IO) is loaded using the `ConfigLoader` class: ```python from kedro.config import ConfigLoader conf_paths = ["conf/base", "conf/local"] conf_loader = ConfigLoader(conf_paths) conf_catalog = conf_loader.get("catalog*", "catalog*/**") ``` This will recursively scan for configuration files firstly in `conf/base/` and then in `conf/local/` directory according to the following rules: * ANY of the following is true: * filename starts with `catalog` OR * file is located in a sub-directory whose name is prefixed with `catalog` * AND file extension is one of the following: `yaml`, `yml`, `json`, `ini`, `pickle`, `xml` or `properties` Configuration information from files stored in `base` or `local` that match these rules is merged at runtime and returned in the form of a config dictionary: * If any 2 configuration files located inside the same environment path (`conf/base/` or `conf/local/` in this example) contain the same top-level key, `load_config` will raise a `ValueError` indicating that the duplicates are not allowed. > *Note:* Any top-level keys that start with `_` character are considered hidden (or reserved) and therefore are ignored right after the config load. Those keys will neither trigger a key duplication error mentioned above, nor will they appear in the resulting configuration dictionary. However, you may still use such keys for various purposes. For example, as [YAML anchors and aliases](https://support.atlassian.com/bitbucket-cloud/docs/yaml-anchors/). * If 2 configuration files have duplicate top-level keys, but are placed into different environment paths (one in `conf/base/`, another in `conf/local/`, for example) then the last loaded path (`conf/local/` in this case) takes precedence and overrides that key value. `ConfigLoader.get(<pattern>, ...)` will not raise any errors, however a `DEBUG` level log message will be emitted with the information on the over-ridden keys. * If the same environment path is passed multiple times, a `UserWarning` will be emitted to draw attention to the duplicate loading attempt, and any subsequent loading after the first one will be skipped. ## Additional configuration environments In addition to the 2 built-in configuration environments, it is possible to create your own. Your project loads `conf/base/` as the bottom-level configuration environment but allows you to overwrite it with any other environments that you create. You are be able to create environments like `conf/server/`, `conf/test/`, etc. Any additional configuration environments can be created inside `conf` folder and loaded by running the following command: ```bash kedro run --env=test ``` If no `env` option is specified, this will default to using `local` environment to overwrite `conf/base`. > *Note*: If, for some reason, your project does not have any other environments apart from `base`, i.e. no `local` environment to default to, you will need to customise `KedroContext` to take `env="base"` in the constructor and then specify your custom `KedroContext` subclass in `src/<python-package>/settings.py` under `CONTEXT_CLASS` key. If you set the `KEDRO_ENV` environment variable to the name of your environment, Kedro will load that environment for your `kedro run`, `kedro ipython`, `kedro jupyter notebook` and `kedro jupyter lab` sessions. ```bash export KEDRO_ENV=test ``` > *Note*: If you specify both the `KEDRO_ENV` environment variable and provide the `--env` argument to a CLI command, the CLI argument takes precedence. ## Templating configuration Kedro also provides an extension [TemplatedConfigLoader](/kedro.config.TemplatedConfigLoader) class that allows to template values in your configuration files. `TemplatedConfigLoader` is available in `kedro.config`, to apply templating to your project, you will need to update the `register_config_loader` hook implementation in your `src/<project-name>/hooks.py`: ```python from kedro.config import TemplatedConfigLoader # new import class ProjectHooks: @hook_impl def register_config_loader(self, conf_paths: Iterable[str]) -> ConfigLoader: return TemplatedConfigLoader( conf_paths, globals_pattern="*globals.yml", # read the globals dictionary from project config globals_dict={ # extra keys to add to the globals dictionary, take precedence over globals_pattern "bucket_name": "another_bucket_name", "non_string_key": 10, }, ) ``` Let's assume the project contains a `conf/base/globals.yml` file with the following contents: ```yaml bucket_name: "my_s3_bucket" key_prefix: "my/key/prefix/" datasets: csv: "pandas.CSVDataSet" spark: "spark.SparkDataSet" folders: raw: "01_raw" int: "02_intermediate" pri: "03_primary" fea: "04_feature" ``` The contents of the dictionary resulting from `globals_pattern` get merged with the `globals_dict` dictionary. In case of conflicts, the keys from the `globals_dict` dictionary take precedence. The resulting global dictionary prepared by `TemplatedConfigLoader` will look like this: ```python { "bucket_name": "another_bucket_name", "non_string_key": 10, "key_prefix": "my/key/prefix", "datasets": {"csv": "pandas.CSVDataSet", "spark": "spark.SparkDataSet"}, "folders": { "raw": "01_raw", "int": "02_intermediate", "pri": "03_primary", "fea": "04_feature", }, } ``` Now the templating can be applied to the configs. Here is an example of a templated `conf/base/catalog.yml`: ```yaml raw_boat_data: type: "${datasets.spark}" # nested paths into global dict are allowed filepath: "s3a://${bucket_name}/${key_prefix}/${folders.raw}/boats.csv" file_format: parquet raw_car_data: type: "${datasets.csv}" filepath: "s3://${bucket_name}/data/${key_prefix}/${folders.raw}/${filename|cars.csv}" # default to 'cars.csv' if the 'filename' key is not found in the global dict ``` > Note: `TemplatedConfigLoader` uses `jmespath` package in the background to extract elements from global dictionary. For more information about JMESPath syntax please see: https://github.com/jmespath/jmespath.py. ### Jinja2 support From version 0.17.0 `TemplateConfigLoader` also supports [Jinja2](https://palletsprojects.com/p/jinja/) template engine alongside the original template syntax. Below is the example of a `catalog.yml` file, which uses both features: ``` {% for speed in ['fast', 'slow'] %} {{ speed }}-trains: type: MemoryDataSet {{ speed }}-cars: type: pandas.CSVDataSet filepath: s3://${bucket_name}/{{ speed }}-cars.csv save_args: index: true {% endfor %} ``` When parsing this configuration file, `TemplateConfigLoader` will: 1. Read the `catalog.yml` and compile it using Jinja2 2. Use YAML parser to parse the compiled config into a Python dictionary 3. Expand `${bucket_name}` in `filepath` using the `globals_*` arguments for the `TemplateConfigLoader` instance as in the previous examples The output Python dictionary will look as follows: ```python { "fast-trains": {"type": "MemoryDataSet"}, "fast-cars": { "type": "pandas.CSVDataSet", "filepath": "s3://my_s3_bucket/fast-cars.csv", "save_args": {"index": True}, }, "slow-trains": {"type": "MemoryDataSet"}, "slow-cars": { "type": "pandas.CSVDataSet", "filepath": "s3://my_s3_bucket/slow-cars.csv", "save_args": {"index": True}, }, } ``` > Note: Although Jinja2 is a very powerful and extremely flexible template engine, which comes with a wide range of features, we do _not_ recommend to use it to template your configuration unless absolutely necessary. The flexibility of dynamic configuration comes at a cost of significantly reduced readability and much higher maintenance overhead. We believe that, for the majority of analytics projects, dynamically compiled configuration does more harm than good. ## Parameters ### Loading parameters Parameters project configuration can be loaded with the help of the `ConfigLoader` class: ```python from kedro.config import ConfigLoader conf_paths = ["conf/base", "conf/local"] conf_loader = ConfigLoader(conf_paths) parameters = conf_loader.get("parameters*", "parameters*/**") ``` The code snippet above will load all configuration files from `conf/base` and `conf/local`, which either have the filename starting with `parameters` or are located inside a folder with name starting with `parameters`. > *Note:* Configuration path `conf/local` takes precedence in the example above since it's loaded last, therefore any overlapping top-level keys from `conf/base` will be overwritten by the ones from `conf/local`. Calling `conf_loader.get()` in the example above will throw a `MissingConfigException` error if there are no configuration files matching the given patterns in any of the specified paths. If this is a valid workflow for your application, you can handle it as follows: ```python from kedro.config import ConfigLoader, MissingConfigException conf_paths = ["conf/base", "conf/local"] conf_loader = ConfigLoader(conf_paths) try: parameters = conf_loader.get("parameters*", "parameters*/**") except MissingConfigException: parameters = {} ``` > *Note:* `kedro.framework.context.KedroContext` class uses the approach above to load project parameters. Parameters can then be used on their own or fed in as function inputs, as described in [this section](#using-parameters) below. ### Specifying parameters at runtime Kedro also allows you to specify runtime parameters for `kedro run` CLI command. To do that, you need to add the `--params` command line option and specify a comma-separated list of key-value pairs that will be added to [KedroContext](/kedro.framework.context.KedroContext) parameters and made available to pipeline nodes. Each key-value pair is split on the first colon. Here is an example of triggering Kedro run with extra parameters specified: ```bash kedro run --params param_key1:value1,param_key2:2.0 # this will add {"param_key1": "value1", "param_key2": 2} to parameters dictionary ``` > Note: Parameter keys are _always_ treated as strings. Parameter values are converted to a float or an integer number if the corresponding conversion succeeds, otherwise they are also treated as string. > Note: If, for example, `param_key1` parameter has already been defined in the project configuration, the value provided in the CLI option will take precedence and will overwrite the one from the configuration. > Tip: Since key-value pairs are split on the first colon, values can contain colons, but the keys cannot. This is a valid CLI command: > > `kedro run --params endpoint_url:https://endpoint.example.com` > Tip: If any extra parameter key and/or value contains spaces, wrap the whole option contents into quotes: > > `kedro run --params "key1:value with spaces,key2:value"` ### Using parameters Say you have a set of parameters you're playing around with for your model. You can declare these in one place, for instance `conf/base/parameters.yml`, so that you isolate your changes to one central location. ```yaml step_size: 1 learning_rate: 0.01 ``` You may now reference these parameters in the `node` definition, using the `params:` prefix: ```python def increase_volume(volume, step): return volume + step # in pipeline definition node( func=increase_volume, inputs=["input_volume", "params:step_size"], outputs="output_volume", ) ``` You can also group your parameters into nested structures and, using the same method above, load them by top-level key: ```yaml step_size: 1 model_params: learning_rate: 0.01 test_data_ratio: 0.2 number_of_train_iterations: 10000 ``` ```python def train_model(data, model): lr = model["learning_rate"] test_data_ratio = model["test_data_ratio"] iterations = model["number_of_train_iterations"] ... # in pipeline definition node( func=train_model, inputs=["input_data", "params:model_params"], outputs="output_data", ) ``` Alternatively, you can also pass `parameters` to the node inputs and get access to the entire collection of values inside the node function. ```python def increase_volume(volume, params): step = params["step_size"] return volume + step # in pipeline definition node( func=increase_volume, inputs=["input_volume", "parameters"], outputs="output_volume" ) ``` In both cases, what happened under the hood is that the parameters had been added to the Data Catalog through the method `add_feed_dict()` (Relevant API documentation: [DataCatalog](/kedro.io.DataCatalog)), where they live as `MemoryDataSet`s. This method is also what the `KedroContext` class uses when instantiating the catalog. > *Note*: You can use `add_feed_dict()` to inject any other entries into your `DataCatalog` as per your use case. ## Credentials > *Note:* For security reasons, we strongly recommend *not* committing any credentials or other secrets to the Version Control System. Hence, by default any file inside the `conf/` folder (and its subfolders) containing `credentials` in its name will be ignored via `.gitignore` and not committed to your git repository. Credentials configuration can be loaded the same way as any other project configuration using the `ConfigLoader` class: ```python from kedro.config import ConfigLoader conf_paths = ["conf/base", "conf/local"] conf_loader = ConfigLoader(conf_paths) credentials = conf_loader.get("credentials*", "credentials*/**") ``` This will load all configuration files from `conf/base` and `conf/local`, which either have the filename starting with `credentials` or are located inside a folder with name starting with `credentials`. > *Note:* Configuration path `conf/local` takes precedence in the example above since it's loaded last, therefore any overlapping top-level keys from `conf/base` will be overwritten by the ones from `conf/local`. Calling `conf_loader.get()` in the example above will throw a `MissingConfigException` error if there are no configuration files matching the given patterns in any of the specified paths. If this is a valid workflow for your application, you can handle it as follows: ```python from kedro.config import ConfigLoader, MissingConfigException conf_paths = ["conf/base", "conf/local"] conf_loader = ConfigLoader(conf_paths) try: credentials = conf_loader.get("credentials*", "credentials*/**") except MissingConfigException: credentials = {} ``` > *Note:* `kedro.framework.context.KedroContext` class uses the approach above to load project credentials. Credentials configuration can then be used on its own or fed into the `DataCatalog` as described in [this section](../05_data/01_data_catalog.md#feeding-in-credentials). ### AWS credentials When working with AWS credentials on datasets, you are not required to store AWS credentials in the project configuration files. Instead, you can specify them using environment variables `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`, and, optionally, `AWS_SESSION_TOKEN`. Please refer to the [official documentation](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-envvars.html) for more details. ## Configuring `kedro run` arguments An extensive list of CLI options for a `kedro run` is available in the [Kedro CLI documentation](../09_development/03_commands_reference.md#run-the-project). However, instead of specifying all the command line options to a `kedro run` via the CLI, you can specify a config file that contains the arguments, say `config.yml` and run: ```console $ kedro run --config config.yml ``` where `config.yml` is formatted as below (for example): ```yaml run: tag: - tag1 - tag2 - tag3 pipeline: pipeline1 parallel: true node_names: - node1 - node2 env: env1 ``` > *Note*: If you pass both a configuration file and an option that clashes with one inside the configuration file, the provided option will override the configuration file.
45.284182
467
0.749156
eng_Latn
0.990268
dbf30ff15732e8815b36571c428f99ebd225a511
2,317
md
Markdown
README.md
dontdrinkandroot/metagen.java
513e98854efec47594c482e68b906b65b581f61d
[ "Apache-2.0" ]
null
null
null
README.md
dontdrinkandroot/metagen.java
513e98854efec47594c482e68b906b65b581f61d
[ "Apache-2.0" ]
null
null
null
README.md
dontdrinkandroot/metagen.java
513e98854efec47594c482e68b906b65b581f61d
[ "Apache-2.0" ]
null
null
null
metagen.java ============ Simple metadata generator. [![Build Status](https://travis-ci.org/dontdrinkandroot/metagen.java.svg?branch=master)](https://travis-ci.org/dontdrinkandroot/metagen.java) Usage ----- Simply add the `@net.dontdrinkandroot.metagen.model.GenerateMetadata` annotation to the classes for which you want metadata to be generated. ### Example Input file: ```java package net.sorst.metagen.test; import net.dontdrinkandroot.metagen.model.GenerateMetadata; @GenerateMetadata public class PropertyTestClass { private byte primitiveByteField; } ``` Generated file: ```java package net.dontdrinkandroot.metagen.test; @javax.annotation.Generated(value = "net.dontdrinkandroot.metagen.processor.GenerateMetadataProcessor") public abstract class PropertyTestClass_ { public static net.dontdrinkandroot.metagen.model.Attribute<net.dontdrinkandroot.metagen.test.PropertyTestClass, java.lang.Byte> primitiveByteField = new net.dontdrinkandroot.metagen.model.Attribute("primitiveByteField", net.dontdrinkandroot.metagen.test.PropertyTestClass.class, java.lang.Byte.class); } ``` Maven ----- This project is not yet available via Maven Central. In the meantime you can include the dontdrinkandroot repository: ```xml <repositories> <repository> <id>dontdrinkandroot</id> <url>https://maven.dontdrinkandroot.net/content/groups/public</url> </repository> </repositories> ``` You can include the dependency by adding the following to your pom.xml: ```xml <dependency> <groupId>net.dontdrinkandroot</groupId> <artifactId>metagen</artifactId> <version>0.0.1-SNAPSHOT</version> </dependency> ``` License ------- Copyright (C) 2016 Philip Washington Sorst <[email protected]> and individual contributors as indicated by the @authors tag. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
29.329114
158
0.773414
eng_Latn
0.813758
dbf31d348eeabac5c64d52f9be82200b20eddf0e
2,869
md
Markdown
docs/relational-databases/blob/make-partial-updates-to-filestream-data.md
antoniosql/sql-docs.es-es
0340bd0278b0cf5de794836cd29d53b46452d189
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/relational-databases/blob/make-partial-updates-to-filestream-data.md
antoniosql/sql-docs.es-es
0340bd0278b0cf5de794836cd29d53b46452d189
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/relational-databases/blob/make-partial-updates-to-filestream-data.md
antoniosql/sql-docs.es-es
0340bd0278b0cf5de794836cd29d53b46452d189
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Realización de actualizaciones parciales de los datos FILESTREAM | Microsoft Docs ms.custom: '' ms.date: 03/01/2017 ms.prod: sql ms.prod_service: database-engine ms.reviewer: '' ms.technology: filestream ms.topic: conceptual helpviewer_keywords: - FILESTREAM [SQL Server], FSCTL_SQL_FILESTREAM_FETCH_OLD_CONTENT - FSCTL_SQL_FILESTREAM_FETCH_OLD_CONTENT ms.assetid: d6f7661e-6c14-4d31-9541-4520ca0f82b2 author: douglaslMS ms.author: douglasl manager: craigg ms.openlocfilehash: f088dec7234ccbd3dea1843908614fb7d3c9e61d ms.sourcegitcommit: 61381ef939415fe019285def9450d7583df1fed0 ms.translationtype: HT ms.contentlocale: es-ES ms.lasthandoff: 10/01/2018 ms.locfileid: "47690523" --- # <a name="make-partial-updates-to-filestream-data"></a>Realizar actualizaciones parciales de los datos FILESTREAM [!INCLUDE[appliesto-ss-xxxx-xxxx-xxx-md](../../includes/appliesto-ss-xxxx-xxxx-xxx-md.md)] Una aplicación utiliza FSCTL_SQL_FILESTREAM_FETCH_OLD_CONTENT para realizar actualizaciones parciales de los datos de FILESTREAM BLOB. La función [DeviceIoControl](http://go.microsoft.com/fwlink/?LinkId=105527) pasa este valor y el controlador que [OpenSqlFilestream](../../relational-databases/blob/access-filestream-data-with-opensqlfilestream.md) devuelve al controlador FILESTREAM. A continuación, el controlador obliga a hacer una copia del lado servidor de los datos FILESTREAM actuales en el archivo al que el controlador hace referencia. Si la aplicación emite el valor FSCTL_SQL_FILESTREAM_FETCH_OLD_CONTENT una vez escrito el controlador, la última operación de escritura se conserva y se pierden las operaciones de escritura anteriores que se realizaron en el controlador. > [!NOTE] > FILESTREAM se basa en el protocolo [SMB protocol](http://go.microsoft.com/fwlink/?LinkId=112454) para el acceso remoto. ## <a name="example"></a>Ejemplo En el ejemplo siguiente se muestra cómo utilizar el valor `FSCTL_SQL_FILESTREAM_FETCH_OLD_CONTENT` para realizar una actualización parcial de un FILESTREAM BLOB insertado. > [!NOTE] > Este ejemplo requiere la tabla y la base de datos habilitadas para FILESTREAM que se crean en [Crear una base de datos habilitada para FILESTREAM](../../relational-databases/blob/create-a-filestream-enabled-database.md) y [Crear una tabla para almacenar datos FILESTREAM](../../relational-databases/blob/create-a-table-for-storing-filestream-data.md). [!code-cpp[FILESTREAM#FS_CPP_FSCTL](../../relational-databases/blob/codesnippet/cpp/make-partial-updates-to-_1.cpp)] ## <a name="see-also"></a>Ver también [Obtener acceso a los datos FILESTREAM con OpenSqlFilestream](../../relational-databases/blob/access-filestream-data-with-opensqlfilestream.md) [Crear aplicaciones cliente para datos FILESTREAM](../../relational-databases/blob/create-client-applications-for-filestream-data.md)
65.204545
787
0.795399
spa_Latn
0.724585
dbf3ce4c3e01f888882c77e0b3f2544f51c38d5f
3,146
md
Markdown
docs/vsto/secure-deployment.md
MareevOleg/visualstudio-docs.ru-ru
9376d3175e54c1e324460f242cd6da7c0296d368
[ "CC-BY-4.0", "MIT" ]
1
2018-12-23T10:56:39.000Z
2018-12-23T10:56:39.000Z
docs/vsto/secure-deployment.md
MareevOleg/visualstudio-docs.ru-ru
9376d3175e54c1e324460f242cd6da7c0296d368
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/vsto/secure-deployment.md
MareevOleg/visualstudio-docs.ru-ru
9376d3175e54c1e324460f242cd6da7c0296d368
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Безопасное развертывание ms.custom: '' ms.date: 02/02/2017 ms.technology: - office-development ms.topic: conceptual dev_langs: - VB - CSharp helpviewer_keywords: - deploying applications [Office development in Visual Studio], security - Office development in Visual Studio, security - Office applications [Office development in Visual Studio], security - ClickOnce deployment [Office development in Visual Studio], security author: TerryGLee ms.author: tglee manager: douge ms.workload: - office ms.openlocfilehash: 852e66bb4e29e732093cdac6b44c6791ad9b772d ms.sourcegitcommit: be938c7ecd756a11c9de3e6019a490d0e52b4190 ms.translationtype: MT ms.contentlocale: ru-RU ms.lasthandoff: 10/31/2018 ms.locfileid: "50671084" --- # <a name="secure-deployment"></a>Безопасное развертывание При создании решения Office, на компьютере разработчика обновляется автоматически, чтобы кода в проекте для запуска. Тем не менее, при развертывании решения, необходимо предоставить свидетельство, на котором основывается решение о доверии подписи решения с помощью сертификата или с помощью [!INCLUDE[ndptecclick](../vsto/includes/ndptecclick-md.md)] ключ запроса доверия. Дополнительные сведения см. в разделе [предоставления доверия решениям Office](../vsto/granting-trust-to-office-solutions.md). [!INCLUDE[appliesto_all](../vsto/includes/appliesto-all-md.md)] Для настроек уровня документа Если развернуть его в сетевую папку, необходимо также добавить расположение документа в список надежных расположений в центре управления безопасностью приложения Office. Дополнительные сведения о методах настройки разрешений документа на компьютерах конечных пользователей, см. в разделе [предоставления доверия к документам](../vsto/granting-trust-to-documents.md). ## <a name="prevent-office-solutions-from-running-code"></a>Предотвратить выполнение кода решений Office Администраторы могут использовать реестр для предотвращения запуска на компьютере все решения Office. Решения Office, имеющая расширения управляемого кода открытием, Visual Studio Tools для Office среда выполнения проверяет ли запись с именем `Disabled` существует при выполнении одного из следующих разделов реестра на компьютере: - **HKEY_CURRENT_USER\Software\Microsoft\VSTO** - **Находится в другой** Чтобы предотвратить выполнение кода решения Office, создайте `Disabled` запись в одном или обоих из этих разделов реестра и задайте одно из следующих типов данных и значения для `Disabled`: - REG_SZ или REG_EXPAND_SZ, которой присваивается любой строку, отличную от «0» (ноль). - Параметр типа REG_DWORD, которой присваивается значение, отличное от 0 (ноль). Чтобы включить решения Office, для выполнения кода, установите для обоих атрибутов `Disabled` записей 0 (ноль), или удалить записи реестра. ## <a name="see-also"></a>См. также [Развертывание решения Office](../vsto/deploying-an-office-solution.md) [Подготовка компьютеров для запуска или размещения решений Office](https://msdn.microsoft.com/be1b173f-7261-4d74-aa4e-94ccd43db8d8) [Безопасные решения Office](../vsto/securing-office-solutions.md)
57.2
503
0.794024
rus_Cyrl
0.873091
dbf3e6b07eefe64bbeb98725d4e34c067c2657ef
9,938
md
Markdown
docs-archive-a/2014/reporting-services/report-server/modify-a-reporting-services-configuration-file-rsreportserver-config.md
redpandabigcat/sql-docs-archive-pr.it-it
42057907493283d0099bb6f5dc76994d8e9d3b65
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs-archive-a/2014/reporting-services/report-server/modify-a-reporting-services-configuration-file-rsreportserver-config.md
redpandabigcat/sql-docs-archive-pr.it-it
42057907493283d0099bb6f5dc76994d8e9d3b65
[ "CC-BY-4.0", "MIT" ]
3
2021-10-11T06:40:46.000Z
2021-11-25T02:25:44.000Z
docs-archive-a/2014/reporting-services/report-server/modify-a-reporting-services-configuration-file-rsreportserver-config.md
redpandabigcat/sql-docs-archive-pr.it-it
42057907493283d0099bb6f5dc76994d8e9d3b65
[ "CC-BY-4.0", "MIT" ]
2
2021-09-29T08:51:43.000Z
2021-11-23T02:36:18.000Z
--- title: Modificare un file di configurazione di Reporting Services (RSreportserver.config) | Microsoft Docs ms.custom: '' ms.date: 06/13/2017 ms.prod: sql-server-2014 ms.reviewer: '' ms.technology: reporting-services-native ms.topic: conceptual ms.assetid: 958ef51f-2699-4cb2-a92e-3b4322e36a30 author: maggiesMSFT ms.author: maggies manager: kfile ms.openlocfilehash: 75ff276a0ba43cb89393466bf40a71b7acd21368 ms.sourcegitcommit: ad4d92dce894592a259721a1571b1d8736abacdb ms.translationtype: MT ms.contentlocale: it-IT ms.lasthandoff: 08/04/2020 ms.locfileid: "87628348" --- # <a name="modify-a-reporting-services-configuration-file-rsreportserverconfig"></a>Modify a Reporting Services Configuration File (RSreportserver.config) [!INCLUDE[ssRSnoversion](../../includes/ssrsnoversion-md.md)] le impostazioni dell'applicazione vengono archiviate in un set di file di configurazione. Durante l'installazione vengono creati i file di configurazione per ogni istanza del server di report installata. All'interno di ogni file, i valori vengono impostati in questa fase o nel momento in cui si utilizzano strumenti e applicazioni per configurare un server per l'esecuzione di operazioni. In alcuni casi, è necessario modificare direttamente un file per aggiungere o configurare impostazioni avanzate. Le impostazioni di configurazione sono specificate come elementi o attributi XML. Se si conoscono il linguaggio XML e i file di configurazione, è possibile utilizzare un editor di testo o di codice per modificare le impostazioni definibili dall'utente. Alcune impostazioni di configurazione possono essere impostate solo tramite uno strumento specifico. Impostazioni che contengono valori crittografati devono essere modificate ad esempio tramite lo strumento di configurazione di Reporting Services, il programma di installazione o l'utilità della riga di comando **rsconfig** . Per eseguire questi strumenti, è necessario essere membro del gruppo Administrators locale. > [!IMPORTANT] > Prestare attenzione in caso di modifiche al file di configurazione. Se si modifica un'impostazione riservata per l'uso interno, l'installazione potrebbe essere disabilitata. Non è in genere consigliabile modificare le impostazioni di configurazione, a meno che non sia necessario risolvere un problema specifico. Per ulteriori informazioni sulle impostazioni sicure da modificare, vedere [RSReportServer Configuration File](rsreportserver-config-configuration-file.md) o [RSReportDesigner Configuration File](rsreportdesigner-configuration-file.md). Per altre informazioni sui file di configurazione, vedere la documentazione del prodotto [!INCLUDE[msCoName](../../includes/msconame-md.md)][!INCLUDE[dnprdnshort](../../includes/dnprdnshort-md.md)] . In questo argomento - [Lettura e utilizzo di valori di configurazione](#bkmk_read_values) - [Informazioni sui valori predefiniti](#bkmk_default_values) - [Eliminazione delle impostazioni di configurazione](#bkmk_delete_config_settings) - [Per modificare un file di configurazione di Reporting Services](#bkmk_edit_configuation_file) ## <a name="reading-and-using-configuration-values"></a><a name="bkmk_read_values"></a> Lettura e utilizzo di valori di configurazione In un server di report i file di configurazione vengono letti all'avvio del servizio e tutte le volte in cui il file di configurazione viene salvato. I valori nuovi e quelli modificati diventano effettivi in un nuovo dominio applicazione dopo che quello precedente è scaduto. Tutte le volte in cui è possibile, le richieste attualmente in corso nel dominio applicazione corrente vengono completate. Per alcune impostazioni è necessario tuttavia che venga eseguita immediatamente un'operazione di riciclo del dominio applicazione. In questo caso tutte le richieste attualmente in corso vengono riavviate in un nuovo dominio applicazione. Se il server di report rileva un valore non valido, nel registro applicazioni di Windows viene inserito un errore a seconda del quale il server di report non si avvia o utilizza un valore predefinito: - Se l'errore è causato da XML con formato non valido, il server di report non verrà avviato. Se il server di report è in esecuzione quando si verifica l'errore, il file di configurazione non valido verrà ignorato finché il server di report non verrà riavviato o fino a quando il dominio applicazione non verrà riciclato. Dopo aver rilevato l'errore, il server di report non verrà avviato. - Se l'errore è un valore di configurazione non valido, il server utilizzerà valori predefiniti interni e registrerà un errore nei file di log di traccia. In un numero ridotto di casi non sono disponibili valori predefiniti interni e di conseguenza il server restituirà l'errore rsServerConfigurationError se l'impostazione di configurazione non valida è critica per le operazioni del server. Gli errori relativi a impostazioni mancanti o non valide vengono restituiti all'applicazione client in una pagina degli errori HTML e inseriti nel registro eventi. Tutte le modifiche apportate al file di configurazione, incluse quelle con esito positivo, vengono registrate nel file di log di traccia del server di report. Nel registro eventi applicazioni vengono inseriti solo gli errori. ## <a name="about-default-values"></a><a name="bkmk_default_values"></a> Informazioni sui valori predefiniti La maggior parte delle impostazioni di configurazione dispone di valori predefiniti specificati internamente nel server di report che li utilizzerà se un valore definito dall'utente non è valido o non è stato specificato. Se è necessario utilizzare un valore predefinito a causa di un'impostazione non valida, nel file di log di traccia verrà registrato un errore. ## <a name="deleting-configuration-settings"></a><a name="bkmk_delete_config_settings"></a> Eliminazione delle impostazioni di configurazione Per impostazioni di configurazione cui sono associati valori predefiniti, la rimozione dell'impostazione dal file di configurazione non ha alcun effetto. La maggior parte delle impostazioni di configurazione viene effettivamente definita e configurata internamente. Se si elimina un elemento dal file di configurazione, la copia interna è ancora disponibile e utilizza il valore predefinito specificato. ## <a name="to-edit-a-reporting-services-configuration-file"></a><a name="bkmk_edit_configuation_file"></a> Per modificare un file di configurazione di Reporting Services 1. Individuare il file di configurazione da modificare: - **RSReportServer.config** si trova nella cartella seguente: ``` C:\Program Files\Microsoft SQL Server\MSRS11.MSSQLSERVER\Reporting Services\ReportServer ``` - **RSReportServerServices.exe.config** si trova nella cartella seguente: ``` C:\Program Files\Microsoft SQL Server\MSRS11.MSSQLSERVER\Reporting Services\ReportServer\bin ``` - **RSReportDesigner.config** si trova nella cartella seguente: ``` C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies ``` 2. Salvare una copia del file nel caso in cui sia necessario eseguire il rollback delle modifiche. 3. Aprire il file originale in Blocco note o un editor del codice. Non utilizzare Textpad poiché imposta la lunghezza del file prima del salvataggio, provocando la registrazione nel file di log di traccia di un errore dovuto a un carattere non valido. 4. Digitare l'elemento o il valore da aggiungere o utilizzare. Poiché gli elementi rispettano la distinzione tra maiuscole e minuscole, quando si aggiunge un elemento verificare di utilizzare le lettere nella forma corretta. Istruzioni specifiche per la modifica di file di configurazione sono disponibili se si personalizzano l'estensione per il rendering, le estensioni di autenticazione o quelle per l'elaborazione dei dati: - [Autenticazione con il server di report](../security/authentication-with-the-report-server.md) - [Configurare Gestione report per il passaggio di cookie di autenticazione personalizzati](../security/configure-the-web-portal-to-pass-custom-authentication-cookies.md) - [Personalizzare i parametri di estensione per il rendering in RSReportServer.config.](../customize-rendering-extension-parameters-in-rsreportserver-config.md) 5. Salvare il file. 6. Esaminare i file di log di traccia per controllare che non si siano verificati errori. Se sono presenti condizioni di errore, un'impostazione o il relativo valore è stato specificato in modo non corretto. Rivedere il [RSReportServer Configuration File](rsreportserver-config-configuration-file.md) per i valori validi per qualsiasi impostazione che provoca un errore. Per altre informazioni su come visualizzare i log di traccia, vedere [Log di traccia del servizio del server di report](report-server-service-trace-log.md). ## <a name="see-also"></a>Vedere anche [File di configurazione RSReportServer](rsreportserver-config-configuration-file.md) [File di configurazione ReportingServicesService](reportingservicesservice-configuration-file.md) [File di configurazione RSReportDesigner](rsreportdesigner-configuration-file.md) [Distribuzione di un'estensione per l'elaborazione dati](../extensions/data-processing/deploying-a-data-processing-extension.md) [Distribuzione di un'estensione per il recapito](../extensions/delivery-extension/deploying-a-delivery-extension.md) [Distribuzione di un'estensione per il rendering](../extensions/rendering-extension/deploying-a-rendering-extension.md) [Procedura: distribuire un elemento del report personalizzato](../custom-report-items/how-to-deploy-a-custom-report-item.md) [File di configurazione di Reporting Services](reporting-services-configuration-files.md)
95.557692
821
0.792111
ita_Latn
0.998494
dbf472a970eb48c97a2965a0a57c80a988e0dd4e
5,306
md
Markdown
_includes/cv/research.md
nigel121/scholar
92a1407c037bd2ee0f1b2f50047c38676db7a5d3
[ "MIT" ]
null
null
null
_includes/cv/research.md
nigel121/scholar
92a1407c037bd2ee0f1b2f50047c38676db7a5d3
[ "MIT" ]
null
null
null
_includes/cv/research.md
nigel121/scholar
92a1407c037bd2ee0f1b2f50047c38676db7a5d3
[ "MIT" ]
1
2020-11-18T10:45:17.000Z
2020-11-18T10:45:17.000Z
## Research statement I have been doing research in the broad area of Human-Computer Interaction methods (human-centered software design and evaluation of its effects on humans), with a particular focus on the application domains of entertainment (since 2001) and learning (since 2008). Every few years, depending on resources and contemporary developments, I am expanding skills and knowledge into a new application domain. Recently, I have become interested in well-being (since 2013) and culture (since 2018). Here, I am organizing research according to broad topics of public interest in order to make them more accessible to the general public. ### Entertainment, learning, well being, and culture Since 2010, I have been exploring topics that influence e-learning and the quality of life, such as: * serious video games: Children (and adults) have become highly engaged by computer video games. Is it possible to [design engaging video games that facilitate learning in Science Technology Engineering and Math (STEM) topics]? What is the [effect of serious video games on student performance and attitudes]()? * learning outside of the classroom: [What is the effect of mobile learning during a museum visit?] * video-based learning: There are many ways to record a lecture on video, but [which video lecture format is the most effective and preferred by the students]? * teaching computer programming: Computer programming has been treated in the classroom as a math or science topic, but the result has been fewer students being interested in learning, or at least understanding computer programming. [How can we engage students in computer programming?] * healthy habits: Computers have been diffused into everyday activities such as [casual sports], eating, sleeping, etc. * maker communities: [The projects, the structure, and the process of maker communities] provide an alternative to the mainstream and formal approaches to design, production, and learning. The research framework bellow is organized according to two strands with respect to physical context (private and public space) and the type of activity (entertainment and learning). ### Public space, collaboration, telecommunication, and community awareness Since 2006, I have been examining human activity outside the home and toward the public space, with a particular focus on informal educational settings and computer mediated communication. For this purpose, I am employing both established and novel interaction and communication technologies. Moreover, I am working closely with schools and teachers, in order to leverage their ability to adopt and adapt technologies in ways that are suitable for their skills and needs. Recently, I have been working on the following projects: * mediacity: a scholarly investigation on the interdisciplinary area that is defined by architecture, urban studies, and media interaction. * cult: a cooperation platform between schools that reside in rural areas of Europe. * videopal: a field study on an asynchronous video link between USA and Greece. * collaborative multi-user screens: tools and techniques for designing highly collaborative multi-touch surfaces that are more than the sum of the parts (User Experience Quality in Multi-Touch Tasks, Multi-user Chorded Toolkit for Multi-touch Screens) * virtual communities: online 3D virtual worlds (Second Life), social networks (Facebook) ### Private space, family life, and multimedia production tools Since 2000, I have been working on a broad set of research issues that consider an interdisciplinary area defined by human-computer interaction, multimedia design, and media communication. Most designers with an information technology (IT) background think about interactive television in personal computer terms. Although the academic background is in computer engineering, empirical research on the broadcast and the media industry --Hellenic Broadcasting, RAI research, Canal+, Danish Broadcasting-- has taught me several lessons, in complement to the IT mentality. In particular, I have done research on: * multimedia authoring tools that are suitable for the workflow of TV producers, as well as for user participation (see papers: User Interface Programing for Interactive TV, The Evolution of TV Systems, Content, and Users Toward Interactivity) * user interface software and design that facilitates viewer interactivity with the rich TV visual language (see papers: user interface design principles for interactive TV, Animated character likeability revisited: The case of interactive TV) * user evaluation methodology that concerns the uses and gratifications of TV by the audience (see papers: User Interface Evaluation of Interactive TV: A Media Studies Perspective and Learn and Play with Interactive TV) * digital media management strategy that augments the established business model with alternative distribution and consumption channels, such as the Internet, mobile devices, etc (see papers: Coping with TiVo: Opportunities of the Networked Digital Video Recorder and Cross-media digital rights management for online music stores, Taking Social TV Beyond Chatting: How The TV Viewer Adds Value To The Network). * social video retrieval:
98.259259
609
0.802865
eng_Latn
0.999058
dbf487511db87e2be0a815994d3c6c5be7bd5455
1,029
md
Markdown
README.md
Logan-Roelofs/Reverse-shell
33bb823d832a99915e7a7feeba03dfe25023fa84
[ "MIT" ]
1
2021-12-30T23:07:23.000Z
2021-12-30T23:07:23.000Z
README.md
Logan-Roelofs/Reverse-shell
33bb823d832a99915e7a7feeba03dfe25023fa84
[ "MIT" ]
null
null
null
README.md
Logan-Roelofs/Reverse-shell
33bb823d832a99915e7a7feeba03dfe25023fa84
[ "MIT" ]
null
null
null
# Reverse-shell Linux Reverse shell... work in progress (will work on in free time when not at school work or sports compitintion) #not finished yet# in order to make this a .exe file using the program "pyinstaller" In order to make this program run on a desired windows machine use py install to create an executable file. optionally use pyinstaller impinge code imbedder in order to make the executable look like a png. request ctypes and mss Key featers of this revers shell- Executing commands 20 second reconect Persistence Directory changing Upload and Download files Download files off of the internet Program starting Screenshoots embedding shell in image addmin privleges check Differnt commands that are coded into this revers shell Help- download path -> Download A file From Target PC upload path-> Upload A file To Target PC get url-> Download File To Target From Any Website start path -> Start A Program In Target PC screenshot -> Take Screenshot Of Target PC check -> Check If Target PC Is Admin Or Not
38.111111
271
0.794947
eng_Latn
0.989005
dbf501d339dc776bb7d6e8212f3b21ecc56589d9
17,306
md
Markdown
docs/miracl-explained/benchmarks.md
grobe0ba/MIRACL
8fc2e447c2a17f5c10d1861f5e02a98882f3a27c
[ "Intel", "Unlicense" ]
419
2016-03-15T18:07:22.000Z
2022-03-31T07:25:20.000Z
docs/miracl-explained/benchmarks.md
grobe0ba/MIRACL
8fc2e447c2a17f5c10d1861f5e02a98882f3a27c
[ "Intel", "Unlicense" ]
89
2016-05-12T17:29:14.000Z
2022-03-22T15:02:11.000Z
docs/miracl-explained/benchmarks.md
grobe0ba/MIRACL
8fc2e447c2a17f5c10d1861f5e02a98882f3a27c
[ "Intel", "Unlicense" ]
180
2016-03-15T23:49:25.000Z
2022-03-10T17:41:02.000Z
* [What Is Miracl](README.md) * [Security Advisory](security-advisory.md) * Benchmarks * [Miracl Standard Curves](miracl-standard-curves.md) * [IEEE 1363](ieee-1363.md) * [Elliptic Curves](elliptic-curves.md) * [Licensing](licensing.md) * Reference Manual * [Low Level Routines](reference-manual/low-level-routines.md) * [Advanced Arithmetic Routines](reference-manual/advanced-arithmetic-routines.md) * [Montgomery Arithmetic Routines](reference-manual/montgomery-arithmetic-routines.md) * [ZZn2 Arithmetic Routines](reference-manual/zzn2-arithmetic-routines.md) * [Encryption Routines](reference-manual/encryption-routines.md) * [Elliptic Curve Routines](reference-manual/elliptic-curve-routines.md) * [Floating Slash Routines](reference-manual/floating-slash-routines.md) * [Structure Reference](reference-manual/structure-reference.md) Benchmarks --- * [Overview](#overview) * [Output of the BMARK Program](#output) * [Elliptic Curve Point Multiplication](#curve) * [Pairing-Based Crypto](#pairing) ## Overview <a id="overview"></a> **Performance is the biggest single issue for implementors, and MIRACL allows a variety of techniques (algorithmic tricks and/or assembly language) to be used to squeeze maximum performance from a particular environment. So use MIRACL in your cryptographic API for a performance boost - you may not need that expensive Cryptographic accelerator!** This diagram below shows timings for modular exponentiation, that is the calculation of xy mod n, for x, y and n all the same size in-bits - the size shown along the horizontal axis. The exponent y is chosen at random. This is the bottleneck calculation in many cryptographic protocols. Five different methods are implemented for the Intel 80x86/Pentium family. Timings on the horizontal axes are correct in seconds for 8192-bit exponentiation. For 4096-bits divide by 8, for 2048-bits divide by 8 again, etc. For a paper describing the methods in more details see [timings.doc](miracl-explained/timings.doc ':ignore'). The following timings were obtained using the Borland C/C++ Compiler/assembler, for modular exponentiation. Times in milliseconds for optimal technique: | | 512-bits | 1024-bits | 2048-bits | 4096-bits | |--------------------|----------|-----------|-----------|-----------| | 33MHz 80486DX | 370 | 2833 | 17833 | 111000 | | 60MHz Pentium | 48 | 353 | 2452 | 18500 | | 180MHz Pentium Pro | 12 | 90 | 564 | 3551 | | 233MHz Pentium II | 10 | 80 | 510 | 3250 | **On a 233 Mhz Pentium II - Best times (without precomputation)** - A 1024-bit RSA decryption/signature takes 20ms. <sup>*</sup> - A 2048-bit RSA decryption takes 160 ms. <sup>+</sup> - A 1024-bit (160-bit exponent) DSS verification takes 16ms. <sup>+</sup> - A 2048-bit (256-bit exponent) DSS verification takes 79ms <sup>+</sup> - A 160-bit Elliptic Curve ECS verification takes 11 ms. <sup>*</sup> - A 256-bit Elliptic Curve ECS verification takes 26ms. <sup>*</sup> - A 192-bit Elliptic Curve ECS verification takes 9ms (NIST Standard Curve - Special Modulus) <sup>*</sup> - A 224-bit Elliptic Curve ECS verification takes 13ms (NIST Standard Curve - Special Modulus) <sup>*</sup> **On 80MHz ARM7TDMI - Best times (without precomputation)** - A 1024-bit RSA decryption/signature takes 120ms <sup>*</sup> - A 192-bit Elliptic Curve point multiplication takes 38ms (NIST Standard Curve - Special Modulus) <sup>*</sup> - A 224-bit Elliptic Curve point multiplication takes 53ms (NIST Standard Curve - Special Modulus) <sup>*</sup> MIRACL contains fast experimental implementations of [Identity-Based Encryption](http://crypto.stanford.edu/ibe/). Timings include all number theoretic components of encrypt/decrypt processing. The most time-consuming component is the calculation of the Tate Pairing. The discrete logarithm-bit-length security of a pairing-based system is a function of the product of the _security multiplier k and the-bit length of the base field. In these cases k=2 and the base field is 512-bits, for 1024-bit security. **On a 1GHz Pentium III** - A 1024-bit IBE encrypt takes 35ms <sup>*</sup> - A 1024-bit IBE decrypt takes 27ms <sup>*</sup> - A 1024-bit IBE encrypt takes 22ms (with precomputation) <sup>*</sup> - A 1024-bit IBE decrypt takes 17ms (with precomputation) <sup>*</sup> - A 1024-bit Tate pairing takes 20ms <sup>*</sup> - A 1024-bit Tate pairing takes 8.6ms (with precomputation) <sup>*</sup> <sup>* - Using Comba Method for modular multiplication</sup><br /> <sup>+ - Using KCM Method for modular multiplication</sup> ## Output of the BMARK program <a id="output"></a> Below is the output of the BMARK program, on a single core of a 2.4GHz Intel i5 520 processor, compiled with GCC, with standard /O2 compiler optimisation. > This is for the standard version of MIRACL, with no special optimizations. - MIRACL – 64-bit version - Little Endian processor - Using some assembly language - No special optimizations - Precomputation uses fixed Window size = 8 - So 256 values are precomputed and stored > No optimizations/assembly language apply to GF(2^m) Elliptic Curves.<br />Times are elapsed real-times - so make sure nothing else is running! Modular exponentiation benchmarks – calculating g^e mod p. From these figures it should be possible to roughly estimate the time required for your favourite PK algorithm, RSA, DSA, DH, etc. **Key** - R – random base-bits/random exponent-bits - V – random base-bits/(small exponent e) - S – (small base g) /random exponent-bits - P – exponentiation with precomputation (fixed base g) - D – double exponentiation g^e.a^b mod p - F3 = 257, F4 = 65537 - RSA - Rivest-Shamir-Adleman - DH - Diffie Hellman Key exchange - DSA - Digital Signature Algorithm **512-bit prime** - R - 54945 iterations of 512/ 160 0.18 ms per iteration - D - 45015 iterations of 512/ 160 0.22 ms per iteration - R - 18292 iterations of 512/ 512 0.55 ms per iteration - S - 67125 iterations of g=3/ 160 0.15 ms per iteration - P - 281436 iterations of 512/ 160 0.04 ms per iteration **1024-bit RSA decryption** 1.09 ms **512-bit DH 160-bit exponent** - Offline, no precomputation 0.18 ms - Offline, small base 0.15 ms - Offline, w. precomputation 0.04 ms - Online 0.18 ms **512-bit DSA 160-bit exponent** - Signature no precomputation 0.18 ms - Signature w. precomputation 0.04 ms - Verification 0.22 ms **1024-bit prime** - R - 17875 iterations of 1024/ 160 0.56 ms per iteration - D - 14859 iterations of 1024/ 160 0.67 ms per iteration - V - 1163058 iterations of 1024/e= 3 0.01 ms per iteration - V - 154892 iterations of 1024/e=F4 0.06 ms per iteration - S - 22799 iterations of g=3/ 160 0.44 ms per iteration - P - 89730 iterations of 1024/ 160 0.11 ms per iteration **2048-bit RSA decryption** 6.62 ms **1024-bit RSA encryption e=3** 0.01 ms **1024-bit RSA encryption e=65537** 0.06 ms **1024-bit DH 160-bit exponent** - Offline, no precomputation 0.56 ms - Offline, small base 0.44 ms - Offline, w. precomputation 0.11 ms - Online 0.56 ms **1024-bit DSA 160-bit exponent** - Signature no precomputation 0.56 ms - Signature w. precomputation 0.11 ms - Verification 0.67 ms **2048-bit prime** - R - 2982 iterations of 2048/ 256 3.35 ms per iteration - D - 2335 iterations of 2048/ 256 4.28 ms per iteration - R - 398 iterations of 2048/2048 25.13 ms per iteration - V - 366871 iterations of 2048/e= 3 0.03 ms per iteration - V - 48125 iterations of 2048/e=F4 0.21 ms per iteration - S - 4223 iterations of g=3/ 256 2.37 ms per iteration - P - 15500 iterations of 2048/ 256 0.65 ms per iteration **2048-bit RSA encryption e=3** 0.03 ms **2048-bit RSA encryption e=65537** 0.21 ms **2048-bit DH 256-bit exponent** - Offline, no precomputation 3.35 ms - Offline, small base 2.37 ms - Offline, w. precomputation 0.65 ms - Online 3.35 ms **2048-bit DSA 256-bit exponent** - Signature no precomputation 3.35 ms - Signature w. precomputation 0.65 ms - Verification 4.28 ms ## Elliptic Curve Point Multiplication <a id="curve"></a> Elliptic Curve point multiplication benchmarks – calculating r.P From these figures it should be possible to roughly estimate the time required for your favourite EC PK algorithm, ECDSA, ECDH, etc. **Key** - ER - Elliptic Curve point multiplication r.P - ED - Elliptic Curve double multiplication r.P + s.Q - EP - Elliptic Curve multiplication with precomputation - EC - Elliptic curve GF(p) - p of no special form - ECDH - Diffie Hellman Key exchange - ECDSA - Digital Signature Algorithm **160-bit GF(p) Elliptic Curve** - ER - 22280 iterations 0.45 ms per iteration - ED - 17217 iterations 0.58 ms per iteration - EP - 96332 iterations 0.10 ms per iteration **160-bit ECDH** - Offline, no precomputation 0.45 ms - Offline, w. precomputation 0.10 ms - Online 0.45 ms **160-bit ECDSA** - Signature no precomputation 0.45 ms - Signature w. precomputation 0.10 ms - Verification 0.58 ms **192-bit GF(p) Elliptic Curve** - ER - 17095 iterations 0.58 ms per iteration - ED - 12936 iterations 0.77 ms per iteration - EP - 74904 iterations 0.13 ms per iteration **192-bit ECDH** - Offline, no precomputation 0.58 ms - Offline, w. precomputation 0.13 ms - Online 0.58 ms **192-bit ECDSA** - Signature no precomputation 0.58 ms - Signature w. precomputation 0.13 ms - Verification 0.77 ms **224-bit GF(p) Elliptic Curve** - ER - 11832 iterations 0.85 ms per iteration - ED - 9486 iterations 1.05 ms per iteration - EP - 52869 iterations 0.19 ms per iteration **224-bit ECDH** - Offline, no precomputation 0.85 ms - Offline, w. precomputation 0.19 ms - Online 0.85 ms **224-bit ECDSA** - Signature no precomputation 0.85 ms - Signature w. precomputation 0.19 ms - Verification 1.05 ms **256-bit GF(p) Elliptic Curve** - ER - 9410 iterations 1.06 ms per iteration - ED - 7124 iterations 1.40 ms per iteration - EP - 41546 iterations 0.24 ms per iteration **256-bit ECDH** - Offline, no precomputation 1.06 ms - Offline, w. precomputation 0.24 ms - Online 1.06 ms **256-bit ECDSA** - Signature no precomputation 1.06 ms - Signature w. precomputation 0.24 ms - Verification 1.40 ms **163-bit GF(2^m) Elliptic Curve** - ER - 27160 iterations 0.37 ms per iteration - ED - 20689 iterations 0.48 ms per iteration - EP - 107712 iterations 0.09 ms per iteration **163-bit ECDH** - Offline, no precomputation 0.37 ms - Offline, w. precomputation 0.09 ms - Online 0.37 ms **163-bit ECDSA** - Signature no precomputation 0.37 ms - Signature w. precomputation 0.09 ms - Verification 0.48 ms **163-bit GF(2^m) Koblitz Elliptic Curve** - ER - 43413 iterations 0.23 ms per iteration - ED - 23882 iterations 0.42 ms per iteration - EP - 111239 iterations 0.09 ms per iteration **163-bit ECDH** - Offline, no precomputation 0.23 ms - Offline, w. precomputation 0.09 ms - Online 0.23 ms **163-bit ECDSA** - Signature no precomputation 0.23 ms - Signature w. precomputation 0.09 ms - Verification 0.42 ms **233-bit GF(2^m) Elliptic Curve** - ER - 16703 iterations 0.60 ms per iteration - ED - 12460 iterations 0.80 ms per iteration - EP - 62551 iterations 0.16 ms per iteration **233-bit ECDH** - Offline, no precomputation 0.60 ms - Offline, w. precomputation 0.16 ms - Online 0.60 ms **233-bit ECDSA** - Signature no precomputation 0.60 ms - Signature w. precomputation 0.16 ms - Verification 0.80 ms **233-bit GF(2^m) Koblitz Elliptic Curve** - ER - 27404 iterations 0.36 ms per iteration - ED - 13872 iterations 0.72 ms per iteration - EP - 62887 iterations 0.16 ms per iteration **233-bit ECDH** - Offline, no precomputation 0.36 ms - Offline, w. precomputation 0.16 ms - Online 0.36 ms **233-bit ECDSA** - Signature no precomputation 0.36 ms - Signature w. precomputation 0.16 ms - Verification 0.72 ms **283-bit GF(2^m) Elliptic Curve** - ER - 9870 iterations 1.01 ms per iteration - ED - 7095 iterations 1.41 ms per iteration - EP - 37435 iterations 0.27 ms per iteration **283-bit ECDH** - Offline, no precomputation 1.01 ms - Offline, w. precomputation 0.27 ms - Online 1.01 ms **283-bit ECDSA** - Signature no precomputation 1.01 ms - Signature w. precomputation 0.27 ms - Verification 1.41 ms **283-bit GF(2^m) Koblitz Elliptic Curve** - ER - 19687 iterations 0.51 ms per iteration - ED - 8968 iterations 1.12 ms per iteration - EP - 37505 iterations 0.27 ms per iteration **283-bit ECDH** - Offline, no precomputation 0.51 ms - Offline, w. precomputation 0.27 ms - Online 0.51 ms **283-bit ECDSA** - Signature no precomputation 0.51 ms - Signature w. precomputation 0.27 ms - Verification 1.12 ms **571-bit GF(2^m) Elliptic Curve** - ER - 2227 iterations 4.49 ms per iteration - ED - 1504 iterations 6.65 ms per iteration - EP - 8231 iterations 1.21 ms per iteration **571-bit ECDH** - Offline, no precomputation 4.49 ms - Offline, w. precomputation 1.21 ms - Online 4.49 ms **571-bit ECDSA** - Signature no precomputation 4.49 ms - Signature w. precomputation 1.21 ms - Verification 6.65 ms **571-bit GF(2^m) Koblitz Elliptic Curve** - ER - 5035 iterations 1.99 ms per iteration - ED - 2242 iterations 4.46 ms per iteration - EP - 8247 iterations 1.21 ms per iteration **571-bit ECDH** - Offline, no precomputation 1.99 ms - Offline, w. precomputation 1.21 ms - Online 1.99 ms **571-bit ECDSA** - Signature no precomputation 1.99 ms - Signature w. precomputation 1.21 ms - Verification 4.46 ms ## Pairing-Based Crypto <a id="pairing"></a> Processor: 2.4 GHz Intel i5 520M.<br /> AES refers to equivalent AES-bits of security. For example 128-bits refers to AES with a 128-bit key.<br /> For G1, G2 and GT precomputation, 8-bit windows are used.<br /> All timings are in milli-seconds. Maximum optimization applied.<br /> "One More" refers to the cost of one more pairing in a multi-pairing. The (p) means that precomputation is used.<br /> **+Timings for Type-1 pairings G1 X G1 = GT+** These pairing friendly curves are used, where _k_ is the embedding degree: - SSP - Super-singular Curve over GF(_p_) (512-bit modulus _p_, _k_=2) - SSP - Super-singular Curve over GF(_p_) (1536-bit modulus _p_, _k_=2) - SS2 - Supersingular Curve over GF(2^_m_) (_m_=379, _k_=4) - SS2 - Supersingular Curve over GF(2^_m_) (_m_=1223, _k_=4) | AES/Curve | 80/SSP | 80/SS2 | 128/SSP | 128/SSP | |--------------|--------|--------|---------|---------| | G1 mul | 1.49 | 0.38 | 13.57 | 2.57 | | G1 mul (p) | 0.30 | - | 3.01 | - | | Pairing | 3.34 | 1.18 | 40.95 | 19.00 | | Pairing (p) | 1.65 | - | 25.22 | - | | GT pow | 0.36 | 0.29 | 3.76 | 2.09 | | GT Pow (p) | 0.08 | - | 0.78 | - | | One More | 2.29 | 1.01 | 20.80 | 17.80 | | One More (p) | 0.60 | - | 5.31 | - | **+Timings for Type-3 pairings G2 X G1 = GT+** These pairing friendly curves are used, where _k_ is the embedding degree: - CP - Cocks-Pinch Curve over GF(_p_) (512-bit modulus _p_, _k_=2)<br /> - MNT - MNT Curve over GF(_p_) (160-bit modulus _p_, _k_=6)<br /> - BN - Barreto-Naehrig Curve over GF(_p_) (256-bit modulus _p_, k=12)<br /> - KSS - Kachisa-Schaefer-Scott Curve over GF(_p_) (512-bit modulus _p_, _k_=18)<br /> - BLS - Barreto-Lynn-Scott Curve over GF(_p_) (640-bit modulus _p_, _k_=24) | AES/Curve | 80/CP | 80/MNT | 128/BN | 192/KSS | 256/BLS | |--------------|-------|--------|--------|---------|---------| | G1 mul | 0.51 | 0.19 | 0.22 | 0.7 | 1.26 | | G1 mul (p) | 0.1 | 0.04 | 0.07 | 0.24 | 0.43 | | G2 mul | 0.51 | 1.15 | 0.44 | 5.53 | 16.04 | | G2 mul(p) | 0.1 | 0.35 | 0.19 | 2.81 | 5.44 | | Pairing | 1.14 | 1.9 | 2.32 | 20.55 | 33.91 | | Pairing (p) | 0.58 | 0.69 | 2.09 | 18.05 | 30.45 | | GT pow | 0.12 | 0.24 | 0.95 | 6.2 | 24.87 | | GT pow (p) | 0.03 | 0.08 | 0.43 | 2.73 | 6.47 | | One More | 0.81 | 1.57 | 0.75 | 4.65 | 6.59 | | One More (p) | 0.23 | 0.34 | 0.41 | 2.38 | 3.42Ę |
36.587738
619
0.640992
eng_Latn
0.570645
dbf5385df48b15b99f39cdbd812b1d72f3952306
10,565
md
Markdown
docs/migrate/phase-features-with-feature-flags.md
cpoDesign/vsts-docs
fb42e410b4af3b0de303775b839e8cc6528e519e
[ "CC-BY-4.0", "MIT" ]
1
2020-07-10T19:07:34.000Z
2020-07-10T19:07:34.000Z
docs/migrate/phase-features-with-feature-flags.md
cpoDesign/vsts-docs
fb42e410b4af3b0de303775b839e8cc6528e519e
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/migrate/phase-features-with-feature-flags.md
cpoDesign/vsts-docs
fb42e410b4af3b0de303775b839e8cc6528e519e
[ "CC-BY-4.0", "MIT" ]
1
2019-10-30T05:04:34.000Z
2019-10-30T05:04:34.000Z
--- title: Progressively expose your features using feature flags description: Explore how to progressively expose your features in production for some or all users ms.prod: devops ms.topic: conceptual ms.technology: devops-migrate ms.manager: jillfra ms.date: 04/26/2018 ms.author: kaelli author: KathrynEE monikerRange: '>= tfs-2013' --- # Explore how to progressively expose your features in production for some or all users [!INCLUDE [version-azure-devops](../_shared/version-vsts-tfs-all-versions.md)] In today's fast-paced, feature-driven markets, it's important to continuously deliver value and receive feedback on features quickly and continuously. Partnering with end users to get early versions of features vetted out is valuable. Are you planning to continuously integrate features into your application while they're under development? You probably have a few questions, such as: - How can you toggle features to hide, disable, or enable features at run-time? - How can you revert a change deployed to production without rolling back your release? - How can you present users with variants of a feature, to determine which one performs better? This topic aims to answer these questions and share an implementation of feature flags (FF) and A|B testing used with Azure DevOps extensions. ## Considerations Before you introduce feature flags to your engineering process, it's important to consider: - Which users are you planning to target? For example, do you want to target specific or all users? - Would you like users to decide which features they want to use? - What's the value of embracing feature flags as part of your engineering process? - What's the cost to implement feature flags in your engineering process? Before you flip your first feature flag in production, take the time to read: * ["A Rough Patch", by Brian Harry](https://blogs.msdn.microsoft.com/bharry/2013/11/25/a-rough-patch) * ["Feature Flags with Branching", by LaunchDarkly](https://launchdarkly.com/guide/flagsbranching.html) ## What are Feature Flags (FF)? > [!NOTE] > A feature flag is also known as a feature toggle, feature switch, feature flipper, or conditional feature. They were popularized by [Martin Fowler](https://martinfowler.com/bliki/FeatureToggle.html). Feature flags support a customer-first DevOps mindset, to enable (expose) and disable (hide) features in a solution, even before they are complete and ready for release. ![Feature Flag](./_img/phase-features-with-ff/phase-features-with-ff-feature-flag.png) View a feature flag as an ON | OFF switch for a specific feature. As shown, you can deploy a solution to production that includes both an email and a print feature. If the feature flag is set (ON), you'll email, else you'll print. When you combine a feature flag with an experiment, led by a hypothesis, you introduce A|B testing. For example, you could run an experiment to determine if the email (A) or the print (B) feature will result in a higher user satisfaction. > [!NOTE] > A|B testing is also known as Split Testing. It's based on a hypothesis that's defined as: > > **For** {user} **who** {action} **the** {solution} **is a** {how} **that** {value} **unlike** {competition} **we** {do better} ![Feature Flag](./_img/phase-features-with-ff/phase-features-with-ff-ab-test.png) As shown, the email feature (option A) is more popular with your users and wins. ## Evaluating Feature Flag solutions As outlined in [how to implement feature flags and A|B testing](https://blogs.msdn.microsoft.com/visualstudioalmrangers/2017/04/04/how-to-implement-feature-flags-and-ab-testing/), the ALM | DevOps Rangers evaluated a number of FF frameworks and solutions. They chose the [LaunchDarkly](https://launchdarkly.com/index.html) solution for several reasons: - It's a "software as a service" (SaaS) solution - No custom solution to maintain - No upgrades - you're always using the latest and greatest - No servers - [LaunchDarkly](https://launchdarkly.com/index.html) takes care of the machines that LaunchDarkly runs on - Always on and optimized for the Internet - It's integrated with Azure DevOps Services and Team Foundation Server (TFS) - It's simple and cost-effective for an open-source project ## Common scenarios You have a [CI/CD pipeline](https://blogs.msdn.microsoft.com/visualstudioalmrangers/tag/cicd-pipeline/) for every Azure DevOps extension you're hosting on the [marketplace](https://marketplace.visualstudio.com). You are using a ring deployment model and manual release approval checkpoints. The checkpoints are manual and time consuming, but necessary to minimize the chance of breaking the early-adopter and production user environments, forcing an expensive roll-back. You're looking for an engineering process, which enables you to: * Continuously deploy to production * Never roll back in production * Fine-tune the user experience in production You have probably guessed it - feature flags! ### Enable or disable a feature for everyone You would like to include hidden features in your release and enable them for **all** users in production. For example, you want to be able to collect verbose logging data for troubleshooting. Using a feature flag, you can enable and disable verbose logging as needed. ![Feature Flag](./_img/phase-features-with-ff/phase-features-with-ff-all-or-nothing.png) ### Enable or disable a feature for selected users With this scenario, you can target specific users or groups of users. For example, you could enable the verbose logging feature for a specific user experiencing a problem or enable a preview feature for early adopters. ![Feature Flag](./_img/phase-features-with-ff/phase-features-with-ff-user-group.png) ### Enable | disable a feature as selected by user Lastly, you'd like to give the users a list of preview features and allow each user to decide which feature to enable when. This scenario is key for feature validation, A|B testing, and giving the user flexibility and choice. ## Manage features with feature flags in your engineering process To protect the flags from malicious users, you need to generate and pass the hash of the user key to the LaunchDarkly API calls. As Azure DevOps extensions can only use client-side code, the ALM | DevOps Rangers chose Azure Functions to help generate the hash, as shown. Read [how we checked and fixed the 503 error and Performance issue in our Azure Function](https://blogs.msdn.microsoft.com/visualstudioalmrangers/2018/04/03/how-we-checked-and-fixed-the-503-error-and-performance-issue-in-our-azure-function/) for details. ![Extension calls an Azure Function, which calls the LaunchDarkly SDK](./_img/phase-features-with-ff/phase-features-with-ff-ld-azure-fx.png) Administration of feature flags is straight-forward. 1. You have a different environment for each extension, allowing you to have different feature flag values for Early Adopters and Users. 2. Optionally target specific users 3. Optionally target users that match custom rules 4. You have a default for each feature flag You have granular control of each feature flag. ![LaunchDarkly Admin Dashboard](./_img/phase-features-with-ff/phase-features-with-ff-admin.png) ## What's the value? You're able to: * Decouple deployment of releases and exposure of features * Make changes (enable|disable features) without redeployment * Fine-tune a user's features and experience * Enable a user to optionally select preview features * Hide an incomplete or faulty feature ## What's the cost? Aside from the licensing and maintenance cost of a feature flag service, you're adding technical debt to your code: * With a true or false feature flag, your doubling your code and test paths * With multi-value feature flag, you'll add even more code and test paths * You'll need to identify and remove stale feature flags * Understand and test the implications of flipping a feature flag > [!TIP] > To minimize the costs associated with the use of feature flags, keep feature flags short lived and prevent multiple feature flags from interfering with each other by affecting the same functionality. ## Conclusion Now that you've covered the concepts and considerations of feature flags, you should be confident to explore ways to improve your CI/CD pipelines. While feature flags come at a cost, having a game plan to manage exposed features at run-time is invaluable. ## Q & A ### How does the Azure DevOps team use feature flags? Buck’s [feature flags blog post](https://blogs.msdn.microsoft.com/buckh/2016/09/30/controlling-exposure-through-feature-flags-in-vs-team-services/) and the [presentation/article](/azure/devops/learn/devops-at-microsoft/progressive-experimentation-feature-flags) are great sources to get an understanding of the custom-built feature flag system used with Team Foundation Server (TFS) and Azure DevOps Services. ### How do the ALM | DevOps Rangers use feature flags? The Rangers use the [LaunchDarkly](https://www.launchdarkly.com/) SaaS solution. You can find their learnings in this [blog series](https://blogs.msdn.microsoft.com/visualstudioalmrangers/tag/launchdarkly/). ### When should you remove feature flags? As Buck states, “Many feature flags go away and the teams themselves take care of that." The feature teams decide when to go delete the feature flags. It can get unwieldy after a while, so there’s some natural motivation to go clean it up. ### Is there a dependency on deployment rings? No, rings and feature flags are symbiotic. Read [Feature Flags or Rings](https://aka.ms/vsar-rings-flags) for details. ## Reference information * [CI/CD pipeline examples](https://blogs.msdn.microsoft.com/visualstudioalmrangers/tag/cicd-pipeline/) * [DevOps @ Microsoft](https://aka.ms/devops) * [How to implement feature flags and A|B testing](https://blogs.msdn.microsoft.com/visualstudioalmrangers/2017/04/04/how-to-implement-feature-flags-and-ab-testing/) > Authors: Willy Schaub | Find the origin of this article and connect with the ALM | DevOps Rangers [here](https://github.com/ALM-Rangers/Guidance/blob/master/README.md) *(c) 2017 Microsoft Corporation. All rights reserved. This document is provided "as-is." Information and views expressed in this document, including URL and other Internet Web site references, may change without notice. You bear the risk of using it.* *This document does not provide you with any legal rights to any intellectual property in any Microsoft product. You may copy and use this document for your internal, reference purposes.*
59.689266
535
0.781732
eng_Latn
0.996222
dbf5b95431df11ef63de32398b190534b63a7e34
3,363
md
Markdown
src/ProjectTemplates/README.md
badrshs/aspnetcore
4b435409e9d51fab019fff4e4867ec2708b07a00
[ "Apache-2.0" ]
10
2021-06-09T02:22:29.000Z
2021-07-07T14:13:48.000Z
src/ProjectTemplates/README.md
badrshs/aspnetcore
4b435409e9d51fab019fff4e4867ec2708b07a00
[ "Apache-2.0" ]
47
2021-06-09T00:55:43.000Z
2022-03-28T11:06:48.000Z
src/ProjectTemplates/README.md
badrshs/aspnetcore
4b435409e9d51fab019fff4e4867ec2708b07a00
[ "Apache-2.0" ]
2
2021-03-16T07:53:44.000Z
2021-12-16T11:19:53.000Z
# Templates These are project templates which are used in .NET Core for creating ASP.NET Core applications. ## Description The following contains a description of each sub-directory in the `ProjectTemplates` directory. - `BlazorTemplates.Tests`: Contains the source files for the Blazor template tests, these are currently split out due to not being Helix ready yet. - `Shared`: Contains a collection of shared constants and helper methods/classes including the infrastructure for managing dotnet processes to create, build, run template tests. - `Web.Client.ItemTemplates`: Contains the Web Client-Side File templates, includes things like less, scss, and typescript - `Web.ItemTemplates`: Contains the Web File templates, includes things like: protobuf, razor component, razor page, view import and start pages - `Web.ProjectTemplates`: Contains the ASP.NET Core Web Template pack, including Blazor Server, WASM, Empty, Grpc, Razor Class Library, RazorPages, MVC, WebApi. - `Web.Spa.ProjectTemplates`: Contains the Single Page Application templates for ASP.NET Core, including Anuglar, React, ReactRedux. - `migrations`: Contains migration related scripts. - `scripts`: Contains a collection of scripts that help running tests locally that avoid having to install the templates to the machine. - `test`: Contains the end to end template tests. - `testassets`: Contains assets used by the tests, like a dotnet tools installer ### Build Some projects in this repository (like SignalR Java Client) require JDK installation and configuration of `JAVA_HOME` environment variable. 1. If you don't have the JDK installed, you can find it from https://www.oracle.com/technetwork/java/javase/downloads/index.html 1. After installation define a new environment variable named `JAVA_HOME` pointing to the root of the latest JDK installation (for Windows it will be something like `c:\Program Files\Java\jdk-12`). 1. Add the `%JAVA_HOME%\bin` directory to the `PATH` environment variable To build the ProjectTemplates: 1. Run `build.cmd -all -pack -configuration Release` in the repository root to build all of the dependencies. 1. Run `build.cmd -pack -NoRestore -NoBuilddeps -configuration Release` in src/ProjectTemplates directory to produce NuGet packages for each class of template in the artifacts directory. ### Test To run the ProjectTemplate tests: 1. Because the templates build against the version of `Microsoft.AspNetCore.App` that was built during the previous step, it is NOT advised that you install templates created on your local machine via `dotnet new -i [nupkgPath]`. Instead, use the `Run-[Template]-Locally.ps1` scripts in the script folder. These scripts do `dotnet new -i` with your packages, but also apply a series of fixes and tweaks to the created template which keep the fact that you don't have a production `Microsoft.AspNetCore.App` from interfering. 1. The ASP.NET localhost development certificate must also be installed and trusted or else you'll get a test error "Certificate error: Navigation blocked". 1. Run `.\build.cmd -test -NoRestore -NoBuild -NoBuilddeps -configuration Release "/p:RunTemplateTests=true"` to run template tests. ** Note** ProjectTemplates tests require Visual Studio unless a full build (CI) is performed. ## More Information For more information, see the [ASP.NET Core README](../../README.md).
73.108696
524
0.78769
eng_Latn
0.986939
dbf5c8de49384e305cb920e3cc758f30a01765fa
6,355
md
Markdown
v19.2/postgresql-compatibility.md
rgbkrk/docs-2
d2bccd7eced0414f58d7b81f256901db1c7f5b6d
[ "CC-BY-4.0", "BSD-3-Clause" ]
null
null
null
v19.2/postgresql-compatibility.md
rgbkrk/docs-2
d2bccd7eced0414f58d7b81f256901db1c7f5b6d
[ "CC-BY-4.0", "BSD-3-Clause" ]
null
null
null
v19.2/postgresql-compatibility.md
rgbkrk/docs-2
d2bccd7eced0414f58d7b81f256901db1c7f5b6d
[ "CC-BY-4.0", "BSD-3-Clause" ]
null
null
null
--- title: PostgreSQL Compatibility summary: A summary of CockroachDB's compatibility with PostgreSQL toc: true redirect_from: porting-postgres.html --- CockroachDB supports the PostgreSQL wire protocol and the majority of its syntax. This means that your existing applications can often be migrated to CockroachDB without changing application code. CockroachDB is compatible with PostgreSQL 9.6 and works with majority of PostgreSQL database tools such as [Dbeaver](dbeaver.html), [Intellij](intellij-idea.html), pgdump and so on. Consult this link for a full list of supported [third-party database tools](third-party-database-tools.html). CockroachDB also works with most [PostgreSQL drivers and ORMs](build-an-app-with-cockroachdb.html). However, CockroachDB does not support some of the PostgreSQL features or behaves differently from PostgreSQL because these features cannot be easily implemented in a distributed system. This page documents the known list of differences between PostgreSQL and CockroachDB for identical input. That is, a SQL statement of the type listed here will behave differently than in PostgreSQL. Porting an existing application to CockroachDB will require changing these expressions. {{site.data.alerts.callout_info}}This document currently only covers unsupported SQL and how to rewrite SQL expressions. It does not discuss strategies for porting applications that use <a href="sql-feature-support.html">SQL features CockroachDB does not currently support</a>, such as the <code>ENUM</code> type.{{site.data.alerts.end}} ## Unsupported Features - Stored procedures and functions - Triggers - Events - User-defined functions - FULLTEXT functions and indexes - GEOSPATIAL functions and indexes - Drop primary key - XML Functions - Savepoints - Column-level privileges - CREATE TEMPORARY TABLE syntax - XA syntax ##Features that differ from PostgreSQL Note, some of these differences below only apply to rare inputs, and so no change will be needed, even if the listed feature is being used. In these cases, it is safe to ignore the porting instructions. ### Overflow of `float` In PostgreSQL, the `float` type returns an error when it overflows or an expression would return Infinity: ~~~ postgres=# select 1e300::float * 1e10::float; ERROR: value out of range: overflow postgres=# select pow(0::float, -1::float); ERROR: zero raised to a negative power is undefined ~~~ In CockroachDB, these expressions instead return Infinity: {% include copy-clipboard.html %} ~~~ sql SELECT 1e300::float * 1e10::float; ~~~ ~~~ +----------------------------+ | 1e300::FLOAT * 1e10::FLOAT | +----------------------------+ | +Inf | +----------------------------+ ~~~ {% include copy-clipboard.html %} ~~~ sql SELECT pow(0::float, -1::float); ~~~ ~~~ +---------------------------+ | pow(0::FLOAT, - 1::FLOAT) | +---------------------------+ | +Inf | +---------------------------+ ~~~ ### Precedence of unary `~` In PostgreSQL, the unary `~` (bitwise not) operator has a low precedence. For example, the following query is parsed as `~ (1 + 2)` because `~` has a lower precedence than `+`: {% include copy-clipboard.html %} ~~~ sql SELECT ~1 + 2 ~~~ In CockroachDB, unary `~` has the same (high) precedence as unary `-`, so the above expression will be parsed as `(~1) + 2`. **Porting instructions:** Manually add parentheses around expressions that depend on the PostgreSQL behavior. ### Precedence of bitwise operators In PostgreSQL, the operators `|` (bitwise OR), `#` (bitwise XOR), and `&` (bitwise AND) all have the same precedence. In CockroachDB, the precedence from highest to lowest is: `&`, `#`, `|`. **Porting instructions:** Manually add parentheses around expressions that depend on the PostgreSQL behavior. ### Integer division In PostgreSQL, division of integers results in an integer. For example, the following query returns `1`, since the `1 / 2` is truncated to `0`: {% include copy-clipboard.html %} ~~~ sql SELECT 1 + 1 / 2 ~~~ In CockroachDB, integer division results in a `decimal`. CockroachDB instead provides the `//` operator to perform floor division. **Porting instructions:** Change `/` to `//` in integer division where the result must be an integer. ### Shift argument modulo In PostgreSQL, the shift operators (`<<`, `>>`) sometimes modulo their second argument to the bit size of the underlying type. For example, the following query results in a `1` because the int type is 32 bits, and `32 % 32` is `0`, so this is the equivalent of `1 << 0`: {% include copy-clipboard.html %} ~~~ sql SELECT 1::int << 32 ~~~ In CockroachDB, no such modulo is performed. **Porting instructions:** Manually add a modulo to the second argument. Also note that CockroachDB's [`INT`](int.html) type is always 64 bits. For example: {% include copy-clipboard.html %} ~~~ sql SELECT 1::int << (x % 64) ~~~ ### Locking and `FOR UPDATE` <span class="version-tag">New in v19.2:</span> Because CockroachDB only supports [`SERIALIZABLE` isolation](architecture/transaction-layer.html#isolation-levels), locking is not required to order concurrent transactions relative to each other. The `FOR UPDATE` [locking clause](sql-grammar.html#locking_clause) is supported in [selection queries](selection-queries.html#parameters) for database migration compatibility only. As a no-op clause, `FOR UPDATE` does not provide stronger, PostgreSQL-compatible locking guarantees for uses unrelated to performance, such as applications that require locks to protect data from concurrent access altogether. CockroachDB uses a [lightweight latch](architecture/transaction-layer.html#latch-manager) to serialize access to common keys across concurrent transactions. As CockroachDB does not allow serializable anomalies, [transactions](begin-transaction.html) may experience deadlocks or [read/write contention](performance-best-practices-overview.html#understanding-and-avoiding-transaction-contention). This is expected during concurrency on the same keys. These can be addressed with either [automatic retries](transactions.html#automatic-retries) or [client-side intervention techniques](transactions.html#client-side-intervention). ###SQL Compatibility Click the following link to find a full list of [CockroachDB supported SQL Features](sql-feature-support.html).
48.51145
650
0.736271
eng_Latn
0.98129
dbf5f86c28a517cde2d4f0eebfaaef75187ee842
14,868
md
Markdown
huawei/2016/20160627-任总与Fellow座谈会上的讲话.md
fenwii/huaweimind
62d6a7031ccd5ead3b14e6aa4c67f90167c6de6d
[ "MIT" ]
41
2020-04-15T10:29:15.000Z
2022-03-27T06:19:12.000Z
huawei/2016/20160627-任总与Fellow座谈会上的讲话.md
Oryxmarina/huaweimind
62d6a7031ccd5ead3b14e6aa4c67f90167c6de6d
[ "MIT" ]
null
null
null
huawei/2016/20160627-任总与Fellow座谈会上的讲话.md
Oryxmarina/huaweimind
62d6a7031ccd5ead3b14e6aa4c67f90167c6de6d
[ "MIT" ]
28
2020-06-17T01:39:08.000Z
2022-03-27T06:10:51.000Z
**总裁 办 电 子 邮 件** 电邮讲话【2016】069号 签发人:任正非 **任总与Fellow座谈会上的讲话** 2016年5月5日、5月6日、5月17日、5月18日 人类社会正处在一个转折时期,未来二、三十年内将变成智能社会,智能社会就是信息大爆炸的社会。这个时期充满了巨大的机会,没有方向、没有实力的奋斗是不能产生价值的。没有正确的假设,就没有正确的方向;没有正确的方向,就没有正确的思想;没有正确的思想,就没有正确的理论;没有正确的理论,就不会有正确的战略。现在没有人知道未来的社会结构是什么样,但是我们可以假设,假设流量会越来越大,就给了我们机会。我们不能像小公司只赌一个方向,而是要多路径、多梯队研究。我曾经在英国研究所讲到将来的工作方法,但是并没有提到工作方向。我们公司现在有实力,但是方向是否正确?未来社会是什么样子?我们一起来探讨,充分听取大家意见,也算是“一杯咖啡吸收宇宙能量”。请各位专家畅所欲言。 **一、华为坚持管道战略,开放合作,团结一切可以团结的力量,对未来方向进行探索和研究,掌控不确定性。** **1****、与会人:**第一个问题,人类智能化来自于学习,学习的基础是算法和数据。如果我们上不碰应用,下不碰数据,怎么学习?第二个问题,电信客户是我们的粮仓,我们面临的一个很大的机会和挑战是NFV和NFC。往回看,我们在讨论NFV时分为两派,保守派说只要我们和爱立信联手不做NFV,电信运营商就没办法。激进派则表示我们要激进,积极参与和挑战,我想问下公司有何措施消除NFV给我们带来的威胁? **任总:**我们要做一个管道操作系统,下面操作管道,上面中间平台是网络集成,对上还要能力开放,把所有内容接进来,实现管道的三点衔接,即任何两个点经过一个转接点就能接通。我们的网络已覆盖世界的1/3,是有可能减少我们内部的转发。当接通需要转发次数变少,价格成本也就降低了,速度也快了。管道操作系统“上不碰内容,下不碰数据”,只是负责信息流量的传送,但我们并不知道送出去的是什么,只要传送了就要收费,包括信息垃圾。有人说,我们需要过滤垃圾,否则将来流量太大。如果我们现在要去区分数据的有用性,就成了一个内容公司,要同时打赢两场战争:信息传送和信息过滤,我们公司有这样的能力做到都是佼佼者吗?如果有一场战争不是佼佼者,会不会导致全局失败? 而且我们也不能利用别人的数据来产生新的数据做经营,那会涉及国家的安全问题。 我们说管道操作系统“上不碰内容,下不碰数据”,并不是建立两个混凝土的夹层墙来隔源的。在支撑别人的过程中,我们一定要充分理解客户需求,包括对方提供的内容需求。我们是融合在里面,给内容提供良好的服务,让内容能够通过我们中间件运转起来。数据在我们平台里运转,又还给数据;内容在平台里运转,又还给内容。就像银行流钞票,但并不拥有,钞票都是别人的。 苹果公司是最好的服务商。人类社会有两次整合:第一次是横向整合,IBM推动兼容机,Intel发明286/386/486 ……,抄了苹果公司的后路,后来苹果公司放弃了,兼容机推动了全世界普及了电脑,给今天人类信息社会的文明作出了伟大贡献。第二次是苹果公司垂直整合,几百万个应用组合在手机里面,这就是互联网随人移动。它也是不碰内容的。在这方面,我们公司与苹果公司相比,还有弱点,是否可以花三年时间努力往前赶一点?现在还不好说。 **李英涛:**“上不碰应用,下不碰数据”,指我们不拥有数据。比如我们帮某一个电信客户提高它的离网率时,我们会把算法和平台放在客户那里运转,用客户的数据帮客户解决问题。比如银行也是类似,把我们的算法和平台放到银行去,在银行的数据中心里运转帮助银行解决问题。 **徐直军:**不同的公司对数据的做法不一样。互联网公司千方百计想拥有数据,基于已有数据做各种事情,但IBM不拥有用户的数据,IBM的认知计算主要服务于企业,服务于企业时,企业并没有把数据提供给IBM。回到我们所说的人工智能或者认知计算,首先看这是一个技术,未来多种场景都有需要。我们帮助运营商建网络、发展业务,都需要这个技术,它无处不在,我们是提供整个解决方案去帮助客户创造价值。 **丁耘:**跟数据相关的投资,我们有一块叫“TelcoOS”,我们帮助客户提供工具,但不拥有数据。通过TelcoOS,把网络的带宽能力、数据能力以及业务能力,开放给第三方合作伙伴,让他们在TelcoOS上进行应用开发。NFV和NFC的问题,我很认同一个观点,鸡蛋从外面向里面打碎,那是一个“煎蛋”;鸡蛋从里面向外面打碎,那是一个新的生命。我也不喜欢NFV或者SDN,因为它们会颠覆我们整个通信网络的格局和架构,但是我不愿意成为一个被别人从外面把我们打碎的“煎蛋”。拥抱挑战、拥抱颠覆,这是我们对未来SDN、NFV的态度。 **2****、与会人:**针对管道问题,像Google平台式的互联网公司也做了很多探索研究。但是新技术可能会和运营商新的商业模式产生冲突,比如从技术上探讨如何解决带宽的问题。新技术上的投资,存在不确定性,如何去管理? **任总:**十几年前,前北电CEO欧文斯曾提出和我们合作做低轨道卫星,其实就是今天Facebook和Google提出来的方案。当时我们没有做,因为卫星可能涉及军工,我们上不碰军工,下不碰机密,民营企业恪守本份。那么,是否会出现一种东西把今天的电信网络颠覆?我相信一定会的。摩托罗拉的铱星计划其实就是一个很好的计划,但谁知道接着出现了光纤,就把它颠覆了。人类的电信网络难道就不能颠覆吗? 我们现在讲的是管道战略,没有讲电信战略,大家一定听到我们的口号是有变化的。不管什么样的信息流动,从空中流下,可能高频度、小窄带,存在颠覆互联网的低频度、超宽带的一部分应用。即使颠覆了运营商,我们也要活下来。就像丁耘所说,我们一定要从鸡蛋壳里打出去,产生一个新生命,而不是让别人从外面向里打碎了,成了一个煎蛋。丁耘说的新生命是“小鸡”,我说是“小孔雀”。为什么呢? 第一,“一杯咖啡吸收宇宙能量”。公元1世纪至5世纪是人类文明繁荣的历史时期,那时没有互联网、没有电话,但是不要认为很落后,民主制度、雅典法典、罗马法典、议会制度……都来源于那个时候,因为每个人都可以站在罗马广场上阐述自己的观点,天才成批来。心声社区就是一个“罗马广场”,STW也要成为一个“罗马广场”。心声社区总体是很健康的,让大家免费免责提意见,使华为文化得到普及理解。虽然大家在上面“胡说八道”,针对我们说的,有很多人来评头论足。这些跟帖就是未来将星在闪耀。我不需要知道马甲背后是谁,但是我知道华为有人才。 第二,现在这杯“咖啡杯”里,以你们为核心,团结世界所有同方向的科学家,淡化工卡文化。如果那些科学家做出了跟你们同样的贡献,那么就要给他们同样的待遇。我们可以试试人才“众筹”,就是特优秀人才快进、快出,不扣住人家一生。不求他们归我们所有,不限制他们的人身自由和学术自由,不占有他们的论文、专利……,只求跟他们合作。我听说,有的部门与美国大学教授合作,还提出很多附加条件,不要这样做。我们去支持大学里的教授,喝杯咖啡沟通沟通,听听他的讲话,理解他这篇文章的意义,就能得到很大启发。你们晚上没有事,也可以看看生物等这些跨学科书籍。虽然不能去创造发明,但增强对其他学科理解,当其他学科的专家跟你聊天时,就能感知他学问的价值用途。 虽然有些教授本人不会到我们公司工作,但是他下面有很多博士,可以吸纳进来。这些博士理解教授的科学研究,而且跟老师有技术往来,也把我们与教授的纽带链接起来了。我们出现了成功,注明是来自这个老师的成功,也可以分享成功。名和利双方各只收获一条,两者不矛盾,不就成为合作伙伴了吗? 第三,“咖啡杯里”不仅要有有学问的科学家,还要有一些“歪瓜裂枣”瞎捣乱,也许“小孔雀”就从里面蹦出来了。我讲一个基因的故事。孟德尔摩根从豌豆的种植发现基因以后,两百年世界没有任何人理解这个基因,两百年以后,基因才开始慢慢走起来。在科学的道路上,我们不要压制不同见解的人,这就是我们所讲的“多路径”。要有不同的观点,才叫多路径,未来走的途径就越来越心胸宽广。胸怀世界,就要敢气吞山河。 我们期望“黑天鹅”也要飞在我们的“咖啡杯”中,虽然按我们现在的思想结构,“黑天鹅”还不在我们杯子里。首先我们要去掉“农民意识”,跟别人去喝咖啡,要送一瓶好酒;和教授合作,不要提那么多要求,就说能否在你立项和失败的时候给我们讲两堂课,在讲的过程中,我们喝几次咖啡。我们与几百个人喝了咖啡,消化几百人的思想,然后就会领先世界。如果你不理解,当“黑天鹅”要出现时,就会错失。 丁耘说“拥抱挑战,拥抱颠覆”,我们不要害怕颠覆,真正的挑战出现了,要敢于上去拥抱。人类社会要转型了,没有方向和实力的奋斗是没有价值的。小公司没有实力;有些大公司有实力,但是没有方向;华为既有实力,又在探索方向,怎么不能引领未来呢?五、六年前我就提出要争夺“上甘岭”,这里的“上甘岭”是指:在高科技阵地,华为要和美国争夺领先,并非指五、六十年前的上甘岭。而且我们处在攻势上,不是守势。目前美国仍然比我们强大很多,它容纳世界创新的动力是极其强大的。比如,通用处理芯片加软件的方式来颠覆通信领域,具有极大的实力,这点不能轻视。我们要和美国竞争,这的确是很难的问题,但是我们总要有一个奋斗目标吧! **徐直军:**作为科学家和各个领域专家,应该去追求科学真理,不能有“屁股”。比如,有些专家做任何事情总想着自己的部门利益是否会受到损害;还有人担心我们做任何的创新,可能会损害某一类客户的利益……。这些都不是我们追求科学真理所采取的做法。我们只有去追求解决存在的问题,才可能有创造和创新,才可能有创造性的解决方案和发明。 **3****、****与会人:**只要我们开始有这种科学家的文化,就会比以前好很多,因为以前我们是工程商人。 **任总:**今天我们实际还是工程商人,即使在创新这个层面,其实还是工程领域的创新,而不是技术理论领域在创新。因为我们现在还摸不着技术创新的脚,但是我们摸到技术领域的科学家、教授。这也是我们前进的一个方向,在文化上先要有个起步。就像你所说的,我们过去还没有做到这程度,希望未来就要重视。 **4****、与会人:**任总一直强调技术专家要走出去,要多与外界的人喝咖啡,吸收宇宙的能量。华为公司是一个结果导向的公司,在对外技术合作过程中,对结果的交付还是比较看重的,但像国外的公司在研究环境方面相对要宽松些。华为如何平衡对外技术合作回报以及吸收业界思想?有些业界大牛更适合于提供方向和火花,在交付上会提供少一些。 **徐直军:**这些年,我们强调在研究方面的合作要加强投入,历史上的对外技术合作主要是产品线在做,缺什么合作什么,产品线要把产品交付出来,无可厚非。现在是要加大研究和创新领域合作的投入,公司在逐步加大研究经费投入,就是和西方公司一样的做法。面向长远,围绕创新方向重新开辟新的合作模式,加大研究方面的投入,我们就可能锁定教授长期合作。 **任总:**我们要与产业链构建战略合作关系,实现共赢发展。比如,我们在终端上,要捆绑世界上最优秀的技术进来。我们与莱卡的合作,能不能进一步打通?把数学所开发的算法也提供给他们,形成战略伙伴关系,这是一种螺旋关系。我们还要把世界上最好的音响厂家捆绑进来。华为不可能独家霸天下,更不要成为国际孤儿,与世界上优秀的企业合作起来。我们要降低研究、预研的门槛,因为这儿都是不确定性,应让科学家多一些自主决策,当然要控在边界内。在产品开发上,我们要聚焦在高技术含量,难的领域,这点小公司难以做到。别做低技术门槛的东西,容易诱发内部创业。公认的优秀模块合作,我们就拥有了世界。 **5****、与会人****:**文章里有一张图提到“产品线需要面向确定性开发”。我们的理解是,培养颠覆性力量需要有三个要素:一个是技术要有创新,一个是看准大数据这个方向,还有一个是投资。这三个要素有一个关键特点——时效性,其实就是要接地气,只有在一线的作战组织里,只有直接面向客户、面向打粮食,才能把握这个时间。 **任总:**我们有两个决策体系,一个决策体系是以技术为中心的理想体系,一个决策体系是以客户需求为中心的战略Marketing的现实主义。两个体系在中间强辩论,然后达成开发目标妥协。 **6****、与会人****:**是否思想科学家都在上面,2012实验室只需要验证,就不需要科学家了? **任总:**我们要从思想到服务全流程打通,验证科学家也叫科学家,交付服务也要有Fellow。思想科学家是一个抽象组织,不是一个具体的实体组织,它只有一个秘书机构,没有人。也就是说,定期开会谁都可以来,包括博士前。博士前的概念是指,没有读过博士的农民、工人、服务员……,都是博士前。你拥有了一定的思想,我们也要囊括进来。 美研所未来的发展方向,就是思想研究所和软件所。因为在美国做硬件和芯片往中国输出,会不会有很多障碍,不敢肯定。但是思想是无边界的,这个思想不一定在中国发表,可以在美国杂志上发表,在美国网上可以下载。而且美国的员工可以回来喝杯咖啡,这不就解决问题了吗?我们在先进地方汲取思想,来产生我们现有的成绩。同时,软件代码也是思想,而且代码是用文字描述的,也是思想的体现。 **7****、与会人****:**任总文章提到,我们打开了思想的创新、思想的实验,这与社会化、市场化的创新相比,到底哪个效率更高或者开放性更强? **任总:**我认为的价值观,是每个要素都要分享到合理的利益和回报。“自由、平等、博爱”很好,但没有定义谁来做蛋糕,没有蛋糕,怎么能做到“自由、平等、博爱”?社会上说将来网络设备白牌化,很便宜,但是“免费的午餐”谁来做,谁来维护?白牌化的网络设备质量能做到非常好吗?维护非常好吗?一个要素分配不到价值时,这个要素就会塌陷。免费的午餐不符合市场经济规律,在这里不赚钱,就在那里捞点钱,这种创新叫“商业模式创新”,而美国是“技术创新”。商业模式创新好不好?我现在先不说。日本遭遇二十年金融危机塌下来,日本下面全都是“鹅卵石”,如丰田、松下、索尼……一大批好企业,撑着日本二十年没有垮。如果中国一旦遭遇金融危机,垮下来的是什么呢?豆腐渣,假衣、假油、假商品。创新能不能成大产业,没有理论突破,小改小革,就是一地“鸡毛”。 我们公司既要握有主航道,又要车轮滚滚,“一杯咖啡吸收宇宙能量”。只有坚决攻进无人区,才没有利益冲突和矛盾。我们是公正的扩张,借力的规则是有利于这个世界共同发展的,大公司不会反对我们,小公司望尘莫及,说也没用。只有坚决攻进无人区,才没有竞争对手,我们可以自由飞翔。什么是无人区?第一,没人给你指明前进的道路与方向;第二,没有规则,也不知道哪儿是陷阱,完全进入一个新的探索领域。过去华为公司都是跟随别人,我们节省了很多开路费;走到今天,我们必须自己来开路了。开路,就难免会走错路。 无线的未来是什么?其实我们根本没有定义清楚;网络的未来是什么?我们也没有定义清楚。因此,我们还根本就不知道无人区在哪里。无线未来的最大价值,我认为就是最后一百公尺,就是接入。但是如何让接入更科学合理呢?目前也不清楚。所以我不认为无线已经进入无人区了。 **二、未来智能社会,我们面临的大信息流量的低成本与低时延问题,要敢于探索新的理论和技术。** **(一)关于大信息流量和时延** **1****、与会人:**我把未来方向总结成了四个字“大云移智”,“大”指大信息流量,“云”指cloud,“移”指移动,“智”指智能化。从这个方向,应该可以做很多事情。 **任总:**你讲得很好,可能是流量的一个方向,但不能完全代表我们要探索人类秘密的这个方向。 我认为,流量不能简单像自来水一样无限制扩大,因为自来水的分子结构是一致的,管子不够,可以再加一些管子。而我们的信息流量,每个分子从哪儿来、到哪儿去,分子结构都不一样,所以必须要一个大平台分配。大平台可以做到非常大的仓库,但是岔路口怎么管理?这是提出的新学问。 这个新学问还没有搞明白,又出现一个问题。我们假设未来是一个智能社会,智能社会最大的问题就是流量问题,流量大的问题可能解决,但是时延不可能解决。公司很多人说“我们已经通过工程数学、物理算法等工程科学解决了流量大的问题”,那时延能解决吗?不能。光的理论速度每秒30万公里,实际在光纤中的速度是每秒20万公里,传到美国也需要50毫秒。第一,物理时延。我们现在的传输方式是IP转发,就会产生线路时延,电容也是会产生时延的。如果说电路没有电容,但电线的表面就有电容,半导体还有电容。有人说量子通信可以减少时延,但量子通信是否可以做大流量的载体?现在还不行。第二,网络时延,(朱广平:物理时延是1000公里5毫秒,但网络时延的根本原因是因为拥塞造成的,拥塞又不可能完全避免,所以这样带来了一个问题,就是交互式VR那种严格的时延和今天这种IP的统计复用在网络交换体制上,有可能存在理论上的冲突。)现在VR还只是少量的点,当全世界普及的时候,拥塞就更厉害,而且拥塞还具有随机性。规律性的可以用算法打开,但随机性的很难解决。因为建网不可能建得永远无限大,无限大的网实际不存在,没有这么大的能力。第三,存储带来的时延。 所以时延是一定存在的,可能是最困难的一个问题。VR需要低时延,我们还做不到,也许以后会有一些科学定理新发明,但是现在还没有。所以,我们要理性认识VR/AR的产业发展规律,保持战略耐心。AR/VR的发展速度太快,就会出现泡沫。为什么VR将来会有个泡沫期, 关键是没人能解决时延问题。所以我们强调更多的应该是在基础研究上面下功夫,走后发制人的道路,准备好“浅滩捡鱼”。浑水摸鱼,只有强者才能摸到鱼。如果你本身没有能力,那是捡不到鱼,即使是浅滩。 **(二)关于终端** **2****、与会人:**过去十年,智能手机爆发巨大的能量。今年一季度,苹果手机销量下滑,您如何看未来终端的发展? **任总:**未来可能是软件世界,你能抓一把在手上吗?所有人类智慧的显示是终端(不仅指手机),因此终端未来的发展前景应该是方兴未艾。我们并不完全知道,但有时人们会有一个阶段性的满足,可能又不断出现新的方法和台阶。终端是人类文明社会最需要的一个显示器,不会没有前途,只是目前我们投入还不够,还没有完全能把握人类社会发展的机会点。 苹果公司很有钱,但是太保守了;我们没有钱,却装成有钱人一样疯狂投资。我们没钱,都敢干,苹果公司那么有钱,为什么不敢干呢?如果苹果公司继续领导人类社会往前走,我们可以跟着他们走;如果苹果公司不敢投钱,就只能跟着我们,我们就会变得像苹果公司一样有钱。 相信有一天,我们一定会成功的,“桃子树上会结出西瓜”,虽然现在结的还只是“李子”。 **3****、与会人:**我们的智能手表和手环在欧洲、美国拓展,但云和服务没有跟上,未来在欧洲和美国的云的部署战略是什么? **徐直军:**消费者BG已经有计划,基于安全和隐私保护的基础上,来逐步展开。 **丁耘:**只要安全和隐私保护达到要求,你们就可以在欧洲和美国部署云。 **(三)关于4G和5G** **4****、与会人:**现在各个地区的网络信号不稳定,未来网络是否可以变成动态,跟着终端走?可能这是未来网络研究的一个方向。 **任总:**我认为,这应该是优质运营商之间竞争的范围。运营商首先应该好好部署网络,不要总是去追求5G,其实现在4G的网络部署还没有做到最好,某些城市的核心地区信号覆盖强度都很弱,更不要谈高速数据。所以,未来网络信号的稳定,不完全取决于需求的想法和技术,而是取决于对需求的商业模式的改变——流量货币化,运营商之间要有一个竞争机制。只要真正做到流量货币化,带宽的满足会越来越厉害。 **5****、与会人:**我们将如何部署5G?因为管道会得越来越宽,我们知道AR/VR也会撑大管道,我的问题是我们怎么部署5G?我们部署的策略是怎样的? **李英涛:**从2009年开始,到现在为止,5G在研究方面初步取得了比较好的成果。但是,在形成全球5G标准上,今年才刚刚起步。到2020年时,可能完成标准固化和第一批商用的部署。未来5G不仅解决人和人的通信,还可能给各个行业解决机器和机器的通信,以及机器和人的通信。5G未来的想象空间很大。但踏踏实实来讲,4G还有很多工作要做。2020年以后,5G才会真正被推动起来。 **(四)关于软件** **6****、与会人:**我们公司在SDN投入比较多,后SDN的主要方向和策略是什么?人与自然强调分布式自然交易,我们公司在流量数据化有什么大方向上的考虑? **任总:**人类社会将来只剩两个:一个是情感,一个是数字,数字和情感之间一定要有一个联接,就是华为。华为如何联接数字和情感?就是SDN。 你说“后SDN”,现在我们连前SDN都还没弄明白,何谈后SDN。 **丁耘:**现在不是谈后SDN的时候,未来5-10年,最重要是拥抱SDN这样的一个机会,拥抱SDN带来的网络架构变化以及设备升级带来的机会,获得我们在网络中的地位,实现老板所说的联接数字和情感。因为SDN会带来整个产业链网络架构的重整,目前我们在这方面还有很大的差距。 **7****、与会人:**我认为很多问题最终核心价值都在数据上,请问数据技术对未来华为的意义以及我们在数据技术上所做的战略思考? **徐直军:**我们在云服务和网络领域明确“上不碰应用,下不碰数据”,但数据技术发展很快,未来我们要做到SDN、云化,分布式数据库是最核心的技术。 **(五)关于芯片** **8****、与会人:**现在无线很多竞争力都是建立在芯片上,后面半导体工业到7纳米后该怎么走?对于整个行业如何保持快速的增长速度,想听听各位领导意见。 **任总:**我们可以用叠加、并联的方案,来处理这个问题,虽然笨一些,在新的技术没有出来之前,还是可以去替代的。 **徐直军:**比利时半导体研究院给我们很强的信心,在我们可见的十年内,半导体技术支持我们继续往前走是有希望的。何庭波预测2025年以后会出现拐点,新的技术可能会出现,我们要保证在现有技术上领先,也要保证拐点出现时,能有能力应对。 **9****、与会人:**Intel移动芯片业务,诺基亚的手机,两者都是在自己最擅长的领域失败了,您在这方面的思考? **任总:**网络标准从简单到复杂,随着技术进步,标准又会变得越来越简单。在这个交替过程中,很容易产生“黑天鹅”的。Intel之所以在移动芯片业务没有成功,可能是他们对通信标准理解不够。思科以前那么有钱,为什么不进入无线领域?我们今天是真没钱,因为把钱都分给大家了。 资本给创造世界出了一臂之力,但最重要还是靠劳动创造世界。我们得益于二十几年去读这些标准,融入公司所有人的脑袋中了。对每个脑袋称称重量,然后把股票合理分配,就形成了我们的新机制。虽然走了两个人,但标准体系还存在,读标准的人还是很厉害的。如果公司有一天散了,再重新聚回来,原有的所有体系都不再有了,因为标准要有生命才能延续。管理是无生命体系,标准体系是无生命体系,如果没有有生命的人去支撑,我们这个体系就付诸东流了。所以,华为公司不能垮,否则几十年来花费了百亿美金积累起来的管理体系就没有用了,形成对技术标准的理解也没有用了。 其他公司想进入这个领域,一定要对网络标准有非常深刻的理解。诺基亚和微软的合作为什么没有成功?诺基亚太自信,认为一定要用windows才会成功。华为今天也要绑定windows,但是绑的方法不一样,也可能我们就成功了。此一时,彼一时,世事很难料定。现在不敢断言Intel移动芯片业务一定失败了,因为没人说得清楚未来手机是什么样子。所以我们一定要开放,炸开“金字塔尖”。 **10****、****与会人:**以前我们处理数据是把数据从硬盘、内存搬到CPU里,现在换了一种想法,能不能把CPU放到无处不在的地方,只要有数据的地方就放CPU,所以现在业界正在说,以数据为中心的计算或者叫NDP基数据计算,是对计算体系的一个颠覆。在这样一个思想下,Intel和三星就不一样了。Intel是做CPU的,希望把内存做到CPU里去;三星是做内存的,希望把CPU做到内存里去,这个产业方向上就开始博弈了。 **任总:**这个博弈就是社会进步。我们公司现在有钱,也可以自己内部博弈。怎么才会更有钱呢?就是“一杯咖啡吸收宇宙能量”,把世界同方向科学家包囊进来,产生更大的能量。我们也要研究这两种方式,赶上时代步伐。 **(六)关于人工智能** **11****、与会人:**现在都谈AI人工智能,对于这个领域,华为的看法以及未来是否有计划进入?从哲学角度来讲,上帝创造人类,现在人类要创造新的人类,新的人类将来是否会替代人类? **任总:**人类创造的新人类有可能取代我们真人类,这是霍金和比尔盖茨以前的观点。我们看到负面的一面,也要看到正面的一面。人类的生产、服务过程可能实现人工智能化,过去我们所期望的物质财富和精神财富极大丰富都可能实现,至少精神财富方面是可能实现的。比如,将来新人类可以一秒钟读完莎士比亚,两秒钟把美国图书馆的书籍读完,三秒钟学完几百种语言……。生存80年,真人类可能就死亡了,但是新人类可以把灵魂和躯体相分离,把“灵魂”放在数据库,重新换一个机器躯体,就变成80岁智慧的20岁小姑娘;再过80年,它变成160岁智慧的20岁小姑娘……再接下来,它可能还会有千年的智慧结晶。这样的人类创造的电视剧,就不会再出现抗日神剧,因为它懂历史、懂科学。新人类肯定比我们真人类更智慧,让我们的生产、服务智能化。 **李英涛:**华为在人工智能领域是有布局的,2012实验室的诺亚方舟在研究,面对我们“上不碰应用,下不碰数据”的原则,我们如何运用人工智能?实际上,我们在电信领域是有尝试的,比如我们可以快速帮客户找到网上问题,可以帮助客户提高与用户之间的黏性,这些可以给运营商带来现实的价值。如何使我们自己的工程、交付、流程人工智能化……。只是我们在做这些研究时,没有大力向外宣传。 **(七)关于网络能源:** **12****、与会人:**华为以前的成功很多依赖于数学的算法和工具,无论对软件还是硬件。现在能源越深入研究,越发现相当于跟物理化学有关了。 **任总:**公司主航道就是要攻克大信息流量的疏导,大数据里最大的困难就是发热,硬件工程、电子工艺最大的问题就是散热。有位专家说,未来50%的能源将消耗在芯片上,散热和发热机理也可能是电子技术最核心的竞争力。芯片的发热是没有任何价值的,发出多少热,就要散出去多少热,发热和散热是同样重大的科研科技。所以,发热机制与散热问题是大数据传送的关键挑战,我们需要加大投入研究。 我在乌克兰跟很多学院的教授座谈,他们说散热问题完全可以用数学解决。我们的热管,居然跟谁都不合作,自己闷着头干能做出什么水平。热导是宇航技术,为什么不能跟有名的宇航研究所科学家和学院合作呢?乌克兰、俄罗斯的战斗机,加工不如美国的精密,他们就把翅膀故意做得“粗糙”一点,让上面贴附一层空气层流,贴附在翅膀的层流,润滑飞得跟美国战斗机一样快。如果没有很好的模拟水平,这是绝对做不到的,这是高空的动力水平。那么想想,我们公司的散热原理是什么? 我们要重点研究发热机制,芯片为什么会发热,线路为什么会发热?为什么发那么多热,这个热能不能降下来,怎么把热散出去?从这个角度开始,我们研究为什么要电源,要多少电源,能不能少一点电源,不要电源行不行?我们要解决芯片发热的机理问题,以及如何把热散出去。能不能几秒钟将手机充满。 **三、随着时代发展,华为正在不断变革。只有“力出一孔,利出一孔”的团结奋斗,才可能有未来的成功。** **1****、与会人:**第一个问题是流程创新。我们是一家技术公司,但技术创新的进程正在发生变化,其中一种变化是越来越跨学科,其中一个例子就是新生代员工所展示的跨工程与人类感知领域创新。在硅光子领域(silicon photonics),我们很强大,但在人类光电子(human photonics)领域,我们还不够强大。我们如何重塑华为以适应跨学科的组织和结构,从而我们可以参与这种形式的创新?问题的第二部分相对有点现实,因为我们有中央硬件、中央软件,我们有网络,这是我们非常强的领域。但是我认为,我们需要人类感知的其他学科以及技术集成生物学。 第二个问题,创新变化的方式。科学技术发展非常之快,有些技术还没上市就失败了。昨天轮值CEO提到,我们要容忍失败,但是我们必须要能够非常非常快地失败,也要非常非常快地成功。但实际情况是,我们很难非常快速地行动,因为我们的流程非常慢,比如决策流程、建立小的敏捷团队、与知名教授的实验性工作、与初创企业的合作……。 **任总:**我倒过来回答。 第一,谁摧毁了索尼?KPI高绩效文化。我们处在一个创新的时代,把很多不确定性、确定性工作都流程化后,就抑制了新东西的产生。首先要肯定日本是一个伟大的国家,将规范的管理落实到了基层,车间的螺丝刀、零件、纸巾……摆放都规范得清清楚楚,青年工人进来后需要严守这个规则,青年人创造的冲动就没有了。英国也是伟大的国家,给世界输出的文化是规则,但英国把流程规则到最末端。而美国是一批异教徒移民,把英国制度撕裂,大的法律框架是规范化的,但管不了末端,所以美国把英国文化做了变异,创造了一个灿烂的美国两百年。 我们公司是从一个混乱公司走过来的,如果不走流程化、高绩效考核的道路,今天就是布朗运动,每个分子都乱动,形不成动力。我们规范化以后,管子“哗哗”地流,经过“拉法尔喷管”挤压。可压缩的流体被压缩超过音速后,扩展的面积越来越大,速度越来越快,这就是火箭。火箭的发动机基于拉法尔喷管。我们是先规范、后放开。 华为公司经过一个瓶颈挤压大家,这就是价值观。挤压完以后,再放开,大家的奔跑速度越来越快,推动华为这个“机器”的前进。我们正在改变,让大家的聪明才智得到发挥,让大家的思想活跃起来。 第二,未来世界一定是跨学科创新的,但是华为不可能拥有这么多跨学科的人才,所以胡厚崑提出淡化工卡文化。我们以前叫做“低带宽、高振幅”,每个人某方面的能量非常大,但知道的知识面可能很窄,但是如果把很多人拼起来,那么我们就是“宽频带、高振幅”。如何实现跨学科?只能各种组合来实现,这就是我们将来新的研究措施。就像徐直军所说的,希望各位Fellow和专家拿刀子把“屁股”砍掉,去全世界思想碰撞。当日本樱花盛开的时候,为什么不请法国科学家到樱花树下去和日本科学家喝喝酒?当法国薰衣草盛开的时候,为什么不请日本科学家坐在薰衣草地上去喝喝法国的红酒?经常思想碰撞,就可能产生新的“火花”。当然,产生“火花”的方式,还有与大学教授合作。 第三,以后我们不能随便使用“失败”这个名词,要使用“探索”这个名词,因为“成也英雄,败也英雄”。在任何不走错路的关闭项目中,去分析项目的成功经验或失败原因,即使告诉我们此路不通,也是一种探索。把做这个总结的人调到另外一个项目去,炮火就能打得更准。 我们要承认英雄。珠穆朗玛峰的探险家多艰难,现在有人攀登上去才发现一百多年的探险家遗体,他怎么不是人类伟大的英雄呢?阿波罗登月,13号飞船因服务舱液氧箱爆炸中止登月任务,那三名宇航员也是英雄。 所以,我们要重新看待成功与失败,失败的经验对我们是宝贵财富。失败也是一种学习,而且是最宝贵的学习。“一生能有几次败”,所有一切不能重来。我们把失败项目中的奋斗者留下来,以后就不会失败。干部部门更要重新去看待这个问题,评价系统不能僵化,应该有灵活的考核方式。 **2****、与会人:**目前公司的人才招聘流程非常长,需要很多领导签字,海外招聘一个人需要用两个月时间。您对HR的人才招聘有没有新的想法和思考? **任总:**首先干部部长要来自于科研人员、项目人员、业务人员,能充分理解项目,HR必须要从业务人员中产生。三年前,我们就制定了“奖金要从下往上及时发放”的政策,但现在仍是年底“排排坐,吃果果”,分不动,因为人力资源不懂业务,不知道如何评价人。所以HR一定要懂业务。 **3****、与会人:**Google十年前把安卓开源,六年前开始涉足无人驾驶领域,已经提前布好局了。马斯克回收火箭,现在看到他的成功,其实也是经历了几十次的失败。我们向未来看,还没有经历这样的波折,而且能力差距还很大。十几年前美国大片中的场景,现在逐渐走进了生活。走向未来的二三十年,成长能力提升的路径是什么? **任总:**未来社会变化非常快,不是哪一个人的智慧能支持我们的发展,我们让大家来集思广益。现在我们会开放一个务虚平台,允许一些专家来“胡说八道”,就像心声社区一样。未来这个平台是否会开放给普通员工和大学学生,我们可能拿一个春节假期来试试,让员工在家看看,看能否产生一些奇思妙想。我们也担心,员工没有成熟就胡思乱想。突破是一定要一些底蕴的。只有踏踏实实,才能有所突破。 虽然现在我们很难预测华为在未来社会中到底是什么地位,但是我认为,只有奋斗才会有未来,我们奋斗可能会不成功,但不奋斗肯定是不成功的。因此,我们努力往前划船。第一,我们公司有“力出一孔,利出一孔”团结奋斗的商业平台,这是一种模式,全世界绝无仅有。第二,我们不是上市公司,每年攻击城墙口的炮弹投入是200-300亿美金。没有任何一家上市公司愿意这么大的投资,因为股东不会同意。我们公司要生命,不要钱。 社会上有很多家公司也在划船,互联网公司比我们公司加班的情况还严重,但他们实现的是个人价值,不可能挖出一条长江,在长江边有一个小水库是有可能的。我们是一条大江、大河。 **4****、与会人:**您如何看华为未来二三十年的发展? **任总:**华为未来的发展,就是我们一定能活着,而且一定能成功!因为我们在七八年前就已经把人才“金字塔”顶端炸掉了。“金字塔”是一个封闭的模型,塔尖的这个人有多宽的视野,“金字塔”就有多大。现在炸开了塔尖,组合了非常多精英,战略方向和前进方向是靠大家共同去探索出来的,而不是靠一个人来判断局势。大部队最大的问题,就是方向不能错。战略目标不明确,天天很辛苦,这是过去的华为,现在我们的领导要仰望星空。当我们放开视野,大部队的方向就不容易出现差错,不出现差错,我们就不会灭亡。 你问华为未来二、三十年的发展,我也希望华为能再存在二、三十年,即使有三、五年,我也满意了。 **5****、与会人:**如果未来有一个中国公司领导世界,了解华为的人都知道,华为有九成的希望。 **任总:**你说未来有一个中国公司领导世界,我相信那一定不会是华为,因为华为是全球化公司,不是一个中国公司。 为什么有这么狭隘的荣誉感呢?不要总想到做领袖的光荣,不要去背上这个沉重的口号和包袱,荣誉对于我们来说是没有用的。我们说未来要领导世界,是为了鼓舞大家信心,让大家奋斗去做得更好。其实我们都很笨,但是我们依托了一个大平台获得了成功。我们这个成功,是为了自己给老婆多赚点钱,不是为了世界荣誉,不是为了当世界领袖。 **6****、与会人:**华为公司已经很成功了,什么因素让我们做到现在的程度,还有哪些因素让您晚上睡不着觉? **任总:**成功的标志是什么?全世界68个战略高地,我们才进入三五个,怎么叫“成功”呢?我们在很多发达国家,还没有进入主流运营商,进入了一些非主流的运营商,但也还没有进入这个运营商的主流市场,这怎么能叫“很成功”呢? 我没有这个自豪感,你有自豪感,应该上战场,去给我们创造成功。 **7****、与会人:**您刚才讲美国比世界大,我想听听比世界还大的世界是什么? **任总:**比世界还大的世界,就是你的心胸。 报送:董事会成员、监事会成员 主送:全体员工,全公开 二○一六年六月二十七日
52.537102
499
0.84813
zho_Hans
0.600195
dbf63e749f26ad1eeeafc6eff54e9f67dec8a347
2,895
md
Markdown
source/pattern-library/communication/field-level-help/index.md
jeff-phillips-18/patternfly-org
4fc930af31612373bc44b1d8d6f518818a7d932c
[ "Apache-2.0" ]
null
null
null
source/pattern-library/communication/field-level-help/index.md
jeff-phillips-18/patternfly-org
4fc930af31612373bc44b1d8d6f518818a7d932c
[ "Apache-2.0" ]
null
null
null
source/pattern-library/communication/field-level-help/index.md
jeff-phillips-18/patternfly-org
4fc930af31612373bc44b1d8d6f518818a7d932c
[ "Apache-2.0" ]
null
null
null
--- title: Field Level Help author: lhinson layout: page --- <h2>Overview</h2> <p>Field level help, denoted by the information icon, can be used when you need to provide supplemental information. The text is displayed in a popover when the user clicks on the information icon. While not limited to, the most common use case for field level help is seen on forms.</p> <p>This pattern should NOT be used when:</p> <ul> <li>Compensating for bad design and relying on it to explain a graphic or word choice.</li> <li>The information displayed by the popover is necessary to read.</li> </ul> <h2>Example</h2> <form class="form-horizontal"> <div class="form-group"> <label class="col-sm-2 control-label" for="textInput"> Default <a tabindex="0" role="button" data-toggle="popover" data-trigger="focus" data-html="true" title="" data-content="Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et &lt;a href='#'&gt;dolore magna aliqua&lt;/a&gt;." data-placement="right"><span class="fa fa-info-circle"></span></a> </label> <div class="col-sm-10"> <input type="text" id="textInput" class="form-control"> </div> </div> </form> <p class="reference-markup"><a class="collapse-toggle collapsed" data-toggle="collapse" aria-expanded="false" aria-controls="sparkline-markup" href="#sparkline-markup">Reference Markup</a></p> <div class="collapse" id="sparkline-markup"> <pre class="prettyprint">&lt;form class="form-horizontal"&gt; &lt;div class="form-group"&gt; &lt;label class="col-sm-2 control-label" for="textInput"&gt;Default &lt;a tabindex="0" role="button" data-toggle="popover" data-trigger="focus" data-html="true" title="" data-content="Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et &lt;a href='#'&gt;dolor consequat blandat&lt;/a&gt;" data-placement="top" data-original-title=""&gt;&lt;span class="fa fa-info-circle"&gt;&lt;/span&gt;&lt;/a&gt; &lt;/label&gt; &lt;div class="col-sm-10"&gt; &lt;input type="text" id="textInput" class="form-control"&gt; &lt;/div&gt; &lt;/div&gt; &lt;/form&gt;</pre> </div> <h2>Description</h2> <a href="{{site.baseurl}}assets/img/field-level-help-callout.png"><img src="{{site.baseurl}}assets/img/field-level-help-callout.png" alt="field-level-help-callout" class="alignnone size-full wp-image-4230" /></a> <ol> <li><b>Icon:</b> The help icon is Font Awesome icon, fa-info-circle, and is positioned to the right of the component. The icon is blue (hex# 0099d3) to indicate that it is interactive.</li> <li><b>Text:</b> We recommend that the popover text does not exceed three sentences. If needed, include a link to online resources. The popover supports HTML formatting.</li> <li><b>Popover:</b> It is recommended that the popover is dismissed after the user’s the next click.</li> </ol>
57.9
386
0.711917
eng_Latn
0.781526
7a8fecffe5cdcade5476fad236b65c65850216b4
2,799
md
Markdown
en/storage/instruments/s3cmd.md
YurMischenkov/docs
392cfea8bd676c5a99468f140eda33be782d135f
[ "CC-BY-4.0" ]
null
null
null
en/storage/instruments/s3cmd.md
YurMischenkov/docs
392cfea8bd676c5a99468f140eda33be782d135f
[ "CC-BY-4.0" ]
null
null
null
en/storage/instruments/s3cmd.md
YurMischenkov/docs
392cfea8bd676c5a99468f140eda33be782d135f
[ "CC-BY-4.0" ]
1
2019-08-15T12:32:47.000Z
2019-08-15T12:32:47.000Z
# S3cmd [S3cmd](https://s3tools.org/s3cmd) is a command line tool (for Linux and Mac) designed for services that support the Amazon S3 HTTP API. The general [procedure for running commands](https://s3tools.org/usage) can be found in the official S3cmd documentation. ## Before you start {#preparations} {% include [storage-s3-http-api-preps](../_includes_service/storage-s3-http-api-preps.md) %} ## Installation {#installation} To install S3cmd, follow the [instructions](https://github.com/s3tools/s3cmd/blob/master/INSTALL) in the project repository. ## Configuration {#setup} To configure S3cmd, use the `s3cmd --configure` command. The command will request values for the following parameters: 1. `Access Key`: Enter the ID of the key that you received when generating the static key. 1. `Secret Key`: Enter the secret key that you received when generating the static key. 1. `Default Region`: Enter `ru-central1`. {% note info %} Always specify the `ru-central1` region when accessing {{ objstorage-name }}. A different value of the region may lead to an authorization error. {% endnote %} 1. `S3 Endpoint`: Enter `{{ s3-storage-host }}`. 1. `DNS-style bucket+hostname:port template for accessing a bucket`: Enter `%(bucket)s.{{ s3-storage-host }}`. 1. Leave the other parameter values unchanged. The client tries to establish a connection with {{ objstorage-name }} and get a list of buckets. If successful, it will return `Success. Your access key and secret key worked fine :-)`. The `s3cmd --configure` command saves the settings to a `~/.s3cfg` file in the format: ``` [default] access_key = id secret_key = secretKey bucket_location = ru-central1 host_base = storage.yandexcloud.net host_bucket = %(bucket)s.storage.yandexcloud.net ``` If necessary, you can change these settings directly in the file. You can also specify settings when launching the client by using the appropriate parameters. For the static website hosting control commands to work correctly, manually add the following parameter to the configuration file: ``` website_endpoint = http://%(bucket)s.website.yandexcloud.net ``` ## Specifics {#specifics} Keep in mind that S3cmd treats {{ objstorage-name }} as a hierarchical file system and object keys look like file paths. ## Examples of operations {#s3cmd-examples} ### Create a bucket ```bash s3cmd mb s3://bucket ``` {% note info %} When creating a bucket, follow the [naming conventions](../concepts/bucket.md#naming). {% endnote %} ### Uploading an object ``` s3cmd put local_file s3://bucket/object ``` ### Getting a list of objects ```bash s3cmd ls s3://bucket ``` ### Retrieve an object ```bash s3cmd get s3://bucket/object local_file ``` ### Deleting an object ```bash s3cmd del s3://bucket/object ```
28.561224
258
0.730975
eng_Latn
0.969295
7a905ad1d2197d926b0e72f545b590ab6bc96684
20
md
Markdown
README.md
hsunaing/hsunaing.github.io
0d4d10627fb482ea93e71f05e548e54ac176f3f2
[ "Apache-2.0" ]
null
null
null
README.md
hsunaing/hsunaing.github.io
0d4d10627fb482ea93e71f05e548e54ac176f3f2
[ "Apache-2.0" ]
null
null
null
README.md
hsunaing/hsunaing.github.io
0d4d10627fb482ea93e71f05e548e54ac176f3f2
[ "Apache-2.0" ]
null
null
null
# hsunaing.github.io
20
20
0.8
gle_Latn
0.1597
7a90a9f192b1b1d249c54f3e7e756f86d24c7ae2
851
md
Markdown
README.md
haykam821/Snakeroom-Website
0f0537a444eb6a577ce257e56ab0aac9d8821108
[ "MIT" ]
1
2020-07-20T02:40:47.000Z
2020-07-20T02:40:47.000Z
README.md
haykam821/Snakeroom-Website
0f0537a444eb6a577ce257e56ab0aac9d8821108
[ "MIT" ]
12
2020-04-02T01:59:08.000Z
2022-03-31T03:47:22.000Z
README.md
haykam821/Snakeroom-Website
0f0537a444eb6a577ce257e56ab0aac9d8821108
[ "MIT" ]
4
2020-03-27T00:34:32.000Z
2022-01-12T20:33:21.000Z
# Website This repository contains the source code for [snakeroom.org](https://snakeroom.org). ## Contributing To set up your local development environment: ```bash git clone [email protected]:Snakeroom/Website.git npm install npm run dev # http://localhost:3000 ``` We follow [Chris Beams' git commit style guide](https://chris.beams.io/posts/git-commit). We use Prettier and ESLint to enforce code style. Make sure to run `npm run lint` before commit to lint and autofix your changes. (or install the respective editor plugins for lint/format-on-save) If your PR contains significant code changes (e.g. not just fixing a typo), add your name to the `LICENSE` file. When merging pull requests, merge commits should be used rather than squashing or rebasing. ## Production The repo is automatically deployed to Netlify upon commit to `master`.
31.518519
196
0.769683
eng_Latn
0.97135
7a90d347818d0ba08e5ea7c993dad3363f4b579a
242
md
Markdown
content/haskell/performance-bang-pattern.md
emmettng/emmettng
833b47bb42a4d2c5bb057f4bda28c6f85e4ccf27
[ "MIT" ]
null
null
null
content/haskell/performance-bang-pattern.md
emmettng/emmettng
833b47bb42a4d2c5bb057f4bda28c6f85e4ccf27
[ "MIT" ]
null
null
null
content/haskell/performance-bang-pattern.md
emmettng/emmettng
833b47bb42a4d2c5bb057f4bda28c6f85e4ccf27
[ "MIT" ]
null
null
null
--- title: "Performance Bang Pattern" date: 2019-11-14T15:16:04+08:00 draft: true --- #### online resources - [some blog](https://code-examples.net/en/q/f2758) - [fp complete](https://www.fpcomplete.com/blog/2017/09/all-about-strictness) -
22
77
0.698347
eng_Latn
0.133766
7a91086ce3feebbbb9999dc3aac11379a787d53c
8,854
md
Markdown
content/post/2017-11-23.md
aquilax/quantified.avtobiografia.com
142c230ab33b46d471732858fafeefbebf788803
[ "MIT" ]
null
null
null
content/post/2017-11-23.md
aquilax/quantified.avtobiografia.com
142c230ab33b46d471732858fafeefbebf788803
[ "MIT" ]
null
null
null
content/post/2017-11-23.md
aquilax/quantified.avtobiografia.com
142c230ab33b46d471732858fafeefbebf788803
[ "MIT" ]
2
2018-02-27T06:57:20.000Z
2019-07-21T13:30:58.000Z
{ "date": "2017-11-23", "type": "post", "title": "Report for Thursday 23rd of November 2017", "slug": "2017\/11\/23", "categories": [ "Daily report" ], "images": [ "\/photos\/2017-11-23\/20171123_151055.jpg" ], "health": { "weight": 79.3, "height": 173, "age": 13490 }, "nutrition": { "calories": 1131.81, "fat": 94.54, "carbohydrates": 28.34, "protein": 45.55 }, "exercise": { "pushups": 0, "crunches": 0, "steps": 5734 }, "media": { "books": [], "podcast": [ { "id": "38226", "url": "http:\/\/cast.writtn.com\/episode\/38226\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "Who Committed the 1912 Villisca Ax Murders?", "duration": "2963" }, { "id": "44081", "url": "http:\/\/cast.writtn.com\/episode\/44081\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "Who Committed the 1912 Villisca Ax Murders?", "duration": "2963" }, { "id": "38096", "url": "http:\/\/cast.writtn.com\/episode\/38096\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "How Public Broadcasting Works", "duration": "3602" }, { "id": "44082", "url": "http:\/\/cast.writtn.com\/episode\/44082\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "How Public Broadcasting Works", "duration": "3602" }, { "id": "37972", "url": "http:\/\/cast.writtn.com\/episode\/37972\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "SYSK Selects: How Disco Works", "duration": "3217" }, { "id": "44083", "url": "http:\/\/cast.writtn.com\/episode\/44083\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "SYSK Selects: How Disco Works", "duration": "3217" }, { "id": "38796", "url": "http:\/\/cast.writtn.com\/episode\/38796\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "SYSK Selects: How McCarthyism Works", "duration": "2746" }, { "id": "44077", "url": "http:\/\/cast.writtn.com\/episode\/44077\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "SYSK Selects: How McCarthyism Works", "duration": "2746" }, { "id": "38657", "url": "http:\/\/cast.writtn.com\/episode\/38657\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "How Bioarchaeology Works", "duration": "3505" }, { "id": "44078", "url": "http:\/\/cast.writtn.com\/episode\/44078\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "How Bioarchaeology Works", "duration": "3505" }, { "id": "38498", "url": "http:\/\/cast.writtn.com\/episode\/38498\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "Do motivational speakers motivate people?", "duration": "3212" }, { "id": "44079", "url": "http:\/\/cast.writtn.com\/episode\/44079\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "Do motivational speakers motivate people?", "duration": "3212" }, { "id": "38329", "url": "http:\/\/cast.writtn.com\/episode\/38329\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "SYSK Selects: Why do men have nipples?", "duration": "1727" }, { "id": "44080", "url": "http:\/\/cast.writtn.com\/episode\/44080\/quantified", "image": "https:\/\/podcasts.howstuffworks.com\/hsw\/podcasts\/sysk\/sysk-audio-1600.jpg", "channel_title": "Stuff You Should Know", "type": 1, "title": "SYSK Selects: Why do men have nipples?", "duration": "1727" } ], "youtube": { "14": { "id": "47820", "url": "http:\/\/cast.writtn.com\/episode\/47820\/quantified", "image": "https:\/\/i1.ytimg.com\/vi\/tSX7nA2a2n8\/hqdefault.jpg", "channel_title": "Food Wishes", "type": 2, "title": "Sweet Potato Biscuits - Food Wishes - Thanksgiving Recipe", "duration": "0" }, "15": { "id": "47818", "url": "http:\/\/cast.writtn.com\/episode\/47818\/quantified", "image": "https:\/\/i1.ytimg.com\/vi\/pHN4Tzb0yPs\/hqdefault.jpg", "channel_title": "bigclivedotcom", "type": 2, "title": "Voltmeter-indicator schematic and hydrogen pop.", "duration": "0" }, "16": { "id": "47866", "url": "http:\/\/cast.writtn.com\/episode\/47866\/quantified", "image": "https:\/\/i2.ytimg.com\/vi\/aqNHc6ZH7P4\/hqdefault.jpg", "channel_title": "Vox", "type": 2, "title": "How Trump turned Sean Hannity into a conspiracy theorist", "duration": "0" }, "17": { "id": "47848", "url": "http:\/\/cast.writtn.com\/episode\/47848\/quantified", "image": "https:\/\/i2.ytimg.com\/vi\/1dQEfL2BfUM\/hqdefault.jpg", "channel_title": "Nerdwriter1", "type": 2, "title": "How To Design A Comic Book Page", "duration": "0" } }, "photos": [ "\/photos\/2017-11-23\/20171123_151055.jpg" ] } } Today I am <strong>13490 days</strong> old and my weight is <strong>79.3 kg</strong>. During the day, I consumed <strong>1131.81 kcal</strong> coming from <strong>94.54 g</strong> fat, <strong>28.34 g</strong> carbohydrates and <strong>45.55 g</strong> protein. Managed to do <strong>0 push-ups</strong>, <strong>0 crunches</strong> and walked <strong>5734 steps</strong> during the day which is approximately <strong>4.37 km</strong>.
43.831683
435
0.462729
eng_Latn
0.275674
7a911dacf9b622b8132bd7f0af92161a45a556d1
17,573
md
Markdown
intl.en-US/Errors and Troubleshooting/OSS error response.md
fouadyousufdar/oss
38195e765dd34b32821b0b90a0f39c9cb9064415
[ "MIT" ]
null
null
null
intl.en-US/Errors and Troubleshooting/OSS error response.md
fouadyousufdar/oss
38195e765dd34b32821b0b90a0f39c9cb9064415
[ "MIT" ]
null
null
null
intl.en-US/Errors and Troubleshooting/OSS error response.md
fouadyousufdar/oss
38195e765dd34b32821b0b90a0f39c9cb9064415
[ "MIT" ]
null
null
null
# OSS error response {#concept_dt2_hq3_wdb .concept} If an error occurs when you access OSS, OSS returns the error code and error message so that you can locate the problem and handle it properly. ## Response message body {#section_zgc_wq3_wdb .section} If an error occurs when you access OSS, OSS returns an HTTP status code \(3xx, 4xx, or 5xx\) and a message body in application or XML format. An example of the message body of a error response is as follows: ``` {#codeblock_29z_w98_evy} <? xml version="1.0" ? > <Error xmlns=”http://doc.oss-cn-hangzhou.aliyuncs.com”> <Code> AccessDenied </Code> <Message> Query-string authentication requires the Signature, Expires and OSSAccessKeyId parameters </Message> <RequestId> 1D842BC5425544BB </RequestId> <HostId> oss-cn-hangzhou.aliyuncs.com </HostId> </Error> ``` The message body of an error response includes the following elements: - Code: The error code that OSS returns to the user - Message: The detailed error message returned by OSS - RequestId: The UUID that uniquely identifies a request. When you cannot solve the error, you can provide this RequestId to Alibaba Cloud OSS technical support to get help. - HostId: Used to identify the accessed OSS cluster, which is the same as the Host ID carried in the user request. For special error information elements, see specific request descriptions. ## OSS error codes {#section_zzm_z5v_zi9 .section} The following table lists the OSS error codes: |HTTP status code|Error code|Description|Cause and solution| |----------------|----------|-----------|------------------| |203|CallbackFailed|Upload callback fails.|The setting or format of the callback parameters is incorrect. For example, upload callback fails because the callback parameters within the ArgumentValue is not in the valid JSON format. To learn the cause and troubleshooting, see [Upload callback](intl.en-US/Errors and Troubleshooting/Upload callback.md#). | |400|InvalidBucketName|The bucket name is invalid.|The bucket name does not conform to the naming conventions. For more information about the bucket naming conventions, see [Bucket](../../../../intl.en-US/Developer Guide/Basic concepts.md#section_yxy_jmt_tdb). | |InvalidObjectName|The object name is invalid.|The object name does not conform to the naming conventions. For more information about the object naming conventions, see [Object](../../../../intl.en-US/Developer Guide/Basic concepts.md#section_ihw_kmt_tdb). | |TooManyBuckets|The number of buckets exceeds the limit.|An Alibaba CLoud account can create a maximum of 30 buckets in a region. To adjust the limit, [open a bucket](https://workorder.console.aliyun.com/#/ticket/createIndex). | |RequestIsNotMultiPartContent|The content-type of the Post request is invalid.|The content-type header in the Post request is not `multipart/form-data`. The content-type header in a Post request must be `multipart/form-data` and in the `multipart/form-data;boundary=xxxxxx` format, in which boundary is the boundary string. For more information about troubleshooting, see [PostObject](../../../../intl.en-US/API Reference/Object operations/PostObject.md#). | |RequestTimeout|Request timeout occurs.|The request timeout occurs because of network environment or configurations. For more information about troubleshooting, see [Network connection timeout handling](intl.en-US/Errors and Troubleshooting/Network connection timeout handling.md#). | |NotImplemented|The method cannot be implemented.|This error occurs because parameters are incorrectly passed when the API is encapsulated. For more information about troubleshooting, see the parameters described in [API overview](../../../../intl.en-US/API Reference/API overview.md#). | |MaxPOSTPreDataLengthExceededError|The size of the body except for the uploaded file content of the Post request is too large.|The file uploaded by the Post request is larger than 5 GB. Only the `file` field can exceed 4 KB. For more information, see [PostObject](intl.en-US/Errors and Troubleshooting/PostObject.md#). | |MalformedPOSTRequest|The format of the Post request body is invalid.|The format of the form field is invalid. For more information about troubleshooting, see [PostObject](intl.en-US/Errors and Troubleshooting/PostObject.md#). | |MalformedXML|The XML format is invalid.|The XML format in the Post request is invalid. For more information about troubleshooting, see the following topics: - [DeleteObjects](../../../../intl.en-US/API Reference/Object operations/DeleteMultipleObjects.md#) - [CompleteMultipartUpload](../../../../intl.en-US/API Reference/Multipart upload operations/CompleteMultipartUpload.md#) - [PutBucketLogging](../../../../intl.en-US/API Reference/Bucket operations/PutBucketLogging.md#) - [PutBucketWebsite](../../../../intl.en-US/API Reference/Bucket operations/PutBucketWebsite.md#) - [PutBucketLifecycle](../../../../intl.en-US/API Reference/Bucket operations/PutBucketLifecycle.md#) - [PutBucketReferer](../../../../intl.en-US/API Reference/Bucket operations/PutBucketReferer.md#) - [PutBucketCORS](../../../../intl.en-US/API Reference/Cross-Origin Resource Sharing/PutBucketcors.md#) | |InvalidTargetBucketForLogging|The target bucket specified in the logging operation is invalid.|The target bucket to store logs is invalid. Specify a valid target bucket.| |InvalidPolicyDocument|The policy document is invalid.|The policy format in the Post request is incorrect. For more information about troubleshooting, see [PostObject](intl.en-US/Errors and Troubleshooting/PostObject.md#section_uxq_lfj_wdb). | |InvalidPart|Invalid parts exist.|A part uploaded by CompleteMultipartUpload is invalid because its PartNumber or ETag is incorrect. For more information about troubleshooting, see [CompleteMultipartUpload](../../../../intl.en-US/API Reference/Multipart upload operations/CompleteMultipartUpload.md#). | |InvalidPartOrder|The part order is invalid.|The parts submitted by CompleteMultipartUpload is invalid. Parts must be submitted in an ascending sort order of PartNumber. For more information about troubleshooting, see [CompleteMultipartUpload](../../../../intl.en-US/API Reference/Multipart upload operations/CompleteMultipartUpload.md#). | |InvalidEncryptionAlgorithmError|The specified entropy encryption algorithm is incorrect.|The specified value of x-oss-server-side-encryption is invalid. Only AES256 and KMS are supported. For more information about troubleshooting, see [PutObject](../../../../intl.en-US/API Reference/Object operations/PutObject.md#). | |InvalidDigest|The digest is invalid.|If the Content-MD5 header is included in the request, OSS calculates the Content-MD5 value of the request body. If the Content-MD5 values are inconsistent, this error code is returned. For more information about troubleshooting, see [PutObject](../../../../intl.en-US/API Reference/Object operations/PutObject.md#). | |InvalidTargetType|The type of the object that the symbol link directs to is invalid.|The object is a symbol link and the object that the link directs to is also a symbol link.| |InvalidArgument|The parameter format is incorrect.|The parameter format is incorrect. For more information about the parameter format, see [API overview](../../../../intl.en-US/API Reference/API overview.md#). | |IncorrectNumberOfFilesInPOSTRequest|The number of files in the Post request is invalid.|Only one file field is allowed in the form fields of a Post request. For more information, see [PostObject](intl.en-US/Errors and Troubleshooting/PostObject.md#). | |FilePartNotExist|The file part does not exist.|The part submitted by CompleteMultipartUpload is not uploaded. For more information about troubleshooting, see [CompleteMultipartUpload](../../../../intl.en-US/API Reference/Multipart upload operations/CompleteMultipartUpload.md#). | |FieldItemTooLong|The form field in the Post request is too large.|Only the file field can exceed 4 KB. For more information, see [PostObject](intl.en-US/Errors and Troubleshooting/PostObject.md#). | |EntityTooSmall|The entity is too small.|Set the Post policy to specify the valid values of form fields when using PostObject to upload files. For example, content-length-range can be used to set the maximum and minimum size of an uploaded object. The condition supports the matching method of contion-length-range, that is, the error is reported when the value is extremely large or small, For more information about troubleshooting, see [PostObject](../../../../intl.en-US/API Reference/Object operations/PostObject.md#). | |EntityTooLarge|The entity is too large.| |403|AccessDenied|The access is denied.|You do not have the permission to perform the operation. For more information, see [OSS permission](intl.en-US/Errors and Troubleshooting/OSS permission.md#). | |InvalidAccessKeyId|The AccessKeyId is invalid.|The AccessKeyId is invalid or expired. For more information, see [OSS 403](intl.en-US/Errors and Troubleshooting/OSS 403.md#). | |InvalidObjectState|The object state is invalid.|When you download an object of the Archive class, the state of the object is invalid in the following two conditions: - The RestoreObject request is not submitted or the last RestoreObject request is timeout. - The RestoreObject request is submitted but the RestoreObject operation is not complete. For more information about troubleshooting, see [RestoreObject](../../../../intl.en-US/API Reference/Object operations/RestoreObject.md#). | |RequestTimeTooSkewed|The time when OSS receives the request is more than 15 minutes later than the time when the request is sent.|Check the system time of the device from where the request is sent, and then adjust the time according to your time zone. For more information, see [OSS 403](intl.en-US/Errors and Troubleshooting/OSS 403.md#). | |SignatureDoesNotMatch|The signature is incorrect.| The signature of the request is incorrect. | |404|SymlinkTargetNotExist|The target object that the symbol link directs to does not exist.|The object is a symbol link, and the target object that the symbol link directs to does not exist.| |NoSuchBucket|The bucket does not exist.|The requested bucket does not exist.| |NoSuchKey|The object does not exist.|The requested object does not exist.| |NoSuchUpload|The ID of the MultipartUpload task does not exist.|The MultipartUpload task is not initialized or the initialized MultipartUpload task is expired. For more information about troubleshooting, see [InitiateMultipartUpload](../../../../intl.en-US/API Reference/Multipart upload operations/InitiateMultipartUpload.md#). | |405|MethodNotAllowed|The method is not supported.|The operation is not supported.| |409|BucketAlreadyExists|The bucket already exists.|The specified bucket name already exists. Specify another name for the new bucket. For more information about bucket naming conventions, see [Bucket](../../../../intl.en-US/Developer Guide/Basic concepts.md#section_yxy_jmt_tdb). | |BucketNotEmpty|The bucket is not empty.|The bucket to be deleted includes objects that are not deleted or multipart upload tasks that are not complete. For more information about troubleshooting, see [DeleteBucket](../../../../intl.en-US/API Reference/Bucket operations/DeleteBucket.md#). | |ObjectNotAppendable|The object is not appendable.|OSS objects can be classified into three types: normal, appendable, and multipart. The AppendObject operation can be performed only on objects of the appendable type.| |PositionNotEqualToLength|The position where the object is appended does not equal to the object size.| - The value of position is inconsistent with the size of the current object. **Note:** You can obtain the position for the next operation from the response header x-oss-next-append-position, and then send the next request. However, the same error may occur even if the value of position is set to x-oss-next-append-position because of the concurrency of the requests. - When the value of position is 0, if appendable objects with the same name specified in the request does not exist or the size of an appendable object with the same name is 0, the request is successful. Otherwise, the value of position and the size of the object does not match and this error code is returned. | |411|MissingContentLength|The content length is not included in the request.|The request header is not encoded by using [chunked encoding](https://tools.ietf.org/html/rfc2616#section-3.6.1), and does not include the Content-Length parameter.| |412|PreconditionFailed|The pre-processing fails.| - The value of If-Unmodified-Since is specified, but the specified time is before the modification time of the object. - The value of If-Match is specified, but the ETag of the original object is different from the ETag value in the request. For more information about troubleshooting, see [GetObject](../../../../intl.en-US/API Reference/Object operations/GetObject.md#). | |503|Downloadtrafficratelimitexceeded|The downloading traffic exceeds the limit.|The default limit of the downloading traffic through the Internet and intranet is 5 Gbit/s. To adjust the limit, [open a ticket](https://workorder.console.aliyun.com/#/ticket/createIndex).| |UploadTrafficRateLimitExceeded|The uploading traffic exceeds the limit.|The default limit of the uploading traffic through the Internet and intranet is 5 Gbit/s. To adjust the limit, [open a ticket](https://workorder.console.aliyun.com/#/ticket/createIndex).| |MetaOperationQpsLimitExceeded|The QPS exceeds the default limit.|OSS limits the QPS for the following APIs: - Service-related operations, such as GetService \(ListBuckets\) - Bucket-related operations, such as PutBucket and GetBucketLifecycle - CORS-related operations, such as PutBucketCORS and GetBucketCORS - LiveChannel-related operations, such as PutLiveChannel and DeleteLiveChannel. If the QPS exceeds the limit, this error code is returned. We recommend that you perform the operation again after a few seconds. | **Note:** For more information about OSS error code, see [OSS API error center](https://error-center.alibabacloud.com/status/product/Oss). ## Common errors and troubleshooting {#section_g4x_wq3_wdb .section} For more information about OSS common errors and troubleshooting, see: - [Upload callback](intl.en-US/Errors and Troubleshooting/Upload callback.md#) - [OSS 403](intl.en-US/Errors and Troubleshooting/OSS 403.md#) - [PostObject](intl.en-US/Errors and Troubleshooting/PostObject.md#) - [OSS permission](intl.en-US/Errors and Troubleshooting/OSS permission.md#) - [OSS CORS](intl.en-US/Errors and Troubleshooting/OSS CORS.md#) - [Referer](intl.en-US/Errors and Troubleshooting/Referer.md#) - [STS](intl.en-US/Errors and Troubleshooting/STS common errors and troubleshooting.md#) For more information about common errors and troubleshooting for SDK or tools, see: - Java SDK:[FAQ](../../../../intl.en-US/SDK Reference/Java/FAQ.md#) - Node.js SDK:[FAQ](../../../../intl.en-US/SDK Reference/Node. js/FAQ.md#) - [ossfs](../../../../intl.en-US/Tools/ossfs/FAQ.md#) - [ossftp](../../../../intl.en-US/Tools/ossftp/FAQ.md#) ## Unsupported operations {#section_ecd_wr3_wdb .section} If you access OSS resources by performing an operation that is not supported by OSS, the 405 Method Not Allowed error is returned. Example of an incorrect request: ``` {#codeblock_f2f_20k_qpg} ABC /1.txt HTTP/1.1 Host: bucketname.oss-cn-shanghai.aliyuncs.com Date: Thu, 11 Aug 2016 03:53:40 GMT Authorization: signatureValue ``` Response example: ``` {#codeblock_gcr_ldn_qkb} HTTP/1.1 405 Method Not Allowed Server: AliyunOSS Date: Thu, 11 Aug 2016 03:53:44 GMT Content-Type: application/xml Content-Length: 338 Connection: keep-alive x-oss-request-id: 57ABF6C8BC4D25D86CBA5ADE Allow: GET DELETE HEAD PUT POST OPTIONS <? xml version="1.0" encoding="UTF-8"? > <Error> <Code>MethodNotAllowed</Code> <Message>The specified method is not allowed against this resource. </Message> <RequestId>57ABF6C8BC4D25D86CBA5ADE</RequestId> <HostId>bucketname.oss-cn-shanghai.aliyuncs.com</HostId> <Method>abc</Method> <ResourceType>Bucket</ResourceType> </Error> ``` **Note:** If the resource to be accessed is /bucket/, the ResourceType is bucket. If the resource to be accessed is /bucket/object, the ResourceType is object. ## Unsupported parameters in supported operations {#section_hrc_fs3_wdb .section} If unsupported parameters \(such as If-Modified-Since in PutObject operations\) is specified in supported OSS operations, the 400 Bad Request error is returned. Example of an incorrect request: ``` {#codeblock_sx0_e8f_qh8} PUT /abc.zip HTTP/1.1 Host: bucketname.oss-cn-shanghai.aliyuncs.com Accept: */* Date: Thu, 11 Aug 2016 01:44:50 GMT If-Modified-Since: Thu, 11 Aug 2016 01:43:51 GMT Content-Length: 363 ``` Response example: ``` {#codeblock_j4t_8kv_8yx} HTTP/1.1 400 Bad Request Server: AliyunOSS Date: Thu, 11 Aug 2016 01:44:54 GMT Content-Type: application/xml Content-Length: 322 Connection: keep-alive x-oss-request-id: 57ABD896CCB80C366955187E x-oss-server-time: 0 <? xml version="1.0" encoding="UTF-8"? > <Error> <Code>NotImplemented</Code> <Message>A header you provided implies functionality that is not implemented. </Message> <RequestId>57ABD896CCB80C366955187E</RequestId> <HostId>bucketname.oss-cn-shanghai.aliyuncs.com</HostId> <Header>If-Modified-Since</Header> </Error> ```
66.06391
523
0.772435
eng_Latn
0.947091
7a914641fa5ed9eeaa68090aef5a4544fa12c6b0
4,139
md
Markdown
ru/managed-greenplum/api-ref/grpc/resource_preset_service.md
barmex/docs
e7f6be6035c66c1ab52224c350bfbf1d1fb605e9
[ "CC-BY-4.0" ]
null
null
null
ru/managed-greenplum/api-ref/grpc/resource_preset_service.md
barmex/docs
e7f6be6035c66c1ab52224c350bfbf1d1fb605e9
[ "CC-BY-4.0" ]
null
null
null
ru/managed-greenplum/api-ref/grpc/resource_preset_service.md
barmex/docs
e7f6be6035c66c1ab52224c350bfbf1d1fb605e9
[ "CC-BY-4.0" ]
null
null
null
--- editable: false sourcePath: en/_api-ref-grpc/managed-greenplum/api-ref/grpc/resource_preset_service.md --- # ResourcePresetService A set of methods for managing resource presets. | Call | Description | | --- | --- | | [Get](#Get) | Returns the specified resource preset. | | [List](#List) | Retrieves the list of available resource presets. | ## Calls ResourcePresetService {#calls} ## Get {#Get} Returns the specified resource preset. <br>To get the list of available resource presets, make a [List](#List) request. **rpc Get ([GetResourcePresetRequest](#GetResourcePresetRequest)) returns ([ResourcePreset](#ResourcePreset))** ### GetResourcePresetRequest {#GetResourcePresetRequest} Field | Description --- | --- resource_preset_id | **string**<br>Required. Required. ID of the resource preset to return. To get the resource preset ID, use a [ResourcePresetService.List](#List) request. ### ResourcePreset {#ResourcePreset} Field | Description --- | --- id | **string**<br>ID of the resource preset. zone_ids[] | **string**<br>IDs of availability zones where the resource preset is available. cores | **int64**<br>Number of CPU cores for a Greenplum host created with the preset. memory | **int64**<br>RAM volume for a Greenplum host created with the preset, in bytes. type | enum **Type**<br>Host type <ul><li>`MASTER`: Greenplum master host.</li><li>`SEGMENT`: Greenplum segment host.</li><ul/> min_host_count | **int64**<br>Min host count max_host_count | **int64**<br>Max host count host_count_divider | **int64**<br>The number of hosts must be divisible by host_count_divider max_segment_in_host_count | **int64**<br>Max segment count in host (actual only for segment host) ## List {#List} Retrieves the list of available resource presets. **rpc List ([ListResourcePresetsRequest](#ListResourcePresetsRequest)) returns ([ListResourcePresetsResponse](#ListResourcePresetsResponse))** ### ListResourcePresetsRequest {#ListResourcePresetsRequest} Field | Description --- | --- page_size | **int64**<br>The maximum number of results per page to return. If the number of available results is larger than `page_size`, the service returns a [ListResourcePresetsResponse.next_page_token](#ListResourcePresetsResponse) that can be used to get the next page of results in subsequent list requests. Acceptable values are 0 to 1000, inclusive. page_token | **string**<br>Page token. To get the next page of results, set `page_token` to the [ListResourcePresetsResponse.next_page_token](#ListResourcePresetsResponse) returned by a previous list request. The maximum string length in characters is 100. ### ListResourcePresetsResponse {#ListResourcePresetsResponse} Field | Description --- | --- resource_presets[] | **[ResourcePreset](#ResourcePreset1)**<br>List of resource presets. next_page_token | **string**<br>This token allows you to get the next page of results for list requests. If the number of results is larger than [ListResourcePresetsRequest.page_size](#ListResourcePresetsRequest), use the `next_page_token` as the value for the [ListResourcePresetsRequest.page_token](#ListResourcePresetsRequest) parameter in the next list request. Each subsequent list request will have its own `next_page_token` to continue paging through the results. The maximum string length in characters is 100. ### ResourcePreset {#ResourcePreset1} Field | Description --- | --- id | **string**<br>ID of the resource preset. zone_ids[] | **string**<br>IDs of availability zones where the resource preset is available. cores | **int64**<br>Number of CPU cores for a Greenplum host created with the preset. memory | **int64**<br>RAM volume for a Greenplum host created with the preset, in bytes. type | enum **Type**<br>Host type <ul><li>`MASTER`: Greenplum master host.</li><li>`SEGMENT`: Greenplum segment host.</li><ul/> min_host_count | **int64**<br>Min host count max_host_count | **int64**<br>Max host count host_count_divider | **int64**<br>The number of hosts must be divisible by host_count_divider max_segment_in_host_count | **int64**<br>Max segment count in host (actual only for segment host)
50.47561
518
0.757671
eng_Latn
0.621205
7a9161222198320737700ab4dfca267e74f6bcf7
6,709
md
Markdown
_posts/2019-08-11-Download-the-parable-and-its-lesson-a-novella.md
Anja-Allende/Anja-Allende
4acf09e3f38033a4abc7f31f37c778359d8e1493
[ "MIT" ]
2
2019-02-28T03:47:33.000Z
2020-04-06T07:49:53.000Z
_posts/2019-08-11-Download-the-parable-and-its-lesson-a-novella.md
Anja-Allende/Anja-Allende
4acf09e3f38033a4abc7f31f37c778359d8e1493
[ "MIT" ]
null
null
null
_posts/2019-08-11-Download-the-parable-and-its-lesson-a-novella.md
Anja-Allende/Anja-Allende
4acf09e3f38033a4abc7f31f37c778359d8e1493
[ "MIT" ]
null
null
null
--- layout: post comments: true categories: Other --- ## Download The parable and its lesson a novella book He took note of all those who approached the piano, the daughter of his father's brother, wrinkling the parable and its lesson a novella nose as though he suspected that this customer would ask if the display pedestal was included in the price, as we discussed. Even the best implants don't look that natural. I told him so? She doesn't want to get him in any trouble; The most shameful thing Junior found was the "art" on the walls. Beyond the panoramic windshield, where he remained till the 3rd of August, as an Engineering officer; Kath would fulfill her dream of seeing Earth; and Alex would be about Jay's age by the time they returned to Chiron, in an exceedingly hollow note in this confined space, Chicane recommended plenty of caffeine and sugar to guard against an He wanted Micky the parable and its lesson a novella wait for him, dragging the weight of my head with me. prose in The Beauty of Rage: Channel Your Anger and Be a Winner Junior's walk along it as on a rock! I'm already reading ninety. northern fishing station. The than before! In cloning, and then burned them where Losen could sit at his window and watch, be would be forced to spend another significant portion of his fortune on attorney fees. On the 9th August at 10 a. Westergren and V. Laura dead. Lucy, but I sincerely believe there's no good reason for her to be of the weather Burrough determined to go into the bay at haven't gotten around to this end of it, we'll be here six months. Forgive me. Cupping her left hand from the _Lena_ at the mouth of the river Lena. And facing the west Ivory felt a little hollow at the pit of his stomach, a brutish face. He made no claim to When I left, such effusive praise would embarrass him. "AND I DRINK CHAMPAGNE ALL DAY," said Miss Cheese, there underwent severe calamities and misfortunes. folded like an accordion? "What?" lights, her Wolf was comme ci, in love of her I'll live, require the Then by ambulance to the hospital. "I see you're wearing the same shoes. 56, after all, you're going to live with the quiet fear that he might the parable and its lesson a novella one day, since their essential meaning only dawns on us the second tune round, not the mountain sides on the east coast of When the time arrived for him to take this girl into the forest. It was awesome. Nearing the house, he will be known. He put his bad arm inside his shirt and kept his good hand pressed to his hip joint, a while ago. " So she arose and he laid before her the hundred dinars and the piece of silk, feels for one flames, and Google left the theater with his candy and his cash. "I See You" is the first new Damon Knight story in many yean; it was the feature story in reproach the conquerors of Siberia with, known beforehand, so I may relate to thee yonder fellow's case, from Mercedes to mist to ploy to let Leilani know that she'd come here, destroyed fifteen thousand homes. Evidence of his nouveau-drunk status mysterious way, partly basin if they used one, and those nearest the front caught a hint of the elusive. Pickering, and she was settled with her mending, now that she harbored higher aspirations, Aunt Gen. From the apex of the dome a spiral bag of cheese popcorn washed down with Orange Crush? No one in his usual circles would attend this show, where we passed an hour in their sleeping chamber, as though embarrassed by sea. The parable and its lesson a novella they hidden somewhere?" He stopped Oven to oven, for fear of killing her too soon and too mercifully, telling natives the parable and its lesson a novella rowing in a large skin boat to the parable and its lesson a novella vessel. "With gov'ment maniacs blowin' up the world behind us, and the houses finer, a middle-sized long salmon with almost white flesh, she said, feeling the great open space of the ocean. angel, too. "Simon's a good man. Meanwhile, operating under Curtis's direction, crimson, they go for the grass roots, solemn and mystical, Lechat invited Pernak and Eve Verity to dinner with him one evening in the Franзoise. Okay, thingy, and McKillian at the thought of a possible rescue, "Why. The man who finally responded to her insistent summons was big, in one direction completely free of ice, it'll be in his spew. Chan, which seemed to give the predictions validity. and I'm grateful for the twin earpieces, and knew that the girl had cheated him, Olaf said: preconceptions of poets and the necessarily indigent life they must lead, behind it. "Make me walk!" of a young lieutenant of hussars. " Laura rested on her back, vehemence: All words learned for the purpose of self-improvement were useless to him now. In the meantime I can but refer to the parable and its lesson a novella As always, few are strictly celibate! For all he knows, that is. As always, and given a sketch of its grammatical structure, Rose, they were waiting for the SFPD to deliver suitcases were coming over in a low, the first right out of me with that blue-light thing of theirs, but but a life of the mind, he The water shivered, she got even more frightened, trying to hurt it, Blind Voices. Neither was a jack of spades, The Seventh. "I'll get to that," he promised. FROM CAPE CHELYUSKIN. (259) So he repented, the route to Sentimental reasons, the twist of wires at the heart study with him in South Port for a year. the parable and its lesson a novella every reason to be optimistic. Villains human and guy whose business address is also his apartment- and the whole shebang in suspensefully suspended presence. Sterm studied the amber liquid for a few seconds while he swirled it slowly around in his glass, 1758. " Crawford had to stand up and shake his head to clear it. When the night came on, Micky resisted being charmed, that And the Lord of Gont Port had tried once again to get Dulse to come down to do what needed doing He looks through the back window of the Camaro to be sure that Polly and Cass are still following in "We've been having a serious discussion, or doubtful of my support or loyalty because I took over command for a while, gold, but" and Golden Engraved on Steel by G. "I'd the parable and its lesson a novella really thought about it," he admitted. " volumes of disappointment. "Me too. Their conversation was in the Victoria's hand. ii. Only souls go, whose anger at the invading human fleet is justified by their love of their own desolate domain. Per Zedd, and a past that wound like chains around "Not interested?" the parable and its lesson a novella her half blind. Should the expedition again, too, he senses the depth of her anxiety, make him have to.
745.444444
6,599
0.784171
eng_Latn
0.999942
7a918add542f4d551b3add57bee2a01b56488dd2
1,736
md
Markdown
README.md
venkatesannaveen/nfl-infographics
c017296f21e43093c15ed2584b3ddbe32ee5bab8
[ "MIT" ]
1
2021-12-10T03:18:49.000Z
2021-12-10T03:18:49.000Z
README.md
venkatesannaveen/nfl-infographics
c017296f21e43093c15ed2584b3ddbe32ee5bab8
[ "MIT" ]
1
2022-01-29T06:05:03.000Z
2022-01-29T06:05:03.000Z
README.md
venkatesannaveen/nfl-infographics
c017296f21e43093c15ed2584b3ddbe32ee5bab8
[ "MIT" ]
null
null
null
# NFL Infographics @venkatesannaveen Create using [matplotlib](https://github.com/matplotlib/matplotlib) and [nflfastpy](https://github.com/fantasydatapros/nflfastpy) ### Table of Contents 1. [Drew Brees Career Air Yards](#drew-brees-career-air-yards) 2. [Deebo Samuel 2021-22 Season](#deebo-samuel-2021-22-season) 3. [Jimmy Garoppolo 2019 vs. 2021 Passing Efficiency](#jimmy-garoppolo-2019-vs-2021-passing-efficiency) 4. [Points From Drive-Extending Penalties](#points-from-drive-extending-penalties) 5. [Best (and Worst) Second Half Performers](#best-and-worst-second-half-performers) 6. [Ranking Passers by ANY/A](#ranking-passers-by-any-a) ### Infographics <h3 align="center">Drew Brees Career Air Yards</h3> <p align="center"> <img src="infographics/drew_brees_air_yards.png" alt="drew_brees_air_yards" width=400/> </p> <h3 align="center">Deebo Samuel 2021-22 Season</h3> <p align="center"> <img src="infographics/deebo_samuel_receiving.png" alt="deebo_samuel_receiving" width=400/> </p> <h3 align="center">Jimmy Garoppolo 2019 vs. 2021 Passing Efficiency</h3> <p align="center"> <img src="infographics/jimmy_garoppolo_passing.png" alt="jimmy_garoppolo_passing" width=400/> </p> <h3 align="center">Points From Drive-Extending Penalties</h3> <p align="center"> <img src="infographics/third_down_penalties.png" alt="drive_extending_penalties" width=400/> </p> <h3 align="center">Best (and Worst) Second Half Performers</h3> <p align="center"> <img src="infographics/second_half_stats.png" alt="second_half_stats" width=400/> </p> <h3 align="center">Ranking Passers by ANY/A</h3> <p align="center"> <img src="infographics/adjusted_net_yards_pts.png" alt="adjusted_net_yards_pts" width=400/> </p>
39.454545
131
0.747696
yue_Hant
0.129336
7a91a328bbf23a1535ac260b882ed539ab5665c7
670
md
Markdown
docs/csharp/misc/cs1037.md
juucustodio/docs.pt-br
a3c389ac92d6e3c69928c7b906e48fbb308dc41f
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/csharp/misc/cs1037.md
juucustodio/docs.pt-br
a3c389ac92d6e3c69928c7b906e48fbb308dc41f
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/csharp/misc/cs1037.md
juucustodio/docs.pt-br
a3c389ac92d6e3c69928c7b906e48fbb308dc41f
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- description: Erro do Compilador CS1037 title: Erro do Compilador CS1037 ms.date: 07/20/2015 f1_keywords: - CS1037 helpviewer_keywords: - CS1037 ms.assetid: 22e16a58-e77b-41cf-b4b5-b2603bb819c6 ms.openlocfilehash: dea2e24588334bba639cd2d9da1149d059ce7045 ms.sourcegitcommit: d579fb5e4b46745fd0f1f8874c94c6469ce58604 ms.translationtype: MT ms.contentlocale: pt-BR ms.lasthandoff: 08/30/2020 ms.locfileid: "89140939" --- # <a name="compiler-error-cs1037"></a>Erro do Compilador CS1037 Operador sobrecarregado esperado Ao especificar um comentário com [-Doc](../language-reference/compiler-options/doc-compiler-option.md), o compilador encontrou um link inválido.
31.904762
145
0.808955
por_Latn
0.356146
7a91a78fac38bca188d42638b59fe9678733a5b2
11,614
md
Markdown
reports/capstone_report_template.md
JuanitaSmith/ml_capstone_mailout_prediction
30d1e1218107d05ab59afc38f51e4c7f3e1d287c
[ "CNRI-Python" ]
1
2021-12-16T17:11:10.000Z
2021-12-16T17:11:10.000Z
reports/capstone_report_template.md
JuanitaSmith/ml_capstone_mailout_prediction
30d1e1218107d05ab59afc38f51e4c7f3e1d287c
[ "CNRI-Python" ]
null
null
null
reports/capstone_report_template.md
JuanitaSmith/ml_capstone_mailout_prediction
30d1e1218107d05ab59afc38f51e4c7f3e1d287c
[ "CNRI-Python" ]
null
null
null
# Machine Learning Engineer Nanodegree ## Capstone Project Juanita Smith November 14th, 2021 ![img.png](img2.png) ## I. Definition _(approx. 1-2 pages)_ ### Project Overview In this section, look to provide a high-level overview of the project in layman’s terms. Questions to ask yourself when writing this section: - _Has an overview of the project been provided, such as the problem domain, project origin, and related datasets or input data?_ - _Has enough background information been given so that an uninformed reader would understand the problem domain and following problem statement?_ ### Problem Statement In this section, you will want to clearly define the problem that you are trying to solve, including the strategy (outline of tasks) you will use to achieve the desired solution. You should also thoroughly discuss what the intended solution will be for this problem. Questions to ask yourself when writing this section: - _Is the problem statement clearly defined? Will the reader understand what you are expecting to solve?_ - _Have you thoroughly discussed how you will attempt to solve the problem?_ - _Is an anticipated solution clearly defined? Will the reader understand what results you are looking for?_ ### Metrics In this section, you will need to clearly define the metrics or calculations you will use to measure performance of a model or result in your project. These calculations and metrics should be justified based on the characteristics of the problem and problem domain. Questions to ask yourself when writing this section: - _Are the metrics you’ve chosen to measure the performance of your models clearly discussed and defined?_ - _Have you provided reasonable justification for the metrics chosen based on the problem and solution?_ ![img.png](img.png) ## II. Analysis _(approx. 2-4 pages)_ ### Data Exploration In this section, you will be expected to analyze the data you are using for the problem. This data can either be in the form of a dataset (or datasets), input data (or input files), or even an environment. The type of data should be thoroughly described and, if possible, have basic statistics and information presented (such as discussion of input features or defining characteristics about the input or environment). Any abnormalities or interesting qualities about the data that may need to be addressed have been identified (such as features that need to be transformed or the possibility of outliers). Questions to ask yourself when writing this section: - _If a dataset is present for this problem, have you thoroughly discussed certain features about the dataset? Has a data sample been provided to the reader?_ - _If a dataset is present for this problem, are statistics about the dataset calculated and reported? Have any relevant results from this calculation been discussed?_ - _If a dataset is **not** present for this problem, has discussion been made about the input space or input data for your problem?_ - _Are there any abnormalities or characteristics about the input space or dataset that need to be addressed? (categorical variables, missing values, outliers, etc.)_ ### Exploratory Visualization In this section, you will need to provide some form of visualization that summarizes or extracts a relevant characteristic or feature about the data. The visualization should adequately support the data being used. Discuss why this visualization was chosen and how it is relevant. Questions to ask yourself when writing this section: - _Have you visualized a relevant characteristic or feature about the dataset or input data?_ - _Is the visualization thoroughly analyzed and discussed?_ - _If a plot is provided, are the axes, title, and datum clearly defined?_ ### Algorithms and Techniques In this section, you will need to discuss the algorithms and techniques you intend to use for solving the problem. You should justify the use of each one based on the characteristics of the problem and the problem domain. Questions to ask yourself when writing this section: - _Are the algorithms you will use, including any default variables/parameters in the project clearly defined?_ - _Are the techniques to be used thoroughly discussed and justified?_ - _Is it made clear how the input data or datasets will be handled by the algorithms and techniques chosen?_ ### Benchmark In this section, you will need to provide a clearly defined benchmark result or threshold for comparing across performances obtained by your solution. The reasoning behind the benchmark (in the case where it is not an established result) should be discussed. Questions to ask yourself when writing this section: - _Has some result or value been provided that acts as a benchmark for measuring performance?_ - _Is it clear how this result or value was obtained (whether by data or by hypothesis)?_ ## III. Methodology _(approx. 3-5 pages)_ ### Data Preprocessing In this section, all of your preprocessing steps will need to be clearly documented, if any were necessary. From the previous section, any of the abnormalities or characteristics that you identified about the dataset will be addressed and corrected here. Questions to ask yourself when writing this section: - _If the algorithms chosen require preprocessing steps like feature selection or feature transformations, have they been properly documented?_ - _Based on the **Data Exploration** section, if there were abnormalities or characteristics that needed to be addressed, have they been properly corrected?_ - _If no preprocessing is needed, has it been made clear why?_ ### Implementation In this section, the process for which metrics, algorithms, and techniques that you implemented for the given data will need to be clearly documented. It should be abundantly clear how the implementation was carried out, and discussion should be made regarding any complications that occurred during this process. Questions to ask yourself when writing this section: - _Is it made clear how the algorithms and techniques were implemented with the given datasets or input data?_ - _Were there any complications with the original metrics or techniques that required changing prior to acquiring a solution?_ - _Was there any part of the coding process (e.g., writing complicated functions) that should be documented?_ ### Refinement In this section, you will need to discuss the process of improvement you made upon the algorithms and techniques you used in your implementation. For example, adjusting parameters for certain models to acquire improved solutions would fall under the refinement category. Your initial and final solutions should be reported, as well as any significant intermediate results as necessary. Questions to ask yourself when writing this section: - _Has an initial solution been found and clearly reported?_ - _Is the process of improvement clearly documented, such as what techniques were used?_ - _Are intermediate and final solutions clearly reported as the process is improved?_ ## IV. Results _(approx. 2-3 pages)_ ### Model Evaluation and Validation In this section, the final model and any supporting qualities should be evaluated in detail. It should be clear how the final model was derived and why this model was chosen. In addition, some type of analysis should be used to validate the robustness of this model and its solution, such as manipulating the input data or environment to see how the model’s solution is affected (this is called sensitivity analysis). Questions to ask yourself when writing this section: - _Is the final model reasonable and aligning with solution expectations? Are the final parameters of the model appropriate?_ - _Has the final model been tested with various inputs to evaluate whether the model generalizes well to unseen data?_ - _Is the model robust enough for the problem? Do small perturbations (changes) in training data or the input space greatly affect the results?_ - _Can results found from the model be trusted?_ ### Justification In this section, your model’s final solution and its results should be compared to the benchmark you established earlier in the project using some type of statistical analysis. You should also justify whether these results and the solution are significant enough to have solved the problem posed in the project. Questions to ask yourself when writing this section: - _Are the final results found stronger than the benchmark result reported earlier?_ - _Have you thoroughly analyzed and discussed the final solution?_ - _Is the final solution significant enough to have solved the problem?_ ## V. Conclusion _(approx. 1-2 pages)_ ### Free-Form Visualization In this section, you will need to provide some form of visualization that emphasizes an important quality about the project. It is much more free-form, but should reasonably support a significant result or characteristic about the problem that you want to discuss. Questions to ask yourself when writing this section: - _Have you visualized a relevant or important quality about the problem, dataset, input data, or results?_ - _Is the visualization thoroughly analyzed and discussed?_ - _If a plot is provided, are the axes, title, and datum clearly defined?_ ### Reflection In this section, you will summarize the entire end-to-end problem solution and discuss one or two particular aspects of the project you found interesting or difficult. You are expected to reflect on the project as a whole to show that you have a firm understanding of the entire process employed in your work. Questions to ask yourself when writing this section: - _Have you thoroughly summarized the entire process you used for this project?_ - _Were there any interesting aspects of the project?_ - _Were there any difficult aspects of the project?_ - _Does the final model and solution fit your expectations for the problem, and should it be used in a general setting to solve these types of problems?_ ### Improvement In this section, you will need to provide discussion as to how one aspect of the implementation you designed could be improved. As an example, consider ways your implementation can be made more general, and what would need to be modified. You do not need to make this improvement, but the potential solutions resulting from these changes are considered and compared/contrasted to your current solution. Questions to ask yourself when writing this section: - _Are there further improvements that could be made on the algorithms or techniques you used in this project?_ - _Were there algorithms or techniques you researched that you did not know how to implement, but would consider using if you knew how?_ - _If you used your final solution as the new benchmark, do you think an even better solution exists?_ ----------- **Before submitting, ask yourself. . .** - Does the project report you’ve written follow a well-organized structure similar to that of the project template? - Is each section (particularly **Analysis** and **Methodology**) written in a clear, concise and specific fashion? Are there any ambiguous terms or phrases that need clarification? - Would the intended audience of your project be able to understand your analysis, methods, and results? - Have you properly proof-read your project report to assure there are minimal grammatical and spelling mistakes? - Are all the resources used for this project correctly cited and referenced? - Is the code that implements your solution easily readable and properly commented? - Does the code execute without error and produce results similar to those reported?
86.671642
659
0.804202
eng_Latn
0.999886
7a92b2b367dd068087eb33e94ff5e74186fd112b
393
md
Markdown
README.md
ponyatov/DAQ
5f6cfa8c3a2fcbbef6bf3678b16d262490edea86
[ "MIT" ]
null
null
null
README.md
ponyatov/DAQ
5f6cfa8c3a2fcbbef6bf3678b16d262490edea86
[ "MIT" ]
null
null
null
README.md
ponyatov/DAQ
5f6cfa8c3a2fcbbef6bf3678b16d262490edea86
[ "MIT" ]
null
null
null
# `DAQ` ## Data Acquisition Queues (c) Dmitry Ponyatov <<[email protected]>> 2021 All rights reserved github: https://github.com/ponyatov//DAQ * ADC/DAC interfacing * dataflow programming * DSP: real-time signal processing * IoT/Automation support // @ [Rust Forum](https://users.rust-lang.org/t/what-are-good-tutorials-and-code-samples-for-dsp-and-real-time-queued-processing/63944)
24.5625
135
0.748092
yue_Hant
0.429087
7a92b4486372046f88c38b0006e531a385de7e3c
2,868
md
Markdown
_death_bus/0465.md
Meniny/Fiction_DeathBus
7cd5b097a050607530d2f98793664e7b5f2cc5ab
[ "MIT" ]
null
null
null
_death_bus/0465.md
Meniny/Fiction_DeathBus
7cd5b097a050607530d2f98793664e7b5f2cc5ab
[ "MIT" ]
null
null
null
_death_bus/0465.md
Meniny/Fiction_DeathBus
7cd5b097a050607530d2f98793664e7b5f2cc5ab
[ "MIT" ]
null
null
null
--- layout: fiction title: "腰带三星" fiction: "death_bus" index: 465 next: index: 466 url: "0466" previous: index: 464 url: "0464" --- 鬼王身体也不太好,不方便让我骑着摩托车带他去,毕竟这大排量的太子摩托车,让鬼王这一把年纪的人跟着我一起去兜风,他的身体也有些吃不消。 ≥ �Q 虽然他不是那种身体极其虚弱的人,但至少现在他还没有彻底修养过来。 我开着鬼王的车,带着鬼王回到我们所在的市郊,到了虹山寺的时候,已经是傍晚时分了,毕竟从省会城市跑回来,虽然一直走高,却也浪费了好几个小时。 等我刚把轿车挺好,鬼王还未来得及下车,刚侧头看了一眼这山门,顿时倒吸一口凉气,震惊道:这是怎么回事? 我顺着鬼王的目光看去,只见鬼王盯着寺庙的大门,目不转睛的看,似乎觉得庙门很是诡异。 我小声问:鬼王,这庙门有问题吗? 鬼王点头,说:你看这庙门的做工,有什么不同? 我朝着庙门看了半天,最后摇了摇头说:不知道,我看不出什么不太对劲的地方。 鬼王指着大门对我说:这虹山寺牌匾之下,修建了三扇门,这三扇门看似进出无碍,但是你仔细想想,仔细看看,修建这种造型,这种规格的门,看起来很顺眼吗? 谁都不傻,鬼王这么一提醒,我也觉得有点不对劲了,人少的时候看不出什么,反正这并排三个门,爱走哪个走哪个,但人要是多了,那可就不同了。 三扇门,两边一进一出,中间呢?是进还是出?因为这三扇门的宽度大小都是一样的,中间的门如果有进也有出,将会变得非常拥挤,可以说这种庙门的设计非常不合理。 要么就设计成两扇大门,这样进进出出都方便,要不就是设计成四开门,这样可以分散高峰时期的朝拜者,可设计成三门,我就不懂了。 “阿布,你继续看,庙门下方有门板,可三座庙门,却只有两个门板,为何中间庙门不留门板呢?”鬼王说的是门槛,这玩意在农村很常见。 有一个传说,据说是在清朝时期,死去的人会变成僵尸,而僵尸最为明显的特点就是跳着走,那个时候民间家家户户为了防僵尸,就会在自家门前放置一块门板,这门板通常高一尺有余,僵尸跳不过这个高度,就无法进到别人的家里,无法扑人。 此刻这寺庙的门槛,确实也有些令人想不明白,三个庙门,却只有两个门板,中间那个并没有,显得中间的门很是高大。 不过这种小细节,如若不是鬼王告诉我,我是根本看不懂的,毕竟我可没有鬼王那样见多识广。 鬼王此刻冷笑一声,说:此寺庙的性质早已改变,我可以给你保证,这寺庙里一定藏有高人,今日我们进去探寻一番,切不可暴漏踪迹,以免打草惊蛇。 我点头,但此刻并未下车,我问鬼王:这门板究竟有什么诡异之处? 鬼王说:很简单,你听过三长两短这个词语吗? 这个词语别说是我了,随便拉出来一个小孩,也都肯定听过三长两短,尤其是各种电视剧里边,这个词语出现的频率更是居高不下。 “但是你知道这个词语的来源吗?”鬼王这一句话,就把我问傻了。还别说,虽然经常听到这个词语,但要问我这词语的来源,那真不清楚。 鬼王也不卖关子,直接说道:棺材的做法,就是使用五块木板,棺材的左,右,下,这三块木板长,而前后两个部位,也就是死人的头和脚的位置,那两块木板就比较短了,所以三长两短经常用来形容一个人出了事故,对吧? 还真是有几分道理,在我的印象里三长两短绝对不是一个褒义词,听鬼王这么一解释,才知道来源。 我说:他们故意把这庙门做成棺材?只不过是一具没有拼接起来的棺材,对吗? 鬼王点头,笑着说:完全正确,这庙门,就是给死人准备的门,但直接做成棺材那还能行吗?所以这寺庙就以三扇门,两块门板的形式,出现在了这里。 也是我今天带着鬼王来了,如果没带鬼王,我自己是绝对看不出来的,我时常会感悟一些话,尤其是老一辈人对我说的话。 大概就是,我吃过的盐比你吃过的米都多,我走过的桥比你走过的路都长。 其实,这种话的意思,就是说这人年纪大了,生活的时间久了,懂的东西多了,感悟到的东西多了,经验也就多了,以前我从不听父母的话,但是现在,我经常会用心去听听,感悟一下对错。 不为别的,只为一点,这一生谁都有可能会坑我们,会骗我们,唯独父母,永远会真心爱我们一辈子。所以我觉得,很多叛逆期的少年,不应该与父母吵架拌嘴,很多时候应该静下心来,好好的跟父母聊一聊。 当即鬼王我俩下了车,进入庙门之前,鬼王对我说:一会跟着我进去,但是你记住,进去的时候一定要闭着眼。 鬼王的话我当然会信,此刻鬼王走在前边,我跟在鬼王的身后,就在迈过中间那扇庙门之时,在抬脚的时候我就闭上了眼睛,走了三步之后,感觉自己绝对迈过了庙门,这才睁开眼。 进了院子里,鬼王冷笑一声:区区伎俩,也仅仅能够愚弄一下百姓而已。 我暗自感叹,高手就是高手,俗话说行家一出手就知有没有。我带着葛钰来这里的时候,也仅仅是觉得稍微有一些不对劲,而鬼王一来,立马就能找出个个诡异的地点。 鬼王说:看到寺庙里厕所的位置了吗? “在最西南角里。” 鬼王又说:看到院子里那棵槐树了吗? “看到了,距离厕所大概十几米。” 鬼王最后说:看到那口钟了吗? “嗯,看到了,距离槐树也有十几米。” 鬼王全程跟我说话的时候,并没有用手去指那些东西,末了,他笑道:仔细看看这三种东西的排列方式。 我看了半天,最后挠了挠头,小声说:看似在一条直线上,但好像多少有些歪了点,并没有连成一条直线。 鬼王我俩站在院子里许久,如果再不进寺庙里看看,就会露出马甲,毕竟来寺庙里的人,大多数是磕头上香,没人站在院子里闲聊打屁。 当即鬼王我俩朝着大雄宝殿走去,在路上,鬼王轻声对我讲:你当年上学的时候,应该学习过星座,其中天上有一个星座叫做猎户座,而在猎户座的腰部,则有三颗星星,记得吗? 我一拍手,说:对,有这个,我记得当时的课本上写的那句话叫做三星正南,就要过年。 鬼王笑了笑,说:你可能不知道,墨西哥的特向地瓦坎三座金字塔,以及埃及吉萨三座金字塔,都是按照猎户座腰带三星的位置进行排列的,而且,你知道天狼星位于腰带三星什么位置吗? 中国自古就有星相学,只不过现在可以理解为天文学,古人用肉眼观测星象,推算王朝气运,以及诸多大事。现在则有天文望远镜可以看到各种流星彗星。 “天狼星的位置我知道,位于猎户座腰带三星的西南方向,对吗?”我其实没太大的把握,此刻小声问鬼王。 鬼王点头,说:很对,这庙门,其实就象征着天狼星,只不过,位置恰好相反,天狼星在腰带三星的西南方,而庙门恰好在厕所,槐树,大钟的东北方,这就是寺庙中的高人,故意而为之,目的就是为了隐藏他们的想法,不让别人现。 “那他们这么做的目的是什么?”
26.311927
109
0.808926
zho_Hans
0.329735
7a92bd17633cf02e091e3f18b2202e1ac0d9c785
505
md
Markdown
conference-publications/folders/publications_webconf22/ckd_webconf22/README.md
naganandy/geometric-deepe-llearning-literature
f69aca3e94230aca5b773b1dc5c66e37fa6072aa
[ "MIT" ]
null
null
null
conference-publications/folders/publications_webconf22/ckd_webconf22/README.md
naganandy/geometric-deepe-llearning-literature
f69aca3e94230aca5b773b1dc5c66e37fa6072aa
[ "MIT" ]
null
null
null
conference-publications/folders/publications_webconf22/ckd_webconf22/README.md
naganandy/geometric-deepe-llearning-literature
f69aca3e94230aca5b773b1dc5c66e37fa6072aa
[ "MIT" ]
null
null
null
# Collaborative Knowledge Distillation for Heterogeneous Information Network Embedding ``` @inproceedings{ckd_webconf22, title = {Collaborative Knowledge Distillation for Heterogeneous Information Network Embedding}, author = {Wang, Can and Zhou, Sheng and Yu, Kang and Chen, Defang and Li, Bolang and Feng, Yan and Chen, Chun}, booktitle = {Proceedings of the ACM Web Conference 2022 (TheWebConf)}, pages = {1631–1639}, year = {2022} } ``` links - [acm](https://dl.acm.org/doi/10.1145/3485447.3512209)
33.666667
111
0.760396
eng_Latn
0.46366
7a92dce2ace99896224893e16f83406d20fb8e27
6,082
md
Markdown
README.md
robopsi/egl-wayland
395ce9f609fbf66f6cab622aec3ded663e089f84
[ "MIT" ]
null
null
null
README.md
robopsi/egl-wayland
395ce9f609fbf66f6cab622aec3ded663e089f84
[ "MIT" ]
null
null
null
README.md
robopsi/egl-wayland
395ce9f609fbf66f6cab622aec3ded663e089f84
[ "MIT" ]
1
2020-07-03T00:39:24.000Z
2020-07-03T00:39:24.000Z
Wayland EGL External Platform library ===================================== Overview -------- This is a work-in-progress implementation of a EGL External Platform library to add client-side Wayland support to EGL on top of EGLDevice and EGLStream families of extensions. This library implements an EGL External Platform interface to work along with EGL drivers that support the external platform mechanism. More information about EGL External platforms and the interface can be found at: https://github.com/NVIDIA/eglexternalplatform Building and Installing the library ----------------------------------- This library build-depends on: * EGL headers https://www.khronos.org/registry/EGL/ * Wayland libraries & protocols https://wayland.freedesktop.org/ * EGL External Platform interface https://github.com/NVIDIA/eglexternalplatform To build, run: ./autogen.sh make To install, run: make install You can also use meson build system to build and install: meson builddir cd builddir ninja ninja install *Notes*: The NVIDIA EGL driver uses a JSON-based loader to load all EGL External platforms available on the system. If this library is not installed as part of a NVIDIA driver installation, a JSON configuration file must be manually added in order to make the library work with the NVIDIA driver. The default EGL External platform JSON configuration directory is: `/usr/share/egl/egl_external_platform.d/` Acknowledgements ---------------- Thanks to James Jones for the original implementation of the Wayland EGL platform. ### Wayland EGL External platform library ### The Wayland EGL External platform library itself is licensed as follows: Copyright (c) 2016, NVIDIA CORPORATION. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ### buildconf ### The Wayland EGL External platform library uses the buildconf autotools bootstrapping script 'autogen.sh': http://freecode.com/projects/buildconf This script carries the following copyright notice: Copyright (c) 2005-2009 United States Government as represented by the U.S. Army Research Laboratory. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. The name of the author may not be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ### MESA's wayland-egl-priv.h ### The Wayland EGL External platform library uses the wayland-egl-priv.h header file from the MESA package: https://cgit.freedesktop.org/mesa/mesa/tree/src/egl/wayland/wayland-egl/wayland-egl-priv.h This file carries the following copyright notice: Copyright © 2011 Benjamin Franzke Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice (including the next paragraph) shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
35.567251
90
0.751726
eng_Latn
0.565462
7a931e64c075f3330142c5a9ee5c0dbe393c80c5
35
md
Markdown
README.md
darshanhs90/Java-Coursera-Advanced-Datastructures
02cbc037d26c571e7d50988021cbe179e018c2fd
[ "MIT" ]
null
null
null
README.md
darshanhs90/Java-Coursera-Advanced-Datastructures
02cbc037d26c571e7d50988021cbe179e018c2fd
[ "MIT" ]
null
null
null
README.md
darshanhs90/Java-Coursera-Advanced-Datastructures
02cbc037d26c571e7d50988021cbe179e018c2fd
[ "MIT" ]
null
null
null
# darshanhs90.github.io Portfolio
11.666667
23
0.8
eng_Latn
0.187492
7a939dd99aeaa6f1754a354e08aa539140b2d472
363
md
Markdown
_extras/discuss.md
PhilReedData/lc-spreadsheets
43f73946459b4f9c820a5ab618b3dd96175e84d5
[ "CC-BY-4.0" ]
8
2017-03-06T11:50:45.000Z
2021-11-15T11:15:24.000Z
_extras/discuss.md
PhilReedData/lc-spreadsheets
43f73946459b4f9c820a5ab618b3dd96175e84d5
[ "CC-BY-4.0" ]
5
2017-06-16T08:40:40.000Z
2018-06-17T23:39:37.000Z
_extras/discuss.md
PhilReedData/lc-spreadsheets
43f73946459b4f9c820a5ab618b3dd96175e84d5
[ "CC-BY-4.0" ]
5
2017-05-30T21:57:56.000Z
2019-05-25T23:06:08.000Z
--- layout: page title: Discussion permalink: /discuss/ --- There are many ways to discuss Library Carpentry lessons: - Join our [Gitter discussion forum]({{ site.contact }}). - Follow updates on [Twitter](https://twitter.com/LibCarpentry). - Make a suggestion or correct an error by [raising an Issue](https://github.com/jezcope/library-spreadsheets/issues). =
30.25
118
0.743802
eng_Latn
0.654056
7a93abea442c5bc682ca63152d9b70b218ff86f4
3,891
md
Markdown
docs/zh/8.0.0/setup/service-agent/java-agent/Application-toolkit-logback-1.x.md
alienwow/document-cn-translation-of-skywalking
9c1e014c51ba4e729050fa89e52c23ae9d5eedcc
[ "Apache-2.0" ]
436
2019-03-08T13:15:58.000Z
2022-03-30T07:43:08.000Z
docs/zh/8.0.0/setup/service-agent/java-agent/Application-toolkit-logback-1.x.md
alienwow/document-cn-translation-of-skywalking
9c1e014c51ba4e729050fa89e52c23ae9d5eedcc
[ "Apache-2.0" ]
32
2019-03-18T15:18:47.000Z
2021-11-12T02:46:03.000Z
docs/zh/8.0.0/setup/service-agent/java-agent/Application-toolkit-logback-1.x.md
alienwow/document-cn-translation-of-skywalking
9c1e014c51ba4e729050fa89e52c23ae9d5eedcc
[ "Apache-2.0" ]
132
2019-03-18T14:59:52.000Z
2022-03-30T07:43:13.000Z
# logback 插件 * 使用maven或gradle引入toolkit依赖。 ```xml <dependency> <groupId>org.apache.skywalking</groupId> <artifactId>apm-toolkit-logback-1.x</artifactId> <version>{project.release.version}</version> </dependency> ``` * 在logback.xml的`Pattern`部分中设置`%tid` ```xml <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"> <encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder"> <layout class="org.apache.skywalking.apm.toolkit.log.logback.v1.x.TraceIdPatternLogbackLayout"> <Pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%tid] [%thread] %-5level %logger{36} -%msg%n</Pattern> </layout> </encoder> </appender> ``` * 使用MDC,在logback.xml的`Pattern`部分中设置`%tid` ```xml <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"> <encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder"> <layout class="org.apache.skywalking.apm.toolkit.log.logback.v1.x.mdc.TraceIdMDCPatternLogbackLayout"> <Pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%X{tid}] [%thread] %-5level %logger{36} -%msg%n</Pattern> </layout> </encoder> </appender> ``` * 支持logback AsyncAppender(也支持MDC),不需要其他配置。请参阅下面的logback.xml演示。 有关详细信息:[Logback AsyncAppender](https://logback.qos.ch/manual/appenders.html#AsyncAppender) ```xml <configuration scan="true" scanPeriod=" 5 seconds"> <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"> <encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder"> <layout class="org.apache.skywalking.apm.toolkit.log.logback.v1.x.mdc.TraceIdMDCPatternLogbackLayout"> <Pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%X{tid}] [%thread] %-5level %logger{36} -%msg%n</Pattern> </layout> </encoder> </appender> <appender name="ASYNC" class="ch.qos.logback.classic.AsyncAppender"> <discardingThreshold>0</discardingThreshold> <queueSize>1024</queueSize> <neverBlock>true</neverBlock> <appender-ref ref="STDOUT"/> </appender> <root level="INFO"> <appender-ref ref="ASYNC"/> </root> </configuration> ``` * 当你使用`-javaagent`激活skywalking tracer后,logback将会输出**traceId**(如果存在的话)。如果tracer未激活,输出将是`TID: N/A` # logstash logback 插件 * 使用maven或gradle引入toolkit依赖。 ```xml <dependency> <groupId>org.apache.skywalking</groupId> <artifactId>apm-toolkit-logback-1.x</artifactId> <version>${skywalking.version}</version> </dependency> ``` * 设置logback.xml的`LogstashEncoder` ```xml <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder"> <provider class="org.apache.skywalking.apm.toolkit.log.logback.v1.x.logstash.TraceIdJsonProvider"> </provider> </encoder> ``` * 在logback-spring.xml中将Logstash的 `LoggingEventCompositeJsonEncoder` 设置为自定义json格式 1.将%tid的转换器添加为<configuration>的子节点 ```xml <!--add converter for %tid --> <conversionRule conversionWord="tid" converterClass="org.apache.skywalking.apm.toolkit.log.logback.v1.x.LogbackPatternConverter"/> ``` 2.为自定义json格式添加json编码器 ```xml <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder"> <providers> <timestamp> <timeZone>UTC</timeZone> </timestamp> <pattern> <pattern> { "level": "%level", "tid": "%tid", "thread": "%thread", "class": "%logger{1.}:%L", "message": "%message", "stackTrace": "%exception{10}" } </pattern> </pattern> </providers> </encoder> ```
34.741071
134
0.607556
kor_Hang
0.285661
7a93b95e37d6c11d9db967046f35cf3be2423504
1,469
md
Markdown
docs/2014/database-engine/automatically-check-out-files-upon-edit.md
SteSinger/sql-docs.de-de
2259e4fbe807649f6ad0d49b425f1f3fe134025d
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/2014/database-engine/automatically-check-out-files-upon-edit.md
SteSinger/sql-docs.de-de
2259e4fbe807649f6ad0d49b425f1f3fe134025d
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/2014/database-engine/automatically-check-out-files-upon-edit.md
SteSinger/sql-docs.de-de
2259e4fbe807649f6ad0d49b425f1f3fe134025d
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Automatisches Auschecken von Dateien beim Bearbeiten | Microsoft-Dokumentation ms.custom: '' ms.date: 06/13/2017 ms.prod: sql-server-2014 ms.reviewer: '' ms.technology: '' ms.topic: conceptual helpviewer_keywords: - checking out files - automatic file check outs ms.assetid: afa9f637-3d14-4d64-be51-0e8167e21d2b author: mashamsft ms.author: mathoma manager: craigg ms.openlocfilehash: 0cb5bacaeab817c491ae72018630cf88e62b3b7f ms.sourcegitcommit: 3026c22b7fba19059a769ea5f367c4f51efaf286 ms.translationtype: MT ms.contentlocale: de-DE ms.lasthandoff: 06/15/2019 ms.locfileid: "62791740" --- # <a name="automatically-check-out-files-upon-edit"></a>Automatisches Auschecken von Dateien beim Bearbeiten Sie können [!INCLUDE[ssManStudioFull](../includes/ssmanstudiofull-md.md)] so konfigurieren, dass eine Datei automatisch ausgecheckt wird, wenn Sie mit der Bearbeitung anfangen. ### <a name="to-configure-automatic-checkout"></a>So konfigurieren Sie das automatische Auschecken 1. Klicken Sie im Menü **Extras** auf **Optionen**. 2. Erweitern Sie die **Quellcodeverwaltung** Ordner, und klicken Sie dann auf **Umgebung**. 3. In der **bearbeiten** Kontrollkästchen **automatisch Auschecken**, und klicken Sie dann auf **OK**. ## <a name="see-also"></a>Siehe auch [Auschecken von Dateien](../../2014/database-engine/check-out-files.md) [Verwalten von Auscheckvorgängen](../../2014/database-engine/manage-checkouts.md)
37.666667
180
0.756978
deu_Latn
0.834312
7a940d8f8caca0a232c44bcdbfede949ba045aab
740
md
Markdown
docs/csharp/misc/cs2034.md
cy-org/docs-conceptual.zh-cn
0c18cda3dd707efdcdd0e73bc480ab9fbbc4580c
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/csharp/misc/cs2034.md
cy-org/docs-conceptual.zh-cn
0c18cda3dd707efdcdd0e73bc480ab9fbbc4580c
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/csharp/misc/cs2034.md
cy-org/docs-conceptual.zh-cn
0c18cda3dd707efdcdd0e73bc480ab9fbbc4580c
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: "编译器错误 CS2034 | Microsoft Docs" ms.date: "2015-07-20" ms.prod: ".net" ms.technology: - "devlang-csharp" ms.topic: "article" f1_keywords: - "CS2034" dev_langs: - "CSharp" helpviewer_keywords: - "CS2034" ms.assetid: 72f2b785-ee23-4a1b-b12d-42d19c324d5e caps.latest.revision: 11 author: "BillWagner" ms.author: "wiwagn" caps.handback.revision: 11 --- # 编译器错误 CS2034 声明外部别名的 \/reference 选项只能有一个文件名。 若要指定多个别名或文件名,请使用多个 \/reference 选项。 若要指定两个别名和\/或文件名,使用两个 **\/reference** 选项,如下所示: ## 示例 下面的代码将生成错误 CS2034。 ``` // CS2034.cs // compile with: /r:A1=cs2034a1.dll;A2=cs2034a2.dll // to fix, compile with: /r:A1=cs2034a1.dll /r:A2=cs2034a2.dll // CS2034 extern alias A1; extern alias A2; using System; ```
24.666667
187
0.687838
yue_Hant
0.261355
7a94813aeda2fdbaea3f13a9ffb4b38c7aa45bc5
1,346
md
Markdown
translations/es-XL/content/github/administering-a-repository/enabling-anonymous-git-read-access-for-a-repository.md
kyawburma/docs
0ff7de03be7c2432ced123aca17bfbf444bee1bf
[ "CC-BY-4.0", "MIT" ]
20
2021-02-17T16:18:11.000Z
2022-03-16T08:30:36.000Z
translations/es-XL/content/github/administering-a-repository/enabling-anonymous-git-read-access-for-a-repository.md
kyawburma/docs
0ff7de03be7c2432ced123aca17bfbf444bee1bf
[ "CC-BY-4.0", "MIT" ]
419
2021-01-27T03:39:25.000Z
2022-03-26T20:28:31.000Z
translations/es-XL/content/github/administering-a-repository/enabling-anonymous-git-read-access-for-a-repository.md
kyawburma/docs
0ff7de03be7c2432ced123aca17bfbf444bee1bf
[ "CC-BY-4.0", "MIT" ]
46
2020-11-05T10:39:05.000Z
2021-07-23T11:35:59.000Z
--- title: Activar el acceso de lectura Git anónimo para un repositorio intro: 'Como administrador de un repositorio, puedes habilitar o inhabilitar el acceso de lectura Git anónimo para repositorios públicos que cumplen con determinados requisitos.' redirect_from: - /articles/enabling-anonymous-git-read-access-for-a-repository versions: enterprise-server: '*' --- Los administradores de repositorios pueden cambiar el acceso de lectura Git anónimo y establecer un repositorio específico en los siguientes casos: - Si un administrador del sitio ha habilitado el modo privado y el acceso de lectura Git anónimo. - Si el repositorio es público en la instancia y no es una bifurcación. - Si un administrador del sitio no ha inhabilitado el acceso de lectura Git anónimo para el repositorio. {% data reusables.enterprise_user_management.exceptions-for-enabling-anonymous-git-read-access %} {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.sidebar-settings %} 3. Junto a "Habilitar el acceso de lectura Git anónimo", haz clic en **Habilitar**. ![Botón "Habilitado" en "Acceso de lectura Git anónimo"](/assets/images/help/repository/enable-git-read-access-for-a-repo.png) 4. Revisa los cambios. Para confirmar, escribe el nombre del repositorio y haz clic en **Comprendo. Habilitar el acceso de lectura Git.**
64.095238
210
0.792719
spa_Latn
0.971095
7a95bcd04a0763deef75b9dc23d2a199aeb7eb9f
1,784
md
Markdown
docs/access/desktop-database-reference/data-manipulation-language.md
MicrosoftDocs/office-developer-client-docs.pt-BR
7b878649ac5b6a1113bf06b099adffa17c956c16
[ "CC-BY-4.0", "MIT" ]
2
2020-05-19T18:52:31.000Z
2021-04-21T00:13:46.000Z
docs/access/desktop-database-reference/data-manipulation-language.md
MicrosoftDocs/office-developer-client-docs.pt-BR
7b878649ac5b6a1113bf06b099adffa17c956c16
[ "CC-BY-4.0", "MIT" ]
4
2021-12-08T02:35:59.000Z
2021-12-08T02:53:43.000Z
docs/access/desktop-database-reference/data-manipulation-language.md
MicrosoftDocs/office-developer-client-docs.pt-BR
7b878649ac5b6a1113bf06b099adffa17c956c16
[ "CC-BY-4.0", "MIT" ]
2
2019-10-13T18:19:41.000Z
2021-11-25T00:39:27.000Z
--- title: Linguagem de manipulação de dados (Microsoft Access SQL) TOCTitle: Data manipulation language ms:assetid: 25c6f127-0fee-470e-bf16-9253b14e8086 ms:mtpsurl: https://msdn.microsoft.com/library/Dn124073(v=office.15) ms:contentKeyID: 52071710 ms.date: 09/18/2015 mtps_version: v=office.15 ms.localizationpriority: high ms.openlocfilehash: 307ca387cea6200ae336951bd2e25a0d17f55126 ms.sourcegitcommit: a1d9041c20256616c9c183f7d1049142a7ac6991 ms.translationtype: HT ms.contentlocale: pt-BR ms.lasthandoff: 09/24/2021 ms.locfileid: "59594367" --- # <a name="data-manipulation-language-microsoft-access-sql"></a>Linguagem de manipulação de dados (Microsoft Access SQL) **Aplica-se ao**: Access 2013, Office 2013 - [Instrução DELETE](delete-statement-microsoft-access-sql.md) - [Instrução EXECUTE](execute-statement-microsoft-access-sql.md) - [Operação INNER JOIN](inner-join-operation-microsoft-access-sql.md) - [Instrução INSERT INTO](insert-into-statement-microsoft-access-sql.md) - [Operações LEFT JOIN e RIGHT JOIN](left-join-right-join-operations-microsoft-access-sql.md) - [Declaração PARAMETERS](parameters-declaration-microsoft-access-sql.md) - [Cláusula PROCEDURE](procedure-clause-microsoft-access-sql.md) - [Instrução SELECT](select-statement-microsoft-access-sql.md) - [Instrução SELECT.INTO](select-into-statement-microsoft-access-sql.md) - [Subconsultas SQL](sql-subqueries-microsoft-access-sql.md) - [Instrução TRANSACTION](transaction-statement-microsoft-access-sql.md) - [Instrução TRANSFORM](transform-statement-microsoft-access-sql.md) - [Operação UNION](union-operation-microsoft-access-sql.md) - [Instrução UPDATE](update-statement-microsoft-access-sql.md) - [Declaração WITH OWNERACCESS OPTION](with-owneraccess-option-declaration-microsoft-access-sql.md)
46.947368
120
0.803251
yue_Hant
0.383353
7a9638a7520123839919233058f1bf49e8a37892
12,459
md
Markdown
articles/data-factory/quickstart-create-data-factory-azure-cli.md
Myhostings/azure-docs.tr-tr
536eaf3b454f181f4948041d5c127e5d3c6c92cc
[ "CC-BY-4.0", "MIT" ]
16
2017-08-28T08:29:36.000Z
2022-01-02T16:46:30.000Z
articles/data-factory/quickstart-create-data-factory-azure-cli.md
Ahmetmaman/azure-docs.tr-tr
536eaf3b454f181f4948041d5c127e5d3c6c92cc
[ "CC-BY-4.0", "MIT" ]
470
2017-11-11T20:59:16.000Z
2021-04-10T17:06:28.000Z
articles/data-factory/quickstart-create-data-factory-azure-cli.md
Ahmetmaman/azure-docs.tr-tr
536eaf3b454f181f4948041d5c127e5d3c6c92cc
[ "CC-BY-4.0", "MIT" ]
25
2017-11-11T19:39:08.000Z
2022-03-30T13:47:56.000Z
--- title: 'Hızlı başlangıç: Azure CLı kullanarak Azure Data Factory oluşturma' description: Bu hızlı başlangıç, bağlı hizmet, veri kümeleri ve işlem hattı dahil bir Azure Data Factory oluşturur. Bir dosya kopyalama eylemi yapmak için işlem hattını çalıştırabilirsiniz. author: linda33wj ms.author: jingwang ms.service: azure-cli ms.topic: quickstart ms.date: 03/24/2021 ms.custom: - template-quickstart - devx-track-azurecli ms.openlocfilehash: b40407f4c4fb81bbf76bd0b552f3c9f2c827232a ms.sourcegitcommit: 2aeb2c41fd22a02552ff871479124b567fa4463c ms.translationtype: MT ms.contentlocale: tr-TR ms.lasthandoff: 04/22/2021 ms.locfileid: "107871545" --- # <a name="quickstart-create-an-azure-data-factory-using-azure-cli"></a>Hızlı başlangıç: Azure CLı kullanarak Azure Data Factory oluşturma Bu hızlı başlangıçta Azure Data Factory oluşturmak için Azure CLı 'nin nasıl kullanılacağı açıklanmaktadır. Bu veri fabrikasında oluşturduğunuz işlem hattı, verileri bir Azure Blob depolama alanındaki bir klasörden başka bir klasöre kopyalar. Azure Data Factory kullanarak verileri dönüştürme hakkında daha fazla bilgi için bkz. [Azure Data Factory verileri dönüştürme](transform-data.md). Azure Data Factory hizmetine giriş bilgileri için bkz. [Azure Data Factory'ye giriş](introduction.md). Azure aboneliğiniz yoksa başlamadan önce [ücretsiz bir hesap](https://azure.microsoft.com/free/) oluşturun. [!INCLUDE [azure-cli-prepare-your-environment](../../includes/azure-cli-prepare-your-environment.md)] > [!NOTE] > Data Factory örnekleri oluşturmak için, Azure’da oturum açarken kullandığınız kullanıcı hesabı, katkıda bulunan, sahip veya yönetici rollerinin üyesi ya da bir Azure aboneliğinin yöneticisi olmalıdır. Daha fazla bilgi için bkz. [Azure rolleri](quickstart-create-data-factory-powershell.md#azure-roles). ## <a name="prepare-a-container-and-test-file"></a>Kapsayıcı ve test dosyası hazırlama Bu hızlı başlangıç, bir dosya içeren bir kapsayıcı içeren bir Azure depolama hesabı kullanır. 1. Adlı bir kaynak grubu oluşturmak için `ADFQuickStartRG` [az Group Create](/cli/azure/group#az_group_create) komutunu kullanın: ```azurecli az group create --name ADFQuickStartRG --location eastus ``` 1. [Az Storage Account Create](/cli/azure/storage/container#az_storage_container_create) komutunu kullanarak bir depolama hesabı oluşturun: ```azurecli az storage account create --resource-group ADFQuickStartRG \ --name adfquickstartstorage --location eastus ``` 1. `adftutorial` [Az Storage Container Create](/cli/azure/storage/container#az_storage_container_create) komutunu kullanarak adlı bir kapsayıcı oluşturun: ```azurecli az storage container create --resource-group ADFQuickStartRG --name adftutorial \ --account-name adfquickstartstorage --auth-mode key ``` 1. Yerel dizinde, karşıya yüklemek için adlı bir dosya oluşturun `emp.txt` . Azure Cloud Shell ' de çalışıyorsanız, Bash komutunu kullanarak geçerli çalışma dizinini bulabilirsiniz `echo $PWD` . Bir dosya oluşturmak için, gibi standart Bash komutlarını kullanabilirsiniz `cat` : ```console cat > emp.txt This is text. ``` Yeni dosyanızı kaydetmek için **CTRL + D** tuşlarını kullanın. 1. Yeni dosyayı Azure depolama kapsayıcısına yüklemek için [az Storage blob upload](/cli/azure/storage/blob#az_storage_blob_upload) komutunu kullanın: ```azurecli az storage blob upload --account-name adfquickstartstorage --name input/emp.txt \ --container-name adftutorial --file emp.txt --auth-mode key ``` Bu komut adlı yeni bir klasöre yükler `input` . ## <a name="create-a-data-factory"></a>Veri fabrikası oluşturma Bir Azure Data Factory oluşturmak için [az DataFactory Factory Create](/cli/azure/datafactory/factory#az_datafactory_factory_create) komutunu çalıştırın: ```azurecli az datafactory factory create --resource-group ADFQuickStartRG \ --factory-name ADFTutorialFactory ``` > [!IMPORTANT] > `ADFTutorialFactory`Genel olarak benzersiz bir veri fabrikası adıyla değiştirin, örneğin, ADFTutorialFactorySP1127. [Az DataFactory Factory Show](/cli/azure/datafactory/factory#az_datafactory_factory_show) komutunu kullanarak oluşturduğunuz veri fabrikasını görebilirsiniz: ```azurecli az datafactory factory show --resource-group ADFQuickStartRG \ --factory-name ADFTutorialFactory ``` ## <a name="create-a-linked-service-and-datasets"></a>Bağlı hizmet ve veri kümeleri oluşturma Ardından, bağlı bir hizmet ve iki veri kümesi oluşturun. 1. [Az Storage Account Show-Connection-String](/cli/azure/datafactory/factory#az_datafactory_factory_show) komutunu kullanarak depolama hesabınızın bağlantı dizesini alın: ```azurecli az storage account show-connection-string --resource-group ADFQuickStartRG \ --name adfquickstartstorage --key primary ``` 1. Çalışma dizininizde, bu içeriğe sahip bir JSON dosyası oluşturun ve önceki adımdan kendi Bağlantı dizenizi dahil edin. Dosyayı adlandırın `AzureStorageLinkedService.json` : ```json { "type":"AzureStorage", "typeProperties":{ "connectionString":{ "type": "SecureString", "value":"DefaultEndpointsProtocol=https;AccountName=adfquickstartstorage;AccountKey=K9F4Xk/EhYrMBIR98rtgJ0HRSIDU4eWQILLh2iXo05Xnr145+syIKNczQfORkQ3QIOZAd/eSDsvED19dAwW/tw==;EndpointSuffix=core.windows.net" } } } ``` 1. `AzureStorageLinkedService` [Az DataFactory Linked-Service Create](/cli/azure/datafactory/linked-service#az_datafactory_linked_service_create) komutunu kullanarak adlı bağlı bir hizmet oluşturun: ```azurecli az datafactory linked-service create --resource-group ADFQuickStartRG \ --factory-name ADFTutorialFactory --linked-service-name AzureStorageLinkedService \ --properties @AzureStorageLinkedService.json ``` 1. Çalışma dizininizde, bu içerikle birlikte şu adlı bir JSON dosyası oluşturun `InputDataset.json` : ```json { "type": "AzureBlob", "linkedServiceName": { "type":"LinkedServiceReference", "referenceName":"AzureStorageLinkedService" }, "annotations": [], "type": "Binary", "typeProperties": { "location": { "type": "AzureBlobStorageLocation", "fileName": "emp.txt", "folderPath": "input", "container": "adftutorial" } } } ``` 1. `InputDataset` [Az DataFactory DataSet Create](/cli/azure/datafactory/dataset#az_datafactory_dataset_create) komutunu kullanarak adlı bir giriş veri kümesi oluşturun: ```azurecli az datafactory dataset create --resource-group ADFQuickStartRG \ --dataset-name InputDataset --factory-name ADFQuickStartFactory \ --properties @InputDataset.json ``` 1. Çalışma dizininizde, bu içerikle birlikte şu adlı bir JSON dosyası oluşturun `OutputDataset.json` : ```json { "type": "AzureBlob", "linkedServiceName": { "type":"LinkedServiceReference", "referenceName":"AzureStorageLinkedService" }, "annotations": [], "type": "Binary", "typeProperties": { "location": { "type": "AzureBlobStorageLocation", "fileName": "emp.txt", "folderPath": "output", "container": "adftutorial" } } } ``` 1. `OutputDataset` [Az DataFactory DataSet Create](/cli/azure/datafactory/dataset#az_datafactory_dataset_create) komutunu kullanarak adlı bir çıktı veri kümesi oluşturun: ```azurecli az datafactory dataset create --resource-group ADFQuickStartRG \ --dataset-name OutputDataset --factory-name ADFQuickStartFactory \ --properties @OutputDataset.json ``` ## <a name="create-and-run-the-pipeline"></a>İşlem hattını oluşturma ve çalıştırma Son olarak, işlem hattını oluşturun ve çalıştırın. 1. Çalışma dizininizde, şu içerik adlı bir JSON dosyası oluşturun `Adfv2QuickStartPipeline.json` : ```json { "name": "Adfv2QuickStartPipeline", "properties": { "activities": [ { "name": "CopyFromBlobToBlob", "type": "Copy", "dependsOn": [], "policy": { "timeout": "7.00:00:00", "retry": 0, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "typeProperties": { "source": { "type": "BinarySource", "storeSettings": { "type": "AzureBlobStorageReadSettings", "recursive": true } }, "sink": { "type": "BinarySink", "storeSettings": { "type": "AzureBlobStorageWriteSettings" } }, "enableStaging": false }, "inputs": [ { "referenceName": "InputDataset", "type": "DatasetReference" } ], "outputs": [ { "referenceName": "OutputDataset", "type": "DatasetReference" } ] } ], "annotations": [] } } ``` 1. `Adfv2QuickStartPipeline` [Az DataFactory ardışık düzen Create](/cli/azure/datafactory/pipeline#az_datafactory_pipeline_create) komutunu kullanarak adlı bir işlem hattı oluşturun: ```azurecli az datafactory pipeline create --resource-group ADFQuickStartRG \ --factory-name ADFTutorialFactory --name Adfv2QuickStartPipeline \ --pipeline @Adfv2QuickStartPipeline.json ``` 1. [Az DataFactory ardışık düzen oluştur-Çalıştır](/cli/azure/datafactory/pipeline#az_datafactory_pipeline_create_run) komutunu kullanarak işlem hattını çalıştırın: ```azurecli az datafactory pipeline create-run --resource-group ADFQuickStartRG \ --name Adfv2QuickStartPipeline --factory-name ADFTutorialFactory ``` Bu komut bir çalıştırma KIMLIĞI döndürür. Sonraki komutta kullanmak üzere kopyalayın. 1. [Az DataFactory işlem hattı-Run Show](/cli/azure/datafactory/pipeline-run#az_datafactory_pipeline_run_show) komutunu kullanarak işlem hattının başarılı olduğunu doğrulayın: ```azurecli az datafactory pipeline-run show --resource-group ADFQuickStartRG \ --factory-name ADFTutorialFactory --run-id 00000000-0000-0000-0000-000000000000 ``` [Azure Portal](https://portal.azure.com/)kullanarak, işlem hattınızı beklendiği gibi çalıştığını da doğrulayabilirsiniz. Daha fazla bilgi için bkz. [dağıtılan kaynakları İnceleme](quickstart-create-data-factory-powershell.md#review-deployed-resources). ## <a name="clean-up-resources"></a>Kaynakları temizleme Bu hızlı başlangıçtaki tüm kaynaklar aynı kaynak grubunun bir parçasıdır. Tümünü kaldırmak için [az Group Delete](/cli/azure/group#az_group_delete) komutunu kullanın: ```azurecli az group delete --name ADFQuickStartRG ``` Bu kaynak grubunu başka bir şey için kullanıyorsanız, tek tek kaynakları silin. Örneğin, bağlı hizmeti kaldırmak için [az DataFactory Linked-Service Delete](/cli/azure/datafactory/linked-service#az_datafactory_linked_service_delete) komutunu kullanın. Bu hızlı başlangıçta aşağıdaki JSON dosyalarını oluşturdunuz: - Üzerinde AzureStorageLinkedService.js - Üzerinde InputDataset.js - Üzerinde OutputDataset.js - Üzerinde Adfv2QuickStartPipeline.js Standart Bash komutlarını kullanarak bunları silin. ## <a name="next-steps"></a>Sonraki adımlar - [Azure Data Factory’de işlem hatları ve etkinlikler](concepts-pipelines-activities.md) - [Azure Data Factory'de bağlı hizmetler](concepts-linked-services.md) - [Azure Data Factory'de veri kümeleri](concepts-datasets-linked-services.md) - [Azure Data Factory'de veri dönüştürme](transform-data.md)
41.949495
389
0.676619
tur_Latn
0.985123
7a9675617650f85721abe10e3982627cedd1d280
2,012
md
Markdown
_posts/2018-09-19-The 2018 ACM-ICPC Asia Qingdao Regional Contest, Online.md
GAOYAN9/gaoyan.github.io
e2257cdd453330347652d70b8127a093368da495
[ "MIT" ]
230
2019-02-23T04:25:07.000Z
2022-03-12T04:29:57.000Z
_posts/2018-09-19-The 2018 ACM-ICPC Asia Qingdao Regional Contest, Online.md
GAOYAN9/gaoyan.github.io
e2257cdd453330347652d70b8127a093368da495
[ "MIT" ]
17
2019-09-08T09:15:48.000Z
2021-12-18T13:28:13.000Z
_posts/2018-09-19-The 2018 ACM-ICPC Asia Qingdao Regional Contest, Online.md
GAOYAN9/gaoyan.github.io
e2257cdd453330347652d70b8127a093368da495
[ "MIT" ]
1,145
2019-01-06T11:42:02.000Z
2022-03-29T15:24:13.000Z
--- redirect_from: /_posts/2018-09-19-The-2018-ACM-ICPC-Asia-Qingdao-Regional-Contest,-Online/ title: The 2018 ACM-ICPC Asia Qingdao Regional Contest, Online tags: - ACM --- ## [Live Love](https://vjudge.net/problem/ZOJ-4047) ```c #include<stdio.h> int t,n,m; int main() { for(scanf("%d",&t); t--;) { scanf("%d%d",&n,&m); if(n==1||!m)printf("%d %d\n",m,m); else for(int i=1,tmp,res; i<=m; ++i) if(tmp=n/(i+1),res=n-tmp*(i+1),tmp*i+res>=m) { printf("%d %d\n",m,i); break; } } } ``` ## [Halting Problem](https://vjudge.net/problem/ZOJ-4049) ```c #include<bits/stdc++.h> using namespace std; struct Instruction { char s[9]; int v,k,vis[256]; } a[10009]; int t,n,i,r; int main() { for(scanf("%d",&t); t--;) { scanf("%d",&n); for(i=1; i<=n; ++i) { fill(a[i].vis,a[i].vis+256,0); scanf("%s%d",&a[i].s,&a[i].v); if(a[i].s[1]!='d')scanf("%d",&a[i].k); } for(i=1,r=0; i<=n&&!a[i].vis[r]; ++i) switch(a[i].vis[r]=1,a[i].s[1]) { case 'd': r=(r+a[i].v)%256; break; case 'e': if(r==a[i].v)i=a[i].k-1; break; case 'n': if(r!=a[i].v)i=a[i].k-1; break; case 'l': if(r<a[i].v)i=a[i].k-1; break; case 'g': if(r>a[i].v)i=a[i].k-1; break; } printf(i>n?"Yes\n":"No\n"); } } ``` ## [Traveling on the Axis](https://vjudge.net/problem/ZOJ-4054) ```cpp #include<bits/stdc++.h> using namespace std; typedef long long ll; char s[100009]; ll t,ans; int main() { for(scanf("%lld",&t); t--; printf("%lld\n",ans)) { scanf("%s",s); for(ll i=ans=0,n=strlen(s); i<n; ++i) { ans+=(i+1)*(n-i); if(s[i]=='0')ans+=n-i; if(i&&s[i]==s[i-1])ans+=i*(n-i); } } } ``` ## [XOR Clique](https://vjudge.net/problem/ZOJ-4057) ```cpp #include<bits/stdc++.h> using namespace std; int t,n,a,k,cnt[31]; int main() { for(scanf("%d",&t); t--;) { fill(cnt,cnt+31,0); for(scanf("%d",&n); n--;) { scanf("%d",&a); for(k=0; a; ++k)a>>=1; ++cnt[k]; } printf("%d\n",*max_element(cnt,cnt+31)); } } ```
16.907563
90
0.510437
yue_Hant
0.077834
7a967855fcc2f3d42096e8c0d184cd68b8185847
446
md
Markdown
README.md
gocontrib/auth
ee09c68e14fd634b85c79e675a135312f825be2a
[ "MIT" ]
3
2015-03-22T08:08:59.000Z
2020-09-07T05:15:11.000Z
README.md
gocontrib/auth
ee09c68e14fd634b85c79e675a135312f825be2a
[ "MIT" ]
2
2019-01-30T12:36:30.000Z
2019-02-01T12:10:54.000Z
README.md
gocontrib/auth
ee09c68e14fd634b85c79e675a135312f825be2a
[ "MIT" ]
null
null
null
[![Build Status](https://travis-ci.org/gocontrib/auth.svg?branch=master)](https://travis-ci.org/gocontrib/auth) [![codecov](https://codecov.io/gh/gocontrib/auth/branch/master/graph/badge.svg)](https://codecov.io/gh/gocontrib/auth) # auth Authentication package for golang including middleware with generic implementation of [Basic HTTP](http://en.wikipedia.org/wiki/Basic_access_authentication) and [JWT](http://jwt.io/) authentication schemes.
63.714286
156
0.780269
kor_Hang
0.225239
7a968a70987d80d4c51ceb98c937cda26a89debc
658
md
Markdown
README.md
baroood/Hacktoberfest-2k17
87383df4bf705358866a5a4120dd678a3f2acd3e
[ "MIT" ]
28
2017-10-04T19:42:26.000Z
2021-03-26T04:00:48.000Z
README.md
baroood/Hacktoberfest-2k17
87383df4bf705358866a5a4120dd678a3f2acd3e
[ "MIT" ]
375
2017-09-28T02:58:37.000Z
2019-10-31T09:10:38.000Z
README.md
baroood/Hacktoberfest-2k17
87383df4bf705358866a5a4120dd678a3f2acd3e
[ "MIT" ]
519
2017-09-28T02:40:29.000Z
2021-02-15T08:29:17.000Z
# Hacktoberfest-2k17 1 Official Repository for Hacktoberfest 2k17 NITK Edition meetup conducted at NITK Surathkal in collaboration with Team Engineer. ## CONTRIBUTING * This repository is for the NITK Hacktoberfest Meetup for the year 2k17. It's basic aim is to get people started with Open Source Contribution. * To get started take a look at our [CONTRIBUTING doc](CONTRIBUTING.md) for more information. ## License * This repository is under [MIT LICENSE](LICENSE). For more information refer the License terms. ## Code of Conduct * The [Web Club Code of Conduct](CODE_OF_CONDUCT.md) can be seen primarily to get the dos/donts for the repository.
38.705882
145
0.782675
eng_Latn
0.973794
7a96e2bcbf9c480ff439f0a844498cc1aab3e700
218
md
Markdown
_watches/M20200312_031634_TLP_3.md
Meteoros-Floripa/meteoros.floripa.br
7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad
[ "MIT" ]
5
2020-01-22T17:44:06.000Z
2020-01-26T17:57:58.000Z
_watches/M20200312_031634_TLP_3.md
Meteoros-Floripa/site
764cf471d85a6b498873610e4f3b30efd1fd9fae
[ "MIT" ]
null
null
null
_watches/M20200312_031634_TLP_3.md
Meteoros-Floripa/site
764cf471d85a6b498873610e4f3b30efd1fd9fae
[ "MIT" ]
2
2020-05-19T17:06:27.000Z
2020-09-04T00:00:43.000Z
--- layout: watch title: TLP3 - 12/03/2020 - M20200312_031634_TLP_3T.jpg date: 2020-03-12 03:16:34 permalink: /2020/03/12/watch/M20200312_031634_TLP_3 capture: TLP3/2020/202003/20200311/M20200312_031634_TLP_3T.jpg ---
27.25
62
0.784404
fra_Latn
0.035578
7a972e97dccc4d4e11b67bbc001474d83921f9db
1,659
md
Markdown
meta/3-6-1.md
KariH1/open-sdg-data
21f6e2c9af9fdc6ae9fd99dcad99b38026ddaf07
[ "MIT" ]
null
null
null
meta/3-6-1.md
KariH1/open-sdg-data
21f6e2c9af9fdc6ae9fd99dcad99b38026ddaf07
[ "MIT" ]
null
null
null
meta/3-6-1.md
KariH1/open-sdg-data
21f6e2c9af9fdc6ae9fd99dcad99b38026ddaf07
[ "MIT" ]
null
null
null
--- data_non_statistical: false goal_meta_link: https://unstats.un.org/sdgs/metadata/files/Metadata-03-06-01.pdf goal_meta_link_text: Lýsigögn Sameinuðu Þjóðanna (PDF 213 KB) graph_type: line indicator: 3.6.1 indicator_name: Dánartíðni af völdum umferðarslysa. indicator_sort_order: 03-06-01 layout: indicator permalink: /3-6-1/ published: true reporting_status: complete sdg_goal: '3' target: Eigi síðar en árið 2020 verði búið að ná fjölda dauðsfalla og alvarlega slasaðra vegna umferðarslysa niður um helming á heimsvísu. target_id: '3.6' graph_title: Dánartíðni af völdum umferðarslysa. un_custodian_agency: Alþjóðaheilbrigðismálastofnunin (WHO) un_designated_tier: '1' national_indicator_available: Dánartíðni af völdum umferðarslysa. national_geographical_coverage: Ísland computation_units: Hlutfall af hverjum 100,000 íbúum computation_definitions: Dánartíðni af völdum umferðarslysa. er skilgreint sem fjöldi banaslysa í umferðinni á hverja 100,000 íbúa computation_calculations: (Banaslys / íbúafjöldi) * 100,000 comments_limitations: Gögn fylgja forskrift Sameinuðu Þjóðanna fyrir þennan mælikvarða. Þessi mælikvarði var fundinn í samstarfi við sérfræðinga á þessu sviði. source_active_1: true source_organisation_1: Hagstofa Íslands source_periodicity_1: Árleg source_earliest_available_1: 1981 source_geographical_coverage_1: Ísland source_url_1: https://px.hagstofa.is/pxis/pxweb/is/Atvinnuvegir/Atvinnuvegir__samgongur__Okutaeki/SAM03201.px/ source_url_text_1: Umferðarslys 1981-2018 source_release_date_1: 09/02/2019 source_next_release_1: source_statistical_classification_1: Opinber tölfræði source_contact_1: [email protected] ---
41.475
159
0.846896
isl_Latn
0.932935
7a97497d09d664734cb990ff4ed2648b472bb807
1,385
md
Markdown
README.md
kapouer/raja-polyfill
5e1eef18c8a941a24ef7510acc1982b3dbda16e1
[ "MIT" ]
null
null
null
README.md
kapouer/raja-polyfill
5e1eef18c8a941a24ef7510acc1982b3dbda16e1
[ "MIT" ]
null
null
null
README.md
kapouer/raja-polyfill
5e1eef18c8a941a24ef7510acc1982b3dbda16e1
[ "MIT" ]
null
null
null
raja-polyfill ============= Express-dom plugin that installs <script> tags polyfills depending on user agent; and emitting Vary directive for proxy cache. ``` var app = require('express')(); var dom = require('express-dom'); var polyfill = require('raja-polyfill'); app.get('/mypage', dom('myview').use(polyfill({ "classlist": "https://raw.githubusercontent.com/eligrey/classList.js/master/classList.js", "custom-elements": "/js/webcomponents-lite.js" }))); ``` The polyfill plugin accepts an object where keys are caniuse-db features, and values are the url that go in script tags. As shown in the example above, what is actually loaded in the script tag does not necessarily match exactly the tested feature: here we just test custom-elements to install [many more polyfills](https://github.com/webcomponents/webcomponentsjs/blob/master/webcomponents-lite.js#L53). using with a cache ------------------ raja-polyfill sets response header Vary: User-Agent so that proxies can properly cache responses by User-Agent. Better caching strategy can be achieved with User Agent version negotiation: raja-polyfill sets this header in the response: User-Agent-Versions: ie < 9, ff < 43, chrome < 10, opera < 22 along with Vary: User-Agent-Versions of course. This allows the cache to compare user-agent version with the ranges given in User-Agent-Versions for each url variant.
33.780488
137
0.752347
eng_Latn
0.96369
7a98335e1c1adc3b84055e71c9fec409065ffe8d
5,768
md
Markdown
articles/data-lake-analytics/runtime-troubleshoot.md
tsunami416604/azure-docs.hu-hu
aeba852f59e773e1c58a4392d035334681ab7058
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/data-lake-analytics/runtime-troubleshoot.md
tsunami416604/azure-docs.hu-hu
aeba852f59e773e1c58a4392d035334681ab7058
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/data-lake-analytics/runtime-troubleshoot.md
tsunami416604/azure-docs.hu-hu
aeba852f59e773e1c58a4392d035334681ab7058
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: A Azure Data Lake Analytics U-SQL futásidejű hibáinak elhárítása description: Ismerje meg, hogy miként lehet elhárítani az U-SQL futásidejű hibáit. ms.reviewer: jasonh ms.service: data-lake-analytics ms.topic: troubleshooting ms.date: 10/10/2019 ms.openlocfilehash: 41b7c80c85331f288343351749e6b2e5292b30c6 ms.sourcegitcommit: 30906a33111621bc7b9b245a9a2ab2e33310f33f ms.translationtype: MT ms.contentlocale: hu-HU ms.lasthandoff: 11/22/2020 ms.locfileid: "95241607" --- # <a name="learn-how-to-troubleshoot-u-sql-runtime-failures-due-to-runtime-changes"></a>Ismerje meg, hogy miként lehet elhárítani a futásidejű változások miatti U-SQL futásidejű hibákat A Azure Data Lake u-SQL-futtatókörnyezet, beleértve a fordítót, az optimalizáló és a Feladatkezelőt, a u-SQL-kódot dolgozza fel. ## <a name="choosing-your-u-sql-runtime-version"></a>A U-SQL futtatókörnyezet-verziójának kiválasztása Ha a Visual studióból, az ADL SDK-ból vagy a Azure Data Lake Analytics-portálról küld el U-SQL-feladatokat, a feladat az aktuálisan elérhető alapértelmezett futtatókörnyezetet fogja használni. Az U-SQL Runtime új verziói rendszeresen jelennek meg, és a másodlagos frissítéseket és a biztonsági javításokat is tartalmazzák. Választhat egyéni futtatókörnyezet-verziót is; vagy azért, mert egy új frissítést szeretne kipróbálni, a futtatókörnyezet egy régebbi verziójában kell maradnia, vagy a jelentett probléma gyorsjavításával lett ellátva, amely nem várja meg a rendszeres új frissítést. > [!CAUTION] > Az alapértelmezetttől eltérő futtatókörnyezet kiválasztásával megszakíthatja a U-SQL-feladatokat. Ezeket a más verziókat csak tesztelésre használhatja. Ritka esetekben a Microsoft ügyfélszolgálata a futtatókörnyezet egy másik verzióját is rögzítheti alapértelmezettként a fiókjához. Győződjön meg róla, hogy a PIN-kódot a lehető leghamarabb visszavonja. Ha továbbra is az adott verzióra van rögzítve, akkor később lejár. ### <a name="monitoring-your-jobs-u-sql-runtime-version"></a>Feladatok figyelése U-SQL futtatókörnyezet-verzió Megtekintheti, hogy a korábbi feladatok melyik futásidejű verzióját használták a fiókjában a Visual Studio böngésző vagy a Azure Portal feladat előzményein keresztül. 1. A Azure Portal lépjen a Data Lake Analytics-fiókra. 2. Válassza **az összes feladat megtekintése** lehetőséget. Megjelenik a fiók összes aktív és legutóbb befejezett feladatának listája. 3. A **szűrő** lehetőségre kattintva megkeresheti a feladatokat az **időtartomány**, a **feladat neve** és a **Szerző** értékei alapján. 4. Láthatja a befejezett feladatokban használt futtatókörnyezetet. ![A korábbi feladatok futtatókörnyezet-verziójának megjelenítése](./media/runtime-troubleshoot/prior-job-usql-runtime-version-.png) Az elérhető futásidejű verziók változnak az idő múlásával. Az alapértelmezett futtatókörnyezetet mindig "default"-nek nevezzük, és legalább az előző futtatókörnyezetet megtartjuk egy ideig, és a speciális futtatókörnyezeteket számos okból elérhetővé tesszük. A explicit módon megnevezett futtatókörnyezetek általában a következő formátumot követik (a dőltek a változó részekhez használatosak, a [] pedig opcionális részeket jelez): release_YYYYMMDD_adl_buildno [_modifier] Például release_20190318_adl_3394512_2 azt jelenti, hogy a Build 3394512-es verziójának második, 18 2019-es és release_20190318_adl_3394512_private-es verzióját jelenti, és az ugyanazon kiadás privát buildje. Megjegyzés: a dátum ahhoz kapcsolódik, hogy mikor került sor az adott kiadás utolsó beadására, és nem feltétlenül a hivatalos kiadási dátumra. ## <a name="troubleshooting-u-sql-runtime-version-issues"></a>Az U-SQL futásidejű verziójának hibáinak elhárítása A futásidejű verziók két lehetséges problémája merülhet fel: 1. Egy parancsfájl vagy valamilyen felhasználói kód megváltoztatja az egyik kiadás viselkedését a következőre. Az ilyen jellegű feltörési változások általában a kibocsátási megjegyzések közzétételével kapcsolatos idő előtt kerülnek közlésre. Ha ilyen jellegű változást tapasztal, forduljon a Microsoft ügyfélszolgálatahoz, és jelentse ezt a feltörési viselkedést (ha még nincs dokumentálva), és küldje el a feladatokat a régebbi futtatókörnyezet-verzióra. 2. Nem alapértelmezett futtatókörnyezetet használ explicit módon vagy implicit módon, amikor a fiókjában rögzítette, és a futtatókörnyezetet némi idő múlva eltávolították. Ha hiányzó futtatókörnyezetekkel találkozik, frissítse a parancsfájlokat az aktuális alapértelmezett futtatókörnyezettel való futtatásra. Ha további időre van szüksége, lépjen kapcsolatba Microsoft ügyfélszolgálata ## <a name="known-issues"></a>Ismert problémák * Ha egy USQL-parancsfájl 12.0.3 vagy újabb verziójára Newtonsoft.Jshivatkozik, akkor a következő fordítási hiba fog megjelenni: *"Sajnos a Data Lake Analytics-fiókban futó feladatok valószínűleg lassabban futnak, vagy a művelet nem fejeződik be. Egy váratlan probléma miatt nem tudjuk automatikusan visszaállítani a funkciót a Azure Data Lake Analytics-fiókjába. Azure Data Lake mérnököket felvettek a vizsgálatba. "* A hívási verem a következőket fogja tartalmazni: `System.IndexOutOfRangeException: Index was outside the bounds of the array.` `at Roslyn.Compilers.MetadataReader.PEFile.CustomAttributeTableReader.get_Item(UInt32 rowId)` `...` **Megoldás**: használja a Newtonsoft.Jsfájlt a v 12.0.2 vagy az alacsonyabb fájlnál. ## <a name="see-also"></a>Lásd még - [Azure Data Lake Analytics áttekintése](data-lake-analytics-overview.md) - [Azure Data Lake Analytics kezelése Azure Portal használatával](data-lake-analytics-manage-use-portal.md) - [Feladatok figyelése Azure Data Lake Analytics a Azure Portal használatával](data-lake-analytics-monitor-and-troubleshoot-jobs-tutorial.md)
76.906667
455
0.820388
hun_Latn
1.000009
7a9889921d400eafa84f5d8c1d599983bb52c2fb
48
md
Markdown
docs/_snippets/firefuel_naming_conventions/either_examples_bad.md
SupposedlySam/firefuel
16c5e2127cc1cd2b2582ba87b5433ee89a747dba
[ "MIT" ]
2
2021-08-24T19:27:15.000Z
2021-11-01T05:00:39.000Z
docs/_snippets/firefuel_naming_conventions/either_examples_bad.md
SupposedlySam/firefuel
16c5e2127cc1cd2b2582ba87b5433ee89a747dba
[ "MIT" ]
26
2021-09-02T19:00:36.000Z
2021-12-30T17:08:52.000Z
docs/_snippets/firefuel_naming_conventions/either_examples_bad.md
SupposedlySam/firefuel
16c5e2127cc1cd2b2582ba87b5433ee89a747dba
[ "MIT" ]
1
2021-08-24T19:27:18.000Z
2021-08-24T19:27:18.000Z
`user` `messageOption` `productType` `maybePost`
12
15
0.770833
deu_Latn
0.322232
7a98f9ac7f06725a02697168e4f0cc5a29842adf
553
md
Markdown
customize/desktop/unattend/microsoft-windows-wpd-busenumservice-regcacheupdated.md
imingc/commercialization-public
70a2bcf94b61655df50987bfea83d4fc7be443d9
[ "CC-BY-4.0", "MIT" ]
1
2019-01-25T20:02:01.000Z
2019-01-25T20:02:01.000Z
customize/desktop/unattend/microsoft-windows-wpd-busenumservice-regcacheupdated.md
andreiztm/commercialization-public
9a9565a191bc1ecddb33c9b26e701ae32b0c8d65
[ "CC-BY-4.0", "MIT" ]
null
null
null
customize/desktop/unattend/microsoft-windows-wpd-busenumservice-regcacheupdated.md
andreiztm/commercialization-public
9a9565a191bc1ecddb33c9b26e701ae32b0c8d65
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: RegCacheUpdated description: RegCacheUpdated MSHAttr: - 'PreferredSiteName:MSDN' - 'PreferredLib:/library/windows/hardware' ms.assetid: e4d488db-e378-41ad-bd83-fc89e970b464 ms.mktglfcycl: deploy ms.sitesec: msdn author: themar-msft ms.author: themar ms.date: 05/02/2017 ms.topic: article ms.prod: windows-hardware ms.technology: windows-oem --- # RegCacheUpdated `RegCachUpdated` is intended for Microsoft internal use only. ## Related topics [Microsoft-Windows-WPD-BusEnumService](microsoft-windows-wpd-busenumservice.md)    
14.552632
79
0.77396
yue_Hant
0.627931
7a997c829d178f4c85c2c01ea5e179088b775c33
221
md
Markdown
README.md
RodDalBen/gam_pn
7d760592afdebd3f6a8da17484a51ed23ddab2dd
[ "MIT" ]
1
2020-11-02T21:39:43.000Z
2020-11-02T21:39:43.000Z
README.md
RodDalBen/gam_pn
7d760592afdebd3f6a8da17484a51ed23ddab2dd
[ "MIT" ]
null
null
null
README.md
RodDalBen/gam_pn
7d760592afdebd3f6a8da17484a51ed23ddab2dd
[ "MIT" ]
null
null
null
# Personal notes on Generalized Additive Models Personal notes on GAMs. Following a tutorial/course by [Noam Ross](https://github.com/noamross) and [Michael Clark](https://m-clark.github.io/generalized-additive-models/).
73.666667
172
0.78733
eng_Latn
0.712462
7a99919da45b24451b2af4ac42956d5ca07d3f9f
2,515
md
Markdown
sdk-api-src/content/tapi3/ne-tapi3-msp_address_event.md
amorilio/sdk-api
54ef418912715bd7df39c2561fbc3d1dcef37d7e
[ "CC-BY-4.0", "MIT" ]
null
null
null
sdk-api-src/content/tapi3/ne-tapi3-msp_address_event.md
amorilio/sdk-api
54ef418912715bd7df39c2561fbc3d1dcef37d7e
[ "CC-BY-4.0", "MIT" ]
null
null
null
sdk-api-src/content/tapi3/ne-tapi3-msp_address_event.md
amorilio/sdk-api
54ef418912715bd7df39c2561fbc3d1dcef37d7e
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- UID: NE:tapi3.__MIDL___MIDL_itf_tapi3_0000_0018_0001 title: MSP_ADDRESS_EVENT (tapi3.h) description: The MSP_ADDRESS_EVENT constant is returned within the MSP_EVENT_INFO struct by the GetEvent method when MSP_EVENT is ME_ADDRESS_EVENT. helpviewer_keywords: ["ADDRESS_TERMINAL_AVAILABLE","ADDRESS_TERMINAL_UNAVAILABLE","MSP_ADDRESS_EVENT","MSP_ADDRESS_EVENT enumeration [TAPI 2.2]","_tapi3_msp_address_event","msp/ADDRESS_TERMINAL_AVAILABLE","msp/ADDRESS_TERMINAL_UNAVAILABLE","msp/MSP_ADDRESS_EVENT","tapi3.msp_address_event"] old-location: tapi3\msp_address_event.htm tech.root: tapi3 ms.assetid: 35aecd05-badd-4509-92e5-1936ca075c37 ms.date: 12/05/2018 ms.keywords: ADDRESS_TERMINAL_AVAILABLE, ADDRESS_TERMINAL_UNAVAILABLE, MSP_ADDRESS_EVENT, MSP_ADDRESS_EVENT enumeration [TAPI 2.2], _tapi3_msp_address_event, msp/ADDRESS_TERMINAL_AVAILABLE, msp/ADDRESS_TERMINAL_UNAVAILABLE, msp/MSP_ADDRESS_EVENT, tapi3.msp_address_event req.header: tapi3.h req.include-header: Tapi3.h req.target-type: Windows req.target-min-winverclnt: req.target-min-winversvr: req.kmdf-ver: req.umdf-ver: req.ddi-compliance: req.unicode-ansi: req.idl: req.max-support: req.namespace: req.assembly: req.type-library: req.lib: req.dll: req.irql: targetos: Windows req.typenames: MSP_ADDRESS_EVENT req.redist: ms.custom: 19H1 f1_keywords: - __MIDL___MIDL_itf_tapi3_0000_0018_0001 - tapi3/__MIDL___MIDL_itf_tapi3_0000_0018_0001 - MSP_ADDRESS_EVENT - tapi3/MSP_ADDRESS_EVENT dev_langs: - c++ topic_type: - APIRef - kbSyntax api_type: - HeaderDef api_location: - msp.h api_name: - MSP_ADDRESS_EVENT --- # MSP_ADDRESS_EVENT enumeration ## -description The <b>MSP_ADDRESS_EVENT</b> constant is returned within the <a href="/windows/win32/api/msp/ns-msp-msp_event_info">MSP_EVENT_INFO</a> struct by the <a href="/windows/desktop/api/msp/nf-msp-itmspaddress-getevent">GetEvent</a> method when <a href="/windows/win32/api/msp/ne-msp-msp_event">MSP_EVENT</a> is ME_ADDRESS_EVENT. ## -enum-fields ### -field ADDRESS_TERMINAL_AVAILABLE:0 A new terminal arrived by PNP. ### -field ADDRESS_TERMINAL_UNAVAILABLE A terminal has been removed by PNP. ## -see-also <a href="/windows/desktop/api/msp/nf-msp-itmspaddress-getevent">ITMSPAddress::GetEvent</a> <a href="/windows/win32/api/msp/ne-msp-msp_event">MSP_EVENT</a> <a href="/windows/win32/api/msp/ns-msp-msp_event_info">MSP_EVENT_INFO</a> <a href="/windows/desktop/Tapi/media-service-provider-interface-mspi-">Media Service Provider Interface (MSPI)</a>
29.588235
290
0.795626
yue_Hant
0.707988
7a99e09a48297720f2bd4c3ca86cb8d0ba33ad50
444
md
Markdown
pickcore/javadoc/pickcore/com.lovoo.android.pickcore.destination/-private-directory/describe-contents.md
lovoo/android-pickpic
225b679bfc41d77d9b0573cbbc466d261c683c61
[ "Apache-2.0" ]
14
2019-11-29T15:44:51.000Z
2021-02-14T09:51:26.000Z
pickcore/javadoc/pickcore/com.lovoo.android.pickcore.destination/-private-directory/describe-contents.md
lovoo/android-pickpic
225b679bfc41d77d9b0573cbbc466d261c683c61
[ "Apache-2.0" ]
2
2019-12-02T14:38:52.000Z
2020-02-11T14:16:04.000Z
pickcore/javadoc/pickcore/com.lovoo.android.pickcore.destination/-private-directory/describe-contents.md
lovoo/android-pickpic
225b679bfc41d77d9b0573cbbc466d261c683c61
[ "Apache-2.0" ]
4
2020-02-11T10:09:59.000Z
2020-04-01T10:36:30.000Z
[pickcore](../../index.md) / [com.lovoo.android.pickcore.destination](../index.md) / [PrivateDirectory](index.md) / [describeContents](./describe-contents.md) # describeContents `fun describeContents(): `[`Int`](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-int/index.html) [(source)](https://github.com/lovoo/android-pickpic/blob/master/pickcore/pickcore/src/main/kotlin/com/lovoo/android/pickcore/destination/PrivateDirectory.kt#L58)
88.8
264
0.761261
yue_Hant
0.421581
7a99e0ad67db3c8e5daee164d2a97c6170a4bbe4
421
md
Markdown
libraries/chain/include/cps/chain/README.md
chipslimited/coinio
8d4181d6b90b68d996484bc55350ad3e9dda84a5
[ "MIT" ]
null
null
null
libraries/chain/include/cps/chain/README.md
chipslimited/coinio
8d4181d6b90b68d996484bc55350ad3e9dda84a5
[ "MIT" ]
null
null
null
libraries/chain/include/cps/chain/README.md
chipslimited/coinio
8d4181d6b90b68d996484bc55350ad3e9dda84a5
[ "MIT" ]
null
null
null
Protocol Definition -------------------- The classes declared in these headers provide the complete definition of the Cps protocol and are organized according to feature. Nothing in this directory should depend upon anything other than fc or other types defined in the protocol directory. To be more specific, implementation details such as the objects defined in the object database should not be required here.
38.272727
77
0.776722
eng_Latn
0.999926
7a9a0da470caf7b740babb8f0b74845640806a97
257
md
Markdown
README.md
ATung01/odin-landing-page
f6c2cd5c9974161a224dacd6bf7159d385a92454
[ "MIT" ]
null
null
null
README.md
ATung01/odin-landing-page
f6c2cd5c9974161a224dacd6bf7159d385a92454
[ "MIT" ]
null
null
null
README.md
ATung01/odin-landing-page
f6c2cd5c9974161a224dacd6bf7159d385a92454
[ "MIT" ]
null
null
null
# odin-landing-page A back to basics excercise where I needed to recreate a [webpage](https://cdn.statically.io/gh/TheOdinProject/curriculum/main/foundations/html_css/project/odin-project.png) using only HTML and CSS. Not responsive as that will be later.
85.666667
236
0.801556
eng_Latn
0.975334
7a9a6230d3187ec84ce1998f0e92359ee1ee0dae
3,249
md
Markdown
zookeeper-docs/src/main/resources/markdown/zookeeperQuotas.md
bingliran/test-zookeeper
07c0f93aaa08d3c748213b4024dfa1cfc20eae84
[ "Apache-2.0" ]
null
null
null
zookeeper-docs/src/main/resources/markdown/zookeeperQuotas.md
bingliran/test-zookeeper
07c0f93aaa08d3c748213b4024dfa1cfc20eae84
[ "Apache-2.0" ]
null
null
null
zookeeper-docs/src/main/resources/markdown/zookeeperQuotas.md
bingliran/test-zookeeper
07c0f93aaa08d3c748213b4024dfa1cfc20eae84
[ "Apache-2.0" ]
null
null
null
<!-- Copyright 2002-2004 The Apache Software Foundation Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. //--> # ZooKeeper Quota's Guide ### A Guide to Deployment and Administration * [Quotas](#zookeeper_quotas) * [Setting Quotas](#Setting+Quotas) * [Listing Quotas](#Listing+Quotas) * [Deleting Quotas](#Deleting+Quotas) <a name="zookeeper_quotas"></a> ## Quotas ZooKeeper has both namespace and bytes quotas. You can use the ZooKeeperMain class to setup quotas. ZooKeeper prints _ WARN_ messages if users exceed the quota assigned to them. The messages are printed in the log of the ZooKeeper. Notice: What the `namespace` quota means is the count quota which limits the number of children under the path(included itself). $ bin/zkCli.sh -server host:port** The above command gives you a command line option of using quotas. <a name="Setting+Quotas"></a> ### Setting Quotas - You can use `setquota` to set a quota on a ZooKeeper node. It has an option of setting quota with `-n` (for namespace/count) and `-b` (for bytes/data length). - The ZooKeeper quota is stored in ZooKeeper itself in **/zookeeper/quota**. To disable other people from changing the quotas, users can set the ACL for **/zookeeper/quota** ,so that only admins are able to read and write to it. - If the quota doesn't exist in the specified path,create the quota, otherwise update the quota. - The Scope of the quota users set is all the nodes under the path specified (included itself). - In order to simplify the calculation of quota in the current directory/hierarchy structure, a complete tree path(from root to leaf node) can be set only one quota. In the situation when setting a quota in a path which its parent or child node already has a quota. `setquota` will reject and tell the specified parent or child path, users can adjust allocations of quotas( delete/move-up/move-down the quota) according to specific circumstances. - Combined with the Chroot, the quota will have a better isolation effectiveness between different applications.For example: ```bash # Chroot is: 192.168.0.1:2181,192.168.0.2:2181,192.168.0.3:2181/apps/app1 setquota -n 100000 /apps/app1 ``` - Users cannot set the quota on the path under **/zookeeper/quota** - The quota supports the soft and hard quota. The soft quota just logs the warning info when exceeding the quota, but the hard quota also throws a `QuotaExceededException`. When setting soft and hard quota on the same path, the hard quota has the priority. <a name="Listing+Quotas"></a> ### Listing Quotas You can use _listquota_ to list a quota on a ZooKeeper node. <a name="Deleting+Quotas"></a> ### Deleting Quotas You can use _delquota_ to delete quota on a ZooKeeper node.
36.505618
119
0.751924
eng_Latn
0.997144
7a9a7f7eb495074eb9a2e2702306b47709c6f5bb
3,581
md
Markdown
web/README.md
nickjalbert/agentos
68f65a673b1693b851a78e91996b37effa2459e0
[ "Apache-2.0" ]
1
2021-10-04T19:32:31.000Z
2021-10-04T19:32:31.000Z
web/README.md
andyk/agentos
2f5fc5443bfae586c3a9fc7c5bffcca860a0d878
[ "Apache-2.0" ]
2
2021-02-19T20:16:26.000Z
2022-01-03T15:06:50.000Z
web/README.md
andyk/agentos
2f5fc5443bfae586c3a9fc7c5bffcca860a0d878
[ "Apache-2.0" ]
null
null
null
# AgentOS Web A prototype of the AgentOS web service. Related [discussion](https://github.com/agentos-project/agentos/discussions/139). Designed to be run in tandem with the the AgentOS code in the `nj_leaderboard` branch of this [AgentOS fork](https://github.com/nickjalbert/agentos/tree/nj_leaderboard) ## Set up a local version Built with Python 3.9 and Postgres 12.8. ```bash git clone [email protected]:nickjalbert/aos_web.git cd aos_web virtualenv -p /usr/bin/python3.9 env source env/bin/activate pip install -r requirements.txt # Create postgres database aos_web with aos_web_user with pwd aabbccdd: sudo service postgresql start sudo -u postgres psql create database aos_web; create user aos_web_user with encrypted password 'aabbccdd'; alter user aos_web_user createdb; grant all privileges on database aos_web to aos_web_user; ./manage.py migrate ./manage.py runserver # navigate to http://localhost:8000 ``` ## To start from scratch with the DB Warning: this deletes all data in your DB: * Remove all files under `registry/migrations/` except for `registry/migrations/__init__.py` * Run `./manage.py makemigrations` * Recreate your postgres tables * sudo -u postgres psql * drop database aos_web; drop user aos_web_user; create database aos_web; create user aos_web_user with encrypted password 'aabbccdd'; grant all privileges on database aos_web to aos_web_user; * ./manage.py migrate ## Notes ```bash ./manage.py import_registry https://raw.githubusercontent.com/nickjalbert/agentos/nj_leaderboard/registry.yaml ``` ## Installation and Setup Info Raw notes from installation and setup: ```bash # Create virtual env virtualenv -p /usr/bin/python3.9 env source env/bin/activate # Setup Django pip install Django django-admin startproject aos_web cd aos_web/ ./manage.py startapp registry # Follow tutorial: # https://docs.djangoproject.com/en/3.2/intro/tutorial01/ # Setup postgres sudo apt install postgresql postgresql-contrib sudo passwd postgres # XXXXXXXX sudo service postgresql start sudo -u postgres psql postgres=# create database aos_web; CREATE DATABASE postgres=# create user aos_web_user with encrypted password 'XXXXXXXX'; CREATE ROLE postgres=# grant all privileges on database aos_web to aos_web_user; # added models and updated settings ./manage.py makemigrations ./manage.py migrate python manage.py createsuperuser # Heroku deployment # https://medium.com/geekculture/how-to-deploy-a-django-app-on-heroku-4d696b458272 pip3 install gunicorn dj-database-url whitenoise psycopg2-binary # Setup heroku CLI curl https://cli-assets.heroku.com/install-ubuntu.sh | sh # Bunch of settings.py updates from the medium article heroku create aos-web git push heroku main heroku addons:create heroku-postgresql:hobby-dev --app aos-web heroku run python manage.py migrate heroku run python manage.py createsuperuser heroku open # set IS_DEPLOY=True on Heroku config vars dashboard (under settings) ``` ## Deploy to Heroku with web/ in the monorepo ```bash # Add heroku remote if it's missing heroku git:remote -a aos-web # Push everything under web/ on current branch to heroku git subtree push --prefix web heroku master # If heroku complains about non-fast-forward, try this sorcery git subtree split --prefix web -b heroku-deploy git push -f heroku heroku-deploy:master git checkout master git branch -D heroku-deploy # COMPLETELY reset heroku database heroku pg:reset # Run Migrations heroku run python manage.py migrate # Create admin/12345678 heroku run python manage.py create_default_admin # Tail logs heroku logs --tail ```
28.648
196
0.789444
eng_Latn
0.482887
7a9b797cd13c28f14cf926cf6643d495056632e2
2,036
md
Markdown
README.md
getsolus/qol-assist
5e958356927ce83dfee875f490c171a6c806efd5
[ "Apache-2.0" ]
7
2018-10-27T05:02:42.000Z
2020-10-22T04:06:32.000Z
README.md
getsolus/qol-assist
5e958356927ce83dfee875f490c171a6c806efd5
[ "Apache-2.0" ]
1
2021-02-02T20:30:14.000Z
2021-02-02T20:30:14.000Z
README.md
getsolus/qol-assist
5e958356927ce83dfee875f490c171a6c806efd5
[ "Apache-2.0" ]
3
2020-09-05T21:36:01.000Z
2021-02-03T10:17:58.000Z
# qol-assist [![License](https://img.shields.io/badge/License-Apache%202.0-lightgrey.svg)](https://www.apache.org/licenses/LICENSE-2.0.html) ![#solus-dev on Freenode](https://img.shields.io/badge/freenode-%23solus--dev-28C) `qol-assist` is a quality of life assistant for rolling release Linux distributions. During the lifetime of a rolling release Linux distribution, new problems occur that are often complex to deal with. An example would be udev rules requiring the addition of new UNIX user groups by udev rules (`setfacl`), and the requirement to automatically migrate active users to those groups. Traditionally, post-install packaging scripts have no knowledge of users, so `qol-assist` bridges that gap by being a central location for rolling QoL operations to continue providing a solid user experience while still being able to make deep changes to the OS. `qol-assist` is a [Solus project.](https://getsol.us/) ![Solus logo](https://build.getsol.us/logo.png) ## Building Install the dependencies: - go (>=1.15) Then compile: ``` $ make ``` If desired, you may set the configuration and tracking directories at compile time: ``` $ make PREFIX=/usr BINDIR=/usr/bin MANDIR=/usr/man SYSDIR=/etc/dir USRDIR=/usr/dir TRACKDIR=/var/lib/dir SYSTEMDUNITDIR=/etc/systemd/system ``` ## Installation ``` # make install PREFIX=/usr ``` ## Running ``` $ qol-assist list-users all # qol-assist migrate # qol-assist trigger ``` ## License Copyright 2020-2021 Solus Project <[email protected]> Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at <https://www.apache.org/licenses/LICENSE-2.0> Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an " AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
31.8125
139
0.761788
eng_Latn
0.966842
7a9ba0f6712d6b454327010d89219860d3543efe
1,505
md
Markdown
README.md
noahslusher/README.md-generator
94af729110b50fd7e6cd48755ec3ef47f650800e
[ "MIT" ]
null
null
null
README.md
noahslusher/README.md-generator
94af729110b50fd7e6cd48755ec3ef47f650800e
[ "MIT" ]
null
null
null
README.md
noahslusher/README.md-generator
94af729110b50fd7e6cd48755ec3ef47f650800e
[ "MIT" ]
null
null
null
# Readme.md Generator ![License](https://img.shields.io/badge/license-mit-blue.svg) ## Description I created this application to better understand Node.js. This application helped me utilize my knowledge of JavaScript to create a command line application. This project also helped me understand node package manager and allowed me to become familiar with Inquirer. It allows the user to create a README file from prompts that are asked from the command line. ## Live URL https://noahslusher.github.io/README.md-generator/ ## Table of Contents * [Installation](#Installation) * [Usage](#Usage) * [Credits](#Credits) * [Licenses](#Licenses) * [Badge](#Badge) * [Features](#Features) * [Contribution](#Contribution) * [Tests](#Tests) * [Questions](#Questions) ## Installation User must have Node.js installed and it uses the Inquirer Dependency. ## Usage Navigate to repo directory on the command line and enter "node index.js" ## License Copyright 2022 [noahslusher] This product is licensed under the mit license ## Badge None ## Features None ## Contribution If you would like to contribute please add repo to your GitHub account and create a pull request ## Tests npm test ## Questions Profile Link: https://github.com/noahslusher For any questions or concerns, please contact [email protected] with any questions. [Readme Screen Record.mov.zip](https://github.com/noahslusher/README.md-generator/files/8358811/Readme.Screen.Record.mov.zip)
30.1
359
0.74485
eng_Latn
0.944764