hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9c2de8e49678920bbf58a85408e5c270ca44c3cb | 1,091 | md | Markdown | CONTRIBUTING.md | codetriage-readme-bot/Helpdesk | 5c5271fc4bebee59878ffd48ed82d3f4b0e94b66 | [
"MIT"
] | 15 | 2017-10-17T19:22:02.000Z | 2021-03-30T15:26:52.000Z | CONTRIBUTING.md | codetriage-readme-bot/Helpdesk | 5c5271fc4bebee59878ffd48ed82d3f4b0e94b66 | [
"MIT"
] | 40 | 2017-10-16T13:31:36.000Z | 2017-12-22T20:07:59.000Z | CONTRIBUTING.md | codetriage-readme-bot/Helpdesk | 5c5271fc4bebee59878ffd48ed82d3f4b0e94b66 | [
"MIT"
] | 15 | 2017-10-17T20:07:31.000Z | 2020-02-20T10:11:21.000Z | # Contributing to Helpdesk
As part of Hacktoberfest, we're labelling issues with "Hacktoberfest" and "Help Wanted" to make the low-hanging fruit easy to spot.
Feel free to submit a pull-request, with the posted issues as pointers to find areas that need work. If you find something that isn't an issue yet, please feel free to post it as an issue on GitHub.
To make sure that you aren't working on an issue that someone else is already coding for, add a comment to the issue to let everyone know that you've started work on it.
You can try to get the application running on your computer in one of two ways:
**Method 1.** Install Ruby and MongoDB - it runs natively on your computer. Install the bundler gem with "gem install bundler" and in the project directory, run "bundle install" . Then, execute "runme.bat"
**Method 2.** Install Oracle VirtualBox 5.1 and Vagrant - it sets up a virtual machine. Then, from the project directory, run the command "vagrant up"
This should get the server running and you can access the application from your web browser at: http://localhost:8000/
| 68.1875 | 205 | 0.771769 | eng_Latn | 0.999278 |
9c2deffea43d121b7820ce974f567fc32d013787 | 196 | md | Markdown | source/_assessments/91654.md | RuapehuCollege/Website | 2cf4122115ea0a3bcfc0059f3ad17f23df79729b | [
"MIT"
] | null | null | null | source/_assessments/91654.md | RuapehuCollege/Website | 2cf4122115ea0a3bcfc0059f3ad17f23df79729b | [
"MIT"
] | 1 | 2020-12-04T17:56:25.000Z | 2020-12-04T17:56:25.000Z | source/_assessments/91654.md | RuapehuCollege/Website | 2cf4122115ea0a3bcfc0059f3ad17f23df79729b | [
"MIT"
] | 1 | 2019-11-13T11:20:34.000Z | 2019-11-13T11:20:34.000Z | ---
title: "91654"
description: "Waihanga tuhinga whai take i te reo Maori o te ao whanui"
level: "3"
assessment: "Internal"
credits: "6"
pdf: <pdf>
courses:
- "Te Reo Maori Level 3"
---
| 16.333333 | 71 | 0.647959 | mri_Latn | 0.790923 |
9c2e9fbd0f90b675a65424ed43bc30a0365c6293 | 13,570 | md | Markdown | _posts/2021-3-29-Block.md | KGDeveloper/kgdeveloper.github.io | 67950862f1b928063c4143706a61519c9d4e48ed | [
"MIT"
] | 1 | 2020-10-16T07:29:12.000Z | 2020-10-16T07:29:12.000Z | _posts/2021-3-29-Block.md | KGDeveloper/kgdeveloper.github.io | 67950862f1b928063c4143706a61519c9d4e48ed | [
"MIT"
] | null | null | null | _posts/2021-3-29-Block.md | KGDeveloper/kgdeveloper.github.io | 67950862f1b928063c4143706a61519c9d4e48ed | [
"MIT"
] | null | null | null | ---
layout: post
title: Block
subtitle: Block
date: 2021-03-29
author: KG丿夏沫
header-img: img/post-bg-ios9-web.jpg
catalog: true
tags:
- iOS
- OC
- 笔记
- Block
---
# Block
<img src="https://raw.githubusercontent.com/KGDeveloper/KGImg/master/img/20210329001.png?token=AHPRJRCCMP5CVGPA62676PDAMFWRS" alt="Block图解"/>
### Block定义以及表达式
在iOS开发中针对于Objective-C我们经常提到Block,对于Swift来说就是闭包,今天主要是探索Block,所以先有个疑问,什么是Block?为什么要用Block?
>首先Block是一个OC对象,其内部也是isa指针,Block封装了函数实现以及函数上下文的OC对象
>使用Block是为了将函数的调用以及实现合并到一起
Block的声明:```返回值(^Block名称)(参数列表){Block回调实现}```
Block的调用:```Block名称(Block参数)```
### Block分类
在我们程序运行过程中,Block的使用是非常广泛的,比如数组遍历```enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {}```,masonry等很多优秀三方库中也会发现Block的身影,所以我们探索下的使用场景以及在什么场景下Block是什么类型
在我们iOS系统中Block分为6种:
>_NSConcreteStackBlock:栈Block
>
>_NSConcreteMallocBlock:堆Block
>
>_NSConcreteGlobalBlock:全局Block
>
>_NSConcreteAutoBlock:在GC环境下,当对象被```__weak,__block```修饰,并且从栈复制到堆时,block会被标记为该模式
>
>_NSConcreteFinalizingBlock:在GC环境下,当block被复制时,如果block有ctors&dtors时,则转换为该模式
>
>_NSConcreteWeakBlockVariable:与```_NSConcreteFinalizingBlock```反之则转换为该模式
### Block的结构
Block作为一个OC的对象,是一个结构体类型,其结构如下:
```
struct Block_layout {
void *isa;
volatile int32_t flags;
int32_t reserved;
BlockInvokeFunction invoke;
struct Block_descriptor_1 *descriptor;
};
```
知识点:
>1、通过上面Block的结构体,就能得出我们之所以说Block也是OC对象的原因,因为Block内部也有isa指针
>2、其中用到了一个修饰符```volatile```这个在系统结构体里面很常见,这个修饰符的意思就是告诉编译器,我修饰的属性,不需要你去优化,包括存储的空间也不需要你去优化。而且```volatile```修饰符保证了不同线程对这个属性进行操作的时候的可见性,即一个线程修改了变量的值,这个新值对其他线程来说是立即可见的。
>3、GlobalBlock:位于全局区,在Block内部不使用外部变量,或者只使用静态变量和全局变量
>4、MallocBlock:位于堆区,在Block内部使用局部变量或者OC属性,并且赋值给强引用或者Copy修饰的变量
>5、StackBlock:位于栈区,与MallocBlock一样,可以在内部使用局部变量或者OC属性,但是不能赋值给强引用或者Copy修饰的变量
### Block源码探索
首先下载[block源码](https://opensource.apple.com/source/libclosure/libclosure-78/),然后我们打开项目后,然后创建一个```TARGET```,然后打开```main.m```文件,在里面申明并调用简单的block,代码如下:
```
int main() {
void(^block0)(void) = ^{
NSLog(@"==================\n");
};
block0();
}
```
然后在block申明处添加断点,同时打开汇编模式,运行程序,然后进入断点后,点击```step over```然后会发现进入了```_Block_copy```函数,然后打开源码,找到```_Block_copy```函数的实现,代码如下:
```
// Copy, or bump refcount, of a block. If really copying, call the copy helper if present.
// 拷贝 block,
// 如果原来就在堆上,就将引用计数加 1;
// 如果原来在栈上,会拷贝到堆上,引用计数初始化为 1,并且会调用 copy helper 方法(如果存在的话);
// 如果 block 在全局区,不用加引用计数,也不用拷贝,直接返回 block 本身
// 参数 arg 就是 Block_layout 对象,
// 返回值是拷贝后的 block 的地址
void *_Block_copy(const void *arg) {
struct Block_layout *aBlock;
// 如果 arg 为 NULL,直接返回 NULL
if (!arg) return NULL;
// The following would be better done as a switch statement
// 强转为 Block_layout 类型
aBlock = (struct Block_layout *)arg;
// 获取Block签名
const char *signature = _Block_descriptor_3(aBlock)->signature;
// 如果现在已经在堆上
if (aBlock->flags & BLOCK_NEEDS_FREE) {
// latches on high
// 就只将引用计数加 1
latching_incr_int(&aBlock->flags);
return aBlock;
}
// 如果 block 在全局区,不用加引用计数,也不用拷贝,直接返回 block 本身
else if (aBlock->flags & BLOCK_IS_GLOBAL) {
return aBlock;
}
else {
// Its a stack block. Make a copy.
// block 现在在栈上,现在需要将其拷贝到堆上
// 在堆上重新开辟一块和 aBlock 相同大小的内存
struct Block_layout *result =
(struct Block_layout *)malloc(aBlock->descriptor->size);
// 开辟失败,返回 NULL
if (!result) return NULL;
// 将 aBlock 内存上的数据全部复制新开辟的 result 上
memmove(result, aBlock, aBlock->descriptor->size); // bitcopy first
#if __has_feature(ptrauth_calls)
// Resign the invoke pointer as it uses address authentication.
result->invoke = aBlock->invoke;
#endif
// reset refcount
// 将 flags 中的 BLOCK_REFCOUNT_MASK 和 BLOCK_DEALLOCATING 部分的位全部清为 0
result->flags &= ~(BLOCK_REFCOUNT_MASK|BLOCK_DEALLOCATING); // XXX not needed
// 将 result 标记位在堆上,需要手动释放;并且引用计数初始化为 1
result->flags |= BLOCK_NEEDS_FREE | 2; // logical refcount 1
// copy 方法中会调用做拷贝成员变量的工作
_Block_call_copy_helper(result, aBlock);
// Set isa last so memory analysis tools see a fully-initialized object.
// isa 指向 _NSConcreteMallocBlock
result->isa = _NSConcreteMallocBlock;
return result;
}
}
```
在这个函数中主要进行以下操作:
>1、进行强转,变成```Block_layout```结构体对象。
>
>2、然后调用```Block_descriptor_3```函数获取签名信息,在这个函数内部首先判断是否有签名信息,用到了Block给构体中```flags```属性,这个属性与```BLOCK_HAS_SIGNATURE```(1左移30位)进行或运算,然后判断结果是否为0,如果结果为0,那么表示没有签名信息,直接返回null,如果不为0,获取Block的```descriptor```指针,然后指针偏移```Block_descriptor_1```结构体大小,然后继续判断```flags```属性或运算```BLOCK_HAS_COPY_DISPOSE```(1左移25位)得到的结果是否为0,如果不为0,指针再次偏移```Block_descriptor_2```结构体大小,最后得到```Block_descriptor_3```的地址,然后进行强转后返回```Block_descriptor_3```对象,这样外部就可以直接访问到```Block_descriptor_3```结构体对象的属性。
>
>3、然后判断Block是否在堆上,如果在堆上Block的引用计数加1,然后返回Block对象
>
>4、如果Block是全局Block,不做任何处理直接返回Block对象
>
>5、如果Block是栈区Block,那么先在堆上开辟一块和Block大小相同的空间,如果开辟空间失败,直接返回null,如果开辟成功,直接将block的数据全部复制到新开辟的堆上block,重新定义调用指针。然后将堆上block的flags中的BLOCK_REFCOUNT_MASK 和 BLOCK_DEALLOCATING 部分的位全部清为 0,然后标记堆block并且引用计初始化为1,然后调用```_Block_call_copy_helper```方法,复制栈block的属性到堆block上,修改堆block的isa指针为堆block类型。
### Block实现以及变量捕获
我们都知道Block会捕获外部变量,那么是怎么捕获的,Block在编译时发生了什么?怎么找到外部变量的?怎么找到Block块的实现的?下面我们通过编译cpp文件去探索一下Block。
1、在Block中不使用任何变量或者常量,然后使用```clang -rewrite-objc main.m -o main.cpp```将main.m文件编译成main.cpp文件。下面先看下编译前以及编译后的代码:
>编译前:
```
#import <Foundation/Foundation.h>
int main() {
void(^block0)(void) = ^{
NSLog(@"==================\n");
};
block0();
}
```
>编译后(删除额外代码后):
```
struct __main_block_impl_0 {
struct __block_impl impl;
struct __main_block_desc_0* Desc;
__main_block_impl_0(void *fp, struct __main_block_desc_0 *desc, int flags=0) {
impl.isa = &_NSConcreteStackBlock;
impl.Flags = flags;
impl.FuncPtr = fp;
Desc = desc;
}
};
// Block块
static void __main_block_func_0(struct __main_block_impl_0 *__cself) {
NSLog((NSString *)&__NSConstantStringImpl__var_folders_np_kl73wv292blcg8h8zbxlhlfh0000gn_T_main_ac9b9b_mi_0);
}
static struct __main_block_desc_0 {
size_t reserved;
size_t Block_size;
} __main_block_desc_0_DATA = { 0, sizeof(struct __main_block_impl_0)};
int main() {
// Block的申明
void(*block0)(void) = &__main_block_impl_0(__main_block_func_0, &__main_block_desc_0_DATA));
// Block的调用
(block0->FuncPtr)(block0);
}
```
先看下main函数,在main函数中,Block的申明以及调用。
>Block前半部分定义和编译前的代码是一模一样的,然后后半部分就是调用```__main_block_impl_0```结构体的初始化方法,传入两个参数,一个是Block块实现函数,一个是Block信息。
>
>在```__main_block_func_0```函数中很简单就是打印日志。
>
>在```__main_block_impl_0```结构体中的初始化方法中对结构体进行了赋值,isa指向```_NSConcreteStackBlock```栈Block,flags标志数据是对Block的信息存储,里面有Block的释放标志、引用计数、是够拥有拷贝函数、是否拥有析构函数、是否有垃圾回收、是否全局Block、是否有签名、是否扩展等信息。desc存储了Block的大小等信息。
2、在Block中使用外部局部变量,然后使用```clang -rewrite-objc main.m -o main.cpp```将main.m文件编译成main.cpp文件。下面先看下编译前以及编译后的代码:
>编译前:
```
#import <Foundation/Foundation.h>
int main() {
int a = 10;
void(^block0)(void) = ^{
NSLog(@"==================%d\n",a);
};
block0();
}
```
>编译后(删除额外代码后):
```
struct __block_impl {
void *isa;
int Flags;
int Reserved;
void *FuncPtr;
};
struct __main_block_impl_0 {
struct __block_impl impl;
struct __main_block_desc_0* Desc;
int a;
__main_block_impl_0(void *fp, struct __main_block_desc_0 *desc, int _a, int flags=0) : a(_a) {
impl.isa = &_NSConcreteStackBlock;
impl.Flags = flags;
impl.FuncPtr = fp;
Desc = desc;
}
};
static void __main_block_func_0(struct __main_block_impl_0 *__cself) {
int a = __cself->a; // bound by copy
NSLog((NSString *)&__NSConstantStringImpl__var_folders_np_kl73wv292blcg8h8zbxlhlfh0000gn_T_main_5f4e61_mi_0,a);
}
static struct __main_block_desc_0 {
size_t reserved;
size_t Block_size;
} __main_block_desc_0_DATA = { 0, sizeof(struct __main_block_impl_0)};
int main() {
int a = 10;
void(*block0)(void) = &__main_block_impl_0(__main_block_func_0, &__main_block_desc_0_DATA, a));
(block0->FuncPtr)(block0);
}
```
先看下main函数,在main函数中,Block的申明以及调用。
>Block前半部分定义和编译前的代码是一模一样的,然后后半部分就是调用```__main_block_impl_0```结构体的初始化方法,传入三个参数,一个是Block块实现函数,一个是Block信息,```一个是局部变量a```传入了局部变量a的值。
>
>在```__main_block_impl_0```结构体中的初始化方法中对结构体进行了赋值,isa指向```_NSConcreteStackBlock```栈Block,flags标志数据是对Block的信息存储,里面有Block的释放标志、引用计数、是够拥有拷贝函数、是否拥有析构函数、是否有垃圾回收、是否全局Block、是否有签名、是否扩展等信息。desc存储了Block的大小等信息,最重要的是```__main_block_impl_0```中定义了一个属性a,在析构函数中将对外部变量的值赋值给了a。
>
>在```__main_block_func_0```函数中定义了一个临时变量a然后使用```__cself->a```进行赋值,然后进行打印a的值,而且这块定义的a是一个全新的局部变量,不是外部定义的局部变量a,这也就是为什么在Block申明之后,Block调用之前修改局部变量的值,不影响Block块内部输出的值的原因,因为Block在捕获局部变量的时候传递了一个值,所以当编译器编译时就已经捕获了局部变量的值,并且赋值给了内部属性a,然后在Block实现中再次赋值给了Block块内部的局部变量a,所以外部不管怎么修改,都不影响Block内部捕获的值。
3、在Block中使用外部全局变量,然后使用```clang -rewrite-objc main.m -o main.cpp```将main.m文件编译成main.cpp文件。下面先看下编译前以及编译后的代码:
>编译前:
```
#import <Foundation/Foundation.h>
int a = 10;
int main() {
void(^block0)(void) = ^{
NSLog(@"==================%d\n",a);
};
block0();
}
```
>编译后(删除额外代码后):
```
struct __block_impl {
void *isa;
int Flags;
int Reserved;
void *FuncPtr;
};
int a = 10;
struct __main_block_impl_0 {
struct __block_impl impl;
struct __main_block_desc_0* Desc;
__main_block_impl_0(void *fp, struct __main_block_desc_0 *desc, int flags=0) {
impl.isa = &_NSConcreteStackBlock;
impl.Flags = flags;
impl.FuncPtr = fp;
Desc = desc;
}
};
static void __main_block_func_0(struct __main_block_impl_0 *__cself) {
NSLog((NSString *)&__NSConstantStringImpl__var_folders_np_kl73wv292blcg8h8zbxlhlfh0000gn_T_main_aac8d9_mi_0,a);
}
static struct __main_block_desc_0 {
size_t reserved;
size_t Block_size;
} __main_block_desc_0_DATA = { 0, sizeof(struct __main_block_impl_0)};
int main() {
void(*block0)(void) = (&__main_block_impl_0(__main_block_func_0, &__main_block_desc_0_DATA));
(block0->FuncPtr)(block0);
}
```
先看下main函数,在main函数中,Block的申明以及调用。
>Block前半部分定义和编译前的代码是一模一样的,然后后半部分就是调用```__main_block_impl_0```结构体的初始化方法,传入两个参数,一个是Block块实现函数,一个是Block信息。
>
>在```__main_block_impl_0```结构体中的初始化方法中对结构体进行了赋值,isa指向```_NSConcreteStackBlock```栈Block,flags标志数据是对Block的信息存储,里面有Block的释放标志、引用计数、是够拥有拷贝函数、是否拥有析构函数、是否有垃圾回收、是否全局Block、是否有签名、是否扩展等信息。desc存储了Block的大小等信息。
>
>在```__main_block_func_0```函数中直接使用了全局变量,并不是捕获值,因为在```__main_block_impl_0```结构体中并没有申明属性,也没有对全局变量进行操作,而是直接使用全局变量,所以这就是为申明全局变量在Block申明后修改会影响Block内部的原因。
4、在Block中使用外部进过```__block```修饰的局部变量,然后使用```clang -rewrite-objc main.m -o main.cpp```将main.m文件编译成main.cpp文件。下面先看下编译前以及编译后的代码:
>编译前:
```
#import <Foundation/Foundation.h>
int main() {
__block int a = 10;
void(^block0)(void) = ^{
NSLog(@"==================%d\n",a);
};
block0();
}
```
>编译后(删除额外代码后):
```
struct __block_impl {
void *isa;
int Flags;
int Reserved;
void *FuncPtr;
};
struct __Block_byref_a_0 {
void *__isa;
__Block_byref_a_0 *__forwarding;
int __flags;
int __size;
int a;
};
struct __main_block_impl_0 {
struct __block_impl impl;
struct __main_block_desc_0* Desc;
__Block_byref_a_0 *a; // by ref
__main_block_impl_0(void *fp, struct __main_block_desc_0 *desc, __Block_byref_a_0 *_a, int flags=0) : a(_a->__forwarding) {
impl.isa = &_NSConcreteStackBlock;
impl.Flags = flags;
impl.FuncPtr = fp;
Desc = desc;
}
};
static void __main_block_func_0(struct __main_block_impl_0 *__cself) {
__Block_byref_a_0 *a = __cself->a; // bound by ref
NSLog((NSString *)&__NSConstantStringImpl__var_folders_np_kl73wv292blcg8h8zbxlhlfh0000gn_T_main_379bf1_mi_0,(a->__forwarding->a));
}
static void __main_block_copy_0(struct __main_block_impl_0*dst, struct __main_block_impl_0*src) {_Block_object_assign((void*)&dst->a, (void*)src->a, 8/*BLOCK_FIELD_IS_BYREF*/);}
static void __main_block_dispose_0(struct __main_block_impl_0*src) {_Block_object_dispose((void*)src->a, 8/*BLOCK_FIELD_IS_BYREF*/);}
static struct __main_block_desc_0 {
size_t reserved;
size_t Block_size;
void (*copy)(struct __main_block_impl_0*, struct __main_block_impl_0*);
void (*dispose)(struct __main_block_impl_0*);
} __main_block_desc_0_DATA = { 0, sizeof(struct __main_block_impl_0), __main_block_copy_0, __main_block_dispose_0};
int main() {
__attribute__((__blocks__(byref))) __Block_byref_a_0 a = {(void*)0,(__Block_byref_a_0 *)&a, 0, sizeof(__Block_byref_a_0), 10};
void(*block0)(void) = (&__main_block_impl_0(__main_block_func_0, &__main_block_desc_0_DATA, (__Block_byref_a_0 *)&a, 570425344));
(block0->FuncPtr)(block0);
}
```
先看下main函数,在main函数中,Block的申明以及调用。
>```__block int a = 10;```经过编译后变成了```__Block_byref_a_0```结构体对象,通过结构体的析构函数将a的```地址```以及值都进行捕获。
>Block前半部分定义和编译前的代码是一模一样的,然后后半部分就是调用```__main_block_impl_0```结构体的初始化方法,传入四个参数,一个是Block块实现函数,一个是Block信息,一个是```__Block_byref_a_0```对象的地址,一个是flags。
>
>在```__main_block_impl_0```结构体中的初始化方法中对结构体进行了赋值,isa指向```_NSConcreteStackBlock```栈Block,flags标志数据是对Block的信息存储,里面有Block的释放标志、引用计数、是够拥有拷贝函数、是否拥有析构函数、是否有垃圾回收、是否全局Block、是否有签名、是否扩展等信息。desc存储了Block的大小等信息。
>
>在```__main_block_desc_0_DATA```中对捕获的值进行了拷贝,而且是地址拷贝,也就是让a的引用计数+1防止提前释放
>
>在```__main_block_func_0```函数中通过定义```__Block_byref_a_0```结构体对象然后获取值,然后访问结构体对象中局部变量a的值进行操作。
### 堆Block
虽然通过上面的源码编译查看后Block的isa都是指向栈Block,但是在程序运行中,我们通过断点调试查看的时候,发现Block只要没有被copy或者捕获使用```__block```修饰的变量,都是```_NSConcreteGlobalBlock```全局Block,copy出来的Block以及捕获了使用```__block```修饰的变量时,就变成了```_NSConcreteMallocBlock```堆Block。具体如下图所示:
<img src="https://raw.githubusercontent.com/KGDeveloper/KGImg/master/img/20210402001.png?token=AHPRJRA3PW5M4ZOHEGCL2KDAM3S7A" alt="堆block"/> | 31.705607 | 459 | 0.73832 | yue_Hant | 0.352917 |
9c2ef5db7244d26e1d269e4e0921159339df22b7 | 1,284 | md | Markdown | includes/storage-import-export-ship-drives.md | Microsoft/azure-docs.cs-cz | 1e2621851bc583267d783b184f52dc4b853a058c | [
"CC-BY-4.0",
"MIT"
] | 6 | 2017-08-28T07:43:21.000Z | 2022-01-04T10:32:24.000Z | includes/storage-import-export-ship-drives.md | MicrosoftDocs/azure-docs.cs-cz | 1e2621851bc583267d783b184f52dc4b853a058c | [
"CC-BY-4.0",
"MIT"
] | 428 | 2018-08-23T21:35:37.000Z | 2021-03-03T10:46:43.000Z | includes/storage-import-export-ship-drives.md | Microsoft/azure-docs.cs-cz | 1e2621851bc583267d783b184f52dc4b853a058c | [
"CC-BY-4.0",
"MIT"
] | 16 | 2018-03-03T16:52:06.000Z | 2021-12-22T09:52:44.000Z | ---
title: zahrnout soubor
description: zahrnout soubor
author: alkohli
services: storage
ms.service: storage
ms.topic: include
ms.date: 04/08/2019
ms.author: alkohli
ms.custom: include file
ms.openlocfilehash: 7ecc36218df23d81c4646612b5474a1465f428eb
ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5
ms.translationtype: MT
ms.contentlocale: cs-CZ
ms.lasthandoff: 03/29/2021
ms.locfileid: "80282473"
---
FedEx, UPS nebo DHL lze použít k odeslání balíčku do datacentra Azure. Pokud chcete použít nosný operátor jiný než FedEx/DHL, kontaktujte Azure Data Box provozní tým na adrese `[email protected]`
* Zadejte platné číslo účtu FedEx, UPS nebo DHL dopravce, který společnost Microsoft použije k tomu, aby dodala jednotky zpět.
* Pro přenosové jednotky z umístění USA a Evropy se vyžaduje číslo účtu FedEx, UPS nebo DHL.
* DHL číslo účtu se upřednostňuje pro přenosové jednotky z Asie a míst Austrálie.
* Pokud nemáte číslo účtu, vytvořte účet dopravce [FedEx](https://www.fedex.com/us/oadr/) nebo [DHL](http://www.dhl.com/) .
* Při expedici balíčků je nutné dodržovat [Microsoft Azure podmínek služby](https://azure.microsoft.com/support/legal/services-terms/).
* Správně zabalit vaše disky, aby nedocházelo k potenciálním škodám a prodlevám při zpracování.
| 49.384615 | 198 | 0.794393 | ces_Latn | 0.997192 |
9c2f3161cb2574c3792a50be077f1359279b58c0 | 208 | md | Markdown | _project/living-room-tour-on-twopeasandtheirpodcom.md | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | _project/living-room-tour-on-twopeasandtheirpodcom.md | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | _project/living-room-tour-on-twopeasandtheirpodcom.md | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | ---
layout: project_single
title: "Living Room Tour on twopeasandtheirpod.com"
slug: "living-room-tour-on-twopeasandtheirpodcom"
parent: "living-room-furniture"
---
Living Room Tour on twopeasandtheirpod.com | 29.714286 | 52 | 0.793269 | eng_Latn | 0.805059 |
9c2f47f57282f543ecd19c9907ac0b6f6658da97 | 303 | md | Markdown | submission/Readme.md | realdealneil/realdealrobotics_vision | e4ef37452ac377c2d56366bf2525ee8549399108 | [
"BSD-3-Clause"
] | null | null | null | submission/Readme.md | realdealneil/realdealrobotics_vision | e4ef37452ac377c2d56366bf2525ee8549399108 | [
"BSD-3-Clause"
] | null | null | null | submission/Readme.md | realdealneil/realdealrobotics_vision | e4ef37452ac377c2d56366bf2525ee8549399108 | [
"BSD-3-Clause"
] | null | null | null | Run Real Deal Robotics' submission as follows:
python3 rdr_generate_submission.py -p <pathToImages>
where <pathToImages> is replaced with the actual path to the folder containing JPG test images.
This should deconflict with whatever version of generate_submission.py
script is run by AlphaPilot.
| 33.666667 | 97 | 0.811881 | eng_Latn | 0.997066 |
9c305194ed904dfdb730b3f1746f9e275b66f7fb | 1,538 | md | Markdown | index.md | carterburn/marinecoders.github.io | 48502e68338623da0cf21d95769516d59dead7fd | [
"Apache-2.0"
] | 1 | 2020-12-16T15:12:53.000Z | 2020-12-16T15:12:53.000Z | index.md | devinsmiley/marinecoders.github.io | d51c10c939d35481fea2ad65318dafed0926faa2 | [
"Apache-2.0"
] | null | null | null | index.md | devinsmiley/marinecoders.github.io | d51c10c939d35481fea2ad65318dafed0926faa2 | [
"Apache-2.0"
] | null | null | null | ---
layout: splash
feature_row:
- title: "Projects"
excerpt: "Learn more about our projects."
url: "/projects/"
btn_label: "Go to Projects"
btn_class: "btn--inverse"
- title: "Learn to Code"
excerpt: "Learn more about coding, DevSecOps, and enjoy our list of free courses."
url: "/learn/"
btn_label: "Start Learning"
btn_class: "btn--inverse"
- title: "DoD DevSecOps"
excerpt: "Learn more about the Department of Defense's software goals and enabling platforms on the Chief Software Officer's website."
url: "https://software.af.mil"
btn_label: "CSO Website"
btn_class: "btn--inverse"
---
<br /><br />
{: .align-center}
{% include feature_row %}
<h3 class="archive__subtitle">{{ site.data.ui-text[site.locale].recent_posts | default: "Recent Posts" }}</h3>
{% if paginator %}
{% assign posts = paginator.posts %}
{% else %}
{% assign posts = site.posts %}
{% endif %}
{% for post in posts %}
{% include archive-single.html %}
{% endfor %}
{% include paginator.html %}
## Team Guidelines
* We build code to help Marines not our Pros/Cons or FITREPS!
* We open source as much as possible [cio.gov](https://sourcecode.cio.gov/OSS/) [code.mil](https://code.mil)
* We are responsible users of existing open source code
* We help each other
* Cybersecurity is important
## Have questions or want to join us?
Send an email to collin.chew [at] usmc.mil / andrew.hutcheon [at] usmc.mil, we would love to hear from you!
| 31.387755 | 138 | 0.680754 | eng_Latn | 0.908129 |
9c31358d477dd1f41b037b39a814e54a99148539 | 4,779 | md | Markdown | sources/talk/20190614 Western Digital launches open-source zettabyte storage initiative.md | PandaWizard/TranslateProject | 529abd5c64c3a50d6458fa2e5b58da36e900802b | [
"Apache-2.0"
] | 22 | 2019-04-03T06:30:29.000Z | 2019-11-07T08:57:16.000Z | sources/talk/20190614 Western Digital launches open-source zettabyte storage initiative.md | PandaWizard/TranslateProject | 529abd5c64c3a50d6458fa2e5b58da36e900802b | [
"Apache-2.0"
] | 1 | 2019-03-02T03:16:12.000Z | 2019-03-02T03:16:12.000Z | sources/talk/20190614 Western Digital launches open-source zettabyte storage initiative.md | PandaWizard/TranslateProject | 529abd5c64c3a50d6458fa2e5b58da36e900802b | [
"Apache-2.0"
] | 6 | 2016-09-22T02:30:11.000Z | 2017-07-28T00:36:36.000Z | [#]: collector: (lujun9972)
[#]: translator: ( )
[#]: reviewer: ( )
[#]: publisher: ( )
[#]: url: ( )
[#]: subject: (Western Digital launches open-source zettabyte storage initiative)
[#]: via: (https://www.networkworld.com/article/3402318/western-digital-launches-open-source-zettabyte-storage-initiative.html)
[#]: author: (Andy Patrizio https://www.networkworld.com/author/Andy-Patrizio/)
Western Digital launches open-source zettabyte storage initiative
======
Western Digital's Zoned Storage initiative leverages new technology to create more efficient zettabyte-scale data storage for data centers by improving how data is organized when it is stored.
![monsitj / Getty Images][1]
Western Digital has announced a project called the Zoned Storage initiative that leverages new technology to create more efficient zettabyte-scale data storage for data centers by improving how data is organized when it is stored.
As part of this, the company also launched a [developer site][2] that will host open-source, standards-based tools and other resources.
The Zoned Storage architecture is designed for Western Digital hardware and its shingled magnetic recording (SMR) HDDs, which hold up to 15TB of data, as well as the emerging zoned namespaces (ZNS) standard for NVMe SSDs, designed to deliver better endurance and predictability.
**[ Now read:[What is quantum computing (and why enterprises should care)][3] ]**
This initiative is not being retrofitted for non-SMR drives or non-NVMe SSDs. Western Digital estimates that by 2023, half of all its HDD shipments are expected to be SMR. And that will be needed because IDC predicts data will be generated at a rate of 103 zettabytes a year by 2023.
With this project Western Digital is targeting cloud and hyperscale providers and anyone building a large data center who has to manage a large amount of data, according to Eddie Ramirez, senior director of product marketing for Western Digital.
Western Digital is changing how data is written and stored from the traditional random 4K block writes to large blocks of sequential data, like Big Data workloads and video streams, which are rapidly growing in size and use in the digital age.
“We are now looking at a one-size-fits-all architecture that leaves a lot of TCO [total cost of ownership] benefits on the table if you design for a single architecture,” Ramirez said. “We are looking at workloads that don’t rely on small block randomization of data but large block sequential write in nature.”
Because drives use 4k write blocks, that leads to overprovisioning of storage, especially around SSDs. This is true of consumer and enterprise SSDs alike. My 1TB SSD drive has only 930GB available. And that loss scales. An 8TB SSD has only 6.4TB available, according to Ramirez. SSDs also have to be built with DRAM for caching of small block random writes. You need about 1GB of DRAM per 1TB of NAND to act as a buffer, according to Ramirez.
### The benefits of Zoned Storage
Zoned Storage allows for 15-20% more storage on a HDD the than traditional storage mechanism. It eliminates the overprovisioning of SSDs, so you get all the NAND flash the drive has and you need far fewer DRAM chips on an SSD. Additionally, Western Digital promises you will need up to one-eighth as much DRAM to act as a cache in future SSD drives, lowering the cost.
Ramirez also said quality of service will improve, not necessarily that peak performance is better, but it will manage latency from outliers better.
Western Digital has not disclosed what if any pricing is associated with the project. It plans to work with the open-source community, customers, and industry players to help accelerate application development around Zoned Storage through its website.
Join the Network World communities on [Facebook][4] and [LinkedIn][5] to comment on topics that are top of mind.
--------------------------------------------------------------------------------
via: https://www.networkworld.com/article/3402318/western-digital-launches-open-source-zettabyte-storage-initiative.html
作者:[Andy Patrizio][a]
选题:[lujun9972][b]
译者:[译者ID](https://github.com/译者ID)
校对:[校对者ID](https://github.com/校对者ID)
本文由 [LCTT](https://github.com/LCTT/TranslateProject) 原创编译,[Linux中国](https://linux.cn/) 荣誉推出
[a]: https://www.networkworld.com/author/Andy-Patrizio/
[b]: https://github.com/lujun9972
[1]: https://images.idgesg.net/images/article/2019/02/big_data_center_server_racks_storage_binary_analytics_by_monsitj_gettyimages-951389152_3x2-100787358-large.jpg
[2]: http://ZonedStorage.io
[3]: https://www.networkworld.com/article/3275367/what-s-quantum-computing-and-why-enterprises-need-to-care.html
[4]: https://www.facebook.com/NetworkWorld/
[5]: https://www.linkedin.com/company/network-world
| 78.344262 | 442 | 0.775267 | eng_Latn | 0.989349 |
9c319c7bfc41ba530af7a8c8532d47954327b9ec | 3,037 | md | Markdown | src/pages/en/warning-trips/index.md | matikin9/gtfs.org | b0bb63e73611b1fd44d2e7f9d9c61174bac8fcfd | [
"CC-BY-3.0",
"Apache-2.0",
"MIT"
] | null | null | null | src/pages/en/warning-trips/index.md | matikin9/gtfs.org | b0bb63e73611b1fd44d2e7f9d9c61174bac8fcfd | [
"CC-BY-3.0",
"Apache-2.0",
"MIT"
] | null | null | null | src/pages/en/warning-trips/index.md | matikin9/gtfs.org | b0bb63e73611b1fd44d2e7f9d9c61174bac8fcfd | [
"CC-BY-3.0",
"Apache-2.0",
"MIT"
] | null | null | null | ---
path: /warning-trips/
lang: en
---
# Category: Trips
| Warning | Description |
|-------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Block trips with inconsistent route types |<br> A block trip implies that passengers can remain on a vehicle to transfer from one route to the next. If the ```route_type``` changes, either the block or the ```route_type``` is incorrectly defined, this must be fixed. <br><br> |
| Stop headsign duplicates stop name | The ```stop_headsign``` describes the direction for trips departing from the specified stop. This prevents routing results directing you to take a route towards Central Terminal if you are departing from Central Terminal. |
| <br><br>Trip headsign contains ```route_long_name``` | <br> Since the ```trip_headsign``` may be displayed together with the ```route_long_name``` information should not be duplicated. |
| <br> Trip headsign contains ```route_short_name``` | <br> Since the system displays ```trip_headsign``` with the ```route_short_name``` field, this information should not be duplicated. This prevents routing results like “1 towards 1 Central Terminal”. <br> |
| <br> Trips with incorrect stop headsigns |<br> The following trips have an incorrect ```stop_headsign```. Correct ```stop_headsign``` for loop trips is the next stop. Trips need to have at least one stop with the correct headsign. |
| <br> Trip has duplicate stops | Stops within a trip should be distinct. |
| Fast travel between stops | <br>Two stop times, belonging to the same trip in the ```stop_times.txt``` file, were found where the speed required for the transit vehicle to travel between the specified stops in the specified time seems suspiciously fast. |
| Fast travel between far stops | <br> Two stops, which were far apart and belonging to the same trip in the ```stop_times.txt``` file, were found where the speed required for the transit vehicle to travel between the specified stops in the specified time seems suspiciously fast. | | 178.647059 | 315 | 0.480408 | eng_Latn | 0.998486 |
9c31ecc54a1fb2e9a9d06d713544bd19b136d1d2 | 326 | md | Markdown | README.md | Gtsz/SimpleSmoothMouseMovement | 186ba9e4f6e8401136a8456af72013722b71b62b | [
"MIT"
] | null | null | null | README.md | Gtsz/SimpleSmoothMouseMovement | 186ba9e4f6e8401136a8456af72013722b71b62b | [
"MIT"
] | null | null | null | README.md | Gtsz/SimpleSmoothMouseMovement | 186ba9e4f6e8401136a8456af72013722b71b62b | [
"MIT"
] | null | null | null | # SimpleSmoothMouseMovement
### Usage
- Double click to run
- End task in Task Manager
- No "installation"
### Warming
The program is not perfect. It ocur high CPU usage while moving the mouse, which is possible to block the system.
Don't drag any window while enable this program. It will be recognized as a shake gesture.
| 29.636364 | 113 | 0.763804 | eng_Latn | 0.999156 |
9c327d0f07aa75c23d6cf784848aa906a2014756 | 6,517 | md | Markdown | README.md | EthanRutherford/fast-fuzzy | c233f3c95d52be95435ef39b1b322e5976153563 | [
"0BSD"
] | 46 | 2017-04-08T17:45:50.000Z | 2022-02-26T14:51:22.000Z | README.md | EthanRutherford/fast-fuzzy | c233f3c95d52be95435ef39b1b322e5976153563 | [
"0BSD"
] | 14 | 2017-04-09T12:31:12.000Z | 2022-01-21T09:25:09.000Z | README.md | EthanRutherford/fast-fuzzy | c233f3c95d52be95435ef39b1b322e5976153563 | [
"0BSD"
] | 6 | 2017-04-28T17:04:08.000Z | 2020-10-06T21:22:45.000Z | # fast-fuzzy [](https://travis-ci.com/EthanRutherford/fast-fuzzy) [](https://www.npmjs.com/package/fast-fuzzy)
Fast fuzzy-search utility
## methodology
fast-fuzzy is a tiny, lightning-quick fuzzy-searching utility.
The ranking algorithm is a modification of [levenshtein distance](https://en.wikipedia.org/wiki/Levenshtein_distance)
proposed by Peter H. Sellers ([paper](https://pdfs.semanticscholar.org/0517/aa6d420f66f74bd4b281e2ed0e2021f3d359.pdf)).
fast-fuzzy also uses the [damerau-levenshtein distance](https://en.wikipedia.org/wiki/Damerau%E2%80%93Levenshtein_distance)
by default, which, compared to normal levenshtein, punishes transpositions less.
Inputs are normalized before search.
Normalization consists of standard utf8-normalization,
optionally taking the lowercase of a string,
optionally removing non-word characters,
and optionally flattening/trimming whitespace.
Graphemes, such as conjoined emoji 👨👩👧, are treated as single characters.
Inputs are scored from `0` to `1`, where a higher score indicates a closer match.
When searching, results are returned in descending order of score.
Ties in score are broken by earliness of match (when using sellers substring match only).
Further ties are broken by favoring the candidate whose length is closest to the length of the search term.
This causes matches which are closer to exact full string matches to be effectively ranked higher.
Ties in length difference are broken by insertion order.
Lists of candidates are stored in a [trie](https://en.wikipedia.org/wiki/Trie) internally, which
avoids doing redundant work on candidates with common prefixes.
Additionally, when a subtree of the trie can be determined to have no string which could possibly
score >= the threshold, the entire subtree is skipped.
This significantly improves search times compared to a bruteforce search.
While the default options are to use Damerau and Sellers (transposition-friendly substring search),
either of these options can be opted out of if the need arises.
## exports
| name | description | signature |
| ---- | --------- | ------------ |
| `fuzzy` | fuzzy ranking algorithm; returns match strength | `(term, candidate, options?) => score` |
| `search` | for one-off searches; returns a sorted array of matches | `(term, candidates, options?) => matches` |
| `Searcher` | for searching the same set of candidates multiple times; caches the constructed trie<sup>1</sup> | `N/A` |
<sup>1</sup> it is recommended that you use a `Searcher` when searching the same set multiple times.
`search` will create a new trie every time, and while this is relatively cheap, it can have an
impact on responsiveness if you intend to update search results in real time, i.e. while typing.
### `Searcher` methods
| name | description | signature |
| ---- | --------- | ------------ |
| `constructor` | supply the options and initial list of candidates | `(candidates?, options?) => searcher` |
| `add` | add new candidates to the list | `(...candidates) => void` |
| `search` | perform a search against the instance's candidates |`(term, options?) => matches`<sup>2</sup> |
<sup>2</sup> allows overriding the `threshold`, `returnMatchData`, and `useDamerau` options
## options
`Searcher` and `search` both take an options object for configuring behavior.
| option | type | description | default |
| ------ | ---- | ----------- | ------- |
| keySelector | `Function` | selects the string(s)<sup>3</sup> to search when candidates are objects | `s => s`
| threshold | `Number` | the minimum score that can be returned | `.6`
| ignoreCase | `Bool` | normalize case by calling `toLower` on input and pattern | `true`
| ignoreSymbols | `Bool` | strip non-word symbols<sup>4</sup> from input | `true`
| normalizeWhitespace | `Bool`| normalize and trim whitespace | `true`
| returnMatchData | `Bool` | return match data<sup>5</sup> | `false`
| useDamerau | `Bool` | use damerau-levenshtein distance | `true`
| useSellers | `Bool` | use the Sellers method for substring matching | `true`
<sup>3</sup> if the keySelector returns an array, the candidate will take the score of the highest scoring key.
<sup>4</sup> `` `~!@#$%^&*()-=_+{}[]\|\;':",./<>? ``
<sup>5</sup> in the form `{item, original, key, score, match: {index, length}}`.
Match index and length are in terms of the original, non-normalized string.
Also note that `match` will be `undefined` if `useSellers` is `false`.
`fuzzy` accepts a subset of these options (excluding keySelector and threshold) with the same defaults.
## examples
You can call `fuzzy` directly to get a match score for a single string
```javascript
const {fuzzy} = require("fast-fuzzy");
fuzzy("hello", "hello world"); //returns 1
fuzzy("word", "hello world"); //returns .75
//pass in custom options
fuzzy("hello world", "hello world"); //returns 1
fuzzy("hello world", "hello world", {normalizeWhitespace: false}); //returns .90909090...
```
Use `search` to search a list of strings or objects
```javascript
const {search} = require("fast-fuzzy");
search("abc", ["def", "bcd", "cde", "abc"]); //returns ["abc", "bcd"]
//pass in a keySelector to search for objects
search(
"abc",
[{name: "def"}, {name: "bcd"}, {name: "cde"}, {name: "abc"}],
{keySelector: (obj) => obj.name},
);
//returns [{name: "abc"}, {name: "bcd"}]
//pass returnMatchData to receive the matchData for each result
search("abc", ["def", "bcd", "cde", "abc"], {returnMatchData: true});
/* returns [{
item: 'abc', original: 'abc', key: 'abc', score: 1,
match: {index: 0, length: 3},
}, {
item: 'bcd', original: 'bcd', key: 'bcd', score: 0.6666666666666667,
match: {index: 0, length: 2},
}] */
```
Use `Searcher` in much the same way as `search`
```javascript
const {Searcher} = require("fast-fuzzy");
const searcher = new Searcher(["def", "bcd", "cde", "abc"]);
searcher.search("abc"); //returns ["abc", "bcd"]
//options are passed in on construction
const anotherSearcher = new Searcher(
[{name: "thing1"}, {name: "thing2"}],
{keySelector: (obj) => obj.name},
);
//some options can be overridden per call
searcher.search("abc", {returnMatchData: true});
/* returns [{
item: 'abc', original: 'abc', key: 'abc', score: 1,
match: {index: 0, length: 3},
}, {
item: 'bcd', original: 'bcd', key: 'bcd', score: 0.6666666666666667,
match: {index: 0, length: 2},
}] */
```
| 45.894366 | 244 | 0.699862 | eng_Latn | 0.970235 |
9c32e5dadee2c3b9833e09ccf37a4f3acdd10e0c | 2,681 | md | Markdown | availability/proxy_pac.md | mirkodziadzka-avi/datascript-library | 0d00f4a98168b791e8707d6b5cd1a0f16b311550 | [
"MIT"
] | 24 | 2017-06-21T07:42:21.000Z | 2022-01-06T07:31:44.000Z | availability/proxy_pac.md | mirkodziadzka-avi/datascript-library | 0d00f4a98168b791e8707d6b5cd1a0f16b311550 | [
"MIT"
] | 3 | 2019-01-16T20:40:49.000Z | 2021-12-13T12:31:57.000Z | availability/proxy_pac.md | mirkodziadzka-avi/datascript-library | 0d00f4a98168b791e8707d6b5cd1a0f16b311550 | [
"MIT"
] | 19 | 2018-05-10T13:35:55.000Z | 2021-12-12T15:59:53.000Z | # Managing Proxy Auto-Configuration (PAC) file
The example outlines how to integrate existing proxy.pac file to DataScript Lua code and serve it to the end users.
1. Proxy Auto-Configuration (PAC) file.
```
function FindProxyForURL(url, host) {
if (isPlainHostName(host) ||
dnsDomainIs(host, ".avinetworks.com") ||
isInNet(host, "192.168.0.0", "255.255.0.0")) {
return "DIRECT";
} else {
return "PROXY proxy.avinetworks.net:8080";
}
}
```
2. The above file has to be prepared/formatted for DataScript, leveraging sed function. The output of sed function will be used in DataScript.
```
root@avitools:/tmp# sed ':a;N;$!ba;s/\n/\\n/g' proxy.pac
function FindProxyForURL(url, host) {\n if (isPlainHostName(host) ||\n dnsDomainIs(host, ".avinetworks.com") ||\n isInNet(host, "192.168.0.0", "255.255.0.0")) {\n return "DIRECT";\n } else {\n return "PROXY proxy.avinetworks.net:8080";\n }\n}
```
3. Create HTTP_REQ DataScript and associate it to the corresponding virtual service.
```lua
-- HTTP_REQ
proxy_pac_body = 'function FindProxyForURL(url, host) {\n if (isPlainHostName(host) ||\n dnsDomainIs(host, ".avinetworks.com") ||\n isInNet(host, "192.168.0.0", "255.255.0.0")) {\n return "DIRECT";\n } else {\n return "PROXY proxy.avinetworks.net:8080";\n }\n}'
if string.contains(avi.http.get_path(), 'proxy.pac') then
avi.http.response(200,{content_type="application/x-ns-proxy-autoconfig", pragma="no-cache"}, proxy_pac_body)
end
```
4. Verify the applied DataScript.
```
root@avitools:/tmp# curl -i http://10.57.0.52/proxy.pac
HTTP/1.1 200 OK
Content-Type: application/x-ns-proxy-autoconfig
Content-Length: 252
Connection: keep-alive
pragma: no-cache
function FindProxyForURL(url, host) {
if (isPlainHostName(host) ||
dnsDomainIs(host, ".avinetworks.com") ||
isInNet(host, "192.168.0.0", "255.255.0.0")) {
return "DIRECT";
} else {
return "PROXY proxy.avinetworks.net:8080";
}
root@avitools:/tmp# apt-get install libpacparser1
root@avitools:/tmp# curl -O http://10.57.0.52/proxy.pac
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 253 100 253 0 0 1144 0 --:--:-- --:--:-- --:--:-- 1150
root@avitools:/tmp# pactester -p proxy.pac -u http://www.avinetworks.com
DIRECT
root@avitools:/tmp# pactester -p proxy.pac -u http://vmware.com
PROXY proxy.avinetworks.net:8080
```
| 46.224138 | 284 | 0.621037 | yue_Hant | 0.331888 |
9c33fddb987ab37a2a6e9fc844b2ad93903ccf2d | 5,444 | md | Markdown | docs/Appendix01_R_base/R_pacages_manage.md | wan230114/BioNote | 259a96f678667e364e416581d11e528aad762ef4 | [
"Apache-2.0"
] | 1 | 2020-04-24T07:29:18.000Z | 2020-04-24T07:29:18.000Z | docs/Appendix01_R_base/R_pacages_manage.md | wan230114/BioNote | 259a96f678667e364e416581d11e528aad762ef4 | [
"Apache-2.0"
] | 3 | 2020-04-08T02:50:05.000Z | 2020-04-08T05:51:41.000Z | docs/Appendix01_R_base/R_pacages_manage.md | wan230114/BioNote | 259a96f678667e364e416581d11e528aad762ef4 | [
"Apache-2.0"
] | 2 | 2020-04-04T04:52:52.000Z | 2021-01-19T16:33:04.000Z | # R包的管理及安装汇总
总结:
```R
install.packages('dplyr', repo="http://mirrors.tuna.tsinghua.edu.cn/CRAN")
# Download and install a package from CRAN.
library(dplyr)
# Load the package into the session, making all
# its functions available to use.
dplyr::select
# Use a particular function from a package.
data(iris)
# Load a built-in dataset into the environment.
```
## 1. R包的查看
查看你已经安装了哪些包。
```R
installed.packages()
```
查看自己的机器可以安装哪些包!
```R
available.packages()
```
导入自定义安装包环境变量:
```bash
export R_LIBS_USER=$PATH:/TJPROJ1/DENOVO/PROJECT/chenjun/R_lib/R_3.5.1
export R_LIBS=$PATH:/TJPROJ1/DENOVO/PROJECT/chenjun/R_lib/R_3.5.1
```
## 2. R包的下载源:
R中安装程序包选择国内CRAN镜像方法 – 苏岳宁博客
- http://suyuening.com/archives/1506.html
常用下载源
- http://mirrors.tuna.tsinghua.edu.cn/CRAN (常用,较稳定)
- http://mirrors.ustc.edu.cn/CRAN
## 3. R包的安装
### 3.1. install.packages()的CRAN源安装
从CRAN中安装R包
语法:
```R
########安装R包的几种方式#############
# 修改清华镜像站
site="https://mirrors.tuna.tsinghua.edu.cn/CRAN"
install.packages("DESeq2", repo=site)
#完全可以多个R包一起安装
ins_pac = c("DO.db", "fgsea", "qvalue", "ggforce",
"DOSE", "ggraph", "GOSemSim", "biomaRt",
"enrichplot", "GenomicFeatures", "gridBase",
"rtracklayer", "TxDb.Hsapiens.UCSC.hg19.knownGene")
install.packages("ins_pac", repo=site)
```
多个源安装:【待精确补充】
```R
install.packages("包名",
"repos"=c(CRAN="下载源"),
lib='自定义安装路径')
```
- 示例
```R
install.packages("RColorBrewer", "repos" = c(CRAN="http://mirrors.ustc.edu.cn/CRAN"), lib='/home/chenjun/software/R_lib')
install.packages("MatrixEQTL", "repos" = c(CRAN="http://mirrors.ustc.edu.cn/CRAN"), lib='/TJPROJ1/DENOVO/PROJECT/chenjun/R_lib/R_3.5.1')
install.packages("MatrixEQTL", "repos" = c(CRAN="http://www.bios.unc.edu/research/genomic_software/Matrix_eQTL/"), lib='/TJPROJ1/DENOVO/PROJECT/chenjun/R_lib')
```
### 3.2. BiocInstaller::biocLite() 安装
有时候很多生物学的包无法直接安装成功,使用biocLite可安装成功。
代码总结:
通过网络导入 bioconductor 包
```R
source("https://bioconductor.org/biocLite.R")
biocLite("phangorn")
```
安装 BiocInstaller 包
```R
install.packages('BiocInstaller', repos='http://bioconductor.org/packages/3.7/bioc')
library("BiocInstaller")
options(BioC_mirror="http://mirrors.ustc.edu.cn/bioc/")
biocLite("phangorn", lib='/ifs/TJPROJ3/Plant/chenjun/software/R/R_3.6.0_package')
biocLite("synbreed", lib='/ifs/TJPROJ3/Plant/chenjun/software/R/R_3.6.0_package')
biocLite("snpStats", lib='/home/chenjun/software/R_lib')
```
- 参考 【设置 biocLite的安装源】
R包选择镜像以及本地安装 - wangyunpeng_bio的博客 - CSDN博客
https://blog.csdn.net/qq_29300341/article/details/53229215
- 安装报错解决 【google搜索关键词】
> ERROR: dependency ‘snpStats’ is not available for package ‘LDheatmap’ * removing ‘/home/chenjun/software/R_lib/LDheatmap’
> ERROR: dependency ‘LDheatmap’ is not available for package ‘synbreed’ * removing ‘/home/chenjun/software/R_lib/synbreed’
> - R:无法安装snpStats包
> - https://www.biostars.org/p/202152/
【解决报错: biocLite.R'】
R语言 | 可以愉快的更新最新版R了!
http://www.360doc.com/content/18/1012/22/47596298_794245943.shtml
### 3.3. BiocManager::install() 生物学包管理
```R
##BiocManager
chooseCRANmirror()
install.packages("BiocManager")
#安装的软件包可以更新到当前版本
BiocManager::install()
#使用version()查看Bioconductor版本
BiocManager::version()
```
### 3.4. devtools::install_github() 安装github中的包
- 准备:devtools包的安装
```R
# 判断devtools工具是否存在,选择是否需要安装,因为很大。
require(devtools)
if (!requireNamespace("devtools", quietly = TRUE))
install.packages("devtools")
```
- 远程安装
```R
library("devtools")
# 安装phyloseq包
install_github("joey711/phyloseq")
library(phyloseq)
# 安装ggvegan包
devtools::install_github("gavinsimpson/ggvegan")
install_github("ggvegan")
# 其他
devtools::install_github("calligross/ggthemeassist")
# 安装开发版(连github不稳定有时间下载失败,多试几次可以成功)
devtools::install_github("phyloseq", build_vignettes = TRUE)
# 安装新功能最优版
devtools::install_github("phyloseq", ref = "optimization")
```
- 本地安装:
> 我们下载github上的包,可能经常性的打不开,R中无法下载,甚至手动克隆都有可能随时中断。
> 无法下载得到github包,或者无法安装后,将github包手动下载下来,解压之后定位文件夹名称后安装
```R
install.packages("C:/Users/wentao/Desktop/hrbrthemes-master/", repos = NULL, type = "source")
install.packages("C:/Users/wentao/Desktop/microbiomeutilities-master/", repos = NULL, type = "source")
library(microbiomeutilities)
```
图形化界面的本地安装操作方法:
[零基础玩转R:如何安装已下载到本地的包](https://mp.weixin.qq.com/s/RJ4-1i8QvtpO3Ay_XWeQOg)
### 3.5. install_version()安装固定版本
```R
install_version("igraph", version = "0.6.5",
repo="http://mirrors.tuna.tsinghua.edu.cn/CRAN/")
```
## 4. R包的载入
### 4.1. R包载入方式
```R
# 常规载入
library("包名")
require("包名")
# 通过变量指定载入
pkg <- "包名"
library(pkg, character.only=TRUE)
# 多个包一起载入
# Load packages into session
cran_packages <- c("ggplot2", "gridExtra")
bioc_packages <- c("dada2", "msa", "phyloseq")
sapply(c(cran_packages, bioc_packages), require, character.only = TRUE)
```
> **R中 library 和 require 的区别**
> - 在一个函数中,如果一个包不存在,执行到library将会停止执行,require则会继续执行。
> - require将会根据包的存在与否返回true或者false,
> 参考:
> - [r - What is the difference between require() and library()? - Stack Overflow](https://stackoverflow.com/questions/5595512/what-is-the-difference-between-require-and-library)
> - [R中library和require的区别 - todoit - 博客园](https://www.cnblogs.com/todoit/archive/2012/10/24/2736514.html)
---
### 4.2. 查看载入的包
```R
# 查看默认载入的包
getOption("defaultPackages") # 查看启动R时自动载入的包
# 查看R中载入的包
sessionInfo() # 查看R中载入的包
```
### 4.3. 查看包安装目录
查看已经安装的包目录
```R
.libPaths() # 查看包的安装目录
library() # 查看已经安装的包目录
```
| 24.303571 | 178 | 0.709772 | yue_Hant | 0.303075 |
9c354ca468b9ad8db7f495837ddf0156bb1cbe9a | 1,151 | md | Markdown | 8_Cloudrip_Mountain/455-Square_Shield/readme.md | katitek/Code-Combat | fbda1ac0ae4a2e2cbfce21492a2caec8098f1bef | [
"MIT"
] | null | null | null | 8_Cloudrip_Mountain/455-Square_Shield/readme.md | katitek/Code-Combat | fbda1ac0ae4a2e2cbfce21492a2caec8098f1bef | [
"MIT"
] | null | null | null | 8_Cloudrip_Mountain/455-Square_Shield/readme.md | katitek/Code-Combat | fbda1ac0ae4a2e2cbfce21492a2caec8098f1bef | [
"MIT"
] | null | null | null | ## _Square Shield_
#### _Legend says:_
> Only faith, ruler, and calculation can protect us.
#### _Goals:_
+ _Survive_
#### _Topics:_
+ **Strings**
+ **Variables**
+ **Array Length**
+ **Return Statements**
+ **Object Literals**
+ **Accessing Properties**
#### _Solutions:_
+ **[JavaScript](squareSquad.js)**
+ **[Python](square_squad.py)**
#### _Rewards:_
+ 387 xp
+ 178 gems
#### _Victory words:_
+ _NOW THAT'S SOME GEOMETRY!_
___
### _HINTS_
There are a lot of wild yetis and they are very agressive! Four paladins and a prayer can make the divine shield. Form a square with the paladins in the vertices and enemies will not pass.
Two paladins are ready, find the places for the other two.
___
To survive, you must assemble the Paladins into a square formation!
Since the Paladins will be making the corners of a square, consider these properties of a square:
1. The corners of a square share an x or y coordinate with an adjacent corner.
2. A square has equilateral (same length) sides.
To solve this level, move Vaelia and Illumina a `sideLength` away from each corner, but make sure they share a similar corner position (`x`).
___
| 24.489362 | 188 | 0.721112 | eng_Latn | 0.992942 |
9c35e9d97a3db0d4144c4a91d83f1936f8bc5e97 | 509 | md | Markdown | includes/spatial-anchors-create-locate-anchors-session-status.md | changeworld/azure-docs.nl-nl | bdaa9c94e3a164b14a5d4b985a519e8ae95248d5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/spatial-anchors-create-locate-anchors-session-status.md | changeworld/azure-docs.nl-nl | bdaa9c94e3a164b14a5d4b985a519e8ae95248d5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/spatial-anchors-create-locate-anchors-session-status.md | changeworld/azure-docs.nl-nl | bdaa9c94e3a164b14a5d4b985a519e8ae95248d5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
ms.openlocfilehash: bb75ea0f8eb3fcf85a06cadd68ae111d51369891
ms.sourcegitcommit: 87781a4207c25c4831421c7309c03fce5fb5793f
ms.translationtype: MT
ms.contentlocale: nl-NL
ms.lasthandoff: 01/23/2020
ms.locfileid: "76694464"
---
Zoals eerder beschreven, hebt u voldoende omgevings gegevens nodig voor het maken van een nieuw ruimtelijk ruimte-anker voor de Cloud. Dit betekent dat `ReadyForCreateProgress` boven 1 moet zijn, maar we raden u aan om te wachten tot `RecommendedForCreateProgress` meer dan 1 is.
| 50.9 | 279 | 0.827112 | nld_Latn | 0.995242 |
9c36c1f5375d3003cdb603cecadc1ac6a9b35f42 | 2,714 | md | Markdown | README.md | flowkey/meteor-raven | 06120e0b77c04d77f07b7b6d105a2eba78021487 | [
"MIT"
] | null | null | null | README.md | flowkey/meteor-raven | 06120e0b77c04d77f07b7b6d105a2eba78021487 | [
"MIT"
] | null | null | null | README.md | flowkey/meteor-raven | 06120e0b77c04d77f07b7b6d105a2eba78021487 | [
"MIT"
] | 1 | 2018-08-01T14:25:00.000Z | 2018-08-01T14:25:00.000Z | # Raven
Raven/[Sentry](https://www.getsentry.com) integration for Meteor. Includes [Raven.js](https://github.com/getsentry/raven-js) for frontend logging and [raven-node](https://github.com/getsentry/raven-node) for backend logging.
Provides consolidated error logging to Sentry via Raven from both the client and the server.
**Although this is a fork of `dVelopment/meteor-raven` and `deepwell/meteor-raven` the API is completely differently!**
## Usage
Grab your client keys (DSN) from your project settings in Sentry. I recommend saving them in Meteor's `setting.json`:
```js
{
// ...
"sentryPrivateDSN": "https://[email protected]/app_id",
"public": {
// ...
// public key has to be available from the client!
"sentryPublicDSN": "https://public_key:[email protected]/app_id"
}
}
```
Now you can initialize the RavenLogger:
```js
import { Meteor } from 'meteor/meteor';
import RavenLogger from 'meteor/flowkey:raven';
// ...
export const ravenLogger = new RavenLogger({
publicDSN: Meteor.settings.public.sentryPublicDSN, // will be used on the client
privateDSN: Meteor.settings.sentryPrivateDSN, // will be used on the server
shouldCatchConsoleError: true, // default
trackUser: false, // default
}, ravenOptions);
// ...
```
You can pass options for raven directly into the client. Which parameters are accepted can be found at the Sentry docs for [the Node client](https://docs.sentry.io/clients/node/config/#optional-settings) and the [JavaScript client](https://docs.sentry.io/clients/javascript/config/#optional-settings). The options are being passed into both clients.
If you don't want to catch errors thrown globally or using `console.error` set `shouldCatchConsoleError` to `false`.
If you are using the Meteor Accounts package, you can enable user tracking on errors by settings `trackUser` to `true`. It will associate the error with the user's userId.
Now you can finally log messages to Sentry!
```js
import { ravenLogger } from './path/to/logger';
// ...
ravenLogger.log('Error transmitted using Raven.captureMessage', additionalData);
ravenLogger.log(new Error('Error transmitted using Raven.captureException'), additionalData);
```
If the first argument is an `instanceof Error` then `captureException` from Raven is used, otherwise `captureMessage`. If an error is passed, Raven will be saving full error and exception stack traces.
`additionalData` can be anything described in the [Sentry docs](https://docs.sentry.io/clients/javascript/usage/#passing-additional-data).
To set tags on the whole context, use `ravenLogger.setTagsContext`:
```js
ravenLogger.setTagsContext({ component: 'system' });
```
| 40.507463 | 349 | 0.747605 | eng_Latn | 0.881291 |
9c386b7fd1f15bfad15a8ccc4f9a63e596b8b68f | 1,145 | md | Markdown | _posts/2005-01-23-Iron-Fe.md | HeavyWeather/HeavyWeather.github.io | 40816530bf57a4023ac57686a85003e1e77e671f | [
"CC-BY-3.0"
] | null | null | null | _posts/2005-01-23-Iron-Fe.md | HeavyWeather/HeavyWeather.github.io | 40816530bf57a4023ac57686a85003e1e77e671f | [
"CC-BY-3.0"
] | null | null | null | _posts/2005-01-23-Iron-Fe.md | HeavyWeather/HeavyWeather.github.io | 40816530bf57a4023ac57686a85003e1e77e671f | [
"CC-BY-3.0"
] | null | null | null | ---
layout: post
title: Iron (Fe)
date: '2005-01-23T21:33:00+00:00'
tags: [fiction]
image:
imageurl:
---
Electric cat's frame ripples with voltage. Coated with ultra-black fur it boils with nannites. Electric cat uncoils and leaps exactly like greased lightning, servos purring their pleasure. Multiband, telefocus eyes glint from neon: reflected back and forth forever. Everything it sees, I see.
<!--more-->
Electric cat purrs, leaps, purrs, and microgyros whir along its spine. Four axis stabilised, in ninety nine out of a hundred tests electric cat lands on its feet running. Its tail switches to mewl out encrypted radio in frequencies only I can hear. Dogs oblivious to its dampened footfall or its telecommunication.
Electric cat is better than the real thing. Electric cat responds to my will only slighter slower than the speed of thought. It war-drives through the sleeping city. Through network after poorly protected network. Returning home each morning to download its prey at my doorstep, to flex and charge at my stroking touch. Summoned from the east, a demon... my familiar, housed in flesh that pounds with howling current.
| 76.333333 | 417 | 0.787773 | eng_Latn | 0.999157 |
9c38a960452552d058d63cf1952d5edab01c20aa | 1,765 | md | Markdown | docs/extensibility/debugger/reference/idebugprogramnode2-detachdebugger-v7.md | MicrosoftDocs/visualstudio-docs.de-de | edda581743b0eede0b99441d8e52a8d0e133dec8 | [
"CC-BY-4.0",
"MIT"
] | 10 | 2018-09-27T09:13:44.000Z | 2021-09-08T07:12:47.000Z | docs/extensibility/debugger/reference/idebugprogramnode2-detachdebugger-v7.md | MicrosoftDocs/visualstudio-docs.de-de | edda581743b0eede0b99441d8e52a8d0e133dec8 | [
"CC-BY-4.0",
"MIT"
] | 68 | 2018-02-07T12:07:58.000Z | 2021-03-19T00:35:58.000Z | docs/extensibility/debugger/reference/idebugprogramnode2-detachdebugger-v7.md | MicrosoftDocs/visualstudio-docs.de-de | edda581743b0eede0b99441d8e52a8d0e133dec8 | [
"CC-BY-4.0",
"MIT"
] | 41 | 2018-01-05T16:53:02.000Z | 2021-10-09T11:00:50.000Z | ---
title: IDebugProgramNode2::D etachDebugger_V7 | Microsoft-Dokumentation
description: Diese Methode ist eine alte, veraltete Form der Trennmethode, die vor Visual Studio 2005 verwendet wurde.
ms.date: 11/04/2016
ms.topic: reference
f1_keywords:
- IDebugProgramNode2::DetachDebugger
helpviewer_keywords:
- IDebugProgramNode2::DetachDebugger
- IDebugProgramNode2::DetachDebugger_V7
author: leslierichardson95
ms.author: lerich
manager: jmartens
ms.technology: vs-ide-debug
ms.workload:
- vssdk
dev_langs:
- CPP
- CSharp
ms.openlocfilehash: 47ad13cc0f3f01665535e6b9d8168af79eb299f0
ms.sourcegitcommit: 68897da7d74c31ae1ebf5d47c7b5ddc9b108265b
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 08/13/2021
ms.locfileid: "122029982"
---
# <a name="idebugprogramnode2detachdebugger_v7"></a>IDebugProgramNode2::DetachDebugger_V7
> [!Note]
> Veraltet. VERWENDEN SIE NICHT.
## <a name="syntax"></a>Syntax
```cpp
HRESULT DetachDebugger_V7 (
void
);
```
```csharp
int DetachDebugger_V7 ();
```
## <a name="return-value"></a>Rückgabewert
Eine Implementierung sollte immer `E_NOTIMPL` zurückgeben.
## <a name="remarks"></a>Hinweise
> [!WARNING]
> Ab Visual Studio 2005 wird diese Methode nicht mehr verwendet und sollte immer `E_NOTIMPL` zurückgeben.
Diese Methode wird aufgerufen, wenn der Debugger unerwartet beendet wird. Wenn diese Methode aufgerufen wird, sollte die De das Programm so fortsetzen, als ob der Benutzer davon getrennt wäre. Es sollten keine Debugereignisse mehr gesendet werden. Das Programm sollte sich in einem Zustand befinden, in dem es von einer anderen Instanz des Debuggers angefügt werden kann.
## <a name="see-also"></a>Siehe auch
- [IDebugProgramNode2](../../../extensibility/debugger/reference/idebugprogramnode2.md)
| 30.431034 | 371 | 0.788102 | deu_Latn | 0.889351 |
9c398b821ea9ea5500919b9483b92cf6b2f11bea | 2,389 | md | Markdown | Hardware/Readme.md | cpldcpu/SimPad | 22191102d1bc6d280b5c20b2e4b45a70a983abee | [
"MIT"
] | 66 | 2019-01-17T04:48:10.000Z | 2022-03-25T17:47:30.000Z | Hardware/Readme.md | cpldcpu/SimPad | 22191102d1bc6d280b5c20b2e4b45a70a983abee | [
"MIT"
] | 2 | 2019-10-10T08:21:01.000Z | 2020-06-30T06:45:40.000Z | Hardware/Readme.md | cpldcpu/SimPad | 22191102d1bc6d280b5c20b2e4b45a70a983abee | [
"MIT"
] | 6 | 2019-08-15T20:26:10.000Z | 2022-03-25T17:47:32.000Z | # Hardware
Experimental programmer hardware based on an Arduino Nano with a ATMega168PA.
The programming interface of the MCU is directly connected to GPIO of the ATMega as shown below. Also VDD is controlled by a GPIO. Maximum load on VDD is 20mA according to the PMS150C datasheet, which can be easily sourced from the ATMega. Vpp is generated with a simple boost converter.
6-wire 5-wire
PWM = VPP = ICVPP
PB0 = VDD = VDD
PB1 = MISO = ICPDA (data output of MCU or bidirectional)
PB2 = MOSI = Not used (data input of MCU)
PB3 = SCLK = ICPCK
PA6 = ADC6 = voltage monitor input (5k1/10k divider)
This is a minimal hardware implementation and is not representative of a production worthy programmer. See below for open issues.
## Boost converter to control Vpp
The programming voltage Vpp is generated with a simple boost converter that is controlled directly by the periphery of the ATMega. The circuit and a LTspice simulation is shown in the images below. This concept was previously used in the Openprog PIC programmer and is also used in the official PICkit by Microchip. See [here](http://openprog.altervista.org/OP_eng.html#Regulator) for more details.
Timer TC0 is used in fast PWM mode to generate a 62.5 kHz square wave on the input of the switching transistor. The duty cycle of the PWM signal is varied to modify swithing. The voltage across the load is divided with R1 and R3 and fed back into the ADC of the ATMega. Right now the voltage in the programmer is not controlled in a closed loop, therefore changes of the load lead to a deviation of Vpp. To reduce the impact of varying load, a constant loading resistor was added. This may be merged with the voltage divider.
A 1N4148 PN junction diode with ~0.7 Vf drop is used intentionally to reduce the standby voltage of the switching converter to below 5.0 V.
## Open issues
- To properly implement the programming sequence, including corner case verification, it is also necessary to control the voltage of VDD from 2 V to 6.5 V. This requires additional hardware.
- Control of VPP is somewhat instable and has a slow step response. This could be improved with a closed loop converter or a dedicated boost converter IC.
### Circuit

### Simulation

### Breadboard

| 66.361111 | 525 | 0.768522 | eng_Latn | 0.999219 |
9c399169ab851b79deb1e8c5ceb2464871544140 | 56 | md | Markdown | README.md | david-gang/coroutines_presentation | 7324f5c94cd0080d6a7df3e0bcee56fd13861f0c | [
"MIT"
] | null | null | null | README.md | david-gang/coroutines_presentation | 7324f5c94cd0080d6a7df3e0bcee56fd13861f0c | [
"MIT"
] | null | null | null | README.md | david-gang/coroutines_presentation | 7324f5c94cd0080d6a7df3e0bcee56fd13861f0c | [
"MIT"
] | null | null | null | # coroutines_presentation
This is code for presentation
| 18.666667 | 29 | 0.857143 | eng_Latn | 0.998844 |
9c3dca2392e8e67147017268bad7ed40f73a91f9 | 381 | md | Markdown | Tree/LeetCode-Tree/Minimum Depth of Binary Tree/README.md | steveLauwh/Algorithms | e8ce0c5cc2a54fbe9700ed8c02acf0f758243eaa | [
"Apache-2.0"
] | 39 | 2018-09-19T06:57:33.000Z | 2022-01-29T09:11:20.000Z | Tree/LeetCode-Tree/Minimum Depth of Binary Tree/README.md | steveLauwh/Data-Structures-And-Algorithms | e8ce0c5cc2a54fbe9700ed8c02acf0f758243eaa | [
"Apache-2.0"
] | null | null | null | Tree/LeetCode-Tree/Minimum Depth of Binary Tree/README.md | steveLauwh/Data-Structures-And-Algorithms | e8ce0c5cc2a54fbe9700ed8c02acf0f758243eaa | [
"Apache-2.0"
] | 17 | 2018-07-09T08:33:14.000Z | 2021-12-08T09:30:01.000Z | ## Minimum Depth of Binary Tree「LeetCode 111」
题目:求二叉树的最小深度。
```
Given a binary tree, find its minimum depth.
The minimum depth is the number of nodes along the shortest path from the root node down to the nearest leaf node.
```
解题思路:递归思想
1. 与 Maximum Depth of Binary Tree 类似
2. 从根节点开始,当左子树为 NULL,需要求解右子树的最小深度
3. 当右子树为 NULL,需要求解左子树的最小深度
4. 当都不为 NULL,比较左子树和右子树的最小深度,求出最小深度后加 1
| 22.411765 | 114 | 0.76378 | eng_Latn | 0.943551 |
9c3dd47835e713678ba0ce8f279d6ed203418688 | 9,090 | md | Markdown | README.md | lukasl-dev/waterlink | 794eabccf61fb39ab7feda9d2289dcb7e5bca93b | [
"MIT"
] | 11 | 2021-04-17T14:12:02.000Z | 2022-01-15T13:14:09.000Z | README.md | lukasl-dev/waterlink | 794eabccf61fb39ab7feda9d2289dcb7e5bca93b | [
"MIT"
] | 2 | 2021-08-17T03:33:33.000Z | 2021-08-17T15:23:56.000Z | README.md | lukasl-dev/waterlink | 794eabccf61fb39ab7feda9d2289dcb7e5bca93b | [
"MIT"
] | 2 | 2021-04-18T09:16:43.000Z | 2021-08-28T09:06:21.000Z | # waterlink
<div align="center">
<a href="https://golang.org/">
<img
src="https://img.shields.io/badge/Written%20in-Go-%23EF4041?style=for-the-badge"
height="30"
/>
</a>
<a href="https://pkg.go.dev/github.com/lukasl-dev/waterlink">
<img
src="https://img.shields.io/badge/godoc-reference-5272B4.svg?style=for-the-badge"
height="30"
/>
</a>
<a href="https://goreportcard.com/report/github.com/lukasl-dev/waterlink">
<img
src="https://goreportcard.com/badge/github.com/lukasl-dev/waterlink?style=for-the-badge"
height="30"
/>
</a>
</div>
## :books: Introduction
Waterlink is a [Lavalink](https://github.com/freyacodes/Lavalink) client written in Go. **The library is based on
the [Lavalink 3.x.x protocol](https://github.com/freyacodes/Lavalink/blob/master/IMPLEMENTATION.md).**
---
## :mag_right: Compatibility
The following Lavalink versions have been tested for compatibility with waterlink:
- [x] [v3.3.2.5](https://github.com/freyacodes/Lavalink/releases/tag/3.3.2.5)
---
## :ballot_box: Installation
It is assumed that you have already worked with the Go environment. If this is not the case,
see [this page first](https://golang.org/doc/install).
```shell
go get -u github.com/lukasl-dev/waterlink
```
---
## :art: Structural design
### :house: Architecture
I have tried to implement my interpretation of [**Clean Architecture by Robert C. Martin (Uncle
Bob)**](https://blog.cleancoder.com/uncle-bob/2012/08/13/the-clean-architecture.html). If you have any corrections or
suggestions, please create an issue.
### :mosquito: Mocking
To simplify testing for the handling of the library, waterlink offers the possibility of mock implementations. The
mocking library used for this is [stretchr/testify](https://github.com/stretchr/testify).
---
## :bamboo: Getting started
Firstly, we need to differentiate between **connectionless** and **connection-oriented** use cases. **
Connection-oriented** use cases require an **active web socket connection** to the Lavalink server and **
connectionless** use cases are **only based on simple HTTP requests**.
### :boat: Opening a connection
The Connection is the interface between waterlink and **Lavalink's web socket API**. It is required to access the **
connection-oriented use cases** and can be opened by the `waterlink.Connect` function.
<details>
<summary>Usage</summary>
<p>
```go
package main
import (
"context"
"net/url"
"github.com/lukasl-dev/waterlink"
)
var (
host = url.URL{ // TODO: adjust
Scheme: "ws",
Host: "localhost:2333",
}
passphrase = "youshallnotpass" // TODO: adjust
)
func main() {
opts := waterlink.NewConnectOptions().WithPassphrase(passphrase) // more options available
conn, err := waterlink.Connect(context.TODO(), host, opts)
if err != nil {
// TODO: handle error
return
}
// TODO: use conn
}
```
</p>
</details>
### :phone: Creating a requester
The Requester is the interface between waterlink and **Lavalink's HTTP API**. It is required to access the **
connectionless use cases** and can be created by the `waterlink.NewRequester` function.
<details>
<summary>Usage</summary>
<p>
```go
package main
import (
"net/url"
"github.com/lukasl-dev/waterlink"
)
var (
host = url.URL{ // TODO: adjust
Scheme: "http",
Host: "localhost:2333",
}
passphrase = "youshallnotpass" // TODO: adjust
)
func main() {
opts := waterlink.NewRequesterOptions().WithPassphrase(passphrase) // more options available
req := waterlink.NewRequester(host, opts)
// TODO: use req
}
```
</p>
</details>
### :musical_keyboard: Interacting with tracks
#### Loading multiple tracks
<details>
<summary>Usage</summary>
<p>
```go
package main
import (
"github.com/lukasl-dev/waterlink"
)
var (
req waterlink.Requester // TODO: create req
identifier = "https://www.youtube.com/watch?v=dQw4w9WgXcQ" // TODO: adjust
)
func main() {
resp, err := req.LoadTracks(identifier)
if err != nil {
// TODO: handle error
return
}
// TODO: use resp
}
```
</p>
</details>
#### Decoding multiple tracks
<details>
<summary>Usage</summary>
<p>
```go
package main
import (
"github.com/lukasl-dev/waterlink"
)
var (
req waterlink.Requester // TODO: create req
trackIDs []string // TODO: define trackIDs
)
func main() {
tracks, err := req.DecodeTracks(trackIDs...)
if err != nil {
// handle error
return
}
// TODO: use tracks
}
```
</p>
</details>
### :notes: Interacting with an audio player
The interaction with an audio player **requires an active web socket connection**.
Additionally, a [voice update event **must be intercepted**](#briefcase-intercepting-a-voice-update-event) to play a
track.
#### Destroying an audio player
<details>
<summary>Usage</summary>
<p>
```go
package main
import "github.com/lukasl-dev/waterlink"
var (
conn waterlink.Connection // TODO: open conn
guildID string // TODO: define guildID
)
func main() {
if err := conn.Destroy(guildID); err != nil {
// TODO: handle error
}
}
```
</p>
</details>
#### Pausing/Resuming the current playing track
<details>
<summary>Usage</summary>
<p>
```go
package main
import "github.com/lukasl-dev/waterlink"
var (
conn waterlink.Connection // TODO: open conn
guildID string // TODO: define guildID
paused bool // TODO: define paused
)
func main() {
if err := conn.SetPaused(guildID, paused); err != nil {
// TODO: handle error
}
}
```
</p>
</details>
#### Playing a track
<details>
<summary>Usage</summary>
<p>
```go
package main
import (
"github.com/lukasl-dev/waterlink"
"github.com/lukasl-dev/waterlink/usecase/play"
)
var (
conn waterlink.Connection // TODO: open conn
guildID string // TODO: define guildID
trackID string // TODO: load trackID
volume uint // TODO: define volume
)
func main() {
opts := play.NewOptions().WithVolume(volume) // more options available
if err := conn.Play(guildID, trackID, opts); err != nil {
// TODO: handle error
}
}
```
</p>
</details>
#### Seeking the current playing track
<details>
<summary>Usage</summary>
<p>
```go
package main
import (
"github.com/lukasl-dev/waterlink"
)
var (
conn waterlink.Connection // TODO: open conn
guildID uint // TODO: define guildID
position uint // TODO: define position
)
func main() {
if err := conn.Seek(guildID, position); err != nil {
// TODO: handle error
}
}
```
</p>
</details>
#### Stopping the current playing track
<details>
<summary>Usage</summary>
<p>
```go
package main
import (
"github.com/lukasl-dev/waterlink"
)
var (
conn waterlink.Connection // TODO: open conn
guildID string // TODO: define guildID
)
func main() {
if err := conn.Stop(guildID); err != nil {
// TODO: handle error
}
}
```
</p>
</details>
#### Intercepting a voice update event
<details>
<summary>Usage</summary>
<p>
```go
package main
import (
"github.com/lukasl-dev/waterlink"
)
var (
conn waterlink.Connection // TODO: open conn
guildID uint // TODO: define guildID
sessionID string // TODO: define sessionID
token string // TODO: define token
endpoint string // TODO: define endpoint
)
func main() {
if err := conn.UpdateVoice(guildID, sessionID, token, endpoint); err != nil {
// TODO: handle error
}
}
```
</p>
</details>
#### Updating the volume of an audio player
<details>
<summary>Usage</summary>
<p>
```go
package main
import (
"github.com/lukasl-dev/waterlink"
)
var (
conn waterlink.Connection // TODO: open conn
guildID string // TODO: define guildID
volume uint // TODO: define volume
)
func main() {
if err := conn.UpdateVolume(guildID, volume); err != nil {
// TODO: handle error
}
}
```
</p>
</details>
### :mailbox: Monitoring events
<details>
<summary>Usage</summary>
<p>
```go
package main
import (
"github.com/lukasl-dev/waterlink"
"github.com/lukasl-dev/waterlink/entity/event"
"github.com/lukasl-dev/waterlink/entity/player"
"github.com/lukasl-dev/waterlink/entity/server"
)
var (
conn waterlink.Connection // TODO: open conn
)
func main() {
for evt := range conn.Events() {
switch evt.Type() {
case event.Stats: // more events available
evt := evt.(server.Stats)
println("Server uses", evt.Memory.Used, "memory")
case event.TrackStart: // more events available
evt := evt.(player.TrackStart)
println("Track", evt.TrackID, "started on guild", evt.GuildID)
}
}
}
```
</p>
</details>
---
## :notebook: Examples
### Integrating [bwmarrin/discordgo](https://github.com/bwmarrin/discordgo)
[\<External Repository\>](https://github.com/lukasl-dev/waterlink-discordgo)
| 19.464668 | 117 | 0.647085 | eng_Latn | 0.630341 |
9c4061865369becb6668a3a9e0ceade4b8be59f1 | 67 | md | Markdown | _includes/02-image.md | AntonGluschuk/markdown-portfolio | 77d03585344d6d1f24b5ea2b638b7dc010ff05a6 | [
"MIT"
] | null | null | null | _includes/02-image.md | AntonGluschuk/markdown-portfolio | 77d03585344d6d1f24b5ea2b638b7dc010ff05a6 | [
"MIT"
] | 5 | 2020-10-13T17:04:02.000Z | 2020-10-13T17:37:59.000Z | _includes/02-image.md | AntonGluschuk/markdown-portfolio | 77d03585344d6d1f24b5ea2b638b7dc010ff05a6 | [
"MIT"
] | null | null | null | 
| 33.5 | 66 | 0.776119 | vie_Latn | 0.274273 |
9c408323313bfea7e1451dbeaef42996c4a6b168 | 1,046 | md | Markdown | docs/notes/fragments.md | CaoJiayuan/gh-pages | 84f9b9d27d5899088ddaf6e4e3a22b45213e76f2 | [
"MIT"
] | null | null | null | docs/notes/fragments.md | CaoJiayuan/gh-pages | 84f9b9d27d5899088ddaf6e4e3a22b45213e76f2 | [
"MIT"
] | 6 | 2021-03-01T19:59:39.000Z | 2022-02-10T11:24:01.000Z | docs/notes/fragments.md | CaoJiayuan/gh-pages | 84f9b9d27d5899088ddaf6e4e3a22b45213e76f2 | [
"MIT"
] | 1 | 2018-08-13T06:54:42.000Z | 2018-08-13T06:54:42.000Z | ---
title: Fragments
---
<center>
<h1>代码片段</h1>
</center>
## 小程序直传OSS
```javascript
function upload() {
this.$request.get('http://api.demo.test/api/upload/auth').then(res => { //请求服务器获取授权信息
wx.chooseImage({
success: file => {
let filename = file.tempFilePaths[0].substring(file.tempFilePaths[0].lastIndexOf('/') + 1);
let key = `${res.data.dir}${filename}`;
wx.uploadFile({
url: res.data.host,
filePath: file.tempFilePaths[0],
name: 'file',
formData: {
name: file.tempFilePaths[0],
key: key,
policy: res.data.policy,
OSSAccessKeyId: res.data.accessid,
signature: res.data.signature,
},
success: data => {
let url = res.data.host + '/' + key; //上传后的文件url
console.log(url)
}
})
}
})
})
}
``` | 26.820513 | 104 | 0.443595 | yue_Hant | 0.142792 |
9c40d98383c951f59608c1feb6b30b4911f328d0 | 13,089 | md | Markdown | README.md | kouk/permifrost | 713aee06c287ba128032c03eead3469a79d90560 | [
"MIT"
] | null | null | null | README.md | kouk/permifrost | 713aee06c287ba128032c03eead3469a79d90560 | [
"MIT"
] | null | null | null | README.md | kouk/permifrost | 713aee06c287ba128032c03eead3469a79d90560 | [
"MIT"
] | null | null | null | # `permifrost`
We welcome contributions, so please feel free to submit MRs or [issues](https://gitlab.com/gitlab-data/permifrost/-/issues/new) if you'd like to help in any way. To get started with contributions read the [Contributing](#contributing) section at the bottom of this README to get started.
## Installation
Install the most stable version using the following command:
```
pip install permifrost
```
If you would like to work with the most up-to-date functionality in permifrost install directly from GitLab using the following command:
```
pip install git+https://gitlab.com/gitlab-data/permifrost.git
```
## Usage
Use this command to check and manage the permissions of a Snowflake account.
```bash
permifrost [-v] run <spec_file> [--role] [--dry] [--diff] [--user] [--ignore-memberships]
```
```shell
#> permifrost run --help
Usage: permifrost run [OPTIONS] SPEC
Grant the permissions provided in the provided specification file for
specific users and roles
Options:
--dry Do not actually run, just check.
--diff Show full diff, both new and existing permissions.
--role TEXT Run grants for specific roles. Usage: --role testrole --role
testrole2.
--user TEXT Run grants for specific users. Usage: --user testuser --user
testuser2.
--ignore-memberships Do not handle role membership grants/revokes
--help Show this message and exit.
```
Use this utility command to run the SnowFlake specification loader to confirm that your `roles.yml` file is valid.
```bash
permifrost [-v] spec-test <spec_file> [--role] [--user] [--ignore-memberships]
```
```shell
#> permifrost spec-test --help
Usage: permifrost spec-test [OPTIONS] SPEC
Load SnowFlake spec based on the roles.yml provided. CLI use only for confirming specifications are valid.
Options:
--role TEXT Run grants for specific roles. Usage: --role testrole
--role testrole2.
--user TEXT Run grants for specific users. Usage: --user testuser
--user testuser2.
--ignore-memberships Do not handle role membership grants/revokes
--run-list TEXT Run grants for specific users. Usage: --user testuser
--user testuser2.
--help Show this message and exit.
```
Given the parameters to connect to a Snowflake account and a YAML file (a
"spec") representing the desired database configuration, this command makes sure
that the configuration of that database matches the spec. If there are
differences, it will return the sql grant and revoke commands required to make
it match the spec. If there are additional permissions set in the database this
command will create the necessary revoke commands with the exception of:
- Object Ownership
- Warehouse Privileges
Furthermore, if you are using the recommended role of `SECURITYADMIN`, `ALTER USER ...` commands will fail on users that are owned by `ACCOUNTADMIN`. In these circumstances, it is highly recommended to log into the Snowflake instance and update ownership of all users to belong to `USERADMIN` as per Snowflake recommended best practices.
Lastly, note that the default roles cannot have their role hierarchies modified. As such, any `GRANT ROLE <default role> TO ROLE <default role>;` will be excluded from the permission set generated by Permifrost.
For example:
```yaml
...
roles:
public:
member_of:
- useradmin
securityadmin:
member_of:
- useradmin
...
```
Both of the above relationships will be skipped as this attempts to modify a default Snowflake permission structure which would generate an error on attempting to implement.
Permifrost is heavily inspired by
[pgbedrock](https://github.com/Squarespace/pgbedrock) which can be used for
managing the permissions in a Postgres database.
## spec_file
The YAML specification file is used to define in a declarative way the
databases, roles, users and warehouses in a Snowflake account, together with the
permissions for databases, schemas and tables for the same account.
All permissions are abbreviated as `read` or `write` permissions, with
Permifrost generating the proper grants for each type of object. This includes
shared databases which have simpler and more limited permissions than non-shared
databases.
According to the `read` vs. `write` permissions approach, you should be able to
grant granular access like `read` permissions for usage of database and schema
and `write` permissions to insert data into a specific table within that
database and schema.
Tables and views are listed under `tables` and handled properly behind the
scenes.
If `*` is provided as the parameter for tables the grant statement will use the
`ALL <object_type>s in SCHEMA` syntax. It will also grant to future tables and
views. See Snowflake documenation for [`ON
FUTURE`](https://docs.snowflake.net/manuals/sql-reference/sql/grant-privilege.html#optional-parameters)
If a schema name includes an asterisk, such as `snowplow_*`, then all schemas
that match this pattern will be included in the grant statement _unless it is
for ownership_, in which case the asterisk is not supported. This can be coupled
with the asterisk for table grants to grant permissions on all tables in all
schemas that match the given pattern. This is useful for date-partitioned
schemas.
All entities must be explicitly referenced. For example, if a permission is
granted to a schema or table then the database must be explicitly referenced for
permissioning as well. Additionally, role membership must be explicit in the
config file. If a role does not have a `member_of` list, it will have all roles
it currently has revoked.
Roles can accept "_" as a role name either alone or nested under the `include`
key. There is optionally an `exclude` key that can be used if `include` is used.
`"_"`will grant membership to all roles defined in the spec. Any roles defined
in`exclude`will be removed from the list defined in`include`.
A specification file has the following structure:
```bash
# Databases
databases:
- db_name:
shared: boolean
- db_name:
shared: boolean
owner: role_name
... ... ...
# Roles
roles:
- role_name:
warehouses:
- warehouse_name
- warehouse_name
...
member_of:
- role_name
- role_name
...
# or
member_of:
include:
- "*"
exclude:
- role_name
privileges:
databases:
read:
- database_name
- database_name
...
write:
- database_name
- database_name
...
schemas:
read:
- database_name.*
- database_name.schema_name
- database_name.schema_partial_*
...
write:
- database_name.*
- database_name.schema_name
- database_name.schema_partial_*
...
tables:
read:
- database_name.*.*
- database_name.schema_name.*
- database_name.schema_partial_*.*
- database_name.schema_name.table_name
...
write:
- database_name.*.*
- database_name.schema_name.*
- database_name.schema_partial_*.*
- database_name.schema_name.table_name
...
owns:
databases:
- database_name
...
schemas:
- database_name.*
- database_name.schema_name
...
tables:
- database_name.*.*
- database_name.schema_name.*
- database_name.schema_name.table_name
...
- role_name:
owner: role_name
... ... ...
# Users
users:
- user_name:
can_login: boolean
member_of:
- role_name
...
- user_name:
owner: role_name
... ... ...
# Warehouses
warehouses:
- warehouse_name:
size: x-small
- warehouse_name:
size: x-small
owner: role_name
... ... ...
```
For a working example, you can check [the Snowflake specification
file](https://gitlab.com/gitlab-data/permifrost/blob/master/tests/permifrost/core/permissions/specs/snowflake_spec.yml)
that we are using for testing `permifrost permissions`.
### Settings
All settings are declared here with their default values and are described
below. These can be added to your spec.yaml file.
```yaml
require-owner: false
```
`require-owner`: Set to true to force having to set the `owner` property on all
objects defined.
## --diff
When this flag is set, a full diff with both new and already granted commands is
returned. Otherwise, only required commands for matching the definitions on the
spec are returned.
## --dry
When this flag is set, the permission queries generated are not actually sent to
the server and run; They are just returned to the user for examining them and
running them manually.
When this flag is not set, the commands will be executed on Snowflake and their
status will be returned and shown on the command line.
## Connection Parameters
The following environmental variables must be available to connect to Snowflake:
```bash
$PERMISSION_BOT_USER
$PERMISSION_BOT_ACCOUNT
$PERMISSION_BOT_WAREHOUSE
```
### Username and Password
To connect using a username and password, also include the following:
```bash
$PERMISSION_BOT_PASSWORD
$PERMISSION_BOT_DATABASE
$PERMISSION_BOT_ROLE
```
Currently, Permifrost assumes you are using the SECURITYADMIN role and will fail
validation if you are not.
### OAuth
To connect using an OAuth token, also include the following:
```bash
$PERMISSION_BOT_OAUTH_TOKEN
```
### Key Pair Authentication
Rather than supplying a password or an oauth token, it's possible to connect via
Snowflake's Key Pair authentication by setting the following:
```bash
$PERMISSION_BOT_KEY_PATH
$PERMISSION_BOT_KEY_PASSPHRASE
```
See [Snowflake-sqlalchemy](https://github.com/snowflakedb/snowflake-sqlalchemy#key-pair-authentication-support) for more info.
## Contributing
Contributing to Permifrost is easy, and most commands to do so are available
within the Makefile.
The easiest way to start developing is to run `make initial-setup` to install
all the necessary packages to develop on the project. Next run `make
permifrost` in a second terminal, this will open a shell in a docker container
with the local version of Permifrost installed.
You can now make changes to the files in your editor and it will be reflected in
the commands that you run from the docker shell.
To check code quality prior to committing changes, you can use `make local-lint`.
See the [Makefile](Makefile) for more details.
**WARNINGS**
DO NOT name git branches with forward slashes `/` as the current CI pipeline is
unable to manage names like this. (i.e. `username/feature/feature-name` will
break the CI pipeline so `username.feature.feature-name` should be used
instead)
This project has [pre-commit
hooks](https://github.com/pre-commit/pre-commit-hooks) installed to maintain
the existing code quality. As such, we strongly recommend you use a terminal to
**commit** and **push** code changes. Specifically, avoid using git
integrations on IDEs to make **commits** or **pushes**. **Adding** files
through the IDE git integrations are okay, but do not **commit** through the
IDE. Use the terminal to commit changes because it will show the output of each
of the pre-commit checks to allow you to make changes as needed.
For committing work-in-progress changes use `git commit --no-verify -m "WIP:
<message>"`.
For committing finalized changes, the below workflow will identify errors and allow for easier development:
* Make your changes and `git add <file name(s)>`
* `git commit` to identify/format errors in the changed files
* Repeat the following steps until all checks pass
* `git add <file name(s)>`
* `git commit`
* Add message at the prompt and save/exit the commit file
* When you are ready to push changes to the remote host, run `git push origin <branch name>`. This will perform additional linting/formatting checks.
* Repeat the following steps until all checks pass
* `git push origin <branch name>`
* `git add <file name(s)>`
* `git commit`
* Add message at the prompt and save/exit the commit file
* `git push origin <branch name>` until all checks pass
## Releasing
See the [issue template](https://gitlab.com/gitlab-data/permifrost/-/blob/master/.gitlab/issue_templates/Releasing%20update.md)
for guidance on how to release a new version of this project to PyPi
| 34.264398 | 337 | 0.691573 | eng_Latn | 0.995667 |
9c411c0925533cea77a1a480fcafa4326a07ae03 | 3,186 | md | Markdown | docs/access/desktop-database-reference/rundatamacro-macro-action.md | isabella232/office-developer-client-docs.de-DE | f244ed2fdf76004aaef1de6b6c24b8b1c5a6942e | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-19T18:52:16.000Z | 2021-04-21T00:13:46.000Z | docs/access/desktop-database-reference/rundatamacro-macro-action.md | MicrosoftDocs/office-developer-client-docs.de-DE | f244ed2fdf76004aaef1de6b6c24b8b1c5a6942e | [
"CC-BY-4.0",
"MIT"
] | 2 | 2021-12-08T03:25:19.000Z | 2021-12-08T03:43:48.000Z | docs/access/desktop-database-reference/rundatamacro-macro-action.md | isabella232/office-developer-client-docs.de-DE | f244ed2fdf76004aaef1de6b6c24b8b1c5a6942e | [
"CC-BY-4.0",
"MIT"
] | 5 | 2018-07-17T08:19:45.000Z | 2021-10-13T10:29:41.000Z | ---
title: RunDataMacro-Makroaktion
TOCTitle: RunDataMacro macro action
ms:assetid: fe4ac2f4-7851-7797-ce91-5f2dd3ba4d22
ms:mtpsurl: https://msdn.microsoft.com/library/Ff837269(v=office.15)
ms:contentKeyID: 48548933
ms.date: 09/18/2015
mtps_version: v=office.15
f1_keywords:
- vbaac10.chm168493
f1_categories:
- Office.Version=v15
ms.localizationpriority: medium
ms.openlocfilehash: d24cf33bd9b5ced31ec7a71ce67efc70b26b5e02
ms.sourcegitcommit: a1d9041c20256616c9c183f7d1049142a7ac6991
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 09/24/2021
ms.locfileid: "59557792"
---
# <a name="rundatamacro-macro-action"></a>RunDataMacro-Makroaktion
**Gilt für**: Access 2013, Office 2013
Mit der **AusführenDatenmakro**-Aktion können Sie ein benanntes Datenmakro ausführen.
## <a name="setting"></a>Einstellung
Die **AusführenDatenmakro**-Aktion kann mit dem folgenden Argument verwendet werden.
<table>
<colgroup>
<col style="width: 50%" />
<col style="width: 50%" />
</colgroup>
<thead>
<tr class="header">
<th><p>Aktionsargument</p></th>
<th><p>Beschreibung</p></th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td><p>Name</p></td>
<td><p>Der Name des Datenmakros, das ausgeführt werden soll.</p></td>
</tr>
</tbody>
</table>
## <a name="remarks"></a>HinwBemerkungeneise
Sie können die **Aktion "RunDataMacro"** in Makros, benannten Datenmakros und den folgenden Makroereignissen verwenden: **[Makroereignis "Nach Löschvorgang",](after-delete-macro-event.md)** **[Makroereignis "Nach Einfügen"](after-insert-macro-event.md)** und **[Makroereignis "Nach Aktualisierung".](after-update-macro-event.md)**
Der Name des Datenmakros muss die Tabelle enthalten, der es angefügt ist (z. **B. Comments.AddComment**, nicht nur **AddComment**).
Wenn Sie das auszuführende Datenmakro im Makro-Designer auswählen, wird von Access ermittelt, ob das Datenmakro Parameter erfordert. Wenn für das Datenmakro Parameter erforderlich sind, werden Textfelder angezeigt, in die Sie die Argumente eingeben können.
Wenn Sie ein Makro ausführen, das die **AusführenDatenmakro** -Aktion enthält, und dieses die **AusführenDatenmakro** -Aktion erreicht, wird das aufgerufene Datenmakro in Access ausgeführt. Sobald das aufgerufene Datenmakro beendet wurde, kehrt Access zum ursprünglichen Makro zurück und führt die nächste Aktion aus.
## <a name="example"></a>Beispiel
Das folgende Beispiel zeigt, wie Sie einen Parameter an ein benanntes Datenmakro übergeben. Das dmGetCurrentServiceRequest-Datenmakro der tblServiceRequests-Tabelle wird mithilfe der RunDataMacro-Aktion aufgerufen. Wenn dmGetCurrentServiceRequest abgeschlossen ist, wird die CurrentServiceRequest-Variable zurückgegeben, aus der das Datenmakro in das Textfeld txtCurrentSR geschrieben wird.
**Der Beispielcode stammt von:**[Microsoft Access 2010 Programmer's Reference](https://www.amazon.com/Microsoft-Access-2010-Programmers-Reference/dp/8126528125).
```vb
RunDataMacro
Macro Name tblServiceRequests.dmGetCurrentServiceRequest
Parameters
prmAssignedTo =[ID]
SetProperty
Control Name txtCurrentSR
Property Value
Value =[ReturnVars]![CurrentServiceRequest]
```
| 40.329114 | 390 | 0.77307 | deu_Latn | 0.925948 |
9c41738d6a1bbb54e0fc11aace6ba055f2225de0 | 48 | md | Markdown | README.md | smsubham/Bengali.AI-Handwritten-Grapheme-Classification | 178da5c62309e9d364792a1fc1d14d6d6ed3073a | [
"Apache-2.0"
] | null | null | null | README.md | smsubham/Bengali.AI-Handwritten-Grapheme-Classification | 178da5c62309e9d364792a1fc1d14d6d6ed3073a | [
"Apache-2.0"
] | null | null | null | README.md | smsubham/Bengali.AI-Handwritten-Grapheme-Classification | 178da5c62309e9d364792a1fc1d14d6d6ed3073a | [
"Apache-2.0"
] | null | null | null | # Bengali.AI-Handwritten-Grapheme-Classification | 48 | 48 | 0.875 | eng_Latn | 0.385038 |
9c43bbdabd5acf4bb2f7c4cd978923cae19c3557 | 39 | md | Markdown | README.md | hersheychadha/Visual-Biases | e9c8161078c4fab13ddb8d28fa0a9a12574c0e02 | [
"MIT"
] | null | null | null | README.md | hersheychadha/Visual-Biases | e9c8161078c4fab13ddb8d28fa0a9a12574c0e02 | [
"MIT"
] | null | null | null | README.md | hersheychadha/Visual-Biases | e9c8161078c4fab13ddb8d28fa0a9a12574c0e02 | [
"MIT"
] | null | null | null | # Visual-Biases
Minor Project Contents
| 13 | 22 | 0.820513 | eng_Latn | 0.691609 |
9c451d7bd0003ccef823628fa08a193573655e13 | 3,386 | md | Markdown | includes/vpn-gateway-table-gwtype-aggtput-include.md | junichia/azure-docs.ja-jp | 9fa4a5e0a4ff7a741c2efb6c5a5010fa2abc810c | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-10-22T22:16:30.000Z | 2019-10-22T22:16:30.000Z | includes/vpn-gateway-table-gwtype-aggtput-include.md | junichia/azure-docs.ja-jp | 9fa4a5e0a4ff7a741c2efb6c5a5010fa2abc810c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/vpn-gateway-table-gwtype-aggtput-include.md | junichia/azure-docs.ja-jp | 9fa4a5e0a4ff7a741c2efb6c5a5010fa2abc810c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: インクルード ファイル
description: インクルード ファイル
services: vpn-gateway
author: cherylmc
ms.service: vpn-gateway
ms.topic: include
ms.date: 05/22/2019
ms.author: cherylmc
ms.custom: include file
ms.openlocfilehash: 65c1011e6e005c190d1ae5d51fdd009f66a20956
ms.sourcegitcommit: 083aa7cc8fc958fc75365462aed542f1b5409623
ms.translationtype: HT
ms.contentlocale: ja-JP
ms.lasthandoff: 09/11/2019
ms.locfileid: "70919703"
---
|**SKU** | **S2S/VNet 間<br>トンネル** | **P2S<br> SSTP 接続** | **P2S<br> IKEv2/OpenVPN 接続** | **合計<br>スループット ベンチマーク** | **BGP** | **ゾーン冗長** |
|--- | --- | --- | --- | --- | --- | --- |
|**Basic** | 最大 10 | 最大 128 | サポートされていません | 100 Mbps | サポートされていません| いいえ |
|**VpnGw1**| 最大 30* | 最大 128 | 最大 250 | 650 Mbps | サポートされています | いいえ |
|**VpnGw2**| 最大 30* | 最大 128 | 最大 500 | 1 Gbps | サポートされています | いいえ |
|**VpnGw3**| 最大 30* | 最大 128 | 最大 1000 | 1.25 Gbps | サポートされています | いいえ |
|**VpnGw1AZ**| 最大 30* | 最大 128 | 最大 250 | 650 Mbps | サポートされています | はい |
|**VpnGw2AZ**| 最大 30* | 最大 128 | 最大 500 | 1 Gbps | サポートされています | はい |
|**VpnGw3AZ**| 最大 30* | 最大 128 | 最大 1000 | 1.25 Gbps | サポートされています | はい |
(*) 30 個を超える S2S VPN トンネルが必要な場合は、[Virtual WAN](../articles/virtual-wan/virtual-wan-about.md) を使用してください。
* これらの接続の制限は別々になっています。 たとえば、VpnGw1 SKU では 128 の SSTP 接続が利用できると共に 250 の IKEv2 接続を利用できます。
* 料金情報については、[価格](https://azure.microsoft.com/pricing/details/vpn-gateway)に関するページをご覧ください。
* SLA (サービス レベル アグリーメント) 情報は [SLA](https://azure.microsoft.com/support/legal/sla/vpn-gateway/) のページで確認できます。
* VpnGw1、VpnGw2、VpnGw3、VpnGw1AZ、VpnGw2AZ、および VpnGw3AZ は、Resource Manager デプロイ モデルを使用する VPN ゲートウェイでのみサポートされます。
* Basic SKU はレガシ SKU とみなされます。 Basic SKU には一定の機能制限があります。 Basic SKU を使用するゲートウェイのサイズを変更し、新しいゲートウェイ SKU のいずれかにすることはできません。その代わり、新しい SKU に変更する必要があります。 Basic SKU を使用する前に、必要としている機能がサポートされていることを確認してください。
* 合計スループット ベンチマークは、1 つのゲートウェイから集計された複数のトンネルの測定値に基づいています。 VPN ゲートウェイの合計スループット ベンチマークは、S2S と P2S を組み合わせたものです。 **多数のポイント対サイト接続がある場合、スループットの制限が原因でサイト間接続に悪影響が及ぶ可能性があります。** 合計スループット ベンチマークは、インターネット トラフィックの状況とアプリケーションの動作に依存するため、保証されたスループットではありません。
* VpnGw Sku の相対的なパフォーマンスをお客様が容易に把握できるように、一般公開されている iPerf ツールと CTSTraffic ツールを使用してパフォーマンスを測定しました。 次の表に、さまざまなアルゴリズムを使用したパフォーマンス テストの結果を示します。 ご覧のとおり、IPsec 暗号化と整合性の両方に GCMAES256 アルゴリズムを使用した場合に、最高のパフォーマンスが得られました。 IPsec 暗号化に AES256 を使用し、整合性に SHA256 を使用した場合は、平均的なパフォーマンスが得られました。 IPsec 暗号化に DES3 を使用し、整合性に SHA256 を使用した場合は、パフォーマンスが最も低くなりました。
|**SKU** | **使用した<br>アルゴリズム** | **測定された<br>スループット** | **測定された<br> 1 秒あたりのパケット数** |
|--- | --- | --- | --- |
|**VpnGw1**| GCMAES256<br>AES256 と SHA256<br>DES3 と SHA256| 650 Mbps<br>500 Mbps<br>120 Mbps | 58,000<br>50,000<br>50,000|
|**VpnGw2**| GCMAES256<br>AES256 と SHA256<br>DES3 と SHA256| 1 Gbps<br>500 Mbps<br>120 Mbps | 90,000<br>80,000<br>55,000|
|**VpnGw3**| GCMAES256<br>AES256 と SHA256<br>DES3 と SHA256| 1.25 Gbps<br>550 Mbps<br>120 Mbps | 105,000<br>90,000<br>60,000|
|**VpnGw1AZ**| GCMAES256<br>AES256 と SHA256<br>DES3 と SHA256| 650 Mbps<br>500 Mbps<br>120 Mbps | 58,000<br>50,000<br>50,000|
|**VpnGw2AZ**| GCMAES256<br>AES256 と SHA256<br>DES3 と SHA256| 1 Gbps<br>500 Mbps<br>120 Mbps | 90,000<br>80,000<br>55,000|
|**VpnGw3AZ**| GCMAES256<br>AES256 と SHA256<br>DES3 と SHA256| 1.25 Gbps<br>550 Mbps<br>120 Mbps | 105,000<br>90,000<br>60,000|
| 65.115385 | 332 | 0.690786 | jpn_Jpan | 0.480723 |
051e9db7a56f25348bde610e063caf7f4517cb5a | 650 | md | Markdown | README.md | aln447/gaja_mgr | d794047c42f38aa203df1365a225cd5dead5e47e | [
"MIT"
] | null | null | null | README.md | aln447/gaja_mgr | d794047c42f38aa203df1365a225cd5dead5e47e | [
"MIT"
] | null | null | null | README.md | aln447/gaja_mgr | d794047c42f38aa203df1365a225cd5dead5e47e | [
"MIT"
] | null | null | null | # Gaja
Statyczna strona graficzna napisana w ramach części praktycznej pracy magisterskiej *"Projektowanie interfejsów, wybrane wzorce projektowe"
- Autor: Alan Krygowski
- Kontakt email: [email protected]
Strona korzysta ze schematu graficznego _Material Bootstrap_ od Creative Tim, dostępnej pod tym źródłem: https://www.creative-tim.com/product/material-kit
Zdjęcia bazowe, zapisane w katalogu:
- post1: https://unsplash.com/photos/I_71KTe1Nwk
- post2: https://unsplash.com/photos/Cbf5Qzt2QIY
- post3: https://unsplash.com/photos/DLS6UsGH7VE
Źródła wszystkich innych wykorzystanych publikacji dostępne są w stronie projektu, przy każdym zdjęciu | 46.428571 | 154 | 0.816923 | pol_Latn | 0.999386 |
0520d54684032f446586226c482a6d8fbe28896a | 2,350 | md | Markdown | README.md | leobarros/nginx | 36c5dc71957a78793e673bc6504ba5d1f380d738 | [
"Apache-2.0"
] | null | null | null | README.md | leobarros/nginx | 36c5dc71957a78793e673bc6504ba5d1f380d738 | [
"Apache-2.0"
] | null | null | null | README.md | leobarros/nginx | 36c5dc71957a78793e673bc6504ba5d1f380d738 | [
"Apache-2.0"
] | null | null | null | # Nginx web server
Máquina virtual para estudo do servidor web Nginx.
Os exemplos foram tirados do curso de Nginx Fundamental em
(https://www.udemy.com/nginx-fundamentals/learn/v4/overview)
# Requerimentos
* VirturalBox
* Vagrant (Ubuntu)
* Puppet 4
* Adicionar em sua máquina o host web.testlabs.com.br em seu /etc/hosts.
Exemplo: xxx.xxx.xxx.xxx web.testlabs.com.br
* Baixar o wordpress (https://wordpress.org/download/)
Criar a pasta /sites/wordpress, pois está root /sites/wordpress
no arquivo de configuração de nginx.conf.
# Usando o Vagrant
* Iniciando a máquina virtual com o VagrantFile presente:
vagrant up web
* Para iniciar a sessão ssh:
vagrant ssh web
* Adicionar o repositorio de PHP 5 na máquina virtual
Sites para informações sobre o repositório do PHP 5.6:
https://joshtronic.com/2014/08/31/upgrade-to-php-56-on-ubuntu-1404-lts/
https://www.dotdeb.org/instructions/
https://www.dotdeb.org/mirrors/
* Baixar o wordpress (https://wordpress.org/download/)
Criar a pasta /sites/wordpress, pois está root /sites/wordpress
no arquivo de configuração de nginx.conf.
* Instalar o MySQL para abrigar a base de dados do wordpress.
apt-get install mysql-server
* Instalar apache2-utils para usar o ab, htpasswd e etc.
apt-get install apache2-utils
Copiar o arquivo .htpassw para /etc/nginx/
Usuário de exemplo e senha de exemplo:
usuário: testlabs
senha: 280600
## Instalação do php5.6
Depois de informar o repositorio do php devemos instalar os pacotes:
* php5.6
* php5.6-fpm
* php5.6-cgi
* php5.6-mysql
apt-get install php5.6 php5.6-fpm php5.6-cgi php5.6-mysql
# Exemplo de um site
Está sendo usado o site de exemplo da pasta bootstrap assim como o
wordpress.
Será necessário informar se deseja usar o bootstrap ou wordpress no
arquivo nginx.conf do projeto, copiar para /etc/nginx.conf e depois
reiniciar o nginx com systemctl restart nginx.
# Gerando certificado para uso de https no nginx.conf
Use o comando abaixo na máquina virtual para gerar o ssl
sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout /etc/nginx/ssl/nginx.key -out /etc/nginx/ssl/nginx.crt
# Arquivo de Configuração
Colocar o arquivo nginx.conf em /etc/nginx/
Colocar o arquivo www.conf em /etc/php/5.6/fpm/pool.d
| 34.057971 | 121 | 0.732766 | por_Latn | 0.972085 |
05233c268355dff6faa852077b98e03dfa16b59c | 10,350 | md | Markdown | _posts/LearnCpp/chapterM. Move Semantics and Smart Pointers/2021-12-13-M.02-R-value references.md | mgtruuuu/mgtruuuu.github.io | 2fc22f2f995ea1dfef2f64201e8b112fab55322f | [
"MIT"
] | null | null | null | _posts/LearnCpp/chapterM. Move Semantics and Smart Pointers/2021-12-13-M.02-R-value references.md | mgtruuuu/mgtruuuu.github.io | 2fc22f2f995ea1dfef2f64201e8b112fab55322f | [
"MIT"
] | null | null | null | _posts/LearnCpp/chapterM. Move Semantics and Smart Pointers/2021-12-13-M.02-R-value references.md | mgtruuuu/mgtruuuu.github.io | 2fc22f2f995ea1dfef2f64201e8b112fab55322f | [
"MIT"
] | null | null | null | ---
title : "M.02 — R-value references"
category :
- LearnCpp
tag :
- C++
- https://www.learncpp.com/
- value category
- l-value, locator value
- r-value
- l-value reference, reference
- r-value reference
toc: true
toc_sticky: true
use_math : true
---
Way back in chapter 1, we mentioned l-values and r-values, and then told you not to worry that much about them. That was fair advice prior to `C++11`. But understanding move semantics in `C++11` requires a re-examination of the topic. So let’s do that now.
## L-values and r-values
Despite having the word “value” in their names, **l-values and r-values are actually not properties of values, but rather, *properties of expressions***.
**Every expression in C++ has two properties:** a **type** (which is used for type checking), and a **value category** (which is used for certain kinds of syntax checking, such as whether the result of the expression can be assigned to). In `C++03` and earlier, l-values and r-values were the only two value categories available.
*The actual definition of which expressions are l-values and which are r-values is surprisingly complicated, so we’ll take a simplified view of the subject that will largely suffice for our purposes.*
It’s simplest to think of an **l-value** (also called a **locator value**) as *a function or an object (or an expression that evaluates to a function or object).* *All l-values have assigned memory addresses.*
*When l-values were originally defined*, they were defined as “values that are suitable to be on the left-hand side of an assignment expression”. However, *later*, **the `const` keyword was added to the language, and l-values were split into two sub-categories:** **modifiable l-values**, which can be changed, and **non-modifiable l-values**, which are const.
It’s simplest to think of an **r-value** as *“everything that is not an l-value”*. This notably includes **literals** (e.g. `5`), **temporary values** (e.g. `x + 1`), and **anonymous objects** (e.g. `Fraction(5, 2)`). **r-values are typically evaluated for their values, have expression scope (they die at the end of the expression they are in), and cannot be assigned to.** *This non-assignment rule makes sense, because assigning a value applies a side effect to the object.* Since r-values have expression scope, if we were to assign a value to an r-value, then the r-value would either go out of scope before we had a chance to use the assigned value in the next expression (which makes the assignment useless) or we’d have to use a variable with a side effect applied more than once in an expression (which by now you should know causes undefined behavior!).
>>>???(lvalue=rvalue=rvalue ... contradiction??)
**In order to support move semantics, `C++11` introduces 3 new value categories: pr-values, x-values, and gl-values.** We will largely ignore these since understanding them isn’t necessary to learn about or use move semantics effectively. If you’re interested, [cppreference.com](https://en.cppreference.com/w/cpp/language/value_category) has an extensive list of expressions that qualify for each of the various value categories, as well as more detail about them.
## L-value references
*Prior to `C++11`, only one type of reference existed in C++, and so it was just called a “reference”.* **However, in `C++11`, it’s sometimes called an l-value reference.** *L-value references can only be initialized with modifiable l-values.*
<div class="cpp-table-wrapper"><p></p><table class="cpp-table"><tbody><tr><th>\L-value reference</th><th>Can be initialized with</th><th>Can modify</th></tr><tr><td>Modifiable l-values</td><td>Yes</td><td>Yes</td></tr><tr><td>Non-modifiable l-values</td><td>No</td><td>No</td></tr><tr><td>R-values</td><td>No</td><td>No</td></tr></tbody></table></div>
*L-value references to const objects can be initialized with l-values and r-values alike. However, those values can’t be modified.*
<div class="cpp-table-wrapper"><p></p><table class="cpp-table"><tbody><tr><th>\L-value reference to const</th><th>Can be initialized with</th><th>Can modify</th></tr><tr><td>Modifiable l-values</td><td>Yes</td><td>No</td></tr><tr><td>Non-modifiable l-values</td><td>Yes</td><td>No</td></tr><tr><td>R-values</td><td>Yes</td><td>No</td></tr></tbody></table></div>
L-value references to const objects are particularly useful because they allow us to pass any type of argument (l-value or r-value) into a function without making a copy of the argument.
## R-value references
*`C++11` adds a new type of reference called an r-value reference.* An **r-value reference** is a reference that is designed to be initialized with an r-value *(only)*. While an l-value reference is created using a single ampersand, an r-value reference is created using a double ampersand:
```c++
int x{ 5 };
// L-value reference initialized with l-value x.
int &lref{ x };
// R-value reference initialized with r-value 5.
int &&rref{ 5 };
```
**R-values references cannot be initialized with l-values.**
<div class="cpp-table-wrapper"><p></p><table class="cpp-table"><tbody><tr><th>\R-value reference</th><th>Can be initialized with</th><th>Can modify</th></tr><tr><td>Modifiable l-values</td><td>No</td><td>No</td></tr><tr><td>Non-modifiable l-values</td><td>No</td><td>No</td></tr><tr><td>R-values</td><td>Yes</td><td>Yes</td></tr></tbody></table></div>
<div class="cpp-table-wrapper"><p></p><table class="cpp-table"><tbody><tr><th>\R-value reference to const</th><th>Can be initialized with</th><th>Can modify</th></tr><tr><td>Modifiable l-values</td><td>No</td><td>No</td></tr><tr><td>Non-modifiable l-values</td><td>No</td><td>No</td></tr><tr><td>R-values</td><td>Yes</td><td>No</td></tr></tbody></table></div>
R-value references have two properties that are useful.
- First, **r-value references extend the lifespan of the object they are initialized with to the lifespan of the r-value reference** (l-value references to const objects can do this too).
- Second, **non-const r-value references allow you to modify the r-value**!
Let’s take a look at some examples:
```c++
#include <iostream>
class Fraction {
private:
int m_numerator;
int m_denominator;
public:
Fraction(int numerator = 0, int denominator = 1)
: m_numerator{ numerator }, m_denominator{ denominator }{}
friend std::ostream& operator<<(std::ostream& out, const Fraction& f1) {
out << f1.m_numerator << '/' << f1.m_denominator;
return out;
}
};
int main() {
// r-value reference to temporary Fraction.
auto&& rref{ Fraction{ 3, 5 } };
// f1 of operator<< binds to the temporary,
// no copies are created.
std::cout << rref << '\n';
} // rref (and the temporary Fraction) goes out of scope here.
```
This program prints:
```
3/5
```
As an anonymous object, `Fraction(3, 5)` would normally go out of scope at the end of the expression in which it is defined. However, since we’re initializing an r-value reference with it, its duration is extended until the end of the block. We can then use that r-value reference to print the `Fraction`’s value.
Now let’s take a look at a less intuitive example:
```c++
#include <iostream>
int main() {
// Because we're initializing an r-value reference with a literal,
// a temporary with value 5 is created here.
int&& rref{ 5 };
rref = 10;
std::cout << rref << '\n';
}
```
This program prints:
```
10
```
While it may seem weird to initialize an r-value reference with a literal value and then be able to change that value, **when initializing an r-value reference with a literal, a temporary object is constructed from the literal so that the reference is referencing *a temporary object, not a literal value***.
*R-value references are not very often used in either of the manners illustrated above.*
## R-value references as function parameters
R-value references are more often used as function parameters. This is most useful for function overloads when you want to have different behavior for l-value and r-value arguments.
```c++
#include <iostream>
// L-value arguments will select this function.
void fun(const int& lref) {
std::cout << lref << ", l-value reference to const\n";
}
// R-value arguments will select this function.
void fun(int&& rref) {
std::cout << rref << ", r-value reference\n";
}
int main() {
int x{ 5 };
// L-value argument calls l-value version of function.
fun(x);
// R-value argument calls r-value version of function.
fun(5);
}
```
This prints:
```
5, l-value reference to const
5, r-value reference
```
As you can see, *when passed an l-value*, the overloaded function resolved to the version with the l-value reference. *When passed an r-value*, the overloaded function resolved to the version with the r-value reference **(this is considered a better match than a l-value reference to const)**.
Why would you ever want to do this? We’ll discuss this in more detail in the next lesson. Needless to say, it’s an important part of move semantics.
One interesting note:
```c++
#include <iostream>
// L-value arguments will select this function.
void fun(const int& lref) {
std::cout << lref << ", l-value reference to const\n";
}
// R-value arguments will select this function.
void fun(int&& rref) {
std::cout << rref << ", r-value reference\n";
}
int main() {
int &&ref{ 5 };
fun(ref);
}
```
Function `fun()` actually calls the l-value version of the function! **Although variable `ref` has type r-value reference to an integer, it is actually an l-value itself (as are all named variables).** The confusion stems from the use of the term r-value in two different contexts. Think of it this way: Named-objects are l-values. Anonymous objects are r-values. **The type of the named object or anonymous object is independent from whether it’s an l-value or r-value.** Or, put another way, if r-value reference had been called anything else, this confusion wouldn’t exist.
## Returning an r-value reference
**You should almost never return an r-value reference, for the same reason you should almost never return an l-value reference.** In most cases, you’ll end up returning a hanging reference when the referenced object goes out of scope at the end of the function. | 49.052133 | 863 | 0.710242 | eng_Latn | 0.995711 |
052401fd8ffc8587734529978d965318aa22eb22 | 43 | md | Markdown | guides/data/data-collection-mi.md | cannlytics/cannlytics-ai | c9d94e6fe9961129d1e29cd70c11ad6d267f3d48 | [
"MIT"
] | 2 | 2021-11-14T00:57:23.000Z | 2022-02-05T23:31:05.000Z | guides/data/data-collection-mi.md | cannlytics/cannlytics-ai | c9d94e6fe9961129d1e29cd70c11ad6d267f3d48 | [
"MIT"
] | null | null | null | guides/data/data-collection-mi.md | cannlytics/cannlytics-ai | c9d94e6fe9961129d1e29cd70c11ad6d267f3d48 | [
"MIT"
] | 1 | 2021-11-14T09:07:00.000Z | 2021-11-14T09:07:00.000Z | # Data Collection | Michigan
## Resources
| 10.75 | 28 | 0.72093 | kor_Hang | 0.813124 |
05243e098a62c3c2c22ab45f4f15216695c8de90 | 19 | md | Markdown | README.md | mdsaraujo/projeto-espaco | a38e772be8bb5f04fab4cb1252f18f84aab6ca45 | [
"MIT"
] | null | null | null | README.md | mdsaraujo/projeto-espaco | a38e772be8bb5f04fab4cb1252f18f84aab6ca45 | [
"MIT"
] | null | null | null | README.md | mdsaraujo/projeto-espaco | a38e772be8bb5f04fab4cb1252f18f84aab6ca45 | [
"MIT"
] | null | null | null | # projeto-espaco
| 6.333333 | 16 | 0.684211 | fra_Latn | 0.847867 |
05249188d74a86c22843060a22cc74ac28320606 | 1,272 | md | Markdown | README.md | banyan/gh_contrib | 39f3fb7f7f6ed862c12b5e02678dade27c5e896f | [
"MIT"
] | null | null | null | README.md | banyan/gh_contrib | 39f3fb7f7f6ed862c12b5e02678dade27c5e896f | [
"MIT"
] | null | null | null | README.md | banyan/gh_contrib | 39f3fb7f7f6ed862c12b5e02678dade27c5e896f | [
"MIT"
] | null | null | null | # gh_contrib
GitHub has a nice feature of [contributions](https://help.github.com/articles/viewing-contributions-on-your-profile-page/).
But unfortunately they don't offer an official API.
Now somehow we can get only from here (`https://github.com/users/<username>/contributions`). It returns HTML :cry:.
This is just parsing HTML to JSON or returns as Ruby's object.
## Usage
### CLI
```zsh
$ gem install gh_contrib
$ gh_contrib username
$ echo 'GITHUB_USERNAME=username
$ GITHUB_PASSWORD=password' > .env
# or you can define on shell
# export GITHUB_USERNAME=username
# export GITHUB_PASSWORD=password
$ gh_contrib username
$ gh_contrib username -d month
```
### API
```ruby
require 'gh_contrib'
agent = GhContrib::Agent.new
puts agent.contributions 'username'
agent.login 'username', 'password'
puts agent.contributions 'username'
puts agent.contributions_by_month 'username'
```
## Tips
* Sum all the contributions with [jq](http://stedolan.github.io/jq/).
```zsh
$ gh_contrib banyan | jq 'reduce .[].count as $item (0; . + $item)'
4698
```
## Limitations
* You can't enable two factor authentication for this user if you want to get the data as logged in :cry:.
## Caveat
* Since it's not an official API, it might be broken anytime :dizzy_face:.
| 21.931034 | 123 | 0.735063 | eng_Latn | 0.815158 |
0524e4db94d132fde4eccf3f5583407266ec7fe3 | 3,750 | md | Markdown | content/blog/algorithm/10.md | leverz/zblog | a0082e799c75e1de0ebac8561bf44777d2cf945b | [
"MIT"
] | null | null | null | content/blog/algorithm/10.md | leverz/zblog | a0082e799c75e1de0ebac8561bf44777d2cf945b | [
"MIT"
] | 4 | 2019-12-03T05:26:49.000Z | 2019-12-03T05:29:54.000Z | content/blog/algorithm/10.md | leverz/zblog | a0082e799c75e1de0ebac8561bf44777d2cf945b | [
"MIT"
] | null | null | null | ---
title: 10. 正则表达式匹配
date: '2019-11-30T23:32:00.000Z'
---
[https://leetcode-cn.com/problems/regular-expression-matching/](https://leetcode-cn.com/problems/regular-expression-matching/)
使用语言:Go
## 解:
```Go
func isMatch(s string, p string) bool {
if s == p {
return true
}
if len(p) == 0 {
return len(s) == 0
}
if len(p) == 1 || string(p[1]) != "*" {
// p 有值而 s 无值的情况返回 false
// p[0] != s[0] 返回 false
if len(s) == 0 || (string(p[0]) != "." && p[0] != s[0]) {
return false
}
// p 长度为 1,则递归传入的 p 为 "", 需要判断 s 是否为空
// p 长度大于 1,但是 p[1] != "*",因此不需要考虑重复匹配的情况,直接进入下一个阶段的匹配
return isMatch(string(s[1:]), string(p[1:]))
}
// p 长度大于 1 并且 p[1] == "*",需要考虑重复匹配的情况
// 如果 p[0] == ".",而已知 p[1] == "*",这是一个万能匹配式,我们只需要知道 p[2:] 是否与 s 中的半段匹配即可。
// 因此,这里靠 i 做驱动,不断地与 p[2:] 做匹配,只要匹配成功,就全部成功,遍历完 i 依然不成功,就返回不成功
// 如果 p[0] != ".",而已知 p[1] == "*",则可以得出 p[0] 对应的字母可以重复任意次。
// 这里 i 从 -1 开始计算的意义在于 x* 可以匹配 0 次或任意多次。从 -1 开始的意思是假设匹配 0 次,则直接将 s 与 p[2:] 进行匹配。匹配如果不成功,就认为应该选择匹配任意多次的分支,则需要保证 p[0] 与 s[i]相等(可以理解为 s[i-1] 已经与 p[0] 想等了,而 s[i:] 又跟 p[2:] 不匹配,这个时候如果 s[i] 还不跟 p[0] 相等,就可以认为完全不匹配了)
// 按照递归的思想,我们只处理子问题,剩下的交给递归去解决
// 这里的子问题是 s[i] 如果跟 p[0] 相等,就继续看 s[i] 后面的元素是否与 p[2:] 匹配即可,一旦后面的元素完全匹配,即可得出 s 和 p 匹配;但是如果不匹配,可能只是因为我们的 s[i:] 还包含了应该跟 p[:2] 匹配的字符,所以继续 i++ 向后检查
sl := len(s)
i := -1
for i < sl && (i < 0 || string(p[0]) == "." || p[0] == s[i]) {
if isMatch(s[(i+1):], p[2:]) {
return true
}
i++
}
return false
}
```
这道题目的难点在于 “*” 的处理上。如果没有 “*” 我们就只需要考虑 s 和 p 的每个字符是否相等,或者 p 中的某个字符是否为 “.”。
为了规避复杂的 “*” 处理,该解法的前半段先处理不带 “*” 的情况。
后半段对 “*” 的处理上,实际上是对各种情况的检测,我画了个示意图:

如上图所示,可以划分出这些情况分支:
1. p 第一个字符是 “.”
+ 1.1. s 中有连续的后半段能够匹配 p[2:],则 s 和 p **完全匹配**
+ 1.2. s 中没有连续的后半段能够匹配 p[2:],则 s 和 p **不匹配**
2. p 第一个字符不是 “.”,而 s[0] == p[0]
+ 2.1. s 的所有元素中,前 i 个值都等于 p[0]
+ 2.1.1. 第 i 个元素之后的某后半段字符串能够与 p[2:] 匹配,则 s 和 p **完全匹配**
+ 2.1.2. 没有找到满足条件的后半段字符串,则 s 和 p **不匹配**
+ 2.2. s 的所有元素中,s[i:] 与 p[2:] 不匹配,并且,s[i] != p[0],则 s 和 p **不匹配**。(举个例子:s = "aba", p = "a*a")
3. p 第一个字符不是 “.”,而 s[0] != p[0]
+ 3.1. s 与 p[2:] 匹配,则 s 和 p **完全匹配**
+ 3.2. s 与 p[2:] 不匹配,则 s 和 p **不匹配**
根据以上分析,我们发现,实际上在递归的判断匹配的过程中,产生了很多重复路径,为了降低时间复杂度,我们可以引入一个二维数组来缓存已经遍历过的路径结果。
```Go
func isMatch(s string, p string) bool {
if s == p {
return true
}
if len(p) == 0 {
return len(s) == 0
}
sl := len(s)
pl := len(p)
mem := make([][]*bool, sl + 1)
for i := range mem {
mem[i] = make([]*bool, pl + 1)
}
return _isMatch(s, p, 0, 0, mem)
}
func _isMatch(s, p string, i, j int, mem [][]*bool) bool {
matched := true
notMatched := false
if j >= len(p) {
return i >= len(s)
}
if mem[i][j] != nil {
return *mem[i][j]
}
var subS string
if i >= len(s) {
subS = ""
} else {
subS = string(s[i:])
}
subP := string(p[j:])
if subS == subP {
mem[i][j] = &matched
return true
}
if len(subP) == 1 || string(subP[1]) != "*" {
if len(subS) == 0 || (string(subP[0]) != "." && subP[0] != subS[0]) {
mem[i][j] = ¬Matched
return false
}
r := _isMatch(s, p, i + 1, j + 1, mem)
mem[i][j] = &r
return r
}
for k:=-1;k<len(subS);k++ {
if k < 0 || string(subP[0]) == "." || subP[0] == subS[k] {
if _isMatch(s, p, i + k + 1, j + 2, mem) {
mem[i][j] = &matched
return true
}
} else {
mem[i][j] = ¬Matched
return false
}
}
mem[i][j] = ¬Matched
return false
}
```
| 27.372263 | 212 | 0.485333 | yue_Hant | 0.795163 |
0525283c34cfd8e8f3c275e00551b316d9cf9102 | 75 | md | Markdown | README.md | LordSapling/UnityGitHubTutorial | 4f5b07c092d834f3959934c55f87547a3f19a43f | [
"MIT"
] | null | null | null | README.md | LordSapling/UnityGitHubTutorial | 4f5b07c092d834f3959934c55f87547a3f19a43f | [
"MIT"
] | null | null | null | README.md | LordSapling/UnityGitHubTutorial | 4f5b07c092d834f3959934c55f87547a3f19a43f | [
"MIT"
] | null | null | null | # UnityGitHubTutorial
Demonstrate commit, push, pull, branches and merging
| 25 | 52 | 0.826667 | eng_Latn | 0.953262 |
0525a64836fd4496076f5d85656bf63f6511a302 | 422 | md | Markdown | README.md | my-dish/packer | 46c5849e1f24484c19fb67ead9ac46c6e86ab500 | [
"MIT"
] | null | null | null | README.md | my-dish/packer | 46c5849e1f24484c19fb67ead9ac46c6e86ab500 | [
"MIT"
] | null | null | null | README.md | my-dish/packer | 46c5849e1f24484c19fb67ead9ac46c6e86ab500 | [
"MIT"
] | null | null | null | # packer
<!-- travis https://travis-ci.org/ -->
<!-- appveyor https://ci.appveyor.com -->
<!-- codecov https://codecov.io/gh -->
<!-- npm version badge: https://badge.fury.io/ -->
## Install
```
$ npm install @my-dish/packer
```
## Usage
```js
const packer = require('@my-dish/packer');
packer.installTemplate();
packer.createPackageJSON();
packer.installPackages();
```
### CLI
```sh
$ packer --debug [--reload]
```
| 16.230769 | 50 | 0.618483 | yue_Hant | 0.577073 |
0525bd0af1128c3e2495eb645a3b1ca7d5d9e481 | 30 | md | Markdown | README.md | hydragium/panes | 77c7e8cd19ed7aca6240acef190b5d586b8e336c | [
"MIT"
] | null | null | null | README.md | hydragium/panes | 77c7e8cd19ed7aca6240acef190b5d586b8e336c | [
"MIT"
] | null | null | null | README.md | hydragium/panes | 77c7e8cd19ed7aca6240acef190b5d586b8e336c | [
"MIT"
] | null | null | null | # panes
Web UI pane component
| 10 | 21 | 0.766667 | eng_Latn | 0.840069 |
05260ae00532655bdf6487a2dcea8a2bed91e696 | 1,218 | md | Markdown | memdocs/configmgr/core/understand/what-happened-to-sccm.md | hyoshioka0128/memdocs.ja-jp | e165484b113c81dd8464d1548afad83e1f18d1d8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | memdocs/configmgr/core/understand/what-happened-to-sccm.md | hyoshioka0128/memdocs.ja-jp | e165484b113c81dd8464d1548afad83e1f18d1d8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | memdocs/configmgr/core/understand/what-happened-to-sccm.md | hyoshioka0128/memdocs.ja-jp | e165484b113c81dd8464d1548afad83e1f18d1d8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: SCCM はどうなりましたか?
description: System Center Configuration Manager から Microsoft Endpoint Configuration Manager へのブランド変更について理解する
ms.date: 11/29/2019
ms.prod: configuration-manager
ms.technology: configmgr-core
ms.topic: conceptual
ms.assetid: 68430abb-d18e-4266-aa5a-3ad3ab753f4c
author: aczechowski
ms.author: aaroncz
manager: dougeby
ms.openlocfilehash: 150da86c7a78484d13bbc81c533db292bc2b4756
ms.sourcegitcommit: bbf820c35414bf2cba356f30fe047c1a34c5384d
ms.translationtype: HT
ms.contentlocale: ja-JP
ms.lasthandoff: 04/21/2020
ms.locfileid: "81706690"
---
# <a name="what-happened-to-system-center-configuration-manager"></a>System Center Configuration Manager はどうなりましたか?
バージョン 1910 以降、Configuration Manager の Current Branch は Microsoft Endpoint Manager の一部になりました。 バージョン 1906 以前では、System Center Configuration Manager がブランド化されたままです。 Microsoft Endpoint Manager ブランドは、今後数か月のうちに製品とドキュメントに表示されます。
[System Center スイート](https://docs.microsoft.com/system-center)の他のコンポーネントに変更はありません。
System Center 2012 Configuration Manager などの以前の製品バージョンは、再ブランド化されません。
詳細については、以下の記事を参照してください。
- [Configuration Manager とは](introduction.md)
- [Microsoft Endpoint Configuration Manager の FAQ](microsoft-endpoint-manager-faq.md)
| 39.290323 | 219 | 0.834975 | yue_Hant | 0.284585 |
0527a3d352b8674d1f77567ee44dc1e80621a12c | 1,932 | md | Markdown | README.md | scbd/drupal-code-base | 9a1d1a306916b813a98d5922a587d6b3c2cc599f | [
"MIT"
] | null | null | null | README.md | scbd/drupal-code-base | 9a1d1a306916b813a98d5922a587d6b3c2cc599f | [
"MIT"
] | null | null | null | README.md | scbd/drupal-code-base | 9a1d1a306916b813a98d5922a587d6b3c2cc599f | [
"MIT"
] | null | null | null |
| [](https://microbadger.com/images/scbd/drupal-code-base:dev "Get your own version badge on microbadger.com") | [](https://microbadger.com/images/scbd/drupal-code-base:stg "Get your own version badge on microbadger.com") |
| --------------- | ------------------------ |
| [](https://circleci.com/gh/scbd/drupal-code-base/tree/dev) | [](https://circleci.com/gh/scbd/drupal-code-base/tree/stg) |
|[](https://microbadger.com/images/scbd/drupal-code-base:dev "Get your own image badge on microbadger.com")| [](https://microbadger.com/images/scbd/drupal-code-base:stg "Get your own image badge on microbadger.com")|
| [](https://codeclimate.com/github/scbd/drupal-code-base/maintainability)| [](https://codeclimate.com/github/scbd/drupal-code-base/maintainability) |
TODO:
- auto import db local and server env
- bug in encrypted config - Could not open input file: security-checker.phar
- auto sync 'files' from canonical site (one way) possible s3 module
- cli export
- test theme
<<<<<<< HEAD
=======
- cron container
>>>>>>> badges
- test module
- install search/solr
- instal seo module
- circle ci - simple tests
| 62.322581 | 404 | 0.733954 | yue_Hant | 0.448083 |
052917bff4fb2063476410ae6403235d356f49a7 | 298 | md | Markdown | example/readme.md | Agraphie/pubspec-version | 0b36e9d1d18bf66915b0380d11fefe49f39d083f | [
"MIT"
] | 29 | 2018-10-04T17:01:00.000Z | 2021-10-17T15:38:26.000Z | example/readme.md | Agraphie/pubspec-version | 0b36e9d1d18bf66915b0380d11fefe49f39d083f | [
"MIT"
] | 12 | 2018-11-16T00:58:38.000Z | 2019-12-29T10:41:26.000Z | example/readme.md | Agraphie/pubspec-version | 0b36e9d1d18bf66915b0380d11fefe49f39d083f | [
"MIT"
] | 4 | 2019-03-15T18:23:38.000Z | 2021-09-27T10:51:16.000Z | #### Examples
Before | Command | After
--- | --- | ---
1.2.3 | `pubver bump breaking` | 2.0.0
0.2.1 | `pubver bump breaking` | 0.3.0
0.2.1 | `pubver bump major` | 1.0.0
0.2.1 | `pubver bump minor` | 0.3.0
0.2.1 | `pubver bump patch` | 0.2.1
0.2.1 | `pubver set 5.4.3-dev` | 5.4.3-dev | 33.111111 | 43 | 0.540268 | eng_Latn | 0.131927 |
0529422eb9f2956303a7edc899081313a6ca3ac8 | 1,259 | md | Markdown | content/wiki/Robocon/intro.md | MasayaMorinaga/Molinablog | 1405a4717ab95021ed40aee6d548f63a6b52c045 | [
"MIT"
] | null | null | null | content/wiki/Robocon/intro.md | MasayaMorinaga/Molinablog | 1405a4717ab95021ed40aee6d548f63a6b52c045 | [
"MIT"
] | null | null | null | content/wiki/Robocon/intro.md | MasayaMorinaga/Molinablog | 1405a4717ab95021ed40aee6d548f63a6b52c045 | [
"MIT"
] | null | null | null | ---
title: NHKロボコンの競技の特徴
linktitle: 競技の特徴
toc: true
type: docs
date: "2019-05-05T00:00:00+01:00"
draft: false
menu:
Robocon:
parent: NHKロボコン
weight: 1
# Prev/next pager order (if `docs_section_pager` enabled in `params.toml`)
weight: 1
---
NHKロボコンについてのあれこれ
## そもそもNHKロボコンとは
NHKロボコンはNHKエンタープライズが主催しているロボットコンテスト.
主に高専生が参加する高専ロボコンと, 大学生, 専門学校生が参加する学生ロボコンの2つがある. 一般的には高専ロボコンの方が有名であるよう. 学生ロボコンやってましたって言っても「?」みたいな顔されるのに「高専ロボコンの大学生版です」って言ったらみんなわかってくれる. なんでだろうね.
このNHKロボコンは上位に各国の代表チームが参加して対戦するABUロボコンがあり, NHKロボコンはその代表チーム決定という役割も持つ. 基本的に優勝したチームが代表としてABUロボコンに出場する.
## 特徴
最大の特徴は毎年ルールが変わること. 8月末に行われるABUロボコンでその次のルールが発表される. そして次のNHKロボコンはだいたい6月頭なのでおよそ9ヶ月でコンセプト決定から機体の設計, 加工, 制御までを完成させなくてはいけない. かなりのハードスケジュールである.
また, 初心者が非常に多く長くロボコンをやっている経験者はほとんど居ないというのも特徴と言えるかも知れない. 具体的には大学生しか参加できないので, 多くの人が2-3年で引退してしまう. そして多くの人が大学に入ってからロボコンを始める. 中高からロボコンをやってきたという人は半分もいないと思う. 自分の感覚だと1~2割くらいだが, これは大学や年代によっても変わってくると思う.
## 各大学の体制
多くの大学で, 担当分野を分けて制作することが多い. 分け方は様々だが, 機械, 回路, 制御の3要素に分けることが多い気がする. 中には回路と制御が一緒の大学もあるが, 話を聞いてみると回路も制御もやっているという人はあまりいない気がする.
これは, 各要素を学習する時間が非常に短い(多くの場合, ロボコンを始めてからメインで活動するまでの時間は1年半程度)ので, 各人が学習する分野を絞って学習時間の短縮化を図るというのが主な理由だと思う. あとそもそも短期間でロボットを仕上げる以上, 分業は必要不可欠なのでどちらにせよ各人が必要となるスキルも限られる. 例外はあるにせよ, 多くの結果を残している大学では分業して開発を効率化している例が多々見受けられる.
| 35.971429 | 211 | 0.846704 | jpn_Jpan | 0.974661 |
05299f18503f9e39d26033dad6b161f72deb9320 | 1,175 | md | Markdown | content/ja/problems/IOL/2019/3.md | fulfom/kotohazi | bf192b52f9bd82c1da07934869fe088543e6275a | [
"Apache-2.0"
] | 5 | 2020-08-07T09:52:21.000Z | 2022-03-16T04:12:52.000Z | content/ja/problems/IOL/2019/3.md | fulfom/kotohazi | bf192b52f9bd82c1da07934869fe088543e6275a | [
"Apache-2.0"
] | 18 | 2020-07-22T21:00:37.000Z | 2021-07-30T19:56:47.000Z | content/ja/problems/IOL/2019/3.md | fulfom/kotohazi | bf192b52f9bd82c1da07934869fe088543e6275a | [
"Apache-2.0"
] | null | null | null | ---
title: "IOL2019-3 書物のパフラヴィー文字"
weight: 101
type: docs
pagetype: prob
description: "翻訳・ヒント・解答・解説のまとめ"
problems:
- 101
---
{{% hasProbdata id=101 key="introduction" %}}
## 紹介
{{% problems/introduction id=101 %}}{{% /hasProbdata %}}
## 情報
{{% problems/probcard id=101 detailed=true %}}{{% hasProbdata id=101 key="probcorrection" %}}
### 問題訂正
{{% probdata id=101 key="probcorrection" %}}{{% /hasProbdata %}}{{% hasProbdata id=101 key="translation" %}}
### 翻訳・注釈
{{% problems/translation id=101 %}}{{% /hasProbdata %}}{{% hasProbdata id=101 key="hint" %}}
## ヒント
{{% problems/hint id=101 %}}{{% /hasProbdata %}}
## 解答
[解答]({{% probdata id=101 key="solution" default="link" %}}){{% hasProbdata id=101 key="solcorrection" %}}
### 解答訂正
{{% collapse title="解答訂正" %}}
{{% probdata id=101 key="solcorrection" %}}
{{% /collapse %}}{{% /hasProbdata %}}
## 解説
{{% hasProbdata id=101 key="sketch" %}}
{{% problems/sketch id=101 %}}
{{% /hasProbdata %}}{{% hasProbdata id=101 key="explanationlinks" %}}
### 詳しい解説
{{% problems/explanationLinks id=101 %}}{{% /hasProbdata %}}
## 関連記事
{{< problems/relatedarticle id=101 >}}
## 類題
{{< problems/relatedprob id=101 >}}
| 19.583333 | 108 | 0.618723 | eng_Latn | 0.08943 |
052ad97beec3570bc7255955fb7087bb9eea804f | 54 | md | Markdown | README.md | jordicasesnoves/yeelightcontroller-backend | aabbb1e9ac4a2fed872305e874e34e9797551039 | [
"MIT"
] | null | null | null | README.md | jordicasesnoves/yeelightcontroller-backend | aabbb1e9ac4a2fed872305e874e34e9797551039 | [
"MIT"
] | null | null | null | README.md | jordicasesnoves/yeelightcontroller-backend | aabbb1e9ac4a2fed872305e874e34e9797551039 | [
"MIT"
] | null | null | null | # Yeelightcontroller-backend
Control your smart bulb
| 13.5 | 28 | 0.833333 | eng_Latn | 0.608449 |
052b6f0defcafd0dafbfbcd8a49f505018a6c016 | 1,112 | md | Markdown | README.md | Jamesits/myip | 13e0a51bb3dc606d9fe945824a9b658c9f1e213e | [
"MIT"
] | 9 | 2018-12-07T13:32:50.000Z | 2021-01-04T23:29:21.000Z | README.md | Jamesits/myip | 13e0a51bb3dc606d9fe945824a9b658c9f1e213e | [
"MIT"
] | 1 | 2019-12-13T03:37:21.000Z | 2019-12-15T06:32:09.000Z | README.md | Jamesits/myip | 13e0a51bb3dc606d9fe945824a9b658c9f1e213e | [
"MIT"
] | 4 | 2018-12-07T13:54:42.000Z | 2021-12-29T06:10:05.000Z | # myip
Get your external IP address from command line.
[](https://dev.azure.com/nekomimiswitch/General/_build/latest?definitionId=72&branchName=master)
## Usage
### Basic Usage
```shell
$ myip
2001:db8::2
$ myip -4
192.0.2.2
$ myip -6
2001:db8::2
```
### STUN
* This is the default method
* `stun.l.google.com:19302` is the default server
* Connection over UDP only
```shell
myip --method STUN --server stun.l.google.com:19302
```
### ip.sb HTTPS API
```shell
myip --method ip.sb
```
### OpenDNS DNS Query
```shell
myip --method OpenDNS
```
### OpenDNS HTTPS API
* `-4`/`-6` doesn't work
```shell
myip --method OpenDNS-API
```
## Building
Use go 1.11 or higher. Run `build.sh` to build and collect your artifact in `build` directory.
## Donation
If this project is helpful to you, please consider buying me a coffee.
[](https://www.buymeacoffee.com/Jamesits) or [PayPal](https://paypal.me/Jamesits)
| 18.229508 | 199 | 0.704137 | eng_Latn | 0.405242 |
052c092b014a55bac522b40af7b890991e38ecb8 | 66 | md | Markdown | README.md | covix/shiny-pancake | 0e6e133be55d6d8e2ee25ab94ebb5633424794e2 | [
"MIT"
] | 1 | 2019-07-01T11:06:37.000Z | 2019-07-01T11:06:37.000Z | README.md | covix/shiny-pancake | 0e6e133be55d6d8e2ee25ab94ebb5633424794e2 | [
"MIT"
] | null | null | null | README.md | covix/shiny-pancake | 0e6e133be55d6d8e2ee25ab94ebb5633424794e2 | [
"MIT"
] | null | null | null | # shiny-pancake
Shiny app for twitter visualization and analysis
| 16.5 | 48 | 0.818182 | eng_Latn | 0.947361 |
052c4c21b16e910b207f391cae991d5b9a9967ef | 3,847 | md | Markdown | add/metadata/System.Web.UI.MobileControls/MobileListItem.meta.md | v-maudel/docs-1 | f849afb0bd9a505311e7aec32c544c3169edf1c5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.Web.UI.MobileControls/MobileListItem.meta.md | v-maudel/docs-1 | f849afb0bd9a505311e7aec32c544c3169edf1c5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.Web.UI.MobileControls/MobileListItem.meta.md | v-maudel/docs-1 | f849afb0bd9a505311e7aec32c544c3169edf1c5 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-11-16T19:24:50.000Z | 2020-11-16T19:24:50.000Z | ---
uid: System.Web.UI.MobileControls.MobileListItem
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.OnBubbleEvent(System.Object,System.EventArgs)
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.GetHashCode
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.#ctor
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.SaveViewState
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.System#Web#UI#IStateManager#TrackViewState
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.Value
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.TrackViewState
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.Index
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.System#Web#UI#IStateManager#IsTrackingViewState
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.System#Web#UI#IStateManager#LoadViewState(System.Object)
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.DataItem
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.#ctor(System.Object,System.String,System.String)
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.ToString
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.System#Web#UI#IStateManager#SaveViewState
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.Equals(System.Object)
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.Text
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.#ctor(System.String)
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.Selected
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.IsTrackingViewState
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.#ctor(System.Web.UI.MobileControls.MobileListItemType)
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.FromString(System.String)
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.LoadViewState(System.Object)
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
---
uid: System.Web.UI.MobileControls.MobileListItem.#ctor(System.String,System.String)
ms.technology:
- "dotnet-webforms"
ms.author: "riande"
manager: "wpickett"
---
| 20.036458 | 105 | 0.735638 | yue_Hant | 0.207971 |
052cff858e840d61c7cc1568b562e1497f9b636e | 1,772 | md | Markdown | docs/visual-basic/language-reference/modifiers/module-keyword.md | Youssef1313/docs.it-it | 15072ece39fae71ee94a8b9365b02b550e68e407 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/language-reference/modifiers/module-keyword.md | Youssef1313/docs.it-it | 15072ece39fae71ee94a8b9365b02b550e68e407 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/language-reference/modifiers/module-keyword.md | Youssef1313/docs.it-it | 15072ece39fae71ee94a8b9365b02b550e68e407 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: <keyword> moduli
ms.date: 07/20/2015
f1_keywords:
- vb.ModuleAttribute
helpviewer_keywords:
- Module keyword [Visual Basic]
- Module modifier
- attribute blocks, Module keyword
ms.assetid: d971b940-05ab-4d56-8485-e3b8a661906b
ms.openlocfilehash: cd2f762181b5a702f0b0defd5b71bb7bdf129c7b
ms.sourcegitcommit: 17ee6605e01ef32506f8fdc686954244ba6911de
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 11/22/2019
ms.locfileid: "74351558"
---
# <a name="module-keyword-visual-basic"></a>> \<parola chiave Module (Visual Basic)
Specifica che un attributo all'inizio di un file di origine viene applicato al modulo di assembly corrente.
## <a name="remarks"></a>Note
Molti attributi riguardano un singolo elemento di programmazione, ad esempio una classe o una proprietà. Applicare tale attributo alleghindo il blocco di attributi, racchiuso tra parentesi angolari (`< >`), direttamente all'istruzione di dichiarazione.
Se un attributo riguarda non solo l'elemento seguente ma al modulo di assembly corrente, il blocco di attributi viene inserito all'inizio del file di origine e l'attributo viene identificato con la parola chiave `Module`. Se si applica all'intero assembly, usare la parola chiave [assembly](../../../visual-basic/language-reference/modifiers/assembly.md) .
Il modificatore `Module` non corrisponde all' [istruzione Module](../../../visual-basic/language-reference/statements/module-statement.md).
## <a name="see-also"></a>Vedere anche
- [Assembly](../../../visual-basic/language-reference/modifiers/assembly.md)
- [Istruzione Module](../../../visual-basic/language-reference/statements/module-statement.md)
- [Panoramica degli attributi](../../../visual-basic/programming-guide/concepts/attributes/index.md)
| 53.69697 | 359 | 0.772009 | ita_Latn | 0.892961 |
052d2058088f069465b324547c0029bd6a8ea441 | 582 | md | Markdown | mission.md | RockefellerArchiveCenter/DTeamDocs | fa4d9de8888ad759dbd2f6d2dd60c812075b5c4f | [
"CC-BY-4.0"
] | 1 | 2020-10-08T20:08:14.000Z | 2020-10-08T20:08:14.000Z | mission.md | RockefellerArchiveCenter/DTeamDocs | fa4d9de8888ad759dbd2f6d2dd60c812075b5c4f | [
"CC-BY-4.0"
] | 10 | 2018-02-09T18:02:04.000Z | 2020-06-26T21:19:07.000Z | mission.md | RockefellerArchiveCenter/DTeamDocs | fa4d9de8888ad759dbd2f6d2dd60c812075b5c4f | [
"CC-BY-4.0"
] | null | null | null | # Rockefeller Archive Center Digital Strategies Mission
The Digital Strategies team leads the ethical application of technology in all aspects of the RAC's work by connecting people to systems and expertise. We develop, implement, and maintain the organization’s archival systems and digital infrastructure; build capacity through training designed to empower our colleagues; center user perspectives to continually improve the usability and accessibility of our systems and collections; and participate in professional communities to further our work and the archival profession.
| 145.5 | 524 | 0.843643 | eng_Latn | 0.999018 |
052d57fad0a4b3d3ee496c6ad2670a32c87cbfe7 | 1,386 | md | Markdown | _posts/2019-03-14-Neural-Style-Transfer-for-Point-Clouds.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 7 | 2018-02-11T01:50:19.000Z | 2020-01-14T02:07:17.000Z | _posts/2019-03-14-Neural-Style-Transfer-for-Point-Clouds.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | null | null | null | _posts/2019-03-14-Neural-Style-Transfer-for-Point-Clouds.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 4 | 2018-02-04T15:58:04.000Z | 2019-08-29T14:54:14.000Z | ---
layout: post
title: "Neural Style Transfer for Point Clouds"
date: 2019-03-14 03:56:06
categories: arXiv_CV
tags: arXiv_CV Style_Transfer Classification
author: Xu Cao, Weimin Wang, Katashi Nagao
mathjax: true
---
* content
{:toc}
##### Abstract
How can we edit or transform the geometric or color property of a point cloud? In this study, we propose a neural style transfer method for point clouds which allows us to transfer the style of geometry or color from one point cloud either independently or simultaneously to another. This transfer is achieved by manipulating the content representations and Gram-based style representations extracted from a pre-trained PointNet-based classification network for colored point clouds. As Gram-based style representation is invariant to the number or the order of points, the same method can be extended to transfer the style extracted from an image to the color expression of a point cloud by merely treating the image as a set of pixels. Experimental results demonstrate the capability of the proposed method for transferring style from either an image or a point cloud to another point cloud of a single object or even an indoor scene.
##### Abstract (translated by Google)
##### URL
[http://arxiv.org/abs/1903.05807](http://arxiv.org/abs/1903.05807)
##### PDF
[http://arxiv.org/pdf/1903.05807](http://arxiv.org/pdf/1903.05807)
| 53.307692 | 936 | 0.784993 | eng_Latn | 0.994752 |
052d859f0873e7649a81045b47b2c540c229a410 | 16,708 | md | Markdown | site/content/vscode__devcontainers__cpp.md | sajayantony/mcr-images | 9ecf2b4f6873cc33da68f1c608c7bb3a175ca5d3 | [
"MIT"
] | null | null | null | site/content/vscode__devcontainers__cpp.md | sajayantony/mcr-images | 9ecf2b4f6873cc33da68f1c608c7bb3a175ca5d3 | [
"MIT"
] | null | null | null | site/content/vscode__devcontainers__cpp.md | sajayantony/mcr-images | 9ecf2b4f6873cc33da68f1c608c7bb3a175ca5d3 | [
"MIT"
] | null | null | null | ---
title: vscode/devcontainers/cpp
---
- 0
- 0-bionic
- 0-bullseye
- 0-buster
- 0-debian
- 0-debian-10
- 0-debian-11
- 0-debian-9
- 0-debian10
- 0-debian11
- 0-debian9
- 0-focal
- 0-hirsute
- 0-stretch
- 0-ubuntu
- 0-ubuntu-18.04
- 0-ubuntu-20.04
- 0-ubuntu-21.04
- 0-ubuntu18.04
- 0-ubuntu20.04
- 0-ubuntu21.04
- 0.132-buster
- 0.132-debian
- 0.132-debian-10
- 0.132-debian-9
- 0.132-debian10
- 0.132-debian9
- 0.132-stretch
- 0.132.0-buster
- 0.132.0-debian
- 0.132.0-debian-10
- 0.132.0-debian-9
- 0.132.0-debian10
- 0.132.0-debian9
- 0.132.0-stretch
- 0.133-buster
- 0.133-debian
- 0.133-debian-10
- 0.133-debian-9
- 0.133-debian10
- 0.133-debian9
- 0.133-stretch
- 0.133.0-buster
- 0.133.0-debian
- 0.133.0-debian-10
- 0.133.0-debian-9
- 0.133.0-debian10
- 0.133.0-debian9
- 0.133.0-stretch
- 0.134-buster
- 0.134-debian
- 0.134-debian-10
- 0.134-debian-9
- 0.134-debian10
- 0.134-debian9
- 0.134-stretch
- 0.134.0-buster
- 0.134.0-debian
- 0.134.0-debian-10
- 0.134.0-debian-9
- 0.134.0-debian10
- 0.134.0-debian9
- 0.134.0-stretch
- 0.134.1-buster
- 0.134.1-debian
- 0.134.1-debian-10
- 0.134.1-debian-9
- 0.134.1-debian10
- 0.134.1-debian9
- 0.134.1-stretch
- 0.136-buster
- 0.136-debian
- 0.136-debian-10
- 0.136-debian-9
- 0.136-debian10
- 0.136-debian9
- 0.136-stretch
- 0.136.0-buster
- 0.136.0-debian
- 0.136.0-debian-10
- 0.136.0-debian-9
- 0.136.0-debian10
- 0.136.0-debian9
- 0.136.0-stretch
- 0.137-buster
- 0.137-debian
- 0.137-debian-10
- 0.137-debian-9
- 0.137-debian10
- 0.137-debian9
- 0.137-stretch
- 0.137.0-buster
- 0.137.0-debian
- 0.137.0-debian-10
- 0.137.0-debian-9
- 0.137.0-debian10
- 0.137.0-debian9
- 0.137.0-stretch
- 0.138-buster
- 0.138-debian
- 0.138-debian-10
- 0.138-debian-9
- 0.138-debian10
- 0.138-debian9
- 0.138-stretch
- 0.138.0-buster
- 0.138.0-debian
- 0.138.0-debian-10
- 0.138.0-debian-9
- 0.138.0-debian10
- 0.138.0-debian9
- 0.138.0-stretch
- 0.139-buster
- 0.139-debian
- 0.139-debian-10
- 0.139-debian-9
- 0.139-debian10
- 0.139-debian9
- 0.139-stretch
- 0.139.1-buster
- 0.139.1-debian
- 0.139.1-debian-10
- 0.139.1-debian-9
- 0.139.1-debian10
- 0.139.1-debian9
- 0.139.1-stretch
- 0.140-buster
- 0.140-debian
- 0.140-debian-10
- 0.140-debian-9
- 0.140-debian10
- 0.140-debian9
- 0.140-stretch
- 0.140.0-buster
- 0.140.0-debian
- 0.140.0-debian-10
- 0.140.0-debian-9
- 0.140.0-debian10
- 0.140.0-debian9
- 0.140.0-stretch
- 0.140.1-buster
- 0.140.1-debian
- 0.140.1-debian-10
- 0.140.1-debian-9
- 0.140.1-debian10
- 0.140.1-debian9
- 0.140.1-stretch
- 0.141-buster
- 0.141-debian
- 0.141-debian-10
- 0.141-debian-9
- 0.141-debian10
- 0.141-debian9
- 0.141-stretch
- 0.141.0-buster
- 0.141.0-debian
- 0.141.0-debian-10
- 0.141.0-debian-9
- 0.141.0-debian10
- 0.141.0-debian9
- 0.141.0-stretch
- 0.142-buster
- 0.142-debian
- 0.142-debian-10
- 0.142-debian-9
- 0.142-debian10
- 0.142-debian9
- 0.142-stretch
- 0.142.0-buster
- 0.142.0-debian
- 0.142.0-debian-10
- 0.142.0-debian-9
- 0.142.0-debian10
- 0.142.0-debian9
- 0.142.0-stretch
- 0.143-buster
- 0.143-debian
- 0.143-debian-10
- 0.143-debian-9
- 0.143-debian10
- 0.143-debian9
- 0.143-stretch
- 0.143.0-buster
- 0.143.0-debian
- 0.143.0-debian-10
- 0.143.0-debian-9
- 0.143.0-debian10
- 0.143.0-debian9
- 0.143.0-stretch
- 0.144-buster
- 0.144-debian
- 0.144-debian-10
- 0.144-debian-9
- 0.144-debian10
- 0.144-debian9
- 0.144-stretch
- 0.144.0-buster
- 0.144.0-debian
- 0.144.0-debian-10
- 0.144.0-debian-9
- 0.144.0-debian10
- 0.144.0-debian9
- 0.144.0-stretch
- 0.145-buster
- 0.145-debian
- 0.145-debian-10
- 0.145-debian-9
- 0.145-debian10
- 0.145-debian9
- 0.145-stretch
- 0.145.0-buster
- 0.145.0-debian
- 0.145.0-debian-10
- 0.145.0-debian-9
- 0.145.0-debian10
- 0.145.0-debian9
- 0.145.0-stretch
- 0.145.1-buster
- 0.145.1-debian
- 0.145.1-debian-10
- 0.145.1-debian-9
- 0.145.1-debian10
- 0.145.1-debian9
- 0.145.1-stretch
- 0.146-buster
- 0.146-debian
- 0.146-debian-10
- 0.146-debian-9
- 0.146-debian10
- 0.146-debian9
- 0.146-stretch
- 0.146.0-buster
- 0.146.0-debian
- 0.146.0-debian-10
- 0.146.0-debian-9
- 0.146.0-debian10
- 0.146.0-debian9
- 0.146.0-stretch
- 0.147-bionic
- 0.147-buster
- 0.147-debian
- 0.147-debian-10
- 0.147-debian-9
- 0.147-debian10
- 0.147-debian9
- 0.147-focal
- 0.147-stretch
- 0.147-ubuntu
- 0.147-ubuntu-18.04
- 0.147-ubuntu-20.04
- 0.147-ubuntu18.04
- 0.147-ubuntu20.04
- 0.147.0-bionic
- 0.147.0-buster
- 0.147.0-debian
- 0.147.0-debian-10
- 0.147.0-debian-9
- 0.147.0-debian10
- 0.147.0-debian9
- 0.147.0-focal
- 0.147.0-stretch
- 0.147.0-ubuntu
- 0.147.0-ubuntu-18.04
- 0.147.0-ubuntu-20.04
- 0.147.0-ubuntu18.04
- 0.147.0-ubuntu20.04
- 0.148-bionic
- 0.148-buster
- 0.148-debian
- 0.148-debian-10
- 0.148-debian-9
- 0.148-debian10
- 0.148-debian9
- 0.148-focal
- 0.148-stretch
- 0.148-ubuntu
- 0.148-ubuntu-18.04
- 0.148-ubuntu-20.04
- 0.148-ubuntu18.04
- 0.148-ubuntu20.04
- 0.148.0-bionic
- 0.148.0-buster
- 0.148.0-debian
- 0.148.0-debian-10
- 0.148.0-debian-9
- 0.148.0-debian10
- 0.148.0-debian9
- 0.148.0-focal
- 0.148.0-stretch
- 0.148.0-ubuntu
- 0.148.0-ubuntu-18.04
- 0.148.0-ubuntu-20.04
- 0.148.0-ubuntu18.04
- 0.148.0-ubuntu20.04
- 0.148.1-bionic
- 0.148.1-buster
- 0.148.1-debian
- 0.148.1-debian-10
- 0.148.1-debian-9
- 0.148.1-debian10
- 0.148.1-debian9
- 0.148.1-focal
- 0.148.1-stretch
- 0.148.1-ubuntu
- 0.148.1-ubuntu-18.04
- 0.148.1-ubuntu-20.04
- 0.148.1-ubuntu18.04
- 0.148.1-ubuntu20.04
- 0.149-bionic
- 0.149-buster
- 0.149-debian
- 0.149-debian-10
- 0.149-debian-9
- 0.149-debian10
- 0.149-debian9
- 0.149-focal
- 0.149-stretch
- 0.149-ubuntu
- 0.149-ubuntu-18.04
- 0.149-ubuntu-20.04
- 0.149-ubuntu18.04
- 0.149-ubuntu20.04
- 0.149.0-bionic
- 0.149.0-buster
- 0.149.0-debian
- 0.149.0-debian-10
- 0.149.0-debian-9
- 0.149.0-debian10
- 0.149.0-debian9
- 0.149.0-focal
- 0.149.0-stretch
- 0.149.0-ubuntu
- 0.149.0-ubuntu-18.04
- 0.149.0-ubuntu-20.04
- 0.149.0-ubuntu18.04
- 0.149.0-ubuntu20.04
- 0.150-bionic
- 0.150-buster
- 0.150-debian
- 0.150-debian-10
- 0.150-debian-9
- 0.150-debian10
- 0.150-debian9
- 0.150-focal
- 0.150-stretch
- 0.150-ubuntu
- 0.150-ubuntu-18.04
- 0.150-ubuntu-20.04
- 0.150-ubuntu18.04
- 0.150-ubuntu20.04
- 0.150.0-bionic
- 0.150.0-buster
- 0.150.0-debian
- 0.150.0-debian-10
- 0.150.0-debian-9
- 0.150.0-debian10
- 0.150.0-debian9
- 0.150.0-focal
- 0.150.0-stretch
- 0.150.0-ubuntu
- 0.150.0-ubuntu-18.04
- 0.150.0-ubuntu-20.04
- 0.150.0-ubuntu18.04
- 0.150.0-ubuntu20.04
- 0.151-bionic
- 0.151-buster
- 0.151-debian
- 0.151-debian-10
- 0.151-debian-9
- 0.151-debian10
- 0.151-debian9
- 0.151-focal
- 0.151-stretch
- 0.151-ubuntu
- 0.151-ubuntu-18.04
- 0.151-ubuntu-20.04
- 0.151-ubuntu18.04
- 0.151-ubuntu20.04
- 0.151.0-bionic
- 0.151.0-buster
- 0.151.0-debian
- 0.151.0-debian-10
- 0.151.0-debian-9
- 0.151.0-debian10
- 0.151.0-debian9
- 0.151.0-focal
- 0.151.0-stretch
- 0.151.0-ubuntu
- 0.151.0-ubuntu-18.04
- 0.151.0-ubuntu-20.04
- 0.151.0-ubuntu18.04
- 0.151.0-ubuntu20.04
- 0.152
- 0.152-bionic
- 0.152-buster
- 0.152-debian
- 0.152-debian-10
- 0.152-debian-9
- 0.152-debian10
- 0.152-debian9
- 0.152-focal
- 0.152-stretch
- 0.152-ubuntu
- 0.152-ubuntu-18.04
- 0.152-ubuntu-20.04
- 0.152-ubuntu18.04
- 0.152-ubuntu20.04
- 0.152.0-bionic
- 0.152.0-buster
- 0.152.0-debian
- 0.152.0-debian-10
- 0.152.0-debian-9
- 0.152.0-debian10
- 0.152.0-debian9
- 0.152.0-focal
- 0.152.0-stretch
- 0.152.0-ubuntu
- 0.152.0-ubuntu-18.04
- 0.152.0-ubuntu-20.04
- 0.152.0-ubuntu18.04
- 0.152.0-ubuntu20.04
- 0.152.1
- 0.152.1-bionic
- 0.152.1-buster
- 0.152.1-debian
- 0.152.1-debian-10
- 0.152.1-debian-9
- 0.152.1-debian10
- 0.152.1-debian9
- 0.152.1-focal
- 0.152.1-stretch
- 0.152.1-ubuntu
- 0.152.1-ubuntu-18.04
- 0.152.1-ubuntu-20.04
- 0.152.1-ubuntu18.04
- 0.152.1-ubuntu20.04
- 0.153
- 0.153-bionic
- 0.153-buster
- 0.153-debian
- 0.153-debian-10
- 0.153-debian-9
- 0.153-debian10
- 0.153-debian9
- 0.153-focal
- 0.153-stretch
- 0.153-ubuntu
- 0.153-ubuntu-18.04
- 0.153-ubuntu-20.04
- 0.153-ubuntu18.04
- 0.153-ubuntu20.04
- 0.153.0
- 0.153.0-bionic
- 0.153.0-buster
- 0.153.0-debian
- 0.153.0-debian-10
- 0.153.0-debian-9
- 0.153.0-debian10
- 0.153.0-debian9
- 0.153.0-focal
- 0.153.0-stretch
- 0.153.0-ubuntu
- 0.153.0-ubuntu-18.04
- 0.153.0-ubuntu-20.04
- 0.153.0-ubuntu18.04
- 0.153.0-ubuntu20.04
- 0.154
- 0.154-bionic
- 0.154-buster
- 0.154-debian
- 0.154-debian-10
- 0.154-debian-9
- 0.154-debian10
- 0.154-debian9
- 0.154-focal
- 0.154-stretch
- 0.154-ubuntu
- 0.154-ubuntu-18.04
- 0.154-ubuntu-20.04
- 0.154-ubuntu18.04
- 0.154-ubuntu20.04
- 0.154.0
- 0.154.0-bionic
- 0.154.0-buster
- 0.154.0-debian
- 0.154.0-debian-10
- 0.154.0-debian-9
- 0.154.0-debian10
- 0.154.0-debian9
- 0.154.0-focal
- 0.154.0-stretch
- 0.154.0-ubuntu
- 0.154.0-ubuntu-18.04
- 0.154.0-ubuntu-20.04
- 0.154.0-ubuntu18.04
- 0.154.0-ubuntu20.04
- 0.154.1
- 0.154.1-bionic
- 0.154.1-buster
- 0.154.1-debian
- 0.154.1-debian-10
- 0.154.1-debian-9
- 0.154.1-debian10
- 0.154.1-debian9
- 0.154.1-focal
- 0.154.1-stretch
- 0.154.1-ubuntu
- 0.154.1-ubuntu-18.04
- 0.154.1-ubuntu-20.04
- 0.154.1-ubuntu18.04
- 0.154.1-ubuntu20.04
- 0.154.2
- 0.154.2-bionic
- 0.154.2-buster
- 0.154.2-debian
- 0.154.2-debian-10
- 0.154.2-debian-9
- 0.154.2-debian10
- 0.154.2-debian9
- 0.154.2-focal
- 0.154.2-stretch
- 0.154.2-ubuntu
- 0.154.2-ubuntu-18.04
- 0.154.2-ubuntu-20.04
- 0.154.2-ubuntu18.04
- 0.154.2-ubuntu20.04
- 0.155
- 0.155-bionic
- 0.155-buster
- 0.155-debian
- 0.155-debian-10
- 0.155-debian-9
- 0.155-debian10
- 0.155-debian9
- 0.155-focal
- 0.155-stretch
- 0.155-ubuntu
- 0.155-ubuntu-18.04
- 0.155-ubuntu-20.04
- 0.155-ubuntu18.04
- 0.155-ubuntu20.04
- 0.155.0
- 0.155.0-bionic
- 0.155.0-buster
- 0.155.0-debian
- 0.155.0-debian-10
- 0.155.0-debian-9
- 0.155.0-debian10
- 0.155.0-debian9
- 0.155.0-focal
- 0.155.0-stretch
- 0.155.0-ubuntu
- 0.155.0-ubuntu-18.04
- 0.155.0-ubuntu-20.04
- 0.155.0-ubuntu18.04
- 0.155.0-ubuntu20.04
- 0.155.1
- 0.155.1-bionic
- 0.155.1-buster
- 0.155.1-debian
- 0.155.1-debian-10
- 0.155.1-debian-9
- 0.155.1-debian10
- 0.155.1-debian9
- 0.155.1-focal
- 0.155.1-stretch
- 0.155.1-ubuntu
- 0.155.1-ubuntu-18.04
- 0.155.1-ubuntu-20.04
- 0.155.1-ubuntu18.04
- 0.155.1-ubuntu20.04
- 0.156
- 0.156-bionic
- 0.156-buster
- 0.156-debian
- 0.156-debian-10
- 0.156-debian-9
- 0.156-debian10
- 0.156-debian9
- 0.156-focal
- 0.156-stretch
- 0.156-ubuntu
- 0.156-ubuntu-18.04
- 0.156-ubuntu-20.04
- 0.156-ubuntu18.04
- 0.156-ubuntu20.04
- 0.156.0
- 0.156.0-bionic
- 0.156.0-buster
- 0.156.0-debian
- 0.156.0-debian-10
- 0.156.0-debian-9
- 0.156.0-debian10
- 0.156.0-debian9
- 0.156.0-focal
- 0.156.0-stretch
- 0.156.0-ubuntu
- 0.156.0-ubuntu-18.04
- 0.156.0-ubuntu-20.04
- 0.156.0-ubuntu18.04
- 0.156.0-ubuntu20.04
- 0.157
- 0.157-bionic
- 0.157-buster
- 0.157-debian
- 0.157-debian-10
- 0.157-debian-9
- 0.157-debian10
- 0.157-debian9
- 0.157-focal
- 0.157-stretch
- 0.157-ubuntu
- 0.157-ubuntu-18.04
- 0.157-ubuntu-20.04
- 0.157-ubuntu18.04
- 0.157-ubuntu20.04
- 0.157.0
- 0.157.0-bionic
- 0.157.0-buster
- 0.157.0-debian
- 0.157.0-debian-10
- 0.157.0-debian-9
- 0.157.0-debian10
- 0.157.0-debian9
- 0.157.0-focal
- 0.157.0-stretch
- 0.157.0-ubuntu
- 0.157.0-ubuntu-18.04
- 0.157.0-ubuntu-20.04
- 0.157.0-ubuntu18.04
- 0.157.0-ubuntu20.04
- 0.200
- 0.200-bionic
- 0.200-buster
- 0.200-debian
- 0.200-debian-10
- 0.200-debian-9
- 0.200-debian10
- 0.200-debian9
- 0.200-focal
- 0.200-stretch
- 0.200-ubuntu
- 0.200-ubuntu-18.04
- 0.200-ubuntu-20.04
- 0.200-ubuntu18.04
- 0.200-ubuntu20.04
- 0.200.0
- 0.200.0-bionic
- 0.200.0-buster
- 0.200.0-debian
- 0.200.0-debian-10
- 0.200.0-debian-9
- 0.200.0-debian10
- 0.200.0-debian9
- 0.200.0-focal
- 0.200.0-stretch
- 0.200.0-ubuntu
- 0.200.0-ubuntu-18.04
- 0.200.0-ubuntu-20.04
- 0.200.0-ubuntu18.04
- 0.200.0-ubuntu20.04
- 0.201
- 0.201-bionic
- 0.201-buster
- 0.201-debian
- 0.201-debian-10
- 0.201-debian-9
- 0.201-debian10
- 0.201-debian9
- 0.201-focal
- 0.201-stretch
- 0.201-ubuntu
- 0.201-ubuntu-18.04
- 0.201-ubuntu-20.04
- 0.201-ubuntu18.04
- 0.201-ubuntu20.04
- 0.201.0
- 0.201.0-bionic
- 0.201.0-buster
- 0.201.0-debian
- 0.201.0-debian-10
- 0.201.0-debian-9
- 0.201.0-debian10
- 0.201.0-debian9
- 0.201.0-focal
- 0.201.0-stretch
- 0.201.0-ubuntu
- 0.201.0-ubuntu-18.04
- 0.201.0-ubuntu-20.04
- 0.201.0-ubuntu18.04
- 0.201.0-ubuntu20.04
- 0.201.1
- 0.201.1-bionic
- 0.201.1-buster
- 0.201.1-debian
- 0.201.1-debian-10
- 0.201.1-debian-9
- 0.201.1-debian10
- 0.201.1-debian9
- 0.201.1-focal
- 0.201.1-stretch
- 0.201.1-ubuntu
- 0.201.1-ubuntu-18.04
- 0.201.1-ubuntu-20.04
- 0.201.1-ubuntu18.04
- 0.201.1-ubuntu20.04
- 0.201.2
- 0.201.2-bionic
- 0.201.2-buster
- 0.201.2-debian
- 0.201.2-debian-10
- 0.201.2-debian-9
- 0.201.2-debian10
- 0.201.2-debian9
- 0.201.2-focal
- 0.201.2-stretch
- 0.201.2-ubuntu
- 0.201.2-ubuntu-18.04
- 0.201.2-ubuntu-20.04
- 0.201.2-ubuntu18.04
- 0.201.2-ubuntu20.04
- 0.201.3
- 0.201.3-bionic
- 0.201.3-buster
- 0.201.3-debian
- 0.201.3-debian-10
- 0.201.3-debian-9
- 0.201.3-debian10
- 0.201.3-debian9
- 0.201.3-focal
- 0.201.3-stretch
- 0.201.3-ubuntu
- 0.201.3-ubuntu-18.04
- 0.201.3-ubuntu-20.04
- 0.201.3-ubuntu18.04
- 0.201.3-ubuntu20.04
- 0.201.4
- 0.201.4-bionic
- 0.201.4-buster
- 0.201.4-debian
- 0.201.4-debian-10
- 0.201.4-debian-9
- 0.201.4-debian10
- 0.201.4-debian9
- 0.201.4-focal
- 0.201.4-stretch
- 0.201.4-ubuntu
- 0.201.4-ubuntu-18.04
- 0.201.4-ubuntu-20.04
- 0.201.4-ubuntu18.04
- 0.201.4-ubuntu20.04
- 0.201.5
- 0.201.5-bionic
- 0.201.5-buster
- 0.201.5-debian
- 0.201.5-debian-10
- 0.201.5-debian-9
- 0.201.5-debian10
- 0.201.5-debian9
- 0.201.5-focal
- 0.201.5-stretch
- 0.201.5-ubuntu
- 0.201.5-ubuntu-18.04
- 0.201.5-ubuntu-20.04
- 0.201.5-ubuntu18.04
- 0.201.5-ubuntu20.04
- 0.201.6
- 0.201.6-bionic
- 0.201.6-buster
- 0.201.6-debian
- 0.201.6-debian-10
- 0.201.6-debian-9
- 0.201.6-debian10
- 0.201.6-debian9
- 0.201.6-focal
- 0.201.6-stretch
- 0.201.6-ubuntu
- 0.201.6-ubuntu-18.04
- 0.201.6-ubuntu-20.04
- 0.201.6-ubuntu18.04
- 0.201.6-ubuntu20.04
- 0.201.7
- 0.201.7-bionic
- 0.201.7-buster
- 0.201.7-debian
- 0.201.7-debian-10
- 0.201.7-debian-9
- 0.201.7-debian10
- 0.201.7-debian9
- 0.201.7-focal
- 0.201.7-stretch
- 0.201.7-ubuntu
- 0.201.7-ubuntu-18.04
- 0.201.7-ubuntu-20.04
- 0.201.7-ubuntu18.04
- 0.201.7-ubuntu20.04
- 0.201.8
- 0.201.8-bionic
- 0.201.8-buster
- 0.201.8-debian
- 0.201.8-debian-10
- 0.201.8-debian-9
- 0.201.8-debian10
- 0.201.8-debian9
- 0.201.8-focal
- 0.201.8-stretch
- 0.201.8-ubuntu
- 0.201.8-ubuntu-18.04
- 0.201.8-ubuntu-20.04
- 0.201.8-ubuntu18.04
- 0.201.8-ubuntu20.04
- 0.202
- 0.202-bionic
- 0.202-bullseye
- 0.202-buster
- 0.202-debian
- 0.202-debian-10
- 0.202-debian-11
- 0.202-debian-9
- 0.202-debian10
- 0.202-debian11
- 0.202-debian9
- 0.202-focal
- 0.202-stretch
- 0.202-ubuntu
- 0.202-ubuntu-18.04
- 0.202-ubuntu-20.04
- 0.202-ubuntu18.04
- 0.202-ubuntu20.04
- 0.202.0
- 0.202.0-bionic
- 0.202.0-bullseye
- 0.202.0-buster
- 0.202.0-debian
- 0.202.0-debian-10
- 0.202.0-debian-11
- 0.202.0-debian-9
- 0.202.0-debian10
- 0.202.0-debian11
- 0.202.0-debian9
- 0.202.0-focal
- 0.202.0-stretch
- 0.202.0-ubuntu
- 0.202.0-ubuntu-18.04
- 0.202.0-ubuntu-20.04
- 0.202.0-ubuntu18.04
- 0.202.0-ubuntu20.04
- 0.202.1
- 0.202.1-bionic
- 0.202.1-bullseye
- 0.202.1-buster
- 0.202.1-debian
- 0.202.1-debian-10
- 0.202.1-debian-11
- 0.202.1-debian-9
- 0.202.1-debian10
- 0.202.1-debian11
- 0.202.1-debian9
- 0.202.1-focal
- 0.202.1-stretch
- 0.202.1-ubuntu
- 0.202.1-ubuntu-18.04
- 0.202.1-ubuntu-20.04
- 0.202.1-ubuntu18.04
- 0.202.1-ubuntu20.04
- 0.203
- 0.203-bionic
- 0.203-bullseye
- 0.203-buster
- 0.203-debian
- 0.203-debian-10
- 0.203-debian-11
- 0.203-debian-9
- 0.203-debian10
- 0.203-debian11
- 0.203-debian9
- 0.203-focal
- 0.203-hirsute
- 0.203-stretch
- 0.203-ubuntu
- 0.203-ubuntu-18.04
- 0.203-ubuntu-20.04
- 0.203-ubuntu-21.04
- 0.203-ubuntu18.04
- 0.203-ubuntu20.04
- 0.203-ubuntu21.04
- 0.203.0
- 0.203.0-bionic
- 0.203.0-bullseye
- 0.203.0-buster
- 0.203.0-debian
- 0.203.0-debian-10
- 0.203.0-debian-11
- 0.203.0-debian-9
- 0.203.0-debian10
- 0.203.0-debian11
- 0.203.0-debian9
- 0.203.0-focal
- 0.203.0-hirsute
- 0.203.0-stretch
- 0.203.0-ubuntu
- 0.203.0-ubuntu-18.04
- 0.203.0-ubuntu-20.04
- 0.203.0-ubuntu-21.04
- 0.203.0-ubuntu18.04
- 0.203.0-ubuntu20.04
- 0.203.0-ubuntu21.04
- bionic
- bullseye
- buster
- debian
- debian-10
- debian-11
- debian-9
- debian10
- debian11
- debian9
- dev
- dev-bionic
- dev-bullseye
- dev-buster
- dev-debian
- dev-debian-10
- dev-debian-11
- dev-debian-9
- dev-debian10
- dev-debian11
- dev-debian9
- dev-focal
- dev-hirsute
- dev-stretch
- dev-ubuntu
- dev-ubuntu-18.04
- dev-ubuntu-20.04
- dev-ubuntu-21.04
- dev-ubuntu18.04
- dev-ubuntu20.04
- dev-ubuntu21.04
- focal
- hirsute
- latest
- stretch
- ubuntu
- ubuntu-18.04
- ubuntu-20.04
- ubuntu-21.04
- ubuntu18.04
- ubuntu20.04
- ubuntu21.04
| 17.532004 | 31 | 0.66124 | yue_Hant | 0.103336 |
052df7cfd5a506c00f5835985b8a2099eabaf1ea | 270 | md | Markdown | README.md | unimpressedturtle/unimpressedturtle.github.io | 6280d56ab2358c94956927e0160c64af1969033a | [
"MIT"
] | null | null | null | README.md | unimpressedturtle/unimpressedturtle.github.io | 6280d56ab2358c94956927e0160c64af1969033a | [
"MIT"
] | null | null | null | README.md | unimpressedturtle/unimpressedturtle.github.io | 6280d56ab2358c94956927e0160c64af1969033a | [
"MIT"
] | null | null | null | 
### About
Website for Unimpressed Turtle - independent game developer based in San Francisco, CA.
### Live
[http://www.unimpressedturtle.com](http://www.unimpressedturtle.com)
### License
See [LICENSE](/LICENSE)
| 20.769231 | 87 | 0.744444 | kor_Hang | 0.352862 |
052dfde2d29baea719edfe8353400bb96eb058b2 | 1,953 | md | Markdown | posts/2012-04-17-two-tools-for-gateway-trial-host-nplhost.md | qmacro-org/blog | bbf9dc5908fa9a0961ff869e4784291bcb67c3dd | [
"MIT"
] | null | null | null | posts/2012-04-17-two-tools-for-gateway-trial-host-nplhost.md | qmacro-org/blog | bbf9dc5908fa9a0961ff869e4784291bcb67c3dd | [
"MIT"
] | null | null | null | posts/2012-04-17-two-tools-for-gateway-trial-host-nplhost.md | qmacro-org/blog | bbf9dc5908fa9a0961ff869e4784291bcb67c3dd | [
"MIT"
] | null | null | null | ---
layout: post
title: Two tools for Gateway trial host nplhost
tags:
- gateway
- multitail
- netweaver
- sap
- screen
---
The [SAP NetWeaver Gateway](http://scn.sap.com/community/netweaver-gateway) trial system is a great way to get your hands on all that OData and HTTP goodness. There are a couple of tools that I find myself re-installing when I build a new copy of the VM + trial – multitail and screen.
[Multitail](http://www.vanheusden.com/multitail/) is something I mentioned on my [Enterprise Geeks slot with Craig Cmehil](/undefined/) and allows you to tail more than one file at once. Very useful for keeping an eye on all those log files in the instance work directory!
And [screen](http://en.wikipedia.org/wiki/GNU_Screen) is one of those great utilities that I put in the same class as putty and vim: absolutely essential. It allows you to maintain multiple persistent sessions on a remote *nix host. Great for disconnecting and reconnecting (especially on dodgy ‘net connections) and being able to continue exactly where you left off.
I realised that people might benefit from these too, so I thought I’d offer them for you to download in binary form, so you can avoid going through the hassle of firing up the package manager and wrestling with repositories and dependencies, or building from source. I built them from source on an 64bit SUSE Linux VM ‘nplhost’ straight from SAP, so they should work if you’re using the same as the standard VM recommended for the trial. If you’ve decided on a Windows VM to run Gateway, then you’re out of luck, in more ways than one :-)
They’re available here: [http://www.pipetree.com/~dj/2012/04/nplhost/](http://www.pipetree.com/~dj/2012/04/nplhost/)

Download them to npladm’s home directory to run them from there. Don’t forget to (a) chmod +x each of the binaries, and (b) rename the _ to . for each of the dotfiles.
Share and enjoy!
| 65.1 | 538 | 0.762929 | eng_Latn | 0.998167 |
052fa53508787fd833d1baca8c5289d522d80c87 | 7,210 | md | Markdown | docs/azure-speech-article.md | A7250AG/Karvis | ad7be70c358ded8b5afe4028c7c87cc2204ecf81 | [
"MIT"
] | null | null | null | docs/azure-speech-article.md | A7250AG/Karvis | ad7be70c358ded8b5afe4028c7c87cc2204ecf81 | [
"MIT"
] | null | null | null | docs/azure-speech-article.md | A7250AG/Karvis | ad7be70c358ded8b5afe4028c7c87cc2204ecf81 | [
"MIT"
] | null | null | null | # DSharpPlus + Azure Speech Services
## 1. Introduction
With a little bit of effort, it is entirely possible to leverage Microsoft Azure's Cognitive Services for your Discord bot. Of particular interest are the speech services, which will allow you to do [text-to-speech](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/text-to-speech) (the focus of this article) and [speech-to-text](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-to-text). This article will show you how to create a command to have your bot say something back to you, like Simon says. Once you've got that under your belt, you can make modifications to have your bot speak to you based on whatever trigger and content your heart desires. If you are feeling ambitious, you can even have your bot respond to voice messages using speech-to-text. Just parse the returned text as if it were a regular Channel message or pass it through your Interactivity module for some real fun.
## 2. Prerequisites
* DSharpPlus >= 4.0
* DSharpPlus.VoiceNext
* This article assumes that you've already followed the [VoiceNext](https://dsharpplus.emzi0767.com/articles/vnext_setup.html) article and have a bot that works and that can play music or some other audio.
* NAudio [[GitHub]](https://github.com/naudio/NAudio) [[NuGet]](https://www.nuget.org/packages/NAudio/)
* An account for [Microsoft Azure](https://portal.azure.com)
## 3. Text-to-Speech
Jumping right in, the process is as follows:
1. Follow along with [Try Speech Services for free](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/get-started) to get an API key for Speech Services. The Free Tier pricing includes up to 5M characters free per month. After that, it is $4 per 1M characters. Not bad, really.
2. Add the [Azure Speech SDK NuGet package](https://aka.ms/csspeech/nuget) to your project.
3. Create a new class to do the Azure Speech processing using the Speech SDK. It needs to request audio data from the Speech Synthesizer and then use NAudio to resample to Discord/Opus's 48kHz, 16bit stereo PCM requirement.
```c#
using DSharpPlus;
using Microsoft.CognitiveServices.Speech;
using Microsoft.CognitiveServices.Speech.Audio;
using NAudio.Wave;
using System;
using System.IO;
using System.Threading.Tasks;
public class SpeechModule
{
private DebugLogger debugLogger;
public SpeechModule(DebugLogger debugLogger)
{
this.debugLogger = debugLogger;
}
public async Task<byte[]> SynthesisToSpeakerAsync(string text)
{
debugLogger.LogMessage(LogLevel.Info, Constants.ApplicationName, $"Azure Speech: Synthesizing speech for text [{text}]", DateTime.Now);
// Creates an instance of a speech config with specified subscription key and service region.
// Replace with your own subscription key and service region (e.g., "westus").
var config = SpeechConfig.FromSubscription("subscription-key", "westus");
var audioConfig = AudioConfig.FromStreamOutput(AudioOutputStream.CreatePullStream(AudioStreamFormat.GetDefaultOutputFormat()));
//var audioConfig = AudioConfig.FromDefaultSpeakerOutput(); // if you want to hear it before processing
// Creates a speech synthesizer using the default speaker as audio output.
using (var synthesizer = new SpeechSynthesizer(config, audioConfig))
using (var result = await synthesizer.SpeakTextAsync(text))
{
if (result.Reason == ResultReason.SynthesizingAudioCompleted)
{
debugLogger.LogMessage(LogLevel.Info, Constants.ApplicationName, "Azure Speech: Speech synthesized.", DateTime.Now);
}
else if (result.Reason == ResultReason.Canceled)
{
var cancellation = SpeechSynthesisCancellationDetails.FromResult(result);
debugLogger.LogMessage(LogLevel.Error, Constants.ApplicationName, $"Azure Speech: CANCELED: Reason={cancellation.Reason}", DateTime.Now);
if (cancellation.Reason == CancellationReason.Error)
{
debugLogger.LogMessage(LogLevel.Error, Constants.ApplicationName, $"Azure Speech: CANCELED: ErrorCode={cancellation.ErrorCode}", DateTime.Now);
debugLogger.LogMessage(LogLevel.Error, Constants.ApplicationName, $"Azure Speech: CANCELED: ErrorDetails=[{cancellation.ErrorDetails}]", DateTime.Now);
debugLogger.LogMessage(LogLevel.Error, Constants.ApplicationName, $"Azure Speech: CANCELED: Did you update the subscription info?", DateTime.Now);
}
}
// NAudio resampling from Azure Speech default to Opus default
using (var output = new MemoryStream())
using (var ms = new MemoryStream(result.AudioData))
using (var rs = new RawSourceWaveStream(ms, new WaveFormat(16000, 16, 1)))
using (var resampler = new MediaFoundationResampler(rs, new WaveFormat(48000, 16, 2)))
{
byte[] bytes = new byte[rs.WaveFormat.AverageBytesPerSecond * 4];
while (true)
{
int bytesRead = resampler.Read(bytes, 0, bytes.Length);
if (bytesRead == 0)
break;
output.Write(bytes, 0, bytesRead);
}
return output.GetBuffer();
}
}
}
}
```
4. Add a new command to your CommandsModule.cs from the previous articles to use VoiceNext to send the resampled audio data to Discord through your bot.
```c#
[Command("speak")]
public async Task Speak(CommandContext ctx, string text)
{
if (String.IsNullOrWhiteSpace(text))
throw new InvalidOperationException("No text to speak.");
var vnext = ctx.Client.GetVoiceNext();
var vnc = vnext.GetConnection(ctx.Guild);
if (vnc == null)
throw new InvalidOperationException("Not connected in this guild.");
await ctx.RespondAsync("👌");
var buffer = await new SpeechModule(vnext.Client.DebugLogger).SynthesisToSpeakerAsync(text);
vnc.SendSpeaking(); // send a speaking indicator
await vnc.GetTransmitStream().WriteAsync(buffer);
await vnc.GetTransmitStream().FlushAsync();
vnc.SendSpeaking(false);
}
```
5. With your bot connected to your Guild and joined to a voice channel, like you did in the previous article, send the speak command to your bot (e.g., `-> speak "testing testing testing"`) and listen as your text is spoken back to you.
## 4. Speech-to-Text
TBD
## 5. References
[DSharpPlus Voice Next Setup](https://dsharpplus.emzi0767.com/articles/vnext_setup.html)
[Quickstart: Synthesize speech with the Speech SDK for .NET Core](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/quickstart-text-to-speech-dotnetcore) | 53.80597 | 951 | 0.673925 | eng_Latn | 0.633394 |
052fd07ec1cf997aef5437b09b363d25e3aa4f36 | 1,427 | md | Markdown | 2020/09/08/2020-09-08 12:20.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/09/08/2020-09-08 12:20.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/09/08/2020-09-08 12:20.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年09月08日12时数据
Status: 200
1.钟南山哽咽说什么都压不倒中国人
微博热度:4622186
2.刘璇孕肚倒立照
微博热度:2549209
3.与郎朗一起月见不凡
微博热度:2520016
4.贝克汉姆夫妇曾感染新冠
微博热度:1965348
5.彭于晏像拉黄包车的车夫
微博热度:1708887
6.聚划算99划算节
微博热度:1697569
7.刘亦菲素颜
微博热度:1666965
8.iPhone12
微博热度:1521022
9.陈一鸣裸辞
微博热度:1246638
10.大连13岁行凶男孩家人始终不道歉
微博热度:1219976
11.特朗普让记者摘下口罩遭拒绝
微博热度:1040083
12.阿娇或需再做手术
微博热度:1036324
13.半是蜜糖半是伤预告
微博热度:974454
14.诺兰太会拍男人了
微博热度:720467
15.抗击新冠疫情表彰大会
微博热度:677565
16.信小呆一元转让中国锦鲤
微博热度:635416
17.农夫山泉创始人将成中国新首富
微博热度:583392
18.陈学冬警告粉丝不要爆料
微博热度:581708
19.三星将关闭中国唯一一座电视工厂
微博热度:571311
20.14亿中国人的代表受表彰
微博热度:514803
21.王者荣耀
微博热度:511753
22.熊黛林身材
微博热度:504902
23.第一炉香选角
微博热度:491876
24.穿迷彩进网红书店被当农民工拦下
微博热度:415280
25.钟南山获授共和国勋章
微博热度:391175
26.张定宇步履蹒跚走入人民大会堂
微博热度:382617
27.李湘王岳伦夫妻综艺
微博热度:379865
28.白敬亭的自拍原图
微博热度:377898
29.印度
微博热度:374073
30.苹果发布会时间
微博热度:370764
31.插着把梳子就出门了
微博热度:370438
32.怎么会有璇玑这么笨的女主
微博热度:365530
33.谢谢你为湖北拼过命
微博热度:345872
34.何俊尧被调职
微博热度:328727
35.成都300年桂花巷内桂花树全被砍
微博热度:322612
36.闺蜜拍照技术下的我
微博热度:295933
37.钟南山获勋现场发言画面
微博热度:269717
38.张伯礼张定宇陈薇获人民英雄称号
微博热度:260659
39.人固有一暮
微博热度:251032
40.郑州一高校每天仅开720个洗澡名额
微博热度:227897
41.怀念意气风发的司凤
微博热度:216421
42.当边牧遇上听力考试
微博热度:215993
43.田馥甄新歌
微博热度:215335
44.青岛凤凰音乐节
微博热度:214173
45.课间操跳得像康复训练
微博热度:213015
46.光遇
微博热度:211701
47.父母起诉22岁女儿拒养2岁弟弟胜诉
微博热度:210410
48.特朗普公开攻击美军高层
微博热度:209340
49.陈薇被颁授国家荣誉称号奖章
微博热度:209293
50.向新冠肺炎牺牲烈士和逝世同胞默哀
微博热度:200828
| 6.995098 | 20 | 0.786265 | yue_Hant | 0.327705 |
05309c6b83aec67f52ba27828c2b2c32721c30f2 | 35 | md | Markdown | docs/reference/util.md | lindsay-stevens/openclinica_sqldatamart | ecdda4208021ff5a1cb7a20036eed23bd4196af0 | [
"MIT"
] | 8 | 2016-03-31T08:49:35.000Z | 2021-03-11T06:31:43.000Z | docs/reference/util.md | lindsay-stevens-kirby/openclinica_sqldatamart | ecdda4208021ff5a1cb7a20036eed23bd4196af0 | [
"MIT"
] | 4 | 2015-04-20T05:57:58.000Z | 2016-01-20T04:35:59.000Z | docs/reference/util.md | lindsay-stevens/openclinica_sqldatamart | ecdda4208021ff5a1cb7a20036eed23bd4196af0 | [
"MIT"
] | 5 | 2016-05-18T06:56:04.000Z | 2019-01-24T16:12:53.000Z | # utils
## Overview
## Scripts
| 4.375 | 11 | 0.571429 | fra_Latn | 0.63126 |
053141449db19e8ba31579ff7fa556611d7d9e72 | 2,595 | md | Markdown | README.md | Bader-Research/ListRanking | ee0f1cd93324cf35413ace143d9a1b7ec5b242a4 | [
"BSD-3-Clause"
] | 1 | 2020-12-14T05:11:25.000Z | 2020-12-14T05:11:25.000Z | README.md | dbader13/ListRanking | ee0f1cd93324cf35413ace143d9a1b7ec5b242a4 | [
"BSD-3-Clause"
] | null | null | null | README.md | dbader13/ListRanking | ee0f1cd93324cf35413ace143d9a1b7ec5b242a4 | [
"BSD-3-Clause"
] | 1 | 2021-09-26T15:02:47.000Z | 2021-09-26T15:02:47.000Z | # Parallel List Ranking
## SIMPLE
This SIMPLE/SMP module (written by David R. Helman and Joseph JáJá ,
and later modified by David A. Bader) implements a randomized,
parallel code for link ranking.
Helman and JáJá introduce a new optimal prefix computation algorithm
on linked lists which builds upon the sparse ruling set approach of
Reid-Miller and Blelloch. Besides being somewhat simpler and requiring
nearly half the number of memory accesses, and can bound our
complexity with high probability instead of merely on
average. Moreover, whereas Reid-Miller and Blelloch targeted their
algorithm for implementation on a vector multiprocessor architecture,
they develop an algorithm for implementation on the symmetric
multiprocessor architecture (SMP). These symmetric multiprocessors
dominate the high-end server market and are currently the primary
candidate for constructing large scale multiprocessor systems. The
authors ran this code using a variety of benchmarks which they
identified to examine the dependence of the algorithm on memory access
patterns. For some problems, their algorithm actually matched or
exceeded the optimal sequential solution using only a single
thread. Moreover, in spite of the fact that the processors must
compete for access to main memory, their algorithm still resulted in
scalable performance up to 16 processors, which was the largest
platform available to them.
References:
D. R. Helman and J. JáJá , "Prefix Computations on Symmetric Multiprocessors," Journal of Parallel and Distributed Computing, 61(2):265--278, 2001.
## SMP
Irregular problems such as those from graph theory pose serious
challenges for parallel machines due to non-contiguous accesses to
global data structures with low degrees of locality. These parallel
codes perform list ranking on two types of shared-memory computers:
symmetric multiprocessors (SMP) and multithreaded architectures (MTA)
such as the Cray MTA-2. Previous studies show that for SMPs
performance is primarily a function of non-contiguous memory accesses,
whereas for the MTA, it is primarily a function of the number of
concurrent operations.
References:
D.A. Bader, G. Cong, and J. Feo, "On the Architectural Requirements
for Efficient Execution of Graph Algorithms," The 33rd International
Conference on Parallel Processing (ICPP 2005), pp. 547-556, Georg
Sverdrups House, University of Oslo, Norway, June 14-17, 2005.
## IBM-CELL
Parallel List Ranking for the Sony-Toshiba-IBM Cell Broadband Engine Processor
## CRAY-MTA
Parallel List Ranking for the Cray Multithreaded Architecture (MTA/XMT)
| 43.25 | 147 | 0.813487 | eng_Latn | 0.995116 |
0532400f6dfa7bd65023653c495ff4a4bc2d4ffc | 233 | md | Markdown | _project/66-elegant-bohemian-decor-to-give-different-touch.md | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | _project/66-elegant-bohemian-decor-to-give-different-touch.md | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | _project/66-elegant-bohemian-decor-to-give-different-touch.md | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | ---
layout: project_single
title: "66+ Elegant Bohemian Decor to Give Different Touch"
slug: "66-elegant-bohemian-decor-to-give-different-touch"
parent: "elegant-bohemian-decor"
---
66+ Elegant Bohemian Decor to Give Different Touch | 33.285714 | 60 | 0.776824 | eng_Latn | 0.664083 |
053307bad036553d55b00f6dc0940bf28ece5b74 | 569 | md | Markdown | _posts/2017-09-23-BeginLearnPython.md | yxyClub/Www.xiaoyan.work | ea0745ed685cfab00c1001e4a76ad6001137e142 | [
"Apache-2.0"
] | 6 | 2019-07-20T09:12:55.000Z | 2021-07-22T12:43:09.000Z | _posts/2017-09-23-BeginLearnPython.md | yxyClub/Www.xiaoyan.work | ea0745ed685cfab00c1001e4a76ad6001137e142 | [
"Apache-2.0"
] | 6 | 2019-04-06T12:05:23.000Z | 2021-05-21T09:47:41.000Z | _posts/2017-09-23-BeginLearnPython.md | yxyClub/Www.xiaoyan.work | ea0745ed685cfab00c1001e4a76ad6001137e142 | [
"Apache-2.0"
] | 1 | 2020-03-30T15:01:11.000Z | 2020-03-30T15:01:11.000Z | ---
layout: post
title: 开始学 Python
date: 2017-09-23
categories: blog
tags: [Python,笨办法学Python,LearPython2TheHardWay,LP2THW]
description: 立个Flag开始学Python。
---
## 正文
>引子:在老阳汪洋的知识体系中认识到了Python编程的重要性。编程可以给自己带来创作感和成就感,这正是我想要的。因此学习好Python,成了回到自己喜欢做的事上来的一种回归,这让我能过获得心流。
## 编程软件
主要操作系统采用了Mac。
安装了Python3.6(Idle和Luncher),在python网站上下载最新版本。
Mac系统里自然存在了Python2版本。因此,我的电脑里其实是存在两个版本的Python编程软件的。2.x版本的编程和3.x版本的编程还是有一定的区别的,目前许多程序员正在做这方面的转换。
## 参考资料
- 《教孩子学Python》
- 《笨办法学Python(电子版,中文翻译)4.0》
- Zoom.Quiet大神
## Log
- 20170923创建了这个版本。
- 20170928在原始框架版本上添加内容。
- 20180711从简书转过来。
| 17.78125 | 98 | 0.797891 | yue_Hant | 0.911374 |
0533e63843b07771ff3d89a2253e436587c2a698 | 42,368 | md | Markdown | all/README.md | daily-interview/fe-interview | a067b324c734b3e1c50bbec3790011f24561e6e2 | [
"MIT"
] | 167 | 2019-07-05T10:08:35.000Z | 2022-03-28T08:53:14.000Z | all/README.md | daily-interview/fe-interview | a067b324c734b3e1c50bbec3790011f24561e6e2 | [
"MIT"
] | 67 | 2019-07-04T05:49:49.000Z | 2022-02-22T08:38:14.000Z | all/README.md | daily-interview/fe-interview | a067b324c734b3e1c50bbec3790011f24561e6e2 | [
"MIT"
] | 26 | 2019-07-28T12:31:27.000Z | 2022-01-13T10:54:57.000Z | # fe-interview
[](../../commits/master)
[](https://github.com/daily-interview/fe-interview/stargazers)
[](https://github.com/daily-interview/fe-interview/network)
[](https://github.com/daily-interview/fe-interview/blob/master/LICENSE)
## 所有面试题
1、 [js] [介绍一下 JS 的基本数据类型。](https://github.com/daily-interview/fe-interview/issues/1)
```
JS 7 种基本数据类型(原始类型),即 (Undefined、Null、Boolean、Number 、String) + (Symbol、BigInt)和 3种引用数据类型:对象(Object)、数组(Array)、函数(Function)。
基本类型值:指的是保存在栈内存中的简单数据段。
引用类型值:指的是那些保存在堆内存中的对象。变量中保存的实际上只是一个指针,这个指针指向内存堆中实际的值。
> 注:Symbol 是 ES6 引入了一种新的原始数据类型,表示独一无二的值; BigInt即是第七种基本类型,V8引擎v6.7 默认启用对 BigInt 的支持。
Symbol用法
语法
Symbol (value)
eg.
```
let a=Symbol ("welcome");
console.log(a); //输出 Symbol(welcome)
```
BigInt用法
语法
BigInt(value) || 数字后面加n;
eg.
```
let b1 = BigInt(10);
let b2 = 10n;
console.log(b1,b2); //输出 10n 10n
```
```
2、 [js] [js实现几种常见排序算法。( 手写 )](https://github.com/daily-interview/fe-interview/issues/2)
冒泡排序:

```javascript
const arr = [3, 44, 38, 5, 47, 15, 36, 26, 27, 2, 46, 4, 19, 50, 48];
function bubbleSort(arr) {
let len = arr.length;
if(len >= 1) {
for(let i = 0;i < len - 1; i++) {
for(let j = 0;j < len - 1 - i; j++) {
if(arr[j] > arr[j + 1]) {
let temp = arr[j + 1];
arr[j + 1] = arr[j];
arr[j] = temp;
}
}
}
}
return arr;
}
console.log(bubbleSort(arr));
```
冒泡排序优化版:
```javascript
const arr = [3, 44, 38, 5, 47, 15, 36, 26, 27, 2, 46, 4, 19, 50, 48];
function bubbleSort(arr) {
let len = arr.length;
let lastExchangeIndex = 0;
//无序数列的边界,每次比较只需要比到这里为止
let sortBorder = len - 1;
if(len >= 1) {
for(let i = 0;i < len; i++) {
//有序标记,每一轮的初始是true
let isSorted = true;
for(let j = 0;j < sortBorder - i; j++) {
if(arr[j] > arr[j + 1]) {
let temp = arr[j + 1];
arr[j + 1] = arr[j];
arr[j] = temp;
//有元素交换,所以不是有序,标记变为false
isSorted = false;
//把无序数列的边界更新为最后一次交换元素的位置
lastExchangeIndex = j;
}
}
sortBorder = lastExchangeIndex;
if(isSorted) { //有序,跳出循环
break;
}
}
}
return arr;
}
console.log(bubbleSort(arr));
```
选择排序:

```javacript
const arr=[3,44,38,5,47,15,36,26,27,2,46,4,19,50,48];
function selectionSort(arr) {
let len = arr.length;
let minIndex, temp;
for (let i = 0; i < len - 1; i++) {
minIndex = i;
for (let j = i + 1; j < len; j++) {
if (arr[j] < arr[minIndex]) {
// 寻找最小的数
minIndex = j;
// 将最小数的索引保存
}
}
temp = arr[i];
arr[i] = arr[minIndex];
arr[minIndex] = temp;
}
return arr;
}
console.log(selectionSort(arr));
```
选择排序优化版:
```javascript
const arr = [3, 44, 38, 5, 47, 15, 36, 26, 27, 2, 46, 4, 19, 50, 48];
function selectionSort(arr) {
let len = arr.length;
let left = 0;
let right = len - 1;
while (left < right) {
let max = left;//记录无序区最大元素下标
let min = left;//记录无序区最小元素下标
let j = 0;
for (j = left + 1; j <= right; j++) {
//找最大元素下标
if (arr[j] < arr[min])
{
min = j;
}
//找最小元素下标
if (arr[j] > arr[max])
{
max = j;
}
}
//最小值如果是第一个则没有必要交换
if (min != left) {
let tmp = arr[left];
arr[left] = arr[min];
arr[min] = tmp;
}
//这里很重要,如果最大元素下标是left,前面已经和最小元素交换了,此时最大元素下标应该是min
if (max == left) {
max = min;
}
//最大值如果是最后一个则没必要交换
if (max != right) {
let tmp = arr[right];
arr[right] = arr[max];
arr[max] = tmp;
}
left++;
right--;
}
return arr;
}
console.log(selectionSort(arr));
```
插入排序:

```javacript
const arr=[3,44,38,5,47,15,36,26,27,2,46,4,19,50,48];
function insertSort(arr) {
const len = arr.length;
let preIndex, current;
for (let i = 1; i < len; i++) {
preIndex = i - 1;
current = arr[i];
while (preIndex >= 0 && arr[preIndex] > current) {
arr[preIndex + 1] = arr[preIndex];
preIndex--;
}
arr[preIndex + 1] = current;
}
return arr;
}
console.log(insertSort(arr));
```
3、 [css] [有哪几种常用的清除浮动方法?](https://github.com/daily-interview/fe-interview/issues/3)
- 父级元素添加伪元素
```css
.clear-float:after {
content: '';
display: block;
clear: both;
}
```
- 在与浮动元素平级的最后面添加新元素 div.clear
```css
.clear {
clear: both;
}
```
- 在父级元素添加样式 overflow: auto; 或者 overflow: hidden; 会存在兼容性问题。
4、 [js] [Object.assign 是浅拷贝还是深拷贝?实现深拷贝的方法有哪些?](https://github.com/daily-interview/fe-interview/issues/4)
> Object.assign() 方法用于将所有可枚举属性的值从一个或多个源对象复制到目标对象。它将返回目标对象。
- 如果目标对象中的属性具有相同的键,则属性将被源对象中的属性覆盖。后面的源对象的属性将类似地覆盖前面的源对象的属性。
- Object.assign 方法只会拷贝源对象自身的并且可枚举的属性到目标对象。该方法使用源对象的[[Get]]和目标对象的[[Set]],所以它会调用相关 getter 和 setter。因此,它分配属性,而不仅仅是复制或定义新的属性。如果合并源包含getter,这可能使其不适合将新属性合并到原型中。为了将属性定义(包括其可枚举性)复制到原型,应使用Object.getOwnPropertyDescriptor()和Object.defineProperty() 。
- String类型和 Symbol 类型的属性都会被拷贝。
- 在出现错误的情况下,例如,如果属性不可写,会引发TypeError,如果在引发错误之前添加了任何属性,则可以更改target对象。
- Object.assign 不会在那些source对象值为 `null `或 `undefined` 的时候抛出错误。
- 针对**深拷贝**,需要使用其他办法,因为 Object.assign()拷贝的是属性值。假如源对象的属性值是一个对象的引用,那么它也只指向那个引用。也就是说,如果对象的属性值为简单类型(如string, number),通过Object.assign({},srcObj);得到的新对象为`深拷贝`;如果属性值为对象或其它引用类型,那对于这个对象而言其实是`浅拷贝`的。
## 深拷贝的几种实现方法
### JSON.stringify 和 JSON.parse
>用 JSON.stringify 把对象转换成字符串,再用 JSON.parse 把字符串转换成新的对象。
可以转成 JSON 格式的对象才能使用这种方法,如果对象中包含 function 或 RegExp 这些就不能用这种方法了。
```
//通过js的内置对象JSON来进行数组对象的深拷贝
function deepClone(obj) {
let _obj = JSON.stringify(obj);
let objClone = JSON.parse(_obj);
return objClone;
}
```
### Object.assign()拷贝
当对象中只有一级属性,没有二级属性的时候,此方法为深拷贝,但是对象中有对象的时候,此方法,在二级属性以后就是浅拷贝。
### 通过jQuery的extend方法实现深拷贝
```
let $ = require('jquery');
let obj1 = {
a: 1,
b: {
f: {
g: 1
}
},
c: [1, 2, 3]
};
let obj2 = $.extend(true, {}, obj1);
```
### lodash.cloneDeep()实现深拷贝
```
let _ = require('lodash');
let obj1 = {
a: 1,
b: { f: { g: 1 } },
c: [1, 2, 3]
};
let obj2 = _.cloneDeep(obj1);
```
### 使用递归的方式实现深拷贝
```
function _deepClone(source) {
let target;
if (typeof source === 'object') {
target = Array.isArray(source) ? [] : {}
for (let key in source) {
if (source.hasOwnProperty(key)) {
if (typeof source[key] !== 'object') {
target[key] = source[key]
} else {
target[key] = _deepClone(source[key])
}
}
}
} else {
target = source
}
return target
}
```
5、 [js] [promise和setTimeout执行顺序是怎样的?](https://github.com/daily-interview/fe-interview/issues/5)
写出下列程序运行结果并做出解释:
```javascript
setTimeout(function(){
console.log(1);
},0);
new Promise(function(resolve) {
console.log(2)
for(let i=0; i<10000 ; i++ ) {
i==9999 && resolve();
}
console.log(3)
}).then(function(){
console.log(4)
});
console.log(5);
```
这个就涉及到**事件循环(Event Loop)**
> JS运行时,对代码执行顺序的一个算法(任务调度算法)
JS 分类:同步任务和异步任务
JS 的执行机制:
- 首先判断JS代码是同步还是异步,同步就进入主线程,异步就进入 event table
- 异步任务在 event table 中注册函数,当满足触发条件后,被推入event queue
- 同步任务进入主线程后一直执行,直到主线程空闲时,才回去 event queue 中查看是否有可执行的异步任务,如果有就推入主线程
event loop 里有维护两个不同的异步任务队列
- macro Tasks(宏任务):script(整体代码), setTimeout, setInterval, setImmediate, I/O, UI rendering
- micro Tasks(微任务):process.nextTick, Promise(浏览器实现的原生Promise), Object.observe, MutationObserver, MessageChannel
每次执行一段代码(一个script标签)都是一个 macroTask
执行流程:
- event loop 开始
- 从macro Tasks 队列抽取一个任务,执行
- micro Tasks 清空队列执行,若有任务不可执行,推入下一轮 micro Tasks
- 结束 event loop
浏览器执行代码的过程如下整个流程

那么回到题目上去,就是
```js
setTimeout(function(){
console.log(1); // 1-放入宏任务队列,7-执行下一轮事件循环,宏任务输出1
},0);
new Promise(function(resolve) {
console.log(2); // 2-同步输出 2
for(let i=0; i<10000 ; i++ ) {
i==9999 && resolve();
}
console.log(3); // 4-同步输出 3
}).then(function(){
console.log(4); // 3-放入微任务队列,6-回到微任务队列,执行剩余的微任务,输出4
});
console.log(5); // 5-同步输出 5
```
6、 [ts] [说一说Typescript中的泛型的作用及使用场景。](https://github.com/daily-interview/fe-interview/issues/6)
## 什么是TypeScript
> TypeScript是由Microsoft Corporation开发和维护的面向对象的编程语言。它是JavaScript的超集,包含所有元素。
> TypeScript完全遵循OOPS概念,在TSC(TypeScript编译器)的帮助下,我们可以将Typescript代码(.ts文件)转换为JavaScript(.js文件)。
## 为什么要使用TypeScript
> TypeScript的设计目的应该是解决JavaScript的“痛点”:弱类型和没有命名空间,导致很难模块化,不适合开发大型程序。另外它还提供了一些语法糖来帮助大家更方便地实践面向对象的编程。
- TypeScript简化了JavaScript代码,使其更易于阅读和调试。
- TypeScript是开源的。
- TypeScript为JavaScript IDE和实践提供了高效的开发工具,例如静态检查。
- 使用TypeScript,我们可以比普通的JavaScript做出巨大的改进。
- TypeScript为我们提供了ES6(ECMAScript 6)的所有优点,以及更高的工作效率。
- TypeScript可以帮助我们避免开发人员通过类型检查代码编写JavaScript时经常遇到的痛苦错误。
- 强大的类型系统,包括泛型。
- TypeScript代码可以按照ES5和ES6标准进行编译,以支持最新的浏览器。
- 支持静态类型。
- TypeScript将节省开发人员的时间。
## 什么是泛型
泛型的本质是参数化类型,通俗的将就是所操作的数据类型被指定为一个参数,这种参数类型可以用在类、接口和方法的创建中,分别成为泛型类,泛型接口、泛型方法。
> TypeScript中的泛型跟java中的泛型基本类似。
## 为什么使用泛型
TypeScript 中不建议使用 any 类型,不能保证类型安全,调试时缺乏完整的信息。
TypeScript可以使用泛型来创建`可重用`的组件。支持当前数据类型,同时也能支持未来的数据类型。扩展灵活。可以在编译时发现你的类型错误,从而保证了类型安全。
## 泛型的使用
使用泛型可以创建泛型函数、泛型接口,泛型类
1.使用泛型变量
```
// 泛型变量的使用
function identity<T>(arg:T):T{
console.log(typeof arg);
return arg;
}
let output1=identity<string>('myString');
let output2=identity('myString');
let output3:number=identity<number>(100);
let output4:number=identity(200);
```
```
// 使用集合的泛型
function loggingIdentity<T>(arg:Array<T>):Array<T>{
console.log(arg.length);
return arg;
}
loggingIdentity([1,2,3]);
```
2.定义泛型函数
```
// 泛型函数
function identity<T>(arg:T):T{
return arg;
}
let myIdentity:{<T>(arg:T):T}=identity;
```
3.定义泛型接口
```
// 泛型接口
interface GenericIdentityFn<T> {
(arg: T): T;
}
function identity<T>(arg: T): T {
return arg;
}
let myIdentity: GenericIdentityFn<number> = identity;
```
4.定义泛型类
```
// 泛型类
class GenericNumber<T>{
zeroValue:T;
add:(x:T,y:T)=>T;
}
let myGenericNumber=new GenericNumber<number>();
myGenericNumber.zeroValue=0;
myGenericNumber.add=function(x,y){return x+y;};
console.info(myGenericNumber.add(2,5));
let stringNumberic=new GenericNumber<string>();
stringNumberic.zeroValue='abc';
stringNumberic.add=function(x,y){return `${x}--${y}`};
console.info(stringNumberic.add('张三丰','诸葛亮'));
```
7、 [http] [说说你对http和https的理解](https://github.com/daily-interview/fe-interview/issues/7)
## 什么是HTTP?
> 超文本传输协议,是一个基于请求与响应,无状态的,应用层的协议,常基于TCP/IP协议传输数据,互联网上应用最为广泛的一种网络协议,所有的WWW文件都必须遵守这个标准。设计HTTP的初衷是为了提供一种发布和接收HTML页面的方法。
## 什么是HTTPS?
> HTTPS是一种通过计算机网络进行安全通信的传输协议,经由HTTP进行通信,利用SSL/TLS建立全信道,加密数据包。HTTPS使用的主要目的是提供对网站服务器的身份认证,同时保护交换数据的隐私与完整性。
PS:TLS是传输层加密协议,前身是SSL协议,由网景公司1995年发布,有时候两者不区分。
## 什么是TLS/SSL?
> TLS/SSL全称安全传输层协议Transport Layer Security, 是介于TCP和HTTP之间的一层安全协议,不影响原有的TCP协议和HTTP协议,所以使用HTTPS基本上不需要对HTTP页面进行太多的改造。
SSL有三种不同类型,需要了解下:
- 扩展验证型(EV)SSL证书,适用于大企业,像银行,证券网站都会使用这个证书,信任等级,安全等级是最高的。
- 组织验证型(OV)SSL证书,适用于企业网站,需要验证企业身份,安全等级比DV高些
- 域名验证型(DV)SSL证书 ,适用于个人网站,一般验证下网站信息就可以通过,很多免费版本
## 如果获取SSL证书?
一般分为三种价位的:贵的,便宜的,免费的
- 贵的(上千甚至上万的价):Symantec、globalsign、comodo、geotrust,除非大企业需要,中小企业网站没必要购买
- 便宜的(大概50美元上下):godday,RapaidSSL,Comodo positiveSSL
- 免费的:Let's Encrypt(比较推荐),Wosign,GlobeSSL
## HTTP特点
- 支持客户/服务器模式。(C/S模式)
- 简单快速:客户向服务器请求服务时,只需传送请求方法和路径。请求方法常用的有GET、HEAD、POST。每种方法规定了客户与服务器联系的类型不同。由于HTTP协议简单,使得HTTP服务器的程序规模小,因而通信速度很快。
- 灵活:HTTP允许传输任意类型的数据对象。正在传输的类型由Content-Type加以标记。
- 无连接:无连接的含义是限制每次连接只处理一个请求。服务器处理完客户的请求,并收到客户的应答后,即断开连接。采用这种方式可以节省传输时间。
- 无状态:HTTP协议是无状态协议。无状态是指协议对于事务处理没有记忆能力。缺少状态意味着如果后续处理需要前面的信息,则它必须重传,这样可能导致每次连接传送的数据量增大。另一方面,在服务器不需要先前信息时它的应答就较快
## HTTPS特点
- 内容加密:采用混合加密技术,中间者无法直接查看明文内容
- 验证身份:通过证书认证客户端访问的是自己的服务器
- 保护数据完整性:防止传输的内容被中间人冒充或者篡改
## HTTPS和HTTP的区别主要如下:
- HTTP 的URL 以http:// 开头,而HTTPS 的URL 以https:// 开头
- HTTP 是不安全的,而 HTTPS 是安全的
- HTTP 标准端口是80 ,而 HTTPS 的标准端口是443
- HTTP 无法加密,而HTTPS 对传输的数据进行加密
- HTTP无需证书,而HTTPS 需要CA机构(如Wosign)的颁发的SSL证书
## 总结
> HTTP + 加密 + 认证 + 完整性保护 = HTTPS
8、 [js] [说说你对 Promise 的理解](https://github.com/daily-interview/fe-interview/issues/8)
## Promise 核心
- Promise 概括来说是对异步的执行结果的描述对象。(这句话的理解很重要)
- Promise 规范中规定了,promise 的状态只有3种:
- pending
- fulfilled
- rejected
Promise 的状态一旦改变则不会再改变。
- Promise 规范中还规定了 Promise 中必须有 then 方法,这个方法也是实现异步的链式操作的基本。
## ES6 Promise细节
- Promise 构造器中必须传入函数,否则会抛出错误。(没有执行器还怎么做异步操作。。。)
- Promise.prototype上的 catch(onrejected) 方法是 then(null,onrejected) 的别名,并且会处理链之前的任何的reject。
- Promise.prototype 上的 then和 catch 方法总会返回一个全新的 Promise 对象。
- 如果传入构造器的函数中抛出了错误,该 promise 对象的[[PromiseStatus]]会赋值为 rejected,并且[[PromiseValue]]赋值为 Error 对象。
- then 中的回调如果抛出错误,返回的 promise 对象的[[PromiseStatus]]会赋值为 rejected,并且[[PromiseValue]]赋值为 Error 对象。
- then 中的回调返回值会影响 then 返回的 promise 对象。
## Promise优点
- Promise最大的好处是在异步执行的流程中,把执行代码和处理结果的代码清晰地分离了。

- 解决回调地狱(Callback Hell)问题
Promise还可以做更多的事情,比如,有若干个异步任务,需要先做任务1,如果成功后再做任务2,任何任务失败则不再继续并执行错误处理函数。
要串行执行这样的异步任务,不用Promise需要写一层一层的嵌套代码。有了Promise,我们只需要简单地写:
```
job1.then(job2).then(job3).catch(handleError);
```
> 其中,job1、job2和job3都是Promise对象。
- Promise.all()并行执行异步任务
- Promise.race()获得先返回的结果即可
>eg.同时向两个URL读取用户的个人信息,只需要获得先返回的结果即可。
## Promise如何解决这两个问题
- 解决可读性的问题
这一点不用多说,用过Promise的人很容易明白。Promise的应用相当于给了你一张可以把解题思路清晰记录下来的草稿纸,你不在需要用脑子去记忆执行顺序。
- 解决信任问题
Promise并没有取消控制反转,而是把反转出去的控制再反转一次,也就是反转了控制反转。
这种机制有点像事件的触发。它与普通的回调的方式的区别在于,普通的方式,回调成功之后的操作直接写在了回调函数里面,而这些操作的调用由第三方控制。在Promise的方式中,回调只负责成功之后的通知,而回调成功之后的操作放在了then的回调里面,由Promise精确控制。
Promise有这些特征:只能决议一次,决议值只能有一个,决议之后无法改变。任何then中的回调也只会被调用一次。Promise的特征保证了Promise可以解决信任问题。
对于回调过早的问题,由于Promise只能是异步的,所以不会出现异步的同步调用。即便是在决议之前的错误,也是异步的,并不是会产生同步(调用过早)的困扰。
```
let a = new Promise((resolve, reject) => {
let b = 1 + c; // ReferenceError: c is not defined,错误会在下面的a打印出来之后报出。
resolve(true);
})
console.log(1, a);
a.then(res => {
console.log(2, res);
})
.catch(err => {
console.log(err);
})
```
对于回调过晚或没有调用的问题,Promise本身不会回调过晚,只要决议了,它就会按照规定运行。至于服务器或者网络的问题,并不是Promise能解决的,一般这种情况会使用Promise的竞态APIPromise.race加一个超时的时间:
```
function timeoutPromise(delay) {
return new Promise(function(resolve, reject) {
setTimeout(function() {
reject("Timeout!");
}, delay);
});
}
Promise.race([doSomething(), timeoutPromise(3000)])
.then(...)
.catch(...);
```
对于回调次数太少或太多的问题,由于Promise只能被决议一次,且决议之后无法改变,所以,即便是多次回调,也不会影响结果,决议之后的调用都会被忽略。
9、 [js] [说一下对bind,call,apply三个函数的认识,自己实现一下bind方法](https://github.com/daily-interview/fe-interview/issues/9)
粗略讲一下,希望大佬们能补充下。
首先这三个方法都是用来改变函数的 this 的绑定(指向)的。
它们的用法如下:
```js
func.apply(thisArg, [argsArray])
fun.call(thisArg, arg1, arg2, ...)
function.bind(thisArg[, arg1[, arg2[, ...]]])
```
区别:
- call 和 apply 的区别在于传参的形式不一样,apply 的参数形式是数组或类数组对象,call 的参数形式则是一个个排列的参数值;
- bind 返回的是原函数的拷贝,并拥有指定的 this 值和初始参数;而 call 和 apply 都是直接返回原函数的返回值,或 undefined;即 bind 是需要手动去调用的,而 apply 和 call 都是立即自动执行。
实现 bind 方法可以参考 [MDN bind polyfill](https://developer.mozilla.org/zh-CN/docs/Web/JavaScript/Reference/Global_Objects/Function/bind#Compatibility)
或者
```js
const bind = (fn, context, ...boundArgs) => (...args) => fn.apply(context, [...boundArgs, ...args]);
```
10、 [css] [说说对 BFC(Block formatting contexts) 的理解](https://github.com/daily-interview/fe-interview/issues/10)
## BFC是什么?
- BFC(Block Formatting Context)即“块级格式化上下文”
- IFC(Inline Formatting Context)即“行内格式化上下文”
- BFC是W3C CSS 2.1 规范中的一个概念,它决定了元素如何对其内容进行定位,以及与其他元素的关系和相互作用。当涉及到可视化布局的时候,Block Formatting Context提供了一个独立的渲染区域(作用范围或者盒子),HTML元素在这个独立的渲染区域中按照一定规则进行布局。并且与这个渲染区域外部毫不相干。
## 如何产生BFC
- 浮动元素:float 除 none 以外的值。
- 绝对定位元素:position (absolute、fixed)。
- display 为 inline-block、table-cells、flex。
- overflow 除了 visible 以外的值。
11、 [js] [什么是函数节流和函数防抖?应用场景是怎么样的?](https://github.com/daily-interview/fe-interview/issues/11)
## 防抖debounce
防抖(Debounce): 多次触发,只在最后一次触发时,执行目标函数。
函数防抖就是,延迟一段时间再执行函数,如果这段时间内又触发了该函数,则延迟重新计算。
### 应用场景
(1)通过监听某些事件完成对应的需求,比如:
- 通过监听 scroll 事件,检测滚动位置,根据滚动位置显示返回顶部按钮
- 通过监听 resize 事件,对某些自适应页面调整DOM的渲染(通过CSS实现的自适应不再此范围内)
- 通过监听 keyup 事件,监听文字输入并调用接口进行模糊匹配
(2)其他场景
- 表单组件输入内容验证
- 防止多次点击导致表单多次提交
- search模糊搜索,用户在不断输入值时,用防抖来节约请求资源
......
### 简单实现
```
function debounce(fn, wait) {
let t;
return () => {
let context = this;
let args = arguments;
if (t) clearTimeout(t);
t= setTimeout(() => {
fn.apply(context, args);
}, wait)
}
}
```
### 完整实现
```
function debounce(func, wait, immediate) {
let time;
let debounced = function() {
let context = this;
if(time) clearTimeout(time);
if(immediate) {
let callNow = !time;
if(callNow) func.apply(context, arguments);
time = setTimeout(
()=>{time = null} //见注解
, wait)
} else {
time = setTimeout(
()=>{func.apply(context, arguments)}
, wait)
}
};
debounced.cancel = function() {
clearTimeout(time);
time = null;
};
return debounced;
}
```
// underscore.js debounce
```
//
// Returns a function, that, as long as it continues to be invoked, will not
// be triggered. The function will be called after it stops being called for
// N milliseconds. If `immediate` is passed, trigger the function on the
// leading edge, instead of the trailing.
_.debounce = function(func, wait, immediate) {
var timeout, args, context, timestamp, result;
// 处理时间
var later = function() {
var last = _.now() - timestamp;
if (last < wait && last >= 0) {
timeout = setTimeout(later, wait - last); // 10ms 6ms 4ms
} else {
timeout = null;
if (!immediate) {
result = func.apply(context, args);
if (!timeout) context = args = null;
}
}
};
```
## 节流 throttle
节流(Throttle):函数间隔一段时间后才能再触发,避免某些函数触发频率过高,比如滚动条滚动事件触发的函数。
### 应用场景
- 鼠标不断点击触发,mousedown(单位时间内只触发一次)
- 监听滚动事件,比如是否滑到底部自动加载更多
```
### 简单实现
function throttle (fn, wait, mustRun) {
let start = new Date()
let timeout
return () => {
// 在返回的函数内部保留上下文和参数
let context = this;
let args = arguments;
let current = new Date();
clearTimeout(timeout);
let remaining = current - start;
// 达到了指定触发时间,触发该函数
if (remaining > mustRun) {
fn.apply(context, args);
start = current;
} else {
// 否则wait时间后触发,闭包保留一个timeout实例
timeout = setTimeout(fn, wait);
}
}
}
```
### 完整实现
```
function throttle(func, wait, options) {
let time, context, args, result;
let previous = 0;
if (!options) options = {};
let later = function () {
previous = options.leading === false ? 0 : new Date().getTime();
time = null;
func.apply(context, args);
if (!time) context = args = null;
};
let throttled = function () {
let now = new Date().getTime();
if (!previous && options.leading === false) previous = now;
let remaining = wait - (now - previous);
context = this;
args = arguments;
if (remaining <= 0 || remaining > wait) {
if (time) {
clearTimeout(time);
time = null;
}
previous = now;
func.apply(context, args);
if (!time) context = args = null;
} else if (!time && options.trailing !== false) {
time = setTimeout(later, remaining);
}
};
return throttled;
}
```
// underscore.js throttle
```
// Returns a function, that, when invoked, will only be triggered at most once
// during a given window of time. Normally, the throttled function will run
// as much as it can, without ever going more than once per `wait` duration;
// but if you'd like to disable the execution on the leading edge, pass
// `{leading: false}`. To disable execution on the trailing edge, ditto.
_.throttle = function(func, wait, options) {
var context, args, result;
var timeout = null;
var previous = 0;
if (!options) options = {};
var later = function() {
previous = options.leading === false ? 0 : _.now();
timeout = null;
result = func.apply(context, args);
if (!timeout) context = args = null;
};
return function() {
var now = _.now();
if (!previous && options.leading === false) previous = now;
var remaining = wait - (now - previous);
context = this;
args = arguments;
if (remaining <= 0 || remaining > wait) {
if (timeout) {
clearTimeout(timeout);
timeout = null;
}
previous = now;
result = func.apply(context, args);
if (!timeout) context = args = null;
} else if (!timeout && options.trailing !== false) {
timeout = setTimeout(later, remaining);
}
return result;
};
};
```
[详情见我的简书-js防抖函数、节流函数实现, 以及在react项目中的使用](https://www.jianshu.com/p/38be6513992f)
12、 [css] [css中position属性值有哪些?各有什么特点。](https://github.com/daily-interview/fe-interview/issues/12)
position 属性介绍
(1)position 属性自 CSS2 起就有了,该属性规定元素的定位类型。所有主流浏览器都支持 position 属性。
(2)position 的可选值有四个:static、relative、absolute、fixed。下面分别进行介绍。(其实还有个 inherit,不过这个是 IE 特有的,这里就不做讨论)
## static
<h3 id="position: static(默认值)"> position: static(默认值)</h3>
1,基本介绍
(1)static 是默认值。表示没有定位,或者说不算具有定位属性。
(2)如果元素 position 属性值为 static(或者未设 position 属性),该元素出现在正常的流中(忽略 top, bottom, left, right 或者 z-index 声明)。
2,使用样例
css:
```
<style>
div {
width: 200px;
height: 100px;
background-color: #C9FFFF;
}
</style>
```
html:
```
<div></div>
<input type="text"/>
````
我们不设置元素的 postion 属性值,那么默认的显示效果如下:
[点我预览](https://artdong.github.io/blog/2018/07/23/css-position)
## relative
1,基本介绍
(1)relative 生成相对定位的元素,相对于其正常位置进行定位。
(2)相对定位完成的过程如下:
首先按默认方式(static)生成一个元素(并且元素像层一样浮动了起来)。
然后相对于以前的位置移动,移动的方向和幅度由 left、right、top、bottom 属性确定,偏移前的位置保留不动。
2,样例代码
下面代码将文本输入框 position 设置为 relative(相对定位),并且相对于默认的位置向右、向上分别移动 15 个像素。
css:
```
div {
width: 200px;
height: 100px;
background-color: #C9FFFF;
}
input {
position: relative;
left: 15px;
top: -15px;
}
```
html:
```
<div></div>
<input type="text" />
```
运行效果如下:
[点我预览](https://artdong.github.io/blog/2018/07/23/css-position)
## absolute
1,基本介绍
(1)absolute 生成绝对定位的元素。
(2)绝对定位的元素使用 left、right、top、bottom 属性相对于其最接近的一个具有定位属性的父元素进行绝对定位。
(3)如果不存在这样的父元素,则相对于 body 元素,即相对于浏览器窗口。
2,样例代码
下面代码让标题元素相对于它的父容器做绝对定位(注意父容器 position 要设置为 relative)。
同时通过 top 属性让标题元素上移,使其覆盖在父容器的上边框。
最后通过 left 和 margin-left 配合实现这个绝对定位元素的水平居中。
css:
```
#box {
width: 200px;
height: 100px;
-webkit-box-flex:1;
border: 1px solid #28AE65;
border-radius:6px;
padding: 20px;
position: relative;
font-size: 12px;
}
#title {
background: #FFFFFF;
color: #28AE65;
font-size: 15px;
text-align: center;
width: 70px;
height: 20px;
line-height: 20px;
position: absolute;
top: -10px;
left: 50%;
margin-left: -35px;
}
```
html:
```
<div id="box">
<div id="title">标题</div>
欢迎访问我的博客
</div>
```
运行效果如下,标题元素虽然是在边框容器的内部。但由于其使用绝对定位,并做位置调整,最后显示效果就是覆盖在父容器边框上。
[点我预览](https://artdong.github.io/blog/2018/07/23/css-position)
## fixed
1,基本介绍
(1)fixed 生成绝对定位的元素,该元素相对于浏览器窗口进行定位。
(2)固定定位的元素不会随浏览器窗口的滚动条滚动而变化,也不会受文档流动影响,而是始终位于浏览器窗口内视图的某个位置。
2,样例代码
(1)下面代码让输入框位于浏览器窗口的底部。
css:
```
input {
position: fixed;
bottom: 10px;
}
```
html:
```
<ol>
<li>数据</li><li>数据</li><li>数据</li><li>数据</li><li>数据</li><li>数据</li>
<li>数据</li><li>数据</li><li>数据</li><li>数据</li><li>数据</li><li>数据</li>
<li>数据</li><li>数据</li><li>数据</li><li>数据</li><li>数据</li><li>数据</li>
<li>数据</li><li>数据</li><li>数据</li><li>数据</li><li>数据</li><li>数据</li>
</ol>
<input type="text" />
```
(2)可以看到不管滚动条如何滚动,输入框始终处于窗口的最下方。
[点我预览]( https://artdong.github.io/blog/2018/07/23/css-position)
13、 [js] [谈一谈你对this指针的理解](https://github.com/daily-interview/fe-interview/issues/13)
## 为什么要用this
> this提供了一种更优雅的方法来隐式`传递`一个对象的引用,因此可以将API设计得更加简洁并且易于复用。
## 什么是 this
> this 就是一个指针,指向我们调用函数的对象。
## this的值由什么决定
> this的值并不是由函数定义放在哪个对象里面决定,而是函数执行时由谁来唤起决定。
## 什么是执行上下文
> 执行上下文 是语言规范中的一个概念,用通俗的话讲,大致等同于函数的执行“环境”。具体的有:变量作用域(和 作用域链条,闭包里面来自外部作用域的变量),函数参数,以及 this 对象的值。
现在起,我们专注于查明 this 关键词到底指向哪。因此,我们现在要思考的就一个问题:
- 是什么调用函数?是哪个对象调用了函数?
为了理解这个关键概念,我们来测一下下面的代码。
```
let person = {
name: "Jay",
greet: function() {
console.log("hello, " + this.name);
}
};
person.greet();
```
谁调用了 greet 函数?是 person 这个对象对吧?在 greet() 调用的左边是一个 person 对象,那么 this 关键词就指向 person,this.name 就等于 "Jay"。现在,还是用上面的例子,我加点料:
```
let greet = person.greet; // 将函数引用存起来;
greet(); // 调用函数
```
你觉得在这种情况下控制台会输出什么?“Jay”?undefined?还是别的?
正确答案是 undefined。这说明this 的值并不是由函数定义放在哪个对象里面决定,而是函数执行时由谁来唤起决定。
## 思考题
找出 this 的指向
```
let name = "Jay Global";
let person = {
name: 'Jay Person',
details: {
name: 'Jay Details',
print: function() {
return this.name;
}
},
print: function() {
return this.name;
}
};
console.log(person.details.print()); // ?
console.log(person.print()); // ?
let name1 = person.print;
let name2 = person.details;
console.log(name1()); // ?
console.log(name2.print()) // ?
```
## 词法作用域
> 词法作用域也就是在词法阶段定义的作用域,也就是说词法作用域在代码书写时就已经确定了。箭头函数就是遵循词法作用域。
## this 和 箭头函数
在 ES6 里面,不管你喜欢与否,箭头函数被引入了进来。对于那些还没用惯箭头函数或者新学 JavaScript 的人来说,当箭头函数和 this 关键词混合使用时会发生什么,这个点可能会给你带来小小的困惑和蛋蛋的忧伤。
当涉及到 this 关键词,箭头函数 和 普通函数 主要的不同是什么?
> 箭头函数按词法作用域来绑定它的上下文,所以 this 实际上会引用到原来的上下文。
## 思考题
找出this指向
```
let object = {
data: [1,2,3],
dataDouble: [1,2,3],
double: function() {
console.log("this inside of outerFn double()");
console.log(this);
return this.data.map(function(item) {
console.log(this); // 这里的 this 是什么??
return item * 2;
});
},
doubleArrow: function() {
console.log("this inside of outerFn doubleArrow()");
console.log(this);
return this.dataDouble.map(item => {
console.log(this); // 这里的 this 是什么??
return item * 2;
});
}
};
object.double();
object.doubleArrow();
```
14、 [js] [你遇到过跨域问题吗?跨域请求资源的方式有哪些?](https://github.com/daily-interview/fe-interview/issues/14)
## 什么是跨域
> 在JavaScript中,有一个很重要的安全性限制,被称为“Same-Origin Policy”(同源策略)。它是一种约定,由Netscape公司1995年引入浏览器,是浏览器最核心也最基本的安全功能,如果缺少了同源策略,浏览器很容易受到XSS、CSFR等攻击。所谓同源是指"协议+域名+端口"三者相同,即便两个不同的域名指向同一个ip地址,也非同源。
## 常用的几种跨域解决方案:
- JSONP
- CORS策略
- Nginx代理跨域
## 跨域的原理解析及实现方法
1. 通过JSONP(JSON with padding)跨域
> 通常为了减轻web服务器的负载,我们把js、css,img等静态资源分离到另一台独立域名的服务器上,在html页面中再通过相应的标签从不同域名下加载静态资源,而被浏览器允许,基于此原理,我们可以通过动态创建script,再请求一个带参网址实现跨域通信。
> 而jsonp就是利用了script标签的src属性是没有跨域的限制的,从而达到跨域访问的目的。因此它的最基本原理就是:动态添加一个<script>标签来实现。
原生实现:
```
<script>
let script = document.createElement('script');
script.type = 'text/javascript';
// 传参一个回调函数名给后端,方便后端返回时执行这个在前端定义的回调函数
script.src = 'http://www.domain2.com:3000/login?user=admin&callback=handleCallback';
document.head.appendChild(script);
// 回调执行函数
function handleCallback(res) {
alert(JSON.stringify(res));
}
</script>
```
服务端返回如下(返回时即执行全局函数):
```
handleCallback({"status": true, "user": "admin"})
```
后端node.js代码示例:
```
let querystring = require('querystring');
let http = require('http');
let server = http.createServer();
server.on('request', function(req, res) {
var params = qs.parse(req.url.split('?')[1]);
let fn = params.callback;
// jsonp返回设置
res.writeHead(200, { 'Content-Type': 'text/javascript' });
res.write(fn + '(' + JSON.stringify(params) + ')');
res.end();
});
server.listen('3000');
console.log('Server is running at port 3000...');
```
JSONP的不足之处:
- 只能使用get方法,不能使用post方法:我们知道 script,link, img 等等标签引入外部资源,都是 get 请求的,那么就决定了 jsonp 一定是 get 的。但有时候我们使用的 post 请求也成功,为啥呢?这是因为当我们指定dataType:'jsonp',不论你指定:type:"post" 或者type:"get",其实质上进行的都是 get 请求!
- 没有关于 JSONP 调用的错误处理。如果动态脚本插入有效,就执行调用;如果无效,就静默失败。失败是没有任何提示的。例如,不能从服务器捕捉到 404 错误,也不能取消或重新开始请求。不过,等待一段时间还没有响应的话,就不用理它了。
2. CORS策略
- 原理:
CORS是一个W3C标准,全称是"跨域资源共享"(Cross-origin resource sharing)。它允许浏览器向跨源服务器,发出XMLHttpRequest请求,从而克服了AJAX只能同源使用的限制。它为Web服务器定义了一种方式,允许网页从不同的域访问其资源.
CORS系统定义了一种浏览器和服务器交互的方式来确定是否允许跨域请求。 它是一个妥协,有更大的灵活性,但比起简单地允许所有这些的要求来说更加安全。
- 实现方法:
CORS需要浏览器和服务器同时支持。目前,所有浏览器都支持该功能,IE浏览器不能低于IE10(IE8/9需要使用XDomainRequest对象来支持CORS)。
整个CORS通信过程,都是浏览器自动完成,不需要用户参与。对于开发者来说,CORS通信与同源的AJAX通信没有差别,代码完全一样。浏览器一旦发现AJAX请求跨源,就会自动添加一些附加的头信息,有时还会多出一次附加的请求,但用户不会有感觉。
- 前端[www.domain1.com]设置:
原生ajax
// 前端设置是否带cookie
xhr.withCredentials = true;
示例代码:
```
let xhr = new XMLHttpRequest(); // IE8/9需用window.XDomainRequest兼容
// 前端设置是否带cookie
xhr.withCredentials = true;
xhr.open('post', 'http://www.domain2.com:3000/login', true);
xhr.setRequestHeader('Content-Type', 'application/x-www-form-urlencoded');
xhr.send('user=admin');
xhr.onreadystatechange = function() {
if (xhr.readyState == 4 && xhr.status == 200) {
alert(xhr.responseText);
}
};
```
- Nodejs后台示例:
```
let http = require('http');
let server = http.createServer();
let qs = require('querystring');
server.on('request', function(req, res) {
let postData = '';
// 数据块接收中
req.addListener('data', function(chunk) {
postData += chunk;
});
// 数据接收完毕
req.addListener('end', function() {
postData = qs.parse(postData);
// 跨域后台设置
res.writeHead(200, {
'Access-Control-Allow-Credentials': 'true', // 后端允许发送Cookie
'Access-Control-Allow-Origin': 'http://www.domain1.com', // 允许访问的域(协议+域名+端口)
/*
* 此处设置的cookie还是domain2的而非domain1,因为后端也不能跨域写cookie(nginx反向代理可以实现),
* 但只要domain1中写入一次cookie认证,后面的跨域接口都能从domain2中获取cookie,从而实现所有的接口都能跨域访问
*/
'Set-Cookie': 'l=a123456;Path=/;Domain=www.domain2.com;HttpOnly' // HttpOnly的作用是让js无法读取cookie
});
res.write(JSON.stringify(postData));
res.end();
});
});
server.listen('3000');
console.log('Server is running at port 3000...');
```
- CORS策略的优缺点:
> 优点:
1. CORS支持所有类型的HTTP请求。
2. 使用CORS,开发者可以使用普通的XMLHttpRequest发起请求和获得数据,比起JSONP有更好的错误处理。
> 缺点: 兼容性方面相对差一点,ie10或以上才支持
3. nginx代理跨域
- nginx配置解决iconfont跨域
浏览器跨域访问js、css、img等常规静态资源被同源策略许可,但iconfont字体文件(eot|otf|ttf|woff|svg)例外,此时可在nginx的静态资源服务器中加入以下配置。
```
location / {
add_header Access-Control-Allow-Origin *;
}
```
- nginx反向代理接口跨域
跨域原理: 同源策略是浏览器的安全策略,不是HTTP协议的一部分。服务器端调用HTTP接口只是使用HTTP协议,不会执行JS脚本,不需要同源策略,也就不存在跨越问题。
实现思路:通过nginx配置一个代理服务器(域名与domain1相同,端口不同)做跳板机,反向代理访问domain2接口,并且可以顺便修改cookie中domain信息,方便当前域cookie写入,实现跨域登录。
nginx具体配置:
```
#proxy服务器
server {
listen 81;
server_name www.domain1.com;
location / {
proxy_pass http://www.domain2.com:3000; #反向代理
proxy_cookie_domain www.domain2.com www.domain1.com; #修改cookie里域名
index index.html index.htm;
# 当用webpack-dev-server等中间件代理接口访问nignx时,此时无浏览器参与,故没有同源限制,下面的跨域配置可不启用
add_header Access-Control-Allow-Origin http://www.domain1.com; #当前端只跨域不带cookie时,可为*
add_header Access-Control-Allow-Credentials true;
}
}
```
1.) 前端代码示例:
```
let xhr = new XMLHttpRequest();
// 前端开关:浏览器是否读写cookie
xhr.withCredentials = true;
// 访问nginx中的代理服务器
xhr.open('get', 'http://www.domain1.com:81/?user=admin', true);
xhr.send();
```
2.) Nodejs后台示例:
```
let http = require('http');
let server = http.createServer();
let qs = require('querystring');
server.on('request', function(req, res) {
let params = qs.parse(req.url.substring(2));
// 向前台写cookie
res.writeHead(200, {
'Set-Cookie': 'l=a123456;Path=/;Domain=www.domain2.com;HttpOnly' // HttpOnly:脚本无法读取
});
res.write(JSON.stringify(params));
res.end();
});
server.listen('3000');
console.log('Server is running at port 3000...');
```
15、 [http] [什么是强缓存和协商缓存?](https://github.com/daily-interview/fe-interview/issues/15)
## 什么是浏览器缓存
浏览器缓存(Brower Caching)是浏览器在本地磁盘对用户最近请求过的文档进行存储,当访问者再次访问同一页面时,浏览器就可以直接从本地磁盘加载文档。
## 为什么要使用浏览器缓存
1.减少了冗余的数据传输,节省了网费
2.减少了服务器的负担,大大提升了网站的性能
3.加快了客户端加载网页的速度
4.更好的用户体验
## 浏览器缓存类型
> 浏览器缓存主要有两类:缓存协商和彻底缓存,也有称之为协商缓存和强缓存。
1.强缓存:强制缓存整体流程比较简单,就是在第一次访问服务器取到数据之后,在过期时间之内不会再去重复请求。实现这个流程的核心就是如何知道当前时间是否超过了过期时间。
> 强制缓存的过期时间通过第一次访问服务器时返回的响应头获取。在 http 1.0 和 http 1.1 版本中通过不同的响应头字段实现。
- http 1.0
在 http 1.0 版本中,强制缓存通过 Expires 响应头来实现。 expires 表示未来资源会过期的时间。也就是说,当发起请求的时间超过了 expires 设定的时间,即表示资源缓存时间到期,会发送请求到服务器重新获取资源。而如果发起请求的时间在 expires 限定的时间之内,浏览器会直接读取本地缓存数据库中的信息(from memory or from disk),两种方式根据浏览器的策略随机获取。
- http 1.1
在 http 1.1 版本中,强制缓存通过 Cache-Control 响应头来实现。Cache-Control 拥有多个值:
private:客户端可以缓存
public:客户端和代理服务器均可缓存;
max-age=xxx:缓存的资源将在 xxx 秒后过期;
no-cache:需要使用协商缓存来验证是否过期;
no-store:不可缓存
最常用的字段就是 max-age=xxx ,表示缓存的资源将在 xxx 秒后过期。一般来说,为了兼容,两个版本的强制缓存都会被实现。
> 强制缓存只有首次请求才会跟服务器通信,读取缓存资源时不会发出任何请求,资源的 Status 状态码为 200,资源的 Size 为 from memory 或者 from disk ,http 1.1 版本的实现优先级会高于 http 1.0 版本的实现。
2.协商缓存:协商缓存与强制缓存的不同之处在于,协商缓存每次读取数据时都需要跟服务器通信,并且会增加缓存标识。在第一次请求服务器时,服务器会返回资源,并且返回一个资源的缓存标识,一起存到浏览器的缓存数据库。当第二次请求资源时,浏览器会首先将缓存标识发送给服务器,服务器拿到标识后判断标识是否匹配,如果不匹配,表示资源有更新,服务器会将新数据和新的缓存标识一起返回到浏览器;如果缓存标识匹配,表示资源没有更新,并且返回 304 状态码,浏览器就读取本地缓存服务器中的数据。
在 http 协议的 1.0 和 1.1 版本中也有不同的实现方式。
- http 1.0
在 http 1.0 版本中,第一次请求资源时服务器通过 Last-Modified 来设置响应头的缓存标识,并且把资源最后修改的时间作为值填入,然后将资源返回给浏览器。在第二次请求时,浏览器会首先带上 If-Modified-Since 请求头去访问服务器,服务器会将 If-Modified-Since 中携带的时间与资源修改的时间匹配,如果时间不一致,服务器会返回新的资源,并且将 Last-Modified 值更新,作为响应头返回给浏览器。如果时间一致,表示资源没有更新,服务器返回 304 状态码,浏览器拿到响应状态码后从本地缓存数据库中读取缓存资源。
这种方式有一个弊端,就是当服务器中的资源增加了一个字符,后来又把这个字符删掉,本身资源文件并没有发生变化,但修改时间发生了变化。当下次请求过来时,服务器也会把这个本来没有变化的资源重新返回给浏览器。
- http 1.1
在 http 1.1 版本中,服务器通过 Etag 来设置响应头缓存标识。Etag 的值由服务端生成。在第一次请求时,服务器会将资源和 Etag 一并返回给浏览器,浏览器将两者缓存到本地缓存数据库。在第二次请求时,浏览器会将 Etag 信息放到 If-None-Match 请求头去访问服务器,服务器收到请求后,会将服务器中的文件标识与浏览器发来的标识进行对比,如果不相同,服务器返回更新的资源和新的 Etag ,如果相同,服务器返回 304 状态码,浏览器读取缓存。
> 协商缓存每次请求都会与服务器交互,第一次是拿数据和标识的过程,第二次开始,就是浏览器询问服务器资源是否有更新的过程。每次请求都会传输数据,如果命中缓存,则资源的 Status 状态码为 304 而不是 200 。同样的,一般来讲为了兼容,两个版本的协商缓存都会被实现,http 1.1 版本的实现优先级会高于 http 1.0 版本的实现。
> 两者的共同点是,都是从客户端缓存中读取资源;区别是强缓存不会发请求,协商缓存会发请求。
16、 [js] [了解js闭包吗?谈谈你对js闭包的理解。](https://github.com/daily-interview/fe-interview/issues/16)
## 什么是闭包
MDN的解释:闭包是`函数`和`声明该函数的词法环境`的组合。
简单讲,闭包就是指有权访问另一个函数作用域中的变量的函数。
它由两部分构成:函数,以及创建该函数的环境。环境由闭包创建时在作用域中的所有局部变量组成。
理解闭包的关键在于:外部函数调用之后其变量对象本应该被销毁,但闭包的存在使我们仍然可以访问外部函数的变量对象,这就是闭包的重要概念。
## 如何产生一个闭包函数
> 创建闭包最常见方式,就是在一个函数内部创建另一个函数。
```
function outer() {
let name = "hello"; // 闭包创建时所能访问的局部变量
function sayHello() { // 闭包函数
alert(name);
}
return sayHello; // 返回闭包函数
}
let myFunc = outer();
myFunc();
```
> 闭包的作用域链包含着它自己的作用域,以及包含它的函数的作用域和全局作用域。
outer有了myFunc的引用,内存一直得不到释放,咋办呢?这样的函数多了是不是会造成内存溢出?
手动释放一下:
```
myFunc = null;
```
## 闭包的注意事项(如何防止内存泄漏)
通常,函数的作用域及其所有变量都会在函数执行结束后被销毁。但是,在创建了一个闭包以后,这个函数的作用域就会一直保存到闭包不存在为止。
```
function makeAdder(x) {
return function(y) {
return x + y;
};
}
let add5 = makeAdder(5);
let add10 = makeAdder(10);
console.log(add5(2)); // 7
console.log(add10(2)); // 12
add5 = null;
add10 = null;
```
add5 和 add10 都是闭包。它们共享相同的函数定义,但是保存了不同的词法环境。在 add5 的环境中,x 为 5。而在 add10 中,x 则为 10。
最后通过 null 释放了 add5 和 add10 对闭包的引用。
在javascript中,如果一个对象不再被引用,那么这个对象就会被垃圾回收机制回收;
如果两个对象互相引用,而不再被第3者所引用,那么这两个互相引用的对象也会被回收。
## 闭包中的this对象
```
let name = "window";
let obj = {
name: 'object',
getName: function() {
return function() {
return this.name;
}
}
}
obj.getName()(); // window
```
在上面这段代码中,obj.getName()()实际上是在全局作用域中调用了匿名函数,this指向了window。
window才是匿名函数功能执行的环境。
如果想使this指向外部函数的执行环境,可以这样改写:
```
let name = "window";
let obj = {
name: 'object',
getName: function() {
var that = this;
return function() {
return that.name;
}
}
}
obj.getName()();
```
## 函数内部的定时器
当函数内部的定时器引用了外部函数的变量对象时,该变量对象不会被销毁。
```
(function() {
let a = 0;
setInterval(function(){
console.log(a++);
}, 1000)
})()
```
## 闭包的用途
- 模拟块级作用域
```
var isShow = true;
if(isShow){
var a=1000;
console.log(a);
}
console.log(a); // 在if定义的变量在外部可以访问
```
```
(function(){ // a在外部就不认识啦
var isShow = true;
if(isShow){
var a=10000;
console.log(a);
}
})();
console.log(a); // 报错,无法访问
```
- 让变量的值始终保持在内存中,对结果进行缓存
```
function fn(){
let count = 0;
return function(){
count++;
return count;
}
}
let add=fn();
add(); // 1
add(); // 2
add(); // 3
```
- 封装工具函数
```
let counter = (function(){
let privateCounter = 0; // 私有变量
function change(val){
privateCounter += val;
}
return {
increment:function(){ // 三个闭包共享一个词法环境
change(1);
},
decrement:function(){
change(-1);
},
value:function(){
return privateCounter;
}
};
})();
counter.value(); // 0
counter.increment();
counter.value(); // 1
```
17、 [nodejs] [用过nginx吗?nginx负载均衡如何实现?](https://github.com/daily-interview/fe-interview/issues/17)
## 什么是nginx?
Nginx("engine x")是一款是由俄罗斯的程序设计师Igor Sysoev所开发高性能的Web和反向代理服务器,也是一个 IMAP/POP3/SMTP 代理服务器。
在高连接并发的情况下,Nginx是Apache服务器不错的替代品。
## nginx服务器基本特征
- 处理静态文件,索引文件以及自动索引;打开文件描述符缓冲
- 无缓存的反向代理加速,简单的负载均衡和容错
- FastCGI,简单的负载均衡和容错
- 模块化的结构。包括gzipping, byte ranges, chunked responses,以及 SSI-filter等filter。如果由FastCGI或 其它代理服务器处理单页中存在的多个SSI,则这项处理可以并行运行,而不需要相互等待
- 支持SSL 和 TLSSNI
## nginx常用功能
1、Http代理,反向代理:作为web服务器最常用的功能之一,尤其是反向代理。
Nginx在做反向代理时,提供性能稳定,并且能够提供配置灵活的转发功能。Nginx可以根据不同的正则匹配,采取不同的转发策略,比如图片文件结尾的走文件服务器,动态页面走web服务器,只要你正则写的没问题,又有相对应的服务器解决方案,你就可以随心所欲的玩。并且Nginx对返回结果进行错误页跳转,异常判断等。如果被分发的服务器存在异常,他可以将请求重新转发给另外一台服务器,然后自动去除异常服务器。
2、负载均衡
Nginx的负载均衡是通过upstream实现的。
eg.
```
upstream test.aaa {
ip_hash; ## 调度算法
server 192.168.1.10:80;
server 192.168.1.11:80 down;
server 192.168.1.12:8009 max_fails=3 fail_timeout=20s;
server 192.168.1.13:8080;
}
server {
listen 80;
server_name localhost;
location / {
proxy_pass http://test.aaa;
}
}
```
upstream 支持的负载均衡算法:
- 轮询(默认)
> 每个请求按时间顺序逐一分配到不同的后端服务器,如果后端某台服务器宕机,故障系统被自动剔除,使用户访问不受影响。
- weight
> 指定轮询几率,weight和访问比率成正比,用于后端服务器性能不均的情况。
- fair(第三方)
> 按后端服务器的响应时间来分配请求,响应时间短的优先分配。Nginx本身是不支持fair的,如果需要使用这种调度算法,必须下载Nginx的upstream_fair模块。
- url_hash(第三方)
> 按访问URL的hash结果来分配请求,使每个URL定向到同一个后端服务器,后端服务器为缓存时比较适用。另外,在upstream中加入hash语句后,server语句不能写入weight等其他参数。Nginx本身是不支持url_hash的,如果需要使用这种调度算法,必须安装Nginx 的hash软件包。
upstream 支持的状态参数
- down,表示当前的server暂时不参与负载均衡。
- backup,预留的备份机器。当其他所有的非backup机器出现故障或者忙的时候,才会请求backup机器,因此这台机器的压力最轻。
- max_fails,允许请求失败的次数,默认为1。当超过最大次数时,返回proxy_next_upstream 模块定义的错误。
- fail_timeout,在经历了max_fails次失败后,暂停服务的时间。max_fails可以和fail_timeout一起使用。
> 注,当负载调度算法为ip_hash时,后端服务器在负载均衡调度中的状态不能是weight和backup。
3、web缓存
Nginx可以对不同的文件做不同的缓存处理,配置灵活,并且支持FastCGI_Cache,主要用于对FastCGI的动态程序进行缓存。配合着第三方的ngx_cache_purge,对制定的URL缓存内容可以的进行增删管理。
18、 [h5] [如果图片加载失败,如何做统一处理及优化?](https://github.com/daily-interview/fe-interview/issues/18)
## 项目中遇到的问题
在实际项目中,不可避免的会遇到在页面中加载大量图片,但可能由于网络问题,或者图片文件缺失等问题,导致图片不能正常展示。
我们希望有一种降级处理的方式,可以在图片加载失败后显示一张我们预先设定好的默认图片。
## 如何解决
> 监听图片的 error 事件
由于图片加载失败后,会抛出一个 error 事件,我们可以通过监听 error 事件的方式来对图片进行降级处理。
```
<img id="img" src="//xxx/img.png">
let img = document.getElementById('img');
img.addEventListener('error',function(e){
e.target.src = '//xxx/default.png'; // 为当前图片设定默认图
})
```
这种方式,确实实现了对异常图片的降级处理,但每张图片都需要通过 JS 进行获取,并且监听 error 事件,对于大量图片的情况并不适用。
为此,我们可以使用内联事件来监听 error 事件
```
<img src="//xxx/img.png" onerror="this.src = '//xxx/default.png'">
```
我们可以看到,完全不需要单独去写 JS 的监听,我们就实现了异常图片的降级处理,但这种方式还不够好,因为我们仍然需要手动的向 img 标签中添加内联事件,在实际开发过程中,很难保证每张图片都不漏写。
## 优化方案
> 通过在全局监听的方式,来对异常图片做降级处理
DOM2级事件规定事件流包含三个阶段:
事件捕获阶段
处于目标阶段
事件冒泡阶段
首先发生的是事件捕获,为截获事件提供了机会。然后是实际的目标接收到的事件。最后一个阶段是冒泡阶段。
我们上文中的监听图片自身的 error 事件,实际上在事件流中是处于目标阶段。
对于 img 的 error 事件来说,是无法冒泡的,但是是可以捕获的,我们的实现如下:
```
window.addEventListener('error',function(e){
// 当前异常是由图片加载异常引起的
if( e.target.tagName.toUpperCase() === 'IMG' ){
e.target.src = '//xxx/default.jpg';
}
},true)
```
最后,我们在思考一个问题,当网络出现异常的时候,必然会出现什么网络图片都无法加载的情况,这样就会导致我们监听的 error 事件。被无限触发,所以我们可以设定一个计数器,当达到期望的错误次数时停止对图片赋予默认图片的操作,改为提供一个Base64的图片。
实现起来也很简单,如下:
```
window.addEventListener('error',function(e){
let target = e.target, // 当前dom节点
tagName = target.tagName,
count = Number(target.dataset.count ) || 0, // 以失败的次数,默认为0
max= 3; // 总失败次数,此时设定为3
// 当前异常是由图片加载异常引起的
if( tagName.toUpperCase() === 'IMG' ){
if(count >= max){
target.src = 'data:image/jpeg;base64,/9j/4AAQSkZJRgABAQEAYABgAAD//AK3/ALYH+5hX6FV5N4Y/5GHwx/vyf+iJa9ZrysPhoYVShDZu/potDmwWFhhIzhT2bv6aLQ//Z';
}else{
target.dataset.count = count + 1;
target.src = '//xxx/default.jpg';
}
}
},true)
```
19、 [js] [从输入URL到页面加载完成发生了什么?](https://github.com/daily-interview/fe-interview/issues/19)
## 从输入 URL 到页面加载完成的过程
- 首先通过DNS解析获得网址的对应IP地址,如果这一步做了智能 DNS 解析的话,会提供访问速度最快的 IP 地址。
- 接下来是 TCP 握手 (3次握手4次挥手),应用层会下发数据给传输层,这里 TCP 协议会指明两端的端口号,然后下发给网络层。网络层中的 IP 协议会确定 IP 地址,并且指示了数据传输中如何跳转路由器。然后包会再被封装到数据链路层的数据帧结构中,最后就是物理层面的传输了。
- TCP 握手结束后会进行 TLS 握手,然后就开始正式的传输数据。
- 数据在进入服务端之前,可能还会先经过负责负载均衡的服务器,它的作用就是将请求合理的分发到多台服务器上,这时假设服务端会响应一个 HTML 文件。
- 首先浏览器会判断状态码是什么,如果是 200 那就继续解析,如果 400 或 500 的话就会报错,如果 300 的话会进行重定向,这里会有个重定向计数器,避免过多次的重定向,超过次数也会报错。
- 浏览器开始解析文件,如果是 gzip 格式的话会先解压一下,然后通过文件的编码格式知道该如何去解码文件。
- 文件解码成功后会正式开始渲染流程,先会根据 HTML 构建 DOM 树,有 CSS 的话会去构建 CSSOM 树。如果遇到 script 标签的话,会判断是否存在 async 或者 defer ,前者会并行进行下载并执行 JS,后者会先下载文件,然后等待 HTML 解析完成后顺序执行,如果以上都没有,就会阻塞住渲染流程直到 JS 执行完毕。遇到文件下载的会去下载文件,这里如果使用 HTTP 2.0 协议的话会极大的提高多图的下载效率。
- 初始的 HTML 被完全加载和解析后会触发 DOMContentLoaded 事件。
- CSSOM 树和 DOM 树构建完成后会开始生成 Render 树,这一步就是确定页面元素的布局、样式等等诸多方面的东西。
- 在生成 Render 树的过程中,浏览器就开始调用 GPU 绘制,合成图层,将内容显示在屏幕上。
| 22.988606 | 281 | 0.67636 | yue_Hant | 0.628777 |
053420ac1e4dab75305def245df611878ce278c8 | 218 | md | Markdown | _watches/M20200228_060909_TLP_2.md | Meteoros-Floripa/meteoros.floripa.br | 7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad | [
"MIT"
] | 5 | 2020-01-22T17:44:06.000Z | 2020-01-26T17:57:58.000Z | _watches/M20200228_060909_TLP_2.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | null | null | null | _watches/M20200228_060909_TLP_2.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | 2 | 2020-05-19T17:06:27.000Z | 2020-09-04T00:00:43.000Z | ---
layout: watch
title: TLP2 - 28/02/2020 - M20200228_060909_TLP_2T.jpg
date: 2020-02-28 06:09:09
permalink: /2020/02/28/watch/M20200228_060909_TLP_2
capture: TLP2/2020/202002/20200227/M20200228_060909_TLP_2T.jpg
---
| 27.25 | 62 | 0.784404 | eng_Latn | 0.04112 |
05349cdd677aad402fe6f82c62ce13656d6b49a8 | 4,626 | md | Markdown | _posts/2021-06-24-3261e6107adbefc58955de2aa613847a.md | develup4/develup4.github.io | 4478d54f11b3aa4fff75ed396ba38d752575ac41 | [
"MIT"
] | null | null | null | _posts/2021-06-24-3261e6107adbefc58955de2aa613847a.md | develup4/develup4.github.io | 4478d54f11b3aa4fff75ed396ba38d752575ac41 | [
"MIT"
] | null | null | null | _posts/2021-06-24-3261e6107adbefc58955de2aa613847a.md | develup4/develup4.github.io | 4478d54f11b3aa4fff75ed396ba38d752575ac41 | [
"MIT"
] | null | null | null | ---
title: Logistic Regression
categories: machine_learning
tags: logistic_regression
---
앞서 말했듯 Logistic Regression은 분류를 목적으로 한다 (주어진 입력에 따라 discrete한 클래스를 추정한다)
=> Binary Classification(0 or 1, False or True)
이를 위해서 기준이 되는 임계함수를 두는데,
(계단 모양이라서 0 아니면 1인 값을 가지게 된다)

대표사진 삭제
사진 설명을 입력하세요.
다만, 이전의 Linear regression 모델에서 바로 임계함수를 적용하면 아래와 같은 문제가 발생할 수 있다.

대표사진 삭제
사진 설명을 입력하세요.
0.5를 기준으로 악성인 5, 6번이 새로운 데이터에 의해 양성으로 바껴버릴 수 있다.
직선을 기반으로는 나누기가 어려운 것이다.

대표사진 삭제
사진 설명을 입력하세요.
그래서 중간에 활성화 함수 단계를 두어서 이를 해결한다.
**(Sigmoid 함수**…스타트업 드라마에서도 남주혁이 쓰더라)


사진 삭제
사진 설명을 입력하세요.
이렇게 생긴 함수이며 아래와 같은 수식으로 표현한다.
1 / (1 + e^-x)
학습할 변수를 추가하면,
1/(1 + e^-(WX+b))
이렇게 될 것이다.
이 경우 Linear Regression처럼 아름다운 Convex 형태(극점이 유일)가 아니기 때문에,
(이전에는 그냥 2차함수의 그래프였을 뿐이므로)
Cost function에 Gradient Descent방식을 사용하기가 어렵다.

사진 삭제
사진 설명을 입력하세요.
그래서 Binary Cross Entropy Error 방식을 사용한다.
엔트로피란다…점점 어려운 느낌이 들지만…여차저차해서 계산이 끝나면 쉬운 포맷이 나오더라. 참자.
결과적으로 아래와 같은 형태의 비용함수가 나오게 되는 것이고,

사진 삭제
사진 설명을 입력하세요.
실제 정답(label)의 확률 분포와 h(x)의 확률분포를 cost로 계산하면,
cost가 0에서 무한대까지의 형태로 표현이 가능하고 위와 같이 극점이 하나인 형태로 나타낼 수 있는 것이다.
즉, 여기서부터는 로그함수의 형태라서 이전처럼 Gradient Descent를 이용할 수 있다.

사진 삭제
사진 설명을 입력하세요.
수식으로 표현하면 cost function은 위와 같다.
이제 cost에 대해서 미분을 해서 역전파를 해야하는데 이런 저런 계산을 거치고나면,

사진 삭제
사진 설명을 입력하세요.
기존 y_hat 부분이 **a(sigmoid 함수의 결과값)**로 바뀌었을뿐 계산법은 같다.
import matplotlib.pyplot as plt
import numpy as np
class Neuron:
def __init__(self):
self.w = 1.0 # 가중치를 초기화합니다
self.b = 1.0 # 절편을 초기화합니다
def forpass(self, x):
z = x * self.w + self.b # 직선 방정식을 계산합니다
return z
def backprop(self, x, err):
w_grad = x * err # 가중치에 대한 그래디언트를 계산합니다
b_grad = 1 * err # 절편에 대한 그래디언트를 계산합니다
return w_grad, b_grad
def activation(self, z):
a = z;
a = 1/(1+np.exp(-z))
return a
def fit(self, x, y, epochs=200):
for i in range(epochs): # 에포크만큼 반복합니다
for x_i, y_i in zip(x, y): # 모든 샘플에 대해 반복합니다
z = self.forpass(x_i) # 정방향 계산
a = self.activation(z)
err = a - y_i # 오차 계산
w_grad, b_grad = self.backprop(x_i, err) # 역방향 계산
self.w -= 0.1*w_grad # 가중치 업데이트
self.b -= 0.1*b_grad # 절편 업데이트
# x = np.array([1,2,3,4,5,6,7,8])
# y = np.array([0,0,0,0,1,1,1,1])
x = np.array([1,2,3,4,5,6,7,8,20])
y = np.array([0,0,0,0,1,1,1,1,1])
neuron = Neuron()
neuron.fit(x, y)
for xi, yi in zip(x,y):
plt.plot(xi,yi,"rx")
for x_i in x:
y_hat = neuron.forpass(x_i)
a = neuron.activation(y_hat)
if( a >= 0.5 ):
print("%d : 악성종양"%x_i)
else:
print("%d : 양성종양"%x_i)
y_temp.append(a)
x = np.arange(0,x[-1],0.1)
y_temp = []
for i, x_i in enumerate(x):
y_hat = neuron.forpass(x_i)
a = neuron.activation(y_hat)
y_temp.append(a)
plt.plot(x,y_temp)
plt.show()
따라서 코드도 크게 다르지 않다.
forpass 결과값에 대해서 sigmoid(activation) 함수를 돌려서 a값을 얻어내고 이것을 이용하는 형태이다.

사진 삭제
사진 설명을 입력하세요.
이렇게 했더니 새로운 데이터에도 끄떡없게 된다.
결과적으로는 구현과 수식에 있어서 크게 다른게 없었다.
이제 저기서 절반을 나눠서 양성, 음성 등으로 구분을 하면 될 것이다.
| 29.278481 | 127 | 0.603113 | kor_Hang | 0.999921 |
05350b4b944fae5160b35a17bea5dae69b7fcb4f | 259 | md | Markdown | iot-c-reference/iot-c-ref-iothubtransportamqp-websockets-h/amqp-protocol-over-websocketstls.md | nschonni/azure-reference-other | 6ab55a08d43984965d4e75fc8ebfa8e477819cf1 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | iot-c-reference/iot-c-ref-iothubtransportamqp-websockets-h/amqp-protocol-over-websocketstls.md | nschonni/azure-reference-other | 6ab55a08d43984965d4e75fc8ebfa8e477819cf1 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | iot-c-reference/iot-c-ref-iothubtransportamqp-websockets-h/amqp-protocol-over-websocketstls.md | nschonni/azure-reference-other | 6ab55a08d43984965d4e75fc8ebfa8e477819cf1 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | # AMQP_Protocol_over_WebSocketsTls()
## Syntax
\#include "[azure-iot-sdk-c/iothub_client/inc/iothubtransportamqp_websockets.h](../iot-c-ref-iothubtransportamqp-websockets-h.md)"
```C
const TRANSPORT_PROVIDER* AMQP_Protocol_over_WebSocketsTls(void
);
```
| 23.545455 | 132 | 0.783784 | yue_Hant | 0.456269 |
05356204a45bbcc09673fceb4218d0b2d56d0104 | 530 | md | Markdown | docs/Model/Destiny/Components/PlugSets/DestinyPlugSetsComponent.md | Yogarine/bungie-sdk-php | 93c643afc4e1d951964ec6ce8194818c1a152a57 | [
"BSD-3-Clause"
] | 4 | 2019-06-22T19:01:02.000Z | 2020-12-21T13:25:06.000Z | docs/Model/Destiny/Components/PlugSets/DestinyPlugSetsComponent.md | Yogarine/bungie-sdk-php | 93c643afc4e1d951964ec6ce8194818c1a152a57 | [
"BSD-3-Clause"
] | null | null | null | docs/Model/Destiny/Components/PlugSets/DestinyPlugSetsComponent.md | Yogarine/bungie-sdk-php | 93c643afc4e1d951964ec6ce8194818c1a152a57 | [
"BSD-3-Clause"
] | null | null | null | # DestinyPlugSetsComponent
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**plugs** | [**map[string,\Bungie\Model\Destiny\Sockets\DestinyItemPlug[]]**](array.md) | The shared list of plugs for each relevant PlugSet, keyed by the hash identifier of the PlugSet (DestinyPlugSetDefinition). | [optional]
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
| 48.181818 | 227 | 0.64717 | yue_Hant | 0.309892 |
05358f751d5e884b1e675918f2599465be22de23 | 4,208 | md | Markdown | posts/understanding-typescripts-exclude.md | antonholmberg/my-blog | d9569651d41d9b2e9144a7279211b7b460cb37c0 | [
"MIT"
] | null | null | null | posts/understanding-typescripts-exclude.md | antonholmberg/my-blog | d9569651d41d9b2e9144a7279211b7b460cb37c0 | [
"MIT"
] | 10 | 2021-03-01T20:23:01.000Z | 2022-02-26T01:36:30.000Z | posts/understanding-typescripts-exclude.md | antonholmberg/my-blog | d9569651d41d9b2e9144a7279211b7b460cb37c0 | [
"MIT"
] | null | null | null | ---
path: '/posts/understanding-typescripts-exclude'
date: 2019-07-4:17:25.962Z
title: 'Understanding TypeScripts Exclude'
description: 'Tackling one the more advanced type featues in typescript, Exclude.'
---
I recently started to do more TypeScript. I have plenty of previous experiences
with typed languages but there were still some things in TypeScript that I didn't
really feel comfortable with at first.
### That Weird Exclude Type
While reading release notes for TypeScript 2.8 I stumbled across _Omit_. Not
knowing what it was I set out to understand it. However, the problem grew since
I found that _Omit_ was defined as a combination of _Pick_ and _Exclude_. I just
couldn't for the life of me figure out what _Exclude_ did.
Most of the articles I found about _Exclude_ would show an example of how it was
used in conjunction with another type. It felt like they sort of assumed that
the reader already knew what _Exclude_ did.
### Lets Start With Union Types
So TypeScript has this awesome feature called _union types_. I think it is
easier to show an example of a _union type_ rather than explaining it in text.
```TypeScript
type Language = "swedish" | "danish" | "english" | "french":
const firstLanguage: Language = "swedish";
const secondLanguage: Language = "english";
// Will not compile
const thirdLanguage = "meowing"
```
So in the example above we create a type called _Language_. A variable of type
_Language_ can now only be one of the languages we defined in the type. In this
case _meowing_ is not an acceptable language and therefore the program above
will not compile.
### So What Is This Exclude Thing?
This is when _Exclude_ comes in. _Exclude_ takes two _union types_ and, sort of,
subtracts the values in the second _union type_ from the first _union type_.
```TypeScript
type Language = "swedish" | "danish" | "english" | "french":
type NordicLanguage = Exclude<Language, "english" | "french">;
const firstLanguage: NordicLanguage = "swedish";
// This will not compile
const secondLanguage: NordicLanguage = "english";
```
So in the above example we create another type called _NordicLanguage_. This
type can take on all the same values as _Language_ except for the excluded values
_english_ and _french_. This is more or less the same as writing.
```TypeScript
type Language = "swedish" | "danish" | "english" | "french":
type NordicLanguage = "swedish" | "danish";
```
### A Cool Use Case
So I recently had a problem where I had an object that contained multiple keys of
the same type. I also wanted to store which keys was currently
active/selected.
As it turned out; this perfect case for _Exclude_.
```TypeScript
type AvailableArea = Exclude<keyof Map, 'selectedArea'>;
type Climate = 'grass' | 'snow' | 'sand' | 'water';
interface Area {
climate: Climate;
}
interface Map {
selectedArea: AvailableArea;
north: Area;
south: Area;
west: Area;
east: Area;
}
```
The first thing that we need to understand if what _keyof_ means.
```TypeScript
// Same as: type keys = "selectedArea" | "north" | "south" | "west" | "east";
type keys = keyof Map;
interface Map {
selectedArea: AvailableArea;
north: Area;
south: Area;
west: Area;
east: Area;
}
```
So now that we have that down the question is: Do we really want _selectedArea_
to be able to refer to it self? In this case the answer was no. If I create a
_union type_ with the key names hard coded, what if I start adding more areas
like _southWest_? These questions lead me to the conclusion that probably it is
best if I use _Exclude_ here.
We know that _keyof_ returns a _union type_ where the values can be any of the
keys in the object. All we need to do now is to "exclude" _selectedArea_ and we
should be left with exactly what we want!
```typescript
type AvailableArea = Exclude<keyof Map, 'selectedArea'>;
```
This gives me the possibility to include more areas in the future and still keep
type safety throughout my application.
### Closing Thoughts
Hopefully someone found this useful in some way. Next time I might cover _Pick_
but there are plenty of tutorials out there for that and once I understood
_Exclude_ I found that _Pick_ wasn't that hard to grasp.
| 32.875 | 82 | 0.756416 | eng_Latn | 0.999716 |
053627dabbf41af5edcd4e3160e7856d58f38e0d | 2,316 | md | Markdown | docs/analysis-services/scripting/properties/account-element-impersonationinfo-assl.md | sql-aus-hh/sql-docs.de-de | edfac31211cedb5d13440802f131a1e48934748a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/analysis-services/scripting/properties/account-element-impersonationinfo-assl.md | sql-aus-hh/sql-docs.de-de | edfac31211cedb5d13440802f131a1e48934748a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/analysis-services/scripting/properties/account-element-impersonationinfo-assl.md | sql-aus-hh/sql-docs.de-de | edfac31211cedb5d13440802f131a1e48934748a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Konto-Element (ImpersonationInfo) (ASSL) | Microsoft Docs
ms.date: 05/08/2018
ms.prod: sql
ms.technology: analysis-services
ms.custom: assl
ms.topic: reference
ms.author: owend
ms.reviewer: owend
author: minewiskan
manager: kfile
ms.openlocfilehash: ce73dcfc07da406c01f1df0edce01da360ecd49a
ms.sourcegitcommit: c12a7416d1996a3bcce3ebf4a3c9abe61b02fb9e
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 05/10/2018
ms.locfileid: "34035901"
---
# <a name="account-element-impersonationinfo-assl"></a>Account-Element (ImpersonationInfo) (ASSL)
[!INCLUDE[ssas-appliesto-sqlas](../../../includes/ssas-appliesto-sqlas.md)]
Enthält den Namen des Benutzerkontos für die [ImpersonationInfo](../../../analysis-services/scripting/data-type/impersonationinfo-data-type-assl.md) -Datentyp.
## <a name="syntax"></a>Syntax
```xml
<ImpersonationInfo
...
<Account>...</Account>
...
</Action>
```
## <a name="element-characteristics"></a>Elementmerkmale
|Merkmal|Beschreibung|
|--------------------|-----------------|
|Datentyp und -länge|String|
|Standardwert|Keine|
|Kardinalität|0-1: Optionales Element, das nur einmal auftreten kann.|
## <a name="element-relationships"></a>Elementbeziehungen
|Beziehung|Element|
|------------------|-------------|
|Übergeordnete Elemente|[ImpersonationInfo](../../../analysis-services/scripting/data-type/impersonationinfo-data-type-assl.md)|
|Untergeordnete Elemente|Keine|
## <a name="remarks"></a>Hinweise
Den Wert des der **Konto** Element als auch der Wert des der [Kennwort](../../../analysis-services/scripting/properties/password-element-assl.md) -Elements werden zum Zweck des Identitätswechsels verwendet, wenn der Wert des der [ImpersonationMode-Wert](../../../analysis-services/scripting/properties/impersonationmode-element-assl.md) -Element für jedes Element abgeleitet aus der **ImpersonationInfo** Datentyp wird festgelegt, um *ImpersonateAccount*.
## <a name="see-also"></a>Siehe auch
[DataSourceImpersonationInfo-Element (ASSL)](../../../analysis-services/scripting/properties/datasourceimpersonationinfo-element-assl.md)
[Datenbankeigenschaften & #40; ASSL & #41;](../../../analysis-services/scripting/properties/properties-assl.md)
| 40.631579 | 458 | 0.708117 | deu_Latn | 0.4422 |
05365264eb29a8c5ce6f8a91e75f9d37d18e2a47 | 273 | md | Markdown | _project/great-room-pictures.md | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | _project/great-room-pictures.md | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | _project/great-room-pictures.md | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | ---
layout: project_single
title: "Great Room Pictures"
slug: "great-room-pictures"
parent: "beautiful-living-room-for-your-dream-house"
---
<3 Our Dream Home <3 with Antlers Used For Decor & our country Style & A Beautiful View Of The Lake & Mountains (oh i wish one day) | 39 | 131 | 0.739927 | eng_Latn | 0.899429 |
0536ff8d5172ce1e79bd46cbcc1500b409b76e93 | 1,143 | md | Markdown | draft/pages/about.md | techoi/techoi-blog-lumen-custom | f55b38ff4fb66404a209eb19198dab9c2425ad37 | [
"MIT"
] | null | null | null | draft/pages/about.md | techoi/techoi-blog-lumen-custom | f55b38ff4fb66404a209eb19198dab9c2425ad37 | [
"MIT"
] | 8 | 2021-06-28T20:37:25.000Z | 2022-02-27T11:10:55.000Z | draft/pages/about.md | techoi/techoi-blog-lumen-custom | f55b38ff4fb66404a209eb19198dab9c2425ad37 | [
"MIT"
] | null | null | null | ---
title: "About me"
template: "page"
---
> ## *INTRODUCE*
Be a better Programmer가 되는 목표를 가지고 있습니다.
> ## *SKILLS*
- Front-end
- React.js
- React Hooks
- Vue.js
- javascript(ES5/ES6)
- HTML/CSS
- Typescript
- Next.js
- Sass
- Back-end
- Node.js
- Express.js
- AWS
- nginx
- PHP
- Apache
- Java
- Database
- MySQL
- PostgreSQL
- Redis
- MongoDB
- GraphQL
- ETC
- Ubuntu
- CentOS
- Docker
- Docker Swarm
- Kubernetes
- Vim
- Socket.io
- Git/Github
- VS Code
> ## *EXPERIENCE*
- ### 2018.11 ~ 현재
와그트래블(여행액티비티 플랫폼) 프론트엔드 개발자
- ### 2018.03 ~ 2018.12
일산대진고등학교 클러스터(대진고, 주엽고, 대화고) 안드로이드 프로그래밍 강의
- ### 2018.01 ~ 2018.11
핀플(소상공인 회계관리 서비스) 프론트엔드 개발자
- ### 2018.08 ~ 2018.01
파인애플소프트(머신 프레임워크) 개발자
- ### 2018.02 ~ 2018.07
쌍용정보통신 빅데이터 교육(반장)
- ### 2017.01 ~ 2017.11
웨딩의여신 BM기획 및 세일즈
- ### 2016.07 ~ 2016.12
벤플 사업개발
- ### 2012.03 ~ 2016. 06
대한민국 공군 방공관제사령부 재정처 계약담당/출납담당
> ## *PROJECTS*
- ### 콜메모
- ### 스크래핑
- ### 도큐메라
- ### 알람스
> ## *EDUCATION*
- ### 한동대학교 경제학/상담심리학 학사
> ## *ETC*
- ### 정보처리기사 | 14.2875 | 47 | 0.528434 | kor_Hang | 0.994486 |
0537feb7a930d8b22149d63d15452b23c529f7cc | 165 | md | Markdown | FUN.md | MitchellJThomas/cosign | 2ef684f8f0e5ea41d3483a0a84ea906c33d4ecbf | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | FUN.md | MitchellJThomas/cosign | 2ef684f8f0e5ea41d3483a0a84ea906c33d4ecbf | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2022-02-18T15:41:42.000Z | 2022-02-18T15:41:42.000Z | FUN.md | coyote240/cosign | a4cb262dc3d45a283a6a7513bb767a38a2d3f448 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # Fun Tips And Tricks!
## Signing Git Commits
Git commit signing has been broken out into its own project! Check out
https://github.com/sigstore/gitsign for more.
| 23.571429 | 70 | 0.763636 | eng_Latn | 0.944283 |
0539d74fe0f01ef08790b1ca289a37a9ccb17ed1 | 98 | md | Markdown | README.md | Galeria-Kaufhof/cassbus | 41b093de9a0fccbe5e8f4fe3122c478ed9477816 | [
"MIT"
] | 6 | 2016-03-14T10:03:53.000Z | 2016-03-14T12:13:12.000Z | README.md | Galeria-Kaufhof/cassbus | 41b093de9a0fccbe5e8f4fe3122c478ed9477816 | [
"MIT"
] | null | null | null | README.md | Galeria-Kaufhof/cassbus | 41b093de9a0fccbe5e8f4fe3122c478ed9477816 | [
"MIT"
] | null | null | null | # cassbus
a cassandra based messagebus for your ruby applications
*Code to be provided soonish!*
| 19.6 | 55 | 0.795918 | eng_Latn | 0.995184 |
0539d84551056c2b3eb489b1e3d85edc387f518a | 959 | md | Markdown | README.md | marthagmoreno/pacman | 77b0e90e727e2e904e44df8d9d07ad5bcbaeec77 | [
"MIT"
] | null | null | null | README.md | marthagmoreno/pacman | 77b0e90e727e2e904e44df8d9d07ad5bcbaeec77 | [
"MIT"
] | null | null | null | README.md | marthagmoreno/pacman | 77b0e90e727e2e904e44df8d9d07ad5bcbaeec77 | [
"MIT"
] | null | null | null | # **PACMAN**
## Introduction
> This project displays a single pacman moving from side to side of the screen
## Installation
> Download ALL FILES to the same location/folder
> Drag & Drop index html into your web browser
## Built With
>- HTML
>- CSS
>- Javascript
## Links
>- [My Portfolio](https://github.com/martha-moreno/martha-moreno.github.io)
>- [PACMAN Repo](https://github.com/martha-moreno/pacman)
## License
> MIT
## Language Card Layout
[](https://github.com/martha-moreno/github-readme-stats)
## Contact
[<img src='https://cdn.jsdelivr.net/npm/[email protected]/icons/github.svg' alt='github' height='40'>](https://github.com/martha-moreno/martha-moreno.github.io) [<img src='https://cdn.jsdelivr.net/npm/[email protected]/icons/linkedin.svg' alt='linkedin' height='40'>](https://www.linkedin.com/in/martha-gissela-moreno/)
| 34.25 | 326 | 0.72367 | kor_Hang | 0.289854 |
053b7770e78269b89f301f0b82a0ff86977a224b | 39 | md | Markdown | README.md | redster51/milp-to-marketplace-news-converter | 2fa4fe77d38464f401a4a82fcf5dc1585eaa3fdd | [
"MIT"
] | null | null | null | README.md | redster51/milp-to-marketplace-news-converter | 2fa4fe77d38464f401a4a82fcf5dc1585eaa3fdd | [
"MIT"
] | null | null | null | README.md | redster51/milp-to-marketplace-news-converter | 2fa4fe77d38464f401a4a82fcf5dc1585eaa3fdd | [
"MIT"
] | null | null | null | # milp-to-marketplace-news-converter
| 13 | 36 | 0.769231 | eng_Latn | 0.62641 |
053ceeb2b98f41b4fddd6a6a35b6d579f5cf81e4 | 696 | md | Markdown | core/README.md | tglaeser/life | afc4467bc44e264864ab99e5cce87f66680e4995 | [
"Apache-2.0",
"MIT"
] | null | null | null | core/README.md | tglaeser/life | afc4467bc44e264864ab99e5cce87f66680e4995 | [
"Apache-2.0",
"MIT"
] | null | null | null | core/README.md | tglaeser/life | afc4467bc44e264864ab99e5cce87f66680e4995 | [
"Apache-2.0",
"MIT"
] | null | null | null | ## _Life_ Core Crate
[wasm-bindgen]:https://github.com/rustwasm/wasm-bindgen
[console_error_panic_hook]:https://github.com/rustwasm/console_error_panic_hook
[wee_alloc]:https://github.com/rustwasm/wee_alloc
#### Build Core Crate
```
$ wasm-pack build
```
#### Test in Headless Browser
```
$ wasm-pack test --headless --firefox
```
#### More Build Information
```
$ cargo --help
$ wasm-pack --help
```
#### Further Reading
- [`wasm-bindgen`][wasm-bindgen] - For communicating between WebAssembly and JavaScript.
- [`console_error_panic_hook`][console_error_panic_hook] - For logging panic messages to the developer console.
- [`wee_alloc`][wee_alloc] - An allocator optimized for small code size.
| 31.636364 | 111 | 0.738506 | eng_Latn | 0.632518 |
053cf091ae3e0d4f31566b655c42d02ad4f0a5c2 | 44,802 | md | Markdown | guidelines/list-info/o-packs-in-system-fedora-34.md | sensor-dream/APIVOF | d418819fd278da4fbe579c05c015c2582316e98b | [
"Apache-2.0"
] | null | null | null | guidelines/list-info/o-packs-in-system-fedora-34.md | sensor-dream/APIVOF | d418819fd278da4fbe579c05c015c2582316e98b | [
"Apache-2.0"
] | null | null | null | guidelines/list-info/o-packs-in-system-fedora-34.md | sensor-dream/APIVOF | d418819fd278da4fbe579c05c015c2582316e98b | [
"Apache-2.0"
] | null | null | null | # For first letter o, information about installation packages
<details>
<summary>objectweb-asm</summary>
```
From repo : fedora
Short desc : Java bytecode manipulation and analysis framework
URL : http://asm.ow2.org/
License : BSD
Descript : ASM is an all purpose Java bytecode manipulation and analysis
: framework. It can be used to modify existing classes or dynamically
: generate classes, directly in binary form. Provided common
: transformations and analysis algorithms allow to easily assemble
: custom complex transformations and code analysis tools.
```
</details>
<details>
<summary>objenesis</summary>
```
From repo : fedora
Short desc : A library for instantiating Java objects
URL : http://objenesis.org/
License : ASL 2.0
Descript : Objenesis is a small Java library that serves one purpose: to instantiate
: a new object of a particular class.
: Java supports dynamic instantiation of classes using Class.newInstance();
: however, this only works if the class has an appropriate constructor. There
: are many times when a class cannot be instantiated this way, such as when
: the class contains constructors that require arguments, that have side effects,
: and/or that throw exceptions. As a result, it is common to see restrictions
: in libraries stating that classes must require a default constructor.
: Objenesis aims to overcome these restrictions by bypassing the constructor
: on object instantiation. Needing to instantiate an object without calling
: the constructor is a fairly specialized task, however there are certain cases
: when this is useful:
: * Serialization, Remoting and Persistence - Objects need to be instantiated
: and restored to a specific state, without invoking code.
: * Proxies, AOP Libraries and Mock Objects - Classes can be sub-classed without
: needing to worry about the super() constructor.
: * Container Frameworks - Objects can be dynamically instantiated in
: non-standard ways.
```
</details>
<details>
<summary>ocaml-srpm-macros</summary>
```
From repo : fedora
Short desc : OCaml architecture macros
License : GPLv2+
Descript : This package contains macros needed by RPM in order to build
: SRPMS. It does not pull in any other OCaml dependencies.
```
</details>
<details>
<summary>ocl-icd</summary>
```
From repo : fedora
Short desc : OpenCL Library (Installable Client Library) Bindings
URL : https://github.com/OCL-dev/ocl-icd/
License : BSD
Descript : OpenCL Library (Installable Client Library) Bindings.
```
</details>
<details>
<summary>ocl-icd</summary>
```
From repo : fedora
Short desc : OpenCL Library (Installable Client Library) Bindings
URL : https://github.com/OCL-dev/ocl-icd/
License : BSD
Descript : OpenCL Library (Installable Client Library) Bindings.
```
</details>
<details>
<summary>oddjob</summary>
```
From repo : anaconda
Short desc : A D-Bus service which runs odd jobs on behalf of client applications
URL : https://pagure.io/oddjob
License : BSD
Descript : oddjob is a D-Bus service which performs particular tasks for clients which
: connect to it and issue requests using the system-wide message bus.
```
</details>
<details>
<summary>oddjob-mkhomedir</summary>
```
From repo : anaconda
Short desc : An oddjob helper which creates and populates home directories
URL : https://pagure.io/oddjob
License : BSD
Descript : This package contains the oddjob helper which can be used by the
: pam_oddjob_mkhomedir module to create a home directory for a user
: at login-time.
```
</details>
<details>
<summary>ogdi</summary>
```
From repo : fedora
Short desc : Open Geographic Datastore Interface
URL : http://ogdi.sourceforge.net/
License : BSD
Descript : OGDI is the Open Geographic Datastore Interface. OGDI is an
: application programming interface (API) that uses a standardized
: access methods to work in conjunction with GIS software packages (the
: application) and various geospatial data products. OGDI uses a
: client/server architecture to facilitate the dissemination of
: geospatial data products over any TCP/IP network, and a
: driver-oriented approach to facilitate access to several geospatial
: data products/formats.
```
</details>
<details>
<summary>okteta</summary>
```
Эпоха : 1
From repo : fedora
Short desc : Binary/hex editor
URL : https://cgit.kde.org/okteta.git
License : GPLv2+ and GFDL
Descript : Okteta is a binary/hex editor for KDE
```
</details>
<details>
<summary>okteta-libs</summary>
```
Эпоха : 1
From repo : fedora
Short desc : Runtime libraries and kpart plugins for okteta
URL : https://cgit.kde.org/okteta.git
License : GPLv2+ and GFDL
Descript : Runtime libraries and kpart plugins for okteta.
```
</details>
<details>
<summary>oniguruma</summary>
```
From repo : anaconda
Short desc : Regular expressions library
URL : https://github.com/kkos/oniguruma/
License : BSD
Descript : Oniguruma is a regular expressions library.
: The characteristics of this library is that different character encoding
: for every regular expression object can be specified.
: (supported APIs: GNU regex, POSIX and Oniguruma native)
```
</details>
<details>
<summary>open-sans-fonts</summary>
```
From repo : fedora
Short desc : Open Sans is a humanist sans-serif typeface designed by Steve Matteson
URL : http://www.google.com/fonts/specimen/Open+Sans
License : ASL 2.0
Descript : Open Sans is a humanist sans serif typeface designed by Steve Matteson, Type
: Director of Ascender Corp. This version contains the complete 897 character
: set, which includes the standard ISO Latin 1, Latin CE, Greek and Cyrillic
: character sets. Open Sans was designed with an upright stress, open forms and
: a neutral, yet friendly appearance. It was optimized for print, web, and mobile
: interfaces, and has excellent legibility characteristics in its letter forms.
```
</details>
<details>
<summary>open-vm-tools</summary>
```
From repo : anaconda
Short desc : Open Virtual Machine Tools for virtual machines hosted on VMware
URL : https://github.com/vmware/open-vm-tools
License : GPLv2
Descript : The open-vm-tools project is an open source implementation of VMware Tools. It
: is a suite of open source virtualization utilities and drivers to improve the
: functionality, user experience and administration of VMware virtual machines.
: This package contains only the core user-space programs and libraries of
: open-vm-tools.
```
</details>
<details>
<summary>open-vm-tools</summary>
```
From repo : updates-testing
Short desc : Open Virtual Machine Tools for virtual machines hosted on VMware
URL : https://github.com/vmware/open-vm-tools
License : GPLv2
Descript : The open-vm-tools project is an open source implementation of VMware Tools. It
: is a suite of open source virtualization utilities and drivers to improve the
: functionality, user experience and administration of VMware virtual machines.
: This package contains only the core user-space programs and libraries of
: open-vm-tools.
```
</details>
<details>
<summary>open-vm-tools-desktop</summary>
```
From repo : updates-testing
Short desc : User experience components for Open Virtual Machine Tools
URL : https://github.com/vmware/open-vm-tools
License : GPLv2
Descript : This package contains only the user-space programs and libraries of
: open-vm-tools that are essential for improved user experience of VMware virtual
: machines.
```
</details>
<details>
<summary>openal-soft</summary>
```
From repo : fedora
Short desc : Open Audio Library
URL : http://openal-soft.org/
License : LGPLv2+
Descript : OpenAL Soft is a cross-platform software implementation of the OpenAL 3D
: audio API. It's built off of the open-sourced Windows version available
: originally from the SVN repository at openal.org. OpenAL provides
: capabilities for playing audio in a virtual 3d environment. Distance
: attenuation, doppler shift, and directional sound emitters are among
: the features handled by the API. More advanced effects, including air
: absorption, low-pass filters, and reverb, are available through the
: EFX extension. It also facilitates streaming audio, multi-channel buffers,
: and audio capture.
```
</details>
<details>
<summary>openal-soft</summary>
```
From repo : fedora
Short desc : Open Audio Library
URL : http://openal-soft.org/
License : LGPLv2+
Descript : OpenAL Soft is a cross-platform software implementation of the OpenAL 3D
: audio API. It's built off of the open-sourced Windows version available
: originally from the SVN repository at openal.org. OpenAL provides
: capabilities for playing audio in a virtual 3d environment. Distance
: attenuation, doppler shift, and directional sound emitters are among
: the features handled by the API. More advanced effects, including air
: absorption, low-pass filters, and reverb, are available through the
: EFX extension. It also facilitates streaming audio, multi-channel buffers,
: and audio capture.
```
</details>
<details>
<summary>openal-soft-qt</summary>
```
From repo : fedora
Short desc : Qt frontend for configuring OpenAL Soft
URL : http://openal-soft.org/
License : LGPLv2+
Descript : The openal-soft-qt package contains alsoft-config, a Qt-based tool
: for configuring OpenAL features.
```
</details>
<details>
<summary>openblas</summary>
```
From repo : anaconda
Short desc : An optimized BLAS library based on GotoBLAS2
URL : https://github.com/xianyi/OpenBLAS/
License : BSD
Descript :
: OpenBLAS is an optimized BLAS library based on GotoBLAS2 1.13 BSD
: version. The project is supported by the Lab of Parallel Software and
: Computational Science, ISCAS. http://www.rdcps.ac.cn
```
</details>
<details>
<summary>openblas-openmp</summary>
```
From repo : anaconda
Short desc : An optimized BLAS library based on GotoBLAS2, OpenMP version
URL : https://github.com/xianyi/OpenBLAS/
License : BSD
Descript :
: OpenBLAS is an optimized BLAS library based on GotoBLAS2 1.13 BSD
: version. The project is supported by the Lab of Parallel Software and
: Computational Science, ISCAS. http://www.rdcps.ac.cn
:
: This package contains the library compiled with OpenMP support with
: 32-bit integer interface.
```
</details>
<details>
<summary>openblas-openmp64</summary>
```
From repo : fedora
Short desc : An optimized BLAS library based on GotoBLAS2, OpenMP version
URL : https://github.com/xianyi/OpenBLAS/
License : BSD
Descript :
: OpenBLAS is an optimized BLAS library based on GotoBLAS2 1.13 BSD
: version. The project is supported by the Lab of Parallel Software and
: Computational Science, ISCAS. http://www.rdcps.ac.cn
:
: This package contains the library compiled with OpenMP support and
: 64-bit integer interface.
```
</details>
<details>
<summary>openblas-serial</summary>
```
From repo : fedora
Short desc : An optimized BLAS library based on GotoBLAS2, serial version
URL : https://github.com/xianyi/OpenBLAS/
License : BSD
Descript :
: OpenBLAS is an optimized BLAS library based on GotoBLAS2 1.13 BSD
: version. The project is supported by the Lab of Parallel Software and
: Computational Science, ISCAS. http://www.rdcps.ac.cn
:
: This package contains the sequential library compiled with a 32-bit
: integer interface.
```
</details>
<details>
<summary>openblas-srpm-macros</summary>
```
From repo : fedora
Short desc : OpenBLAS architecture macros
License : MIT
Descript : OpenBLAS architecture macros.
```
</details>
<details>
<summary>opencl-filesystem</summary>
```
From repo : fedora
Short desc : OpenCL filesystem layout
URL : http://www.khronos.org/registry/cl/
License : Public Domain
Descript : This package provides some directories required by packages which use OpenCL.
```
</details>
<details>
<summary>opencl-headers</summary>
```
From repo : fedora
Short desc : OpenCL (Open Computing Language) header files
URL : https://www.khronos.org/registry/cl/
License : MIT
Descript : OpenCL (Open Computing Language) header files.
```
</details>
<details>
<summary>opencl-utils</summary>
```
From repo : fedora
Short desc : Useful OpenCL tools and utilities
URL : http://code.google.com/p/opencl-utils
License : MIT
Descript :
: OpenCL Utils is a project that aims to create various tools and utilities to
: make the use of OpenCL more useful and efficient, such as: useful functions,
: optimization hints and common kernel templates. This package currently only
: contains CLRun, which allows for dynamic loading of OpenCL.
```
</details>
<details>
<summary>openconnect</summary>
```
From repo : anaconda
Short desc : Open client for Cisco AnyConnect VPN, Juniper Network Connect/Pulse, PAN GlobalProtect
URL : http://www.infradead.org/openconnect.html
License : LGPLv2+
Descript : This package provides a multiprotocol VPN client for Cisco AnyConnect,
: Juniper SSL VPN / Pulse Connect Secure, and Palo Alto Networks GlobalProtect
: SSL VPN.
```
</details>
<details>
<summary>opencore-amr</summary>
```
From repo : rpmfusion-free
Short desc : OpenCORE Adaptive Multi Rate Narrowband and Wideband speech lib
URL : http://sourceforge.net/projects/opencore-amr/
License : ASL 2.0
Descript : Library of OpenCORE Framework implementation of Adaptive Multi Rate Narrowband
: and Wideband speech codec.
```
</details>
<details>
<summary>opencryptoki-libs</summary>
```
From repo : fedora
Short desc : The run-time libraries for opencryptoki package
URL : https://github.com/opencryptoki/opencryptoki
License : CPL
Descript : Opencryptoki implements the PKCS#11 specification v2.11 for a set of
: cryptographic hardware, such as IBM 4764 and 4765 crypto cards, and the
: Trusted Platform Module (TPM) chip. Opencryptoki also brings a software
: token implementation that can be used without any cryptographic
: hardware.
: This package contains the PKCS#11 library implementation, and requires
: at least one token implementation (packaged separately) to be fully
: functional.
```
</details>
<details>
<summary>opencsd</summary>
```
From repo : fedora
Short desc : An open source CoreSight(tm) Trace Decode library
URL : https://github.com/Linaro/OpenCSD
License : BSD
Descript : This library provides an API suitable for the decode of ARM(r)
: CoreSight(tm) trace streams.
```
</details>
<details>
<summary>opencv-contrib</summary>
```
From repo : updates-testing
Short desc : OpenCV contributed functionality
URL : https://opencv.org
License : BSD
Descript : This package is intended for development of so-called "extra" modules, contributed
: functionality. New modules quite often do not have stable API, and they are not
: well-tested. Thus, they shouldn't be released as a part of official OpenCV
: distribution, since the library maintains binary compatibility, and tries
: to provide decent performance and stability.
```
</details>
<details>
<summary>opencv-core</summary>
```
From repo : updates-testing
Short desc : OpenCV core libraries
URL : https://opencv.org
License : BSD
Descript : This package contains the OpenCV C/C++ core libraries.
```
</details>
<details>
<summary>openexr-libs</summary>
```
From repo : anaconda
Short desc : OpenEXR Libraries
URL : https://www.openexr.com/
License : BSD
Descript : OpenEXR is an open-source high-dynamic-range floating-point image file format
: for high-quality image processing and storage. This document presents a brief
: overview of OpenEXR and explains concepts that are specific to this format.
:
: OpenEXR Features:
:
: * High dynamic range and color precision. Support for 16-bit floating-point,
: * 32-bit floating-point, and 32-bit integer pixels.
: * Multiple image compression algorithms, both lossless and lossy. Some of
: the included codecs can achieve 2:1 lossless compression ratios on images
: with film grain. The lossy codecs have been tuned for visual quality and
: decoding performance.
: * Extensibility. New compression codecs and image types can easily be added
: by extending the C++ classes included in the OpenEXR software distribution.
: New image attributes (strings, vectors, integers, etc.) can be added to
: OpenEXR image headers without affecting backward compatibility with existing
: OpenEXR applications.
: * Support for stereoscopic image workflows and a generalization
: to multi-views.
: * Flexible support for deep data: pixels can store a variable-length list
: of samples and, thus, it is possible to store multiple values at different
: depths for each pixel. Hard surfaces and volumetric data representations are
: accommodated.
: * Multipart: ability to encode separate, but related, images in one file.
: This allows for access to individual parts without the need to read other
: parts in the file.
: * Versioning: OpenEXR source allows for user configurable C++
: namespaces to provide protection when using multiple versions of the library
: in the same process space.
:
: The IlmBase Library:
:
: Also a part of OpenEXR, the IlmBase library is a basic, light-weight, and
: efficient representation of 2D and 3D vectors and matrices and other simple but
: useful mathematical objects, functions, and data types common in computer
: graphics applications, including the “half” 16-bit floating-point type.
```
</details>
<details>
<summary>openexr-libs</summary>
```
From repo : updates-testing
Short desc : OpenEXR Libraries
URL : https://www.openexr.com/
License : BSD
Descript : OpenEXR is an open-source high-dynamic-range floating-point image file format
: for high-quality image processing and storage. This document presents a brief
: overview of OpenEXR and explains concepts that are specific to this format.
:
: OpenEXR Features:
:
: * High dynamic range and color precision. Support for 16-bit floating-point,
: * 32-bit floating-point, and 32-bit integer pixels.
: * Multiple image compression algorithms, both lossless and lossy. Some of
: the included codecs can achieve 2:1 lossless compression ratios on images
: with film grain. The lossy codecs have been tuned for visual quality and
: decoding performance.
: * Extensibility. New compression codecs and image types can easily be added
: by extending the C++ classes included in the OpenEXR software distribution.
: New image attributes (strings, vectors, integers, etc.) can be added to
: OpenEXR image headers without affecting backward compatibility with existing
: OpenEXR applications.
: * Support for stereoscopic image workflows and a generalization
: to multi-views.
: * Flexible support for deep data: pixels can store a variable-length list
: of samples and, thus, it is possible to store multiple values at different
: depths for each pixel. Hard surfaces and volumetric data representations are
: accommodated.
: * Multipart: ability to encode separate, but related, images in one file.
: This allows for access to individual parts without the need to read other
: parts in the file.
: * Versioning: OpenEXR source allows for user configurable C++
: namespaces to provide protection when using multiple versions of the library
: in the same process space.
:
: The IlmBase Library:
:
: Also a part of OpenEXR, the IlmBase library is a basic, light-weight, and
: efficient representation of 2D and 3D vectors and matrices and other simple but
: useful mathematical objects, functions, and data types common in computer
: graphics applications, including the “half” 16-bit floating-point type.
```
</details>
<details>
<summary>openh264</summary>
```
From repo : fedora-cisco-openh264
Short desc : H.264 codec library
URL : http://www.openh264.org/
License : BSD
Descript : OpenH264 is a codec library which supports H.264 encoding and decoding. It is
: suitable for use in real time applications such as WebRTC.
```
</details>
<details>
<summary>openjpeg2</summary>
```
From repo : anaconda
Short desc : C-Library for JPEG 2000
URL : https://github.com/uclouvain/openjpeg
License : BSD and MIT
Descript : The OpenJPEG library is an open-source JPEG 2000 library developed in order to
: promote the use of JPEG 2000.
:
: This package contains
: * JPEG 2000 codec compliant with the Part 1 of the standard (Class-1 Profile-1
: compliance).
: * JP2 (JPEG 2000 standard Part 2 - Handling of JP2 boxes and extended multiple
: component transforms for multispectral and hyperspectral imagery)
```
</details>
<details>
<summary>openldap</summary>
```
From repo : updates
Short desc : LDAP support libraries
URL : http://www.openldap.org/
License : OpenLDAP
Descript : OpenLDAP is an open source suite of LDAP (Lightweight Directory Access
: Protocol) applications and development tools. LDAP is a set of
: protocols for accessing directory services (usually phone book style
: information, but other information is possible) over the Internet,
: similar to the way DNS (Domain Name System) information is propagated
: over the Internet. The openldap package contains configuration files,
: libraries, and documentation for OpenLDAP.
```
</details>
<details>
<summary>openldap</summary>
```
From repo : updates-testing
Short desc : LDAP support libraries
URL : http://www.openldap.org/
License : OpenLDAP
Descript : OpenLDAP is an open source suite of LDAP (Lightweight Directory Access
: Protocol) applications and development tools. LDAP is a set of
: protocols for accessing directory services (usually phone book style
: information, but other information is possible) over the Internet,
: similar to the way DNS (Domain Name System) information is propagated
: over the Internet. The openldap package contains configuration files,
: libraries, and documentation for OpenLDAP.
```
</details>
<details>
<summary>openldap-clients</summary>
```
From repo : updates
Short desc : LDAP client utilities
URL : http://www.openldap.org/
License : OpenLDAP
Descript : OpenLDAP is an open-source suite of LDAP (Lightweight Directory Access
: Protocol) applications and development tools. LDAP is a set of
: protocols for accessing directory services (usually phone book style
: information, but other information is possible) over the Internet,
: similar to the way DNS (Domain Name System) information is propagated
: over the Internet. The openldap-clients package contains the client
: programs needed for accessing and modifying OpenLDAP directories.
```
</details>
<details>
<summary>openldap-compat</summary>
```
From repo : updates
Short desc : Package providing legacy non-threded libldap
URL : http://www.openldap.org/
License : OpenLDAP
Descript : The openldap-compat package contains non-threaded variant of libldap
: which should not be used. Instead, applications should link to libldap_r
: which provides thread-safe variant with the very same API.
```
</details>
<details>
<summary>openldap-devel</summary>
```
From repo : updates
Short desc : LDAP development libraries and header files
URL : http://www.openldap.org/
License : OpenLDAP
Descript : The openldap-devel package includes the development libraries and
: header files needed for compiling applications that use LDAP
: (Lightweight Directory Access Protocol) internals. LDAP is a set of
: protocols for enabling directory services over the Internet. Install
: this package only if you plan to develop or will need to compile
: customized LDAP clients.
```
</details>
<details>
<summary>openpgm</summary>
```
From repo : fedora
Short desc : An implementation of the PGM reliable multicast protocol
URL : https://github.com/steve-o/openpgm
License : LGPLv2
Descript : OpenPGM is an open source implementation of the Pragmatic General
: Multicast (PGM) specification in RFC 3208.
```
</details>
<details>
<summary>opensc</summary>
```
From repo : anaconda
Short desc : Smart card library and applications
URL : https://github.com/OpenSC/OpenSC/wiki
License : LGPLv2+
Descript : OpenSC provides a set of libraries and utilities to work with smart cards. Its
: main focus is on cards that support cryptographic operations, and facilitate
: their use in security applications such as authentication, mail encryption and
: digital signatures. OpenSC implements the PKCS#11 API so applications
: supporting this API (such as Mozilla Firefox and Thunderbird) can use it. On
: the card OpenSC implements the PKCS#15 standard and aims to be compatible with
: every software/card that does so, too.
```
</details>
<details>
<summary>openslide</summary>
```
From repo : fedora
Short desc : C library for reading virtual slides
URL : http://openslide.org/
License : LGPLv2
Descript : The OpenSlide library allows programs to access virtual slide files
: regardless of the underlying image format.
```
</details>
<details>
<summary>openssh</summary>
```
From repo : anaconda
Short desc : An open source implementation of SSH protocol version 2
URL : http://www.openssh.com/portable.html
License : BSD
Descript : SSH (Secure SHell) is a program for logging into and executing
: commands on a remote machine. SSH is intended to replace rlogin and
: rsh, and to provide secure encrypted communications between two
: untrusted hosts over an insecure network. X11 connections and
: arbitrary TCP/IP ports can also be forwarded over the secure channel.
:
: OpenSSH is OpenBSD's version of the last free version of SSH, bringing
: it up to date in terms of security and features.
:
: This package includes the core files necessary for both the OpenSSH
: client and server. To make this package useful, you should also
: install openssh-clients, openssh-server, or both.
```
</details>
<details>
<summary>openssh</summary>
```
From repo : updates-testing
Short desc : An open source implementation of SSH protocol version 2
URL : http://www.openssh.com/portable.html
License : BSD
Descript : SSH (Secure SHell) is a program for logging into and executing
: commands on a remote machine. SSH is intended to replace rlogin and
: rsh, and to provide secure encrypted communications between two
: untrusted hosts over an insecure network. X11 connections and
: arbitrary TCP/IP ports can also be forwarded over the secure channel.
:
: OpenSSH is OpenBSD's version of the last free version of SSH, bringing
: it up to date in terms of security and features.
:
: This package includes the core files necessary for both the OpenSSH
: client and server. To make this package useful, you should also
: install openssh-clients, openssh-server, or both.
```
</details>
<details>
<summary>openssh-clients</summary>
```
From repo : anaconda
Short desc : An open source SSH client applications
URL : http://www.openssh.com/portable.html
License : BSD
Descript : OpenSSH is a free version of SSH (Secure SHell), a program for logging
: into and executing commands on a remote machine. This package includes
: the clients necessary to make encrypted connections to SSH servers.
```
</details>
<details>
<summary>openssh-clients</summary>
```
From repo : updates-testing
Short desc : An open source SSH client applications
URL : http://www.openssh.com/portable.html
License : BSD
Descript : OpenSSH is a free version of SSH (Secure SHell), a program for logging
: into and executing commands on a remote machine. This package includes
: the clients necessary to make encrypted connections to SSH servers.
```
</details>
<details>
<summary>openssh-server</summary>
```
From repo : updates-testing
Short desc : An open source SSH server daemon
URL : http://www.openssh.com/portable.html
License : BSD
Descript : OpenSSH is a free version of SSH (Secure SHell), a program for logging
: into and executing commands on a remote machine. This package contains
: the secure shell daemon (sshd). The sshd daemon allows SSH clients to
: securely connect to your SSH server.
```
</details>
<details>
<summary>openssl</summary>
```
Эпоха : 1
From repo : updates-testing
Short desc : Utilities from the general purpose cryptography library with TLS implementation
URL : http://www.openssl.org/
License : OpenSSL and ASL 2.0
Descript : The OpenSSL toolkit provides support for secure communications between
: machines. OpenSSL includes a certificate management tool and shared
: libraries which provide various cryptographic algorithms and
: protocols.
```
</details>
<details>
<summary>openssl-devel</summary>
```
Эпоха : 1
From repo : updates-testing
Short desc : Files for development of applications which will use OpenSSL
URL : http://www.openssl.org/
License : OpenSSL and ASL 2.0
Descript : OpenSSL is a toolkit for supporting cryptography. The openssl-devel
: package contains include files needed to develop applications which
: support various cryptographic algorithms and protocols.
```
</details>
<details>
<summary>openssl-gost-engine</summary>
```
From repo : fedora
Short desc : A reference implementation of the Russian GOST crypto algorithms for OpenSSL
URL : https://github.com/gost-engine/engine
License : OpenSSL
Descript : A reference implementation of the Russian GOST crypto algorithms for OpenSSL.
```
</details>
<details>
<summary>openssl-ibmpkcs11</summary>
```
From repo : fedora
Short desc : IBM OpenSSL PKCS#11 engine
URL : https://github.com/opencryptoki/openssl-ibmpkcs11
License : OpenSSL
Descript : This package contains a shared object OpenSSL dynamic engine for the use
: with a PKCS#11 implementation such as openCryptoki.
```
</details>
<details>
<summary>openssl-libs</summary>
```
Эпоха : 1
From repo : anaconda
Short desc : A general purpose cryptography library with TLS implementation
URL : http://www.openssl.org/
License : OpenSSL and ASL 2.0
Descript : OpenSSL is a toolkit for supporting cryptography. The openssl-libs
: package contains the libraries that are used by various applications which
: support cryptographic algorithms and protocols.
```
</details>
<details>
<summary>openssl-libs</summary>
```
Эпоха : 1
From repo : fedora
Short desc : A general purpose cryptography library with TLS implementation
URL : http://www.openssl.org/
License : OpenSSL and ASL 2.0
Descript : OpenSSL is a toolkit for supporting cryptography. The openssl-libs
: package contains the libraries that are used by various applications which
: support cryptographic algorithms and protocols.
```
</details>
<details>
<summary>openssl-libs</summary>
```
Эпоха : 1
From repo : updates-testing
Short desc : A general purpose cryptography library with TLS implementation
URL : http://www.openssl.org/
License : OpenSSL and ASL 2.0
Descript : OpenSSL is a toolkit for supporting cryptography. The openssl-libs
: package contains the libraries that are used by various applications which
: support cryptographic algorithms and protocols.
```
</details>
<details>
<summary>openssl-perl</summary>
```
Эпоха : 1
From repo : fedora
Short desc : Perl scripts provided with OpenSSL
URL : http://www.openssl.org/
License : OpenSSL and ASL 2.0
Descript : OpenSSL is a toolkit for supporting cryptography. The openssl-perl
: package provides Perl scripts for converting certificates and keys
: from other formats to the formats used by the OpenSSL toolkit.
```
</details>
<details>
<summary>openssl-pkcs11</summary>
```
From repo : fedora
Short desc : A PKCS#11 engine for use with OpenSSL
URL : https://github.com/OpenSC/libp11
License : LGPLv2+ and BSD
Descript : openssl-pkcs11 enables hardware security module (HSM), and smart card support in
: OpenSSL applications. More precisely, it is an OpenSSL engine which makes
: registered PKCS#11 modules available for OpenSSL applications. The engine is
: optional and can be loaded by configuration file, command line or through the
: OpenSSL ENGINE API.
```
</details>
<details>
<summary>openssl-pkcs11</summary>
```
From repo : anaconda
Short desc : A PKCS#11 engine for use with OpenSSL
URL : https://github.com/OpenSC/libp11
License : LGPLv2+ and BSD
Descript : openssl-pkcs11 enables hardware security module (HSM), and smart card support in
: OpenSSL applications. More precisely, it is an OpenSSL engine which makes
: registered PKCS#11 modules available for OpenSSL applications. The engine is
: optional and can be loaded by configuration file, command line or through the
: OpenSSL ENGINE API.
```
</details>
<details>
<summary>openssl-static</summary>
```
Эпоха : 1
From repo : fedora
Short desc : Libraries for static linking of applications which will use OpenSSL
URL : http://www.openssl.org/
License : OpenSSL and ASL 2.0
Descript : OpenSSL is a toolkit for supporting cryptography. The openssl-static
: package contains static libraries needed for static linking of
: applications which support various cryptographic algorithms and
: protocols.
```
</details>
<details>
<summary>opentest4j</summary>
```
From repo : fedora
Short desc : Open Test Alliance for the JVM
URL : https://github.com/ota4j-team/opentest4j
License : ASL 2.0
Descript : Open Test Alliance for the JVM is a minimal common foundation for
: testing libraries on the JVM. The primary goal of the project is to
: enable testing frameworks like JUnit, TestNG, Spock, etc. and
: third-party assertion libraries like Hamcrest, AssertJ, etc. to use a
: common set of exceptions that IDEs and build tools can support in a
: consistent manner across all testing scenarios -- for example, for
: consistent handling of failed assertions and failed assumptions as
: well as visualization of test execution in IDEs and reports.
```
</details>
<details>
<summary>openvpn</summary>
```
From repo : updates
Short desc : A full-featured TLS VPN solution
URL : https://community.openvpn.net/
License : GPLv2
Descript : OpenVPN is a robust and highly flexible tunneling application that uses all
: of the encryption, authentication, and certification features of the
: OpenSSL library to securely tunnel IP networks over a single UDP or TCP
: port. It can use the Marcus Franz Xaver Johannes Oberhumers LZO library
: for compression.
```
</details>
<details>
<summary>opus</summary>
```
From repo : fedora
Short desc : An audio codec for use in low-delay speech and audio communication
URL : https://www.opus-codec.org/
License : BSD
Descript : The Opus codec is designed for interactive speech and audio transmission over
: the Internet. It is designed by the IETF Codec Working Group and incorporates
: technology from Skype's SILK codec and Xiph.Org's CELT codec.
```
</details>
<details>
<summary>opus</summary>
```
From repo : anaconda
Short desc : An audio codec for use in low-delay speech and audio communication
URL : https://www.opus-codec.org/
License : BSD
Descript : The Opus codec is designed for interactive speech and audio transmission over
: the Internet. It is designed by the IETF Codec Working Group and incorporates
: technology from Skype's SILK codec and Xiph.Org's CELT codec.
```
</details>
<details>
<summary>opusfile</summary>
```
From repo : fedora
Short desc : A high-level API for decoding and seeking within .opus files
URL : https://www.opus-codec.org/
License : BSD
Descript : libopusfile provides a high-level API for decoding and seeking
: within .opus files. It includes:
: * Support for all files with at least one Opus stream (including
: multichannel files or Ogg files where Opus is muxed with something else).
: * Full support, including seeking, for chained files.
: * A simple stereo downmixing API (allowing chained files to be
: decoded with a single output format, even if the channel count changes).
: * Support for reading from a file, memory buffer, or over HTTP(S)
: (including seeking).
: * Support for both random access and streaming data sources.
```
</details>
<details>
<summary>orc</summary>
```
From repo : fedora
Short desc : The Oil Run-time Compiler
URL : http://cgit.freedesktop.org/gstreamer/orc/
License : BSD
Descript : Orc is a library and set of tools for compiling and executing
: very simple programs that operate on arrays of data. The "language"
: is a generic assembly language that represents many of the features
: available in SIMD architectures, including saturated addition and
: subtraction, and many arithmetic operations.
```
</details>
<details>
<summary>orc</summary>
```
From repo : anaconda
Short desc : The Oil Run-time Compiler
URL : http://cgit.freedesktop.org/gstreamer/orc/
License : BSD
Descript : Orc is a library and set of tools for compiling and executing
: very simple programs that operate on arrays of data. The "language"
: is a generic assembly language that represents many of the features
: available in SIMD architectures, including saturated addition and
: subtraction, and many arithmetic operations.
```
</details>
<details>
<summary>orca</summary>
```
From repo : updates-testing
Short desc : Assistive technology for people with visual impairments
URL : https://wiki.gnome.org/Projects/Orca
License : LGPLv2+
Descript : Orca is a screen reader that provides access to the graphical desktop via
: user-customizable combinations of speech and/or braille. Orca works with
: applications and toolkits that support the assistive technology service
: provider interface (AT-SPI), e.g. the GNOME desktop.
```
</details>
<details>
<summary>os-prober</summary>
```
From repo : anaconda
Short desc : Probes disks on the system for installed operating systems
URL : http://kitenet.net/~joey/code/os-prober/
License : GPLv2+ and GPL+
Descript : This package detects other OSes available on a system and outputs the results
: in a generic machine-readable format. Support for new OSes and Linux
: distributions can be added easily.
```
</details>
<details>
<summary>osinfo-db</summary>
```
From repo : updates-testing
Short desc : osinfo database files
URL : http://libosinfo.org/
License : LGPLv2+
Descript : The osinfo database provides information about operating systems and
: hypervisor platforms to facilitate the automated configuration and
: provisioning of new virtual machines
```
</details>
<details>
<summary>osinfo-db-tools</summary>
```
From repo : anaconda
Short desc : Tools for managing the osinfo database
URL : http://libosinfo.org/
License : GPLv2+
Descript : This package provides tools for managing the osinfo database of
: information about operating systems for use with virtualization
```
</details>
<details>
<summary>ostree</summary>
```
From repo : updates
Short desc : Tool for managing bootable, immutable filesystem trees
URL : https://ostree.readthedocs.io/en/latest/
License : LGPLv2+
Descript : libostree is a shared library designed primarily for
: use by higher level tools to manage host systems (e.g. rpm-ostree),
: as well as container tools like flatpak and the atomic CLI.
```
</details>
<details>
<summary>ostree-devel</summary>
```
From repo : updates
Short desc : Development headers for ostree
URL : https://ostree.readthedocs.io/en/latest/
License : LGPLv2+
Descript : The ostree-devel package includes the header files for the ostree library.
```
</details>
<details>
<summary>ostree-libs</summary>
```
From repo : updates
Short desc : Development headers for ostree
URL : https://ostree.readthedocs.io/en/latest/
License : LGPLv2+
Descript : The ostree-libs provides shared libraries for ostree.
```
</details>
| 34.838258 | 101 | 0.665484 | eng_Latn | 0.975272 |
053ef688e74b8ab3848236a760391c4811b3d65e | 1,052 | md | Markdown | _posts/2016-4-29-Established-Connections.md | plottico/plottico.github.io | 262ff928376c9480cdc829cb1127dcc49dacae99 | [
"MIT"
] | null | null | null | _posts/2016-4-29-Established-Connections.md | plottico/plottico.github.io | 262ff928376c9480cdc829cb1127dcc49dacae99 | [
"MIT"
] | null | null | null | _posts/2016-4-29-Established-Connections.md | plottico/plottico.github.io | 262ff928376c9480cdc829cb1127dcc49dacae99 | [
"MIT"
] | null | null | null | ---
layout: post
title: Linux established connections
comments: true
snippet: true
---
Plot amount of established connections on a linux box.
<object data="https://plotti.co/plotticonn/300x80.svg" type="image/svg+xml"></object>
Paste this script into console as user root:
```bash
#!/bin/bash
while true; do
wget -O /dev/null -q "https://plotti.co/YOUR_HASH?k=YOUR_KEY&d=`netstat -tn | grep ESTAB | grep -v 127.0.0.1 | wc -l`,established_connections"
sleep 1
done &
```
To automatically run it at startup in ubuntu/debian, paste this into a root shell:
```bash
sed -i "/exit 0/d" /etc/rc.local # remove exit 0
cat >> /etc/rc.local << 'EOF'
while true; do
wget -O /dev/null -q "https://plotti.co/YOUR_HASH?k=YOUR_KEY&d=`netstat -tn | grep ESTAB | grep -v 127.0.0.1 | wc -l`,established_connections"
sleep 1
done & # fork to background
exit 0
EOF
```
Notice that the commands use `| grep -v 127.0.0.1` to remove all local connections from counter; if you want to count these too, you can remove this part from the code.
Tags: bash, shell, linux | 28.432432 | 168 | 0.711977 | eng_Latn | 0.923048 |
053f9170b4421f1c2c9e2ba33cf8abf649d6afa6 | 424 | md | Markdown | _posts/Home.md | propainter/blog | e366fb12159ad44303a94ce7d57f1ff0af24b622 | [
"MIT"
] | null | null | null | _posts/Home.md | propainter/blog | e366fb12159ad44303a94ce7d57f1ff0af24b622 | [
"MIT"
] | null | null | null | _posts/Home.md | propainter/blog | e366fb12159ad44303a94ce7d57f1ff0af24b622 | [
"MIT"
] | null | null | null | ---
layout: post
title: Himanshu Gautam
---
Next you can update your site name, avatar and other options using the _config.yml file in the root of your repository (shown below).

This is my blog where I write posts, document my learnings and other good to know stuff for Software Engineering.
TODO:
- [ ] Make a list of good to check tech site, blogs or forums
| 22.315789 | 133 | 0.731132 | eng_Latn | 0.998558 |
0540dc01fafe3f8125a8c44d73e871edff3992ed | 894 | md | Markdown | README.md | CliffS/couch-queue | 00d2fcadd5f741b7e0e8438691451ec3fc8780cc | [
"0BSD"
] | 1 | 2016-06-18T00:07:43.000Z | 2016-06-18T00:07:43.000Z | README.md | CliffS/couch-queue | 00d2fcadd5f741b7e0e8438691451ec3fc8780cc | [
"0BSD"
] | 13 | 2020-01-01T07:16:42.000Z | 2021-06-25T15:30:56.000Z | README.md | CliffS/couch-queue | 00d2fcadd5f741b7e0e8438691451ec3fc8780cc | [
"0BSD"
] | null | null | null | # couch-queue
Safe, concurrent queuing allowing multiple writers
and multiple workers using CouchDB as its backend.
The principal difference from other, similar modules
is that this will wait for another entry to become
available, if the queue is exhausted.
## Example
```javascript
static Queue = require('couch-queue');
static queue = new Queue(undefined, undefined, {
username: 'cliff',
password: 'pass'
});
queue.on('ready', function() {
queue.createQueue();
})
.on('dequeued', function(data) {
console.log("Dequeued payload " + JSON.stringify(data, null, 2));
});
.on('created', function() {
var payload = {
anything: "you like",
canGo: "in here"
}
queue.enqueue(payload);
})
## Database Fields
* pending: boolean
* enqueued: time pushed
* dequeued: time pulled
## Installation
npm install couch-queue
## Events
ready
Emitted by the contructor
| 17.529412 | 67 | 0.699105 | eng_Latn | 0.953532 |
05418a246922a2ee2eedfd9f0f91099c1fa9d3bc | 403 | md | Markdown | catalog/qi-cai-lao-fu-zi/en-US_qi-cai-lao-fu-zi.md | htron-dev/baka-db | cb6e907a5c53113275da271631698cd3b35c9589 | [
"MIT"
] | 3 | 2021-08-12T20:02:29.000Z | 2021-09-05T05:03:32.000Z | catalog/qi-cai-lao-fu-zi/en-US_qi-cai-lao-fu-zi.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 8 | 2021-07-20T00:44:48.000Z | 2021-09-22T18:44:04.000Z | catalog/qi-cai-lao-fu-zi/en-US_qi-cai-lao-fu-zi.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 2 | 2021-07-19T01:38:25.000Z | 2021-07-29T08:10:29.000Z | # Qi Cai Lao Fu Zi

- **type**: movie
- **episodes**: 1
- **original-name**: 七彩老夫子
- **start-date**: 1981-07-16
- **rating**: G - All Ages
## Tags
- action
- comedy
- seinen
## Sinopse
Adaptation of popular chinese manhua.
## Links
- [My Anime list](https://myanimelist.net/anime/30095/Qi_Cai_Lao_Fu_Zi)
| 16.791667 | 73 | 0.630273 | yue_Hant | 0.205276 |
0544ce810ddd02ff388c1d9c1767c68d5af3496a | 337 | md | Markdown | docs/tm.charlie.expandabletextview/-expandable-text-view/update-state.md | arslancharyev31/Anko-ExpandableTextView | 5c5979edbdb1dd19f1fb1002af353b0557e0f454 | [
"MIT"
] | 88 | 2017-09-19T11:59:07.000Z | 2021-09-08T09:13:21.000Z | docs/tm.charlie.expandabletextview/-expandable-text-view/update-state.md | naseemakhtar994/Anko-ExpandableTextView | 5c5979edbdb1dd19f1fb1002af353b0557e0f454 | [
"MIT"
] | 5 | 2018-01-30T23:22:43.000Z | 2021-03-23T14:59:40.000Z | docs/tm.charlie.expandabletextview/-expandable-text-view/update-state.md | naseemakhtar994/Anko-ExpandableTextView | 5c5979edbdb1dd19f1fb1002af353b0557e0f454 | [
"MIT"
] | 9 | 2018-01-06T22:20:57.000Z | 2021-08-12T12:06:23.000Z | ---
title: ExpandableTextView.updateState - expandable-textview
---
[expandable-textview](../../index.html) / [tm.charlie.expandabletextview](../index.html) / [ExpandableTextView](index.html) / [updateState](.)
# updateState
`protected fun updateState(): `[`Unit`](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-unit/index.html) | 37.444444 | 142 | 0.72997 | kor_Hang | 0.452633 |
05465fc5a64ac36a8f6bb707a322ced3ea1847f7 | 387 | md | Markdown | content/jobs/HPE/index.md | arjun1237/v4 | 9f2a0d99db3c06e9e0dfc7103d8cbf04b0b516b1 | [
"MIT"
] | null | null | null | content/jobs/HPE/index.md | arjun1237/v4 | 9f2a0d99db3c06e9e0dfc7103d8cbf04b0b516b1 | [
"MIT"
] | null | null | null | content/jobs/HPE/index.md | arjun1237/v4 | 9f2a0d99db3c06e9e0dfc7103d8cbf04b0b516b1 | [
"MIT"
] | null | null | null | ---
date: '4'
title: 'Server Support Engineer'
company: 'HP Enterprise'
location: 'Bengaluru, India'
range: 'February - June 2019'
url: 'https://www.hpe.com/us/en/home.html'
---
- Successfully deployed nodes for several of HPE clients on their HPE Simplivity federation.
- Responsible for entire deployment including IP network analysis to checking the federation health on deployment.
| 32.25 | 114 | 0.764858 | eng_Latn | 0.961498 |
0548914b09df668cfde59fdbf2cefc29eed2c489 | 3,780 | md | Markdown | doc/how/theming.md | wp-jungle/bonzai-core | 49bbf5f105eda0249671c48a0380f146ee0251dc | [
"RSA-MD"
] | null | null | null | doc/how/theming.md | wp-jungle/bonzai-core | 49bbf5f105eda0249671c48a0380f146ee0251dc | [
"RSA-MD"
] | 6 | 2018-09-18T17:37:16.000Z | 2021-08-15T12:33:50.000Z | doc/how/theming.md | wp-jungle/bonzai-core | 49bbf5f105eda0249671c48a0380f146ee0251dc | [
"RSA-MD"
] | null | null | null | Themes development
==================
Usage
-----
See [Grunt documentation](grunt.md#Developing-themes).
Themes required structure
-------------------------
To make a theme Bonzai compatible, you will need several things :
* Your theme folder name must match one off the `themes_pattern` that can be found in `config/bonzai/application.json`.
You can either add your theme folder name to this pattern, or rename your theme folder to look something like
`bonzai-my-awesome-theme`.
* Your theme must contain a composer.json file that match the configuration explained below
* Your theme must be structured as explained below
Theme composer file
-------------------
This is almost a basic composer.json file, except that there are some additional required vars, like :
* extra.textDomain
* extra.slug
* extra.prefix
* extra.grunt
Theses configuration are passed to grunt when using grunt tasks related to theme development.
```
{
"name": "mytheme",
"version": "0.0.1",
"homepage": "http://mytheme.com",
"description": "MyTheme description",
"type": "wordpress-theme",
"license": "proprietary",
"keywords": [
"WordPress",
"theme"
],
"authors": [
{
"name": "Firstname Lastname",
"email": "[email protected]",
"homepage": "http://mypersonalsite.com/"
}
],
"repositories": [
{
"type": "composer",
"url": "https://wpackagist.org"
}
],
"require": {
"php": ">=5.4",
"composer/installers": "~v1.0.25"
},
"extra": {
"textDomain": "textdomainslug",
"slug": "themeslug",
"prefix": "slg",
"installer-paths": {
"vendor/{$name}": ["type:wordpress-plugin"]
},
"grunt": {
"assets": {
"copy": {
"files": [
{
"cwd": "public/app/themes/bonzai-mytheme/dev/libs/font-awesome",
"expand": true,
"src": [
"css/font-awesome.min.css",
"fonts/*"
],
"dest": "public/app/themes/bonzai-mytheme/assets/fonts/font-awesome"
},
{
"cwd": "public/app/themes/bonzai-mytheme/dev/libs/summernote/dist/font",
"expand": true,
"src": [
"*"
],
"dest": "public/app/themes/bonzai-mytheme/assets/fonts"
}
]
}
}
}
},
"config": {
"preferred-install": "dist"
},
"minimum-stability": "dev"
}
```
Theme folder structure
----------------------
* app/ `: the folder where you will put all your PHP files, functions, templates, etc ...`
* assets/
- admin/ `: JS and CSS files loaded on wp-admin`
+ css
+ js
- frontend/ `: JS and CSS files loaded on frontend`
+ css
+ js
- backend/ `: JS and CSS files loaded on WP Customer Area Backend`
+ css
+ js
- images/ `: the gfx`
- fonts/ `: the fonts`
- vendor/ `: all the vendor files directly copied in it`
* dev/
- libs/ `: the folder where bower install dependencies`
- releases/ `: the folder where you can find your theme zip files`
- src/ `: this folder contains the source files that will be compiled to assets folder`
+ js/
* admin/
* backend/
* frontend/
* custom/ `: create a file app.custom_js.myscript.js to compile it to app/custom_js/myscript.js`
+ less/
* admin/
* backend/
* frontend/
* commons/ `: less files that are included in every others folders (admin, backend, frontend)`
* languages/ `: folder where languages are compiled to`
* vendor/ `: folder where composer install dependencies`
* bower.json
* composer.json
* etc ...
| 26.808511 | 119 | 0.566931 | eng_Latn | 0.923844 |
0549686013a9c334dcbd910ded3a38d8f213b1a0 | 1,433 | md | Markdown | scripting-docs/winscript/reference/iremotedebugapplicationex-forcestepmode.md | rfakhouri/visualstudio-docs.cs-cz | 3d540a168c09a23b855f746696062fd9954b8dd5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | scripting-docs/winscript/reference/iremotedebugapplicationex-forcestepmode.md | rfakhouri/visualstudio-docs.cs-cz | 3d540a168c09a23b855f746696062fd9954b8dd5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | scripting-docs/winscript/reference/iremotedebugapplicationex-forcestepmode.md | rfakhouri/visualstudio-docs.cs-cz | 3d540a168c09a23b855f746696062fd9954b8dd5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: IRemoteDebugApplicationEx:ForceStepMode | Dokumentace Microsoftu
ms.custom: ''
ms.date: 01/18/2017
ms.reviewer: ''
ms.suite: ''
ms.tgt_pltfrm: ''
ms.topic: reference
apiname:
- IRemoteDebugApplicationEx:ForceStepMode
apilocation:
- scrobj.dll
helpviewer_keywords:
- IRemoteDebugApplicationEx:ForceStepMode
ms.assetid: 83e69a3e-e4c9-4ddd-b01b-1820e4177a03
caps.latest.revision: 5
author: mikejo5000
ms.author: mikejo
manager: ghogen
ms.openlocfilehash: 04a2c700d2cac4bcdc845ebf442de29863e87deb
ms.sourcegitcommit: 94b3a052fb1229c7e7f8804b09c1d403385c7630
ms.translationtype: MT
ms.contentlocale: cs-CZ
ms.lasthandoff: 04/23/2019
ms.locfileid: "62934638"
---
# <a name="iremotedebugapplicationexforcestepmode"></a>IRemoteDebugApplicationEx:ForceStepMode
Vynutí ladicí program v režimu krokování.
## <a name="syntax"></a>Syntaxe
```cpp
HRESULT ForceStepMode(
IRemoteDebugApplicationThread* pStepThread
);
```
### <a name="parameters"></a>Parametry
`pStepThread`
[in] Vlákno pro sledování ladění procesu krok. Pokud má hodnotu null, PDM vymaže jeho krokování vlákna.
## <a name="return-value"></a>Návratová hodnota
Metoda vrátí `HRESULT`. Možné hodnoty zahrnují hodnoty v následující tabulce, ale nejsou na ně omezeny.
|Value|Popis|
|-----------|-----------------|
|`S_OK`|Metoda byla úspěšná.|
## <a name="see-also"></a>Viz také:
- [IRemoteDebugApplicationEx – rozhraní](iremotedebugapplicationex-interface.md) | 26.054545 | 103 | 0.769714 | ces_Latn | 0.699333 |
0549b623ad53697cd9182ffb817ebd6950587757 | 77 | md | Markdown | README.md | M9k/Progetto-P2 | bd92bf278939ea59f8d41d8e9fd69f4172b3b6ac | [
"BSD-3-Clause"
] | 1 | 2021-12-02T10:56:00.000Z | 2021-12-02T10:56:00.000Z | README.md | M9k/Progetto-P2 | bd92bf278939ea59f8d41d8e9fd69f4172b3b6ac | [
"BSD-3-Clause"
] | null | null | null | README.md | M9k/Progetto-P2 | bd92bf278939ea59f8d41d8e9fd69f4172b3b6ac | [
"BSD-3-Clause"
] | null | null | null | # Progetto-P2
Piccolo progetto di programmazione ad oggetti con framework QT
| 25.666667 | 62 | 0.831169 | ita_Latn | 0.999905 |
054a4d1d97060f1c335b7b7f5c964d9260b68f29 | 1,514 | md | Markdown | README.md | scafol/KP-WEB | d4998a8da57268cb02c78522aef1fa1beb2a5f97 | [
"MIT"
] | null | null | null | README.md | scafol/KP-WEB | d4998a8da57268cb02c78522aef1fa1beb2a5f97 | [
"MIT"
] | null | null | null | README.md | scafol/KP-WEB | d4998a8da57268cb02c78522aef1fa1beb2a5f97 | [
"MIT"
] | null | null | null | # <center> Scafol Web Framework Base For Codeigniter </center>
This repository contains custom library for network request
## Usage
### `GET`
```
String $route route of API
Array $query query data [optional]
Array $header header data [optional]
return Array result from API
```
Example :
`
$this->request->get('/users', ['age' => 14, 'team' => 'development], ['Authorization: 123', 'Content-Type: json'])
`
### `POST FORM-DATA`
```
String $route route of API
Array $body body form data
Array $header header data [optional]
return Array result from API
```
Example :
`
$this->request->postFormData('/users', ['age' => 14, 'team' => 'development], ['Authorization: 123', 'Content-Type: json'])
`
### `POST MULTIPART`
```
String $route route of API
Array $body body multipart data
Array $header header data [optional]
return Array result from API
```
Example :
`
$this->request->postMultipart('/users', ['age' => 14, 'team' => 'development, 'image' => new CURLFile('image_path')], ['Authorization: 123', 'Content-Type: json'])
`
### `PUT`
```
String $route route of API
Array $body body put data
Array $header header data [optional]
return Array result from API
```
Example :
`
$this->request->put('/users/1', ['age' => 14, 'team' => 'development], ['Authorization: 123', 'Content-Type: json'])
`
### `DELETE`
```
String $route route of API
Array $header header data [optional]
return Array result from API
```
Example :
`
$this->request->delete('/users/1', ['Authorization: 123', 'Content-Type: json'])
`
| 23.65625 | 164 | 0.674373 | eng_Latn | 0.490546 |
054a5c48091f7a19d6719ee879392f74e2e80ed5 | 7,285 | md | Markdown | articles/fin-ops-core/dev-itpro/data-entities/dual-write/migrate-prospect-to-cash.md | MicrosoftDocs/Dynamics-365-Operations.is-is | ce362ebbd8aabebe5e960567bddc97e5d1f37b56 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-18T17:14:14.000Z | 2021-04-20T21:13:46.000Z | articles/fin-ops-core/dev-itpro/data-entities/dual-write/migrate-prospect-to-cash.md | MicrosoftDocs/Dynamics-365-Operations.is-is | ce362ebbd8aabebe5e960567bddc97e5d1f37b56 | [
"CC-BY-4.0",
"MIT"
] | 6 | 2017-12-12T11:46:48.000Z | 2019-04-30T11:45:51.000Z | articles/fin-ops-core/dev-itpro/data-entities/dual-write/migrate-prospect-to-cash.md | MicrosoftDocs/Dynamics-365-Operations.is-is | ce362ebbd8aabebe5e960567bddc97e5d1f37b56 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2019-10-12T18:18:43.000Z | 2022-02-09T23:55:11.000Z | ---
title: Flytja PTC-gögn úr Data Integrator í tvöfalda skráningu
description: Þetta efnisatriði lýsir hvernig á að flytja PTC-gögn úr Data Integrator í tvöfalda skráningu.
author: RamaKrishnamoorthy
ms.date: 01/04/2021
ms.topic: article
audience: Application User, IT Pro
ms.reviewer: rhaertle
ms.search.region: global
ms.author: ramasri
ms.search.validFrom: 2020-01-06
ms.openlocfilehash: d216f1c46aa3362730c126ffc33fefdddddf1853
ms.sourcegitcommit: 259ba130450d8a6d93a65685c22c7eb411982c92
ms.translationtype: HT
ms.contentlocale: is-IS
ms.lasthandoff: 08/24/2021
ms.locfileid: "7416378"
---
# <a name="migrate-prospect-to-cash-data-from-data-integrator-to-dual-write"></a>Flytja PTC-gögn úr Data Integrator í tvöfalda skráningu
[!include [banner](../../includes/banner.md)]
Til að flytja PTC-gögn úr Data Integrator í tvöfalda skráningu skal fylgja þessum skrefum.
1. Keyrið Data Integrator-verk PTC til að gera eina fulla samstillingu að lokum. Á þennan hátt er tryggt að bæði kerfi (Finance and Operations forrit og Customer Engagement-forrit) séu með öll gögn.
2. Til að hjálpa til við að koma í veg fyrir hugsanlegt gagnatap er Prospect to cash gögn flutt út úr Microsoft Dynamics 365 Sales í Excel-skrá eða skrá með aðskildum gildum (CSV). Flytja út gögn úr eftirfarandi einingum:
- [Lykill](#account-table)
- [Tengiliður](#contact-table)
- [Reikningur](#invoice-table)
- Reikningsfæra vörur
- [Pöntun](#order-table)
- [Panta vörur](#order-products-table)
- [Afurðir](#products-table)
- [Tilboð](#quote-and-quote-product-tables)
- [Tilboð í afurðir](#quote-and-quote-product-tables)
3. Fjarlægið Prospect to cash lausn úr Sales-umhverfi. Þetta skref fjarlægir dálka og samsvarandi gögn sem viðfangið til að reiðufjárlausn sé kynnt.
4. Setja upp lausn tvöfaldrar skráningar.
5. Stofnið tengingu tvöfaldrar skráningar á milli Finance and Operations-forritsins og forrits viðskiptavinar fyrir einn eða fleiri lögaðila.
6. Virkið töfluvörpun tvöfaldrar skráningar og keyrið fyrstu samstillinguna fyrir áskild tilvísunargögn. (Frekari upplýsingar er að finna í [Hvað skal hafa í huga við fyrstu samstillingu](initial-sync-guidance.md).) Dæmi um áskilin gögn eru m.a. viðskiptavinaflokkar, greiðsluskilmálar og greiðsluáætlanir. Ekki skal virkja vörpun tvöfaldrar skráningar fyrir töflur sem krefast frumstillingar, t.d. töflur lykla, tilboðs, tilboðslínu, pöntunar og pöntunarlínu.
7. Í forriti viðskiptavinar skal opna **Ítarlegar stillingar \> Kerfisstillingar \> Gagnastjórnun \> Afrita greiningarreglur** og slökkva á öllum reglum.
8. Frumstillið töflurnar sem eru gefnar upp í skrefi tvö. Frekari upplýsingar er að finna í eftirstandandi hlutum í þessu efnisatriði.
9. Opnið Finance and Operations-forritið og virkið töfluvarpanir, t.d. töfluvarpanir lykla, tilboðs, tilboðslínu, pöntunar og pöntunarlínu. Keyrið síðan fyrstu samstillingu. (Frekari upplýsingar er að finna [Hvað skal hafa í huga við fyrstu samstillingu](initial-sync-guidance.md).) Þetta ferli mun samstilla viðbótarupplýsingar úr Finance and Operations-forritinu, t.d. vinnslustöðu, sendingu og reikningsheimilisfang, svæði og vöruhús.
## <a name="account-table"></a>Lyklatafla
1. Í dálkinn **Fyrirtæki** skal færa inn heiti fyrirtækisins, til dæmis **USMF**.
2. Í dálkinn **Gerð vensla** skal færa inn **Viðskiptavin** sem fast gildi. Ekki er víst að hægt sé að flokka hverja reikningsfærslu sem viðskiptavin í viðskiptagrunni.
3. Í dálkinn **Kenni viðskiptavinaflokks** skal færa inn númer viðskiptavinaflokks úr Finance and Operations-forritinu. Sjálfgefið gildi úr Prospect to cahs lausn er **10**.
4. Ef notuð er PTC-lausnin án sérstillingar á **Reikningsnúmeri** skal færa inn gildi fyrir **Reikningsnúmer** í dálkinn **Aðilanúmer**. Ef það eru sérstillingar og aðilanúmerið er óþekkt skaltu ná í þær upplýsingar úr Finance and Operations forritinu.
## <a name="contact-table"></a>Taflan Tengiliður
1. Í dálkinn **Fyrirtæki** skal færa inn heiti fyrirtækisins, til dæmis **USMF**.
2. Stillið eftirfarandi dálka út frá **IsActiveCustomer** gildinu í CSV-skrá:
- Ef **IsActiveCustomer** er stillt á **Já** í CSV-skrá skal stilla dálkinn **Seljanlegt** á **Já**. Í dálkinn **Kenni viðskiptavinaflokks** skal færa inn númer viðskiptavinaflokks úr Finance and Operations-forritinu. Sjálfgefið gildi úr Prospect to cahs lausn er **10**.
- Ef **IsActiveCustomer** er stillt á **Nei** í CSV-skrá skal stilla dálkinn **Seljanlegt** á **Nei** og stilla dálkinn **Tengiliður fyrir** á **Viðskiptavinur**.
3. Ef notuð er PTC-lausnin án sérstillingar á **Númer tengiliðar** skal stilla eftirfarandi dálka:
- Yfirfærið númer tengiliðar á CSV-skrána (**msdynce\_contactnumber**) á tengiliðanúmerið í töflunni **Tengiliður** (**msdyn\_contactnumber**).
- Notið gildið úr töflunni **Númer tengiliðar** í dálkinn **Aðilanúmer**.
- Notið gildin úr töflunni **Númer tengiliðar** í dálkinum **Lykilnúmer/kenni tengiliðar**.
## <a name="invoice-table"></a>Reikningstafla
Vegna þess að gögnin úr töflunni **Reikningur** eru hönnuð til að flæða í eina átt, úr Finance and Operations-forritinu til forrits viðskiptavinar, gerist ekki þörf á frumstillingu. Keyrið fyrstu samstillingu til að flytja öll áskilin gögn úr Finance and Operations-forritinu í forrit viðskiptavinar. Frekari upplýsingar má finna á [Hvað skal hafa í huga við fyrstu samstillingu](initial-sync-guidance.md).
## <a name="order-table"></a>Pöntunartafla
1. Í dálkinn **Fyrirtæki** skal færa inn heiti fyrirtækisins, til dæmis **USMF**.
2. Afritið gildi dálksins **Pöntunarkenni** í CSV-skránni í dálkinn **Sölupöntunarnúmer**.
3. Afritið gildi dálksins **Viðskiptavinur** í CSV-skránni í dálkinn **Númer viðskiptavinareiknings**.
4. Afritið gildið úr dálkinum **Sendist til svæðis/lands** í CSV-skránni í dálkinn **Sendist til lands/svæðis**. Dæmi um þetta gildi eru m.a. **US** og **Bandaríkin**.
5. Stillið dálkinn **Umbeðin móttökudagsetning**. Ef ekki er hægt notuð móttökudagsetning skal nota dálkana **Umbeðinn afhendingardag**, **Dagsetningu uppfyllingar** og **Dagsetning innsendingar** í CSV-skránni. Dæmi um þetta gildi er **2020-03-27T00:00:00Z**.
6. Stillið dálkinn **Tungumál**. Dæmi um þetta gildi er **en-us**.
7. Stillið dálkinn **Gerð pöntunar** með því að nota dálkinn **Vörutengd**.
## <a name="order-products-table"></a>Töflur afurðapöntunar
- Í dálkinn **Fyrirtæki** skal færa inn heiti fyrirtækisins, til dæmis **USMF**.
## <a name="products-table"></a>Afurðatöflur
Vegna þess að gögnin úr töflunni **Afurðir** eru hönnuð til að flæða í eina átt, úr Finance and Operations-forritinu til forrits viðskiptavinar, gerist ekki þörf á frumstillingu. Keyrið fyrstu samstillingu til að flytja öll áskilin gögn úr Finance and Operations-forritinu í forrit viðskiptavinar. Frekari upplýsingar má finna á [Hvað skal hafa í huga við fyrstu samstillingu](initial-sync-guidance.md).
## <a name="quote-and-quote-product-tables"></a>Töflur tilboðs og afurðartilboðs
Fyrir töfluna **Tilboð** skal fylgja leiðbeiningunum í hlutanum [Pöntunartafla](#order-table) fyrr í þessu efnisatriði. Fyrir töfluna **Afurðartilboð** skal fylgja leiðbeiningunum í hlutanum [Tafla afurðapöntunar](#order-products-table).
[!INCLUDE[footer-include](../../../../includes/footer-banner.md)] | 77.5 | 460 | 0.773782 | isl_Latn | 0.999782 |
054a76e5363a803110f547b2ac8695210a67e375 | 448 | md | Markdown | about.md | vmayoral/vmayoral.github.io | 2f7ba7030d1211ed583c3453988a4082cd36c662 | [
"MIT"
] | null | null | null | about.md | vmayoral/vmayoral.github.io | 2f7ba7030d1211ed583c3453988a4082cd36c662 | [
"MIT"
] | null | null | null | about.md | vmayoral/vmayoral.github.io | 2f7ba7030d1211ed583c3453988a4082cd36c662 | [
"MIT"
] | null | null | null | ---
layout: page
title: About
permalink: /about/
tags: about
---
This blog describes some of the milestones in a journey dedicated to robotics mixed with a _deep_ passion for Artificial Intelligence.
Code will be opened and shared at [Github](http://github.com/vmayoral) to download it,
request a feature, report a bug, or contribute. Since I'm doing this in my free time, It's free, and open source
([MIT](http://opensource.org/licenses/MIT)).
| 32 | 134 | 0.754464 | eng_Latn | 0.991716 |
054af78109100d0f8ee692075501e7ecabee6ab1 | 47 | md | Markdown | README.md | acanoenfr/guard-1 | 1ed3ce6beda038266357752693a8b0e668be07b7 | [
"MIT"
] | 1 | 2022-02-12T08:32:06.000Z | 2022-02-12T08:32:06.000Z | README.md | acanoenfr/guard-1 | 1ed3ce6beda038266357752693a8b0e668be07b7 | [
"MIT"
] | null | null | null | README.md | acanoenfr/guard-1 | 1ed3ce6beda038266357752693a8b0e668be07b7 | [
"MIT"
] | null | null | null | # guard-1
Directory manager for an association
| 15.666667 | 36 | 0.808511 | eng_Latn | 0.988158 |
054b321b642e42e1bbe1f3a86cc9096ff966e5af | 1,027 | md | Markdown | README.md | chrislawlor/astrosky | 2fbe7af5a1d06061370580d071d37f8132539255 | [
"CC0-1.0"
] | null | null | null | README.md | chrislawlor/astrosky | 2fbe7af5a1d06061370580d071d37f8132539255 | [
"CC0-1.0"
] | null | null | null | README.md | chrislawlor/astrosky | 2fbe7af5a1d06061370580d071d37f8132539255 | [
"CC0-1.0"
] | null | null | null | # AstroSky

A simple space shooter built on PyGame
## Getting Started
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.
### Prerequisites
* Python 3
* PyGame
* SDL
```
Give examples
```
### Installing
A step by step series of examples that tell you have to get a development env running
Say what the step will be
```
pip3 install pygame
```
## Built With
* [PyGame](https://www.pygame.org/news)
## Authors
* **Chris Lawlor** - *Initial work*
## License
This project is placed in the public domain under CC0- see the [LICENSE.txt](LICENSE.txt) file for details
All assets used in the creation of this game retain their respective licenses.
## Acknowledgments
* [Space Shooter Redux](http://kenney.nl/assets/space-shooter-redux) - Art assets
* [Bfxr](https://rometools.github.io/rome/) - Sound effects | 19.75 | 200 | 0.73223 | eng_Latn | 0.982789 |
054ba4b704f6dec6c68a433f89fbf2c2017171e7 | 2,841 | md | Markdown | output/2019/01/prettier-with-lighting-web-components.md | BrettMN/wipdeveloper-wordpress-archive | f7cd9dea1ef83e61f03494a71de4fd605ac58f68 | [
"MIT"
] | null | null | null | output/2019/01/prettier-with-lighting-web-components.md | BrettMN/wipdeveloper-wordpress-archive | f7cd9dea1ef83e61f03494a71de4fd605ac58f68 | [
"MIT"
] | null | null | null | output/2019/01/prettier-with-lighting-web-components.md | BrettMN/wipdeveloper-wordpress-archive | f7cd9dea1ef83e61f03494a71de4fd605ac58f68 | [
"MIT"
] | null | null | null | ---
layout: "post.11ty.js"
title: "Prettier with Lighting Web Components"
date: "2019-01-23"
tags:
- "Blog"
- "LWC"
- "Prettier"
- "Salesforce"
- "SalesforceDeveloper"
- "SalesforceDX"
slug: "prettier-with-lighting-web-components"
coverImage: "Screen-Shot-2019-01-25-at-3.50.46-PM.png"
---
* * *
Hello I'm Brett with WIPDeveloper.com. If you've been working with Lightning Web Components in Visual Studio Code with the Prettier extension installed, you might have encountered issues where he tried to assign a property to an attribute on your HTML.
## A Prettier Problem
If I try to change the title from the string of web developer calm to the property label it's going to reformat it and cause issues. I'm gonna hit save now I have an issue where it's Prettier has automatically inserted quotes around my property.
The Lightning Web Component linter is saying that you should not have quotes around your property names to prevent Prettier from automatically inserting double quotes around your Properties on your HTML templates in your lightning lab components.
We are going to have to tell Prettier to ignore the HTML files. We can do that in our project so we don't have to uninstalled the extension.
## A Prettier Solution
We create a new file called `.prettierignore`
now we're going to do more than just ignore the HTML files actually got this from René Winkelmeyer and I hope he's okay with me saying his name he works at Salesforce as a dev evangelist. This is a pretty or ignore file from one of the Trailhead example apps.
And I'm going to it's what it's going to do is it tells Prettier to ignore CSS, HTML, SVG files, XML files, and anything in the `.sfdx` folder so Prettier will not check the formatting on those.
All we do is add that to our `.prettierignore` file. Now when we remove the double quotes,
We press save. it did not work. What did I spell wrong?
Where did I create the `.prettierignore` file
You have to create it at the base of your project I was not paying enough attention.
Now we can delete the double quotes.
Press Save it prettier did not re insert the double quotes. The lender does not have problems with our syntax and we can carry on with using
properties from our web component in attributes, I will provide a link for the example `.prettierignore`, I'll also have the entire contents below.
#### `.prettierignore`
```
**/lwc/**/*.css
**/lwc/**/*.html
**/lwc/**/*.svg
**/lwc/**/*.xml
.sfdx
```
## Links
- [.prettierignore Example](https://github.com/trailheadapps/purealoe-lwc/blob/master/.prettierignore)
- [René Winkelmeyer](https://twitter.com/muenzpraeger)
## That’s it for now.
Remember to sign up for **[The Weekly Stand-Up!](https://wipdeveloper.wpcomstaging.com/newsletter/)** and you can get updated with any new information we have on WIPDeveloper.com.
| 41.779412 | 259 | 0.755368 | eng_Latn | 0.995136 |
054c6a84a1a37273263cf26c5c739ad8138ab2dd | 48 | md | Markdown | README.md | mmertdemirhan/Second-HTML-Task | 86b05490507512f19f68ed30b86b19f97e05d35a | [
"MIT"
] | null | null | null | README.md | mmertdemirhan/Second-HTML-Task | 86b05490507512f19f68ed30b86b19f97e05d35a | [
"MIT"
] | null | null | null | README.md | mmertdemirhan/Second-HTML-Task | 86b05490507512f19f68ed30b86b19f97e05d35a | [
"MIT"
] | null | null | null | # Second-HTML-Task
Patika.dev ikinci HTML Odevi
| 16 | 28 | 0.791667 | tur_Latn | 0.789769 |
054ce4f5914d2df9972963cbf5fec7c32d3474d8 | 389 | md | Markdown | README.md | dee0512/UI-Challenge | 9829a72a6f8220aca7dbc54079be3637565b58ec | [
"MIT"
] | null | null | null | README.md | dee0512/UI-Challenge | 9829a72a6f8220aca7dbc54079be3637565b58ec | [
"MIT"
] | 1 | 2017-05-08T06:34:31.000Z | 2017-05-08T06:34:31.000Z | README.md | dee0512/UI-Challenge | 9829a72a6f8220aca7dbc54079be3637565b58ec | [
"MIT"
] | null | null | null | ## What is this?
This is a personal project where I will challenge myself to code cool mobile user interfaces in ionic.
Project hosted at (current progress): https://ui-challenge-site.herokuapp.com/
### Example 1:
To start the project, I am thinking of creating a splash screen text effect like this one inspired by [this](https://www.youtube.com/watch?v=_JBuHh1HIAA&feature=youtu.be).
| 48.625 | 171 | 0.768638 | eng_Latn | 0.989902 |
054d2663bce3c302e395c6fcb181103b8f469729 | 1,122 | md | Markdown | src/pages/practice-areas/corporate-law-in-romania.md | cristealaw/site | 5aa8f600561b3380e8ff5fe16b0c4b48c7adb0b5 | [
"MIT"
] | null | null | null | src/pages/practice-areas/corporate-law-in-romania.md | cristealaw/site | 5aa8f600561b3380e8ff5fe16b0c4b48c7adb0b5 | [
"MIT"
] | null | null | null | src/pages/practice-areas/corporate-law-in-romania.md | cristealaw/site | 5aa8f600561b3380e8ff5fe16b0c4b48c7adb0b5 | [
"MIT"
] | null | null | null | ---
templateKey: about-page
title: Corporate Law
---
Legal assistance for the incorporation of companies based in Romania, the subsidiaries and branches of foreign companies in Romania. Incorporation and registration of representative offices in Romania of foreign law companies. Drafting of articles of association, drafting of resolutions modifying articles of association, assistance for the registration of companies with the Trade Register. Assistance for the dissolution and removal of companies. Mergers and spin-offs of companies. Assistance for managers / partners / shareholders for the summoning and preparation within the general meetings of the associates / shareholders. Assistance within the conduct of general meetings of the associates / shareholders. Assistance and representation before the Romanian courts within proceedings for the annulment of decisions made in the general meetings of shareholders. The firm was established in 1999 and offers a full range of legal services. The firm has close cooperation relationship with law firms in Italy, England, Germany, Norway and United States of America. | 224.4 | 1,069 | 0.83066 | eng_Latn | 0.999648 |
054d6d5fd1b7a4568d7b73a4ca787deb35d2c6c0 | 1,756 | md | Markdown | King County Housing Regression/README.md | kiritowu/Machine-Learning | 65d5452b4ffbd266f1579f5db15b94113f8bd260 | [
"MIT"
] | 1 | 2021-11-04T03:38:50.000Z | 2021-11-04T03:38:50.000Z | King County Housing Regression/README.md | kiritowu/Machine-Learning | 65d5452b4ffbd266f1579f5db15b94113f8bd260 | [
"MIT"
] | null | null | null | King County Housing Regression/README.md | kiritowu/Machine-Learning | 65d5452b4ffbd266f1579f5db15b94113f8bd260 | [
"MIT"
] | null | null | null | # King County Housing Regression
### Buy Low Sell High
Author : Wong Zhao Wu, Bryan
## Modelling Objective
Perform EDA and Modelling to find the optimal solution in estimating the housing prices by minimizing Root Mean Squared Error as the primary metrics.
## Keywords
- Supervised Learning
- Regression
- Feature Engineering
- Hierarchical Clustering
- Gradient Boosted Trees
## Credits
- King Country Housing Dataset: [https://geodacenter.github.io/data-and-lab//KingCounty-HouseSales2015/](https://geodacenter.github.io/data-and-lab//KingCounty-HouseSales2015/)
## Personal Learning Reflection
Through the seattle housing price prediction problem, I've learned more of **Geo-location Feature Engineering** without leaking the actual housing prices as well as grasp a better understanding of the **Bias-Variance trade-off** through countless iterations of redefining the params grid, hyperparameter tuning, model evaluation and again! Initially due to the poor design of parameter searching grid, which results in the resulting model being more overfitted than the default parameter, despite the drop in test error. By doing more research and read-up on the Gradient Boosting, I've identified several key hyperparameters that can improve the performance without increasing the variance as much. I've also decided to make use of **AWS Sagemaker** to host and run the entire experiment to speed up the experimenting iteration for this project.
Written By : Wong Zhao Wu
Last Modified : 25 May 2021

Image retrieved from [Unsplash](https://unsplash.com/photos/skUTVJi8-jc).
| 58.533333 | 846 | 0.802961 | eng_Latn | 0.97198 |
054e274c098e149b0278d908c166c88eb20b295b | 72 | md | Markdown | README.md | CMD-dot-exe/biggranny-discord-bot | f1a3d3ae8afc8fe0f26166fe8f92dc2c63fb4abd | [
"MIT"
] | null | null | null | README.md | CMD-dot-exe/biggranny-discord-bot | f1a3d3ae8afc8fe0f26166fe8f92dc2c63fb4abd | [
"MIT"
] | null | null | null | README.md | CMD-dot-exe/biggranny-discord-bot | f1a3d3ae8afc8fe0f26166fe8f92dc2c63fb4abd | [
"MIT"
] | null | null | null | # biggranny-discord-bot
Discord bot made in JS for Biggranny (Youtuber)
| 24 | 47 | 0.791667 | eng_Latn | 0.801504 |
054ece59193db3414f81195625683294f6d74874 | 209 | md | Markdown | products/_sprzegla/en/bulgarian/index.md | fimbes/fimbes-web | 94cb37214ab86d703996d7c71f01ac3906f98a62 | [
"MIT"
] | null | null | null | products/_sprzegla/en/bulgarian/index.md | fimbes/fimbes-web | 94cb37214ab86d703996d7c71f01ac3906f98a62 | [
"MIT"
] | null | null | null | products/_sprzegla/en/bulgarian/index.md | fimbes/fimbes-web | 94cb37214ab86d703996d7c71f01ac3906f98a62 | [
"MIT"
] | null | null | null | ---
layout: category
type: clutches
title: Bulgarian clutches
category: bulgarian
permalink: '/en/clutches/bulgarian/'
translation_url: '/sprzegla/bulgarskie/'
---
Clutches from bulgarian manufacturers - Sigma | 23.222222 | 45 | 0.789474 | eng_Latn | 0.88639 |
054f22d598c0780f6575c836469926a3a683840e | 2,324 | md | Markdown | README.md | azu/move-github-repository | 6fb6706dccd852a42c8bffd3066582010442887b | [
"MIT"
] | 4 | 2017-05-01T15:37:22.000Z | 2021-04-08T06:06:35.000Z | README.md | azu/move-github-repository | 6fb6706dccd852a42c8bffd3066582010442887b | [
"MIT"
] | null | null | null | README.md | azu/move-github-repository | 6fb6706dccd852a42c8bffd3066582010442887b | [
"MIT"
] | null | null | null | # move-github-repository
It make your repository "301 Moved Permanently".
It does:
- Update description && homepage
- Create "301_moved_permanently" branch that has a README.md
- Set "301_moved_permanently" as default branch
- It aim to preserve exist link
## Install
Install with [npm](https://www.npmjs.com/):
npm install -g move-github-repository
## Usage
Usage
$ GH_TOKEN=xxx move-github-repository --description "[[MOVED]]" --homepage http://example.com/new
Options
--description -d Description repository
--homepage -h New URL
Env
GH_TOKEN=xxx move-github-repository --description "[[MOVED]]" --homepage http://example.com/new
Examples
$ GH_TOKEN=xxx move-github-repository --description "[[MOVED]]" --homepage http://example.com/new
## Example
GH_TOKEN="xxxx" move-github-repository -d "[301 Moved]" -h "https://github.com/azu/move-github-repository"
Result: <https://github.com/azu/movemovemomvomeove>
Also:
- [textlint/txt-to-ast: [CAUTION] This repository is MOVED to monorepo.](https://github.com/textlint/txt-to-ast "textlint/txt-to-ast: [CAUTION] This repository is MOVED to monorepo.")
- [textlint/textlint-plugin-text: [CAUTION] This repository is MOVED to monorepo.](https://github.com/textlint/textlint-plugin-text "textlint/textlint-plugin-text: [CAUTION] This repository is MOVED to monorepo.")
- [textlint/textlint-plugin-markdown: [CAUTION] This repository is MOVED to monorepo.](https://github.com/textlint/textlint-plugin-markdown "textlint/textlint-plugin-markdown: [CAUTION] This repository is MOVED to monorepo.")
## Changelog
See [Releases page](https://github.com/azu/move-github-repository/releases).
## Running tests
Install devDependencies and Run `npm test`:
npm i -d && npm test
## Contributing
Pull requests and stars are always welcome.
For bugs and feature requests, [please create an issue](https://github.com/azu/move-github-repository/issues).
1. Fork it!
2. Create your feature branch: `git checkout -b my-new-feature`
3. Commit your changes: `git commit -am 'Add some feature'`
4. Push to the branch: `git push origin my-new-feature`
5. Submit a pull request :D
## Author
- [github/azu](https://github.com/azu)
- [twitter/azu_re](https://twitter.com/azu_re)
## License
MIT © azu
| 29.794872 | 225 | 0.722892 | eng_Latn | 0.410758 |
054ff46c4755e7881b21aec4d7ed010d3db036f7 | 14,111 | md | Markdown | Documentation/references/pioneer.md | d-exclaimation/pioneer | 4641841f16146546a04e75c960edb43bf24e494a | [
"Apache-2.0"
] | null | null | null | Documentation/references/pioneer.md | d-exclaimation/pioneer | 4641841f16146546a04e75c960edb43bf24e494a | [
"Apache-2.0"
] | 17 | 2021-12-08T08:33:55.000Z | 2022-03-21T03:06:05.000Z | Documentation/references/pioneer.md | d-exclaimation/pioneer | 4641841f16146546a04e75c960edb43bf24e494a | [
"Apache-2.0"
] | null | null | null | ---
icon: telescope-fill
order: 100
---
# Pioneer
## Pioneer
[Pioneer](#pioneer) GraphQL [Vapor](https://vapor.codes/) Server for handling all GraphQL operations
### `init`
Returns an initialized [Pioneer](#pioneer) server instance.
=== Example
```swift
let server = Pioneer(
schema: schema,
resolver: .init(),
contextBuilder: { req, res in
Context(req: req, res: res, auth: req.headers[.authorization].first)
},
websocketContextBuilder: { req, params, gql in
let res = Response()
guard case .string(let auth) = params?["Authorization"] else {
return Context(req: req, res: res, auth: nil)
}
Context(req: req, res: res, auth: auth)
}
)
```
===
==- Options
| Name | Type | Description |
| ------------------------- | ---------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------- |
| `schema` | [!badge variant="primary" text="GraphQLSchema"] | GraphQL schema used to execute operations |
| `resolver` | [!badge variant="success" text="Resolver"] | Resolver used by the GraphQL schema |
| `contextBuilder` | [!badge variant="danger" text="(Request, Response) async throws -> Context"] | Context builder from request (Can be async and can throw an error) |
| `httpStrategy` | [!badge variant="primary" text="HTTPStrategy"] | HTTP strategy <br/> **Default**: `.queryOnlyGet` |
| `websocketContextBuilder` | [!badge variant="danger" text="(Request, ConnectionParams, GraphQLRequest) async throws -> Context"] | Context builder for the websocket |
| `websocketProtocol` | [!badge variant="primary" text="WebsocketProtocol"] | Websocket sub-protocol <br/> **Default**: `.subscriptionsTransportws` |
| `introspection` | [!badge variant="primary" text="Bool"] | Allowing introspection <br/> **Default**: `true` |
| `playground` | [!badge variant="primary" text="IDE"] | Allowing playground <br/> **Default**: `.graphiql` |
| `keepAlive` | [!badge variant="warning" text="UInt64?"] | Keep alive internal in nanosecond, `nil` for disabling <br/> **Default**: 12.5 seconds |
===
### `init`
**Constraint**:
```swift
where WebsocketContextBuilder == ContextBuilder
```
Returns an initialized [Pioneer](#pioneer) server instance.
=== Example
```swift
let server = Pioneer(
schema: schema,
resolver: .init(),
contextBuilder: { req, res in
Context(req: req, res: res, auth: req.headers[.authorization].first)
}
)
```
===
==- Options
| Name | Type | Description |
| ------------------- | ---------------------------------------------------------------------------- | -------------------------------------------------------------------------------------- |
| `schema` | [!badge variant="primary" text="GraphQLSchema"] | GraphQL schema used to execute operations |
| `resolver` | [!badge variant="success" text="Resolver"] | Resolver used by the GraphQL schema |
| `contextBuilder` | [!badge variant="danger" text="(Request, Response) async throws -> Context"] | Context builder from request (Can be async and can throw an error) |
| `httpStrategy` | [!badge variant="primary" text="HTTPStrategy"] | HTTP strategy <br/> **Default**: `.queryOnlyGet` |
| `websocketProtocol` | [!badge variant="primary" text="WebsocketProtocol"] | Websocket sub-protocol <br/> **Default**: `.subscriptionsTransportws` |
| `introspection` | [!badge variant="primary" text="Bool"] | Allowing introspection <br/> **Default**: `true` |
| `playground` | [!badge variant="primary" text="IDE"] | Allowing playground <br/> **Default**: `.graphiql` |
| `keepAlive` | [!badge variant="warning" text="UInt64?"] | Keep alive internal in nanosecond, `nil` for disabling <br/> **Default**: 12.5 seconds |
===
### `init` (No context)
**Constraint**:
```swift
where Context == Void
```
Returns an initialized [Pioneer](#pioneer) server instance without explicitly specifying `contextBuilder`.
=== Example
```swift
let server = Pioneer(
schema: schema,
resolver: .init()
)
```
===
==- Options
| Name | Type | Description |
| ------------------- | --------------------------------------------------- | -------------------------------------------------------------------------------------- |
| `schema` | [!badge variant="primary" text="GraphQLSchema"] | GraphQL schema used to execute operations |
| `resolver` | [!badge variant="success" text="Resolver"] | Resolver used by the GraphQL schema |
| `httpStrategy` | [!badge variant="primary" text="HTTPStrategy"] | HTTP strategy <br/> **Default**: `.queryOnlyGet` |
| `websocketProtocol` | [!badge variant="primary" text="WebsocketProtocol"] | Websocket sub-protocol <br/> **Default**: `.subscriptionsTransportws` |
| `introspection` | [!badge variant="primary" text="Bool"] | Allowing introspection <br/> **Default**: `true` |
| `playground` | [!badge variant="primary" text="IDE"] | Allowing playground <br/> **Default**: `.graphiql` |
| `keepAlive` | [!badge variant="warning" text="UInt64?"] | Keep alive internal in nanosecond, `nil` for disabling <br/> **Default**: 12.5 seconds |
===
### `init` (Graphiti)
**Constraint**:
```swift
where Schema == Graphiti.Schema<Resolver, Context> and WebsocketContextBuilder == ContextBuilder
```
Returns an initialized [Pioneer](#pioneer) server instance using [Graphiti](https://github.com/GraphQLSwift/Graphiti) schema.
=== Example
```swift
let server = try Pioneer(
schema: Schema<Resolver, Context>(...),
resolver: .init(),
contextBuilder: { req, res in
Context(req: req, res: res)
}
)
```
===
==- Options
| Name | Type | Description |
| ------------------- | ------------------------------------------------------------------------- | -------------------------------------------------------------------------------------- |
| `schema` | [!badge variant="warning" text="Schema<Resolver, Context>"] | Graphiti schema used to execute operations |
| `resolver` | [!badge variant="success" text="Resolver"] | Resolver used by the GraphQL schema |
| `contextBuilder` | [!badge variant="danger" text="(Request, Response) async throws -> Void"] | Context builder from request (Can be async and can throw an error) |
| `httpStrategy` | [!badge variant="primary" text="HTTPStrategy"] | HTTP strategy <br/> **Default**: `.queryOnlyGet` |
| `websocketProtocol` | [!badge variant="primary" text="WebsocketProtocol"] | Websocket sub-protocol <br/> **Default**: `.subscriptionsTransportws` |
| `introspection` | [!badge variant="primary" text="Bool"] | Allowing introspection <br/> **Default**: `true` |
| `playground` | [!badge variant="primary" text="IDE"] | Allowing playground <br/> **Default**: `.graphiql` |
| `keepAlive` | [!badge variant="warning" text="UInt64?"] | Keep alive internal in nanosecond, `nil` for disabling <br/> **Default**: 12.5 seconds |
===
### `init` (Graphiti)
**Constraint**:
```swift
where Schema == Graphiti.Schema<Resolver, Context>
```
Returns an initialized [Pioneer](#pioneer) server instance using [Graphiti](https://github.com/GraphQLSwift/Graphiti) schema.
=== Example
```swift
let server = try Pioneer(
schema: Schema<Resolver, Context>(...),
resolver: .init(),
contextBuilder: { req, res in
Context(req: req, res: res)
},
websocketContextBuilder: { req, params, gql in
let res = Response()
guard case .string(let auth) = params?["Authorization"] else {
return Context(req: req, res: res, auth: nil)
}
Context(req: req, res: res, auth: auth)
}
)
```
===
==- Options
| Name | Type | Description |
| ------------------------- | ---------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------- |
| `schema` | [!badge variant="warning" text="Schema<Resolver, Context>"] | Graphiti schema used to execute operations |
| `resolver` | [!badge variant="success" text="Resolver"] | Resolver used by the GraphQL schema |
| `contextBuilder` | [!badge variant="danger" text="(Request, Response) async throws -> Void"] | Context builder from request (Can be async and can throw an error) |
| `httpStrategy` | [!badge variant="primary" text="HTTPStrategy"] | HTTP strategy <br/> **Default**: `.queryOnlyGet` |
| `websocketContextBuilder` | [!badge variant="danger" text="(Request, ConnectionParams, GraphQLRequest) async throws -> Context"] | Context builder for the websocket |
| `websocketProtocol` | [!badge variant="primary" text="WebsocketProtocol"] | Websocket sub-protocol <br/> **Default**: `.subscriptionsTransportws` |
| `introspection` | [!badge variant="primary" text="Bool"] | Allowing introspection <br/> **Default**: `true` |
| `playground` | [!badge variant="primary" text="IDE"] | Allowing playground <br/> **Default**: `.graphiql` |
| `keepAlive` | [!badge variant="warning" text="UInt64?"] | Keep alive internal in nanosecond, `nil` for disabling <br/> **Default**: 12.5 seconds |
===
### `applyMiddleware`
Apply Pioneer GraphQL handlers to a Vapor route
!!!info Route overwrites
Avoid using the same path and methods for:
- **GET** at: `"\(path)"` and `"\(path)/websocket"`
- **POST** at: `"\(path)"`
- **GET** at: `"playground"`
As that will overwrite the applied routing and block certain operations in those endpoints.
It's best to group any other routes or apply the routing after all custom routes.
!!!
=== Example
```swift
server.applyMiddleware(
on: app,
at: "graphql"
)
```
===
==- Options
| Name | Type | Description |
| ---- | ----------------------------------------------- | ------------------------------------------ |
| `on` | [!badge variant="danger" text="RoutesBuilder"] | Graphiti schema used to execute operations |
| `at` | [!badge variant="primary" text="PathComponent"] | Resolver used by the GraphQL schema |
===
---
!!!success DocC API Reference
Swift Package Index can now host DocC compatible API documentation taken from the code comments
You can check out Pioneer's DocC API Docs here:
[!ref Pioneer API Reference](https://swiftpackageindex.com/d-exclaimation/pioneer/main/documentation/pioneer)
!!!
| 54.906615 | 221 | 0.437318 | eng_Latn | 0.389053 |
0551d9ec732a6d34e9a0becd58d39b099dc5154f | 1,225 | md | Markdown | README.md | soumyadip1995/numgrad | 8f11fdc9244e0baea221cd57b89127f607263346 | [
"MIT"
] | 1 | 2021-12-15T22:46:21.000Z | 2021-12-15T22:46:21.000Z | README.md | soumyadip1995/numgrad | 8f11fdc9244e0baea221cd57b89127f607263346 | [
"MIT"
] | null | null | null | README.md | soumyadip1995/numgrad | 8f11fdc9244e0baea221cd57b89127f607263346 | [
"MIT"
] | 1 | 2021-12-15T22:46:34.000Z | 2021-12-15T22:46:34.000Z | ### numgrad
If torch is based on the Torch Tensor, then numgrad is based on the numpy array. A Tensor class wrapping the numpy array. If [karpathy/micrograd](https://github.com/karpathy/micrograd) provides support for scalar values and its gradients, numgrad provides support for both scalar values and matrices.
Note- NNs to be implemented. Yet to be finished.
#### A few Examples
A few examples have been provided below.
```
1)
x = Tensor(np.arange(-4, 4).reshape(2, 4))
y = Tensor(np.arange(-2, 2).reshape(4, 1))
n = dot(x, y)
n1 = relu(n)
backward_graph(n1)
print(x.grad, y.grad)
2)
x_init = np.random.randn(3,3).astype(np.float32)
W_init = np.random.randn(3,3).astype(np.float32)
m_init = np.random.randn(1,3).astype(np.float32)
x = Tensor(x_init)
y = Tensor(W_init)
c = mul(x,y)
out = relu(c)
d = sum(out)
tr = backward_graph(d)
print(out, d.data, tr)
```
#### Scalar values
The input can either be a scalar or a np.ndarray , some scalar value computations included. Not final.
```
a = Tensor(-8.0)
b = Tensor(9.0)
c = Tensor(-3.0)
outadd = add(a, b)
outm = mul(outadd, c)
d = backward_graph(outm)
print(outm.data, outm a.grad, b.grad)
# More operations for dd/da and dd/db will be supported
```
| 22.272727 | 300 | 0.694694 | eng_Latn | 0.906759 |
055268f3aecc2c42eacc3e990752129d04ec01f6 | 64 | md | Markdown | README.md | devxchangeio/php-restful-plugin | 877adbd1baaf9c1a5653be8adb7047a2fed945ea | [
"MIT"
] | null | null | null | README.md | devxchangeio/php-restful-plugin | 877adbd1baaf9c1a5653be8adb7047a2fed945ea | [
"MIT"
] | null | null | null | README.md | devxchangeio/php-restful-plugin | 877adbd1baaf9c1a5653be8adb7047a2fed945ea | [
"MIT"
] | null | null | null | # php-restful-plugin
PHP Resful Eclipse Plugin for PHP Projects
| 21.333333 | 42 | 0.8125 | kor_Hang | 0.404927 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.