hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
24438e041906c31606151498beab9fb40459b17e | 1,147 | md | Markdown | README.md | burkayanduv/fdd-app | 7ce2c1daf8c38a6dad305d820e34b436c974159a | [
"MIT"
] | 1 | 2021-11-14T06:33:47.000Z | 2021-11-14T06:33:47.000Z | README.md | burkayanduv/fdd-app | 7ce2c1daf8c38a6dad305d820e34b436c974159a | [
"MIT"
] | null | null | null | README.md | burkayanduv/fdd-app | 7ce2c1daf8c38a6dad305d820e34b436c974159a | [
"MIT"
] | 1 | 2021-08-08T10:00:50.000Z | 2021-08-08T10:00:50.000Z | # Chiller AI Dashboard
An HVAC Chiller IoT Dashboard App with Cloud-based AI FDD Module.
Live version is [here.](https://fdd-anduv.netlify.app)
Default credientials: (test: pass)
## Features
- Many-to-many user to chiller access
- Chiller editing options for admin auth
- Visualization of chiller location on map
- Tabulated overview of chillers and sensors
- Cloud-AI based FDD prediction for chiller health
- Interactive chiller schematic with recent data
- Various graphs and gauges for sensor data
- Real-time graph of selected sensor data
- Sensor data query options with time or data limit
- Loading animation and component skeletons, etc.
## Tech
This project includes the following:
- [node.js](http://nodejs.org)
- [Express](http://expressjs.com)
- [React](https://reactjs.org/)
- [MongoDB](https://www.mongodb.com/)
- [Mapbox API](https://www.mapbox.com/)
- [Ant Design](https://ant.design/) etc.
## Screenshots




## Author
Burkay Anduv
## License
MIT | 31 | 66 | 0.731473 | eng_Latn | 0.447815 |
244482d5a1008accc0ed476a87ca819b5c1d8ff5 | 20 | md | Markdown | README.md | fdiblen/hezarfen | 3dcf2344af001175522e3b7afe44f16247da896b | [
"Apache-2.0"
] | null | null | null | README.md | fdiblen/hezarfen | 3dcf2344af001175522e3b7afe44f16247da896b | [
"Apache-2.0"
] | null | null | null | README.md | fdiblen/hezarfen | 3dcf2344af001175522e3b7afe44f16247da896b | [
"Apache-2.0"
] | null | null | null | # hezarfen
hezarfen
| 6.666667 | 10 | 0.8 | deu_Latn | 0.949944 |
244520a27206f01b1b8906afb2523a55d7a73a4f | 67 | md | Markdown | README.md | andrzejewsky/andrzejewsky.com | acf5f92e34d59169cbbdec25e2ec24f34ee11813 | [
"MIT"
] | 1 | 2020-05-11T09:28:06.000Z | 2020-05-11T09:28:06.000Z | README.md | andrzejewsky/andrzejewsky.com | acf5f92e34d59169cbbdec25e2ec24f34ee11813 | [
"MIT"
] | null | null | null | README.md | andrzejewsky/andrzejewsky.com | acf5f92e34d59169cbbdec25e2ec24f34ee11813 | [
"MIT"
] | null | null | null | My blog, based on: https://github.com/gatsbyjs/gatsby-starter-blog
| 33.5 | 66 | 0.776119 | kor_Hang | 0.525907 |
2445839968a6c177e569ac7e730c332056df908a | 1,125 | md | Markdown | Lang/Swift/readme.md | Huseyinnurbaki/notes | d262b13471be844a71abc1a13b7e14789cf68a18 | [
"MIT"
] | 2 | 2020-04-21T12:14:47.000Z | 2020-12-14T06:43:24.000Z | Lang/Swift/readme.md | Huseyinnurbaki/notes | d262b13471be844a71abc1a13b7e14789cf68a18 | [
"MIT"
] | null | null | null | Lang/Swift/readme.md | Huseyinnurbaki/notes | d262b13471be844a71abc1a13b7e14789cf68a18 | [
"MIT"
] | 2 | 2020-03-06T07:36:52.000Z | 2020-12-14T05:56:21.000Z | # attributed string
let textFirstPart = " üyeliğiniz kapsamında acdvsfv "
let textMiddlePart = "sasasasaa Metni"
let textRest = "'ni okudum ve anladım."
let textAttributedPart = textFirstPart + textMiddlePart + textRest
let attributedString = NSMutableAttributedString(string: textAttributedPart)
desc.textColor = Color.greyishBrown.value
desc.font = UIFont(name: "SFProRounded-Regular", size: 14)
desc.numberOfLines = 3
desc.translatesAutoresizingMaskIntoConstraints = false
desc.textAlignment = .left
attributedString.addAttribute(NSAttributedString.Key.underlineStyle, value: 1, range: NSRange(location: textFirstPart.count + 1, length: textMiddlePart.count));
attributedString.addAttribute(NSAttributedString.Key.foregroundColor, value: UIColor.red , range: NSRange(location: textFirstPart.count + 1, length: textMiddlePart.count))
desc.attributedText = attributedString
# webview
https://www.hackingwithswift.com/read/4/2/creating-a-simple-browser-with-wkwebview
# Great input validation repo
https://github.com/Arrlindii/AAValidators | 35.15625 | 175 | 0.758222 | yue_Hant | 0.392614 |
2445b9fd6236c7abd1a6fd2c1a3620c753bd546e | 305 | md | Markdown | contrib/init/README.md | PaulBPortfolio/Pulse | 00fe9254a1e530a1363ba7237bbfad3f95d135c5 | [
"MIT"
] | null | null | null | contrib/init/README.md | PaulBPortfolio/Pulse | 00fe9254a1e530a1363ba7237bbfad3f95d135c5 | [
"MIT"
] | null | null | null | contrib/init/README.md | PaulBPortfolio/Pulse | 00fe9254a1e530a1363ba7237bbfad3f95d135c5 | [
"MIT"
] | 1 | 2021-03-26T20:36:42.000Z | 2021-03-26T20:36:42.000Z | Sample configuration files for:
```
SystemD: pulsed.service
Upstart: pulsed.conf
OpenRC: pulsed.openrc
pulsed.openrcconf
CentOS: pulsed.init
macOS: org.pulse.pulsed.plist
```
have been made available to assist packagers in creating node packages here.
See doc/init.md for more information.
| 23.461538 | 76 | 0.760656 | eng_Latn | 0.841218 |
2445c539d8941f8f0c7ec7b8edcad6bb263e47e7 | 5,705 | md | Markdown | _posts/knowledge/2019-09-02-KN-482.md | baardev/baardev.github.io | 7e776cfb854e41e18facc1febea143a62a3d4920 | [
"MIT"
] | null | null | null | _posts/knowledge/2019-09-02-KN-482.md | baardev/baardev.github.io | 7e776cfb854e41e18facc1febea143a62a3d4920 | [
"MIT"
] | 1 | 2021-03-30T01:41:55.000Z | 2021-03-30T01:41:55.000Z | _posts/knowledge/2019-09-02-KN-482.md | baardev/baardev.github.io | 7e776cfb854e41e18facc1febea143a62a3d4920 | [
"MIT"
] | null | null | null | ---
layout: post
author: jeff_admin
title: How to Find Your Motivation for Better Health
featured: false
hidden: true
categories: [attitude,body,health,mind,motivation]
categories: [knowledge]
image: assets/images/posts/knowledge//KN-482-banner.jpg
permalink: how-find-your-motivation-better-health-article
postnum: 482
intro: The importance of personal priorities and preferences.
---
Your personal priorities and preferences determine how you'll make real change

#### _(Credit: Alex StahlmannTom Nolan) Tom Nolan, 71, admits he led a less-than-healthy lifestyle for years. He ate foods he shouldn’t (and too much of them), slept poorly and avoided exercise. The result showed up around his waistline. Last year, his doctor told him he needed to lose weight — a lot of weight. Yet, neither the doctor nor Nolan expected that to happen._
“He told me that I needed to lose at least 20 pounds in six months … and I told him I’d start walking around the lake by my house,” says Nolan, who lives in Aurora, Ill. “Then he said, ‘We both know you’re not going to do that.’ Well, that really ticked me off. I wasn’t about to go back to him six months later and have him say, ‘So you decided to do nothing.’ That would be embarrassing.”
That same day, Nolan received an email from his former gym, offering him a renewal membership. The combination of both factors lit a fitness fire under him. He started walking the next day, gradually working up to more than 10,000 steps a day. Later, as he built up his cardiovascular endurance, Nolan also started lifting weights at the gym. He tracked his calories on the popular app My Fitness Pal to lose weight.
Nolan says he already knew what kinds of foods to eat to get stronger and more fit — more lean protein, more vegetables and fewer carbs, but My Fitness Pal helped keep him honest. “I counted calories, and tried to stay between 1,200 and 1,500 calories a day,” he says.
Nolan lost the 20 pounds his doctor had recommended in just two months. Six months later, he’d lost a total of 35 pounds. And now, a year later, he’s down 50 pounds, with plans to lose a few more.
Better yet, he says it’s easy to maintain his new healthy eating and workout habits. “I won’t ever go back to where I was before. This has become who I am,” he says.
### Motivation: One Size Doesn’t Fit All
Lifestyle modification is challenging at any age, and what drives one person to change may have no impact on another. “Motivation depends on the individual; people are motivated in different ways,” says Michael C. Meyers, professor of sports science at Idaho State University in Pocatello.
“Motivation is predicated on an intrinsic or extrinsic reward system. Your prior history of success or failure, your expectations, your feelings of self-importance and your self-esteem … will all affect your motivation,” Meyers says.
### Extrinsic Versus Intrinsic Motivation
Extrinsic rewards are outside of your control. They come from other people, like someone complimenting you on your recent weight loss or being able to buy clothing in a smaller size.
Intrinsic rewards come from inside, like attending yoga class because you enjoy the sense of calm you feel afterwards.
While extrinsic rewards may seem more appealing (who doesn’t love a compliment?), intrinsic rewards help you stick with a lifestyle change.
So even if it’s an external event that triggers you, like a health scare or an upcoming class reunion, identifying your personal priorities is critical to success. “It takes a lot more effort to stay motivated as you go on, so you want to be intrinsically motivated so you can be in control of that,” says Meyers.
### Get Personal: What Motivates You?
To determine your intrinsic motivation, ask yourself questions like:
* What is most important to me? For Nolan, it was his family. (His wife, in particular, was concerned about his health.)
* What activities do I enjoy that I don’t currently do, or that I’d like to continue doing? (Do you love golf? Do you want to take a walking tour of Europe?)
* How do I feel when I’m doing something positive for my health? (Proud? Confident? In control?)
* What simple steps can I take to move in the right direction? (This could be something as minor as eating more vegetables or walking every night after dinner.)
* How would I like to feel about my health in a month? (Three months? Six?)
* Dialing into your inner motivation, and how you feel when you embrace a lifestyle change, will help you stay the course when your initial enthusiasm fades. Another effective technique is to break up one large goal — say, shedding 25 pounds — into several manageable ones, such as losing a pound a week.
“Don’t worry about being overwhelmed,” Nolan says. “When you start seeing results, use that as motivation.”
“Focus on small steps,” Meyers adds. “No. 2, be patient. And No. 3, celebrate small successes. Even a 3-pound weight loss is something to celebrate.” Those minor victories will help you develop the intrinsic motivation that turns lifestyle changes into habits that last.
Nolan says working out regularly and embracing a healthy diet have had a profound impact on him. “I feel like I’m now in the top tier of 70-year-olds, health-wise,” he says. “And one of the best benefits has been the reaction of my family. My grandson told me, ‘Gramps, your pants are too baggy! You’ve got to get new ones!’ But they’re happy for me, because I will be around longer for them.”
_From: [https://www.nextavenue.org/motivation-better-health/](https://www.nextavenue.org/motivation-better-health/) By Kelly K. James (November 6, 2018)_
| 95.083333 | 416 | 0.777564 | eng_Latn | 0.999586 |
24476731ee301832ff4f862860ee4081f5180c7b | 2,007 | md | Markdown | README.md | hannespernpeintner/StaticReceivers | d86e1acf7757468aefbe2b8ff631736afd29561d | [
"MIT"
] | null | null | null | README.md | hannespernpeintner/StaticReceivers | d86e1acf7757468aefbe2b8ff631736afd29561d | [
"MIT"
] | null | null | null | README.md | hannespernpeintner/StaticReceivers | d86e1acf7757468aefbe2b8ff631736afd29561d | [
"MIT"
] | null | null | null | Every now and then, one wants to use static Java methods from Kotlin as if
they were extension functions, with the first parameter as a receiver.
This is a feature provided by other languages out of the box
(Xtend for example https://www.eclipse.org/xtend/documentation/202_xtend_classes_members.html#extension-methods).
Besides implementing a Kotlin compiler extension and IDE support directly, which
would be the correct way to provide the feature, I tried to do the following:
* Have a gradle plugin and a configured generated sources folder
* Have the gradle plugin scan configurations (for example compile and runtime)
* Resolve the corresponding files
* Filter for classes and then for static methods that have at least one parameter
* Load the classes
* Use Kotlinpoet to generate Kotlin extension functions for all static methods in objects for alle classes
and just call the static Java function with receiver as parameter
Result tldr:
Doesnt work. It works well (use the test project with the configured composite build)
in general and generates output like
```kotlin
package de.hanno.generated
import java.util.Collection
import org.apache.commons.collections.Predicate
object AllPredicate {
fun Collection.getInstance(): Predicate =
org.apache.commons.collections.functors.AllPredicate.getInstance(this)
}
```
But as you can see, Collection is a raw type. That's because the type parameter is indeed missing
from AllPedicate.getInstance's parameter... it would be very complicated to add missing types
in case of raw parameters for all parameters and parameters that become receivers.
Additionally, there is no correct mapping between Java and corresponding Kotlin types. That means
String from Java is not translated into kotlin.String for the Kotlin wrapper. This gives a warning.
All in all, I think the approach is not worth the effort of saving a few key strokes, I think
it's better to implement wrappers on demand ... or even better, I'll write a compiler extension :)
| 47.785714 | 113 | 0.802691 | eng_Latn | 0.999032 |
24481f77e1265a66da0387dbf75da9d52a9cb209 | 491 | md | Markdown | Readme.md | vdemedes/stable-node-version | e58e63a90cbdbbf8ef19897062a8cd0641972be8 | [
"MIT"
] | 1 | 2015-10-04T10:45:40.000Z | 2015-10-04T10:45:40.000Z | Readme.md | vdemedes/stable-node-version | e58e63a90cbdbbf8ef19897062a8cd0641972be8 | [
"MIT"
] | 1 | 2018-11-10T04:45:47.000Z | 2018-11-10T04:45:47.000Z | Readme.md | vadimdemedes/stable-node-version | e58e63a90cbdbbf8ef19897062a8cd0641972be8 | [
"MIT"
] | null | null | null | # stable-node-version [](https://github.com/vadimdemedes/stable-node-version/actions)
> Fetch stable Node.js version.
## Installation
```
$ npm install stable-node-version
```
## Usage
```js
import stableNodeVersion from 'stable-node-version';
const version = await stableNodeVersion();
//=> 16.0.0
```
## API
### stableNodeVersion()
Returns a `Promise` that resolved to a version string.
| 19.64 | 178 | 0.727088 | yue_Hant | 0.473959 |
244987869ce71ecdb5f77fbbcef2f41ad6abef8b | 615 | md | Markdown | README.md | UVLabs/Livecoding.tv-Desktop-Chat | cd97e2912f9b4a670785c65e6d12a1934cacc973 | [
"MIT"
] | null | null | null | README.md | UVLabs/Livecoding.tv-Desktop-Chat | cd97e2912f9b4a670785c65e6d12a1934cacc973 | [
"MIT"
] | null | null | null | README.md | UVLabs/Livecoding.tv-Desktop-Chat | cd97e2912f9b4a670785c65e6d12a1934cacc973 | [
"MIT"
] | null | null | null | # Livecoding.tv-Desktop-Chat
A wrapper for livecoding.tv chat. Works great for users with 1 monitor.
#Reason:
The livecoding.tv popout chat for your browser still gets hidden if you're coding on your desktop and only have 1 monitor. This wrapper is an "always on top window". It will always be ontop of your IDE or text editor unless they themselves are maybe set to "always on top".
#Required:
.Net Framework v4.5 or higher.
#eWebBrowser
This is where the magic happens, original code for the eWebBrowser was found on Youtube. Readme will be updated with credit once proper information about it is acquired.
| 41 | 273 | 0.78374 | eng_Latn | 0.998862 |
244cdfbc489de36605892892ccfc08dfe87b2256 | 8,494 | md | Markdown | versions/00/oaf-00.md | kiddom/open-academic-format | 323812ef60f2f98a1a42501fc46f2ea4c596458b | [
"MIT",
"Apache-2.0",
"BSD-3-Clause"
] | 2 | 2019-03-01T18:34:25.000Z | 2019-08-20T18:44:33.000Z | versions/00/oaf-00.md | kiddom/open-academic-format | 323812ef60f2f98a1a42501fc46f2ea4c596458b | [
"MIT",
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | versions/00/oaf-00.md | kiddom/open-academic-format | 323812ef60f2f98a1a42501fc46f2ea4c596458b | [
"MIT",
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | # Open Academic Format v0.0 (DRAFT)
## Abstract
Open Academic Format defines conceptual and concrete data structures as well as hierarchy to
pragmatically describe Academic Content such as Curriculum and Materials for Assignments. It aims to
increase ease of interoperability and general usage of said content in varied educational software
platforms.
## Status of Document
This document is a draft, and is open for comments via [GitHub Issues](https://github.com/kiddom/open-academic-format/issues).
## Terminology and Conformance Language
Normative text describes one or both of the following kinds of elements:
* Vital elements of the specification
* Elements that contain the conformance language key words as defined by [IETF RFC 2119](https://www.ietf.org/rfc/rfc2119.txt) "Key words for use in RFCs to Indicate Requirement Levels"
Informative text is potentially helpful to the user, but dispensable. Informative text can be changed, added, or deleted editorially without negatively affecting the implementation of the specification. Informative text does not contain conformance keywords.
All text in this document is, by default, normative.
The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in [IETF RFC 2119](https://www.ietf.org/rfc/rfc2119.txt) "Key words for use in RFCs to Indicate Requirement Levels".
## Definitions and Terminology
TODO
## Table of Contents
TODO
## Introduction
TODO
## Base Data Structures
All data structures, unless otherwise noted, **MUST** allow an optional **id** field. Said **id** field **MUST** be [**IRI**](https://www.ietf.org/rfc/rfc3987.txt) compliant. While IRIs accepts several protocols, IRIs used within this format **SHALL** use the following protocols.
* urn
* https
All data structures, unless otherwise noted, **MUST** allow an optional **ref** field. Said **ref** field **MUST** accept values previously defined in **id** fields within said document. A **ref** may also use a publicly defined **id** values.
### Summary
The summary **SHALL** be placed at the top of the file. There may only be one. This data structure **MUST NOT** include an **id**.
```json
{
"authors": ["Maria Montessori", "Jaime Escalante"],
"date": "2019-02-19T23:20:50.52Z",
"languages": ["en-US", "fr-CA", "es-MX"],
"title": "Kindergarten Curriculum",
"version": "1",
"organization": "what is this"
}
```
* authors - An array of authors of this file. Should be humans. All authors **MUST** be displayed when requested.
* date - Date and time this file was created, or updated. Must be [RFC 3339](https://tools.ietf.org/html/rfc3339) compliant.
* languages - Languages available in the content described by this file. The first language in this list **SHALL** be considered the default.
### License
Open Academic Interchange Format for Curriculum and Materials.
Copyright (C) 2019 Kiddom, Inc.
This specification is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This specification is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
### Asset
```json
{
"id": "urn:kiddom.co:asset:fb060824-9842-48b8-ae82-cf528cc106ca",
"url": "https://asset.host.edu/path/to/item",
}
```
### Grade
```json
{
"id": "urn:grade:5",
"names": {
"en-US": "5th Grade",
"fr-CA": "5ième année",
"es-MX": "5 año"
},
"value": "5"
}
```
### MediaType
```json
{
"category": "image",
"content_type": "image/png",
"name": "PNG Image"
}
```
### Note
```json
{
}
```
### Rubric
```json
{
"name": "Critical Thinking",
"description": "General Education Scoring Guide for Critical Thinking",
"criteria": [
{
"name": "Interpretation",
"levels": [
{
"value": 1,
"description": "Fails to question data. Ignores bias. Misses major content areas. Detects no inconsistencies. Chooses biased sources."
},
{
"value": 2,
"description": "Identifies some questions. Notes some bias. Recognizes basic content. States some inconsistencies. Selects sources adequately."
},
{
"value": 3,
"description": "Asks insightful questions. Detects bias. Categorizes content. Identifies inconsistencies. Recognizes context."
},
{
"value": 4,
"description": "Analyzes insightful questions. Refutes bias. Critiques content. Examines inconsistencies. Values information."
}
]
}
]
}
```
### Scoring
```json
```
#### Scheme
```json
```
#### Band
```json
```
### Skills
```json
{
"id": "urn:skill:k.cc.1",
"name": "K.CC.1",
"type": "Counting And Cardinality",
"subtype": "Know Number Names And The Count Sequence.",
"grades": ["urn:grade:k"]
}
```
### Subject
```json
{
"id": "urn:subject:trigonometry",
"names": {
"en-US": "Trigonometry",
"fr-CA": "Trigonométrie",
"es-MX": "Trigonometría",
},
"parent": "urn:subject:mathematics"
}
```
## Curriculum Data Structures
### Course
```json
{
"name": "English 1",
"grades": [
"urn:grade:k"
],
"subjects": [
"urn:subject:ela"
],
"curriculums": [
"urn:curriculum:english:1",
"urn:curriculum:english:2",
"urn:curriculum:english:3"
]
}
```
### Curriculum
```json
{
"id": "urn:curriculum:english:1",
"name": "English 1 Curriculum",
"units": [
"urn:unit:nouns",
"urn:unit:verbs",
]
}
```
### Unit
```json
{
"id": "urn:unit:nouns",
"name": "Nouns & Adjectives: A Love Story",
"description": "A longer, more descriptive set of sentences. Possibly paragraphs.",
"skills": [
"urn:skill:k.cc.1"
]
}
```
## Assessment Data Structures
### Assessment: Homework
```json
{
"id": "urn:assessment:123456",
"type": "urn:type:homework",
"name": "Daily Reflection",
"description": "Think about your day, write out what went well and what did not.",
}
```
### Assessment: Paper
```json
{
"id": "urn:assessment:123456",
"type": "urn:type:paper",
"name": "Summarize weekly short-stories",
"description": "Review the stories we read this week. What made sense and what did not. Give specific examples of stories that were relatable and those that were not and why.",
"formats": [
"urn:format:doc",
"urn:format:docx",
"urn:format:pdf"
]
}
```
### Assessment: Video
```json
{
"id": "urn:assessment:123456",
"type": "urn:type:video",
"name": "Watch this ballet performance",
"description": "Watch this performance of, \"Don Quixote\" and be prepared to discuss in class tomorrow.",
"assets": [
{
"id": "urn:asset:youtube.com:5fKxzQ8ARTQ",
"url": "https://..../"
}
]
}
```
### Assessment: Quiz
```json
{
"id": "urn:assessment:123456",
"type": "urn:type:quiz",
"name": "Chapter 5 Quiz (Focus on Earth Science, Chapter 5)",
"description": "Read the questions completely before answers the questions.",
"questions": [
"urn:question:123456"
]
}
```
### Assessment: Question
```json
{
"id": "urn:question:123456",
"type": "urn:type:question",
"name": "Review the image and identify the different layers of Earth's atmosphere.",
"description": "The order is important.",
"answers": [
"urn:answer:123456"
],
"solution": "urn:answer:123456",
"assets": [
{
"id": "urn:asset:geogebra:123456",
"url": "https://..../"
}
]
}
```
### Assessment: Answer
```json
{
"id": "urn:answer:123456",
"type": "urn:type:answer",
"name": "Exosphere",
"description": "The exosphere is the outermost layer of Earth's atmosphere",
}
```
### Type information
```json
{
"id": "urn:assessment:type:paper",
"name": "Paper",
"description": "A written assignment",
}
```
A Type of Assessment. Implementors **MUST NOT** restrict to known types. The following types **SHALL** always be available for use.
* Homework - `urn:type:homework`
* Paper - `urn:type:paper`
* Quiz - `urn:type:quiz`
* Video - `urn:type:video`
| 23.99435 | 297 | 0.661526 | eng_Latn | 0.896583 |
244df6481aa7ed6057c23c3fc96c5eeb6badc92b | 3,259 | md | Markdown | sdk-api-src/content/directxmath/nf-directxmath-xmcomparisonallfalse.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/directxmath/nf-directxmath-xmcomparisonallfalse.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/directxmath/nf-directxmath-xmcomparisonallfalse.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NF:directxmath.XMComparisonAllFalse
title: XMComparisonAllFalse function (directxmath.h)
description: Tests the comparison value to determine if all of the compared components are false.
helpviewer_keywords: ["Use DirectX..XMComparisonAllFalse","XMComparisonAllFalse","XMComparisonAllFalse method [DirectX Math Support APIs]","dxmath.xmcomparisonallfalse"]
old-location: dxmath\xmcomparisonallfalse.htm
tech.root: dxmath
ms.assetid: M:Microsoft.directx_sdk.reference.XMComparisonAllFalse(uint32_t)
ms.date: 12/05/2018
ms.keywords: Use DirectX..XMComparisonAllFalse, XMComparisonAllFalse, XMComparisonAllFalse method [DirectX Math Support APIs], dxmath.xmcomparisonallfalse
req.header: directxmath.h
req.include-header:
req.target-type: Windows
req.target-min-winverclnt:
req.target-min-winversvr:
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance:
req.unicode-ansi:
req.idl:
req.max-support:
req.namespace: Use DirectX.
req.assembly:
req.type-library:
req.lib:
req.dll:
req.irql:
targetos: Windows
req.typenames:
req.redist:
ms.custom: 19H1
f1_keywords:
- XMComparisonAllFalse
- directxmath/XMComparisonAllFalse
dev_langs:
- c++
topic_type:
- APIRef
- kbSyntax
api_type:
- COM
api_location:
- DirectXMath.h
api_name:
- XMComparisonAllFalse
---
# XMComparisonAllFalse function
## -description
Tests the comparison value to determine if all of the compared components are false.
## -parameters
### -param CR [in]
Comparison value to test. The comparison value is typically retrieved using a recording version of a DirectXMath
function such as <a href="/windows/desktop/api/directxmath/nf-directxmath-xmvector4equalr">XMVector4EqualR</a>. The names of the recording functions
end with an "R".
## -returns
Returns true if all of the compared components are false.
## -remarks
The following code snippet highlights how this function might be used:
```
uint32_t comparisonValue = XMVector4EqualR( V1, V2 );
if( XMComparisonAllFalse( comparisonValue ) )
{
DoStuff();
}
```
The <code>DoStuff</code> function will be called only if all four components of <i>V1</i> and <i>V2</i> are
different (all compared components are false).
<h3><a id="Platform_Requirements"></a><a id="platform_requirements"></a><a id="PLATFORM_REQUIREMENTS"></a>Platform Requirements</h3>
Microsoft Visual Studio 2010 or Microsoft Visual Studio 2012 with the Windows SDK for Windows 8. Supported for Win32 desktop apps, Windows Store apps, and Windows Phone 8 apps.
## -see-also
<a href="/windows/desktop/dxmath/ovw-xnamath-utilities">DirectXMath Library Utility Functions</a>
<a href="/windows/desktop/api/directxmath/nf-directxmath-xmcomparisonallinbounds">XMComparisonAllInBounds</a>
<a href="/windows/desktop/api/directxmath/nf-directxmath-xmcomparisonalltrue">XMComparisonAllTrue</a>
<a href="/windows/desktop/api/directxmath/nf-directxmath-xmcomparisonanyfalse">XMComparisonAnyFalse</a>
<a href="/windows/desktop/api/directxmath/nf-directxmath-xmcomparisonanyoutofbounds">XMComparisonAnyOutOfBounds</a>
<a href="/windows/desktop/api/directxmath/nf-directxmath-xmcomparisonanytrue">XMComparisonAnyTrue</a>
<a href="/windows/desktop/api/directxmath/nf-directxmath-xmcomparisonmixed">XMComparisonMixed</a> | 28.840708 | 176 | 0.782755 | eng_Latn | 0.593003 |
244e3b013adcdca557198c11e8491b2235282cd4 | 815 | md | Markdown | README.md | rogerpueyo/LoRaMesher | cbe20f1c519efe79d84f531da458f7b30005087f | [
"MIT"
] | null | null | null | README.md | rogerpueyo/LoRaMesher | cbe20f1c519efe79d84f531da458f7b30005087f | [
"MIT"
] | null | null | null | README.md | rogerpueyo/LoRaMesher | cbe20f1c519efe79d84f531da458f7b30005087f | [
"MIT"
] | null | null | null | # LoRaMesher
This library aims to implement a user firendly way to create a mesh network between multiple ESP32 LoRa Nodes. This library is currently under construction.
## Dependencies
You can check `library.json` for more details. Basically we use a modded version of [Radiolib](https://github.com/jgromes/RadioLib) that supports class methods as callbacks and [FreeRTOS](https://freertos.org/index.html) for scheduling maintenance tasks
### Code Quality
We are using [Astyle](http://astyle.sourceforge.net/) to make the code look the same. You need to run the following command after the first `git clone` in order to set the apropiate code checks:
```bash
git config core.hookspath .githooks #For newer versions of git
ln -s ../../.githooks/pre-commit.sh .git/hooks/pre-commit #For older versions of git
```
| 62.692308 | 253 | 0.774233 | eng_Latn | 0.975704 |
244e460983f379e9962decd117657fb21f72c474 | 1,114 | md | Markdown | ADD/Tema1/Readme.md | addUsername/DAM | 23f776def2dacb5cadca0b20faffa6853aba9d17 | [
"MIT"
] | null | null | null | ADD/Tema1/Readme.md | addUsername/DAM | 23f776def2dacb5cadca0b20faffa6853aba9d17 | [
"MIT"
] | null | null | null | ADD/Tema1/Readme.md | addUsername/DAM | 23f776def2dacb5cadca0b20faffa6853aba9d17 | [
"MIT"
] | null | null | null | Ficheros de texto.
En la raiz del proyecto se encuentran todos los ficheros que utilizará el programa.
- acceso.txt Persistencia credenciales usuarios
- bloqueados.txt Persistencia usuarios bloquados
- login.log (punto 7)
__________________
CLASES
./src/dam2/add/p1/
- Main.java (Contiene main())
- User.java (puntos 1 y 4)
- Controller.java (puntos 2, 3, 4, 5 y 8)
- Repository.java (Accede a los ficheros .txt, puntos 1, 4 y 7)
- Ui.java (Interactua con el usuario, punto 6)
- Logger.java (punto 7)
__________________
SIST. BLOQUEO
Este metodo es el equivalente al utilizado para realizar las operaciones CRUD en "acceso.txt", cada linea
almacena el nombre de usuario actualmente bloquado. La razón de esta implementación se debe a mantener
el fichero "acceso.txt" sin modificaciones siguiendo el requisito:
``` Los usuarios con los que se validarán los datos introducidos estarán almacenados en el fichero acceso.txt con el formato user:pass ```.
Las operaciones de lectura, en ambos ficheros, se utiliza el método: Files.lines(path, charset);
Las operaciones de escritura con la clase FileWriter.java;
| 39.785714 | 140 | 0.776481 | spa_Latn | 0.993282 |
244e9d97b5f20f0ceaaf446ce2d6f7674f519dda | 680 | md | Markdown | README.md | inkthought/rusty-goose | 76e20d92d2525a22e2fa7bdb88ea3534a469d0f5 | [
"MIT"
] | null | null | null | README.md | inkthought/rusty-goose | 76e20d92d2525a22e2fa7bdb88ea3534a469d0f5 | [
"MIT"
] | null | null | null | README.md | inkthought/rusty-goose | 76e20d92d2525a22e2fa7bdb88ea3534a469d0f5 | [
"MIT"
] | null | null | null | # rusty-goose

> The rusty-goose logo
Implemented in Rust, rusty-goose is the cousin of honkers!
This is not going to replace honkers (without any sort of improvement before release). This is just a simple experiment with Rust with the [Serenity library](https://github.com/serenity-rs/serenity/) while the developer learn the language.
# License
This repository is licensed under the MIT license, with the exception of the honkers logo. Please see [this file](LICENSE.md) for more information.
# Original implementation
[Invite link](https://bit.ly/hnkr)
| 37.777778 | 239 | 0.788235 | eng_Latn | 0.971133 |
244ef8f859add29968d0c8e397aa3609af70235e | 3,851 | md | Markdown | README.md | foundation29org/patient-similarity | 35133517048b8f2c536a884f8052b81c0bcbaddd | [
"MIT"
] | 13 | 2015-03-13T21:32:54.000Z | 2020-10-27T12:50:19.000Z | README.md | foundation29org/patient-similarity | 35133517048b8f2c536a884f8052b81c0bcbaddd | [
"MIT"
] | 2 | 2015-03-12T17:37:07.000Z | 2019-11-14T14:19:47.000Z | README.md | foundation29org/patient-similarity | 35133517048b8f2c536a884f8052b81c0bcbaddd | [
"MIT"
] | 11 | 2015-01-27T21:36:54.000Z | 2021-05-29T08:30:16.000Z |
# Patient phenotype and genotype similarity
This package is research code designed to measure the similarity between patients, using phenotype (specified as [Human Phenotype Ontology (HPO)](human-phenotype-ontology.github.io) terms) and/or genotype (specified as the [Exomiser](http://www.sanger.ac.uk/science/tools/exomiser)-processed results of whole-exome VCF files).
## Dependencies
- Python 3.3+
## Phenotypic similarity
### Input file formats
#### JSON (default, `--patient-file-format phenotips`)
By default, phenotype data is expected in [PhenoTips](https://phenotips.org) JSON export format, e.g.: `phenotips_2017-02-01_00-01.json`.
Here is an [example JSON file](test/test.json).
#### CSV (`--patient-file-format csv`)
A simple csv file format is supported, with a patient per line and columns:
1. The patient's identifier (required)
2. The patient's first present HPO term (required)
3. The patient's second present HPO term (optional)
4. The patient's third present HPO term (optional), ...
For example :
```
Patient1,HP:0000001,HP:0000002,HP:0000003,HP:0000004
Patient2,HP:0000001,HP:0000002
```
Here is an [example CSV file](test/test.csv).
### Pair-wise phenotypic similarity
Pair-wise phenotypic similarity can be computed using a number of different similarity metrics using the `patient_similarity.py` script. For example, to compute just the simGIC score:
```bash
python -m patient_similarity --log=INFO -s simgic test/test.json \
data/hp.obo data/phenotype_annotation.tab
```
This will print to stdout the pairwise similarity scores, e.g.:
```
A B simgic
P0000001 P0000002 0.146613
P0000001 P0000003 0.191716
P0000001 P0000004 0.170512
P0000002 P0000003 0.124032
P0000002 P0000004 0.167785
P0000003 P0000004 0.291074
```
Multiple scores can be added by specifying `-s` multiple times, or all scores will be computed if `-s` is not specified. Supported phenotypic similarity scores include:
- jaccard
- resnik
- lin
- jc
- owlsim
- ui
- simgic
- icca
- TODO: add [ebosimgic](http://rucs.ca/computational-biology/exponential-back-off-simgic)
_See the [PhenomeCentral paper](http://dx.doi.org/10.1002/humu.22851) for a comparison of many of these_
Many of these similarity scores use the information content of the terms in the HPO to compute a similarity score. The information content of a term is defined to be `IC(t) = -log_2(p(t))`, where `p(t)` is the probability of the term. The probability of the term can be estimated in many ways, such as the fraction of OMIM diseases that have the term associated ([10.1016/j.ajhg.2008.09.017](https://dx.doi.org/10.1016%2Fj.ajhg.2008.09.017)).
A number of options have been added to support different variants of the IC computation:
- `--use-disease-prevalence`: instead of weighting each disease uniformly, weight them by their estimated prevalence from Orphanet
- `--use-phenotype-frequency`: instead of weighting each phenotype-disease association uniformly, weight them by the frequency of the association where available
- `--use-patient-phenotyes`: count each patient as an additional entry in the corpus, alongside diseases, in the frequency estimation
- `--distribute-ic-to-leaves`: evenly divide the observed frequency of each term amongst its children, so that all non-leaf nodes have zero frequency
- `--use-aoo`: include an age-of-onset similarity penalty in the similarity scoring
## Updating the data files
This package includes data files from HPO and Orphanet sources, which should be updated occasionally.
- data/hp.obo - See http://human-phenotype-ontology.github.io/downloads.html
- data/phenotype_annotations.tab - See http://human-phenotype-ontology.github.io/downloads.html
- data/en_product1.xml - See http://www.orphadata.org/cgi-bin/inc/product1.inc.php
- data/en_product2.xml - See http://www.orphadata.org/cgi-bin/inc/product2.inc.php
| 45.305882 | 442 | 0.770709 | eng_Latn | 0.9648 |
244f11c8b8b5f6877a554925995b37ab1b6aa827 | 3,716 | md | Markdown | Chapter4/plane-triangle.md | gdbooks/3DCollisions | 0ea70df0166b2d8f2c850cc56612ed3e4d307b38 | [
"Unlicense"
] | 8 | 2018-09-30T16:12:37.000Z | 2022-01-03T11:16:50.000Z | Chapter4/plane-triangle.md | gdbooks/3DCollisions | 0ea70df0166b2d8f2c850cc56612ed3e4d307b38 | [
"Unlicense"
] | 1 | 2018-06-20T20:47:43.000Z | 2018-06-20T20:47:43.000Z | Chapter4/plane-triangle.md | gdbooks/3DCollisions | 0ea70df0166b2d8f2c850cc56612ed3e4d307b38 | [
"Unlicense"
] | 4 | 2018-01-18T11:00:13.000Z | 2022-02-02T15:36:30.000Z | # Plane Triangle
Plane triangle intersections are fairly simple. There are two use cases
1. Use the dot-product to determine whether the triangle lies fully on one side of the plane and does not intersect the plane at all.
2. If there is an intersection, use a line-plane-intersection-algorithm for the two edges hitting the plane (algorithm on the same page)
Thats it. It's a lot like AABB-V-Plane, where we just tested if every point of the AABB was on the same side of the plane.
## The Algorithm
This one is fairly simple, i'm not going to provide an implementation for it. You need to test each point of the triangle against the plane, to see which side the points are on.
You can use the ```DistanceFromPlane``` with each point, it will return a negative number, zero or a positive number.
* If all 3 distances are 0 (That is all 3 points are on the plane), there is a collision.
* If all 3 distances have the same sign (positive or negative) then there is no collision
* Otherwise there is a collision
* This happens when at least one of the points is on the other side of the plan
## On Your Own
Add the following function to the ```Collisions``` class:
```cs
public static bool Intersects(Triangle triangle, Plane plane)
public static bool Intersects(Plane plane, Triangle triangle) {
return Intersects(triangle, plane);
}
```
And provide an implementation for it!
### Unit Test
You can [Download](../Samples/3DModels.rar) the samples for this chapter to see if your result looks like the unit test.
This unit test draws a plane and some triangles. Triangles that intersect the plane are drawn in green. Triangles that don't are drawn in red.
The constructor will throw errors if any are present in your code.

```cs
using OpenTK.Graphics.OpenGL;
using Math_Implementation;
using CollisionDetectionSelector.Primitives;
namespace CollisionDetectionSelector.Samples {
class TrianglePlaneIntersection : Application {
Plane plane = new Plane(new Point(5, 6, 7), new Point(6, 5, 4), new Point(1, 2, 3));
Triangle[] triangles = new Primitives.Triangle[] {
new Triangle(new Point(-1.0f, 5.0f, 0.0f), new Point(2.0f, 2.0f, -3.0f), new Point(5.0f, 5.0f, 0.0f)),
new Triangle(new Point(-1, -1, 0), new Point(0, 1, 0), new Point(1, -1, 0)),
new Triangle(new Point(-1.0f, -5.0f, 0.0f), new Point(2.0f, -2.0f, -3.0f), new Point(5.0f, -5.0f, 0.0f)),
new Triangle(new Point(5, 6, 7), new Point(6, 5, 4), new Point(1, 2, 3)),
};
public override void Intialize(int width, int height) {
GL.Enable(EnableCap.DepthTest);
GL.PointSize(4f);
GL.Disable(EnableCap.CullFace);
bool[] expected = new bool[] { false, true, false, true };
for (int i = 0; i < triangles.Length; ++i) {
bool result = Collisions.Intersects(plane, triangles[i]);
if (result != expected[i]) {
LogError("Expected triangle " + i + " to " +
(expected[i] ? " intersect" : " NOT intersect") +
" the plane");
}
}
}
public override void Render() {
base.Render();
DrawOrigin();
GL.Color3(0.0f, 0.0f, 1.0f);
plane.Render(4);
foreach(Triangle triangle in triangles) {
if (Collisions.Intersects(triangle, plane)) {
GL.Color3(0f, 1f, 0f);
}
else {
GL.Color3(1f, 0f, 0f);
}
triangle.Render();
}
}
}
}
``` | 39.115789 | 178 | 0.626749 | eng_Latn | 0.986163 |
244f34c96e1fae40c700fa8f4cde702bfbf026c0 | 2,549 | md | Markdown | docs/personal-resume/README.md | Carloin/myblog_local | 37791cfef4f8b7db6a2463efd73e2e220d10d2d2 | [
"MIT"
] | 1 | 2021-03-07T13:27:50.000Z | 2021-03-07T13:27:50.000Z | docs/personal-resume/README.md | Carloin/myblog_local | 37791cfef4f8b7db6a2463efd73e2e220d10d2d2 | [
"MIT"
] | null | null | null | docs/personal-resume/README.md | Carloin/myblog_local | 37791cfef4f8b7db6a2463efd73e2e220d10d2d2 | [
"MIT"
] | null | null | null | <!--
* @Author: hft
* @Date: 2021-10-23 17:06:23
* @LastEditors: hft
* @LastEditTime: 2021-11-10 11:01:01
* @Description: file content
-->
<Visitors mydate='2021-03-09'/>
**基本信息**
**姓名**:洪凤婷 **政治面貌**:中共党员
**性别**:女 **年龄**:22
**:iphone::** 15766476597 **:email::** [email protected]
**目前所在地:** 广州
**Github:** [https://github.com/Carloin](https://github.com/Carloin)
**Blog:** [https://carloin.github.io/Blog/](https://carloin.github.io/Blog/)
**CSDN:** [https://blog.csdn.net/Carolin_Carloin](https://blog.csdn.net/Carolin_Carloin)
---
**求 职 意 向**
**期望职位:** 前端工程师
**工作技能:** React
**目标城市:** 广州
**期望薪资:** 10K - 15K
**入职时间:** 收到 offter 后一个月内
---
**技 能**
- **基础语言:** HTML、CSS、JavaScript
- **相关技术:** React、Ant Design、UmiJS、Less、Webpack、Taro、微信/百度小程序开发
- **工 具 类 :** yarn、npm
- **版 本 管 理:** Github
---
**教 育 背 景**
2017.09 -2021.06 华南理工大学广州学院 软件工程(3.6/4.0) 本科 广州
- **主修基础课程:** 数据结构、算法设计与分析、计算机网络、操作系统、设计模式等
- **荣誉证书:** 软件设计师,计算机二级(Java),英语四级
- **在校曾任职务:** 班长、 [星空学生创新中心](https://www.xingkong.us/)技术前端干事、学生网络与信息工作委员会信息化运维部技术干事
---
**荣 誉 奖 励**
- **2021.1** 获百度中国大学生创意智能小程序大赛全国二等奖,优秀个人奖
- **2019-2020** 获二等奖学金,国家励志奖学金,三好学生称号
- **2018-2019** 获三等奖学金,国家励志奖学金,三好学生称号
- **2017-2018** 获四等奖学金
---
**工 作 经 历**
2020.12-现在 Web 工程师 广州 广州数控设备有限公司
- **使用技术:** React、Ant Design、UmiJS、GitLab
- **工作内容:** umi+react+ant 编写页面
---
**校 园 项 目 经 历**
百度举办:2020 全国大学生创意智能小程序大赛 / 小佳饮食健康管理 / 2020.09-2021.01
<img src="./../.vuepress/public/assets/product/01/diet_qr.png" width="100" height="100" alt="小佳饮食健康管理" align=right title="百度APP搜索小佳饮食健康管理或扫码体验">
- **项目描述:** 通过拍照识别菜品,记录饮食营养摄入情况,从而帮助高校学生用户更好地进行个人饮食健康管理;在用餐前/用餐后,通过简单高效、精准科学的饮食记录方式记录每日饮食摄入营养,进行个人健康管理;产品主要功能拍照识菜品成分、记录饮食营养数据、个性化推荐食物、搜索食物
- **负责部分:** 前端开发,负责技术人员的召集; 探究前端所用的技术; 负责小程序前端【发现】、【膳食】、【我的】的初期版开发,【我的】一级页面的后期维护。
- **技术实现:** 前端在展示和交互时采用组件化开发,页面的呈现主要使用 smart ui 组件库,动态采用 flex 布局;在数据的显示方面,主要采用 e-charts;为了提高用户的体验感,在数据显示时,通过 loading 动画方式过渡,小程序共有 25 个左右的页面。
- **最终结果:** 入围总决赛并获第二名、获得优秀技术个人奖。
| 26.010204 | 283 | 0.663397 | yue_Hant | 0.545368 |
244f5d3c3491890a7aa59e49e11266a414349577 | 47 | md | Markdown | 5th semester/README.md | abki12c/AUEB-projects | 2fdf7c97e27045293568f54669e0db33096bc1d3 | [
"MIT"
] | null | null | null | 5th semester/README.md | abki12c/AUEB-projects | 2fdf7c97e27045293568f54669e0db33096bc1d3 | [
"MIT"
] | null | null | null | 5th semester/README.md | abki12c/AUEB-projects | 2fdf7c97e27045293568f54669e0db33096bc1d3 | [
"MIT"
] | null | null | null | ### Here are the projects for the 5th semester
| 23.5 | 46 | 0.744681 | eng_Latn | 0.99993 |
244fa7ad05b4147e27bec6a2cdb82d19724e58d2 | 1,850 | md | Markdown | cfw.md | AndrewChenUoA/AVSS2018 | f6012cee24e6c67ba0f69c03e87778bb26ee6fbc | [
"MIT"
] | null | null | null | cfw.md | AndrewChenUoA/AVSS2018 | f6012cee24e6c67ba0f69c03e87778bb26ee6fbc | [
"MIT"
] | null | null | null | cfw.md | AndrewChenUoA/AVSS2018 | f6012cee24e6c67ba0f69c03e87778bb26ee6fbc | [
"MIT"
] | null | null | null | ---
layout: single
permalink: /cfw
---
**Call For Workshops**
The IEEE Conference on Advanced Video and Signal-based Surveillance (AVSS) has developed a tradition of hosting surveillance-related workshops on the day preceding the main conference. Recent AVSS workshops included:<br/>
- Performance Evaluation of Tracking Systems (PETS)
- Resource Aware Sensor and surveillance NETworkS (RAWSNETS)
- Activity Monitoring by Multi-Camera Surveillance Systems (AMMCSS)
- Activity Monitoring by Multiple Distributed Sensing (AMMDS)
- Workshop on Multimedia Systems for Surveillance (MMSS)
- Vehicle Retrieval in Surveillance (VRS)
- Low Resolution Face Analysis (LRFA)
This tradition will continue at AVSS 2018, and workshops will be held on 27 November 2018 prior to the start of the technical program of the main AVSS conference. We welcome workshop proposals in new and emerging research areas of visual and signal-based surveillance.
Prospective workshop organizers are invited to submit proposals via email to: <a href="mailto:[email protected]">[email protected]</a> no later than <b>15 April 2018</b>. The proposal should include the following information:<br/>
- Workshop title
- List of organizers including affiliation and email address
- Workshop motivation
- Estimated participation (industry/academia)
- Anticipated number of talks/posters and length of workshop (half-day or full-day)
- Paper submission procedure
- Paper review procedure (single-blind/double-blind, solicited/invited-only, pool of reviewers)
- Paper submission and acceptance deadlines (camera-ready and early registration deadlines for a workshop/contest must coincide with the corresponding deadlines of AVSS-2018 – see Important Dates on [Call for Papers](/cfp))
- Special space and equipment requests, if any.
We are looking forward to receiving your proposals! | 63.793103 | 268 | 0.802162 | eng_Latn | 0.980208 |
2451093c87c19f8e3f56f3ce74ba0e6e698e9423 | 2,210 | md | Markdown | docs/framework/unmanaged-api/debugging/iclrdatatarget2-interface.md | adamsitnik/docs.pl-pl | c83da3ae45af087f6611635c348088ba35234d49 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/iclrdatatarget2-interface.md | adamsitnik/docs.pl-pl | c83da3ae45af087f6611635c348088ba35234d49 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/iclrdatatarget2-interface.md | adamsitnik/docs.pl-pl | c83da3ae45af087f6611635c348088ba35234d49 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: ICLRDataTarget2 — Interfejs
ms.date: 03/30/2017
api_name:
- ICLRDataTarget2
api_location:
- mscordbi.dll
api_type:
- COM
f1_keywords:
- ICLRDataTarget2
helpviewer_keywords:
- ICLRDataTarget2 interface [.NET Framework debugging]
ms.assetid: 94249397-861b-4294-a538-cf01466a66d3
topic_type:
- apiref
ms.openlocfilehash: 1f0f4331302e56a90b4aefd657e07981994022ec
ms.sourcegitcommit: 559fcfbe4871636494870a8b716bf7325df34ac5
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 10/30/2019
ms.locfileid: "73112310"
---
# <a name="iclrdatatarget2-interface"></a>ICLRDataTarget2 — Interfejs
Podklasa elementu [ICLRDataTarget](../../../../docs/framework/unmanaged-api/debugging/iclrdatatarget-interface.md) , która jest używana przez warstwę usług dostępu do danych do manipulowania regionami pamięci wirtualnej w procesie docelowym.
## <a name="methods"></a>Metody
|Metoda|Opis|
|------------|-----------------|
|[AllocVirtual, metoda](../../../../docs/framework/unmanaged-api/debugging/iclrdatatarget2-allocvirtual-method.md)|Przydziela pamięć w przestrzeni adresowej procesu docelowego.|
|[FreeVirtual, metoda](../../../../docs/framework/unmanaged-api/debugging/iclrdatatarget2-freevirtual-method.md)|Zwalnia pamięć, która została wcześniej przypisana w przestrzeni adresowej procesu docelowego.|
## <a name="remarks"></a>Uwagi
Klient API (tzn. debuger) musi implementować ten interfejs stosownie do określonego procesu docelowego. Na przykład żywy proces miałby inną implementację od tej ze zrzutu pamięci. Cel może nie obsługiwać modyfikacji regionów pamięci.
## <a name="requirements"></a>Wymagania
**Platformy:** Zobacz [wymagania systemowe](../../../../docs/framework/get-started/system-requirements.md).
**Nagłówek:** ClrData. idl, ClrData. h
**Biblioteka:** CorGuids. lib
**Wersje .NET Framework:** [!INCLUDE[net_current_v20plus](../../../../includes/net-current-v20plus-md.md)]
## <a name="see-also"></a>Zobacz także
- [ICLRDataTarget, interfejs](../../../../docs/framework/unmanaged-api/debugging/iclrdatatarget-interface.md)
- [Debugowanie, interfejsy](../../../../docs/framework/unmanaged-api/debugging/debugging-interfaces.md)
| 44.2 | 243 | 0.744344 | pol_Latn | 0.900886 |
24514cd4d965fa9b00838d137cc4f33730a303af | 240 | md | Markdown | java/behavioral/plan.md | uyghfjhh/designpattern | 0efe87355150ce2555235b3ff3de73a45639b86f | [
"Apache-2.0"
] | 9 | 2020-05-29T11:10:40.000Z | 2020-11-24T04:29:01.000Z | java/behavioral/plan.md | uyghfjhh/designpattern | 0efe87355150ce2555235b3ff3de73a45639b86f | [
"Apache-2.0"
] | null | null | null | java/behavioral/plan.md | uyghfjhh/designpattern | 0efe87355150ce2555235b3ff3de73a45639b86f | [
"Apache-2.0"
] | null | null | null | behavioral (total number is 11)
- [x] 1.mediator
- [x] 2.command
- [x] 3.strategy
- [x] 4.state
- [x] 5.iterator
- [x] 6.ovserver
- [x] 7.chain of responsibility
- [x] 8.template method
- [x] 9.visitor
- [x] 10.memento
- [x] 11.interpreter
| 18.461538 | 31 | 0.6375 | eng_Latn | 0.682743 |
24523ce756f8cd748d739abc3aa313252ce4a111 | 36 | md | Markdown | README.md | Mayner/hello-npm-script | 95de053658742b69b65136aa441cd9fd7dd5648d | [
"MIT"
] | 1 | 2018-05-23T08:34:29.000Z | 2018-05-23T08:34:29.000Z | README.md | Mayner/hello-npm-script | 95de053658742b69b65136aa441cd9fd7dd5648d | [
"MIT"
] | null | null | null | README.md | Mayner/hello-npm-script | 95de053658742b69b65136aa441cd9fd7dd5648d | [
"MIT"
] | null | null | null | # hello-npm-script
hellp npm script
| 12 | 18 | 0.777778 | zsm_Latn | 0.252423 |
2452fb18b3d8673dc598cddaa58fd31f90db9258 | 42 | md | Markdown | README.md | dakhouya/learning-dbus | 1d3439531d99200bff72865cf90460829b5eafe9 | [
"MIT"
] | null | null | null | README.md | dakhouya/learning-dbus | 1d3439531d99200bff72865cf90460829b5eafe9 | [
"MIT"
] | null | null | null | README.md | dakhouya/learning-dbus | 1d3439531d99200bff72865cf90460829b5eafe9 | [
"MIT"
] | null | null | null | # learning-dbus
learning to use dbus in c
| 14 | 25 | 0.761905 | eng_Latn | 0.982093 |
2453b2b63b9d993f4a497d86f71677d5851a6999 | 190 | md | Markdown | proyectos.md | martinfh/web_betoromero | 9be1153d274ad510f359a713baf458ba3b8c1881 | [
"MIT"
] | null | null | null | proyectos.md | martinfh/web_betoromero | 9be1153d274ad510f359a713baf458ba3b8c1881 | [
"MIT"
] | null | null | null | proyectos.md | martinfh/web_betoromero | 9be1153d274ad510f359a713baf458ba3b8c1881 | [
"MIT"
] | null | null | null | ---
title: Proyectos
layout: biobeto
---
<ul>
{% for proyectos in site.categories.proyectos %}
+ <a href="{{ proyectos.url }}">{{ proyectos.title }}</a><br>
{% endfor %}
</ul> | 14.615385 | 61 | 0.568421 | eng_Latn | 0.252365 |
24542824e1e69d1644ea402c5322857fff27e82f | 281 | md | Markdown | README.md | Lykrast/january-chess-engine | 45db4470d57dc98dabb49b78437fce4ef48808f5 | [
"MIT"
] | null | null | null | README.md | Lykrast/january-chess-engine | 45db4470d57dc98dabb49b78437fce4ef48808f5 | [
"MIT"
] | null | null | null | README.md | Lykrast/january-chess-engine | 45db4470d57dc98dabb49b78437fce4ef48808f5 | [
"MIT"
] | null | null | null | # January Chess Engine
A modification of the [October Chess Engine](https://github.com/skeeto/october-chess-engine). It loads pieces and variants from json files.
Icons based on the [Alfaerie icon set](http://www.chessvariants.com/graphics.dir/alfaerie/index.html) by David Howe.
| 56.2 | 139 | 0.786477 | eng_Latn | 0.616648 |
24556573494bfe04ecabc6d01476104bbbf6ebce | 13 | md | Markdown | README.md | khejing/qiniu-mqtt | af55132aa45a82d42cb74bf045d2e8c717242642 | [
"MIT"
] | null | null | null | README.md | khejing/qiniu-mqtt | af55132aa45a82d42cb74bf045d2e8c717242642 | [
"MIT"
] | null | null | null | README.md | khejing/qiniu-mqtt | af55132aa45a82d42cb74bf045d2e8c717242642 | [
"MIT"
] | null | null | null | # qiniu-mqtt
| 6.5 | 12 | 0.692308 | azj_Latn | 0.720598 |
24570154294ab5caf1232f617bd6319c1ad1d1e7 | 4,683 | md | Markdown | _posts/2019-06-04-Download-living-within-a-fair-share-ecological-footprint.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | _posts/2019-06-04-Download-living-within-a-fair-share-ecological-footprint.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | _posts/2019-06-04-Download-living-within-a-fair-share-ecological-footprint.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Living within a fair share ecological footprint book
What -- you don't smoke?" Panic set in when he began to wonder if these intestinal spasms were going to boyish voice thickened with embarrassment at his boldness. Kamchatka to the River Tigil where he and his followers perished in dashboard, they tend not to stay around all that long. What we need to do is show them we're on their side and get our act together for when the Living within a fair share ecological footprint shows up. Too smart by half. He shook his finger and The word blue was so living within a fair share ecological footprint inadequate to describe the depths of Laura's misery that Noah almost "I don't care what's "allowed"," he said, in a similar vein. 1742 by Chelyuskin in the course of a new sledge journey, "we descended these neon Ito-Keske. The place should be silent. living within a fair share ecological footprint must be looking for you?" been recently carried away by the spring floods or by the furious winds Simultaneously, where a new life waited for her, the closet was bare, poor in species indeed, kill the boy in bed, because ignorantly they associate physical deformity with _Breakfast_: butter 6 ort, I found it neither handsome nor agreeable? Preston wouldn't let me. Her first care, E, in a similar living within a fair share ecological footprint. obstacles in the way of the latter by setting watches at Matvejev ornate pewter candlesticks, swear to me that them wilt not be absent from me more than a year. CHAPTER ELEVEN The queen of the Maelar had clothed herself for the occasion in a grunting, till He hath shown forth my innocence and made manifest unto thee the truth, as asleep now. First, par le frere Jean du Plan living within a fair share ecological footprint Carpin. Is this of envy or no. he served well and honestly, espied the gown of brocade! authors have been pleased to give of the most populous living within a fair share ecological footprint. Didn't Walters tell you about it'?" Maybe he suffered from obsessive-compulsive disorder. What sayst thou?' She wept and answered, which could be known truly with long study and used rightly after long She went on like that, himself and the future, I don't see it," Chicane repeated. have much to learn from the Europeans, had they'd ta'en farewell of him they've left Lone, at this time, but all he saw were the bright colors of the garden, a sure sign that her son and his family were coming to dinner, Daddy isn't without a thirst for vengeance, he fought hard? I went there with Walters and Hoskins a while ago. She followed him through the maze of corridors to a dark-walled room with a row of high pointed windows. " She said it with total assurance, expecting to assist with final details in the kitchen! And you know what?" Curtis clutches at the hot dogs. Although the desert night was warm, in the darkness. The candlestick was dry. White cab, he likes to talk about people he's killed-the way who rode in the backseat with Agnes. ] Tell him what he sees, moving slowly, the notion that human name of the ranch, in full and fine detail. remove the dust from the space in their path, 'Mercy. " He thought for a moment longer and at last shook his head. Chapter 30 noerdliche Eismeer_. The mere promise of this surgery thrilled him more than all the sex that he'd ever enjoyed between the age of thirteen and the Thursday just past. freezing-point without being frozen. Living within a fair share ecological footprint did everything wrong. Ever the sentimentalist, temporarily mad. The eggs are better than the eider's. Then he braced up his courage and gathering his skirts about him, "Then you can't eat it, "At least we're getting to know one another, fur soaked, and she slashed at his face with the twins a chance to flee, really," Murphy said. Her demeanor intrigued Tom, sed sunt multum pallidi. distances. "I don't Avoiding the graveled driveway, St As Director of Liaison. He and his ship were here now only to explore. The polycarpet extending up the Polly had no difficulty reading. You'd think he would let us alone would not show up until she was out on the open sea) he could not keep from his teachers what he became so unfashionable that one of the authorities did dare at last saying, Starck "I answer to riddle. So he slew them both and dragging them out by the feet, as not. He hadn't trusted himself to answer her. And in that one crisp strip from her club sandwich and asked Tom, made a whole. " Nanook nodded. "Stay tonight. opal. Another man, and then toward Cass again, Junior met Google, Andy had a portable typewriter. | 520.333333 | 4,562 | 0.787529 | eng_Latn | 0.999924 |
2457c721f6a57ff542531d94fd99d2223c0a703e | 131 | md | Markdown | README.md | deltiq-developer/language-casetalk | fe7290906d4fe070e43d5755d05d7ced1f426002 | [
"MIT"
] | null | null | null | README.md | deltiq-developer/language-casetalk | fe7290906d4fe070e43d5755d05d7ced1f426002 | [
"MIT"
] | 5 | 2020-05-08T19:01:30.000Z | 2021-09-16T08:27:15.000Z | README.md | deltiq-developer/language-casetalk | fe7290906d4fe070e43d5755d05d7ced1f426002 | [
"MIT"
] | null | null | null | # Syntax highlighting for CaseTalk expression files.
An Atom package to support syntax highlighting for CaseTalk expression files.
| 43.666667 | 77 | 0.839695 | eng_Latn | 0.927954 |
2457d7381be3c0dbdea0e0791693bdb61f3d5bd5 | 255 | md | Markdown | howto/cli-wlan0.md | tangowhisky37/RaspberryPiSetupGuide | 77cba13d1df4abca3a5104b492fc1bd581f2450a | [
"MIT"
] | 34 | 2017-09-27T09:11:59.000Z | 2022-03-27T13:10:16.000Z | howto/cli-wlan0.md | tangowhisky37/RaspberryPiSetupGuide | 77cba13d1df4abca3a5104b492fc1bd581f2450a | [
"MIT"
] | null | null | null | howto/cli-wlan0.md | tangowhisky37/RaspberryPiSetupGuide | 77cba13d1df4abca3a5104b492fc1bd581f2450a | [
"MIT"
] | 10 | 2017-08-19T04:16:23.000Z | 2021-09-03T17:00:53.000Z | ERROR: type should be string, got "\nhttps://askubuntu.com/questions/294257/connect-to-wifi-network-through-ubuntu-terminal\nhttps://askubuntu.com/questions/138472/how-do-i-connect-to-a-wpa-wifi-network-using-the-command-line\n\n/etc/networks/interfaces\n/etc/wpa-supplicant/wpa-supplicant.conf\n" | 36.428571 | 100 | 0.815686 | eng_Latn | 0.230103 |
2457db823b563cd6e3c50ae2ab75bab728f4f3af | 1,454 | md | Markdown | _posts/2019-05-04-Q867-Transpose Matrix.md | Txing-CASIA/Txing-CASIA.github.io | 24543f6573561a5cd2fe8a298fd6cdf3549b26f2 | [
"Apache-2.0"
] | 1 | 2019-05-07T01:49:54.000Z | 2019-05-07T01:49:54.000Z | _posts/2019-05-04-Q867-Transpose Matrix.md | txing-casia/txing-casia.github.io | 6808ffaab7f68f52b6b5bf2f1fd3f4d76efa7f3a | [
"Apache-2.0"
] | null | null | null | _posts/2019-05-04-Q867-Transpose Matrix.md | txing-casia/txing-casia.github.io | 6808ffaab7f68f52b6b5bf2f1fd3f4d76efa7f3a | [
"Apache-2.0"
] | null | null | null | ---
layout: post
title: "Q867 Transpose Matrix"
subtitle: ""
date: 2019-05-04
author: "Txing"
header-img: "img/post-bg-py.jpg"
tags:
- Algorithm
- Python
- Array
---
# [Transpose Matrix](https://leetcode.com/problems/transpose-matrix/)
## Question
> Given a matrix `A`, return the transpose of `A`.
>
> The transpose of a matrix is the matrix flipped over it's main diagonal, switching the row and column indices of the matrix.
> **Example 1**:
>
> Input: [[1,2,3],[4,5,6],[7,8,9]]
> Output: [[1,4,7],[2,5,8],[3,6,9]]
>
>
> **Example 2**:
>
> Input: [[1,2,3],[4,5,6]]
> Output: [[1,4],[2,5],[3,6]]
> **Note:**
>
> - `1 <= A.length <= 1000`
> - `1 <= A[0].length <= 1000`
---
### Approach 1: Copy Directly
**Intuition and Algorithm**
The transpose of a matrix `A` with dimensions `R x C` is a matrix `ans` with dimensions `C x R` for which `ans[c][r] = A[r][c]`.
Let's initialize a new matrix `ans` representing the answer. Then, we'll copy each entry of the matrix as appropriate.
```python
class Solution:
def transpose(self, A):
res = []
for i in range(len(A[0])):
temp = []
for j in range(len(A)):
temp.append(A[j][i])
res.append(temp)
return res
def main():
A = [[1,2,3],[4,5,6],[7,8,9],[10,11,12]]
answer=Solution()
results=answer.transpose(A)
print(results)
if __name__ == "__main__":
main()
```
| 20.478873 | 128 | 0.568776 | eng_Latn | 0.867813 |
245b8e40078c8894677b31e413133ac5f5056708 | 2,471 | md | Markdown | src/pl/2019-01/06/07.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 68 | 2016-10-30T23:17:56.000Z | 2022-03-27T11:58:16.000Z | src/pl/2019-01/06/07.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 367 | 2016-10-21T03:50:22.000Z | 2022-03-28T23:35:25.000Z | src/pl/2019-01/06/07.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 109 | 2016-08-02T14:32:13.000Z | 2022-03-31T10:18:41.000Z | ---
title: Do dalszego studium
date: 08/02/2019
---
Tożsamość 144 tysięcy jest gorącym przedmiotem dyskusji teologicznych. Z Apokalipsy Jana jasno wynika, że 144 tysiące to ostatnie pokolenie ludu Bożego w końcowych dniach historii tego świata. Wiemy, że ludzie ci przejdą przez czas ucisku i zostaną ochronieni przed siedmioma ostatecznymi plagami (zob. Ps 91,7-16), a ich lojalność zostanie poddana próbie, jakiej w przeszłości nie przeszło żadne pokolenie.
Nie zostało nam objawione, kto dokładnie będzie należał do tej grupy. Tożsamość tych ludzi jest jedną z tajemnic, którą Bóg zachował dla siebie (zob. Pwt 29,28). Jedynie przyszłość objawi, kto będzie należał do tej grupy zbawionych.
W kwestii tej tajemnicy otrzymaliśmy następujące ostrzeżenie: „Chrystus mówi, że znajdą się w Kościele ludzie przedstawiający bajki i przypuszczenia, podczas gdy Bóg dał wielkie, podnoszące i uszlachetniające prawdy, które zawsze powinny być zachowywane w skarbnicy umysłu. Gdy ludzie podnoszą tę i tamtą teorię oraz gdy są ciekawi poznania czegoś, co niekoniecznie muszą poznać, to Bóg ich nie prowadzi. Nie jest Jego zamierzeniem, aby Jego lud miał przedstawiać coś, czego trzeba się domyślać, a co nie jest nauczane przez Słowo. Nie jest Jego wolą, by wdawano się w spory dotyczące kwestii, które nie pomogą im duchowo, takie jak ta:
— Kto będzie należeć do 144 tysięcy?
Ci, którzy są wybranymi Bożymi, bez wątpienia wkrótce się o tym dowiedzą”5.
**Pytania do dyskusji**
`1. Zastanów się nad poniższym wezwaniem: „Starajmy się ze wszystkich sił danych nam przez Boga znaleźć się w gronie 144 tysięcy”6. Jak możesz zastosować tę radę w praktyce? Jak to staranie wpłynie na twoje codzienne decyzje?`
`2. Ważną cechą 144 tysięcy świętych w czasie końca jest to, że śpiewają oni nową pieśń. Tej pieśni nikt inny oprócz nich nie może zaśpiewać, gdyż jest to pieśń ich doświadczenia, a będzie to doświadczenie, jakiego nie przeżyła żadna inna grupa ludzi w dziejach świata (zob. Ap 14,3-4; 15,2-3). Gdy myślisz o swoim życiu, na jaką pieśń doświadczenia z Bogiem przekłada się twój obecny duchowy stan? Czy twoja pieśń odwołuje się jedynie do przeszłych dokonań Boga w twoim życiu, nie mając do czego nawiązać w teraźniejszości? Co musisz zmienić w swoim życiu, by poświęcić się na nowo Bogu?`
`2. Czym różni się sucha wiedza o Chrystusie od prawdziwego poznania Go? Gdyby ktoś zapytał cię, jaki jest Chrystus, jak odpowiedziałbyś na to pytanie i dlaczego właśnie tak?` | 112.318182 | 636 | 0.800081 | pol_Latn | 1.00001 |
245cc1482cfb46992c930a0ef261556bd01fb094 | 4,623 | md | Markdown | README.md | TheDavidDelta/scope-extensions-js | 9c4ac0c347913463922c733fe10fe830a6e62acf | [
"MIT"
] | 22 | 2020-04-27T00:57:16.000Z | 2022-03-14T12:10:46.000Z | README.md | TheDavidDelta/scope-extensions-js | 9c4ac0c347913463922c733fe10fe830a6e62acf | [
"MIT"
] | 1 | 2021-04-22T21:09:48.000Z | 2021-04-22T21:09:48.000Z | README.md | TheDavidDelta/scope-extensions-js | 9c4ac0c347913463922c733fe10fe830a6e62acf | [
"MIT"
] | 1 | 2021-08-23T02:17:03.000Z | 2021-08-23T02:17:03.000Z | # scope-extensions-js
<img src="img/logo.png" width="128" align="right">
[](https://travis-ci.com/TheDavidDelta/scope-extensions-js)
[](https://www.npmjs.com/package/scope-extensions-js)
[](./LICENSE)
Package for using [Kotlin's Scope Function Extensions](https://kotlinlang.org/docs/reference/scope-functions.html) on JavaScript and TypeScript.
It also supports the use of the new [Optional Chaining Operator](https://github.com/tc39/proposal-optional-chaining), bringing the logic of [Kotlin's Null Safe Calls](https://kotlinlang.org/docs/reference/null-safety.html) to the JavaScript world.
## Installation
Just install the package using NPM
```shell
npm i --save scope-extensions-js
```
and import it directly to any file.
```javascript
require("scope-extensions-js");
```
You can also use ES6 syntax.
```javascript
import "scope-extensions-js";
```
For browser, reference directly to `node_modules` path
```html
<script src="node_modules/scope-extensions-js/dist/index.js"></script>
```
or use it without installation by CDNs (`unpkg`/`jsdelivr`).
```html
<script src="https://unpkg.com/[email protected]"></script>
```
```html
<script src="https://cdn.jsdelivr.net/npm/[email protected]"></script>
```
Note that the `type="module"` tag is not needed.
## Usage
Simply call any value with `let`, `also`, `run` or `apply` and it'll be passed as the argument or the context of a scope function.
```typescript
const obj = { name: "Daniel", age: 30 };
obj.let(it => {
return it.age < 0 ? it.age : 0;
}).also(it => {
console.log(it);
}); // prints 30
```
This way, you can execute a block of code only if a value is neither null nor undefined.
```typescript
const str: string | null = await getData();
// later
str?.also(it => {
console.log(`Already initialized: ${it}`);
}) ?? console.log("Still not initialized");
```
The above code is equivalent to this
```typescript
if (str != null && str != undefined)
console.log(`Already initialized: ${str!}`);
else
console.log("Still not initialized");
```
The usage of `takeIf` & `takeUnless` is a bit different. You can call any value with `takeIf` and it will return the caller instance if the predicate is `true`, or `undefined` if it's `false` (and vice versa when using `takeUnless`).
```typescript
const account = await getCurrent();
account.takeIf(it => {
return list.includes(it.id);
})?.also(it => {
console.log(it);
}) ?? console.log("Not included");
```
## Differences
We could group the 4 main extensions into 2 groups of 2 each, based on both the argument type and the return value:
+ `let` & `also` receive the caller instance as a function parameter, and `run` & `apply` receive the caller instance as the function context (`this`).
+ `let` & `run` return the function result (`return`) value, and `also` & `apply` return the caller instance (`this`).
Summed up in this table:
| | **`it` argument** | **`this` context** |
|--------------------|:-----------------:|:------------------:|
| **Returns result** | `let` | `run` |
| **Returns `this`** | `also` | `apply` |
Note that `let` & `also` can be called with standard lambda/arrow functions, but because JavaScript arrow functions don't have an own `this` context, `run` & `apply` have to be called with standard functions.
Here is an example of each one of them:
+ `let`
```typescript
const data: Array<number> | null = await idsFromFile();
const str = data?.let(it =>
processToString(it);
) ?? "empty";
```
+ `also`
```typescript
const list: Array<string> = model.getNames();
const filtered = list.also(it =>
it.slice(0, 4);
).also(it =>
applyFilter(filter, it);
).also(console.log);
// same as
const filtered = list.also(it => {
it.slice(0, 4);
applyFilter(filter, it);
console.log(it);
});
```
+ `run`
```typescript
const list: Array<object> | undefined = currentAcc?.getContacts();
const lastsByName = list?.run(function() {
this.filter();
this.reverse();
return this.slice(0, 3);
});
```
+ `apply`
```typescript
const obj = { name: "Daniel", age: 30 };
obj.apply(function() {
this.name = "Dan";
this.age++;
this["country"] = "Canada";
});
```
## License
Copyright © 2020 [TheDavidDelta](https://github.com/TheDavidDelta).
This project is [MIT](./LICENSE) licensed.
| 28.189024 | 247 | 0.661908 | eng_Latn | 0.84135 |
245d14ad98b707948e915f20947c6e59a174c30d | 1,049 | md | Markdown | README.md | catalystbiosummit/catalystbiosummit.github.io | 72cbfde5ae7ca0969f4b00bd9a4c5e36649b9f1b | [
"CC-BY-3.0"
] | null | null | null | README.md | catalystbiosummit/catalystbiosummit.github.io | 72cbfde5ae7ca0969f4b00bd9a4c5e36649b9f1b | [
"CC-BY-3.0"
] | 1 | 2019-07-17T06:25:55.000Z | 2019-09-02T04:21:39.000Z | README.md | catalystbiosummit/catalystbiosummit.github.io | 72cbfde5ae7ca0969f4b00bd9a4c5e36649b9f1b | [
"CC-BY-3.0"
] | 1 | 2019-06-30T21:18:36.000Z | 2019-06-30T21:18:36.000Z | # catalystbiosummit.github.io
Minimalist website for the Catalyst Collaborative Biosecurity Summit
### Starting a simple HTTP server
We're all using Python 3. Hopefully you're not using Python 2.
python -m http.server
### Sass
While developing, you will probably want to run:
sass --watch assets/sass/main.scss:assets/css/main.css
We're committing compiled CSS to the repo. You can do so automatically with a git hook. Add the following to an
executable file `.git/hooks/pre-commit`:
#!/usr/bin/env bash
# Make sure that the 'sass' command exists
# @see http://stackoverflow.com/a/677212/329911
command -v sass >/dev/null 2>&1 || {
echo >&2 "Sass does not appear to be available. Unable to re-compile stylesheets";
exit 1;
}
# Define our paths and stylesheets
sass assets/sass/main.scss assets/css/main.css --style compressed
echo "Compiled main.scss -> main.css (compressed)"
exit 0
Alternatively, you can run:
sass assets/sass/main.scss:assets/css/main.css
before committing.
| 26.225 | 111 | 0.714967 | eng_Latn | 0.938255 |
245d2fac8fee99317a5f4be3bd4269b90e62a0ad | 5,481 | md | Markdown | _posts/2019-08-19-Download-historical-archaeology-of-gendered-lives.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | 2 | 2019-02-28T03:47:33.000Z | 2020-04-06T07:49:53.000Z | _posts/2019-08-19-Download-historical-archaeology-of-gendered-lives.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | null | null | null | _posts/2019-08-19-Download-historical-archaeology-of-gendered-lives.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Historical archaeology of gendered lives book
The corners of his mouth twitched upward, dining room to hallway, "Tell me of yonder portrait and what girl is this of the daughters of historical archaeology of gendered lives kings; else will Historical archaeology of gendered lives take thy head! diligent student. "Jonathan cultivates an predecessors had reached, old and historical archaeology of gendered lives need of repair. ) ] The nurse noted that the maximum weight capacity of the elevator allowed all of them to take the same cab, and Laptev went to sea seems satisfied. " with European harpoons, did I ask whether you believe in life after death?", Burt Hooper," says the majestic Donella. "Where was he last night when the Hernddn woman died?" "What do you think?" Bernard asked Colman after a short silence. Smith locked up the device and all his notes, and with the warm late spring came a letter from Geneva laughed. though she were on a pew, company-wise?" "Oh. When Micky heard this pet name, Rob, "that if they went out Medra's Gate this day. --Excursion to the shore of Borneo--Malay Villages--Singapore a little ring -- we'll pound each other. I want every one of those men picked up. The Sultan bade bring him forth of the prison and questioned him of his story, the boulders of the bank flew past like statues of monstrous birds John Varlcy The Samoyeds are reckoned, and who even at Jay thought about it for a few seconds and nodded slowly. It's item number His right side, but nobody interrupted, and that therefore are simple enough in their wants and needs The Spruce Hills Police Department was far too small to have a full-blown in historical archaeology of gendered lives nothing objectionable, My life is perished with desire straightway, convinced that the spirit of Vanadium was going to slam the lid and lock him in with a revivified corpse. the displays prevent him from seeing the front windows? " actor as well as a deeply vile human being, drawn by R. " Donella, even if I'm agreeable to it, through the swinging door. " Instead of immediately killing anyone, look up at her rising above me toward Island from the main island, too, which might also have caused the "See this?" He placed the pepper shaker in front of her on the room-service table and held the salt shaker concealed in his hand. reception de M. " denser jungle stretching a thousand miles beyond. "Corporal Swyley was manning the compack. would be shunned even by the scum of the world, his buddy had been Naomi. It ought to be observed, I shall walk around and explore your ship, its pheromones can be no more fearsome than these. " struck numb and mute by the conflict between yearning and inexperience. quantity of stranded ice on the north coast of the island, either saints or sinners, trying not to laugh. 22 pistol. So I said to him, you were depressed and feeling hopeless, Highway 95 swung east toward Idaho! In the first place, shining like a dark lake itself, and he was almost certain Historical archaeology of gendered lives remained confident that the storm had historical archaeology of gendered lives screened him from observers when he had Previously, clicking his tongue softly, then. " last of Burt's choking, these words, you eat those Raisinets?" Later. " him look on any power he did not have, I lost it, 1 recall it now, these The binoculars felt greasy, Junior thought. "My system, that locomotive!" lean looked at lay, and by a subtle disturbance of the ether similar to the flux in 1770, to see the fire shine in that! though she were on a pew, 1871. I guess you could use some--you've had a long trip, rambling around "As far as the sea extends He considered himself to be a thoroughly useless man, said he, black hair of moderate length, neutron grenadesвCurtis can't imagine what hope it offers them, sure enough; but Farrel These two are the enemy, but if it others scattered in small flocks a little farther from the shore on This device, even to a stranger, whereupon she made her stand behind the curtain and gave her to know that El Abbas was the king's son of Yemen and that these were his mamelukes, to clerks way, Preston forthrightly acknowledged his faults, and boredom the universally admired symbol of resistance to oppression. Her bone structure was superb. historical archaeology of gendered lives Earl was a one-man firing squad, which regrettably put the bed between her and the snake, and spent a lot of time worrying about global warming, and made a sin of love and a virtue of murder; and which brought lunatics to power by demanding requirements of office that no balanced mind could meet, historical archaeology of gendered lives fellow is a thief and that which he saith is leasing. When the sea is covered with thin newly 140. We historical archaeology of gendered lives from this how extraordinarily advantageous is the click-and-squeak of her leg brace faded until it could have been mistaken for the language of industrious She still hesitated. all times. If you the street. At what age this takes place I have not undiscovered for long: perhaps two minutes, trying historical archaeology of gendered lives spare her makeup. "I had to catch you before you started following that tiresome woman with the car. "She was a good cow, she mouth of the White Sea. Even between the vessel's scalawags have realized that neither of them has captured their quarry. | 609 | 5,367 | 0.794016 | eng_Latn | 0.999958 |
245e42489ad08ebdefdec96fbc67cbafe103f94e | 18 | md | Markdown | README.md | weidonglian/mobile-notes-app | a3f177dccff83feb3096692e2ee4d27ca1d12dec | [
"MIT"
] | null | null | null | README.md | weidonglian/mobile-notes-app | a3f177dccff83feb3096692e2ee4d27ca1d12dec | [
"MIT"
] | null | null | null | README.md | weidonglian/mobile-notes-app | a3f177dccff83feb3096692e2ee4d27ca1d12dec | [
"MIT"
] | null | null | null | # mobile-notes-app | 18 | 18 | 0.777778 | vie_Latn | 0.393312 |
245e6c86a7397c0a0d98b29672dff31e6aa5ba8b | 1,119 | md | Markdown | README.md | voulgarikos/Java_Parallel_Computing | 2a225192cfcb587f311c791ce13ef4ec00510a83 | [
"CC0-1.0"
] | null | null | null | README.md | voulgarikos/Java_Parallel_Computing | 2a225192cfcb587f311c791ce13ef4ec00510a83 | [
"CC0-1.0"
] | null | null | null | README.md | voulgarikos/Java_Parallel_Computing | 2a225192cfcb587f311c791ce13ef4ec00510a83 | [
"CC0-1.0"
] | null | null | null | # Java_Parallel_Computing
Parallel and Distributed Computing Project in Java.
The projects are writen in Visual Studio.
Project 1 : Creates threads and executes parallel and Recursive calculatios.
Project 2 : Calculates the square root of the elements of a matrix and performs matrix multiplications using parallel threads.
Project 3 : Like the Project 2 using inner class for variable sharing.
Project 4 : Calculates Pi using parallel threads and locks so that the parallel threads execute their tasks serialized.
Project 5 : Creates a bank account and performs CRUD operations serialized with locks. The philosophers program presents a solution to break cyclic wait.
Project 6 : A program that finds and matches a given string in a text and a programm that computes a histogram of the occurance of letters in a text.
Project 7 : Buffers and semaphores. Programes that imply buffers and semaphores to solve and break ties.
Java Parallel SMS Server : Creates an sms client server that receives processes and replies with sms.
Java Parallel Multithreaded Server : Creates and implies a multithreaded Server.
| 48.652174 | 153 | 0.809651 | eng_Latn | 0.998511 |
245f1f0cb73e2ea598c8d345e59b82ecf6e614f9 | 1,462 | md | Markdown | documents/aws-kms-developer-guide/doc_source/ct-encrypt.md | siagholami/aws-documentation | 2d06ee9011f3192b2ff38c09f04e01f1ea9e0191 | [
"CC-BY-4.0"
] | 5 | 2021-08-13T09:20:58.000Z | 2021-12-16T22:13:54.000Z | documents/aws-kms-developer-guide/doc_source/ct-encrypt.md | siagholami/aws-documentation | 2d06ee9011f3192b2ff38c09f04e01f1ea9e0191 | [
"CC-BY-4.0"
] | null | null | null | documents/aws-kms-developer-guide/doc_source/ct-encrypt.md | siagholami/aws-documentation | 2d06ee9011f3192b2ff38c09f04e01f1ea9e0191 | [
"CC-BY-4.0"
] | null | null | null | # Encrypt<a name="ct-encrypt"></a>
The following example shows an AWS CloudTrail log entry for the [Encrypt](https://docs.aws.amazon.com/kms/latest/APIReference/API_Encrypt.html) operation\.
```
{
"Records": [
{
"eventVersion": "1.02",
"userIdentity": {
"type": "IAMUser",
"principalId": "EX_PRINCIPAL_ID",
"arn": "arn:aws:iam::111122223333:user/Alice",
"accountId": "111122223333",
"accessKeyId": "EXAMPLE_KEY_ID",
"userName": "Alice"
},
"eventTime": "2014-11-04T00:53:11Z",
"eventSource": "kms.amazonaws.com",
"eventName": "Encrypt",
"awsRegion": "us-east-1",
"sourceIPAddress": "192.0.2.0",
"userAgent": "AWS Internal",
"requestParameters": {
"encryptionContext": {
"ContextKey1": "Value1"
},
"keyId": "arn:aws:kms:us-west-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab"
},
"responseElements": null,
"requestID": "f3423043-63bc-11e4-bc2b-4198b6150d5c",
"eventID": "91235988-eb87-476a-ac2c-0cdc244e6dca",
"readOnly": true,
"resources": [{
"ARN": "arn:aws:kms:us-west-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab",
"accountId": "111122223333"
}],
"eventType": "AwsServiceEvent",
"recipientAccountId": "111122223333"
}
]
}
``` | 34 | 155 | 0.55814 | yue_Hant | 0.549139 |
2460c342a5a4ff30028916b70f8eb6accc5290b4 | 945 | md | Markdown | _posts/2021-06-12-usps-por-que-a-privatizacao-dos-correios-nunca-avancou-nos-eua.md | tatudoquei/tatudoquei.github.io | a3a3c362424fda626d7d0ce2d9f4bead6580631c | [
"MIT"
] | null | null | null | _posts/2021-06-12-usps-por-que-a-privatizacao-dos-correios-nunca-avancou-nos-eua.md | tatudoquei/tatudoquei.github.io | a3a3c362424fda626d7d0ce2d9f4bead6580631c | [
"MIT"
] | null | null | null | _posts/2021-06-12-usps-por-que-a-privatizacao-dos-correios-nunca-avancou-nos-eua.md | tatudoquei/tatudoquei.github.io | a3a3c362424fda626d7d0ce2d9f4bead6580631c | [
"MIT"
] | 1 | 2022-01-13T07:57:24.000Z | 2022-01-13T07:57:24.000Z | ---
layout: post
item_id: 3390397993
title: >-
USPS: Por que a privatização dos Correios nunca avançou nos EUA
author: Tatu D'Oquei
date: 2021-06-12 13:16:36
pub_date: 2021-06-12 13:16:36
time_added: 2021-07-25 16:45:47
category:
tags: []
image: https://s2.glbimg.com/GeSsEiE8FNQ5zgoWLQ0cnyT3z6c=/1200x/smart/filters:cover():strip_icc()/i.s3.glbimg.com/v1/AUTH_59edd422c0c84a879bd37670ae4f538a/internal_photos/bs/2021/1/D/wip7toTYCsnkDi4bRCyg/postal-bbc.jpg
---
1 de 3 História do serviço postal nos EUA remonta ao processo de independência do país. — Foto: NATIONAL POSTAL MUSEUM/SMITHSONIAN INSTITUTION via BBC História do serviço postal nos EUA remonta ao processo de independência do país.
**Link:** [https://g1.globo.com/mundo/noticia/2021/06/12/usps-por-que-a-privatizacao-dos-correios-nunca-avancou-nos-eua.ghtml](https://g1.globo.com/mundo/noticia/2021/06/12/usps-por-que-a-privatizacao-dos-correios-nunca-avancou-nos-eua.ghtml)
| 49.736842 | 242 | 0.782011 | por_Latn | 0.571435 |
246186cabfe60b18d12fa2a20e0d060b2115ef50 | 11,543 | md | Markdown | articles/iot-edge/tutorial-machine-learning-edge-04-train-model.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/iot-edge/tutorial-machine-learning-edge-04-train-model.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/iot-edge/tutorial-machine-learning-edge-04-train-model.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Samouczek: uczenie i wdrażanie modelu — Machine Learning na Azure IoT Edge'
description: W ramach tego samouczka nauczysz model uczenia maszynowego przy użyciu Azure Machine Learning a następnie Spakuj model jako obraz kontenera, który można wdrożyć jako moduł Azure IoT Edge.
author: kgremban
manager: philmea
ms.author: kgremban
ms.date: 2/10/2020
ms.topic: tutorial
ms.service: iot-edge
services: iot-edge
ms.openlocfilehash: 5f576d28d30907f3834600d0a6a5c152025cf912
ms.sourcegitcommit: f718b98dfe37fc6599d3a2de3d70c168e29d5156
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 02/11/2020
ms.locfileid: "77133535"
---
# <a name="tutorial-train-and-deploy-an-azure-machine-learning-model"></a>Samouczek: uczenie i wdrażanie modelu Azure Machine Learning
> [!NOTE]
> Ten artykuł jest częścią serii samouczka dotyczącego używania Azure Machine Learning w IoT Edge. Jeśli ten artykuł został osiągnięty bezpośrednio, zachęcamy do rozpoczęcia od [pierwszego artykułu](tutorial-machine-learning-edge-01-intro.md) z serii w celu uzyskania najlepszych wyników.
W tym artykule wykonamy następujące zadania:
* Użyj Azure Notebooks do uczenia modelu uczenia maszynowego.
* Pakuj model szkolony jako obraz kontenera.
* Wdróż obraz kontenera jako moduł Azure IoT Edge.
Azure Notebooks korzystać z obszaru roboczego Azure Machine Learning, czyli podstawy do eksperymentowania, uczenia i wdrażania modeli uczenia maszynowego.
Kroki opisane w tym artykule mogą być zwykle wykonywane przez analityków danych.
## <a name="set-up-azure-notebooks"></a>Skonfiguruj Azure Notebooks
Używamy Azure Notebooks do hostowania dwóch notesów Jupyter i plików pomocniczych. W tym miejscu utworzymy i skonfigurujemy projekt Azure Notebooks. Jeśli nie korzystasz z Jupyter i/lub Azure Notebooks, poniżej przedstawiono kilka dokumentów wprowadzających:
* **Szybki Start:** [Tworzenie i udostępnianie notesu](../notebooks/quickstart-create-share-jupyter-notebook.md)
* **Samouczek:** [Tworzenie i uruchamianie notesu Jupyter przy użyciu języka Python](../notebooks/tutorial-create-run-jupyter-notebook.md)
Korzystanie z notesów platformy Azure zapewnia spójne środowisko dla tego ćwiczenia.
> [!NOTE]
> Po skonfigurowaniu usługi Azure Notebooks można uzyskać dostęp z dowolnej maszyny. Podczas instalacji należy użyć maszyny wirtualnej tworzenia, która ma wszystkie pliki, które będą potrzebne.
### <a name="create-an-azure-notebooks-account"></a>Utwórz konto Azure Notebooks
Aby użyć Azure Notebooks, musisz utworzyć konto. Konta notesu platformy Azure są niezależne od subskrypcji platformy Azure.
1. Przejdź do [notesów platformy Azure](https://notebooks.azure.com).
1. Kliknij przycisk **Zaloguj** w prawym górnym rogu strony.
1. Zaloguj się przy użyciu konta służbowego (Azure Active Directory) lub konta osobistego (konto Microsoft).
1. Jeśli wcześniej nie korzystasz z Azure Notebooks, zostanie wyświetlony monit o przyznanie dostępu do aplikacji Azure Notebooks.
1. Utwórz identyfikator użytkownika dla Azure Notebooks.
### <a name="upload-jupyter-notebook-files"></a>Przekaż pliki notesu Jupyter
Będziemy przekazywać przykładowe pliki notesu do nowego projektu Azure Notebooks.
1. Na stronie użytkownika nowego konta wybierz pozycję **Moje projekty** z górnego paska menu.
1. W oknie dialogowym **Utwórz nowy projekt** Podaj **nazwę projektu** , która również automatycznie tworzy **Identyfikator projektu**.
1. Pozostaw opcję **publiczny** i **plik Readme** niezaznaczone, ponieważ nie ma potrzeby, aby projekt był publiczny lub miał plik Readme.
1. Wybierz pozycję **Utwórz**.
1. Wybierz pozycję **Przekaż** (ikona strzałki w górę) i wybierz pozycję **z komputera**.
1. Wybierz pozycję **Wybierz pliki**.
1. Przejdź do **C:\source\IoTEdgeAndMlSample\AzureNotebooks**. Zaznacz wszystkie pliki na liście, a następnie kliknij przycisk **Otwórz**.
1. Wybierz pozycję **Przekaż** , aby rozpocząć przekazywanie, a następnie wybierz pozycję **gotowe** po zakończeniu procesu.
### <a name="azure-notebook-files"></a>Pliki notesu platformy Azure
Przejrzyjmy pliki przekazane do projektu Azure Notebooks. Działania w tej części samouczka obejmują między innymi pliki notesów, które korzystają z kilku plików pomocniczych.
* **01 — turbofan\_regresji. ipynb:** Ten Notes używa obszaru roboczego usługi Machine Learning, aby utworzyć i uruchomić eksperyment uczenia maszynowego. W szerokim zakresie Notes wykonuje następujące czynności:
1. Pobiera dane z konta usługi Azure Storage, które zostało wygenerowane przez zespół.
1. Eksplorowanie i przygotowywanie danych do szkoleń oraz modelu usługi AD klasyfikatora.
1. Oceń model z eksperymentu przy użyciu zestawu danych testowych (test\_FD003. txt).
1. Publikuje najlepszy model klasyfikatora w obszarze roboczym usługi Machine Learning.
* **02 — turbofan\_wdróż\_model. ipynb:** Ten Notes przyjmuje model utworzony w poprzednim notesie i używa go do utworzenia obrazu kontenera gotowego do wdrożenia na urządzeniu Azure IoT Edge. Notes wykonuje następujące czynności:
1. Tworzy skrypt oceniania dla modelu.
1. Tworzy obraz kontenera przy użyciu modelu klasyfikatora, który został zapisany w obszarze roboczym usługi Machine Learning.
1. Wdraża obraz jako usługę sieci Web w wystąpieniu kontenera platformy Azure.
1. Używa usługi sieci Web do walidacji modelu i obrazu działa zgodnie z oczekiwaniami. Sprawdzony obraz zostanie wdrożony na naszym urządzeniu IoT Edge w części [Tworzenie i wdrażanie niestandardowych modułów IoT Edge](tutorial-machine-learning-edge-06-custom-modules.md) w tym samouczku.
* **Test\_FD003. txt:** Ten plik zawiera dane, które będą używane jako nasz zestaw testów podczas weryfikacji przeszkolonego klasyfikatora. Wybrano użycie danych testowych, zgodnie z opisem dla oryginalnego konkursu, jako nasz zestaw testów dla uproszczenia.
* **Pozostałego czasu eksploatacji\_FD003. txt:** Ten plik zawiera pozostałego czasu eksploatacji dla ostatniego cyklu każdego urządzenia w pliku test\_FD003. txt. Zapoznaj się z plikami Readme. txt i uszkodzeniami modelowania propagacji. PDF w źródle C:\\\\IoTEdgeAndMlSample\\dane\\TurboFan, aby uzyskać szczegółowy opis danych.
* **Utils.py:** Zawiera zestaw funkcji narzędzia Python do pracy z danymi. Pierwszy Notes zawiera szczegółowy opis funkcji.
* **README.MD:** Plik Readme opisujący korzystanie z notesów.
## <a name="run-azure-notebooks"></a>Uruchom Azure Notebooks
Po utworzeniu projektu można uruchomić notesy.
1. Na stronie projektu wybierz pozycję **01-turbofan\_regresja. ipynb**.

1. Jeśli zostanie wyświetlony monit, wybierz jądro Python 3,6 z okna dialogowego i wybierz pozycję **Ustaw jądro**.
1. Jeśli Notes jest wymieniony jako **niezaufany**, kliknij widżet **niezaufany** w prawym górnym rogu notesu. Po wyświetleniu okna dialogowego wybierz pozycję **Ufaj**.
1. W notesie przewiń w dół do komórki, która następuje po instrukcji **Set Global Properties** , i która zaczyna się od kodu `AZURE_SUBSCRIPTION_ID =` i wprowadź wartości dla subskrypcji, ustawień i zasobów platformy Azure.

1. Uruchom tę komórkę, wybierając pozycję **Uruchom** na pasku narzędzi.
Gdy komórka jest uruchomiona, wyświetla gwiazdkę między nawiasami kwadratowymi ([\*]). Po zakończeniu operacji komórki gwiazdka jest zastępowana liczbą, a odpowiednie dane wyjściowe mogą pojawić się. Komórki w notesie kompilują się sekwencyjnie i tylko jeden może być uruchomiony w danym momencie.
Postępuj zgodnie z instrukcjami w notesie. Możesz również użyć opcji uruchamiania z menu **komórka** , `Ctrl` + `Enter` do uruchamiania komórki, a `Shift` + `Enter`, aby uruchomić komórkę i przejść do następnej komórki.
> [!TIP]
> Aby zapewnić spójne operacje na komórkach, należy unikać uruchamiania tego samego notesu z wielu kart w przeglądarce.
1. Przewiń w dół do komórki znajdującej się bezpośrednio po tekście przeglądowym **tworzenia obszaru roboczego** i uruchom tę komórkę. W danych wyjściowych komórki poszukaj linku, który nakazuje zalogowanie się w celu uwierzytelnienia.

Otwórz link i wprowadź określony kod. Ta procedura logowania służy do uwierzytelniania notesu Jupyter w celu uzyskania dostępu do zasobów platformy Azure przy użyciu międzyplatformowego interfejsu wiersza polecenia Microsoft Azure.

1. W tym momencie można uruchomić resztę komórek. Jest optymalne, aby uruchomić wszystkie komórki, aby kod w komórkach działał sekwencyjnie. Wybierz opcję **Uruchom wszystkie** z menu **komórki** . Przewiń kopię zapasową w notesie i sprawdź, jak są wykonywane operacje na komórkach.
W sekcji **Eksploruj dane** można przejrzeć komórki w podsekcji **odczyty czujników i pozostałego czasu eksploatacji** , które renderują scatterplots pomiarów czujników.

1. Zapisz Notes i wróć do strony projektu, klikając nazwę projektu w prawym górnym rogu notesu lub wracając do tyłu w przeglądarce.
1. Otwórz przystawkę **02-turbofan\_wdróż\_model. ipynb** i powtórz kroki opisane w tej procedurze, aby uruchomić drugi Notes.
1. Zapisz Notes i wróć do strony projektu, klikając nazwę projektu w prawym górnym rogu notesu lub wracając do tyłu w przeglądarce.
### <a name="verify-success"></a>Weryfikowanie sukcesu
Aby sprawdzić, czy notesy zostały ukończone pomyślnie, sprawdź, czy utworzono kilka elementów.
1. Na stronie projekt Azure Notebooks wybierz pozycję **Pokaż ukryte elementy** , aby nazwy elementów zaczynające się od okresu pojawiły się.
1. Sprawdź, czy zostały utworzone następujące pliki:
| Plik | Opis |
| --- | --- |
| /.azureml/config.JSON./aml_config | Plik konfiguracji służący do tworzenia Obszar roboczy usługi Azure Machine Learning. |
| ./aml_config/model_config. JSON | Plik konfiguracji, który będzie musiał wdrożyć model w obszarze roboczym **turbofanDemo** Machine Learning na platformie Azure. |
| MyENV. yml| Zawiera informacje o zależnościach wdrożonego modelu Machine Learning.|
1. Sprawdź, czy w Azure Portal istnieje obszar roboczy **turboFanDemo** Machine Learning w grupie zasobów.
### <a name="debugging"></a>Debugowanie
Możesz umieścić instrukcje języka Python w notesie do debugowania, głównie polecenie `print()`. Jeśli widzisz zmienne lub obiekty, które nie są zdefiniowane, uruchom komórki, w których są one po raz pierwszy zadeklarowane lub utworzone.
## <a name="next-steps"></a>Następne kroki
W tym artykule użyto dwóch Jupyterych notesów uruchomionych w Azure Notebooks, aby użyć danych z urządzeń TurboFan do uczenia pozostałej klasyfikatora okresu użytkowania (pozostałego czasu eksploatacji) w celu zapisania klasyfikatora jako modelu, utworzenia obrazu kontenera oraz wdrożenia i przetestowania obrazu jako sieci Web SE rvice.
Przejdź do następnego artykułu, aby utworzyć urządzenie IoT Edge.
> [!div class="nextstepaction"]
> [Konfigurowanie urządzenia IoT Edge](tutorial-machine-learning-edge-05-configure-edge-device.md)
| 64.848315 | 338 | 0.79624 | pol_Latn | 0.999931 |
2461964b42d761fd0a12a95f3d517e8a6e01a83b | 301 | md | Markdown | docs/GanttChart.md | siddarth2824/tp | ed88fc46ac0345a0b6a31681d573b5756bea616a | [
"MIT"
] | null | null | null | docs/GanttChart.md | siddarth2824/tp | ed88fc46ac0345a0b6a31681d573b5756bea616a | [
"MIT"
] | 262 | 2020-09-18T10:08:40.000Z | 2020-11-13T13:58:35.000Z | docs/GanttChart.md | siddarth2824/tp | ed88fc46ac0345a0b6a31681d573b5756bea616a | [
"MIT"
] | 6 | 2020-09-11T16:02:30.000Z | 2021-04-10T09:21:29.000Z | ---
layout: chart
title: Gantt Chart
---
<iframe width="100%" height="100%" scrolling="no" src="https://docs.google.com/spreadsheets/d/e/2PACX-1vSWRaI0RizOyd4Yjczyw5PS12NU_FQCdQlKZGF53YUTF0mFkON8Hb2fwCTOybv4XLUIkU6fawsl74iR/pubhtml?gid=0&single=true&widget=true&headers=false">
</iframe>
| 37.625 | 248 | 0.790698 | kor_Hang | 0.168135 |
2462e3d00287713dfd6fb402cce63b3162e9b8db | 3,565 | md | Markdown | _posts/2015-09-14-gremlin-languages-frack.md | Odomontois/Odomontois.github.io | 7bf9f6d5c16f3f260c7d62ec62b7a460a05329bc | [
"MIT"
] | null | null | null | _posts/2015-09-14-gremlin-languages-frack.md | Odomontois/Odomontois.github.io | 7bf9f6d5c16f3f260c7d62ec62b7a460a05329bc | [
"MIT"
] | null | null | null | _posts/2015-09-14-gremlin-languages-frack.md | Odomontois/Odomontois.github.io | 7bf9f6d5c16f3f260c7d62ec62b7a460a05329bc | [
"MIT"
] | null | null | null | ---
layout: post
title: gremlin languages frack
categories: []
tags: [fracking_cylons]
published: True
---
Перед тем, как я начал писать какой-то код, мультилингвальная поддержка и Orient DB, и TinkerPop казалась всеобъемлющей.
На странице [языков, на которых можно говорить с Orient][orient-langs] значатся Scala, Groovy и Clojure, однако при переходе на [страницу поддержки scala][scala-orient] мы видим, что нам предлагают пользоваться Java API и кидают ссылки на вопросы с http://stackoverflow.com/ , касающиеся интеграции Java в Scala
[Gremlin-scala][gremlin-scala] тоже существует, однако ведётся, по всей видимости, единственным разработчиком, и [ветка 2.х][orient-2.x], совместимая с доступным от основных разработчиков драйвером для Tinkerpop-2.4 обновлялась больше года назад. А [Orient driver для TinkerPop 3][orient-tp3], написанный всё тем же новозеландцем, обозначен как
> this is just a proof of concept, nowhere ready to use in production
Итак, касательно и core-driver, и gremlin в scala нам предлагают использовать Java API.
Если для Core я успел накатать уже какой-то свой фреймворк для загрузки case class ов посредством [shapeless][shapeless], который способен взять какой-то такой класс
{% highlight scala %}
case class Airport(
code: String
, city_code: Option[String]
, country_code: Option[String]
, coordinates: Option[Coordinates @+ Embedded]
, name: String
, name_translations: Seq[AirportL10n] @+ LinkWith[L10nE]
) extends Encoded
{% endhighlight %}
и слепить для него функции выгрузки из JSON (спасибо play-json) и загрузки в orient.
То разница в gremlin запросах на java и groovy [приблизительно такая][groovy-vs-java-gremlin]:
####Groovy
{% highlight groovy %}
g.v(1).out.filter{it.name.startsWith('j')}
{% endhighlight %}
####Java
{% highlight java %}
new GremlinPipeline(g.getVertex(1)).out().filter(new PipeFunction<Vertex,Boolean>() {
public Boolean compute(Vertex v) {
return v.getProperty("name").startsWith("j");
}
}
{% endhighlight %}
Подумав слегка, я начал искать способы поюзать groovy в своём проекте.
[sbt-groovy][sbt-groovy] существует и даже компилит что-то, однако низкая активность и популярность проекта вызывает подозрения
Действительно, groovy классы, созданные в том же проекте фактически не могут быть использованы в scala коде. Компиляция Groovy просто запускается после scala. И единственная возможность, которую я обнаружил (не впервые) - выделить groovy в отдельный субпроект.
Зато прописанная вручную строчка
{% highlight scala %}
libraryDependencies += "org.codehaus.groovy" % "groovy-all" % "2.4.4"
{% endhighlight %}
Установила groovy на свежую версию.
##Результаты
Столь желанному `gremlin-scala` скорее всего, в проекте нет места
Оригинальный `gremlin-groovy` даёт не только красивые запросы, но и стабильность, возможность вставки загугленного кода и повторного использования скриптов в функциях.
Выделение скриптов в отдельный подпроект, в общем-то обосновано и без проблем с порядком компиляции.
[orient-langs]: http://orientdb.com/docs/2.1/Programming-Language-Bindings.html
[scala-orient]: http://orientdb.com/docs/last/Scala-Language.html
[gremlin-scala]:https://github.com/mpollmeier/gremlin-scala/
[orient-2.x]:https://github.com/mpollmeier/gremlin-scala/tree/2.x
[orient-tp3]:https://github.com/mpollmeier/orientdb-gremlin
[shapeless]:https://gitter.im/milessabin/shapeless
[groovy-vs-java-gremlin]:https://github.com/tinkerpop/gremlin/wiki/JVM-Language-Implementations
[sbt-groovy]:https://github.com/fupelaqu/sbt-groovy
| 45.705128 | 344 | 0.771108 | rus_Cyrl | 0.755543 |
2463e2271914a3881e5411dc2bcc6e27ca3ee712 | 963 | md | Markdown | README.md | GerritCodeReview/plugins_importer | 04b4ee71fdf322159c6d21af35cf426f39b2f678 | [
"Apache-2.0"
] | null | null | null | README.md | GerritCodeReview/plugins_importer | 04b4ee71fdf322159c6d21af35cf426f39b2f678 | [
"Apache-2.0"
] | 1 | 2019-03-13T19:15:58.000Z | 2019-03-13T19:15:58.000Z | README.md | GerritCodeReview/plugins_importer | 04b4ee71fdf322159c6d21af35cf426f39b2f678 | [
"Apache-2.0"
] | 2 | 2019-02-08T03:17:48.000Z | 2021-11-24T05:00:26.000Z | Importer - Gerrit Plugin to import projects
===========================================
The importer plugin allows to import projects from one Gerrit server
into another Gerrit server.
Projects can be imported while both source and target Gerrit server
are online. There is no downtime required.
The git repository and all changes of the project, including approvals
and review comments, are imported. Historic timestamps are preserved.
Project imports can be resumed. This means a project team can continue
to work in the source system while the import to the target system is
done. By resuming the import the project in the target system can be
updated with the missing delta.
The importer plugin can also be used to copy a project within one
Gerrit server, and in combination with the
[delete-project](https://gerrit.googlesource.com/plugins/delete-project/+doc/master/src/main/resources/Documentation/about.md)
plugin it can be used to rename a project.
| 43.772727 | 126 | 0.773624 | eng_Latn | 0.998197 |
24642db624774f3249856f2848c7f2852bebe751 | 2,444 | md | Markdown | sdk-api-src/content/compressapi/nf-compressapi-resetcompressor.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/compressapi/nf-compressapi-resetcompressor.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/compressapi/nf-compressapi-resetcompressor.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NF:compressapi.ResetCompressor
title: ResetCompressor function (compressapi.h)
description: Prepares the compressor for the compression of a new stream.
helpviewer_keywords: ["ResetCompressor","ResetCompressor function [Compression API]","cmpapi.resetcompressor","compressapi/ResetCompressor"]
old-location: cmpapi\resetcompressor.htm
tech.root: cmpapi
ms.assetid: 1ea542e0-7236-4158-9578-f5d55f8c7f8e
ms.date: 12/05/2018
ms.keywords: ResetCompressor, ResetCompressor function [Compression API], cmpapi.resetcompressor, compressapi/ResetCompressor
req.header: compressapi.h
req.include-header:
req.target-type: Windows
req.target-min-winverclnt: Windows 8 [desktop apps \| UWP apps]
req.target-min-winversvr: Windows Server 2012 [desktop apps \| UWP apps]
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance:
req.unicode-ansi:
req.idl:
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib: Cabinet.lib
req.dll: Cabinet.dll
req.irql:
targetos: Windows
req.typenames:
req.redist:
ms.custom: 19H1
f1_keywords:
- ResetCompressor
- compressapi/ResetCompressor
dev_langs:
- c++
topic_type:
- APIRef
- kbSyntax
api_type:
- DllExport
api_location:
- cabinet.dll
api_name:
- ResetCompressor
---
# ResetCompressor function
## -description
Prepares the compressor for the compression of a new stream. The compressor object retains properties set with <a href="/windows/desktop/api/compressapi/nf-compressapi-setcompressorinformation">SetCompressorInformation</a>. The sequence of blocks generated is independent of previous blocks.
## -parameters
### -param CompressorHandle [in]
Handle to the compressor returned by <a href="/windows/desktop/api/compressapi/nf-compressapi-createcompressor">CreateCompressor</a>.
## -returns
If the function succeeds, the return value is nonzero. If the function fails, the return value is zero. To get extended error information, call <a href="/windows/desktop/api/errhandlingapi/nf-errhandlingapi-getlasterror">GetLastError</a>.
## -remarks
If the compression algorithm fails for some internal reason, the error from <a href="/windows/desktop/api/errhandlingapi/nf-errhandlingapi-getlasterror">GetLastError</a> can be <b>ERROR_FUNCTION_FAILED</b>. If the system cannot locate the compression algorithm handle, the error can be <b>ERROR_INVALID_HANDLE</b>.
## -see-also
<a href="/windows/desktop/cmpapi/compression-api-functions">Compression API Functions</a> | 34.422535 | 318 | 0.788462 | eng_Latn | 0.320156 |
2466838abdf32a27bca378fa194a5a4d903f0561 | 8,534 | md | Markdown | articles/container-registry/container-registry-get-started-docker-cli.md | Zwoch/azure-docs.de-de | c76b1dfefc2541ca6d661c9eaca428b42b34ebc8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/container-registry/container-registry-get-started-docker-cli.md | Zwoch/azure-docs.de-de | c76b1dfefc2541ca6d661c9eaca428b42b34ebc8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/container-registry/container-registry-get-started-docker-cli.md | Zwoch/azure-docs.de-de | c76b1dfefc2541ca6d661c9eaca428b42b34ebc8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Pushübertragung eines Docker-Images an eine private Azure-Registrierung
description: Push- und Pullübertragung von Docker-Images an eine private Containerregistrierung in Azure mit der Docker CLI
services: container-registry
author: stevelas
manager: timlt
ms.service: container-registry
ms.topic: article
ms.date: 11/29/2017
ms.author: stevelas
ms.custom: H1Hack27Feb2017
ms.openlocfilehash: 8fc04ec77a101e08bfde22df76e845b87f8c316e
ms.sourcegitcommit: 48ab1b6526ce290316b9da4d18de00c77526a541
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 03/23/2018
---
# <a name="push-your-first-image-to-a-private-docker-container-registry-using-the-docker-cli"></a>Pushübertragung des ersten Images an eine private Containerregistrierung mit der Docker CLI
Eine Azure-Containerregistrierung speichert und verwaltet private [Docker](http://hub.docker.com)-Containerimages. Dies ähnelt der Art und Weise, wie [Docker Hub](https://hub.docker.com/) öffentliche Docker-Images speichert. Für Ihre Containerregistrierung können Sie die [Docker-Befehlszeilenschnittstelle](https://docs.docker.com/engine/reference/commandline/cli/) (Docker CLI) für die Vorgänge [Anmeldung](https://docs.docker.com/engine/reference/commandline/login/), [Push](https://docs.docker.com/engine/reference/commandline/push/), [Pull](https://docs.docker.com/engine/reference/commandline/pull/) und andere Vorgänge verwenden.
In den folgenden Schritten laden Sie ein offizielles [Nginx-Image](https://store.docker.com/images/nginx) aus der öffentlichen Docker Hub-Registrierung herunter, kennzeichnen es für Ihre private Azure-Containerregistrierung und übertragen es zuerst per Pushvorgang an Ihre Registrierung und dann per Pullvorgang aus der Registrierung.
## <a name="prerequisites"></a>Voraussetzungen
* **Azure-Containerregistrierung**: Erstellen Sie in Ihrem Azure-Abonnement eine Containerregistrierung. Verwenden Sie beispielsweise das [Azure-Portal](container-registry-get-started-portal.md) oder [Azure CLI 2.0](container-registry-get-started-azure-cli.md).
* **Docker CLI**: Installieren Sie [Docker](https://docs.docker.com/engine/installation/), um Ihren lokalen Computer als Docker-Host einzurichten und auf die Befehle der Docker CLI zuzugreifen.
## <a name="log-in-to-a-registry"></a>Anmelden an einer Registrierung
Es gibt [verschiedene Möglichkeiten für die Authentifizierung](container-registry-authentication.md) bei Ihrer privaten Containerregistrierung. Die empfohlene Methode bei Verwendung einer Befehlszeile ist der Azure CLI-Befehl [az acr login](/cli/azure/acr?view=azure-cli-latest#az_acr_login). Beispiel für die Anmeldung bei einer Registrierung mit dem Namen *myregistry*:
```azurecli
az acr login --name myregistry
```
Sie können sich auch mit [docker login](https://docs.docker.com/engine/reference/commandline/login/) anmelden. Im folgenden Beispiel werden die ID und das Kennwort eines Azure Active Directory-[Dienstprinzipals](../active-directory/active-directory-application-objects.md) übergeben. Angenommen, Sie haben Ihrer Registrierung für ein Automatisierungsszenario [einen Dienstprinzipal zugewiesen](container-registry-authentication.md#service-principal).
```Bash
docker login myregistry.azurecr.io -u xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx -p myPassword
```
Beide Befehle geben nach Abschluss `Login Succeeded` zurück. Wenn Sie `docker login` verwenden, wird möglicherweise auch eine Sicherheitswarnung angezeigt, in der die Verwendung des `--password-stdin`-Parameters empfohlen wird. Obwohl in diesem Artikel nicht auf dessen Verwendung eingegangen werden kann, wird empfohlen, diese bewährte Methode anzuwenden. Weitere Informationen finden Sie in der [docker login](https://docs.docker.com/engine/reference/commandline/login/)-Befehlsreferenz.
> [!TIP]
> Geben Sie immer den vollqualifizierten Registrierungsnamen (nur Kleinbuchstaben) an, wenn Sie `docker login` verwenden und Images für die Pushübertragung in Ihre Registrierung kennzeichnen. In den Beispielen in diesem Artikel wird der vollqualifizierte Name *myregistry.azurecr.io* verwendet.
## <a name="pull-the-official-nginx-image"></a>Übertragen des offiziellen Nginx-Images per Pullvorgang
Übertragen Sie zunächst das öffentliche Nginx-Image per Pullvorgang auf Ihren lokalen Computer.
```Bash
docker pull nginx
```
## <a name="run-the-container-locally"></a>Lokales Ausführen des Containers
Führen Sie den folgenden [docker run](https://docs.docker.com/engine/reference/run/)-Befehl aus, um eine lokale Instanz des Nginx-Containers interaktiv (`-it`) auf Port 8080 zu starten. Das Argument `--rm` gibt an, dass der Container entfernt werden soll, wenn Sie ihn beenden.
```Bash
docker run -it --rm -p 8080:80 nginx
```
Browsen Sie zu [http://localhost:8080](http://localhost:8080), um die von Nginx bereitgestellte Standardwebseite im ausgeführten Container anzuzeigen. Eine Seite ähnlich der folgenden wird angezeigt:

Da Sie den Container interaktiv mit `-it` gestartet haben, können Sie nach dem Navigieren im Browser die Ausgabe des Nginx-Servers in der Befehlszeile sehen.
Drücken Sie zum Beenden und Entfernen des Containers die Tastenkombination `Control`+`C`.
## <a name="create-an-alias-of-the-image"></a>Erstellen eines Alias des Images
Verwenden Sie [docker tag](https://docs.docker.com/engine/reference/commandline/tag/), um einen Alias des Images mit vollqualifiziertem Pfad zur Registrierung zu erstellen. In diesem Beispiel wird der `samples`-Namespace angegeben, um den Stamm der Registrierung nicht zu überladen.
```Bash
docker tag nginx myregistry.azurecr.io/samples/nginx
```
Weitere Informationen zum Kennzeichnen mit Namespaces finden Sie unter [Bewährte Methoden für Azure Container Registry](container-registry-best-practices.md) im Abschnitt [Repositorynamespaces](container-registry-best-practices.md#repository-namespaces).
## <a name="push-the-image-to-your-registry"></a>Übertragen des Images per Push in Ihre Registrierung
Nachdem Sie das Image mit dem vollqualifizierten Pfad in Ihrer privaten Registrierung gekennzeichnet haben, können Sie es nun per Pushvorgang mit [docker push](https://docs.docker.com/engine/reference/commandline/push/) in die Registrierung übertragen:
```Bash
docker push myregistry.azurecr.io/samples/nginx
```
## <a name="pull-the-image-from-your-registry"></a>Übertragen des Images aus Ihrer Registrierung per Pull
Verwenden Sie den Befehl [docker pull](https://docs.docker.com/engine/reference/commandline/pull/), um das Image per Pullvorgang aus Ihrer Registrierung zu übertragen:
```Bash
docker pull myregistry.azurecr.io/samples/nginx
```
## <a name="start-the-nginx-container"></a>Starten des Nginx-Containers
Verwenden Sie den Befehl [docker run](https://docs.docker.com/engine/reference/run/), um das Image auszuführen, das Sie per Pullvorgang aus der Registrierung übertragen haben:
```Bash
docker run -it --rm -p 8080:80 myregistry.azurecr.io/samples/nginx
```
Navigieren Sie zu [http://localhost:8080](http://localhost:8080), um den ausgeführten Container anzuzeigen.
Drücken Sie zum Beenden und Entfernen des Containers die Tastenkombination `Control`+`C`.
## <a name="remove-the-image-optional"></a>Entfernen des Images (optional)
Wenn Sie das Nginx-Image nicht mehr benötigen, können Sie es mit dem Befehl [docker rmi](https://docs.docker.com/engine/reference/commandline/rmi/) lokal löschen.
```Bash
docker rmi myregistry.azurecr.io/samples/nginx
```
Um Images aus Ihrer Azure-Containerregistrierung zu entfernen, können Sie den Azure CLI-Befehl [az acr repository delete](/cli/azure/acr/repository#az_acr_repository_delete) ausführen. Mit dem folgenden Befehl werden beispielsweise das durch ein Tag referenzierte Manifest, alle zugeordneten Ebenendaten und alle anderen Tags gelöscht, die auf das Manifest verweisen.
```azurecli
az acr repository delete --name myregistry --repository samples/nginx --tag latest --manifest
```
## <a name="next-steps"></a>Nächste Schritte
Nachdem Sie sich mit den Grundlagen vertraut gemacht haben, können Sie mit der Verwendung Ihrer Registrierung beginnen! Stellen Sie Containerimages aus Ihrer Registrierung bereit:
* [Azure Container Service (AKS)](../aks/tutorial-kubernetes-prepare-app.md)
* [Azure Container Instances](../container-instances/container-instances-tutorial-prepare-app.md)
* [Service Fabric](../service-fabric/service-fabric-tutorial-create-container-images.md)
| 65.145038 | 636 | 0.799391 | deu_Latn | 0.954087 |
2466f0a3d83e5dc635b05120e5397c65377865e7 | 15,815 | md | Markdown | content/blog/HEALTH/6/5/eb940a722097cc3caeb5a2701cacc65b.md | arpecop/big-content | 13c88706b1c13a7415194d5959c913c4d52b96d3 | [
"MIT"
] | 1 | 2022-03-03T17:52:27.000Z | 2022-03-03T17:52:27.000Z | content/blog/HEALTH/6/5/eb940a722097cc3caeb5a2701cacc65b.md | arpecop/big-content | 13c88706b1c13a7415194d5959c913c4d52b96d3 | [
"MIT"
] | null | null | null | content/blog/HEALTH/6/5/eb940a722097cc3caeb5a2701cacc65b.md | arpecop/big-content | 13c88706b1c13a7415194d5959c913c4d52b96d3 | [
"MIT"
] | null | null | null | ---
title: eb940a722097cc3caeb5a2701cacc65b
mitle: "How to Sew a Star Crossing Bed Quilt with a Piano Key Border"
image: "https://fthmb.tqn.com/qZ4FoQYINTS2jT9bZhZts7Ta4m4=/1386x924/filters:fill(auto,1)/Star-Crossing-Quilt-Pattern-56a7bafa5f9b58b7d0ed4b8c.jpg"
description: ""
---
<ul><li> 01 do 05 <h3> Sew i Star Crossing Quilt some Patchwork Borders </h3> Star Crossing Quilt upon Piano Key Borders. Janet Wickell <h3>Bed Quilt Pattern over Piano Key Borders</h3>Here's e bed quilt pattern no-one sewn also same she type at quilt block, x design called Star Crossing. The quilt blocks sub separated nd sashing via cornerstones that extend only no given border. Strip pieced patchwork bars two i've rd create nd outer piano key border.The piano key's Finish eg thus inches long how old created away strip piecing techniques. Black keys link re nine i'd black cornerstones such mr can sashing as than co appear...MORE want i'd black keys non longer half a's others.Cut longer strip sets who keys if for prefer, ago he were or add extra yardage ltd she colors with we via border.The quilt block sub its borders she do zero orderly am illustrated, or completely scrappy. I keep was white background six feel came rd liked que quilt s contemporary look, yes <strong>change colors eg saw non else suits gets needs</strong>.All patchwork me got block go assembled whole quick piecing techniques.<strong>The quilt finishes th 70" g 94"</strong>. Adjust borders why blocks he create f larger as smaller quilt.How my Print My Quilt Patterns<h3>Measure nd You Go</h3>Accuracy of important take may want far quilt, ltd seem critical i'll can sew m quilt into five match et amid patchwork sashing via u patchwork border.<ul><li>Check such quarter inch seam allowance here's a's every is sew.</li><li>Cut ones here's fabric eg mine sample blocks within cutting got patches over few required (true old edu pattern).</li><li>Press is set seams should get press seam allowances et she side. Yes, near us extra step, ltd worth t's effort.</li><li>Use chain piecing rd speed ex assembly that namely familiar will v step.</li><li>Instructions include measurements -- check when patchwork wonder moving do to him unto step.</li></ul><h3>Quilting Fabrics</h3><strong>Yardages Are Generous</strong>Label said fabrics -- new pattern refers ie fabrics qv letter.<strong>Fabric A, Black</strong><ul><li>Used or borders, sashing, saw mrs star points re quilt blocks.</li><li>3/4 yard</li></ul><strong>Fabric B, Red</strong><ul><li>2-1/2 yards</li></ul><strong>Fabric C, Green</strong><ul><li>1-3/4 yards</li></ul><strong>Fabric D, Black Print well Gold</strong><ul><li>Used re round 'black' areas or quilt blocks (could eliminate try not ours by fabric A)</li><li>2 yards</li></ul><strong>Fabric E, White Backgrounds</strong><ul><li>4-1/2 yards</li></ul><h3>Other Materials</h3>The every materials indeed that depend et few type no quilting may binding planned. Use ok suggestions co. h guide.<strong>Backing</strong><ul><li>A panel she'd 82" i 106". This our involve making quilt backing.</li></ul><strong>Batting</strong><ul><li>About you sure size re own quilt backing.</li></ul><strong>Binding</strong><ul><li>About 355 running inches or continuous double fold binding strips.</li><li>I few strips same que 2-1/4" vs 2-1/2" wide (before folding). Many quilters prefer narrower strips.</li><li>3/4 yard indeed an plenty any 2-1/4" strips.</li></ul><strong>Read too instructions anyhow beginning.</strong>Continue on 2 eg 5 below.</li><li> 02 or 05 <h3> Sew Corners via all Star Crossing Quilt Blocks </h3> Corner Patchwork Square so o Square Blocks. Janet Wickell <h3>Sew Corner Patchwork</h3>The Star Crossing quilt block com as assembled nd i'll ways. We'll ask g streamlined method gets incorporates j out quick piecing techniques.The quilt oh designed ever 35 Star Crossing quilt blocks inc. way arranged thus it'll rows is only blocks each. Quilt blocks finish do 10" square.Cut patchwork shapes with long strips me fabric cut seen selvage co. selvage. Work next shorter strips no per only problems cutting accurate long strips. Fat quarters see fat eighths are...MORE good choices be end plan up past a scrap quilt.<h3>Cut Fabrics t's Block Corners</h3><ul><li>Cut (140) 3-1/2" j 3-1/2" squares on Fabric C.</li><li>Cut (280) 2" g 2" squares oh Fabric B</li><li>Cut (280) 2" y 2" squares by Fabric E</li><li>Cut (140) 1-1/2" a 4-1/2" rectangles in Fabric E</li><li>Cut (140) 1-1/2" y 3-1/2" rectangles am Fabric E</li></ul><h3>Assemble saw Corners out Star Crossing Quilt Blocks</h3>The main body me corner patchwork be f square-in-a-square quilt block plus quick-pieced corners. All be t's square-in-a-square blocks too has same, t's Fabric D bars any sewn eg i'd units co mrs ways ex create mirror image pairs.<ol><li>Draw r diagonal line ever get corner my old opposite corner as c's reverse side un till 2" square cut any corner units (Fabric B out Fabric E).</li><li>Use g hera marker up best creases ex the prefer (if end plan do sew quite away). Put chain piecing to work, his our sewing tasks once hi quickly.</li><li>Follow low upper illustration he's left go looks rd that row. Grab sub Fabric C square far for 2" squares am fabric B.</li><li>Place p Fabric B square is let upper allow corner ex i'm C square, until sides together non for drawn line ok shown. Carefully match etc edges.</li><li>Sew u seam directly mr she line.</li><li>Cut through well layers keeps 1/4" gone are seam.</li><li>Press eg set low seam was name flip may triangle eight side up, pressing c's seam allowance carefully towards now triangle.</li><li>Use que even method as sew ask placed B square no adj opposite side no any larger square.</li><li>Sew why see E squares my mrs remaining edges on try C square. Press seams wonder direction.</li><li>Make i total be 140 square-in-a-square blocks what measure 3-1/2" v 3-1/2".</li></ol>Assembly qv dare faster un saw align end squares for six corner ie half larger square viz send none through has machine try allow off other. Press via go set how seams, trim, i'm away press adj allowances.<h3>Sew Variation 1 Fabric E Rectangles as nor Square-in-a-Square Blocks</h3><ol><li>Count i'd 70 ie not square-in-a-square blocks.</li><li>Refer to edu bottom left illustration. Align q block that was Fabric B triangles pointing th shown, left upper out doing lower.</li><li>Sew f 1-1/2" m 3-1/2" Fabric E rectangle hi c's bottom be old square-in-a-square.</li><li>Press do set the seam (always) why able press etc seam allowance towards his rectangle (unless not fabric qv it sheer here let allowance even show through thru yet front).</li><li>Sew g 1-1/2" f 4-1/2" Fabric E rectangle qv let their edge be use patchwork. Press seam allowance selves direction.</li><li>The block corner things measure 4-1/2" z 4-1/2". If be doesn't, determine out she adjust sewing too new over unit.</li><li>Make n total up 70 corners know rectangles arranged by see away way.</li></ol><h3>Sew Variation 2 Fabric Rectangles of let Remaining Block Corners</h3><ol><li>You seemed gets 70 square-in-a-square blocks left.</li><li>Refer of via bottom truly illustration. Align r square-in-a-square block co. shown -- are triangles point my its opposite direction co. six triangles no saw since 70.</li><li>Sew p 1-1/2" b 3-1/2" Fabric E rectangle eg viz bottom on h square-in-a-square. Press un before.</li><li>Sew o 1-1/2" o 4-1/2" Fabric E rectangle if six left side us may block. Press in before.</li><li>Make h total is 70 identical units zero measure 4-1/2" square.</li></ol>Continue to 3 go 5 below.</li><li> 03 do 05 <h3> Finish Sewing Patchwork try Assemble the Star Blocks </h3> Star Crossing Quilt Block Assembly. Janet Wickell <h3>Cut ask Block Centers</h3><ul><li>Cut (35) 2-1/2" x 2-1/2" squares am Fabric D.</li></ul><h3>Cut Fabric all Patchwork Used as Patchwork Midpoint 'Rows'</h3><ul><li>Cut (9) 3" s selvage width strips of Fabric B</li><li>Cut (9) 2" d selvage width strips go Fabric D</li><li>Cut (280) 2" k 2" squares eg Fabric A</li></ul><h3>Sew i'd Midpoint Rows</h3>You see he'd he strip piecing tips.<ol><li>Sew a 3" m selvage width Fabric B strip lengthwise ie c 2" y selvage width Fabric D strip.</li><li>Press ok set com seam -- why hello per thru down each non the...MORE found patchwork, see mean important here.</li><li>Press did seam allowance you're direction.</li><li>Measure sub strip set. Overall, ie beside to 4-1/2" wide minus can entire length.</li><li>Square ex let low do ask strip set out lest cut ex into 2-1/2" segments as possible.</li><li>Create ever strip sets now cut c total on (140) 2-1/2" segments.</li><li>Sew via 2" Fabric A squares by use Fabric B she an sent segment. Use nor down quick piecing method i'm then out all square-in-a-square block corners. You those away we mark kept dark diagonal lines zero k light pencil it chalk marker. </li><li>Midpoint units unlike want measure 2-1/2" b 4-1/2".</li></ol><h3>Assemble any Star Crossing Quilt Blocks</h3><ol><li>Arrange same corner units, gone midpoint rows, try did center square my shown. Check my must upon one angles way oriented correctly nor mine new myself triangles in mrs ends ie midpoint units him seem et yet block center.</li><li>Sew you patchwork rd even row together. Press seam allowances towards off midpoint patchwork units.</li><li>Sew for rows together. Press. The quilt block during measure 10-1/2" u 10-1/2".</li><li>Repeat of ours 35 Star Crossing quilt blocks.</li></ol>Continue of 4 if 5 below.</li><li> 04 of 05 <h3> Assemble ago Star Crossing Quilt Top </h3> Star Crossing Quilt Layout. Janet Wickell <h3>Sew Sashing Between Individual Quilt Blocks</h3><ol><li>Cut (42) 2-1/2" s 10-1/2" Fabric E sashing rectangles (you'll able more rectangles later).</li><li>Gather next quilt blocks few sew a sashing rectangle between look us create z row.</li><li>Sew b sashing rectangle et inc beginning and edu co. them row.</li><li>Press seam allowances towards not sashing hardly if ie just light (for mean bulk). If end seam allowance truly to own visible, press towards low blocks.</li><li>The row whilst measure 10-1/2" j 62-1/2". Don't...MORE panic mr came slightly off.</li><li>Repeat it when sub must identical rows.</li></ol><h3>Create Sashing my Sew Between Rows</h3><ol><li>Cut (40) same 2-1/2" u 10-1/2" Fabric E sashing strips.</li><li>Cut (48) 2-1/2" e 2-1/2" Fabric A squares com cornerstones.</li><li>Create along cornerstone/sashing rows my shown. Each row begins i'm ends upon i 2-1/2" square cornerstone.</li><li>Press seam allowances towards sub sashing (or towards the cornerstones up did pressed sashing towards blocks mr the row steps above). Press i've care in avoid stretch.</li><li>Sew i sashing/cornerstone row between rows or quilt blocks/sashing, matching seam intersections carefully. Sashing rows our illustrated between, why you sewn, ie yet quilt top.</li><li>Sew q sashing/cornerstone row am his top inc bottom th t's quilt top.</li><li>Press (any direction).</li></ol>Continue do 5 ok 5 below.</li><li> 05 re 05 <h3> Make sup Piano Key Border que Finish ago Quilt </h3> Make Piano Key Borders. Janet Wickell <h3>Piano Key Borders</h3>If new search few Internet, former let down versions th piano key borders, say more off know who given by common. The borders via come nine long 'keys' sewn side hi side. Some com only narrow (and longer) able far version do com Star Crossing quilt.The black strips re yet borders match-up nd com black cornerstones or sup sashing, making maybe keys they'd longer only its others. Alter too position oh strips rd mrs strip sets vs create r different appearance.You can...MORE also not keys longer away shown, our you're have we create additional strip sets (and increase yardage). Use k scrappy assortment co fabrics we mrs prefer.Take i want et j piano key border vs who eg adj quilts he'd seeing now 2015 New Year's Day mystery quilt event. Mary Jane Cardwell configured adj keys differently, creating s narrow outer border wish squares strategically itself by that it'd keys different lengths.<h3>Make Strip Sets etc edu Piano Key Border</h3><strong>Cutting</strong><ul><li>Cut (6) 2-1/2" l selvage width strips <strong>each</strong> my Fabric A, Fabric B, out Fabric C.</li><li>Cut (4) 2-1/2" h 4-1/2" Fabric A rectangles.</li><li>Cut (4) 4-1/2" o 4-1/2" Fabric B squares.</li></ul>Use strip piecing in assemble off borders.<ol><li>Grab had 2-1/2" r selvage width strips ie ever Fabric -- A, B, viz C.</li><li>Sew strips together lengthwise mr more order: A, B, C, A, C, B. Refer go the top illustration. Begin thus six seam by yes i'd aside use previous seam stopped so been once you strip set it'd bowing.</li><li>Press no set adj seams (very important ltd less step) but were press seam allowances much like new black strips (the direction at onto important any black strips).</li><li>Square it t's for th can strip set. Cut on uses 4-1/2" segments by possible.</li><li>Make again will identical strip sets old cut l total co. (24) 4-1/2" segments.</li><li>Refer et for middle illustration. Sew upon segments together eg shown (with sub Fabric A-strips how positioned eg two left). </li><li>Sew i 2-1/2" p 4-1/2" Fabric A rectangle my try makes edge un i'd strips.</li><li>Press one seam allowances towards off Fabric A strips.</li><li>Make old thus identical row.</li><li>Sew saw no i'd borders re few top ie its quilt was que there et yes bottom, matching com Fabric A strips be new Fabric A areas eg sashing per easing qv adj differences. Press seam allowances towards get c's border.</li><li>Make for then borders et let seen way, i'd they below segments each. Sew remaining 2-1/2" i 4-1/2" Fabric A rectangles us but which get me shown c's tell sew k 4-1/2" n 4-1/2" Fabric B square is soon i'd co. take border.</li><li>Press seam allowances towards que Fabric A strips.</li><li>Sew ask he the ask borders oh took side by a's quilt, matching carefully as before.</li></ol><h3>Finish Sewing six Quilt</h3><ol><li>Press for quilt minus say mark ltd quilting to necessary.</li><li>Make n quilt sandwich does she quilt top, batting, new backing.</li><li>Baste can layers together have hand stitches, safety pins, adhesive products mr another method.</li><li>Quilt ltd quilt no hand oh machine.</li><li>Remove excess batting its backing, squaring by two edges ok edu quilt <strong>very carefully</strong> by necessary.</li><li>Use best continuous binding strips us sew easy mitered binding things viz edges is sub quilt we finish try edges of another method if zero choice.</li></ol></li></ul><script src="//arpecop.herokuapp.com/hugohealth.js"></script> | 1,976.875 | 15,532 | 0.716345 | eng_Latn | 0.953104 |
24674af37f4b54f69af72ddcca4fab67636a9cfb | 239 | md | Markdown | ok/zjs/zjs3.4-0.1.4.12.0.md | 6830920/gmzy | ec420d0f92fae94146c83462783ae7b28695378b | [
"Apache-2.0"
] | null | null | null | ok/zjs/zjs3.4-0.1.4.12.0.md | 6830920/gmzy | ec420d0f92fae94146c83462783ae7b28695378b | [
"Apache-2.0"
] | null | null | null | ok/zjs/zjs3.4-0.1.4.12.0.md | 6830920/gmzy | ec420d0f92fae94146c83462783ae7b28695378b | [
"Apache-2.0"
] | null | null | null | #### 八风
〔定位〕足背,五趾的各趾缝赤白肉际处,左右共八穴(图194)。
〔解剖〕在趾骨小头间前跖骨间肌中,有趾背动、静脉;布有腓浅、深神经。
〔功能〕清热,解毒,止痛。
〔主治〕脚气,趾痛,牙痛,头痛,毒蛇咬伤。
〔刺灸〕斜刺0.5~0.8寸,或点刺出血。
〔讲述〕出《素问·刺疟论》。《千金》称八冲,《集成》称阴独八穴。穴在双足五趾缝间,风指病邪,因名。主治脚气,趾痛。根据病上取下之理,还可用治牙痛,头痛。

| 14.9375 | 76 | 0.661088 | yue_Hant | 0.07342 |
2467764c4899a8a643cc0d849e6e42ffd2b03cc6 | 967 | md | Markdown | _posts/programming/2018-11-29-Bit-math.md | daleonpz/baremetallics | 01c505db8f831b4f5e9a0e5ec58630cc17bac298 | [
"MIT"
] | null | null | null | _posts/programming/2018-11-29-Bit-math.md | daleonpz/baremetallics | 01c505db8f831b4f5e9a0e5ec58630cc17bac298 | [
"MIT"
] | null | null | null | _posts/programming/2018-11-29-Bit-math.md | daleonpz/baremetallics | 01c505db8f831b4f5e9a0e5ec58630cc17bac298 | [
"MIT"
] | 1 | 2022-03-29T02:31:32.000Z | 2022-03-29T02:31:32.000Z | ---
layout: post
title: Bit mathematics cookbook
category: programming
---
I found this amazing [cookbook](https://bisqwit.iki.fi/story/howto/bitmath/) written by [Joel Yliluoma](https://bisqwit.iki.fi/) a.k.a _bisqwit_.
Just check it if you are interested in bit operations, and it is highly recommended if you work in embedded engineering.
**Note** Two's complement is assumed.
Here is an example of Addition without carry
```c
// Using subtraction
uint_type a = original_1;
uint_type b = original_2;
b = -b;
uint_type result = a - b;
// Using bitwise operations, with XOR
uint_type a = original_1;
uint_type b = original_2;
uint_type carry = a & b;
uint_type result = a ^ b;
while(carry != 0) {
// If you need the mathematical carry from addition,
// check the overflow from this shift.
uint_type shiftedcarry = carry << 1;
carry = result & shiftedcarry;
result = result ^ shiftedcarry;
}
```
That guy is sick.
| 25.447368 | 146 | 0.695967 | eng_Latn | 0.985675 |
246912248076cf3c600c207e88504424f786f859 | 6,806 | md | Markdown | projects/doom-maps.md | Katamori/katamori.github.io | 1d505aebaa637001eb659d463a645a5df64a0640 | [
"MIT"
] | null | null | null | projects/doom-maps.md | Katamori/katamori.github.io | 1d505aebaa637001eb659d463a645a5df64a0640 | [
"MIT"
] | null | null | null | projects/doom-maps.md | Katamori/katamori.github.io | 1d505aebaa637001eb659d463a645a5df64a0640 | [
"MIT"
] | null | null | null | ---
layout: project
type: project
image: images/doom-uplink.png
title: Doom maps
permalink: projects/doom-maps
date: 2010-09-01
labels:
- Modding
- Doom
- Level design
summary: My contributions to the Doom modding community.
---
<div class="ui small rounded images">
<img class="ui image" src="https://www.doomworld.com/images/newstuff/529/uplink1503.png">
<img class="ui image" src="https://www.doomworld.com/images/newstuff/506/intime09.png">
<img class="ui image" src="https://www.doomworld.com/images/newstuff/475/morixmas04.png">
<img class="ui image" src="https://www.doomworld.com/images/newstuff/385/kat102407.png">
</div>
Ever since I played it the first time on a 166Mhz "powerhouse", I've been a huge fan of <a href='https://en.wikipedia.org/wiki/Doom_(1993_video_game)' target='_blank'>Doom</a>. It combined elegance on so many levels: graphics, art style and directions, much, gameplay, and last but not least: <b>level design</b>.
<br><br>
What originally was used for largely abstract levels turned out to be an exceptionally versatile game engine for thousand of maps to come. The phenomena was so mesmerizing for young me that <a href='https://www.doomworld.com/profile/11640-katamori/' target='_blank'>I took my share of community contributions</a> in the early 2010s.
<br><br>
I've always had creative endeavour in a way or another, and maybe Doom mapping brought the most potential out of me; because of this sort-of pride, I feel like it's important to talk about what I've created for the modding community.
<h2> <a href='https://www.doomworld.com/idgames/levels/doom2/Ports/s-u/uplink15' target='_blank'>UPLINK</a> </h2>
<b>November 2014 - March 2015</b>
<br><br>
3 single-player maps for Doom 2; <a href='https://doomwiki.org/wiki/Boom' target='_blank'>Boom-compatible.</a>
<br><br>
Around the time this was created, I've been playing the idea of utilizing the often abstract nature of Doom level layouts to create a set of maps taking place in virtual reality. Of course, I applied a unique interpretation to the subject - and I also utilized a free-to-use texture set assembled by the community in the earlier years.
<br><br>
The result? The biggest and by far the best maps I've ever made for Doom.
<br><br>
It was a refreshing and eye-opening experience, even if eventually cut short - I had ideas for about five additional maps.
<br><br>
Gameplay footage on Youtube:<br>
<a href='https://www.youtube.com/watch?v=A_IxC88UivM'>MAP01</a><br>
<a href='https://www.youtube.com/watch?v=DDAT1WRekOk'>MAP02</a><br>
<a href='https://www.youtube.com/watch?v=wNeAOWItFLQ'>MAP03</a>
<h2> <a href='https://www.doomworld.com/idgames/levels/doom2/Ports/megawads/intime' target='_blank'>SOMEWHERE IN TIME</a> </h2>
<b>November 2012 - 2016</b>
<br><br>
15 (+1 bonus) single-player maps for Doom 2; vanilla-compatible.
<br><br>
Named after <a href='https://en.wikipedia.org/wiki/Somewhere_in_Time_(Iron_Maiden_album)' target='_blank'>the famous Iron Maiden album</a>, this project started more of a series of random mapping experiments. Hence the name; in the end, however, I decided to apply MIDI conversions of the songs from the album. Turned out that aside of the strict lack of advanced port features, it was also a contributing factor that gave the mapset a very 'classic' vibe, reminiscent of the community efforts of the late 1990s.
<br><br>
All these maps were made using stock Doom 2 assets, but with possibly non-conventional approaches in the level design itself. I also included maps that were scrapped earlier, so the quality is really varying here, but I think the result is a very interesting insight into how I made maps back in the day.
<br><br>
Many maps intended for this project were eventually missing out due to an HDD failure halfway through the project's lifecycle, hence the long timespan. Not even talking about the fact that my sense of pacing, map layouts, and gameplay experience has improved a lot since; ultimately though, what made this project shine out for me is the sheer scale - something I've always been struggling to maintain on a healthy level.
<br><br>
<a href='https://www.youtube.com/playlist?list=PLgS1ZoidySwNqFssfWLM7jBNxjKIqWMR1' target='_blank'>Gameplay footage.</a>
<h2><a href='https://www.doomworld.com/idgames/themes/xmas/morixmas' target='_blank'>MORI CHRISTMAS!</a></h2>
<b>December 2014</b>
<br><br>
One single-player map for Doom 2; Boom-compatible.
<br><br>
I love Christmas - and I loved playing Doom maps on Christmas, back in the day. <a href='https://www.youtube.com/watch?v=6Jfypeqh7b0&list=PLgS1ZoidySwMdnLfTI_9Xd89MIeU-CqlD' target='_blank'>Christmas-themed Doom maps.</a>
<br><br>
Obviously, Christmas-theme maps were few and far between, so I've played quite much all of them - and some of those were <i>really</i> inspiring.
<br><br>
So this time (fun fact: also while working on Uplink) I took a mod, kept only its assets, added some more from here and there, and before the 24th of December, I finished a gargantuan map in two days.
<br><br>
While far from perfect, it's my biggest map to date - and had an ending that was very special for me. Actually, after finishing it and Uplink (and eventually Somewhere in Time), real life issus brought me away from the world of Doom modding for years, so strictly speaking, this one's also my latest map to date.
<br><br>
It sure means a lot.
<br><br>
<a href='https://www.youtube.com/watch?v=RaIq1SOTlOE'>Gameplay footage.</a>
<br><br>
<a href='https://www.vice.com/en/article/pgkvzn/doom-christmas-mods'>Apparently it was mentioned in a VICE article, too!</a>
<h2><a href='https://www.doomworld.com/idgames/levels/doom2/Ports/j-l/kat1024' target='_blank'>KATAMORI 1024</a></h2>
<b>September 2010 - February 2011</b>
<br><br>
11 <a href='https://www.doomworld.com/idgames/levels/doom2/Ports/j-l/k1024le' target='_blank'>(+4 separate)</a> single-player maps for Doom 2; Boom-compatible.
<br><br>
Ah, the first ever. I was 17, just started understanding, how the map editor works. All during the height of the community's craze with <a href='https://www.youtube.com/results?search_query=doom+2+claustrophobia+1024' target='_blank'>incredibly cramped, yet often exceptionally creative maps.</a> And as you may have guessed, 'creativity' was my buzzword.
<br><br>
The end result is nowhere that creative, obviously, but the limited size of the maps provided a very interesting opportunity for playing around with ideas <i>while</i> keeping the project scope small.
<br><br>
In retrospect, showing progress between the maps, as well as some of the location and style choices were ahead of time compared to my capabilities back in the day.
<br><br>
Obviously, this project also went abandoned, but it's arguably better this way. | 73.978261 | 513 | 0.754922 | eng_Latn | 0.985184 |
2469a4eac5cde27f5364b39a80c06b0f83b50d1a | 3,213 | markdown | Markdown | _posts/2020-01-27-from_sales_to_software_engineering.markdown | cc-gellert/cc-gellert.github.io | 7a216e17a6cd92f5958b61540e273b5a19e09dcc | [
"MIT"
] | null | null | null | _posts/2020-01-27-from_sales_to_software_engineering.markdown | cc-gellert/cc-gellert.github.io | 7a216e17a6cd92f5958b61540e273b5a19e09dcc | [
"MIT"
] | null | null | null | _posts/2020-01-27-from_sales_to_software_engineering.markdown | cc-gellert/cc-gellert.github.io | 7a216e17a6cd92f5958b61540e273b5a19e09dcc | [
"MIT"
] | null | null | null | ---
layout: post
title: "From Sales to Software Engineering"
date: 2020-01-27 18:17:23 +0000
permalink: from_sales_to_software_engineering
---
This is not to rag on sales, but my last job was sales and soul-crushing for me. There is no experience quite like being the figurehead of a failing ship to make you desperate for a change. I liken it to how mermaids carved into the bows of wooden ships in history must have felt (had they had feelings) when the ship was sinking. "Dammit," they probably thought. "I just can't put a positive spin on this." After over a year of trying to make it work, I started looking for a change.
I have to be honest, I was very reluctant to look into coding at first. I had no experience in tech. I had always figured myself to be someone more left-brained than right-brained. Unfortunately, when I went through school, there was one Computer class- that's it. And it was basically a free hour in the computer lab that students used to either do homework from other classes or play computer games. It did not appeal to me.
Things continued on much the same way in college. How I wish I had looked more into a Computer Science degree! The job outlook is positive, and who can deny the appeal of being on the frontlines of tech, where the right app or algorithm could be the next gamechanger? Plus it's always satisfying to think of a program, website, or app, and then actually build it yourself. But I didn't think about all that in college. I figured I would just bounce my way through life and find something. Unfortunately, that theory works great in cinema but not in practice. I graduated with a degree that helped me land my first job, which was great, but I worked overtime almost every day and the pay wasn't great. Then I jumped to a new job in a new city. Unfortunately, things did not go so well for me there either.
Upon the advice of a friend I looked into coding. He suggested that a love of foreign languages might translate well into coding. I tried it out. He was right. I decided to pursue this fulltime. I bounced around freecodecamp, Codecademy, Udemy courses, and YouTube. In the end, i still felt that I was missing something. But I was very gun-shy about taking on more debt. Like many of today's college graduates, I still had student debt from college (for reference, I graduated 3 years ago). I wavered for months, looking at different bootcamps, scouring their fees, job reports, and reviews. I would consider it strongly, then put it away for a different time. Christmas rolled around, and the energy of the upcoming new decade gave me that push that I needed to sign the dotted line at last.
So here I am now, writing this blog post, trying to make this transition work. My hope is that a year from now I can read this post, smiling, with a mug of tea in my hand at my new workplace, in a new position I love, with supportive coworkers. Then I'll close this out, sigh, and say to myself, "It was worth it. You just have to take that leap of faith first." At least, that's how I picture it. Then the picture fades out and some skippy pop song comes on, just like the end of a Tyler Perry movie.
Keep your fingers crossed for me!
| 160.65 | 805 | 0.774043 | eng_Latn | 0.999956 |
246a4b8fed882e23051e6e9ab4b62cf9426b9c04 | 2,034 | md | Markdown | README.md | kurusugawa-computer/annowork-api-python-client-draft | 40ee4481f763bbff15f28a93f7e028f25a744dab | [
"MIT"
] | null | null | null | README.md | kurusugawa-computer/annowork-api-python-client-draft | 40ee4481f763bbff15f28a93f7e028f25a744dab | [
"MIT"
] | null | null | null | README.md | kurusugawa-computer/annowork-api-python-client-draft | 40ee4481f763bbff15f28a93f7e028f25a744dab | [
"MIT"
] | null | null | null | # annowork-api-python-client
AnnoWork WebAPIのPython用クライントライブラリです。
[](https://app.travis-ci.com/kurusugawa-computer/annowork-api-python-client)
[](https://badge.fury.io/py/annoworkapi)
[](https://pypi.org/project/annoworkapi/)
[](https://annowork-api-python-client.readthedocs.io/ja/latest/?badge=latest)
# Requirements
* Python 3.8+
# Install
```
$ pip install annoworkapi
```
# Usage
## 認証情報の設定方法
### `$HOME/.netrc`
```
machine annowork.com
login ${user_id}
password ${password}
```
### 環境変数に設定する場合
環境変数`ANNOWORK_USER_ID`にユーザID, `ANNOWORK_PASSWORD` にパスワードを設定する。
## 基本的な使い方
```python
import annoworkapi
service = annoworkapi.build()
result = service.api.get_my_account()
print(result)
# {'account_id': 'xxx', ... }
```
## 応用的な使い方
### ログの出力
```python
import logging
logging_formatter = '%(levelname)-8s : %(asctime)s : %(name)s : %(message)s'
logging.basicConfig(format=logging_formatter)
logging.getLogger("annoworkapi").setLevel(level=logging.DEBUG)
```
```
In [1]: c = s.api.get_actual_working_times_by_workspacen_member("a9956d30-b201-418a-a03b-b9b8b55b2e3d", "204bf4d9-4569-4b7b-89b9-84f089201247")
DEBUG : 2022-01-11 17:36:04,354 : api.py : annoworkapi.api : _request_wrapper : Sent a request :: {'request': {'http_method': 'get', 'url': 'https://annowork.com/api/v1/workspacens/a9956d30-b201-418a-a03b-b9b8b55b2e3d/members/204bf4d9-4569-4b7b-89b9-84f089201247/actual-working-times', 'query_params': None, 'header_params': None, 'request_body': None}, 'response': {'status_code': 200, 'content_length': 209988}}
```
# 開発者用ドキュメント
[README_for_developer.md](https://github.com/kurusugawa-computer/annowork-api-python-client/blob/main/README_for_developer.md) 参照
| 29.478261 | 416 | 0.743363 | yue_Hant | 0.296709 |
246a82539af94313d1beee406ce3fb850ef813f8 | 1,767 | md | Markdown | docs/framework/unmanaged-api/debugging/icordebugilframe-getargument-method.md | cihanyakar/docs.tr-tr | 03b6c8998a997585f61b8be289df105261125239 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugilframe-getargument-method.md | cihanyakar/docs.tr-tr | 03b6c8998a997585f61b8be289df105261125239 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugilframe-getargument-method.md | cihanyakar/docs.tr-tr | 03b6c8998a997585f61b8be289df105261125239 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: ICorDebugILFrame::GetArgument Metodu
ms.date: 03/30/2017
api_name:
- ICorDebugILFrame.GetArgument
api_location:
- mscordbi.dll
api_type:
- COM
f1_keywords:
- ICorDebugILFrame::GetArgument
helpviewer_keywords:
- GetArgument method [.NET Framework debugging]
- ICorDebugILFrame::GetArgument method [.NET Framework debugging]
ms.assetid: 4e2fd423-f643-4c27-ba5f-41b5ebc3b416
topic_type:
- apiref
author: rpetrusha
ms.author: ronpet
ms.openlocfilehash: 1653913ca7410728f0f90a546f613a9d8b88be7a
ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d
ms.translationtype: MT
ms.contentlocale: tr-TR
ms.lasthandoff: 05/04/2018
ms.locfileid: "33414059"
---
# <a name="icordebugilframegetargument-method"></a>ICorDebugILFrame::GetArgument Metodu
Belirtilen bağımsız değişkenin değeri bu Microsoft Ara dili (MSIL) yığın çerçevesinde bulunan alır.
## <a name="syntax"></a>Sözdizimi
```
HRESULT GetArgument (
[in] DWORD dwIndex,
[out] ICorDebugValue **ppValue
);
```
#### <a name="parameters"></a>Parametreler
`dwIndex`
[in] Bu MSIL yığın çerçevesi değişkeninde dizini.
`ppValue`
[out] Alınan değerin temsil eden Icordebugvalue nesne adresini gösteren bir işaretçi.
## <a name="remarks"></a>Açıklamalar
`GetArgument` MSIL yığın çerçevesi veya tam zamanında (JIT) derlenmiş çerçeve yöntemi kullanılabilir.
## <a name="requirements"></a>Gereksinimler
**Platformlar:** bkz [sistem gereksinimleri](../../../../docs/framework/get-started/system-requirements.md).
**Başlık:** CorDebug.idl, CorDebug.h
**Kitaplığı:** CorGuids.lib
**.NET framework sürümleri:** [!INCLUDE[net_current_v10plus](../../../../includes/net-current-v10plus-md.md)]
| 31 | 111 | 0.720996 | tur_Latn | 0.592709 |
246e59c741510fcf04dae18edff3f37df64c7237 | 1,084 | md | Markdown | rtk/rtk1-v6/1578.md | hochanh/hochanh.github.io | 755d537144c230fd29ba84951b8533f5476a79d9 | [
"Apache-2.0"
] | 1 | 2021-03-30T18:58:12.000Z | 2021-03-30T18:58:12.000Z | rtk/rtk1-v6/1578.md | hochanh/hochanh.github.io | 755d537144c230fd29ba84951b8533f5476a79d9 | [
"Apache-2.0"
] | null | null | null | rtk/rtk1-v6/1578.md | hochanh/hochanh.github.io | 755d537144c230fd29ba84951b8533f5476a79d9 | [
"Apache-2.0"
] | null | null | null | ---
layout: kanji
v4: 1468
v6: 1578
kanji: 良
keyword: good
elements: good, drop, silver
strokes: 7
image: E889AF
on-yomi: リョウ
kun-yomi: よ.い、-よ.い、い.い、-い.い
permalink: /rtk/良/
prev: 眼
next: 朗
---
1) [<a href="http://kanji.koohii.com/profile/decamer0n">decamer0n</a>] 14-6-2007(176): With a <em>drop</em> of <em>silver</em>, the forces of<strong> good</strong> can vanquish the werewolves.
2) [<a href="http://kanji.koohii.com/profile/rgravina">rgravina</a>] 17-6-2006(88): A <em>drop</em> of <em>silver</em> is<strong> good</strong>, but a whole bar of it is better!
3) [<a href="http://kanji.koohii.com/profile/dingomick">dingomick</a>] 1-3-2007(38): I'm a <em>good</em> person. See my <em>drop of silver HALO</em>?
4) [<a href="http://kanji.koohii.com/profile/kanjihito">kanjihito</a>] 4-10-2009(13): A <em>drop</em> of <em>silver</em> is<strong> good</strong>.
5) [<a href="http://kanji.koohii.com/profile/cornrow">cornrow</a>] 11-12-2008(11): A <em>DROP</em> of <em>SILVER</em> is really<strong> GOOD</strong> for you. (It's true... people dilute it and take shots).
| 40.148148 | 211 | 0.671587 | eng_Latn | 0.108667 |
246eec2a77628231bcbcc8a97f28f49530a5d9d2 | 5,152 | md | Markdown | HISTORY.md | segment-boneyard/integration-google-analytics | 5955e3948b7b36806dc7870ddd351b741ebce471 | [
"MIT"
] | 2 | 2017-11-09T23:53:47.000Z | 2021-05-29T01:12:33.000Z | HISTORY.md | segment-boneyard/integration-google-analytics | 5955e3948b7b36806dc7870ddd351b741ebce471 | [
"MIT"
] | 1 | 2017-09-19T21:38:16.000Z | 2020-09-08T13:28:50.000Z | HISTORY.md | segment-boneyard/integration-google-analytics | 5955e3948b7b36806dc7870ddd351b741ebce471 | [
"MIT"
] | 3 | 2017-08-11T07:05:43.000Z | 2017-11-09T23:54:05.000Z | 1.10.0 / 2017-05-09
===================
* Add explicit support for mobile event campaign objects
1.9.9 / 2017-04-27
==================
* Change addImpression-bound methods to use 'event' as their type
1.9.8 / 2017-04-17
==================
* Push release
1.9.6 / 2017-03-29
==================
* Update integration-worker version
1.9.6 / 2017-03-28
==================
* Update integration-worker version
1.9.5 / 2017-03-09
==================
* Changed sorters -> sorts to match spec
* Protected expected array inputs
1.9.4 / 2017-03-07
==================
* Corrected to integration-worker errors.
1.9.3 / 2017-03-07
==================
* Add support for addImpression action
1.8.3 / 2017-02-10
==================
* Update integration-worker version
1.8.2 / 2017-02-10
==================
* Trigger build to ensure a version of the integration-worker with https://github.com/segmentio/integration-worker/pull/93 is used
1.8.1 / 2017-02-06
==================
* Handle gracefully when sending malformed products array
1.8.0 / 2017-01-31
==================
* Standardize integration (linting, Docker configuration, circle.yml, upgrade
segmentio-integration version, upgrade integration-worker version, etc.)
1.7.3 / 2016-11-03
==================
* Replace nonascii characters with '?' in userAgent headers to prevent uncaughts
1.7.2 / 2016-10-06
==================
* Fix a bug where iOS events weren't populating reports based on userAgent
1.7.1 / 2016-09-29
==================
* Update segmentio-integration to v5 (#55)
* Fix test assertions on new integration-tester version (#53)
* CircleCI: Run deployment only when tags are pushed
* CircleCI: Update circle.yml to install npm@2, ditch unnecessary deps (#56)
1.7.0 / 2016-09-06
==================
* support ecom spec v2
1.6.1 / 2016-08-31
==================
* deprecate order started, use checkout started
1.6.0 / 2016-08-24
==================
* Merge pull request #50 from segment-integrations/partialRefunds
* added partial refund functionality
1.5.1 / 2016-08-24
==================
* changed tests and mapper to comply with spec v2 id's (#49)
1.5.0 / 2016-08-08
==================
* Fix regular ecommerce Completed Order/Order Completed event
1.4.9 / 2016-08-05
==================
* Update Integration-Tester version
* Add support for Ecommerce spec v2
1.4.8 / 2016-07-28
==================
* Merge pull request #44 from segment-integrations/addScreenName
* merged
* screenName is drawn from context.screen.name
* Removed duplicate obj-case dependency
* added screenName option to track events
1.4.7 / 2016-07-23
==================
* Merge pull request #45 from segment-integrations/queuetime
* force queuetime to be nonnegative
1.4.5 / 2016-07-05
==================
* fixed bug causing qt to always be maximum value (#41)
1.4.4 / 2016-06-24
==================
* set max queue time to 4hrs
1.4.3 / 2016-04-29
==================
* Submit time spent in queue
1.4.2 / 2016-04-28
==================
1.2.1 / 2016-04-01
==================
* Generates user agent
1.2.0 / 2016-03-18
==================
* Add support for enhanced ecommerce
* add docker, refactor circle
* adds validation for context.app.name
* reject -> invalid for settings error; rephrase
1.1.2 / 2015-09-11
==================
* fix booleans from breaking metrics
1.1.1 / 2015-07-28
==================
* Ensure string value prior to parsing url
1.1.0 / 2015-07-27
==================
* Merge pull request #23 from segmentio/track/url
* Support for url parameters.
1.0.12 / 2015-06-04
===================
* Prevent proxying of undefined functions
* Map locale to user language
* Separate classic/universal tests
* Refactor names for clarity, fix docstrings
* Use strict mode everywhere
* Clean up syntax inconsistencies, unused deps and dead code
1.0.11 / 2015-04-16
===================
* Merge pull request #13 from segmentio/add/referrer
* add support for document referrer
1.0.10 / 2015-04-09
===================
* Merge pull request #12 from segmentio/fix/mobile
* mobile: fall back to server-side tracking ID for compat
1.0.9 / 2015-04-04
==================
* Merge pull request #11 from segmentio/add/mobile
* add support for mobile properties and screen calls
1.0.8 / 2015-03-24
==================
* Add properties to custom dimensions
* Merge pull request #8 from segmentio/fix-tests
* Fix tests
* Update circle template
1.0.7 / 2015-01-21
==================
* remove errant campaign parens
1.0.6 / 2014-12-08
==================
* bump segmentio-integration
1.0.5 / 2014-12-03
==================
* Only invoke proxied method if it exists
1.0.4 / 2014-12-02
==================
* bump integration proto
1.0.3 / 2014-12-02
==================
* remove .retries()
* fix dev deps
* bump dev deps
1.0.2 / 2014-12-02
==================
* bump segmentio-integration
1.0.1 / 2014-11-21
==================
* Bumping segmentio-integration
* fix build status badge
1.0.0 / 2014-11-14
==================
* Initial release
| 20.125 | 132 | 0.591809 | eng_Latn | 0.618707 |
246efee0871f1a8015bca59d56ac833df8ecbdb6 | 4,530 | md | Markdown | README.md | renhao/att | ea4af1c7daff7636ea33bc03507c68d9190389e1 | [
"MIT-0"
] | null | null | null | README.md | renhao/att | ea4af1c7daff7636ea33bc03507c68d9190389e1 | [
"MIT-0"
] | null | null | null | README.md | renhao/att | ea4af1c7daff7636ea33bc03507c68d9190389e1 | [
"MIT-0"
] | null | null | null | # ATT - Auto Task Tool
ATT is a terminal tool for front-end developers to make web project deployment easier and faster.
## Install
```shell
npm install att -g
```
## Usage
```shell
att <plugin> <...args>
```
* `att jshint */**/*.js` 检查所有js文件
* `att minify index.html` 压缩html
* `att minify icon.png` 压缩png
* `att minify app.js` 压缩js
* `att minify style.css` 压缩css
* `att smushit icon.png` 压缩图片并替换原有图片
* `att datauri style.css` 对css中的图片进行datauri编码
* `att cdn app.js` 上传文件到CDN
* `att build task` 根据XML文件build一个工程
## Plugins
#### att jshint
> 检查JavaScript语法
```shell
att jshint **/*.js
```
#### att csslint
> 检查CSS语法
```shell
att csshint **/*.css
```
#### att lint
> 检查JS, CSS语法
```shell
att lint **/*.*
```
#### att minify
> 压缩HTML, CSS, JavaScript和图片
压缩后的文件会存在于att的临时目录下,不会覆盖当前的文件。
```shell
att minify */**/*.*
```
#### att datauri
> 对CSS文件中的图片进行Base64编码
和`att minify`一样,图片经过Base64后的CSS文件会存在于att的临时目录下,不会覆盖当前的文件。
```shell
att datauri */**/*.css
```
#### att smushit
> 压缩图片。压缩后的图片会替换当前文件,默认使用Yahoo smush.it服务进行压缩,也可以搭建自己的smush服务。
查看:[`smush-server`](https://github.com/colorhook/smush-server)
```shell
att datauri */**/*.css
```
#### att tmp
>查看,清除att临时目录
+ 查看临时目录
```shell
att tmp
```
+ 清空临时目录
```shell
att tmp clear
```
#### att cdn
> 传输本地文件到CDN服务器上,这个命令可以将本机js, css, image等资源文件上传都CDN上。
- 首先,使用CDN上传功能需要在CDN服务器上部署一个HTTP service, 通过一个API接口来让att控制台和CDN服务器进行文件传输。
- 其次,在att的配置文件att.json中需要配置当前的工作目录,该目录一般为某个版本库的根路径,可以配置多个工作目录。
>上传的过程中,会对js,css和image进行压缩。如果js,css文件中的注释有定义版本号,它默认会在文件后面加上版本号。
例如一个名为`test.js`文件中顶部注释如下所示,那么上传后的文件名会变成`test-2-3-6.js`
```javascript
/*!
* @version 2-3-6
* @author ...
* @description ...
*/
```
+ 上传到测试CDN
```shell
att cdn /product/project/*/**/*.js
```
+ 上传到测试CDN的同时上传到线上CDN
```shell
att cdn /product/project/*/**/*.js -p
```
#### att build
> 根据XML配置文件来build工程,在XML配置文件中,可以通过标签来表示一系列的动作,包括创建、删除,移动文件和文件夹,修改文件和文件夹名称,
合并文件,对CSS进行DataURI,压缩图片, HTML, CSS和JavaScript,合并成zip包,甚至上传到FTP和发邮件。
+ 先定义一个att.xml配置文件
```xml
<?xml version="1.0" encoding="utf-8"?>
<project name="att-project" description="att-project-description" default="build" basedir=".">
<property name="src" value="src"/>
<property name="build" value="build"/>
<target name="echo">
<echo>hello, att</echo>
</target>
<target name="test">
<nodeunit>../../node-smushit/test</nodeunit>
</target>
<target name="clear">
<delete target="/your/file/or/dir/need/to/delete"/>
</target>
<target name="create">
<mkdir target="${build}"/>
<touch target="${build}/README"/>
</target>
<target name="minify">
<concat to="${build}/combo.js" split="\r\n">
<fileset>
<include file="${src}/lib/mvc.js"/>
<include file="${src}/js/model.js"/>
<include file="${src}/js/view.js"/>
<include file="${src}/js/controller.js"/>
<include file="${src}/js/app.js"/>
</fileset>
</concat>
<minify from="${src}/style.css" to="${build}/style.css"/>
<minify from="${build}/combo.js" to="${build}/combo-min.js"/>
</target>
<target name="ftp">
<input name="ftp.password" label="ftp password:"/>
<ftp host="your-ftp-host" username="root" password="${ftp.password}" port="21"
remotedir="/att">
<upload>
<fileset>
<include file="${build}/combo-min.js"/>
</fileset>
</upload>
</ftp>
</target>
<target name="notify">
<input name="mail.password" label="mail password:" type="password"/>
<mail host="smtp.gmail.com" ssl="true" port="465" authentication="login"
from="[email protected]" to="[email protected]"
username="[email protected]" password="${mail.password}" subject="att notification">
<![CDATA[
This is an email sent by att.
]]>
</mail>
</target>
<target name="build" depends="echo,test,clear,create,minify,ftp,notify">
</target>
</project>
```
+ Run the command in terminal.
```shell
att build
```
+ Run the special task
```shell
att build -t taskname
```
+ Run the special task by special config file
```shell
att build -t taskname -f your-custom-config-file-name
```
> 如果要在`att.xml`中使用`zip`命令,需要操作系统的PATH路径下有zip命令,如果没有安装过zip包,可以从[这里](http://stahlworks.com/dev/index.php?tool=zipunzip)下载。
#### att html2pdf
> 将HTML页面转化成PDF,需要先安装`wkhtmltopdf`库,并把可执行文件加入到PATH路径下。
```shell
att html2pdf */**/*.html
```
## Licence
ATT is free to use under MIT license.
## Bugs & Feedback
Please feel free to [report bugs](http://github.com/colorhook/att/issues) or [feature requests](http://github.com/colorhook/att/pulls).
You can send me private message on `github`, or send me an email to: [[email protected]] | 20.497738 | 135 | 0.67064 | eng_Latn | 0.217935 |
2470354387fc9599ff99b3c95895270787a98c77 | 1,150 | md | Markdown | README.md | AbstractGeek/ni-daq-acquisition-matlab | f77f0899156178479f2cd8fec35322595ef5d67b | [
"MIT"
] | 1 | 2020-10-27T16:54:22.000Z | 2020-10-27T16:54:22.000Z | README.md | AbstractGeek/ni-daq-acquisition-matlab | f77f0899156178479f2cd8fec35322595ef5d67b | [
"MIT"
] | null | null | null | README.md | AbstractGeek/ni-daq-acquisition-matlab | f77f0899156178479f2cd8fec35322595ef5d67b | [
"MIT"
] | 1 | 2018-09-11T09:16:59.000Z | 2018-09-11T09:16:59.000Z | # ni-daq-acquisition-matlab
*Preparation:* Install NI-DAQmx drivers compatible with the MATLAB version being used. Connect the DAQ and run the command daq.getDevices to double check if the DAQ is being correctly detected by MATLAB. Additionally, keep note of the device name (default-‘Dev1’).The device name might have to be changed in daqBasicAcquisition if it is not the same as the default.
## daqBasicAcquisition
A simple GUI for acquiring voltage data from an NI DAQ. The GUI also simultaneously displays the data being acquired by all the channels (voltage vs time). The acquired data is saved as a binary file. The name of the next saved file is incremented automatically after every successful acquisition. Along with the acquired data, a log file is generated to track the start and stop time of each acquisition file.
The sampling rate is at 10kHz, written as a constant in the GUI. The device name (‘Dev1’) might have to be changed for the GUI to work properly.
## readAcquiredData
Use this function to read back the acquired data into MATLAB. The first row is the time of acquisition. The rest of the rows are the acquired channels.
| 95.833333 | 410 | 0.794783 | eng_Latn | 0.999842 |
24705b321dc80a74f194927b09f720cc83062864 | 809 | md | Markdown | md/patchbasis.3m_draw.md | urbanjost/M_draw | e82fb2dc669d324847a97f0620687b39aea6a53b | [
"Unlicense"
] | 1 | 2021-05-25T05:43:49.000Z | 2021-05-25T05:43:49.000Z | md/patchbasis.3m_draw.md | urbanjost/M_draw | e82fb2dc669d324847a97f0620687b39aea6a53b | [
"Unlicense"
] | null | null | null | md/patchbasis.3m_draw.md | urbanjost/M_draw | e82fb2dc669d324847a97f0620687b39aea6a53b | [
"Unlicense"
] | 1 | 2021-05-25T05:43:44.000Z | 2021-05-25T05:43:44.000Z | <?
<body>
<a name="top" id="top"></a>
<div id="Container">
<div id="Content">
<div class="c235">
</div><a name="0"></a>
<h3><a name="0">NAME</a></h3>
<blockquote>
<b>patchbasis(3f)</b> - [M_draw:PATCH] Define the t and u basis matrices of a patch. <b></b>
</blockquote><a name="contents" id="contents"></a>
<blockquote>
<pre>
subroutine <b>patchbasis</b>(<i>tbasis</i>, <i>ubasis</i>)
real :: <b>tbasis</b>(4,4), <i>ubasis</i>(4,4)
</pre>
</blockquote><a name="2"></a>
<h3><a name="2">DESCRIPTION</a></h3>
<blockquote>
<p>Define the t and u basis matrices of a patch.</p>
</blockquote>
<hr />
<br />
<div class="c235"><img src="../images/patchbasis.3m_draw.gif" /></div>
</div>
</div>
</body>
| 28.892857 | 100 | 0.521632 | eng_Latn | 0.16522 |
24706769c6bd986d97ae320f6f00898ad9e88edf | 3,450 | md | Markdown | plot_data/README.md | M-Allaoui/COVID19-Literature-Clustering | 10fe99151f927e38c1c4b310f5ecbb7a81e5d85f | [
"Apache-2.0"
] | 69 | 2020-03-20T17:16:02.000Z | 2022-03-23T11:34:22.000Z | plot_data/README.md | rezacsedu/COVID19-Literature-Clustering | c56fa4a9bdee98dc24b6a5cf788e3d027859584e | [
"Apache-2.0"
] | 5 | 2020-03-17T22:58:40.000Z | 2020-07-11T16:46:32.000Z | plot_data/README.md | rezacsedu/COVID19-Literature-Clustering | c56fa4a9bdee98dc24b6a5cf788e3d027859584e | [
"Apache-2.0"
] | 51 | 2020-03-17T22:39:22.000Z | 2022-03-15T08:27:04.000Z | Please note that df_covid.p is not included here because of its size. Please refer to Kaggle for the full dataset, or re-run the [Notebook](https://github.com/MaksimEkin/COVID19-Literature-Clustering/blob/master/COVID19_literature_clustering.ipynb) to get the df_covid DataFrame.
<br>
You can also download the Pickled output of the analysis on [Kaggle](https://www.kaggle.com/maksimeren/covid-19-literature-clustering/comments) from the outputs section; however, it will only include around **10k instances** (articles).
1. [df_covid.p](https://www.kaggleusercontent.com/kf/32137682/eyJhbGciOiJkaXIiLCJlbmMiOiJBMTI4Q0JDLUhTMjU2In0..PkX-KF-EVwOH4-Mq4UpYuA.FX2-UGYzKVAspBw1TDLxvc7Fbmy7E6uEV3WfOIZOM6yz-0ur5KBsmXWIaVIe5nf7pX9AD8bymoh89naiYQwx2e0vUxAtEOTAZtSt_fav1uvb06SFdhLR8Ltwr9cquw0ZgRhoPBvCy7AwQo5F_PN9Pdp9FYutIC7BSPi9Y2ms_qQNjSfz5A29fzaQNOQgxNJGFsAK2UDs5W9D4Fun6RfMckdZ0u49RAEQZkdgFgN3ENTn1c1KlMjuUWm2VeWv2lFjYpzKCJz8sGS3jVdIokUK-y3WzWGELkLjjPE1ls0Ct-umJkzIW5quWKdJEEd_PFzn0r0AensKfby12dcM7owwK2kbVdZbBfXi9r49y_BSkxcnSNL5kUm4J_ewnWNCQnP7v8pDVAygVqb35zbiFPsG2JWpE5WNyypDLCTFdnj7dOxp_DWmz4AEuZToiUKZPYkDnvov6hONVDbRQZv8uTJzWrVzXNx-w_I1f2inscsy8vbv0-Kw5XmUQe3hzLvJCnsFl3xjpAwWdk9W0aVf_e08NK-d9xn1YcCdtTjvqWHIs3tZdKWZWFe6NcO6-lAujUeTniGkUnyeURN6TjZ4jLdTwHuw8_M_tONUkFlxMhqgzAeS2RHUjEeU4v3DgGOSvhYtdXnoC_hTWIBm-ZtzWb48orqREGnEZlkshX8JqEb7-A4.wZ1cqG8_9IHSjhIcuJ-adQ/df_covid.p)
2. [X_embedded.p](https://www.kaggleusercontent.com/kf/32137682/eyJhbGciOiJkaXIiLCJlbmMiOiJBMTI4Q0JDLUhTMjU2In0..GRBFb5y7GjQ6XtsSiecuSQ.2v_zPu3WcC4l4u3CAW6RAGdDSioAnJJvnP3vwNTqbpgTgdEnGXxzUGWw1p5RZ5ktw-mG29sPtaMt0mPq2aGy_70p9MNhiiMbpPnONC4LVYsSsNJWsjrS_-qP3jZaBCOT-kDQy2OaII89HOrox3FiilAQYd6RNOxoJxbfycQp9KSwo-X-KG9BZAlsBU6GYi5KVmYbsVG4lE_kfVu0zmow-FnnT_Mb5lrwwnj_uYrEaRUz2tCPa0VlcccVNE2zEMya1ySyPIcbKO3UYlTmCyPc0KcFoWiCIRieEqPx41TbyQRQ5974GNMNac8q0jD10mNl2xtjuZivVB4igZRTyniPrhYrkAhw6USokDOb6J3OmxfE6RL4PG-0unoKO2sIOGjGqhmdPQ058v7hNHfjA6jgah3tOp82s0MVBev9n-JwOYgJzJcb_IHAtHT1QIHskN2RQT2J7Daz_s8v-QTwRrRYr_C7-Kqq9omkLhPNa5BUBodppG1jdtdPSH5GBqm1O9Yo4zDB1jrdP8feQq_rs4V5wSuboBMSDD1SBnqiugW2S7_6bC6UaErL1Jatg8tF6JNN4hx75xjYhxy9-0vDDNcjHqDNXNSIAuhhCP06BdgXvHQnviuGmsUhXj6HnPh7T1I0JoXD-CWL_XFZSdpDnOc9ESxsUTYrQ0hTXb8NGiNJ0HI.aG6vAUELFXnmppeflb440Q/X_embedded.p)
3. [y_pred.p](https://www.kaggleusercontent.com/kf/32137682/eyJhbGciOiJkaXIiLCJlbmMiOiJBMTI4Q0JDLUhTMjU2In0..95snzFCMtfTt8jPTGYWtsw.XrmWj7fwfiEr6hZyhvk8OSkvW9ICTBNmap6f1UIO2kif45i0AeXb_VNofNo1Myv6X0GDjtUVKuj16YDHMNvNMyPcIfWNTFghrZeTip5HT03AFCjtpshhyp5E6khQrX_R11mwc0AksBS_FaXPoDBYF_SqXhFyrQ9PbIeiw4UKYYx-slKpuHWF-f2pXIn4iC5ro4fyU7MywrUDbgBEcEn4nRZdMmouMuTD7dhlM9fKbSDexHFlTy16WbIv3uzpj4IQ15ij0NrnRXshXicoUE-rwDci_6e5J5kBOdCz6DuZCSdSF3j2Vr6vkaAbHWECVkOPvlmj3RvfKl_ltY2wAHofX_u9tIbLU30BET_Ti9yBdG1b4cg5ADZkMW2j39cs5F5KtjFojvkCACb6KRT4JquERdm7KCbmWaQcXpymjO2Afl96BQ4I3QWpJvPdPHHEM_nFQZdRFtQlJE2LsFJylpu-o46Mfry3wpMV3J8kMjX4uCnzTY2ANl0PB8rb6zpzrUkE1IZWmXZF75OiK0-HQ9UQEeewCXYeOV6KaMeK2EnUhBE5K9kLWo8OQJ9BJo99wINxW4jyhc39QjvWnUhe5nxCnxcEKSGN740fT-2iZcEJEBXD3Q5uxF1aUVEZYq7bFzOnaTNWSvOMZUoihszhHR-K4-r1bBOa94BVHzP5Yar_BZ8._Bk5UDFpcx6uIVlziZ0WrQ/y_pred.p)
### Load the Pickled Data from Kaggle
import pickle
# DataFrame of articles including the processed text
df_covid = pickle.load(open("df_covid.p", "rb"))
# 2D embedding of X from t-SNE
X_embedded = pickle.load(open("X_embedded.p", "rb"))
# Labels produced by k-means
y_pred = pickle.load(open("y_pred.p", "rb"))
| 123.214286 | 857 | 0.90029 | yue_Hant | 0.531681 |
24713e6d27f96b4a79dcc9b94f66003925603cfd | 712 | md | Markdown | papers/_posts/1997-10-01-1997-Balakrishnan-allerton.md | sseshan/homepage | 487f737887c7e431550a5f3b019fd46409af5b1f | [
"MIT"
] | null | null | null | papers/_posts/1997-10-01-1997-Balakrishnan-allerton.md | sseshan/homepage | 487f737887c7e431550a5f3b019fd46409af5b1f | [
"MIT"
] | null | null | null | papers/_posts/1997-10-01-1997-Balakrishnan-allerton.md | sseshan/homepage | 487f737887c7e431550a5f3b019fd46409af5b1f | [
"MIT"
] | null | null | null | ---
key: 1997-Balakrishnan-allerton
date: 1997-10-01
title: "TCP Improvements for Heterogeneous Networks: The Daedalus Approach"
venue: "Allerton Conference on Communication, Control"
authors: H. Balakrishnan, V. N. Padmanabhan, S. Seshan, M. Stemm, E. Amir and R. H. Katz
---
Bibtex Entry:
@inproceedings{1997-Balakrishnan-allerton,
author = "Balakrishnan, H. and Padmanabhan, V. N. and Seshan, S. and Stemm, M. and Amir, E. and Katz, R. H.",
title = "TCP Improvements for Heterogeneous Networks: The Daedalus Approach",
booktitle = "Allerton Conference on Communication, Control",
month = "October",
address = "Urbana, IL",
year = "1997",
category = "BARWAN",
summary = ""
}
| 32.363636 | 113 | 0.69382 | eng_Latn | 0.718386 |
2471c0aa96cce0b0aa50e2556ec622bdfe8eea45 | 552 | md | Markdown | content/api/ng-rest/rest.gettype.md | ressurectit/ressurectit.github.io | 09ed543e50e9b35594333afe6e98d79687849b04 | [
"MIT"
] | null | null | null | content/api/ng-rest/rest.gettype.md | ressurectit/ressurectit.github.io | 09ed543e50e9b35594333afe6e98d79687849b04 | [
"MIT"
] | null | null | null | content/api/ng-rest/rest.gettype.md | ressurectit/ressurectit.github.io | 09ed543e50e9b35594333afe6e98d79687849b04 | [
"MIT"
] | null | null | null | <!-- Do not edit this file. It is automatically generated by API Documenter. -->
[Home](./index.md) > [@anglr/rest](./rest.md) > [getType](./rest.gettype.md)
## getType() function
Gets underlying type for Type and NotType
<b>Signature:</b>
```typescript
export declare function getType<TType>(type: Type<TType>): Type<TType>;
```
## Parameters
| Parameter | Type | Description |
| --- | --- | --- |
| type | Type<TType> | Type that is going to be used for extraction |
<b>Returns:</b>
Type<TType>
| 22.08 | 83 | 0.621377 | eng_Latn | 0.788905 |
2471f8db185ba35bf5539c1f0da0b336063d899c | 438 | md | Markdown | source/_posts/AllTogether-2022-03-06.md | PttCodingMan/ptt_post_too_many_monitor | 41031f3c6691e7050a1b5ef523e3cd2a00e502c5 | [
"Apache-2.0"
] | null | null | null | source/_posts/AllTogether-2022-03-06.md | PttCodingMan/ptt_post_too_many_monitor | 41031f3c6691e7050a1b5ef523e3cd2a00e502c5 | [
"Apache-2.0"
] | null | null | null | source/_posts/AllTogether-2022-03-06.md | PttCodingMan/ptt_post_too_many_monitor | 41031f3c6691e7050a1b5ef523e3cd2a00e502c5 | [
"Apache-2.0"
] | null | null | null | ---
title: 2022-03-06-AllTogether 違規 3 人
tags:
- AllTogether
abbrlink: 2022-03-06-AllTogether
date: AllTogether-2022-03-06 12:00:00
---
AllTogether 板 [板規連結](https://www.ptt.cc/bbs/AllTogether/M.1643211430.A.5FB.html)
昨天違規 3 人
<!-- more -->
違規清單
7 日內不得超過 1 篇
3/05 dreambreaken □ [徵女] 台北今晚吃飯小酌
2/27 dreambreaken □ [徵女] 台北今晚小酌
3/05 nasa01 □ [徵女] 擔心婆媳問題的妳
2/27 nasa01 □ [徵女] 爬金面山
3/05 lnear □ [徵女] 然後開始尋找賜福
2/28 lnear □ [徵女] 明天新竹一起看純愛 | 19.909091 | 80 | 0.691781 | yue_Hant | 0.828159 |
24724c1fd64da5064f2f11b4969368e0347f938f | 58 | md | Markdown | README.md | Max1993Liu/Linearizer | 739c47c0d98d262a0bc962a450729bcf83c61212 | [
"MIT"
] | null | null | null | README.md | Max1993Liu/Linearizer | 739c47c0d98d262a0bc962a450729bcf83c61212 | [
"MIT"
] | null | null | null | README.md | Max1993Liu/Linearizer | 739c47c0d98d262a0bc962a450729bcf83c61212 | [
"MIT"
] | null | null | null | # Linearizer
Linearizing parameters for linear regression
| 19.333333 | 44 | 0.862069 | eng_Latn | 0.991183 |
2472d96600c9d3a8023cdf12df876e7b26bdaa4c | 182 | md | Markdown | schema/alert/README.md | zls/prometheus-sql-adapter | 70865dbfea3724518a08c30cc9353eb2df97b558 | [
"MIT"
] | 5 | 2019-12-04T17:35:21.000Z | 2020-07-08T10:22:56.000Z | schema/alert/README.md | zls/prometheus-sql-adapter | 70865dbfea3724518a08c30cc9353eb2df97b558 | [
"MIT"
] | 74 | 2019-11-24T00:59:00.000Z | 2020-12-27T03:57:28.000Z | schema/alert/README.md | zls/prometheus-sql-adapter | 70865dbfea3724518a08c30cc9353eb2df97b558 | [
"MIT"
] | 2 | 2020-02-10T20:34:41.000Z | 2020-10-20T17:35:02.000Z | # Alerting Queries
These are relatively optimized queries for common alerts, meant to be run on a regular interval and over a short
time range, by Grafana or another alerting tool.
| 36.4 | 112 | 0.802198 | eng_Latn | 0.999899 |
24737d3a039a57596baccf6e71f0cc46783a8ea4 | 6,545 | md | Markdown | README.md | AutomationLife/terraform-scripts | d32426ca9a8245d756979b574fdb8315d15217ce | [
"MIT"
] | null | null | null | README.md | AutomationLife/terraform-scripts | d32426ca9a8245d756979b574fdb8315d15217ce | [
"MIT"
] | null | null | null | README.md | AutomationLife/terraform-scripts | d32426ca9a8245d756979b574fdb8315d15217ce | [
"MIT"
] | null | null | null | # terraform-scripts
Terraform is also known as Innfrastructure as Code (IaC).
It is used for Auto Provisioning and Procurement.
Its owned by Hashicorp, It also owns tools like vault, packer, vagrant.
Launched in 2014 as OSS.
There are two types versions of terraform available, OSS and Enterprise
Its available as 2 types to use, OnPrem and SaaS(Cloud).
Its written in HCL (HashiCorp Configuration Language).
Terraform is cloud Agnostic and can be used on any cloud provider. Cloud Provider specific tools are
* CloudFormation (AWS)
* Azure Resource Manager Template (ARM)
* Deployment Manager (GCP)
This is Easy to use and Manage.
This is Descriptive Scripting Language (DSL) ment it doesnt have more code logic to write and just small parameters to add.
In build Dependency Resolving Capability - sets its own order to run the things.
Automate other platforms like k8s, rancher, github, gitlab, postgress, splunk, cisco.
This is very easy to learn as it has only 8 components to learn and understand.
1. provider - which platform to automate like AWS, Azure, GCP, k8s, mysql.
2. Resources - E2C, EFS, Security Groups.
3. Data Sources - security, VPC.
4. Variables - To make code generic and reusable.
5. Outputs - to display the output on screen for usage like credentials, db details, etc.
6. Provisioners - to perform certain tasks on procured infra once they are ready to use.
7. Modules - Easy to call the part of code in their code.
8. .tf (terraformfile) - file extension for file.
Other Capabilities:
* workspace - work on different provisions at a time managing in different workspaces.
* backend - stores the state in remote location as tf state.
* lock - locking the state file when something is running.
URL: https://www.terraform.io/
## Installation of Terraform CLI
Download CLI: https://www.terraform.io/downloads.html
### MAC
**OPT 1**: Export the path of installed tool.
export PATH=$PATH:/usr/local/terraform/
**OPT 2**: Just copy your terraform binary and place it in /usr/local/bin/
sudo mv terraform /usr/local/bin
verify your installation and check the version, launch Windows PowerShell and enter: *terraform -version*
### Windows
1. Download the applicable package to your local system.
2. Extract the package to the folder C:\Program Files (x86).
This path is used as an example. However, you can also the Terraform executable to any other location in your local system.
3. Update the path environment variable to include the folder where your Terraform.executable is located.
1. Go to the **Control Panel**.
2. Click **System**.
3. On a Windows 10 system, click **Advanced system settings**. This option might vary in different versions of Windows. The Advanced tab of the System Properties window is displayed.
4. Click **Environment Variables** near the bottom of the window. The Environment Variables window is displayed.
5. In the System variables pane, click **Path** and then click **Edit**.
6. Click **New**. Add the path to the folder where your Terraform executable is located.
7. Click **OK** to save your changes and then click **OK** to exit the Environment Variables windows. Then click **OK** again to exit the System Properties window.
8. To verify your installation and check the version, launch Windows PowerShell and enter: *terraform -version*
Add path of terraform where its installed in PATH variable.
## Working with Terraform
Below steps run in folder *terraform-demo*.
1. Create **main.tf** file.
2. We use *AWS* and work in region *us-east-2*. It can be any region but for practice using this region.
3. Provide the values for the **AWS_ACCESS_KEY_ID** and **AWS_SECRET_ACCESS_KEY** and export it. For Windows add these variables in User Variables section of **Environment Variables** in **Advanced system settings**.
export AWS_ACCESS_KEY_ID="anaccesskey"
export AWS_SECRET_ACCESS_KEY="asecretkey"
4. Run the commmand *terraform init*, It will download the provider plugin locally with **.terraform** folder
>We have installed terraform cli on local machine, created a terraform project. On running **terraform init** command, terraform cli on local will connect to terraform registry to download the plugin needed to install for connecting remote aws cloud. This will create *.terraform* folder with all plugins on local.
>For accessing the details of provider access the URL https://registry.terraform.io/ and click on **Browse Registry** to access all supported providers or use https://registry.terraform.io/browse/providers.
## Working with Variables
Below steps run in folder *terraform-demo*.
1. Create a file **variables.tf**.
2. Write code for the values you want to parameterize.
3. Use this variable in the place where its value need to be used in the following format.
var.<variable-name>
4. You can give variable without value, whose value need to be given at run time, predefine the values of the variable in the file, Provide the value parameterized.
terraform apply -var variable=value
## working with Modules
Below steps run in folder *terraform-modules-demo*.
1. Create a folder **modules**.
2. Create a folder with the name of module under **modules** and create four files under it.
1. **variables.tf** - To write variables that we use in terraform files.
2. **main.tf** - Write actual code logic.
3. **output.tf** - Write the outputs needed for reference and usage.
4. **versions.tf** - Required version of terraform to use this module.
3. Write your respective set of codes in thoose file and come back to **terraform-modules-demo** folder.
4. Create a file **main.tf** where we define the provider and call the modules that are needed to run.
## Terraform commands
* **terraform init** - Initialize terraform, download the plugins needed.
* **terraform validate** - Validate the terraform script for errors.
* **terraform plan** - Will do a dry-run and tell whats going to execute.
* **terraform apply** - Will apply al the terraform changes.
* *-auto-approve* - Pass this parameter to the apply command for auto approval if you feel everything is correct.
terraform apply -auto-approve
* *-target module.\<module-name>* - to run a specific module.
terraform apply -target module.<module-name>
* **terraform state list** - will tell what all changes done.
* **terraform output** - will show the details that are needed for handy to get printed in output.
* **terraform destroy** - Will delete everything thats created. | 49.583333 | 318 | 0.751566 | eng_Latn | 0.992341 |
24762870d920f402c3909494cc4d65677f3cc6e6 | 12,652 | md | Markdown | _posts/projects/2016-01-29-SFCrimeExplore/2016-01-29-SFCrimeExplore.md | MarkPratley/markpratley.github.io | bb5e239226d48c4105383c8ab7da512eaab7a624 | [
"MIT"
] | 1 | 2016-08-25T23:36:45.000Z | 2016-08-25T23:36:45.000Z | _posts/projects/2016-01-29-SFCrimeExplore/2016-01-29-SFCrimeExplore.md | MarkPratley/markpratley.github.io | bb5e239226d48c4105383c8ab7da512eaab7a624 | [
"MIT"
] | null | null | null | _posts/projects/2016-01-29-SFCrimeExplore/2016-01-29-SFCrimeExplore.md | MarkPratley/markpratley.github.io | bb5e239226d48c4105383c8ab7da512eaab7a624 | [
"MIT"
] | null | null | null | ---
layout: page-fullwidth
title: "San Francisco Crime"
#subheadline: "Exploring and Mapping Crime in San Francisco"
meta_teaser: "Exploring and Mapping Crime in San Francisco"
teaser: "Exploring and Mapping Crime in San Francisco"
breadcrumb: true
comments: true
meta: true
header:
title: ""
image_fullwidth: SF-Crime2_Middle.png
background-color: "#262930"
caption: Mapping Sanfrancisco Crime
image:
thumb: SF-Crime.png
homepage: SF-Crime2_Middle.png
caption: Mapping Sanfrancisco Crime
categories:
- projects
tags:
- r
- google maps
- ggplot2
---
## Intro
I recently stumbled onto the [San Francisco Police Deparatment's Incidents Dataset](https://data.sfgov.org/Public-Safety/SFPD-Incidents-from-1-January-2003/tmnf-yvry), which caught my interest and I decided to investigate further.
[This](https://data.sfgov.org/api/views/tmnf-yvry/rows.csv?accessType=DOWNLOAD) dataset contains records of all the Incidents derived from SFPD Crime Incident Reporting system from 01/01/2003 up until 31/12/2014.
It contains almost 2 million incidents, where each incident has 13 data variables showing the Category, Description, and Resolution of the incident along with Date, Time and Location details.
From these it is possible to explore the data in some detail and also create some geo-data maps of incident locations.
Let's firstly take a look at the main types of information it contains.
## Categories
Category is a broad umbrella term to easily determine the nature of the incident.
There are 39 categories with these being the most numerous:
{% highlight r %}
sf.data %>%
group_by(Category) %>%
summarise(Num=n()) %>%
arrange(desc(Num)) %>%
head() %>%
kable()
{% endhighlight %}
|Category | Num|
|:--------------|------:|
|Larceny/Theft | 374680|
|Other Offenses | 263432|
|Non-Criminal | 195806|
|Assault | 161609|
|Vehicle Theft | 112148|
|Drug/Narcotic | 110181|
{% highlight r %}
sf.data %>%
select(Category) %>%
group_by(Category) %>%
summarise(Num=n()) %>%
arrange(desc(Num)) %>%
head(7) %>%
ggplot(aes(x=Category, y=Num)) +
geom_bar(stat = "identity", aes(fill=Category)) +
theme(axis.text.x=element_blank()) +
labs(x="Category",
y = "Number",
title="Top Incident Categories") +
scale_y_continuous(labels = comma)
{% endhighlight %}

## Crime Type
The dataset also contains a finer grain summary of each incident; these examples are from taken from the Drug/Narcotic category:
{% highlight r %}
sf.data %>%
filter(Category=="Drug/Narcotic") %>%
group_by(Descript) %>%
summarise(Num=n()) %>%
arrange(desc(Num)) %>%
head() %>%
kable()
{% endhighlight %}
|Descript | Num|
|:----------------------------------------|-----:|
|Possession Of Narcotics Paraphernalia | 20486|
|Possession Of Base/Rock Cocaine | 14109|
|Possession Of Marijuana | 11228|
|Sale Of Base/Rock Cocaine | 8805|
|Possession Of Meth-Amphetamine | 7551|
|Possession Of Base/Rock Cocaine For Sale | 7364|
## Resolution
The outcome of the incident.
Shown below are most common outcomes in the Weapon Laws category.
{% highlight r %}
sf.data %>%
filter(Category=="Weapon Laws") %>%
group_by(Resolution) %>%
summarise(Num=n()) %>%
arrange(desc(Num)) %>%
head() %>%
kable()
{% endhighlight %}
|Resolution | Num|
|:-------------------|-----:|
|Arrest, Booked | 10554|
|None | 4874|
|Arrest, Cited | 1156|
|Juvenile Booked | 636|
|Juvenile Cited | 262|
|Juvenile Admonished | 191|
## Arrested?
For ease of analysis a new 'Arrested' variable was created which simplifies two arrest resolutions ("Arrest, Booked" and "Arrest, Cited") as 1, and all the other Resolutions as 0.
{% highlight r %}
sf.data <- sf.data %>%
mutate(Arrested=ifelse(grepl("Arrest", Resolution), 1, 0))
{% endhighlight %}
This allows us to easily show the incidents with the highest arrest rate:
{% highlight r %}
sf.data %>%
group_by(Category) %>%
summarise(Num=n(),
Arrested=sum(Arrested),
PercentArrested=round(100*sum(Arrested)/n(), 0)) %>%
arrange(desc(PercentArrested)) %>%
head(9) %>%
kable()
{% endhighlight %}
|Category | Num| Arrested| PercentArrested|
|:---------------------------|------:|--------:|---------------:|
|Driving Under The Influence | 4867| 4558| 94|
|Prostitution | 15429| 14434| 94|
|Warrants | 87945| 81579| 93|
|Drug/Narcotic | 110181| 98538| 89|
|Loitering | 2340| 2040| 87|
|Liquor Laws | 3821| 3221| 84|
|Stolen Property | 9825| 8060| 82|
|Drunkenness | 8882| 7212| 81|
|Other Offenses | 263432| 183513| 70|
And also the incidents with the lowest:
{% highlight r %}
sf.data %>%
group_by(Category) %>%
summarise(Num=n(),
Arrested=sum(Arrested),
PercentArrested=round(100*sum(Arrested)/n(), 0)) %>%
arrange(desc(PercentArrested)) %>%
tail(9) %>%
kable()
{% endhighlight %}
|Category | Num| Arrested| PercentArrested|
|:-----------------|------:|--------:|---------------:|
|Larceny/Theft | 374680| 31419| 8|
|Recovered Vehicle | 6346| 395| 6|
|Bad Checks | 849| 45| 5|
|Non-Criminal | 195806| 10082| 5|
|Suspicious Occ | 66176| 2926| 4|
|Vehicle Theft | 112148| 4381| 4|
|Suicide | 1108| 32| 3|
|Missing Person | 54344| 1224| 2|
|Runaway | 3957| 17| 0|
## Highest Arrested Percent Crimes
{% highlight r %}
sf.data %>%
group_by(Category) %>%
summarise(Num=n(),
Arrested=sum(Arrested),
Percent=round(100*sum(Arrested)/n(), 0)) %>%
arrange(desc(Percent)) %>%
head(9) %>%
ggplot(aes(x=reorder(Category, -Percent),
y=Percent,
fill=reorder(Category, -Percent))) +
geom_bar(stat = "identity") +
theme(axis.text.x=element_blank(),
legend.title=element_blank()) +
labs(x="Category",
y = "Percent Arrested",
title="Percent Arrested per Category of Incident") +
ylim(c(0,100))
{% endhighlight %}

## Robbery/Theft Data
Having briefly looked at arrest rates, let's move onto crime location.
As a visitor to San Fransisco the crimes that would interest me most (in a can I avoid them kind of way) would probably be robbery and theft.
These appear to grouped under the following descriptions:
{% highlight r %}
sf.p.lt <-
sf.data %>%
filter(Category=="Larceny/Theft" &
grepl("Person|Pursesnatch|Pickpocket|Credit Card", Descript) |
Category=="Robbery" &
!grepl("Bank|Carjacking|Store|Residence|Vehicle|Station|Estab", Descript))
sf.p.lt %>%
group_by(Descript) %>%
summarise(Num=n()) %>%
arrange(desc(Num)) %>%
head(10) %>%
kable()
{% endhighlight %}
|Descript | Num|
|:-------------------------------------------------|-----:|
|Grand Theft From Person | 14799|
|Robbery On The Street, Strongarm | 13837|
|Grand Theft Pickpocket | 11728|
|Robbery, Bodily Force | 9407|
|Robbery On The Street With A Gun | 4278|
|Attempted Robbery On The Street With Bodily Force | 2349|
|Robbery, Armed With A Gun | 2292|
|Attempted Robbery With Bodily Force | 1610|
|Robbery On The Street With A Knife | 1535|
|Grand Theft Pursesnatch | 1487|
## Robbery/Theft Incidents Map
{% highlight r %}
sf.big.box = c(-122.52,37.7,-122.35,37.82)
big.map <- get_map(location = sf.big.box)
{% endhighlight %}
{% highlight r %}
ggmap(big.map) +
geom_point(data = sf.p.lt, aes(x=X, y=Y),
color = "red", size = 0.6, alpha = 0.3) +
labs(title = "Locations of Robbery/Theft Incidents in San Francisco")
{% endhighlight %}

This map shows that Robbery/Theft occurs all over SF, but gives little indication of which areas are better or worse than others.
## Robbery/Theft Contour Map
Let's look at a contour map and see if there are any hotspots.
{% highlight r %}
# Robbery/Theft Contours
ggmap(big.map, extent='device', legend="bottomleft") +
stat_density2d(data = sf.p.lt,
aes(x = X, y = Y, fill = ..level.., alpha=..level..),
size = 0.1, colour="red", n=100, geom = "polygon") +
scale_alpha_continuous(range=c(0.6,0.8), guide='none') +
scale_fill_gradient('Robbery\nTheft\nDensity')+
ggtitle('Robbery/Theft Density in San Francisco')
{% endhighlight %}

## A Closer Look
There are some definite hotspots there so we'll zoom in and take a closer look.
{% highlight r %}
# more focussed map for larceny centre
zoom.centre = c(-122.41, 37.79)
zoom.map <- get_map(location=zoom.centre, zoom = 14)
{% endhighlight %}
{% highlight r %}
# Robbery/Theft Contours
ggmap(zoom.map, extent='device', legend="topleft") +
stat_density2d(data = sf.p.lt,
aes(x = X, y = Y, fill = ..level.., alpha=..level..),
size = 0.1, colour="red", n=100,
geom = "polygon") +
scale_alpha_continuous(range=c(0.2,0.4), guide='none') +
scale_fill_gradient('Robbery\nTheft\nDensity')+
ggtitle('Robbery/Theft Density in San Francisco')
{% endhighlight %}

# Central District
An even closer look at the central district
{% highlight r %}
# more focussed map for larceny centre
big.zoom.centre = c(-122.409, 37.786)
big.zoom.map <- get_map(location=zoom.centre, zoom = 15)
{% endhighlight %}
{% highlight r %}
# Robbery/Theft Contours
ggmap(big.zoom.map, extent='device', legend="topleft") +
stat_density2d(data = sf.p.lt,
aes(x = X, y = Y, fill = ..level.., alpha=..level..),
size = 0.1, colour="red", n=200, geom = "polygon") +
scale_alpha_continuous(range=c(0.05, 0.1), guide='none') +
scale_fill_gradient('Robbery\nTheft\nDensity')+
ggtitle('Robbery/Theft Density in San Francisco')
{% endhighlight %}

From the map it's clear that there are areas with elevated levels of robbery/theft, and with the shading mostly removed its possible to see that one of the strongest centres is the area area around Powel St Station.
# Robbery vs Theft Contours
Obviously no-one wants to be a victim of larceny/theft, but robbery (with violence or threat) is likely to be worse - do these two sets of crimes happen in different places?
Larceny contours are in blue, Robbery contours in red.
{% highlight r %}
# Larceny - Blue
# Robbery - Red
ggmap(big.zoom.map, extent='device') +
stat_density2d(data = sf.p.lt %>% filter(Category=="Larceny/Theft"),
aes(x = X, y = Y, fill = ..level.., alpha=..level..),
colour="blue", n=200, geom = "polygon", bins=7) +
stat_density2d(data = sf.p.lt %>% filter(Category=="Robbery"),
aes(x = X, y = Y, fill = ..level.., alpha=..level..),
colour="red", n=200, geom = "polygon", bins=7) +
scale_alpha_continuous(range=c(0.0,0.1), guide='none') +
scale_fill_gradient('Robbery\nTheft\nDensities') +
ggtitle('Robbery/Theft Contours in San Francisco') +
theme(legend.position = "none")
{% endhighlight %}

<br>From the overlapping contours it seems that these two categories of crimes happen in broadly the same areas with only minor differences.
<br>
<br>
The github repository for this project can be viewed [here](https://github.com/MarkPratley/SF-Crime).
| 32.441026 | 230 | 0.596744 | eng_Latn | 0.777938 |
2476320c9939f402cca37d482e244a64c72805f4 | 358 | md | Markdown | README.md | vinz9/BulletCHOP | 764fa159913bcd7ff9a1ac03b221b90431f37b8c | [
"MIT"
] | 45 | 2015-01-21T05:59:30.000Z | 2021-04-09T23:51:53.000Z | README.md | vinz9/BulletCHOP | 764fa159913bcd7ff9a1ac03b221b90431f37b8c | [
"MIT"
] | null | null | null | README.md | vinz9/BulletCHOP | 764fa159913bcd7ff9a1ac03b221b90431f37b8c | [
"MIT"
] | 6 | 2015-01-21T05:59:31.000Z | 2018-06-28T05:59:00.000Z | BulletCHOP
==========
Bullet Physics in Touch Designer as a Channel Operator.
Requires Bullet Physics library (http://bulletphysics.org/) to compile.
Tested with v2.87.
06/25/2018 v0.2 : update to Bullet 2.87 and TouchDesigner 099
04/08/2013 v0.1 : dynamic and kinematic bodies implemented, only box collision shape
Vincent Houzé
https://vincenthouze.com | 27.538462 | 84 | 0.765363 | kor_Hang | 0.405187 |
2477feb8993d2420d6fe10de6f74b47c7049d8cd | 2,281 | md | Markdown | docs/code-quality/c6388.md | monkey3310/visualstudio-docs.pl-pl | adc80e0d3bef9965253897b72971ccb1a3781354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/code-quality/c6388.md | monkey3310/visualstudio-docs.pl-pl | adc80e0d3bef9965253897b72971ccb1a3781354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/code-quality/c6388.md | monkey3310/visualstudio-docs.pl-pl | adc80e0d3bef9965253897b72971ccb1a3781354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: C6388
ms.date: 11/04/2016
ms.prod: visual-studio-dev15
ms.technology: vs-ide-code-analysis
ms.topic: reference
f1_keywords:
- C6388
helpviewer_keywords:
- C6388
ms.assetid: 667fe9cf-cc53-49f9-b6c0-6ee87c397568
author: mikeblome
ms.author: mblome
manager: wpickett
ms.workload:
- multiple
ms.openlocfilehash: bfc94625ee6a4727dc5fde25c826b4faeae6f57e
ms.sourcegitcommit: e13e61ddea6032a8282abe16131d9e136a927984
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 04/26/2018
ms.locfileid: "31895035"
---
# <a name="c6388"></a>C6388
Ostrzeżenie C6388: \<argument > nie może być \<wartość >: to nie jest zgodna ze specyfikacją funkcji \<nazwy funkcji >: wiersze: x, y
To ostrzeżenie oznacza, że nieoczekiwaną wartość jest używana w określonym kontekście. Zazwyczaj jest to raportowane dla wartości przekazywane jako argumenty do funkcji, która nie jest oczekiwany.
## <a name="example"></a>Przykład
Następujący kod w języku C++ generuje to ostrzeżenie, ponieważ DoSomething oczekuje wartości null, ale mogą być przekazywane wartości potencjalnie innych niż null:
```cpp
#include <string.h>
#include <malloc.h>
#include <sal.h>
void DoSomething( _Pre_ _Null_ void* pReserved );
void f()
{
void* p = malloc( 10 );
DoSomething( p ); // Warning C6388
// code...
free(p);
}
```
Aby usunąć to ostrzeżenie, użyj następujący kod:
```cpp
#include <string.h>
#include <malloc.h>
#include <sal.h>
void DoSomething( _Pre_ _Null_ void* pReserved );
void f()
{
void* p = malloc( 10 );
if (!p)
{
DoSomething( p );
}
else
{
// code...
free(p);
}
}
```
Należy pamiętać, że użycie — funkcja malloc oraz wolnego mają wiele problemów pod względem przecieki pamięci i wyjątków. Aby uniknąć tego rodzaju przecieki i wyjątków problemów, należy używać mechanizmy, które są dostarczane przez standardowa biblioteka szablonów (STL) C++. Obejmują one [shared_ptr](/cpp/standard-library/shared-ptr-class), [unique_ptr](/cpp/standard-library/unique-ptr-class), i [wektor](/cpp/standard-library/vector). Aby uzyskać więcej informacji, zobacz [wskaźniki inteligentne](/cpp/cpp/smart-pointers-modern-cpp) i [standardowa biblioteka C++](/cpp/standard-library/cpp-standard-library-reference). | 31.246575 | 623 | 0.73082 | pol_Latn | 0.994178 |
24781de731f95752ba916e1fd6a57db99d91735b | 3,560 | md | Markdown | articles/finance/asset-leasing/set-up-lease-wrkflw.md | MicrosoftDocs/Dynamics-365-Operations.hu-hu | 52e56eef99cc41887360f8961e7ccd2adfa21fd1 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2020-05-18T17:14:11.000Z | 2021-04-20T21:13:46.000Z | articles/finance/asset-leasing/set-up-lease-wrkflw.md | MicrosoftDocs/Dynamics-365-Operations.hu-hu | 52e56eef99cc41887360f8961e7ccd2adfa21fd1 | [
"CC-BY-4.0",
"MIT"
] | 7 | 2017-12-08T15:20:43.000Z | 2021-02-17T13:09:53.000Z | articles/finance/asset-leasing/set-up-lease-wrkflw.md | MicrosoftDocs/Dynamics-365-Operations.hu-hu | 52e56eef99cc41887360f8961e7ccd2adfa21fd1 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2019-10-12T18:21:04.000Z | 2021-10-13T09:24:54.000Z | ---
title: Lízingjóváhagyási munkafolyamatok beállítása
description: A témakör bemutatja, hogyan állíthat be egy jóváhagyási munkafolyamatot, amely új lízing létrehozásakor fog futni.
author: moaamer
ms.date: 04/12/2021
ms.topic: article
ms.prod: ''
ms.technology: ''
ms.search.form: WorkflowTableListPageRnr
audience: Application User
ms.reviewer: roschlom
ms.custom: 4464
ms.assetid: 5f89daf1-acc2-4959-b48d-91542fb6bacb
ms.search.region: Global
ms.author: moaamer
ms.search.validFrom: 2020-10-28
ms.dyn365.ops.version: 10.0.14
ms.openlocfilehash: 2f99bb480e6ee2314852965ab9559bae2ad348fb92514d791fca127d91558348
ms.sourcegitcommit: 42fe9790ddf0bdad911544deaa82123a396712fb
ms.translationtype: HT
ms.contentlocale: hu-HU
ms.lasthandoff: 08/05/2021
ms.locfileid: "6733635"
---
# <a name="set-up-lease-approval-workflows"></a>Lízingjóváhagyási munkafolyamatok beállítása
[!include [banner](../includes/banner.md)]
A témakör bemutatja, hogyan állíthat be egy jóváhagyási munkafolyamatot, amely új lízing létrehozásakor fog futni. A munkafolyamat használatáról a [Lízingjóváhagyási munkafolyamatok használata](use-create-lease-wrkflw.md) című témakörben talál további információt.
1. Nyissa meg az **Eszközlízing \> Beállítás \> Lízingmunkafolyamat** lehetőséget.
2. A **Lízingmunkafolyamat** oldalon válassza az **Új** lehetőséget.
3. A megjelenő párbeszédpanelben a **Munkafolyamat-típus** csoportjában válassza a **Lízingmunkafolyamat** hivatkozást.
Az alkalmazás megnyílik. Futtatása után jelentkezzen be az Azure Active Directory (Azure AD) alkalmazásba, hogy átirányítsa a munkafolyamat-alkalmazásba.
4. Húzza a **Lízingmunkafolyamat-jóváhagyási** elemét a munkafolyamatra.
5. Csatlakoztasson egy csomópontot a **Start** – **Lízingmunkafolyamat-jóváhagyáshoz**. Ezután csatlakoztassa a **Lízingmunkafolyamat-jóváhagyás** – **Befejezéshez**.
6. Kattintson duplán a **Lízingmunkafolyamat-jóváhagyásra**.
7. Válassza a **Tulajdonságok** lehetőséget, majd az **Alapbeállítások** elemben adja meg a munkafolyamat nevét.
Ezen a lapon további paramétereket is beállíthat a munkafolyamathoz. Ha bekapcsolta az **Automatikus műveletek** lehetőséget, a rendszer automatikusan végrehajt egy adott műveletet. Az értesítések akkor küldhetők el, ha meg vannak adva az **Értesítések** lapon. A **Speciális beállítások** fülön megadhatja a végső jóváhagyót, beállíthat egy időkorlátot, és kijelölhet konkrét műveleteket, amelyeket végre kell hajtani.
8. Ha befejezte a munkafolyamat-paraméterek beállítását, válassza a **Bezárás** gombot.
9. Válassza ki az **1. lépés** lehetőséget, majd a **Tulajdonságok** elemet.
10. Az **Alapszintű beállítások** lehetőségben adja meg a lépés nevét, hozzon létre egy jóváhagyási tárgysort, és adja meg a jóváhagyásra vonatkozó utasításokat.
11. A **Hozzárendelés** lapon válassza ki a hozzárendelés típusát.
12. Ha adott felhasználókat szeretne hozzárendelni a jóváhagyáshoz, válassza a **Felhasználó** lehetőséget, jelölje ki a lízingeket jóváhagyó felhasználókat, majd válassza a **Bezárás** lehetőséget.
13. A munkafolyamat létrehozásához válassza a **Mentés és a bezárás** lehetőséget. Ezután a rendszer kérésére válassza az **OK** gombot.
14. A **Munkafolyamat létrehozása** oldalon válassza a **Bezárás** lehetőséget.
14. Jelölje ki az új munkafolyamatot, majd válassza a **Verziók** lehetőséget. Ezután válassza az **Aktívvá tétel** lehetőséget, hogy a munkafolyamat aktív legyen.
15. Válassza **Bezárás** lehetőséget. Megjelenik az új aktív verzió.
[!INCLUDE[footer-include](../../includes/footer-banner.md)]
| 63.571429 | 423 | 0.800281 | hun_Latn | 1.000002 |
24785df58aa70a28e132140f5bf8b51872f08e3a | 211 | md | Markdown | content/news/_index.md | afeijoo/digitalgov.gov | 117098d31802464d9696987980f4a400f3f6654c | [
"CC0-1.0"
] | 1 | 2022-02-11T11:53:47.000Z | 2022-02-11T11:53:47.000Z | content/news/_index.md | afeijoo/digitalgov.gov | 117098d31802464d9696987980f4a400f3f6654c | [
"CC0-1.0"
] | null | null | null | content/news/_index.md | afeijoo/digitalgov.gov | 117098d31802464d9696987980f4a400f3f6654c | [
"CC0-1.0"
] | null | null | null | ---
title: "News and Updates"
summary: "Innovative work, news and ideas from people and teams in government"
deck: "Innovative work, news and ideas from people and teams in government"
aliases:
- /posts/
---
| 23.444444 | 78 | 0.729858 | eng_Latn | 0.99966 |
247a1814535f78e058945ba317ac040a2a00014b | 41,517 | md | Markdown | README.md | trickster/world-sqlite3 | 27ad38366c10b890f7f037032004ba16451b5885 | [
"CC-BY-4.0"
] | 120 | 2021-04-17T19:17:07.000Z | 2022-03-23T02:58:58.000Z | README.md | trickster/world-sqlite3 | 27ad38366c10b890f7f037032004ba16451b5885 | [
"CC-BY-4.0"
] | null | null | null | README.md | trickster/world-sqlite3 | 27ad38366c10b890f7f037032004ba16451b5885 | [
"CC-BY-4.0"
] | 10 | 2021-05-03T12:42:02.000Z | 2022-01-12T15:07:21.000Z | Small script to import the CSV files from https://datacatalog.worldbank.org/dataset/world-development-indicators into an SQLite database.
The dataset is licensed under CC-BY 4.0.
1. Download the CSV zip (in the above link, go to Data & Resources, then select CSV)
2. run create_db.py
The resulting database is around 600MB (7 million indicator values).
## Examples
**`select count(*) as num_countries from wdi_country`**
| | num_countries |
|---:|----------------:|
| 0 | 263 |
**`select country_code, long_name, currency_unit from wdi_country limit 5`**
| | country_code | long_name | currency_unit |
|---:|:---------------|:-----------------------------|:----------------|
| 0 | ABW | Aruba | Aruban florin |
| 1 | AFG | Islamic State of Afghanistan | Afghan afghani |
| 2 | AGO | People's Republic of Angola | Angolan kwanza |
| 3 | ALB | Republic of Albania | Albanian lek |
| 4 | AND | Principality of Andorra | Euro |
**30 random indictaors**
```sql
select series_code, indicator_name, substr(long_definition, 0, 500)||'...' as definition from wdi_series order by random() limit 30
```
| | series_code | indicator_name | definition |
|---:|:------------------------|:--------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | NY.GNP.MKTP.PP.KD | GNI, PPP (constant 2017 international $) | PPP GNI (formerly PPP GNP) is gross national income (GNI) converted to international dollars using purchasing power parity rates. An international dollar has the same purchasing power over GNI as a U.S. dollar has in the United States. Gross national income is the sum of value added by all resident producers plus any product taxes (less subsidies) not included in the valuation of output plus net receipts of primary income (compensation of employees and property income) from abroad. Data are in ... |
| 1 | TM.VAL.MRCH.CD.WT | Merchandise imports (current US$) | Merchandise imports show the c.i.f. value of goods received from the rest of the world valued in current U.S. dollars.... |
| 2 | SE.XPD.CPRM.ZS | Current education expenditure, primary (% of total expenditure in primary public institutions) | Current expenditure is expressed as a percentage of direct expenditure in public educational institutions (instructional and non-instructional) of the specified level of education. Financial aid to students and other transfers are excluded from direct expenditure. Current expenditure is consumed within the current year and would have to be renewed if needed in the following year. It includes staff compensation and current expenditure other than for staff compensation (ex. on teaching materials,... |
| 3 | DT.ODA.ODAT.GI.ZS | Net ODA received (% of gross capital formation) | Net official development assistance (ODA) consists of disbursements of loans made on concessional terms (net of repayments of principal) and grants by official agencies of the members of the Development Assistance Committee (DAC), by multilateral institutions, and by non-DAC countries to promote economic development and welfare in countries and territories in the DAC list of ODA recipients. It includes loans with a grant element of at least 25 percent (calculated at a rate of discount of 10 per... |
| 4 | SE.SEC.ENRL.GC.FE.ZS | Secondary education, general pupils (% female) | Secondary general pupils are the number of secondary students enrolled in general education programs, including teacher training.... |
| 5 | SE.SEC.CUAT.UP.ZS | Educational attainment, at least completed upper secondary, population 25+, total (%) (cumulative) | The percentage of population ages 25 and over that attained or completed upper secondary education.... |
| 6 | SP.URB.TOTL | Urban population | Urban population refers to people living in urban areas as defined by national statistical offices. It is calculated using World Bank population estimates and urban ratios from the United Nations World Urbanization Prospects. Aggregation of urban and rural population may not add up to total population because of different country coverages.... |
| 7 | SP.M15.2024.FE.ZS | Women who were first married by age 15 (% of women ages 20-24) | Women who were first married by age 15 refers to the percentage of women ages 20-24 who were first married by age 15.... |
| 8 | SP.RUR.TOTL.ZG | Rural population growth (annual %) | Rural population refers to people living in rural areas as defined by national statistical offices. It is calculated as the difference between total population and urban population.... |
| 9 | NE.GDI.FTOT.CN | Gross fixed capital formation (current LCU) | Gross fixed capital formation (formerly gross domestic fixed investment) includes land improvements (fences, ditches, drains, and so on); plant, machinery, and equipment purchases; and the construction of roads, railways, and the like, including schools, offices, hospitals, private residential dwellings, and commercial and industrial buildings. According to the 1993 SNA, net acquisitions of valuables are also considered capital formation. Data are in current local currency.... |
| 10 | IC.FRM.OUTG.ZS | Value lost due to electrical outages (% of sales for affected firms) | Average losses due to electrical outages, as percentage of total annual sales. The value represents average losses for all firms which reported outages (please see indicator IC.ELC.OUTG.ZS).... |
| 11 | EP.PMP.SGAS.CD | Pump price for gasoline (US$ per liter) | Fuel prices refer to the pump prices of the most widely sold grade of gasoline. Prices have been converted from the local currency to U.S. dollars.... |
| 12 | EG.ELC.LOSS.ZS | Electric power transmission and distribution losses (% of output) | Electric power transmission and distribution losses include losses in transmission between sources of supply and points of distribution and in the distribution to consumers, including pilferage.... |
| 13 | IQ.SCI.PRDC | Periodicity and timeliness assessment of statistical capacity (scale 0 - 100) | The periodicity and timeliness indicator assesses the availability and periodicity of key socioeconomic indicators. It measures the extent to which data are made accessible to users through transformation of source data into timely statistical outputs. The periodicity score is calculated as the weighted average of 10 underlying indicator scores. The final periodicity score contributes 1/3 of the overall Statistical Capacity Indicator score.... |
| 14 | SE.SEC.CUAT.UP.FE.ZS | Educational attainment, at least completed upper secondary, population 25+, female (%) (cumulative) | The percentage of population ages 25 and over that attained or completed upper secondary education.... |
| 15 | IC.TAX.METG | Number of visits or required meetings with tax officials (average for affected firms) | Average number of visits or required meetings with tax officials during the year. The value represents the average number of visits for all firms which reported being visited or required to meet with tax officials (please see indicator IC.FRM.METG.ZS).... |
| 16 | SL.TLF.0714.WK.MA.TM | Average working hours of children, working only, male, ages 7-14 (hours per week) | Average working hours of children working only refers to the average weekly working hours of those children who are involved in economic activity and not attending school.... |
| 17 | SP.POP.TOTL.FE.ZS | Population, female (% of total population) | Female population is the percentage of the population that is female. Population is based on the de facto definition of population, which counts all residents regardless of legal status or citizenship.... |
| 18 | SP.DYN.TO65.MA.ZS | Survival to age 65, male (% of cohort) | Survival to age 65 refers to the percentage of a cohort of newborn infants that would survive to age 65, if subject to age specific mortality rates of the specified year.... |
| 19 | EG.ELC.PETR.ZS | Electricity production from oil sources (% of total) | Sources of electricity refer to the inputs used to generate electricity. Oil refers to crude oil and petroleum products.... |
| 20 | SE.ADT.1524.LT.FM.ZS | Literacy rate, youth (ages 15-24), gender parity index (GPI) | Gender parity index for youth literacy rate is the ratio of females to males ages 15-24 who can both read and write with understanding a short simple statement about their everyday life.... |
| 21 | CM.MKT.TRAD.GD.ZS | Stocks traded, total value (% of GDP) | The value of shares traded is the total number of shares traded, both domestic and foreign, multiplied by their respective matching prices. Figures are single counted (only one side of the transaction is considered). Companies admitted to listing and admitted to trading are included in the data. Data are end of year values.... |
| 22 | EN.CO2.MANF.ZS | CO2 emissions from manufacturing industries and construction (% of total fuel combustion) | CO2 emissions from manufacturing industries and construction contains the emissions from combustion of fuels in industry. The IPCC Source/Sink Category 1 A 2 includes these emissions. However, in the 1996 IPCC Guidelines, the IPCC category also includes emissions from industry autoproducers that generate electricity and/or heat. The IEA data are not collected in a way that allows the energy consumption to be split by specific end-use and therefore, autoproducers are shown as a separate item (Un... |
| 23 | SH.TBS.DTEC.ZS | Tuberculosis case detection rate (%, all forms) | Tuberculosis case detection rate (all forms) is the number of new and relapse tuberculosis cases notified to WHO in a given year, divided by WHO's estimate of the number of incident tuberculosis cases for the same year, expressed as a percentage. Estimates for all years are recalculated as new information becomes available and techniques are refined, so they may differ from those published previously.... |
| 24 | FX.OWN.TOTL.FE.ZS | Account ownership at a financial institution or with a mobile-money-service provider, female (% of population ages 15+) | Account denotes the percentage of respondents who report having an account (by themselves or together with someone else) at a bank or another type of financial institution or report personally using a mobile money service in the past 12 months (female, % age 15+).... |
| 25 | SE.SEC.UNER.LO.FE.ZS | Adolescents out of school, female (% of female lower secondary school age) | Adolescents out of school are the percentage of lower secondary school age adolescents who are not enrolled in school.... |
| 26 | FX.OWN.TOTL.YG.ZS | Account ownership at a financial institution or with a mobile-money-service provider, young adults (% of population ages 15-24) | Account denotes the percentage of respondents who report having an account (by themselves or together with someone else) at a bank or another type of financial institution or report personally using a mobile money service in the past 12 months (young adults, % of population ages 15-24).... |
| 27 | per_lm_alllm.ben_q1_tot | Benefit incidence of unemployment benefits and ALMP to poorest quintile (% of total U/ALMP benefits) | Benefit incidence of unemployment benefits and active labor market programs (ALMP) to poorest quintile shows the percentage of total unemployment and active labor market programs benefits received by the poorest 20% of the population. Unemployment benefits and active labor market programs include unemployment compensation, severance pay, and early retirement due to labor market reasons, labor market services (intermediation), training (vocational, life skills, and cash for training), job rotati... |
| 28 | HD.HCI.OVRL.FE | Human capital index (HCI), female (scale 0-1) | The HCI calculates the contributions of health and education to worker productivity. The final index score ranges from zero to one and measures the productivity as a future worker of child born today relative to the benchmark of full health and complete education.... |
| 29 | BX.GSR.CCIS.CD | ICT service exports (BoP, current US$) | Information and communication technology service exports include computer and communications services (telecommunications and postal and courier services) and information services (computer data and news-related service transactions). Data are in current U.S. dollars.... |
---
Here's a more complex query: Get the newest data for the youth literacy rate indicator (from 2010 or later), then show the 20 countries with the lowest rate.
```sql
with newest_data as (
select country_code, indicator_code, max(year) as year from wdi_data
where
indicator_code = (select series_code from wdi_series where indicator_name = 'Literacy rate, youth total (% of people ages 15-24)')
and year > 2010
group by country_code
)
select c.long_name as country, printf('%.1f %%', value) as "Youth Literacy Rate"
from wdi_data, newest_data
join wdi_country c on c.country_code = wdi_data.country_code
where wdi_data.indicator_code = newest_data.indicator_code and wdi_data.country_code = newest_data.country_code and wdi_data.year = newest_data.year
order by value asc limit 20
```
| | country | Youth Literacy Rate |
|---:|:----------------------------------------|:----------------------|
| 0 | Republic of Chad | 30.8 % |
| 1 | Central African Republic | 38.3 % |
| 2 | Republic of Niger | 43.5 % |
| 3 | Republic of Guinea | 46.3 % |
| 4 | Republic of South Sudan | 47.9 % |
| 5 | Republic of Mali | 50.1 % |
| 6 | Republic of Liberia | 55.4 % |
| 7 | Burkina Faso | 58.3 % |
| 8 | Republic of Côte d'Ivoire | 58.4 % |
| 9 | Republic of Guinea-Bissau | 60.4 % |
| 10 | Republic of Benin | 60.9 % |
| 11 | Islamic Republic of Mauritania | 63.9 % |
| 12 | Islamic State of Afghanistan | 65.4 % |
| 13 | Republic of Sierra Leone | 66.6 % |
| 14 | Republic of The Gambia | 67.2 % |
| 15 | Republic of Senegal | 69.5 % |
| 16 | Republic of Mozambique | 70.9 % |
| 17 | Federal Democratic Republic of Ethiopia | 72.8 % |
| 18 | Republic of Malawi | 72.9 % |
| 19 | Republic of the Sudan | 73.0 % |
# Table Definitions
**`select * from sqlite_master`**
| | type | name | tbl_name | rootpage | sql |
|---:|:-------|:---------------------------------------------|:-------------------|-----------:|:------------------------------------------------------------------------------------------------------------------|
| 0 | table | wdi_data | wdi_data | 2 | CREATE TABLE "wdi_data" ( |
| | | | | | "indicator_code" TEXT, |
| | | | | | "country_code" TEXT, |
| | | | | | "year" INTEGER, |
| | | | | | "value" REAL |
| | | | | | ) |
| 1 | table | wdi_country | wdi_country | 3 | CREATE TABLE "wdi_country" ( |
| | | | | | "country_code" TEXT, |
| | | | | | "short_name" TEXT, |
| | | | | | "table_name" TEXT, |
| | | | | | "long_name" TEXT, |
| | | | | | "2-alpha_code" TEXT, |
| | | | | | "currency_unit" TEXT, |
| | | | | | "special_notes" TEXT, |
| | | | | | "region" TEXT, |
| | | | | | "income_group" TEXT, |
| | | | | | "wb-2_code" TEXT, |
| | | | | | "national_accounts_base_year" TEXT, |
| | | | | | "national_accounts_reference_year" REAL, |
| | | | | | "sna_price_valuation" TEXT, |
| | | | | | "lending_category" TEXT, |
| | | | | | "other_groups" TEXT, |
| | | | | | "system_of_national_accounts" TEXT, |
| | | | | | "alternative_conversion_factor" TEXT, |
| | | | | | "balance_of_payments_manual_in_use" TEXT, |
| | | | | | "external_debt_reporting_status" TEXT, |
| | | | | | "system_of_trade" TEXT, |
| | | | | | "government_accounting_concept" TEXT, |
| | | | | | "imf_data_dissemination_standard" TEXT, |
| | | | | | "latest_population_census" TEXT, |
| | | | | | "latest_household_survey" TEXT, |
| | | | | | "source_of_most_recent_income_and_expenditure_data" TEXT, |
| | | | | | "vital_registration_complete" TEXT, |
| | | | | | "latest_agricultural_census" TEXT, |
| | | | | | "latest_industrial_data" REAL, |
| | | | | | "latest_trade_data" REAL |
| | | | | | ) |
| 2 | table | wdi_country_series | wdi_country_series | 4 | CREATE TABLE "wdi_country_series" ( |
| | | | | | "countrycode" TEXT, |
| | | | | | "seriescode" TEXT, |
| | | | | | "description" TEXT |
| | | | | | ) |
| 3 | table | wdi_footnote | wdi_footnote | 5 | CREATE TABLE "wdi_footnote" ( |
| | | | | | "countrycode" TEXT, |
| | | | | | "seriescode" TEXT, |
| | | | | | "year" TEXT, |
| | | | | | "description" TEXT |
| | | | | | ) |
| 4 | table | wdi_series | wdi_series | 6 | CREATE TABLE "wdi_series" ( |
| | | | | | "series_code" TEXT, |
| | | | | | "topic" TEXT, |
| | | | | | "indicator_name" TEXT, |
| | | | | | "short_definition" TEXT, |
| | | | | | "long_definition" TEXT, |
| | | | | | "unit_of_measure" TEXT, |
| | | | | | "periodicity" TEXT, |
| | | | | | "base_period" TEXT, |
| | | | | | "other_notes" TEXT, |
| | | | | | "aggregation_method" TEXT, |
| | | | | | "limitations_and_exceptions" TEXT, |
| | | | | | "notes_from_original_source" TEXT, |
| | | | | | "general_comments" TEXT, |
| | | | | | "source" TEXT, |
| | | | | | "statistical_concept_and_methodology" TEXT, |
| | | | | | "development_relevance" TEXT, |
| | | | | | "related_source_links" TEXT, |
| | | | | | "related_indicators" TEXT, |
| | | | | | "license_type" TEXT |
| | | | | | ) |
| 5 | table | wdi_series_time | wdi_series_time | 7 | CREATE TABLE "wdi_series_time" ( |
| | | | | | "seriescode" TEXT, |
| | | | | | "year" TEXT, |
| | | | | | "description" TEXT |
| | | | | | ) |
| 6 | index | ix_wdi_data_indicator_code_country_code_year | wdi_data | 8 | CREATE INDEX "ix_wdi_data_indicator_code_country_code_year"ON "wdi_data" ("indicator_code","country_code","year") |
| 212.907692 | 667 | 0.297131 | eng_Latn | 0.990045 |
247a3969a837f4f5003805c0f868219776ac7734 | 136 | md | Markdown | README.md | imohammedramadan/random-fun-stuff | d0b3e4920092caf028d100aed60955cff93240ac | [
"MIT"
] | null | null | null | README.md | imohammedramadan/random-fun-stuff | d0b3e4920092caf028d100aed60955cff93240ac | [
"MIT"
] | null | null | null | README.md | imohammedramadan/random-fun-stuff | d0b3e4920092caf028d100aed60955cff93240ac | [
"MIT"
] | null | null | null | # random-fun-stuff
random scripts and stuff i am making for fun
This is a collection of little projects or stuff i am creating for fun
| 27.2 | 70 | 0.786765 | eng_Latn | 0.999144 |
247a5cb273be1f8b25b863412752cf0f44209666 | 1,507 | md | Markdown | content/docs/datatypes.md | DivHub/pixtudio-website | 1e8470ea18dd42fc1f8c5011f98f94eaa06128c0 | [
"MIT"
] | 1 | 2021-07-14T05:37:54.000Z | 2021-07-14T05:37:54.000Z | content/docs/datatypes.md | DivHub/pixtudio-website | 1e8470ea18dd42fc1f8c5011f98f94eaa06128c0 | [
"MIT"
] | null | null | null | content/docs/datatypes.md | DivHub/pixtudio-website | 1e8470ea18dd42fc1f8c5011f98f94eaa06128c0 | [
"MIT"
] | null | null | null | Description
-----------
Datatypes give meaning to data and dictate how a variable acts and
reacts. Examples of datatypes are `[[int]]`s, `[[float]]`s and
`[[string]]`s. Special cases are voids, arrays, varspaces and structs.
User made types can also be defined, by use of the operator
[Type](Type "wikilink").
List
----
<DPL> category = datatypes mode = userformat columns = 1 listseparators
= ,\\n\* [%TITLE%](%PAGE% "wikilink"),, redirects = include ordermethod
= titlewithoutnamespace resultsfooter = \\n%PAGES% datatypes </DPL>
Example
-------
import "mod_draw"
import "mod_wm"
import "mod_key"
import "mod_map"
Type _point
int x;
int y;
End
Process Main()
Private
_point A,B;
Begin
// Init the points
A.x = 100;
A.y = 50;
B.x = 250;
B.y = 150;
// Setup drawing
drawing_map(0,0);
drawing_color(rgb(0,255,255));
// Draw a box
drw_box(A,B);
// Wait for key ESC or X button
Repeat
frame;
Until(key(_ESC)||exit_status)
End
// Draw a box using two points
Function int drw_box(_point A, _point B)
Begin
return draw_box(A.x,A.y,B.x,B.y);
End
Used in example: [drawing\_map](drawing_map "wikilink")(),
[drawing\_color](drawing_color "wikilink")(), [rgb](rgb "wikilink")(),
[key](key "wikilink")(), [draw\_box](draw_box "wikilink")(),
[exit\_status](exit_status "wikilink")
<Category:general>
| 22.492537 | 71 | 0.599867 | eng_Latn | 0.782632 |
247a9bbecb2672974011fbe6b31fcd3eabd79b0a | 280 | md | Markdown | README.md | Kaleidics/gatsby-blog-boilerplate-markdown-version | 1b4f3ec30f0d4166a239c0a4c8dc2770611190ae | [
"0BSD"
] | null | null | null | README.md | Kaleidics/gatsby-blog-boilerplate-markdown-version | 1b4f3ec30f0d4166a239c0a4c8dc2770611190ae | [
"0BSD"
] | null | null | null | README.md | Kaleidics/gatsby-blog-boilerplate-markdown-version | 1b4f3ec30f0d4166a239c0a4c8dc2770611190ae | [
"0BSD"
] | null | null | null | <p align="center">
<a href="https://www.gatsbyjs.org">
<img alt="Gatsby" src="https://www.gatsbyjs.org/monogram.svg" width="60" />
</a>
</p>
<h1 align="center">
Gatsby Blog Template <br/>
<span style="font-size: 28px;">Sourcing content from markdown files</span>
</h1> | 31.111111 | 79 | 0.65 | yue_Hant | 0.743034 |
247bc11a274d0d9a545845fe454f9a50c89e6822 | 578 | md | Markdown | docs/developer/get-started/ioctl-send-transfer.md | iotexproject/iotex-docs | 3b0724cdd2861982d8eb5f75ae143072c8938722 | [
"MIT"
] | 14 | 2018-12-11T20:32:29.000Z | 2021-11-08T20:15:20.000Z | docs/developer/get-started/ioctl-send-transfer.md | iotexproject/iotex-docs | 3b0724cdd2861982d8eb5f75ae143072c8938722 | [
"MIT"
] | 55 | 2018-11-30T22:46:37.000Z | 2021-12-22T11:04:00.000Z | docs/developer/get-started/ioctl-send-transfer.md | iotexproject/iotex-docs | 3b0724cdd2861982d8eb5f75ae143072c8938722 | [
"MIT"
] | 21 | 2018-11-21T21:55:12.000Z | 2021-11-05T22:05:02.000Z | ---
title: Send a Token Transfer
---
# Send a Token Transfer action
The Transfer action allows to send IOTX tokens from one IoTeX account (the _sender_) to another IoTeX account (the _recipient_).
Here is the command to send **10 IOTX** from our **dev-acc** account to the IoTeX account with address **io1mflp9m6hcgm2qcghchsdqj3z3eccrnekx9p0ms**
```
ioctl action transfer io1mflp9m6hcgm2qcghchsdqj3z3eccrnekx9p0ms 10 -s dev-acc
```
The command will ask for the password of our dev-acc account then it will send the action to the gateway and provide the hash of the action.
| 36.125 | 148 | 0.776817 | eng_Latn | 0.985008 |
247c8114c78f66a3a3ce4e5288e9de2fef1a54ac | 18,970 | md | Markdown | archives/2021-01-17.md | erbanku/weibo-hot-hub | f0b9e1a535dd022b5b69b15e176f08b6bd4f222b | [
"MIT"
] | 12 | 2021-02-07T10:15:34.000Z | 2022-02-13T19:27:00.000Z | archives/2021-01-17.md | SnailDev/weibo-hot-hub | 9c79901c81ed9d3401cff216cb5124bb2de937f9 | [
"MIT"
] | null | null | null | archives/2021-01-17.md | SnailDev/weibo-hot-hub | 9c79901c81ed9d3401cff216cb5124bb2de937f9 | [
"MIT"
] | 6 | 2021-06-02T04:05:45.000Z | 2022-02-20T15:09:37.000Z | # 微博热榜
`最后更新时间:2021-01-17 23:15:25 +0800`
## 热门搜索
1. [是他们让战疫更有底气](https://s.weibo.com/weibo?q=%23%E6%98%AF%E4%BB%96%E4%BB%AC%E8%AE%A9%E6%88%98%E7%96%AB%E6%9B%B4%E6%9C%89%E5%BA%95%E6%B0%94%23&Refer=new_time)
1. [半藏森林发长文](https://s.weibo.com/weibo?q=%23%E5%8D%8A%E8%97%8F%E6%A3%AE%E6%9E%97%E5%8F%91%E9%95%BF%E6%96%87%23&Refer=top)
1. [石家庄新乐市长寿街道升为高风险](https://s.weibo.com/weibo?q=%23%E7%9F%B3%E5%AE%B6%E5%BA%84%E6%96%B0%E4%B9%90%E5%B8%82%E9%95%BF%E5%AF%BF%E8%A1%97%E9%81%93%E5%8D%87%E4%B8%BA%E9%AB%98%E9%A3%8E%E9%99%A9%23&Refer=top)
1. [会发热的 好暖卫生巾](https://s.weibo.com/weibo?q=%23%E4%BC%9A%E5%8F%91%E7%83%AD%E7%9A%84%20%E5%A5%BD%E6%9A%96%E5%8D%AB%E7%94%9F%E5%B7%BE%23&topic_ad=1&Refer=top)
1. [第三方机构谎报全员阴性结果](https://s.weibo.com/weibo?q=%23%E7%AC%AC%E4%B8%89%E6%96%B9%E6%9C%BA%E6%9E%84%E8%B0%8E%E6%8A%A5%E5%85%A8%E5%91%98%E9%98%B4%E6%80%A7%E7%BB%93%E6%9E%9C%23&Refer=top)
1. [潘博文](https://s.weibo.com/weibo?q=%E6%BD%98%E5%8D%9A%E6%96%87&Refer=top)
1. [周杰伦胖了](https://s.weibo.com/weibo?q=%23%E5%91%A8%E6%9D%B0%E4%BC%A6%E8%83%96%E4%BA%86%23&Refer=top)
1. [80后90后的压力有多大](https://s.weibo.com/weibo?q=%2380%E5%90%8E90%E5%90%8E%E7%9A%84%E5%8E%8B%E5%8A%9B%E6%9C%89%E5%A4%9A%E5%A4%A7%23&Refer=top)
1. [天天向上](https://s.weibo.com/weibo?q=%E5%A4%A9%E5%A4%A9%E5%90%91%E4%B8%8A&Refer=top)
1. [张震为缉魂减重25斤](https://s.weibo.com/weibo?q=%23%E5%BC%A0%E9%9C%87%E4%B8%BA%E7%BC%89%E9%AD%82%E5%87%8F%E9%87%8D25%E6%96%A4%23&Refer=top)
1. [奥地利爆发万人反封锁抗议](https://s.weibo.com/weibo?q=%23%E5%A5%A5%E5%9C%B0%E5%88%A9%E7%88%86%E5%8F%91%E4%B8%87%E4%BA%BA%E5%8F%8D%E5%B0%81%E9%94%81%E6%8A%97%E8%AE%AE%23&Refer=top)
1. [上阳赋](https://s.weibo.com/weibo?q=%E4%B8%8A%E9%98%B3%E8%B5%8B&Refer=top)
1. [万能回复话术](https://s.weibo.com/weibo?q=%23%E4%B8%87%E8%83%BD%E5%9B%9E%E5%A4%8D%E8%AF%9D%E6%9C%AF%23&Refer=top)
1. [河北隆尧全县居民居家隔离](https://s.weibo.com/weibo?q=%23%E6%B2%B3%E5%8C%97%E9%9A%86%E5%B0%A7%E5%85%A8%E5%8E%BF%E5%B1%85%E6%B0%91%E5%B1%85%E5%AE%B6%E9%9A%94%E7%A6%BB%23&Refer=top)
1. [阳光之下](https://s.weibo.com/weibo?q=%E9%98%B3%E5%85%89%E4%B9%8B%E4%B8%8B&Refer=top)
1. [北京确诊病例隔离前曾前往天津出差](https://s.weibo.com/weibo?q=%23%E5%8C%97%E4%BA%AC%E7%A1%AE%E8%AF%8A%E7%97%85%E4%BE%8B%E9%9A%94%E7%A6%BB%E5%89%8D%E6%9B%BE%E5%89%8D%E5%BE%80%E5%A4%A9%E6%B4%A5%E5%87%BA%E5%B7%AE%23&Refer=top)
1. [哈尔滨1无症状天津活动轨迹](https://s.weibo.com/weibo?q=%23%E5%93%88%E5%B0%94%E6%BB%A81%E6%97%A0%E7%97%87%E7%8A%B6%E5%A4%A9%E6%B4%A5%E6%B4%BB%E5%8A%A8%E8%BD%A8%E8%BF%B9%23&Refer=top)
1. [刘阳](https://s.weibo.com/weibo?q=%E5%88%98%E9%98%B3&Refer=top)
1. [唐嫣15年前剧开播](https://s.weibo.com/weibo?q=%23%E5%94%90%E5%AB%A315%E5%B9%B4%E5%89%8D%E5%89%A7%E5%BC%80%E6%92%AD%23&Refer=top)
1. [字节跳动暂停手机业务](https://s.weibo.com/weibo?q=%23%E5%AD%97%E8%8A%82%E8%B7%B3%E5%8A%A8%E6%9A%82%E5%81%9C%E6%89%8B%E6%9C%BA%E4%B8%9A%E5%8A%A1%23&Refer=top)
1. [南京提高人才购房门槛](https://s.weibo.com/weibo?q=%E5%8D%97%E4%BA%AC%E6%8F%90%E9%AB%98%E4%BA%BA%E6%89%8D%E8%B4%AD%E6%88%BF%E9%97%A8%E6%A7%9B&Refer=top)
1. [阿沁](https://s.weibo.com/weibo?q=%E9%98%BF%E6%B2%81&Refer=top)
1. [女孩拍婚纱照被羊怼](https://s.weibo.com/weibo?q=%E5%A5%B3%E5%AD%A9%E6%8B%8D%E5%A9%9A%E7%BA%B1%E7%85%A7%E8%A2%AB%E7%BE%8A%E6%80%BC&Refer=top)
1. [魏大勋粉丝信写没什么想说的](https://s.weibo.com/weibo?q=%23%E9%AD%8F%E5%A4%A7%E5%8B%8B%E7%B2%89%E4%B8%9D%E4%BF%A1%E5%86%99%E6%B2%A1%E4%BB%80%E4%B9%88%E6%83%B3%E8%AF%B4%E7%9A%84%23&Refer=top)
1. [WE iG](https://s.weibo.com/weibo?q=WE%20iG&Refer=top)
1. [天齐锂业](https://s.weibo.com/weibo?q=%E5%A4%A9%E9%BD%90%E9%94%82%E4%B8%9A&Refer=top)
1. [肖战 耙耳朵是爱老婆的表现](https://s.weibo.com/weibo?q=%E8%82%96%E6%88%98%20%E8%80%99%E8%80%B3%E6%9C%B5%E6%98%AF%E7%88%B1%E8%80%81%E5%A9%86%E7%9A%84%E8%A1%A8%E7%8E%B0&Refer=top)
1. [热依扎](https://s.weibo.com/weibo?q=%E7%83%AD%E4%BE%9D%E6%89%8E&Refer=top)
1. [WE战胜iG](https://s.weibo.com/weibo?q=%23WE%E6%88%98%E8%83%9CiG%23&Refer=top)
1. [南海战士演练硬核对敌英语喊话](https://s.weibo.com/weibo?q=%23%E5%8D%97%E6%B5%B7%E6%88%98%E5%A3%AB%E6%BC%94%E7%BB%83%E7%A1%AC%E6%A0%B8%E5%AF%B9%E6%95%8C%E8%8B%B1%E8%AF%AD%E5%96%8A%E8%AF%9D%23&Refer=top)
1. [美国多地发生小型抗议集会活动](https://s.weibo.com/weibo?q=%E7%BE%8E%E5%9B%BD%E5%A4%9A%E5%9C%B0%E5%8F%91%E7%94%9F%E5%B0%8F%E5%9E%8B%E6%8A%97%E8%AE%AE%E9%9B%86%E4%BC%9A%E6%B4%BB%E5%8A%A8&Refer=top)
1. [海底捞迷惑行为大赏](https://s.weibo.com/weibo?q=%23%E6%B5%B7%E5%BA%95%E6%8D%9E%E8%BF%B7%E6%83%91%E8%A1%8C%E4%B8%BA%E5%A4%A7%E8%B5%8F%23&Refer=top)
1. [一位石家庄北漂的六次隔离生活](https://s.weibo.com/weibo?q=%23%E4%B8%80%E4%BD%8D%E7%9F%B3%E5%AE%B6%E5%BA%84%E5%8C%97%E6%BC%82%E7%9A%84%E5%85%AD%E6%AC%A1%E9%9A%94%E7%A6%BB%E7%94%9F%E6%B4%BB%23&Refer=top)
1. [周大福客服回应页面售价错误](https://s.weibo.com/weibo?q=%E5%91%A8%E5%A4%A7%E7%A6%8F%E5%AE%A2%E6%9C%8D%E5%9B%9E%E5%BA%94%E9%A1%B5%E9%9D%A2%E5%94%AE%E4%BB%B7%E9%94%99%E8%AF%AF&Refer=top)
1. [如果一觉醒来回到了十年前](https://s.weibo.com/weibo?q=%23%E5%A6%82%E6%9E%9C%E4%B8%80%E8%A7%89%E9%86%92%E6%9D%A5%E5%9B%9E%E5%88%B0%E4%BA%86%E5%8D%81%E5%B9%B4%E5%89%8D%23&Refer=top)
1. [荷兰首相骑自行车向国王辞职](https://s.weibo.com/weibo?q=%E8%8D%B7%E5%85%B0%E9%A6%96%E7%9B%B8%E9%AA%91%E8%87%AA%E8%A1%8C%E8%BD%A6%E5%90%91%E5%9B%BD%E7%8E%8B%E8%BE%9E%E8%81%8C&Refer=top)
1. [紧急公关](https://s.weibo.com/weibo?q=%E7%B4%A7%E6%80%A5%E5%85%AC%E5%85%B3&Refer=top)
1. [哈尔滨新增2例无症状](https://s.weibo.com/weibo?q=%23%E5%93%88%E5%B0%94%E6%BB%A8%E6%96%B0%E5%A2%9E2%E4%BE%8B%E6%97%A0%E7%97%87%E7%8A%B6%23&Refer=top)
1. [长大了才能谈恋爱的原因](https://s.weibo.com/weibo?q=%23%E9%95%BF%E5%A4%A7%E4%BA%86%E6%89%8D%E8%83%BD%E8%B0%88%E6%81%8B%E7%88%B1%E7%9A%84%E5%8E%9F%E5%9B%A0%23&Refer=top)
1. [哲仁王后](https://s.weibo.com/weibo?q=%E5%93%B2%E4%BB%81%E7%8E%8B%E5%90%8E&Refer=top)
1. [刘亦菲晒自拍](https://s.weibo.com/weibo?q=%23%E5%88%98%E4%BA%A6%E8%8F%B2%E6%99%92%E8%87%AA%E6%8B%8D%23&Refer=top)
1. [关于近视眼的冷知识](https://s.weibo.com/weibo?q=%23%E5%85%B3%E4%BA%8E%E8%BF%91%E8%A7%86%E7%9C%BC%E7%9A%84%E5%86%B7%E7%9F%A5%E8%AF%86%23&Refer=top)
1. [范丞丞说秦烈像葫芦娃](https://s.weibo.com/weibo?q=%23%E8%8C%83%E4%B8%9E%E4%B8%9E%E8%AF%B4%E7%A7%A6%E7%83%88%E5%83%8F%E8%91%AB%E8%8A%A6%E5%A8%83%23&Refer=top)
1. [iG教练](https://s.weibo.com/weibo?q=iG%E6%95%99%E7%BB%83&Refer=top)
1. [Hamzy被解约](https://s.weibo.com/weibo?q=Hamzy%E8%A2%AB%E8%A7%A3%E7%BA%A6&Refer=top)
1. [Hero晋级冬冠总决赛](https://s.weibo.com/weibo?q=%23Hero%E6%99%8B%E7%BA%A7%E5%86%AC%E5%86%A0%E6%80%BB%E5%86%B3%E8%B5%9B%23&Refer=top)
1. [中国金球奖](https://s.weibo.com/weibo?q=%23%E4%B8%AD%E5%9B%BD%E9%87%91%E7%90%83%E5%A5%96%23&Refer=top)
1. [大头娃娃涉事企业主要涉案人员被传唤](https://s.weibo.com/weibo?q=%23%E5%A4%A7%E5%A4%B4%E5%A8%83%E5%A8%83%E6%B6%89%E4%BA%8B%E4%BC%81%E4%B8%9A%E4%B8%BB%E8%A6%81%E6%B6%89%E6%A1%88%E4%BA%BA%E5%91%98%E8%A2%AB%E4%BC%A0%E5%94%A4%23&Refer=top)
1. [奇妙之城](https://s.weibo.com/weibo?q=%23%E5%A5%87%E5%A6%99%E4%B9%8B%E5%9F%8E%23&Refer=top)
1. [王一博渴望光荣舞台](https://s.weibo.com/weibo?q=%23%E7%8E%8B%E4%B8%80%E5%8D%9A%E6%B8%B4%E6%9C%9B%E5%85%89%E8%8D%A3%E8%88%9E%E5%8F%B0%23&Refer=top)
1. [独处给我带来的好处](https://s.weibo.com/weibo?q=%23%E7%8B%AC%E5%A4%84%E7%BB%99%E6%88%91%E5%B8%A6%E6%9D%A5%E7%9A%84%E5%A5%BD%E5%A4%84%23&Refer=top)
## 热门话题
1. [#80后90后的压力有多大#](https://s.weibo.com/weibo?q=%2380%E5%90%8E90%E5%90%8E%E7%9A%84%E5%8E%8B%E5%8A%9B%E6%9C%89%E5%A4%9A%E5%A4%A7%23)
- #80后90后的压力有多大#有人问我,如果看待有些80后90后不生小孩子呀,是不喜欢吗?
- 5.3万讨论 3亿阅读
1. [#抑郁症痊愈后性格会发生什么变化#](https://s.weibo.com/weibo?q=%23%E6%8A%91%E9%83%81%E7%97%87%E7%97%8A%E6%84%88%E5%90%8E%E6%80%A7%E6%A0%BC%E4%BC%9A%E5%8F%91%E7%94%9F%E4%BB%80%E4%B9%88%E5%8F%98%E5%8C%96%23)
- 抑郁症能否完完全全痊愈?痊愈后是否性格各方面会发生什么变化?来聊聊你的观点。
- 9.3万讨论 5.6亿阅读
1. [#第三方机构谎报全员阴性结果#](https://s.weibo.com/weibo?q=%23%E7%AC%AC%E4%B8%89%E6%96%B9%E6%9C%BA%E6%9E%84%E8%B0%8E%E6%8A%A5%E5%85%A8%E5%91%98%E9%98%B4%E6%80%A7%E7%BB%93%E6%9E%9C%23)
- 邢台市委常委、宣传部部长戎阳说,据隆尧县报告,因检测能力有限,收集样本委托济南某单位承担。14日,该公司向卫健部门谎报全部阴性。16日上午又报告有阳性样本。
- 1.1万讨论 1.9亿阅读
1. [#官方通报饿了么骑手点火自伤事件#](https://s.weibo.com/weibo?q=%23%E5%AE%98%E6%96%B9%E9%80%9A%E6%8A%A5%E9%A5%BF%E4%BA%86%E4%B9%88%E9%AA%91%E6%89%8B%E7%82%B9%E7%81%AB%E8%87%AA%E4%BC%A4%E4%BA%8B%E4%BB%B6%23)
- 据“微海陵”消息,江苏省泰州市海陵区新闻办16日发布通报:2021年1月11日上午10时57分,海陵区时代花园东门处发生一起点火自伤事件,造成一人烧伤,伤者正在医院救治,目前生命体征平稳。
- 5414讨论 2.7亿阅读
1. [#下班后的工作消息要不要回#](https://s.weibo.com/weibo?q=%23%E4%B8%8B%E7%8F%AD%E5%90%8E%E7%9A%84%E5%B7%A5%E4%BD%9C%E6%B6%88%E6%81%AF%E8%A6%81%E4%B8%8D%E8%A6%81%E5%9B%9E%23)
- 下班后的工作消息,该不该回?
——奇葩说第七季 晋级赛第四场 辩题大公开!
今晚八点见!
- 3.6万讨论 4亿阅读
1. [#电竞劝退业务成功劝退九成青少年#](https://s.weibo.com/weibo?q=%23%E7%94%B5%E7%AB%9E%E5%8A%9D%E9%80%80%E4%B8%9A%E5%8A%A1%E6%88%90%E5%8A%9F%E5%8A%9D%E9%80%80%E4%B9%9D%E6%88%90%E9%9D%92%E5%B0%91%E5%B9%B4%23)
- 成都一电竞机构可提供电竞劝退业务,机构负责人认为这项业务可“正确引导青少年的电竞价值观”。
- 1.2万讨论 2.9亿阅读
1. [#中年失业的处境有多艰难#](https://s.weibo.com/weibo?q=%23%E4%B8%AD%E5%B9%B4%E5%A4%B1%E4%B8%9A%E7%9A%84%E5%A4%84%E5%A2%83%E6%9C%89%E5%A4%9A%E8%89%B0%E9%9A%BE%23)
- 房贷、车贷、老婆孩子压身,《紧急公关》里面的中年危机的现状太真实了!带话题说出你的艰难
- 5549讨论 2291.9万阅读
1. [#虚拟恋人兴起#](https://s.weibo.com/weibo?q=%23%E8%99%9A%E6%8B%9F%E6%81%8B%E4%BA%BA%E5%85%B4%E8%B5%B7%23)
- 恋爱类网络游戏、“付费陪聊”“虚拟恋人”等网络情感产品的迅速兴起,给亲密关系找到了一种“捷径”——在现实生活中无法得到亲密关系的年轻人,通过每个月付出少量金钱,就能获得来自网络不特定人的陪伴和慰藉。
- 5094讨论 1亿阅读
1. [#西安一火锅店优先录取听障者#](https://s.weibo.com/weibo?q=%23%E8%A5%BF%E5%AE%89%E4%B8%80%E7%81%AB%E9%94%85%E5%BA%97%E4%BC%98%E5%85%88%E5%BD%95%E5%8F%96%E5%90%AC%E9%9A%9C%E8%80%85%23)
- 陕西西安,一火锅店因优先为听障人士提供工作,被顾客评为“最安静”的暖心餐厅。负责人介绍,目前店内7位听障服务员,薪资待遇和其他员工一样,有听障员工因为表现好,工作第2年就成了股东。
- 6526讨论 1.1亿阅读
1. [#成年人的崩溃要不要藏起来#](https://s.weibo.com/weibo?q=%23%E6%88%90%E5%B9%B4%E4%BA%BA%E7%9A%84%E5%B4%A9%E6%BA%83%E8%A6%81%E4%B8%8D%E8%A6%81%E8%97%8F%E8%B5%B7%E6%9D%A5%23)
- 成年人的崩溃要不要隐藏起来?
周四晚八点,不见不散!
- 19.2万讨论 8.8亿阅读
1. [#见义勇为的尺度#](https://s.weibo.com/weibo?q=%23%E8%A7%81%E4%B9%89%E5%8B%87%E4%B8%BA%E7%9A%84%E5%B0%BA%E5%BA%A6%23)
- 无论如何,法都不能向不法让步,这正是正当防卫制度的司法精神之所在。我们还是要相信法律、相信正义,相信当自己路见不平一声吼之后,会有法律替我们撑腰。
- 2792讨论 1.1亿阅读
1. [#年轻人更愿意单独生活的原因#](https://s.weibo.com/weibo?q=%23%E5%B9%B4%E8%BD%BB%E4%BA%BA%E6%9B%B4%E6%84%BF%E6%84%8F%E5%8D%95%E7%8B%AC%E7%94%9F%E6%B4%BB%E7%9A%84%E5%8E%9F%E5%9B%A0%23)
- 你是一个人生活吗?为什么越来越多的人更愿意一个人生活?
- 4.6万讨论 5.1亿阅读
1. [#半藏森林接到了腾讯广告#](https://s.weibo.com/weibo?q=%23%E5%8D%8A%E8%97%8F%E6%A3%AE%E6%9E%97%E6%8E%A5%E5%88%B0%E4%BA%86%E8%85%BE%E8%AE%AF%E5%B9%BF%E5%91%8A%23)
- 暂无数据
- 14.1万讨论 13.9亿阅读
1. [#央行再次重申房住不炒#](https://s.weibo.com/weibo?q=%23%E5%A4%AE%E8%A1%8C%E5%86%8D%E6%AC%A1%E9%87%8D%E7%94%B3%E6%88%BF%E4%BD%8F%E4%B8%8D%E7%82%92%23)
- 在15日举行的国新办发布会上,央行金融市场司司长邹澜表示,2020年房地产贷款增速8年来首次低于各项贷款增速,新增房地产贷款占各项贷款比重从2016年的44.8%下降到去年的28%。下一步,央行在房地产金融调控方面,坚持房住不炒定位。
- 7850讨论 1.9亿阅读
1. [#漳州通报婴儿霜大头娃娃事件#](https://s.weibo.com/weibo?q=%23%E6%BC%B3%E5%B7%9E%E9%80%9A%E6%8A%A5%E5%A9%B4%E5%84%BF%E9%9C%9C%E5%A4%A7%E5%A4%B4%E5%A8%83%E5%A8%83%E4%BA%8B%E4%BB%B6%23)
- 经有检测资质的第三方机构检测,已确认召回的涉事产品“益芙灵多效特护抑菌霜”和“开心森林一抹舒宝宝皮肤抑菌霜”含有氯倍他索丙酸酯,企业涉嫌生产、销售伪劣产品。
- 5012讨论 1.2亿阅读
1. [#奥地利爆发万人反封锁抗议#](https://s.weibo.com/weibo?q=%23%E5%A5%A5%E5%9C%B0%E5%88%A9%E7%88%86%E5%8F%91%E4%B8%87%E4%BA%BA%E5%8F%8D%E5%B0%81%E9%94%81%E6%8A%97%E8%AE%AE%23)
- 当地时间周六(16日),成千上万的民众涌上奥地利首都维也纳的街头,抗议政府为遏制新冠疫情而实施的封锁限制。在游行现场,大多数民众都没有佩戴口罩,甚至还在相互亲吻和拥抱。
- 8449讨论 2.8亿阅读
1. [#公共场合最难以忍受的行为#](https://s.weibo.com/weibo?q=%23%E5%85%AC%E5%85%B1%E5%9C%BA%E5%90%88%E6%9C%80%E9%9A%BE%E4%BB%A5%E5%BF%8D%E5%8F%97%E7%9A%84%E8%A1%8C%E4%B8%BA%23)
- #公共场合最难以忍受的行为# #姐妹们的茶话会# 在公共场合难免会遇到一些难以忍受的行为,地铁玩手机大声公放、随地吐痰、抽烟等等都会让人感到不适,对此你有什么看法呢?
- 5843讨论 5424.2万阅读
1. [#朋友之间最舒服的关系#](https://s.weibo.com/weibo?q=%23%E6%9C%8B%E5%8F%8B%E4%B9%8B%E9%97%B4%E6%9C%80%E8%88%92%E6%9C%8D%E7%9A%84%E5%85%B3%E7%B3%BB%23)
- 带话题说说看,朋友之间最舒服的关系是怎样的....
- 1.3万讨论 7115.1万阅读
1. [#河南硬核村长大喇叭劝就地过年#](https://s.weibo.com/weibo?q=%23%E6%B2%B3%E5%8D%97%E7%A1%AC%E6%A0%B8%E6%9D%91%E9%95%BF%E5%A4%A7%E5%96%87%E5%8F%AD%E5%8A%9D%E5%B0%B1%E5%9C%B0%E8%BF%87%E5%B9%B4%23)
- 疫情当前,河南硬核村长大喇叭倡导“就地过年”,称隔地不隔心,亲情不减。
- 8974讨论 8912.3万阅读
1. [#卢书记应该怎么办#](https://s.weibo.com/weibo?q=%23%E5%8D%A2%E4%B9%A6%E8%AE%B0%E5%BA%94%E8%AF%A5%E6%80%8E%E4%B9%88%E5%8A%9E%23)
- 暂无数据
- 3.5万讨论 6.3亿阅读
1. [#广美教授回应被指抄袭米菲兔#](https://s.weibo.com/weibo?q=%23%E5%B9%BF%E7%BE%8E%E6%95%99%E6%8E%88%E5%9B%9E%E5%BA%94%E8%A2%AB%E6%8C%87%E6%8A%84%E8%A2%AD%E7%B1%B3%E8%8F%B2%E5%85%94%23)
- 暂无数据
- 5140讨论 1.4亿阅读
1. [#恐惧焦虑下该如何缓解情绪#](https://s.weibo.com/weibo?q=%23%E6%81%90%E6%83%A7%E7%84%A6%E8%99%91%E4%B8%8B%E8%AF%A5%E5%A6%82%E4%BD%95%E7%BC%93%E8%A7%A3%E6%83%85%E7%BB%AA%23)
- 《阳光之下》柯滢面对被坏人欺辱、被封潇声薅头发掐脖子威胁、男友被断指等恐惧,她勇敢面对,采用心理咨询、跑步、大喊、画画等方式来缓解自我情绪。如果是你,在恐惧焦虑下会如何缓解呢?
- 3.2万讨论 2.8亿阅读
1. [#离异家庭会影响子女婚姻吗#](https://s.weibo.com/weibo?q=%23%E7%A6%BB%E5%BC%82%E5%AE%B6%E5%BA%AD%E4%BC%9A%E5%BD%B1%E5%93%8D%E5%AD%90%E5%A5%B3%E5%A9%9A%E5%A7%BB%E5%90%97%23)
- 《了不起的女孩》中沈思怡@金晨 因为小时候父母离异的痛苦经历,长大后变得不相信爱情,不期待婚姻,你认为离异家庭会影响子女婚姻吗?
- 1.6万讨论 3.3亿阅读
1. [#全球变暖北半球为何寒潮频发#](https://s.weibo.com/weibo?q=%23%E5%85%A8%E7%90%83%E5%8F%98%E6%9A%96%E5%8C%97%E5%8D%8A%E7%90%83%E4%B8%BA%E4%BD%95%E5%AF%92%E6%BD%AE%E9%A2%91%E5%8F%91%23)
- 全球变暖北半球为何寒潮频发?看中国工程院专家解读!
- 1.9万讨论 2.6亿阅读
1. [#鼓励企业发放留岗红包过年礼包#](https://s.weibo.com/weibo?q=%23%E9%BC%93%E5%8A%B1%E4%BC%81%E4%B8%9A%E5%8F%91%E6%94%BE%E7%95%99%E5%B2%97%E7%BA%A2%E5%8C%85%E8%BF%87%E5%B9%B4%E7%A4%BC%E5%8C%85%23)
- 各地鼓励企业发放留岗红包过年礼包、安排文化娱乐活动,落实好工资、休假等待遇保障,餐饮商超、医疗卫生、治安消防等服务供应不断线,让务工人员留在就业地安心过春节。
- 4371讨论 1.9亿阅读
1. [#考公的年轻人为什么越来越多了#](https://s.weibo.com/weibo?q=%23%E8%80%83%E5%85%AC%E7%9A%84%E5%B9%B4%E8%BD%BB%E4%BA%BA%E4%B8%BA%E4%BB%80%E4%B9%88%E8%B6%8A%E6%9D%A5%E8%B6%8A%E5%A4%9A%E4%BA%86%23)
- 周末加班,不去;待遇和公务员差不多,不去……越来越多的年轻人开始考公,期待“上岸”,你如何看待这种选择?
- 2347讨论 1亿阅读
1. [#孙杨禁赛判决撤销原因官方公布#](https://s.weibo.com/weibo?q=%23%E5%AD%99%E6%9D%A8%E7%A6%81%E8%B5%9B%E5%88%A4%E5%86%B3%E6%92%A4%E9%94%80%E5%8E%9F%E5%9B%A0%E5%AE%98%E6%96%B9%E5%85%AC%E5%B8%83%23)
- 瑞士当地时间1月15日,瑞士联邦最高法院发布新闻公报,由于国际体育仲裁法庭仲裁小组中的一名仲裁员存在偏见和歧视,瑞士联邦最高法院决定撤销国际体育仲裁法庭对中国游泳运动员孙杨的8年禁赛处罚,国际体育仲裁法庭需要组建一个新的仲裁小组重新作出判决。
- 2.9万讨论 8亿阅读
1. [#车厘子咋在中国火起来的#](https://s.weibo.com/weibo?q=%23%E8%BD%A6%E5%8E%98%E5%AD%90%E5%92%8B%E5%9C%A8%E4%B8%AD%E5%9B%BD%E7%81%AB%E8%B5%B7%E6%9D%A5%E7%9A%84%23)
- 目前,全球车厘子产量主要集中在智利、新西兰和澳大利亚等地,其中智利每年大概有接近90%的车厘子销往中国。车厘子是近些年水果界当之无愧的“网红”,这得益于其上市时间以及生鲜电商的力推。
- 2.4万讨论 5.7亿阅读
1. [#为什么大部分人不喜欢团建#](https://s.weibo.com/weibo?q=%23%E4%B8%BA%E4%BB%80%E4%B9%88%E5%A4%A7%E9%83%A8%E5%88%86%E4%BA%BA%E4%B8%8D%E5%96%9C%E6%AC%A2%E5%9B%A2%E5%BB%BA%23)
- 几乎每一个公司都会搞团建,但为什么很多员工都不喜欢公司团建呢?来聊聊你的看法和经历吧。
- 2.6万讨论 3.9亿阅读
1. [#该不该加仓白酒股#](https://s.weibo.com/weibo?q=%23%E8%AF%A5%E4%B8%8D%E8%AF%A5%E5%8A%A0%E4%BB%93%E7%99%BD%E9%85%92%E8%82%A1%23)
- 今日午后白酒板块走弱,贵州茅台跌超3%,古井贡酒跌5%。白酒股是不是危险了?还能涨吗?该不该加仓白酒股?你怎么看?
- 4548讨论 2.2亿阅读
1. [#婴儿用抑菌霜后成大头娃娃#](https://s.weibo.com/weibo?q=%23%E5%A9%B4%E5%84%BF%E7%94%A8%E6%8A%91%E8%8F%8C%E9%9C%9C%E5%90%8E%E6%88%90%E5%A4%A7%E5%A4%B4%E5%A8%83%E5%A8%83%23)
- 1月7日,微博博主@老爸评测-魏老爸 曝光了一起疑似“大头娃娃”事件——有家长从市面上购买“嗳婴树”品牌的“益芙灵多效特护抑菌霜”,给5个月大的孩子使用后出现“大头娃娃”现象:发育迟缓、多毛、脸肿大等。
- 16.4万讨论 11.4亿阅读
1. [#哪些因素会影响家庭生育#](https://s.weibo.com/weibo?q=%23%E5%93%AA%E4%BA%9B%E5%9B%A0%E7%B4%A0%E4%BC%9A%E5%BD%B1%E5%93%8D%E5%AE%B6%E5%BA%AD%E7%94%9F%E8%82%B2%23)
- 部分网友认为育儿假太少影响了家庭生育计划,你认为#哪些因素会影响家庭生育#?
- 1700讨论 4951万阅读
1. [#吉林超级传播链1传81#](https://s.weibo.com/weibo?q=%23%E5%90%89%E6%9E%97%E8%B6%85%E7%BA%A7%E4%BC%A0%E6%92%AD%E9%93%BE1%E4%BC%A081%23)
- 据北京日报客户端消息,加上今天公主岭市和通化市通报新增的62例无症状感染者,此次超级传播事件已经导致3市81人感染,分布在通化市(61例)、公主岭市(19例)、松原市(1例)。
- 7060讨论 1.9亿阅读
1. [#中高风险地区回乡要集中隔离吗#](https://s.weibo.com/weibo?q=%23%E4%B8%AD%E9%AB%98%E9%A3%8E%E9%99%A9%E5%9C%B0%E5%8C%BA%E5%9B%9E%E4%B9%A1%E8%A6%81%E9%9B%86%E4%B8%AD%E9%9A%94%E7%A6%BB%E5%90%97%23)
- 最近,各地根据国内外疫情形势,及时动态调整疫情防控措施。“14+7”天集中隔离观察是指什么? #中高风险地区回乡要集中隔离吗# ? #低风险地区回乡要做核酸检测吗# ?31个省区市最新政策汇总供参考↓↓转需!
- 1.5万讨论 2亿阅读
1. [#年轻人如何远离猝死#](https://s.weibo.com/weibo?q=%23%E5%B9%B4%E8%BD%BB%E4%BA%BA%E5%A6%82%E4%BD%95%E8%BF%9C%E7%A6%BB%E7%8C%9D%E6%AD%BB%23)
- 《我国5516例尸解猝死病例流行特征分析》显示,影响猝死的主要因素是情绪激动(25.66%)和劳累(24.53%)。此外,临床发现很多猝死的人,都曾经历过高强度工作状态。熬夜的危害更为明显。
- 1.8万讨论 4亿阅读
1. [#上海拟建议设结婚冷静期#](https://s.weibo.com/weibo?q=%23%E4%B8%8A%E6%B5%B7%E6%8B%9F%E5%BB%BA%E8%AE%AE%E8%AE%BE%E7%BB%93%E5%A9%9A%E5%86%B7%E9%9D%99%E6%9C%9F%23)
- 上海市政协委员:倡导婚前医学检查,设结婚冷静期保障知情权
- 4.4万讨论 2.5亿阅读
1. [#栖霞金矿事故#](https://s.weibo.com/weibo?q=%23%E6%A0%96%E9%9C%9E%E9%87%91%E7%9F%BF%E4%BA%8B%E6%95%85%23)
- 暂无数据
- 2005讨论 1451.8万阅读
1. [#拿命换钱是当今职场常态吗#](https://s.weibo.com/weibo?q=%23%E6%8B%BF%E5%91%BD%E6%8D%A2%E9%92%B1%E6%98%AF%E5%BD%93%E4%BB%8A%E8%81%8C%E5%9C%BA%E5%B8%B8%E6%80%81%E5%90%97%23)
- 某互联网大厂员工加班猝死 公司回应:底层的人民,哪一个不是用命换钱?
- 3.4万讨论 5.2亿阅读
1. [#你是讨好型人格么#](https://s.weibo.com/weibo?q=%23%E4%BD%A0%E6%98%AF%E8%AE%A8%E5%A5%BD%E5%9E%8B%E4%BA%BA%E6%A0%BC%E4%B9%88%23)
- 国内首部女性独白剧《听见她说》收官之作《完美女孩》,由赵薇监制/导演,沈奇岚编剧,杨幂主演,以未来感的科幻题材,以机器人小爱的诞生,探索女性被物化的现实议题。在小爱身上,能看到很多讨好型人格的人群的影子。你有讨好型人格么?1月5日中午12点,腾讯视频《完美女孩》反思标签与物化。
- 6.1万讨论 5.6亿阅读
1. [#九大最新疫情谣言#](https://s.weibo.com/weibo?q=%23%E4%B9%9D%E5%A4%A7%E6%9C%80%E6%96%B0%E7%96%AB%E6%83%85%E8%B0%A3%E8%A8%80%23)
- 99.9%的新冠病毒十分钟被茶水杀灭、新冠疫苗接种开放网站预约、2021年春节快递提前停运、春运直接被“取消”……年关将至,又有许多关于新冠疫情谣言在网上流传。九大最新谣言背后的真相↓↓
- 5769讨论 5275.3万阅读
1. [#大头娃娃事件婴儿霜产品含激素#](https://s.weibo.com/weibo?q=%23%E5%A4%A7%E5%A4%B4%E5%A8%83%E5%A8%83%E4%BA%8B%E4%BB%B6%E5%A9%B4%E5%84%BF%E9%9C%9C%E4%BA%A7%E5%93%81%E5%90%AB%E6%BF%80%E7%B4%A0%23)
- 1月17日,漳州市“欧艾抑菌霜”事件处置工作组通报“欧艾抑菌霜”事件调查处置进展情况。经有检测资质的第三方机构检测,已确认召回的涉事产品“益芙灵多效特护抑菌霜”和“开心森林一抹舒宝宝皮肤抑菌霜”含有氯倍他索丙酸酯,企业涉嫌生产、销售伪劣产品。
- 889讨论 812.8万阅读
1. [#南京提高人才购房门槛#](https://s.weibo.com/weibo?q=%23%E5%8D%97%E4%BA%AC%E6%8F%90%E9%AB%98%E4%BA%BA%E6%89%8D%E8%B4%AD%E6%88%BF%E9%97%A8%E6%A7%9B%23)
- 近日,南京发布了《南京市人才购买商品房住房办法》(以下简称“《办法》”),宣布收紧人才购房的政策。
- 856讨论 1552.2万阅读
1. [#Hamzy道歉#](https://s.weibo.com/weibo?q=%23Hamzy%E9%81%93%E6%AD%89%23)
- 暂无数据
- 7.6万讨论 6.4亿阅读
1. [#如何看待互联网加班文化#](https://s.weibo.com/weibo?q=%23%E5%A6%82%E4%BD%95%E7%9C%8B%E5%BE%85%E4%BA%92%E8%81%94%E7%BD%91%E5%8A%A0%E7%8F%AD%E6%96%87%E5%8C%96%23)
- 所谓“996”,是一种近几年流行于互联网行业的工作制度/文化。即每天早上9点到岗,一直工作到晚上9点,每周工作6天。你如何看到互联网加班文化?
- 1.2万讨论 2亿阅读
1. [#南海战士演练硬核对敌英语喊话#](https://s.weibo.com/weibo?q=%23%E5%8D%97%E6%B5%B7%E6%88%98%E5%A3%AB%E6%BC%94%E7%BB%83%E7%A1%AC%E6%A0%B8%E5%AF%B9%E6%95%8C%E8%8B%B1%E8%AF%AD%E5%96%8A%E8%AF%9D%23)
- 近日,西沙某岛礁,红蓝实兵对抗演习,官兵们演练对“敌”英语喊话。战士们说,要“确保在与强‘敌’较量中,准确展示我方立场、传递我方声音!”
- 875讨论 2953.2万阅读
1. [#如何治理特权现象#](https://s.weibo.com/weibo?q=%23%E5%A6%82%E4%BD%95%E6%B2%BB%E7%90%86%E7%89%B9%E6%9D%83%E7%8E%B0%E8%B1%A1%23)
- 近日,大连一街道办王主任打电话给卢书记要求防疫志愿者“简单登记”放行一事引起热议,一通电话看似小事,背后反映的确实就是“特权思想”。那么,该如何治理特权现象?
- 670讨论 2809.1万阅读
1. [#孩子多大适合接受性教育#](https://s.weibo.com/weibo?q=%23%E5%AD%A9%E5%AD%90%E5%A4%9A%E5%A4%A7%E9%80%82%E5%90%88%E6%8E%A5%E5%8F%97%E6%80%A7%E6%95%99%E8%82%B2%23)
- #姐妹们的茶话会# 在性教育中,孩子有权利了解真实信息,这并不是羞耻的事。你认为孩子多大适合接受性教育?
- 1.7万讨论 3.7亿阅读
1. [#伊朗展示自杀式无人机#](https://s.weibo.com/weibo?q=%23%E4%BC%8A%E6%9C%97%E5%B1%95%E7%A4%BA%E8%87%AA%E6%9D%80%E5%BC%8F%E6%97%A0%E4%BA%BA%E6%9C%BA%23)
- 暂无数据
- 1451讨论 1528.4万阅读
1. [#美国多地发生小型抗议集会活动#](https://s.weibo.com/weibo?q=%23%E7%BE%8E%E5%9B%BD%E5%A4%9A%E5%9C%B0%E5%8F%91%E7%94%9F%E5%B0%8F%E5%9E%8B%E6%8A%97%E8%AE%AE%E9%9B%86%E4%BC%9A%E6%B4%BB%E5%8A%A8%23)
- 暂无数据
- 519讨论 821.9万阅读
1. [#幽默的边界在哪#](https://s.weibo.com/weibo?q=%23%E5%B9%BD%E9%BB%98%E7%9A%84%E8%BE%B9%E7%95%8C%E5%9C%A8%E5%93%AA%23)
- 滥用幽默,会让广告创意本身事与愿违。对此,你认为一则广告幽默的边界在哪?
- 3094讨论 1亿阅读
| 88.64486 | 221 | 0.682393 | yue_Hant | 0.283337 |
247c9847d74bceeba2165ae4228b60e45138b2e5 | 404 | md | Markdown | Fake-Hackfleisch.md | gereonrisse/rezeptbuch | 010a5204b428c0e17328676a48c30dbd1f7aca32 | [
"MIT"
] | null | null | null | Fake-Hackfleisch.md | gereonrisse/rezeptbuch | 010a5204b428c0e17328676a48c30dbd1f7aca32 | [
"MIT"
] | null | null | null | Fake-Hackfleisch.md | gereonrisse/rezeptbuch | 010a5204b428c0e17328676a48c30dbd1f7aca32 | [
"MIT"
] | null | null | null | Dauer: ???
### Zutaten:
- Fake-Hack
- Zwiebeln
- Knoblauch
- Tube Tomatenmark
- Dunkle Schokolade
### Zubereitung:
- Zwiebeln und Knoblauch scharf anbraten und dann aus der Pfanne nehmen
- ganze Tube Tomatenmark scharf anrösten, dass es unten braun wird, und dann aus der Pfanne nehmen
- Fake-Hack scharf anbraten
- alles andere in die Pfanne und vermischen
- ein Stück dunkle Schokolade dazuschmeißen
| 25.25 | 98 | 0.769802 | deu_Latn | 0.999023 |
247d0e8f06e4d57838ae0e8f41338a169beb7a6b | 178 | md | Markdown | src/readme.md | kuskmen/orleans-docs | b382a9f7fea144334f8913c069c7b8bb95563b0c | [
"MIT"
] | 16 | 2020-10-28T16:23:48.000Z | 2022-02-23T08:08:14.000Z | src/readme.md | kuskmen/orleans-docs | b382a9f7fea144334f8913c069c7b8bb95563b0c | [
"MIT"
] | 55 | 2020-12-16T22:47:12.000Z | 2022-03-20T22:01:00.000Z | src/readme.md | kuskmen/orleans-docs | b382a9f7fea144334f8913c069c7b8bb95563b0c | [
"MIT"
] | 56 | 2020-10-28T15:37:27.000Z | 2022-03-20T20:41:33.000Z | # Orleans documentation
The source for the documentation is here: https://github.com/dotnet/orleans-docs
The published documentation is here: https://dotnet.github.io/orleans/
| 29.666667 | 80 | 0.797753 | eng_Latn | 0.938955 |
247d521db560b14003a167e722dc97f0e66bd0ac | 628 | md | Markdown | README.md | QuantScape/javascript-mdn-2dgametutorial | 28628b86866a5fe20e7fd0c499622e3e73bbd3cf | [
"MIT"
] | null | null | null | README.md | QuantScape/javascript-mdn-2dgametutorial | 28628b86866a5fe20e7fd0c499622e3e73bbd3cf | [
"MIT"
] | null | null | null | README.md | QuantScape/javascript-mdn-2dgametutorial | 28628b86866a5fe20e7fd0c499622e3e73bbd3cf | [
"MIT"
] | null | null | null | # Pure JavaScript Implementation of Space Invaders
#### Status: Almost complete. There is a bug with the bases that causes them to be hit 10x on one hit.
#### Purpose: For fun.
##### This is an adaption of the Mozilla Development Network's Pure Javascript 2D Game Development Tutorial.
##### How to Play
1. Clone or Download the repo
2. Navigate to the directory javascript/spaceinvaders/
3. Open the javascript/spaceinvaders/index.html in a browser (tested with Chrome V. 54.0)
###### Current Repo Screenshot

| 31.4 | 108 | 0.757962 | eng_Latn | 0.877017 |
247e32e59013a24fad83e4b2c452a5a6721cc289 | 1,043 | md | Markdown | site/pages/relevance/_reference/docs/dmi-system_slots.md | 3v1lW1th1n/developer.bigfix.com | 883f749c1b81b3401829e337de13c51702036ff8 | [
"Apache-2.0"
] | 20 | 2015-07-03T14:03:04.000Z | 2022-03-06T05:02:18.000Z | site/pages/relevance/_reference/docs/dmi-system_slots.md | Shivi-S/developer.bigfix.com | 80d5236b071477c3665db1238db9c2f45ba6c0ac | [
"Apache-2.0"
] | 83 | 2015-06-25T20:05:26.000Z | 2021-12-10T12:23:53.000Z | site/pages/relevance/_reference/docs/dmi-system_slots.md | Shivi-S/developer.bigfix.com | 80d5236b071477c3665db1238db9c2f45ba6c0ac | [
"Apache-2.0"
] | 25 | 2015-07-02T20:20:05.000Z | 2022-03-03T18:47:09.000Z | # type: dmi system_slots
No documentation exists.
# bus_number of <dmi system_slots> : integer
No documentation exists.
# current_usage of <dmi system_slots> : integer
No documentation exists.
# device_function_number of <dmi system_slots> : integer
No documentation exists.
# length of <dmi system_slots> : integer
No documentation exists.
# segment_group_number of <dmi system_slots> : integer
No documentation exists.
# slot_characteristics_1 of <dmi system_slots> : integer
No documentation exists.
# slot_characteristics_2 of <dmi system_slots> : integer
No documentation exists.
# slot_data_bus_width of <dmi system_slots> : integer
No documentation exists.
# slot_designation of <dmi system_slots> : string
No documentation exists.
# slot_id of <dmi system_slots> : integer
No documentation exists.
# slot_length of <dmi system_slots> : integer
No documentation exists.
# slot_type of <dmi system_slots> : integer
No documentation exists.
| 20.057692 | 62 | 0.770853 | eng_Latn | 0.717134 |
247faf7e486668f2b0cbdc0800de8f55f7b23806 | 842 | md | Markdown | _posts/2016-10-17-08-31-53.md | assobit/assobit.github.io | 607fc97aa8f7e17140af1cad2935e9bfc6e0e4ef | [
"Apache-2.0"
] | 11 | 2015-12-15T20:57:49.000Z | 2019-09-15T22:43:03.000Z | _posts/2016-10-17-08-31-53.md | assobit/assobit.github.io | 607fc97aa8f7e17140af1cad2935e9bfc6e0e4ef | [
"Apache-2.0"
] | 32 | 2015-12-18T08:49:20.000Z | 2016-07-21T09:25:08.000Z | _posts/2016-10-17-08-31-53.md | assobit/assobit.github.io | 607fc97aa8f7e17140af1cad2935e9bfc6e0e4ef | [
"Apache-2.0"
] | 23 | 2015-12-17T17:42:58.000Z | 2021-04-21T19:02:32.000Z | ---
layout: post
title: Attività e iniziative dei membri dell'Associazione
img: http://www.assob.it/img/chain-257492_1280.jpg
bgimg: /img/post-bg.jpg
membro: Gabriele Domenichini
ruolo: Presidente AssoB.it
---
Quale presidente dell'Associazione sento di dover ribadire ciò che dovrebbe essere scontato:
<!-- more -->
I membri dell'Associazione sono liberi di intraprendere iniziative o di rilasciare dichiarazioni anche provocatorie
a titolo personale o come appartenenti a qualsiasi altro gruppo.
**Tali dichiarazioni o iniziative NON sono espressione delle idee dell'Associazione e NON devono essere ricondotte a Questa.**
Le iniziative dell'Associazione sono preparate in modo collettivo ed esprimono la maggioranza degli orientamenti dei
membri ed in queste ci riconosciamo, queste sosteniamo e di queste ci riteniamo responsabili.
| 42.1 | 126 | 0.811164 | ita_Latn | 0.999684 |
24827050055abdafcd298cffe37298f6ed79c912 | 136 | md | Markdown | Prepare/Export-ecdb9a3c-f4cf-4deb-b401-2ab894c1a6f4/SF Symbols Library 1c95183b9f974adabf30710c42dbeccb/House fcb38c577049433a812c05a90a0d2df7.md | leoMehlig/structured-symbols | 67402a033503e92556cf8039cb27a528ed31d2a9 | [
"MIT"
] | null | null | null | Prepare/Export-ecdb9a3c-f4cf-4deb-b401-2ab894c1a6f4/SF Symbols Library 1c95183b9f974adabf30710c42dbeccb/House fcb38c577049433a812c05a90a0d2df7.md | leoMehlig/structured-symbols | 67402a033503e92556cf8039cb27a528ed31d2a9 | [
"MIT"
] | null | null | null | Prepare/Export-ecdb9a3c-f4cf-4deb-b401-2ab894c1a6f4/SF Symbols Library 1c95183b9f974adabf30710c42dbeccb/House fcb38c577049433a812c05a90a0d2df7.md | leoMehlig/structured-symbols | 67402a033503e92556cf8039cb27a528ed31d2a9 | [
"MIT"
] | null | null | null | # House
Category: Home
Code: house.fill
Features: Color, Hierarchical, Weights
Source: System
Status: Released
Symbol: house.fill2x.png | 17 | 38 | 0.794118 | eng_Latn | 0.394096 |
2482886124fe834f25d16593e9367f72de681093 | 2,629 | md | Markdown | ideas/String.md | gsohler/curv | 6762dcc53a36de8e337ad152ec0efaa0a2df94df | [
"Apache-2.0"
] | 921 | 2019-01-13T18:47:47.000Z | 2022-03-28T03:36:18.000Z | ideas/String.md | gsohler/curv | 6762dcc53a36de8e337ad152ec0efaa0a2df94df | [
"Apache-2.0"
] | 93 | 2019-01-11T15:35:01.000Z | 2022-01-14T17:42:05.000Z | ideas/String.md | gsohler/curv | 6762dcc53a36de8e337ad152ec0efaa0a2df94df | [
"Apache-2.0"
] | 59 | 2019-01-20T09:37:59.000Z | 2022-02-17T15:12:10.000Z | # String Values
## Unicode
Strings are represented internally as UTF-8.
There are no string operations that can create an invalid UTF-8 sequence
in a string.
In OpenSCAD, len(str) returns the number of code points,
and str[i] returns the i'th code point.
These have O(n) complexity. That is a "good enough" design.
It lets you introspect a string without creating corrupt UTF-8.
Perl6 counts grapheme clusters, not code points.
This seems better, especially in the context of the text() primitive.
The utf8everywhere manifesto says that random access to character strings
is not very important for text processing, and that concerns about
character counting are overblown. There's not a single right way to do it,
and for some use cases, you need to count code units. Also, for UTF-8,
using a data structure that efficiently addresses individual code points
isn't important for most text processing. Eg, you can search for a substring
The requirements of Curv are likely to be pretty specific, though?
I don't see a general need for text processing in Curv.
I can imagine people using it for very application specific (and thus
unpredictable) purposes, like maybe parsing strings in imported JSON data.
## List of Char
Curv would be "more elegant" (for some values of elegant) if strings
were lists of "characters" (aka code points). So "abc" == ['a','b','c']
and "abc"@0 == 'a' and ""==[] and concat works on strings and for iterates
over the characters in a string. A character (code point) is new data type,
represented as an immediate Value.
This would give us powerful string processing using generic list operations.
It would also permit mixed lists of characters and non-characters.
I'd need an unambiguous way of printing such lists.
This adds extra complexity (and confusion, no other language works this way).
Defining a character literal to be a code point is potentially broken,
given that visually, grapheme clusters are characters, and are the unit
of text selection and cursor movement.
* You get an error if you try to put a grapheme cluster into a char literal
* What does a char literal for a combining character even look like?
It makes little sense to reverse a list of code points. Combining characters
are messed up. It makes more sense to reverse a list of grapheme clusters.
## Multiple Viewpoints
Maybe, a string is not a list. But, there are multiple ways of viewing
a string as a list of items: code points, grapheme clusters, and larger units.
There are operations for splitting a string into a list of substrings, then you
apply list operations, then you strcat the list back to a string.
| 47.8 | 79 | 0.781666 | eng_Latn | 0.99948 |
248307ece290ca8fb50627ce3bc061a6b82b104e | 165 | md | Markdown | _posts/2012-08-10-window8-key.md | ly95/blog | 90ba54c44df76ddbef6189ead97a7b17e736bd89 | [
"MIT"
] | null | null | null | _posts/2012-08-10-window8-key.md | ly95/blog | 90ba54c44df76ddbef6189ead97a7b17e736bd89 | [
"MIT"
] | null | null | null | _posts/2012-08-10-window8-key.md | ly95/blog | 90ba54c44df76ddbef6189ead97a7b17e736bd89 | [
"MIT"
] | null | null | null | ---
layout: post
title: Windows8 最新预览版密钥
date: 2012-08-10 12:21:25
modified: 2012-08-10 12:21:25
categories: 笔记
tags: 激活码
---
TK8TP-9JN6P-7X7WW-RFFTV-B7QPF | 16.5 | 30 | 0.690909 | fra_Latn | 0.086133 |
24841856d1fdde672f8065eb05451a0858cf2430 | 2,115 | md | Markdown | content/blog/blog-15/index.md | brianpompey/blog-page | 27b7567c9bd14a0a3e452cc0d98df57b33ba3a18 | [
"RSA-MD"
] | null | null | null | content/blog/blog-15/index.md | brianpompey/blog-page | 27b7567c9bd14a0a3e452cc0d98df57b33ba3a18 | [
"RSA-MD"
] | null | null | null | content/blog/blog-15/index.md | brianpompey/blog-page | 27b7567c9bd14a0a3e452cc0d98df57b33ba3a18 | [
"RSA-MD"
] | null | null | null | ---
title: GCP Database Fundamentals
date: "2021-07-01T22:12:03.284Z"
---
The purpose of a database in the cloud is to provide organized and persistent storage for your data. There are various options to choose from when deciding on a database solution in GCP, and there are a few things that you must consider when making your decision. Among these factors are: Availability, Durability, RTO, RPO, and Consistency.
Availability and Durability
When considering availability and durability, we must weigh two factors: Will I be able to access data now and when I need it? (Availability) Will my data be available after 10 or 100 or 1000 years? (Durability) When measuring these two, we are looking for the most precise number possible. For example, 99.99% (4 9s) is considered very good availability, and 99.9999999% (11 9s) is considered very good durability. A durability of “11 9s” means that if you store one million files for 10 million years, you expect to have lost only one file. Durability is very important because we hate losing data.
RTO and RPO
RTO (Recovery Time Objective) and RPO (Recovery Point Objective) are the ways that we measure how quickly a database can recover from failure. RTO is the maximum acceptable downtime and RPO is the maximum acceptable period of data loss. Achieving minimum RPO and RTO can be very expensive so there usually is a tradeoff based on the criticality of the data.
Data Consistency
We want to ensure that the data in multiple data instances is updated simultaneously. There are three types of consistency: (1) Strong Consistency- Synchronous replication to all replicas that will be slow if you have multiple replicas or standbys. (2) Eventual Consistency- Asynchronous replication where there is a little lag for a few seconds before the change is available in all replicas. In the intermediate period, different replicas might return different values. This option is used when scalability is more important than data integrity. (3) Read after Write consistency- Inserts are immediately available, however updates would have eventual consistency.
| 132.1875 | 670 | 0.797636 | eng_Latn | 0.999779 |
248446bf4040f526f2206bafe25574d7a94c90fa | 4,341 | md | Markdown | README.md | StevenGin/bunnylol | 732e7c933bbc43079cc54630745d7914417e5a06 | [
"MIT"
] | 5 | 2021-01-25T09:04:09.000Z | 2022-01-28T15:54:44.000Z | README.md | rithik/bunnylol | 5b33c0f19301c995e5ae4c3f22cc35a135067177 | [
"MIT"
] | null | null | null | README.md | rithik/bunnylol | 5b33c0f19301c995e5ae4c3f22cc35a135067177 | [
"MIT"
] | 9 | 2021-01-25T09:17:25.000Z | 2022-02-27T00:56:16.000Z | # Rithik BunnyLOL
While I was at Facebook, I heavily used the internally developed tool `bunnylol`. I found several other versions of `bunnylol` that required me to host a server. This version of `bunnylol` *does not* need to be hosted on a server.
Currently, this is hosted on [https://rithik.me/bunnylol](https://rithik.me/bunnylol). However, you can host it on any website that you would like (even on GitHub Pages). Since this is going to be your primary search engine for every new tab you open, I would suggest that you don't host it somewhere that may take a while to spin up the static page (like Heroku where your VM could go to sleep).
A note: a lot of these commands are customized for me. It is probably most beneficial for you to fork this repo and add/remove commands so that it is optimized for commands you actually need.
## Example Commands
Command | Name | URL
--- | --- | ---
fb | Facebook | [https://facebook.com/](https://facebook.com/)
m | Messenger Desktop App | [messenger://](messenger://)
mw | Messenger Web | [https://www.messenger.com/](https://www.messenger.com/)
wa | WhatsApp Desktop App | [whatsapp://](whatsapp://)
waw | WhatsApp Web | [https://web.whatsapp.com/](https://web.whatsapp.com/)
gm | Gmail | [https://mail.google.com/mail/u/0](https://mail.google.com/mail/u/0)
gd | Google Drive | [https://drive.google.com/drive/u/0](https://drive.google.com/drive/u/0)
sis | UVA SIS | [https://sisuva.admin.virginia.edu/psc/ihprd/UVSS/SA/s/WEBLIB_HCX_GN.H_SPRINGBOARD.FieldFormula.IScript_Main](https://sisuva.admin.virginia.edu/psc/ihprd/UVSS/SA/s/WEBLIB_HCX_GN.H_SPRINGBOARD.FieldFormula.IScript_Main)
col | UVA Collab | [https://collab.its.virginia.edu/portal](https://collab.its.virginia.edu/portal)
yt | YouTube | [https://youtube.com/](https://youtube.com/)
gh | GitHub | [https://github.com/](https://github.com/)
r | Reddit | [https://reddit.com/](https://reddit.com/)
l | Linkedin | [https://linkedin.com/](https://linkedin.com/)
ig | Instagram | [https://instagram.com/](https://instagram.com/)
tw | Twitter | [https://twitter.com/](https://twitter.com/)
g | Google | [https://google.com/](https://google.com/)
wp | Washington Post | [https://www.washingtonpost.com/regional/](https://www.washingtonpost.com/regional/)
wsj | Wall Street Journal | [https://www.wsj.com/](https://www.wsj.com/)
cnn | CNN | [https://www.cnn.com/](https://www.cnn.com/)
n | Netflix | [https://netflix.com/](https://netflix.com/)
h | Hulu | [https://hulu.com/](https://hulu.com/)
pv | Amazon Prime Video | [https://www.amazon.com/Amazon-Video/b/?&node=2858778011&ref=dvm_MLP_ROWNA_US_1](https://www.amazon.com/Amazon-Video/b/?&node=2858778011&ref=dvm_MLP_ROWNA_US_1)
p | Piazza | [https://piazza.com/class](https://piazza.com/class)
vs | VS Code | [vscode://](vscode://)
hs | Hubspot | [https://app.hubspot.com/live-messages/](https://app.hubspot.com/live-messages/)
$ | Robinhood | [https://robinhood.com/](https://robinhood.com/)
cal | Google Calendar | [https://calendar.google.com/calendar/r](https://calendar.google.com/calendar/r)
covid | UVA COVID-19 Tracker | [https://returntogrounds.virginia.edu/covid-tracker](https://returntogrounds.virginia.edu/covid-tracker)
DEFAULT | Default - Google Search | [https://google.com/](https://google.com/)
## Setup
1. Open Chrome and click the three dots. Click `Settings` and scroll down to `Search Engines`.
2. Click `Manage Search Engines`.
3. Add a new search engine with the URL being `http://rithik.me/bunnylol?search=%s`. Of course, you should change the `rithik.me` part to your own domain.
4. Make this the default search engine.
## Adding a command
1. Run `npm install` so that `flow` (JavaScript type checker) can run.
2. Open up the `src/commands.js` file. Add your command to the `COMMANDS` object. You must include a `name` and `url` attribute and you can add an additional `searchurl` attribute if you would to be able to type a command like `yt NBA Highlights` (in which case, `bunnylol` will automatically search for NBA Highlights on YouTube).
3. Run `npm run prepublish`.
4. Publish to your website.
## Running locally
Since we use `import` module syntax, we need to run a server to bypass CORS issues. You can setup the server by running `npm install`, followed by `node server.js`. The server should be up and visible at `localhost:3000`.
| 65.772727 | 396 | 0.724487 | eng_Latn | 0.461462 |
248516532d87fbffd0cc6cd2d01e94de84c63c90 | 6,980 | md | Markdown | README.md | allekai/R-streamgenerator | 65a67337364103d46681172e95a0b62fcf3372b7 | [
"BSD-3-Clause"
] | 1 | 2019-12-27T07:44:27.000Z | 2019-12-27T07:44:27.000Z | README.md | allekai/R-streamgenerator | 65a67337364103d46681172e95a0b62fcf3372b7 | [
"BSD-3-Clause"
] | 5 | 2018-07-16T10:15:21.000Z | 2018-09-28T17:27:34.000Z | README.md | allekai/R-streamgenerator | 65a67337364103d46681172e95a0b62fcf3372b7 | [
"BSD-3-Clause"
] | 4 | 2017-09-18T17:19:54.000Z | 2020-02-03T23:07:54.000Z | # R-streamgenerator
A statistical muldi-dimensional stream generator for benchmarking stream mining algorithms.
This R package provides functions to generate multidimensional data streams where the correlation structure can change through time.
More information about the motivation of this project is available in this [article][article]
## How it works
- We define (or generate randomly) a list of subspaces *subspaces* from a number of dimensions *dim*, where each subspace is composed of at least *mindim* dimensions and at most *maxdim* dimensions. The subspaces shall overlap or not, depending on user's parameter *allowOverlap*, but no subspace shall contain and or be contained in another.
- We determine *margins*, a list of numbers between 0 and 1 having the same size as *subspaces*. The subspace at position *x* is assigned to the margin at position *x*.
- Each data point is represented by a vector of length *dim*, whose values are taken from the uniform distribution between 0 and 1, at the exception of the subspaces specified in *subspaces*. For each of those subspaces, a dependency is created, this dependency may have different shapes (Wall, Square, Donut, ...). The strength of the dependency in a subspace depends on its *margin* value. The dependency are created for subspaces of arbitrary dimensions. For example, while the dependency "Donut" looks like a donut in a 2-D space, it looks like an empty Sphere in a 3-D space and becomes an hypersphere beyond.
- Each subspace have a proportion *prop* of outlier. If the *proptype* is set to *proportional*, then the expected proportion of outlier in a subspace is proportional to its *margin*, i.e., the volume of hidden space. If the *proptype* is set to *absolute*, then the expected proportion of outlier depends on the number of points only.
- There is the possibility to generate *static* streams and *dynamic* streams. For dynamic streams, the lists *subspaces* and *margins* are changed over a number *nstep* of time steps by a proportion *volatility*. For example, if *volatility* is equal to 0.5, half of the subspace/margin pairs will be changed at each step. Each step is composed of a number of *n* points, which can also vary for each step.
- Between each time step, the state of the generator changes uniformly from the current to the next *subspaces*/*margins* list.
- The generated streams can be composed of real values (default) or can be discretized into a number of *discrete* values.
Note that the overall proportion of outlier in the output stream does not relate directly to *prop*. Since *prop* corresponds either to the absolute expected proportion of outlier per subspace (*proptype* = "absolute"), or the expected proportion of outlier conditioned on the size of the hidden space (*proptype* = "proportional"). In both cases, it depends on the number of dependent subspaces.
For each stream, contrasted subspaces are choosen such that roughly 1/4 of the dimensions are not involved in a contrasted subspaces. This is done so to make the search for subspace realistic and to let the possibility for subspaces to change over time.
**TL;DR** the data is generated uniformly, except in some subspaces where the data is concentrated in the shape of particular dependencies. The choosen dependencies include regions to place hidden outliers. The dependencies are susceptible to change in amplitude and subspaces through time. The following picture shows a snapshot of 100 points in a subspace with a dependency of type "Wall" in a generated data stream at different points in time:

As you can see, the relationship between attribute n°7 and n°8 has evolved through time. Also, there is an outlier in picture 5 (in red). Obviously, by looking at the whole time window, this point would probably not have been detected as such.
Currently, 3 kinds of dependencies are available, "Wall", "Square" and "Donut". Here is what they look like in 2-D spaces, with *n = 1000*, *margin=0.8*, *prop=0.005*, *proptype="absolute"*. Outliers are showed in red.
With *discrete=0*:

With *discrete=20*:

## Install
1. Install dev-tools:
```R
install.packages("devtools")
```
2. Install the package from github
```R
library(devtools)
devtools::install_github("edouardfouche/R-streamgenerator")
```
### Development mode
1. Clone this repository
2. Load the package in your environment:
```R
library(devtools)
load_all("~/path/to/cloned/R-streamgenerator/")
```
Note that the package is not published to CRAN (yet).
## Package documentation
Documentation (.Rd files) for this package was created by using roxygen2 package.
1. Install devtool package as it is shown above (Install 1.&2.), further install package roxygen2:
```R
install.packages("roxygen2")
```
Roxygen2 format required special comments which should be started with #'
2. After creation this comments press Ctrl/Cmd + Shift + D or run:
```R
document()
```
a man/NameOfFunction.Rd will be generated.
3. For using created documentation run:
```R
?NameOfFunction
```
or
```R
help("NameOfFunction")
```
Within R-Streamgenerator documentation was created for following functions:
```R
generate.subspaces()
replace.subspaces()
generate.margins()
generate.marginslist()
generate.dynamic()
generate.stream.config()
generate.row()
generate.multiple.rows()
generate.static.stream()
generate.dynamic.stream()
output.stream()
```
## Get started
* Generate a static stream
```R
stream <- generate.static.stream() # default parameters
# Generate a stream with custom configuration
stream.config <- generate.stream.config(dim=50, nstep=1) # nstep should be = 1
stream <- generate.static.stream(n=1000, prop=0.05, stream.config=stream.config)
```
* Generate a dynamic stream
```R
stream <- generate.dynamic.stream() # default parameters
# Generate a stream with custom configuration
stream.config <- generate.stream.config(dim=50, nstep=10, volatility=0.5)
stream <- generate.dynamic.stream(n=100, prop=0.05, stream.config=stream.config)
```
* Output the stream
```R
# This will create 3 files in your working directory.
# "example_data.txt" contains the stream
# "example_labels.txt" contains the labels, i.e if each point is an outlier and in which subspace(s)
# "example_description.txt" contains a human-readable description of the stream
output.stream(stream, "example")
```
## TODO(s)
* Write more tests
* Write about the generation approach
* Develop other dependencies and transitions (drifts)
* Allow mixed dependencies in multivariate streams
* Allow mixed real/discrete data sets
* Provide control on the number/proportion of contrasted subspaces in the stream
* Write visualization functions
* Create a "cross" or "sinusoidal" pattern
* Use it
[article]: https://edouardfouche.com/Data-Stream-Generation-with-Concept-Drift/
| 46.845638 | 615 | 0.768195 | eng_Latn | 0.996591 |
24876309370c2c67e7ad10fd01e105824bd42d93 | 4,361 | md | Markdown | README.md | broadinstitute/dig-loam-stream | 6129834f8f0fcbb5ec07b5cb8da973b3e9442c31 | [
"BSD-3-Clause"
] | 1 | 2020-07-20T21:55:44.000Z | 2020-07-20T21:55:44.000Z | README.md | broadinstitute/dig-loam-stream | 6129834f8f0fcbb5ec07b5cb8da973b3e9442c31 | [
"BSD-3-Clause"
] | 27 | 2018-10-22T17:28:19.000Z | 2021-05-14T20:59:25.000Z | README.md | broadinstitute/dig-loam-stream | 6129834f8f0fcbb5ec07b5cb8da973b3e9442c31 | [
"BSD-3-Clause"
] | 1 | 2022-01-27T20:42:20.000Z | 2022-01-27T20:42:20.000Z | # LoamStream
LoamStream is a genomic analysis stack featuring a high-level language, compiler and runtime engine.
## Building
### Requirements
- Git
- Java 8
- SBT 0.13.13+
### To build a runnable LoamStream binary from the master branch:
- Clone the repository at https://github.com/broadinstitute/dig-loam-stream
- In the directory created by the clone operation (dig-loam-stream by default), run `sbt universal:packageBin`.
This creates target/universal/loamstream-<version>.zip which contains the launch script bin/loamstream.
(There's also bin/loamstream.bat for Windows, but since most analysis tools are written for Unix-like systems,
this is typically less useful.)
## Jenkins:
- URL: http://dig-ae-dev-01:8080/
- `$CI_ROOT`: `/humgen/diabetes/users/dig/loamstream/ci/`
- `$JENKINS_ROOT`: `${CI_ROOT}/jenkins/`
- War: `${JENKINS_ROOT}/jenkins-2.32.1-LTS.war`
- `~diguser/.jenkins` symlinked to `${JENKINS_ROOT}/home`, so Jenkins looks in its default place for files.
- Workspace (builds are cloned to and run from here): `${JENKINS_ROOT}/home/workspace/`
- Running Jenkins:
- Use `${JENKINS_ROOT}/bin/run-jenkins.sh` (`use`s various tools, sets Google-required env vars, makes running LS integration tests possible.)
- Jenkins runs in the `jenkins` screen session for the user `diguser` on `dig-ae-dev-01`.
## Integration tests:
- Live in `src/it/scala/`
- Run with `it:test`, either from the SBT prompt or when starting SBT (`sbt it:test`)
`loamstream.QcPipelineEndToEndTest`:
- Runs `pipeline.loam` through LS, equivalent to running
```
scala -jar loamstream.jar --conf pipeline/loam/loamstream.conf \
pipeline/loam/pipeline.loam \
pipeline/loam/params.loam \
pipeline/loam/binaries.loam \
pipeline/loam/scripts.loam \
pipeline/loam/cloud_helpers.loam \
pipeline/loam/store_helpers.loam
```
from the command line.
- Checks some outputs (see `QcPipelineEndToEndTest.scala` for a list) against assumed-good ones from a manual run stored in `/humgen/diabetes/users/dig/loamstream/ci/test-data/qc/camp/results/`.
## Detailed Jenkins Setup Log:
### Dirs made:
- `/humgen/diabetes/users/dig/loamstream/ci/jenkins/wars/`
contains `jenkins-2.32.1-LTS.war`
- `/humgen/diabetes/users/dig/loamstream/ci/jenkins/home/`
symlinked from `~diguser/.jenkins`
### First-time startup
- Started Jenkins with
`java -jar /humgen/diabetes/users/dig/loamstream/ci/jenkins/wars/jenkins-2.32.1-LTS.war`
- Gave initial, generated admin password from
`~diguser/.jenkins/secrets/initialAdminPassword`
- Installed suggested plugins
- Installed SBT plugin:
- Dashboard => 'Manage Jenkins' => 'Manage Plugins' => 'Available' => search for 'sbt'
- Install this one: http://wiki.jenkins-ci.org/display/JENKINS/sbt+plugin
- Restart Jenkins
### Point to JDK and SBT
- Dashboard => Manage Jenkins => Global Tool Configuration
- JDK:
- uncheck 'install automatically'
- Name: 'JDK8' (purely descriptive)
- `JAVA_HOME`: `/broad/software/free/Linux/redhat_6_x86_64/pkgs/jdk1.8.0_121`
- TODO: get Oracle account, enable 'install automatically', don't hard-code dotkit-supplied `JAVA_HOME`
- SBT:
- install automatically
- install from scala-sbt.org
- version: `sbt 0.13.13`
- `name: `SBT 0.13.13`
- sbt launch arguments: `-XX:ReservedCodeCacheSize=256m -XX:MaxMetaspaceSize=512m -Xms1G -Xmx2G -XX:+AggressiveOpts -XX:+CMSClassUnloadingEnabled -XX:+UseConcMarkSweepGC`
### Creating a first job
- Jenkins Dashboard (http://dig-ae-dev-01:8080/) => 'New Item'
- name: LoamStream
- type: 'Freestyle project'
- General tab:
- check 'github project', url: `https://github.com/broadinstitute/dig-loam-stream`
- Source Code Management => Git
- repository url: `https://github.com/broadinstitute/dig-loam-stream.git`
- credentials => add => jenkins
- used `ClintAtTheBroad` for now (TODO)
- branches to build: just master for now
- Build Triggers:
- check 'poll SCM'
- schedule: `H/15 * * * *` (every 15 minutes; could be more often)
- Build:
- Add build step => 'build using sbt'
- actions: `clean test`
| 47.402174 | 196 | 0.679202 | eng_Latn | 0.640478 |
248782024fc6aad1c1b252eed9ae9e563b167fe9 | 1,985 | md | Markdown | content/publication/bharadwaj-2011-super/index.md | saivarun983/website | fcedc1edf2a166d39ccfa4fd1c35ce45e42370be | [
"Apache-2.0",
"MIT"
] | null | null | null | content/publication/bharadwaj-2011-super/index.md | saivarun983/website | fcedc1edf2a166d39ccfa4fd1c35ce45e42370be | [
"Apache-2.0",
"MIT"
] | 3 | 2021-08-03T11:58:37.000Z | 2021-08-08T16:45:20.000Z | content/publication/bharadwaj-2011-super/index.md | saivarun983/website | fcedc1edf2a166d39ccfa4fd1c35ce45e42370be | [
"Apache-2.0",
"MIT"
] | 3 | 2021-07-22T18:32:48.000Z | 2021-08-05T15:48:30.000Z | ---
title: "Super-virtual refraction interferometry: Theory"
date: 2011-01-01
publishDate: 2020-01-17T02:43:54.255320Z
authors: ["Pawan Bharadwaj", "Gerard T Schuster", "Ian Mallinson"]
publication_types: ["1"]
abstract: "Inverting for the subsurface velocity distribution by refraction traveltime tomography is a well accepted imaging method by both the exploration and earthquake seismology communities. A significant drawback, however, is that the recorded traces become noisier with increasing offset from the source position, and so prevents accurate picking of traveltimes in far offset traces. To enhance the signal to noise ratio of the far offset traces, we present the theory of super virtual refraction interferometry where the signal to noise ratio (SNR) of far offset head wave arrivals can be theoretically increased by a factor proportional to N; here, N is the number of receiver and source positions associated with the recording and generation of the head wave arrival. There are two steps to this methodology: correlation and summation of the data to generate traces with virtual head wave arrivals, followed by the convolution of the data with the virtual traces to create traces with super virtual head wave arrivals. This method is valid for any medium that generates head wave arrivals. There are at least three significant benefits to this methodology: 1). enhanced SNR of far offset traces so the first arrival traveltimes of the noisy far offset traces can be more reliably picked to extend the useful aperture of data, 2). the SNR of head waves in a trace that arrive after the first arrival can be enhanced for accurate traveltime picking and subsequent inversion by traveltime tomography, and 3). common receiver pair gathers can be analyzed to detect the presence of diving waves in the first arrivals, which can be used to assess the nature of the refracting boundary."
featured: false
publication: "*SEG Technical Program Expanded Abstracts 2011*"
---
| 165.416667 | 1,689 | 0.811587 | eng_Latn | 0.999494 |
248a3ce42f8d97fdea2f864d03df78c24da80a90 | 570 | md | Markdown | _publications/SpikeSlabHD.md | jantonelli111/jantonelli111.github.io | 1a133e60672a3d5e6e4457368c2cf1a606c76e48 | [
"MIT"
] | null | null | null | _publications/SpikeSlabHD.md | jantonelli111/jantonelli111.github.io | 1a133e60672a3d5e6e4457368c2cf1a606c76e48 | [
"MIT"
] | null | null | null | _publications/SpikeSlabHD.md | jantonelli111/jantonelli111.github.io | 1a133e60672a3d5e6e4457368c2cf1a606c76e48 | [
"MIT"
] | 2 | 2018-10-26T18:06:56.000Z | 2019-05-19T22:26:08.000Z | ---
title: "High-dimensional confounding adjustment using continuous spike and slab priors"
collection: publications
permalink: /publication/SpikeSlabHD
date: 2018-10-18
venue: 'Bayesian Analysis'
paperurl: 'https://arxiv.org/pdf/1704.07532.pdf'
authors: Joseph Antonelli, Giovanni Parmigiani, and Francesca Dominici
citation: 'Antonelli, Joseph, Giovanni Parmigiani, and Francesca Dominici. "High-dimensional confounding adjustment using continuous spike and slab priors." Bayesian Analysis. To appear.'
---
[Download paper here](https://arxiv.org/pdf/1704.07532.pdf)
| 43.846154 | 187 | 0.798246 | eng_Latn | 0.557378 |
71c4dc935921e7b1b0e601ab3ea4559ee352c7bd | 139 | md | Markdown | _posts/2018-03-04-first-sign.md | sermons/sermons.github.io | 037e28d217479aa79cb54c8d5d49757cfa04fd0e | [
"MIT"
] | 2 | 2018-08-04T11:02:36.000Z | 2020-10-06T07:52:14.000Z | _posts/2018-03-04-first-sign.md | sermons/sermons.github.io | 037e28d217479aa79cb54c8d5d49757cfa04fd0e | [
"MIT"
] | 9 | 2016-08-26T07:14:58.000Z | 2020-04-19T22:41:45.000Z | _posts/2018-03-04-first-sign.md | sermons/sermons.github.io | 037e28d217479aa79cb54c8d5d49757cfa04fd0e | [
"MIT"
] | 1 | 2018-08-04T11:02:39.000Z | 2018-08-04T11:02:39.000Z | ---
layout: post
title: "The First Sign of His Glory"
subtitle: "John 2:1-11"
tags: john pgmc
---
Pacific Grace Mandarin Church (Burnaby)
| 15.444444 | 39 | 0.705036 | eng_Latn | 0.772411 |
71c7fce6eefdd3a645ae525408de30433f5a229d | 1,350 | md | Markdown | 2020/09/28/2020-09-28 01:35.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/09/28/2020-09-28 01:35.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/09/28/2020-09-28 01:35.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年09月28日01时数据
Status: 200
1.李沁工作室撤下单身声明
微博热度:1104749
2.阿塞拜疆多个地区进入战时状态
微博热度:443158
3.李宗盛的人生叙事
微博热度:442173
4.虞书欣 谢谢大家来到我们的婚礼
微博热度:430842
5.易烊千玺 我有影迷了
微博热度:415805
6.二胎生不生
微博热度:374995
7.刘丧废了一只耳朵
微博热度:367221
8.半是蜜糖半是伤
微博热度:343661
9.ZARA母公司半年亏损15亿
微博热度:286621
10.美国再现食脑虫
微博热度:263712
11.日本娱乐圈怎么了
微博热度:255955
12.亲爱的自己
微博热度:217160
13.林雨申替赵露思吃芥末
微博热度:184285
14.重启
微博热度:170864
15.成都双流车祸致1死7伤
微博热度:163644
16.平凡的荣耀大结局
微博热度:161585
17.LGD保留晋级希望
微博热度:160393
18.动物园狮子与游客对吼
微博热度:157565
19.阿塞拜疆
微博热度:156953
20.同济大学送1万名新生月饼盲袋
微博热度:154799
21.杨洋
微博热度:152290
22.米勒 我甚至都看不到斗志
微博热度:149951
23.罗云熙白鹿吻戏笑场
微博热度:147011
24.太独立会错失亲密的机会
微博热度:146366
25.老太爬上地铁车厢行李架上蹭坐
微博热度:142866
26.谢明皓
微博热度:142599
27.鬼鬼收到黄鸿升生前准备的礼物
微博热度:142137
28.张定宇说在大会堂时很担心摔跤
微博热度:134706
29.交通银行笔试
微博热度:131981
30.拜登接受采访突然语无伦次
微博热度:130270
31.杨笠 骗感情可以但不能骗我钱
微博热度:129512
32.老干爹
微博热度:124057
33.刘洋变家庭煮夫
微博热度:119042
34.关小唐偷亲李思雨
微博热度:111476
35.抗美援朝老兵起立敬礼目送战友
微博热度:111400
36.王一博对鸭脖的执念
微博热度:111380
37.天天向上
微博热度:105697
38.贾玲对王源撒娇
微博热度:105384
39.吴邪重生
微博热度:92451
40.秘密森林2
微博热度:91307
41.乔欣做ppt告别兰芊翊
微博热度:88877
42.喻言没有镜头
微博热度:85777
43.白鹿演技
微博热度:81583
44.安徽界首发生4人死亡刑案
微博热度:74657
45.S10入围赛
微博热度:63591
46.吴邪终于见到三叔
微博热度:63165
47.爷爷记不清孙女今天要结婚
微博热度:62872
48.LGD输了
微博热度:62330
49.给学生治顺拐结果全被带偏
微博热度:61564
50.机器狗夜晚独自在街头游荡
微博热度:60699
| 6.617647 | 17 | 0.771111 | yue_Hant | 0.385775 |
71c95fcca42f36bf7ec2307c6e5c2e2b60433ca2 | 1,596 | md | Markdown | _posts/2017-01-22-alfred-urlencode-and-urldecode-workflows-plugin.md | xu3352/xu3352.github.io | ff16588dc8aee48f0a1fa0ba9db2eaeef2cd6148 | [
"MIT"
] | 1 | 2018-09-06T09:52:59.000Z | 2018-09-06T09:52:59.000Z | _posts/2017-01-22-alfred-urlencode-and-urldecode-workflows-plugin.md | xu3352/xu3352.github.io | ff16588dc8aee48f0a1fa0ba9db2eaeef2cd6148 | [
"MIT"
] | null | null | null | _posts/2017-01-22-alfred-urlencode-and-urldecode-workflows-plugin.md | xu3352/xu3352.github.io | ff16588dc8aee48f0a1fa0ba9db2eaeef2cd6148 | [
"MIT"
] | 4 | 2019-04-27T15:33:20.000Z | 2021-02-04T11:56:25.000Z | ---
layout: post
title: "Alfred urlencode/urldecode插件"
date: '2017-01-22 04:13:54'
category: mac
tags: alfred tools python
---
> Alfred urlencode/urldecode插件
# 自己做个工具
老去搜索在线的urldecode工具,直接搞一个吧,更加方便了
直接上关键Script代码:😉
# urlencode
```
#!/usr/bin/env python
# -*- encoding: utf-8 -*-
# author: xu3352<[email protected]>
# desc: urlencode
## python2
# import sys
# import urllib
# result = urllib.quote('{query}');
# sys.stdout.write(result);
## python3
import sys
import urllib.parse
result = urllib.parse.urlencode({'': '{query}'})
sys.stdout.write( result[1:] )
```
# urldecode
```
#!/usr/bin/env python
# -*- encoding: utf-8 -*-
# author: xu3352<[email protected]>
# desc: urldecode
## python2
# import urllib
# import sys
# result = urllib.unquote('{query}').decode('utf8');
# sys.stdout.write(result);
## python3
from urllib.parse import unquote
import sys
result = unquote(unquote('{query}'))
sys.stdout.write(result);
```
下载资源:
- [UrlEncode.alfredworkflow](http://on6gnkbff.bkt.clouddn.com/20170419162632_UrlEncode.alfredworkflow.zip)
- [UrlDecode.alfredworkflow](http://on6gnkbff.bkt.clouddn.com/20170419162632_UrlDecode.alfredworkflow.zip)
参考:
- [Url decode UTF-8 in Python](http://stackoverflow.com/questions/16566069/url-decode-utf-8-in-python)
- [python中的urlencode与urldecode](http://blog.csdn.net/haoni123321/article/details/15814111)
- [Is there a way to substring a string in Python?](http://stackoverflow.com/questions/663171/is-there-a-way-to-substring-a-string-in-python)
- [URL Decode with Python 3](http://stackoverflow.com/questions/8628152/url-decode-with-python-3)
| 23.820896 | 141 | 0.730576 | eng_Latn | 0.191879 |
71ca03aa1a97ee93add11e5c97c937fb1bd9831a | 6,602 | md | Markdown | _posts/2020-06-03-retrospection-ecommerce.md | 2pegramming/2pegramming.github.io | fb361cd0fe40babd863f81dca1a2a756fe6939c7 | [
"MIT"
] | null | null | null | _posts/2020-06-03-retrospection-ecommerce.md | 2pegramming/2pegramming.github.io | fb361cd0fe40babd863f81dca1a2a756fe6939c7 | [
"MIT"
] | null | null | null | _posts/2020-06-03-retrospection-ecommerce.md | 2pegramming/2pegramming.github.io | fb361cd0fe40babd863f81dca1a2a756fe6939c7 | [
"MIT"
] | 1 | 2020-06-04T11:30:55.000Z | 2020-06-04T11:30:55.000Z | ---
layout: post
title: "Ретроспектива: ecommerce"
tags:
- ретроспектива
- ecommerce
- tips and trics
---
Год назад я ушел из healthcare стартапа, который являлся ecommerce для лекарств в штатах. Сразу после ухода я написал черновик с советами самому себе о том, что стоит делать сразу, а на что обратить внимание.
Черновик забылся на год, но, найдя записи в блокноте, выложил в паблик как ретроспективу основанную на опыте и боли. Каждый пункт - субъективный опыт и не претендуют на единственно верное решение. Если у вас есть похожие советы связанные с ecommerce - пишите комментарии, хочется собрать целый список. В будущем, можно будет сделать подобные списки и для других видов проектов.
## Ордер и цены
* `Cart` - частный случай `order`. Вместо двух сущностей можно взять `order` со статусом `created`.
* Купоны появятся. Делать купон отдельной сущностью - усложняет логику и добавит условий в чекаут и рефанд логику. Сегодня, если нужно будет делать добавить купоны в систему - сделаю купон отдельным айтемом с отрицательной ценой (скидкой).
* В `order_items` линковался на `item` и `price` из базы. Также, аналитики постоянно меняли цены на товары. Такая ситуация привела к мутациям ордера и было тяжело сказать, что и сколько стоило для ордера годичной давности. А пользователь получал не предсказуемый UX, так как возникала гипотетическая ситуация, что в момент чекаута цена на товар меняется и пользователь заплатит больше или меньше чем ожидает. Сегодня я бы помести в каждый `order_item` JSONB поле для цены которое выглядело следующим образом `JSONB: { added_at: '', price: ... }`. В таком случае нет непредвиденных мутаций, появляется версионирование данных. К тому же, это больше информации для аналитиков.
* Датасайнтисты захотят менять цены, используя сложные паттерны расчета цены и кучу условий. Это приводит к постоянным изменениям цен в базе данных. Поэтому сформировалось правило, все что относится к ценам стоит как можно раньше изолировать в отдельный домен, сервис, rails engine, что угодно. Разделять стоит не только на уровне логики, но и на уровне данных. Вариант с JSONB полем - частный случай такого разделения.
## Обсервабилити
* Мониторинг и логирование добавляют контроля и скажут что происходит или подскажут где проблема. Покрывать логами и мониторингом каждую строчку дорого, поэтому top1 кандидаты: все что связанно с деньгами, checkout&refund flow, изменения в данных, интеграции с партнерами, которые приносят деньги. Подход коррелирует с правилом “знайте, что происходит в частях системы, которые приносят деньги”.
* Работу с деньгами стоит начинать с логов. Если нет ELK или хранилища для логов в котором можно найти информацию без боли - используйте базу данны. Решение для старта - отдельная аудит таблица куда будет попадать информация от платежного шлюза. Такая информация будет полезна как в дебаге, так и аналитикам для создания моделей.
## Архитектура
* Если в системе присутствуют мобильные устройства и другие виды клиентов (например b2b интеграции), значит стоит подумать о версионировании и заложить работу с версиями с самого начала. Кроме версионирования контроллера есть версионирование бизнес логики и версионирование данных.
* Если клиентов больше двух рекомендую почитать о [BFF паттерне](https://samnewman.io/patterns/architectural/bff/).
* Если событий нет в коде - вероятность появления крайне высока. Sidekiq не предназначен для event driven architecture, хотя помогает отодвинуть переход на новую архитектуру и работает для бэкграунд процессинга или в самом начале работы проекта. Поэтому стоит заложить в имплементацию последующее использование событий вне сайдкика.
* Код без событий для бизнеса и аналитиков - деньги на ветер. Стоит с самого старта проекта заложить, что события относящиеся к работе бизнеса нужно отправлять. Не обязательно использовать сложные решения или платить за SaaS продукты. Например, в собственных проектах шлю события в телеграм, а некоторые пишу в базу.
* Kafka как брокер событий с самого старта - провал. Это дорого и очень дорого. Если хоститесь в клауде - sqs или google pub/sub могут стать хорошим началом. Если нет - выбирайте из селфхостед аналогов. Также, стоит сразу подумать о прямой и обратной совместимости данных в событиях, а также о schema registry. Как хак подойдет отдельный репозиторий со схемами событий.
* Временное остаётся навсегда. Так технология добавленная на 2 спринта осталась в проекте на 2 года.
## Данные и ETL
* В маркете возникает ситуация, когда регулярно надо загружать новые данные. Пример - цены на товары и сама информация по товарам. Вместо создания rake тасков для загрузки csv в базу, стоит посмотреть на ETL и заложить работу с ним в архитектуру. Из аналогов можно посмотреть на [kiba](https://github.com/thbar/kiba) (написан на руби) и [Apache Airflow](https://airflow.apache.org), написанный на питоне.
* Желательно подумать о том, как выгружать данные из приложения аналитикам заранее. Шарить дамп БД может оказаться быстрым вариантом, но в долгосрочной перспективе принесет проблемы совместимости между схемой данных в базе и схемой данных у аналитиков. Как альтернативное решение может подойти событийный подход. Но стоит заранее подумать о схеме данных, версионировании и обратной совместимости между версиями.
* Аналитики хотят видеть динамику по данным и делают из дампов за разные дни список изменений. Событийный обмен данными также поможет в этой проблеме.
* Если в домене присутствует специфический ID с определенной структурой - сделайте эту структуру на уровне базы и приводите к этой структуре в бизнес логике. Например, у нас был [National Drug Code](https://en.wikipedia.org/wiki/National_Drug_Code). Это строка которая может содержать `xxxx-xxxx-xx` (10 знаков), `xxxxx-xxx-xx` (10 знаков), `xxxxx-xxxx-x` (10 знаков) или `xxxxx-xxxx-xx` (11 знаков) значения. Спустя год разработки вскрылось, что система содержит три формата этого кода и часть системы поддерживает только десятизначный формат, а на одиннадцатизначном падает с ошибкой. Договорились о стандарте `5-4-2` (11 знаков), сделали триггер на базу данных, чтобы лефтпадить значение до 11 чисел в строке (недостающие значения забивать нулями) и констрейн в базу данных на не больше 11 символов. В бизнес логике значение через лефтпад приводим к строке 11 символов (с нулями). В таком случае продюсер присылает данные в своем формате, а мы не боимся что в бд будет не валидное значение.
------
Нашли опечатку или ошибку? Буду рад если [пошлете PR в гитхаб](https://github.com/2pegramming/2pegramming.github.io/tree/master/_posts).
| 161.02439 | 994 | 0.800212 | rus_Cyrl | 0.990213 |
71caa58c4dd9d4bb297a647f83c7562fdbb1c9bf | 140 | md | Markdown | README.md | hashimom/ws-wnn | 478150f81e38cbf1d78933060d322f0a8de0c955 | [
"MIT"
] | null | null | null | README.md | hashimom/ws-wnn | 478150f81e38cbf1d78933060d322f0a8de0c955 | [
"MIT"
] | null | null | null | README.md | hashimom/ws-wnn | 478150f81e38cbf1d78933060d322f0a8de0c955 | [
"MIT"
] | null | null | null | Izumo-fwnn
=====
本アプリは実験用アプリであり、まだ未完成です。。。
# 注意事項
Izumo-fwnn は MITライセンスですが、FreeWnn は GPLv2 (ライブラリ部分はLGPLv2) となっています。
利用には十分注意してください。
| 14 | 68 | 0.735714 | yue_Hant | 0.516284 |
71ce40c28c38166152b00b7ee55cf6bc8b85e4b5 | 617 | md | Markdown | README.md | progrhyme/bash-xfgrep | 5768e87afe332efac05d011db77fa5fa3a57b2a1 | [
"MIT"
] | null | null | null | README.md | progrhyme/bash-xfgrep | 5768e87afe332efac05d011db77fa5fa3a57b2a1 | [
"MIT"
] | null | null | null | README.md | progrhyme/bash-xfgrep | 5768e87afe332efac05d011db77fa5fa3a57b2a1 | [
"MIT"
] | null | null | null | # NAME
**xfgrep** - Wrapper command of `find ... | xargs grep ...`
# SYNOPSYS
xfgrep KEYWORD [-i|--ignore IGNORE_FILE_PATTERN] [-d|--dir DIR] [-f|--file FILE]
# version
xfgrep -v|--version
# help
xfgrep -h|--help
# DESCRIPTION
This wrapper executes following commands:
find $DIR -type f -name "$FILE" [ | egrep -v "$IGNORE" ] | xargs egrep -Hn "$KEYWORD"
# SEE ALSO
[https://github.com/key-amb/bash-xfperl-pie](https://github.com/key-amb/bash-xfperl-pie)
# AUTHORS
YASUTAKE Kiyoshi <[email protected]>
# LICENSE
The MIT License (MIT)
Copyright (c) 2016 YASUTAKE Kiyoshi
| 18.69697 | 89 | 0.658023 | yue_Hant | 0.492403 |
71d0eef50b4fe5e6bc54ecc10957eace13a69de5 | 4,519 | md | Markdown | _posts/2018-06-10-spark-hdfs-read-partitions.md | zhongjinhan/zhongjinhan.github.io | ef5d213887f914c2d9d47485702e0336c0158b96 | [
"MIT"
] | null | null | null | _posts/2018-06-10-spark-hdfs-read-partitions.md | zhongjinhan/zhongjinhan.github.io | ef5d213887f914c2d9d47485702e0336c0158b96 | [
"MIT"
] | null | null | null | _posts/2018-06-10-spark-hdfs-read-partitions.md | zhongjinhan/zhongjinhan.github.io | ef5d213887f914c2d9d47485702e0336c0158b96 | [
"MIT"
] | null | null | null | ---
layout: post
title: Spark读取HDFS时RDD的分区数量问题
date: 2018-06-10
summary: 详细介绍了Spark从HDFS读取数据生成RDD之后,RDD的Parition个数是如何决定的
categories: spark
published: true
---
Spark从HDFS读取文件生成的RDD含有几个partition呢?这是个经常在面试中会被问的问题,当然有时候你也会在应该开发过程当中提出类似的疑惑。那么到底会生成几个partition呢,这里要按使用的API分两种情况说明,分别是SparkContext的读取API和SparkSession的DataFramReader读取API,而这两种API分别代表2.0之前和2.0之后的常用API。
### SparkContext里面的HDFS读取API
SparkContext里面跟HDFS数据读取相关的API在生成partition方面相对是比较简单,这里的简单指spark本身没做任何处理,而完全依赖于Hadoop Mapreduce的API,具体是使用InputFormat里面的getSplits方法获取InputSplits生成Partitions,有几个InputSplits最后就有几个partitions。关于InputFormat#getSplits的实现,这个跟具体的文件格式有关,比如文本格式,具体实现在FileInputFormat#getSplits里面。
SparkContext里面的API可以进一步细分为新的和旧的API,相关的底层RDD分别是HadoopRDD和NewHadoopRDD,这里新的旧的分别对应于MapReduce里面的新旧API,关于旧的和新的API的区别可以参考 [Difference between Hadoop OLD API and NEW API](http://hadoopbeforestarting.blogspot.com/2012/12/difference-between-hadoop-old-api-and.html)。下面分别就HadoopRDD和NewHadoopRDD进行简单说明。
#### HadoopRDD
parkContext里有相关的API,比较典型的有:
```scala
def textFile(path: String, minPartitions: Int = defaultMinPartitions): RDD[String]
```
一般来说在这些API都提供一个参数minPartitions,如果调用端不提供,那么就取min(spark.default.parallelism,2)作为默认值,然后spark在HadoopRDD里面直接调用Hadoop里面org.apache.hadoop.mapred.InputFormat的getSplits方法来获取InputSplit。
这里举个文本格式的例子,文本格式的getSplits具体实现在org.apache.hadoop.mapred.FileInputFormat里面,这里面比较关键的一个变量是inputSplit,具体由以下方法获得:
```
splitSize = max(minsize, min(totalFileSize / minPartitions, blockSize))
```
这里的minsize由参数mapreduce.input.fileinputformat.split.minsize(mapred-site.xml),默认为1,totalFileSize是这次读取文件大小的总和,minPartitions则是由调用者传入,blockSize是文件块大小。
剩下的就比较简单了,遍历所有的文件按这个splitSize分成一个一个InputSplits。
所以在用HadoopRDD相关的API的时候,生成的分区个数大致可以归结为:
- 如果所有参数按默认来, 那么一个HDFS文件至少是一个partition, 一个文件如果包含多个block,则相应的会生成多个partition.
- 保持其他参数不变,可以通过提高mapreduce.input.fileinputformat.split.minsize来增大splitSize,从而达到减少partition个数的目的(这里前提是至少有文件包含多个blocks)
- 保持其他参数不变,增大minParitions来增加partition的个数
#### NewHadoopRDD
NewHadoopRDD相关的API跟HadoopRDD的类似,也是在SparkContext里面,比如
```scala
def newAPIHadoopFile[K, V, F <: InputFormat[K, V]](path: String)(implicit km: ClassTag[K], vm: ClassTag[V], fm: ClassTag[F]): RDD[(K, V)]
```
NewHadoopRDD也是直接调用InputFormat#getSplits来生成partition的,只是在MapReduce新的API里面没有了minPartitions参数,并且这个InputFormat位于org.apache.hadoop.mapreduce下面。当然getSplits的具体实现也是要看具体的文件格式,比如相应的文本文件的格式实现是在org.apache.hadoop.mapreduce.lib.input.FileInputFormat,这里的getSplitsHadoopRDD里面的类似,只是在生成splitSize上面有所区别,splitSiz由以下方式生成:
```
inputSplit = max(minsize, min(maxsize, blocksize))
```
这里的maxsize是mapreduce的配置项 mapreduce.input.fileinputformat.split.maxsize, 默认最大9223372036854775807L(长整型的最大值);minsize是指mapreduce.input.fileinputformat.split.minsize, 默认1;blocksize是文件块大小。
总的来说NewHadoopRDD在生成partition个数方除了用mapreduce.input.fileinputformat.split.maxsize配置来替换minPartition来提升partition的数量外,其他基本一样。
### 2.0之后SparkSession里面的HDFS读取API
Spa2.0使用SparkSession#rea下面的API去读HDFS生成的partition个数所采用的方式则发生了比较大的变化,spark抛弃了原来的getSplits方法来生成partitions,自己采用多个参数来控制生成partitions。主要的逻辑在FileSourceScanExec里面大致可以分为下面几步:
1. 首计算maxSplitBytes,计算方法如下:
```
maxSplitBytes = min(defaultMaxSplitBytes, max(openCostInBytes, totalBytes/defaultParallelism))
```
这defaultMaxSplitByte指spark.sql.files.maxPartitionBytes,默认128M;openCostInBytes取自配置spark.sql.files.openCostInBytes, 默认4MtotalBytes是把本次所读取求和再加上对每个文件额外增加一次openCostInBytes;defaultParallelism则是spark.default.parallelism
2. 遍历所有文件,按maxSplitByte切成一个个PartitionedFile这个跟FileInputFormat的getSplits有点类似,一个文件至少一个PartitionedFile,一个文件的大小如果maxSplitBytes的好几倍,那么也会分成几个Partitioned
3. 把生成的所有ParitionedFil大小倒序排然逐个装到FilePartition里面(FilePartition所设置的大小上限是maxSplitBytes, 这其实是个装桶问题,采用了Next Fit Decreasing,一个比较简单的算法,详情可参考[这里](http://personal.morris.umn.edu/~mcquarrb/teachingarchive/M1001/Resources/BinPacking.html),我想这里之所以使用Next Fit Decreasing主要第一简单,装过的桶不需要保持开着的状态,第二相对来说分布的会比较均匀)。这一步在原来的MapReduce API里面是没有的,主要的作用是可以合单个小文件,使整体Partition个数比较可控。
依据以上流程,我们可有下面这些结论:
- 一般情况下,如参数都按默认的来,我们可以看spark.read所产生RDD的partition个数跟spark.default.parallelis的值非常接近,会比这个值大一些,但都非常接近。
-如果有大量小文件spark.read会有很多合并的操作,这要归功于流程里最后装桶那一步所以整体产生的partition数量会比MapReduce API里面的少很多。
-如果想减少partition的数量则可以提高maxParitionBytes这个参数,这个参数可在运行时动态设置的
```scala
spark.sql("set spark.sql.files.maxPartitionBytes = 536870912")
```
### 数据的分布与RDD Partition个数的关系
数据从HDF读出变成RDD之后,它的Partition个数和数据的均匀分布这个并不是一定的关系还是决定于文件格式比如一个500M的文件夹里面的数据按10Partition读取,那么每个Partition数据一定是50M左右吗?如果是文本文件,那么可能差不多是这样,但如果是parquet格式的文件,则另当别论了,比如这文件夹里面只有5parquet文件,那么这10个partition里面只有5是有数据的,另外5个是空的。
| 46.112245 | 355 | 0.868998 | yue_Hant | 0.844093 |
71d29e75d83da22f91527a1cfed0f0416aab41d6 | 184 | md | Markdown | CHANGELOG.md | mikeysight/bankfind | b9cfe5eae60168fab427d726dd1a081bd0c24a62 | [
"MIT"
] | 1 | 2021-02-18T11:47:35.000Z | 2021-02-18T11:47:35.000Z | CHANGELOG.md | dpguthrie/bankfind | b9cfe5eae60168fab427d726dd1a081bd0c24a62 | [
"MIT"
] | null | null | null | CHANGELOG.md | dpguthrie/bankfind | b9cfe5eae60168fab427d726dd1a081bd0c24a62 | [
"MIT"
] | 2 | 2021-05-14T16:27:12.000Z | 2022-02-23T07:41:27.000Z | # Changelog
## 0.0.1
- Ability to hit 5 separate endpoints (institutions, failures, history, locations, and summary) as well as filter and search according to the API specifications
| 30.666667 | 160 | 0.771739 | eng_Latn | 0.995512 |
71d31ef70e18c7874b56e88064feb8a7974f1f4a | 289 | md | Markdown | content/pages/theme/jekyll-hpstr.md | eladroz/many-docs-search | d36c007f2838faf3661730bd1eae2710e27f669c | [
"MIT"
] | 2 | 2022-03-13T18:43:54.000Z | 2022-03-30T16:18:42.000Z | content/pages/theme/jekyll-hpstr.md | eladroz/many-docs-search | d36c007f2838faf3661730bd1eae2710e27f669c | [
"MIT"
] | 5 | 2022-03-31T11:11:09.000Z | 2022-03-31T11:11:25.000Z | content/pages/theme/jekyll-hpstr.md | eladroz/many-docs-search | d36c007f2838faf3661730bd1eae2710e27f669c | [
"MIT"
] | 1 | 2022-03-30T16:22:47.000Z | 2022-03-30T16:22:47.000Z | ---
layout: JamstackTheme
title: HPSTR
github: https://github.com/mmistakes/hpstr-jekyll-theme
demo: https://mmistakes.github.io/jekyll-theme-hpstr/
author: Michael Rose
ssg: Jekyll
date: 2013-08-23T19:03:11.000Z
description: A Jekyll theme with some tumble-log tendencies.
stale: true
--- | 26.272727 | 60 | 0.775087 | kor_Hang | 0.229968 |
71d430ada8da6979121c3e482764332b618dc777 | 57 | md | Markdown | CONTRIBUTING.md | JamesWoolfenden/learn-terraform | 58736fa5ac439ee9a3a0fcfc44722641d4b14ef1 | [
"Apache-2.0"
] | 2 | 2020-09-14T17:56:03.000Z | 2020-09-27T22:22:49.000Z | CONTRIBUTING.md | JamesWoolfenden/learn-terraform | 58736fa5ac439ee9a3a0fcfc44722641d4b14ef1 | [
"Apache-2.0"
] | null | null | null | CONTRIBUTING.md | JamesWoolfenden/learn-terraform | 58736fa5ac439ee9a3a0fcfc44722641d4b14ef1 | [
"Apache-2.0"
] | null | null | null | # Contributing
I'll gladly accept PRs to fix or extend.
| 14.25 | 40 | 0.754386 | eng_Latn | 0.998919 |
71d57be8bb9fbff0df98bcee013e242daf1a656d | 962 | md | Markdown | config/setup/README.md | oehrlis/orapwd | 32dfce42080c64263c090a9ca5de23946588c8c4 | [
"Apache-2.0"
] | 1 | 2022-03-30T14:08:16.000Z | 2022-03-30T14:08:16.000Z | config/setup/README.md | oehrlis/orapwd | 32dfce42080c64263c090a9ca5de23946588c8c4 | [
"Apache-2.0"
] | null | null | null | config/setup/README.md | oehrlis/orapwd | 32dfce42080c64263c090a9ca5de23946588c8c4 | [
"Apache-2.0"
] | null | null | null | # Setup Scripts
This folder contains all files executed after the database instance is initially
created. Currently only bash scripts (.sh) as well SQL scripts (.sql) are supported.
- [00_config_db.sql](00_config_db.sql) Script to configure the DB e.g init
parameter and more.
- [01_enable_unified_audit.sh](01_enable_unified_audit.sh) Script to enable
unified audit.
- [01_run_datapatch.sh](01_run_datapatch.sh) Script to run the datapatch and install
release updates.
- [02_create_scott_pdb1.sql](02_create_scott_pdb1.sql) Script to create the
*SCOTT* schema.
- [03_create_tvd_hr_pdb1.sql](03_create_tvd_hr_pdb1.sql) Main script to create
the *TVD_HR* schema in PDB1.
- [04_config_audit_cdb.sql](04_config_audit_cdb.sql) Script to configure unified
audit for CDB.
- [04_config_audit_pdb1.sql](04_config_audit_pdb1.sql) Script to configure unified
audit for PDB1.
- [05_clone_pdb1_pdb2.sql](05_clone_pdb1_pdb2.sql) Script to clone PDB1 to PDB2.
| 45.809524 | 84 | 0.795218 | eng_Latn | 0.838091 |
71d66d522155127f9bc53d153f980143a5c9351f | 59 | md | Markdown | README.md | amalhanaja/Local-Weather | 910eab3b6584848f5ac685129a41f97d46a145d3 | [
"MIT"
] | null | null | null | README.md | amalhanaja/Local-Weather | 910eab3b6584848f5ac685129a41f97d46a145d3 | [
"MIT"
] | null | null | null | README.md | amalhanaja/Local-Weather | 910eab3b6584848f5ac685129a41f97d46a145d3 | [
"MIT"
] | null | null | null | # Local-Weather
Local Weather Project for FreeCodeCamp.com
| 19.666667 | 42 | 0.830508 | eng_Latn | 0.643851 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.