hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0d7f959755d28e1ecba0cb1f6e305767e67a3130 | 59 | md | Markdown | README.md | wombat-ucd/real-estate | 6940c1f4605e9a1836ee1b300d9f6a8f9ebf5e1d | [
"MIT"
] | null | null | null | README.md | wombat-ucd/real-estate | 6940c1f4605e9a1836ee1b300d9f6a8f9ebf5e1d | [
"MIT"
] | null | null | null | README.md | wombat-ucd/real-estate | 6940c1f4605e9a1836ee1b300d9f6a8f9ebf5e1d | [
"MIT"
] | null | null | null | # Real-Estate
UCD Computer Science 3rd year DSA Assignment
| 19.666667 | 44 | 0.813559 | kor_Hang | 0.770793 |
0d8046fbbb74beb20774886db0f3154525a3790d | 35,065 | md | Markdown | resources/posts/deal.md | Inaimathi/langnostic | 650068f33593ab3d7a02cf98be6a28e49841878a | [
"MIT"
] | 1 | 2017-05-19T19:02:46.000Z | 2017-05-19T19:02:46.000Z | resources/posts/deal.md | Inaimathi/langnostic | 650068f33593ab3d7a02cf98be6a28e49841878a | [
"MIT"
] | 1 | 2019-08-29T03:01:16.000Z | 2019-08-29T03:01:16.000Z | resources/posts/deal.md | Inaimathi/langnostic | 650068f33593ab3d7a02cf98be6a28e49841878a | [
"MIT"
] | null | null | null | I already mentioned [Lisp In Summer Projects](http://lispinsummerprojects.org/) time, and called it
>...like a multi-month [NaNoWriMo](http://www.nanowrimo.org/) with parentheses instead of character development and sleep.
> --Inaimathi
The project I picked out wasn't something I'd ever talked about here before. It was something [a friend](http://chillier17.deviantart.com/) of mine and I were thinking of putting together to make his hobby/job slightly easier. He makes tabletop games you see. Mostly card stuff, but potentially other tabletop stuff too, from what I gather. This is to be the running journal of my project as I'm writing it. I'm not too worried about it denting my motivation, because I don't intend to publish it until I'm done.
To be perfectly honest with you, Lisp wouldn't be my first choice for this. I mean, the language is always a good thing to have in your corner, but there's a notable lack of battle-tested, feature-full, asynchronous web-servers for it. There's [Antiweb](http://hoytech.com/antiweb/), if you want to get down into the nuts and bolts of the system and serve things at the most efficient possible rate, at the expense of more complex installation and configuration. There's [Wookie](http://wookie.beeets.com/), if you don't particularly care about speed or security. There's [sw-http](http://www.cliki.net/sw-http), which I've been [warned off of directly](http://stackoverflow.com/questions/9388893/details-about-application-finder-fn-in-sw-http). And there's [Hunchentoot](http://weitz.de/hunchentoot/) if you don't care about being asynchronous. There's no Common Lisp equivalent to [Warp](http://hackage.haskell.org/package/warp) or [Yaws](http://hyber.org/) or [Tornado](http://www.tornadoweb.org/en/stable/), and I'm fairly comfortable with each of them, so if not for this contest, this probably would have been a [Haskell](http://www.haskell.org/haskellwiki/Haskell)/[Elm](http://elm-lang.org/), or perhaps [Erlang](http://www.erlang.org/)/[Angular.js](http://angularjs.org/) project rather than a [CL](http://www.cliki.net/)/[Parenscript](http://common-lisp.net/project/parenscript/) one.
I'm not *too* worried. The only part of this system that capital N *needs* to be asynchronous is the SSE handler I'll be using for browser pushes, and I'm fairly confident I'll be able to tweak Hunchentoot slightly to offload those onto a single, dedicated thread rather than keeping each one running in its own.
### The Approach
I want to battle-test some of my own ideas. Starting with the front-end/back-end separation I've been on about for a while, and continuing with some notions I've had about self-documenting APIs. To that end, `deal` is going to be a pair of projects. A game server implementation which will huddle behind [nginx](http://wiki.nginx.org/Main), deal with the application requests, and whose handler definitions are going to be simple enough to read that you'll be able to. And a reference-implementation of a web UI that will communicate with that server and do useful things in a browser.
Now then, without further ado.
### The Journal
## Day One
So here's the minimal amount of stuff we need to model in order to be a useful play-testing tool:
- cards
- collections of cards *(I'm going with "stack" for the moment)*
- hands *(different from stacks in that they're not on the table, but being held by players)*
- players
- die-rolls/coin-flips
- counters/notes
And we need to be able to interact with each one in a variety of ways.
- rotate/move/flip cards and collections *("rotate" as in "on an axis", "flip" as in "from face-up to face-down or vice-versa")*
- play *(either face up or face down)*
- play to *(onto a stack rather than onto the board directly)*
- pick up
- shuffle *(this one just applies to a stack)*
- peek *(a player looks at n cards of a given stack)*
- show *(all players see n cards of a given stack)*
- re-arrange *(peek at `n` cards and put them back in an order you specify)*
Each line of that second group needs to be a handler. Each line of the first group needs to be represented somewhere. Despite my confidence, I'm not *entirely* sure I won't be porting away from Hunchentoot if hacking SSE support into it turns out to be too difficult, so I'd rather define a little sub-language for handler definitions than call `define-easy-handler`s manually. While I'm at it, let that mini-language take type-hints so I don't have to deal with chucking strings around myself. The initial version of `define-handler` does simple type conversion, and thinly wraps `define-easy-handler`
```lisp
(defmacro define-handler ((name &key (default-type :integer)) (&rest args) &body body)
(let ((opts `(,name :uri (concatenate 'string "/" (string-downcase (symbol-name name))))))
(if (not args)
`(define-easy-handler ,opts nil (encode-json (progn ,@body)))
(flet ((type-exp (arg type)
(case type
(:integer `(parse-integer ,arg))
(:string arg)
(:keyword `(intern (string-upcase ,arg) :keyword)))))
(let ((type-conversion (mapcar (lambda (a)
(if (atom a)
(list a (type-exp a default-type))
(list (car a) (type-exp (first a) (second a)))))
args))
(final-args (mapcar (lambda (a) (if (atom a) a (car a))) args)))
`(define-easy-handler ,opts ,final-args
(let ,type-conversion
(encode-json (progn ,@body)))))))))
(defmacro define-game-handler ((name &key (default-type :integer)) (&rest args) &body body)
`(define-handler (,name :default-type ,default-type) (game-id ,@args) ,@body))
```
And it lets me write things like
```lisp
(define-handler (play/move) (thing-id x y z rot)
(list :moving thing-id :to x y z rot))
```
and have it mean `"give me the Integers thing-id, x, y, z and rot, and I'll give you a JSON-encoded response of the list ':moving thing-id :to x y z rot'"`.
That's it for day one.
## Day 3
I skipped one. In truth, this is a few days later, and I *have* been throwing hours/half-hours at the problem in the meantime, but getting no dedicated time down.
The type annotations are a good idea here, I think. Even in a dynamically typed language, you want to surround any kind of outside input with `[assert](http://www.lispworks.com/documentation/HyperSpec/Body/m_assert.htm#assert)`ions or similar, and being able to read the types is going to help people trying to interact with your API. The separate handler definition macro for tables was a misstep though. All it actually does at this point is provide a `with-lock` around the body, and add an invisible argument of `(table :table)` to whatever you define with it.
```lisp
(defmacro define-server-handler ((name) (&rest args) &body body)
"Specifically defines handlers dealing with global server state.
Shares similarity with define-table-handler (if another one comes along, we'll abstract this)"
`(define-handler (,name) ,args
(with-lock-held ((lock *server*))
,@body)))
```
The first is bad because you don't always want a lock with a `table`. For instance, when you're serving up an `EventSource`, it would be a phenomenally bad idea to keep a lock on the related table. The second is bad because we're trying to make this self-documenting. Which means that, while invisible arguments are going to save some typing, they'll be just a little bit more annoying to any front-end developers who try to develop against our server. So, this has to go.
There's also the point that my existing type annotations aren't saving me as much work as they could be. Specifically, whenever I ask for a `foo-id`, I end up looking it up in the appropriate place; either `(things table)`, or possibly `(hand *player*)`<a name="note-Sun-Aug-25-220244EDT-2013"></a>[|1|](#foot-Sun-Aug-25-220244EDT-2013), then assert that the thing coming out of the lookup is the sort of `thing` I'm expecting, then I do something to that `thing`. The "type" system really should be able to do this for me.
```lisp
(defun type-exp (arg type)
"Given a symbol name and a type, returns the expression to read that type from a string"
(match type
(:string nil)
(:int `(parse-integer ,arg))
(:json `(decode-json-from-string ,arg))
((or :keyword :facing)
`(intern (string-upcase ,arg) :keyword))
(:table
(lookup-exp arg '(private-tables *server*) '(public-tables *server*)))
((or :stack :flippable :placeable
(list :card :from-table))
(lookup-exp arg '(things table)))
((list :card :from-hand)
(lookup-exp arg '(hand *player*)))
(_ (error "Invalid type label: '~a'" type))))
(defun lookup-exp (arg &rest places)
(with-gensyms (sym)
`(let ((,sym (intern ,arg :keyword)))
(or ,@(loop for p in places
collect `(gethash ,sym ,p))))))
(defun lookup-assn (arg type)
(match type
(:table `(assert ,arg))
(:stack `(assert (typep ,arg 'stack)))
(:facing `(assert (or (eq ,arg :up) (eq ,arg :down))))
(:placeable `(assert (typep ,arg 'placeable)))
(:flippable `(assert (typep ,arg 'flippable)))
((list :card _) `(assert (typep ,arg 'card)))
(_ nil)))
(defun type-pieces (args)
"Takes a list of arguments and returns three values:
- The conversion expressions
- The names (for use as final args)
- The lookup assertions"
(loop for (name type) in args
when (aif (type-exp name type) (list name it)) collect it into convs
collect name into as
when (lookup-assn name type) collect it into assrs
finally (return (values convs as assrs))))
(defmacro define-handler ((name) (&rest args) &body body)
"Defines handlers with an eye for self-documentation, DRY and portability"
(let ((opts `(,name :uri ,(concatenate 'string "/" (string-downcase (symbol-name name))))))
(if (not args)
`(define-easy-handler ,opts nil (encode-json (progn ,@body)))
(multiple-value-bind (type-conversion final-args lookup-assertions) (type-pieces args)
`(define-easy-handler ,opts ,final-args
(assert (and ,@final-args))
,(if type-conversion
`(let* ,type-conversion
,@lookup-assertions
(encode-json ,@body))
`(progn ,@lookup-assertions
(encode-json ,@body))))))))
```
And it can.
The other thing I'm finalizing is the `id` system. An earlier crack just had each component keep count of its contents and assign that as the next id. There are a few obvious problems with this. Firstly that it would result in duplicate `id`s sometimes. Secondly, unless I wanted to update the item `id` every time I moved the item, this would mean a global counter in `*server*`, which would mean a lock on the whole server any time anything changed play zones. The change I ended up making is just using `[gensym](http://www.lispworks.com/documentation/HyperSpec/Body/f_gensym.htm#gensym)`. Ordinarily, I wouldn't *but*: these `id`s don't need to be cryptographically random, they just need to be unique with respect to all other active `id`s. Of course, doing it this way is going to run me up against potential problems when I get to loading games from disk storage, but that's a pretty long way off. Anyhow, as a result, all the `foo-id` and `id` fields are now `keyword`s rather than `integer`s.
## Day 4
First stab at the interface. And by "first stab", I mean "stupid basic interface that quote renders end-quote things by echoing them to console". It's nowhere near complete, but it's already enough to iron out a wrinkle or two. Specifically, I've had to go back through the model and change every `belongs-to` slot to expect an ID rather than a pointer to a `player`. It became obvious that this was necessary when I got memory-use warnings followed by a crash when I tried to "render" a card. `encode-json-to-string` doesn't like circular references, you see.
Now that everything uses IDs, there's one semi-obvious good thing about it: it'll make putting together the front-end much easier. Because the IDs are now globally unique, I can use them as a `class` tag in the `DOM` to identify objects on the board. That'll let me update the state of a lot of things in the UI without having to re-render very much at all.
## Day 6
I've been refining the model a bit to take into account some of the games I'll want to model for this project. There's also a slightly revised `define-handler` macro that stores information about any `handler`s it `define`s, which then gets served through the `list-handlers` handler. That'll make certain parts of the front-end easier to put together.
Not much work other than that, sadly. I'm still moving forward in increments of an hour or half-hour at the outside. What I *have* been able to do is read through pieces of the [Hunchentoot](http://weitz.de/hunchentoot/) code to try figuring out how, exactly, to hack conditional SSE support to it. Near as I can tell, I'll need to define a `:before` method for [`handle-request`](https://github.com/edicl/hunchentoot/blob/master/acceptor.lisp#L563-L582) and then figure out how to let its call chain know not to terminate the appropriate socket stream. Something else has occurred to me though. Because there's really only one handler I'm going to need to be served asynchronously, *and* that handler will *only* serve up public information, a reasonably simple approach here might be to just off-load SSE serving to [something](http://hackage.haskell.org/package/warp) better [suited](http://hyber.org/server_sent_events.yaws) for [it](http://nic.ferrier.me.uk/blog/2012_08/elnode-nears-1-point-0), [specifically](https://github.com/ztellman/aleph). Yet another approach, since I'm considering [aleph](https://github.com/ztellman/aleph), is to just write the whole thing in Clojure to begin with...
## Fatherly Interlude
My son is at a stage where everything he gets his hands on automatically goes in his mouth. Food, toys, cats, carpet, the computer I got him to paw at. Everything. He's also gotten to teething, which seems to be a very painful experience judging from his vocal emissions.
## Day 9
The past few days have been mostly prospective development and a little thought about secrecy. The end result is going to be some minor mechanical changes to how `id`s function, and they won't be shown for cards inside stacks.
Let me try to take you through it. What I was thinking earlier is that I can just assign a canonical ID to each `thing` that needs to go on the table. The trouble with that approach is that it canonically identifies a `thing`. So, for example, if you take a card from the table, put it into a stack, shuffle that stack, and then play a card face-down, it will be possible for each player to tell whether it's the same card. If it has the same `id` as the starting card, everyone knows what it is, otherwise, no one knows what it is but they can at least knock one option out of the possibility space.
This is not what you want.
The default for that situation is that no one should know what the card is, or have any additional information about it. There are two ways to solve this:
1. We could create canonical `id`s for everything, but display a salted+hashed version to the front end, changing out the salt whenever the zone of play changes. That would let us keep a single `id` in the back-end, but it would keep everything reasonably anonymous to the front end. It seems kind of expensive, and complicated, and not particularly useful in any case.
1. We could assign a new `id` to a `thing` when it crosses play zones. So, for example, when you `play` a card, it gets a new `id` while in play. If you then put it into a stack, it gets a new `id` while there. If you play it face-down out of the stack again, that's a third in-play `id`.
We don't actually need a central way of addressing a given `thing`. Or, at least, we don't yet, so I'm inclined to go for this second option. Remember, we generate `id`s through `gensym`, which is a pretty cheap computation as far as I know. We could, of course, keep our own global counter as part of `*server*`, but I'll see if that's necessary later. What I might want to do at the moment is name the function `make-id` just to make it a bit simpler to change if we end up needing to.
## Day 10
I've been thinking about the SSE situation, and it occurred to me that since
- I'd only need one SSE channel per game
- It would contain only public data
- I would want it to support spectators *(and therefore wouldn't want to restrict access to it)*
- I plan to deploy Deal by running a reverse-proxy from [nginx](http://wiki.nginx.org/Main)
it wouldn't be a bad idea to off-load that particular handler onto nginx itself. The ideal situation would be one where I could just serve up a file per game as the "stream", then keep appending to it from within Deal. That *doesn't* seem to be trivially possible, but nginx *does* have an optional, production-ready [push_stream_module](https://github.com/wandenberg/nginx-push-stream-module) licensed under [GPL3](http://gplv3.fsf.org/). That's something to consider, since it would really only take a bit of configuration twiddling as opposed to actual code to get this up-and-running.
## Day 12
Ok, I'm ignoring the SSE question for now; we don't really have any call for it until I get enough of a front-end together to support more than one player in any case. That's proceeding apace. I've been thinking about how to approach this task; should I abstract as much and as aggressively as possible, or should I keep it plain, straightforward and stupid? Typically, I go for the second option if I can help it at all, but I decided to go the opposite way this time. Here's a list of utilities I defined. Mostly thin wrappers around existing [jQuery](http://jquery.com/) constructs, and two *very* tasty pieces of syntactic sugar to help me define things.
```lisp
(in-package #:deal-ui)
(defparameter *debugging* t)
(defpsmacro log (&body body)
(when *debugging*
`(chain console (log ,@body))))
;;;;;;;;;; JS Basics
(defpsmacro obj->string (thing)
`(chain -j-s-o-n (stringify ,thing)))
(defpsmacro string->obj (thing)
`(chain j-query (parse-j-s-o-n ,thing)))
(defpsmacro fn (&body body) `(lambda () ,@body))
;;;;;;;;;; jQuery Basics
(defpsmacro $ (selector &body chains)
`(chain (j-query ,selector) ,@chains))
(defpsmacro doc-ready (&body body)
`($ document (ready (fn ,@body))))
(defpsmacro $map (lst &body body)
`(chain j-query (map ,lst (lambda (elem i) ,@body))))
(defpsmacro $post (uri arg-plist &body body)
`(chain j-query
(post ,uri (create ,@arg-plist)
(lambda (data status jqXHR)
(let ((res (string->obj (@ jqXHR response-text))))
,@body)))))
(defpsmacro $droppable (target &rest class/action-list)
`($ ,target (droppable
(create
:drop (lambda (event ui)
(let ((dropped (@ ui helper context)))
;; not sure if this should be a cond or a list of independent whens
(cond ,@(loop for (class action) in class/action-list
collect `(($ dropped (has-class ,class)) ,action)))))))))
(defpsmacro $draggable (target (&key revert) &body body)
`($ ,target (draggable (create :stop (lambda (event ui) ,@body) :revert ,revert))))
;;;;;;;;;; Define client-side ajax handlers
(defpsmacro define-ajax (name uri arg-list &body body)
`(defun ,name ,arg-list
(log *current-table-id* ,@arg-list)
($post ,uri (:table *current-table-id* ,@(args->plist arg-list))
,@body)))
;;;;;;;;;; Defining markup/behavior hybrids made easier
(defun expand-self-expression (form self-elem)
(flet ((recur (frm) (expand-self-expression frm self-elem)))
(cond ((null form) nil)
((atom form) form)
((and (eq 'self (car form)) (eq (second form) 'position))
(recur '(+ "top:" (self y) "px;" "left:" (self x) "px;" "z-index:" (self z) ";" "transform:rotate(" (self rot) "deg)")))
((eq 'self (car form))
`(@ ,self-elem ,@(cdr form)))
((atom (car form))
(cons (car form) (recur (cdr form))))
((listp (car form))
(cons (recur (car form))
(recur (cdr form)))))))
(defpsmacro define-thing (name markup &body behavior)
(deal::with-gensyms (thing container)
`(defun ,(intern (format nil "create-~a" name)) (container thing)
(let* ((,thing thing)
(,container container)
(css-id (+ "#" (@ ,thing id))))
($ ,container (append (who-ps-html ,(expand-self-expression markup thing))))
,@(loop for clause in behavior
collect (expand-self-expression clause thing))))))
```
The first bunch already kind of got addressed [last time I talked about parenscript](/posts/javascript-with-a-lisp). Some newcomers include sugar for using map, draggables and droppables in a simpler way than the default jQuery UI package allows for
```lisp
(defpsmacro $map (lst &body body)
`(chain j-query (map ,lst (lambda (elem i) ,@body))))
(defpsmacro $droppable (target &rest class/action-list)
`($ ,target (droppable
(create
:drop (lambda (event ui)
(let ((dropped (@ ui helper context)))
;; not sure if this should be a cond or a list of independent whens
(cond ,@(loop for (class action) in class/action-list
collect `(($ dropped (has-class ,class)) ,action)))))))))
(defpsmacro $draggable (target (&key revert) &body body)
`($ ,target (draggable (create :stop (lambda (event ui) ,@body) :revert ,revert))))
```
All of the correspondingly wrapped structures suffer from the same syntactic problem; they want you to pass them a function, but that function will always get the same arguments passed to it. In plain JS, you can't really bust out of this pattern without using `eval`. Which you shouldn't do. If you're dealing with JS through a language like Lisp though, you can just define macros like these to take the appropriate `body` arguments and then drop the appropriate `lambda`s around them. As long as you remember what the arguments *are*, that frees you from having to check documentation on their order every goddamn time I write any serious front-end JavaScript.
`define-thing` and `define-ajax` are more complex constructs. The second one is a way for me to define connecting functions between the front-end and the back end. Specifically, it lets me say things like
(define-ajax show-table "/show-table" ()
(render-board res))
That'll do exactly what you think it should; send an ajax request to the uri `/show-table`, then pass the JSON-parsed result to `render-board`. `define-thing` buys me more of the same, except it's good for defining local UI components rather than asynchronous handlers. Here's an example
```lisp
(define-thing stack
(:div :id (self id)
:class (+ "stack" (when (= (self face) "down") " face-down"))
:style (self position)
:title (self id)
(:button :class "draw" "Draw")
(:div :class "card-count" (+ "x" (self card-count))))
($draggable css-id ()
(move (self id) (@ ui offset left) (@ ui offset top) 0 0))
($ (+ css-id " .draw") (click (fn (draw (self id) 1)))))
```
Note that this makes use of the `$` and `$draggable` macros as well. What this does is sugar-coat the definition of a function called `create-stack`, which will take a container selector and a JSON object and
1. slot the object into that markup specification
1. append the result to the given container
1. run the behavior applying code on the newly formed element
I'm still considering having the macro itself add the declaration of `:id (self id)`, because I do that literally everywhere. The only other interesting part is that this macro goes through the trees you pass it and expands anything that looks like `(self foo)` into something that looks like `(@ self foo)`, which is how you're supposed to index thing in Parenscript. It also special-cases `(self position)` into the complete CSS style rule, making sure that the `x`, `y`, `z` and `rot` slots are reflected in the relevant CSS properties.
That's that for now. Hopefully, I can 0.1 this thing fairly soon, and finally publish part one of this journal. I *was* going to wait 'till the end, but it looks like the complete document will be far too long to publish at once.
## Day 37
Kind of a big jump this time. Haven't really had the chance to do stuff related to this project lately. My time's been getting filled with extremely interesting, lispy things that I'm unfortunately not allowed to tell you about. Yet. Hopefully, I can convince the correct humans to let me publish some or all of it in the near future.
I've implemented the [session](http://weitz.de/hunchentoot/#sessions) system, which actually lets multiple people sit down at a single table and play together. That's basically it. I've been thinking about what I want the join/new-game interface to look like, but at this point that's all it'll have to be. An interface. The hard part is more or less done. There's one big architectural question I have to answer, and one big feature I need to properly implement, and then I can move on to the task of making the UI pretty, and maybe build some basic tools for deck construction as well as playing.
#### The Big Architectural Decision
Is whether to explicitly represent stacks in the final model. It *kind* of makes sense, given that you don't want anyone to know what cards actually get shuffled to, so it's possible to conceptualize "in a stack" as a state change for the card on the table. It still doesn't work that way in real life. You can take a bunch of cards and stack them, but you never lose the ability to interact with each of them individually. There might be one or two things that either view of the world enables or prohibits, but it also seems that it'd be pretty straight-forward to switch between them later if I wanted to. Maybe this is one I hold off on until I see a direct need.
#### The Big Feature
Is data pushing.
Fuck, I had vaguely hoped that in the year 2013, [this](http://langnostic.blogspot.ca/2012/02/client-communication.html) would be a solved problem, but none of the options provided natively as part of the http/js/html stack are both simple and compatible with the thread-per-request model of serving up data. I'm still heavily leaning to just using the [nginx](https://github.com/wandenberg/nginx-push-stream-module) stream module given that this projects' published data fits some specific criteria that would make a full public solution possible.
That's that. Once those are ironed out, I can finally post a one point oh and get people playing it. Oh, and get this piece published already so I can get on with the next one: taking it from "working" to "beautiful".
## Day 38
So the trivial part of the feature is done. It seems that the nginx stream module is easy to set up and get running properly. I haven't restricted publishing rights to `localhost` yet, but I can't imagine that'll be much more difficult to configure. Now comes the slightly harder part: defining the infrastructure inside of `deal` to publish to these streams and get new arrivals up and running. The basics will look *something* like
```lisp
(defmethod publish-move! ((table table) move &optional (stream-server *stream-server*))
(push move (history table))
(http-request
(format nil "~apub?id=~a" stream-server (id table))
:method :post :content move))
```
Except that I think I'm going to make `move` itself a JSON object just to make it easier to work with on the other end. The `*stream-server*` variable will then be set to the location of the nginx instance that handles stream serving for me. Note two things about this setup, incidentally:
- the nginx stream module natively handles multiple protocols. By default it uses a forever-frame, but it can be configured to expose *the same stream* as an EventSource<a name="note-Sun-Aug-25-220313EDT-2013"></a>[|2|](#foot-Sun-Aug-25-220313EDT-2013), and a web-socket, and a long-poll handler.
- the server handling my stream publishing doesn't have to be on the same machine as the rest of the application, which opens up some interesting hosting possibilities if scaling up ever gets to be *the* problem I'm staring down
It also really, truly looks like it'll be both more performant and much easier than trying to re-write pieces of Hunchentoot to support asynchronous requests in certain contexts.
## Day 41
I have no idea what happened, but I finally ended up getting a solid day to put stuff together for this project. As a result, I've got a pretty-much-playable edition sitting up on my server, waiting for a couple more edits before I unveil it, and this massive `Journal: Part One` I've had going. Right now, I'm in the guts of the `define-handler` mini-language, trying to get my pseudo-type-system to automatically solve the problems of argument bounding for me. That is, I want to be able to specify the min and max for various argument types and have it do the right thing. Specifically, I'd like to be able to specify minimum/maximum *values* for `:int`s, and minimum/maximum *lengths* for `:string`s.
The `:int` changes only come into play in the `new-table` handlers, and the dice-rolling system. I don't want people to start tables that seat fewer than 2 or more than 12. Also, I don't want people to be able to roll 2>-sided dice, or fewer than one of them. A `d2` is a coin-flip, which I have a separate handler defined for, and any less than that would be entirely too predictable. That's pretty obvious: assert that the incoming parameter fits within the specified range, and we're done.
The `:string` changes are going to be used as part of chatting and tagging. Chat messages are going to be delivered from the user to the named table/lobby, and tags are user-specified strings that will be applied either to themselves or games they start. Tags can be empty strings<a name="note-Sun-Aug-25-220319EDT-2013"></a>[|3|](#foot-Sun-Aug-25-220319EDT-2013), but chat messages need to be at least two characters. And *neither* thing should ever be longer than 255 characters<a name="note-Sun-Aug-25-220323EDT-2013"></a>[|4|](#foot-Sun-Aug-25-220323EDT-2013). Now, if I get a chat message shorter than I want it, that's obvious: just throw an error and do nothing.
But.
What do I do with a string *longer* than I want? There's two reasonable-sounding ways to handle that situation
1. **Error out**; after all, interface this user is piloting doesn't conforming to my API, so it should come as no surprise to anyone
1. **Truncat**; take a chunklet of whatever they sent small enough for my purposes, and proceed to fulfill the request with only the relevant data
Erring means chat messages get dropped, truncating means something goes out over the wire, even if it wasn't exactly what the user intended. Now that I think about it, it seems obvious that what you'd really want, as a user, is for the server to be hard-assed about it, but the front-end to tell you what's going on. In the interests of loose coupling, this means I actually want to specify that limitation in both places. Which works perfectly, because my server already emits the specifications for its handlers through [`/server-info` requests](https://github.com/Inaimathi/deal/blob/master/deal.lisp#L6-L12), and that will automatically include any mins/maxes I define in the relevant argument lines.
## Day 42
Basically, finished a bunch of the UI changes I needed to make in order to get this into a playable state. Not quasi-playable, not semi-playable, just plain playable. You can actually go [here](http://deal.inaimathi.ca/static/index.html) and use it for realzies. I mean, it's not *enjoyable* yet, and there's a lot of basic functionality still missing<a name="note-Sun-Aug-25-220327EDT-2013"></a>[|5|](#foot-Sun-Aug-25-220327EDT-2013), but you can actually go there with a couple of friends, start a game of [crazy eights](http://en.wikipedia.org/wiki/Crazy_Eights) or [something](http://www.pagat.com/climbing/asshole.html), and have an excellent chance of finishing it before anything crashes. If anything crashes, incidentally, [do report that](https://github.com/Inaimathi/deal/issues?state=open). Or patch it and send me a pull request.
I've got a bunch of things still in my head concerning where this project ought to go. Some of them involve more degrees of freedom in terms of the reference UI I've been putting together, one of them is a deck builder (which it'll need to fulfill the "prototyping" promise of this project), and another is a board position editor (which it'll need to fulfill the "prototyping" promise of this project for anything other than card games). Now that I think of it, those last to could possibly be combined elegantly. Hmm.
Anyhow, so concludes part one of my journal: `zero` to `playable`. Now I'll try to take it from `playable` to as close to `beautiful` as I can before the contest deadline is up.
Wish me luck.
* * *
##### Footnotes
1 - <a name="foot-Sun-Aug-25-220244EDT-2013"></a>[|back|](#note-Sun-Aug-25-220244EDT-2013) - This second one will change, incidentally. In the real system, this will be referencing a player record from the current users' session rather than the global one, so it's an even better idea to handle that in a macro rather than manually as part of each handler.
2 - <a name="foot-Sun-Aug-25-220313EDT-2013"></a>[|back|](#note-Sun-Aug-25-220313EDT-2013) - Which is basically a formally-specified forever-frame with direct JavaScript support in modern browsers.
3 - <a name="foot-Sun-Aug-25-220319EDT-2013"></a>[|back|](#note-Sun-Aug-25-220319EDT-2013) - The appropriate `id` gets used in that case so that there's an unambiguous way to refer to a player or game, if you're wondering, tags are just supposed to provide something human-readable.
4 - <a name="foot-Sun-Aug-25-220323EDT-2013"></a>[|back|](#note-Sun-Aug-25-220323EDT-2013) - Arbitrarily chosen. It's what all the cool kids were doing, and it serves my purposes well enough, so I went with it.
5 - <a name="foot-Sun-Aug-25-220327EDT-2013"></a>[|back|](#note-Sun-Aug-25-220327EDT-2013) - Such as changing your screen name, playing things face down, picking things up, coin-flips, dice rolls, and pretty much anything other than playing the standard 54-card deck.
| 82.312207 | 1,394 | 0.714388 | eng_Latn | 0.997706 |
0d82edf7e12f19eee0a849b2e6452f5b136fa471 | 147 | md | Markdown | about.md | ummjevel/ummjevel.github.io | 9cdb40e167ea17a98fedada466221022a1cfa621 | [
"MIT"
] | null | null | null | about.md | ummjevel/ummjevel.github.io | 9cdb40e167ea17a98fedada466221022a1cfa621 | [
"MIT"
] | null | null | null | about.md | ummjevel/ummjevel.github.io | 9cdb40e167ea17a98fedada466221022a1cfa621 | [
"MIT"
] | null | null | null | ---
layout: post
title: "about"
date: 2020-08-29
categories: about
permalink: /about/
tags: about
---
휴학생입니다.
매주 기록용으로 개설하였습니다.
현재는 방황중입니다.
| 9.1875 | 18 | 0.673469 | kor_Hang | 0.992564 |
0d8335f5994067fc722f1cfd2e3217b20c5bc831 | 3,302 | md | Markdown | README.md | jordipons/EUSIPCO2017 | 23485a145d6c7043c819ff59d7dd1a910168959b | [
"MIT"
] | 25 | 2017-07-03T17:25:06.000Z | 2021-11-16T02:36:48.000Z | README.md | jordipons/EUSIPCO2017 | 23485a145d6c7043c819ff59d7dd1a910168959b | [
"MIT"
] | null | null | null | README.md | jordipons/EUSIPCO2017 | 23485a145d6c7043c819ff59d7dd1a910168959b | [
"MIT"
] | 4 | 2018-11-16T12:58:48.000Z | 2019-10-13T10:21:03.000Z | Music auto-tagging experiments for the paper entitled "Timbre Analysis of Music Audio Signals with Convolutional Neural Networks" by Jordi Pons, Olga Slizovskaia, Rong Gong, Emilia Gómez and Xavier Serra.
We provide the code for data preprocessing, training and evaluation of our approach.
## Installation
Python scripts based on Lasagne-Theano for deep learning and Librosa for feature extraction.
Requires installing Lasagne-Theano (http://lasagne.readthedocs.org/en/latest/user/installation.html) and Librosa (https://github.com/librosa/librosa).
Lasagne is already in a /src folder, to install Theano do:
> sudo pip install --upgrade https://github.com/Theano/Theano/archive/master.zip
Dependencies: pandas, numpy and scipy.
## Steps for reproducing our results
1. Download the MagnaTagATune (MTT) dataset. See: http://mirg.city.ac.uk/codeapps/the-magnatagatune-dataset and https://github.com/keunwoochoi/magnatagatune-list
2. Copy the MTT datset to `./data/audio/MagnaTagATune/` in its original format: 16 folders, '0' to '9' and then 'a' to 'f'. It will look like: `./data/audio/MagnaTagATune/a/`, `./data/audio/MagnaTagATune/2/`, etc. Note that you will have to create the folder: `./data/audio/MagnaTagATune/`.
3. Go to src folder: `cd EUSIPCO2017/src/`
4. Run `python spectrograms.py`.
5. Run `python exp_setup.py`.
6. Run `python patches.py`.
7. Configure `train.py`:
1. Set `config['patches']` to be the folder generated by `patches.py`, ie: `patches/patches_dieleman_setup_eusipco2017_187_logC_elementWise_memory_1489752607/`.
2. Choose the architecture by setting `config['type']` to: `smallSquared` or `proposed` or `proposed2`.
8. Run `python train.py`. If you want to use your GPU, you probably want to run the following: THEANO_FLAGS=mode=FAST_RUN,device=gpu,floatX=float32 python test_2.py
9. Configure `test.py`: set test_params['model_name'] to be the model created by `train.py`, ie: `dieleman_setup_eusipco2017_proposed2_v0_210270563616880246637098465086143809559` - without the extension.
10. Run `python test.py`.
## Additional information
You might want to use this code for your own porpuses. If this is the case, the following additional information might be useful for you.
### Scipts pipeline
`spectrograms.py` > `exp_setup.py` > `patches.py` > `train.py` > `test.py`
- `spectrograms.py`: computes spectrograms.
- `exp_setup.py`: splits data in train, val, test. Requires: previous run of 'spectrograms.py'.
- `patches.py`: computes patches and normalizes the data. Requires: previous run of 'exp_setup.py'.
- `train.py`: trains a deep learning model defined in 'builid_architecture.py'. Requires: previous run of 'patches.py'.
- `test.py`: evaluates how the trained model performs. Requires: previous run of 'train.py'.
### Folders structure
Root folders:
- `./data`: with audio, groundtruth and all intermediate files (spectrograms, patches and train/test results).
- `./src`: with core and static scripts.
Default data folders:
- `./data/audio/`: with the raw audio files.
- `./data/index/`: with pre-processed *(i)* 'index_file' and *(ii)* 'gt_file'.
When running the scripts throughout the pipeline, the following folders will be created:
- `./data/spectrograms/`
- `./data/exp_setup/`
- `./data/patches/`
- `./data/train/`
- `./data/test/`
| 58.964286 | 290 | 0.752574 | eng_Latn | 0.901615 |
0d83b6e17c1f79bf1431133bba9b34f22e3a4a21 | 4,057 | md | Markdown | brands/icon/leetcode.md | liuwave/icon-helper | 6339786f595d6f12a432db71e7aaccb693b9fe59 | [
"MIT"
] | 2 | 2020-10-26T16:32:26.000Z | 2021-01-13T09:28:45.000Z | brands/icon/leetcode.md | liuwave/icon-helper | 6339786f595d6f12a432db71e7aaccb693b9fe59 | [
"MIT"
] | 48 | 2020-04-04T12:39:59.000Z | 2022-02-27T01:30:09.000Z | brands/icon/leetcode.md | liuwave/icon-helper | 6339786f595d6f12a432db71e7aaccb693b9fe59 | [
"MIT"
] | null | null | null | ---
title: LeetCode(LeetCode) ICON转svg、png下载
name: leetcode
zhTips: LeetCode
search:
image: https://iconhelper.cn/svg/brands/leetcode.svg
---
# LeetCode <small style="font-size: 60%;font-weight: 100">LeetCode</small>
<div id="svg" class="svg-wrap">
<svg role="img" viewBox="0 0 24 24" xmlns="http://www.w3.org/2000/svg"><title>LeetCode icon</title><path d="M16.102 17.93l-2.697 2.607c-.466.467-1.111.662-1.823.662s-1.357-.195-1.824-.662l-4.332-4.363c-.467-.467-.702-1.15-.702-1.863s.235-1.357.702-1.824l4.319-4.38c.467-.467 1.125-.645 1.837-.645s1.357.195 1.823.662l2.697 2.606c.514.515 1.365.497 1.9-.038.535-.536.553-1.387.039-1.901l-2.609-2.636a5.055 5.055 0 0 0-2.445-1.337l2.467-2.503c.516-.514.498-1.366-.037-1.901-.535-.535-1.387-.552-1.902-.038l-10.1 10.101c-.981.982-1.494 2.337-1.494 3.835 0 1.498.513 2.895 1.494 3.875l4.347 4.361c.981.979 2.337 1.452 3.834 1.452s2.853-.512 3.835-1.494l2.609-2.637c.514-.514.496-1.365-.039-1.9s-1.386-.553-1.899-.039zM20.811 13.01H10.666c-.702 0-1.27.604-1.27 1.346s.568 1.346 1.27 1.346h10.145c.701 0 1.27-.604 1.27-1.346s-.569-1.346-1.27-1.346z"/></svg>
</div>
<detail full-name='leetcode'></detail>
<div class="detail-page">
<p>
<span><span class="badge-success badge">免费图标</span> <span class="badge-success badge">免费修改</span> <span class="badge-success badge">免费下载</span> </span>
<br/>
<span>
ICON库:
<span class="badge-secondary badge">品牌图标(Simple Icon)</span>
</span>
<br/>
<span>
CSS名称:
<span class="badge-secondary badge">leetcode</span>
</span>
<br/>
<span>
版本:
<span class="badge-secondary badge">2.8.0</span>
</span>
<br/>
<span>图标来源/作者:<span class="badge-light badge">Simple Icon</span></span>
<br/>
<span class="zh-detail">中文描述:<span class="badge-primary badge">LeetCode</span><span class="help-link"><span>帮助改进</span>(<a href="https://gitee.com/liuwave/icon-helper/edit/master/json/brands/leetcode.json" target="_blank" rel="noopener noreferrer">gitee</a><a href="https://github.com/liuwave/icon-helper/edit/master/json/brands/leetcode.json" target="_blank" rel="noopener noreferrer">github</a></span>)</span><br/>
</p>
</div><div class="description description alert alert-light"><p>图标来源地址:<a href="https://leetcode.com" target="_blank" rel="noopener noreferrer">https://leetcode.com</a></p></div>
<div class="alert alert-dark">
<img height="21" width="21" src="https://cdn.jsdelivr.net/npm/simple-icons@latest/icons/leetcode.svg" />
<img height="24" width="24" src="https://cdn.jsdelivr.net/npm/simple-icons@latest/icons/leetcode.svg" />
<img height="32" width="32" src="https://cdn.jsdelivr.net/npm/simple-icons@latest/icons/leetcode.svg" />
<img height="48" width="48" src="https://cdn.jsdelivr.net/npm/simple-icons@latest/icons/leetcode.svg" />
<img height="64" width="64" src="https://cdn.jsdelivr.net/npm/simple-icons@latest/icons/leetcode.svg" />
<img height="96" width="96" src="https://cdn.jsdelivr.net/npm/simple-icons@latest/icons/leetcode.svg" />
</div>
<div>
<p>可以像图片那样使用:
</p>
<div class="alert alert-primary" style="font-size: 14px">
<img height="32" width="32" src="https://cdn.jsdelivr.net/npm/simple-icons@latest/icons/leetcode.svg" />
<copy-btn content='<img height="32" width="32" src="https://cdn.jsdelivr.net/npm/simple-icons@latest/icons/leetcode.svg" />'></copy-btn>
</div>
<div class="alert alert-secondary">
<img height="32" width="32" src="https://cdn.jsdelivr.net/npm/simple-icons@latest/icons/leetcode.svg" />leetcode
<copy-btn content="leetcode" btn-title="复制图标名称"></copy-btn>
</div>
</div>
<div class="icon-detail__container">
<p>关于“<b>LeetCode</b>”的评论:</p>
</div>
<Vssue title="关于“LeetCode”的评论" />
<div><p>Simple Icon是一个免费的品牌图标库。图标LeetCode可以免费下载使用。更多关于 Simple Icon的信息,参见:<a target="_blank" href="https://iconhelper.cn/brands.html">品牌图标(Simple Icon)</a>
</p></div>
<div style="padding:2rem 0 " class="page-nav"><p class="inner"><span class="prev">←<router-link to="/icon/launchpad.html">Launchpad</router-link></span> <span class="next"><router-link to="/icon/lenovo.html">Lenovo</router-link>→</span></p></div>
| 55.575342 | 851 | 0.697313 | yue_Hant | 0.349659 |
0d83c1fac572371e4bb40fe344a1e72ca6a31976 | 2,156 | md | Markdown | misc-utils-of-mine-generic/api/modules/_string_boxes_.md | cancerberoSgx/misc-utils-of-mine | b2d6050c895cd72789562a6482b565b0b85a9949 | [
"MIT"
] | 2 | 2019-09-04T15:25:47.000Z | 2020-04-07T09:51:44.000Z | misc-utils-of-mine-generic/api/modules/_string_boxes_.md | cancerberoSgx/misc-utils-of-mine | b2d6050c895cd72789562a6482b565b0b85a9949 | [
"MIT"
] | 8 | 2020-08-09T01:02:50.000Z | 2021-11-11T00:49:32.000Z | misc-utils-of-mine-generic/api/modules/_string_boxes_.md | cancerberoSgx/misc-utils-of-mine | b2d6050c895cd72789562a6482b565b0b85a9949 | [
"MIT"
] | 1 | 2020-10-19T09:04:48.000Z | 2020-10-19T09:04:48.000Z | [misc-utils-of-mine-generic](../README.md) › [Globals](../globals.md) › ["string/boxes"](_string_boxes_.md)
# Module: "string/boxes"
## Index
### Enumerations
* [BorderSide](../enums/_string_boxes_.borderside.md)
* [BorderStyle](../enums/_string_boxes_.borderstyle.md)
### Type aliases
* [BoxStyles](_string_boxes_.md#boxstyles)
### Variables
* [borderStyles](_string_boxes_.md#const-borderstyles)
* [boxStyles](_string_boxes_.md#let-boxstyles)
### Functions
* [getBoxChar](_string_boxes_.md#getboxchar)
* [getBoxStyles](_string_boxes_.md#const-getboxstyles)
## Type aliases
### BoxStyles
Ƭ **BoxStyles**: *object*
*Defined in [src/string/boxes.ts:84](https://github.com/cancerberoSgx/misc-utils-of-mine/blob/cb3d17a/misc-utils-of-mine-generic/src/string/boxes.ts#L84)*
#### Type declaration:
## Variables
### `Const` borderStyles
• **borderStyles**: *string[]* = enumKeys(BorderStyle)
*Defined in [src/string/boxes.ts:28](https://github.com/cancerberoSgx/misc-utils-of-mine/blob/cb3d17a/misc-utils-of-mine-generic/src/string/boxes.ts#L28)*
___
### `Let` boxStyles
• **boxStyles**: *[BoxStyles](_string_boxes_.md#boxstyles) | undefined*
*Defined in [src/string/boxes.ts:88](https://github.com/cancerberoSgx/misc-utils-of-mine/blob/cb3d17a/misc-utils-of-mine-generic/src/string/boxes.ts#L88)*
## Functions
### getBoxChar
▸ **getBoxChar**(`s`: [BorderStyle](../enums/_string_boxes_.borderstyle.md), `si`: [BorderSide](../enums/_string_boxes_.borderside.md)): *string*
*Defined in [src/string/boxes.ts:45](https://github.com/cancerberoSgx/misc-utils-of-mine/blob/cb3d17a/misc-utils-of-mine-generic/src/string/boxes.ts#L45)*
**Parameters:**
Name | Type |
------ | ------ |
`s` | [BorderStyle](../enums/_string_boxes_.borderstyle.md) |
`si` | [BorderSide](../enums/_string_boxes_.borderside.md) |
**Returns:** *string*
___
### `Const` getBoxStyles
▸ **getBoxStyles**(): *[BoxStyles](_string_boxes_.md#boxstyles)*
*Defined in [src/string/boxes.ts:90](https://github.com/cancerberoSgx/misc-utils-of-mine/blob/cb3d17a/misc-utils-of-mine-generic/src/string/boxes.ts#L90)*
**Returns:** *[BoxStyles](_string_boxes_.md#boxstyles)*
| 27.641026 | 154 | 0.718924 | yue_Hant | 0.374183 |
0d8540979b9c0390947b1c10c9a5e2911171a9ec | 44,115 | markdown | Markdown | _posts/2009-04-02-bulk-dimensional-nanocomposites-for-thermoelectric-applications.markdown | LizetteO/Tracking.virtually.unlimited.number.tokens.and.addresses | 89c6b0fa3defa14758d6794027cc15150d005bbd | [
"Apache-2.0"
] | null | null | null | _posts/2009-04-02-bulk-dimensional-nanocomposites-for-thermoelectric-applications.markdown | LizetteO/Tracking.virtually.unlimited.number.tokens.and.addresses | 89c6b0fa3defa14758d6794027cc15150d005bbd | [
"Apache-2.0"
] | null | null | null | _posts/2009-04-02-bulk-dimensional-nanocomposites-for-thermoelectric-applications.markdown | LizetteO/Tracking.virtually.unlimited.number.tokens.and.addresses | 89c6b0fa3defa14758d6794027cc15150d005bbd | [
"Apache-2.0"
] | 2 | 2019-10-31T13:03:55.000Z | 2020-08-13T12:57:08.000Z | ---
title: Bulk dimensional nanocomposites for thermoelectric applications
abstract: Thermoelectric elements may be used for heat sensors, heat pumps, and thermoelectric generators. A quantum-dot or nano-scale grain size polycrystalline material the effects of size-quantization are present inside the nanocrystals. A thermoelectric element composed of densified Groups IV-VI material, such as calcogenide-based materials are doped with metal or chalcogenide to form interference barriers form along grains. The dopant used is either silver or sodium. These chalcogenide materials form nanoparticles of highly crystal grains, and may specifically be between 1- and 100 nm. The compound is densified by spark plasma sintering.
url: http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=%2Fnetahtml%2FPTO%2Fsearch-adv.htm&r=1&f=G&l=50&d=PALL&S1=08759662&OS=08759662&RS=08759662
owner: University of South Florida
number: 08759662
owner_city: Tampa
owner_country: US
publication_date: 20090402
---
This invention was made with Government support under Grant No. DE PS26 04NT42113 awarded by the Department of Energy and Grant No. W81XWH 07 1 0708 awarded by the Department of Defense U.S. Army Medical Research and Materiel Command. The Government has rights in the invention.
This application claims priority to U.S. Provisional Patent Application No. 61 041 748 entitled Thermoelectric Nanocomposites and Manufacturing Method filed on Apr. 2 2008 the contents of which are herein incorporated by reference.
This invention relates to enhancing thermoelectric properties of devices. Specifically the invention entails using nano scale fabrication on thermoelectric materials to improve the efficiency of thermoelectric devices.
The demand for new technologies to enhance the efficiency of our Nation s automobiles is at the forefront of new materials research towards power conversion technologies. Thermoelectric TE elements have been used as heat sensors for temperature measurement heat pumps in applications with low power demand or where compression refrigerating systems cannot be used for other reasons and in thermoelectric generators TEG . However unit costs are relatively high and the efficiency too low in TEGs to allow for general commercialization of TE devices. There exists a vast array of potential applications for TE elements due to decreasing energy resources and increasing energy demand. TE power conversion from exhaust waste heat is a viable technology that can be instrumental in improving the efficiency and thus reducing emitted pollutants of our nation s automobiles. Because TE devices can translate heat flow into electrical current thermoelectric generators may be used as a renewable energy source in automotive fuel economy electricity generation. TE technology is advantageous in many respects including reliability no moving parts safety and environmental friendliness. When a temperature gradient is maintained across a TE device electric potential is generated due to the Seebeck effect which can be used to drive a load or generator.
Research into higher efficiency thermoelectric TE materials continues to require advanced synthesis techniques. The specific material property requirements for TE materials can be quantified by the dimensionless figure of merit ZT S where S is the Seebeck coefficient the electrical conductivity and the total thermal conductivity the lattice and electronic contributions respectively . The power factor S or S where is the electrical resistivity is typically optimized as a function of carrier concentration typically 10carriers cmin materials presently available in devices through doping to give the largest ZT. The current TE materials used in devices are rather inefficient ZT 1 even though they are able to address many niche applications. In order that TE devices achieve their full potential new materials and new material synthesis approaches are needed. Materials with larger values of ZT warrant better thermoelectric devices thus researchers have devoted much effort in finding ways to increase ZT. Since the transport properties that define ZT are interrelated it is difficult to control them independently for a three dimensional crystal.
The thermoelectric potential which determines the electrical voltage that can be generated depends on the material specific thermoelectric characteristics based on the Seebeck coefficient and temperature differences. High Seebeck coefficients and large temperature differences lead to high thermoelectric voltages. In order to be able to draw large electrical power a large temperature difference must be maintained between the elements of a TE device. This necessitates the use of bulk materials with high Seebeck coefficients. With the present methods for manufacturing TE elements either it is not possible in practice to manufacture such bulk materials or the manufacture of such bulk TE elements would lead to exorbitant manufacturing costs.
Recent research on TE materials has resulted in ZT values greater than unity at different temperatures of application due to the advanced material synthesis characterization and modeling capabilities developed. Several classes of materials have added to these advances. These include bulk materials such as skutterudites and clathrates G. S. Nolas et al. Skutterudites A phonon glass electron crystal approach to advanced thermoelectric energy conversion applications Ann Rev. Mat. Res. 29 8. 1999 C. Uher Skutterudites Perspective novel materials in 69 139 2000 B. C. Sales et al. Filled skutterudite antimonides A new class of thermoelectric materials 272 1325 1996 G. S. Nolas et al. Semiconductor Clathrates A phonon glass electron crystal material with potential for thermoelectric applications in Vol. 69 edited by T. M. Tritt Academic Press 2000 p. 255 G. S. Nolas Structure transport and thermoelectric properties of clathrate compounds in edited by D. M. Rowe CRC Press Boca Raton Fla. 2005 33 1 A. Saramat et al. Large thermoelectric figure of merit at high temperature in Czochralski grown clathrate BaGaGe J. Appl. Phys. 99 023708 2006 and complex chalcogenides M. G. Kanatzidis The role of solid state chemistry in the discovery of new thermoelectric materials in 69 51 2000 with unique crystal structures allowing for low and therefore enhanced thermoelectric properties.
Additionally research and development in nanotechnology has shown promising improvements in Seebeck coefficient with new TE materials. The controlled fabrication of nanoscale semiconductors with enhanced physical properties is a current goal of technical as well as fundamental interest. Thin film coatings and nanotube technology allows production of two dimensional or one dimensional thermoelectric structures with improved TE characteristics compared to traditional bulk materials but still do not solve the problem of manufacturing bulk materials that are inexpensive. Nevertheless nanostructured materials have become of great interest because they offer the opportunity of independently varying the transport properties that define ZT G. S. Nolas J. Sharp and H. J. Goldsmid Springer New York 2001 G. Chen M. S. Dresselhaus G. Dresselhaus J. P. Fleurial and T. Caillat Vol. 48 London 2003 p. 45 66 .
Semiconductor grain boundaries on carrier transport becomes increasingly important in nanoscale polycrystalline systems where surface point defect dislocation and interfacial energy barrier scatterings can dominate the transport G. Blatter and F. Greuter Phys. Rev. B 34 8555 1986 H. Marom M. Ritterband M. Eizenberg Thin Solid Films 510 62 2006 C. H. Seager J. Appl. Phys. 52 3960 1981 . Recent identification of several higher efficiency thermoelectric TE materials can be attributed to nanoscale enhancement M. S. Dresselhaus G. Chen M. Y. Tang R. Yang H. Lee D. Wang Z. Ren J. P. Fleurial and P. Gogna Adv. Mater. 19 1043 2007 L. D. Hicks M. S. Dresselhaus Phys. Rev. B 47 16631 1993 J. P. Heremans C. M. Thrush and D. T. Morelli J. Appl. Phys. 98 063703 2005 K. F. Hsu S. Loo F. Guo W. Chen J. S. Dyck C. Uher T. Hogan E. K. Polychroniadis M. G. Kanatzidis Science 303 818 2004 M. S. Dresselhaus G. Chen M. Y. Tang R. G. Yang H. Lee D. Z. Wang Z. F. Ren J. P. Fleurial and P. Gogna 886 3 2006 . These materials demonstrate increased Seebeck coefficient and decreased thermal conductivity due to the phenomenological properties of nanometer length scales including enhanced interfacial phonon scattering and charge carrier filtering.
Nanometer scale materials have been at the core of one of the most substantial advances in thermoelectric technology since the development of semiconductor technology Mater. Res. Soc. Proc. Vol. 886 edited by J. Yang. T. P. Hogan R. Funahachi and G. S. Nolas 2006 . The two mechanisms that led to this success were a strong reduction in and an increase in S . The reduction in can be understood as a function of phonon mean free path J. W. Sharp et al. Boundary Scattering of Phonons and thermoelectric figure of merit Physica Status Solidi a 187 507 2001 G. S. Nolas and H. J. Goldsmid The Figure of Merit in Amorphous Thermoelectrics Physica B 194 271 2002 S. Bhattacharya et al. Grain Structure Effects on the Lattice Thermal Conductivity of Ti Based Half Heusler Alloys Appl. Phys. Lett. 81 43 2002 . Quantum confinement results in a strongly increased Seebeck coefficient due to the increase in the density of states of the carriers L. D. Hicks and M. S. Dresselhaus Effect of quantum well structures on the thermoelectric figure of merit Phys. Rev. B 47 12727 1993 . This observation was at the origin of the interest in nanoscale thermoelectricity and experimentally observed on bismuth nanowires J. P. Heremans et al. Thermoelectric Power of Bismuth Nanocomposites Phys. Rev. Lett. 88 216801 2002 .
Nanostructured TE enhancement aims to split the interdependence of the electrical and thermal transport allowing for better optimization of the TE figure of merit G. S. Nolas et al. Springer New York N.Y. 2001 . The reduction of through the interface scattering of phonons in nanostructured materials has been viewed as the primary way to increase ZT M. S. Dresselhaus et al. New Directions for Low Dimensional Thermoelectric Materials Adv. Mater. 19 1043 2007 R. Venkatasubramanian et al. Thin film thermoelectric devices with high room temperature figures of merit Nature 413 597 2001 T. C. Harman et al. Quantum Dot Superlattice Thermoelectric Materials and Devices Science 297 2229 2002 T. Koga et al. Experimental proof of principle investigation of enhanced ZT in 001 oriented Si Ge superlattices Applied Physics Letters 77 1490 2000 . The particular mechanism is the effective scattering of phonons due to the presence of interfaces.
Superior TE performance requires enhancement of the power factor S . Carrier filtering where the presence of interfacial energy barriers filters low energy charge carriers traversing the interface has been theoretically predicted Y. I. Ravich In edited by D. M. Rowe pages 67 73 CRC Press New York 1995 B. Moyzhes and V. Nemchinsky Appl. Phy. Lett. 73 1895 1998 . This increases ISI as its value depends on the mean carrier energy relative to the Fermi level G. S. Nolas J. Sharp and H. J. Goldsmid Springer New York N.Y. 2001 .
Carrier scattering at the boundaries that inhibits the conduction of low energy charge carriers thus causing an overall increase in S was first proposed theoretically L. W. Whitlow and T. Hirano Superlattice applications to thermoelectricity J. Appl. Phys. 78 5460 1995 Moyzhes and V. Nemchinsky Thermoelectric figure of merit of metal semiconductor barrier structure based on energy relaxation length Appl. Phys. Lett. 73 1895 1998 and has been experimentally investigated in III V heterostructures D. Vashaee and A. Shakouri Improved thermoelectric power factor in metal based superlattices Phys. Rev. Lett. 92 106103 2004 . Very little experimental research effort has been undertaken into the investigation of S enhancement of nano scale inclusions in bulk materials. Research on PbTe K. Kishimoto and T. Koyanagi Preparation of sintered degenerate n type PbTe with a small grain size and its thermoelectric properties J. Appl. Phys. 92 2544 2002 J. P. Heremans et al. Thermopower enhancement in lead telluride nanostructures Phys. Rev. B 70 115334 2004 Si Ge alloys M. S. Dresselhaus et al. New directions for nanoscale thermoelectric materials research Proc. Mater. Res. Soc. 886 3 2006 BiTe B. Poudel et al High Thermoelectric Performance of Nanostructured Bismuth Antimony Telluride Bulk Alloys Science 320 634 2008 and Ce filled skutterudites with unfilled skutterudite nano grains P. C. Znai et al. Nanostructures and enhanced thermoelectric properties in Ce filled skutterudite bulk materials Appl. Phys. Lett. 89 052111 2006 have shown an improvement in S as compared to their bulk counterparts. These studies however have not fully developed the physics behind the underlying phenomena affecting the transport properties as a function of grain size. In all five of these studies the ball milling technique that was used to synthesize nanoscale powders of the respective materials causes strain in these materials that surely affected their transport properties as compared to hot pressed HP sintered bulk polycrystalline specimens. This was specifically noted in the work of Heremans et al. Thermopower enhancement in lead telluride nanostructures Phys. Rev. B 70 115334 2004 . The affect on nano feature size is therefore not well established from these studies. More recently a study by Heremans et al. Thermoelectric enhancement in PbTe with Pb precipitates J. Appl. Phys. 98 063703 2005 showed an increase in S due to 3 and 6 nano grain Pb precipitates. Most recently we have synthesized and measured the transport properties of polycrystalline PbTe with 100 nm PbTe inclusions and showed enhanced S with minimal degradation in a as compared to bulk polycrystalline PbTe A. J. Crocker and L. M. Rogers Brit. Interpretation of the Hall coefficient electrical resistivity and Seebeck coefficient of p type lead telluride J. Appl. Phys. 18 563 1967 .
In a quantum dot or nano scale grain size polycrystalline material the effects of size quantization is present inside the nanocrystals. However experimental studies involving bulk thermoelectric materials with nanostructure inclusions or granular nanocomposites indicated that TE property improvements for bulk nanocrystalline materials are feasible K. Kishimoto and T. Koyanagi J. Appl. Phys. 92 2544 2002 J. P. Heremans C. M. Thrush and D. T. Morelli Physical Review B 70 115334 2004 J. Martin G. S. Nolas W. Zhang and L. Chen Appl. Phys. Lett. 99 222112 2007 . These studies have suggested that the thermoelectric properties can be enhanced by increasing S . The physical mechanism responsible for S enhancement in these materials has not been developed. It has been demonstrated that PbTe PbTeSe quantum dot superlattices can have a two fold increase in room temperature ZT as compared to bulk PbTe T. C. Harman P. J. Taylor M. P. Walsh and B. E. LaForge Science 297 2229 2002 T. C. Harman M. P. Walsh B. E. LaForge and G. W. Turner J. Electron. Mat. 34 L19 2005 . Additionally materials containing PbTe nanogranular formations J. P. Heremans C. M. Thrush and D. T. Morelli Phys. Rev. B 70 115334 2004 J. Martin G. S. Nolas W. Zhang and L. Chen Appl. Phys. Lett. 90 222112 2007 or Si and Ge nanoparticles have also been shown to exhibit an increase in ZT.
Experimental studies indicate that the carrier scattering at the interfaces may be an important factor in the overall increase in ZT. In particular it is speculated that for polycrystalline materials the grains appear as traps for low energy carriers while the higher energy carriers diffuse trough the specimen Y. I. Ravich CRC Press New York 1995 . In fact the enhancement of the TE properties in nanocomposites has been attributed to the presence of carrier interface potential barrier scattering mechanism which is not typical for bulk materials. The grain boundaries therefore appear to filter out the lower energy carriers. Since the mean energy per carrier is increased S increases while is not degraded significantly. Filtering by energy barriers has been theoretically predicted and experimentally observed in thin films and heterostructure systems D. Vashaee and A. Shakouri Phys. Rev. Lett. 92 106103 2004 D. Vashaee and A. Shakouri J. of Appl. Phys. 95 1233 2004 G. D. Mahan and L. M. Woods Phys. Rev. Lett. 80 4016 1998 .
The thermoelectric properties may be altered by approaching nano scale dimensions. These include superlattice structures of BiTe SbTe R. Venkatasubramanian et al. Thin film thermoelectric devices with high room temperature figures of merit Nature 413 597 2001 PbSeTe PbTe and PbTe thin films that contain nanodots of PbSe. T. C. Hannan et al. Quantum Dot Superlattice Thermoelectric Materials and Devices Science 297 2229 2002 . Nanodots of PbTe can form inside AgPbSbTecrystals with ZT 1.7. K. F. Hsu et al. Cubic AgPbSbTe Bulk Thermoelectric Materials with High Figure of Merit Science 303 818 2004 H. Wang et al. High performance AgPhSbTethermoelectric bulk materials fabricated by mechanical alloying and spark plasma sintering Appl. Phys. Left. 88 092104 2006 A. Kosuga M. Uno K. Kurosaki S. Yamanaka Thermoelectric properties of Stoichiometric AgPhSbTe x 0 0.1 0.2 J. Alloys Comp 391 288 2005 In most of the present research the effort focuses on reducing to increase ZT.
In certain embodiments a thermoelectric element composed of densified Groups IV VI material. For example chalcogenide based materials such as PbTe are useful. A metal or chalcogenide dopant is integrated into the chalcogenide forming nanoparticles such that interference barriers form along grains. The dopant is either silver or sodium in specific embodiments.
In certain embodiments the Group IV VI materials are chalcogenide. These chalcogenide materials are optionally nanoparticles of highly crystal grains and may specifically be between 1 and 100 nm. The chalcogenide nanoparticles may be BiTe SbTe PbSeTe PbTe CdS CdSe ZnSe ZnS CsS PbS or PbSeS. In specific embodiments the nanocrystals are PbTe particles ranging in size from 100 150 nm. The chalcogenide nanoparticles can also be core shell nanocrystals with a metal dopant core and semiconducting coating in specific embodiments. The semiconducting compound is optionally densified by spark plasma sintering.
The thermoelectric element may be used for heat sensors heat pumps and thermoelectric generators and may specifically be a thermoelectric generator adapted for use on an automobile.
A phenomenological model is proposed to describe the diffusion transport of carriers through a material composed of nanogranular regions. The material is viewed as containing potential interface barriers due to the grains and the transport includes quantum transmission through and reflection from those barriers. Additional scattering mechanisms such as carriers acoustic phonons carriers non polar optical phonons and carriers ionized impurities which are relevant for thermoelectric materials are also taken into account. The model involves a set of physical parameters which can be measured independently or taken from the literature to calculate or make predictions for the transport characteristics for relevant thermoelectrics or define grain barrier characteristics such as height width and distance between them and predict grain barrier influence on thermoelectric transport. The model is tested by comparing the calculated carrier conductivity and thermopower to experimental data for PbTe dimensional nanocomposites.
A nanofabricated thermoelectric may be created using at least one semiconducting Groups IV VI element. A metal or chalcogenide dopant is mixed with the semiconducting compound forming nanoparticles of dopant and semiconducting material. The material is then densified. The nanoparticles are optionally formed by ultrasonic homogenizer. In specific embodiments the dopant semiconducting material is densified by spark plasma sintering or hot pressing
In some embodiments the dopant is dissolved in the lead acetate. In certain embodiments the dopant may be silver or sodium and may specifically be silver acetate.
A model for describing the diffusive transport properties in granular nanocomposite materials is proposed and successfully applied to explain relevant experimental data. The interfaces of the nano scaled granular regions are modeled as rectangular barriers. The transport through the material includes carrier quantum mechanical transmission and scattering from the barriers carrier phonon and carrier ionized impurities scattering within the grains. The theoretically calculated and S can reproduce the experimental observations obtained for Ag doped PbTe and undoped PbTe nanocomposite specimens. We have shown that the interplay between the different scattering mechanisms as well as the carrier concentration and the physical parameters for the barriers are important in finding an optimal regime for the TE performance of a given material. Specifically our model reveals that by manipulating the barrier size and height the mean energy per carrier may be increased leading to an increase in S without substantial degradation of . Furthermore this model can be adapted within the relaxation time approximation for other IV VI material systems composed of highly crystalline grains by incorporating the relevant physical parameters in the total density of states and scattering mechanisms.
As used herein the term Group IV VI refer to compounds located at locations IVA through VIA on the periodic table. These include semiconducting materials and specifically chalcogenides.
As used herein the term nanoparticle means one or more nanoparticles with an average diameter ranging from 1 to 100 nm.
As used herein the term dopant means an element compound or compounds which provide a substantial number of shallow donor or acceptor levels in the semiconductor material so as to substantially alter the conductivity of the material. Exemplary dopants are metal or chalcogenides such as silver or sodium. Those of skill in the art will understand that other dopant compounds or elements exist for use in the semiconductor materials.
TE samples were synthesized employing solution phase and sonochemical synthesis methods. The samples were then characterized structurally chemically electrically and thermally for evaluation of the materials properties. Core shell nanocrystals permit manipulation of physical parameters by forming metallic and semiconducting coatings around the nanocrystals to produce additional quantum confined states at these interfaces. A solution phase synthesis approach is the best approach in producing large batches of phase pure nanoscale powders of chalcogenides. Chalcogenide compounds lend themselves to the development of core shell nanocrystalline bulk composite materials synthesis. The approach has been previously described C. B. Murray et al. Synthesis and characterization of monodisperse nanocrystals and closed packed nanocrystal assemblies Annu. Rev. Mater. Sci. 30 545 2000 and is optimized to produce large quantities of PbTe and PbSe nanocrystals required for the nanocomposite formation.
PbTe was used as an exemplary material however other nanocomposites are envisioned. Without limiting the invention examples include semiconductor semiconductor composites metal metal composites and metal non metal composites such as PbSe PbSe PbTe SbTe PbTe AgSbTe PbSe and alloys of PbTe and PbSe such as PbSnSe.
Integration of core shell nanocrystals into bulk materials suggests new possibilities for tuning interfacial energy barriers and observing the affect on the transport properties. Core shell nanocomposite thermoelectric materials may reduce the carrier mobility thus having the potential to degrade the thermoelectric properties although the potential increase in S may more than offset this degradation.
Monodisperse nanocrystals were produced using solution phase synthesis and sonochemical methods. Nano spheres of PbTe or PbSe were synthesized using solvent techniques employing a low temperature reaction of tellurium for PbTe or selenium for PbSe in alkaline aqueous solutions with a lead acetate trihydrate solution. Large yields of nano sphere semiconductors were obtained in up to 10 grams per batch shown in . It is noted that large batch sizes are a prerequisite in synthesizing bulk quantities of thermoelectric materials.
Structural microscopic and chemical analyses of the desired materials were undertaken including x ray diffraction and electron diffraction analysis for structural characterization. Tunneling Electron Microscopy TEM was used to study nano grain size and structure as well as inter grain regions in the compacted materials and to monitor and characterize the presence of any disordered or other extraneous phases resulting from localized deformation at inter particle regions. SEM with energy dispersive spectroscopy permitted analysis of grain size composition and composite uniformity.
Hot Pressing HP of powders has been the mainstay of densification of polycrystalline materials for thermoelectrics applications. Spark Plasma Sintering SPS was used to densify composites with nanocrystals 100 nm. SPS utilizes a direct electrical current or a pulse of electric current applied to the graphite punch and die set which passes through the sample thereby heating both the outside and inside of the sample simultaneously. This process generates internal localized heating and allows for the rapid low temperature densification of fine grained powders M. Omori Sintering consolidation reaction and crystal growth by the spark plasma system SPS Mater. Sci. Eng. A287 183 2000 Z. A. Munir et al. The effect of electric field and pressure on the synthesis and consolidation of materials A review of the spark plasma sintering method J. Mater. Sci. 41 763 2006 and has been used for fine grain densification of thermoelectric alloys such as BiSb and BiTe resulting in lower while maintaining excellent electrical properties. In addition the grains possessed orientational direction as required for optimization in BiSb and BiTe Y. Lee and T. Koyanagi Thermoelectric Properties of n Bi Sb Sintered Alloys Prepared by Spark Plasma Sintering Method in 202001 p. 279 . Long distance mass diffusion is restricted during the SPS process thus phase formation is determined by local chemistry allowing for phase assembly far from equilibrium and avoiding the formation of transient phases Z. Shen and M. Nygren Kinetic aspects of superfast consolidation of silicon nitride based ceramics by spark plasma sintering J. Mater. Chem. 11 204 2001 Z. Shen et al. Formation of tough interlocking microstructures in silicon nitride ceramics by dynamic ripening Nature 417 266 2002 .
Varying the parameters pressure and current DC and pulsed adjusts the grain size in the 10 to 100 nm range allowing for tuning of grain size and transport properties. The nanostructure is preserved following the SPS procedure with grains ranging from 100 nm to over 1 micron as seen in the representative SEM image of a PbTe fracture surface in . This image also indicated SPS densification allows for nanocomposite formation with minimal conglomeration of the nanograins.
Different chalcogenide constituents were doped using the techniques described above. A major complication in consolidating nanoparticles within a bulk matrix is the formation of an organic or oxide molecular layer on the surface which is very hard to remove completely. Such a layer may lead to high due to inter particle scattering. In addition for thermoelectric applications doping towards optimized carrier concentrations is essential. As shown below the overall S of PbTe nanocomposites is enhanced is not drastically reduced . Ag doped PbTe nanocrystals were prepared as described above with the addition of Ag CHCOO to the aqueous solution. X ray diffraction indicated phase purity while Hall measurements indicated an increase in carrier concentration as compared to the nominally undoped PbTe nanocomposites seen in Table I.
Core shell composites permit the formation of an ultra thin layer of a desired material either metallic or more interestingly another chalcogenide around chalcogenide nanocrystals forming a cladding that can act as a quantum well structure. The thickness and type of material of the cladding will determine the strength of the scattering of charge carriers as well as the tunneling probability of the higher energy carriers thus allowing for the tunability of electrical properties. While carrier mobility decreases the increase in S for an optimized configuration compensates for the mobility decrease and leads to an overall enhancement in S while also reducing increasing the overall TE properties. Core shell composites also prevent grain growth that may lead to the reduction in the nano features during extended high temperature TE device operation.
The growth of epitaxial shells over the core nanocrystals has been demonstrated using a modified two injection synthesis process M. Brumer et al. PbSe PbS and PbSe PbSeSCore Shell Nanocrystals Advanced Functional Materials 15 1111 2005 . This method has demonstrated success in many Core Shell systems including Si SiO CdS Cd OH CdSe ZnSe CdSe ZnS CdS HsS CsS CdS CdS PbSe PbS and PbSe PbSeS M. Brumer et al. PbSe PbS and PbSe PbSeSCore Shell Nanocrystals Advanced Functional Materials 15 1111 2005 X. Peng et al. Epitaxial Growth of Highly Luminescent CdSe CdS Core Shell Nanocrystals with Photostability and Electronic Accessibility J. Amer. Chem. Soc. 119 7019 1997 . Extending this technique to nanocomposites such as PbSe PbTe and their alloys enables relative bandgap tailoring between the two materials and possible tuning of the electronic structure by varying shell thickness.
PbTe nanocomposites were prepared by densifying 100 nm PbTe nanocrystals synthesized in high yield employing a solution phase reaction of two monometallic aqueous precursor solutions J. Martin et al. PbTe nanocomposites synthesized from PbTe nanocrystals Appl. Phys. Lett. 90 222112 2007 W. Zhang et al. Synthesis of nanocrystalline lead chalcogenides PbE E S Se or Te from alkaline aqueous solutions Materials Research Bulletin 35 12 2009 2000 . Lead telluride nanocrystals were synthesized employing an aqueous solution phase reaction by mixing a Te KOH aqueous solution and lead acetate trihydrate Pb CHCOO .3HO solution at low temperature and standard atmosphere. The carrier concentrations were modified by directly doping the PbTe nanocrystals with Ag AgTe prior to the densification procedure specimens III and IV . Ag doped PbTe nanocrystals were prepared by dissolving 0.33 mg a Ag compound in 6.68 gm of Pb CHCOO .3HO solution to achieve the desired carrier concentration. This procedure reproducibly synthesizes 100 150 nm spherical PbTe nanocrystals confirmed by TEM with a high yield. The amounts disclosed are exemplary and should not be construed as limiting the invention. The amounts disclosed may be varied by one in the art based on the desired quantity of end product such as the amount of Ag doping in the PbTe nanocrystals disclosed in the present example.
These nanocrystals were subjected to Spark Plasma Sintering to consolidate these nanoscale grains within a dense PbTe matrix at 95 bulk theoretical density resulting in a dimensional nanocomposite structure as observed in scanning electron microscope images of both fracture and polished surfaces. Densifying solely the nanocrystals results in the dispersion of non conglomerated nanostructure within a bulk matrix J. Martin et al. PbTe nanocomposites synthesized from PbTe nanocrystals Appl. Phys. Lett. 90 222112 2007 with grains ranging from 100 nm to over 1 micron. Table I lists room temperature physical properties described below. X ray diffraction following SPS indicated 5 vol. PbTeOimpurity for specimens I and II and 3 vol. TeOimpurity for specimens III and IV.
The nanocomposites were cut into 2 2 5 mmparallelepipeds for transport property measurements. Four probe resistivity and steady state gradient sweep Seebeck coefficient and S respectively were measured from 12 to 300 K in a custom radiation shielded vacuum probe with maximum uncertainties of 4 and 6 respectively at 300 K J. Martin et al. Thermoelectric properties of silicon germanium type I clathrates J. Appl. Phys. 102 103719 2007 . Temperature dependent four probe Hall measurements were conducted from 5 to 300 K at both positive and negative magnetic fields of up to 5 T to eliminate voltage probe misalignment effects and room temperature measurements taken to eliminate thermal instabilities with a 10 uncertainty.
For all specimens a linear and positive magnetic field dependence of the Hall resistance confirms dominant p type conduction. The carrier concentrations increase upon Ag doping by more than a factor of 5 as listed in Table I. Correspondingly the values as shown in Table I exhibit a significant reduction in magnitude compared to the undoped specimens. All specimens exhibit relatively large room temperature S values of approximately 325 V K for the two undoped specimens and 200 V K for the two Ag doped specimens.
The nanocomposite model was tested by comparing low temperature transport measurements for Ag doped PbTe nanocomposites. These results demonstrate how effectively the model describes experimental data for PbTe nanocomposite material.
The scattering mechanisms that dominate the transport in bulk lead chalcogenides do not fully describe the unique temperature dependence of in these nanocomposites implying the presence of an additional scattering mechanism. In non degenerate semiconductors the carriers are scattered by long wavelength acoustic phonons m T Yu. I. Ravich B. A. Efimova and I. A. Smirnov Plenum N.Y. 1970 . Since the factor m is inversely proportional to temperature in lead chalcogenides based on experimental temperature dependence of m the mobility therefore varies with T as experimentally observed in single crystal and polycrystalline lead chalcogenides with a weaker dependence in degenerate specimens Yu. I. Ravich B. A. Efimova and I. A. Smirnov Plenum N.Y. 1970 W. Scanlon in 9 Academic Press NY 1959 . . This dependence is opposite of that observed in the PbTe nanocomposites suggesting phonon scattering is present in combination with an additional mechanism. Furthermore the experimental data indicates is not proportional to Tin these nanocomposites and suggests scattering by ionized impurities is not a dominant mechanism. The high in PbTe implies suppression of long range Coulomb potentials limiting scattering to near the internal point of an impurity due to the large Bohr radius m on the order of the lattice constant E. H. Putley Proc. Phys. Soc. B 65 736 1952 N. A. Poklonski S. A. Vyrko V. I. Yatskevich and A. A. Kocherzhenko J. Appl. Phys. 93 9749 2003 and consequently a small screening length. Therefore it is unlikely that ionized impurity scattering is effectively present in these nanocomposites particularly at room temperature where the interaction time the time required for the carrier to pass the region of one impurity ion N. A. Poklonski S. A. Vyrko V. I. Yatskevich and A. A. Kocherzhenko J. Appl. Phys. 93 9749 2003 is significantly shorter.
The nanocomposite carrier conduction can be effectively described as dominated by grain boundary potential barrier scattering in combination with phonon scattering. Similar models have successfully described the electrical properties of silicon CdTe and nanostructured metal oxide films C. H. Seager Grain boundary recombination Theory and experiment in silicon J. Appl. Phys. 52 3960 1981 J. Y. W. Seto J. Appl. Phys. 46 5247 1975 O. Vigil Galan et al. Influence of the growth conditions and postdeposition treatments upon the grain boundary barrier height of CdTe thin films deposited by close space vapor transport J. Appl. Phys. 90 3427 2001 G. Kiriakidis et al. High Performance Gas Sensing Materials Based on Nanostructed Metal Oxide Films Rev. Adv. Mater. Sci. 10 215 2005 . Studies indicated oxygen adsorption in the PbTe nanocomposites J. Martin et al. PbTe nanocomposites synthesized from PbTe nanocrystals Appl. Phys. Lett. 90 222112 2007 . This surface reactivity is difficult to prevent considering the aqueous nature of the synthesis technique W. Zhang et al. Synthesis of nanocrystalline lead chalcogenides PbE E S Se or Te from alkaline aqueous solutions Materials Research Bulletin 35 12 2009 2000 . The surface oxidation of PbTe is a sequential process proceeding first through the formation of weak peroxide like structures up to 70 coverage then by the chemisorption of oxygen T. S. Zyubina V. S. Neudachina L. V. Yashina V. I. Shtanov Surface Science 574 52 2005 . Density Functional Theory DFT calculations of the surface reactivity of PbTe T. S. Zyubina V. S. Neudachina L. V. Yashina V. I. Shtanov XPS and ab initio study of the interaction of PbTe with molecular oxygen Surface Science 574 52 2005 indicate these oxygen complexes form chemical bonds by transferring charge from the tellurium atoms T. S. Zyubina V. S. Neudachina L. V. Yashina V. I. Shtanov XPS and ab initio study of the interaction of PbTe with molecular oxygen Surface Science 574 52 2005 . These chemical shifts were experimentally confirmed through X ray Photoemission Spectroscopy XPS . The chemisorption of oxygen essentially forms carrier trapping acceptor states by removing electrons from the grain surface reducing itinerant carrier density. For nanocrystalline materials this chemisorption results in increased trapping of carriers at grain boundaries forming energy barriers that impede the conduction of carriers between grains. Assuming a uniformly distributed concentration of ionized carrier traps N cm a grain boundary thickness less than the crystallite size L whose morphology and size distribution are identical and within the grains less than through the boundary the effective mobility is given by J. Y. W. Seto The electrical properties of polycrystalline silicon films J. Appl. Phys. 46 5247 1975
The one dimensional time independent WKB transmission probability for the potential barrier is given by V. V. Mitin Contribution of light holes to thermionic field emission in Si and Ge Phys. Rev. B 31 2584 1985 exp 2 2 14 where xand xare the classical carrier turning points with energy E m is the effective mass and qV x is the interfacial barrier energy. Therefore the tunneling probability is a maximum for charge carriers with smaller m . The electrical transport in p type PbTe is dominated by two bands a lower mobility heavy hole HH valence band below the light hole LH valence band at low temperature where 10 m m Yu. I. Ravich B. A. Efimova and I. A. Smirnov Plenum N.Y. 1970 L. M. Rogers Drift mobility of light mass holes in PbTe heavily doped with Na Brit. J. Appl. Phys. 1 1067 1968 . We assume similar m and band structure for the nanocomposites. At low temperature and higher hole densities the electrical properties are dominated nearly exclusively by the LH carriers. As the temperature increases the HH band rises resulting in a decreasing and an increase in carrier scattering for the higher carrier density specimens. At higher temperature when the average energy of the charge carriers is sufficient to overcome the grain boundary energy barrier conduction is dominated through thermionic emission Texp E kT and increases with temperature. Grain boundary potential barrier scattering of the carriers in combination with phonon scattering gives rise to the unique temperature dependence of the electrical conductivity and the mobility in these nanocomposites.
The effective crystallite size was estimated using equation 1 the energy barriers obtained from fitting the temperature dependence of the values calculated from the room temperature carrier concentration and the HH m 1.5 m Yu. I. Ravich B. A. Efimova and I. A. Smirnov Plenum N.Y. 1970 and references therein . These estimates indicate effective crystallite sizes between 300 and 400 nm listed in Table I and are consistent with the dimensional nanocomposite structure observed in our SEM analyses. This suggests the grain boundary energy barrier scattering is dominated through these nanoscale features. Of note inclusion of LH carriers in the calculation only slightly lowers the effective crystallite size.
Furthermore conduction through the boundary potential barrier between grains essentially filters lower energy charge carriers increasing the average carrier energy and consequently S . shows the room temperature S for the PbTe nanocomposites in comparison to theoretically calculated bulk values Yu. I. Ravich B. A. Efimova and I. A. Smirnov Plenum N.Y. 1970 and references therein A. J. Crocker and L. M. Rogers Brit. J. Appl. Phys. 18 563 1967 indicating an enhancement in S as compared to bulk PbTe at the same carrier concentration. In addition we compare the room temperature S for the nanocomposites to two of our undoped and two Na doped bulk PbTe specimens indicating an enhancement in S over that of bulk PbTe by up to a factor of two inset in . The larger S in the nanocomposites as compared to bulk polycrystalline materials in addition to similar thermal conductivities Table I and Yu. I. Ravich et al. Plenum N.Y. 1970 results in enhanced room temperature ZT of up to a factor of two as compared to bulk PbTe. Therefore interfacial energy barrier carrier filtering is an effective method of thermoelectric performance enhancement in these bulk nanocomposites.
PbTe nanocomposites were prepared by densifying 100 nm PbTe nanocrystals synthesized in high yield employing a solution phase technique. SPS successfully consolidated these nanoscale grains within a dense PbTe matrix. The carrier concentrations were modified by directly doping the PbTe nanocrystals with Ag prior to densification. The unique temperature dependence of and suggests an additional scattering mechanism in combination with phonon carrier scattering dominant in single crystal and polycrystalline lead chalcogenides. For these nanocrystalline materials the chemisorption of oxygen likely results in increased trapping of carriers at grain boundaries forming energy barriers that impede the conduction of carriers between grains. This conduction is effectively described as dominated by grain boundary potential barrier scattering in combination with phonon scattering. Furthermore these nanocomposites demonstrate an enhanced TE performance as compared to bulk PbTe thus interfacial energy barrier carrier scattering is an effective method of thermoelectric performance enhancement in bulk nanocomposites.
In the preceding specification all documents acts or information disclosed does not constitute an admission that the document act or information of any combination thereof was publicly available known to the public part of the general knowledge in the art or was known to be relevant to solve any problem at the time of priority.
The disclosures of all publications cited above are expressly incorporated herein by reference each in its entirety to the same extent as if each were incorporated by reference individually.
While there has been described and illustrated specific embodiments of a thermoelectric device it will be apparent to those skilled in the art that variations and modifications are possible without deviating from the broad spirit and principle of the present invention. It is also to be understood that the following claims are intended to cover all of the generic and specific features of the invention herein described and all statements of the scope of the invention which as a matter of language might be said to fall therebetween.
| 393.883929 | 2,865 | 0.824209 | eng_Latn | 0.998229 |
0d85cc6f9d3aaaedfde4dc583342e87eb5f84e8f | 888 | md | Markdown | DOCUMENTATION.md | BibleJS/bible-romanian | 942f4b6926773263026f19436b4a26eeb0c11afa | [
"MIT"
] | 5 | 2015-08-19T15:57:05.000Z | 2021-11-01T18:19:58.000Z | DOCUMENTATION.md | BibleJS/bible-romanian | 942f4b6926773263026f19436b4a26eeb0c11afa | [
"MIT"
] | 1 | 2015-10-09T06:29:36.000Z | 2019-05-01T09:43:01.000Z | DOCUMENTATION.md | BibleJS/bible-romanian | 942f4b6926773263026f19436b4a26eeb0c11afa | [
"MIT"
] | 2 | 2017-11-10T07:36:31.000Z | 2021-11-01T18:19:52.000Z | ## Documentation
You can see below the API reference of this module.
### `getVerse(reference, callback)`
Fetches the verses that were requested in reference
e.g. Geneza 1:1 - returns one verse
or Geneza 1:1,2 - returns two verses (1 and 2)
or Geneza 1:1-10 - returns the verses 1 - 10
or Geneza 1 - returns the whole chapter
#### Params
- **String** `reference`: Bible verse reference
- **Function** `callback`: The callback function
#### Return
- ****
### `search(term, callback)`
Searches a String/Regular expression in all verses
#### Params
- **String|RegExp** `term`: String/Regular expression that should be contained in verses.
- **Function** `callback`: The callback function
#### Return
- ****
### `getBooks(callback)`
Returns an array with Bible books in Romanian
#### Params
- **Function** `callback`: The callback function
#### Return
- ****
| 21.142857 | 89 | 0.675676 | eng_Latn | 0.926874 |
0d873b9c46570e3ab58e904567cf0c09adf0d2ec | 129 | md | Markdown | docs/en/03_developers_guides/01_modules/elgg-salmon.md | Yaco/lorea-website | 1a44b82e58c6c453e34bc6ac87f9cb38e7e4a9c9 | [
"MIT"
] | null | null | null | docs/en/03_developers_guides/01_modules/elgg-salmon.md | Yaco/lorea-website | 1a44b82e58c6c453e34bc6ac87f9cb38e7e4a9c9 | [
"MIT"
] | null | null | null | docs/en/03_developers_guides/01_modules/elgg-salmon.md | Yaco/lorea-website | 1a44b82e58c6c453e34bc6ac87f9cb38e7e4a9c9 | [
"MIT"
] | null | null | null | Standard protocol for comments and annotations to swim upstream to original update sources
https://github.com/lorea/elgg-salmon
| 32.25 | 90 | 0.829457 | eng_Latn | 0.852401 |
0d87b892f7ca3b26a1744fd4e6fa14d08a6eda34 | 2,169 | md | Markdown | README.md | zhuyunfeng1224/XHUserStatistics | c19af0f3a53e1821971f41a71ba1a37fe247319a | [
"MIT"
] | 2 | 2016-12-08T07:06:17.000Z | 2017-03-13T11:01:58.000Z | README.md | zhuyunfeng1224/XHUserStatistics | c19af0f3a53e1821971f41a71ba1a37fe247319a | [
"MIT"
] | null | null | null | README.md | zhuyunfeng1224/XHUserStatistics | c19af0f3a53e1821971f41a71ba1a37fe247319a | [
"MIT"
] | null | null | null | # XHUserStatistics
[](https://travis-ci.org/echo/XHUserStatistics)
[](http://cocoapods.org/pods/XHUserStatistics)
[](http://cocoapods.org/pods/XHUserStatistics)
[](http://cocoapods.org/pods/XHUserStatistics)
## Example
To run the example project, clone the repo, and run `pod install` from the Example directory first.
## Requirements
## Installation
XHUserStatistics is available through [CocoaPods](http://cocoapods.org). To install
it, simply add the following line to your Podfile:
### 安装
```ruby
pod "XHUserStatistics", '~> 1.0.1'
```
### 创建plist文件
* 首先创建一个名为UserStatistics的plist文件 `UserStatistics.plist`
### 创建plist文件
* 在plist根节点下面添加两个节点分别为:`pageEvents` 和 `actionEvents`
pageEvent用来添加页面事件
actionEvents用来添加自定义事件,但是只能添加controller内的方法
plist的结构如下:
```
<dict>
<key>pageEvents</key>
<dict>
<key>XHViewController</key>
<dict>
<key>pageName</key>
<string>mainPage</string>
<key>appear</key>
<true/>
<key>disAppear</key>
<true/>
</dict>
</dict>
<key>actionEvents</key>
<dict>
<key>buttonClicked:</key>
<dict>
<key>XHViewController</key>
<dict>
<key>eventId</key>
<string>clickButton</string>
</dict>
</dict>
</dict>
</dict>
```
* 在AppDelegate中添加初始化语句
```
[XHUserStastisticsManager manager].actionEventBlock = ^(XHActionEvent *actionEvent) {
NSLog(@"there is a action Event: %@", actionEvent.eventId);
};
[XHUserStastisticsManager manager].appearPageEventBlock = ^(XHPageEvent *pageEvent) {
NSLog(@"there is a appear event of page: %@", pageEvent.pageName);
};
[XHUserStastisticsManager manager].disappearPageEventBlock = ^(XHPageEvent *pageEvent) {
NSLog(@"there is a disappear event of page: %@", pageEvent.pageName);
};
```
## 注意:
自定义事件所有的方法参数类型一定要是OC对象,不可以是基本类型,如:NSInteger要使用NSNumber类型
## Author
xihe, [email protected]
## License
XHUserStatistics is available under the MIT license. See the LICENSE file for more info.
| 23.835165 | 126 | 0.74781 | yue_Hant | 0.336795 |
0d8867bdbce8df62158b682b15a5db3df8dddcc1 | 208 | md | Markdown | README.md | akashagarwal7/homebrew-tools | 0443126d78d5f8126757e17b0ed3819e5ae5be01 | [
"MIT"
] | null | null | null | README.md | akashagarwal7/homebrew-tools | 0443126d78d5f8126757e17b0ed3819e5ae5be01 | [
"MIT"
] | null | null | null | README.md | akashagarwal7/homebrew-tools | 0443126d78d5f8126757e17b0ed3819e5ae5be01 | [
"MIT"
] | null | null | null | # homebrew-tools
A homebrew tap
Usage(macOS):
`brew tap akashagarwal7/tools`
After adding the tap, you must run `brew update` so that the formulae present in this repo are available for you to install.
| 26 | 124 | 0.759615 | eng_Latn | 0.997127 |
0d88c6d9325500883c57b0012840f8355e7bad1e | 3,109 | md | Markdown | README.md | thiesgerken/wavepi | 5af37946dcc1910ad1cccdc76d2e2f546eeafec4 | [
"BSD-3-Clause"
] | null | null | null | README.md | thiesgerken/wavepi | 5af37946dcc1910ad1cccdc76d2e2f546eeafec4 | [
"BSD-3-Clause"
] | null | null | null | README.md | thiesgerken/wavepi | 5af37946dcc1910ad1cccdc76d2e2f546eeafec4 | [
"BSD-3-Clause"
] | null | null | null | # WavePI (Parameter Identification for Wave Equations)
© 2017-2019 Thies Gerken, University of Bremen, `[email protected]`
Developed as part of my PhD-Project [_Dynamic Inverse Problems for Wave Phenomena_](https://nbn-resolving.de/urn:nbn:de:gbv:46-00107730-18)

## Dependencies
- `cmake >= 2.8.8`
- `deal.II >= 9.1.0-pre`
- `boost >= 1.62`
- `gtest >= 1.8.0` (optional)
Note that `deal.II` has to be configured with [TBB](https://www.threadingbuildingblocks.org/), MPI and UMFPACK support (either bundled or external). I use
```shell
cmake -DCMAKE_INSTALL_PREFIX=/usr/local -DDEAL_II_WITH_MPI=ON
```
for configuring `deal.II`.
## How to Build
Compile using `N` parallel jobs:
```shell
mkdir build
cd build
cmake ..
make -jN
```
Generate Eclipse Project Files: (Do not do this in a child directory)
```shell
cmake -G "Eclipse CDT4 - Unix Makefiles" -DCMAKE_ECLIPSE_VERSION=4.7 -DCMAKE_ECLIPSE_MAKE_ARGUMENTS=-j1 /path/to/wavepi
```
Change Build type to release (no assertions, typically runs 10 times faster):
```shell
cmake -DCMAKE_BUILD_TYPE=Release ..
```
Use the same command with `Debug` to go back. There are also `make` targets that switch the build type.
To only build the documentation (Doxygen), run `make doc` inside the build directory. The command `make run-doc` will also open the result in a Browser
## MPI
Add `-DWAVEPI_WITH_MPI` to the `cmake` invokation to enable MPI support. In this case, MPI is used to parallelize PDE solutions for different right hand sides.
## Tests
This project uses [Google Test](https://github.com/google/googletest). Run the test suite using the binary `wavepi_test` (only built if `gtest` was found). You can also list all tests (`--gtest_list_tests`) and only run a subset of them (`--gtest_filter="[filter]"`, wildcards are allowed). Currently, a few of the tests should fail (L2 Adjoint to the wave equation by integrating backwards is not as good as `WaveEquationAdjoint`, and is not even correct if $`\nu\neq 0`$).
When using `CMake >= 3.10`, one can also run the tests using [`ctest`](https://cmake.org/cmake/help/latest/manual/ctest.1.html). Just run `ctest` in the build directory. `ctest -V` also shows test output, `ctest -N` lists all tests and `ctest -R <regex>` runs all tests that match the specified regex (use `.*` instead of `*`!). If you want colors using `ctest`, run `export GTEST_COLOR=1` beforehand.
## Remarks on the Code
It is common C++ practice to put all the code of templated classes into the header file, because the compiler needs to instantiate them for every compilation unit. For classes/functions that only depend on the space dimension as a template parameter, I ignored this rule and just added instances for one, two and three dimensions to increase compilation speed, as is common also in `deal.II`.
## Shell Autocompletion (ZSH)
Put (or symlink) [completions.zsh](completions.zsh) in `~/.zsh-completions` and make sure you have the following lines in `~/.zshrc`:
```shell
fpath=($HOME/.zsh-completions $fpath)
autoload -U compinit
compinit
```
| 42.589041 | 474 | 0.743647 | eng_Latn | 0.9781 |
0d89a869371b40f7cf7cf988129767360c459724 | 95 | md | Markdown | README.md | alexanduyin/vue_shop | 1bb3a9fd37be339863170133da8e1360d5d507d7 | [
"CC0-1.0"
] | 1 | 2020-04-06T12:07:38.000Z | 2020-04-06T12:07:38.000Z | README.md | alexanduyin/vue_shop | 1bb3a9fd37be339863170133da8e1360d5d507d7 | [
"CC0-1.0"
] | 2 | 2021-10-06T15:08:46.000Z | 2022-02-27T02:32:27.000Z | README.md | alexanduyin/vue_shop | 1bb3a9fd37be339863170133da8e1360d5d507d7 | [
"CC0-1.0"
] | null | null | null | # vue_shop
这是一个简单的练手项目,因为之前刚刚学完vue,所以准备通过该项目练练手;
项目是一个后台管理系统,不是非常复杂,但是涉及到服务器和接口的问题,所以项目无法完整的展示
| 23.75 | 45 | 0.884211 | zho_Hans | 0.985705 |
0d8aa169afc077a34b434e4dc42736fc60399077 | 2,814 | md | Markdown | sota-guides/Buckeye Knob (W4G_NG-017).md | k4kpk/k4kpk.github.io | 31c1ff28f1c7691c91e05bf31bbc100aeceda967 | [
"CC-BY-4.0"
] | null | null | null | sota-guides/Buckeye Knob (W4G_NG-017).md | k4kpk/k4kpk.github.io | 31c1ff28f1c7691c91e05bf31bbc100aeceda967 | [
"CC-BY-4.0"
] | 4 | 2020-08-25T22:13:01.000Z | 2021-06-30T16:45:38.000Z | sota-guides/Buckeye Knob (W4G_NG-017).md | k4kpk/k4kpk.github.io | 31c1ff28f1c7691c91e05bf31bbc100aeceda967 | [
"CC-BY-4.0"
] | null | null | null | ---
layout: sota-guide
---
# SOTA Guide - Buckeye Knob, W4G/NG-017
#### Drive Guide - Buckeye Knob from Atlanta
* **Duration**: 1:45
* **Google Maps** URL from Atlanta (33.917, -84.3378):
* Atlanta, Buckeye, Akin, Atlanta (Double-header): http://goo.gl/maps/0XZ0E
* ... plus dinner: http://goo.gl/maps/cR97o
* **Seasonal/Limited Access**: All-season, dirt. Patrick drove it in February. May require SUV.
* **Directions**:
* GA-400 N for 47
* L on US-19 and go 5.1
* R on GA-9 N - go 8 miles.
* L on GA-60 - go 7.3.
* This is more of a Y than a left turn. It is near the entrance to the R Ranch resort community.
* R on GA-180 (Wolf Pen Gap Rd) - go 7.7
* L on Duncan Ridge Rd - go 3.9. (a.k.a. Forest Service road FS 39.)
* Duncan Ridge Rd Landmarks, where Duncan Ridge Rd meets GA-180:
* "Wolf Pen Gap" highway sign (green background, white letters) on the right.
* On the left, the dirt road has a big "Coopers Creek WMA" sign.
* There's a green-blazed trail on the right side of 180. I suspect it is the Coosa trail (which does not go to Coosa Bald).
* Park at campsite/wide spot in road on L, just past Duncan Ridge Trail (trailhead) on R.
* **Food**
* Last McDonalds: Dahlonega
* Penultimate McDonalds: GA-400 at GA-53 (38 miles north of I-285)
* Dinner: Smoke-n-Gold in Dahlonega
#### Drive Guide - Buckeye Knob from Akin Mtn
* **Duration**:
* **Google Maps**: http://goo.gl/maps/h64zm
* **Seasonal/Limited Access**: All-season, dirt. Patrick drove it in February. May require SUV.
* **Directions**:
* Head west (or southwest) on Mulkey Gap Rd and go 2.1
* Keep L (straight) at Ridge Runner Trail and go 0.7
* L on Duncan Ridge Rd (USFS-39) and go 5.2
* Might see cemetery across creek on L, just before turn.
* Will cross creek and follow a fork.
* Note: At 3.1 something merges from R
* Note: At about 3.5, keep L.
* Park at campsite on R, just pass Buckeye Knob (on L) just before Duncan Ridge Trail (trailhead) on L.
#### Trail Guide
* **Duration**: 0.7 mile, 0:30
* **Navigation**
* NNE (toward Coosa) on Duncan Ridge Rd 500' from the campsite
* L on old road up to the ridge line and the Duncan Ridge Trail.
* The old road is recognizable only by some piles of dirt to stop people from driving it. It is not readily apparent as a road.
* Go W, then NW on DRT about .65 miles and 500' altitude.
* **Trailhead altitude**: 3415
* **Summit altitude**: 3734
* **GPS tracks/waypoints**:
* Trailhead: 34.78459,-83.99194
* Summit: 34.7873, -83.9982
#### Summit Guide
* Hang antenna from tree: yes
* Space to guy mast: yes
* Cell coverage: AT&T=OK, Vzn=OK, APRS=OK
#### Plan-B Candidates
* Coosa Bald, Sheriff Knob, Akin Mountain
| 40.782609 | 136 | 0.652097 | eng_Latn | 0.92282 |
0d8c7a0d92f70275ba8763e426a7b56c20ff1bc3 | 316 | md | Markdown | STAPLER_APP.md | NusCracker25/stapleMD | d9cc4e06ae994b440ee0969eddd67e95ccc404a6 | [
"MIT"
] | null | null | null | STAPLER_APP.md | NusCracker25/stapleMD | d9cc4e06ae994b440ee0969eddd67e95ccc404a6 | [
"MIT"
] | null | null | null | STAPLER_APP.md | NusCracker25/stapleMD | d9cc4e06ae994b440ee0969eddd67e95ccc404a6 | [
"MIT"
] | null | null | null | to fix
had to deactivate Ivvy compiler because of ngx-socketio.
watch over an update of the package
My temporary solution was to add:
````json
"enableIvy": false,
"angularCompilerOptions": {
"fullTemplateTypeCheck": true,
"enableIvy": false, <---
"strictInjectionParameters": true
}
````
from my file tsconfig.json | 21.066667 | 56 | 0.753165 | eng_Latn | 0.962422 |
0d8d981b2b329314a4847cd09ea596bc8105563e | 1,571 | md | Markdown | CHANGELOG.md | Gags1409/lookforfood | d02fb5913cbfff4b1e9fc71cad0221bde52e0c58 | [
"MIT"
] | 2 | 2021-01-21T10:12:59.000Z | 2021-02-01T16:48:48.000Z | CHANGELOG.md | Gags1409/lookforfood | d02fb5913cbfff4b1e9fc71cad0221bde52e0c58 | [
"MIT"
] | null | null | null | CHANGELOG.md | Gags1409/lookforfood | d02fb5913cbfff4b1e9fc71cad0221bde52e0c58 | [
"MIT"
] | null | null | null | # Changelog
All notable changes to this project will be documented in this file. See [standard-version](https://github.com/conventional-changelog/standard-version) for commit guidelines.
### [1.4.2](https://github.com/Gags1409/lookforfood/compare/v1.4.1...v1.4.2) (2020-10-05)
### Bug Fixes
* patch release task and updated README ([3cfd02c](https://github.com/Gags1409/lookforfood/commit/3cfd02c98feb3b60d582075cc148fd21bff282ab))
### [1.4.1](https://github.com/Gags1409/lookforfood/compare/v1.4.0...v1.4.1) (2020-10-05)
### Bug Fixes
* fixed commit lint and put urls in config ([7889e33](https://github.com/Gags1409/lookforfood/commit/7889e333bc778df40569f5e66fa74c17c8c9a95a))
## [1.4.0](https://github.com/Gags1409/lookforfood/compare/v1.2.0...v1.4.0) (2020-10-05)
### Features
* put documentation and fixed linting issues ([a38ec1b](https://github.com/Gags1409/lookforfood/commit/a38ec1be1622b638956fc0efd300cec2086dbc28))
## [1.3.0](https://github.com/Gags1409/lookforfood/compare/v1.2.0...v1.3.0) (2020-10-05)
### Features
* put documentation and fixed linting issues ([a38ec1b](https://github.com/Gags1409/lookforfood/commit/a38ec1be1622b638956fc0efd300cec2086dbc28))
## [1.2.0](https://github.com/Gags1409/lookforfood/compare/v1.1.0...v1.2.0) (2020-10-05)
### Features
* setup client ([91a51fd](https://github.com/Gags1409/lookforfood/commit/91a51fda2979bda9bff60455bceb06e2b3da47d0))
## 1.1.0 (2020-10-05)
### Features
* set up server ([2f6a122](https://github.com/Gags1409/lookforfood/commit/2f6a1221e033fb5e22c2a2a7f451c2f68c5071e3))
| 34.152174 | 174 | 0.753024 | yue_Hant | 0.248918 |
0d8ea6dd25fec3ddf027cbda6bdd49dfe53b6639 | 363 | md | Markdown | README.md | adriens/katacoda-scenarios | 8eda283ba7fb64d5b145a8e388bf123b46b337c7 | [
"MIT"
] | null | null | null | README.md | adriens/katacoda-scenarios | 8eda283ba7fb64d5b145a8e388bf123b46b337c7 | [
"MIT"
] | 10 | 2021-01-26T03:57:53.000Z | 2021-02-24T09:07:03.000Z | README.md | adriens/katacoda-scenarios | 8eda283ba7fb64d5b145a8e388bf123b46b337c7 | [
"MIT"
] | null | null | null | # Interactive Katacoda Scenarios

[](https://www.katacoda.com/rastadidi "Get your profile on Katacoda.com")
Visit https://www.katacoda.com/rastadidi to view the profile and interactive scenarios.
| 51.857143 | 134 | 0.790634 | yue_Hant | 0.282154 |
0d8ed8a17b51f1a813807b37699c4974700af3f1 | 317 | md | Markdown | README.md | luoyangpeng/laravel-superman | 7468f50c202692af1401ec0cff5667d335b5e88d | [
"MIT"
] | null | null | null | README.md | luoyangpeng/laravel-superman | 7468f50c202692af1401ec0cff5667d335b5e88d | [
"MIT"
] | null | null | null | README.md | luoyangpeng/laravel-superman | 7468f50c202692af1401ec0cff5667d335b5e88d | [
"MIT"
] | null | null | null | # laravel-superman
Workerman speed up laravel application
## Installation
```shell
$ composer require "itas/laravel-superman:^1.0" -vvv
```
## Configuration
```php
$ php artisan vendor:publish --provider="Itas\\LaravelSuperman\\ServiceProvider" --tag=config
```
## Usage
```php
$ php artisan superman:serve
``` | 16.684211 | 93 | 0.712934 | eng_Latn | 0.388645 |
0d8f728621b9ec24f7cc0cd6340e7ee766961f37 | 48 | md | Markdown | README.md | emwilbanks/SL2_32 | 0013c9839ba597f620530f6434248fa6d727c0e4 | [
"MIT"
] | null | null | null | README.md | emwilbanks/SL2_32 | 0013c9839ba597f620530f6434248fa6d727c0e4 | [
"MIT"
] | null | null | null | README.md | emwilbanks/SL2_32 | 0013c9839ba597f620530f6434248fa6d727c0e4 | [
"MIT"
] | null | null | null | # SL2_32
Unit to shift a 32-bit value left by 2
| 16 | 38 | 0.729167 | eng_Latn | 0.999593 |
0d929d62f2e6655382577fc4f0bbf988cac5c793 | 1,467 | md | Markdown | README.md | OrangeKicker1/cfxnes | 8056b7323c9a1017cae306ed6217041df178ddaa | [
"MIT"
] | 59 | 2015-02-02T13:43:34.000Z | 2022-03-10T04:59:31.000Z | README.md | OrangeKicker1/cfxnes | 8056b7323c9a1017cae306ed6217041df178ddaa | [
"MIT"
] | 20 | 2017-03-16T09:19:28.000Z | 2020-07-12T21:16:53.000Z | README.md | OrangeKicker1/cfxnes | 8056b7323c9a1017cae306ed6217041df178ddaa | [
"MIT"
] | 17 | 2017-03-16T09:19:35.000Z | 2022-03-10T04:59:31.000Z | # cfxnes
JavaScript NES emulator and emulation library.

:video_game: [Live demo](https://cfxnes.herokuapp.com)
:information_source: [How to use cfxnes as a library](lib)
## Features
- Supported ROM images: iNES, NES 2.0.
- Supported mappers: NROM, MMC1, MMC3, UNROM, CNROM, AOROM, BNROM,
NINA-001, Color Dreams.
- ROM images can be loaded from ZIP archive.
- Persistence of battery-backed RAM (game saves) in IndexedDB.
- Rendering using WebGL (with canvas API fallback).
- Fullscreen mode.
- Sound emulation using Web Audio.
- Zapper emulation using mouse.
- Gamepad support.
- Customizable key bindings.
- Plenty of configuration options.
## Supported Browsers
- Chrome (last 2 versions)
- Firefox (last 2 versions)
- Opera (last 2 versions)
- IE 11, Edge >= 12
- Safari >= 9
## Known Issues
- No sound in IE due to missing Web Audio support.
- Poor performance in IE, Edge.
- Very poor performance on mobile devices.
- Occasional graphical glitches in games using MMC3 mapper.
- See [list of broken games](broken-games.md).
## Project Structure
- **[Core](core)** - [Readme](core/README.md)
- **[Lib](lib)** - [Readme](lib/README.md)
/ [Changelog](lib/CHANGELOG.md)
/ [API](lib/API.md)
/ [Examples](lib/examples)
- **[App](app)** - [Readme](app/README.md)
/ [Changelog](app/CHANGELOG.md)
## License
Cfxnes is licensed under the [MIT license](LICENSE.md).
| 26.672727 | 66 | 0.679618 | eng_Latn | 0.484703 |
0d92ecd31062540d773036a8e7227d31f18fe35a | 75 | md | Markdown | README.md | AndOrMaxi/OthelloAI | d721c1d095046a63269905b3ba96012b092afc02 | [
"MIT"
] | null | null | null | README.md | AndOrMaxi/OthelloAI | d721c1d095046a63269905b3ba96012b092afc02 | [
"MIT"
] | null | null | null | README.md | AndOrMaxi/OthelloAI | d721c1d095046a63269905b3ba96012b092afc02 | [
"MIT"
] | null | null | null | # OthelloAI
Othello game using Minimax Algorithm (Artificial Intelligence)
| 25 | 62 | 0.84 | eng_Latn | 0.803268 |
0d932f71c7409aea614ef2a393e4f7212fed33ac | 1,065 | md | Markdown | vendor/trntv/yii2-aceeditor/README.md | abondr/15bell_starter_kit | ee5639a5d641db7e086662ef522adf15e1ac147f | [
"BSD-3-Clause"
] | null | null | null | vendor/trntv/yii2-aceeditor/README.md | abondr/15bell_starter_kit | ee5639a5d641db7e086662ef522adf15e1ac147f | [
"BSD-3-Clause"
] | null | null | null | vendor/trntv/yii2-aceeditor/README.md | abondr/15bell_starter_kit | ee5639a5d641db7e086662ef522adf15e1ac147f | [
"BSD-3-Clause"
] | null | null | null | Ace Editor Widget For Yii2
=======================
Installation
------------
The preferred way to install this extension is through [composer](http://getcomposer.org/download/).
Either run
```
php composer.phar require trntv/yii2-aceeditor "*"
```
or add
```
"trntv/yii2-aceeditor": "*"
```
to the require section of your `composer.json` file.
Usage
-----------------------
Using model
```php
trntv\aceeditor\AceEditor::widget([
// You can either use it for model attribute
'model' => $my_model,
'attribute' => 'my_field',
// or just for input field
'name' => 'my_input_name',
'mode'=>'html', // programing language mode. Default "html"
'theme'=>'github' // editor theme. Default "github"
]);
```
With active field
```php
$form->field($model, 'field')->widget(
'trntv\aceeditor\AceEditor',
[
'mode'=>'html', // programing language mode. Default "html"
'theme'=>'github' // editor theme. Default "github"
]
)
```
Lists of all available modes and themes see [here](https://github.com/ajaxorg/ace)
| 19.722222 | 100 | 0.615962 | eng_Latn | 0.725562 |
0d93f048445052b4aa4e0b000c3e9df39cf83cae | 6,924 | md | Markdown | writeup_report.md | anooshabn/Behavioral-Cloning | 94b75f57f1bc331a90200433805c72be9c5d3bf4 | [
"MIT"
] | 1 | 2019-01-06T16:36:07.000Z | 2019-01-06T16:36:07.000Z | writeup_report.md | anooshabn/Behavioral-Cloning | 94b75f57f1bc331a90200433805c72be9c5d3bf4 | [
"MIT"
] | null | null | null | writeup_report.md | anooshabn/Behavioral-Cloning | 94b75f57f1bc331a90200433805c72be9c5d3bf4 | [
"MIT"
] | null | null | null | # **Behavioral Cloning**
## Writeup
---
**Behavioral Cloning Project**
The goals / steps of this project are the following:
* Use the simulator to collect data of good driving behavior
* Build, a convolution neural network in Keras that predicts steering angles from images
* Train and validate the model with a training and validation set
* Test that the model successfully drives around track one without leaving the road
* Summarize the results with a written report
[//]: # (Image References)
[image1]: ./examples/placeholder.png "Model Visualization"
[image2]: ./examples/placeholder.png "Grayscaling"
[image3]: ./examples/placeholder_small.png "Recovery Image"
[image4]: ./examples/placeholder_small.png "Recovery Image"
[image5]: ./examples/placeholder_small.png "Recovery Image"
[image6]: ./examples/placeholder_small.png "Normal Image"
[image7]: ./examples/placeholder_small.png "Flipped Image"
## Rubric Points
### Here I will consider the [rubric points](https://review.udacity.com/#!/rubrics/432/view) individually and describe how I addressed each point in my implementation.
---
### Files Submitted & Code Quality
#### 1. Submission includes all required files and can be used to run the simulator in autonomous mode
My project includes the following files:
* model.py containing the script to create and train the model
* drive.py for driving the car in autonomous mode
* model.h5 containing a trained convolution neural network
* writeup_report.md summarizing the results
* video.mp4 video recording of vehicle running autonomously for two laps around track 1
#### 2. Submission includes functional code
Using the Udacity provided simulator and my drive.py file, the car can be driven autonomously around the track by executing
```sh
python drive.py model.h5 run1
```
#### 3. Submission code is usable and readable
The model.py file contains the code for training and saving the convolution neural network. The file shows the pipeline I used for training and validating the model, and it contains comments to explain how the code works.
### Model Architecture and Training Strategy
#### 1. An appropriate model architecture has been employed
My model consists of a convolution neural network with 5 convolutional layers of 5x5 (3 layers), 3x3 (2 layers) filter sizes and four fully connected layers. (model.py lines 100-110)
The model includes RELU layers to introduce nonlinearity (code line 20), and the data is normalized in the model using a Keras lambda layer (code line 100-104).
#### 2. Attempts to reduce overfitting in the model
The model was trained and validated on different data sets to ensure that the model was not overfitting (code line 94-95). The model was tested by running it through the simulator and ensuring that the vehicle could stay on the track.
#### 3. Model parameter tuning
The model used an adam optimizer, so the learning rate was not tuned manually (model.py line 114).
#### 4. Appropriate training data
I faced lot of issues with online simulator, also I kept losing data due to my bad internet connectivity. Though I downloaded simulator, collected data on my local machine, faced issued while uploading to workspace. So I ended using data provided by Udacity, also flipped the images to generate more data.
### Model Architecture and Training Strategy
#### 1. Solution Design Approach
The overall strategy for deriving a model architecture was to make sure :
* the car should stay in the center of the road as much as possible
* if the car veers off to the side, it should recover back to center
My first step was to use a convolution neural network model similar to the one used in classroom which was published by autonomous vehicle team at NVIDIA. I thought this model is appropriate because it was tested in classroom and also this was the model NVIDIA used to train a real car.
In order to gauge how well the model was working, I split my image and steering angle data into a training and validation set.
model.fit_generator(train_generator, steps_per_epoch=len(train_lines),
validation_data=validation_generator, validation_steps=len(validation_lines), epochs=5, verbose = 1)
Initially I used 5 epochs and used the above mentioned line of code to fit the model, but it took a lot of time and due to my bad internet, it kept diconnecting and I had to train again, but never finished. So instead I modified it into below line of code after taking the suggestion on slack channel. (set the number of minibatches per epoch to the size of the dataset divided by the size of a minibatch.)
model.fit_generator(train_generator, steps_per_epoch=int(len(train_lines)/32),
validation_data=validation_generator, validation_steps=int(len(validation_lines)/32), epochs=3, verbose = 1)
Classroom video on Generators really helped me to organize my project.
As a final step I ran the simulator to see how well the car was driving around track one. The model was trained and validated on different data sets to ensure that the model was not overfitting (code line 94-95). The model was tested by running it through the simulator and ensuring that the vehicle could stay on the track.
At the end of the process, the vehicle is able to drive autonomously around the track without leaving the road.
#### 2. Final Model Architecture
The final model architecture (model.py lines 18-24) consisted of a convolution neural network with the following layers and layer sizes ...
#### 3. Creation of the Training Set & Training Process
As mentioned above, because of the issues I faced while collecting my own data using simulator and upoading the data, I ended using data provided by Udacity. The dataset consists of images from 3 different angles. Below are the images captured from center, left and right side of the road.
![alt text][./examples/center_2016_12_01_13_30_48_287.jpg "Center"]
![alt text][./examples/left_2016_12_01_13_30_48_287.jpg "Left"]
![alt text][./examples/right_2016_12_01_13_30_48_287.jpg "Right"]
To augment the data sat, I also flipped images and angles.
After the collection process, I had X number of data points. I then preprocessed this data by normalizing and mean centering the data by using a Lambda layer ((model.py lines 62) as follows:
model.add(Lambda(lambda x: x / 255.0 - 0.5, input_shape=(160, 320, 3)))
I finally randomly shuffled the data set (model.py lines 53-55).
I used this training data for training the model. The validation set helped determine if the model was over or under fitting. The model was trained and validated on different data sets to ensure that the model was not overfitting (code line 94-95). The model was tested by running it through the simulator and ensuring that the vehicle could stay on the track. I used an adam optimizer so that manually training the learning rate wasn't necessary.
| 58.184874 | 447 | 0.774841 | eng_Latn | 0.99917 |
0d94970a61f5473f959a56fbaf430467ec94a22a | 4,383 | markdown | Markdown | _posts/2018-02-25-activerecord_associations.markdown | Rygel-XVI/Rygel-XVI.github.io | bcaf994df41ff5103cab394b1e99ebeaef7debe0 | [
"MIT"
] | null | null | null | _posts/2018-02-25-activerecord_associations.markdown | Rygel-XVI/Rygel-XVI.github.io | bcaf994df41ff5103cab394b1e99ebeaef7debe0 | [
"MIT"
] | null | null | null | _posts/2018-02-25-activerecord_associations.markdown | Rygel-XVI/Rygel-XVI.github.io | bcaf994df41ff5103cab394b1e99ebeaef7debe0 | [
"MIT"
] | null | null | null | ---
layout: post
title: "ActiveRecord Associations"
date: 2018-02-26 00:18:00 +0000
permalink: activerecord_associations
---
When writing a simple Ruby program we create associations by hand.
Say we are writing a video game that has characters and they own objects and they belong to a group that is questing merrily along attempting to save the town from a band of marauding goblins.
The Dungeon class would have instances of each Enemy class (ie goblins, wolves, etc.) and each of those Enemy Objects would have equipment and loot. Equipment and loot would also be objects of a class(s) Treasure or Equipment. We haven't even touched the adventure group who would have equipment and treasure as well as quests, levels, skills, and so on. It gets messy <em>fast</em>. As programmers we don't like messy. It makes it difficult to debug and refactor.
Databases make this so much easier and when using Ruby we use ActiveRecord which automates all of that manual object associating.
There are many different types of associations:
<ul>
<li>belongs_to</li>
<li>has_one</li>
<li>has_many</li>
<li>has_many :through</li>
<li>has_one :through</li>
<li>has_and_belongs_to_many</li>
</ul>
Instead of manually writing out each of the associations these macros makes it so you don't have to micromanage and it adds a lot of different methods that you don't have to write them out. Most importantly it associates the objects by assigning their primary and foreign ids automatically for us and also automatically dessociates them when we destroy the object.
I'm going to go over how the id's are assigned for the more common associations.
Lets say a character has an individual character quest.
```
class Quest
belongs_to :character
end
```
One method it inherits is:
```
character=
```
This associates the Quest and Character classes so that** Quest has a foreign id of character_id.** Therefore, this instance of Quest belongs *to only this instance* of Character.
Though a Quest can belong to one character say your character has many health potions to keep them alive on their adventure.
```
class Character
has_many :potions
end
```
This creates an association where **each potion has a foreign id of character_id** which associates that potion with that character.
```
character.potions
character.potion_ids
```
Allows you to see an array of potions that the character owns and allows the programmer to use the potions array as they would any other array of objects. It also gives you the potion_ids method which is an array of the primary keys of the potions.
Now to make things more complicated, say that you have a group of adventures and they go on a quest where they acquiretreasure. Not any individual character owns all the treasure and therefore the treasure also belong to the whole group (not just an individual).
We end up with the has_and_belongs_to_many association.
```
class Treasure
has_and_belongs_to_many :characters
end
```
Though this will probably work has_and_belongs_to_many is deprecated so it is better to use the has_many :through association and using a join model which would be represented as is shown below.
```
class CharacterTreasure
belongs_to :character
belongs_to :treasure
end
class Treasure
has_many :character_treasures
has_many :characters, through: character_treasures
end
class Character
has_many :character_treasures
has_many :treasures, through: character_treasures
end
```
The database for the CharacterTreasure model has a foreign id for both the Treasure and the Character models and associates them <em>through</em> the join table and both the Character and Treasure models get the treasure_ids and character_ids arrays respectively making it easy to iterate through the different associated objects.
<a href='http://roseweixel.github.io/images/revised_character_actor_with_join.png'>Here</a> is a good visual representation of the tables in a has_many through relationship.
There are many more methods that are given through ActiveRecord. I recommend checking out the Rails guide on <a href='http://guides.rubyonrails.org/association_basics.html#choosing-between-has-many-through-and-has-and-belongs-to-many'>ActiveRecord Associations</a> and the <a href='http://api.rubyonrails.org/classes/ActiveRecord/Associations/ClassMethods.html'>class methods</a> documentation.
| 33.976744 | 464 | 0.78987 | eng_Latn | 0.999347 |
0d97dcdbabca6ff0e1169f9296f15b5710cf6951 | 279 | md | Markdown | README.md | fabrizioschiavi/FSD-Wordpress-template-hierarchy | 6e16f04df40fcc0bb2bedecdde36fa0da9f6d59e | [
"MIT"
] | 4 | 2018-07-10T22:41:53.000Z | 2021-07-06T03:56:32.000Z | README.md | fabrizioschiavi/FSD-Wordpress-template-hierarchy | 6e16f04df40fcc0bb2bedecdde36fa0da9f6d59e | [
"MIT"
] | null | null | null | README.md | fabrizioschiavi/FSD-Wordpress-template-hierarchy | 6e16f04df40fcc0bb2bedecdde36fa0da9f6d59e | [
"MIT"
] | null | null | null | # Wordpress template hierarchy
Simplified text only version of the popular Wordpress template hierarchy picture

| 55.8 | 165 | 0.842294 | yue_Hant | 0.409855 |
0d98b90d2b9bff3f43f724c322981878458555f4 | 149 | md | Markdown | about.md | npapoylias/ngps2019.github.io | e7369bb5850b852fd17ce834fda10959ab386611 | [
"MIT"
] | null | null | null | about.md | npapoylias/ngps2019.github.io | e7369bb5850b852fd17ce834fda10959ab386611 | [
"MIT"
] | 1 | 2018-08-11T11:37:01.000Z | 2018-08-11T11:37:01.000Z | about.md | npapoylias/ngps2019.github.io | e7369bb5850b852fd17ce834fda10959ab386611 | [
"MIT"
] | 1 | 2018-08-10T15:35:04.000Z | 2018-08-10T15:35:04.000Z | ---
layout: page
title: About
permalink: /about/
---
Images courtesy of:
* [Andreas Komodromos](https://www.flickr.com/photos/andreas_komodromos/)
| 14.9 | 73 | 0.724832 | xho_Latn | 0.166945 |
0d9ae76468555a72f3be9d15d7570fd37bfb27c4 | 736 | md | Markdown | releasenotes/releasenotes_V0.8.2-hotfix1.md | bradmccoydev/keptn | edc89c9ad34f098d53b8566640adc33a469ec318 | [
"Apache-2.0"
] | 1,323 | 2019-02-06T09:10:07.000Z | 2022-03-31T17:34:53.000Z | releasenotes/releasenotes_V0.8.2-hotfix1.md | bradmccoydev/keptn | edc89c9ad34f098d53b8566640adc33a469ec318 | [
"Apache-2.0"
] | 6,012 | 2019-02-12T12:57:24.000Z | 2022-03-31T23:37:14.000Z | releasenotes/releasenotes_V0.8.2-hotfix1.md | bradmccoydev/keptn | edc89c9ad34f098d53b8566640adc33a469ec318 | [
"Apache-2.0"
] | 242 | 2019-02-14T17:37:43.000Z | 2022-03-31T13:54:46.000Z | # Release Notes 0.8.2-hotfix1
This is a hotfix release for Keptn 0.8.2 `helm-service` if deployed as an execution plane service.
---
## Fixes
- Fix duplicated Helm Deployment.Started/Finished CloudEvents when using helm-service as a remote execution plane [3888](https://github.com/keptn/keptn/issues/3888)
## Upgrade to 0.8.2-hotfix1
You only need to upgrade `helm-service` if [deployed in execution-plane (Multi-cluster setup)](https://keptn.sh/docs/0.8.x/operate/multi_cluster/) as follows:
```console
helm upgrade helm-service https://github.com/keptn/keptn/releases/download/0.8.2-hotfix1/helm-service-0.8.2-hotfix1.tgz -n keptn-exec
```
It is not required to upgrade the cli or any other services for this hotfix release.
| 36.8 | 164 | 0.76087 | eng_Latn | 0.953618 |
0d9b86602e6c95ad6523d597b458e355377c36d9 | 221 | md | Markdown | README.md | r-ush/whatsapp-electron-app | 4acca15bb43fd72358d8a6b6f1b032d7f91a84d4 | [
"MIT"
] | 2 | 2021-07-03T08:34:29.000Z | 2021-07-03T08:53:53.000Z | README.md | sloorush/whatsapp-electron-app | 4acca15bb43fd72358d8a6b6f1b032d7f91a84d4 | [
"MIT"
] | 20 | 2021-11-16T15:19:43.000Z | 2022-03-31T15:21:17.000Z | README.md | r-ush/whatsapp-electron-app | 4acca15bb43fd72358d8a6b6f1b032d7f91a84d4 | [
"MIT"
] | null | null | null | # Whatsapp Electron App
Install the .deb package from the [releases](https://github.com/r-ush/whatsapp-electron-app/releases) and you are gtg!
Uses simple electron webview to open a whatsapp window in the electron app.
| 36.833333 | 118 | 0.782805 | eng_Latn | 0.968018 |
0d9ce29d33552afc0f5a59b5cc57fb31cf1a7c2f | 1,066 | md | Markdown | README.md | jsuarezruiz/eliteK | e94f37b221c9fc74aaa13e1673328044a4f211a5 | [
"MIT"
] | 1 | 2021-02-09T13:49:18.000Z | 2021-02-09T13:49:18.000Z | README.md | jsuarezruiz/eliteKit | e94f37b221c9fc74aaa13e1673328044a4f211a5 | [
"MIT"
] | null | null | null | README.md | jsuarezruiz/eliteKit | e94f37b221c9fc74aaa13e1673328044a4f211a5 | [
"MIT"
] | null | null | null | # eliteKit
<img src="https://api.nuget.org/v3-flatcontainer/elitekit/1.3.0.2/icon" width="144">
SkiaSharp based components for .Net.
# Motivation
- Create reusable and beautiful controls using SkiaSharp. This is a community effort so all PR's are welcome.
# Supported Platforms
- Currently it is only available for Xamarin.Forms (iOS, Android and UWP) but more platforms will be added.
# Documentation
- This is an InProgress task which i will be updating regularly on a separated branch.You can check the progress at https://arqueror.github.io/eliteKit/
# License
Under MIT (see license file)
# Want to support this project?
Please submit bugs, features and send PR's. If you want to go further you can buy me a coffee using the link below and check my other repos.
<a href="https://www.buymeacoffee.com/jOUwyzl" target="_blank"><img src="https://www.buymeacoffee.com/assets/img/custom_images/purple_img.png" alt="Buy Me A Coffee" style="height: auto !important;width: auto !important;" ></a>
**Happy coding! :sparkles: :camel: :boom:**
| 44.416667 | 226 | 0.743902 | eng_Latn | 0.966382 |
0d9fabf063f64acb4e8c448dd4d35db964cb18cd | 215 | md | Markdown | src/note/poem/14.md | Mister-Hope/blog | aaee54ad1b76087e5793f373f19b1ce6144f14f1 | [
"MIT"
] | 6 | 2019-10-09T10:16:43.000Z | 2020-06-03T17:37:26.000Z | src/note/poem/14.md | Mister-Hope/blog | aaee54ad1b76087e5793f373f19b1ce6144f14f1 | [
"MIT"
] | 46 | 2019-10-31T08:57:11.000Z | 2020-10-07T06:53:03.000Z | src/note/poem/14.md | Mister-Hope/blog | aaee54ad1b76087e5793f373f19b1ce6144f14f1 | [
"MIT"
] | 1 | 2020-06-07T13:33:22.000Z | 2020-06-07T13:33:22.000Z | ---
icon: like
date: 2016-12-09
category: 随笔
tag:
- 轻言细语
---
# 轻言细语【十四】
::: center
刚烈的冬风敲打着车窗,
和着淡淡的薄凉,
飘洒在飞驰的列车旁。
窗外景色飞退,
心微凉,情慵懒,
一缕倦意心头堆,
劳累此般又为谁?
寒寂渐深,心乱纷沓,
犹如一片孤寂的小雪花—
没有目的地却有牵挂,
飘荡在虚空中渐渐升华
:::
| 5.972222 | 16 | 0.64186 | yue_Hant | 0.079205 |
0da001329e71108d7a829112baedce5082d3f8ce | 4,703 | md | Markdown | source/content/guides/edge-integrations/01-introduction.md | jocastaneda/documentation | d99fa7f083692ff708a463fa8da7482b9535def3 | [
"MIT"
] | null | null | null | source/content/guides/edge-integrations/01-introduction.md | jocastaneda/documentation | d99fa7f083692ff708a463fa8da7482b9535def3 | [
"MIT"
] | null | null | null | source/content/guides/edge-integrations/01-introduction.md | jocastaneda/documentation | d99fa7f083692ff708a463fa8da7482b9535def3 | [
"MIT"
] | null | null | null | ---
title: Edge Integrations
subtitle: Introduction
description: A modern approach to audience-based content personalization.
categories: [develop]
tags: [collaborate, composer, continuous-integrations, webops, workflow]
contributors: [michellecolon-pantheon, jazzsequence, jspellman814]
type: guide
layout: guide
showtoc: true
anchorid: edge-integrations
permalink: docs/guides/edge-integrations/
editpath: edge-integrations/01-introduction.md
reviewed: "2022-03-23"
---
This guide is made to facilitate the onboarding process for developers who are implementing content personalization via Pantheon's [Advanced Global CDN](/guides/professional-services/advanced-global-cdn) into their own Drupal or WordPress website.
## What Is Edge Integrations? (Pilot)
Edge Integrations is a Software Development Kit (SDK) that allows users to personalize WordPress and Drupal. Pantheon's approach leverages tight integration between the CMS and our Global CDN with Edge Computing capabilities to deliver the right content to the right audience directly, and with fewer moving parts.
### How Does Edge Integrations Work?
Edge Integrations uses configuration at the "edge" or the CDN to enable personalization options for Geolocation or Interests. This is done by using HTTP vary headers that tell the CDN to return cached variations of content based on values identified by the user browsing the site.
## Is Edge Integrations Right for You?
The benefits of Edge Integrations:
- Unified Experience
- You can improve productivity by using existing content, style guides, media, and CMS integrations. No need to manage assets in multiple places.
- Performance-forward
- Improve credibility with a fast, seamless customer experience, and distribute personalized content across dozens of global and US points of presence.
- Cost-effective
- Increase business impact by instrumenting and measuring success with your current analytics products. No need for new segmentation tooling.
- Geographic targeting
- Based on the location of the visitor, the site can deliver a different homepage — for example, Canadian visitors might see poutine, whereas US visitors would see pizza.
- Interest fingerprinting
- Repeated engagement with types of content — e.g. looking at multiple vegan recipes - will create a specific Interest-based cache, which will reorganize a landing page to highlight content that matches the visitor's interest.
There are many more ways to leverage content variation to identify valuable audience segments or variants. Contact your Account Manager to learn more about Edge Integrations and get started.
## Glossary
<dl>
<dt>Vary Header</dt>
<dd>
The cache layer stores and registers content variants utilizing the [vary header](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Vary). The Vary HTTP response header describes the parts of the request message, aside from the method and URL, that influenced the content of the response it occurs in. It’s a key concept in the process of *content negotiation*. In HTTP, content negotiation is the mechanism that is used for serving different representations of a resource (for example, a WordPress Post or Drupal node) to the same URI to help the user agent specify which representation is best suited for the user (e.g. document language, personalization blocks, content-encoding, version of the content).
Example: `Vary: <header-name>, <header-name>`
- `<header-name>` corresponds to the personalization property or condition
</dd>
<dt>Segmentation (Drupal)</dt>
<dd>
Each segment corresponds to a different value within the personalization property/condition.
Example: `US` `CA` `ES` `UK`
- Used within the Geolocation condition, where each segment is a country.
- Within the Interest condition, we may have segments that correspond to particular terms in the Interest taxonomy vocabulary.
Every segment is defined through the Smart Content module UI, and is connected to the block content that will show up when the condition is met and the user is placed within the segment. For example, when the user is in Canada, the condition for the Geolocation is met and the user is placed into a `CA` segment. Then, blocks that respond to the `CA` segment will be rendered instead of generic blocks.
The combination of rendered segmented blocks creates a page variant that is later on stored inside the AGCDN cache and shown to the user without the need for CMS engagement.
</dd>
</dl>
## Additional Resources
Support for Edge Integrations is available through the Pantheon Community slack channel `#edge-integrations`. If you're not already a member, [join the community here](https://slackin.pantheon.io/)!
| 54.057471 | 715 | 0.796725 | eng_Latn | 0.99704 |
0da00f5ca5c5f2cde5e78e41acb27de28df405ee | 20 | md | Markdown | README.md | Link009/LicensePlates-OCR | 1844bc8d5b814c031a14edffea4ece957227672c | [
"MIT"
] | 16 | 2019-04-23T20:24:41.000Z | 2020-07-22T08:05:10.000Z | README.md | Link009/LicensePlates-OCR | 1844bc8d5b814c031a14edffea4ece957227672c | [
"MIT"
] | null | null | null | README.md | Link009/LicensePlates-OCR | 1844bc8d5b814c031a14edffea4ece957227672c | [
"MIT"
] | 11 | 2019-08-07T19:46:19.000Z | 2019-12-16T12:00:12.000Z | # LicensePlates OCR
| 10 | 19 | 0.8 | eng_Latn | 0.531121 |
0da0289beab02f96256284417250b40be2434ef5 | 1,337 | md | Markdown | README.md | KleinerHacker/jrcp | a45d903a606c3fa16c9143e4b9e113382202f6c7 | [
"Apache-2.0"
] | null | null | null | README.md | KleinerHacker/jrcp | a45d903a606c3fa16c9143e4b9e113382202f6c7 | [
"Apache-2.0"
] | null | null | null | README.md | KleinerHacker/jrcp | a45d903a606c3fa16c9143e4b9e113382202f6c7 | [
"Apache-2.0"
] | null | null | null | # JRCP - Java Rest Client Proxy
## Overview
A framework to create a HTTP client based on a given Annotation Framework. It creates proxies for defined interfaces to use like RMI proxies.
## How to use
### Maven Import
TODO
### Use in code
To create a simple JRCP client use always the builder:
```Java
final var client = JRCPClient.createBuilder("http://localhost:8080/rest/api")
.withApiInterface(Customer.class, Carret.class)
.withAnnotationProvider(SpringAnnotationProvider.INSTANCE)
.withStandardContentProviders()
.build();
```
Now you can use this client to contact the server via the API interface proxies:
```Java
final var proxy = client.getProxy(Customer.class);
proxy.addCustomer(...);
proxy.getCustomers();
...
```
## Annotation Supports
Currently there is one annotation support:
* **Spring** - for Spring Boot Annotations - Maven Artefact (TODO)
## Custom Extensions
### Annotation Provider
To add a custom annotation provider implmements the `AnnotationProvider` interface.
TODO: Example
### Content Provider
To add a custom content provider implements one of this interfaces:
* `StringContentProvider` - Creates a string based entity (for body), used for e. g. JSON
* `BinaryContentProvider` - Creates a binary based entity (for body), used for e. g. Java Object Serialization
TODO: Example
| 24.759259 | 141 | 0.753179 | eng_Latn | 0.85047 |
0da077e589f0998f87ef66de6caae58069b3f179 | 960 | md | Markdown | packages/arc-utils/CHANGELOG.md | cmgriffing/calatrava | 75725f73b2864e0dfd470a45e13962aefd630c6a | [
"MIT"
] | 2 | 2022-02-14T23:29:40.000Z | 2022-02-19T02:25:46.000Z | packages/arc-utils/CHANGELOG.md | cmgriffing/calatrava | 75725f73b2864e0dfd470a45e13962aefd630c6a | [
"MIT"
] | null | null | null | packages/arc-utils/CHANGELOG.md | cmgriffing/calatrava | 75725f73b2864e0dfd470a45e13962aefd630c6a | [
"MIT"
] | null | null | null | # @calatrava/arc-utils
## 0.0.11
### Patch Changes
- 8839035: clean up console logs and change releveant logs into debug logs
- Updated dependencies [8839035]
- @calatrava/[email protected]
## 0.0.10
### Patch Changes
- add init step for hasTeams and conditionally import teams stuff
## 0.0.9
### Patch Changes
- add missing arc template files to boilerplate and arc-utils
## 0.0.8
### Patch Changes
- rebuild dependencies and republish
## 0.0.7
### Patch Changes
- use proper config prop paths for arc stuff
## 0.0.6
### Patch Changes
- add debug util and use it in all packages
- Updated dependencies
- @calatrava/[email protected]
## 0.0.5
### Patch Changes
- add arc cli command and clean up boilerplate commands
## 0.0.4
### Patch Changes
- Use custom delimiter in all packages
## 0.0.3
### Patch Changes
- Add files attribute package jsons and add tsconfigs where needed
## 0.0.2
### Patch Changes
- 23f045a: Implemented changesets
| 14.545455 | 74 | 0.697917 | eng_Latn | 0.965648 |
0da1aae2b0bef1ec61fe835198ba27e51a93ed4e | 3,215 | md | Markdown | README.md | siegcollado/react-with-actioncable | 867a0bb9e7d149552e9d385b5e9022c20fdcccf1 | [
"MIT"
] | null | null | null | README.md | siegcollado/react-with-actioncable | 867a0bb9e7d149552e9d385b5e9022c20fdcccf1 | [
"MIT"
] | null | null | null | README.md | siegcollado/react-with-actioncable | 867a0bb9e7d149552e9d385b5e9022c20fdcccf1 | [
"MIT"
] | null | null | null | # react-with-actioncable
A higher order component (like redux's `connect`) to connect our components to ActionCable.
This contains assumptions on your channel behavior (for example, regarding which connection is rejected) so use at your own risk, or fork it if needed.
This is used with [react-actioncable-provider](https://github.com/cpunion/react-actioncable-provider) so you need to install it too.
```bash
yarn add react-actioncable-provider
```
## Usage
```javascript
import ActionCable from 'actioncable'
import ActionCableProvider from 'react-actioncable-provider'
<ActionCableProvider cable={ActionCable.createConsumer(WEBSOCKET_URL)}>
<App />
</ActionCableProvider>
```
For components that need to subscribe to channels, we connect it like this:
```javascript
import withActionCable from 'react-with-actioncable'
const SomeComponent = (props) => {
const {
name,
helloFunction
} = props
return (
<div>
{name && `Hello ${name}`!}
<Button onPress={helloFunction} />
<div/>
)
}
export default withActionCable({
channel: 'HelloChannel', // The channel that we will subscribe to.
// Parameters that we pass to the channel. Props are passed to it from the component
params: (props) => ({ name: 'foo' }),
// tell the Hoc if we want to connect the component automatically or if we want to
// connect to it manually using the provided props.cable.connectToChannel(params) function
autoConnect: true,
// this is called after we connect to the channel
onConnect: (channel) => {},
// this is called after we disconnect to the channel
onDisconnect: () => {},
// this is called if the server rejects our connection
onReject: () => {},
// this is called when we receive new data from the channel.
// data is from the channel, props come from the props passed to the component.
// this function should return an object, in which this will be passed as props to the component.
onReceive: (data, props) => {
return {
name: data.name
},
},
// this contains functions that we are going to use to broadcast to the channel
// these will be passed down as props to the component as well
broadcasters: {
// it should return an object. the channel now will receive data containing { name: name }
helloFunction: (name) => {
return {
name
}
}
},
// this contains functions that are also instance methods for the actioncable channel.
// these will be passed down as props to the component.
serverMethods: {
foo: (bar) => {
// it should return an object. the channel will now call on HelloChannel#foo, with
// { bar: bar } as data
return {
bar
}
}
}
})(SomeComponent)
```
We can also pass this function to the `compose` function if we want to use it with redux or apollo.
```javascript
import { connect } from 'react-redux'
import { compose, graphql } from 'react-apollo'
import { withActionCable } from '../components/withActionCable'
import Component from './components/Component'
export default compose(
graphql(),
connect(mapState, mapDispatch),
withActionCable({ channel: 'Foobar' })
)(Component)
```
## Todo
- Tests
# License
MIT
| 28.963964 | 151 | 0.697978 | eng_Latn | 0.998563 |
0da25ccc5d153c6c95cdd16a568e7eb7fa7adfbd | 1,187 | md | Markdown | README.md | alex-carvalho/log-correlation | fe84fd3fd1d3871691d5da3f29b086ce36d36c3a | [
"MIT"
] | 2 | 2020-11-30T15:54:15.000Z | 2021-11-18T22:34:03.000Z | README.md | alex-carvalho/log-correlation | fe84fd3fd1d3871691d5da3f29b086ce36d36c3a | [
"MIT"
] | null | null | null | README.md | alex-carvalho/log-correlation | fe84fd3fd1d3871691d5da3f29b086ce36d36c3a | [
"MIT"
] | null | null | null | # Log correlation
Example of log correlation using Spring Cloud Sleuth and ELK stack.
This project has 4 services, have the flow:
```
-> Service-C
Service-A -> Service-B
-> Service-D
```
- Service-A - is Spring MVC, call Service-B
- Service-B - is Spring Webflux, call Service-C and Service-D simultaneously, both Spring MVC
Witch service has one endpoint to generate log:
- http://localhost:8080/servicea/message
- http://localhost:8180/serviceb/message
- http://localhost:8280/servicec/message
- http://localhost:8380/serviced/message
Docker compose file have all services and ELK stack configured
Use the `start.sh` to run the project, they will generate build of jar files, execute docker-compose, await kibana stay health, create index pattern and make some requests to generate logs.
After star is complete acess: http://localhost:5601/app/discover#/
### Log flow:

### Log correlation in Kibana

| 29.675 | 189 | 0.73631 | eng_Latn | 0.654754 |
0da2f3d5083d952fa3e79255e8ffa84b3448b80f | 126 | md | Markdown | README.md | lineality/Code_Challenge_Study_Tools | 61028af3669b2bd6ea19b36233afb2f21c54c1c6 | [
"MIT"
] | null | null | null | README.md | lineality/Code_Challenge_Study_Tools | 61028af3669b2bd6ea19b36233afb2f21c54c1c6 | [
"MIT"
] | null | null | null | README.md | lineality/Code_Challenge_Study_Tools | 61028af3669b2bd6ea19b36233afb2f21c54c1c6 | [
"MIT"
] | null | null | null | # Code_Challenge_Study_Tools
https://colab.research.google.com/drive/1dJFunMg6wT-ht8TQ5xpo3PNrnvDkjlhx#scrollTo=TBbI6bRqx_3j
| 31.5 | 95 | 0.873016 | yue_Hant | 0.436268 |
0da33fc65f904cad7bebe9d20a3c356f6408b8ce | 1,222 | md | Markdown | AlchemyInsights/remove-users-from-sharepoint-group.md | pebaum/OfficeDocs-AlchemyInsights-pr.es-ES | 1ef7350ca1a1c8038bc57b9e47bdd510bb7c83d5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | AlchemyInsights/remove-users-from-sharepoint-group.md | pebaum/OfficeDocs-AlchemyInsights-pr.es-ES | 1ef7350ca1a1c8038bc57b9e47bdd510bb7c83d5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | AlchemyInsights/remove-users-from-sharepoint-group.md | pebaum/OfficeDocs-AlchemyInsights-pr.es-ES | 1ef7350ca1a1c8038bc57b9e47bdd510bb7c83d5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Quitar usuarios de un grupo de SharePoint
ms.author: v-todmc
author: todmccoy
manager: mnirkhe
ms.date: 04/21/2020
ms.audience: Admin
ms.topic: article
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.collection: Adm_O365
ms.custom:
- "9000237"
- "3198"
ms.openlocfilehash: 81b05e14fb3755c6602548087617f19ee1d585a5
ms.sourcegitcommit: 286000b588adef1bbbb28337a9d9e087ec783fa2
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 04/27/2020
ms.locfileid: "43911741"
---
# <a name="remove-users-from-a-sharepoint-group"></a>Quitar usuarios de un grupo de SharePoint
Un grupo de SharePoint es una colección de usuarios que tienen el mismo conjunto de permisos para los sitios y el contenido. En lugar de asignar permisos a una persona a la vez, puede usar grupos para asignar convenientemente el mismo nivel de permisos a muchas personas a la vez.
- [Quitar usuarios de un grupo](https://docs.microsoft.com/sharepoint/customize-sharepoint-site-permissions#remove-users-from-a-group)
- [Agregar o quitar miembros de grupos de 365 de Microsoft mediante el centro de administración](https://docs.microsoft.com/office365/admin/create-groups/add-or-remove-members-from-groups?view=o365-worldwide)
| 43.642857 | 280 | 0.804419 | spa_Latn | 0.807614 |
0da4038992d43788294f6ec066062798a3dcef51 | 321 | md | Markdown | README.md | esperco/esper-latency-stat | 7be021ca37dba45f5f3b047708bd5f1760bdb6ae | [
"BSD-3-Clause"
] | 3 | 2017-08-15T18:54:05.000Z | 2018-08-28T21:12:05.000Z | README.md | esperco/esper-latency-stat | 7be021ca37dba45f5f3b047708bd5f1760bdb6ae | [
"BSD-3-Clause"
] | null | null | null | README.md | esperco/esper-latency-stat | 7be021ca37dba45f5f3b047708bd5f1760bdb6ae | [
"BSD-3-Clause"
] | null | null | null | This repository was extracted from a larger internal project at
[Esper](https://esper.com).
We released it in the hope that it might be useful to other
OCaml developers.
It won't build as is but most of the code was used in production.
Description
-----------
Reports latencies of various API calls, in a daily email.
| 26.75 | 65 | 0.750779 | eng_Latn | 0.999901 |
0da4708d7f5319b600598ce4117a75914eccbf4b | 105 | md | Markdown | _pages/life.md | BaeKwangho/baekwangho.github.com | 568c9c38ab9d3d3fb1cef34dc511f3eaaa419baa | [
"MIT"
] | null | null | null | _pages/life.md | BaeKwangho/baekwangho.github.com | 568c9c38ab9d3d3fb1cef34dc511f3eaaa419baa | [
"MIT"
] | null | null | null | _pages/life.md | BaeKwangho/baekwangho.github.com | 568c9c38ab9d3d3fb1cef34dc511f3eaaa419baa | [
"MIT"
] | null | null | null | ---
title: "life"
layout: categories
permalink: /categories/life/
author_profile: true
taxonomy: life
--- | 15 | 28 | 0.742857 | eng_Latn | 0.455139 |
0da5c2ae1d3099341afd9df128eea0fe2e3056e5 | 1,224 | md | Markdown | _pages/about.md | soniahorchidan/soniahorchidan.github.io | 7ef5d6794daed1a15bf4a3b996c8a000e074887e | [
"MIT"
] | null | null | null | _pages/about.md | soniahorchidan/soniahorchidan.github.io | 7ef5d6794daed1a15bf4a3b996c8a000e074887e | [
"MIT"
] | null | null | null | _pages/about.md | soniahorchidan/soniahorchidan.github.io | 7ef5d6794daed1a15bf4a3b996c8a000e074887e | [
"MIT"
] | null | null | null | ---
permalink: /
title: "About me"
excerpt: "About me"
author_profile: true
redirect_from:
- /about/
- /about.html
---
<br>
I am a Doctoral Student at KTH Royal Institute of Technology in Stockholm, Sweden, where I am part of the Division of Software and Computer Systems (SCS) of the School of Electrical Engineering and Computer Science (EECS). I am supervised by Prof. [Jim Dowling](https://www.kth.se/profile/jdowling), Prof. [Seif Haridi](https://www.kth.se/profile/haridi), Prof. [Henrik Boström](https://www.kth.se/profile/henbos), and Asst. Prof. [Paris Carbone](https://www.kth.se/profile/parisc).
My research is tightly intertwined with the [Continuous Deep Analytics](https://cda-group.github.io/) (CDA) project, whose purpose is to build the foundations of next-generation scalable Big Data platforms that can make real-time decisions based on massive live data. In this project, I focus on systems for Graph Data Management and large-scale Graph Analytics. More specifically, I investigate how Graph Representation Learning can be leveraged to solve graph data management problems.
My research focuses on <b>Large Graph Analytics</b>, <b>Graph Representation Learning</b>, and <b>Stream Processing</b>.
| 72 | 487 | 0.767157 | eng_Latn | 0.900882 |
0da62b6fd52a574ae00775b068d16173e846bc75 | 2,614 | md | Markdown | README.md | openSNP/snpr | f995d368163d3ad3df5b0e20289a8ea30b953a00 | [
"MIT"
] | 97 | 2016-02-12T15:53:25.000Z | 2022-03-27T01:32:02.000Z | README.md | openSNP/snpr | f995d368163d3ad3df5b0e20289a8ea30b953a00 | [
"MIT"
] | 211 | 2016-02-08T11:44:28.000Z | 2022-03-14T15:34:50.000Z | README.md | openSNP/snpr | f995d368163d3ad3df5b0e20289a8ea30b953a00 | [
"MIT"
] | 44 | 2016-02-09T07:58:07.000Z | 2022-02-06T23:59:01.000Z | # openSNP
[](https://travis-ci.org/openSNP/snpr) [](https://codeclimate.com/github/openSNP/snpr) [](https://codeclimate.com/github/openSNP/snpr/coverage) [](https://gitter.im/openSNP/snpr?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
Hello! [openSNP.org](https://opensnp.org) is a repository to which users can upload their SNP-sets (and exome-VCFs) from
23andme, deCODEme, FamilyTreeDNA, AncestryDNA and IYG-format (for participants
of EBI genotyping). On upload, SNPs are annotated using the PLoS and
Mendeley-APIs to show users the newest scientific research results on their
SNPs. Each SNP is also linked to the relevant page on SNPedia. SNPs are ranked
according to how many results could be gathered for SNPedia, PLoS and Mendeley
(in that order). Users can send each other private messages as well as comment
on SNPs and Phenotypes.
## Installing openSNP
Please see [INSTALL.md](https://github.com/openSNP/snpr/blob/master/INSTALL.md) for more detailed instructions on how to run and setup openSNP.
## Contributing to openSNP
Thanks for your interest in helping us out! You are awesome! Please see [CONTRIBUTING.md](https://github.com/openSNP/snpr/blob/master/CONTRIBUTING.md) for more detailed instructions on how to contribute. We also [have a ROADMAP.md](https://github.com/openSNP/snpr/blob/master/ROADMAP.md), containing our idea of where the project should head.
The project has a [Code of Conduct](https://github.com/openSNP/snpr/blob/master/CODE_OF_CONDUCT.md) in order to make this a safe and inclusive space for everyone.
Thanks go to everyone who has contributed so far. May you be celebrated, [inside our humans.txt](https://github.com/openSNP/snpr/blob/master/public/humans.txt) and outside of it!
## Getting in contact
You can always open an issue for specific problems, or send a mail to [email protected] if you want to discuss something or if you have any questions or need help with something. There's also [email protected] if something broke on the site itself.
We're also available on Twitter:
@[gedankenstuecke](https://twitter.com/gedankenstuecke)
@[helgerausch](https://twitter.com/helgerausch)
@[philippbayer](https://twitter.com/philippbayer)
[You can also join us on Gitter](https://gitter.im/openSNP/snpr).
| 76.882353 | 570 | 0.784621 | eng_Latn | 0.884975 |
0da6719fd757d1821e34bfb37a9dcd86be2eb0fe | 9,031 | md | Markdown | z2a/README.md | colinsung/system-integration | 77f3be4efdecb2625ce1d616e75fd862da11f453 | [
"Apache-2.0"
] | 4 | 2018-04-13T15:19:24.000Z | 2021-03-09T13:00:23.000Z | z2a/README.md | colinsung/system-integration | 77f3be4efdecb2625ce1d616e75fd862da11f453 | [
"Apache-2.0"
] | null | null | null | z2a/README.md | colinsung/system-integration | 77f3be4efdecb2625ce1d616e75fd862da11f453 | [
"Apache-2.0"
] | 13 | 2018-10-04T14:08:35.000Z | 2021-05-05T13:06:39.000Z | <!---
.. ===============LICENSE_START=======================================================
.. Acumos CC-BY-4.0
.. ===================================================================================
.. Copyright (C) 2018 AT&T Intellectual Property & Tech Mahindra. All rights reserved.
.. ===================================================================================
.. This Acumos documentation file is distributed by AT&T and Tech Mahindra
.. under the Creative Commons Attribution 4.0 International License (the "License");
.. you may not use this file except in compliance with the License.
.. You may obtain a copy of the License at
..
.. http://creativecommons.org/licenses/by/4.0
..
.. This file is distributed on an "AS IS" BASIS,
.. WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
.. See the License for the specific language governing permissions and
.. limitations under the License.
.. ===============LICENSE_END=========================================================
-->
# Zero-to-Acumos README
>NOTE: Work in progress - subject to change.
In the Acumos `system-integration` repository, the `z2a` sub-directory contains
the scripts that perform installation actions based the flows described below.
## Flow-1
Flow-1 consists of three (3) steps using the following scripts (and descriptions):
```bash
# Step 0[a-c]
z2a/0-kind/0a-env.sh # z2a environment creation
z2a/0-kind/0b-depends.sh # dependency installation and setup
z2a/0-kind/0a-cluster.sh # Kubernetes ('kind') cluster creation
# Step 1
z2a/1-acumos/1-acumos.sh # Acumos noncore and core component setup
# Step 2 (optional)
z2a/2-plugins/2-plugins.sh # Acumos plugins setup (including dependencies)
```
>NOTE: In Flow-1, the `z2a` environment creation script (01-env.sh) will have
>to be executed during the initial setup and again after logging out and logging
>back into the new session.
## Flow-1 VM Requirements
* At the time of this writing, the Operating System installed on the VM must
be either RedHat/CentOS (v7 or greater, v8 recommended) or Ubuntu (18.04 or
greater, 20.04 recommended).
>NOTE: earlier versions of RedHat/CentOS (v6) or Ubuntu (16.04) may be
>sufficient to run the z2a installation, but they have not been tested.
>
>NOTE: Version 0.8.1 of `kind` provides new cluster recovery capabilities.
>`kind` v0.8.1 requires that the VM used be Ubuntu 20.04 or Centos 8 to
>operate properly.
* Flow-1 VM Resource Sizing Recommendations
* four (4) vCPU (minimum)
* 32GB of memory (minimum)
* 80GB disk space (minimum) (~100GB+ for MLWB and other plugins)
* additional disk space for models (based on developer requirements)
* VM Distribution Recommendations
* git (source code tool)
* git is not installed by default by Linux distributions
* git must be installed to allow for Acumos repository replication
* yq (YAML processing tool)
* jq (JSON processing tool)
### Miscellaneous Requirements
* A SSH client with port-forward/tunnel/proxy capabilities; such as:
* PuTTY (Windows SSH client)
* SecureCRT (MacOS SSH client)
* OpenSSH (Linux SSH client)
* For Flow-1 installation, the user **must** have sudo rights on the VM (i.e.
must exist in the `/etc/sudoers` file).
* For Flow-1, the VM requires Internet access such that OS updates, OS
supplemental packages and Helm chart installations can be performed. Either
the VM has proxied access to the Internet or the user must be able to
configure the proxy setting for the VM.
>NOTE: internet proxy configurations are beyond the scope of the installation
>documentation. Please see the README-PROXY.md document for assistance with
>proxy configurations requirements.
## Flow-1 Deployment
Flow One (Flow-1) performs a complete `z2a` Acumos installation including
environment creation, VM Operating System preparation, dependency installation,
Kubernetes cluster creation and deployment of Acumos noncore and core
components. Flow-1 is based on the original `z2a` process flow targeting
development/test environments where a Kubernetes cluster is build from scratch
on a single VM.
### Flow 1 - Steps 0[a-c]-*
In the directory `z2a/0-kind` there are three (3) scripts which perform the
following tasks:
* End-user environment setup (`0a-env.sh` script)
* Linux distribution (RHEL/CentOS or Ubuntu) setup
* Dependency and OS tools installation (`0b-depends.sh` script)
* Kubernetes cluster creation (`0c-cluster.sh` script)
>NOTE: Execution of the `z2a/0-kind/0a-env.sh` script creates and populates
>environment variables necessary for proper operation of subsequent scripts.
>
>NOTE: For 1st time users, the user performing the installation MUST log out
>of their session after the successful completion of `z2a/0-kind/0b-depends.sh`
>script. The logout is required such that the user (installer) can join the
>`docker` group that has just been created.
>
>Upon logging back into a session, the user (installer) will be a member of
>the `docker` group and can proceed by re-executing the `0a-env.sh` script
>and then the `0c-cluster.sh` script located in the
>`~/system-integration/z2a/0-kind` directory. Any subsequent re-run of the
>`z2a/0-kind/0b-depends.sh` script does not require the user to log out
>(one time requirement).
### Flow 1 - Step 1-acumos
In the directory `z2a/1-acumos` there is a single (1) script which performs:
* the installation of the Acumos non-core components (`1-acumos.sh` script)
* the installation of the Acumos core components (`1-acumos.sh` script)
### Flow 1 - Step 2-plugins
In the directory `z2a/2-plugins` there is a single (1) script which performs:
* the installation of the Acumos plugin dependencies (`2-plugins.sh` script)
* the installation of the Acumos plugins (`2-plugins.sh` script)
Currently, the only Acumos plugin supported is MLWB (Machine Learning WorkBench).
## Flow-2
Flow-2 consists of three (3) steps using the following scripts (and descriptions):
```bash
# Step 0
z2a/0-kind/0a-env.sh # z2a environment creation
# Step 1
z2a/1-acumos/1-acumos.sh # Acumos noncore and core component setup
# Step 2 (optional)
z2a/2-plugins/2-plugins.sh # Acumos plugins setup (including dependencies)
```
## Flow-2 Deployment
Flow Two (Flow-2) performs a `z2a` Acumos installation including environment
creation and deployment of Acumos noncore and core components. Flow-2 is based
on the original `z2a` process flow, but is targeted at Acumos installations
onto a Kubernetes cluster that is already built and ready for application
installation.
### Flow 2 - Step 0a
In the directory `z2a/0-kind` there is one (3) script which perform the
following task:
* End-user environment setup (`0a-env.sh` script)
>NOTE: Execution of the `z2a/0-kind/0a-env.sh` script creates and populates
>environment variables necessary for proper operation of subsequent scripts.
### Flow 2 - Step 1-acumos
In the directory `z2a/1-acumos` there is a single (1) script which performs:
* the installation of the Acumos non-core components (`1-acumos.sh` script)
* the installation of the Acumos core components (`1-acumos.sh` script)
### Flow 2 - Step 2-plugins
In the directory `z2a/2-plugins` there is a single (1) script which performs:
* the installation of the Acumos plugin dependencies (`2-plugins.sh` script)
* the installation of the Acumos plugins (`2-plugins.sh` script)
Currently, the only Acumos plugin supported is MLWB (Machine Learning WorkBench).
-----
## Known Issues
ISSUE: At the time of this writing, the `kind` (Kubernetes in Docker) cluster
does not persist across a VM reboot OR a Docker service reconfigure/restart
operation. Development activities to add this cluster recovery capability are
being performed by the upstream developers. At this time, if (for some reason)
the VM is rebooted or the Docker service is restarted, portions of the `z2a`
installation process must be executed again and any "work" may be lost. End-users
must ensure that they have any work performed in the current `z2a` environment
saved outside of z2a.
>NOTE: Version 0.8.1 of `kind` provides new cluster recovery capabilities.
>`kind` v0.8.1 requires Ubuntu 20.04 or Centos 7/8 to install correctly and
>operate properly.
ISSUE: `z2a` performs post-installation component configuration. The `z2a`
scripts perform a complete installation of Acumos and where automation can be
applied, automated configuration is performed. As `z2a` matures, additional
post-installation configuration will be added to configurations that can be
easily maintained.
At this time, automated configuration of only the following components is
being performed:
* MariaDB (for Common Data Services)
* Sonatype Nexus
* Kong (and PostgreSQL)
* Note: Kong has been deprecated. Replaced with native k8s ingress w/ Nginx.
* Nginx (for k8s ingress and native service proxies)
```bash
// Created: 2020/03/20
// Last modified: 2020/12/18
```
| 40.137778 | 87 | 0.723508 | eng_Latn | 0.986087 |
0da7bc2106076a54094747063563632e377689ce | 1,092 | md | Markdown | docs/CloudComputing/Azure/AZ-204/Develop Azure Compute Solutions/Containers/CreatingDockerImages/Images.md | RWillup/dev_studies | 75b14008f08c80d2ad989a85872d42f9ab73009f | [
"MIT"
] | null | null | null | docs/CloudComputing/Azure/AZ-204/Develop Azure Compute Solutions/Containers/CreatingDockerImages/Images.md | RWillup/dev_studies | 75b14008f08c80d2ad989a85872d42f9ab73009f | [
"MIT"
] | null | null | null | docs/CloudComputing/Azure/AZ-204/Develop Azure Compute Solutions/Containers/CreatingDockerImages/Images.md | RWillup/dev_studies | 75b14008f08c80d2ad989a85872d42f9ab73009f | [
"MIT"
] | null | null | null | # Images
## Creating Docker images with Dockerfile
Dockerfile = Contains instructions for how to create a Docker image.
```Dockerfile
FROM microsoft/dotnet:aspnetcore-runtime
WORKDIR /app
COPY ./out .
ENTRYPOINT ["dotnet", "samplewebapp.dll"]
```
To build an image from the Dockerfile above we can just run this command:
```PowerShell
docker build -t samplewebapp .
```
## Multi-stage Dockerfiles
Single stage Dockerfile
- Copies in a pre-built application
Use Docker to build your application
- No developer SDK's needed locally
Multi-stage Dockerfile
- Stage 1: build the application in a container with the SDKs
- Stage 2: copy the published application into a container with the runtime
```Dockerfile
FROM microsoft/dotnet:sdk AS build-env
WORKDIR /app
COPY *.csproj ./
RUN dotnet restore
COPY . ./
RUN dotnet publish -c Release -o out
FROM microsoft/dotnet:aspnetcore-runtime
WORKDIR /app
COPY --from=build-env /app/out .
ENTRYPOINT ["dotnet", "samplewebapp.dll"]
```
We run dotnet restore separately to create a distinct layer for the dependencies. That's for efficiency.
| 22.285714 | 104 | 0.765568 | eng_Latn | 0.846111 |
0da80749d84ab590e72d919bbb4ad80870b367aa | 8,182 | md | Markdown | _posts/2010-08-19-661.md | TkTech/skins.tkte.ch | 458838013820531bc47d899f920916ef8c37542d | [
"MIT"
] | 1 | 2020-11-20T20:39:54.000Z | 2020-11-20T20:39:54.000Z | _posts/2010-08-19-661.md | TkTech/skins.tkte.ch | 458838013820531bc47d899f920916ef8c37542d | [
"MIT"
] | null | null | null | _posts/2010-08-19-661.md | TkTech/skins.tkte.ch | 458838013820531bc47d899f920916ef8c37542d | [
"MIT"
] | null | null | null | ---
title: >
Easter Egg Claptrap
layout: post
permalink: /view/661
votes: 2
preview: "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACUAAAAgCAIAAAAaMSbnAAAABnRSTlMA/wD/AP5AXyvrAAAGH0lEQVRIiZWXXWwU1xXH//djZnbtjdd23WC5OP4qJI0ULME6bMD4I0EqUkRfmpcoEoY4saoi85S8pE+JQiX6goA4ckKtVkqqNA9Vq7YPfWhqryWEBSGmpAKHDdCU1MuHvfbOrmdn5s7cm4dRxjNLg7bn6T+7v6P/PWfO3nuXKCXxbbRfpG8tvnfmzBld14UQACYmJt7ePXlnYJN5SLR/Sn9x/vTMzAwhxPM8TdOOHDly/Oljd3ZvptNYhoLrupxz13UppQCUUnDr8QIAODieOXb48GHHcSqVyvj4+PF9x2rS434+NE3buXMngHK53N/fTwipQR4WPqBQKpVM02SMUUpxF4i3hseeqjh37tzly5f/NTmJXbvSr74qpUSmbr8K0ATDMDjn6+vrUko01lYUf0pB13Up5a6JCQBSSl3X/49+PgIAnuclEommpibLskCAhofUZ0Ap5fu+89RT/YwppaSUaKrbTwMIWltbi8UiAF3XQQDy3X63n3QvvHzB9/3FxUXP87LZ7NTUlBBTddr9s+fO3bt35+W8Uooxpuv6jf6KYRhRJtZPSmk2mz179mwmkxkYGJienr5y5Yqu63X6JZPJnp4exhhjLPgkkUgQEiuQt7V9P3xYXl6WUgI4ffo0AELIjh07AESZlZX70fzgq2BNS0tLjLFCoaCUIoSUSqXAuK2tLWQ4IYRS6vs+57xcLre2tgavTQjhed7y8jJjLMrU1EQp1TSNMeY4jqZpnucxxoQQ0a5EGbqyslIqlSqVimVZvu9bljU2NraxseG67sGDB9va2jo6OqJMjd/a2lq1Wl1ZWVldXeWcF4tF3/cZY0opAFJKz/NizMDAgGEYzc3Nvu/v+HwLUvgxPTQ5OSml7O7u3rbUBIEoU+OXyWQYYw0NDaVSqfOyDgc/Xf25bduJRMLzvI7PGTaQyWRDhv9n6iIooAAFlIAqCoXCgQMH7t27l8vlUAE0xJh43Dq1APbtmN8BkgCglCoWi7ZtA4ATYzgqAAVMgAECaEE6nZ6ens7n89lsFgJwARFhasIGFBDMfAVIo7m52bKsVCplmiYEkIwxFD5AgBTQAEhAoFKp5PP5kydPJpNJVIH/IsbUhAASwVQACvf2kN+9/157OvVog/6HD3/b8Pz3YMcY/sKfjpqmaRgGIeTPL56FBsdxTpw4QQgxDANJoAsv/HGTwUjMbv9vXhJCpNNp0zTnxj9u0PWeRx8RvteeTn1RWBe+sBLY/84mQymlwV6QTqe92/Bu4crR2WvXrl26dOniK3/zVuF9jShTU56mael0OphJ7ypaGpOE4N2P/2pZhfbmlPDgXY8x/AcdHcEvjHMubwPN2PLLJy7QC5zzLZeeLLxyFRRRpsavt6cnWIppmvJrbLy5fvF1+7UXh6b/8tlPRnavvbGKYozhnZ2dmqb5vi+l9JfR+uvupaPjaGwEIRCi79Sp+y/d6Ny+ydT4bd26NZj1crnsXwXug75un/uV2jeQqfxsHUXAwtYfbTIcgBDC932llKygpaUlMTNjGAZjzHVdlzF5H9i+yTwwMJBSVqtVALIMWABAX3ZM5WANUIATY0j0/jJ2aCwQruuGG5Lruh/9/qMHbR6Mxx9/YnhoCMA/ZmcZY6H+8st8yMTOh7lcjhCilJrL5aK6HjMAe555xrZtIcTw0FBUR5lYfb29fQAopeF7CvTNmzfq8evq6qaUEkKCbS/UX33175CJzdvoyEjgJKXknIe6zvoIIfsGBxljf//kE8ZYqKNMrJ+zc3PBYTQ7NxfVdfqNDA8HYv9zz0V1bE3Rfvb1/ZAxZtt2cJsL9a1bN+vx6+rqJoRIKRljUspQR9Nj/dw3OEgpDU5L3/dDXWd9UspnR0cB5Obno/o76+vq6t67Zw9j7PzCghAi1NGBfkiMHRpzXVfTNKUUpTTUH3z4wf/2e+yxLimlpmmcc8dxQp3PX6/Hr7e3j1Jq23bQklBHxzvWz2dHRznnwY1KCBHqOoMxNjI8XCqVPltcJISEOsrE5vP8wkIgcvPzUV2n3/DQkJSysbFxcO/eqI4ysX5u27Y9GCpKaXCnC/T161/U4xf0M/hj5XleqKPp3wAn6mI4P/yRdQAAAABJRU5ErkJggg=="
---
<dl class="side-by-side">
<dt>Preview</dt>
<dd>
<img class="preview" src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACUAAAAgCAIAAAAaMSbnAAAABnRSTlMA/wD/AP5AXyvrAAAGH0lEQVRIiZWXXWwU1xXH//djZnbtjdd23WC5OP4qJI0ULME6bMD4I0EqUkRfmpcoEoY4saoi85S8pE+JQiX6goA4ckKtVkqqNA9Vq7YPfWhqryWEBSGmpAKHDdCU1MuHvfbOrmdn5s7cm4dRxjNLg7bn6T+7v6P/PWfO3nuXKCXxbbRfpG8tvnfmzBld14UQACYmJt7ePXlnYJN5SLR/Sn9x/vTMzAwhxPM8TdOOHDly/Oljd3ZvptNYhoLrupxz13UppQCUUnDr8QIAODieOXb48GHHcSqVyvj4+PF9x2rS434+NE3buXMngHK53N/fTwipQR4WPqBQKpVM02SMUUpxF4i3hseeqjh37tzly5f/NTmJXbvSr74qpUSmbr8K0ATDMDjn6+vrUko01lYUf0pB13Up5a6JCQBSSl3X/49+PgIAnuclEommpibLskCAhofUZ0Ap5fu+89RT/YwppaSUaKrbTwMIWltbi8UiAF3XQQDy3X63n3QvvHzB9/3FxUXP87LZ7NTUlBBTddr9s+fO3bt35+W8Uooxpuv6jf6KYRhRJtZPSmk2mz179mwmkxkYGJienr5y5Yqu63X6JZPJnp4exhhjLPgkkUgQEiuQt7V9P3xYXl6WUgI4ffo0AELIjh07AESZlZX70fzgq2BNS0tLjLFCoaCUIoSUSqXAuK2tLWQ4IYRS6vs+57xcLre2tgavTQjhed7y8jJjLMrU1EQp1TSNMeY4jqZpnucxxoQQ0a5EGbqyslIqlSqVimVZvu9bljU2NraxseG67sGDB9va2jo6OqJMjd/a2lq1Wl1ZWVldXeWcF4tF3/cZY0opAFJKz/NizMDAgGEYzc3Nvu/v+HwLUvgxPTQ5OSml7O7u3rbUBIEoU+OXyWQYYw0NDaVSqfOyDgc/Xf25bduJRMLzvI7PGTaQyWRDhv9n6iIooAAFlIAqCoXCgQMH7t27l8vlUAE0xJh43Dq1APbtmN8BkgCglCoWi7ZtA4ATYzgqAAVMgAECaEE6nZ6ens7n89lsFgJwARFhasIGFBDMfAVIo7m52bKsVCplmiYEkIwxFD5AgBTQAEhAoFKp5PP5kydPJpNJVIH/IsbUhAASwVQACvf2kN+9/157OvVog/6HD3/b8Pz3YMcY/sKfjpqmaRgGIeTPL56FBsdxTpw4QQgxDANJoAsv/HGTwUjMbv9vXhJCpNNp0zTnxj9u0PWeRx8RvteeTn1RWBe+sBLY/84mQymlwV6QTqe92/Bu4crR2WvXrl26dOniK3/zVuF9jShTU56mael0OphJ7ypaGpOE4N2P/2pZhfbmlPDgXY8x/AcdHcEvjHMubwPN2PLLJy7QC5zzLZeeLLxyFRRRpsavt6cnWIppmvJrbLy5fvF1+7UXh6b/8tlPRnavvbGKYozhnZ2dmqb5vi+l9JfR+uvupaPjaGwEIRCi79Sp+y/d6Ny+ydT4bd26NZj1crnsXwXug75un/uV2jeQqfxsHUXAwtYfbTIcgBDC932llKygpaUlMTNjGAZjzHVdlzF5H9i+yTwwMJBSVqtVALIMWABAX3ZM5WANUIATY0j0/jJ2aCwQruuGG5Lruh/9/qMHbR6Mxx9/YnhoCMA/ZmcZY6H+8st8yMTOh7lcjhCilJrL5aK6HjMAe555xrZtIcTw0FBUR5lYfb29fQAopeF7CvTNmzfq8evq6qaUEkKCbS/UX33175CJzdvoyEjgJKXknIe6zvoIIfsGBxljf//kE8ZYqKNMrJ+zc3PBYTQ7NxfVdfqNDA8HYv9zz0V1bE3Rfvb1/ZAxZtt2cJsL9a1bN+vx6+rqJoRIKRljUspQR9Nj/dw3OEgpDU5L3/dDXWd9UspnR0cB5Obno/o76+vq6t67Zw9j7PzCghAi1NGBfkiMHRpzXVfTNKUUpTTUH3z4wf/2e+yxLimlpmmcc8dxQp3PX6/Hr7e3j1Jq23bQklBHxzvWz2dHRznnwY1KCBHqOoMxNjI8XCqVPltcJISEOsrE5vP8wkIgcvPzUV2n3/DQkJSysbFxcO/eqI4ysX5u27Y9GCpKaXCnC/T161/U4xf0M/hj5XleqKPp3wAn6mI4P/yRdQAAAABJRU5ErkJggg==">
</dd>
<dt>Original</dt>
<dd>
<img class="preview" src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAEAAAAAgCAYAAACinX6EAAAJWUlEQVR42t2ZeWhX2RXHnwuuMU0aNS5R45po3Je4xGWEERRUhLpvidG4ov+oaHUURyMuqRrXMRZ3axmnM5TC0OmMdTQmUWPcQBCXioqgguK+L6f3c+69yQ869BepRSeBw7vv3vve757v+Z7tJQjC/MWeDWTNmjUSHx8vCQkJ0qZNG2nWrJls2rRJWOvUqdN/leDX/oeSa9eulaZNm0piYqKC0Lx5c1m/fr0C0LZt27IPAAxISkpS5Rs2bKiSlZWlALRu3Vo6dOggkyZNkokTJ0pGRoYKc2WKAdAeBjRq1EjZsHr1agUAYHCLCRMmoKwKAKA8IJQZBjRu3FgF5UNdAAa0a9dOre8BgA0o3759+7IBAHTH8iiOAMTmzZsVgFatWknHjh1l8uTJCsK4ceOKXQBgygQA69atU7/H+i1btlR32LhxYzEAWBpXYIzSuATMAJhfPwAnA1UWy6M8ShILvAu0aNFCFWbep0kyA0Bw/fQVPB5IbIGRU+5a6MZnAlWQ+5ycnGKLoiCiDGDvabfPS1HI86wZADds2KCAwBSfNZjjtz8+ACec4vlGDhs5ZCTXCPPH7BVl/eE9xVWBPKugKsz4B3f9yb3joLsvsG4EgyiotIg6Y/d8Ggw44pT/2Sn9z0Cto3Pm8Fu3bpX09HQFoEmTJjJ27FjLgAIHgGfRMWf1E+4dP7p3G3CXLFki9evX1ziyZcsWC9bhT4UBP4cof9pZ7aiRv1tFiOxYHQvi/1B49OjRdv9x96yn/RnnBgUOwO/tu1asWKGZJCYmxjLgqAXmvQ5rXDEoKvqwoOlhDzr65jvr5brxQXtPeiPKE9RwBdwgLS3NWrfAgXXUgXHKxZFcx4Jcu7Zs2TINkqRQLaLyLFjvc9ZOIv8HAPKd1UMor4f2gJjABv2xvI8BADF+/HhrbR80TznwCkPExxHzbuqGWrVqqWRnZxe/+6O7wJMnT+TSpUty7tw5OXnypJw/f14uXLggFy9elPv378urV68kLy9PUlNTVXGYACNevHghjx8/lrt378q9e/fkxo0bcvnyZX3X7du35c6dO7rGmHcTM2rWrCmxsbEaA/jd169ff3wAnj17pgfhoLdu3ZKrV6/KlStXVAGUf/PmjQpjmAD12V9YWCgvX76Up0+fyvPnz/U5FOY5gPAAPHz4UB49emQDn1HeA+Df+8nVBRmmjkcmGmUnGGX9vZdwzxMjvNATIKFz4dbDxoFfaLl3bN+u8kG6Tyo7qj5KXwIWac8rz1ppAXB9gIovjXEfL6HrnxQAKJ9m/D3VBDnf+HAlfwNGuOdRjLKYeMFzKEj6BACA8ADwvg8JwAf7AIOlx5u8jwtMMsEOMNJNr898uu35wzIAEBACJwHTZw+fQfyYbwjEFMYeoPcFoHPnzv8bAz7r00f69O4tvXv1kpQePVR5LyNHjJCeKSkqrLEXN/AugRX58xEdITASFBHmEYIdWYM1gunbt291juBIkCSQLlq0SGrXrq1BkqLJ/1E4Iawh/t5LvXr1/mMOqVu3rkpYAFAexXp07y4djPXwfeg/zpS7derUkW5duyoAXNkLI1jzzEAprSNMvr9586ZG/nfv3qmSAEKWQXmyxPXr1zXFauVoyuRr167pOkBQKkdHR/8iALTfzFNDkEo9EMxxXuYQX2cASoMGDXRfqQDAut27dVNFQ7/8cO2anCwdTekLAIz5QRoa1hhrL3DCVYJmTPrD6vQLDx48UOnXr58qCSDFFeJhWzLDCICAAV45APAAEiMiIiKkSpUqUrVqVb2vXLmySo0aNXStWrVqUqlSJalevbpUrFhRgWRvhQoVSgeApz+CVbEughv4+eQuXRQA6nnPgLi4OK3oli5daq132pXPpvKjUqSHGDNmjH4h0sow365hodmzZ5e03wa86dOnqzKAQNmsrXW+/erc25yxDyw15+hiztGzZ08ZOHCgDBgwQMEdPHiwDBkyRMfdjCFTjCEZJ5vzhgWAT1la15sfJGhhWVwA5aERxQ9rND+pLjt4ABCsv2DBAt2r5fBPtjniAHxLnDdvnh5K2+yjttQmO6xatcoy5wfbMQIAFixfvrwsXLjQAnDIAeRL7uPuWuS60MKQ7tOBq+Oiku8TYQHgIwcbSVUAQarjgCgE1aEyVmAP0RrawwK/zjzNDS6jP/5Xq9DQoUOL06AC8A+nrDkcPr148WJ74L/Y+AGI5cqVU1rPnDnTKngspLHyPcoh5z5/c53m965r/dGB71v7Y1bCAuCVI32RlrAwFeBYQ910l6ZIUX4PtPdZAiA46Pz587XX1x/9zrpB//79VXk+hPTt29cCs92yALCXL19urfZNoIo3qmkid1QNaRprInt0DWlcO6akRc9zLfeRECUPubncEmWLP8IcL/mYExYA8jQHAgiUxcKjRo5UALAqtPefwVJdXCA94gbs4ccyMzMlMjLSWsFRHaWhOZ0fYCgDXHsNA9QFjljLVTcB7bNW8ZKSECe/SzZdZ4M6klAvRpUYNmyYTJs2Tb86IzNmzNDYQglNbOHMU6dOVaYSb5Dhw4fr2ghzzrAA4OM8RNCCAVAba0JvDupfPGrUKI0HPkOwh3HMgUDpTPqJ2RhIzG4je4wYanM4Dss4Zr+RXYHuj4qKUrfRvdmBxP02Svomxcu5s2dlQOvfyOdtmkkTwwDWBw0aJFOmTNF3YQDcgzExY86cOTJr1iwdo7CPZ4w5K3GrVFmgi6mmSIG9THTFBXwViBD5WyYmahpEAIX5YcbHASJ6U6ApBwCiMwMrZo7vB74k5hq91sxnW2GO90SvNPdzjBjgIkya69cyUv5lOtGkuBi7d51xLxNEVxu2LDOZJtNkh5UmRWYZ8P6QlSXZxr0YrzcsW2sC7qqVK2Xpl1/KcsNI5hd98UV4AEiB1AAUQr7S8wAQxFhva+hPKmTMmk+VgBW5KFA6k8cjfx9I5NxA2ROYAwam7Q2++kqCDRsUrMjFgQpxYe7cuRK50NzPtnO8p5pxhXrRUfY9zBlwcrZulT/v3y+7du6Ubw4ckD/t2ycHvv5a5xjv2b1b9u7Zo9fvvv1WS+Ldu3bpXsalAgAWeMEyvvnhCiihEvrvMSRiiv3fIb4eMS1Q4ZshDCBrkGUIogDr14kZgBaRYe4nB8XzKlOdmPdGpAWyLSdH/rhtmyqJYijtFed+544dsm/vXp1jjNJcWS8NAP8G0gEsRBjGz04AAAAASUVORK5CYII=">
</dd>
<dt>Title</dt>
<dd>Easter Egg Claptrap</dd>
<dt>Description</dt>
<dd>A easter egg CL4PTP From Borderlands</dd>
<dt>Added By</dt>
<dd>Dominhiho</dd>
<dt>Added On</dt>
<dd>2010-08-19</dd>
<dt>Votes</dt>
<dd>2</dd>
</dl>
| 272.733333 | 3,322 | 0.947201 | yue_Hant | 0.542273 |
0da9095efb727110ddfdf86d756403820587038e | 364 | md | Markdown | content/en/pinball-frameworks/pingod/games/nightmare.md | horseyhorsey/flippingflips-blog | 1f6957e35d123aac810caff6bdcecd3fe9195108 | [
"MIT"
] | null | null | null | content/en/pinball-frameworks/pingod/games/nightmare.md | horseyhorsey/flippingflips-blog | 1f6957e35d123aac810caff6bdcecd3fe9195108 | [
"MIT"
] | null | null | null | content/en/pinball-frameworks/pingod/games/nightmare.md | horseyhorsey/flippingflips-blog | 1f6957e35d123aac810caff6bdcecd3fe9195108 | [
"MIT"
] | null | null | null | ---
author: "Flips-Admin"
title: "PinGod 💀 Nightmare"
date: 2021-12-05T12:00:12+09:00
description: "Nightmare (pinball dreams) on PinGod - (Pinball Dreams, HorsePin)"
draft: true
image:
hideToc: false
enableToc: true
enableTocContent: false
author: HorsePin
authorEmoji: 🐎
categories:
- games
- download
- pingod
tags:
- games
- games pingod
- horsepin
---
--- | 15.826087 | 80 | 0.722527 | eng_Latn | 0.2793 |
0da90fdb7ffbffea20d969de35b486b720a2a888 | 2,123 | md | Markdown | _posts/2021-02-03-BOJ-11559.md | chowisely/chowisely.github.io | f0a4f8b4142be6c9e48bef8ca749dee95231eabf | [
"MIT"
] | null | null | null | _posts/2021-02-03-BOJ-11559.md | chowisely/chowisely.github.io | f0a4f8b4142be6c9e48bef8ca749dee95231eabf | [
"MIT"
] | null | null | null | _posts/2021-02-03-BOJ-11559.md | chowisely/chowisely.github.io | f0a4f8b4142be6c9e48bef8ca749dee95231eabf | [
"MIT"
] | null | null | null | ---
layout: post
title: "[BOJ] 11559번 Puyo Puyo"
date: 2021-02-03 12:00:00 +0930
tags: [알고리즘, 구현, bfs]
series: BOJ
comments: true
---
#### [문제 바로가기](https://www.acmicpc.net/problem/11559)
#### 접근
그래프 탐색을 응용하는 구현 문제이다.
전체 맵에서 한 번 훑어 터트릴 수 있는 뿌요를 벡터에 모두 담아준다. 훑은 다음, 벡터에 담겨 있는 모든 위치를 빈 공간으로 만들고, 바닥과 뿌요 사이에 빈 공간이 남지 않을 때까지 뿌요들을 내려준다.
```cpp
#include <bits/stdc++.h>
using namespace std;
const int H = 12, W = 6;
int dy[4] = {-1, 1, 0, 0};
int dx[4] = {0, 0, -1, 1};
char arr[H][W], new_arr[H][W];
bool ckd[H][W];
vector<pair<int,int> > groups, cand;
void move() {
for(int i = 0; i < groups.size(); i++)
arr[groups[i].first][groups[i].second] = '.';
groups.clear();
memset(new_arr, '.', sizeof(new_arr));
for(int j = 0; j < W; j++) {
int k = H - 1;
for(int i = H - 1; i >= 0; i--) {
if(arr[i][j] != '.')
new_arr[k--][j] = arr[i][j];
}
}
memcpy(arr, new_arr, sizeof(arr));
}
bool bfs(int y, int x) {
int ny, nx;
char color = arr[y][x];
queue<pair<int,int> > q;
ckd[y][x] = true;
q.push(make_pair(y, x));
cand.clear();
cand.push_back(make_pair(y, x));
while(!q.empty()) {
pair<int,int> p = q.front();
q.pop();
for(int i = 0; i < 4; i++) {
ny = p.first + dy[i];
nx = p.second + dx[i];
if(-1 < ny && ny < H && -1 < nx && nx < W && arr[ny][nx] == color && !ckd[ny][nx]) {
ckd[ny][nx] = true;
q.push(make_pair(ny, nx));
cand.push_back(make_pair(ny, nx));
}
}
}
return (cand.size() >= 4 ? true : false);
}
int main() {
std::ios::sync_with_stdio(false);
cin.tie(NULL); cout.tie(NULL);
int seq = 0;
for(int i = 0; i < H; i++) {
for(int j = 0; j < W; j++) {
cin >> arr[i][j];
}
}
while(1) {
memset(ckd, false, sizeof(ckd));
for(int i = 0; i < H; i++) {
for(int j = 0; j < W; j++) {
if(arr[i][j] != '.' && !ckd[i][j]) {
if(bfs(i, j))
groups.insert(groups.end(), cand.begin(), cand.end());
}
}
}
if(groups.size() == 0)
break;
seq += 1;
move();
}
cout << seq;
}
```
| 19.477064 | 113 | 0.484691 | kor_Hang | 0.751571 |
0daba539d388402ee9545bbcafdca30f4b103ee7 | 16 | md | Markdown | README.md | amitbhanji/goblet-of-java | 7ed43ef43bab96c3c1c56948d8f7628a569859d3 | [
"Apache-2.0"
] | null | null | null | README.md | amitbhanji/goblet-of-java | 7ed43ef43bab96c3c1c56948d8f7628a569859d3 | [
"Apache-2.0"
] | null | null | null | README.md | amitbhanji/goblet-of-java | 7ed43ef43bab96c3c1c56948d8f7628a569859d3 | [
"Apache-2.0"
] | null | null | null | # goblet-of-java | 16 | 16 | 0.75 | cat_Latn | 0.505464 |
0dabd71d19cb94b4d09ca0ce9441b5c15485caf4 | 97 | markdown | Markdown | _posts/2014-09-21-services-1.markdown | hudsonglass/hudsonglass.github.io | 3dc7877eaac28021f9ee78f4f22af47a82146e5d | [
"Apache-2.0"
] | null | null | null | _posts/2014-09-21-services-1.markdown | hudsonglass/hudsonglass.github.io | 3dc7877eaac28021f9ee78f4f22af47a82146e5d | [
"Apache-2.0"
] | null | null | null | _posts/2014-09-21-services-1.markdown | hudsonglass/hudsonglass.github.io | 3dc7877eaac28021f9ee78f4f22af47a82146e5d | [
"Apache-2.0"
] | null | null | null | ---
layout: default
img: ipad.png
category: services
title: About the Owner
description: |
---
| 12.125 | 22 | 0.701031 | eng_Latn | 0.778769 |
0dad6353b7538656964b636b59805364061282bd | 268 | md | Markdown | website/docs/components/conditions/static.md | PleXone2019/benthos | 202c1a05c940ca39941218843e2aac53827834b8 | [
"MIT"
] | null | null | null | website/docs/components/conditions/static.md | PleXone2019/benthos | 202c1a05c940ca39941218843e2aac53827834b8 | [
"MIT"
] | null | null | null | website/docs/components/conditions/static.md | PleXone2019/benthos | 202c1a05c940ca39941218843e2aac53827834b8 | [
"MIT"
] | null | null | null | ---
title: static
type: condition
---
<!--
THIS FILE IS AUTOGENERATED!
To make changes please edit the contents of:
lib/condition/static.go
-->
```yaml
static: true
```
Static is a condition that always resolves to the same static boolean value.
| 12.761905 | 76 | 0.671642 | eng_Latn | 0.979442 |
0dadb96809386af05ed3486314d3411fcc889d7b | 1,745 | md | Markdown | papers/_posts/2008-01-01-Langrova_ParRes.md | petrkeil/petrkeil.github.io | 1c083eed2f1fa0c6a993c2cd833e41f7dd37d064 | [
"MIT",
"Unlicense"
] | 1 | 2021-11-09T09:34:09.000Z | 2021-11-09T09:34:09.000Z | papers/_posts/2008-01-01-Langrova_ParRes.md | petrkeil/website | fc96a8999f9ef0eb88844ff025ab474ed6d59e6c | [
"Unlicense",
"MIT"
] | null | null | null | papers/_posts/2008-01-01-Langrova_ParRes.md | petrkeil/website | fc96a8999f9ef0eb88844ff025ab474ed6d59e6c | [
"Unlicense",
"MIT"
] | null | null | null | ---
layout: paper
title: "Arrested development of sheep strongyles: onset and resumption under field conditions of Central Europe"
image: /images/papers/Langrova_2008_ParRes.png
authors: Langrova I, Makovcova K, Vadlejch J, Jankovska I, Petrtyl M, Fetchner J, Lytvynets A, Borkovcova M
year: 2008
ref: Langrova et al 2008 Parasitol Res
journal: "Parasitology Research 103: 387-392"
pdf: /pdfs/papers/Langrova_2008_ParRes.pdf
doi: 10.1007/s00436-008-0984-6
---
# Abstract
Two tracer tests were conducted between August 2004 and March 2007 at an ecological farm in western Bohemia. The first tracer test was performed for the summer–autumn grazing period (onset of arrested development), the second for spring (resumption of arrested development). In the first tracer test, the percentage of nematodes arresting development over the winter months reached 87.7% for Teladorsagia circumcincta, 66.7% for Haemonchus contortus, 89.9% for Nematodirus filicollis, 21.6% for Trichostrongylus axei, and 23.9% for both Trichostrongylus vitrinus and Trichostrongylus colubriformis. None of the arrested larvae were observed with species Cooperia curticei, Nematodirus battus, and Oesophagostomum venulosum. In the second tracer test, a significant increase of adult worms was discovered in March of species T. circumcincta and N. filicollis and Trichostrongylus spp. in February. Redundancy analysis and generalized linear models analyses have confirmed that environmental conditions play a crucial role in hypobiosis of sheep strongyles in the Czech Republic. The analysis of influences of various environmental factors revealed that the number of arrested larvae was negatively influenced by light—day length, sunshine, or daylight decrease (p < 0.01).
| 109.0625 | 1,271 | 0.815473 | eng_Latn | 0.986386 |
0dadd3f3beffe75be9a5d00146a4fed6c68b096b | 3,610 | md | Markdown | README.md | bluecodecat/LCRZ | 6f73e5b5e9a93331a8a351a13de4c2ea1f1f97e5 | [
"Apache-2.0"
] | 2 | 2019-10-07T15:14:38.000Z | 2020-06-28T08:38:25.000Z | README.md | bluecodecat/ZLCR | 6f73e5b5e9a93331a8a351a13de4c2ea1f1f97e5 | [
"Apache-2.0"
] | 1 | 2019-08-01T12:48:57.000Z | 2019-08-01T12:48:57.000Z | README.md | 1847123212/ZLCR | 6f73e5b5e9a93331a8a351a13de4c2ea1f1f97e5 | [
"Apache-2.0"
] | 1 | 2021-06-20T17:06:50.000Z | 2021-06-20T17:06:50.000Z | # **ZLCR**
## low cost 100k LCR meter, base on digital Lock-in amplifier
频率范围:1Hz ~ 100kHz ( 0.01Hz step );
阻抗范围:10m ~ 10M Ohm ( 1k ref only );
丢掉PSD和PGA,忘掉DFT、FFT和LMS算法吧,只需2个运放 + ADC/DAC,100k lcr meter带回家。核心算法类似soft define radio 中的数字下变频(数字IQ解调),更准确的定义是 DLIA(Digital Lock-in amplifier,数字锁定放大器);其实就是将原来模拟器件完成的信号处理整个搬到了数字域,利用浮点运算能力 降低硬件复杂程度。相对于PSD方法,动态范围更高,当信号转换到数字域后,就无需再考虑噪声和失调,动态范围超过100dB;相对于FFT、LMS算法,16bit 4096point 已经算比较高了,而DLIA可以用浮点的 乘法器/CIC/FIR/IIR,DFT:Vi/Vq = 4095/0,DLIA: Vi/Vq = 4.0952341E3 / 1.234354E-2;此外 n=4096 fs=100khz 的FFT 分辨率≈24hz,而DLIA 不存在限制 举个栗子:80,000.00 Hz ~ 80,001.00 Hz扫频,分辨率0.01Hz;
自己绕的变压器:

阻抗谱:

10nF 1kHz 1%建立时间:

[更多图片](https://github.com/yitiandelan/ZLCR/tree/master/docs/images)
## 硬件相关:
标准(100k):
4 x OPA + 2 x ADC + 1 x DAC + MCU (> 80 DMIPS);
最低(20k):
2 x OPA + USB Audio Codec (such as PCM2904);
开发版(100k):
主控制器:STM32F411CE ( > 80 DMIPS), CNY 24
ADC/DAC:TLV320AIC3204 (TI audio codec), CNY 10
AFE:AD8065, 4 x CNY 4
DEBUGGER:CMSIS-DAP for STM32F072 with CDC / J-Link OB-STM32F072-CortexM, CNY 20
接口:USB VCP (via DEBUGGER) or 蓝牙串口 (Simple Shell, Data format: JSON)
Hardware:[100k ZLCR.rev.c.pdf](https://github.com/yitiandelan/ZLCR/blob/master/Hardware/100k%20ZLCR.rev.c.PDF)
Software:[Firmware for stm32f4xx](https://github.com/yitiandelan/ZLCR/tree/master/Firmware)
## 软件相关:
[Python](https://github.com/yitiandelan/ZLCR/tree/master/pyLCR) / [C#](https://github.com/yitiandelan/ZLCR/tree/master/ZLCR) / Matlab / Windows UWP
### 通信格式:
```JSON
uart tx:
zlcr -raw -f 1000\n
uart rx:
{"FREQ":1.000000e+03,"a":-2.304493e+03,"b":-5.388904e+03,"c":-2.319749e+03,"d":-5.420242e+03}\n
{"FREQ":1.000000e+03,"a":-2.304510e+03,"b":-5.388904e+03,"c":-2.319749e+03,"d":-5.419875e+03}\n
{"FREQ":1.000000e+03,"a":-2.304507e+03,"b":-5.388909e+03,"c":-2.319749e+03,"d":-5.421182e+03}\n
{"FREQ":1.000000e+03,"a":-2.304502e+03,"b":-5.388918e+03,"c":-2.319749e+03,"d":-5.423528e+03}\n
uart tx:
zlcr -f 1000\n
uart rx:
{"FREQ":1.000000e+03,"MAG":9.937274e-01,"PHASE":-1.291339e-04}\n
{"FREQ":1.000000e+03,"MAG":9.940118e-01,"PHASE":-2.515127e-04}\n
{"FREQ":1.000000e+03,"MAG":9.941343e-01,"PHASE":-3.042429e-04}\n
{"FREQ":1.000000e+03,"MAG":9.939352e-01,"PHASE":-2.185376e-04}\n
```
## 扩展(up to 20M):
运放构成的auto-balancing bridge并不适合100k以上的测试频率,所以设计了数字桥路平衡控制系统,产生两个激励信号(2 x AD9834),通过检测不平衡电压/电流(HPF + 40dB AMP + DLIA),由控制算法调整激励信号幅度和相位,使桥路趋近平衡;同时利用ADC + DLIA 检测V(DUT) & I(DUT),计算复阻抗,测量结果在桥路达到平衡后误差最小;ADC采样率和计算量增大的问题,解决方案是欠采样(中频采样)。大学时候做过STM32F4欠采样第20奈奎斯特区 观察到明显的孔径抖动 SNR会降低,再往上bandwidth就不够了。对于V(DUT) I(DUT)这种窄带信号,欠采样 + PSD/DLIA 是性能和成本的平衡;对于ΣΔADC,设计了简单的采样保持器 ( 4 x TS5A3159A 未验证);此外AD9834不能控制幅度,设计通过PWM(AF OD mode)调整 I(FSADJSET) 进行幅度控制,相位利用抖动实现16bit;
最后,发个还未验证的版本 [20M ZLCR.rev.a.pdf](https://github.com/yitiandelan/ZLCR/blob/master/Hardware/20M%20ZLCR.rev.a.PDF)
## ref:
[Keysight Technologies Impedance Measurement Handbook](http://literature.cdn.keysight.com/litweb/pdf/5950-3000.pdf)
[抛砖引玉 基于DSP的LCR表试制 供大家参考](http://www.amobbs.com/thread-5590156-1-1.html)
[MT-002: 奈奎斯特准则对数据采样系统设计有何意义](http://www.analog.com/media/cn/training-seminars/tutorials/MT-002_cn.pdf)
[MS-2698:使用同步检测进行精密低电平测量](http://www.analog.com/media/cn/technical-documentation/technical-articles/Use-Synchronous-Detection-to-Make-Precision-Low-Level-Measurements-MS-2698_cn.pdf)
[基于DLIA的交流阻抗谱测量系统关键技术研究](http://cdmd.cnki.com.cn/Article/CDMD-10487-1012361681.htm) | 50.84507 | 464 | 0.710526 | yue_Hant | 0.700729 |
0dae21a04364fb61c3a057f2be030a069980b299 | 859 | md | Markdown | _content/booklets/indian-tradition_kieschnick.md | buddhist-uni/buddhist-uni.github.io | 3d7a6d47e20bff3a4ddf92f9b4e52187678124ba | [
"MIT"
] | 12 | 2020-09-01T11:52:17.000Z | 2022-03-17T17:55:39.000Z | _content/booklets/indian-tradition_kieschnick.md | buddhist-uni/buddhist-uni.github.io | 3d7a6d47e20bff3a4ddf92f9b4e52187678124ba | [
"MIT"
] | 26 | 2020-03-03T10:39:46.000Z | 2022-03-24T03:53:28.000Z | _content/booklets/indian-tradition_kieschnick.md | buddhist-uni/buddhist-uni.github.io | 3d7a6d47e20bff3a4ddf92f9b4e52187678124ba | [
"MIT"
] | 3 | 2020-03-02T20:08:36.000Z | 2022-01-01T15:50:06.000Z | ---
title: "The Indian Tradition through Chinese Buddhist Writings"
authors:
- kieschnick
file_links:
- "exclusive_01/Buddhist%20Chinese%20Primer%20Volume%202%20-%20John%20Kieschnick.pdf"
external_url: "http://www.primerbuddhism.org/volume2.html"
formats: [pdf]
drive_links:
- "https://drive.google.com/file/d/1TiiSZiga9YnVV7EAD-4ioo1jB8GNTsDB/view?usp=drivesdk"
course: chinese-primer
year: 2017
series: buddhist-chinese-primer_kieschnick
number: 2
tags:
- mahayana-roots
---
> This volume assumes knowledge of [the first](/content/booklets/foundations_kieschnick), introducing three types of writings from texts translated in China from Indian originals in medieval times.
The answer key for this textbook can be found [on Google Drive, here](https://drive.google.com/file/d/1ltpbQgRxQcWJAiHxsDeWSuA5fCSBKKps/view?usp=drivesdk){:target="_blank"}.
| 39.045455 | 197 | 0.792782 | yue_Hant | 0.350551 |
0dae63639b63fcfbf7ff74893942196cd7c7b5e4 | 217 | md | Markdown | Ayehu/General/AY GeneralDeleteMultipleErrorMessages/Readme.md | Gstar7CodeMan/custom-activities | d930cb9f0b516f88d6ce8bcbf72ae38b4eb8bea4 | [
"MIT"
] | null | null | null | Ayehu/General/AY GeneralDeleteMultipleErrorMessages/Readme.md | Gstar7CodeMan/custom-activities | d930cb9f0b516f88d6ce8bcbf72ae38b4eb8bea4 | [
"MIT"
] | null | null | null | Ayehu/General/AY GeneralDeleteMultipleErrorMessages/Readme.md | Gstar7CodeMan/custom-activities | d930cb9f0b516f88d6ce8bcbf72ae38b4eb8bea4 | [
"MIT"
] | null | null | null | <br># Ayehu</br>
<br>AY GeneralDeleteMultipleErrorMessages</br>
<br>Method: Post</br>
<br>OperationID: General_DeleteMultipleErrorMessages</br>
<br>EndPoint:</br>
<br>/Api/General/deleteMultipleErrorMessages</br>
| 31 | 57 | 0.764977 | hun_Latn | 0.073942 |
0db161d4ac4d2c08c71247e74972c1034d1fb8e1 | 230 | md | Markdown | docs/docs/WBB/espn_wbb_teams.md | saiemgilani/wehoop | a3b306ba044009d6aac122bc2d1dca50f2b6d02e | [
"MIT"
] | 6 | 2021-04-24T16:31:17.000Z | 2022-01-25T00:30:26.000Z | docs/docs/WBB/espn_wbb_teams.md | saiemgilani/wehoop | a3b306ba044009d6aac122bc2d1dca50f2b6d02e | [
"MIT"
] | 10 | 2021-06-26T00:29:49.000Z | 2022-01-12T00:14:28.000Z | docs/docs/WBB/espn_wbb_teams.md | saiemgilani/wehoop | a3b306ba044009d6aac122bc2d1dca50f2b6d02e | [
"MIT"
] | 2 | 2021-06-26T02:05:35.000Z | 2021-11-07T15:44:35.000Z | ---
title: espn_wbb_teams
sidebar_label: espn_wbb_teams
---
# `espn_wbb_teams`
## Description
Get ESPN women's college basketball team names and ids
## Usage
```r
espn_wbb_teams()
```
## Examples
```r
espn_wbb_teams()
```
| 9.583333 | 54 | 0.691304 | eng_Latn | 0.617547 |
0db19a9890d798bdb9fc0962e5e446bdd8a32fd1 | 1,407 | md | Markdown | articles/azure-arc/data/view-arc-data-services-inventory-in-azure-portal.md | changeworld/azure-docs.it- | 34f70ff6964ec4f6f1a08527526e214fdefbe12a | [
"CC-BY-4.0",
"MIT"
] | 1 | 2017-06-06T22:50:05.000Z | 2017-06-06T22:50:05.000Z | articles/azure-arc/data/view-arc-data-services-inventory-in-azure-portal.md | changeworld/azure-docs.it- | 34f70ff6964ec4f6f1a08527526e214fdefbe12a | [
"CC-BY-4.0",
"MIT"
] | 41 | 2016-11-21T14:37:50.000Z | 2017-06-14T20:46:01.000Z | articles/azure-arc/data/view-arc-data-services-inventory-in-azure-portal.md | changeworld/azure-docs.it- | 34f70ff6964ec4f6f1a08527526e214fdefbe12a | [
"CC-BY-4.0",
"MIT"
] | 7 | 2016-11-16T18:13:16.000Z | 2017-06-26T10:37:55.000Z | ---
title: Visualizzare l'inventario delle istanze nel portale di Azure
description: Visualizzare l'inventario delle istanze nel portale di Azure
services: azure-arc
ms.service: azure-arc
ms.subservice: azure-arc-data
author: twright-msft
ms.author: twright
ms.reviewer: mikeray
ms.date: 09/22/2020
ms.topic: how-to
ms.openlocfilehash: 3c7299ff211035f81cc08e9f191641c780ad02c4
ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 03/29/2021
ms.locfileid: "91826642"
---
# <a name="view-inventory-of-your-instances-in-the-azure-portal"></a>Visualizzare l'inventario delle istanze nel portale di Azure
Dopo aver caricato le [metriche, i log](upload-metrics-and-logs-to-azure-monitor.md)o l' [utilizzo](view-billing-data-in-azure.md), è possibile visualizzare l'istanza come risorsa di Azure dalla portale di Azure. Per visualizzare la risorsa nel portale di Azure, passare all'URL speciale [https://portal.azure.com](https://portal.azure.com) e completare i passaggi seguenti:
1. Passare a **tutti i servizi**.
1. Cercare il tipo di istanza del database.
1. Aggiungere il tipo ai Preferiti.
1. Nel menu a sinistra selezionare il tipo di istanza.
1. È possibile visualizzare le istanze nella stessa visualizzazione delle altre risorse di Azure SQL o di scalabilità di Azure PostgreSQL (usare i filtri per una visualizzazione granulare).
| 50.25 | 374 | 0.800284 | ita_Latn | 0.975334 |
0db462469b709209fc8c7039de37abed403fd8f3 | 515 | markdown | Markdown | _posts/2017-02-24-hello-world.markdown | mjkim610/mjkim610.github.io | c28737e4983db74d13aa12599924672ed575eacf | [
"MIT"
] | null | null | null | _posts/2017-02-24-hello-world.markdown | mjkim610/mjkim610.github.io | c28737e4983db74d13aa12599924672ed575eacf | [
"MIT"
] | null | null | null | _posts/2017-02-24-hello-world.markdown | mjkim610/mjkim610.github.io | c28737e4983db74d13aa12599924672ed575eacf | [
"MIT"
] | null | null | null | ---
layout: post
title: "Hello, World!"
date: 2017-02-24 02:39:00 +0900
---
{% highlight java %}
System.out.println("Hello, World!");
{% endhighlight %}
Welcome to my github.io.
This GitHub Page is mainly for me to try out Jekyll, Travis CI, and other tools. Therefore the site will not be rigorously maintained. Nevertheless, thank you for visiting!
If you would like to get in touch with me, check out the [contact][mjkim610-contact] page.
Thanks >:D
[mjkim610-contact]: https://mjkim610.github.io/contact
| 28.611111 | 172 | 0.726214 | eng_Latn | 0.974183 |
0db709e0ff085f3ce89f18b295d65831523e505c | 285 | md | Markdown | posts/2011/01/z-type.md | atmos/tumblr.atmos.org | 1865e6fe271d4c28047ac50fd4ace154be411ff1 | [
"MIT"
] | null | null | null | posts/2011/01/z-type.md | atmos/tumblr.atmos.org | 1865e6fe271d4c28047ac50fd4ace154be411ff1 | [
"MIT"
] | null | null | null | posts/2011/01/z-type.md | atmos/tumblr.atmos.org | 1865e6fe271d4c28047ac50fd4ace154be411ff1 | [
"MIT"
] | 2 | 2019-05-06T18:02:23.000Z | 2019-05-06T18:27:47.000Z | <!--
id: 2716813757
link: http://tumblr.atmos.org/post/2716813757/z-type
slug: z-type
date: Wed Jan 12 2011 11:56:48 GMT-0800 (PST)
publish: 2011-01-012
tags:
title: Z-Type
-->
Z-Type
======
[http://www.phoboslab.org/ztype/](http://www.phoboslab.org/ztype/)
learn to type better
| 15 | 66 | 0.680702 | kor_Hang | 0.303291 |
0db75c7ac5ea4abda1777f48704eb422ad4e74ad | 3,692 | md | Markdown | _posts/2010-10-17-How-Much-Process-Do-We-Need.md | shawnewallace/shawnewallace.github.com | b4c709019eb448d1086478f164d74568ca44ad87 | [
"Apache-2.0"
] | null | null | null | _posts/2010-10-17-How-Much-Process-Do-We-Need.md | shawnewallace/shawnewallace.github.com | b4c709019eb448d1086478f164d74568ca44ad87 | [
"Apache-2.0"
] | null | null | null | _posts/2010-10-17-How-Much-Process-Do-We-Need.md | shawnewallace/shawnewallace.github.com | b4c709019eb448d1086478f164d74568ca44ad87 | [
"Apache-2.0"
] | null | null | null | ---
layout: post
title: How Much Process Do We Need
year: 2010
month: 10
day: 17
updated:
comments: false
categories: Retrospective Agile
published: true
---
I observed a little twitter conversation this week where one person was attempting to decide if "TDD" meant Test Driven Development or Test Driven Design. After quite a few posts <a href="http://twitter.com/#!/arcware/status/27271929683">one</a> stuck out:<br />
<br />
<blockquote>"What I care about nowadays is shipping stuff."</blockquote><br />
Couple this conversation with a blog posting by <a href="http://kohari.org/2010/08/24/looking-back/">Nate Kohari</a> and I think there is an interesting conversation being framed up.<br />
<br />
What we see is the natural tension that sometimes exists between product owners and implementers. What makes this interesting is that some people who made a living as implementers are now becoming hybrid product owners-implementers. I believe this unique perspective can help our industry.<br />
<br />
I guess I experience some of that when we started <a href="http://www.autoclick.com/">AutoClick</a> back in 1998. <br />
<br />
AutoClick was to be a brand new web application for car dealerships. It was my sincerest desire to do software development correctly. At the time doing software development correctly meant <a href="http://en.wikipedia.org/wiki/Waterfall_model">waterfall</a>...big up front design and LOTS of documentation. Frankly all of that process took a lot of time and really made it hard to ship features. Sometimes it took more time to update the documentation than it did to implement the feature. To me this made our process impractical for us.<br />
<br />
As a developer I wanted rigorous process, but as a company owner I wanted to sell stuff. So, in the desire to ship features that add value quickly, I ended up overcompensating by applying almost no process, no documentation and no discipline. The business team would just call a developer and tell them what change to make and they did it quickly. Sometimes even before I knew about it. The end result was brittle, inconstant software that I was scared to death to deploy. We couldn't trust any of the old documentation and regression testing simply did not happen.<br />
<br />
What I found out that there is no process checklist that I can apply to all projects. I also learned that each project and even each feature can have their own "personality" that can't be ignored. The key is to find the right amount of process to apply for a particular job and team that satisfied the needs of the project and the business and adjust accordingly as the situation changes.<br />
<br />
At AutoClick in about 2002, we began doing short, feature based development cycles. In between releases we would evaluate our process with the goal of making it better. We made adjustments that were acceptable to (or at least understood by) the business and that were focused on delivering value/features first and foremost. In this, I think, we became a good software development organization.<br />
<br />
Since then I have heard of the agile concept of the Retrospective, and it immediately rang with me. The Retrospective is one of our most powerful tools to help us get to the correct amount of process that keeps us organized while still meeting the needs of our business. Spend some time looking at your processes while they are being applied and make the small changes that make it better and pretty soon you'll find that you'll have your project right where you need it to be. | 123.066667 | 605 | 0.779794 | eng_Latn | 0.999442 |
0db7713794faa29956e703d0e9bc61ac2f738dfd | 6,650 | md | Markdown | ilp.md | JHBalaji/cs6290-notes | 6c7ff7aae95a066ecf9eb626f2adfca28f0cf25a | [
"MIT"
] | 44 | 2019-09-06T19:50:34.000Z | 2022-02-15T08:10:15.000Z | ilp.md | drharris/cs6290-notes | 6c7ff7aae95a066ecf9eb626f2adfca28f0cf25a | [
"MIT"
] | null | null | null | ilp.md | drharris/cs6290-notes | 6c7ff7aae95a066ecf9eb626f2adfca28f0cf25a | [
"MIT"
] | 17 | 2020-04-11T23:09:45.000Z | 2022-02-25T02:39:33.000Z | ---
id: ilp
title: ILP
sidebar_label: ILP
---
[🔗Lecture on Udacity (54 min)](https://classroom.udacity.com/courses/ud007/lessons/3615429333/concepts/last-viewed)
## ILP All in the Same Cycle
Ideal scenario is all instructions executing during the same cycle. This may work sometimes for some instructions, but typically there will be dependencies that prevent this, as below.

## The Execute Stage
Can forwarding help? In the previous example, Inst 1 would be able to forward the result to Instruction 2 in the next cycle, but not during the same cycle. Instruction 2 would need to be stalled until the next cycle. But, if Instructions 3-5 do not have dependencies, they can continue executing during the current cycle.
## RAW Dependencies
Even the ideal processor that can execute any number of instructions per cycle still has to obey RAW dependencies, which creates delays and affects overall CPI. So, ideal CPI can never be 0. For example, for instructions 1, 2, 3, 4, and 5, where there is a dependency between 1-2, 3-4, and 4-5, it would take 3 cycles (cycle 1 executes 1 and 3, cycle 2 executes 2 and 4, cycle 3 executes 5), for a total CPI of 3/5 = 0.60. If every instruction had a dependency with the next, you can't do anything in parallel and the minimum CPI is 1.
## WAW Dependencies
In the below example, the second instructions has a data dependency on the first, and gets delayed one cycle. Meanwhile, all other cycles do not have any dependencies and can also be executed in the first cycle. However, we see in the last instruction that R4 could be written, but then due to the delay in instruction 2, could be overwritten. This out-of-order instruction could result in the final value of R4 not being what is expected based on the program. Thus the processor needs a way to find this dependency and delay the last instruction enough cycles to avoid the write issue.

## Removing False Dependencies
RAW dependencies are "true" dependencies - one instruction truly depends on data from another and the dependency must be obeyed. WAR and WAW dependencies are "false" (name) dependencies. They are named this because there is nothing fundamental about them - they are the result of using the same register to store results. If the second instruction used a different register value, there would be no dependency.
## Duplicating Register Values
In the case of a false dependency, you could simply duplicate registers by using separate versions of them. In the below example, you can store two versions of R4 - one on the 2nd instruction, and another on the 4th, and we remember both. The dependency can be resolved when the future instruction that uses R4 can "search" through those past versions and select the most recent.

Likewise, instruction 3 must also search among "versions" of R4 in instructions 2 and 4 and determine the version it needs is from instruction 2. This is possible and correct, but keeping multiple version is very complicated.
## Register Renaming
Register renaming separates registers into two types:
- Architectural = registers that programmer/compiler use
- Physical = all places value can actually go within the processor
As the processor fetches and decodes instructions, it "rewrites" the program to use physical registers. This requires a table called the Register Allocation Table (RAT). This table says which physical register has a value for which architectural register.
### RAT Example

## False Dependencies After Renaming?
In the below example, you can see the true dependencies in purple, and the output/anti dependencies in green. In our renamed program, only the true dependencies remain. This also results in a much lower CPI.

## Instruction Level Parallelism (ILP)
ILP is the IPC when:
- Processor does entire instruction in 1 cycle
- Processor can do any number of instructions in the same cycle
- Has to obey True Dependencies
So, ILP is really what an ideal processor can do with a program, subject only to obeying true dependencies. ILP is a property of a ***program***, not of a processor.
Steps to get ILP:
1. Rename Registers - use RAT
2. "Execute" - ensure no false dependencies, determine when instructions are executed
3. Calculate ILP = (\# instructions)/(\# cycles)
### ILP Example
Tips:
1. You don't have to first do renaming, just pay attention to true dependencies, and trust renaming to handle false dependencies.
2. Be mindful to count how many cycles you're computing over
3. Make sure you're dividing the right direction (instructions/cycles)

## ILP with Structural and Control Dependencies
When computing ILP we only consider true dependencies, not false dependencies. But what about structural and control dependencies?
When considering ILP, there are no structural dependencies. Those are caused by lack of hardware parallelism. ILP assumes ideal hardware - any instructions that can possibly compute in one cycle will do so.
For control dependencies, we assume perfect same-cycle branch prediction (even predicted before it is executed). For example, below we see that the branch is predicted at the point of program load, such that the label instruction will execute in the first cycle.
| | 1 | 2 | 3 |
|------------------|:---:|:---:|:---:|
| `ADD R1, R2, R3` | x | | |
| `MUL R1, R1, R1` | | x | |
| `BNE R5, R1, Label` | | | x |
| ... | | | |
| `Label:`<br />`MUL R5, R7, R8` | x | | |
## ILP vs IPC
ILP is not equal to IPC except on a perfect/ideal out-of-order processor. So IPC should be computed based on the properties of the processor that it is run on, as seen below (note: consider the IPC was calculated ignoring the "issue" property).

Therefore, we can state ILP \\(\geq\\) IPC, as ILP is calculated using no processor limitations.
### ILP and IPC Discussion
The following are considerations when thinking about effect of processor issue width and order to maximize IPC.

*[ALU]: Arithmetic Logic Unit
*[CPI]: Cycles Per Instruction
*[ILP]: Instruction-Level Parallelism
*[IPC]: Instructions per Cycle
*[PC]: Program Counter
*[RAT]: Register Allocation Table
*[RAW]: Read-After-Write
*[WAR]: Write-After-Read
*[WAW]: Write-After-Write
*[RAR]: Read-After-Read | 55.882353 | 586 | 0.762707 | eng_Latn | 0.999091 |
0db7a9498e0c4638ea116939dd04858658512d56 | 429 | md | Markdown | README.md | eric-nth/efview | 79154b059b236e0da74f97ca0550eadc713cbbc5 | [
"MIT"
] | null | null | null | README.md | eric-nth/efview | 79154b059b236e0da74f97ca0550eadc713cbbc5 | [
"MIT"
] | null | null | null | README.md | eric-nth/efview | 79154b059b236e0da74f97ca0550eadc713cbbc5 | [
"MIT"
] | null | null | null | # efview
一个简单而美观的远端文件分享/浏览工具。

## 部署
需要服务器安装了Apache或者PHP。对于版本没有特殊要求。
建议使用Linux服务器。
您需要安装/克隆整个目录到您的服务器,然后将其作为根目录创建php站点。
之后,打开config.php,根据提示信息进行编辑。
一切准备就绪后,打开index.php即可!
## 特性
- 简洁,美观,大方。
- 管理方便,快捷。
- 无用户系统,不支持前端管理,暂时仅支持手动FTP文件上传。
- 没有任何的cookies/local storage设置。
- 代码可读性强,方便用户更改。
- 支持全站ajax。
本项目正在完善中,若有任何意见或建议请联系我: [email protected]。 | 13 | 76 | 0.750583 | yue_Hant | 0.872937 |
0dba62e67985b067fa76bd3ad3c25a20c8098337 | 2,183 | md | Markdown | T-SQL/1. Variables/README.md | mikolajsemeniuk/SQL_Pack | af4513ef776d60f9143d04050b68e5c4208c99f4 | [
"MIT"
] | null | null | null | T-SQL/1. Variables/README.md | mikolajsemeniuk/SQL_Pack | af4513ef776d60f9143d04050b68e5c4208c99f4 | [
"MIT"
] | null | null | null | T-SQL/1. Variables/README.md | mikolajsemeniuk/SQL_Pack | af4513ef776d60f9143d04050b68e5c4208c99f4 | [
"MIT"
] | null | null | null | # Variables
In this section You would know how to:
* Create and assign custom values to variables in T-SQL.
* How to get number of all rows from tables if you need to iterate through whole table.
* Save value from certain column and certain row even if You don't have 'ID' column to string variable.
* Save whole certain row if You don't have 'ID' column to string variable.
* Save whole column row to string variable.
* Declare and temporary table for store certain rows.
**Enjoy !** :wink:
[Go back](https://github.com/mikolajs123/SQL_Pack/tree/master)
## Declare int
Declare and assign sample int
```sql
DECLARE @i BIGINT
SET @i = 0
PRINT 'i: ' + CAST(@i AS VARCHAR)
```
## Declare string
Declare and assign sample string
```sql
DECLARE @n VARCHAR(255)
SET @n = 'Hi SQL'
PRINT 'i: ' + @n
```
## Get length
Get number of all rows from table to variable
```sql
DECLARE @len BIGINT
SET @len = (
SELECT
COUNT(*)
FROM
[Table]
)
PRINT 'len: ' + CAST(@len AS VARCHAR)
```
## Get value
Get first value from column 'col1' to variable even if You don't have 'ID' column
```sql
DECLARE @val VARCHAR(255)
SET @val = (
SELECT
[col1] --Y position
FROM
[Table]
ORDER BY
(SELECT NULL)
OFFSET 1 ROWS -- X position
FETCH NEXT 1 ROWS ONLY -- Range of rows
)
PRINT 'val: ' + @val
```
## Get row
Get whole first row from table to variable even if You don't have 'ID' column
```sql
DECLARE @row VARCHAR(MAX)
SELECT
@row = @row + [col1] + ',' + [col2] + ',' + [col3] -- Remember to use CAST(col3 AS VARCHAR) or CAST(ISNULL(col3, 0) AS VARCHAR)
FROM
[Table]
ORDER BY
(SELECT NULL)
OFFSET 1 ROWS
FETCH NEXT 1 ROWS ONLY
PRINT 'row: ' + @row
```
## Get column
Get all values from column 'col1' from table to variable
```sql
DECLARE @col VARCHAR(MAX)
SELECT
@col = COALESCE(@col + ',', '') + CONVERT(VARCHAR(255), [col1])
FROM
[Table]
PRINT 'col: ' + @col
```
## Declare table
Declare temporary table to store certain rows as a temporary data
```sql
DECLARE @temp table (col1 BIGINT, col2 VARCHAR(MAX))
INSERT INTO @temp VALUES (1, 'Hi')
SELECT * FROM @temp
```
[Go back](https://github.com/mikolajs123/SQL_Pack/tree/master/T-SQL)
| 22.739583 | 132 | 0.672927 | yue_Hant | 0.564521 |
0dbaea9a3e86604617012936d716313203e8534f | 4,194 | md | Markdown | content/posts/how-lua-avoids-semicolons.md | 17cupsofcoffee/seventeencups.net | 18ddeac2a5a05f309f91662b3e219f2b974314cf | [
"MIT"
] | 3 | 2018-03-07T20:07:07.000Z | 2021-09-24T01:53:41.000Z | content/posts/how-lua-avoids-semicolons.md | 17cupsofcoffee/seventeencups.net | 18ddeac2a5a05f309f91662b3e219f2b974314cf | [
"MIT"
] | 6 | 2019-04-06T20:25:30.000Z | 2022-01-11T09:49:11.000Z | content/posts/how-lua-avoids-semicolons.md | 17cupsofcoffee/seventeencups.net | 18ddeac2a5a05f309f91662b3e219f2b974314cf | [
"MIT"
] | 2 | 2017-10-31T00:06:00.000Z | 2021-04-17T14:15:57.000Z | +++
title = "How Lua Avoids Semicolons"
date = 2018-04-03
aliases = ["/posts/how-lua-banished-the-semicolons"]
[taxonomies]
tags = ["language design", "ein", "lua"]
+++
My current pet project outside of work is developing a little programming language called [Ein](https://github.com/17cupsofcoffee/ein). I decided fairly early on in development that I didn't want Ein to have semicolons, so I've spent a fair chunk of the past week investigating how other languages make this work.
Lua's solution to this problem is (in my opinion) fairly nifty, so I thought I'd write about it on the off-chance that someone else will find it as interesting as I do 😄
## The Problem
First, some background - why is getting rid of semicolons tricky? Can't we just remove them from our language's grammar and be done with it?
The answer to this question can be summed up in one snippet of pseudo-code:
```rust
let x = 1 // Does the statement end here?
- 1 // Or does it end here?
```
How does our language's parser decide whether this should be one statement (`let x = 1 - 1`) or two (`let x = 1` followed by `-1`)? In the parser's eyes, they're both perfectly valid!
## The (Potential) Solutions
There's several ways that languages try to get around this problem. Some make the whitespace in their language significant, like Python. Others, like Go, try to insert the semicolons for you behind the scenes based on [a set of rules](https://golang.org/ref/spec#Semicolons).
Personally though, I'm not a fan of those solutions. Making whitespace have meaning rubs me the wrong way for reasons I don't quite understand, and automatic semicolon insertion feels like placing too much trust in the compiler to 'guess' where I meant for the statements to end.
## How Lua Does It
Lua's syntax is unambigous, even if you leave out all the semicolons and nice formatting, and the main way it achieves this is by dropping a feature a lot of us take for granted - *expressions-as-statements*.
In most languages, it's perfectly valid to use an expression (a piece of code that can be evaluated to get a value, like adding two numbers together) in the same place that you would use a statement (a piece of code run for its side effects, like a variable declaration).
Lua takes a much more hardline stance on this - programs are a list of statements, some statements may contain expressions (like the condition of an `if`), but expressions are *not* allowed to be used as statements.
Let's go back to our original example, translated to Lua:
```lua
local x = 1 -- Does the statement end here?
- 1 -- Or does it end here?
```
In Lua, a variable declaration is a statement, but `-1` is an expression - therefore, the only valid way of interpreting this code is `local x = 1 - 1`!
## What's The Catch?
Ah yes, there's always a catch, and in this case it's a fairly obvious one: what if I *want* to run an expression for its side effects?
For example, a lot of the time you'll want to use the return value of a function, but sometimes you'll just want to run it. Lua caters for this scenario by making an exception to the rule, allowing function calls to be used both in statement and expression position.
This is one of the only places that Lua bends the rules, however. In some languages, you can use the short circuting behavior of the logical AND/OR operators as short and sweet control flow statements:
```js
// JavaScript
isActive && doSomething();
```
The equivalent in Lua isn't valid unless you [assign the result to a temporary variable:](http://lua-users.org/wiki/ExpressionsAsStatements)
```lua
local _ = isActive and doSomething()
-- _ has no special meaning - just a common Lua
-- naming convention for throwing away variables!
```
## Conclusion
Thank you for reading! I hope I didn't bore you to death rambling on about semicolons for $x words! ❤️
If you're interested in this kind of thing, I'd recommend taking a look at the [full grammar for Lua](http://www.lua.org/manual/5.3/manual.html#9) from its reference manual - it's really short and really clean, and there's a lot of really interesting design decisions in there which I'm interested in digging deeper into.
| 55.184211 | 321 | 0.751311 | eng_Latn | 0.999333 |
0dbbd604f55c2170c1d387c95c1c8dc83dcf251f | 84 | md | Markdown | README.md | nickcolley/erno-legacy | da2d88623e54d0adca276c69d4cde6faeb44c50a | [
"MIT"
] | null | null | null | README.md | nickcolley/erno-legacy | da2d88623e54d0adca276c69d4cde6faeb44c50a | [
"MIT"
] | 1 | 2018-10-17T16:17:46.000Z | 2018-10-17T16:17:46.000Z | README.md | nickcolley/erno-legacy | da2d88623e54d0adca276c69d4cde6faeb44c50a | [
"MIT"
] | null | null | null | # Erno - Puzzle Timer

| 28 | 61 | 0.738095 | vie_Latn | 0.141891 |
0dbbef976294c1339a9df9bae7d69eb8445d54ea | 1,888 | md | Markdown | articles/supply-chain/troubleshooting/inventory/multiple-inventory-t.md | MicrosoftDocs/Dynamics-365-Operations.is-is | ce362ebbd8aabebe5e960567bddc97e5d1f37b56 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-18T17:14:14.000Z | 2021-04-20T21:13:46.000Z | articles/supply-chain/troubleshooting/inventory/multiple-inventory-t.md | MicrosoftDocs/Dynamics-365-Operations.is-is | ce362ebbd8aabebe5e960567bddc97e5d1f37b56 | [
"CC-BY-4.0",
"MIT"
] | 6 | 2017-12-12T11:46:48.000Z | 2019-04-30T11:45:51.000Z | articles/supply-chain/troubleshooting/inventory/multiple-inventory-t.md | MicrosoftDocs/Dynamics-365-Operations.is-is | ce362ebbd8aabebe5e960567bddc97e5d1f37b56 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2019-10-12T18:18:43.000Z | 2022-02-09T23:55:11.000Z | ---
title: Margar birgðafærslur fyrir rununúmer þegar slökkt er á „Við efnislega uppfærslu“
description: Fjölmargar birgðafærslur eru stofnaðar eftir að þú stillir innkaupapöntunarlínu fyrir vörur þar sem valkosturinn „Við efnislega uppfærslu“ fyrir lotunúmerahópinn er stilltur á Nei.
author: niwang
ms.date: 4/11/2021
ms.topic: troubleshooting
ms.search.form: InventNumGroup
audience: Application User
ms.reviewer: kamaybac
ms.search.region: Global
ms.author: smnatara
ms.search.validFrom: 2021-04-11
ms.dyn365.ops.version: 10.0.19
ms.openlocfilehash: b8aef8835e90b7437bb7833c13c3d287d4ca836bf1fefc01bfeafef1c93329bc
ms.sourcegitcommit: 42fe9790ddf0bdad911544deaa82123a396712fb
ms.translationtype: HT
ms.contentlocale: is-IS
ms.lasthandoff: 08/05/2021
ms.locfileid: "6768595"
---
# <a name="multiple-inventory-transactions-for-batch-numbers-when-on-physical-update-is-disabled"></a>Margar birgðafærslur fyrir rununúmer þegar slökkt er á „Við efnislega uppfærslu“
KB númer: 4613390
## <a name="symptoms"></a>Einkenni
Fjölmargar birgðafærslur eru stofnaðar eftir að þú stillir innkaupapöntunarlínu fyrir vörur þar sem valkosturinn **Við efnislega uppfærslu** fyrir lotunúmerahópinn er stilltur á *Nei*.
Þegar búið er að stofna vöru þar sem valkosturinn **Við efnislega uppfærslu** í lotunúmeraflokknum er stilltur á *Nei* stofnar kerfið sjálfkrafa nýtt lotunúmer ef breyta á innkaupalínumagni og vista síðuna fyrir innkaupapöntun.
## <a name="resolution"></a>Upplausn
Stillingin **Við efnislega uppfærslu** fyrir lotunúmerahópa virkar með eftirfarandi hætti:
- Þegar valkosturinn er stilltur á *Já* eru ný lotunúmer aðeins búin til eftir efnislegar uppfærslur (til dæmis þegar atriði eru send eða móttekin).
- Þegar valkosturinn er stilltur á *Nei* er nýtt lotunúmer stofnað í hvert sinn sem viðeigandi uppfærsla á sér stað (til dæmis þegar nýju magni er bætt við innkaupapöntun).
| 51.027027 | 227 | 0.814089 | isl_Latn | 0.999808 |
0dbc4c0301a02d370c45faf9abecb836dc432f02 | 76 | md | Markdown | README.md | rayd512/Card-Toolkit | 87a3afc334c6110073710eaa57ac3bbfe7d0b5af | [
"Apache-2.0"
] | null | null | null | README.md | rayd512/Card-Toolkit | 87a3afc334c6110073710eaa57ac3bbfe7d0b5af | [
"Apache-2.0"
] | null | null | null | README.md | rayd512/Card-Toolkit | 87a3afc334c6110073710eaa57ac3bbfe7d0b5af | [
"Apache-2.0"
] | null | null | null | # Card Toolkit
An Android app to keep track of your playing card collection
| 25.333333 | 60 | 0.802632 | eng_Latn | 0.994058 |
0dbc79cc5db5d93b36d55be206f52fd979309cba | 1,973 | md | Markdown | README.md | kharidiron/NowPlaying | aa116d4d2e622608fe4e6dffc11b7e51c8f00278 | [
"WTFPL"
] | null | null | null | README.md | kharidiron/NowPlaying | aa116d4d2e622608fe4e6dffc11b7e51c8f00278 | [
"WTFPL"
] | null | null | null | README.md | kharidiron/NowPlaying | aa116d4d2e622608fe4e6dffc11b7e51c8f00278 | [
"WTFPL"
] | null | null | null | # Now Playing
[](http://www.wtfpl.net/about/)
[](https://pypi.python.org/pypi/mitmproxy)
[](https://github.com/qwertyquerty/pypresence)
*Now Playing* is a simple companion application for creating custom Discord rich presence text.

This is a toy application, so I make no promises about its viability, sustailabilty, compatabaility, longevity, or for
that matter any details regarding my sanity. If, after all that, you still want to mess with this, then have a blast
and good luck!
## Installation
1. Clone the repo.
2. Install requirements: `pip install -r requirements.txt`
3. Make a copy of the `discord-ids.json.example` file.
4. Edit the file with your own unique Application IDs as provided by Discord.
5. Run and profit.
## Theory of Operations
You will need to create a new application via the Discord Developer interface for each 'System' instance you want
listed within the program. You can access the developer portal [here](https://discord.com/developers/applications).
You will then need to capture to *Application ID* for your new application, and paste is as a new entry into the
`discord-ids.json` file (see the `discord-ids.json.example` file for an example of the structure).
**Important Note:** You must run this application on the system where your active Discord client is running! (This
might be obvious to some, but I feel it is worth stating for absolute clarity.) The
[pypresence](https://github.com/qwertyquerty/pypresence) module this toy is based around requires a running Discord
instance in order to function.
## License
This project is under the What the Fuck Public License. See [LICENSE](LICENSE) for details. | 51.921053 | 163 | 0.775976 | eng_Latn | 0.979189 |
0dbc8ce8f8707b4808444d0c97a33567d4a87649 | 2,737 | md | Markdown | README.md | SRechenberger/chr_python_prototype | a1f10a7c9275ec15ab3ed48f9e41a0bf19611617 | [
"MIT"
] | null | null | null | README.md | SRechenberger/chr_python_prototype | a1f10a7c9275ec15ab3ed48f9e41a0bf19611617 | [
"MIT"
] | null | null | null | README.md | SRechenberger/chr_python_prototype | a1f10a7c9275ec15ab3ed48f9e41a0bf19611617 | [
"MIT"
] | null | null | null | # CHR(Python)
A _Constraint Handling Rules_ (CHR) implementation for _Python_.
# Example
Given the following _CHR(Python)_ program, saved in `fibonacci.chr`:
```
class Fibonacci.
constraints fib/1, result/1, read/1.
fib($N) <=> $N > 1 | fib($N-1), fib($N-2).
fib($N) <=> $N <= 1 | result($N).
result($N), result($M) <=> result($N+$M).
read($N), result($M) <=> $M = $N.
```
After compilation into `fibonacci.py`, this file contains a class `Fibonacci`,
which has the public methods `fib`, `result` and `read`.
It can be used as such:
```python
from fibonacci import Fibonacci
# Generates an isolated solver instance
solver = Fibonacci()
# Add fib(6) to the constraint store,
# and immediately starts computation.
solver.fib(6)
# At this point, the constraint store
# contains the constraint result(8),
# so this assertion should hold
assert ("result/1", 8) in solver.dump_chr_store()
# Generate a fresh and unbound logical variable
result = solver.fresh_var()
# read the result from the store
# this will add the constraint read(result)
# to the store, and start computation, which
# will then (due to the last rule) bind result
# to the value in the result constraint.
# (in this case 8)
solver.read(result)
# At this point, this assertion holds:
assert result == 8
# To actually extract the value from result,
# use chr.runtime.get_value:
from chr.runtime import get_value
assert not isinstance(result, int)
assert isinstance(get_value(result), int)
```
See the `test_files` folder for more examples.
# Usage
## Command line tool
There is a command line tool `chr_python` to compile _CHR(Python)_ files to
_Python_ code.
Given, you have a file `my_program.chr`, you can compile it to _Python_ by
issuing the command
```shell
chr_python my_program.chr
```
which will create the file `my_program.py`, which you can then import in
_Python_ (see above).
If you want to change the output path, you can use the `-o`, or `--outfile`
flags:
```shell
chr_python my_program.chr -o some/funky/path/my_cool_program.py
# or
chr_python my_program.chr --outfile some/funky/path/my_cool_program.py
```
If you use some kind of automatic build, you may only want to compile a file, if
it actually changes anything. In this case, you can use the `-t` or
`--timestamp` flags, which will check the time of last modification, and only
run the compilation, if the source file is newer than the existing output file.
To get usage information, use the `-h` or `--help` flags.
## Automatic compilation
Add the following lines to a `__init__.py` file of a package to automatically
compile all `*.chr` files in this folder.
```python
import os
from chr.core import chr_compile_module
chr_compile_module(os.path.dirname(__file__))
```
| 25.820755 | 80 | 0.734015 | eng_Latn | 0.992693 |
54bcf49a19f1a2e0dbc3a30aa73b8acf2269c733 | 89 | md | Markdown | oeis/1/2/3/6/26/96/210/1106/3759/12577/A000341.md | hloeffler/oeis-dirs | 1a43b8b9d09cd6304a5d04789c133dcccb57e532 | [
"BSD-2-Clause"
] | null | null | null | oeis/1/2/3/6/26/96/210/1106/3759/12577/A000341.md | hloeffler/oeis-dirs | 1a43b8b9d09cd6304a5d04789c133dcccb57e532 | [
"BSD-2-Clause"
] | null | null | null | oeis/1/2/3/6/26/96/210/1106/3759/12577/A000341.md | hloeffler/oeis-dirs | 1a43b8b9d09cd6304a5d04789c133dcccb57e532 | [
"BSD-2-Clause"
] | null | null | null | http://oeis.org/A000341
Number of ways to pair up {1..2n} so sum of each pair is prime.
| 22.25 | 63 | 0.707865 | eng_Latn | 0.901492 |
54bdcb431bea4a254ad0ff2ac30f74e1834a6a61 | 898 | md | Markdown | _posts/2010-10-12-表白帖.md | backup53/1984bbs | 152406c37afab79176f0d094de5ac4cb0c780730 | [
"MIT"
] | 18 | 2020-01-02T21:43:02.000Z | 2022-02-14T02:40:34.000Z | _posts/2010-10-12-表白帖.md | wzxwj/1984bbs | 152406c37afab79176f0d094de5ac4cb0c780730 | [
"MIT"
] | 3 | 2020-01-01T16:53:59.000Z | 2020-01-05T10:14:11.000Z | _posts/2010-10-12-表白帖.md | backup53/1984bbs | 152406c37afab79176f0d094de5ac4cb0c780730 | [
"MIT"
] | 13 | 2020-01-20T14:27:39.000Z | 2021-08-16T02:13:21.000Z | ---
layout: default
date: 2010-10-12
title: 表白帖
categories: 罗马假日公寓
---
# 表白帖
参政议政
给我张选票
1楼 大 中 小 发表于 2010-10-12 00:22 只看该作者
表白帖
5毛妹,求交往!!!
---
[Terminusbot](https://github.com/TerminusBot) 整理,讨论请前往 [2049bbs.xyz](http://2049bbs.xyz/)
---
moonsucker
大隐隐于厕
2楼 大 中 小 发表于 2010-10-12 00:32 只看该作者
都这时候了,表白帖就贴个真相吧
不敬神的YODA
五毛你好,五毛再见
3楼 大 中 小 发表于 2010-10-12 01:00 只看该作者
同意ls
康帅博
困男一枚
4楼 大 中 小 发表于 2010-10-12 01:08 只看该作者
同意二楼,表白要有诚意
metaphy
中午时分,所有的人都到楼顶花园透风去了,"棕色的"没去。抓住这没人的机会,她正好对我"诉求"一番──我不知这个词是什么意思,但我觉得这词很逗。她我面前哀哀地哭着,说道:老大哥,我要写小说啊……@metaphyp
5楼 大 中 小 发表于 2010-10-12 01:15 只看该作者
同意二楼三楼四楼
鲁西南
明月不知君已去,夜深仍照读书窗。
6楼 大 中 小 发表于 2010-10-12 01:24 只看该作者
哈哈
| 3.904348 | 105 | 0.52784 | yue_Hant | 0.99693 |
54beeac63a3239e603ac827e8b9bf4b62106c22c | 3,939 | md | Markdown | CHANGELOG.md | digital-dj-tools/dj-data-converter | 3561e4803ef0160a73c7d908ea016f78196f9467 | [
"MIT"
] | 85 | 2018-11-24T23:31:58.000Z | 2022-03-25T02:30:41.000Z | CHANGELOG.md | digital-dj-tools/dj-data-converter | 3561e4803ef0160a73c7d908ea016f78196f9467 | [
"MIT"
] | 21 | 2019-01-24T10:44:48.000Z | 2022-03-09T13:10:42.000Z | CHANGELOG.md | digital-dj-tools/dj-data-converter | 3561e4803ef0160a73c7d908ea016f78196f9467 | [
"MIT"
] | 10 | 2019-06-16T05:08:45.000Z | 2022-03-31T17:37:05.000Z | # Changelog
All notable changes to this project will be documented here.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [Unreleased]
### Added
- Copy additional tag metadata when converting (e.g. year)
- Convert color tags
- Convert from Serato DJ to Rekordbox
- Convert from Rekordbox to Serato DJ
- Support the case where an entry has a tempo, but no cues (Traktor to Rekordbox)
- Support disabling the "Store Beatmarker as Hotcue" Traktor setting
## 0.5.1 (2021-09-06)
### Fixed
- Allow indexing element in Traktor nml files [#42](https://github.com/digital-dj-tools/dj-data-converter/issues/42)
## 0.5.0 (2020-11-03)
### Changed
- Migrated from JavaScript/NodeJS runtime to Java/GraalVM runtime, in order to fix [#33](https://github.com/digital-dj-tools/dj-data-converter/issues/33) and improve performance generally
## 0.4.1 (2020-02-04)
### Fixed
- Allow DateAdded blank string for Rekordbox tracks [#27](https://github.com/digital-dj-tools/dj-data-converter/issues/27)
## 0.4.0 (2019-10-24)
### Added
- Convert from Rekordbox to Traktor [#9](https://github.com/digital-dj-tools/dj-data-converter/issues/9)
- Offset correction [#3](https://github.com/digital-dj-tools/dj-data-converter/issues/3)
### Changed
- Downgrade Nodejs from 10 to 8 for the pkg build, to avoid `"buffer" argument must be one of type Buffer or Uint8Array. Received type object` error (will be investigated)
### Fixed
- Correct Rekordbox date format [#26](https://github.com/digital-dj-tools/dj-data-converter/issues/26). Rekordbox xml data written by earlier versions of the converter has malformed date formatting, which Rekordbox will unfortunately accept without validation. This effectively corrupts the Rekordbox database, and so it will need to be deleted and re-created before running this new version of the converter to produce correctly formatted xml data, which can then be imported. The Rekordbox database is located at `C:\Users\youruser name\AppData\Roaming\Pioneer\rekordbox` on Windows, and `/yourharddrivename/Users/yourusername/Library/Pioneer/rekordbox` on a Mac.
## 0.3.4 (2019-10-08)
### Added
- Convert Traktor import date to Rekordbox date added
- Add optional stems in Traktor entry spec
### Fixed
- Filter entries with location and non-blank location file, so that only these Traktor entries are converted [#21](https://github.com/digital-dj-tools/dj-data-converter/issues/21)
- Correct Rekordbox hot cue colours, match Rekordbox green for cues
## 0.3.3 (2019-08-02)
### Added
- Copy additional tag metadata when converting (genre, comment)
## 0.3.2 (2019-06-27)
### Changed
- Packages for Windows and Mac are now made available as archives: zip for Windows and tar.gz for Mac, to preserve execute permissions
## 0.3.1 (2019-06-24)
### Changed
- Add digital-dj-tools/utils dependency
- All bpm values are now type double
- The max tempo inizio and marker start/end is now 7200 seconds
## 0.3.0 (2019-03-02)
### Changed
- Revised colour mapping for markers (cue points). The default cue point colours are used where possible, except when there is a conflict between Traktor and Rekordbox
## 0.2.2 (2019-02-03)
### Changed
- Fixed track numbers in Rekordbox (Rekordbox xml uses a TrackNumber attribute, not a Track attribute)
## 0.2.1 (2019-01-26)
### Changed
- End attr of position marks is now optional, it's only included when the marker is a loop. This avoids "zero-length" position marks, which Rekordbox accepts but DJ hardware e.g. CDJ does not
## 0.2.0 (2019-01-14)
### Changed
- Internal changes to simplify data conversions
- Internal changes allowing this project to be used as a library and extended by other projects.
### Removed
- The -c/--check-input command line option (no longer required since the input is now always checked).
## 0.1.0 (2018-11-24)
### Added
- Initial release. | 49.2375 | 665 | 0.753491 | eng_Latn | 0.95157 |
54c051e10b463e18ae1974eecfeb166647ab9e16 | 122 | md | Markdown | _posts/0000-01-02-Mah-Ismail.md | Mah-Ismail/github-slideshow | b6da95cb5684d6ae1371cce8bf387ba52e72fe6a | [
"MIT"
] | null | null | null | _posts/0000-01-02-Mah-Ismail.md | Mah-Ismail/github-slideshow | b6da95cb5684d6ae1371cce8bf387ba52e72fe6a | [
"MIT"
] | 3 | 2020-12-16T14:58:51.000Z | 2020-12-16T16:33:01.000Z | _posts/0000-01-02-Mah-Ismail.md | Mah-Ismail/github-slideshow | b6da95cb5684d6ae1371cce8bf387ba52e72fe6a | [
"MIT"
] | null | null | null | ---
layout: slide
title: "Welcome to our second slide!"
---
":smile: konga labonga"he said
Use the left arrow to go back!
| 17.428571 | 37 | 0.696721 | eng_Latn | 0.994357 |
54c0530269f53b651b66631fd7f9c4680c4f4001 | 809 | md | Markdown | content/notes/2022-02-08-unpriviliged-docker-build-in-gitlab-ci-with-kaniko.md | xinau/notebook | 37a9890e1fc0c43334a9d13be4ebeb01198de3fd | [
"MIT"
] | null | null | null | content/notes/2022-02-08-unpriviliged-docker-build-in-gitlab-ci-with-kaniko.md | xinau/notebook | 37a9890e1fc0c43334a9d13be4ebeb01198de3fd | [
"MIT"
] | null | null | null | content/notes/2022-02-08-unpriviliged-docker-build-in-gitlab-ci-with-kaniko.md | xinau/notebook | 37a9890e1fc0c43334a9d13be4ebeb01198de3fd | [
"MIT"
] | null | null | null | ---
title: Unprivileged Docker Build in GitLab CI With Kaniko
date: 2022-02-08T08:52:00Z
draft: true
tags: ["container", "container/image", "gitlab-ci", "gitlab-ci/job", "kaniko"]
---
```yaml
build:
image:
name: gcr.io/kaniko-project/executor:debug
entrypoint: [""]
variables:
docker_dir: ./docker
script:
- |
echo "{ \"auths\" : { \"${CI_REGISTRY}\" : {
\"username\" : \"${CI_REGISTRY_USER}\",
\"password\" : \"${CI_REGISTRY_PASSWORD}\"
}}}" > /kaniko/.docker/config.json
- |
/kaniko/executor \
--context ${CI_PROJECT_DIR} \
--dockerfile ${CI_PROJECT_DIR}/${docker_dir}/${docker_image}/Dockerfile \
--destination ${CI_REGISTRY_IMAGE}/${docker_image}:${CI_COMMIT_REF_SLUG}
rules:
- changes:
- ${docker_dir}/**/*
```
| 26.966667 | 79 | 0.598269 | yue_Hant | 0.197797 |
54c0a14bc50cfe7ff914fc32aa9ae84aa11ea91c | 396 | md | Markdown | README.md | ElMijo/builderbox-component | 3123fa5582272414a1ea160b1d0630a1a1a73966 | [
"MIT"
] | null | null | null | README.md | ElMijo/builderbox-component | 3123fa5582272414a1ea160b1d0630a1a1a73966 | [
"MIT"
] | null | null | null | README.md | ElMijo/builderbox-component | 3123fa5582272414a1ea160b1d0630a1a1a73966 | [
"MIT"
] | null | null | null | # BuilderBox Component
This module contains the basic components of BuilderBox
[](https://travis-ci.org/ElMijo/builderbox-component) [](https://coveralls.io/github/ElMijo/builderbox-component?branch=master) | 99 | 316 | 0.808081 | yue_Hant | 0.151338 |
54c0c022b2d113215520db6008d85e65ef08a102 | 3,068 | md | Markdown | _posts/2021-09-16-B_createJavaProject.md | minyeon9/minyeon9.github.io | fb1e5c10930c3f3ed58f5ba7fc4d3eb375d252f1 | [
"MIT"
] | null | null | null | _posts/2021-09-16-B_createJavaProject.md | minyeon9/minyeon9.github.io | fb1e5c10930c3f3ed58f5ba7fc4d3eb375d252f1 | [
"MIT"
] | null | null | null | _posts/2021-09-16-B_createJavaProject.md | minyeon9/minyeon9.github.io | fb1e5c10930c3f3ed58f5ba7fc4d3eb375d252f1 | [
"MIT"
] | null | null | null | ---
layout: single
title: "Java Project 생성"
categories: ['Java']
---
#### 1. eclips JAVA 설정

eclipse 오른 쪽 상단에 있는 버튼[open perspective] 클릭 - java 선택
이 버튼의 기능은 선택한 언어에 적합한 eclipse 뷰 레이아웃을 제공하는 것이다.
우리는 java 언어를 사용하기 때문에 java로 선택했다.
선택 하면 처음 eclipse를 열었을 때와는 조금 다른 레이웃으로 변경 된다.
* * *
#### 2. JREs 확인
[Window] - [Preference] - 'Installed JREs' 검색 - 사용할 버전이 맞는지 확인.
원하는 버전이 없을 경우: [Add] - [Standard VM] - 경로 선택
* * *
#### 3. Project 생성

[file] -[new] - [java project]
1) project name 설정
2) JRE 확인
3) Create module-info.java file 체크 해제: 이게 뭔지 잘 모르겠어서 검색해봤는데.. 그래도 모르겠다. JDK 9 버전 이상 어쩌고 저쩌고 하는데...ㅠㅠ
* * *
#### 4. Java 구성 순서와 구조
1) Package
* class 묶음
* sub package 포함 가능
* .(dot)으로 구분
* 소스 파일 내 첫 줄에 단 한 번만 선언
* 일반적으로 하나의 단어로 지정
2) Import
* 사용하려는 class가 속한 package 지정 시 사용
* import문을 사용하여 class를 불러올 때 package명 생략 가능
* java.lang package의 class는 import문 없이 사용 가능
* String / Object / System 등
3) Class
* method를 포함
* 모든 코드는 class 내에 작성
* class가 모여 하나의 java 애플리케이션을 구성
* class를 통해 객체를 생성
* class명과 file명은 동일
* 여러개의 method를 포함 가능
* class명은 대문자로 시작(카멜케이스)
4) Method
* class의 기능을 구현하는 블럭
* 실행 mothod(main method)
* public static void main(String[] args)
* java.exe가 class를 실행 시킬 때 가장 먼저 실행
* main 이름 변경 불가
* 모든 class에 존재할 필요는 없으나 반드시 하나의 class 내에는 존재
* method명은 소문자로 시작(카멜케이스)
***
#### 5. Class 생성

[file] -[new] - [class]

src 폴더 내에 생성
1) project 경로 확인
2) package name 설정(대문자로 시작)
3) class name 설정(소문자로 시작)
4) modifiers - public 체크
***
#### 6. 출력 method
* 변수 / 문자 / 숫자 / 논리 값을 출력
* System.out.print()
* System.out.println()
* 줄바꿈 처리
***
#### 7. 주석(Comment)
* 코드에 대한 설명이나 정보 작성
* /* */ : 범위 주석
* // : 한 줄 주석
* /** */ : 도큐먼트 주석(javadoc.exe를 통해 API 도큐먼트 생성 시 사용)
***
#### 8. 숫자 / 문자 출력 및 연산
1) 숫자
- 따옴표 미 사용
```java
public void printTest() {
System.out.println(123); // 정수
System.out.println(3.14); // 실수
}
```
2) 문자
- ' ' : 한 글자
- " " : 한 글자 / 문자열 / 빈 값
```java
public void printTest() {
System.out.println('가'); // 한 글자
System.out.println("Hello World!"); // 문자열
}
```
3) 연산
* 숫자 + 숫자
* 숫자 + 문자(순서 변경 가능)
* 문자 + 문자
```java
public void printTest() {
System.out.println(1 + 3); // 4
System.out.println('a' + 1); // 아스키코드 값 + 1 = 98
System.out.println("a" + 1); // 문자 a + 1 = a1
System.out.println("Hello" + (1 + 5)); // Hello6
System.out.println("Hello" + 1 + 5); // Hello15
System.out.println("Hi" + " " + ""); // 공백 가능
}
```
***
오늘 되게 많이 배웠구나.
수업 때는 강사님이 하라는대로 하다보니.. 쉬웠는데,
혼자 하다보니 막히는 부분이 많네.
아직은 package / class / method도 헷갈리고,
import문 작성하는 것도 너무 헷갈린다.
정리를 해도 이렇게 헷갈리는데 안 했으면 어쩔뻔.ㅠㅠ
내일도 정신 바짝 차리고 !!
*** | 22.558824 | 103 | 0.562907 | kor_Hang | 1.000008 |
54c17d0e19c30b9f2cbfd2f560a48145b7afe9c3 | 5,524 | md | Markdown | articles/vs-azure-tools-connected-services-add-active-directory.md | OpenLocalizationTestOrg/azure-docs-pr15_fi-FI | fd5644538d5deb8abf20f8d401bd1741ff804a81 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/vs-azure-tools-connected-services-add-active-directory.md | OpenLocalizationTestOrg/azure-docs-pr15_fi-FI | fd5644538d5deb8abf20f8d401bd1741ff804a81 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/vs-azure-tools-connected-services-add-active-directory.md | OpenLocalizationTestOrg/azure-docs-pr15_fi-FI | fd5644538d5deb8abf20f8d401bd1741ff804a81 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | <properties
pageTitle="Azure Active Directory lisääminen käyttämällä yhdistetyt palvelut-Visual Studiossa | Microsoft Azure"
description="Lisää Azure Active Directory Visual Studio Lisää yhdistetyt palvelut-valintaikkunan avulla"
services="visual-studio-online"
documentationCenter="na"
authors="TomArcher"
manager="douge"
editor="" />
<tags
ms.service="active-directory"
ms.devlang="multiple"
ms.topic="article"
ms.tgt_pltfrm="na"
ms.workload="na"
ms.date="08/15/2016"
ms.author="tarcher" />
# <a name="adding-an-azure-active-directory-by-using-connected-services-in-visual-studio"></a>Azure Active Directory lisääminen käyttämällä yhdistetyt palvelut-Visual Studio
##<a name="overview"></a>Yleiskatsaus
Azure Active Directory (Azure AD) avulla voit tukea Single Sign-On (SSO) ASP.NET MVC verkkosovellukset tai AD todennus verkko-Ohjelmointirajapinnan Services-palveluissa. Azure AD-todennuksen että käyttäjät voivat käyttää niiden tilejä Azure AD muodostaa web-sovellusten. Verkko-Ohjelmointirajapinnan kanssa Azure AD-todennus on hyötyä sisällyttää parannettu tietojen suojauksen, kun paljastaa Ohjelmointirajapinnan WWW-sovelluksesta. Azure AD-kanssa ei tarvitse erillistä todennusta-järjestelmän Oma tili- ja hallinnan hallintaan.
## <a name="supported-project-types"></a>Tuetut projektityypit
Voit muodostaa yhteyden Azure AD-projektin seuraavanlaisia yhdistetyt palvelut-valintaikkunassa.
- ASP.NET-MVC projektit
- ASP.NET-verkko-Ohjelmointirajapinnan projektit
### <a name="connect-to-azure-ad-using-the-connected-services-dialog"></a>Yhteyden muodostaminen Azure AD yhdistetyt palvelut-valintaikkunan avulla
1. Varmista, että sinulla on Azure-tili. Jos sinulla ei ole Azure-tili, voit rekisteröityä [maksuttoman kokeiluversion käyttäjäksi](http://go.microsoft.com/fwlink/?LinkId=518146).
1. Avaa pikavalikko **viittaukset** solmun projektin Visual Studiossa, ja valitse **Lisää yhdistetyt palvelut**.
1. Valitse **Azure AD-todennus** ja valitse sitten **Määritä**.

1. **Määritä Azure AD-todennus**ensimmäisellä sivulla Tarkista **määrittäminen kertakirjautumisen Azure AD avulla**.
Jos projektissa on määritetty toisen todennuksen määrittäminen, ohjattu toiminto varoittaa, että jatkuvaa käytöstä Edellinen kokoonpano.

1. Valitse toimialue toisella sivulla **toimialue** avattavasta luettelosta. Toimialueiden luettelo on kaikkien käytettävissä toimialueiden luettelossa Tiliasetukset-valintaikkunassa tilin. Vaihtoehtoisesti voit kirjoittaa toimialuenimi Jos et löydä etsimääsi, kuten mydomain.onmicrosoft.com yksi. Voit valita asetus, jos haluat luoda uuden Azure AD-sovelluksen tai asetuksia Azure AD-sovelluksesta.

1. Ohjatun toiminnon kolmannella sivulla Varmista, että **lukea kansion tietoja** on valittuna. Ohjattu toiminto täyttää **asiakkaan salaisuus**.

1. Valitse **Valmis** -painiketta. Valintaikkunan Lisää tarvittavat määritykset koodi ja viittausten käyttöön projektin Azure AD-todennusta varten. Näet [Azure portal](http://go.microsoft.com/fwlink/p/?LinkID=525040)AD-toimialueen.
1. Tarkista aloittaminen-sivu, joka tulee näkyviin selaimeen käsittelevässä seuraavat vaiheet ja mitä on tapahtunut-sivulta, miten projektin on muokattu. Jos haluat tarkistaa, että kaikki työskenteli, Avaa jokin muokattu määritys-tiedostot ja varmista, että mainitusta mitä on tapahtunut asetukset ovat käytettävissä. Esimerkiksi tärkeimmät web.config ASP.NET MVC projektissa on lisätty asetuksia:
<appSettings>
<add key="ida:ClientId" value="ClientId from the new Azure AD App" />
<add key="ida:AADInstance" value="https://login.windows.net/" />
<add key="ida:Domain" value="Your selected domain" />
<add key="ida:TenantId" value="The Id of your selected Azure AD Tenant" />
<add key="ida:PostLogoutRedirectUri" value="The default redirect URI from the project" />
</appSettings>
## <a name="how-your-project-is-modified"></a>Miten projektin on muokattu
Kun suoritat ohjatun toiminnon, Visual Studio Lisää Azure AD ja liittyvän projektin viittauksia. Määritysten ja koodin tiedostot projektin myös muokata Lisää Azure AD-tuki. Tiettyjä muutoksia, joka tekee Visual Studio määräytyvät projektityyppi. Saat tietoja siitä, miten ASP.NET MVC projektien muokataan, [Mitä on tapahtunut – MVC projektit](http://go.microsoft.com/fwlink/p/?LinkID=513809). Saat verkko-Ohjelmointirajapinnan projektit- [tapahtumista – Web-Ohjelmointirajapinnan projektit](http://go.microsoft.com/fwlink/p/?LinkId=513810).
##<a name="next-steps"></a>Seuraavat vaiheet
Kysy kysymyksiä ja ohjeita.
- [MSDN-keskustelupalsta: Azure AD](https://social.msdn.microsoft.com/forums/azure/home?forum=WindowsAzureAD)
- [Azure AD-asiakirjat](https://azure.microsoft.com/documentation/services/active-directory/)
- [Blogimerkintä: Azure AD esittely](http://blogs.msdn.com/b/brunoterkaly/archive/2014/03/03/introduction-to-windows-azure-active-directory.aspx)
| 67.365854 | 540 | 0.785119 | fin_Latn | 0.997199 |
54c354b103e8ec8e6fbd4eb2d32a41a38eec5556 | 3,587 | md | Markdown | AgriPest.md | GSMADeveloper/HarmonisedEntityDefinitions | 9876bc9c6a2e33a4c2ee9a97673a1375f2d67d6c | [
"Apache-2.0"
] | 5 | 2017-02-16T09:23:16.000Z | 2021-03-20T09:56:38.000Z | AgriPest.md | JulianoCristian/HarmonisedEntityDefinitions | 9876bc9c6a2e33a4c2ee9a97673a1375f2d67d6c | [
"Apache-2.0"
] | null | null | null | AgriPest.md | JulianoCristian/HarmonisedEntityDefinitions | 9876bc9c6a2e33a4c2ee9a97673a1375f2d67d6c | [
"Apache-2.0"
] | 3 | 2017-01-20T19:11:58.000Z | 2017-10-03T09:14:39.000Z | ### AgriPest
This entity contains a harmonised description of a generic agricultural pest. This entity is primarily associated with the agricultural vertical and related IoT applications.
<AgriPest><Generic Attributes>
| Attribute Name | Attribute Type | Description | Mandatory /Optional | May be Null |
|----------------|----------------|-------------------------------------------------------------------------------|--------------------|-------------|
| id | Text | Unique id of this instance of this entity. | M | N |
| type | Text | Must be equal to "**AgriPest**". | M | N |
| dateCreated | DateTime | Entity creation timestamp. | M | N |
| dateModified | DateTime | Timestamp of the last modification of the entity. | M | Y |
| source | Text | A sequence of characters giving the source of the entity data as a URL. | M | Y |
| dataProvider | Text | A sequence of characters identifying the originator of the harmonised entity. | M | Y |
<AgriPest><Entity Specific Attributes>
| Attribute Name | Attribute Type | Description | Mandatory /Optional | May be Null |
|----------------|--------------------|-----------------------------------------------------------------------------------------------------------------------------------|--------------------|-------------|
| name | Text | The name of this agricultural pest. | M | N |
| alternateName | Text | Alternative name of this agricultural pest. | O | Y |
| description | Text | A description of this agricultural pest. | O | Y |
| refAgriProduct | Array of Reference | An array containing a JSON encoded sequence of characters referencing the unique ids of the recommended AgriProduct pesticide(s). | O | Y |
#### AgriPest JSON
The JSON code can be downloaded from:
<https://gist.github.com/GSMADeveloper/cbab7c37630eb7ea87b503267d0696a1>
```json
{
"id": "fb3f1295-500c-4aa3-b995-c909097d5c01",
"type": "AgriPest",
"dateCreated": {
"value": "2016-08-22T10:18:16Z",
"type": "DateTime"
},
"dateModified": {
"value": "2016-08-22T10:18:16Z",
"type": "DateTime"
},
"source": {
"value": "http://www.samplefarmproduct.com",
"type": "URL"
},
"dataProvider": {
"value": "OperatorA",
"type": "Text"
},
"name": {
"value": "Grasshopper",
"type": "Text"
},
"alternateName": {
"value": "Chorthippus parallelus",
"type": "Text"
},
"description": {
"value": "Common European grasshopper",
"type": "Text"
},
"refAgriProduct": [
"7f1d962b-0d14-479b-a50a-baaef261263a"
]
}
```
| 53.537313 | 207 | 0.409534 | eng_Latn | 0.643099 |
54c3bd9f0dfdc7a2cccc06764be76f4f5ee48c75 | 8,857 | md | Markdown | _posts/2011-12-6-music-games.md | ayame9joe/ayame9joe.github.io | 4786b5c43aa91883114f0368057aa2e15ba40ff4 | [
"MIT"
] | null | null | null | _posts/2011-12-6-music-games.md | ayame9joe/ayame9joe.github.io | 4786b5c43aa91883114f0368057aa2e15ba40ff4 | [
"MIT"
] | null | null | null | _posts/2011-12-6-music-games.md | ayame9joe/ayame9joe.github.io | 4786b5c43aa91883114f0368057aa2e15ba40ff4 | [
"MIT"
] | null | null | null | ---
layout: post
title: 音乐游戏漫谈
description: RT
disqus: true
tags: 专题 游戏设计
---
与电影类似,游戏类型的划分起始于游戏设计实践。当今流行的游戏类型并无统一的划分标准,或依据游戏机制,或重视游戏体验,大多只是约定俗成。音乐游戏(MUG, Music Game)的出现也是如此。出于对游戏与音乐的双重兴趣,将二者结合起来的尝试不断涌现,最终“玩”音乐不再只是梦想,而与音乐有关的游戏则被称为音乐游戏。根据Wiki百科,音乐游戏是指游戏性集中于玩家与音乐元素互动乐趣的游戏类型。可以看出,这个定义相当宽泛;在目前的创作实践中,音乐游戏多以配合节奏行动的方式进行游戏,经由深入挖掘与类型扩展,现今已涵盖数种不同的游戏玩法。此外,音乐游戏也常常以其他游戏内嵌的迷你游戏形式出现。
##音乐记忆游戏
顾名思义,音乐记忆游戏以测试玩家的音乐记忆为游戏机制。现有游戏大多以短期记忆与遗觉记忆作为测试的重点。这样的游戏机制并非音乐游戏专属,在一些儿童游戏(如Simon Says)中已初现端倪。尽管相当简单,音乐记忆游戏却是最早出现的音乐游戏类型。此类游戏最初以非数字化的儿童玩具形式存在。Simon(Studio54, 1978)是这类玩具的代表,时至今日,某些音乐游戏中聆听音节并模仿演奏的环节还可看出Simon对这一游戏类型的影响。Simon具有四个可以产生不同声音的彩色按钮,每一回合,设备随机产生声音组合,玩家必须按下按钮再现这一组合。随着游戏进行,声音数目有所增加,而游戏难度也因之上升。
当音乐游戏发展到一定程度之后,一些游戏重新诠释了这一简单的概念。水口哲也以Space Channel 5就是一例。Space Channel 5在1999年发行于Dreamcast平台,随后移植至PS2与GBA,并于2002年出版了续集Space Channel 5: Part 2。游戏的官方介绍中指出,尽管加入节奏元素,这个游戏玩起来就像电子版本的“Simon Says”游戏。游戏中,玩家将控制宇宙空间播报员Ulala拯救人质,击退敌人,而这一切都是通过重复敌人的指令进行的。除了通常的方向指令之外,指令Chu允许玩家攻击敌人或是解救人质。玩家的失误会导致Ulala生命值的减少。 媒体对这样的再应用表示欢迎。IGN给出9.2的高分。Gamespot甚至这样评论,如果存在某种Gamespot对于游戏类型的评分,那么Space Channel 5将不能被包含其中。因为它彻头彻尾超出了游戏类型的概念,但却是十分简单有趣的游戏。 开发初期,水口哲也被要求制作一款受众宽泛,甚至适宜女性休闲玩家的游戏。在采访若干女性玩家之后,水口得出女性玩家喜爱解谜游戏的结论,而男性玩家希望高高在上的感觉。因此,制作出一款同时满足二者的游戏相当困难。但在受到舞蹈团体Stomp的启发之后,采用音乐记忆作为游戏机制的Space Channel 5终于得到了玩家的认可,甚至吸引天王Michael Jackson加盟演出。 作为一种有着简单机制的游戏类型,音乐记忆游戏是将音乐元素加入游戏的最易行的方法,并且对于休闲玩家有着天然吸引力。但随之而来的是其难以扩展的特性。也许正是因为如此,在Space Channel 5及其续作的成功之后,游戏制作者对这种类型鲜有尝试。但是,这并不意味着音乐记忆机制的失败与死亡。事实上,这一游戏机制正在经过转化之后与其他类型所混合,从而获得新的生命。譬如,Patapon中以音乐节奏作为指令的形式正是一种。而其机制之简单,更易作为迷你游戏加入其他大型游戏之中。动作冒险游戏“塞尔达传说”系列,在时之笛与灵魂轨迹两作均用到了这一概念,通过重复固定搭配的音符,吹奏出美妙的笛声,同时解开谜题。虽然没有节奏的概念,但也可以算作音乐记忆机制的拓展应用。
##节奏游戏
节奏游戏一向是音乐游戏的主要流派。事实上,平日提起音乐游戏,玩家首先想到的即是节奏游戏。这种游戏的流行或许可以归结为人们对于节奏的天然喜爱。游戏底层的乐趣通常基于生理层面。研究者认为,人类在生理上天然具有节奏性,如心跳、呼吸等。许多与节奏相关的活动均能够唤起生理上的快感。 与其他音乐游戏相比,节奏游戏强调动作性,多以模拟舞蹈或乐器演奏为表现方式,考察玩家对于节奏的把握以及手眼的反应能力。玩家需在与音乐节奏吻合的精准时刻按下按键,以使其化身炫舞或演奏出优美的乐曲,并由此获得更高的分数。节奏游戏大多包含多人模式,玩家可在其中比拼或者合奏。尽管平常的控制手柄亦可作为输入装置,节奏游戏往往要求特殊输入设备,如跳舞毯等。 节奏游戏于20世纪70年代发轫于日本,至90年代年初具规模,随后在西方世界变得极度火爆。至2008年,节奏游戏超过动作游戏,成为最受欢迎的游戏类型。然而,过饱和的市场和同质化内容在其后削弱了其竞争力。
一般认为,PaRappa the Rapper(SCE, 1996)是最早出现于大众视野的节奏游戏。PaRappa the Rapper由日本音乐家、游戏设计师Masaya Matssra设计制作。玩家根据不断下落的屏幕音乐提示进行游戏,只有精准的按键才能够匹配节奏,使得PaRappa最终唱出饶舌音乐。游戏根据准确度与游戏风格对玩家的表现进行评分。最具挑战力的是自由风格,即玩家必须背离给定音符序列,但仍旧保持打击节奏与歌曲的一致。PaRappa的幽默色彩也使得游戏独具风格。到1997年,PaRappa the Rapper在日本大卖761,621份,一举成为当年排名第七的热卖游戏,并在1998年获得了首届互动艺术节的互动设计杰出成就、音效设计杰出成就两项奖项。PaRappa the Rapper被认为是第一个现代意义上的节奏游戏。它的成功在日本掀起了音乐游戏的浪潮,一些先锋游戏制作者开始制作这一类型的游戏。
在节奏游戏领域中,Konami处于领跑地位。Konami所制作的大部分音乐游戏以Bemani为总商标。其名称来自于该公司1997年至2002年推出的Beatmania系列。该系列的出现使得这种游戏类型大热起来,并促使1998年节奏游戏发展高潮的到来。这是一款模拟DJ的街机作品,由5个按键与一个碟盘组成,当音符落在判定线上时玩家需按键将之击碎。Beatmania的流行也开创了所谓的DJ游戏类型。除了街机平台,该系列在PS、GBC等平台也有作品。至2002年,这一系列以Beatmania The Final结束。衍生系列Beatmania IIDX以7键和更高难度著称,而Beatmania III则增加了踏板。Beatmania及其变种游戏风靡全世界。网上对战更加剧了这一系列的热度。
在有限的进驻欧美市场的Bemani作品中,最为引人注目的是Dance Dance Revolution(1988)系列,即国内玩家所熟悉的跳舞机。由于其原创的游戏性与持久的市场,DDR备受好评。这一系列的巨大成功引来了众多模仿者,这使得DDR成为现有游戏中被复制最多的游戏。DDR甚至作为一种健身器材而用于学校之中。
新的世纪伊始,由于Rock Band、Guitar Hero等系列的流行,音乐游戏在西方世界大获成功,并且吸引了为数不少的非核心玩家。对于非核心玩家来说,“音乐”似乎远比“游戏”更为引人入胜,亲自参与偶像音乐的乐趣让人无法自拔。
Guitar Hero以仿真模拟吉他的各种功能为特色,而Rock Band则结合了Guitar Hero与Karaoke Revolution,一定程度的弱化了吉他的重要性。Rock Band允许玩家自组乐队,分别扮演主吉他手、贝斯手、架子鼓手与主唱。当玩家的演奏与滚动的音符匹配时即可得分。而更多的不断更新的下载歌曲更是令玩家大呼过瘾。400万套的销售量和600亿的全球收入证明了这个系列的成功。
仔细推敲节奏游戏的机制,我们发现,游戏乐趣是一种混合产物,一旦将音乐元素抽离,单纯的按键将变得毫无意义。若要勉强类比,节奏游戏在某种程度上类似动作游戏,诸如判定与连击的概念。但音乐游戏中更重要的是根据节奏进行击打,而非动作游戏中审时度势的时机判断。节奏游戏中源于节奏表达的快感显然与动作游戏毫无共通之处。无论如何,虽然号称音乐游戏,但其中的游戏性与其他类型相比要少得多;游戏乐趣仍然主要来自于音乐。这也一定程度解释了游戏内嵌歌曲的种类与销售量的相关。
早期的节奏游戏由于可视化提示的存在(如按键区域),节奏仅仅起到辅助作用。游戏更多考察玩家的快速反应能力。如Dance Dance Revolution中明确的按键及其有效区的提示,这一模式也被众多后来者所继承。之后的节奏游戏逐渐淡化可视化提示的概念,如Bust A Move要求玩家凭借对节奏的记忆进行有效输入,此时玩家对于节奏的把握才变得重要起来。或许可以这样认为,可视化提示可以区分OK和Miss的概念,但只有好的节奏把握才能达到更高的判定等级。随着设计者对于节奏要素的重视程度增加,甚至有一些淡化音乐概念,专注节奏打击的游戏出现。《节奏天国》即是如此,它更多提取取材于生活场景中的节奏,增加了玩家的亲切感。到了此时,节奏游戏似乎已经走出音乐游戏的边界,而成为了单独的游戏类型。 除了对于节奏的不同态度之外,仿真程度的高低也体现了一款节奏游戏自身的特点。节奏游戏的基本游戏机制包括键数、判定层次、连击判定等;一般而言,较高的仿真通常意味着更复杂的游戏机制。Guitar Hero与Rock Band系列的最新版本需要配有吉他或者鼓作为周边输入设施。Beatmania的升级版本具有7键,并且配有碟盘,游戏时屏幕音符如瀑布般飞泻而下,只有真正的发烧友才能享受其中的乐趣。在游戏创意面临受阻的情况下,复杂的游戏机制也成为一度音乐游戏的发展趋势,这使得音乐游戏与格斗游戏一并成为精英玩家的游戏。高级的指法技巧必须刻苦练习方可掌握。玩家为此探讨出目视分区、连击指法等常用技法用以攻克关卡。 决定玩家选择节奏游戏的因素还有游戏中内嵌的歌曲种类。20世纪节奏游戏迟迟未能打入欧美市场的一个原因就在于其中的J-POP与动漫歌曲。相对而言,原创游戏音乐较授权音乐更受到发烧友的喜爱;但授权音乐则能赢取更多的非核心玩家。音乐成为决定游戏风格的重要因素。《太鼓达人》由于采用日本民族乐器太鼓作为模拟对象而呈现独特的浓郁和风。
节奏游戏乃至音乐游戏的一个发展方向是玩家可自行植入歌曲,而非只能演奏游戏内置的曲目。目前,部分音乐游戏可通过专门软件达到载入歌曲的目的,但需要玩家自行制作可玩的歌曲。制作这样的歌曲并非易事,玩家制作歌曲的质量也参差不齐。
另外需要指出的是,在节奏游戏发展初期,视觉信息是玩家判断按键时机的重要依据,此类游戏也被称为视读游戏。事实上,由于音乐元素并不只有节奏而已,视读游戏的概念远比节奏游戏宽泛。虽然音调,响度等都是可以利用的元素,但迄今为止利用到这些元素的游戏很少。可以称作响度游戏的是Mad Maestro!(Eidos Entertainment, 2002)。它具有通常节奏游戏玩法玩法,但增加了压力敏感性,即玩家必须按下相关按钮以演奏出不同音量的声音。压力等级具有三级。而音调游戏的代表则是Karaoke Revolution。
##沙箱音乐游戏
开发新的音乐游戏的尝试从来不曾停止。设计师试图将主动权交给玩家,让他们去自行体验音乐的乐趣;也就是所谓的沙箱音乐游戏,可将其定义为以音乐创作作为置于游戏机制之上的核心乐趣的音乐游戏。也因此,与其他音乐游戏相比,沙箱游戏更像是非游戏的音乐制作软件。在另一些音乐游戏中,自由音乐被提取为一个可选模式。
最早的沙箱音乐游戏是岩井俊雄的作品SimTunes( Maxis)。游戏中昆虫在一个布满彩色圆点的网格中移动。昆虫经过这些圆点将触发不同音阶、音色与画面效果。玩家可以选择不同的颜色绘制在网格上,或者改变昆虫移动的方向,这将改变游戏的结果,从而进行即兴的视觉音乐表演。对于这个容量不足1M的小游戏,主流媒体并未给予过多关注。但是,经由玩家们的口口相传,得以接触到这个作品的人通常会给出积极的评价。有玩家认为,这是一款能与玩家共同成长的游戏,这一点令人惊奇。
更加广为人知的例子是同一作者的Electroplankton。游戏由十个不同的互动音乐生物组成,结合NDS触屏和麦克的特点,整合了岩井许多早期工作。Electroplankton在本质上是交互音乐艺术,它为音乐创作提供了数个与海洋生物相关的独特模式,在每一个模式中,玩家需要控制海洋生物及其环境来制作不同音色,最终连成乐曲。不同于节奏游戏,它的创作成分大多取自NDS的触摸概念。因此,杂乱和不和谐的声音也成为乐曲中鲜活的部分,体现了游戏的创意。IGN为该作打出了7分。艺术历史学家Diana Poulsen认为,这一作品体现了游戏与艺术的交合之处。 沙箱游戏的另一代表作品是宫本茂为Wii平台打造的Wii Music。游戏中制作者不厌其烦地告诉你,这并非一个节奏游戏。玩家应该在其中体验到的是音乐的乐趣。游戏通过对现有歌曲的重新编曲来达成这一目的,编曲过程完全是任意的。在演奏之后,玩家可以把自己重新创作的歌曲录制成唱片。但是如何才能体验到音乐自身的乐趣,制作者并没有告诉玩家。美好的幻想仍旧不免为不谙音乐的玩家创作出的噪音所打破。对于这一作品,主流媒体褒贬不一。GameSpy认为,Wii Music是一个孤高的存在,缺乏吸引玩家眼球之处。而NintendoWorldReport则指出,对于没有音乐经验的人来说,这一作品是非常好的学习工具。 与节奏游戏的死板相比,沙箱游戏为玩家提供更加开放的玩法。这使得它们对音乐的参与感更强,但看上去甚至像互动音乐软件而不是游戏。在另一些游戏中,沙箱被作为一种游戏模式安插其中。虽然沙箱游戏深受玩家喜爱,沙箱音乐游戏却并不像听上去那样美好。《全景探秘游戏设计》一书作者谢尔认为,当玩家面临过多的选择时,选择的意义其实根本不存在。若要使沙箱玩具游戏化,必要的任务和引导是不可缺少的。另一个方面,玩家若要在沙箱游戏中获得乐趣,其自身的音乐修养是必不可少的。这使得沙箱游戏更适宜专业玩家。更极端的说,一把吉他能够充当好的沙箱游戏,但是,我根本不知道怎么去弹。在这一点上,设计师岩井俊雄由于自己的非音乐背景而做出了很好的成绩。带领普通玩家去感受进而了解音乐,或许是沙箱游戏更为美好的前景,当然同时也是更为艰巨的任务。
##混合音乐游戏
在混合音乐游戏之中,玩家与游戏环境以非音乐的游戏方式进行交互,但能够得到与音乐元素有关的交互体验。根据音乐与游戏机制的孰重孰轻,混合音乐游戏可以分为创生式与反应式。前者由游戏机制决定音乐体验,后者则相反。
早期的创生式混合音乐游戏有岩井俊雄的作品Otocky(ASCII, 1987),该作品被认为是引领了Rez等游戏的设计先驱。Otocky实乃横板动作游戏,玩家角色装备各种乐器攻击敌人,不同方位对应不同音阶。除此之外,游戏世界观与音乐相关,玩家以回收音符为游戏目的,回收音符则需经历释放音符、射击回收的过程。游戏甚至包含音乐制作模式。由于过多地将自主权交付予玩家,在呆了自由乐趣的同时,一定程度上丧失了音乐的完整性。
而水口哲也的作品Rez则对此进行改进。Rez最初的创作理念来源于俄国艺术家Wassily Kandinsky的通感理论。据水口称,希望在游戏中表现“听见色彩,看见音符”的通感现象。游戏空间充满了数码与赛博意味,玩家在其中沿着预定轨迹运动,由于不存在躲避的可能性,只能通过射击来抵抗敌人。按键、击中等不同的动作均对应不同的音效。节奏音效甚至随连击数目变化。与Otocky相比,Rez弱化了玩家操作在背景音乐中的地位,由此保证了音乐的完整性,同时保有玩家自由创作者的地位。玩家可以选择攻击的方式与顺序,从而带来即兴发挥的乐趣。但事实上,由于Rez过强的艺术性,普通玩家对于这款作品的理解程度相当有限。IGN在对其评论的题头中表达了这种困惑。“或许你们并不明白这个评论在说什么,其实我们也不懂,但无论如何,你当然是要玩这款游戏的”。2009年,重制版Rez HD发售,有趣的是,媒体平均评价由七年前的82%一举上升至91%。除了更好的声光效果之外,这或许说明了大众需要时间来体味Rez的魅力。正如1UP所说,时代终于赶上Rez的步伐了。
反应式混合音乐游戏的代表作品有Vib-Ribbon。这部作品诞生于1999年的PS平台,它的独特之处在于可以根据玩家放入的音乐CD生成不同的关卡。游戏图像简单而有特色,由矢量线段绘制出有尖锐棱角的关卡和角色Vib。玩家需要引领Vib穿过布满障碍的关卡,这些障碍根据演奏的歌曲自动生成。在正确的时间按下按键即能是的Vib无伤通过。否则Vib将由一只兔子变作青蛙,蠕虫,直至最终消失。若连续成功穿过18个障碍,Vib将进化成一个公主。游戏中一共有四种不同的障碍,需要按下不同的按键才能通过。而两个不同的障碍也能够合成更为复杂的障碍类型。GameSpot认为,很容易将其当做“又一个奇怪的日本音乐游戏”而忽略,但事实上,它是一个以极其优雅的方式设计而成的艺术品。
通过以上的例子可以看出两种混合音乐游戏的不同之处。在创生式混合音乐游戏之中,尽管有这样那样的限制,玩家是音乐的创造者这一事实并未被改变。而在反应式混合音乐游戏之中,玩家只是按照设计师所规定的线路演奏音乐,只不过,演奏的方式同一般的节奏游戏不同而已。或许可以这样认为,两者的区别在于以音乐的方式进行游戏,或是以游戏的方式体验音乐。无论如何,混合音乐游戏对于游戏类型拓展的贡献极为突出。在节奏游戏陷入僵局的今天,混合音乐游戏是音乐游戏突围的新的希望。
继Rez中的尝试之后,水口哲也分别以Lumines尝试音乐元素与解谜游戏的混合,以Every Extend Extra深入音乐射击游戏的概念。前者类似于俄罗斯方块。游戏中每一回合均会下落2×2的方块,将其顺逆时针旋转拼成4格同色的方块,并等待扫描线扫到之后即可消去。随着方块的下降速度与扫描线移动速度的不同,游戏也将有不同的难度。方块的移动、旋转、消除均有不同的音效,并随关卡变换,和背景音乐搭配将带来独特的感受。后者以“自爆”与“连击”为主题。透过这些奇妙的作品,我们可以看出设计师个人的成长与音乐游戏发展的轨迹。不难想象,随着对于音乐游戏探讨的深入以及设计实践的增多,音乐游戏将给予我们更为奇妙的体验。
##音乐管理游戏
与其他音乐游戏相比,音乐管理游戏看上去与音乐本身关系甚微。但这并不意味着游戏乐趣与音乐元素的减少。事实上,虽然并不直接与音乐元素互动,此类游戏将目光集中于音乐制作与管理环节,诸如调度、生产与明星的推广,反而能够获得其他音乐游戏所不能涉及的体验。以模拟为主要的游戏机制,带领玩家经历日常生活之外的音乐人生,是音乐管理游戏的魅力所在。 尽管在上个世纪90年代即有此类作品推出,直到2005年NBGI的偶像大师系列出现才将音乐管理游戏推向高潮。最初的街机版之后,很快相继在Xbox 360与PSP发售相关产品。游戏中,玩家化身音乐制作人,培育偶像,竞争粉丝的数量。官方认为,这部作品的游戏类型是“偶像培育体验游戏”。看上去只是以偶像歌手的美丽外表吸引玩家,但事实上游戏的模拟元素一点也不含糊。玩家不仅需要安排歌手的课程以培养其实力,还需与之沟通交流保证其良好情绪。到了选拔会时更要努力展示成果以求审查人员的好评。游戏的高人气不仅使得游戏中的偶像歌手活跃在其他多媒体世界,如动画片之中,NBGI甚至还公布“真实765计划”,试图将偶像带入现实生活,接受真实世界的工作邀请。 虽然单纯的音乐管理游戏相对较少,但涉及到明星养成的游戏其实还有很多,只是它们中的大多数与音乐毫不沾边罢了。如何同时带来音乐与养成的双重乐趣是音乐管理游戏需要解决的问题。从玩家对偶像大师系列的热情来看,音乐管理游戏尚有潜力可以挖掘。
##结语
从游戏设计的实践来看,游戏类型的细化与融合是其发展的趋势。音乐游戏亦当依此规律变化。作为一种音乐元素与游戏机制的混合产品,音乐游戏带给了我们全新的欣赏音乐的方式,关于声音的游戏体验,不仅为其广泛的受众打下良好的基础,亦丰富了我们的生命。
PaRappa the Rapper制作者Matsuura认为,互动媒体最吸引人之处在于玩家能够真正参与到音乐表达之中。许多评论者也赞同Rock Band系列的成功来自于年轻人渴望成为摇滚明星的愿望。游戏经常被看作现实的抽象模拟,音乐游戏也不例外;相对于传统音乐演奏方式而言,音乐游戏较低的门槛并未减少太多音乐表达的乐趣。一曲终结,跳舞毯上的高手们得到的俨然演奏会一般的愉悦。音乐元素是这一游戏类型的天然优势,游戏性则更是音乐游戏立身之根本。
与其他游戏相比,音乐游戏的历史要短许多;虽然起步较晚,音乐游戏却在发展初期即达到了高度成熟的状态,其后的游戏设计大多沿用已有的概念与机制,创新极少获得真正意义的成功。这与玩家求新求变的游戏需求相抵,亦是音乐游戏进一步发展演化的动力所在。
作为玩家,我们期待传统音乐游戏的进化,更期待新的类型与新的体验。
---
**版权声明:自由转载-非商用-非衍生-保持署名([创意共享3.0许可证](https://creativecommons.org/licenses/by-nc-nd/3.0/deed.zh))** | 118.093333 | 931 | 0.907757 | yue_Hant | 0.481413 |
54c66d6f803a05609926974dd3c5537c3939b2bb | 109 | md | Markdown | README.md | HackerBuddy229/BrightApricot | 1eb03f9e3e5f23b8d72c0d07025ce0d51415c42b | [
"MIT"
] | null | null | null | README.md | HackerBuddy229/BrightApricot | 1eb03f9e3e5f23b8d72c0d07025ce0d51415c42b | [
"MIT"
] | 2 | 2020-10-20T11:53:51.000Z | 2020-10-23T10:34:53.000Z | README.md | HackerBuddy229/BrightApricot | 1eb03f9e3e5f23b8d72c0d07025ce0d51415c42b | [
"MIT"
] | null | null | null | ## BrightApricot
An pwa spa blazor wasm and asp.net core app for amature cooks to store and perfect recipes🥘
| 36.333333 | 91 | 0.788991 | eng_Latn | 0.991926 |
54c6966276e41a0ad485598681ee7143e2b21b61 | 63 | md | Markdown | README.md | Brunovnp/Machine_Learning_Class | 43acf5c2c9254c31df69637c356361b72da37406 | [
"MIT"
] | null | null | null | README.md | Brunovnp/Machine_Learning_Class | 43acf5c2c9254c31df69637c356361b72da37406 | [
"MIT"
] | null | null | null | README.md | Brunovnp/Machine_Learning_Class | 43acf5c2c9254c31df69637c356361b72da37406 | [
"MIT"
] | null | null | null | # Machine_Learning_Studies
Models and machine learning studies
| 21 | 35 | 0.873016 | eng_Latn | 0.775839 |
54c6995a236ea46f5b84143f8563168e908aa21d | 2,853 | md | Markdown | README.md | chung-leong/relaks-event-emitter | 3220c4c9eb9a29cabfce3d6ca26a2888d17b02ea | [
"MIT"
] | null | null | null | README.md | chung-leong/relaks-event-emitter | 3220c4c9eb9a29cabfce3d6ca26a2888d17b02ea | [
"MIT"
] | 3 | 2021-03-08T22:56:34.000Z | 2022-02-12T04:05:44.000Z | README.md | trambarhq/relaks-event-emitter | 3220c4c9eb9a29cabfce3d6ca26a2888d17b02ea | [
"MIT"
] | null | null | null | # Relaks Event Emitter
Relaks Event Emitter is the base class of data sources used by [Relaks](https://github.com/trambarhq/relaks) applications. It's designed for promised-based asynchronous code. It provides a notable feature allowing event listeners to postpone the default action associated with an event.
## Methods of emitter
* [addEventListener](#addeventlistener)
* [removeEventListener](#removeeventlistener)
* [triggerEvent](#triggerevent)
* [waitForEvent](#waitforevent)
### addEventListener
```typescript
function addEventListener(name: string, handler: function, beginning?:boolean): void
```
Add an event listener. If `beginning` is `true`, then `handler` will receive the event prior to handlers added previously. Otherwise it's placed at the end of the queue.
### removeEventListener
```typescript
function removeEventListener(name: string, handler: function): void
```
Remove an event listener.
### waitForEvent()
```typescript
async function waitForEvent(type: string): Event
```
Return a promise that is fulfilled when an event of the specified type occurs.
### triggerEvent
```typescript
function triggerEvent(evt: object): boolean
```
Send an event object to any listeners interested in the event. A method used by the event emitter itself. The return value indicates whether there were any listeners.
## Methods of event object
* [preventDefault](#preventdefault)
* [postponeDefault](#postponedefault)
* [stopImmediatePropagation](#stopimmediatepropagation)
* [waitForDecision](#waitfordecision)
### preventDefault
```typescript
function preventDefault(void): void
```
Indicate that the default action should not be performed.
### postponeDefault
```typescript
function postponeDefault(proceed: Promise): void
```
```typescript
function postponeDefault(callback: AsyncFunction): void
```
Request that the default action to be postponed. The method accepts either a promise or a callback function that returns a promise. An event emitter would wait for this promise to be fulfilled before perform the default action. If the promise's fulfillment value is `false` (and not merely "falsy"), that'd be the equivalent of calling `preventDefault()` and `stopImmediatePropagation()`--i.e. the default action will not occur.
When there are multiple listeners, a call to this method will keep other listeners from receiving the event until the promise given is fulfilled.
### stopImmediatePropagation
```typescript
function stopImmediatePropagation(void): void
```
Keep listeners further down the chain from receiving this event.
### waitForDecision
```typescript
async function waitForDecision(void): void
```
A method used by the event emitter itself. If a promise is given to `postponeDefault()` within an event handler, this method will wait for its fulfillment. The method returns immediately otherwise.
| 32.420455 | 428 | 0.781283 | eng_Latn | 0.984301 |
54c6ab75984eff62a4cd13089e10a7c310bb05a5 | 3,343 | md | Markdown | README.md | kksharma1618/dropbox-streaming-upload | 45485878629e71aa06ab11afce3b346c5c2760f6 | [
"MIT"
] | 3 | 2018-01-22T23:41:15.000Z | 2018-11-19T15:56:13.000Z | README.md | kksharma1618/dropbox-streaming-upload | 45485878629e71aa06ab11afce3b346c5c2760f6 | [
"MIT"
] | 1 | 2020-11-16T15:18:34.000Z | 2020-11-16T16:14:24.000Z | README.md | kksharma1618/dropbox-streaming-upload | 45485878629e71aa06ab11afce3b346c5c2760f6 | [
"MIT"
] | null | null | null | # dropbox-streaming-upload
Streaming uploads for dropbox api v2
[Official js sdk](https://github.com/dropbox/dropbox-sdk-js) by dropbox lacks streaming upload feature. This package provide just that missing feature, nothing else.
## Installation
```
npm install --save dropbox-streaming-upload
```
## Supported features
- Simple basic upload
- Upload session for bigger files (dropbox requires upload session for files bigger than 150mb. Though you can choose it with files of any size)
- Ability to cancel upload
## Usage
``` javascript
const upload = require('dropbox-streaming-upload').default
upload(options).then(function(successMetadata) {
}, function(error) {
})
```
## Options
- *access_token:* Dropbox access token.
- *readable_stream:* Readable stream to upload.
- *file_size:* Total size of upload in bytes. Since we just have the readable stream we need size information separately.
- *destination:* Destination path in dropbox where to upload the file (full file path: ie, if you are uploading roses.jpg to /weeds/ folder, then "/weeds/roses.jpg").
- *forced_chunked_upload:* By default library will use upload session if *file_size* is greater than 150mb. If you set this to true, then it will force upload session regardless of *file_size*.
- *chunk_size:* By default library will use 5mb chunk while using chunked upload. You can override it here (in bytes).
- *mute:* See dropbox [documentation](https://www.dropbox.com/developers/documentation/http/documentation#files-upload) for /upload or /upload_session/finish. Default is false.
- *autorename:* See dropbox [documentation](https://www.dropbox.com/developers/documentation/http/documentation#files-upload) for /upload or /upload_session/finish. Default is true.
- *mode:* See dropbox [documentation](https://www.dropbox.com/developers/documentation/http/documentation#files-upload) for /upload or /upload_session/finish. Default is "add".
- *client_modified:* See dropbox [documentation](https://www.dropbox.com/developers/documentation/http/documentation#files-upload) for /upload or /upload_session/finish.
## Cancelling upload
``` javascript
const upload = require('dropbox-streaming-upload').default
const options = {
access_token: 'token',
readable_stream: someStream,
file_size: totalSize,
destination: '/mypath/myfile.ext',
}
upload(options).then(function(successMetadata) {
}, function(error) {
if (error.message === 'user_aborted') {
// you cancelled the upload below
}
})
// when you want to cancel it (before upload is done/failed)
// cancel function is added to your options object by library
// calling cancel function after upload is done/failed has no effect
options.cancel()
```
## Unit testing
- clone this repository
- run
```
npm install
```
- copy ./sample\_test\_data.json to ./tests/test\_data.json
- fill in "access\_token"
- fill in "unitTestBaseFolder". Unit test will create a tmp folder in that folder and upload files inside that (tmp folder will be removed in the end)
- update "uploads" array as needed. structure of uploads array item:
``` json
{
"localFilePath": "localfilepath.ext",
"destination": "./relative/to/tmp/folder/risingearth.jpg",
"forced_chunked_upload": false,
"chunk_size": 5e+6
}
```
- forced\_chunked\_upload and chunk\_size are optional
- run
```
npm test
```
| 39.329412 | 193 | 0.749925 | eng_Latn | 0.837589 |
54c8922625d498a3a8367103af4e60824b507588 | 5,014 | md | Markdown | 2016-08/2016-08-13_short.md | admariner/trending_archive | ffca6522c4fdaa5203c3f18d9a5a7d7da098169c | [
"MIT"
] | 80 | 2015-02-13T16:52:22.000Z | 2022-03-10T20:13:08.000Z | 2016-08/2016-08-13_short.md | admariner/trending_archive | ffca6522c4fdaa5203c3f18d9a5a7d7da098169c | [
"MIT"
] | 65 | 2021-10-02T05:54:01.000Z | 2021-12-28T22:50:23.000Z | 2016-08/2016-08-13_short.md | admariner/trending_archive | ffca6522c4fdaa5203c3f18d9a5a7d7da098169c | [
"MIT"
] | 16 | 2015-10-08T11:06:28.000Z | 2021-06-30T07:26:49.000Z | ###2016-08-13
diff between today and yesterday
####python
* [Enteleform/-RES-](https://github.com/Enteleform/-RES-): Resources
* [RedBalloonShenanigans/MonitorDarkly](https://github.com/RedBalloonShenanigans/MonitorDarkly): Poc, Presentation of Monitor OSD Exploitation, and shenanigans of high quality.
* [hynek/attrs](https://github.com/hynek/attrs): Attributes without boilerplate.
* [gunthercox/ChatterBot](https://github.com/gunthercox/ChatterBot): ChatterBot is a machine learning, conversational dialog engine.
* [orobix/retina-unet](https://github.com/orobix/retina-unet): Retina blood vessel segmentation with a convolutional neural network
* [rg3/youtube-dl](https://github.com/rg3/youtube-dl): Command-line program to download videos from YouTube.com and other video sites
* [airbnb/caravel](https://github.com/airbnb/caravel): Caravel is a data exploration platform designed to be visual, intuitive, and interactive
* [MagicStack/asyncpg](https://github.com/MagicStack/asyncpg): A fast PostgreSQL Database Client Library for Python/asyncio
* [XX-net/XX-Net](https://github.com/XX-net/XX-Net): a web proxy tool
####go
* [miekg/coredns](https://github.com/miekg/coredns): CoreDNS is a DNS server that runs middleware
* [jfrazelle/ghb0t](https://github.com/jfrazelle/ghb0t): A GitHub Bot to automatically delete your fork's branches after a pull request has been merged.
* [yyyar/gobetween](https://github.com/yyyar/gobetween): Modern & minimalistic load balancer for the loud era
* [mholt/caddy](https://github.com/mholt/caddy): Fast, cross-platform HTTP/2 web server with automatic HTTPS
* [grafana/grafana](https://github.com/grafana/grafana): Gorgeous metric viz, dashboards & editors for Graphite, InfluxDB & OpenTSDB
####cpp
* [phaistos-networks/TANK](https://github.com/phaistos-networks/TANK): A very high performance distributed log service
* [JohnLangford/vowpal_wabbit](https://github.com/JohnLangford/vowpal_wabbit): Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques such as online, hashing, allreduce, reductions, learning2search, active, and interactive learning.
* [JuliaComputing/llvm-cbe](https://github.com/JuliaComputing/llvm-cbe): resurrected LLVM "C Backend", with improvements
* [Ardour/ardour](https://github.com/Ardour/ardour): Mirror of Ardour Source Code
* [networkprotocol/libyojimbo](https://github.com/networkprotocol/libyojimbo): A library for creating secure client/server network protocols over UDP
* [google/leveldb](https://github.com/google/leveldb): LevelDB is a fast key-value storage library written at Google that provides an ordered mapping from string keys to string values.
* [bitcoin/bitcoin](https://github.com/bitcoin/bitcoin): Bitcoin Core integration/staging tree
* [mongodb/mongo](https://github.com/mongodb/mongo): The Mongo Database
* [dmlc/xgboost](https://github.com/dmlc/xgboost): Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Flink and DataFlow
####javascript
* [recharts/recharts](https://github.com/recharts/recharts): Redefined chart library built with React and D3
* [gatsbyjs/gatsby](https://github.com/gatsbyjs/gatsby): Transform plain text into dynamic blogs and websites using React.js
* [zeit/micro](https://github.com/zeit/micro): Async ES6 HTTP microservices made easy
* [nodejs/node](https://github.com/nodejs/node): Node.js JavaScript runtime
* [ryanflorence/react-lumberjack](https://github.com/ryanflorence/react-lumberjack): Logging setState for React
* [adilmoujahid/kaggle-talkingdata-visualization](https://github.com/adilmoujahid/kaggle-talkingdata-visualization): Source code for blog post: Interactive Data Visualization of Geospatial Data using D3.js, DC.js, Leaflet.js and Python
* [jumpsuit/jumpsuit](https://github.com/jumpsuit/jumpsuit): Jump in. Zip up. Build great apps.
####coffeescript
* [bkeepers/github-notifications](https://github.com/bkeepers/github-notifications): A client for reading GitHub notifications
* [atom/language-puppet](https://github.com/atom/language-puppet): Puppet package for Atom
* [coffee-js/coffee-script](https://github.com/coffee-js/coffee-script): CoffeeScript
* [fairfieldt/coffeescript-concat](https://github.com/fairfieldt/coffeescript-concat): A utility that preprocesses and concatenates CoffeeScript source files
* [resin-io/resin-wifi-connect](https://github.com/resin-io/resin-wifi-connect): CoffeeScript
* [todda00/meteor-collection-revisions](https://github.com/todda00/meteor-collection-revisions): Keep revision history for Meteor collection documents and provide restore functionality.
* [fcoury/atom-rspec](https://github.com/fcoury/atom-rspec): Atom RSpec runner package
* [Lordmau5/node-spotify-downloader](https://github.com/Lordmau5/node-spotify-downloader): A CLI and GUI solution to download music from Spotify
* [koding/kd](https://github.com/koding/kd): UI Framework for web applications.
| 96.423077 | 282 | 0.785201 | eng_Latn | 0.362164 |
54c89a770e4d0f315c1eebb5f36ff38778d00905 | 864 | md | Markdown | docs/readmes/orc8r/testing.md | sarathbrp/magma | 2641929b1adbaaabf344b8f0e23fc442afcdae4a | [
"BSD-3-Clause"
] | null | null | null | docs/readmes/orc8r/testing.md | sarathbrp/magma | 2641929b1adbaaabf344b8f0e23fc442afcdae4a | [
"BSD-3-Clause"
] | null | null | null | docs/readmes/orc8r/testing.md | sarathbrp/magma | 2641929b1adbaaabf344b8f0e23fc442afcdae4a | [
"BSD-3-Clause"
] | null | null | null | ---
id: testing
sidebar_label: Testing
title: Testing in Orchestrator
hide_title: true
---
# Testing in Orchestrator
### Unit Tests
One easy way to test is to run unit tests. This can be done by running
```
magma/orc8r/cloud: make test
```
### Run the services and check their health
Unit tests are great for checking small logic chunks,
but another way to test is to run the services and check their status.
The services can be built and started by running
```
magma/orc8r/cloud: make run
```
or
```
magma/orc8r/cloud: make restart
```
to restart the services without rebuilding.
The state of these services can be checked by running
```
sudo service magma@* status
```
for all services, and
```
sudo service magma@SERVICE_NAME status
```
for a specific service.
Run
```
sudo journalctl -u magma@SERVICE_NAME
```
to see logs relating to a specific service.
| 20.571429 | 70 | 0.743056 | eng_Latn | 0.998011 |
54c9454468bcbf0568cfc64d9631b2f87f25c468 | 92 | md | Markdown | README.md | mdkearns/carpe-noctem | b7fe20bc47c029bf579da45aa28159dc1458ff6c | [
"MIT"
] | null | null | null | README.md | mdkearns/carpe-noctem | b7fe20bc47c029bf579da45aa28159dc1458ff6c | [
"MIT"
] | null | null | null | README.md | mdkearns/carpe-noctem | b7fe20bc47c029bf579da45aa28159dc1458ff6c | [
"MIT"
] | null | null | null | # Carpe Noctem
A platform to learn/share important statistical concepts in layman's terms.
| 23 | 75 | 0.804348 | eng_Latn | 0.962192 |
54ca055af4b7d23036a04795bd2f6b432b597876 | 809 | md | Markdown | linux/README.md | dengshenkk/Keylogger | 3afd114734344bacdc270dc2c31a07cc5b7a244a | [
"MIT"
] | 1,483 | 2016-11-27T18:56:50.000Z | 2022-03-30T03:33:33.000Z | linux/README.md | dengshenkk/Keylogger | 3afd114734344bacdc270dc2c31a07cc5b7a244a | [
"MIT"
] | 95 | 2016-11-25T19:25:27.000Z | 2022-02-23T10:23:29.000Z | linux/README.md | dengshenkk/Keylogger | 3afd114734344bacdc270dc2c31a07cc5b7a244a | [
"MIT"
] | 621 | 2016-11-27T20:33:35.000Z | 2022-03-30T19:27:42.000Z | # Keylogger
**Keylogger** is simple keylogger for Windows, Linux and Mac.
## Installation
The following instructions will install Keylogger using pip3 .
```
pip3 install -r requirements.txt
```
or
```
pip3 install pyxhook
```
## How to run it
By running `nohup python3 keylogger.py &` command, it'll start to log your strokes:
The meaning of nohup is ‘no hangup‘.
When nohup command use with ‘&’ then it doesn’t return to shell command prompt after running the command in the background.
```
$~/Keylogger/linux$ nohup python3 keylogger.py &
[1] 12529 //this is the keylogger's PID (process ID)
$:~/Keylogger/linux$ fg
```
The Keylogger is now running! It will log your strokes to a file .
Stop it by typing the command `fg` then hitting `CTRL+C`
or
`kill {PID}` for example `kill 12529`
---
| 21.289474 | 124 | 0.714462 | eng_Latn | 0.992997 |
54cad45ff37ad19fd26bcfac0fb6a8fb473067ca | 143 | md | Markdown | dotfiles/xfce-flags/README.md | sio/homelab | 984e7568a403259639abe89955612f1cef67f92b | [
"Apache-2.0"
] | 1 | 2021-07-26T09:27:54.000Z | 2021-07-26T09:27:54.000Z | dotfiles/xfce-flags/README.md | sio/homelab | 984e7568a403259639abe89955612f1cef67f92b | [
"Apache-2.0"
] | null | null | null | dotfiles/xfce-flags/README.md | sio/homelab | 984e7568a403259639abe89955612f1cef67f92b | [
"Apache-2.0"
] | null | null | null | # Text-only labels for keyboard layout in XFCE
`~/.local/share/...` path does not work in Debian 9
(XFCE 4.12, Keyboard Layouts Plugin 0.7.1)
| 28.6 | 51 | 0.713287 | eng_Latn | 0.950474 |
54cb1885c6e0efb0216387054f336f07ec922266 | 69,714 | md | Markdown | posdealers/04-after-sales/troubleshooting-firewall.md | MLU0408/getting-started | f0ad5ae61b621bb96c33414989b36ff700363d44 | [
"MIT"
] | null | null | null | posdealers/04-after-sales/troubleshooting-firewall.md | MLU0408/getting-started | f0ad5ae61b621bb96c33414989b36ff700363d44 | [
"MIT"
] | null | null | null | posdealers/04-after-sales/troubleshooting-firewall.md | MLU0408/getting-started | f0ad5ae61b621bb96c33414989b36ff700363d44 | [
"MIT"
] | null | null | null | ---
slug: /posdealers/get-started/after-sales/troubleshooting-firewall
title: 'Troubleshooting: Firewall'
---
# Troubleshooting: Firewall
Hinter diesen DNS Namen liegt ein Traffic Manager, der einen Redirect auf einen Load Balancer durchführt, der Load Balancer leitet die Anfrage an einen Server weiter, wobei üblicherweise der Load Balancer aber die Antwort zurückliefert und nicht der Server direkt.
Je nach Sicherheitskonzept und Stufe, sowie verwendeter Hardware kann es recht diffizil werden hier eine korrekte Firewall Konfiguration zu erstellen, die vor allem auch im Fail-Over Fall funktioniert. Einen eindeutigen Proxy zu nutzen sollte kein Problem darstellen, wenn keinerlei Veränderung an den zu übertragenden Daten vorgenommen werden.
### Weitere Informationen
Host-Header für den Traffic-Manager, dieser ist DNS-Basiert und gibt HTTP-30x Redirects auf eine aktive Umgebung zurück. (im Notfall kann dies umgangen werden).
### IPs bis Juni 2020
| Datacenter | FT.Country | URL | IP | Package Download | Configuration Download Upload | Onlinesignature SCU (AT only) | Portal | SignaturCloud |
| ------------- | ---------- | ----------------------------------------------------- | -------------- | ----------------- | ------------------------------ | ------------------------------ | ------ | ------------- |
| westeurope | at | ft-a103-helipad.fiskaltrust.at | 52.232.31.243 | | X | | | |
| westeurope | at | ft-a103-packages.fiskaltrust.at | 52.232.31.243 | X | | | | |
| westeurope | at | ft-a103-portal.fiskaltrust.at | 52.232.31.243 | | | | X | |
| westeurope | at | ft-a103-signaturcloud-azure.fiskaltrust.at | 52.232.31.243 | | | | | X |
| westeurope | at | ft-a103-signaturcloud-sql.fiskaltrust.at | 52.232.31.243 | | | | | X |
| westeurope | at | ft-a103-signing.fiskaltrust.at | 52.232.31.243 | | | X | | |
| westeurope | at | ft-a103-signing-secondary.fiskaltrust.at | 52.232.31.243 | | | X | | |
| westeurope | at | ft-a104-helipad.fiskaltrust.at | 52.232.31.248 | | X | | | |
| westeurope | at | ft-a104-packages.fiskaltrust.at | 52.232.31.248 | X | | | | |
| westeurope | at | ft-a104-portal.fiskaltrust.at | 52.232.31.248 | | | | X | |
| westeurope | at | ft-a104-signaturcloud-azure.fiskaltrust.at | 52.232.31.248 | | | | | X |
| westeurope | at | ft-a104-signaturcloud-sql.fiskaltrust.at | 52.232.31.248 | | | | | X |
| westeurope | at | ft-a104-signing.fiskaltrust.at | 52.232.31.248 | | | X | | |
| westeurope | at | ft-a104-signing-secondary.fiskaltrust.at | 52.232.31.248 | | | X | | |
| westeurope | at | ft-a105-helipad.fiskaltrust.at | 52.166.149.161 | | X | | | |
| westeurope | at | ft-a105-packages.fiskaltrust.at | 52.166.149.161 | X | | | | |
| westeurope | at | ft-a105-portal.fiskaltrust.at | 52.166.149.161 | | | | X | |
| westeurope | at | ft-a105-signaturcloud-azure.fiskaltrust.at | 52.166.149.161 | | | | | X |
| westeurope | at | ft-a105-signaturcloud-sql.fiskaltrust.at | 52.166.149.161 | | | | | X |
| westeurope | at | ft-a105-signing.fiskaltrust.at | 52.166.149.161 | | | X | | |
| westeurope | at | ft-a105-signing-secondary.fiskaltrust.at | 52.166.149.161 | | | X | | |
| westeurope | at | ft-a106-helipad.fiskaltrust.at | 52.232.31.234 | | X | | | |
| westeurope | at | ft-a106-packages.fiskaltrust.at | 52.232.31.234 | X | | | | |
| westeurope | at | ft-a106-portal.fiskaltrust.at | 52.232.31.234 | | | | X | |
| westeurope | at | ft-a106-signaturcloud-azure.fiskaltrust.at | 52.232.31.234 | | | | | X |
| westeurope | at | ft-a106-signaturcloud-sql.fiskaltrust.at | 52.232.31.234 | | | | | X |
| westeurope | at | ft-a106-signing.fiskaltrust.at | 52.232.31.234 | | | X | | |
| westeurope | at | ft-a106-signing-secondary.fiskaltrust.at | 52.232.31.234 | | | X | | |
| westeurope | at | ft-a240-helipad-sandbox.fiskaltrust.at | 52.232.105.99 | | X | | | |
| westeurope | at | ft-a240-packages-sandbox.fiskaltrust.at | 52.232.105.99 | X | | | | |
| westeurope | at | ft-a240-portal-sandbox.fiskaltrust.at | 52.232.105.99 | | | | X | |
| westeurope | at | ft-a240-signaturcloud-azure-sandbox.fiskaltrust.at | 52.232.105.99 | | | | | X |
| westeurope | at | ft-a240-signaturcloud-sql-sandbox.fiskaltrust.at | 52.232.105.99 | | | | | X |
| westeurope | at | ft-a240-signing-sandbox.fiskaltrust.at | 52.232.105.99 | | | X | | |
| westeurope | at | ft-a240-signing-sandbox-secondary.fiskaltrust.at | 52.232.105.99 | | | X | | |
| westeurope | at | ft-a248-helipad-sandbox.fiskaltrust.at | 52.233.153.172 | | X | | | |
| westeurope | at | ft-a248-packages-sandbox.fiskaltrust.at | 52.233.153.172 | X | | | | |
| westeurope | at | ft-a248-portal-sandbox.fiskaltrust.at | 52.233.153.172 | | | | X | |
| westeurope | at | ft-a248-signaturcloud-azure-sandbox.fiskaltrust.at | 52.233.153.172 | | | | | X |
| westeurope | at | ft-a248-signaturcloud-sql-sandbox.fiskaltrust.at | 52.233.153.172 | | | | | X |
| westeurope | at | ft-a248-signing-sandbox.fiskaltrust.at | 52.233.153.172 | | | X | | |
| westeurope | at | ft-a248-signing-sandbox-secondary.fiskaltrust.at | 52.233.153.172 | | | X | | |
| westeurope | at | ft-a252-helipad-sandbox.fiskaltrust.at | 52.233.199.77 | | X | | | |
| westeurope | at | ft-a252-packages-sandbox.fiskaltrust.at | 52.233.199.77 | X | | | | |
| westeurope | at | ft-a252-portal-sandbox.fiskaltrust.at | 52.233.199.77 | | | | X | |
| westeurope | at | ft-a252-signaturcloud-azure-sandbox.fiskaltrust.at | 52.233.199.77 | | | | | X |
| westeurope | at | ft-a252-signaturcloud-sql-sandbox.fiskaltrust.at | 52.233.199.77 | | | | | X |
| westeurope | at | ft-a252-signing-sandbox.fiskaltrust.at | 52.233.199.77 | | | X | | |
| westeurope | at | ft-a252-signing-sandbox-secondary.fiskaltrust.at | 52.233.199.77 | | | X | | |
| graz | at | ft-at11-signing.fiskaltrust.at | 149.154.100.21 | | | X | | |
| graz | at | ft-at11-signing-secondary.fiskaltrust.at | 149.154.100.21 | | | X | | |
| westeurope | cloud | ft-a103-data.fiskaltrust.cloud | 52.232.31.245 | | | | | |
| westeurope | cloud | ft-a103-helipad.fiskaltrust.cloud | 52.232.31.245 | | X | | | |
| westeurope | cloud | ft-a103-packages.fiskaltrust.cloud | 52.232.31.245 | X | | | | |
| westeurope | cloud | ft-a103-signaturcloud-azure.fiskaltrust.cloud | 52.232.31.245 | | | | | X |
| westeurope | cloud | ft-a103-signaturcloud-sql.fiskaltrust.cloud | 52.232.31.245 | | | | | X |
| westeurope | cloud | ft-a104-data.fiskaltrust.cloud | 52.232.31.252 | | | | | |
| westeurope | cloud | ft-a104-helipad.fiskaltrust.cloud | 52.232.31.252 | | X | | | |
| westeurope | cloud | ft-a104-packages.fiskaltrust.cloud | 52.232.31.252 | X | | | | |
| westeurope | cloud | ft-a104-signaturcloud-azure.fiskaltrust.cloud | 52.232.31.252 | | | | | X |
| westeurope | cloud | ft-a104-signaturcloud-sql.fiskaltrust.cloud | 52.232.31.252 | | | | | X |
| westeurope | cloud | ft-a105-data.fiskaltrust.cloud | 52.166.150.3 | | | | | |
| westeurope | cloud | ft-a105-helipad.fiskaltrust.cloud | 52.166.150.3 | | X | | | |
| westeurope | cloud | ft-a105-packages.fiskaltrust.cloud | 52.166.150.3 | X | | | | |
| westeurope | cloud | ft-a105-signaturcloud-azure.fiskaltrust.cloud | 52.166.150.3 | | | | | X |
| westeurope | cloud | ft-a105-signaturcloud-sql.fiskaltrust.cloud | 52.166.150.3 | | | | | X |
| westeurope | cloud | ft-a106-data.fiskaltrust.cloud | 13.80.159.55 | | | | | |
| westeurope | cloud | ft-a106-helipad.fiskaltrust.cloud | 13.80.159.55 | | X | | | |
| westeurope | cloud | ft-a106-packages.fiskaltrust.cloud | 13.80.159.55 | X | | | | |
| westeurope | cloud | ft-a106-signaturcloud-azure.fiskaltrust.cloud | 13.80.159.55 | | | | | X |
| westeurope | cloud | ft-a106-signaturcloud-sql.fiskaltrust.cloud | 13.80.159.55 | | | | | X |
| westeurope | cloud | ft-a240-data-sandbox.fiskaltrust.cloud | 52.232.105.98 | | | | | |
| westeurope | cloud | ft-a240-helipad-sandbox.fiskaltrust.cloud | 52.232.105.98 | | X | | | |
| westeurope | cloud | ft-a240-packages-sandbox.fiskaltrust.cloud | 52.232.105.98 | X | | | | |
| westeurope | cloud | ft-a248-data-sandbox.fiskaltrust.cloud | 52.174.199.114 | | | | | |
| westeurope | cloud | ft-a248-helipad.fiskaltrust.cloud | 52.174.199.114 | | X | | | |
| westeurope | cloud | ft-a248-helipad-sandbox.fiskaltrust.cloud | 52.174.199.114 | | X | | | |
| westeurope | cloud | ft-a248-packages.fiskaltrust.cloud | 52.174.199.114 | X | | | | |
| westeurope | cloud | ft-a248-packages-sandbox.fiskaltrust.cloud | 52.174.199.114 | X | | | | |
| westeurope | cloud | ft-a248-signaturcloud-azure.fiskaltrust.cloud | 52.174.199.114 | | | | | X |
| westeurope | cloud | ft-a248-signaturcloud-azure-sandbox.fiskaltrust.cloud | 52.174.199.114 | | | | | X |
| westeurope | cloud | ft-a248-signaturcloud-sql.fiskaltrust.cloud | 52.174.199.114 | | | | | X |
| westeurope | cloud | ft-a248-signaturcloud-sql-sandbox.fiskaltrust.cloud | 52.174.199.114 | | | | | X |
| westeurope | cloud | ft-a252-data-sandbox.fiskaltrust.cloud | 52.233.198.78 | | | | | |
| westeurope | cloud | ft-a252-helipad-sandbox.fiskaltrust.cloud | 52.233.198.78 | | X | | | |
| westeurope | cloud | ft-a252-packages-sandbox.fiskaltrust.cloud | 52.233.198.78 | X | | | | |
| westeurope | cloud | ft-a252-signaturcloud-azure-sandbox.fiskaltrust.cloud | 52.233.198.78 | | | | | X |
| westeurope | cloud | ft-a252-signaturcloud-sql-sandbox.fiskaltrust.cloud | 52.233.198.78 | | | | | X |
| westeurope | de | ft-a103-portal.fiskaltrust.de | 52.174.101.91 | | | | X | |
| westeurope | de | ft-a103-signaturcloud-azure.fiskaltrust.de | 52.174.101.91 | | | | | X |
| westeurope | de | ft-a103-signaturcloud-sql.fiskaltrust.de | 52.174.101.91 | | | | | X |
| westeurope | de | ft-a104-portal.fiskaltrust.de | 13.80.144.75 | | | | X | |
| westeurope | de | ft-a104-signaturcloud-azure.fiskaltrust.de | 13.80.144.75 | | | | | X |
| westeurope | de | ft-a104-signaturcloud-sql.fiskaltrust.de | 13.80.144.75 | | | | | X |
| westeurope | de | ft-a105-portal.fiskaltrust.de | 13.95.25.48 | | | | X | |
| westeurope | de | ft-a105-signaturcloud-azure.fiskaltrust.de | 13.95.25.48 | | | | | X |
| westeurope | de | ft-a105-signaturcloud-sql.fiskaltrust.de | 13.95.25.48 | | | | | X |
| westeurope | de | ft-a106-portal.fiskaltrust.de | 52.233.135.71 | | | | X | |
| westeurope | de | ft-a106-signaturcloud-azure.fiskaltrust.de | 52.233.135.71 | | | | | X |
| westeurope | de | ft-a106-signaturcloud-sql.fiskaltrust.de | 52.233.135.71 | | | | | X |
| westeurope | de | ft-a240-portal-sandbox.fiskaltrust.de | 13.93.13.52 | | | | X | |
| westeurope | de | ft-a240-siganturcloud-azure-sandbox.fiskaltrust.de | 13.93.13.52 | | | | | X |
| westeurope | de | ft-a240-siganturcloud-sandbox.fiskaltrust.de | 13.93.13.52 | | | | | X |
| westeurope | de | ft-a240-signaturcloud-sql-sandbox.fiskaltrust.de | 13.93.13.52 | | | | | X |
| westeurope | de | ft-a248-portal-sandbox.fiskaltrust.de | 13.80.148.248 | | | | X | |
| westeurope | de | ft-a248-signaturcloud-azure-sandbox.fiskaltrust.de | 13.80.148.248 | | | | | X |
| westeurope | de | ft-a248-signaturcloud-sql-sandbox.fiskaltrust.de | 13.80.148.248 | | | | | X |
| westeurope | de | ft-a252-portal-sandbox.fiskaltrust.de | 13.80.149.240 | | | | X | |
| westeurope | de | ft-a252-signaturcloud-azure-sandbox.fiskaltrust.de | 13.80.149.240 | | | | | X |
| westeurope | fr | ft-a103-portal.fiskaltrust.fr | 52.166.149.170 | | | | X | |
| westeurope | fr | ft-a103-signaturcloud-azure.fiskaltrust.fr | 52.166.149.170 | | | | | X |
| westeurope | fr | ft-a103-signaturcloud-sql.fiskaltrust.fr | 52.166.149.170 | | | | | X |
| westeurope | fr | ft-a104-portal.fiskaltrust.fr | 52.174.149.36 | | | | X | |
| westeurope | fr | ft-a104-signaturcloud-azure.fiskaltrust.fr | 52.174.149.36 | | | | | X |
| westeurope | fr | ft-a104-signaturcloud-sql.fiskaltrust.fr | 52.174.149.36 | | | | | X |
| westeurope | fr | ft-a105-portal.fiskaltrust.fr | 23.101.74.60 | | | | X | |
| westeurope | fr | ft-a105-signaturcloud-azure.fiskaltrust.fr | 23.101.74.60 | | | | | X |
| westeurope | fr | ft-a105-signaturcloud-sql.fiskaltrust.fr | 23.101.74.60 | | | | | X |
| westeurope | fr | ft-a106-portal.fiskaltrust.fr | 52.233.130.60 | | | | X | |
| westeurope | fr | ft-a106-signaturcloud-azure.fiskaltrust.fr | 52.233.130.60 | | | | | X |
| westeurope | fr | ft-a106-signaturcloud-sql.fiskaltrust.fr | 52.233.130.60 | | | | | X |
| francecentral | fr | ft-a108-portal.fiskaltrust.fr | 40.89.143.55 | | | | X | |
| francecentral | fr | ft-a108-signaturcloud-azure.fiskaltrust.fr | 40.89.143.55 | | | | | X |
| francecentral | fr | ft-a108-signaturcloud-sql.fiskaltrust.fr | 40.89.143.55 | | | | | X |
| westeurope | fr | ft-a240-portal-sandbox.fiskaltrust.fr | 13.93.11.101 | | | | X | |
| westeurope | fr | ft-a240-signaturcloud-azure-sandbox.fiskaltrust.fr | 13.93.11.101 | | | | | X |
| westeurope | fr | ft-a240-signaturcloud-sql-sandbox.fiskaltrust.fr | 13.93.11.101 | | | | | X |
| francecentral | fr | ft-a241-portal-sandbox.fiskaltrust.fr | 40.89.140.232 | | | | X | |
| francecentral | fr | ft-a241-signaturcloud-azure-sandbox.fiskaltrust.fr | 40.89.140.232 | | | | | X |
| francecentral | fr | ft-a241-signaturcloud-sql-sandbox.fiskaltrust.fr | 40.89.140.232 | | | | | x |
| westeurope | fr | ft-a248-portal-sandbox.fiskaltrust.fr | 40.68.86.85 | | | | X | |
| westeurope | fr | ft-a248-signaturcloud-azure-sandbox.fiskaltrust.fr | 40.68.86.85 | | | | | X |
| westeurope | fr | ft-a248-signaturcloud-sql-sandbox.fiskaltrust.fr | 40.68.86.85 | | | | | X |
| westeurope | fr | ft-a252-portal-sandbox.fiskaltrust.fr | 13.81.216.246 | | | | X | |
| westeurope | fr | ft-a252-signaturcloud-azure-sandbox.fiskaltrust.fr | 13.81.216.246 | | | | | X |
| westeurope | fr | ft-a252-signaturcloud-sql-sandbox.fiskaltrust.fr | 13.81.216.246 | | | | | X |
| Westeurope | cloud | ft-a103-fiskaltrust-cloud.cloud | 52.232.31.245 | | | | | |
| Westeurope | at | ft-a103-fiskaltrust-cloud.at | 52.232.31.245 | | | | | |
| Westeurope | cloud | ft-a104-fiskaltrust-cloud.cloud | 52.232.31.252 | | | | | |
| Westeurope | at | ft-a104-fiskaltrust-cloud.at | 52.232.31.252 | | | | | |
| Westeurope | cloud | ft-a104-fiskaltrust-at.cloud | 52.232.31.248 | | | | | |
| Westeurope | at | ft-a104-fiskaltrust-at.at | 52.232.31.248 | | | | | |
| Westeurope | cloud | ft-a105-fiskaltrust-cloud.cloud | 52.166.150.3 | | | | | |
| Westeurope | at | ft-a105-fiskaltrust-cloud.at | 52.166.150.3 | | | | | |
| Westeurope | cloud | ft-a105-fiskaltrust-at.cloud | 52.166.149.161 | | | | | |
| Westeurope | at | ft-a105-fiskaltrust-at.at | 52.166.149.161 | | | | | |
| Westeurope | cloud | ft-a106-fiskaltrust-cloud.cloud | 13.80.159.55 | | | | | |
| Westeurope | at | ft-a106-fiskaltrust-cloud.at | 13.80.159.55 | | | | | |
| Westeurope | cloud | ft-a106-fiskaltrust-at.cloud | 52.232.31.234 | | | | | |
| Westeurope | at | ft-a106-fiskaltrust-at.at | 52.232.31.234 | | | | | |
| Paris | fr | ft-a108-fiskaltrust-fr.fr | 40.89.143.55 | | | | | |
| Paris | fr | ft-a109-fiskaltrust-fr.fr | 40.66.60.130 | | | | | |
| Frankfurt | de | ft-a111-fiskaltrust-de.de | 51.116.232.218 | | | | | |
| Frankfurt | de | ft-a112-fiskaltrust-de.de | 51.116.168.9 | | | | | |
| Westeurope | at | ft-a114-fiskaltrust-at.at | 13.95.22.62 | | | | | |
| Westeurope | at | ft-a115-fiskaltrust-at.at | 52.166.206.12 | | | | | |
| Graz | at | ft-at11-fiskaltrust-at.at | 149.154.100.21 | | | | | |
## IPs ab Juni 2020
| Datacenter | FT.Country | URL | IP | Package Download | Configuration Download Upload | Onlinesignature SCU (AT only) | Portal | SignaturCloud |
| ------------- | ---------- | ----------------------------------------------------- | -------------- | ---------------- | ----------------------------- | ----------------------------- | ------ | ------------- |
| westeurope | at | ft-a103-helipad.fiskaltrust.at | 52.232.31.243 | | X | | | |
| westeurope | at | ft-a103-packages.fiskaltrust.at | 52.232.31.243 | X | | | | |
| westeurope | at | ft-a103-portal.fiskaltrust.at | 52.232.31.243 | | | | X | |
| westeurope | at | ft-a103-signaturcloud-azure.fiskaltrust.at | 52.232.31.243 | | | | | X |
| westeurope | at | ft-a103-signaturcloud-sql.fiskaltrust.at | 52.232.31.243 | | | | | X |
| westeurope | at | ft-a103-signing.fiskaltrust.at | 52.232.31.243 | | | X | | |
| westeurope | at | ft-a103-signing-secondary.fiskaltrust.at | 52.232.31.243 | | | X | | |
| westeurope | at | ft-a104-helipad.fiskaltrust.at | 52.232.31.248 | | X | | | |
| westeurope | at | ft-a104-packages.fiskaltrust.at | 52.232.31.248 | X | | | | |
| westeurope | at | ft-a104-portal.fiskaltrust.at | 52.232.31.248 | | | | X | |
| westeurope | at | ft-a104-signaturcloud-azure.fiskaltrust.at | 52.232.31.248 | | | | | X |
| westeurope | at | ft-a104-signaturcloud-sql.fiskaltrust.at | 52.232.31.248 | | | | | X |
| westeurope | at | ft-a104-signing.fiskaltrust.at | 52.232.31.248 | | | X | | |
| westeurope | at | ft-a104-signing-secondary.fiskaltrust.at | 52.232.31.248 | | | X | | |
| westeurope | at | ft-a105-helipad.fiskaltrust.at | 52.166.149.161 | | X | | | |
| westeurope | at | ft-a105-packages.fiskaltrust.at | 52.166.149.161 | X | | | | |
| westeurope | at | ft-a105-portal.fiskaltrust.at | 52.166.149.161 | | | | X | |
| westeurope | at | ft-a105-signaturcloud-azure.fiskaltrust.at | 52.166.149.161 | | | | | X |
| westeurope | at | ft-a105-signaturcloud-sql.fiskaltrust.at | 52.166.149.161 | | | | | X |
| westeurope | at | ft-a105-signing.fiskaltrust.at | 52.166.149.161 | | | X | | |
| westeurope | at | ft-a105-signing-secondary.fiskaltrust.at | 52.166.149.161 | | | X | | |
| westeurope | at | ft-a106-helipad.fiskaltrust.at | 52.232.31.234 | | X | | | |
| westeurope | at | ft-a106-packages.fiskaltrust.at | 52.232.31.234 | X | | | | |
| westeurope | at | ft-a106-portal.fiskaltrust.at | 52.232.31.234 | | | | X | |
| westeurope | at | ft-a106-signaturcloud-azure.fiskaltrust.at | 52.232.31.234 | | | | | X |
| westeurope | at | ft-a106-signaturcloud-sql.fiskaltrust.at | 52.232.31.234 | | | | | X |
| westeurope | at | ft-a106-signing.fiskaltrust.at | 52.232.31.234 | | | X | | |
| westeurope | at | ft-a106-signing-secondary.fiskaltrust.at | 52.232.31.234 | | | X | | |
| westeurope | at | ft-a240-helipad-sandbox.fiskaltrust.at | 52.232.105.99 | | X | | | |
| westeurope | at | ft-a240-packages-sandbox.fiskaltrust.at | 52.232.105.99 | X | | | | |
| westeurope | at | ft-a240-portal-sandbox.fiskaltrust.at | 52.232.105.99 | | | | X | |
| westeurope | at | ft-a240-signaturcloud-azure-sandbox.fiskaltrust.at | 52.232.105.99 | | | | | X |
| westeurope | at | ft-a240-signaturcloud-sql-sandbox.fiskaltrust.at | 52.232.105.99 | | | | | X |
| westeurope | at | ft-a240-signing-sandbox.fiskaltrust.at | 52.232.105.99 | | | X | | |
| westeurope | at | ft-a240-signing-sandbox-secondary.fiskaltrust.at | 52.232.105.99 | | | X | | |
| westeurope | at | ft-a248-helipad-sandbox.fiskaltrust.at | 52.233.153.172 | | X | | | |
| westeurope | at | ft-a248-packages-sandbox.fiskaltrust.at | 52.233.153.172 | X | | | | |
| westeurope | at | ft-a248-portal-sandbox.fiskaltrust.at | 52.233.153.172 | | | | X | |
| westeurope | at | ft-a248-signaturcloud-azure-sandbox.fiskaltrust.at | 52.233.153.172 | | | | | X |
| westeurope | at | ft-a248-signaturcloud-sql-sandbox.fiskaltrust.at | 52.233.153.172 | | | | | X |
| westeurope | at | ft-a248-signing-sandbox.fiskaltrust.at | 52.233.153.172 | | | X | | |
| westeurope | at | ft-a248-signing-sandbox-secondary.fiskaltrust.at | 52.233.153.172 | | | X | | |
| westeurope | at | ft-a252-helipad-sandbox.fiskaltrust.at | 52.233.199.77 | | X | | | |
| westeurope | at | ft-a252-packages-sandbox.fiskaltrust.at | 52.233.199.77 | X | | | | |
| westeurope | at | ft-a252-portal-sandbox.fiskaltrust.at | 52.233.199.77 | | | | X | |
| westeurope | at | ft-a252-signaturcloud-azure-sandbox.fiskaltrust.at | 52.233.199.77 | | | | | X |
| westeurope | at | ft-a252-signaturcloud-sql-sandbox.fiskaltrust.at | 52.233.199.77 | | | | | X |
| westeurope | at | ft-a252-signing-sandbox.fiskaltrust.at | 52.233.199.77 | | | X | | |
| westeurope | at | ft-a252-signing-sandbox-secondary.fiskaltrust.at | 52.233.199.77 | | | X | | |
| graz | at | ft-at11-signing.fiskaltrust.at | 149.154.100.21 | | | X | | |
| graz | at | ft-at11-signing-secondary.fiskaltrust.at | 149.154.100.21 | | | X | | |
| westeurope | cloud | ft-a103-data.fiskaltrust.cloud | 52.232.31.245 | | | | | |
| westeurope | cloud | ft-a103-helipad.fiskaltrust.cloud | 52.232.31.245 | | X | | | |
| westeurope | cloud | ft-a103-packages.fiskaltrust.cloud | 52.232.31.245 | X | | | | |
| westeurope | cloud | ft-a103-signaturcloud-azure.fiskaltrust.cloud | 52.232.31.245 | | | | | X |
| westeurope | cloud | ft-a103-signaturcloud-sql.fiskaltrust.cloud | 52.232.31.245 | | | | | X |
| westeurope | cloud | ft-a104-data.fiskaltrust.cloud | 52.232.31.252 | | | | | |
| westeurope | cloud | ft-a104-helipad.fiskaltrust.cloud | 52.232.31.252 | | X | | | |
| westeurope | cloud | ft-a104-packages.fiskaltrust.cloud | 52.232.31.252 | X | | | | |
| westeurope | cloud | ft-a104-signaturcloud-azure.fiskaltrust.cloud | 52.232.31.252 | | | | | X |
| westeurope | cloud | ft-a104-signaturcloud-sql.fiskaltrust.cloud | 52.232.31.252 | | | | | X |
| westeurope | cloud | ft-a105-data.fiskaltrust.cloud | 52.166.150.3 | | | | | |
| westeurope | cloud | ft-a105-helipad.fiskaltrust.cloud | 52.166.150.3 | | X | | | |
| westeurope | cloud | ft-a105-packages.fiskaltrust.cloud | 52.166.150.3 | X | | | | |
| westeurope | cloud | ft-a105-signaturcloud-azure.fiskaltrust.cloud | 52.166.150.3 | | | | | X |
| westeurope | cloud | ft-a105-signaturcloud-sql.fiskaltrust.cloud | 52.166.150.3 | | | | | X |
| westeurope | cloud | ft-a106-data.fiskaltrust.cloud | 13.80.159.55 | | | | | |
| westeurope | cloud | ft-a106-helipad.fiskaltrust.cloud | 13.80.159.55 | | X | | | |
| westeurope | cloud | ft-a106-packages.fiskaltrust.cloud | 13.80.159.55 | X | | | | |
| westeurope | cloud | ft-a106-signaturcloud-azure.fiskaltrust.cloud | 13.80.159.55 | | | | | X |
| westeurope | cloud | ft-a106-signaturcloud-sql.fiskaltrust.cloud | 13.80.159.55 | | | | | X |
| westeurope | cloud | ft-a240-data-sandbox.fiskaltrust.cloud | 52.232.105.98 | | | | | |
| westeurope | cloud | ft-a240-helipad-sandbox.fiskaltrust.cloud | 52.232.105.98 | | X | | | |
| westeurope | cloud | ft-a240-packages-sandbox.fiskaltrust.cloud | 52.232.105.98 | X | | | | |
| westeurope | cloud | ft-a248-data-sandbox.fiskaltrust.cloud | 52.174.199.114 | | | | | |
| westeurope | cloud | ft-a248-helipad.fiskaltrust.cloud | 52.174.199.114 | | X | | | |
| westeurope | cloud | ft-a248-helipad-sandbox.fiskaltrust.cloud | 52.174.199.114 | | X | | | |
| westeurope | cloud | ft-a248-packages.fiskaltrust.cloud | 52.174.199.114 | X | | | | |
| westeurope | cloud | ft-a248-packages-sandbox.fiskaltrust.cloud | 52.174.199.114 | X | | | | |
| westeurope | cloud | ft-a248-signaturcloud-azure.fiskaltrust.cloud | 52.174.199.114 | | | | | X |
| westeurope | cloud | ft-a248-signaturcloud-azure-sandbox.fiskaltrust.cloud | 52.174.199.114 | | | | | X |
| westeurope | cloud | ft-a248-signaturcloud-sql.fiskaltrust.cloud | 52.174.199.114 | | | | | X |
| westeurope | cloud | ft-a248-signaturcloud-sql-sandbox.fiskaltrust.cloud | 52.174.199.114 | | | | | X |
| westeurope | cloud | ft-a252-data-sandbox.fiskaltrust.cloud | 52.233.198.78 | | | | | |
| westeurope | cloud | ft-a252-helipad-sandbox.fiskaltrust.cloud | 52.233.198.78 | | X | | | |
| westeurope | cloud | ft-a252-packages-sandbox.fiskaltrust.cloud | 52.233.198.78 | X | | | | |
| westeurope | cloud | ft-a252-signaturcloud-azure-sandbox.fiskaltrust.cloud | 52.233.198.78 | | | | | X |
| westeurope | cloud | ft-a252-signaturcloud-sql-sandbox.fiskaltrust.cloud | 52.233.198.78 | | | | | X |
| westeurope | de | ft-a103-portal.fiskaltrust.de | 52.174.101.91 | | | | X | |
| westeurope | de | ft-a103-signaturcloud-azure.fiskaltrust.de | 52.174.101.91 | | | | | X |
| westeurope | de | ft-a103-signaturcloud-sql.fiskaltrust.de | 52.174.101.91 | | | | | X |
| westeurope | de | ft-a104-portal.fiskaltrust.de | 13.80.144.75 | | | | X | |
| westeurope | de | ft-a104-signaturcloud-azure.fiskaltrust.de | 13.80.144.75 | | | | | X |
| westeurope | de | ft-a104-signaturcloud-sql.fiskaltrust.de | 13.80.144.75 | | | | | X |
| westeurope | de | ft-a105-portal.fiskaltrust.de | 13.95.25.48 | | | | X | |
| westeurope | de | ft-a105-signaturcloud-azure.fiskaltrust.de | 13.95.25.48 | | | | | X |
| westeurope | de | ft-a105-signaturcloud-sql.fiskaltrust.de | 13.95.25.48 | | | | | X |
| westeurope | de | ft-a106-portal.fiskaltrust.de | 52.233.135.71 | | | | X | |
| westeurope | de | ft-a106-signaturcloud-azure.fiskaltrust.de | 52.233.135.71 | | | | | X |
| westeurope | de | ft-a106-signaturcloud-sql.fiskaltrust.de | 52.233.135.71 | | | | | X |
| westeurope | de | ft-a240-portal-sandbox.fiskaltrust.de | 13.93.13.52 | | | | X | |
| westeurope | de | ft-a240-siganturcloud-azure-sandbox.fiskaltrust.de | 13.93.13.52 | | | | | X |
| westeurope | de | ft-a240-siganturcloud-sandbox.fiskaltrust.de | 13.93.13.52 | | | | | X |
| westeurope | de | ft-a240-signaturcloud-sql-sandbox.fiskaltrust.de | 13.93.13.52 | | | | | X |
| westeurope | de | ft-a248-portal-sandbox.fiskaltrust.de | 13.80.148.248 | | | | X | |
| westeurope | de | ft-a248-signaturcloud-azure-sandbox.fiskaltrust.de | 13.80.148.248 | | | | | X |
| westeurope | de | ft-a248-signaturcloud-sql-sandbox.fiskaltrust.de | 13.80.148.248 | | | | | X |
| westeurope | de | ft-a252-portal-sandbox.fiskaltrust.de | 13.80.149.240 | | | | X | |
| westeurope | de | ft-a252-signaturcloud-azure-sandbox.fiskaltrust.de | 13.80.149.240 | | | | | X |
| westeurope | fr | ft-a103-portal.fiskaltrust.fr | 52.166.149.170 | | | | X | |
| westeurope | fr | ft-a103-signaturcloud-azure.fiskaltrust.fr | 52.166.149.170 | | | | | X |
| westeurope | fr | ft-a103-signaturcloud-sql.fiskaltrust.fr | 52.166.149.170 | | | | | X |
| westeurope | fr | ft-a104-portal.fiskaltrust.fr | 52.174.149.36 | | | | X | |
| westeurope | fr | ft-a104-signaturcloud-azure.fiskaltrust.fr | 52.174.149.36 | | | | | X |
| westeurope | fr | ft-a104-signaturcloud-sql.fiskaltrust.fr | 52.174.149.36 | | | | | X |
| westeurope | fr | ft-a105-portal.fiskaltrust.fr | 23.101.74.60 | | | | X | |
| westeurope | fr | ft-a105-signaturcloud-azure.fiskaltrust.fr | 23.101.74.60 | | | | | X |
| westeurope | fr | ft-a105-signaturcloud-sql.fiskaltrust.fr | 23.101.74.60 | | | | | X |
| westeurope | fr | ft-a106-portal.fiskaltrust.fr | 52.233.130.60 | | | | X | |
| westeurope | fr | ft-a106-signaturcloud-azure.fiskaltrust.fr | 52.233.130.60 | | | | | X |
| westeurope | fr | ft-a106-signaturcloud-sql.fiskaltrust.fr | 52.233.130.60 | | | | | X |
| francecentral | fr | ft-a108-portal.fiskaltrust.fr | 40.89.143.55 | | | | X | |
| francecentral | fr | ft-a108-signaturcloud-azure.fiskaltrust.fr | 40.89.143.55 | | | | | X |
| francecentral | fr | ft-a108-signaturcloud-sql.fiskaltrust.fr | 40.89.143.55 | | | | | X |
| westeurope | fr | ft-a240-portal-sandbox.fiskaltrust.fr | 13.93.11.101 | | | | X | |
| westeurope | fr | ft-a240-signaturcloud-azure-sandbox.fiskaltrust.fr | 13.93.11.101 | | | | | X |
| westeurope | fr | ft-a240-signaturcloud-sql-sandbox.fiskaltrust.fr | 13.93.11.101 | | | | | X |
| francecentral | fr | ft-a241-portal-sandbox.fiskaltrust.fr | 40.89.140.232 | | | | X | |
| francecentral | fr | ft-a241-signaturcloud-azure-sandbox.fiskaltrust.fr | 40.89.140.232 | | | | | X |
| francecentral | fr | ft-a241-signaturcloud-sql-sandbox.fiskaltrust.fr | 40.89.140.232 | | | | | x |
| westeurope | fr | ft-a248-portal-sandbox.fiskaltrust.fr | 40.68.86.85 | | | | X | |
| westeurope | fr | ft-a248-signaturcloud-azure-sandbox.fiskaltrust.fr | 40.68.86.85 | | | | | X |
| westeurope | fr | ft-a248-signaturcloud-sql-sandbox.fiskaltrust.fr | 40.68.86.85 | | | | | X |
| westeurope | fr | ft-a252-portal-sandbox.fiskaltrust.fr | 13.81.216.246 | | | | X | |
| westeurope | fr | ft-a252-signaturcloud-azure-sandbox.fiskaltrust.fr | 13.81.216.246 | | | | | X |
| westeurope | fr | ft-a252-signaturcloud-sql-sandbox.fiskaltrust.fr | 13.81.216.246 | | | | | X |
| Westeurope | cloud | ft-a103-fiskaltrust-cloud.cloud | 52.232.31.245 | | | | | |
| Westeurope | at | ft-a103-fiskaltrust-cloud.at | 52.232.31.245 | | | | | |
| Westeurope | cloud | ft-a104-fiskaltrust-cloud.cloud | 52.232.31.252 | | | | | |
| Westeurope | at | ft-a104-fiskaltrust-cloud.at | 52.232.31.252 | | | | | |
| Westeurope | cloud | ft-a104-fiskaltrust-at.cloud | 52.232.31.248 | | | | | |
| Westeurope | at | ft-a104-fiskaltrust-at.at | 52.232.31.248 | | | | | |
| Westeurope | cloud | ft-a105-fiskaltrust-cloud.cloud | 52.166.150.3 | | | | | |
| Westeurope | at | ft-a105-fiskaltrust-cloud.at | 52.166.150.3 | | | | | |
| Westeurope | cloud | ft-a105-fiskaltrust-at.cloud | 52.166.149.161 | | | | | |
| Westeurope | at | ft-a105-fiskaltrust-at.at | 52.166.149.161 | | | | | |
| Westeurope | cloud | ft-a106-fiskaltrust-cloud.cloud | 13.80.159.55 | | | | | |
| Westeurope | at | ft-a106-fiskaltrust-cloud.at | 13.80.159.55 | | | | | |
| Westeurope | cloud | ft-a106-fiskaltrust-at.cloud | 52.232.31.234 | | | | | |
| Westeurope | at | ft-a106-fiskaltrust-at.at | 52.232.31.234 | | | | | |
| Paris | fr | ft-a108-fiskaltrust-fr.fr | 40.89.143.55 | | | | | |
| Paris | fr | ft-a109-fiskaltrust-fr.fr | 40.66.60.130 | | | | | |
| Frankfurt | de | ft-a111-fiskaltrust-de.de | 51.116.232.218 | | | | | |
| Frankfurt | de | ft-a112-fiskaltrust-de.de | 51.116.168.9 | | | | | |
| Westeurope | at | ft-a114-fiskaltrust-at.at | 13.95.22.62 | | | | | |
| Westeurope | at | ft-a115-fiskaltrust-at.at | 52.166.206.12 | | | | | |
| Graz | at | ft-at11-fiskaltrust-at.at | 149.154.100.21 | | | | | |
### Anforderungen bei Benutzung der fiskaly TSE
Port 443 https ausgehend für
- kassensichv.io
### bei Verwendung der Deutschen Fiskal TSE / der Swissbit Cloud TSE
Port 443 https ausgehend für
- fiskal.cloud
Aus den FAQ der Deutschen Fiskal:
Der Fiskal Cloud-Konnektor benötigt Zugriff über https auf den Hostnamen fiskal.cloud auf Port 443.
Der Dienst ist DNS-basiert und die IP-Adresse kann nicht garantiert werden und kann sich ohne Vorankündigung ändern.
Ein Cloud-basierter TSS Fiskal Cloud Connector benötigt eine kontinuierliche Konnektivität für alle Signier-, Export- und andere Operationen.
Bei einem Token-basierten Fiskal Cloud Connector funktionieren die Signierungsvorgänge auch ohne Internet-Konnektivität. Konfigurationsänderungen und Exportfunktionen erfordern jedoch eine regelmäßige Internetverbindung.
| 193.113573 | 344 | 0.278093 | cat_Latn | 0.091088 |
54cbbdca8ff86d60e2b093634e979be0d6790a4a | 1,341 | md | Markdown | _posts/2019-01-27-Dual-Co-Matching-Network-for-Multi-choice-Reading-Comprehension.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 7 | 2018-02-11T01:50:19.000Z | 2020-01-14T02:07:17.000Z | _posts/2019-01-27-Dual-Co-Matching-Network-for-Multi-choice-Reading-Comprehension.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | null | null | null | _posts/2019-01-27-Dual-Co-Matching-Network-for-Multi-choice-Reading-Comprehension.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 4 | 2018-02-04T15:58:04.000Z | 2019-08-29T14:54:14.000Z | ---
layout: post
title: "Dual Co-Matching Network for Multi-choice Reading Comprehension"
date: 2019-01-27 14:15:37
categories: arXiv_CL
tags: arXiv_CL Relation
author: Shuailiang Zhang, Hai Zhao, Yuwei Wu, Zhuosheng Zhang, Xi Zhou, Xiang Zhou
mathjax: true
---
* content
{:toc}
##### Abstract
Multi-choice reading comprehension is a challenging task that requires complex reasoning procedure. Given passage and question, a correct answer need to be selected from a set of candidate answers. In this paper, we propose \textbf{D}ual \textbf{C}o-\textbf{M}atching \textbf{N}etwork (\textbf{DCMN}) which model the relationship among passage, question and answer bidirectionally. Different from existing approaches which only calculate question-aware or option-aware passage representation, we calculate passage-aware question representation and passage-aware answer representation at the same time. To demonstrate the effectiveness of our model, we evaluate our model on a large-scale multiple choice machine reading comprehension dataset({\em i.e.} RACE). Experimental result show that our proposed model achieves new state-of-the-art results.
##### Abstract (translated by Google)
##### URL
[http://arxiv.org/abs/1901.09381](http://arxiv.org/abs/1901.09381)
##### PDF
[http://arxiv.org/pdf/1901.09381](http://arxiv.org/pdf/1901.09381)
| 51.576923 | 847 | 0.778523 | eng_Latn | 0.929457 |
54cbe08572fec6b640c707f06932740c68a25a20 | 1,027 | md | Markdown | _pages/Weather.md | Marzogh/Marzogh.github.io | 0c985b54fef9391415da01004015df92c1c27954 | [
"MIT"
] | null | null | null | _pages/Weather.md | Marzogh/Marzogh.github.io | 0c985b54fef9391415da01004015df92c1c27954 | [
"MIT"
] | null | null | null | _pages/Weather.md | Marzogh/Marzogh.github.io | 0c985b54fef9391415da01004015df92c1c27954 | [
"MIT"
] | 3 | 2015-08-13T00:56:44.000Z | 2019-10-06T14:10:37.000Z | ---
title: "Weather Station"
layout: single
excerpt: "Arduino based weather station"
sitemap: false
permalink: /weather.html
---
# [Arduino Weather Station](http://marzogh.github.io/Arduino-Weather-Station)
This project aims to build a working weather station that can measure and report a number of environmental variables including:
- Temperature
- Relative humidity
- Wind speed
- Wind direction
- Rainfall
- Ambient light`
- UV`
- Lightning`
- CO<sub>2</sub> levels`
- SO<sub>2</sub> levels`
`In future updates
----------------------
### Components
- [Sparkfun Weather Meters](https://www.sparkfun.com/products/8942)
- [Sparkfun Weather Shield](https://www.sparkfun.com/products/12081)
- Humidity/Temperature - HTU21D
- Barometric Pressure/Temperature - MPL3115A2
- Light - ALS-PT19
- [Electric Imp](https://www.sparkfun.com/products/11395)
- [Electric Imp Shield](https://www.sparkfun.com/products/12887)
- 2x [RJ11 6-pin connectors](https://www.sparkfun.com/products/132) (for the Weather shield)
{: .square}
| 27.026316 | 127 | 0.726388 | eng_Latn | 0.419225 |
54cc2802f0feaea763e195e341930add2e3c216c | 1,066 | md | Markdown | README.md | mina-zaki/react-native-sdk | 91ab25367828d010450a1a8f2a6ce0f8e7e86a90 | [
"MIT"
] | null | null | null | README.md | mina-zaki/react-native-sdk | 91ab25367828d010450a1a8f2a6ce0f8e7e86a90 | [
"MIT"
] | null | null | null | README.md | mina-zaki/react-native-sdk | 91ab25367828d010450a1a8f2a6ce0f8e7e86a90 | [
"MIT"
] | null | null | null | # javascript-sdk
[User documentation](https://lightelligence-javascript-sdk.azurewebsites.net/)
Provides tools to make app development with lightelligence platform faster and easier. Contains of authorization for node and browser environments, wrappers around all api endpoints, util methods.
NOTE: currently you _need_ to commit dist folder in order for sdk to work. We are working on a better solution.
##Dependencies
- [oidc-client](https://github.com/IdentityModel/oidc-client-js) Library to provide OpenID Connect (OIDC) and OAuth2 protocol support for client-side, browser-based JavaScript client applications
##Technologies
- ECMAScript 2018
- Webpack
- Jest
- JSDoc
##Development Kickstart
### Install dependencies
```
npm install
```
### Run build in watch mode
```
npm run start
```
### Build library
```
npm run build
```
### Build documentation
available under `docs/index.html`
```
npm run documentation
```
## Deployment
Library is published to npm via bitbucket pipelines job.
## Access
TODO: link to npm package
## LICENCE
MIT
| 18.067797 | 196 | 0.754221 | eng_Latn | 0.909434 |
54cfc9cfd098525daf1682cbdc950af95414d79f | 2,775 | md | Markdown | docs/normalist/en/02.components/01.table-manager/docs.md | belgattitude/solublecomponents | f045b3967e42d352d8e70fb2c1f8a427df62599f | [
"MIT"
] | 2 | 2015-07-11T14:35:53.000Z | 2019-02-24T08:29:27.000Z | docs/normalist/en/02.components/01.table-manager/docs.md | belgattitude/solublecomponents | f045b3967e42d352d8e70fb2c1f8a427df62599f | [
"MIT"
] | 2 | 2018-03-26T12:18:04.000Z | 2018-03-26T12:18:15.000Z | docs/normalist/en/02.components/01.table-manager/docs.md | belgattitude/solublecomponents | f045b3967e42d352d8e70fb2c1f8a427df62599f | [
"MIT"
] | 1 | 2015-07-11T14:34:25.000Z | 2015-07-11T14:34:25.000Z | ---
title: Setting up TableManager
taxonomy:
category: docs
---
>>>>> `Synthetic\TableManager` holds the database configuration and acts as the central object to retrieve and work with synthetic tables.
### Usage example
The following example illustrate how to setup the TableManager object with a [singleton](http://en.wikipedia.org/wiki/Singleton_pattern) implementation.
```php
<?php
use Zend\Db\Adapter\Adapter;
use Soluble\Normalist\ZeroConfDriver;
use Soluble\Normalist\Synthetic\TableManager;
class MyExampleSingletonConfig {
protected static $tmInstance;
/**
* @return TableManager
*/
public static function getTableManager() {
if (self::$tmInstance === null) {
// 1. Setting up adapter
$adapterConfig = [
'driver' => 'Mysqli', // or Pdo_Mysql
'hostname' => 'localhost', // 'localhost' by default
'username' => 'db_user',
'password' => 'db_password',
'database' => 'db_name'
];
$adapter = new Adapter($adapterConfig);
// 2. Create a driver
$driver = new ZeroConfDriver($adapter);
// 3. Instanciate the table manager
self::$tmInstance = new TableManager($driver);
}
return self::$tmInstance;
}
}
```
You can now retrieve the TableManager from the MyExampleSingletonConfig object.
```php
<?php
require_once ./MyExampleSingletonConfig.php;
$tm = MyExampleSingletonConfig::getTableManager();
$tm->getTable('user')->find(1);
```
### Parameter
The TableManager requires the `$driver` parameter.
| name | type | description |
|---|---|---|
| $driver | `Soluble\Normalist\Driver\DriverInterface` | The database schema driver, the current bundled implementation is the [ZeroConfDriver](../../drivers/zeroconfdriver) providing autodiscovery of models for Mysql/MariaDb. |
>>>>> The bundled [ZeroConfDriver](../../drivers/zeroconfdriver) requires an `$adapter` ([Zend\Db\Adapter\Adapter](http://framework.zend.com/manual/current/en/modules/zend.db.adapter.html)) and a recommended `$driverOptions` parameters. See its [documentation](../drivers/zeroconfdriver)
### Implementations
>>>> The TableManager object must be instanciated only once (at least once per different database connections if many).
Normalist does not enforce a particular strategy to maintain uniqueness of the TableManager instance.
The example above use intentionnaly the [singleton](http://en.wikipedia.org/wiki/Singleton_pattern) as it may be more readable. Please also consider the [following strategies](../../integration/configuration-strategies).
| 33.433735 | 289 | 0.661622 | eng_Latn | 0.729622 |
54d089c673198d84bd65dd44a2cbfc51e69d8659 | 1,059 | md | Markdown | _docs/02-welcome.md | arcolinuxz/calamares.github.io | 0eddc9c5dc69bec1986b3a917f55000eb2d99042 | [
"MIT"
] | 8 | 2015-02-04T14:49:36.000Z | 2021-03-24T04:11:16.000Z | _docs/02-welcome.md | dr460nf1r3/calamares.github.io | 148eeed9abc20aee09cac05f3f27e2c32a7d0136 | [
"MIT"
] | 9 | 2015-06-24T18:17:20.000Z | 2021-11-18T12:10:40.000Z | _docs/02-welcome.md | dr460nf1r3/calamares.github.io | 148eeed9abc20aee09cac05f3f27e2c32a7d0136 | [
"MIT"
] | 36 | 2015-03-01T17:39:42.000Z | 2021-11-18T11:58:07.000Z | ---
title: "Welcome"
permalink: /docs/welcome/
excerpt: "Here you will find a nice big friendly logo for your Linux distribution."
last_modified_at: 2021-01-04T15:19:22-04:00
toc: false
---
Welcome to the installer! Here you will find a nice big friendly logo for your Linux distribution. Calamares does some basic checking here to see if the system is usable for installation, and will warn you if something seems to be wrong. Having less than the shown disc space or available RAM will almost certainly fail the install.

You can change the language of the installer here. Over fifty languages are supported. There may be some buttons that link to your distribution's website (for instance, so you can read the release notes).

Calamares checks if there is internet connectivity (this is used by many distributions for package functions) by loading a well-known page. This is configured by the distribution.
| 58.833333 | 332 | 0.78848 | eng_Latn | 0.997018 |
54d1b602e8f7b6d098f0966500e79e16680ff1a6 | 570 | md | Markdown | build/parameter/README.md | JessicaLeu-code/CFSMotionPlanning_cpp | b0aff3b7c1f00f15a4867eb8221ddd5e523f193b | [
"MIT"
] | 1 | 2021-08-09T11:06:24.000Z | 2021-08-09T11:06:24.000Z | build/parameter/README.md | JessicaLeu-code/CFSMotionPlanning_cpp | b0aff3b7c1f00f15a4867eb8221ddd5e523f193b | [
"MIT"
] | null | null | null | build/parameter/README.md | JessicaLeu-code/CFSMotionPlanning_cpp | b0aff3b7c1f00f15a4867eb8221ddd5e523f193b | [
"MIT"
] | 2 | 2021-08-09T11:06:26.000Z | 2021-09-23T07:34:29.000Z | Parameter setups for CFS 2D motion planning example
- `parameters_2d.txt` set up `max_iter`(maximum numbers of subproblem), `converge_trh` (threshold for convergence), `margin` (clearence away from obstacle), `v_max` (maximum velocity), `weight_self` (weight for penalizing input).
- `env.txt` set up obstacle in the following formate:
```
(number of obstacles)
(x obs1)
(y obs1)
(radius obs1)
(x obs2)
(y obs2)
(radius obs2)
.
.
.
```
- `ref_init_goal.txt` set up planning problem in the following formate:
```
(x initial)
(y initial)
(x goal)
(y goal)
(Horizon)
```
| 22.8 | 229 | 0.714035 | eng_Latn | 0.950718 |
54d1c079ec5fbcf134ee4593ebaa047dfe28f322 | 5,882 | md | Markdown | articles/virtual-machines/image-version-vm-cli.md | julianosaless/azure-docs.pt-br | 461791547c9cc2b4df751bb3ed881ce57796f1e4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/image-version-vm-cli.md | julianosaless/azure-docs.pt-br | 461791547c9cc2b4df751bb3ed881ce57796f1e4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/image-version-vm-cli.md | julianosaless/azure-docs.pt-br | 461791547c9cc2b4df751bb3ed881ce57796f1e4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Criar uma imagem de uma VM
description: Saiba como criar uma imagem em uma galeria de imagens compartilhadas de uma VM no Azure.
author: cynthn
ms.service: virtual-machines
ms.subservice: imaging
ms.topic: how-to
ms.workload: infrastructure
ms.date: 05/01/2020
ms.author: cynthn
ms.reviewer: akjosh
ms.openlocfilehash: f53a6b63c744b0e3e41f7ad22270cd842da57674
ms.sourcegitcommit: e0330ef620103256d39ca1426f09dd5bb39cd075
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 05/05/2020
ms.locfileid: "82796571"
---
# <a name="create-an-image-version-from-a-vm-in-azure-using-the-azure-cli"></a>Criar uma versão de imagem de uma VM no Azure usando o CLI do Azure
Se você tiver uma VM existente que deseja usar para criar várias VMs idênticas, poderá usar essa VM para criar uma imagem em uma galeria de imagens compartilhadas usando o CLI do Azure. Você também pode criar uma imagem de uma VM usando [Azure PowerShell](image-version-vm-powershell.md).
Uma **versão de imagem** é o que você usa para criar uma VM ao usar uma galeria de imagens compartilhada. Você pode ter diversas versões de uma imagem conforme necessário para seu ambiente. Quando você usa uma versão de imagem para criar uma VM, a versão da imagem é usada para criar discos para a nova VM. Versões de imagem podem ser usadas várias vezes.
## <a name="before-you-begin"></a>Antes de começar
Para concluir este artigo, você deve ter uma galeria de imagens compartilhada existente.
Você também deve ter uma VM existente no Azure, na mesma região que a galeria.
Se a VM tiver um disco de dados anexado, o tamanho do disco de dados não poderá ser maior que 1 TB.
Ao trabalhar com este artigo, substitua os nomes de recursos onde necessário.
## <a name="get-information-about-the-vm"></a>Obter informações sobre a VM
Você pode ver uma lista de VMs que estão disponíveis usando [AZ VM List](/cli/azure/vm#az-vm-list).
```azurecli-interactive
az vm list --output table
```
Depois de saber o nome da VM e em qual grupo de recursos ele está, obtenha a ID da VM usando [AZ VM Get-Instance-View](/cli/azure/vm#az-vm-get-instance-view).
```azurecli-interactive
az vm get-instance-view -g MyResourceGroup -n MyVm --query id
```
## <a name="create-an-image-definition"></a>Criar uma definição de imagem
Definições de imagem crie um agrupamento lógico para imagens. Eles são usados para gerenciar informações sobre as versões de imagem que são criadas dentro delas.
Os nomes de definição de imagem podem ser compostos de letras maiúsculas ou minúsculas, dígitos, pontos, traços e pontos.
Verifique se a definição da imagem é do tipo correto. Se você tiver generalizado a VM (usando Sysprep para Windows ou waagent-deprovision para Linux), deverá criar uma definição de imagem generalizada usando `--os-state generalized`. Se você quiser usar a VM sem remover contas de usuário existentes, crie uma definição de imagem especializada `--os-state specialized`usando.
Para obter mais informações sobre os valores que você pode especificar para uma definição de imagem, consulte [definições de imagem](https://docs.microsoft.com/azure/virtual-machines/linux/shared-image-galleries#image-definitions).
Crie uma definição de imagem na Galeria usando [AZ SIG Image-Definition Create](/cli/azure/sig/image-definition#az-sig-image-definition-create).
Neste exemplo, a definição da imagem é chamada *myImageDefinition*e é para uma imagem [especializada](https://docs.microsoft.com/azure/virtual-machines/linux/shared-image-galleries#generalized-and-specialized-images) do SO Linux. Para criar uma definição para imagens usando um sistema operacional Windows, `--os-type Windows`use.
```azurecli-interactive
az sig image-definition create \
--resource-group myGalleryRG \
--gallery-name myGallery \
--gallery-image-definition myImageDefinition \
--publisher myPublisher \
--offer myOffer \
--sku mySKU \
--os-type Linux \
--os-state specialized
```
## <a name="create-the-image-version"></a>Criar a versão da imagem
Crie uma versão de imagem da VM usando [AZ Image Gallery Create-Image-Version](/cli/azure/sig/image-version#az-sig-image-version-create).
Caracteres permitidos para a versão da imagem são números e pontos. Os números devem estar dentro do intervalo de um inteiro de 32 bits. Formato: *MajorVersion*. *MinorVersion*. *Patch*.
Neste exemplo, a versão da nossa imagem é a *1.0.0* e vamos criar 2 réplicas na região do *Oeste EUA Central* , 1 réplica na região *do Sul EUA Central* e 1 réplica na região *leste dos EUA 2* usando o armazenamento com redundância de zona. As regiões de replicação devem incluir a região em que a VM de origem está localizada.
Substitua o valor de `--managed-image` neste exemplo pela ID da VM da etapa anterior.
```azurecli-interactive
az sig image-version create \
--resource-group myGalleryRG \
--gallery-name myGallery \
--gallery-image-definition myImageDefinition \
--gallery-image-version 1.0.0 \
--target-regions "westcentralus" "southcentralus=1" "eastus=1=standard_zrs" \
--replica-count 2 \
--managed-image "/subscriptions/<Subscription ID>/resourceGroups/MyResourceGroup/providers/Microsoft.Compute/virtualMachines/myVM"
```
> [!NOTE]
> Você precisa aguardar que a versão da imagem termine completamente de ser compilada e replicada antes de poder usar a mesma imagem gerenciada para criar outra versão de imagem.
>
> Você também pode armazenar sua imagem no armazenamento Premiun por um armazenamento `--storage-account-type premium_lrs`com [redundância de zona](https://docs.microsoft.com/azure/storage/common/storage-redundancy-zrs) ou adição, `--storage-account-type standard_zrs` adicionando ao criar a versão da imagem.
>
## <a name="next-steps"></a>Próximas etapas
Crie uma VM a partir da [imagem generalizada](vm-generalized-image-version-cli.md) usando o CLI do Azure.
| 54.462963 | 375 | 0.774736 | por_Latn | 0.99914 |
54d1ec830b5577639fdfad8928931bac4f8704c0 | 1,618 | md | Markdown | README.md | GildedFinance/transaction-traveler | d69b32e5d8f7ba8cf42ff2a3b23020fd8b494bf7 | [
"MIT"
] | 3 | 2018-08-26T22:38:57.000Z | 2021-05-19T23:09:55.000Z | README.md | GildedFinance/transaction-traveler | d69b32e5d8f7ba8cf42ff2a3b23020fd8b494bf7 | [
"MIT"
] | null | null | null | README.md | GildedFinance/transaction-traveler | d69b32e5d8f7ba8cf42ff2a3b23020fd8b494bf7 | [
"MIT"
] | null | null | null | # Transaction Traveler
### Enables transaction and invoice data to travel effortlessly between common financial data formats
With Transaction Traveler you can:
- Convert **transaction data** from one format to another (e.g. from Coinbase to Etherscan into a common format)
- Convert **invoice data** from one format to another (e.g. from Request Network to QuickBooks)
We welcome the addition of new data formats. Just send us a pull request!
## Supported Transaction Formats
#### Blockchain APIs
- Etherscan
- Blockchain.com (WIP)
#### Exchange APIs
- Coinbase (WIP)
- Binance (WIP)
#### Accounting APIs
- QuickBooks (WIP)
## Supported Invoice Formats
- [Request Network](https://github.com/RequestNetwork/requestNetwork/tree/master/packages/requestNetworkDataFormat)
- QuickBooks API (WIP)
## Usage
#### Install
Install: ```npm install transaction-traveler```
#### Convert Coinbase transactions to QuickBooks transaction
```
import { TransactionTraveler, Network } from "transaction-traveler"
let exampleTxn = {
// ... Coinbase API transaction data
}
let tt = new TransactionTraveler();
const toQuickBooks = tt.convertTransaction(exampleTxn, Network.Coinbase, Network.QuickBooks);
console.log(toQuickBooks);
```
#### Normalized Data
You can also convert data from multiple systems into a common "base" format. This allows all data to be normalized.
```
const coinbaseTxn = {
// ...
};
const etherscanTxn = {
// ...
};
const normalizedTxns = [
tt.convertTransaction(coinbaseTxn, Network.Coinbase, Network.Base),
tt.convertTransaction(etherscanTxn, Network.Etherscan, Network.Base);
];
```
| 25.28125 | 115 | 0.745365 | eng_Latn | 0.666792 |
54d26a0e4969287ad0738b7bf1f788165f60dedc | 10,008 | md | Markdown | _posts/2021-07-29-alphago_zero_tsp.md | seolhokim/seolhokim.github.io | fd0e88e4a4c2da4b200fefaf49eb7435b8b40216 | [
"MIT"
] | null | null | null | _posts/2021-07-29-alphago_zero_tsp.md | seolhokim/seolhokim.github.io | fd0e88e4a4c2da4b200fefaf49eb7435b8b40216 | [
"MIT"
] | 1 | 2019-12-05T10:37:14.000Z | 2019-12-05T10:37:14.000Z | _posts/2021-07-29-alphago_zero_tsp.md | seolhokim/seolhokim.github.io | fd0e88e4a4c2da4b200fefaf49eb7435b8b40216 | [
"MIT"
] | 1 | 2020-05-19T16:30:20.000Z | 2020-05-19T16:30:20.000Z | ---
layout: post
title: "Solving NP-hard Problems on Graphs with Extended AlphaGo Zero 논문 리뷰 및 설명"
subtitle: ""
categories: deeplearning
tags: reinforcementlearning
---
Solving NP-hard Problems on Graphs with Extended AlphaGo Zero
1. **Abstract**
- Graph neural network와 MCTS의 결합을 통해 Combinatorial Problem을 해결함을 보입니다. 이 때, 학습 방법으로 AlphaGo Zero를 변형한 **CombOpt Zero**를 소개합니다.
2. **Introduction**
- 논문 전반으로 S2V와 많이 비교합니다. 이는 GNN과 greedy selection을 통해 dataset없이 학습이 가능했지만, 실제로 다른 특성을 가진 graph에 대해 성능이 좋지 않음을 확인했습니다. 이를 Q-learning의 exploration의 한계로 보았습니다.
- 그렇기 때문에 Q-learning을 CombOpt Zero로 대체합니다. 이는 두 플레이어로 이루어진 게임에 제한되었는데 이를 sampling과 간단한 reward normalization을 통해 해결합니다.
3. **Preliminary**
1. **Notation**
- 논문의 graph $$G=(V,E)$$는 undirected, unlabelled graph를 사용합니다. $$V$$와 $$E$$는 vertices와 edges를 나타냅니다. $$V(G)$$는 graph의 vertices set을 의미합니다. $$\mathcal{N}(x)$$는 node $$x$$에 대한 한 edge로 연결된 neighbors를 의미하고, node set $$S$$에 대해 $$\mathcal{N}(S) = \bigcup_{x \in S} \mathcal{N}(x)$$와 같이 표현합니다. $$\boldsymbol{p}$$와 $$\boldsymbol{\pi}$$는 모두 vector이므로 bold체로 표기합니다.
2. **Machine Learning for Combinatorial Optimization**
3. **AlphaGo Zero**
- AlphaGo Zero는 바둑에서 인간을 압도한 RL algorithm입니다. 이는 RL을 통해 parameter $$\theta$$의 neural network $$f_\theta$$를 학습시킵니다. state에 대해 network는 action에대한 distribution vector $$\boldsymbol{p}$$과 state value $$v, v\in [-1,1]$$를 output으로 가집니다.
- AlphaGo Zero는 self-play를 통해 network를 update하는데, 이는 **Monte Carlo Tree Search(MCTS)**의 special version입니다. network는 MCTS로 얻은 $$\boldsymbol{\pi}$$와 $$\boldsymbol{p}$$가 같아지도록 cross-entropy를 최소화하고, value estimation $$v$$와 실제 self play를 통해 얻은 $$z$$의 l2-loss를 최소화하도록 합니다. 이를 regularization term과 함께 loss로 나타내면 다음과 같습니다.
$$\mathcal{L} = (z-v)^2 + \mathrm{CrossEntropy}(\boldsymbol{p},\boldsymbol{\pi}) + c_{\mathrm{reg}} \Vert\theta \Vert^2_2$$
- MCTS는 tree형식의 데이터에서 heuristic search를 위한 algorithm으로, root로 부터 다음과 같이 spanning 됩니다.

이 때, edge는 다음과 같은 4가지 정보를 가지고 있습니다.
$$(N(s,a),W(s,a),Q(s,a),P(s,a)) \cdots (2)$$
- $$N(s,a)$$는 edge를 얼마나 방문했는지를 나타냅니다.
- $$W(s,a)$$는 총 action value를 나타냅니다.
- $$Q(s,a)$$는 평균 action value를 나타냅니다. $$\frac{W(s,a)}{N(s,a)}$$를 통해 나타낼 수 있습니다.
- $$P(s,a)$$는 기존의 probability를 나타냅니다.
AlphaGo Zero는 그림의 세 번째 Sampling을 network prediction만을 이용하므로 이를 나타내면 다음과 같습니다.(하지만 여기선 sampling을 통해 random action에 대한 평균과 표준편차를 구하긴합니다.)
- **selection**
- root로 부터 다음의 **upper confidence bound(UCB)**를 maximize하는 Q를 선택합니다.
$$Q(s,a) + c_{\mathrm{puct}}P(s,a)\frac{\sqrt{\sum_{a'}N(s,a')}}{1+N(s,a)} \cdots(3)$$
이는 기존의 UCB에서 prior probability를 이용해 exploration을 좀 더 기존에 좋았던 방향으로 진행할 수 있도록 합니다.(worst case에 penalty를 주는 것과 같습니다.)
- **Expansion**
- 선택된 edge에 대해 unexplored node $$s$$를 추가하고, network를 통해 edge weights들을 initialize합니다.
- **Backup**
- edge를 traverse하며 방문한 edge value를 update합니다. $$Q$$는 각 edge의 총 가본 횟수와 도착한 node의 state에 대한 evaluation의 평균을 이용해 계산합니다.
$$Q(s,a) = \frac{1}{N(s,a)}\sum_{s' \vert s,a →s'}v_{s'}$$
MCTS의 $$\boldsymbol{\pi}$$는 temberature과 함께 가본 횟수에 의해 정의됩니다.
$$\boldsymbol{\pi}_a = \frac{N(s_0,a)^{1/\tau}}{\sum_bN(s_0,b)^{1/\tau}}$$
4. **Graph Neural Network**
- GNN은 input을 graph로 받을 때 사용하는 neural network입니다.
- **structure2vec**
- 기존에 설명했으므로 생략합니다.
- **Graph Convolutional Network**
$$H^{(l+1)} = \sigma(\hat{D}^{-1/2}\hat{A}\hat{D}^{-1/2}H^{(l)}\theta^{(l)})$$
- 그래프의 인접행렬 $$A$$에 대해 $$\tilde{A} = A + I_n$$이고, $$D$$는 각 node의 degree를 나타내는 행렬일 때, $$\tilde{D}^{-1/2}\tilde{A}\tilde{D}^{-1/2}$$는 $$A$$를 정규화시킵니다. 그리고 이전의 hidden $$H^{(l)}$$를 곱하는 것은 각 노드의 hidden을 다음 neighbors의 hidden로 전달하는 것과 같습니다. 이 때 trainable matrix $${\theta^{(l)}}$$를 곱한뒤 마지막으로, non linearity를 더해 다음 $$H^{(l+1)}$$을 recursive하게 만듭니다.
- **Graph Isomorphism Network**
- aggregation, readout이 injective function이어야하기 때문에 neighbors와 자신 모두 더한 뒤 $$\hat{A}$$를 곱해 MLP를 통과합니다. 마지막으로 모든 MLP layer를 concatenate한 output을 사용하는데, 이를 나타내면 다음과 같습니다.
$$H^{(l+1)} = \mathrm{MLP}^{(l)}(\tilde{A}H^{(l)}), \ y_v = \mathrm{MLP(CONCAT}(H^{(l)}_v \vert l = 0,1, \cdots,L))$$
- **2-IGN+**
- 너무 길어지는 것 같아서 생략합니다.
5. **NP-hard Problems on Graphs**
- **Minimum Vertex Cover**
- 모든 edge를 포함하는 nodes의 subset $$V' \subset V$$가 가장 작도록 하는 문제입니다.
- **Max Cut**
- cut set $$C \subset E$$의 size를 최대화하여 node의 subset $$V'$$을 만드는 문제입니다.
- **Maximum Clique**
- 모든 node가 연결된 node의 subset $$V' \subset V$$를 최대화 하는 문제입니다.
4. **Method**
이번 section은 CombOpt Zero를 통해 Graph combinatorial optimization problem을 해결하기 위한 방법을 설명합니다.
1. **Reduction to MDP**
- AlphaGo Zero의 method를 combinatorial optimization에 적용하기 위해서 graph problems을 MDP로 정의해야합니다. deterministic MDP는 다음에 의해 정의됩니다.
$$(S,A_s,T,R)$$
$$S$$는 states의 set, $$A_s$$는 state $$s$$로부터의 action set, transition probability $$T :S \times A_s →S$$, reward $$R :S \times A_s → \mathbb{R}$$ 로 정의합니다. 이 때, 각 state $$s\in S$$에 대해 label $$d$$가 붙습니다. labeling을 위한 function $$d : V →L$$에 대해각 state는 $$s = (G,d), \ G=(V,E)$$로 나타냅니다.
각 문제마다 마지막 state $$S_{\mathrm{end}}$$가 존재하고, 각 state $$s$$로부터 $$S_{\mathrm{end}}$$가 나올 때까지 action과 transition probability function를 통해 반복하게 됩니다. 이를 통해 state과 action sequence를 얻을 수 있고 이를 trajectory라고 합니다. 이를 통해 trajectory의 reward $$\sum^{N-1}_{n=0}R(s_n,a_n)$$를 구할 수 있고, $$r^*(s)$$는 state $$s$$로 부터 얻을 수 있는 최고의 optimal reward sum입니다. $$\mathrm{Init}$$은 input graph로부터 문제를 풀기 위한 state로 변환해주는 function로 사용됩니다. 이 때 목표는 주어진 graph $$G_0$$로부터 $$r^*(\mathrm{Init}(G_0))$$를 얻는 것 입니다.
- **Maximum Vertex Cover**
- label이 필요하지 않는 문제입니다. 그렇기 때문에 $$d$$는 constant function으로 두고 사용합니다. action은 한 node를 선택하는 것으로, 최대한 빠른 step안에 모든 node가 cover되도록 reward $$R(s,x) = -1$$을 설정합니다.
- **Max Cut**

- 각 action마다 node를 0이나 1로(파란색, 빨간색) 색칠합니다. 그리고 그 노드를 제거하면서 얼마나 많은 인접한 색색의 노드가 제거되었는지를 저장합니다. $$A_s = \{ (x,c) \vert x \in V, c \in \{1,2\} \}$$는 nodes에 색칠할 수 있는 set을 나타내고, $$(x,c)$$는 node를 color c로 색칠하는 것을 의미합니다. $$L = \mathbb{N}^2$$는 각 adjacent nodes가 각 색에 의해 몇 회 색칠됐는지를 나타냅니다. $$R(s,(x,c))$$는 $$(3-c)\mathrm{-th}$$에 대해 c가 1이면 2, c가 2이면 1의 $$d(x)$$값입니다.(색칠된 색의 반대 dimension을 고른 값)
- **Maximum Clique**
- 비슷하므로 생략합니다.
2. **Ideas for CombOpt Zero**
- **Combinatorial Optimization vs. Go**
- AlphaGo Zero를 MDP formulation에 적용할 때 두가지 바둑과는 다른 점이 있습니다. 첫째로, 바둑은 고정된 input size를 가지나, combinatorial problem은 대체로 dynamic합니다. 그렇기 때문에 GNN을 통해 이를 해결합니다. 둘째로, 결과가 단순화되지 않습니다. 바둑은 승리하거나 비기거나 지는 경우로 나뉘는데 combinatorial problem은 그렇지 않습니다. 바둑처럼 $$[-1,1]$$의 범위에서 큰 value를 가질수록 이길확률이 크다는 것을 따라서 만들 수 있지만, graph size가 커질수록, trajectory length가 길어지게되고, cumulative reward가 높다면, (3)의 첫 번째 term이 dominent해지게 되므로 탐색이 제대로 이루어지지 않습니다. 여기서는 normalization technic을 사용하여 이를 해결합니다.
- 먼저 AlphaGo Zero의 network $$f_\theta(s) = (\boldsymbol{p},v)$$에서 value를 각 action에 대한 value로 만들어 state action value형태로 만듭니다. 이는 다음과 같이 표현합니다.
$$f_\theta(s)= (\boldsymbol{p},\boldsymbol{v})$$
이 떄 $$v_a$$는 state $$s$$에서 action $$a$$를 취했을 때의 normalized reward를 predict한 값으로 이 state에서 취한 action이 다른 random action보다 얼마나 좋은지를 나타내게 됩니다. $$v_a$$는 다음을 예측하도록 학습됩니다.(state $$s$$에서 action $$a$$로 인한 reward와 이후의 optimal reward를 normalization한 것)
$$(R(s,a)+r^*(T(s,a)) - \mu_s) / \sigma_s$$
이를 이용해 다시 unnormalized value에 대해 restoring할 수 있습니다.
$$r_{\mathrm{estim}}(s)\left\{\begin{matrix}0\ \ \ \ \ \ (s \in S_{\mathrm{end}})\\ \mu_s + \sigma_s \cdot (\max_{a\in{A_s}}{v}_a)\ \ (\mathrm{otherwise})\end{matrix}\right.$$
이리하여 $$W(s,a)$$와 $$Q(s,a)$$는 normalized된 value를 E가지게 됩니다.
- **Reward Normalization Technique**
- 이렇게 normalization을 하여서 (3)의 첫번째 term이 dominent해지는 현상에 대해 완화시킬 수 있게 되었습니다.
3. **Algorithms**
- **MCTS Tree Structure**
- Alphago Zero처럼 각 edge에 (2)의 정보를 가지고 있습니다. 추가적으로 node는 $$(\mu_s,\sigma_s)$$의 tuple을 가집니다.
- **MCTS**

pseudo code는 다음과 같습니다. graph의 특성상 action space가 states에 따라 dynamic하므로 action space의 $$c_{\mathrm{iter}}$$에 비례하게 iteration할 수 있게 합니다. 또한 root의 prior probability는 Dirichlet noise를 넣어 explore을 촉진합니다.
이후 selection과 expand, backup을 하는데, backup쪽만 살펴보겠습니다. $$f_\theta$$를 통해서 normalized state value를 얻은 다음 이를 random sampling을 통해 얻은 mean과 standard deviation을 통해 reward를 계속해서 normalization합니다. 그리하여 $$W(s,a)$$와 $$Q(s,a)$$는 모두 normalization된 값을 가지게 됩니다. 그리고 MCTS $$\boldsymbol{\pi}$$를 output으로 가집니다.
- **Training**
- CombOpt Zero의 학습은 세가지 구성요소로 이루어져있습니다.
- **data generators**

self-play records를 randomly generated graph로부터 계속해서 만듭니다. 이는 MCTS policy $$\boldsymbol{\pi}$$로 부터 생성된 trajectory로 볼 수 있습니다.
- **learners**
- 랜덤하게 sampling한 records로부터 다음과 같이(AlphaGo Zero의 loss와 같은) loss를 계산합니다.
$$\mathcal{L} = (z'- v_a)^2 + \mathrm{CrossEntropy}(\boldsymbol{p},\boldsymbol{\pi}) + c_\mathrm{reg} \Vert \theta \Vert ^2_2$$
- **model evaluators**
- AlphaGo Zero처럼 두 플레이어 간의 winning rate를 구할 수 없으므로, random graph를 생성하고 mean performance를 측정하는 파트입니다.
- References
[1] [Solving NP-Hard Problems on Graphs with Extended AlphaGo Zero](https://arxiv.org/abs/1905.11623)
[2] [https://coinse.kaist.ac.kr/projects/mctsps/](https://coinse.kaist.ac.kr/projects/mctsps/)
| 65.411765 | 486 | 0.617406 | kor_Hang | 1.000003 |
54d27975ce8f328db104999f35a49ddcbc346161 | 57 | md | Markdown | README.md | cobrapitz/DialogCreator | eba2187fb56b5557a8a306463f5dff9f34f82cd4 | [
"MIT"
] | null | null | null | README.md | cobrapitz/DialogCreator | eba2187fb56b5557a8a306463f5dff9f34f82cd4 | [
"MIT"
] | null | null | null | README.md | cobrapitz/DialogCreator | eba2187fb56b5557a8a306463f5dff9f34f82cd4 | [
"MIT"
] | 1 | 2020-05-24T16:38:22.000Z | 2020-05-24T16:38:22.000Z | # DialogCreator
A Dialog-Creator for Godot (Game Engine)
| 19 | 40 | 0.789474 | kor_Hang | 0.396863 |
54d4f0952f2fd0b87eda585b52719e8aa958a910 | 13 | md | Markdown | cohort/README.md | Dzmuh/webstack-by-daniel | 04d3f1daff57bd4759b46c06467a3f63a689e5c8 | [
"MIT"
] | 62 | 2016-06-18T18:41:40.000Z | 2022-03-02T17:59:00.000Z | cohort/README.md | Dzmuh/webstack-by-daniel | 04d3f1daff57bd4759b46c06467a3f63a689e5c8 | [
"MIT"
] | 1 | 2018-05-14T03:25:13.000Z | 2018-05-14T03:25:13.000Z | cohort/README.md | Dzmuh/webstack-by-daniel | 04d3f1daff57bd4759b46c06467a3f63a689e5c8 | [
"MIT"
] | 35 | 2016-06-06T06:09:32.000Z | 2022-01-07T02:20:48.000Z | cohort
====== | 6.5 | 6 | 0.461538 | eng_Latn | 0.81744 |
54d657bd2c358d41c96a7574902665013d3461fc | 2,136 | md | Markdown | README.md | GilberK25/N-Sharp | 4d60d359a0a74f07f4ef111a20af2a5efd44a265 | [
"MIT"
] | 1 | 2022-02-15T01:17:29.000Z | 2022-02-15T01:17:29.000Z | README.md | GilberK25/N-Sharp | 4d60d359a0a74f07f4ef111a20af2a5efd44a265 | [
"MIT"
] | 3 | 2022-02-16T13:01:09.000Z | 2022-03-01T20:00:40.000Z | README.md | GilberK25/N-Sharp | 4d60d359a0a74f07f4ef111a20af2a5efd44a265 | [
"MIT"
] | 1 | 2022-02-15T12:33:46.000Z | 2022-02-15T12:33:46.000Z | # N# - The programming language
N# (N-Sharp) is a programming language developed with ease of creation in mind. It derives from a mix of C# and C++, but adds several new tools and completely rebuilds the libraries.
## What does it include?
This repository will come with documentation, a compiler, libraries, and extensions for popular IDEs to add the N# language to their portfolio.
## How will the language be executed?
N# is a compiled language. Using the built-in compiler, it will be converted into Assembly to then be run by the computer. Because of this, certain parts of the built-in libraries won't be able to be read fully in N# code, because some of it may include pure Assembly code.
## Why make/use this?
In my time as a programmer, I have seen many types of programming languages, but never a good language with useful libraries. For example, graphics libraries. C# is a beautiful and easy-to-use language (in my opinion), but you'll have trouble finding a good graphics library that works for C#, and if you find one, it's most likely a port of a library built for another language. C++, on the other hand, *does* have many commonly-used graphics libraries, but because it's syntax is so much more hard to use, I find it a lot harder to make good looking code in it.
N#'s goal is to fix that problem, with a good syntax system, along with many useful libraries, such as graphics and threading libraries, that come with the language and compiler.
Since it is just me working on this, and as a side project, no doubt this will take a long time. But I believe that it's something that could prove very useful if I actually finish it, and I hope you agree.
## About the folders
| Folder | Description |
| - | - |
| `/Compiler` | An N# project that contains the source code for the compiler. |
| `/Documentation` | Documentation for the language. |
| `/Examples` | Written examples of running N# code. |
| `/Extensions` | Some extension projects for popular IDEs such as Visual Studio. |
| `/Libraries` | The built-in libraries that come with the language. |
## Miscellaneous
- Accepted file endings: `.ns`, `.nsharp`
| 73.655172 | 563 | 0.753277 | eng_Latn | 0.999847 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.