content
stringlengths 10
4.9M
|
---|
Hi everybody! Before we get into the main meat of this message, I'd like to provide a bit of context. This is all based around a game called Space Station 13, which runs on the BYOND (short for Build Your Own Net Dream) platform. More specifically, it's focused on the goonstation branch of this game. If you're interested in finding out more about this game, you'll find a link to the main page of the goonstation wiki at the bottom of this submission. But for now, let's cut to the chase and talk about why we're here in the first place. You see, there's a lot of really intricate subsystems in Space Station 13, and one of them tasks players with discovering the hidden lore of the game by exploring the depths of space, delving into ancient ruins on alien worlds, evading dangerous traps, slaying eldritch horrors, and cracking codes all to locate mysterious artifacts that hold the key to unlocking the secret of the enigmatic stellar installation known only as the solarium.
The solarium puzzle proved so difficult to complete, that a team of dedicated people had to be assembled in order to even have a remote chance of succeeding. I am one of those people.
Now here's the real crazy part. The team and I have been working on solving the solarium for literal years and we STILL haven't solved it!
On the bright side, we've made consistent progress and have grown ever closer to finally cracking this nut. Well, we were. For the past two months or so, we've been stuck on a snag. We've been trying and failing to decode this strange piece of audio:
http://vocaroo.com/i/s0jkg4JDufZu
We are 100% positive that some clue is hidden within this thing, but we have yet to figure out its meaning. We tried running it through several different SSTV (slow scan television) programs, but those attempts were total busts. Then we tried using FL Studio to de-noise the audio in an attempt to find some obscured signal, but that also failed.
After that point, we acted upon a whole laundry list of different ideas to no avail: reversing it, slowing it down, pitch shifting it, bit crunching it, running it through an online midi converter, and hooking it into a fax machine.
We've managed to acquire various other miscellaneous hints about the next stage in the solarium puzzle, but most of them have only led to dead ends and red herrings.
The few promising ones are:
A single, incomplete word: Polyph____
A message that, when decrypted by setting the rotors on a replica enigma machine to S O L, gives the message "THEXWORLDXHASXCHANGEDZSEEKXTHEXSTARSZ"
This image:
The phrase: No Gods in the Abyss
Now, normally, we're not the type of people to just give up, but we've run out of options at this point. I genuinely worry that the team might be forced to disband if we don't get something out of the audio signal soon.
Thus, I ask from the bottom of my heart, the depths of my soul, the very fiber of my being:
Will you help us?
Be warned: this isn't even the craziest thing we've had to solve, just the most difficult. If you decide to help us, be prepared to go down the rabbit hole and not come back up for a while.
If you're interested in learning more about the solarium, then you can drop by the official sol team wiki: https://sol.miraheze.org/wiki/Main_Page
If you'd like to learn more about goonstation in general, then you can check out the official goonstation wiki: http://wiki.ss13.co/Main_Page
If you want to contact the sol team, we're available in the #sol channel on SynIRC. |
/** Statement base class implementing shared properties and logic */
public abstract class StatementBase implements Statement {
private StatementSource source;
private Integer index;
/** Comments preceding the statement in the source code. */
private List<Comment> comments;
public StatementBase(StatementSource source, List<Comment> comments) {
this.index = null;
this.source = source;
this.comments = comments;
}
@Override
public StatementSource getSource() {
return source;
}
@Override
public void setSource(StatementSource source) {
this.source = source;
}
@Override
public Integer getIndex() {
return index;
}
@Override
public void setIndex(Integer index) {
this.index = index;
}
@Override
public List<Comment> getComments() {
return comments;
}
@Override
public void setComments(List<Comment> comments) {
this.comments = comments;
}
@Override
public boolean equals(Object o) {
if(this == o) return true;
if(o == null || getClass() != o.getClass()) return false;
StatementBase that = (StatementBase) o;
return Objects.equals(source, that.source) &&
Objects.equals(index, that.index) &&
Objects.equals(comments, that.comments);
}
@Override
public int hashCode() {
return Objects.hash(source, index, comments);
}
public String idxString() {
return index == null ? "" : ("[" + index + "] ");
}
@Override
public String toString() {
return toString(null, false);
}
public String aliveString(Program program) {
if(program == null || !program.hasLiveRangeVariables()) {
return "";
}
LiveRangeVariables liveRanges = program.getLiveRangeVariables();
StringBuilder alive = new StringBuilder();
alive.append(getAliveString(liveRanges.getAlive(index)));
if(program.hasLiveRangeVariablesEffective()) {
LiveRangeVariablesEffective liveRangeVariablesEffective = program.getLiveRangeVariablesEffective();
LiveRangeVariablesEffective.AliveCombinations aliveCombinations = liveRangeVariablesEffective.getAliveCombinations(this);
alive.append(" ( ");
for(LiveRangeVariablesEffective.AliveCombination aliveCombination : aliveCombinations.getAll()) {
alive.append(aliveCombination.toString(program));
alive.append(" ");
}
alive.append(")");
}
return alive.toString();
}
private String getAliveString(Collection<VariableRef> alive) {
StringBuilder str = new StringBuilder();
str.append(" [ ");
for(VariableRef variableRef : alive) {
str.append(variableRef.getFullName());
str.append(" ");
}
str.append("]");
return str.toString();
}
} |
def is_neighbor(self, point1, point2):
row = point1[0] - point2[0]
col = point1[1] - point2[1]
if col == 0 and row == 0:
return False
if 0 <= row <= 1 and 0 <= col <= 1:
return True
if -1 <= row <= 0 and -1 <= col <= 0:
return True
return False |
import express from 'express';
import bodyParser from 'body-parser';
import { Logger } from 'tslog';
import { env } from 'process';
import { ListarProfilesController } from './controllers/listarProfileController';
import { loginController } from './controllers/loginController';
import { Auth } from './authentication/auth';
const log = new Logger();
const app = express();
const PORT = env["PORT"];
const auth = new Auth();
app.use(bodyParser.json());
app.use(auth.initialize())
app.get('/', (req, res) => res.send(""));
app.post('/login', loginController)
app.get('/profile', auth.authenticate(), ListarProfilesController);
app.listen(PORT, () => {
log.info(`Site em execução na porta ${PORT}`)
});
|
<reponame>DarkWugWug/botiful
import { IAction } from "./foundation";
export declare const help: IAction;
export declare const man: IAction;
|
/**
* Inserts the question to the specified index of the questions list. If the
* index exceeds the size of the questions list, it adds it to the end of the
* list.
*
* @param index
* The index to insert the question to
* @param question
* The question to add
*/
public void addQuestion(int index, AssignmentQuestion question) {
question.setAssignment(this);
if (index >= questions.size()) {
question.setIndex(questions.size());
questions.add(question);
} else {
question.setIndex(index);
questions.set(index, question);
}
} |
Improving the Projection of Global Structures in Data through Spanning Trees
The connection of edges in a graph generates a structure that is independent of a coordinate system. This visual metaphor allows creating a more flexible representation of data than a two-dimensional scatterplot. In this work, we present STAD (Spanning Trees as Approximation of Data), a dimensionality reduction method to approximate the high-dimensional structure into a graph with or without formulating prior hypotheses. STAD generates an abstract representation of high-dimensional data by giving each data point a location in a graph which preserves the distances in the original high-dimensional space. The STAD graph is built upon the Minimum Spanning Tree (MST) to which new edges are added until the correlation between the distances from the graph and the original dataset is maximized. Additionally, STAD supports the inclusion of additional functions to focus the exploration and allow the analysis of data from new perspectives, emphasizing traits in data which otherwise would remain hidden. We demonstrate the effectiveness of our method by applying it to two real-world datasets: traffic density in Barcelona and temporal measurements of air quality in Castile and Le\'on in Spain. |
def _reduce(self: pool, op, xs_per_part):
if self._processes == 1 and len(xs_per_part) == 1:
return reduce(op, map(partial(reduce, op), xs_per_part))
else:
return reduce(op, self._pool.map(partial(reduce, op), xs_per_part)) |
/** Prints out the graph. For debugging purposes. */
public void printGraph() {
for (startNodeIterator();
hasMoreNodes();) {
Node node = nextNode();
if (isInitialNode(node)) {
System.out.println("Initial Node");
}
if (isFinalNode(node)) {
System.out.println("Final Node");
}
System.out.println(node);
node.print();
}
for (startEdgeIterator();
hasMoreEdges();) {
Edge edge = nextEdge();
System.out.println(edge);
edge.print();
}
} |
// CNAMEResource returns the CNAME via Customizations, otherwise nil
func CNAMEResource(fqdnString string) *dnsmessage.CNAMEResource {
if domain, ok := Customizations[strings.ToLower(fqdnString)]; ok && domain.CNAME != (dnsmessage.CNAMEResource{}) {
return &domain.CNAME
}
return nil
} |
Guardians of the Galaxy 2 is likely to add in a number of new cosmic Marvel superheroes alongside the ones we already know, but there are still characters from the first film that could use a little more attention. Specifically, Nebula. The villain, played by Karen Gillan, was hyped up in the marketing campaigns, but ended up doing precious little in the movie. Mostly, she stuck around Ronan's spaceship, had one good fight with Gamora at the end, then hijacked a spaceship and ditched out of the movie.
Hopefully, director James Gunn has much bigger plans for Nebula in the upcoming Guardians of the Galaxy 2- and as well, Gillan herself has been talking the role and her continued appearances in the Marvel Cinematic Universe. Recently, Gillan stated that "I have a physical and emotional attachment to this hair. We'll have to see if I have to shave it. Maybe CGI will have developed further by the time we shoot it."
Which confirms two things. One, Gillan will definitely be returning in the second Guardians of the Galaxy film. And two, she'd prefer that her hair survive the filming process.
What kind of role might Nebula play in the second film? Well, it's doubtful she'd be the film's main villain. We've already seen her and understood the basics of her character in the first film; for number two the audience would probably be better served learning about an entirely new vilain. However, it's also unlikely that Nebula would be another secondary villain serving under the main one- what are the odds she'd find employment under another supervillain? Most likely, Nebula will be a rogue agent in the film- could she potentially turn good by the end? Anything's possible, considering how far away the second Guardians film is.
Guardians of the Galaxy 2 will premiere on July 28, 2017. |
package internal
import (
"log"
"os"
"time"
"github.com/qingfeng777/owls/server/global"
"gorm.io/gorm"
"gorm.io/gorm/logger"
)
type DBBASE interface {
GetLogMode() string
}
var Gorm = new(_gorm)
type _gorm struct{}
// Config gorm 自定义配置
// Author [SliverHorn](https://github.com/SliverHorn)
func (g *_gorm) Config() *gorm.Config {
config := &gorm.Config{DisableForeignKeyConstraintWhenMigrating: true}
_default := logger.New(NewWriter(log.New(os.Stdout, "\r\n", log.LstdFlags)), logger.Config{
SlowThreshold: 200 * time.Millisecond,
LogLevel: logger.Warn,
Colorful: true,
})
var logMode DBBASE
switch global.GVA_CONFIG.System.DbType {
case "mysql":
logMode = &global.GVA_CONFIG.Mysql
break
case "pgsql":
logMode = &global.GVA_CONFIG.Pgsql
break
default:
logMode = &global.GVA_CONFIG.Mysql
}
switch logMode.GetLogMode() {
case "silent", "Silent":
config.Logger = _default.LogMode(logger.Silent)
case "error", "Error":
config.Logger = _default.LogMode(logger.Error)
case "warn", "Warn":
config.Logger = _default.LogMode(logger.Warn)
case "info", "Info":
config.Logger = _default.LogMode(logger.Info)
default:
config.Logger = _default.LogMode(logger.Info)
}
return config
}
|
Network-based reference model tracking for vehicle turning with event-triggered transmission and interval communication delays
This paper deals with the network-based modelling and reference model tracking control of four-wheel-independent-drive electric vehicles for turning purpose. First, a reference model is designed by adjusting its input to characterise the expected neutral turning of the vehicle. By considering event-triggered transmission and interval communication delays, the network-based tracking control system attaining vehicle turning is modelled as an interval input delay system, where the input delay is bounded by the bounds of communication delays and the sampling period. Second, a new delay-dependent bounded real lemma with less conservatism is derived by proposing a matrix-based binary quadratic convex method, constructing a discontinuous augmented Lyapunov–Krasovskii functional tailored based on second-order Bessel–Legendre inequality, and exploiting the relation among the upper bounds of input delay and interval communication delays, and the sampling period. Third, the network-based tracking controller design result is further established by linear matrix inequalities. Finally, an example is provided to confirm the superiority of the analysis results and the effectiveness of control design. |
/// parse the key value entry
/// skip the internal key varint, return the internal key.
fn key(&self) -> &[u8] {
// Key Value Entry
let entry = self.iter.key();
let (value, length) = u32::decode_varint(entry);
// skip the internal_key_varint_length, return the internal_key
&entry[length..value as usize + length]
} |
import { LitElement, html, css } from 'lit'
import { customElement, property } from 'lit/decorators.js'
import { MenuItem } from "./item"
@customElement('wh-dropdown')
export class Dropdown extends LitElement {
static styles = css`
ul {
list-style-type: none;
margin: 0;
padding: var(--padding, 0);
min-width: var(--min-width, 0);
width: 14rem;
z-index: 50;
position: absolute;
overflow-y: scroll;
background-color: var(--background-color, white);
max-width: var(--max-width, auto);
border-radius: var(--border-radius, 0.25rem);
box-shadow: var(--box-shadow, rgba(0, 0, 0, 0.1) 0px 10px 15px -3px, rgba(0, 0, 0, 0.05) 0px 4px 6px -2px);
height: 0;
border-width: 1px;
border-style: solid;
border-color: #e8e8e8;
opacity: 0;
transition: opacity 150ms ease-out;
}
.open {
opacity: 1;
height: auto;
transition: opacity 150ms ease-out;
}
::slotted(*[slot="button"]) {
padding: .5rem 1rem .5rem 1rem;
font-size: 0.875rem;
font-weight: 500;
justify-content: center;
display: inline-flex;
border-width: 1px;
background-color: white;
border-style: solid;
color: #374151;
user-select: none;
border-color: #D1D5DB;
border-radius: 0.375rem;
box-shadow: rgba(0, 0, 0, 0.05) 0px 1px 2px 0px;
}
::slotted(*:hover[slot="button"]) {
background-color: #F9FAFB;
}
`
@property()
opened: boolean = false
@property()
distance: number = 0
@property()
skidding: number = 0
@property({ attribute: false })
list: Array<MenuItem>
connectedCallback() {
super.connectedCallback()
this.setAttribute("tabindex", "0")
this.addEventListener("keydown", this.handleKeyPress)
}
disconnectedCallback() {
this.removeEventListener("keydown", this.handleKeyPress)
super.disconnectedCallback()
}
firstUpdated() {
this.list = Array.from(document.getElementsByTagName("wh-menu-item")) as MenuItem[]
if (this.opened) this.focus()
}
open = () => {
this.opened = true
//Don't close dropdown when opening
setTimeout(() => document.addEventListener("click", this.handleClick))
}
close = () => {
this.opened = false
this.focus()
document.removeEventListener("click", this.handleClick)
}
protected render() {
return html`
<slot name="button" @click="${this.toggle}"></slot>
<ul class="${this.opened ? 'open' : ''}" style="margin-top: ${this.distance}px; margin-left: ${this.skidding}px;">
<slot></slot>
</ul>
`
}
private handleClick = (e: Event) => {
if (this.opened && !this.contains(e.target as Node)) return this.close()
this.dispatchEvent(new CustomEvent("wh-select", { detail: { item: e.target } }))
}
private toggle = () => {
if (this.opened) return this.close()
this.open()
}
private handleKeyPress(e: KeyboardEvent) {
switch (e.code) {
case "Tab": this.handleFocus(e); break
case "Enter": this.handleEnter(e.target); break
case "Escape": this.close(); break
case "ArrowDown": case "ArrowUp": this.handleArrowKeys(e); break
}
}
private handleEnter(target: EventTarget | null) {
if (target === this) return this.toggle()
this.dispatchEvent(new CustomEvent("wh-select", { detail: { item: target } }))
}
private navigate(e: KeyboardEvent) {
//Prevent page scroll
e.preventDefault()
if (document.activeElement === this) return this.list[0].focus()
this.move(e.code)
}
private move(direction: String) {
if (direction === "ArrowDown") return this.moveDown()
this.moveUp()
}
private moveUp() {
const nextElement = document.activeElement.previousElementSibling as MenuItem
if (nextElement && this.list.includes(nextElement)) return nextElement.focus()
this.list[this.list.length - 1].focus()
}
private moveDown() {
const nextElement = document.activeElement.nextElementSibling as MenuItem
if (nextElement && this.list.includes(nextElement)) return nextElement.focus()
this.list[0].focus()
}
private handleFocus(e) {
if (this.opened) {
e.preventDefault()
if (document.activeElement === this) return this.list[0].focus()
if (document.activeElement?.tagName.toLowerCase() === "wh-menu-item") {
this.close()
this.focus()
}
}
}
private handleArrowKeys(e) {
e.preventDefault()
this.opened ? this.navigate(e) : this.open()
}
}
|
def net_resolve(self, host_name):
return self.request( "net-resolve", {
'host_name': [ host_name, 'host-name', [ basestring, 'None' ], False ],
}, {
'ip-addresses': [ basestring, True ],
} ) |
Consensual reactions of human blood-aqueous barrier to implant operations.
Slit-lamp fluorophotometry was used to evaluate the disruption of the blood-aqueous barrier in eyes that underwent posterior chamber lens implantation following phacoemulsification and the consensual reaction of the barrier disruption in the contralateral eyes. Topical indomethacin or placebo was applied to surgically treated eyes to test the effect on the barrier disruption. Fluorophotometry was carried out before operation and 24 hours, one week, and four weeks after operation. In the surgically treated eyes, topical indomethacin effectively inhibited the disruption of the barrier during the first and fourth postoperative weeks; in the contralateral eyes it did not inhibit the reaction. The consensual reaction was observed in higher magnitude and frequency than expected. Its magnitude and frequency were higher during the first postoperative day than during the first or fourth postoperative weeks, but were proportional to the barrier disruption of the surgically treated eyes during the first postoperative day only. |
// AssertEqualArgs gets two lists of arguments and returns nil if they are
// equal to each other and an error otherwise.
func AssertEqualArgs(as1, as2 Args) error {
if len(as1) != len(as2) {
return fmt.Errorf(
"Argument slices %#v and %#v have different length: %d != %d.",
as1, as2, len(as1), len(as2),
)
}
for i := range as1 {
if err := AssertEqualArg(&as1[i], &as2[i]); err != nil {
return err
}
}
return nil
} |
// Copyright 1996-2020 Cyberbotics Ltd.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef WB_LIDAR_HPP
#define WB_LIDAR_HPP
#include "WbAbstractCamera.hpp"
#include "WbSFInt.hpp"
#include "../../../include/controller/c/webots/lidar_point.h"
struct WrRenderable;
struct WrDynamicMesh;
struct WrStaticMesh;
struct WrMaterial;
class WbLidar : public WbAbstractCamera {
Q_OBJECT
public:
// constructors and destructor
explicit WbLidar(WbTokenizer *tokenizer = NULL);
WbLidar(const WbLidar &other);
explicit WbLidar(const WbNode &other);
virtual ~WbLidar();
// reimplemented public functions
void createOdeObjects() override;
void createWrenObjects() override;
void preFinalize() override;
void postFinalize() override;
void reset() override;
void updateCollisionMaterial(bool triggerChange = false, bool onSelection = false) override;
void setSleepMaterial() override;
void setScaleNeedUpdate() override;
void attachResizeManipulator() override;
void detachResizeManipulator() const override;
void handleMessage(QDataStream &) override;
int nodeType() const override { return WB_NODE_LIDAR; }
QString pixelInfo(int x, int y) const override;
void prePhysicsStep(double ms) override;
void postPhysicsStep() override;
void write(WbVrmlWriter &writer) const override;
WbRgb enabledCameraFrustrumColor() const override { return WbRgb(0.0f, 1.0f, 1.0f); }
double maxRange() const override { return mMaxRange->value(); }
// These functions return the value actually used by the lidar (that was initially loaded from the world file or changed
// before the start of the simulation). It may be different from the current value of the field if it was changed after the
// start of the simulation. Once the simulation starts, such changes cannot be applied directly and are applied only after a
// save and reload. This is explained to the user in a warning message.
int actualNumberOfLayers() const;
int actualHorizontalResolution() const;
double actualVerticalFieldOfView() const;
double actualFieldOfView() const;
int textureGLId() const override;
int width() const override;
int height() const override;
double fieldOfView() const override { return actualFieldOfView(); }
WbSolid *solidEndPoint() const;
// selection
void propagateSelection(bool selected) override;
// lazy matrix multiplication system
void setMatrixNeedUpdate() override;
private:
// user accessible fields
WbSFDouble *mTiltAngle;
WbSFInt *mHorizontalResolution;
WbSFDouble *mVerticalFieldOfView;
WbSFInt *mNumberOfLayers;
WbSFDouble *mMinRange;
WbSFDouble *mMaxRange;
WbSFDouble *mResolution;
WbSFDouble *mDefaultFrequency;
WbSFDouble *mMinFrequency;
WbSFDouble *mMaxFrequency;
WbSFString *mType;
WbSFNode *mRotatingHead;
bool mIsPointCloudEnabled;
double mCurrentRotatingAngle;
double mPreviousRotatingAngle;
double mCurrentTiltAngle;
float *mTemporaryImage;
int mActualNumberOfLayers;
int mActualHorizontalResolution;
double mActualVerticalFieldOfView;
double mActualFieldOfView;
WrRenderable *mFrustumRenderable;
WrMaterial *mFrustumMaterial;
WrStaticMesh *mFrustumMesh;
WrRenderable *mLidarPointsRenderable;
WrDynamicMesh *mLidarPointsMesh;
WrMaterial *mLidarPointsMaterial;
WrRenderable *mLidarRaysRenderable;
WrDynamicMesh *mLidarRaysMesh;
WrMaterial *mLidarRaysMaterial;
// private functions
void addConfigureToStream(QDataStream &stream, bool reconfigure = false) override;
void copyAllLayersToSharedMemory();
void updatePointCloud(int minWidth, int maxWidth);
float *lidarImage() const;
WbLidar &operator=(const WbLidar &); // non copyable
WbNode *clone() const override { return new WbLidar(*this); }
void init();
void initializeSharedMemory() override;
int size() const override {
return (sizeof(float) + sizeof(WbLidarPoint)) * actualHorizontalResolution() * actualNumberOfLayers();
}
double minRange() const override { return mMinRange->value(); }
bool isRotating() const { return mType->value().startsWith('r', Qt::CaseInsensitive); }
double verticalFieldOfView() const { return actualFieldOfView() * ((double)height() / (double)width()); }
WbLidarPoint *pointArray() { return (WbLidarPoint *)(lidarImage() + actualHorizontalResolution() * actualNumberOfLayers()); }
// WREN methods
void createWrenCamera() override;
void deleteWren();
void displayPointCloud();
void hidePointCloud();
void applyMaxRangeToWren();
void applyResolutionToWren();
void applyTiltAngleToWren();
private slots:
void updateNear();
void updateMinRange();
void updateMaxRange();
void updateResolution();
void updateTiltAngle();
void updateType();
void updateMinFrequency();
void updateMaxFrequency();
void updateDefaultFrequency();
void updateHorizontalResolution();
void updateVerticalFieldOfView();
void updateNumberOfLayers();
void updateRotatingHead();
void updateBoundingSphere(WbBaseNode *subNode);
void applyCameraSettingsToWren() override;
void applyFrustumToWren() override;
void updateOptionalRendering(int option) override;
void updateFieldOfView() override;
};
#endif // WB_LIDAR_HPP
|
CMOS and sCMOS imaging performance comparison by digital holographic interferometry
Abstract. We use a digital holographic interferometric setup to assess, as a proof of concept, two state-of-the-art sensors (CMOS and sCMOS cameras) that are widely used in nondestructive testing (NDT). This interferometric study is intended to evaluate the image quality recorded by any camera used in NDT. The assessing relies on the quantification of the optical phase information recovered by the cameras used for this study. For this, we calculate the signal-to-noise ratio, correlation coefficient, and quality index (Q-index) as main figures-of-merit. As far as we know, the Q-index has not been used for evaluation of the optical phase coming from image holograms. The CMOS and sCMOS sensors used record the same deformation event under the same experimental conditions. The experiment involves the inspection of a large sample (>1 m2 of area) which implies low illumination conditions for the imaging sensors. The retrieved CMOS optical phase shows artifacts that are not observed in the sCMOS. An analysis of these two groups of interferometric images is presented and discussed. The methodology set forth here can be applied to evaluate other sensors such as CCDs and EM-CCDs. |
class _MaskedUnaryOperation:
"""Defines masked version of unary operations, where invalid
values are pre-masked.
Parameters
----------
f : callable
fill :
Default filling value (0).
domain :
Default domain (None).
"""
def __init__ (self, mufunc, fill=0, domain=None):
""" _MaskedUnaryOperation(aufunc, fill=0, domain=None)
aufunc(fill) must be defined
self(x) returns aufunc(x)
with masked values where domain(x) is true or getmask(x) is true.
"""
self.f = mufunc
self.fill = fill
self.domain = domain
self.__doc__ = getattr(mufunc, "__doc__", str(mufunc))
self.__name__ = getattr(mufunc, "__name__", str(mufunc))
ufunc_domain[mufunc] = domain
ufunc_fills[mufunc] = fill
#
def __call__ (self, a, *args, **kwargs):
"Execute the call behavior."
#
m = getmask(a)
d1 = getdata(a)
#
if self.domain is not None:
dm = np.array(self.domain(d1), copy=False)
m = np.logical_or(m, dm)
# The following two lines control the domain filling methods.
d1 = d1.copy()
# We could use smart indexing : d1[dm] = self.fill ...
# ... but np.putmask looks more efficient, despite the copy.
np.putmask(d1, dm, self.fill)
# Take care of the masked singletong first ...
if not m.ndim and m:
return masked
# Get the result class .......................
if isinstance(a, MaskedArray):
subtype = type(a)
else:
subtype = MaskedArray
# Get the result as a view of the subtype ...
result = self.f(d1, *args, **kwargs).view(subtype)
# Fix the mask if we don't have a scalar
if result.ndim > 0:
result._mask = m
result._update_from(a)
return result
#
def __str__ (self):
return "Masked version of %s. [Invalid values are masked]" % str(self.f) |
/* Main activity for the adb test program */
public class AdbTestActivity extends Activity {
private static final String TAG = "AdbTestActivity";
private TextView mLog;
private UsbManager mManager;
private UsbDevice mDevice;
private UsbDeviceConnection mDeviceConnection;
private UsbInterface mInterface;
private AdbDevice mAdbDevice;
private static final int MESSAGE_LOG = 1;
private static final int MESSAGE_DEVICE_ONLINE = 2;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.adb);
mLog = (TextView)findViewById(R.id.log);
mManager = (UsbManager)getSystemService(Context.USB_SERVICE);
// check for existing devices
for (UsbDevice device : mManager.getDeviceList().values()) {
UsbInterface intf = findAdbInterface(device);
if (setAdbInterface(device, intf)) {
break;
}
}
// listen for new devices
IntentFilter filter = new IntentFilter();
filter.addAction(UsbManager.ACTION_USB_DEVICE_ATTACHED);
filter.addAction(UsbManager.ACTION_USB_DEVICE_DETACHED);
registerReceiver(mUsbReceiver, filter);
}
@Override
public void onDestroy() {
unregisterReceiver(mUsbReceiver);
setAdbInterface(null, null);
super.onDestroy();
}
public void log(String s) {
Message m = Message.obtain(mHandler, MESSAGE_LOG);
m.obj = s;
mHandler.sendMessage(m);
}
private void appendLog(String text) {
Rect r = new Rect();
mLog.getDrawingRect(r);
int maxLines = r.height() / mLog.getLineHeight() - 1;
text = mLog.getText() + "\n" + text;
// see how many lines we have
int index = text.lastIndexOf('\n');
int count = 0;
while (index > 0 && count <= maxLines) {
count++;
index = text.lastIndexOf('\n', index - 1);
}
// truncate to maxLines
if (index > 0) {
text = text.substring(index + 1);
}
mLog.setText(text);
}
public void deviceOnline(AdbDevice device) {
Message m = Message.obtain(mHandler, MESSAGE_DEVICE_ONLINE);
m.obj = device;
mHandler.sendMessage(m);
}
private void handleDeviceOnline(AdbDevice device) {
log("device online: " + device.getSerial());
device.openSocket("shell:exec logcat");
}
// Sets the current USB device and interface
private boolean setAdbInterface(UsbDevice device, UsbInterface intf) {
if (mDeviceConnection != null) {
if (mInterface != null) {
mDeviceConnection.releaseInterface(mInterface);
mInterface = null;
}
mDeviceConnection.close();
mDevice = null;
mDeviceConnection = null;
}
if (device != null && intf != null) {
UsbDeviceConnection connection = mManager.openDevice(device);
if (connection != null) {
log("open succeeded");
if (connection.claimInterface(intf, false)) {
log("claim interface succeeded");
mDevice = device;
mDeviceConnection = connection;
mInterface = intf;
mAdbDevice = new AdbDevice(this, mDeviceConnection, intf);
log("call start");
mAdbDevice.start();
return true;
} else {
log("claim interface failed");
connection.close();
}
} else {
log("open failed");
}
}
if (mDeviceConnection == null && mAdbDevice != null) {
mAdbDevice.stop();
mAdbDevice = null;
}
return false;
}
// searches for an adb interface on the given USB device
static private UsbInterface findAdbInterface(UsbDevice device) {
Log.d(TAG, "findAdbInterface " + device);
int count = device.getInterfaceCount();
for (int i = 0; i < count; i++) {
UsbInterface intf = device.getInterface(i);
if (intf.getInterfaceClass() == 255 && intf.getInterfaceSubclass() == 66 &&
intf.getInterfaceProtocol() == 1) {
return intf;
}
}
return null;
}
BroadcastReceiver mUsbReceiver = new BroadcastReceiver() {
public void onReceive(Context context, Intent intent) {
String action = intent.getAction();
if (UsbManager.ACTION_USB_DEVICE_ATTACHED.equals(action)) {
UsbDevice device = (UsbDevice)intent.getParcelableExtra(UsbManager.EXTRA_DEVICE);
UsbInterface intf = findAdbInterface(device);
if (intf != null) {
log("Found adb interface " + intf);
setAdbInterface(device, intf);
}
} else if (UsbManager.ACTION_USB_DEVICE_DETACHED.equals(action)) {
UsbDevice device = intent.getParcelableExtra(UsbManager.EXTRA_DEVICE);
String deviceName = device.getDeviceName();
if (mDevice != null && mDevice.equals(deviceName)) {
log("adb interface removed");
setAdbInterface(null, null);
}
}
}
};
Handler mHandler = new Handler() {
@Override
public void handleMessage(Message msg) {
switch (msg.what) {
case MESSAGE_LOG:
appendLog((String)msg.obj);
break;
case MESSAGE_DEVICE_ONLINE:
handleDeviceOnline((AdbDevice)msg.obj);
break;
}
}
};
} |
/**
* Here be dragons !
*
* @author: Ezio
* created on 2020/3/18
*/
public class _836_IsRectangleOverlap {
static class Solution {
public boolean isRectangleOverlap(int[] rec1, int[] rec2) {
return !(rec2[0] >= rec1[2] || rec1[0] >= rec2[2] || rec1[1] >= rec2[3] || rec2[1] >= rec1[3]);
}
}
public static void main(String[] args) {
Solution solution = new Solution();
System.out.println(solution.isRectangleOverlap(new int[]{0, 0, 2, 2}, new int[]{1, 1, 3, 3}));
System.out.println(solution.isRectangleOverlap(new int[]{0, 0, 1, 1}, new int[]{1, 0, 2, 1}));
}
} |
Compression and shear surface rheology in spread layers of beta-casein and beta-lactoglobulin.
We investigate the surface viscoelasticity of beta-lactoglobulin and beta-casein spread surface monolayers using a recently discovered method. Step compressions are performed, and the surface pressure is measured as a function of time. This is a common experiment for surface monolayers. However in our experiments the pressure is recorded by two perpendicular sensors, parallel and perpendicular to the compression direction. This enables us to clearly measure the time relaxation of both the compression and shear moduli, at the same time, in a single experiment, and with a standard apparatus. beta-Lactoglobulin and beta-casein monolayers are interesting because of their importance in food science and because they exhibit universally slow dynamical behavior that is still not fully understood. Our results confirm that the compressional modulus dominates the total viscoelastic response in both proteins. Indeed for beta-casein we confirm that the shear modulus is always negligible, i.e., the layer is in a fluid state. In beta-lactoglobulin a finite shear modulus emerges above a critical concentration. We emphasize that in Langmuir trough dynamic experiments the surface pressure should be measured in both the compression and the perpendicular directions. |
/**
* Remove all the provider connections for the specified client uuid.
* @param uuid the uuid of the client for which to remove connections.
*/
public void removeProviderConnections(final String uuid) {
final Collection<AsyncClientClassContext> channels = providerConnections.removeKey(uuid);
if (channels != null) {
for (final AsyncClientClassContext channel: channels) {
try {
closeConnection(channel);
} catch (final Exception e) {
log.error("error closing channel {} : {}", channel, ExceptionUtils.getStackTrace(e));
}
}
}
} |
/**
* Class representing the CML atom element
*
* @author Peter Murray-Rust, Ramin Ghorashi (2005)
*
*/
public class CMLAtom extends AbstractAtom {
final static Logger LOG = Logger.getLogger(CMLAtom.class);
/** namespaced element name.*/
public final static String NS = C_E+TAG;
static {
LOG.setLevel(Level.WARN);
};
List<CMLAtom> ligandAtoms = null;
List<CMLBond> ligandBonds = null;
/**
* Construct a new CMLAtom element without id.
*/
public CMLAtom() {
super();
init();
}
void init() {
ligandAtoms = new ArrayList<CMLAtom>();
ligandBonds = new ArrayList<CMLBond>();
}
/**
* Construct a new CMLAtom element with immutable id.
* @param id
*/
public CMLAtom(String id) {
super();
this.setId(id);
}
/**
* copy constructor.
*
* @param old
* to copy
*/
public CMLAtom(CMLAtom old) {
super(old);
init();
}
/**
* Create new CMLAtom with specified id and ChemicalElement.AS
* @param id
* @param chem
*/
public CMLAtom(String id, AS as) {
this(id, ChemicalElement.getChemicalElement(as));
}
/**
* Create new CMLAtom with specified id and ChemicalElement.
* @param id
* @param chem
*/
public CMLAtom(String id, ChemicalElement chem) {
this(id);
this.setElementType(chem.getSymbol());
}
/**
* copy node .
*
* @return Node
*/
public Element copy() {
return new CMLAtom(this);
}
/**
* create new instance in context of parent, overridable by subclasses.
*
* @param parent
* parent of element to be constructed (ignored by default)
* @return CMLAtom
* @throws RuntimeException
*/
public CMLElement makeElementInContext(Element parent)
throws RuntimeException {
String error = null;
CMLAtom atom = null;
// these rules need revising periodically
if (parent == null) {
atom = new CMLAtom();
} else if (parent instanceof CMLAtomArray) {
// atomArray parent must be child of Molecule or Formula
Element grandParent = (Element) parent.getParent();
if (grandParent == null) {
error = "Atom needs non-null grandparent";
} else if (grandParent instanceof CMLMolecule) {
atom = new CMLAtom();
} else if (grandParent instanceof CMLFormula) {
error = "Atom grandparent must be not be formula";
} else {
error = "Atom grandparent must be molecule";
}
} else {
atom = new CMLAtom();
// error = "Atom needs atomArray parent";
}
if (error != null) {
throw new RuntimeException(error);
}
return atom;
}
/**
* make sure atomId is present
*
* @param parent
* element
*/
public void finishMakingElement(Element parent) {
try {
check();
} catch (RuntimeException e) {
LOG.warn(e.getMessage());
}
}
/**
* checks the CML compliance of this element
*
*/
public void check() {
String id = this.getId();
if (id == null) {
throw new RuntimeException("Atom id must not be null");
}
CMLMolecule molecule = getMolecule();
if (molecule != null) {
CMLAtom oldAtom = getMolecule().getAtomById(id);
if (oldAtom != null) {
if (oldAtom != this) {
oldAtom.debug("OLD ATOM");
throw new RuntimeException("atom check: duplicate atom id: " + id);
}
}
} else {
// is this an error?
// throw new CMLRuntime("Atom has no molecule");
}
}
/** set id.
* this will index the atom if it has a parent.
* id cannot be reset.
* @param id
*/
public void setId(String id) {
String id0 = this.getId();
if (id0 != null) {
if(!id0.equals(id)) {
throw new RuntimeException("Cannot reindex id");
}
} else {
// LOG.debug("ATOM SET ID "+id);
super.setId(id);
ParentNode parent = this.getParent();
if (parent != null && parent instanceof CMLAtomArray) {
CMLAtomArray atomArray = (CMLAtomArray) parent;
if (atomArray != null) {
atomArray.indexAtom(this);
}
}
}
}
/** gets current ligands.
* updated every time atoms or bonds are added or removed
* @return list
*/
public List<CMLAtom> getLigandAtoms() {
if (ligandAtoms == null) {
ligandAtoms = new ArrayList<CMLAtom>();
}
return ligandAtoms;
}
/** gets current bonds to ligands.
* list is aligned with ligandAtoms
* updated every time atoms or bonds are added or removed
* @return list
*/
public List<CMLBond> getLigandBonds() {
if (ligandBonds == null) {
ligandBonds = new ArrayList<CMLBond>();
}
return ligandBonds;
}
void addLigandBond(CMLBond bond, CMLAtom otherAtom) {
getLigandAtoms();
if (ligandAtoms.contains(otherAtom)) {
throw new RuntimeException("Duplicate ligand: "+otherAtom.getId());
}
getLigandBonds();
if (ligandBonds.contains(bond)) {
throw new RuntimeException("Duplicate bond: "+bond.getAtomRefs2());
}
ligandAtoms.add(otherAtom);
ligandBonds.add(bond);
}
/** deletes ligand info but not bond.
* therefore not for public use.
* @param bond
* @param otherAtom
*/
void clearLigandInfo(CMLBond bond, CMLAtom otherAtom) {
if (ligandAtoms != null) {
if (!ligandAtoms.contains(otherAtom)) {
throw new RuntimeException("Unknown ligand: "+otherAtom.getString());
}
ligandAtoms.remove(otherAtom);
} else {
; // is this an error?
}
if (ligandBonds != null) {
if (!ligandBonds.contains(bond)) {
throw new RuntimeException("Unknown bond: "+bond.getString());
}
ligandBonds.remove(bond);
} else {
; // is this an error?
}
}
/** delete ligand info but not bond.
* not for public use.
*/
void clearLigandInfo() {
int nlig = ligandAtoms.size();
for (int i = nlig - 1; i >= 0; i--) {
this.clearLigandInfo(ligandBonds.get(i), ligandAtoms.get(i));
}
}
/**
* Get owner molecule.
*
* @return owner molecule
*/
public CMLMolecule getMolecule() {
Node atomArray = this.getParent();
if (atomArray != null) {
Node grandParent = atomArray.getParent();
if (grandParent != null && grandParent instanceof CMLMolecule) {
return (CMLMolecule) grandParent;
}
}
return null;
}
CMLAtomArray getAtomArray() {
ParentNode parent = this.getParent();
return (parent == null || !(parent instanceof CMLAtomArray)) ? null :
(CMLAtomArray) parent;
}
/** remove atom.
* routed to atomArray.removeAtom()
*/
public void detach() {
CMLAtomArray atomArray = getAtomArray();
if (this.getParent() != null && atomArray != null) {
atomArray.removeAtom(this);
}
}
/** gets all explict hydrogen ligand atoms.
* ignore hydrogen count
* @return list of atoms
*/
public List<CMLAtom> getLigandHydrogenAtoms() {
List<CMLAtom> hydrogenAtoms = new ArrayList<CMLAtom>();
List<CMLAtom> ligandAtoms = this.getLigandAtoms();
for (CMLAtom ligand : ligandAtoms) {
if (AS.H.equals(ligand.getElementType())) {
hydrogenAtoms.add(ligand);
}
}
return hydrogenAtoms;
}
/** if atom has one or more hydrogen atoms deletes one.
* mainly for managing count for aromatics.
* @return atom deleted or null
*/
public CMLAtom deleteAnyLigandHydrogenAtom() {
CMLAtom ligand = null;
List<CMLAtom> hydrogens = getLigandHydrogenAtoms();
if (hydrogens.size() > 0) {
CMLMolecule molecule = this.getMolecule();
ligand = hydrogens.get(0);
CMLBond bond = molecule.getBond(this, ligand);
molecule.deleteBond(bond);
molecule.deleteAtom(ligand);
}
return ligand;
}
/**
* Returns the number of valence electrons this atom has based on its
* chemical element
*
* @return number of valence electrons
*/
public int getValenceElectrons() {
ChemicalElement chemicalElement = this.getChemicalElement();
if (chemicalElement != null) {
return chemicalElement.getValenceElectrons();
}
return 0;
}
/**
* gets Real2 for x2 y2.
*
* @return the point; null if x2, etc. are unset
*/
public Real2 getXY2() {
if (hasCoordinates(CoordinateType.TWOD)) {
return new Real2(this.getX2(), this.getY2());
}
return null;
}
/**
* sets Real2 for x2 y2.
*
* @param point
*/
public void setXY2(Real2 point) {
this.setX2(point.getX());
this.setY2(point.getY());
}
/**
* gets Point3 for x3, y3, z3.
*
* @see #getPoint3(CoordinateType)
* @return the point; null if x3, etc. are unset
*/
public Point3 getXYZ3() {
if (hasCoordinates(CoordinateType.CARTESIAN)) {
return new Point3(this.getX3(), this.getY3(), this.getZ3());
}
return null;
}
/**
* gets Point3 for cartesians or fractionals.
*
* @see #getXYZ3()
* @see #getXYZFract()
* @param type
* selects cartesians or fractionals
* @return the point; null if x3 or xFract, etc. are unset
*/
public Point3 getPoint3(CoordinateType type) {
Point3 point = null;
if (type.equals(CoordinateType.CARTESIAN)) {
point = getXYZ3();
} else if (type.equals(CoordinateType.FRACTIONAL)) {
point = getXYZFract();
}
return point;
}
/**
* sets Point3 for x3 y3 z3.
*
* @param point
*/
public void setXYZ3(Point3 point) {
this.setX3(point.getArray()[0]);
this.setY3(point.getArray()[1]);
this.setZ3(point.getArray()[2]);
}
/**
* sets Point3 for cartesians or fractionals.
*
* @param point
* to set
* @param type
* selects cartesians or fractionals
* @return the point; null if x3 or xFract, etc. are unset
*/
public Point3 setPoint3(Point3 point, CoordinateType type) {
if (type.equals(CoordinateType.CARTESIAN)) {
setXYZ3(point);
} else if (type.equals(CoordinateType.FRACTIONAL)) {
setXYZFract(point);
}
return point;
}
/**
* unsets Point3 for cartesians or fractionals or 2D. remove appropraite
* attributes
*
* @param type
* selects cartesians or fractionals
*/
public void unsetPoint(CoordinateType type) {
if (type.equals(CoordinateType.CARTESIAN)) {
unsetXYZ3();
} else if (type.equals(CoordinateType.FRACTIONAL)) {
unsetXYZFract();
} else if (type.equals(CoordinateType.TWOD)) {
unsetXY2();
}
}
/**
* unsets x3 y3 z3.
*/
public void unsetXYZ3() {
this.removeAttribute("x3");
this.removeAttribute("y3");
this.removeAttribute("z3");
}
/**
* unsets xFract yFract zFract.
*/
public void unsetXYZFract() {
this.removeAttribute("xFract");
this.removeAttribute("yFract");
this.removeAttribute("zFract");
}
/**
* unsets x2 y2
*/
public void unsetXY2() {
this.removeAttribute("x2");
this.removeAttribute("y2");
}
/**
* transform 3D coordinates. does NOT alter fractional or 2D coordinates
*
* @param transform
* the transformation
*/
public void transformCartesians(Transform3 transform) {
Point3 point = this.getXYZ3();
if (point != null) {
point = point.transform(transform);
this.setXYZ3(point);
}
}
/**
* gets Point3 for xFract, yFract, zFract.
*
* @return the point; null if xFract, etc. are unset
*/
public Point3 getXYZFract() {
if (hasCoordinates(CoordinateType.FRACTIONAL)) {
return new Point3(this.getXFract(), this.getYFract(), this
.getZFract());
}
return null;
}
/**
* sets Point3 for x3 y3 z3.
*
* @param point
*/
public void setXYZFract(Point3 point) {
this.setXFract(point.getArray()[0]);
this.setYFract(point.getArray()[1]);
this.setZFract(point.getArray()[2]);
}
/**
* transform 3D fractional coordinates. modifies this does not modify x3,
* y3, z3 (may need to re-generate cartesians)
*
* @param transform
* the transformation
*/
public void transformFractionals(Transform3 transform) {
Point3 point = this.getXYZFract();
point = point.transform(transform);
this.setXYZFract(point);
}
/**
* The formalCharge on the atom. this attribute is often omitted; if so
* getFormalCharge() will throw CMLRuntime. this routine allows the caller
* to decide whether an omission default to 0.
*
* @param control
* @return int
*/
public int getFormalCharge(FormalChargeControl control) {
int fc = 0;
try {
fc = getFormalCharge();
} catch (RuntimeException e) {
if (FormalChargeControl.NO_DEFAULT.equals(control)) {
throw e;
}
}
return fc;
}
/**
* Gets the fractional coordinate for this atom
*
* @return the fractional coordinate or null if it has not been set
*/
public Point3 getFractCoord() {
if (this.getXFractAttribute() != null
&& this.getYFractAttribute() != null
&& this.getZFractAttribute() != null) {
return new Point3(this.getXFract(), this.getYFract(), this
.getZFract());
} else {
return null;
}
}
/**
* gets chemical element corresponding to elementType.
*
* @return the chemical element (or null)
*/
public ChemicalElement getChemicalElement() {
return ChemicalElement.getChemicalElement(this.getElementType());
}
/** convenience method to determine whether atom is of given elementType;
*
* @param element
* @return true if element of same type as getElementType()
*/
public boolean hasElement(String elementType) {
return elementType != null && elementType.equals(this.getElementType());
}
/**
* gets atomicNumber corresponding to elementType.
*
* @return atomic number (0 if not found)
*/
public int getAtomicNumber() {
ChemicalElement chemicalElement = getChemicalElement();
return (chemicalElement == null) ? 0 : chemicalElement
.getAtomicNumber();
}
/**
* get convenience serial number for common elements. only used by Molutils
* and valency tools do not use outside JUMBO. This should be reengineered
* to manage valency at some stage
*
* @param elemType
* @return number
*/
public static int getCommonElementSerialNumber(String elemType) {
final String[] elems = { AS.H.value, AS.C.value, AS.N.value, AS.O.value, AS.F.value, AS.Si.value, AS.P.value, AS.S.value, AS.Cl.value,
AS.Br.value, AS.I.value };
for (int i = 0; i < elems.length; i++) {
if (elems[i].equals(elemType)) {
return i;
}
}
return -1;
}
/**
* gets cross product for 3 atoms in 3D.
*
* gets cross products of this->at1 X this->at2
*
* @param atom1
* first atom
* @param atom2
* second atom
*
* @return the cross product (null if parameters are null; zero if atoms are
* coincident or colinear)
*/
// should this really be a public function?
public Vector3 get3DCrossProduct(CMLAtom atom1, CMLAtom atom2) {
Vector3 cross = null;
Vector3 v1 = this.getVector3(atom1);
Vector3 v2 = this.getVector3(atom2);
cross = v1.cross(v2);
return cross;
}
/**
* gets cross product for 3 atoms in 2D.
*
* gets cross products of this->at1 X this->at2 the result is a 3D vector
* perpendicular to xy2 plane
*
* @param atom1
* first atom
* @param atom2
* second atom
*
* @return the cross product (null if parameters are null; zero if atoms are
* coincident or colinear)
*/
// should this really be a public function?
public Vector3 get2DCrossProduct(CMLAtom atom1, CMLAtom atom2) {
Vector3 cross = null;
if (atom1 != null && atom2 != null) {
Point3 p0 = get2DPoint3();
Point3 p1 = atom1.get2DPoint3();
Point3 p2 = atom2.get2DPoint3();
if (p1 != null && p2 != null) {
Vector3 v1 = p1.subtract(p0);
Vector3 v2 = p2.subtract(p0);
cross = v1.cross(v2);
}
}
return cross;
}
/**
* gets 2D coordinates as a 3D point.
*
* adds a z coordinate of 0.0. Result is not stored
*
* @return the point (null if no 2D coordinates)
*/
public Point3 get2DPoint3() {
Point3 point = null;
if (hasCoordinates(CoordinateType.TWOD)) {
point = new Point3(this.getX2(), this.getY2(), 0.0);
}
return point;
}
/**
* gets vector from this atom to another.
*
* gets vector this->at1 (i.e. at1 minus this)
*
* @param atom1
* other atom
*
* @return the vector (null if atoms are null or have no coordinates)
*/
public Vector3 getVector3(CMLAtom atom1) {
Vector3 v = null;
if (atom1 != null) {
Point3 p0 = this.getXYZ3();
Point3 p1 = atom1.getXYZ3();
if (p1 != null && p0 != null) {
v = p1.subtract(p0);
}
}
return v;
}
/**
* gets vector from this atom to another.
*
* gets vector this->at1 (i.e. at1 minus this)
*
* @param atom1
* other atom
*
* @return the vector (null if atoms are null or have no coordinates)
*/
public Real2 getVector2(CMLAtom atom1) {
Real2 v = null;
if (atom1 != null) {
Real2 p0 = this.getXY2();
Real2 p1 = atom1.getXY2();
if (p1 != null && p0 != null) {
v = p1.subtract(p0);
}
}
return v;
}
/**
* increase x2 and y2 coordinates.
*
* if x2 or y2 is unset, no action (to avoid a default of zero)
*
* @param dx
* amount to add
* @param dy
* amount to add
*/
public void increaseXY2(double dx, double dy) {
if (hasCoordinates(CoordinateType.TWOD)) {
this.setX2(this.getX2() + dx);
this.setY2(this.getY2() + dy);
}
}
/**
* transforms 2D coordinates of atom.
*
* if x2 or y2 is unset take no action
*
* @param t2
* transformation
*/
public void transform(Transform2 t2) {
if (hasCoordinates(CoordinateType.TWOD)) {
Real2 xy = new Real2(this.getX2(), this.getY2());
xy.transformBy(t2);
this.setX2(xy.getX());
this.setY2(xy.getY());
}
}
/**
* increase x3, y3 and z3 coordinates.
*
* if x3, y3 or z3 is unset, no action
*
* @param dx
* amount to add
* @param dy
* amount to add
* @param dz
* amount to add
*/
public void increaseXYZ3(double dx, double dy, double dz) {
if (hasCoordinates(CoordinateType.CARTESIAN)) {
this.setX3(this.getX3() + dx);
this.setY3(this.getY3() + dy);
this.setZ3(this.getZ3() + dz);
}
}
/**
* increase x3, y3 and z3 coordinates.
*
* if x3, y3 or z3 is unset, no action
*
* @param dx
* amount to add
* @param dy
* amount to add
* @param dz
* amount to add
*/
public void increaseXYZFract(double dx, double dy, double dz) {
if (hasCoordinates(CoordinateType.CARTESIAN)) {
this.setXFract(this.getXFract() + dx);
this.setYFract(this.getYFract() + dy);
this.setZFract(this.getZFract() + dz);
}
}
/**
* get distance between atoms.
*
* @param atom2 the other atom
*
* @return distance (throws Exception if atom(s) lack coordinates)
*/
public double getDistanceTo(CMLAtom atom2) {
Vector3 vector = getVector3(atom2);
if (vector != null) {
return getVector3(atom2).getLength();
} else {
throw new RuntimeException("cannot calculate distance");
}
}
/**
* get 2D distance between atoms.
* @param atom2 the other atom
* @return distance (Double.NaN if atom(s) lack coordinates)
*/
public double getDistance2(CMLAtom atom2) {
Real2 xy0 = this.getXY2();
Real2 xy1 = atom2.getXY2();
double distance2 = Double.NaN;
if (xy0 != null && xy1 != null) {
distance2 = xy0.getDistance(xy1);
}
return distance2;
}
/**
* get squared distance between atoms.
*
* @param atom2 the other atom
*
* @return squared distance (NaN if atom(s) lack coordinates)
*/
public double getSquaredDistanceTo(CMLAtom atom2) {
double dist2 = Double.NaN;
Point3 p = this.getPoint3(CoordinateType.CARTESIAN);
Point3 p2 = atom2.getPoint3(CoordinateType.CARTESIAN);
if (p != null || p2 != null) {
dist2 = p.getSquaredDistanceFromPoint(p2);
}
return dist2;
}
/**are two atoms within sum of radii.
*
* @param atom2
* @param radiusType
* @return true if atoms are within sum
*/
public boolean isWithinRadiusSum (
CMLAtom atom2, ChemicalElement.RadiusType radiusType) {
boolean within = false;
ChemicalElement elem = this.getChemicalElement();
ChemicalElement elem2 = atom2.getChemicalElement();
if (elem != null && elem2 != null) {
double radsum =
elem.getRadius(radiusType) +
elem2.getRadius(radiusType);
double radsum2 = radsum * radsum;
within = (radsum2 > this.getSquaredDistanceTo(atom2));
}
return within;
}
/**
* Rounds the coords that are within epsilon of 0 to 0. works coordinates
* (XY2, XYZ3, XYZFract) according to coordinateType
*
* @param epsilon
* (must not be 0)
* @param coordinateType
*/
public void roundCoords(double epsilon, CoordinateType coordinateType) {
epsilon = (epsilon == 0.0) ? 1.0E-50 : epsilon;
final double factor = 1.0 / epsilon;
// 2D
int i;
if (CoordinateType.TWOD.equals(coordinateType)
&& this.hasCoordinates(CoordinateType.TWOD)) {
i = (int) (this.getX2() * factor);
this.setX2(((double) i) * epsilon);
i = (int) (this.getY2() * factor);
this.setY2(((double) i) * epsilon);
}
// 3D
if (CoordinateType.CARTESIAN.equals(coordinateType)
&& this.hasCoordinates(CoordinateType.CARTESIAN)) {
i = (int) (this.getX3() * factor);
this.setX3(((double) i) * epsilon);
i = (int) (this.getY3() * factor);
this.setY3(((double) i) * epsilon);
i = (int) (this.getZ3() * factor);
this.setZ3(((double) i) * epsilon);
}
// Fractionals
if (CoordinateType.FRACTIONAL.equals(coordinateType)
&& this.hasCoordinates(CoordinateType.FRACTIONAL)) {
i = (int) (this.getXFract() * factor);
this.setXFract(((double) i) * epsilon);
i = (int) (this.getYFract() * factor);
this.setYFract(((double) i) * epsilon);
i = (int) (this.getZFract() * factor);
this.setZFract(((double) i) * epsilon);
}
}
/**
* Determines whether or not this atom has coordinates of a given type
*
* @param type
* of coordinate
* @return true if all coordinates or a given type are set, false otherwise
*/
public boolean hasCoordinates(CoordinateType type) {
boolean has = false;
if (CoordinateType.TWOD.equals(type)) {
has = (this.getX2Attribute() != null && this.getY2Attribute() != null);
} else if (CoordinateType.CARTESIAN.equals(type)) {
has = (this.getX3Attribute() != null
&& this.getY3Attribute() != null && this.getZ3Attribute() != null);
} else if (CoordinateType.FRACTIONAL.equals(type)) {
has = (this.getXFractAttribute() != null
&& this.getYFractAttribute() != null && this
.getZFractAttribute() != null);
}
return has;
}
/**
* simple atom comparison based on atomic mass (not recursive).
*
* @param otherAtom
* @return int the comparison
*/
public int compareByAtomicNumber(CMLAtom otherAtom) {
int thisAtnum = getAtomicNumber();
int otherAtnum = otherAtom.getAtomicNumber();
int comp = 0;
if (thisAtnum < otherAtnum) {
comp = -1;
} else if (thisAtnum > otherAtnum) {
comp = 1;
}
return comp;
}
/**
* gets count of hydrogens. Explicit hydrogen count overrides value of HydrogenCount
* attribute if it is greater.
* @return hydrogenCount();
*/
public int getHydrogenCount() {
int hcAttributeValue = 0;
if (super.getHydrogenCountAttribute() != null) {
hcAttributeValue = super.getHydrogenCount();
}
int hcExplicit = getLigandHydrogenAtoms().size();
return Math.max(hcAttributeValue, hcExplicit);
}
/** gets formal charge.
*
* if attribute is missing, returns 0
* If you don't like this behaviour, test for null getFormalChargeAttribute()
* and create your own behaviour
* @return count on attribute or 0
*/
public int getFormalCharge() {
int fc = 0;
if (super.getFormalChargeAttribute() != null) {
fc = super.getFormalCharge();
}
return fc;
}
/** get list of atoms filtered by elements.
*
* @param atomList list of atoms
* @param elementSet elements in filter
* @return atoms with elements in filter
*/
public static List<CMLAtom> filter(List<CMLAtom> atomList, Set<ChemicalElement> elementSet) {
List<CMLAtom> newAtomList = new ArrayList<CMLAtom>();
for (CMLAtom atom : atomList) {
ChemicalElement element = atom.getChemicalElement();
if (element != null && elementSet.contains(element)) {
newAtomList.add(atom);
}
}
return newAtomList;
}
/** rename atom and all bonds it occurs in.
*
* @param newId
*/
public void renameId(String newId) {
String oldId = this.getId();
List<CMLBond> bondList = this.getLigandBonds();
for (CMLBond ligandBond : bondList) {
ligandBond.renameAtomRef(oldId, newId);
}
// must delay this to the end to keep indexes OK
this.resetId(newId);
}
/** replace element in atomRefs array.
* used for swapping first and last atoms in either atomRefs2, atomRefs3 or atomRefs4
* @param atomRefs array with atom ids
* @param atom
* @param rGroup
* @param last
* @return true if swapped
*/
public static boolean replaceAtomRefs(String[] atomRefs, CMLAtom atom, CMLAtom rGroup, int last) {
boolean change = false;
if (atomRefs[0].equals(rGroup.getId())) {
atomRefs[0] = atom.getId();
change = true;
} else if (atomRefs[last].equals(rGroup.getId())) {
atomRefs[last] = atom.getId();
change = true;
}
return change;
}
/** to string.
* @return atom id and element type at present
*/
public String getString() {
StringBuilder sb = new StringBuilder();
sb.append("id='"+this.getId()+"'");
sb.append(" elementType='"+this.getElementType()+"'");
return sb.toString();
}
/**
* is atom of given type.
*
* @param typeList
* @return true if of type
*/
public boolean atomIsCompatible(List<Type> typeList) {
boolean isCompatible = false;
ChemicalElement chemicalElement = ChemicalElement
.getChemicalElement(this.getElementType());
for (Type type : typeList) {
if (type != null && chemicalElement.isChemicalElementType(type)) {
isCompatible = true;
}
}
return isCompatible;
}
/**
* gets list of ligands in 2D diagram in clockwise order.
*
* starting atom is arbitrary (makes smallest clockwise angle with xAxis).
* The 4 atoms can be compared to atomRefs4 given by author or other methods
* to see if they are of the same or alternative parity.
*
* use compareAtomRefs4(CMLAtom[] a, CMLAtom[] b) for comparison
*
* @param atom4
* the original list of 4 atoms
* @return ligands sorted into clockwise order
* @throws RuntimeException
*/
public CMLAtom[] getClockwiseLigands(CMLAtom[] atom4)
throws RuntimeException {
Vector2 vx = new Vector2(1.0, 0.0);
Real2 thisxy = getXY2();
double[] angle = new double[4];
Vector2 v = null;
for (int i = 0; i < 4; i++) {
try {
v = new Vector2(atom4[i].getXY2().subtract(thisxy));
// Angle class appears to be broken, hence the degrees
angle[i] = vx.getAngleMadeWith(v).getDegrees();
} catch (NullPointerException npe) {
throw new RuntimeException(
"Cannot compute clockwise ligands");
}
if (angle[i] < 0) {
angle[i] += 360.;
}
if (angle[i] > 360.) {
angle[i] -= 360.;
}
}
// get atom4Refs sorted in cyclic order
CMLAtom[] cyclicAtom4 = new CMLAtom[4];
for (int i = 0; i < 4; i++) {
double minAngle = Double.MAX_VALUE;
int low = -1;
for (int j = 0; j < 4; j++) {
if (angle[j] >= 0 && angle[j] < minAngle) {
low = j;
minAngle = angle[j];
}
}
if (low != -1) {
cyclicAtom4[i] = atom4[low];
angle[low] = -100.;
} else {
throw new RuntimeException(
"Couldn't get AtomRefs4 sorted in cyclic order");
}
}
// all 4 angles must be less than PI
// the ligands in clockwise order
for (int i = 0; i < 4; i++) {
CMLAtom cyclicAtomNext = cyclicAtom4[(i < 3) ? i + 1 : 0];
Real2 cyclicXy = cyclicAtom4[i].getXY2();
Real2 cyclicXyNext = cyclicAtomNext.getXY2();
v = new Vector2(cyclicXy.subtract(thisxy));
Vector2 vNext = new Vector2(cyclicXyNext.subtract(thisxy));
double ang = v.getAngleMadeWith(vNext).getDegrees();
if (ang < 0) {
ang += 360.;
}
if (ang > 360.) {
ang -= 360.;
}
if (ang > 180.) {
throw new RuntimeException("All 4 ligands on same side "
+ getId());
}
}
return cyclicAtom4;
}
/** is this atom clode to another.
*
* @param atom
* @return true if close
*/
public boolean hasCloseContact(CMLAtom atom) {
double valenceDist = this.getChemicalElement().getCovalentRadius()+atom.getChemicalElement().getVDWRadius();
double dist = this.getDistanceTo(atom);
if ((valenceDist/2) > dist) {
return true;
} else {
return false;
}
}
public static void debugAtom(String msg, CMLAtom atom) {
if (atom == null) {
Util.println(msg+"...");
} else {
Util.println(msg+"..."+atom.getId());
}
}
} |
// Make graphics2d contexts for each test the same size, so we can layer
// the images without invalidating the previous ones.
PP_Resource CreateGraphics2D_90x90() {
PP_Resource graphics2d = PPBGraphics2D()->Create(
pp_instance(), &k90x90, kNotAlwaysOpaque);
CHECK(graphics2d != kInvalidResource);
PPBInstance()->BindGraphics(pp_instance(), graphics2d);
return graphics2d;
} |
/**
* Handles the redirection of incoming evaluation and assign group entity URLs to the proper views with the proper view params added
*
* @author Aaron Zeckoski (azeckoski @ unicon.net)
*/
@Slf4j
public class EvaluationVPInferrer implements EntityViewParamsInferrer {
private EvalCommonLogic commonLogic;
public void setCommonLogic(EvalCommonLogic commonLogic) {
this.commonLogic = commonLogic;
}
private EvalEvaluationService evaluationService;
public void setEvaluationService(EvalEvaluationService evaluationService) {
this.evaluationService = evaluationService;
}
private ModelAccessWrapperInvoker wrapperInvoker;
public void setWrapperInvoker(ModelAccessWrapperInvoker wrapperInvoker) {
this.wrapperInvoker = wrapperInvoker;
}
public void init() {
log.debug("VP init");
}
/* (non-Javadoc)
* @see org.sakaiproject.rsf.entitybroker.EntityViewParamsInferrer#getHandledPrefixes()
*/
public String[] getHandledPrefixes() {
return new String[] {
EvaluationEntityProvider.ENTITY_PREFIX,
AssignGroupEntityProvider.ENTITY_PREFIX
};
}
/* (non-Javadoc)
* @see org.sakaiproject.rsf.entitybroker.EntityViewParamsInferrer#inferDefaultViewParameters(java.lang.String)
*/
public ViewParameters inferDefaultViewParameters(String reference) {
//log.warn("Note: Routing user to view based on reference: " + reference);
final String ref = reference;
final ViewParameters[] togo = new ViewParameters[1];
// this is needed to provide transactional protection
wrapperInvoker.invokeRunnable(() -> { togo [0] = inferDefaultViewParametersImpl(ref); } );
return togo[0];
}
private ViewParameters inferDefaultViewParametersImpl(String reference) {
EntityReference ref = new EntityReference(reference);
EvalEvaluation evaluation = null;
Long evaluationId = null;
String evalGroupId = null;
if (EvaluationEntityProvider.ENTITY_PREFIX.equals(ref.getPrefix())) {
// we only know the evaluation
evaluationId = new Long(ref.getId());
evaluation = evaluationService.getEvaluationById(evaluationId);
} else if (AssignGroupEntityProvider.ENTITY_PREFIX.equals(ref.getPrefix())) {
// we know the evaluation and the group
Long AssignGroupId = new Long(ref.getId());
EvalAssignGroup assignGroup = evaluationService.getAssignGroupById(AssignGroupId);
evalGroupId = assignGroup.getEvalGroupId();
evaluationId = assignGroup.getEvaluation().getId();
evaluation = evaluationService.getEvaluationById(evaluationId);
} else {
throw new IllegalArgumentException("Invalid reference (don't know how to handle): "+ref);
}
if ( EvalConstants.EVALUATION_AUTHCONTROL_NONE.equals(evaluation.getAuthControl()) ) {
// anonymous evaluation URLs ALWAYS go to the take_eval page
log.debug("User taking anonymous evaluation: " + evaluationId + " for group: " + evalGroupId);
EvalViewParameters vp = new EvalViewParameters(TakeEvalProducer.VIEW_ID, evaluationId, evalGroupId);
vp.external = true;
return vp;
} else {
// authenticated evaluation URLs depend on the state of the evaluation and the users permissions
String currentUserId = commonLogic.getCurrentUserId();
boolean userAdmin = commonLogic.isUserAdmin(currentUserId);
log.debug("Note: User ("+currentUserId+") accessing authenticated evaluation: " + evaluationId + " in state ("+EvalUtils.getEvaluationState(evaluation, false)+") for group: " + evalGroupId);
// eval has not started
if ( EvalUtils.checkStateBefore(EvalUtils.getEvaluationState(evaluation, false), EvalConstants.EVALUATION_STATE_INQUEUE, true) ) {
// go to the add instructor items view if permission
// NOTE: the checks below are slightly expensive and should probably be reworked to use the newer participants methods
if (evalGroupId == null) {
Map<Long, List<EvalAssignGroup>> m = evaluationService.getAssignGroupsForEvals(new Long[] {evaluationId}, true, null);
EvalGroup[] evalGroups = EvalUtils.getGroupsInCommon(
commonLogic.getEvalGroupsForUser(currentUserId, EvalConstants.PERM_BE_EVALUATED),
m.get(evaluationId) );
if (evalGroups.length > 0) {
// if we are being evaluated in at least one group in this eval then we can add items
// TODO - except we do not have a view yet so go to the preview eval page
EvalViewParameters vp = new EvalViewParameters(PreviewEvalProducer.VIEW_ID, evaluationId);
vp.external = true;
return vp;
}
} else {
if (commonLogic.isUserAllowedInEvalGroup(currentUserId, EvalConstants.PERM_BE_EVALUATED, evalGroupId)) {
// those being evaluated get to go to add their own questions
// TODO - except we do not have a view yet so go to the preview eval page
EvalViewParameters vp = new EvalViewParameters(PreviewEvalProducer.VIEW_ID, evaluationId);
vp.external = true;
return vp;
}
}
// otherwise just show the preview as long as the user is an admin
if (userAdmin) {
EvalViewParameters vp = new EvalViewParameters(PreviewEvalProducer.VIEW_ID, evaluationId);
vp.external = true;
return vp;
}
// else just require auth
throw new SecurityException("User must be authenticated to access this page");
}
// finally, try to go to the take evals view
if (! commonLogic.isUserAnonymous(currentUserId) ) {
// check perms if not anonymous
// switched to take check first
if ( evaluationService.canTakeEvaluation(currentUserId, evaluationId, evalGroupId) ) {
log.debug("User ("+currentUserId+") taking authenticated evaluation: " + evaluationId + " for group: " + evalGroupId);
EvalViewParameters vp = new EvalViewParameters(TakeEvalProducer.VIEW_ID, evaluationId, evalGroupId);
vp.external = true;
return vp;
} else if (currentUserId.equals(evaluation.getOwner()) ||
commonLogic.isUserAllowedInEvalGroup(currentUserId, EvalConstants.PERM_BE_EVALUATED, evalGroupId)) {
// cannot take, but can preview
EvalViewParameters vp = new EvalViewParameters(PreviewEvalProducer.VIEW_ID, evaluationId);
vp.external = true;
return vp;
} else {
// no longer want to show security exceptions - https://bugs.caret.cam.ac.uk/browse/CTL-1548
//throw new SecurityException("User ("+currentUserId+") does not have permission to take or preview this evaluation ("+evaluationId+")");
return new EvalViewParameters(TakeEvalProducer.VIEW_ID, evaluationId, evalGroupId);
}
}
throw new SecurityException("User must be authenticated to access this page");
}
}
} |
def design_protein(net, x, edge_index, edge_attr, results, cutoff):
x_proba = torch.ones_like(x).to(torch.float) * cutoff
heap = [PrioritizedItem(0, x, x_proba)]
i = 0
while heap:
item = heapq.heappop(heap)
if i % 1000 == 0:
print(
f"i: {i}; p: {item.p:.4f}; num missing: {(item.x == 20).sum()}; "
f"heap size: {len(heap):7d}; results size: {len(results)}"
)
if not (item.x == 20).any():
results.append(item)
else:
children = get_descendents(net, item.x, item.x_proba, edge_index, edge_attr, cutoff)
for x, x_proba in children:
heapq.heappush(heap, PrioritizedItem(-x_proba.sum(), x, x_proba))
i += 1
if len(heap) > 1_000_000:
heap = heap[:700_000]
heapq.heapify(heap)
return results |
<gh_stars>1-10
#!/usr/bin/python3
# Need to install the Python3 anafile wrapper that can be downloaded here.
from cds.core import *
import numpy as np
if __name__ == "__main__":
# Create the ReadMe/Table maker
#tablemaker = CDSTablesMaker(out='ReadMe')
tablemaker = CDSTablesMaker()
# Add astropy Table
data = Table({'a': [1, 2, 3],
'b': [4.0, 1115.0, 6.12340],
'col3': [1.1,2.,12.3]},
names=['a', 'b','col3'])
table = tablemaker.addTable(data, name="data.cds")
#table.setByteByByteTemplate("/home/landais/workspace/pyreadme/userbytebybyte.template")
# Print bytebybyte
#tablemaker.printByteByByte(table)
# Add ASCII aligned columns (fortran) table
table = tablemaker.addTable("h_dm_com.dat",
name="h_dm_com.cds",
description="hipparcos table",
nullvalue="--")
# Add TSV table
table = tablemaker.addTable("asu.tsv", description="my tsv table")
# Add TSV table (ignore the 10 first lines)
cdstable = CDSFileTable("asu.tsv", description="my tsv table")
table = tablemaker.addTable(cdstable)
# Add TSV table (contaning sexagesimal columns)
table = tablemaker.addTable("asu_sexa.tsv", description="my tsv table")
table.columns[5].setSexaRa(precision=1)
table.columns[6].setSexaDe()
# Add numpy table
ntab = np.array([(1.1,1,'test'), (2.2,2,'test2'), (3.3,3,None)],
dtype=[('mag', np.float), ('recno', np.int32), ('comment', np.str_, 10)])
table = tablemaker.addTable(ntab,
name="myNumpyTable",
description="test numy table")
# Print table index (all tables)
#tablemaker.printTablesIndex()
tablemaker.writeCDSTables()
# Customize ReadMe output
tablemaker.title = "my title"
tablemaker.author = 'G.Landais'
tablemaker.date = 2015
tablemaker.abstract = "This is my abstract"
# Print ReadMe
with open('ReadMe', 'w') as out:
tablemaker.makeReadMe(out=out)
|
from flask import request, url_for
from flask_restx import Resource, abort
from tconfig.orm import orm_utils
from tconfig.api.schemas import ParameterSchema, NewParameterSchema, MoveParameterSchema
from tconfig.api.resources import resource_utils
PARAMETER_SCHEMA = ParameterSchema()
NEW_PARAMETER_SCHEMA = NewParameterSchema()
MOVE_PARAMETER_SCHEMA = MoveParameterSchema()
# noinspection PyMethodMayBeStatic
class ParameterListResource(Resource):
def get(self):
parameter_list = orm_utils.get_parameter_list()
response_content = {
"parameter_list": [
PARAMETER_SCHEMA.dump(parameter) for parameter in parameter_list
],
"parameter_List_url": url_for(".parameter_list"),
}
return response_content
def put(self):
parameter_set = orm_utils.get_or_404_parameter_set()
put_data = request.get_json() # @UndefinedVariable
put_data.update({"numItems": len(parameter_set)})
validation_errors = MOVE_PARAMETER_SCHEMA.validate(put_data)
if validation_errors:
abort(400, f"Validation error(s): {validation_errors}")
old_index = put_data["oldIndex"]
new_index = put_data["newIndex"]
moved_parameter = parameter_set.pop(old_index)
parameter_set.insert(new_index, moved_parameter)
resource_utils.perform_orm_commit_or_500(parameter_set, operation="update")
response_content = {
"message": "parameter moved within list",
"old_index": old_index,
"new_index": new_index,
"moved_parameter_url": url_for(".parameter", uid=moved_parameter.uid),
"parameter_List_url": url_for(".parameter_list"),
}
return response_content
def post(self):
parameter_set = orm_utils.get_or_404_parameter_set()
post_data = request.get_json() # @UndefinedVariable
validation_errors = NEW_PARAMETER_SCHEMA.validate(post_data)
if validation_errors:
abort(400, f"Validation error(s): {validation_errors}")
value_info = {
key: NEW_PARAMETER_SCHEMA.fields[key].deserialize(post_data[key])
for key in post_data
}
new_parameter = orm_utils.create_parameter(**value_info)
parameter_set.append(new_parameter)
resource_utils.perform_orm_commit_or_500(parameter_set)
new_id = new_parameter.uid
response_content = {
"message": "new parameter created",
"new_parameter": PARAMETER_SCHEMA.dump(new_parameter),
"new_parameter_url": url_for(".parameter", uid=new_id),
"parameter_list_url": url_for(".parameter_list"),
"parameter_set_url": url_for(".parameter_set"),
}
return response_content, 201
|
<gh_stars>1-10
package de.dc.javafx.emfsupport.website.model.ui;
import java.io.File;
import de.dc.javafx.efxclipse.runtime.model.IEmfManager;
import de.dc.javafx.emfsupport.website.model.Author;
import de.dc.javafx.emfsupport.website.model.ModelFactory;
import de.dc.javafx.emfsupport.website.model.Page;
import de.dc.javafx.emfsupport.website.model.Website;
import javafx.collections.FXCollections;
import javafx.collections.ObservableList;
import javafx.scene.Parent;
public class WebsiteModelTreeViewApplication extends BaseWebsiteModelViewApplication {
@Override
public Parent getRoot() {
IEmfManager<Website> manager = new WebsiteModelManager();
ObservableList<Page> pages = FXCollections.observableArrayList();
for (int i = 0; i < 20; i++) {
Page createPage = ModelFactory.eINSTANCE.createPage();
createPage.setName("<NAME>");
createPage.setUrl("www.johndoe.com");
createPage.setBody("No Content");
for (int j = 0; j < 10; j++) {
Author author = ModelFactory.eINSTANCE.createAuthor();
author.setFirstname("No");
author.setLastname("Name");
author.setEmail("<EMAIL>");
createPage.getAuthor().add(author);
}
pages.add(createPage);
}
manager.getRoot().getPages().addAll(pages);
return new WebsiteModelTreeView(manager);
}
@Override
protected String getTitle() {
return "EmfSupport Model Editor with PropertySheet, Editing Support, Undo/Redo";
}
public static void main(String[] args) {
new File("./workspace").mkdirs();
launch(args);
}
}
|
def update_db(self,
reading: float,
client: str,
vendor: str,
per_unit: float,
units: float,
reason: str
) -> None:
try:
buildInstertionCommand(DATABASE,
CURSOR,
reading,
client,
vendor,
per_unit,
units,
reason)
except Exception as e:
messagebox.showerror("Error fatal al actualizar la base de datos", f"""Los datos no fueron cargados correctamente. Intente
de nuevo o verifique que no haya otros programas trabajando el archivo.
(Error reportado por la base de datos: '{str(e)}')""")
return None
messagebox.showinfo("Proceso terminado exitosamente", """Los datos han sido devueltos en la base de datos.
Puede generar un nuevo registro o salir desde el menu inicial.""")
self.goto("go_home") |
// Implementation of Insertion sort algorithm in c++
// sorting an array in ascending order using insertion sort
#include<stdio.h>
void Insertion(int arr[],int);
int main()
{
int n;
printf("Enter the size of array\n");
scanf("%d",&n);
int arr[n];
printf("Enter %d integers\n",n);
for(int i=0;i<n;i++)
scanf("%d",&arr[i]);
Insertion(arr,n);
printf("Array in sorted order is:\n");
for(int i=0;i<n;i++)
printf("%d ",arr[i]);
}
void Insertion(int arr[],int n)
{
for(int i=1;i<n;i++)
{
int key=arr[i];
int j=i-1;
while(j>=0 && arr[j]>key)
{
arr[j+1]=arr[j];
j=j-1;
}
arr[j+1]=key;
}
}
|
Jim Ritter, a gay officer with more than 30 years in the SPD, has started a campaign that politicians and others think is helping restore the security of Seattle’s gay community.
When he was a 14-year-old police cadet in Seattle, Jim Ritter knew he wanted to work for the Seattle Police Department someday. He also knew he was gay.
In the mid-1970s, you couldn’t be both.
After high school, Ritter joined the Kittitas County Sheriff’s Office knowing he wouldn’t face questions about his sexual orientation — a disqualifier in some departments in those days.
He waited for three years before he applied to the SPD, nervous that he might face a polygraph test with questions about his sexual orientation.
They didn’t ask. He got in.
Even then, it was 10 years before he felt comfortable coming out. But that didn’t keep him from advocating for the LGBTQ community throughout his career as a beat and patrol cop, background investigator, recruiter and founder of the Seattle Metropolitan Police Museum.
Last September, after more than three decades on the force, Ritter, 54, was appointed as SPD’s first full-time liaison to the city’s LGBTQ community.
Since then, Ritter has been making efforts to connect the SPD and the LGBTQ community through a program called “Safe Place,” a campaign against bias crimes that labels businesses as LGBTQ allies and trains employees to call 911 to report hate crimes while harboring victims until police arrive. The campaign also includes a webpage with resources and an anonymous reporting form.
Ritter’s appointment comes amid an uptick in reported anti-LGBTQ bias and hate crimes on Capitol Hill, the hub of Seattle’s gay community.
“Everybody comes up here because we’re fun and we’re safe — at least, we used to be,” said Shaun Knittel, the president and co-founder of LGBTQ rights organization Social Outreach Seattle. “On the Hill, should we have to change? Absolutely not. If you come here to party, to live, to work, you should have an understanding of our ideals.”
In March, Ed Murray, Seattle’s first gay mayor and a longtime resident of Capitol Hill, called the increase in crime against sexual minorities a “crisis.” SPD statistics show that bias crimes against the population climbed from 19 to 36 between 2013 and 2014. However, there’s widespread agreement that crimes are still sporadic and minor in comparison to the organized “gay bashing” of decades past.
The goal of the “Safe Place” is reducing violence and bias crimes within Seattle’s LGBTQ community and, most immediately, within the Capitol Hill neighborhood.
But to achieve that goal, the SPD will have to improve its image among some people who may be wary of the police.
Ritter understands.
When he joined the force in 1983, there was only one openly gay male officer, he said. Although comments about that officer were “not systematic,” Ritter said there were enough to convince him that revealing his own sexual orientation might do more harm than good.
“It’s basically leading at least two lives,” Ritter said. “It’s really difficult — it takes a lot of energy.”
Related video: Same-sex marriage decision 2015 Hundreds gathered outside the U.S. District Court in Seattle for a rally in celebration of the Supreme Court's ruling allowing same-sex marriage in all states. Hundreds gathered outside the U.S. District Court in Seattle for a rally in celebration of the Supreme Court's ruling allowing same-sex marriage in all states. Read more . (Corinne Chin and Danny Gawlowski / The Seattle Times)
But Ritter says that has changed. When he came out in 1993, it was “no big deal” to his colleagues. Today, there are about 50 openly gay officers on the force, Ritter says.
“The way we move forward is to show the public that this is 2015 — and we are not the department we were 30 years ago,” Ritter said. “In SPD, we’re about as progressive as it gets. I’ve been all over the U.S., and when it comes to police departments, I have yet to find one who can hold a candle to us.”
He hopes to help bust the stereotype around police by personally showing the LGBTQ community that police do care, and that SPD Safe Place is “a mechanism in place to prove that.”
Ideally, increased reports and arrests will follow.
“We can only act on the cases we know about so we need to be able to track it, understand what’s going on and hopefully adapt our procedures to address that segment of the community,” said Capt. Paul McDonagh, who heads the department’s East Precinct, which includes Capitol Hill. “We’re supposed to be able to serve everyone.”
Since June 19, SPD has responded to five bias incidents in Capitol Hill and one downtown, according to Ritter. Five of those six cases resulted in an arrest, Ritter said, and police are still searching for the suspects in the sixth.
John Wallace, a recent resident and employee of Capitol Hill, isn’t sure there is an overall consensus about rebuilding trust with police, particularly because suspicion of the police has been heightened on a national level.
“There’s a lot of history of bad blood, and whether or not it’s entirely justifiable on both sides is a matter of opinion,” said Wallace, who recently organized a march on Capitol Hill to protest LGBTQ crime. “I want their help, but we need it in an appropriate way going forward.”
Karyn Schwartz, whose apothecary Sugarpill is a part of the Safe Place campaign, says she was the victim of a mugging in a different part of the city a few years ago. She remembers the helpless feeling of not knowing where to go and doesn’t want anyone else to feel the same.
A longtime member of the Capitol Hill community, she also thinks the Safe Place logo — a rainbow flag inside a badge — is a reminder of the inclusive spirit that’s started to feel endangered in the neighborhood.
“It’s really visibly queer in a way that I’ve been craving,” Schwartz says. “Seeing some sort of representation of yourself that says ‘you belong here’ — it brings calm to your soul.”
Knittel, of Social Outreach Seattle, thinks that if the community continues to get behind the campaign and increase nonviolent bystander intervention, Capitol Hill could feel safe again.
“LGBTQ are the majority on the Hill. If there are two animals and 20 of us … they’re not very dangerous anymore,” Knittel said. “We have a police department; they need to do their job. We have a community, and we need to do our job. That’s to keep people safe.”
An early example of success, say Knittel and the SPD, is an incident that happened outside of R Place, a gay bar on Capitol Hill and a business branded as a “Safe Place.” On June 19, when two men started yelling homophobic slurs at a man outside R Place and threatened to attack him, witnesses stepped in to stop the attack and called the police. The men were arrested.
It’s an indicator of change, Knittel says. |
n = input()
c = [0] * 99
for x in map(int, raw_input().split()):
c[bin(x).count('1')] += 1
print sum(x * (x - 1) / 2 for x in c) |
/**
* Runtime exceptions generated by this class. Wraps exceptions generated by delegated operations,
* particularly when they are not, themselves, Runtime exceptions.
*/
public static class PersistenceFeatureException extends RuntimeException {
private static final long serialVersionUID = 1L;
/**
* Constructor.
* */
public PersistenceFeatureException(Exception ex) {
super(ex);
}
} |
package config
const (
CurrentAPIVersion = "v1"
ServiceName = "gocouchbase"
APIPrefix = "/" + ServiceName + "/" + CurrentAPIVersion
RequestContext = "RequestContext"
ClientIDKey = "clientID"
RequestIDKey = "requestID"
RequestTimeKey = "timestamp"
RemoteAddrKey = "remoteAddr"
MethodKey = "method"
RequestURIKey = "requestURI"
)
var (
OpenPaths = []string{"/health", "/authenticate"}
)
|
def dijkstra(start : "node" , *goal, limit=None, imbalance=None) -> list:
if start in goal: return [start]
if imbalance is None: imbalance = []
node = start
frontier = [(0, start)]
seen = set()
explored = set()
prev = {}
while True:
if len(frontier) == 0:
raise ValueError("frontier is empty")
cost, node = heapq.heappop(frontier)
if limit and cost > limit:
raise TimeoutError("Time exceeded. No match found")
if node is None:
raise ValueError("Node is None")
new_cost = cost + node.time
late_cost = cost + node.time + LATEPENALTY
if node in goal:
li = [node]
n = node
while n != start:
li.append(prev[n])
n = prev[n]
return li[::-1]
explored.add(node)
not_seen = node.nextMove.difference(seen).difference(explored)
has_late = bool(node.lateMove)
late_not_seen = node.lateMove.difference(seen).difference(explored) if has_late else set()
frontier_copy = frontier.copy()
for i,j in frontier_copy:
if j not in explored:
if j in node.nextMove:
my_cost = bounty(j, imbalance, new_cost)
if node.last and j == node.last: my_cost += LAST_MOVE_PENALTY
if i > my_cost:
decrease_key(frontier, i, my_cost, j)
prev[j] = node
elif has_late and j in node.lateMove:
my_cost = bounty(j, imbalance, late_cost)
if node.last and j == node.last: my_cost += LAST_MOVE_PENALTY
if i > late_cost:
decrease_key(frontier, i, late_cost, j)
prev[j] = node
for i in not_seen:
my_cost = bounty(i, imbalance, new_cost)
if node.last and i == node.last: my_cost += LAST_MOVE_PENALTY
heapq.heappush(frontier, (my_cost,i))
seen.add(i)
prev[i] = node
for i in late_not_seen:
if node.last and i == node.last: my_cost += LAST_MOVE_PENALTY
my_cost = bounty(i, imbalance, late_cost)
heapq.heappush(frontier, (my_cost,i))
seen.add(i)
prev[i] = node |
import Vue from 'vue';
export { createNamespace } from './create';
export { addUnit } from './format/unit';
export const isServer: boolean = Vue.prototype.$isServer;
export function noop() {}
export function isDef(val: any): boolean {
return val !== undefined && val !== null;
}
export function isFunction(val: unknown): val is Function {
return typeof val === 'function';
}
export function isObject(val: any): val is Record<any, any> {
return val !== null && typeof val === 'object';
}
export function isPromise<T = any>(val: unknown): val is Promise<T> {
return isObject(val) && isFunction(val.then) && isFunction(val.catch);
}
export function get(object: any, path: string): any {
const keys = path.split('.');
let result = object;
keys.forEach((key) => {
result = isDef(result[key]) ? result[key] : '';
});
return result;
}
|
/**
* Exclude filter.
*
* @param shenyuConfig the shenyu config
* @return the web filter
*/
@Bean
@Order(-5)
@ConditionalOnProperty(name = "shenyu.exclude.enabled", havingValue = "true")
public WebFilter excludeFilter(final ShenyuConfig shenyuConfig) {
return new ExcludeFilter(shenyuConfig.getExclude().getPaths());
} |
#!/usr/bin/env python
from flask import Flask
app = Flask(__name__, static_url_path="")
app.debug = True
import twidder.views
|
//Searches a Date In a CSV File For the Company Symbol and Date Specified
private void searchDateCompany() {
String name;
String SearchDate;
name = JOptionPane.showInputDialog("Please Enter the Symbol or Filename of The Company");
SearchDate = JOptionPane.showInputDialog("Enter the Date Like So -> YYYY-MM-DD");
String newname;
try {
name = name.toUpperCase();
if (name.contains(".CSV") && new File(name).exists()) {
SearchandViewService.readFile(name, SearchDate);
} else if (!name.contains(".CSV")) {
newname = name + ".CSV";
if (newname.contains(".CSV") && new File(newname).exists()) {
SearchandViewService.readFile(newname, SearchDate);
} else if (newname.contains(".CSV") && !new File(newname).exists()) {
getupdate = new CSVUpdateService();
String crumb;
String crumbinput;
if (newname.contains(".CSV")) {
crumbinput = newname.substring(0, newname.length() - 4);
crumb = getupdate.getCrumb(crumbinput);
if (crumb != null && !crumb.isEmpty()) {
getupdate.downloadData(crumbinput, 0, System.currentTimeMillis(), crumb);
SearchandViewService.readFile(newname, SearchDate);
} else {
System.out.println("Unable to view data using the Symbol: " + crumbinput.toUpperCase());
JOptionPane.showMessageDialog(null, "Unable to Show Selected Data, Symbol May Not Exist");
}
}
}
} else {
System.out.println("Unable to view table data using the Symbol: " + name + "" +
" Please Ensure that .csv is added and the company symbol exists on the web");
JOptionPane.showMessageDialog(null, "Unable to show selected data");
}
} catch (Exception el) {
JOptionPane.showMessageDialog(null, "You have clicked Cancel");
}
} |
def main():
from collections import deque
K = int(input())
q = deque(range(1, 10))
for _ in range(K):
x = q.popleft()
lastdigit = x%10
base = x*10 + lastdigit
if lastdigit != 0:
q.append(base-1)
q.append(base)
if lastdigit != 9:
q.append(base+1)
print(x)
if __name__ == '__main__':
main() |
/*
* napms. Sleep for ms milliseconds. We don't expect a particularly good
* resolution - 60ths of a second is normal, 10ths might even be good enough,
* but the rest of the program thinks in ms because the unit of resolution
* varies from system to system. (In some countries, it's 50ths, for example.)
* Vaxen running 4.2BSD and 3B's use 100ths.
*
* Here are some reasonable ways to get a good nap.
*
* (1) Use the poll() or select() system calls in SVr3 or Berkeley 4.2BSD.
*
* (2) Use the 1/10th second resolution wait in the System V tty driver.
* It turns out this is hard to do - you need a tty line that is
* always unused that you have read permission on to sleep on.
*
* (3) Install the ft (fast timer) device in your kernel.
* This is a psuedo-device to which an ioctl will wait n ticks
* and then send you an alarm.
*
* (4) Install the nap system call in your kernel.
* This system call does a timeout for the requested number of ticks.
*
* (5) Write a routine that busy waits checking the time with ftime.
* Ftime is not present on SYSV systems, and since this busy waits,
* it will drag down response on your system. But it works.
*/
int
napms(int ms)
{
struct pollfd pollfd;
if (poll(&pollfd, 0L, ms) == -1)
perror("poll");
return (OK);
} |
// Copyright 2018 <NAME>
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include "base/bind.h"
#include "ui/aura/window.h"
#include "content/public/browser/browser_thread.h"
#include "content/public/browser/navigation_controller.h"
#include "content/public/browser/render_widget_host_view.h"
#include "content/eglcontent/browser/browser.h"
#include "content/eglcontent/browser/browser_context.h"
#include "content/eglcontent/browser/aura_screen.h"
#include "content/eglcontent/api/application.h"
#include "content/eglcontent/api/browser_config.h"
namespace content {
std::unique_ptr<EGLContentBrowser> EGLContentBrowser::g_browser;
void EGLContentBrowser::Initialise(EGLContent::BrowserConfig& config,
EGLContent::BrowserDelegate* delegate) {
g_browser.reset(new EGLContentBrowser(delegate));
g_browser->CreateBrowserContext(config);
g_browser->CreateWindow(config);
g_browser->Ready();
}
void EGLContentBrowser::Release() {
display::Screen::SetScreenInstance(NULL);
g_browser.reset();
}
EGLContentBrowser::EGLContentBrowser(EGLContent::BrowserDelegate* delegate)
: delegate_(delegate) {
}
EGLContentBrowser::~EGLContentBrowser() {;
}
void EGLContentBrowser::CreateBrowserContext(EGLContent::BrowserConfig& config) {
browser_context_.reset(new EGLContentBrowserContext(config));
browser_context_->Initialise();
}
void EGLContentBrowser::CreateWindow(EGLContent::BrowserConfig& config) {
gfx::Size window_size = gfx::Size(config.screen_width, config.screen_height);
screen_.reset(
new EGLContentAuraScreen(window_size, config.scale_factor));
display::Screen::SetScreenInstance(screen_.get());
screen_->Initialise();
WebContents::CreateParams create_params(browser_context_.get());
create_params.initial_size = window_size;
create_params.initially_hidden = false;
web_contents_ = WebContents::Create(create_params);
screen_->host()->SetBoundsInPixels(gfx::Rect(window_size));
screen_->host()->Show();
web_contents_->SetDelegate(this);
aura::Window* window = web_contents_->GetNativeView();
aura::Window* parent = screen_->host()->window();
if (!parent->Contains(window))
parent->AddChild(window);
window->Show();
RenderWidgetHostView* host_view = web_contents_->GetRenderWidgetHostView();
if (host_view)
host_view->SetSize(window_size);
}
void EGLContentBrowser::Ready() {
if (delegate_)
delegate_->BrowserCreated(this);
}
void EGLContentBrowser::LoadingStateChanged(WebContents* source, bool to_different_document) {
if (delegate_)
delegate_->LoadingStateChanged(source->IsLoading());
}
void EGLContentBrowser::LoadProgressChanged(WebContents* source, double progress) {
if (delegate_)
delegate_->LoadProgressed(progress);
}
void EGLContentBrowser::UpdateTargetURL(WebContents* source, const GURL& url) {
std::string str = url.spec();
if (delegate_)
delegate_->TargetURLChanged(str);
}
void EGLContentBrowser::LoadURL(std::string& url) {
BrowserThread::PostTask(
BrowserThread::UI, FROM_HERE,
base::Bind(&EGLContentBrowser::LoadURLTask, base::Unretained(this), url));
}
std::string EGLContentBrowser::GetURL() {
return web_contents_->GetVisibleURL().spec();
}
void EGLContentBrowser::Stop() {
BrowserThread::PostTask(
BrowserThread::UI, FROM_HERE,
base::Bind(&EGLContentBrowser::StopTask, base::Unretained(this)));
}
bool EGLContentBrowser::IsLoading() {
return web_contents_->IsLoading();
}
bool EGLContentBrowser::IsAudioMuted() {
return web_contents_->IsAudioMuted();
}
void EGLContentBrowser::SetAudioMuted(bool mute) {
BrowserThread::PostTask(
BrowserThread::UI, FROM_HERE,
base::Bind(&EGLContentBrowser::SetAudioMutedTask, base::Unretained(this), mute));
}
bool EGLContentBrowser::IsCrashed() const {
return web_contents_->IsCrashed();
}
void EGLContentBrowser::Reload() {
BrowserThread::PostTask(
BrowserThread::UI, FROM_HERE,
base::Bind(&EGLContentBrowser::ReloadTask, base::Unretained(this)));
}
void EGLContentBrowser::LoadURLTask(std::string url) {
GURL gurl(url);
NavigationController::LoadURLParams params(gurl);
LOG(INFO) << "Loading url : " << url;
params.frame_name = std::string();
params.transition_type = ui::PageTransitionFromInt(
ui::PAGE_TRANSITION_TYPED | ui::PAGE_TRANSITION_FROM_ADDRESS_BAR);
web_contents_->GetController().LoadURLWithParams(params);
web_contents_->Focus();
}
void EGLContentBrowser::StopTask() {
web_contents_->Stop();
}
void EGLContentBrowser::SetAudioMutedTask(bool mute) {
web_contents_->SetAudioMuted(mute);
}
void EGLContentBrowser::ReloadTask() {
web_contents_->GetController().Reload(ReloadType::NORMAL, false);
}
}
|
<filename>lotto_env.py<gh_stars>0
import numpy as np
import matplotlib.pyplot as plt
import random
'''
Rules:
Select 6 numbers from 59
Pay -2 for each game
Prize
6 = 1000000
5 + bb = 100000
5 = 10000
4 = 1000
3 = 25
2 = 0
1 = 0
'''
class LottoEnv(object):
def __init__(self, num_balls=59, num_choices=6, has_bonus_ball=True, step_cost=-2, max_loss=10000) -> None:
self.num_balls = num_balls
self.num_choices = num_choices
self.has_bonus_ball = has_bonus_ball
self.step_cost = step_cost
self.max_loss = max_loss
# self.balls = np.arange(1, num_balls+1)
self.balls = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10,
11, 12, 13, 14, 15, 16, 17, 18, 19, 20,
21, 22, 23, 24, 25, 26, 27, 28, 29, 30,
31, 32, 33, 34, 35, 36, 37, 38, 39, 40,
41, 42, 43, 44, 45, 46, 47, 48, 49, 50,
51, 52, 53, 54, 55, 56, 57, 58, 59]
self.draw = []
self.bonus_ball = None # or 0?
self.selection = []
self.reward = max_loss
self.done = False
self.step_no = 0
self.stats = [0, 0, 0, 0, 0, 0, 0]
# self.render()
def reset(self):
self.draw_numbers()
self.reward = self.max_loss
self.done = False
self.step_no = 0
self.stats = [0, 0, 0, 0, 0, 0, 0]
return self.draw, self.reward, self.done, (self.step_no, self.stats)
def select_numbers(self):
self.selection = random.sample(self.balls, self.num_choices)
return self.selection
def draw_numbers(self):
# print(type(self.balls))
self.draw = random.sample(self.balls, self.num_choices)
# self.draw = np.random.sample(self.balls, size=self.num_choices)
self.bonus_ball = random.choice(self.draw)
self.draw.remove(self.bonus_ball)
def step(self, action=None):
self.step_no += 1
self.draw_numbers()
# self.reward += self.step_cost
selection = action # or self.select_numbers()
self.calculate_reward(selection)
if self.reward < 1:
self.done = True
if self.step_no > 1000:
self.done = True
return self.draw, self.reward, self.done, (self.step_no, self.stats)
def calculate_reward(self, selection):
algo = {
6: 1000000,
# 5 + bb = 100000
5: 10000,
4: 1000,
3: 25,
2: 0,
1: 0,
0: 0
}
num_balls = 0
for ball in selection:
if ball in self.draw:
num_balls += 1
if num_balls == 5 and self.bonus_ball in selection:
reward_ = 100000
else:
reward_ = algo[num_balls]
self.reward += (reward_+self.step_cost)
self.stats[num_balls] += 1
if self.step_no % 100 == 0:
print(self.step_no, 'num_balls', num_balls, 'reward', reward_, 'cum reward', self.reward, self.stats)
def render(self):
# render the balls, draw and our selection
print('-----------------------------')
for n in self.balls:
if n in self.draw:
print("({})".format(n), end='\t')
else:
print(" {} ".format(n), end='\t')
if n % 8 == 0:
print('\n')
print('\n-----------------------------')
def max_action(Q, obs, possible_actions):
return 1
|
/*
only one level of child document supported
*/
private void addFieldsToDoc(Map<String, Object> objectMap, SolrInputDocument doc) {
objectMap.forEach(
(key, val) -> {
if (val != null) {
if ("_childDocuments_".equalsIgnoreCase(key)) {
doc.addChildDocuments(getChildDocuments(val));
} else {
if (val instanceof BigDecimal) {
val = ((BigDecimal) val).doubleValue();
}
doc.setField(key, val);
}
}
});
} |
def buildDict(self, dict):
self.dictSet = {}
self.lenSet = set()
for word in dict:
self.lenSet.add(len(word))
for idx in range(0, len(word)):
key = str(word[0:idx] + word[idx + 1:])
if key in self.dictSet:
self.dictSet[key].append((idx, word))
else:
self.dictSet[key] = [(idx,word)]
print(self.dictSet) |
<reponame>victor-da-costa/Aprendendo-Python
from time import sleep
def contador(início, fim, passo):
if passo < 0:
passo * (-1)
if passo == 0:
passo = 1
if início < fim:
cont = início
while cont <= fim:
print(f' {cont}', end='', flush=True)
sleep(0.5)
cont += passo
print()
else:
cont = início
while cont >= fim:
print(f' {cont}', end='', flush=True)
sleep(0.5)
cont -= passo
print()
início = int(input('Inicio: '))
fim = int(input('Fim: '))
passo = int(input('Passo: '))
contador(1, 10, 1)
contador(10, 0, 2)
contador(início, fim, passo)
|
I haven’t done one of these since Iowa Eve, when there were still “others,” so let’s do it! Let’s vote on the community’s Democratic presidential candidate preference.
DATE SANDERS CLINTON OTHER UNSURE SAMPLE SIZE
(1,000S) 26-Jan 61 35 1 2 10.7 12-JAN 65 32 1 2 10.1 29-DEC 73 26 1 1 17.7 15-DEC 63 35 1 2 9 1-DEC 60 38 1 2 9 17-NOV 57 41 1 2 11.5 3-NOV 65 32 1 2 12.3 20-OCT 53 41 1 2 8 29-SEP 62 29 3 3 7.8 15-SEP 62 29 3 3 7.2 1-SEP 58 34 2 3 7.4 19-AUG 58 34 2 3 7.5 5-AUG 58 35 3 2 8.1 21-JUL 57 36 2 3 7.1 9-JUL 67 29 2 2 7.4 23-JUN 63 31 1 1 8 9-JUN 69 24 3 2 14.2
Vote! And here’s some ground rules: If you feel like saying that Donald Trump will defeat Bernie Sanders and/or Hillary Clinton, you’re a moron—10 demerits! If you feel like saying that Hillary Clinton is more conservative than ANY Republican, you’re a moron—100 demerits. If you feel like saying that Bernie Sanders is like Barry Goldwater George McGovern, you’re a moron—100 demerits. If you feel like saying that you’ll never vote for the Democratic nominee, whoever that is, go fuck yourself—1,000 demerits.
Even if Clinton was as conservative as the critics claim she is, are you really going to argue that her Scalia replacement will be no different than the GOP’s? And even if Sanders’ ideas could never get through a GOP House, don’t you realize that his Supreme Court nominee alone will be more important than anything he could accomplish legislatively?
We have two great candidates, no matter what your own psychosis tells you, and this election will decide the balance of the Supreme Court—which could mean an end to partisan gerrymandering and Citizens United and myriad other issues of critical importance to us. I get it, when you’re that committed to a single candidate, it’s hard to see beyond your biases. But that’s the truth. So fucking grow up. |
// Copyright 2022 The Moov Authors
// Use of this source code is governed by an Apache License
// license that can be found in the LICENSE file.
package main
import (
"encoding/json"
"math"
"reflect"
"sync"
)
type Result[T any] struct {
Data T
match float64
precomputedName string
precomputedAlts []string
}
func (e Result[T]) MarshalJSON() ([]byte, error) {
// Due to a problem with embedding type parameters we have to dig into
// the parameterized type fields and include them in one object.
//
// Helpful Tips:
// https://stackoverflow.com/a/64420452
// https://github.com/golang/go/issues/41563
v := reflect.ValueOf(e.Data)
result := make(map[string]interface{})
for i := 0; i < v.NumField(); i++ {
key := v.Type().Field(i)
value := v.Field(i)
if key.IsExported() {
result[key.Name] = value.Interface()
}
}
result["match"] = e.match
return json.Marshal(result)
}
func topResults[T any](limit int, minMatch float64, name string, data []*Result[T]) []*Result[T] {
if len(data) == 0 {
return nil
}
name = precompute(name)
xs := newLargest(limit, minMatch)
var wg sync.WaitGroup
wg.Add(len(data))
for i := range data {
go func(i int) {
defer wg.Done()
it := &item{
value: data[i],
weight: jaroWinkler(data[i].precomputedName, name),
}
for _, alt := range data[i].precomputedAlts {
if alt == "" {
continue
}
it.weight = math.Max(it.weight, jaroWinkler(alt, name))
}
xs.add(it)
}(i)
}
wg.Wait()
out := make([]*Result[T], 0)
for _, thisItem := range xs.items {
if v := thisItem; v != nil {
vv, ok := v.value.(*Result[T])
if !ok {
continue
}
res := &Result[T]{
Data: vv.Data,
match: v.weight,
precomputedName: vv.precomputedName,
precomputedAlts: vv.precomputedAlts,
}
out = append(out, res)
}
}
return out
}
|
/**
* Class which helps to read data
* from XLSX files.
*
*/
public class ReaderXLSX {
/** list of the invoices */
private Invoices invoices;
/** name of the XLSX file */
private String filename;
/** correction invoice */
private InvoiceCorrection invoiceCorrection;
/**
* ReaderXLSX constructor. ReaderXLSX
* is responsible for reading data from the XLSX file
* and making a list of the created invoices in the invoices
* field.
*
* @param invoices Class in which Invoice instances
* are stored
* @param filenameToOpen Name of the file from which Invoices data
* is read
*/
ReaderXLSX(Invoices invoices, InvoiceCorrection invoiceCorrection, String filenameToOpen){
this.invoices = invoices;
this.filename = filenameToOpen;
this.invoiceCorrection = invoiceCorrection;
}
/**
* The method is used for
* reading records from the XLSX file.
*
*/
public void readFromXLSXfile(){
// create input stream to read from the file
FileInputStream file = null;
try {
// get data from the file
file = new FileInputStream(filename);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
// declare record iterator and sheet
XSSFSheet sheet;
Iterator<Row> rowIterator = null;
// create workbook in which data is stored
try (XSSFWorkbook workbook = new XSSFWorkbook(file)){
// get the first sheet from the XLSX file
sheet = workbook.getSheetAt(0);
// get records iterator
rowIterator = sheet.iterator();
} catch (IOException e) {
e.printStackTrace();
}
// initialize sum of invoices values
Float sumVal = (float) 0;
ArrayList<ArrayList<String>> columnValues = new ArrayList<>();
for(int i = 0; i < 15; i++){
columnValues.add(new ArrayList<>());
}
// iterate through each row
while (Objects.requireNonNull(rowIterator).hasNext()) {
// get next row
Row nextRow = rowIterator.next();
// through each cell iterator
Iterator<Cell> cellIterator = nextRow.cellIterator();
// row data in the string
// iterate through each cell in the row
String col0 = ""; String col1 = ""; String col2 = ""; String col3 = "";
String col4 = ""; String col5 = ""; String col6 = ""; String col7 = "";
String col8 = ""; String col9 = ""; String col10 = ""; String col11 = "";
String col12 = ""; String col13 = ""; String col14 = "";
while(cellIterator.hasNext()) {
StringBuilder rowStr = new StringBuilder();
// get next cell
Cell nextCell = cellIterator.next();
// get string value of the cell
DataFormatter formatter = new DataFormatter();
String strValue = formatter.formatCellValue(nextCell);
// get value of each column in the current row
if(nextCell.getColumnIndex() == 0){
col0 = strValue;
} else if(nextCell.getColumnIndex() == 1) {
col1 = strValue;
}else if(nextCell.getColumnIndex() == 2){
col2 = strValue;
}else if(nextCell.getColumnIndex() == 3){
col3 = strValue;
}else if(nextCell.getColumnIndex() == 4){
col4 = strValue;
}else if(nextCell.getColumnIndex() == 5){
col5 = strValue;
}else if(nextCell.getColumnIndex() == 6){
if(strValue.contains("Faktura korygująca")){
invoiceCorrection.setPrzyczynaKorekty(strValue);
invoiceCorrection.setNrFaKorygowanej("Od " + col3 + " do " + col4);
String interval = strValue.replaceAll("[^(0-9)/-]", "");
invoiceCorrection.setOkresFaKorygowanej(interval.substring(0, interval.length() - 2));
continue;
}
col6 = strValue;
}else if(nextCell.getColumnIndex() == 7){
col7 = strValue;
}else if(nextCell.getColumnIndex() == 8){
col8 = strValue;
}else if(nextCell.getColumnIndex() == 9){
col9 = strValue;
}else if(nextCell.getColumnIndex() == 10){
col10 = strValue;
}else if(nextCell.getColumnIndex() == 11){
col11 = strValue;
}else if(nextCell.getColumnIndex() == 12){
col12 = strValue;
}else if(nextCell.getColumnIndex() == 13){
col13 = strValue;
}else if(nextCell.getColumnIndex() == 14) {
col14 = strValue;
}
}
invoices.getListInvoice().add((new Invoice()).setData(
col0,
col1,
col2,
col3,
col4,
col5,
col6,
col7,
col8,
col9,
col10,
col11,
col12,
col13,
col14
));
this.getInvoices().getSummary().setLiczbaFaktur(this.getInvoices().getListInvoice().size());
String digit = col14.replaceAll("[\\D]", "");
if(digit.length() > 0) {
sumVal += Float.parseFloat(digit);
}
}
this.getInvoices().getSummary().setWartoscFaktur(
String.valueOf(sumVal * 0.01) + " zł"
);
this.getInvoices().setInvoiceCorrection(this.getInvoiceCorrection());
}
public Invoices getInvoices() {
return invoices;
}
public InvoiceCorrection getInvoiceCorrection() {
return invoiceCorrection;
}
public void setInvoiceCorrection(InvoiceCorrection invoiceCorrection) {
this.invoiceCorrection = invoiceCorrection;
}
} |
<filename>sampleprojects/eBookApp/src/main/java/com/dtrules/samples/bookpreview/app/GenCase.java
package com.dtrules.samples.bookpreview.app;
import com.dtrules.samples.bookpreview.TestCaseGen_BookPreview;
import com.dtrules.samples.bookpreview.datamodel.DataObj;
public class GenCase {
int level;
public void setLevel(int level){
if (level < 100)level = 100;
this.level = level;
}
BookPreviewApp app = null;
GenCase(BookPreviewApp app){
this.app = app;
}
DataObj generate(){
TestCaseGen_BookPreview gen = new TestCaseGen_BookPreview();
try{
return gen.generate();
}catch(Exception e){
return generate();
}
}
/**
* This method is going to watch the queue in the BookPreviewApp, and fill
* it with test cases until until full (i.e. has level many jobs in it).
*/
public void fill() {
while(app.jobsWaiting()<level){
DataObj request = generate();
app.jobs.add(request);
}
}
}
|
/**
* Free xlog memory and destroy it cleanly, without side
* effects (for use in the atfork handler).
*/
void
xlog_atfork(struct xlog *xlog)
{
close(xlog->fd);
xlog->fd = -1;
} |
<reponame>dpetrovych/jackson-databind-implicit<gh_stars>0
package io.dpetrovych.jackson.databind.implicit.types;
import org.jetbrains.annotations.NotNull;
import java.lang.reflect.Field;
import java.util.Arrays;
import java.util.List;
import static java.util.stream.Collectors.toList;
public class PropertiesExtractorMock implements PropertiesExtractor {
@NotNull
@Override
public <T> PropertiesDescriptor<? extends T> getPropertiesDescriptor(Class<?> type) {
List<String> fields = Arrays.stream(type.getDeclaredFields()).map(Field::getName).collect(toList());
return new PropertiesDescriptor<T>(fields, (Class<T>) type);
}
}
|
/// <reference types="jquery" />
$('#myForm').validator();
$('#myForm').validator('update');
$('#myForm').validator('validate');
$('#myForm').validator('destroy');
$('#myForm').validator({
delay: 500,
html: false,
disable: false,
focus: true,
feedback: {},
custom: {}
});
|
import java.io.PrintWriter;
import java.util.*;
public class Normal {
public static void main(String[] args) throws Exception {
Scanner sc = new Scanner(System.in);
PrintWriter pw = new PrintWriter(System.out);
int N = sc.nextInt();
int k = sc.nextInt();
int a,d,flag,count=0;
int digits[]=new int[10];
for(int i=0; i<N; i++){
a= sc.nextInt();
Arrays.fill(digits,0);
flag=0;
while(a!=0)
{
d= a%10;
a=a/10;
digits[d]++;
}
for(int j=0; j<=k;j++)
if(digits[j]==0)
flag=1;
if(flag==0)
count++;
}
pw.println(count);
pw.close();
}
}
|
import Benefits from "./Benefits";
export default Benefits;
|
import {Component, EventEmitter, Input, Output} from '@angular/core';
import {Command} from '../../command-executor.service';
@Component({
selector: 'app-command-list',
templateUrl: './command-list.component.html'
})
export class CommandListComponent {
@Input() commands: Array<Command>;
@Output() run = new EventEmitter<Command>();
@Output() edit = new EventEmitter<Command>();
@Output() delete = new EventEmitter<Command>();
@Output() add = new EventEmitter();
get canItemBeModified(): boolean {
return this.delete.observers.length > 0 && this.edit.observers.length > 0;
}
get canItemBeAdded(): boolean {
return this.add.observers.length > 0;
}
}
|
In the much-discussed new book "The Dangerous Case of Donald Trump" (Salon review here), mental health professionals act on their “duty to warn” of the imminent danger the Trump presidency represents. As explained in its prologue, “Collectively with our coauthors, we warn that anyone as mentally unstable as Mr. Trump simply should not be entrusted with the life-and-death powers of the presidency.”
But what about journalists’ duty to warn? Press freedom’s stature, as enshrined in the First Amendment, is predicated on its importance in preserving all the other freedoms, in exposing and warning of violations and threats. As the contributors to "The Dangerous Case" make clear, Trump represents a threat unlike any America has ever seen in a sitting president, except perhaps for Richard Nixon in the last few weeks of his presidency. So journalists’ duty to warn should be clear. As Robert Jay Lifton reminds us in the "Dangerous Case" foreword, the larger framework of professional ethics that motivates mental health professionals to speak out also “applies to members of other professions who may have their own ‘duty to warn.’"
Advertisement:
Journalists have a similar responsibility, and bear even more of it, given how flawed journalistic practices helped create our nation's current dire predicament. By adopting a set of conventions that undermined their civic commitment and even usefulness, journalists themselves have helped pave the way for Trump’s emergence, so the admonition, “Reporter, heal thyself!” is clearly in order.
Two critics of journalism struck me as particularly helpful in finding our footing, so I reached out to both for comments to build on what they’d already done. The first is NYU professor of journalism Jay Rosen, proprietor of the long-running Pressthink blog. The second is James Fallows, longtime national correspondent for the Atlantic and author of the 1996 book, "Breaking the News: How the Media Undermine American Democracy." Both have had cogent responses to Trump, but it’s their appreciation of how we got here that’s even more valuable.
During the campaign, Fallows wrote “Trump Time Capsules” – 152 in all – described as "a running chronicle ... of how Trump has broken the norms that applied to previous major-party candidates." It was an example of beat reporting at its best – but not the kind of beat that had ever been possible before. As I’m about to argue, that’s just what we need much more of in order to fulfill our duty to warn as journalists.
“I think Trump is revising the presidency, in some profound and disturbing ways that go beyond the natural evolution of the office as it passes into different hands,” Rosen told me via email. “I gave an example of that here. So much of what he doing is unprecedented in the modern era, as James Fallows of the Atlantic documented during the campaign. I think Fallows proved that this could itself be a beat.”
As for Rosen himself, in a recent post, “Normalizing Trump: An incredibly brief explainer,” he points out that journalists covering Trump are painfully aware of his incompetence: “He isn’t good at anything a president has to do…. He doesn’t know anything about the issues. … He doesn’t care to learn. ... Nothing he says can be trusted. ... His ‘model’ of leadership is humiliation of others.”
Their code requires them to report all this, but it also calls on them “to respect the voters’ choice, as well as the American presidency, of which they see themselves a vital part, as well as the beat, the job of White House reporting. The two parts of the code are in conflict,” and that conflict is quite painful. They flee from it, if they can, which is why we see so many interpretive attempts to normalize Trump. “What they have to report brings ruin to what they have to respect,” he concludes. “So they occasionally revise it into something they can respect: at least a little.”
Advertisement:
Rosen provides even better insight about how we got here in the first place. He is perhaps best known for his critique of “The View from Nowhere,” a pretense to pseudo-objectivity based on being neither left nor right. "American journalists have almost a lust for the View from Nowhere because they think it has more authority than any other possible stance," he later explained. That authority is unearned, he argues: “In journalism, real authority starts with reporting. Knowing your stuff, mastering your beat, being right on the facts, digging under the surface of things, calling around to find out what happened, verifying what you heard. ... Illuminating a murky situation because you understand it better than almost anyone.”
A classic example of how wildly distorted “balanced” journalism can be — and how far at odds with factual reporting — is PolitiFact’s 2011 “Lie of the Year.” In 2009, Republicans won the title with their lie about Obamacare’s nonexistent “death panels.” The following year, Republicans won again, with their false claims that Obamacare was a "government takeover of health care." By then, the gods of balance had gone crazy, so in 2011, PolitiFact awarded Democrats their “lie of the year,” for the entirely factual claim that Republicans were trying to end Medicare with Paul Ryan’s plan to voucherize it — and then slash the program's subsidy shares over time.
“This is really awful,” Paul Krugman wrote in response. “PolitiFact, which is supposed to police false claims in politics, has announced its Lie of the Year — and it’s a statement that happens to be true, the claim that Republicans have voted to end Medicare.” Even a writer at the National Review agreed: “I don’t think any of these examples rise to the level of ‘lie,’ much less ‘Lie of the Year,’” wrote Robert VerBruggen, who supported the Ryan Plan.
"[T]he people at PolitiFact are terrified of being considered partisan if they acknowledge the clear fact that there's a lot more lying on one side of the political divide than on the other,” Krugman went on to say. “So they've bent over backwards to appear 'balanced' -- and in the process made themselves useless and irrelevant. Way to go, guys."
Advertisement:
Five years later, the imbalance in lying had gotten much worse, and brought us President Trump, even though PolitiFact’s 2015 “Lie of the Year” was “the campaign misstatements of Donald Trump.” (That’s right: “Misstatements.” SAD!)
Decades before all this, in "Breaking the News," Fallows highlighted two main developments that are especially relevant here: the devaluing of beat reporting — which ties in directly with Rosen’s critique — and the shift to treating politics as a contest or sporting event, which makes it both more entertaining and easier to cover without getting all bogged down in boring facts. The two are interrelated, as Fallow described when I asked him how the devaluing of beat reporting helped make Trump possible — and how it continues to make him more dangerous.
“Long before Trump, political reporters have had a natural preference of the how of public issues, rather than the what,” Fallows told me. “Can the Republicans win on their immigration (or tax cut or environmental) bill, versus what will that bill actually do. Those political angles are important -- but it's a matter of proportion and emphasis, and the main force for balance was the 'beat' reporters who could put the what of the issues into context.”
Advertisement:
In short, political reporters were ripe targets for Trump’s exploitation, following a pattern I described last summer, except that they were ripe targets as a class, rather than as individuals.
“In his personal traits and in his rise, Donald Trump represents taking the how of politics to an extreme," Fallows said. "He is all about the way he presents himself and his promises and his complaints, rather than the specifics of any actual program. Back in 'Breaking the News,' I argued that journalism had to fight to maintain the distinction between ‘news’ and ‘entertainment,’ because on a pure contest for eyeballs and attention, outright entertainment would always win. That's what entertainment is for!”
Two decades later, things have only gotten worse. “As our politics has reached this extreme in pure posturing and showmanship, with all the consequences on the way we pay taxes and go to war and raise our children, the pressure is all the greater on journalism to keep explaining the what of our world,” Fallows said. “The pressure and responsibility are greater, when the resources are more stretched than they've been in many years.”
Advertisement:
One way to cope is through leaner, more focused, innovative beat reporting, of the sort Fallows himself did during the campaign with his Trump Time Capsules. But perhaps the best example of that came from Toronto Star reporter Daniel Dale’s daily tallies of Trump’s lies, which he also dispersed on Twitter, beginning in September 2016. I asked Dale how and why he got started.
“I did it because I didn’t feel like the frequency of Trump’s dishonesty was being sufficiently communicated by mainstream media coverage,” Dale wrote by email. “Reporters were doing a decent job calling out his deception on Twitter, but if you were just to read their final story or watch their final segment on the evening news, the lying wouldn’t usually make the cut – the story would be ‘Trump talked health care today’ rather than ‘Trump said 20 false things today.’ I wanted to focus on the dishonesty itself, as I thought it was its own story.”
The end results can look so neat and tidy, but it’s a serious job, with plenty of work behind it. “It’s simply very time-consuming,” Dale told me. “He is averaging 2.8 false claims per day, more than four per day over the last month. It takes a while to check all of them.”
It also comes with its own set of specific challenges. “There’s an obvious fatigue factor here,” he said. “Trump frequently tells the same lies over and over; it’s harder to get people to care the 20th time than it is the first,” Dale noted. “You also risk coming across as a pedant or scold when you repeatedly call him out for misstating figures, for example. (I argue that the little lies can be just as revealing as the big ones, and that we shouldn’t let anything slide.)”
Advertisement:
With so many lies, and so many reporters, why was it Dale who stepped up, I wondered. “My experience covering late Toronto mayor Rob Ford and his brother, fellow politician Doug Ford, helped prepare me for this,” Dale explained. “They were both regularly and flagrantly dishonest, and I called them out on that. My newspaper enthusiastically supported me in doing so, which is not always the case, and I knew I could do the same with Trump.”
Neither the Fords nor Trump are unprecedented. Sarah Palin, Michele Bachmann, Newt Gingrich … the list goes on and on. Perhaps if someone like Dale had been around in 1994 when Gingrich was lying his way to the speakership, we’d be living in a very different world today. We should bear that in mind, while still focused on our current situation.
Fallows and Rosen both appreciate what Dale has done. “I think Daniel Dale is doing an important job, both in his own work and as an example,” Fallows said. “In his own work, the indefatigable chronicle of daily lies is an important part of the historical record, for the real-time version of history we are living through and for those looking back.” There was an obvious parallel with his own work on the Trump Time Capsule series, Fallows noted.
“I think he's been very effective at documenting that the president of the United States doesn't care if what he says is true,” Rosen added. “This deserves to be a beat because it's unprecedented that a stream of falsehoods, many of them easy to check, moves outward from the office of the president into American life. That this is not considered a problem by the current White House is itself an amazing fact.”
Advertisement:
As for broader impacts, “Other journalists have regularly retweeted my fact-checks on Twitter, which is good,” Dale said. “Some have used my corrections to begin correcting the false claims themselves when Trump has uttered the same thing again.”
But these are still the exceptions, not the rule. “I think there’s a reluctance on the part of editors and media entities – more than reporters themselves – to frequently declare the president a liar,” Dale reflected. “It’s still seen as a departure from journalistic norms. I’d argue that it’s a basic part of our job.”
Fallows has a similar view. “As an example, it emphasizes the crucial distinction between emotional outrage, which usually makes it harder to get a message across, and intellectual relentlessness,” he said. “It doesn't help anyone to have reporters yelling about the latest lie, boast or threat that has come from Donald Trump. But it is important to keep saying: This is not normal, this is not true, this is dangerous.” And, he went to note, “David Fahrenthold's work in the [Washington] Post has had a similar effect: emotionally calm, intellectually relentless.”
I asked Dale how he hoped others would build on his work. “I’d like to see it become standard to call out the president’s lying at all times,” Dale replied. “Specifically, I think it should be a daily part of our coverage of the president. When there is a presidential debate, media outlets deploy teams of fact-checkers, for good reason – and yet there is almost no fact-checking included in the daily coverage of the president’s interviews and rallies.” Perhaps with most presidents this could be understandable — shading truths has long been much more common than outright lies. But Trump is not other presidents.
Advertisement:
“For example, when he does a Fox News interview and makes more than 10 false claims, the mainstream coverage will not mention that,” Dale said. “Fact-checking is still mostly relegated to PolitiFact, Factcheck.org and the Washington Post’s fact checker rather than considered a core component of the coverage, as it should be.”
Beyond that is the question of what new journalistic beats may be called for to cover Trump and his impacts in ways that actually make sense of what is going on. Along these lines, Rosen suggested, “I think it would be useful to have a 'Republicans who cannot countenance Trump' beat, to regularize the kind of reporting seen here.” Instead, we’ve actually had the opposite — a “Trump voter beat” with a seemingly endless stream of stories in the New York Times and elsewhere, which has helped obscure the fact that Trump’s support among his base has slowly but steadily eroded.
Other beats are clearly possible too. Trump’s lies are but one facet of his troubling, threatening pattern of behavior. As "The Dangerous Case of Donald Trump" suggests, the president engages in troubling behavior almost every day. And he does not act in a vacuum. He and his appointees — and even his lack of appointees — are profoundly changing American government and governance, as Rosen says.
Trump is actually building on a long history here. Wide systemic attacks on scientists and other experts in government, as well as attacks on established decision-making practices cannot be adequately understood in isolation. Dismantling government is a multi-agency initiative of the Trump administration, which requires reporters covering multiple beats in order to grasp what is actually going on.
Advertisement:
Beyond that, Trump represents a threat to liberal democracy, in concert with a wide range of political actors across the globe. The resurgence of right-wing, authoritarian political parties and governments is a worldwide phenomenon that deserves to be covered as a beat. There is a significant literature on the emergence of authoritarian regimes in the post-Cold War era, not only from journalists, and historians, but also from a psychiatric perspective. Frederick Burkle of the Harvard Humanitarian Initiative, for instance, recently published a paper on the subject, "Antisocial Personality Disorder and Pathological Narcissism in Prolonged Conflicts and Wars of the 21st Century."
There are some notable journalists whose work is informed by these international developments, but mostly as freelance writers who lack the sustained institutional support that gives real power to what they do. Connecting the psychological dimension with the political is especially challenging without a team-of-experts approach that even the best individual journalists would find difficult to master. If we are to grasp what is happening around us — and respond effectively to protect our country, and the best of what it stands for -- then we will need these new forms of beat reporting.
We're a long way from having the kind of journalism we need to respond adequately to the threats we face on a a national and global scale. But there are signs of progress, and reasons for optimism. Consider this recent Daniel Dale thread about Donald Trump's propensity to become "hilariously lost in his own lying." That's the kind of reporting our president has made possible. |
<reponame>rajesh-ibm-power/trusted-connector
import { HttpClient, HttpHeaders } from '@angular/common/http';
import { Injectable } from '@angular/core';
import { Observable } from 'rxjs';
import { map } from 'rxjs/operators';
import { environment } from '../../environments/environment';
import { Configuration } from './configuration';
import { Settings } from './settings.interface';
@Injectable()
export class ConnectionConfigurationService {
constructor(private readonly http: HttpClient) { }
public storeConfiguration(config: Configuration): Observable<string> {
return this.http.post(environment.apiURL + '/config/connectionConfigs/' + encodeURIComponent(config.connection),
JSON.stringify(config.settings), {
headers: new HttpHeaders({ 'Content-Type': 'application/json' }),
responseType: 'text'
});
}
public getConfiguration(connection: string): Observable<Configuration> {
return this.http.get<Settings>(environment.apiURL + '/config/connectionConfigs/' + encodeURIComponent(connection))
.pipe(map(res => new Configuration(connection, res)));
}
public getAllConfiguration(): Observable<Configuration[]> {
return this.http.get<object>(environment.apiURL + '/config/connectionConfigs')
.pipe(map(configMap => Object.keys(configMap)
.map(key => new Configuration(key, configMap[key]))));
}
}
|
// Package chelsea exports the Animal Crossing villager Chelsea.
package chelsea
|
<gh_stars>1-10
from .baseeffect import BaseEffect
from .returneffect import ReturnEffect
from .endbattleeffect import EndBattleEffect
from .experienceeffect import ExperienceEffect
from ..action import ActionType
class FaintEffect(BaseEffect):
def __init__(self, scene, target):
super().__init__(scene)
self.spd_on_action = 1000
self.target = target
def on_action(self):
self.scene.board.no_skip(
f"{self.scene.board.get_actor(self.target).name} fainted",
particle="",
)
end = False
self.scene.on_faint_effects(self.target)
self.scene.add_effect(ReturnEffect(self.scene, self.target))
self.scene.remove_action_effects(self.target)
self.scene.board.set_can_fight(self.target, False)
if not self.scene.board.has_fighter(self.target[0]):
end = True
if self.target[0] == 1:
# experience if enemy fainted
self.scene.add_effect(
ExperienceEffect(
self.scene,
(0, self.scene.board.get_active(0)),
self.scene.board.get_actor(self.target),
)
)
if end:
self.scene.add_effect(EndBattleEffect(self.scene))
self.scene.board.fainted = True
self.scene.force_action(self.target[0], ActionType.SENDOUT)
return True, False, False
def on_faint(self, target):
return False, False, False
|
/* MH mail create mailbox
* Accepts: mail stream
* mailbox name to create
* Returns: T on success, NIL on failure
*/
long mh_create (MAILSTREAM *stream,char *mailbox)
{
char *s,tmp[MAILTMPLEN];
sprintf (tmp,"Can't create mailbox %.80s: invalid MH-format name",mailbox);
if (mailbox[0] == '#' && (mailbox[1] == 'm' || mailbox[1] == 'M') &&
(mailbox[2] == 'h' || mailbox[2] == 'H') && mailbox[3] == '/')
for (s = mailbox + 4; s && *s;) {
if (isdigit (*s)) s++;
else if (*s == '/') s = NIL;
else if (s = strchr (s+1,'/')) s++;
else tmp[0] = NIL;
}
if (tmp[0]) {
mm_log (tmp,ERROR);
return NIL;
}
if (mh_isvalid (mailbox,tmp,NIL)) {
sprintf (tmp,"Can't create mailbox %.80s: mailbox already exists",mailbox);
mm_log (tmp,ERROR);
return NIL;
}
if (!mh_path) return NIL;
if (!(mh_file (tmp,mailbox) &&
dummy_create_path (stream,strcat (tmp,"/"),
get_dir_protection (mailbox)))) {
sprintf (tmp,"Can't create mailbox %.80s: %s",mailbox,strerror (errno));
mm_log (tmp,ERROR);
return NIL;
}
return T;
} |
// RegisterTable registers a Table with the Schema by name
func (s *Schema) RegisterTable(n schema.Name, t table.Table) error {
if err := s.register(n); err != nil {
return err
}
return s.tables.Register(n, t)
} |
/**
* Config ApiBoot Logging Discovery Instance
* Draw the list of ApiBoot Logging Admin addresses from the registry
* and report the request log through load balancing
*/
@Data
public static class DiscoveryInstance {
/**
* ApiBoot Logging Admin Spring Security Username
*/
private String username;
/**
* ApiBoot Logging Admin Spring Security User Password
*/
private String password;
/**
* ApiBoot Logging Admin Service ID
*/
private String serviceId;
} |
///verbirgt bzw. zeigt den ersten Block
void BlockListWidget::setStartBlockHidden(bool hide)
{
qDebug() << "BlockListWidget::setStartBlockHidden, " << hide;
this->item(0)->setHidden(hide);
} |
<filename>jpa/odata-jpa-processor-cb/src/main/java/com/sap/olingo/jpa/processor/cb/impl/SqlSubQuery.java
package com.sap.olingo.jpa.processor.cb.impl;
enum SqlSubQuery {
EXISTS("EXISTS"),
SOME("SOME"),
ALL("ALL"),
ANY("ANY");
private String keyWord;
private SqlSubQuery(final String keyWord) {
this.keyWord = keyWord;
}
@Override
public String toString() {
return keyWord;
}
}
|
<filename>hadoop_yarn/applicationmaster_service.pb.go
// Code generated by protoc-gen-go.
// source: applicationmaster_protocol.proto
// DO NOT EDIT!
package hadoop_yarn
import proto "github.com/golang/protobuf"
import json "encoding/json"
import math "math"
import "github.com/noelcosgrave/gohadoop"
import hadoop_ipc_client "github.com/noelcosgrave/gohadoop/hadoop_common/ipc/client"
import yarn_conf "github.com/noelcosgrave/gohadoop/hadoop_yarn/conf"
import "github.com/nu7hatch/gouuid"
// Reference proto, json, and math imports to suppress error if they are not otherwise used.
var _ = proto.Marshal
var _ = &json.SyntaxError{}
var _ = math.Inf
var APPLICATION_MASTER_PROTOCOL = "org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB"
func init() {
}
type ApplicationMasterProtocolService interface {
RegisterApplicationMaster(in *RegisterApplicationMasterRequestProto, out *RegisterApplicationMasterResponseProto) error
FinishApplicationMaster(in *FinishApplicationMasterRequestProto, out *FinishApplicationMasterResponseProto) error
Allocate(in *AllocateRequestProto, out *AllocateResponseProto) error
}
type ApplicationMasterProtocolServiceClient struct {
*hadoop_ipc_client.Client
}
func (c *ApplicationMasterProtocolServiceClient) RegisterApplicationMaster(in *RegisterApplicationMasterRequestProto, out *RegisterApplicationMasterResponseProto) error {
return c.Call(gohadoop.GetCalleeRPCRequestHeaderProto(&APPLICATION_MASTER_PROTOCOL), in, out)
}
func (c *ApplicationMasterProtocolServiceClient) FinishApplicationMaster(in *FinishApplicationMasterRequestProto, out *FinishApplicationMasterResponseProto) error {
return c.Call(gohadoop.GetCalleeRPCRequestHeaderProto(&APPLICATION_MASTER_PROTOCOL), in, out)
}
func (c *ApplicationMasterProtocolServiceClient) Allocate(in *AllocateRequestProto, out *AllocateResponseProto) error {
return c.Call(gohadoop.GetCalleeRPCRequestHeaderProto(&APPLICATION_MASTER_PROTOCOL), in, out)
}
func DialApplicationMasterProtocolService(conf yarn_conf.YarnConfiguration) (ApplicationMasterProtocolService, error) {
clientId, _ := uuid.NewV4()
ugi, _ := gohadoop.CreateSimpleUGIProto()
serverAddress, _ := conf.GetRMSchedulerAddress()
c := &hadoop_ipc_client.Client{ClientId: clientId, Ugi: ugi, ServerAddress: serverAddress}
return &ApplicationMasterProtocolServiceClient{c}, nil
}
/*
// DialApplicationMasterProtocolService connects to an ApplicationMasterProtocolService at the specified network address.
// DialApplicationMasterProtocolServiceTimeout connects to an ApplicationMasterProtocolService at the specified network address.
func DialApplicationMasterProtocolServiceTimeout(network, addr string,
timeout time.Duration) (*ApplicationMasterProtocolServiceClient, *rpc.Client, error) {
c, err := protorpc.DialTimeout(network, addr, timeout)
if err != nil {
return nil, nil, err
}
return &ApplicationMasterProtocolServiceClient{c}, c, nil
}
*/
|
package versions_test
import (
. "rabbitmq-upgrade-preparation/versions"
. "github.com/onsi/ginkgo/extensions/table"
. "github.com/onsi/ginkgo"
. "github.com/onsi/gomega"
)
var _ = Describe("Versions", func() {
Describe("RabbitVersions", func() {
DescribeTable("upgrade preparation required",
func(deployedVersion, desiredVersion string) {
versions := &RabbitVersions{Desired: desiredVersion, Deployed: deployedVersion}
Expect(versions.PreparationRequired()).To(BeTrue())
},
Entry("3.4.4.1 to 3.6.3 requires upgrade preparation", "3.4.4.1", "3.6.3"),
Entry("3.4.4.1 to 3.6.1.904 requires upgrade preparation", "3.4.4.1", "3.6.1.904"),
Entry("3.5.7 to 3.6.3 requires upgrade preparation", "3.5.7", "3.6.3"),
Entry("3.6.1.904 to 3.6.6 requires upgrade preparation", "3.6.1.904", "3.6.6"),
Entry("3.6.3 to 3.6.6 requires upgrade preparation", "3.6.3", "3.6.6"),
Entry("3.6.5 to 3.6.6 requires upgrade preparation", "3.6.5", "3.6.6"),
Entry("3.6.3 to 3.6.7 requires upgrade preparation", "3.6.3", "3.6.7"),
Entry("3.6.5 to 3.6.7 requires upgrade preparation", "3.6.5", "3.6.7"),
Entry("3.6.5 to 3.7.0 requires upgrade preparation", "3.6.5", "3.7.0"),
Entry("3.6.6 to 3.7.0 requires upgrade preparation", "3.6.6", "3.7.0"),
Entry("3.6.6 to 3.6.7 requires upgrade preparation", "3.6.6", "3.6.7"),
Entry("3.6.6 to 3.6.8 requires upgrade preparation", "3.6.6", "3.6.8"),
Entry("3.6.6 to 3.6.9 requires upgrade preparation", "3.6.6", "3.6.9"),
)
DescribeTable("upgrade preparation not required",
func(deployedVersion, desiredVersion string) {
versions := &RabbitVersions{Desired: desiredVersion, Deployed: deployedVersion}
Expect(versions.PreparationRequired()).To(BeFalse())
},
Entry("3.6.1.904 to 3.6.1.904 requires no upgrade preparation", "3.6.1.904", "3.6.1.904"),
Entry("3.6.1.904 to 3.6.3 requires no upgrade preparation", "3.6.1.904", "3.6.3"),
Entry("3.6.3 to 3.6.3 requires no upgrade preparation", "3.6.3", "3.6.3"),
Entry("3.6.3 to 3.6.5 requires no upgrade preparation", "3.6.3", "3.6.5"),
Entry("3.6.5 to 3.6.5 requires no upgrade preparation", "3.6.5", "3.6.5"),
Entry("3.6.6 to 3.6.6 requires no upgrade preparation", "3.6.6", "3.6.6"),
Entry("3.6.9 to 3.6.9 requires no upgrade preparation", "3.6.9", "3.6.9"),
Entry("3.6.6.903 to 3.6.7 requires no upgrade preparation", "3.6.6.903", "3.6.7"),
Entry("3.7.0 to 3.7.0 requires no upgrade preparation", "3.7.0", "3.7.0"),
)
Describe("UpgradeMessage", func() {
It("returns the upgrade message", func() {
versions := &RabbitVersions{Desired: "3.6.6-rc1", Deployed: "3.6.5"}
Expect(versions.UpgradeMessage()).To(Equal("It looks like you are trying to upgrade from RabbitMQ 3.6.5 to RabbitMQ 3.6.6-rc1"))
})
})
Describe("malformed versions", func() {
Context("when the desired version of RabbitMQ is malformed", func() {
It("returns an error", func() {
versions := &RabbitVersions{Desired: "malformed-version", Deployed: "3.6.5"}
_, err := versions.PreparationRequired()
Expect(err).To(MatchError("The desired version of RabbitMQ is malformed: malformed-version"))
})
})
Context("when the deployed version of RabbitMQ is malformed", func() {
It("returns an error", func() {
versions := &RabbitVersions{Desired: "3.6.5", Deployed: "malformed-version"}
_, err := versions.PreparationRequired()
Expect(err).To(MatchError("The deployed version of RabbitMQ is malformed: malformed-version"))
})
})
})
})
Describe("ErlangVersions", func() {
It("detects a change in Erlang if there is a major version bump", func() {
versions := &ErlangVersions{Desired: "18.1", Deployed: "17"}
Expect(versions.PreparationRequired()).To(BeTrue())
})
It("detects no change in Erlang if there is a minor change", func() {
versions := &ErlangVersions{Desired: "18.1", Deployed: "18"}
Expect(versions.PreparationRequired()).To(BeFalse())
})
It("detects no change in Erlang if there is no change", func() {
versions := &ErlangVersions{Desired: "18.1", Deployed: "18.1"}
Expect(versions.PreparationRequired()).To(BeFalse())
})
Describe("UpgradeMessage", func() {
It("returns the upgrade message", func() {
versions := &ErlangVersions{Desired: "18.3.4.1", Deployed: "18.3"}
Expect(versions.UpgradeMessage()).To(Equal("It looks like you are trying to upgrade from Erlang 18.3 to Erlang 18.3.4.1"))
})
})
})
})
|
Volkswagen has always been brilliant in their advertising and marketing campaigns. Always funny and a little weird, the ads you see today have evolved from some really good material of the past. Here are some of the neatest VW ads from the classic era.
It isn’t so.
Just to be clear, Volkswagen cars are not wound up by key, and the winding key mod was not, and is not, standard. How funny that they needed to clarify that! However, there are a lot of really cool things that are standard and this ad lays them out.
Think small.
VW has always been proud of their Beetle, and many of the vintage ads were geared towards highlighting the appeal of this little car. This one highlights the economical awesomeness that keeps the Beetle from being just another novelty.
It’s ugly, but it gets you there.
This is a little on the bizarre side, but it has a point. Comparing a spacecraft to their lineup, maybe, Volkswagen wants you to know it’s not all about looks.
A Car of Many Colors
This multi-colored Beetle shows you that many of the parts from the classic VWs were interchangeable from one year to the next.
Bigger than the biggest, smaller than the smallest.
Not too big, not too small, Volkswagen offered a vehicle that was just the right size.
Dead Bug
Making a play on the bug’s name, Volkswagen wanted you to know they’d never kill the Beetle…however, they did put it into a coma for a while there.
What is it?
We have no idea. Couldn’t even begin to guess what this classic VW is toting around, or if they were ever able to get it out…
It’s a Bus
This ad highlights the styling and functionality cues taken from a bus to make this iconic Volkswagen.
Get Funky With It!
Volkswagen gets it, if you own a bug, you’re probably a little unique, and this ad encourages you to take over where they left off!
Take Your Teddy Bear and Stuff It!
Into the Bus that is!
Which VW ad inspired you to buy your first Volkswagen? |
/** Add set namespace quota record to edit log
*
* @param src the string representation of the path to a directory
* @param quota the directory size limit
*/
public void logSetQuota(String src, long nsQuota, long dsQuota) {
SetQuotaOp op = SetQuotaOp.getInstance();
op.set(src, nsQuota, dsQuota);
logEdit(op);
} |
<filename>platform/shared/xruby/src/com/xruby/compiler/codedom/ExpressionStatement.java<gh_stars>100-1000
/**
* Copyright 2005-2007 <NAME>
* Distributed under the BSD License
*/
package com.xruby.compiler.codedom;
import java.util.ArrayList;
public class ExpressionStatement extends Statement {
private Expression expression;
public ExpressionStatement(Expression exp) {
expression = exp;
}
public Expression getExpression() {
return expression;
}
public void accept(CodeVisitor visitor) {
this.expression.addLineno(visitor);
expression.accept(visitor);
}
void getNewlyAssignedVariables(ISymbolTable symboltable, ArrayList<String> result) {
expression.getNewlyAssignedVariables(symboltable, result);
}
void pullBlock(ArrayList<Block> result) {
expression.pullBlock(result);
}
}
|
def ratio(self,n1,n2, explain=0):
weight_normal_form = 3.0
weight_normal_form_if_one_name_is_in_initials = old_div(weight_normal_form, 4)
weight_normal_form_soundex = 9.0
weight_normal_form_soundex_if_one_name_is_in_initials =old_div(weight_normal_form_soundex,4)
weight_geslachtsnaam1 = 7.0
weight_geslachtsnaam2 = 7.0
weight_initials = 2
weight_initials_if_one_name_is_in_initials = weight_initials * 2 if one of the names is in intials
nf1 = n1.guess_normal_form()
nf2 = n2.guess_normal_form()
nf1 = to_ascii(nf1)
nf2 = to_ascii(nf2)
ratio_normal_form = self.average_distance(split(nf1), split(nf2))
nf1 =remove_stopwords( nf1)
nf2 = remove_stopwords( nf2)
se1 = n1.soundex_nl(nf1, group=2, length=-1)
se2 = n2.soundex_nl(nf2, group=2, length=-1)
ratio_normal_form_soundex = self.average_distance( se1, se2)
g1 = n1.geslachtsnaam()
g2 = n2.geslachtsnaam()
g1 = to_ascii(g1)
g2 = to_ascii(g2)
g1_soundex = n1.soundex_nl(g1, group=2, length=-1)
g2_soundex = n2.soundex_nl(g2, group=2, length=-1)
ratio_geslachtsnaam1 = self.average_distance(g1_soundex, g2_soundex)
ratio_geslachtsnaam2 = self.average_distance(
re.split('[ \.\,\-]', g1.lower()),
re.split('[ \.\,\-]', g2.lower()),
self.levenshtein_ratio)
if len(n1.initials()) == 1 or len(n2.initials()) == 1:
weight_initials = 0
ratio_initials = .5
elif n1.contains_initials() or n2.contains_initials():
ratio_initials = self.levenshtein_ratio(n1.initials().lower(), n2.initials().lower())
weight_initials = weight_initials_if_one_name_is_in_initials
elif len(n1.initials()) > 1 and len(n2.initials()) > 1:
ratio_initials = self.levenshtein_ratio(n1.initials().lower(), n2.initials().lower())
else:
ratio_initials = 0.7
if n1.contains_initials() or n2.contains_initials():
weight_normal_form = weight_normal_form_if_one_name_is_in_initials
weight_normal_form_soundex = weight_normal_form_soundex_if_one_name_is_in_initials
try:
teller = ratio_normal_form * weight_normal_form + ratio_normal_form_soundex * weight_normal_form_soundex+ ratio_geslachtsnaam1*weight_geslachtsnaam1 + ratio_geslachtsnaam2*weight_geslachtsnaam2 + ratio_initials*weight_initials
noemer = weight_normal_form + weight_normal_form_soundex + weight_initials + weight_geslachtsnaam1 + weight_geslachtsnaam2
final_ratio = old_div(teller,noemer)
except ZeroDivisionError:
return 0.0
if explain:
d = [
('ratio_normal_form',ratio_normal_form,),
('weight_normal_form',weight_normal_form, ),
('ratio_geslachtsnaam1 (soundex)', ratio_geslachtsnaam1, ),
('weight_geslachtsnaam1', weight_geslachtsnaam1, ),
('ratio_geslachtsnaam2 (letterlijke geslachtsnaam)', ratio_geslachtsnaam2, ),
('weight_geslachtsnaam2', weight_geslachtsnaam2, ),
('ratio_initials', ratio_initials, ),
('weight_initials', weight_initials, ),
('final_ratio', final_ratio,),
('teller',teller,),
('noemer', noemer,),
]
s = '-' * 100 + '\n'
s += 'Naam1: %s [%s] [%s] %s\n' % (n1, n1.initials(), n1.guess_normal_form(), se1)
s += 'Naam2: %s [%s] [%s] %s\n' % (n2, n2.initials(), n2.guess_normal_form(), se2)
s += 'Similarity ratio: %s\n' % final_ratio
s += '--- REASONS' + '-' * 30 + '\n'
format_s = '%-30s | %-10s | %-10s | %-10s | %-10s | %s-10s\n'
s += format_s % ('\t property', ' ratio', ' weight','relative_weight', ' r*w', 'r * relative_w')
s += '\t' + '-' * 100 + '\n'
format_s = '\t%-30s | %-10f | %-10f | %-10f | %-10f | %-10f\n'
s += format_s % (' normal_form', ratio_normal_form, weight_normal_form,old_div(weight_normal_form,teller), ratio_normal_form * weight_normal_form, old_div(ratio_normal_form * weight_normal_form,teller))
s += format_s % ('soundex van normal_form', ratio_normal_form_soundex, weight_normal_form_soundex,old_div(weight_normal_form_soundex,teller), ratio_normal_form_soundex* weight_normal_form_soundex, old_div(ratio_normal_form_soundex * weight_normal_form_soundex,teller))
s += format_s % ('soundex van geslachtsnaam1', ratio_geslachtsnaam1, weight_geslachtsnaam1,old_div(weight_geslachtsnaam1,teller), ratio_geslachtsnaam1 * weight_geslachtsnaam1, old_div(ratio_geslachtsnaam1 * weight_geslachtsnaam1,teller))
s += format_s % ('geslachtsnaam', ratio_geslachtsnaam2, weight_geslachtsnaam2,old_div(weight_geslachtsnaam2,teller), ratio_geslachtsnaam2 *weight_geslachtsnaam2 , old_div(ratio_geslachtsnaam2 * weight_geslachtsnaam2,teller))
s += format_s % ('initials', ratio_initials, weight_initials, old_div(weight_initials,teller), ratio_initials *weight_initials, old_div(ratio_initials * weight_initials,teller))
s += '\tTOTAL (numerator) | %s (counter = %s)\n' % (teller, noemer)
return s
return '\n'.join(['%s: %s' % (k, v) for k,v in d])
return final_ratio |
// Key loop (for driving the robot)
void Teleop::keyLoop()
{
char c;
bool dirty = false;
tcgetattr(kfd, &cooked);
memcpy(&raw, &cooked, sizeof(struct termios));
raw.c_lflag &= ~(ICANON | ECHO);
raw.c_cc[VEOL] = 1;
raw.c_cc[VEOF] = 2;
tcsetattr(kfd, TCSANOW, &raw);
ROS_INFO("Robot Teleoperation: Reading from keyboard");
ROS_INFO("------------------------------------------");
ROS_INFO("Use arrow keys to move the robot.");
ROS_INFO("LEFT | RIGHT control direction (15 degree steps)");
ROS_INFO("UP | DOWN control speed (0.1 m/s steps)");
while (true) {
if (read(kfd, &c, 1) < 0) {
perror("read():");
exit(-1);
}
switch (c) {
case KEYCODE_L:
ROS_DEBUG("LEFT");
angular_ = 1.0;
dirty = true;
rot_angle_ += 15.0;
break;
case KEYCODE_R:
ROS_DEBUG("RIGHT");
angular_ = -1.0;
dirty = true;
rot_angle_ -= 15.0;
break;
case KEYCODE_U:
ROS_DEBUG("UP");
linear_ = 1.0;
dirty = true;
robot_speed += 0.1;
break;
case KEYCODE_D:
ROS_DEBUG("DOWN");
linear_ = -1.0;
dirty = true;
robot_speed -= 0.1;
break;
case KEYCODE_Q:
ROS_DEBUG("Stop/Pause");
linear_ = 0.0;
dirty = true;
robot_speed = 0.0;
}
if (rot_angle_ > 360.0)
rot_angle_ = acos(cos(rot_angle_));
if (rot_angle_ < -360.0)
rot_angle_ = acos(cos(rot_angle_));
ROS_INFO("Current Speed, Angle [%f, %f]", robot_speed, rot_angle_);
double vx = cos(rot_angle_ * M_PI / 180.0);
double vy = sin(rot_angle_ * M_PI / 180.0);
double stepx = robot_speed * vx;
double stepy = robot_speed * vy;
pedsim_msgs::AgentState astate;
std_msgs::Header header_;
header_.stamp = ros::Time::now();
astate.header = header_;
astate.type = 2;
astate.twist.linear.x = stepx;
astate.twist.linear.y = stepy;
if (dirty == true) {
vel_pub_.publish(astate);
dirty = false;
}
}
return;
} |
# ABC 066 B
S =input()
m = 0
for i,s in enumerate(S):
tmp = S[:-1-i]
tmp1 = tmp[:len(tmp)//2]
tmp2 = tmp[len(tmp)//2:]
if len(tmp)%2==0 and(tmp1==tmp2):
m = max(m,len(tmp))
print(m) |
<reponame>zhanibekrysbek/rft_sensor_serial<gh_stars>0
#include "RFT_IF_PACKET_Rev1.2.h"
#include <string.h>
CRT_RFT_IF_PACKET::CRT_RFT_IF_PACKET()
{
memset(m_rcvd_product_name, 0x00, sizeof(m_rcvd_product_name));
memset(m_rcvd_serial_number, 0x00, sizeof(m_rcvd_serial_number));
memset(m_rcvd_firmware_version, 0x00, sizeof(m_rcvd_firmware_version));
m_rcvd_curr_RX_ID = 0; // current sensor's RX ID, Only CAN version
m_rcvd_curr_TX_ID_1 = 0; // current sensor's TX ID #1, Only CAN version
m_rcvd_curr_TX_ID_2 = 0; // current sensor's TX ID #2, Only CAN version
m_rcvd_set_RX_ID = 0; // setting value of sensor's RX ID, Only CAN version
m_rcvd_set_TX_ID_1 = 0; // setting value of sensor's TX ID #1, Only CAN version
m_rcvd_set_TX_ID_2 = 0; // setting value of sensor's TX ID #2, Only CAN version
m_rcvd_curr_comm_baudrate = 0; // current baudrate
m_rcvd_set_comm_baudrate = 0; // setting value of baudrate
m_rcvd_filter_type = 0;
m_rcvd_filter_setting_value = 0;
m_rcvd_tx_frq = 0;
m_response_cmd = 0;
m_response_result = 0;
m_response_errcode = 0;
memset(m_rcvdForce, 0x00, sizeof(m_rcvdForce));
m_rcvdForceStatus = 0;
memset(m_rcvdOverloadCnt, 0x00, sizeof(m_rcvdOverloadCnt));
m_fDividerForce = FORCE_DIVIDER;
m_fDividerTorque = TORQUE_DIVIDER;
}
CRT_RFT_IF_PACKET::~CRT_RFT_IF_PACKET()
{
}
void CRT_RFT_IF_PACKET::setDivider(float force, float torque)
{
m_fDividerForce = force;
m_fDividerTorque = torque;
}
bool CRT_RFT_IF_PACKET::DFG_read_product_name(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_GET_PRODUCT_NAME;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_read_serial_number(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_GET_SERIAL_NUMBER;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_read_firmware_version(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_GET_FIRMWARE_VER;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_read_message_ID(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_GET_ID;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_read_comm_baudrate(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_GET_COMM_BAUDRATE;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_read_filter_type(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_GET_FT_FILTER;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_read_force_once(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_FT_ONCE;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_read_force(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_FT_CONT;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_read_output_frq(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_GET_CONT_OUT_FRQ;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_read_overload_count(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_GET_OVERLOAD_COUNT;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_set_message_ID(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE], unsigned char rxID, unsigned char txID_1, unsigned char txID_2)
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_SET_ID;
data_field[1] = rxID;
data_field[2] = txID_1;
data_field[3] = txID_2;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_set_comm_baudrate(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE], unsigned char baud_type)
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_SET_COMM_BAUDRATE;
data_field[1] = baud_type;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_set_filter_type(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE], unsigned char filter_type, unsigned char set_value)
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_SET_FT_FILTER;
data_field[1] = filter_type;
data_field[2] = set_value;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_set_stop_force_out(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE])
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_FT_CONT_STOP;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_set_output_frq(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE], unsigned char frq_type)
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_SET_CONT_OUT_FRQ;
data_field[1] = frq_type;
return true;
}
bool CRT_RFT_IF_PACKET::DFG_set_bias(unsigned char data_field[COMMAND_PACKET_DATA_FIELD_SIZE], unsigned char is_on)
{
if (data_field == 0) // pointer validataion checking.
return false;
data_field[0] = CMD_SET_BIAS;
data_field[1] = is_on;
return true;
}
void CRT_RFT_IF_PACKET::convertPacket_To_Force(unsigned char *rcvdPacket, float *force, float forceDivider, float torqueDivider)
{
short raw;
unsigned short temp;
for (int idx = 0; idx < RFT_NUM_OF_FORCE; idx++)
{
temp = rcvdPacket[2 * idx] * 256 + rcvdPacket[2 * idx + 1];
raw = (signed short)temp;
if (idx < 3) // force
{
force[idx] = ((float)raw) / forceDivider;
}
else // torque
{
force[idx] = ((float)raw) / torqueDivider;
}
}
// force status - overload.....
m_rcvdForceStatus = rcvdPacket[12];
}
bool CRT_RFT_IF_PACKET::rcvd_data_field_processing(unsigned char data_field[RESPONSE_PACKET_DATA_FIELD_SIZE], unsigned char check_command_type)
{
if (data_field == 0) // pointer validataion checking.
return false;
unsigned char rcvd_cmd_type = data_field[0]; // command type
if (check_command_type != 0)
{
if (rcvd_cmd_type != check_command_type)
return false;
}
bool prcs_result = true;
switch (rcvd_cmd_type)
{
case CMD_GET_PRODUCT_NAME:
for (int idx = 0; idx < RFT_PRODUCT_NAME_LENGTH; idx++)
m_rcvd_product_name[idx] = data_field[idx + 1];
break;
case CMD_GET_SERIAL_NUMBER:
for (int idx = 0; idx < RFT_SERIAL_NUMBER_LENGTH; idx++)
m_rcvd_serial_number[idx] = data_field[idx + 1];
break;
case CMD_GET_FIRMWARE_VER:
for (int idx = 0; idx < RFT_FIRMWARE_VER_LENGTH; idx++)
m_rcvd_firmware_version[idx] = data_field[idx + 1];
break;
case CMD_SET_ID:
m_response_cmd = rcvd_cmd_type;
m_response_result = data_field[1];
m_response_errcode = data_field[2];
break;
case CMD_GET_ID:
m_rcvd_curr_RX_ID = data_field[1]; // current sensor's RX ID, Only CAN version
m_rcvd_curr_TX_ID_1 = data_field[2]; // current sensor's TX ID #1, Only CAN version
m_rcvd_curr_TX_ID_2 = data_field[3]; // current sensor's TX ID #2, Only CAN version
m_rcvd_set_RX_ID = data_field[4]; // setting value of sensor's RX ID, Only CAN version
m_rcvd_set_TX_ID_1 = data_field[5]; // setting value of sensor's TX ID #1, Only CAN version
m_rcvd_set_TX_ID_2 = data_field[6]; // setting value of sensor's TX ID #2, Only CAN version
break;
case CMD_SET_COMM_BAUDRATE:
m_response_cmd = rcvd_cmd_type;
m_response_result = data_field[1];
m_response_errcode = data_field[2];
break;
case CMD_GET_COMM_BAUDRATE:
m_rcvd_curr_comm_baudrate = data_field[1]; // current baudrate
m_rcvd_set_comm_baudrate = data_field[2]; // setting value of baudrate
break;
case CMD_SET_FT_FILTER:
m_response_cmd = rcvd_cmd_type;
m_response_result = data_field[1];
m_response_errcode = data_field[2];
break;
case CMD_GET_FT_FILTER:
m_rcvd_filter_type = data_field[1];
m_rcvd_filter_setting_value = data_field[2];
break;
case CMD_FT_ONCE:
m_response_cmd = rcvd_cmd_type;
//convertPacket_To_Force((&data_field[1]), m_rcvdForce, FORCE_DIVIDER, TORQUE_DIVIDER);
convertPacket_To_Force((&data_field[1]), m_rcvdForce, m_fDividerForce, m_fDividerTorque);
break;
case CMD_FT_CONT:
m_response_cmd = rcvd_cmd_type;
//convertPacket_To_Force((&data_field[1]), m_rcvdForce, FORCE_DIVIDER, TORQUE_DIVIDER);
convertPacket_To_Force((&data_field[1]), m_rcvdForce, m_fDividerForce, m_fDividerTorque);
break;
case CMD_FT_CONT_STOP:
// NO - RESPONSE PACKET
break;
case CMD_RESERVED_1:
break;
case CMD_RESERVED_2:
break;
case CMD_SET_CONT_OUT_FRQ:
m_response_cmd = rcvd_cmd_type;
m_response_result = data_field[1];
m_response_errcode = data_field[2];
break;
case CMD_GET_CONT_OUT_FRQ:
m_rcvd_tx_frq = data_field[1];
break;
case CMD_SET_BIAS:
break;
case CMD_GET_OVERLOAD_COUNT:
m_response_cmd = rcvd_cmd_type;
for (int i = 0; i < RFT_NUM_OF_FORCE; i++)
{
m_rcvdOverloadCnt[i] = data_field[i + 1];
}
break;
default:
prcs_result = false;
break;
}
return prcs_result;
}
///////////////////////////////////////////////////////////////////////////////////////////
// functions for UART communication, checksum calculation, UART Packet generation
unsigned char CRT_RFT_IF_PACKET::calcChecksum(unsigned char *pkt_buff, int pkt_size)
{
unsigned char checksum = 0;
for (int idx = 1; idx < (pkt_size - 2); idx++) // except SOP, CHECKSUM and EOP
checksum += pkt_buff[idx];
return checksum;
}
// UPG(Uart Packet Generation) functions
bool CRT_RFT_IF_PACKET::UPG_read_product_name(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE])
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_read_product_name(packet_buff + 1);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_read_serial_name(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE])
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_read_serial_number(packet_buff + 1);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_read_firmware_version(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE])
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_read_firmware_version(packet_buff + 1);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_read_comm_baudrate(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE])
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_read_comm_baudrate(packet_buff + 1);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_read_filter_type(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE])
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_read_filter_type(packet_buff + 1);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_read_force_once(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE])
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_read_force_once(packet_buff + 1);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_read_force(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE])
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_read_force(packet_buff + 1);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_read_output_frq(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE])
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_read_output_frq(packet_buff + 1);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_read_overload_count(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE])
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_read_overload_count(packet_buff + 1);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_set_comm_baudrate(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE], unsigned char buad_type)
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_set_comm_baudrate(packet_buff + 1, buad_type);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_set_filter_type(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE], unsigned char filter_type, unsigned char set_value)
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_set_filter_type(packet_buff + 1, filter_type, set_value);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_set_stop_force_out(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE])
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_set_stop_force_out(packet_buff + 1);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_set_output_frq(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE], unsigned char frq_type)
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_set_output_frq(packet_buff + 1, frq_type);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UPG_set_bias(unsigned char packet_buff[UART_COMMAND_PACKET_SIZE], unsigned char is_on)
{
if (packet_buff == 0)
return false;
// buffer clear
for (int idx = 0; idx < UART_COMMAND_PACKET_SIZE; idx++)
packet_buff[idx] = 0x00;
packet_buff[0] = SOP;
DFG_set_bias(packet_buff + 1, is_on);
packet_buff[UART_COMMAND_PACKET_SIZE - 2] = calcChecksum(packet_buff, UART_COMMAND_PACKET_SIZE);
packet_buff[UART_COMMAND_PACKET_SIZE - 1] = EOP;
return true;
}
bool CRT_RFT_IF_PACKET::UART_packet_processing(unsigned char packet_buff[UART_RESPONSE_PACKET_SIZE], unsigned char check_command_type)
{
if (packet_buff == 0)
return false;
unsigned char checksum = calcChecksum(packet_buff, UART_RESPONSE_PACKET_SIZE);
if ((packet_buff[0] == SOP)
&& (packet_buff[UART_RESPONSE_PACKET_SIZE - 2] == checksum)
&& (packet_buff[UART_RESPONSE_PACKET_SIZE - 1] == EOP))
{
if (rcvd_data_field_processing(packet_buff + 1, check_command_type))
return true;
else
return false;
}
else
{
return false;
}
return true;
}
// END OF FILE |
/**
* Selects one of the points.
*
* @param pointIndex
* the index of the point to select or -1 to deselect all.
*/
public void selectPoint(int pointIndex) {
@Nullable
ChartPoint[] points = this.points;
if (pointIndex > -1) {
int pointIndexInt = pointIndex;
if (pointIndexInt < 0 || pointIndexInt >= points.length) {
throw new IndexOutOfBoundsException();
}
}
int lastIndex = this.selectedChartPointIndex;
if (pointIndex != lastIndex) {
if (lastIndex > -1) {
ChartPoint oldPoint = points[lastIndex];
oldPoint.setSelected(false);
}
this.selectedChartPointIndex = pointIndex;
if (pointIndex > -1) {
ChartPoint newPoint = points[pointIndex];
if (newPoint.getValue() < 0.0f) {
this.selectedChartPointIndex = -1;
} else {
newPoint.setSelected(true);
}
}
requestRender();
}
} |
<filename>packages/12-redux/todo-redux/src/application/store/store.ts
import { Action, applyMiddleware, createStore } from 'redux'
import { rootReducer } from './root-reducer'
import thunk, { ThunkAction } from 'redux-thunk'
const enhancers = applyMiddleware(thunk)
export const store = createStore(rootReducer, enhancers)
export type RootState = ReturnType<typeof rootReducer>
export type AppDispatch = typeof store.dispatch
export type Thunk<ReturnType = void> = ThunkAction<ReturnType, RootState, unknown, Action<string>>
|
def descriptor_one_image(imagem,feature_extraction_method,list_of_parameters):
if feature_extraction_method == 'glcm':
features = glcm.glcm(imagem,[],[], int(list_of_parameters[0]), int(list_of_parameters[1]), int(list_of_parameters[2]))
elif feature_extraction_method == 'fotf':
features = histogram.histogram(imagem,[],[])
elif feature_extraction_method == 'lbp':
features = lbp.lbpTexture(imagem,[],[], 8*int(list_of_parameters[0]), int(list_of_parameters[0]))
elif feature_extraction_method == 'hog':
features = hog_rom.HOG(imagem,[],[],int(list_of_parameters[0]),int(list_of_parameters[1]))
elif feature_extraction_method == 'daisy':
features = daisy.daisy_features(imagem,[],[], int(list_of_parameters[0]),int(list_of_parameters[1]),int(list_of_parameters[2]),int(list_of_parameters[3]))
return features[0] |
<filename>census-us/python-connectors/census-us_us_census_custom_dataset/connector.py
# -*- coding: utf-8 -*-
from dataiku.connector import Connector
import dataiku
from dataiku import pandasutils as pdu
import pandas as pd
import numpy as np
import os
import census_resources
import common
import census_metadata
class USCensusConnector(Connector):
def __init__(self, config, plugin_config):
Connector.__init__(self, config, plugin_config) # pass the parameters to the base class
# perform some more initialization
self.P_state_list_str = str(self.config.get("param_state_list")) #, "defaultValue")
self.P_STATES_TYPE_NAME = self.config.get("param_state_format")
self.P_CENSUS_CONTENT = self.config.get("param_census_content")
self.P_CENSUS_LEVEL = self.config.get("param_census_level")
self.P_census_fields = str(self.config.get("param_fields"))
self.P_USE_PREVIOUS_SOURCES = self.config.get("param_re_use_collected_census_sources")
self.P_DELETE_US_CENSUS_SOURCES = self.config.get("param_delete_census_sources")
def get_read_schema(self):
fields_list = self.P_census_fields.split(',')
P_CENSUS_TYPE = self.P_CENSUS_CONTENT[:3]
vint = census_resources.dict_vintage_[P_CENSUS_TYPE][self.P_CENSUS_CONTENT]
url_metadata = vint['variables_definitions']
if url_metadata.endswith('.json'):
metadata_results = census_metadata.get_metadata_sources_from_api(url_metadata)
else:
metadata_results = census_metadata.get_metadata_sources(url_metadata)
metadata_status = metadata_results[0]
df_metadata_source = metadata_results[1]
mlist = list(df_metadata_source['name'])
if metadata_status =='ok':
ok_fields_list = [c for c in fields_list if c in mlist]
all_fields_list = ['GEOID_DKU','STUSAB'] + ok_fields_list
else:
print metadata_status
all_fields_list = ['GEOID_DKU','STUSAB'] + fields_list
if self.P_STATES_TYPE_NAME is not 'state_2letters':
all_fields_list = all_fields_list + [self.P_STATES_TYPE_NAME]
l=[]
for field in all_fields_list:
d={}
d['name']=field
d['type']='string'
l.append(d)
d_={"columns": l}
return d_
def generate_rows(self, dataset_schema=None, dataset_partitioning=None,
partition_id=None, records_limit = -1):
path_datadir_tmp = os.getenv("DIP_HOME") + '/tmp/'
FOLDER_NAME = 'tmp_census_us_'+ self.P_CENSUS_CONTENT
P_CENSUS_TYPE = self.P_CENSUS_CONTENT[:3]
CENSUS_TYPE = str(census_resources.dict_vintage_[self.P_CENSUS_CONTENT[:3]])
fields_list = self.P_census_fields.split(',')
#----------------------------------------- BASE FOLDER
print '1/6 Creating base folders...'
common.create_folder(path_datadir_tmp,FOLDER_NAME,False)
common.create_folder(path_datadir_tmp + '/' + FOLDER_NAME +'/',self.P_CENSUS_LEVEL,False)
#----------------------------------------- SOURCE HARVESTER
state_list_ = self.P_state_list_str.split(',')
state_conversion = common.state_to_2letters_format(self.P_STATES_TYPE_NAME, state_list_)
state_list = state_conversion[0]
state_list_rejected = state_conversion[1]
dict_states = state_conversion[2]
s_found = len(state_list)
s_rejected = len(state_list_rejected)
print '----------------------------------------'
print 'First diagnostic on input dataset'
print '----------------------------------------'
if s_found >0:
print 'States expected to be processed if enough records for feature selection:'
print state_list
print 'States rejected:'
if s_rejected < 60:
print state_list_rejected
else:
print '...too many elements rejected for displaying it in the log...'
if self.P_USE_PREVIOUS_SOURCES is False:
print '2/6 Collecting US Census Data...'
else:
print '2/6 Re using US Census Data if available...'
sources_collector = common.us_census_source_collector(self.P_USE_PREVIOUS_SOURCES,P_CENSUS_TYPE,self.P_CENSUS_CONTENT,self.P_CENSUS_LEVEL,path_datadir_tmp,FOLDER_NAME,state_list,dict_states)
sumlevel_val= sources_collector[0]
fdef_dir= sources_collector[1]
geo_header_file= sources_collector[2]
dict_pattern_files= sources_collector[3]
#status= sources_collector[4]
geo_header_file_dir = fdef_dir + '/' + geo_header_file
geo_header = pd.read_excel(geo_header_file_dir, sheetname=0, header=0)
census_level_code_len = census_resources.dict_level_corresp['v1'][self.P_CENSUS_LEVEL]['code_len']
print '4/6 Generating census...'
final_output_df = pd.DataFrame()
for state in state_list:
print 'Processing this state: %s' % (state)
state_dir = path_datadir_tmp + FOLDER_NAME+'/'+ state
if self.P_CENSUS_LEVEL in ('TRACT','BLOCK_GROUP'):
ziptocollect = dict_pattern_files['v1']['TB']
state_dir_level = state_dir +'/'+ 'TRACT_BG_SEG'
else:
ziptocollect = dict_pattern_files['v1']['OT']
state_dir_level = state_dir +'/'+ 'NO_TRACT_BG_SEG'
ustate = state.upper()
state_name = dict_states[state]['attributes']['state_fullname_w1']
state_number = dict_states[state]['attributes']['state_2digits']
vint = census_resources.dict_vintage_[P_CENSUS_TYPE][self.P_CENSUS_CONTENT]
master_segment_file = state_dir_level + '/' + vint['master_segment_file_pattern']+ vint['vintage_pattern']+state+'.csv'
geo_source_df = pd.read_csv(master_segment_file, sep =',', header = None, names = geo_header.columns)
geo_level_df = geo_source_df[geo_source_df['SUMLEVEL'].isin(sumlevel_val)].copy()
geo_level_df['GEOID_DKU'] = geo_level_df['GEOID'].map(lambda x: x.split('US')[1])
geo_level_df[self.P_CENSUS_LEVEL] = geo_level_df['GEOID_DKU'].map(lambda x: x[:census_level_code_len])
keep_cols = ['FILEID','SUMLEVEL','GEOID_DKU','STUSAB','LOGRECNO']
geo_level_df = geo_level_df[keep_cols]
geo_level_df['STUSAB'] = geo_level_df['STUSAB'].map(lambda x: x.lower()) ## basically the state lower
del geo_level_df['FILEID']
del geo_level_df['SUMLEVEL']
### added
n=0
for fr in os.listdir(state_dir_level):
if fr.startswith(vint['segments_estimations_files_pattern']):
n+=1
segment_list=[]
for i in range(1,n+1):
if i < 10:
segment_list.append('000' + str(i))
if i in range(10,100):
segment_list.append('00' + str(i))
if i >= 100:
segment_list.append('0' + str(i))
nb_segments = len(segment_list)
i=1
for segment_number in segment_list:
i=i+1
print 'Processing segment: %s/%s' % (i,nb_segments)
template_fields_def = census_resources.dict_vintage_[P_CENSUS_TYPE][self.P_CENSUS_CONTENT]['fields_definition']
seq_folder_name = template_fields_def['folder_name']
## For taking into account that some vintage like ACS52013 does not have a structure with the template and a folder
## If no template, recreate the same structure as the alternative one.
if seq_folder_name=='':
seq_folder_name = template_fields_def['geo_header_template_folder_name']
HEADER_PATH_FILE = fdef_dir + '/'+ seq_folder_name +'/Seq' + str(int(segment_number)) + '.xls'
header_df = pd.read_excel(HEADER_PATH_FILE,sheetname=0) ### 0 = 'E'
### Adjust the header to fit what we need.
kh_list = ['FILEID', 'FILETYPE', 'STUSAB', 'CHARITER', 'SEQUENCE', 'LOGRECNO']
f_list = [x for x in header_df.columns if x not in kh_list]
E_list = [x + 'E' for x in f_list]
newcolz_list = kh_list + E_list
t_ = [c for c in newcolz_list if c in fields_list]
if len(t_) >0:
SEGMENT_PATH_FILE = state_dir_level + '/' + vint['segments_estimations_files_pattern']+ vint['vintage_pattern'] + state + segment_number + '000.txt'
segment_df = pd.read_csv(SEGMENT_PATH_FILE, sep = ',', names = newcolz_list,low_memory=False)
out_list = kh_list + t_
out_list.remove('FILEID')
out_list.remove('FILETYPE')
out_list.remove('CHARITER')
out_list.remove('SEQUENCE')
segment_df = segment_df[out_list]
geo_level_df = pd.merge(left= geo_level_df, right= segment_df, how='inner', left_on = ['STUSAB','LOGRECNO'], right_on = ['STUSAB','LOGRECNO'])
print '-------------- volumes check------------------'
print geo_level_df.groupby('STUSAB').size()
print 'Check Tallies here :'
print 'https://www.census.gov/geo/maps-data/data/tallies/tractblock.html'
print '----------------------------------------------'
#del geo_level_df['STUSAB']
del geo_level_df['LOGRECNO']
if self.P_STATES_TYPE_NAME is not 'state_2letters':
geo_level_df[self.P_STATES_TYPE_NAME] = dict_states[state]['attributes'][self.P_STATES_TYPE_NAME]
print '5/6 Building final output...'
final_output_df = pd.concat((final_output_df,geo_level_df),axis=0)
if self.P_DELETE_US_CENSUS_SOURCES is True:
print '6/6 Removing US Census temp data from: %s' % (path_datadir_tmp + FOLDER_NAME)
cmd = "rm -rf %s" % (path_datadir_tmp + FOLDER_NAME)
os.system(cmd)
else:
print '6/6 Keeping US Census data sources in: %s' % (path_datadir_tmp + FOLDER_NAME)
for f in os.listdir(path_datadir_tmp + FOLDER_NAME):
if not f.endswith('.zip'):
cmd = "rm -rf %s" % (path_datadir_tmp + FOLDER_NAME + '/' + f)
os.system(cmd)
for i, line in final_output_df.iterrows():
yield line.to_dict()
else:
print '!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!'
print 'US Census CANNOT be built, no states available in your dataset...'
print 'Check the following settings :'
print '-> are the states in the right format regarding the plugin set by the user ?'
print '-> is the column really containing states ?'
print '----------------------------------------'
|
/**
* An Immutable expenseTracker that is serializable to JSON format.
*/
@JsonRootName(value = "expenseTracker")
public class JsonSerializableExpenseTracker {
private final List<JsonAdaptedExpense> expenses = new ArrayList<>();
/**
* Constructs a {@code JsonSerializableExpenseTracker } with the given expenses.
*/
@JsonCreator
public JsonSerializableExpenseTracker(@JsonProperty("expenses") List<JsonAdaptedExpense> expenses) {
this.expenses.addAll(expenses);
}
/**
* Converts a given {@code ReadOnlyExpenseTracker} into this class for Jackson use.
*
* @param source future changes to this will not affect the created {@code JsonSerializableExpenseTracker}.
*/
public JsonSerializableExpenseTracker(ReadOnlyExpenseTracker source) {
expenses.addAll(source.getExpenseList().stream().map(JsonAdaptedExpense::new).collect(Collectors.toList()));
}
/**
* Converts this expenseTracker into the model's {@code ExpenseTracker} object.
* Adds duplicate expenses for recurring expenses.
*
* @throws IllegalValueException if there were any data constraints violated.
*/
public ExpenseTracker toModelType() throws IllegalValueException {
ExpenseTracker expenseTracker = new ExpenseTracker();
LocalDate date = LocalDate.now();
for (JsonAdaptedExpense jsonAdaptedExpense : expenses) {
Expense expense = jsonAdaptedExpense.toModelType();
expenseTracker.addExpense(expense);
while (isRecurringExpense(expense, date) && !isSameMonth(expense, date)) {
expense.getIsFixed().markAsRecurred();
Expense duplicateExpense = createDuplicateExpense(expense, date);
expenseTracker.addExpense(duplicateExpense);
expense = duplicateExpense;
}
}
return expenseTracker;
}
/**
* Checks if the given expense is recurring and if it should be duplicated.
*/
private boolean isRecurringExpense(Expense expense, LocalDate date) {
boolean expenseIsRecurring = expense.getIsFixed().value && expense.getIsFixed().getIsRecurring();
if (!expenseIsRecurring) {
return false;
}
LocalDate expenseRecurringDate = expense.getDate().getLocalDate().plusMonths(1);
if (expenseRecurringDate.isAfter(date)) {
return false;
} else {
return true;
}
}
/**
* Creates a duplicate expense of the original fixed recurring expense, with a date one month later.
*/
public Expense createDuplicateExpense(Expense expense, LocalDate date) {
DateTimeFormatter formatter = DateTimeFormatter.ofPattern("dd-MM-yyyy");
Description duplicateDescription = new Description(expense.getDescription().value);
IsFixed duplicateIsFixed = new IsFixed("y");
Amount duplicateValue = new Amount(expense.getValue().value.doubleValue());
Date duplicateDate = new Date(expense.getDate().getLocalDate().plusMonths(1).format(formatter));
Tag duplicateTag = new Tag(expense.getTag().tagName);
return new Expense(duplicateDescription, duplicateIsFixed, duplicateValue, duplicateDate, duplicateTag);
}
/**
* Checks if the recurring expense was made in the current month.
*/
public boolean isSameMonth(Expense expense, LocalDate date) {
Month expenseMonth = expense.getDate().getMonth();
if (expenseMonth != date.getMonth()) {
return false;
} else {
return true;
}
}
} |
<gh_stars>1-10
"""
pylaxz -S arguments
Arguments :
os
os-info
checking everying about network.
"""
from os import uname as _uname, name as _name
class System:
"""
For System Purposes
Arguments : (os), (os-all), -h
Examples:
$ pylaxz -S os # for short OS information
$ pylaxz -S os-info # for long description about OS
"""
def __init__(self) -> None:
self.arch = True if _name == "posix" else False
@property
def __partial(self) -> None:
return f"{_uname()}" if self.arch else f"Not supported on Windows yet."
# logxs.printf(os.uname() if self.type else "r u on windows ? omg!" , _int=True)
@property
def __all(self) -> None:
return f"Showing all information..."
def info(self, all=False):
return self.__all if all else self.__partial
# @check.setter
# def check(self, value) -> int:
# return None
|
#include <bits/stdc++.h>
#include <stdio.h>
#define FAST_IO ios_base::sync_with_stdio(false), cin.tie(nullptr)
#define FILE_IO freopen("input.txt", "r", stdin), freopen("output.txt", "w", stdout)
using namespace std;
typedef long long ll;
const int N = 26;
int n;
int tree[200005];
void build() {
for (int i = n - 1; i > 0; i--) tree[i] = tree[2 * i] | tree[2 * i + 1];
}
void modify(int p, int val) {
for (tree[p += n] = 1 << val; p /= 2;) tree[p] = tree[2 * p] | tree[2 * p + 1];
}
int query(int l, int r) {
int resL = 0, resR = 0;
for (l += n, r += n; l < r; l /= 2, r /= 2) {
if (l & 1) resL = resL | tree[l++];
if (r & 1) resR = resR | tree[--r];
}
return resL | resR;
}
signed main() {
FAST_IO;
//FILE_IO;
string s;
cin >> s;
n = s.length();
for (int i = 0; i < n; i++) tree[i + n] = 1 << (s[i] - 'a');
build();
int q;
cin >> q;
while (q--) {
int type;
cin >> type;
if (type == 1) {
int pos;
char x;
cin >> pos >> x;
pos--;
modify(pos, x - 'a');
s[pos] = x;
} else {
int l, r;
cin >> l >> r;
l--;
cout << __builtin_popcount(query(l, r)) << "\n";
}
}
return 0;
} |
<filename>reg_import.go
package goql
import (
"sync"
"github.com/fzerorubigd/goql/astdata"
)
type importProvider struct {
cache map[string][]interface{}
lock *sync.Mutex
}
func (v *importProvider) Provide(in interface{}) []interface{} {
v.lock.Lock()
defer v.lock.Unlock()
p := in.(*astdata.Package)
if d, ok := v.cache[p.Path()]; ok {
return d
}
va := p.Imports()
res := make([]interface{}, len(va))
for i := range va {
res[i] = va[i]
}
v.cache[p.Path()] = res
return res
}
type canonicalCol struct{}
func (canonicalCol) Value(in interface{}) String {
im := in.(*astdata.Import)
if im.Canonical() == "" {
return String{Null: true}
}
return String{String: im.Canonical()}
}
type pathCol struct{}
func (pathCol) Value(in interface{}) String {
im := in.(*astdata.Import)
return String{String: im.Path()}
}
type packageCol struct{}
func (packageCol) Value(in interface{}) String {
im := in.(*astdata.Import)
return String{String: im.TargetPackage()}
}
func registerImport() {
RegisterTable("imports", &importProvider{
cache: make(map[string][]interface{}),
lock: &sync.Mutex{},
})
RegisterField("imports", "pkg_name", genericPackageName{})
RegisterField("imports", "pkg_path", genericPackagePath{})
RegisterField("imports", "file", genericFileName{})
RegisterField("imports", "docs", genericDoc{})
RegisterField("imports", "canonical", canonicalCol{})
RegisterField("imports", "path", pathCol{})
RegisterField("imports", "package", packageCol{})
}
func init() {
registerImport()
}
|
t = int(input())
while t!=0:
n = int(input())
if n%2==0:
count=0
f2=0
while n%2==0:
f1=0
maxi=-1
for i in range(3,int(pow(n,.5)+1)):
if n%i==0:
if i%2==1:
if i>maxi:
maxi=i
if (n//i) %2==1:
if (n//i) >maxi:
maxi=n//i
if maxi!=-1:
n = n//maxi
count+=1
f1=1
if n==2:
f2=1
if f1==0:
n-=1
count+=1
if f2==1:
if count % 2 == 1:
print("Ashishgup")
else:
print("FastestFinger")
else:
if count%2==1:
print("FastestFinger")
else:
print("Ashishgup")
else:
if n==1:
print("FastestFinger")
else:
print("Ashishgup")
t-=1 |
import dark from "./dark";
function Button(props){
return (
<button style={dark.download}>{props.children}</button>
);
}
export default Button; |
<filename>Overworld/PauseState.cpp
//
// PauseState.cpp
// Overworld
//
// Created by <NAME> on 6/29/18.
// Copyright © 2018 Noah. All rights reserved.
//
#include "PauseState.hpp"
PauseState::PauseState(sf::RenderWindow& rw) {
pausedText.setString("Paused");
pausedText.setColor(sf::Color::White);
pausedText.setPosition(100, 100);
pausedText.setFont(resources.getFont(Fonts::Bramble));
rect.setFillColor(sf::Color::Black);
rect.setSize(sf::Vector2f(105,37));
rect.setPosition(90, 100);
audioPlayer.pauseMusic();
backgroundTexture.loadFromImage(rw.capture());
backgroundSprite.setTexture(backgroundTexture);
save("save.bin");
//pause sounds?
}
void PauseState::update(sf::Clock& timer) {
//nothing?
timer.restart();
}
void PauseState::draw(sf::RenderWindow& rw) {
rw.clear(sf::Color::White);
rw.setView(rw.getDefaultView());
rw.draw(backgroundSprite);
rw.draw(rect);
rw.draw(pausedText);
rw.display();
}
void PauseState::handleInput(sf::RenderWindow& rw) {
sf::Event event;
while (rw.pollEvent(event)) {
//if Enter is pressed pop this state;
if (event.type == sf::Event::KeyPressed && event.key.code == sf::Keyboard::Return) {
requestStackPop();
audioPlayer.resumeMusic();
}
}
}
|
/* $Procedure PARTOF ( Parabolic time of flight ) */
/* Subroutine */
int partof_(doublereal *ma, doublereal *d__)
{
doublereal d__1;
doublereal m;
extern int chkin_(char *, ftnlen);
extern doublereal dcbrt_(doublereal *);
doublereal deriv, deriv2, fn, change;
extern int chkout_(char *, ftnlen);
extern logical return_(void);
if (return_()) {
return 0;
} else {
chkin_("PARTOF", (ftnlen)6);
}
if (*ma == 0.) {
*d__ = 0.;
chkout_("PARTOF", (ftnlen)6);
return 0;
} else {
m = abs(*ma);
}
d__1 = m * 3.;
*d__ = dcbrt_(&d__1);
change = 1.;
while(abs(change) > 1e-13) {
d__1 = *d__;
fn = *d__ + d__1 * (d__1 * d__1) / 3. - m;
d__1 = *d__;
deriv = d__1 * d__1 + 1.;
deriv2 = *d__ * 2.;
d__1 = deriv;
change = fn / deriv * (fn * deriv2 / (d__1 * d__1 * 2.) + 1.);
*d__ -= change;
}
if (*ma < 0.) {
*d__ = -(*d__);
}
chkout_("PARTOF", (ftnlen)6);
return 0;
} |
<reponame>dadosjusbr/executor
package main
import (
"encoding/json"
"fmt"
"io/ioutil"
"log"
"strings"
"github.com/dadosjusbr/executor"
"github.com/dadosjusbr/executor/status"
"github.com/spf13/pflag"
)
var (
input = pflag.String("in", "", "Path for the descriptor file.")
volumeName = pflag.String("volume-name", "", "Shared volume name.")
volumeDir = pflag.String("volume-dir", "", "Shared volume full path.")
defaultBaseDir = pflag.String("def-base-dir", "", "Base path to search for stages and to place the cloned repositorie")
defaultEnvFlag = pflag.StringSlice("def-run-env", []string{}, "Environment variables that override the default vars.")
)
func main() {
pflag.Parse()
defaultEnv := make(map[string]string)
for _, e := range *defaultEnvFlag {
env := strings.Split(e, ":")
if len(env) != 2 {
log.Fatalf("Invalid env var spec: %s", e)
}
defaultEnv[env[0]] = env[1]
}
if *input == "" {
log.Fatal("Path to the input file not found. Forgot --in?")
}
in, err := ioutil.ReadFile(*input)
if err != nil {
log.Fatalf("Erro lendo dados da entrada padrão: %q", err)
}
var p executor.Pipeline
if err := json.Unmarshal(in, &p); err != nil {
log.Fatalf("Erro convertendo pipeline da entrada padrão: %q\n\"%s\"", err, string(in))
}
p.DefaultRunEnv = mergeMaps(p.DefaultRunEnv, defaultEnv) // merging maps.
log.Printf("Pipeline: %+v\n\n", p)
// the flag replaces the pipeline description. Useful at runtime.
if *volumeName != "" {
p.VolumeName = *volumeName
}
if p.VolumeName == "" {
log.Printf("Você não setou o campo volume-name, usando \"dadosjusbr\"")
p.VolumeName = "dadosjusbr"
}
if *volumeDir != "" {
p.VolumeDir = *volumeDir
}
if p.VolumeDir == "" {
log.Printf("Você não setou o campo volume-name, usando \"dadosjusbr\"")
p.VolumeDir = "/output"
}
if *defaultBaseDir != "" {
p.DefaultBaseDir = *defaultBaseDir
}
log.Printf("Executando pipeline %s", p.Name)
result := p.Run()
if result.Status != status.OK {
log.Printf("Erro executando pipeline: %s. Imprimindo resultado:\n\n", p.Name)
log.Printf("%+v", result)
return
}
log.Printf("Pipeline %s executado com sucesso! Imprimindo resultado:\n\n", p.Name)
fmt.Printf("%+v", result)
}
// mergeMaps adds all elements of sec to first.
func mergeMaps(first, sec map[string]string) map[string]string {
if first == nil {
return sec
}
env := make(map[string]string)
for k, v := range first {
env[k] = v
}
for k, v := range sec {
env[k] = v
}
return env
}
|
// Class that holds the response from the last request
public class Response {
public int responseCode;
public String responseMessage;
public String responseData;
public ArrayList<Exception> exceptions;
public Response() {
this.exceptions = new ArrayList<Exception>();
}
} |
/*
* pulseIn Function for the Spark Core - Version 0.1.1 (Beta)
* Copyright (2014) Timothy Brown - See: LICENSE
*
* Due to the current timeout issues with Spark Cloud
* this will return after 10 seconds, even if the
* input pulse hasn't finished.
*
* Input: Trigger Pin, Trigger State
* Output: Pulse Length in Microseconds (10uS to 10S)
*
*/
unsigned long pulseIn(uint16_t pin, uint8_t state) {
GPIO_TypeDef* portMask = (PIN_MAP[pin].gpio_peripheral);
uint16_t pinMask = (PIN_MAP[pin].gpio_pin);
unsigned long pulseCount = 0;
unsigned long loopCount = 0;
unsigned long loopMax = 20000000;
while (GPIO_ReadInputDataBit(portMask, pinMask) != state) {
if (loopCount++ == loopMax) {
return 0;
}
}
while (GPIO_ReadInputDataBit(portMask, pinMask) == state) {
if (loopCount++ == loopMax) {
return 0;
}
pulseCount++;
}
return pulseCount * 0.405;
} |
/**
* This class shows how to use Maui on a single document
* or just a string of text.
* @author alyona
*
*/
public class MauiWrapper {
/** Maui filter object */
private MauiFilter extractionModel = null;
private Vocabulary vocabulary = null;
private Stemmer stemmer;
private Stopwords stopwords;
private String language = "en";
/**
* Constructor, which loads the data
* @param dataDirectory - e.g. Maui's main directory (should has "data" dir in it)
* @param vocabularyName - name of the rdf vocabulary
* @param modelName - name of the model
*/
public MauiWrapper(String dataDirectory, String vocabularyName, String modelName) {
stemmer = new PorterStemmer();
String englishStopwords = dataDirectory + "data/stopwords/stopwords_en.txt";
stopwords = new StopwordsEnglish(englishStopwords);
String vocabularyDirectory = dataDirectory + "data/vocabularies/";
String modelDirectory = dataDirectory + "data/models";
loadVocabulary(vocabularyDirectory, vocabularyName);
loadModel(modelDirectory, modelName, vocabularyName);
}
/**
* Loads a vocabulary from a given directory
* @param vocabularyDirectory
* @param vocabularyName
*/
public void loadVocabulary(String vocabularyDirectory, String vocabularyName) {
if (vocabulary != null)
return;
try {
vocabulary = new VocabularyJena(vocabularyName, "skos", vocabularyDirectory);
vocabulary.setStemmer(stemmer);
vocabulary.setStopwords(stopwords);
vocabulary.setLanguage(language);
vocabulary.initialize();
} catch (Exception e) {
System.err.println("Failed to load vocabulary!");
e.printStackTrace();
}
}
/**
* Loads the model
* @param modelDirectory
* @param modelName
* @param vocabularyName
*/
public void loadModel(String modelDirectory, String modelName, String vocabularyName) {
try {
BufferedInputStream inStream = new BufferedInputStream(
new FileInputStream(modelDirectory + "/" + modelName));
ObjectInputStream in = new ObjectInputStream(inStream);
extractionModel = (MauiFilter) in.readObject();
in.close();
} catch (Exception e) {
System.err.println("Failed to load model!");
e.printStackTrace();
}
extractionModel.setVocabularyName(vocabularyName);
extractionModel.setVocabularyFormat("skos");
extractionModel.setDocumentLanguage(language);
extractionModel.setStemmer(stemmer);
extractionModel.setStopwords(stopwords);
extractionModel.setVocabulary(vocabulary);
}
/**
* Main method to extract the main topics from a given text
* @param text
* @param topicsPerDocument
* @return
* @throws Exception
*/
public ArrayList<String> extractTopicsFromText(String text, int topicsPerDocument) throws Exception {
if (text.length() < 5) {
throw new Exception("Text is too short!");
}
// extractionModel.setWikipedia(null);
FastVector atts = new FastVector(3);
atts.addElement(new Attribute("filename", (FastVector) null));
atts.addElement(new Attribute("doc", (FastVector) null));
atts.addElement(new Attribute("keyphrases", (FastVector) null));
Instances data = new Instances("keyphrase_training_data", atts, 0);
double[] newInst = new double[3];
newInst[0] = (double) data.attribute(0).addStringValue("inputFile");
newInst[1] = (double) data.attribute(1).addStringValue(text);
newInst[2] = Instance.missingValue();
data.add(new Instance(1.0, newInst));
extractionModel.input(data.instance(0));
data = data.stringFreeStructure();
Instance[] topRankedInstances = new Instance[topicsPerDocument];
Instance inst;
// Iterating over all extracted keyphrases (inst)
while ((inst = extractionModel.output()) != null) {
int index = (int) inst.value(extractionModel.getRankIndex()) - 1;
if (index < topicsPerDocument) {
topRankedInstances[index] = inst;
}
}
ArrayList<String> topics = new ArrayList<String>();
for (int i = 0; i < topicsPerDocument; i++) {
if (topRankedInstances[i] != null) {
String topic = topRankedInstances[i].stringValue(extractionModel
.getOutputFormIndex());
topics.add(topic);
}
}
extractionModel.batchFinished();
return topics;
}
/**
* Triggers topic extraction from a text file
* @param filePath
* @param numberOfTopics
* @return
* @throws Exception
*/
public ArrayList<String> extractTopicsFromFile(String filePath, int numberOfTopics) throws Exception {
File documentTextFile = new File(filePath);
String documentText = FileUtils.readFileToString(documentTextFile);
return extractTopicsFromText(documentText, numberOfTopics);
}
/**
* Main method for testing MauiWrapper
* Add the path to a text file as command line argument
* @param args
*/
public static void main(String[] args) {
String vocabularyName = "agrovoc_en";
String modelName = "fao30";
String dataDirectory = "../Maui1.2/";
MauiWrapper wrapper = new MauiWrapper(dataDirectory, vocabularyName, modelName);
String filePath = args[0];
try {
ArrayList<String> keywords = wrapper.extractTopicsFromFile(filePath, 15);
for (String keyword : keywords) {
System.out.println("Keyword: " + keyword);
}
} catch (Exception e) {
e.printStackTrace();
}
}
} |
Reducing of birefringence of ‐cut crystal rod using side‐pumping and suitable crystal rotation
The beam quality and output power of high power solid‐state lasers is influenced by birefringence. Inhomogeneous distribution of the thermal field inside the laser crystal rod occurs due to non‐uniform absorption of the pump light inside the crystal and a heat sink only at boundaries. Due to the photoelastic effect, this distribution leads to inhomogeneous thermal strains and birefringence inside the rod. Plane stress and plane strain assumptions for an axially symmetric pumped crystal have been used formerly for analytical models for calculating the birefringence. This model leads in case of an ‐cut crystal to an axially symmetric birefringence pattern. However, the shear strains in the axial‐radial plane are neglected in this former models using plane stress and plane strain assumptions. This shear strains are taken into account by full 3D numerical calculations. A threefold symmetry pattern due to the anisotropic behaviour of the photoelastic tensor, which is contradictory to the ideal use of a radial or azimuthal polarized beam, is shown by results of the birefringence simulation. A laser rod pumped at three sides with threefold symmetry is analysed in order to reduce the effect of birefringence. In this case the absorption is not axially symmetric anymore. Within the crystal in regions where pumping is stronger, the pump light absorption and consequently the temperature, the strains and birefringence are higher. The degree of three‐fold symmetry of birefringence will be reduced, if the region having a low birefringence due to the photoelastic effect is more strongly pumped than the rest of domain. This means the birefringence is affected by the rotation of crystal around its ‐axis. By an optimal rotation with respect to the edges of the crystal, smallest birefringence can be obtained. For generating radial or azimuthal polarizations, the output beam of this laser device is therefore more suitable. (© 2014 Wiley‐VCH Verlag GmbH & Co. KGaA, Weinheim) |
/**
* Save a level state to a string.
* @param l Input level
* @return A string containing the serialized level
* @throws IOException Exception thrown from serializing
*/
public static String exportLevel(Level l) throws IOException {
ByteArrayOutputStream bo = new ByteArrayOutputStream();
ObjectOutputStream so = new ObjectOutputStream(bo);
so.writeObject(l);
so.flush();
return bo.toString();
} |
/**
* On server stop, save all loaded ChunkContainer to mod database, then unregister all of them
* @param e
*/
@Listener
public void onServerStop(GameStoppingServerEvent e) {
ChunkContainerRegistry.getInstance().getRawMap().forEach((key, value) -> saveChunk(value));
ChunkContainerRegistry.getInstance().clear();
} |
Protective Role of Polysaccharides from Gynostemma pentaphyllum Makino Supplementation against Exhaustive Swimming Exercise-Induced Oxidative Stress in Liver and Skeletal Muscle of Mice
The objective of this study was to investigate the protective role of polysaccharide from Gynostemma pentaphyllum Makino (PGP) supplementation against exhaustive swimming exercise-induced oxidative stress. A total of 48 mice were randomly divided into four groups: control, low-dose, medium-dose, and high-dose PGP supplementation groups. The control group received distilled water and the supplementation groups received different doses of PGP (50, 100 and 200 mg/kg body weight) by gavage once a day for 28 consecutive days. After 28 days, the mice performed an exhaustive swimming exercise, and some biochemical parameters related to oxidative stress, including superoxide dismutase (SOD), glutathione peroxidase (GPx), catalase (CAT) and malondialdehyde (MDA), were measured. The results showed that PGP supplementation could increase SOD, GPx and CAT contents, as well as decrease MDA contents in the liver and skeletal muscle of mice, which suggests that PGP supplementation has a protective role against exhaustive swimming exercise-induced oxidative stress. |
def show_how_score():
a = tk.Tk()
how_score = yah.show_how_to_score()
for i in range(len(how_score)):
tk.Label(a, text = how_score[i]).grid(row = i, column = 0)
tk.Button(a, text = "Return", command = a.withdraw).grid(row = len(how_score)+1)
a.mainloop() |
The European Union and the Chinese ministry of science and technology have had a private row about an EU promise to help China finance feasibility studies for a carbon capture and storage site.
To remedy the dispute, the EU is proposing that instead of a €7-million grant to fund the studies, the EU would provide funds to support a "policy dialogue".
Documents released by the EU commission showed that the EU-China cooperation on carbon capture and storage has suffered significant delays (Photo: Peter Teffer)
The EU has invited China to discuss the issue in Brussels next year.
The dispute is the tentative anti-climax of an EU-China cooperation programme that began 12 years ago.
An investigation by EUobserver, based in part on newly revealed information after an access to documents request, showed that the project has been marred by delays.
Concrete action
At a summit in Beijing in September 2005, the EU and China established a Partnership on Climate Change.
The context was this: China was on its way of becoming the largest emitter in the world - a position it has held ever since - because of its reliance on coal, the fossil fuel with the highest CO2-intensity.
It was four years before the failed climate summit in Copenhagen, and a decade before the first truly global treaty on climate change was signed in Paris.
China was very much reluctant to commit to climate action, arguing that it was still a developing country that was not responsible for the historic (Western) emissions which led to global warming.
According to a European Commission press release, the focus of the partnership would be on "concrete action".
The two sides agreed that they would "develop and demonstrate, in China and the EU, advanced 'zero-emissions' coal technology".
In a memorandum of understanding (MoU) signed a year later, the two sides said they would do this by setting up a carbon capture and storage (CCS) project.
The programme was divided into three phases, and final operation of an EU co-financed CCS installation in China was set for 2020.
Three phases
The first, preparatory, phase was concluded in 2009 at a China-EU conference in Beijing. The first phase was led by the United Kingdom, which spent some £2.8 million (€3.2m) on it.
Also that year, the two sides signed another MoU, describing phase two, during which a feasibility study of a CCS demonstration project should be completed.
The MoU said that commission "considered" to financially support phase-two activities with up to €7 million. The two sides aimed to complete the second phase by 2012.
However, some voices of impatience were beginning to let themselves be heard that year.
The House of Lords in the UK investigated broader EU-China relations, for which they invited John Ashton, the then foreign secretary's special representative on climate change.
On 23 April 2009, he gave a testimony noting that a big challenge will be how to fund the actual construction.
"If you build a carbon capture and storage plant at commercial scale, you are talking about an additional cost at this current demonstration phase of the order of hundreds of millions of euros, and the question is - where is that additional cost going to come from?", he said
He noted that China would say that Europe should pay for it.
"There will be a question that we need to address quite soon about the extent to which European taxpayers in the end will be willing to pay … We have not answered that question yet. If you were to ask me how the implementation stage will be paid for, I do not have an answer to it. It is a question that is bothering me a lot," Ashton said.
In their final report, which came out in March 2010, the Lords said they were "sceptical that the current pace of development, and the lack of committed funding, will lead to a successful and timely outcome".
Only the EU commission had pledged to pay €50 million for the actual construction, with estimates for the final costs between €300 million and €550 million.
Despite this, the then centre-left UK minister of state in charge of climate change, Philip Hunt, wrote that "a significant amount has already been achieved" which should not be downplayed, and that the initiative could serve "as a potential blueprint" for other projects.
He noted that he hoped to see the plant operational by 2015, a target date which the EU commission had also started using.
Hunt expressed his "belief that we are well placed to deliver a demonstration plant in China in parallel to those in the UK and elsewhere in the EU".
Delays
But the second phase of the project was plagued with setbacks, according to an external evaluation submitted to the commission in 2013, and made public after an access to documents request by this website.
"Progress on the project has been difficult right from the start, due to work programme changes, slow correspondence in communication and intermittent periods of project implementation," the report said.
"The project paused for a period of nearly one year before it was resumed during 2012," it added.
The reason for the one-year pause was "the need to resolve some contractual issues between the Chinese and European partners", it said, although another internal commission document had a different explanation.
It said, without going into detail, that the one-year hiatus was due to "political differences" between China and Norway, which had by then become a co-financer of the project.
Other difficulties included logistical issues, such as delayed visa issuances on both sides.
The first evaluation report also included some more mundane reasons for delays, for instance, two European diplomats not being allowed beyond by the reception desk of the ministry of science and technology because of a "delayed communication" about their mission.
In 2015, a commission civil servant wrote in an email that China's ministry of science and technology had confirmed that China was still "much interested" in the CCS cooperation.
Money for momentum
A concept note, dated 21 January 2015, acknowledged that the implementation of the China-EU near-zero emissions coal project "has been slower than expected in the last years, but is still considered a success by all involved parties".
It said that to keep up momentum, a quick decision to finance a feasibility study of one of three CCS sites, to be selected by the Chinese ministry, was "vital". The note also acknowledged that one risk associated with the project was that the third phase "does not find sufficient financing".
However, the note said that risk was "mitigated by China's long-term commitment" to the programme, as well as the climate action promises the country was expected to make in Paris - which it did.
Over the following months, EU civil servants exchanged emails with China to find out which CCS site it would prefer. The EU hired an external expert to evaluate the two Chinese-proposed sites.
Do we accept the risks?
In 2016, the file moved up the commission hierarchy.
The directorate-general Climate Action (DG Clima) wanted to fund the feasibility study with money from the Partnership Instrument, a fund under control of the EU's diplomatic branch - the European External Action Service (EEAS).
Christian Leffler, deputy secretary general at EEAS, and Tung-Lai Margue, director for foreign policy instruments, wrote to Jos Delbeke, director-general of DG Clima, on 9 February 2016.
"We are prepared to consider this funding, despite the significant risks for the EU that are associated with this action, provided that we get commitment and guidance from DG Clima on mitigating measures to address the identified risks in the attached explanatory note," they wrote.
The risks included "non-completion", since there was no legal framework in place in China to force energy and industry operators to adopt the expensive CCS technology.
The note also mentioned a potential funding gap. The EU would fund €7 million, but that was nowhere near enough the €27.2 million needed for one feasibility study, or €51 million for two, wrote Leffler and Margue.
They worried about potential public criticism if the EU were spending public money to help state-owned Chinese companies "at a time when the EU economic growth is slow".
The note also mentioned that CCS was not even being used at a commercial scale in the EU, "due to high investment costs and widespread adversity by public opinion".
On 29 February 2016, Delbeke gave them the go-ahead, saying he was willing to take the risks, or that those risks were not as large.
"If the EU does not go ahead with its long-promised support towards CCS demonstration in China, there is a significant risk that the EU's commitment to CCS as a technology and to climate action will be severely criticised by stakeholders and the public," the highest EU civil servant in charge of climate action wrote.
The Chinese companies 'lost interest'
But it all ended in an anti-climax.
On 24 July 2017, a civil servant from DG Clima wrote an email about his or her trip to Beijing a few days earlier.
Apparently, the two companies that were candidates for the feasibility study had "lost interest" in the near-zero emissions coal project and had decided to fund the studies themselves.
The civil servant described a "confrontational" meeting at the Chinese ministry of science and technology.
"We made it clear that the EU cannot double-finance feasibility studies done by the firms," the civil servant wrote.
But China "demanded" the implementation of the 2009 MoU, which included the phrase that the EU had "considered" spending €7 million.
Some days later the commission sent a formal letter to the Chinese ministry, informing them that the EU was "not in a position to maintain the foreseen financial support", but that the EU would like to "reorient the EU-China CCUS cooperation to a policy and expert dialogue".
The EU would "provide funds" for the dialogue, but did not say how much.
"We would be honoured to receive you and other Chinese stakeholders in Brussels in 2018 for in-depth discussions on Carbon Capture Utilisation and Storage," the letter said.
"I am looking forward to a fruitful cooperation with you during the coming years," it concluded, posing the question of whether the next chapter of the story would end any differently. |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.