content
stringlengths 10
4.9M
|
---|
async def play(self, ctx: commands.Context, *, query: str):
if not ctx.voice_state.voice:
await ctx.invoke(self.join)
async with ctx.typing():
try:
lists = await YoutubeDLSource.search(query, loop=self.rinko.loop)
if lists:
source = await YoutubeDLSource.create_source(ctx, lists[0], loop=self.rinko.loop, speed=ctx.voice_state.speed, pitch=ctx.voice_state.pitch, pitch_d=ctx.voice_state.pitch_d)
else:
ctx.send('No list')
except Exception as e:
await ctx.send('An error occurred during the processing of your request.: {}'.format(str(e)))
else:
song = Song(source)
await ctx.voice_state.queue.put(song)
await ctx.send('__**Added the following songs to the playlist.**__\n{}'.format(str(source))) |
<reponame>nextapm/pythonapm
import json
import os
from pythonapm.util import current_milli_time, is_valid_rescode
from pythonapm.constants import manage_agent, delete_agent, shutdown
class Instanceinfo:
def __init__(self):
self.last_reported = None
self.status = manage_agent
self.modified_time = current_milli_time()
def update_status(self, rescode):
self.update_last_reported()
if is_valid_rescode(rescode) is not True:
return
self.status = rescode
if self.status == delete_agent:
self.status = shutdown
def get_status(self):
return self.status
def update_last_reported(self):
self.last_reported = current_milli_time()
def get_modiefied_time(self):
return self.modified_time
def get_retry_counter(self):
return self.retry_counter
def get_last_reported(self):
return self.last_reported
|
U.N. Official Says War Crimes Are Being Committed In Aleppo
David Greene speaks with Zeid Ra'ad Al Hussein, the U.N. high commissioner for human rights, who called what is taking place in the besieged Syrian city of Aleppo, "crimes of historic proportions."
DAVID GREENE, HOST:
You're about to hear more dire news from Syria, and a top official at the U.N. believes that in itself may be part of the problem. He wonders if we've gotten so inundated with news about conflict and suffering that we're becoming numb. But Zeid Ra'ad Al Hussein, the U.N. high commissioner for human rights, believes it is his duty to keep getting the word out. And he did that again recently. He said crimes of historic proportions are being committed in eastern Aleppo, the Syrian city where government troops backed by Russia have been carrying out deadly air strikes.
Let me just ask you. You lashed out at the international community with some pretty harsh words, calling what's happening in Aleppo crimes of historic proportions. What prompted you to say this?
ZEID RA'AD AL HUSSEIN: Well, even by the standards of the horrors we've seen in Syria over the last five years, the recent what seems to be utterly indiscriminate bombing by Syrian and Russian forces creates an impression that the people in eastern Aleppo are almost entombed. They are being sort of kept to their homes, yes, but if they try to venture out to bakeries, to schools, to hospitals, the likelihood of their being bombed is quite high. And they're grounds for believing that war crimes are indeed being committed.
GREENE: The Russian government has said that both sides in Aleppo are harming people, that there are armed groups that are hitting government controls areas in the western part of the city. I mean, how do you respond to Russian claims like that?
HUSSEIN: Well, we have seen improvised mortars being used by the armed groups, and we have also expressed how deplorable it is, of course, and condemned any targeting of civilians. But the overall level or number of civilian casualties seems to accrue from the aerial bombing.
GREENE: This is aerial bombing, to be clear, by Russian and Syrian forces in the east of Aleppo.
HUSSEIN: That's right, yes.
GREENE: I mean, you know better than anyone, Russian officials and diplomats who you've worked with. I mean, Russia has a United Nations representative. There's Sergey Lavrov, the foreign minister. Have you gotten any signs from the Russian government that makes you think you have a willing partner in trying to stop this violence?
HUSSEIN: There have been concerted efforts - you know, John Kerry and Sergey Lavrov have met umpteen times. And the point that we make - and I can't be drawn into the political dimensions of this conflict. The point we make is that so egregious is the suffering of the people of eastern Aleppo. Whatever the strategic motives, surely they cannot eclipse the scale and degree of suffering, and ultimately, this fighting has to cease.
GREENE: There have been these iconic photos of suffering from Syria.
HUSSEIN: Yeah.
GREENE: There have been words like yours, like crimes of historic proportions, but what will get countries to act?
HUSSEIN: Yeah. Over time, populations around the world have become almost sort of tranquillized to the suffering of others. There's so much of it, regrettably. There are so many conflicts, and the suffering of human beings is splashed across on newspapers, on the internet, through radio shows like this one. And what we worry about is that people are not moved anymore to take action and to put pressure on their governments to end this disastrous war. I mean, you hope that humanity is still recoiling at this sort of news, and - whether it be the bombing in Yemen or the actions in Iraq at the moment. And in all cases, we have to hold those who are prosecuting these conflicts responsible for gross violations of human rights.
GREENE: Mr. High Commissioner, before I let you go, you made a reference to you not wanting to get into the politics of this.
HUSSEIN: Yeah.
GREENE: And I'm reminded of some comments you made recently about Donald Trump, the Republican nominee for president in the United States, saying that he would be dangerous as president.
HUSSEIN: Yes, yes.
GREENE: Why did you say that?
HUSSEIN: If Donald Trump makes good on many of the comments he's made over the past year or so, it poses a threat to the human rights of many people.
GREENE: You specifically brought up some of the comments he made about expanding the use of torture. Is that what you're talking about?
HUSSEIN: Yes, yes, absolutely.
GREENE: I guess supporters of Donald Trump might say that the threat of terrorism makes them feel like, no, that they would want their president to take whatever action necessary, even if it meant violating the rights of someone who is suspected of terrorism.
HUSSEIN: Yeah, no, that in law never works. I mean, we will return, of course, to the Stone Age like this very quickly. There can be no exceptions, for instance, to the use of torture.
GREENE: Do you worry that even saying what you have here is crossing a line and it might not be appropriate for a U.N. official to get involved in a presidential campaign in this way?
HUSSEIN: It's well within my mandate. Once they assume power, if they're a candidate, it's often too late. The human rights violations are in full flow, and if in the future, of course, historians will look back and say, well, why didn't these people speak out if they knew it was going to be a significant threat?
GREENE: Mr. High Commissioner, a real pleasure talking to you, thank you so much.
HUSSEIN: Thank you, David.
GREENE: Zeid Ra’ad Al Hussein is the U.N. high commissioner for human rights.
Copyright © 2016 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record. |
<filename>Canadian Computing Contest/2020/S1.cpp<gh_stars>1-10
/*
Aldew
PROB: Surmising a Sprinter's Speed
CLASSIFICATION: Implementation
*/
#include <iostream>
#include <vector>
#include <bits/stdc++.h>
#include <cmath>
#include <string>
#include <map>
#include <math.h>
#include <cstdio>
using namespace std;
int main()
{
int n;
cin >> n;
vector< pair<int, int> > s;
int t, x;
pair<int, int> p;
while (n--){
cin >> t >> x;
p = {t, x};
s.push_back(p);
}
sort(s.begin(), s.end());
double speed = 0;
double si;
for (int i = 0; i < s.size(); i++){
if (i + 1 < s.size()){
si = abs(s[i].second - s[i+1].second);
si /= abs(s[i].first - s[i+1].first);
if (si > speed) speed = si;
}
}
cout << speed << endl;
return 0;
}
|
Today’s Main Feature!
Hello everyone!! Kahotan here! (@gsc_kahotan)
Today I’m going to be taking a look at…
Nendoroid Kaban!
From the popular anime series “Kemono Friends” comes a Nendoroid of Kaban-chan! She comes with three face plates including a gentle standard expression, a shouting expression to recreate the “Please don’t eat me” scene as well as a smiling expression to show her enjoying her travels around Japari Park. Her trademark large backpack is included and can be opened and closed for various different poses. She also comes with a Japari Park map to pose her strolling around the park as well as the paper aeroplane from episode one. Be sure to display her together with the previously announced Nendoroid Serval!
“Please don’t eat me!”
From the anime series that captured the hearts of young and old fans alike “Kemono Friends” comes an adorable little Nendoroid of Kaban-chan! The large bag on her back, iconic feather in her hat, her slightly messy looking hair and even her baggy shorts have all been shrunk down carefully to Nendoroid size!
Let’s take a look at the optional parts that she comes with! (●‘∀‘●)ノ”
– Japari Park Map –
Special hands to display her holding the map are also included for the cute pose you can see above!
– Paper Plane –
The anti-Cerulean paper airplane seen from episode one!
It comes with it’s very own stand allowing you to display it floating through the air as if just thrown!
Next, let’s take a look at her other face plates!
Along with the gentle smiling face above, she also comes with this memorable expression from the series! (゚∀゚ヽ 三 ノ゚∀゚)ノ
“P-Please don’t eat me!“
“I won’t eat you!!“
A worried looking shouting expression which can be used to recreate the scene where she first meets Serval!! Nendoroid Serval is scheduled to released next month, so I decided to display her together with Kaban-chan for this review! ★
Kana-chan is adorable all by herself, but recreating the adventures with her friends makes it all the more enjoyable!
Moving onto her third face plate… a cheerful smiling expression!
Kaban-chan’s innocent smile from her adventures can be enjoyed in Nendoroid size!! As you may have noticed, the pink feather she received from Arai-san is also included to put on the other side of her hat! ♪
Plus, the Lucky Beast that is included with Nendoroid Serval…
Can even fit inside of Nendoroid Kaban-chan’s bag! Not to mention the bag can open and close!
Lucky Beast’s tail is detachable – you will need to take the tail off in order for him to fit inside the bag!! (σ・∀・)σ
Be sure to add the friendly explorer to your collection!
Nendoroid Kaban!
She’ll be up for preorder from tomorrow!! ★
In addition, purchases from the GOODSMILE ONLINE SHOP will also include an Arm Part with Lucky Beast (Final Episode Details) + Bare Left and Right Hand Parts as a bonus!
Be sure to consider it when preordering!! (●´Д`●)ノ
Plus…
This Soft Vinyl Lucky Beast from AQUAMARINE will also be going up for preorder tomorrow! He can even wear Nendoroid Kaban-chan’s hat! If you consider this wearing!
See the AQUAMARINE Official Site or Good Smile Company Official Site tomorrow for more details!
Last but no least, a semi-related item…
PLM will have these Dioramamansion 150: Plains / Jungle display backgrounds up for preorder too! They of course match up with the Kemono friends characters really nicely!! ♡
Be sure to consider them as well! (っ’ω’)っ))
○+●+○+●+○+●+○+●+○+●+○+●+○+●+○+●+○+●+○+●+○+●+○+●+
Preorder Deadlines!!
Preorders for many products at the GOODSMILE ONLINE SHOP are closing soon! Here are the products closing on the 21st and 28th September (JST)!
⇒ MORE DETAILS All Good Smile Company products are made to order, so if you want to be sure that you get your hands on them preordering is always the safest bet! Make sure you don’t miss out! (∩・∀・)∩ ○+●+○+●+○+●+○+●+○+●+○+●+○+●+○+●+○+●+○+●+○+●+○+●+
Planning Team / Kahotan / Twitter ID:@gsc_kahotan
English Updates: @gsc_kevin |
//Many thanks to Shar for creating this item :)
#include <std.h>
#include <daemons.h>
inherit ARMOUR;
#define MAXUSES 25
int uses;
string place;
void create(){
::create();
set_name("ring of refresh");
set_id(({ "ring", "ring of refresh","band","crystal band" }));
set_short("%^BOLD%^%^CYAN%^Ring of Refresh%^RESET%^");
set_obvious_short("A crystal band");
set_long(
"%^BOLD%^%^CYAN%^This band is fashioned out of a clear crystal. "+
"The ring has been unpolished to allow the valleys and peaks of "+
"the crystal to remain, catching the light. Only after you have "+
"touched the ring do you realize that it is actually made out of ice! "+
"Some form of magic seems to keep it from melting."
);
set("read",
@AVATAR
%^BOLD%^%^CYAN%^Hidden below the surface of the ice is one word %^WHITE%^~%^CYAN%^*%^WHITE%^snowfall%^CYAN%^*%^WHITE%^~
AVATAR
);
set("langage","str"); set_weight(2);
set_value(750);
set_property("lore difficulty",5);
set_type("ring");
set_limbs(({ "left hand","right hand" }));
set_size(2);
set_property("enchantment",0);
}
void init(){
::init();
add_action("snowfall","snowfall");
if(uses > 24){
set_value(0);
}
}
int snowfall(string str){
if (str) {
notify_fail ("What?\n");
return 0;
}
if(TP->query_bound() || TP->query_unconcious() ||
TP->query_paralyzed() ||
TP->query_tripped()) {
TP->send_paralyzed_message("info",TP);
return 1;
}
if(query_worn()){
if(uses < MAXUSES){
tell_room(ETP,"%^BOLD%^%^WHITE%^"+TPQCN+"'s skin forms a fine white "+
"sheen that soon melts away, leaving "+TP->query_objective()+" "+
"seeming refreshed and cool.",TP);
tell_object(TP,"%^BOLD%^%^WHITE%^A fine layer of ice forms on your "+
"skin briefly before melting away, leaving you feeling cool and refreshed.");
uses += 1;
return 1;
}
write("The magic of the ring has been depleted!");
return 1;
}
write("You must wear the ring first.");
return 1;
}
|
// parseAmbiguousTextToEWKB parses a text as a number of different options
// that is available in the geospatial world using the first character as
// a heuristic.
// This matches the PostGIS direct cast from a string to GEOGRAPHY/GEOMETRY.
func parseAmbiguousTextToEWKB(str string, defaultSRID geopb.SRID) (geopb.EWKB, error) {
if len(str) == 0 {
return nil, fmt.Errorf("geo: parsing empty string to geo type")
}
if str[0] == '0' {
t, err := ewkbhex.Decode(str)
if err != nil {
return nil, err
}
if defaultSRID != 0 && t.SRID() == 0 {
adjustGeomSRID(t, defaultSRID)
}
return ewkb.Marshal(t, ewkbEncodingFormat)
}
if str[0] == 0x00 || str[0] == 0x01 {
t, err := ewkb.Unmarshal([]byte(str))
if err != nil {
return nil, err
}
if defaultSRID != 0 && t.SRID() == 0 {
adjustGeomSRID(t, defaultSRID)
}
return ewkb.Marshal(t, ewkbEncodingFormat)
}
return decodeEWKT(str, defaultSRID)
} |
//--------------------------------------------------------------------------
// CStaticWindow M O V E W I N D O W
//--------------------------------------------------------------------------
void CStaticWindow::MoveWindow( int x, int y, int nWidth, int nHeight )
{
if( fCreated )
::MoveWindow( hWnd, x, y, nWidth, nHeight, TRUE );
} |
Organizational Architecture, Resilience, and Cyberattacks
This article develops a unique model of organizational resilience architecture with an emphasis on the ways in which organizations respond to cyberattacks. The model elucidates the dynamics and approaches through which organizations mobilize and utilize expertise and resources to combat the effects of cyberattacks on normal business operations. Drawing on recent cases of cyberattacks against organizations, the article identifies a host of strategic and tactical responses victims used to aid recovery and return to daily activities. The responses are grouped into three stages to demonstrate the steps that organizations can take to enhance their resilience: Stage 1 focuses on proactive environmental scanning and locating potential threats and attacks, Stage 2 emphasizes neutralizing threats and attacks, and Stage 3 focuses on redesigning, upgrading, and updating human, technological, and financial resources. On this basis, the article sheds light on levels of organizational resilience and strategies for organizational design in withstanding cyberattacks and security breaches. The theoretical and practical implications of these findings are discussed. |
<filename>benchmark/benchmark.go<gh_stars>1-10
package benchmark
import (
"time"
)
// Small payload, http log like structure. Size: 190 bytes
// SmallPayload contains the small payload data.
type SmallPayload struct {
St int
Sid int
Tt string
Gr int
Uuid string
Ip string
Ua string
Tz int
V int
}
// Medium payload, based on Clearbit API response. Size: 2.3kb
// CBAvatar represents one Gravatar avatar for a Clearbit person.
type CBAvatar struct {
Url string
}
// CBGravatar represents a Clearbit person's Gravatar account data.
type CBGravatar struct {
Avatars []*CBAvatar
}
// CBGithub represents a Clearbit person's Github account data.
type CBGithub struct {
Followers int
}
// CBName represents a Clearbit person's name.
type CBName struct {
FullName string
}
// CBPerson represents a Clearbit person.
type CBPerson struct {
Name *CBName
Github *CBGithub
Gravatar *CBGravatar
}
// MediumPayload contains the medium payload data.
type MediumPayload struct {
Person *CBPerson
Company map[string]interface{}
}
// Large payload, based on Discourse API. Size: 41kb
// DSUser represents a Discourse user.
type DSUser struct {
Username string
}
// DSTopic represents one Discourse topic.
type DSTopic struct {
Id int
Slug string
}
// DSTopicsList represents a paginated set of Discourse topics.
type DSTopicsList struct {
Topics []*DSTopic
MoreTopicsUrl string
}
// LargePayload contains the large payload data.
type LargePayload struct {
Users []*DSUser
Topics *DSTopicsList
}
// Huge payload, based on a large Helm index. Size: 333mb
// IndexFile contains the huge payload data.
type IndexFile struct {
APIVersion string `json:"apiVersion"`
Generated time.Time `json:"generated"`
Entries map[string]ChartVersions `json:"entries"`
}
// ChartVersions is a list of versions of a chart.
type ChartVersions []ChartVersion
// ChartVersion is one version of a chart.
type ChartVersion struct {
Metadata
URLs []string `json:"urls"`
Created time.Time `json:"created,omitempty"`
Removed bool `json:"removed,omitempty"`
Digest string `json:"digest,omitempty"`
}
// Metadata is the metadata for a chart.
type Metadata struct {
Name string `json:"name,omitempty"`
Home string `json:"home,omitempty"`
Sources []string `json:"sources,omitempty"`
Version string `json:"version,omitempty"`
Description string `json:"description,omitempty"`
Keywords []string `json:"keywords,omitempty"`
Maintainers []*Maintainer `json:"maintainers,omitempty"`
Engine string `json:"engine,omitempty"`
Icon string `json:"icon,omitempty"`
ApiVersion string `json:"apiVersion,omitempty"`
Condition string `json:"condition,omitempty"`
Tags string `json:"tags,omitempty"`
AppVersion string `json:"appVersion,omitempty"`
Deprecated bool `json:"deprecated,omitempty"`
TillerVersion string `json:"tillerVersion,omitempty"`
Annotations map[string]string `json:"annotations,omitempty"`
KubeVersion string `json:"kubeVersion,omitempty"`
}
// Maintainer is the information of a chart maintainer.
type Maintainer struct {
Name string `json:"name,omitempty"`
Email string `json:"email,omitempty"`
Url string `json:"url,omitempty"`
}
|
// TODO(ahcorde): Replace this when this function is on ign-rendering6
/////////////////////////////////////////////////
math::Vector3d SpawnPrivate::ScreenToPlane(
const math::Vector2i &_screenPos,
const rendering::CameraPtr &_camera,
const rendering::RayQueryPtr &_rayQuery,
const float offset)
{
double width = _camera->ImageWidth();
double height = _camera->ImageHeight();
double nx = 2.0 * _screenPos.X() / width - 1.0;
double ny = 1.0 - 2.0 * _screenPos.Y() / height;
_rayQuery->SetFromCamera(
_camera, math::Vector2d(nx, ny));
math::Planed plane(math::Vector3d(0, 0, 1), offset);
math::Vector3d origin = _rayQuery->Origin();
math::Vector3d direction = _rayQuery->Direction();
double distance = plane.Distance(origin, direction);
return origin + direction * distance;
} |
import {
Component,
OnDestroy,
OnInit
} from '@angular/core';
import {Router} from '@angular/router';
import {
Attraction,
CategoryAttractionService,
Image,
ImageService
} from 'cr-lib';
import {Subscription} from 'rxjs';
import {ActiveAttractionService} from '../edit/active-attraction.service';
/**
* Presents all images associated with a Location.
*/
@Component({
selector: 'app-images',
templateUrl: './images.page.html',
styleUrls: ['./images.page.scss'],
})
export class ImagesPage implements OnInit, OnDestroy {
/* Exposed for the view. */
public attraction: Attraction;
public images: Image[] = [];
private subscription: Subscription;
constructor(
private imageService: ImageService,
private activeAttractionService: ActiveAttractionService,
private categoryAttractionService: CategoryAttractionService,
private router: Router,
) {
console.log('Hello, Image Page');
}
ngOnInit() {
console.log('Image Page OnInit()');
this.subscription = this.activeAttractionService.getActiveAttractionId()
.subscribe(
(id) => {
this.attraction = this.categoryAttractionService.getAttraction(id);
console.log('Image Page receiving attractionId', this.attraction.id);
this.imageService.getAllImagesForLocation(this.attraction.id)
.subscribe(
(images) => {
this.images = images;
}
);
}
);
}
ngOnDestroy() {
this.subscription.unsubscribe();
}
isFeaturedImage(imageId: number) {
return this.attraction.featuredImage && imageId === this.attraction.featuredImage.id;
}
setFeaturedImage(imageId: number) {
this.imageService.setFeaturedImage(this.attraction.id, imageId)
.subscribe(
(attraction: Attraction) => {
this.attraction.featuredImage = attraction.featuredImage;
}
);
}
save() {
this.router.navigate(['edit', this.attraction.id, 'place']);
}
/* TODO: CI-51 remove image. */
removeImage(imageId: number): void {
console.log('Not yet implemented -- remove', imageId);
}
}
|
package api
import (
"encoding/json"
"testing"
)
func TestGraphQLError_Code(t *testing.T) {
for name, tc := range map[string]struct {
in string
want string
wantErr bool
}{
"invalid code": {
in: `{
"errors": [
{
"message": "The feature \"campaigns\" is not activated because it requires a valid Sourcegraph license. Purchase a Sourcegraph subscription to activate this feature.",
"path": [
"createBatchSpec"
],
"extensions": {
"code": 42
}
}
],
"data": null
}`,
wantErr: true,
},
"invalid extensions": {
in: `{
"errors": [
{
"message": "The feature \"campaigns\" is not activated because it requires a valid Sourcegraph license. Purchase a Sourcegraph subscription to activate this feature.",
"path": [
"createBatchSpec"
],
"extensions": 42
}
],
"data": null
}`,
wantErr: true,
},
"no code": {
in: `{
"errors": [
{
"message": "The feature \"campaigns\" is not activated because it requires a valid Sourcegraph license. Purchase a Sourcegraph subscription to activate this feature.",
"path": [
"createBatchSpec"
],
"extensions": {}
}
],
"data": null
}`,
want: "",
},
"no extensions": {
in: `{
"errors": [
{
"message": "The feature \"campaigns\" is not activated because it requires a valid Sourcegraph license. Purchase a Sourcegraph subscription to activate this feature.",
"path": [
"createBatchSpec"
]
}
],
"data": null
}`,
want: "",
},
"valid code": {
in: `{
"errors": [
{
"message": "The feature \"campaigns\" is not activated because it requires a valid Sourcegraph license. Purchase a Sourcegraph subscription to activate this feature.",
"path": [
"createBatchSpec"
],
"extensions": {
"code": "ErrBatchChangesUnlicensed"
}
}
],
"data": null
}`,
want: "ErrBatchChangesUnlicensed",
},
} {
t.Run(name, func(t *testing.T) {
var result rawResult
if err := json.Unmarshal([]byte(tc.in), &result); err != nil {
t.Fatal(err)
}
if ne := len(result.Errors); ne != 1 {
t.Fatalf("unexpected number of GraphQL errors (this test can only handle one!): %d", ne)
}
ge := &GraphQlError{result.Errors[0]}
have, err := ge.Code()
if tc.wantErr {
if err == nil {
t.Errorf("unexpected nil error")
}
} else {
if err != nil {
t.Errorf("unexpected error: %+v", err)
}
if have != tc.want {
t.Errorf("unexpected code: have=%q want=%q", have, tc.want)
}
}
})
}
}
|
Electron transfer in tetrahemic cytochromes c3: spectroelectrochemical evidence for a conformational change triggered by heme IV reduction.
Electron transfer in tetrahemic cytochromes c3 from Desulfovibrio vulgaris Hildenborough (D.v.H.) and Desulfovibrio desulfuricans Norway (D.d.N.) strains has been investigated by thin layer spectroelectrochemistry with visible absorption, CD, and resonance Raman (RR) monitoring. The observed splitting of the isosbestic point in the Soret absorption band indicates that the electron transfer from the (FeIII)4 state to the (FeII)4 state proceeds via an intermediate species, which corresponds to 25 and 50% reduction for the D.v.H. cyt.c3 and the D.d.N. cyt.c3, respectively. For the latter, a specific CD signal is observed at half-reduction. RR monitoring of the redox process does not reveal multiple splitting of the high-frequency RR bands, at variance with previously published results on the enzymatic reduction of cyt.c3 from Desulfovibrio vulgaris Miyazaki, a cytochrome highly homologous to D.v.H. cyt.c3 . The low-frequency RR spectra of the intermediate species differ significantly from the ones calculated from a linear combination of the all-ferric and all-ferrous states, for the same reduction ratio. Frequency shifts of the bending modes of the cysteine and propionate heme substituents are observed, as well as changes specific to each cytochrome; most notable is the activation of two torsional modes in the case of D.d.N. cyt.c3. Comparison of the results obtained for the two cytochromes leads to the conclusion that reduction of heme IV triggers the observed conformational change. This conclusion is supported by the spectroelectrochemical investigation of the mutant D.v.H. cyt.c3 H25M, in which the sixth ligand of heme III, histidine, is replaced by a methionine. |
// CreateListDelegatedServicesForAccountRequest creates a request to invoke ListDelegatedServicesForAccount API
func CreateListDelegatedServicesForAccountRequest() (request *ListDelegatedServicesForAccountRequest) {
request = &ListDelegatedServicesForAccountRequest{
RpcRequest: &requests.RpcRequest{},
}
request.InitWithApiInfo("ResourceManager", "2020-03-31", "ListDelegatedServicesForAccount", "", "")
request.Method = requests.POST
return
} |
import os
import LayoutFactory
from Utils import Project
from layout import ViewEnviroment
class LayoutProcessor :
totalLayouts = 0
# LayoutFactory mLayoutFactory
#
# void main( String[] args) :
# String projectFolder = "./Facebook"
# ProjectInfo projectInfo = new ProjectInfo("3", "Facebook",
# projectFolder, "MainActivity", "com.facebook")
# LayoutProcessor layoutReader = new LayoutProcessor()
# Project project = layoutReader.read(projectInfo.getPath())
# if (project != null) :
# System.out.println("Found the project: "
# + project.getLayouts().size())
def read(self, sFolder) :
# File folder = new File(sFolder)
self.mLayoutFactory = LayoutFactory()
# look for AndroidManfest.xml file
self.project = self.analyze(sFolder)
# System.out.println("Total layouts: " + totalLayouts)
# Enviroment.get().log()
return self.project
def analyze(self,folder) :
if (self.isAndroidProject(folder)) :
# now try to read some layout file
project = Project(os.path.basename(folder))
# Project project = new Project(folder.getName())
layoutFiles = self.getLayoutFiles(os.path.abspath(folder))
if (len(layoutFiles) == 0) :
print("Nothing to be done")
for file in layoutFiles:
self.totalLayouts = self.totalLayouts +1
layout = self.mLayoutFactory.createLayout(file)
project.addLayout(layout)
allView = layout.getAllView()
for view in allView:
ViewEnviroment.addViewMap(project.mName,
layout.mId, view.mName, view)
return project
else :
listFiles = os.listdir(folder)
# @Override
# boolean accept( File pathname) :
# return pathname.isDirectory()
#
# )
for file in listFiles:
if os.path.isdir(file):
return self.analyze(file)
return None
def getLayoutFiles(self,projectFolderPath) :
folder = projectFolderPath + "/res/layout"
allLayoutFiles = self.getAllFilesRecusivelyFile(folder, ".xml")
return allLayoutFiles
def getAllFilesRecusivelyFile(self,folder, filenameFilter) :
files = []
self.getAllFilesRecusively(folder, files, filenameFilter)
return files
def getAllFilesRecusively(self, folder, files,filenameFilter) :
if folder == None or os.path.isfile(folder) :
return
listFiles = [x for x in os.listdir(folder) if x.endswith(filenameFilter)]
if (len(listFiles)==0) :
return
files.extend(listFiles)
subFolders = [x for x in os.listdir(folder) if os.path.isdir(x)]
for subFolder in subFolders:
self.getAllFilesRecusively(subFolder, files, filenameFilter)
def isAndroidProject(self,folder) :
listFiles = [x for x in os.listdir(folder) if x == "AndroidManifest.xml"]
return len(listFiles) == 1
|
// NewEventBus creates an EventBus, with optional settings.
func NewEventBus(url, appID string, options ...Option) (*EventBus, error) {
b := &EventBus{
appID: appID,
streamName: appID + "_events",
registered: map[eh.EventHandlerType]struct{}{},
errCh: make(chan eh.EventBusError, 100),
codec: &json.EventCodec{},
}
for _, option := range options {
if option == nil {
continue
}
if err := option(b); err != nil {
return nil, fmt.Errorf("error while applying option: %w", err)
}
}
var err error
if b.conn, err = nats.Connect(url, b.connOpts...); err != nil {
return nil, fmt.Errorf("could not create NATS connection: %w", err)
}
if b.js, err = b.conn.JetStream(); err != nil {
return nil, fmt.Errorf("could not create Jetstream context: %w", err)
}
if b.stream, err = b.js.StreamInfo(b.streamName); err == nil {
return b, nil
}
subjects := b.streamName + ".*.*"
cfg := &nats.StreamConfig{
Name: b.streamName,
Subjects: []string{subjects},
Storage: nats.FileStorage,
}
if b.stream, err = b.js.AddStream(cfg); err != nil {
return nil, fmt.Errorf("could not create NATS stream: %w", err)
}
return b, nil
} |
<filename>src/core/validator/types.ts
export interface IRuleConfig {
default?: any
min?: number
max?: number
reg?: RegExp
}
/* export interface IIsOptional {
default: any
}
export interface IIsLength {
min: number
max?: number
}
export interface IIsInt {
min?: number
max?: number
}
export interface IMatches {
reg: RegExp
} */
|
Holding it Straight: Sexual Orientation in the Middle Ages
Lecture by Dr Robert Mills, UCL History of Art
Given at University College London, on October 22, 2013
Historians tend to be reticent about applying the phrase ‘sexual orientation’ to periods before the nineteenth century, but should we be so quick to dismiss the concept? Focusing on depictions of virgins and sodomites – two seemingly opposing categories – this talk will explore how medieval encounters with sex were shaped by concepts of space and orientation.
To listen to the lecture as audio:
See also Medieval representations of sodomy, an interview with Dr. Mills where he discusses how medieval artists developed strategies of depicting taboo practices.
See also: Sex in the Middle Ages
See also: Same-Sex Relations in the Middle Ages
Smartphone and Tablet users click here to sign up for
our weekly email |
import React, { ReactElement, useEffect, useState } from 'react'
import {
Image,
StyleSheet,
TouchableOpacity,
View,
} from 'react-native'
import { StackScreenProps } from '@react-navigation/stack'
import WalletConnect from '@walletconnect/client'
import { IClientMeta } from '@walletconnect/types'
import _ from 'lodash'
import {
NavigationProp,
StackActions,
useNavigation,
} from '@react-navigation/native'
import { COLOR } from 'consts'
import { RootStackParams } from 'types/navigation'
import Body from 'components/layout/Body'
import { navigationHeaderOptions } from 'components/layout/Header'
import WithAuth from 'components/layout/WithAuth'
import { Button, Icon, Loading, Text } from 'components'
import { useConfig, User } from 'lib'
import useWalletConnect from 'hooks/useWalletConnect'
import images from 'assets/images'
import useTopNoti from 'hooks/useTopNoti'
type Props = StackScreenProps<RootStackParams, 'WalletConnect'>
const TIMEOUT_DELAY = 1000 * 60
const PeerMetaInfo = ({
peerMeta,
}: {
peerMeta: IClientMeta
}): ReactElement => {
return (
<View
style={{
width: '100%',
padding: 20,
borderRadius: 5,
backgroundColor: '#ebeff8',
borderStyle: 'solid',
borderWidth: 1,
borderColor: '#cfd8ea',
}}
>
<View style={{ marginBottom: 20 }}>
<Text style={styles.infoTitle} fontType="medium">
Connect to
</Text>
<Text style={styles.infoDesc}>{peerMeta?.url}</Text>
</View>
{_.some(peerMeta?.description) && (
<View>
<Text style={styles.infoTitle} fontType="medium">
Description
</Text>
<Text style={styles.infoDesc}>{peerMeta.description}</Text>
</View>
)}
</View>
)
}
const Render = ({
user,
route,
}: { user: User } & Props): ReactElement => {
const autoCloseTimer = React.useRef<NodeJS.Timeout>()
// before connected
const [
localWalletConnector,
setLocalWalletConnector,
] = useState<WalletConnect>()
const [localPeerMeta, setLocalPeerMeta] = useState<IClientMeta>()
const { chain } = useConfig()
const { showNoti } = useTopNoti()
const { newWalletConnect, saveWalletConnector } = useWalletConnect()
const { goBack, dispatch, canGoBack, addListener } = useNavigation<
NavigationProp<RootStackParams>
>()
const goBackOrHome = (): void => {
if (canGoBack()) {
goBack()
} else {
dispatch(StackActions.replace('Tabs'))
}
}
const rejectConnect = (): void => {
localWalletConnector?.rejectSession()
}
const connect = async (uri: string): Promise<void> => {
const connector = newWalletConnect({ uri })
if (!connector.connected) {
await connector.createSession()
}
setLocalWalletConnector(connector)
connector.on('session_request', (error, payload) => {
if (error) {
throw error
}
const { peerMeta } = payload.params[0]
setLocalPeerMeta(peerMeta)
})
}
const confirmConnect = (): void => {
if (localWalletConnector) {
const { peerMeta } = localWalletConnector
localWalletConnector.approveSession({
chainId: chain.current.walletconnectID,
accounts: [user.address],
})
saveWalletConnector(localWalletConnector)
showNoti({
duration: 5000,
message: `${peerMeta?.name} is connected`,
description: 'Return to your browser and continue',
})
}
goBackOrHome()
}
useEffect(() => {
if (localPeerMeta) {
autoCloseTimer.current = setTimeout(() => {
goBack()
}, TIMEOUT_DELAY) // 1 minute
addListener('blur', (): void => {
if (!localWalletConnector?.connected) {
rejectConnect()
}
})
}
}, [localPeerMeta])
useEffect(() => {
// one of payload or uri is must be included
const payload = route.params?.payload
const uri = route.params?.uri
if (payload) {
connect(decodeURIComponent(payload))
} else if (uri) {
connect(uri)
} else {
showNoti({ message: 'no payload data', type: 'danger' })
}
return (): void => {
autoCloseTimer.current && clearTimeout(autoCloseTimer.current)
}
}, [])
return (
<>
<Body theme={'sky'} containerStyle={{ paddingTop: 20 }}>
{localPeerMeta ? (
<View
style={{
justifyContent: 'space-between',
flex: 1,
}}
>
<View style={{ alignItems: 'center' }}>
<Image
source={images.walletconnect_blue}
style={{ width: 60, height: 60, marginBottom: 10 }}
/>
<View style={{ marginBottom: 30 }}>
<Text style={styles.title} fontType="medium">
{localPeerMeta.name} is requesting to connect to
your wallet
</Text>
</View>
<PeerMetaInfo peerMeta={localPeerMeta} />
</View>
<View style={{ flexDirection: 'row', marginBottom: 40 }}>
<View style={{ flex: 1 }}>
<Button
title={'Deny'}
theme="red"
onPress={(): void => {
rejectConnect()
goBackOrHome()
}}
/>
</View>
<View style={{ width: 10 }} />
<View style={{ flex: 1 }}>
<Button
title={'Allow'}
theme="sapphire"
onPress={confirmConnect}
/>
</View>
</View>
</View>
) : (
<View>
<Loading style={{ marginBottom: 30 }} />
<Text style={styles.title} fontType="medium">
Ready to Connect
</Text>
</View>
)}
</Body>
</>
)
}
const ConnectToWalletConnect = (props: Props): ReactElement => {
return (
<WithAuth>
{(user): ReactElement => <Render {...{ ...props, user }} />}
</WithAuth>
)
}
const HeaderLeft = (): ReactElement => {
const { goBack, canGoBack, dispatch } = useNavigation()
const onPressGoBack = (): void => {
if (canGoBack()) {
goBack()
} else {
dispatch(StackActions.replace('Tabs'))
}
}
return (
<TouchableOpacity
onPress={onPressGoBack}
style={{ paddingLeft: 20 }}
>
<Icon name={'close'} color={COLOR.primary._02} size={28} />
</TouchableOpacity>
)
}
ConnectToWalletConnect.navigationOptions = navigationHeaderOptions({
theme: 'sky',
headerLeft: () => <HeaderLeft />,
})
export default ConnectToWalletConnect
const styles = StyleSheet.create({
title: {
fontSize: 18,
lineHeight: 21,
letterSpacing: 0,
textAlign: 'center',
color: COLOR.primary._02,
},
infoTitle: {
fontSize: 12,
lineHeight: 18,
marginBottom: 5,
},
infoDesc: {
fontSize: 12,
lineHeight: 18,
letterSpacing: 0,
},
})
|
// Copyright (c) 2017 <NAME>. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
package templates
const groupindexSrc = `
{{ define "content" }}
<div class="btn-row">
<a class="link-btn" href="/topics/new?gid={{ .GroupID }}">New topic</a>
{{ if or .IsAdmin .IsMod .IsSuperAdmin }}
<a class="link-btn" href="/groups/edit?id={{ .GroupID }}">Edit group</a>
{{ end }}
{{ if and .Common.UserName .Common.IsGroupSubAllowed }}
{{ if .SubToken }}
<form action="/groups/unsubscribe?token={{ .SubToken }}" method="POST">
<input type="hidden" name="csrf" value="{{ .Common.CSRF }}">
<input class="btn" type="submit" value="Unsubscribe">
</form>
{{ else }}
<form action="/groups/subscribe?id={{ .GroupID }}" method="POST">
<input type="hidden" name="csrf" value="{{ .Common.CSRF }}">
<input class="btn" type="submit" value="Subscribe">
</form>
{{ end }}
{{ end }}
</div>
<h1 id="title"><a href="/groups?name={{ .GroupName }}">{{ .GroupName }}</a></h1>
<div class="muted">{{ .GroupDesc }}</div>
{{ if .HeaderMsg }}
<h3>{{ .HeaderMsg }}</h3>
{{ end }}
{{ if .Topics }}
<div style="margin-top: 30px;">
{{ range .Topics }}
{{ if not .IsDeleted }}
<div class="topic-row">
<div><a href="/topics?id={{ .ID }}">{{ .Title }}{{ if .IsClosed }} [closed] {{ end }}</a></div>
<div class="muted"><a href="/users?u={{ .Owner }}">{{ .Owner }}</a> {{ .CreatedDate }} | <a href="/topics?id={{ .ID }}">{{ .NumComments }} comments</a></div>
</div>
<hr class="sep">
{{ end }}
{{ end }}
</div>
{{ else }}
<div class="row">
<div class="muted">No topics here.</div>
</div>
{{ end }}
{{ if .LastTopicDate }}
<div class="row">
<div><a href="/groups?name={{ .GroupName }}<d={{ .LastTopicDate }}">More</a></div>
</div>
{{ end }}
{{ end }}`
|
/*! \brief Get the number of nodes on a specified engine
*
* The function returns \p 0 on a success.
*
* \param [in] comm
* MDI communicator of the engine. If comm is set to
* MDI_NULL_COMM, the function will check for the calling engine.
* \param [out] nnodes
* On return, the number of nodes supported by the engine.
*/
int MDI_Get_NNodes(MDI_Comm comm, int* nnodes)
{
if ( is_initialized == 0 ) {
mdi_error("MDI_Get_NNodes called but MDI has not been initialized");
return 1;
}
vector* node_vec = get_node_vector(comm);
*nnodes = node_vec->size;
return 0;
} |
<filename>datastore/delete_user.py
from gcloud import datastore
client = datastore.Client('newsai-1166')
def user_to_resources(user_id, resource):
query = client.query(kind=resource)
query.add_filter('CreatedBy', '=', user_id)
resource = list(query.fetch())
return resource
def delete_resource(resource):
client.delete(resource.key)
def get_resource_and_delete(user_id, resource_name):
resources = user_to_resources(user_id, resource_name)
print user_id, resource_name, len(resources)
for resource in resources:
delete_resource(resource)
user_id = 6496327902953472
get_resource_and_delete(user_id, 'Contact')
get_resource_and_delete(user_id, 'Email')
# Delete user
user_id_key = client.key('User', user_id)
client.delete(user_id_key)
|
// GetAgentStatus get gse agent status
func (c *Client) GetAgentStatus(req *GetAgentStatusReq) (*GetAgentStatusResp, error) {
resp := new(GetAgentStatusResp)
req.BaseReq = c.baseReq
err := c.client.Post().
WithEndpoints([]string{c.host}).
WithBasePath("/api/c/compapi/v2/gse/").
SubPathf("get_agent_status").
WithJSON(req).
Do().
Into(resp)
if err != nil {
return nil, err
}
return resp, nil
} |
import re
ip = input()
single = re.findall('.', ip)
single[0] = single[0].upper()
print(''.join(single)) |
// ToPollMutableStateRequest converts thrift PollMutableStateRequest type to internal
func ToPollMutableStateRequest(t *history.PollMutableStateRequest) *types.PollMutableStateRequest {
if t == nil {
return nil
}
return &types.PollMutableStateRequest{
DomainUUID: t.GetDomainUUID(),
Execution: ToWorkflowExecution(t.Execution),
ExpectedNextEventID: t.GetExpectedNextEventId(),
CurrentBranchToken: t.CurrentBranchToken,
}
} |
#### A LADYBIRD BOOK FOR GROWN–UPS
### 'How it works'
### THE BROTHER
## by J.A. Hazeley, N.S.F.W.
and J.P. Morris, O.M.G.
#### (Authors of 'Decorating With Wasps')
Publishers: Ladybird Books Ltd., Loughborough
_Printed in England. If wet, Italy._
## Contents
'How it Works': The Brother
#### THE ARTISTS
Martin Aitchison
Robert Ayton
John Berry
Ronald Sydney Embleton
Roger Hall
Frank Hampson
Frank Humphris
Kenneth Inns
B. Knight
John Leigh-Pemberton
Jorge Nunez
G. Robinson
Harry Wingfield
THE AUTHORS would like to record their gratitude and offer their apologies to the many Ladybird artists whose luminous work formed the glorious wallpaper of countless childhoods. Revisiting it for this book as grown-ups has been a privilege.
This is a brother.
He shares half of his DNA with you.
But he will not share any of his bubble mixture.
Ken will come to regret this in approximately 2048 when he needs to borrow £200 from Joy to get his van through its MOT.
Aisling has a butterfly book.
Ryan has a book about fish.
Ryan has decided this is not fair for a reason that will become no clearer over the next six days of his going on and on about it.
Brandon's sister wants her coat back but he is keeping it because he knows how much it annoys her.
Jumping off things. Sliding down things. Drinking magic potions made of vinegar and mud. Going over there and punching Dean Haggett.
There are plenty of things which an older brother can persuade a younger brother or sister to be the first to try.
Jessica has taught her little brother Trevor everything she knows.
If he ever makes a career out of rude armpit sounds and stealing sugar sachets from cafés, she hopes he will remember who it was who got him there.
Angus is the big brother. He always has the best toys.
Hayley is the little sister. She has lots of toys, but they are broken and used to belong to Angus.
Their brother Christian is sitting in the cupboard under the stairs with a paper bag on his head waiting for anyone to notice.
Middle children are different.
Abel has been singing the same song over and over again the whole way from Megiddo to the Plain of Jezreel.
He also keeps putting his arm on Cain's side of the donkey.
Brothers can be very irritating but it is ever so important to try and keep your temper.
Rollo is watching his brother on the news.
"I could have done that," says Rollo to his girlfriend.
"If they'd just bought me that spacehopper."
At home, Lucas and his sister Mia play together all the time.
But in front of his friends, Lucas pretends girls are stupid.
Their mum worries that the design of the local soft–play centre is not helping.
Music–making is easy when you are brothers like these Everly Brothers.
Phil knows what Don is thinking. Don knows what Phil is thinking.
Don is thinking, "I hate you". Phil is also thinking, "I hate you".
Two brothers in perfect harmony.
Joshua's sister Christine got a new bike for Christmas.
Joshua got book tokens.
"But your sister doesn't like books," says Joshua's mum.
Joshua wished he'd worked harder on looking pig ignorant in front of his parents.
Jenny is definitely telling Mum about this.
Finbar's sister Mary is on a picnic with her new boyfriend.
Finbar staked out a commanding position above the picnic area this morning. He has not met Mary's new boyfriend yet, but if the man's poor coleslaw etiquette leads to further unacceptable behaviour, Finbar is ready to act.
Hereward is not just Sam's big brother. He is also his best friend.
Hereward will support Sam in everything he does. He will be there for him no matter what. He will always give advice and encouragement.
Provided Sam does ever so slightly worse than him at everything.
Gavin is repeating everything his big brother Tom says, but in the voice of Zippy from Rainbow.
He has also prepared a story about Tom wetting his pants at Jack and the Beanstalk when he danced with the cow.
Tom's new girlfriend will be here in ten minutes. Gavin is ready.
Despite being a county–class hammer thrower, Tarragon has not been asked to open the school fete.
Once again, he suspects the only reason he was allowed on the PTA committee was to invite one of his brothers to open it.
Sometimes it is tough being a Dimbleby.
When Mike visits his big brother Dan, he always gets a parcel of things to take home with him.
"I don't really need this stuff any more. Have the lot," says Dan, generously.
When he gets back to his flat, Mike will put the VHS cassettes in the charity shop, the off–cuts of MDF in the shed and all the potato peelings in the bin.
Everett is writing his memoirs.
He knows he wants to mention his time as a prisoner of war, and his music career, and his work as a neurosurgeon, and what it was like at the earth's core.
But mainly he wants to settle an argument he once had with his brother about Gary Neville.
Chang–Chang is sad.
To stop his species becoming extinct, he is probably, at some point, going to have to be nice to his sister.
Chris did not know that his lonely old neighbour, Humbert, had any brothers or sisters.
When Humbert became too ill to leave his forty–six–room mansion, Chris looked in on Humbert and got his shopping. Chris arranged the funeral and even offered to carry Humbert's coffin.
But he does not need to. There are suddenly lots and lots of long–lost brothers and sisters who are very keen to help.
Every birthday, Lee and his brother Wayne re–gift the amusing present that Lee gave Wayne when he was six. It is a running joke.
Lee and Wayne suspect their joke stopped being funny some years ago, but they are carrying on because they are brothers, and not carrying on would indicate something terrible that they can neither explain nor consider.
Also, it is easier than finding an actual present.
Caspar's parents have decided he is "the musical one", so he must practise every day after school.
Caspar is cross because his brother Ludwig gets to play in the garden instead. Ludwig is "the sporty one".
The Beethoven family have great hopes for their boys.
Adrian is explaining that he will not give his sister Lucy any of his bone marrow if she ever needs it.
Not until she gives him back the Fine Young Cannibals single she borrowed in 1989.
"It's not the bone marrow," says Adrian. "It's the principle."
_The Publishers gratefully acknowledge assistance provided by Dr Chris Packet, Acting-up Senior Sibling of CAIN (the Consanguine Aggression Information Network) in preparing this book._
##### MICHAEL JOSEPH
UK | USA | Canada | Ireland | Australia
India | New Zealand | South Africa
Michael Joseph is part of the Penguin Random House group of companies whose addresses can be found at global.penguinrandomhouse.com
First published 2017
Copyright © Jason Hazeley and Joel Morris, 2017
All images copyright © Ladybird Books Ltd, 2017
The moral right of the authors has been asserted
ISBN: 978-1-405-93404-6
# Contents
1. Cover
2. Title Page
3. The Artists
4. 'How it Works': The Brother
5. Acknowledgements
6. Copyright Page
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
1. Cover
2. Table of Contents
3. Begin Reading
|
// Test that small number of intermediate navigations within OAuth start and
// completion are allowed.
TEST_F(OAuthLoginDetectorTest, IntermediateNavigationsAfterOAuthStart) {
EXPECT_FALSE(oauth_login_detector_->GetSuccessfulLoginFlowSite(
GURL("https://foo.com/login.html"),
{
GURL("https://oauth.com/authenticate?client_id=123"),
GURL("https://oauth.com/login"),
}));
EXPECT_FALSE(oauth_login_detector_->GetSuccessfulLoginFlowSite(
GURL("https://oauth.com/login"),
{
GURL("https://oauth.com/login?username=123&password=123"),
GURL("https://oauth.com/relogin"),
}));
EXPECT_FALSE(oauth_login_detector_->GetSuccessfulLoginFlowSite(
GURL("https://oauth.com/relogin"),
{
GURL("https://oauth.com/login?username=123&password=123"),
GURL("https://oauth.com/relogin"),
}));
EXPECT_TRUE(oauth_login_detector_->GetSuccessfulLoginFlowSite(
GURL("https://oauth.com/relogin"),
{GURL("https://oauth.com/login?username=123&password=123"),
GURL("https://oauth.com/loginsuccess"),
GURL("https://foo.com/redirect?code=secret")}));
EXPECT_FALSE(oauth_login_detector_->GetPopUpLoginFlowSite());
} |
/**
* Called once the client becomes visible.
*
* @see AutofillClient#isVisibleForAutofill()
*/
void onVisibleForAutofillLocked() {
AutofillClient client = getClientLocked();
ArraySet<AutofillId> updatedVisibleTrackedIds = null;
ArraySet<AutofillId> updatedInvisibleTrackedIds = null;
if (client != null) {
if (mInvisibleTrackedIds != null) {
final ArrayList<AutofillId> orderedInvisibleIds =
new ArrayList<>(mInvisibleTrackedIds);
final boolean[] isVisible = client.getViewVisibility(
getViewIds(orderedInvisibleIds));
final int numInvisibleTrackedIds = orderedInvisibleIds.size();
for (int i = 0; i < numInvisibleTrackedIds; i++) {
final AutofillId id = orderedInvisibleIds.get(i);
if (isVisible[i]) {
updatedVisibleTrackedIds = addToSet(updatedVisibleTrackedIds, id);
if (sDebug) {
Log.d(TAG, "onVisibleForAutofill() " + id + " became visible");
}
} else {
updatedInvisibleTrackedIds = addToSet(updatedInvisibleTrackedIds, id);
}
}
}
if (mVisibleTrackedIds != null) {
final ArrayList<AutofillId> orderedVisibleIds =
new ArrayList<>(mVisibleTrackedIds);
final boolean[] isVisible = client.getViewVisibility(
getViewIds(orderedVisibleIds));
final int numVisibleTrackedIds = orderedVisibleIds.size();
for (int i = 0; i < numVisibleTrackedIds; i++) {
final AutofillId id = orderedVisibleIds.get(i);
if (isVisible[i]) {
updatedVisibleTrackedIds = addToSet(updatedVisibleTrackedIds, id);
} else {
updatedInvisibleTrackedIds = addToSet(updatedInvisibleTrackedIds, id);
if (sDebug) {
Log.d(TAG, "onVisibleForAutofill() " + id + " became invisible");
}
}
}
}
mInvisibleTrackedIds = updatedInvisibleTrackedIds;
mVisibleTrackedIds = updatedVisibleTrackedIds;
}
if (mVisibleTrackedIds == null) {
finishSessionLocked();
}
} |
// GetAll returns all metrics that have been setup.
//
// Docs: https://docs.cachethq.io/reference#get-metrics
func (s *MetricsService) GetAll(filter *MetricQueryParams) (*MetricResponse, *Response, error) {
u := "api/v1/metrics"
v := new(MetricResponse)
u, err := addOptions(u, filter)
if err != nil {
return nil, nil, err
}
resp, err := s.client.Call("GET", u, nil, v)
return v, resp, err
} |
#<NAME>
#6/21/2020
#An API for Yahoo Finance
import urllib3
from bs4 import BeautifulSoup
import lxml
import html5lib
import ast
import json
from datetime import datetime, timedelta
def testFuntion():
print("test")
stocks = dict()
__debugMode = False
#general scraper to get the url
def scraper(url):
http = urllib3.PoolManager()
response = http.request('GET', url)
soup = BeautifulSoup(response.data, "lxml")
return soup
#sets the debug mode
def setDebugMode(mode):
__debugMode = mode
def message(text):
if(__debugMode):
print(text)
#takes in a stock symbol and stores the data. Use this every time you want to refresh
def initializStockData(sym):
xml = -1
xmlHistorical = -1
try:
url = "https://finance.yahoo.com/quote/" + sym
xml = scraper(url)
url = "https://finance.yahoo.com/quote/"+sym+"/history?period1=1&period2=2000000000&interval=1d&filter=history&frequency=1d"
xmlHistorical = scraper(url)
message("Success initializing: " + sym)
except:
message("Error initializing: " + sym)
return None
stock = Stock(xml, xmlHistorical, sym)
__initStockData(stock)
stocks[sym] = stock
writeToJson(sym)
return True
def __initStockData(stock):
__initStockPriceAtClose(stock)
__initStockPriceAfterHours(stock)
__initChangeAtClose(stock)
__initChangeAfterHours(stock)
__initPreviousClose(stock)
__initOpen(stock)
__initDayRange(stock)
__init52WeekRange(stock)
__initVolume(stock)
__initAverageVolume(stock)
__initMarketCap(stock)
__initHistoricalDataAll(stock)
def writeToJson(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
file = open( sym + "_data.json", "w")
data = {}
data["symbol"] = stock.symbol
data["price at close"] = stock.priceAtClose
data["point change at close"] = stock.pointChangeAtClose
data["percentage change at close"] = stock.percentageChangeAtClose
data["price after hours"] = stock.priceAfterHours
data["percentage change after hours"] = stock.percentageChangeAfterHours
data["point change after hours"] = stock.pointChangeAfterHours
data["previous close"] = stock.previousClose
data["open price"] = stock.openPrice
data["day range"] = stock.dayRange
data["volume"] = stock.volume
data["average volume"] = stock.averageVolume
data["market cap"] = stock.marketCap
data["52 week low"] = stock.yearRange[0]
data["52 week high"] = stock.yearRange[1]
data["historical data"] = stock.OHCL
json.dump(data, file, indent=4)
def initializStockDataFromJson(sym):
stock = Stock("", "", sym)
try:
fileName = sym + "_data.json"
file = open(fileName, "r")
data = json.load(file)
stock.priceAtClose = data["symbol"]
stock.priceAfterHours = data["price at close"]
stock.pointChangeAtClose = data["point change at close"]
stock.percentageChangeAtClose = data["percentage change at close"]
stock.percentageChangeAfterHours = data["percentage change after hours"]
stock.pointChangeAfterHours = data["point change after hours"]
stock.previousClose = data["previous close"]
stock.openPrice = data["open price"]
stock.dayRange = data["day range"]
stock.volume = data["volume"]
stock.averageVolume = data["average volume"]
stock.marketCap = data["market cap"]
stock.yearRange = [data["52 week low"], data["52 week high"]]
stock.OHCL = data["historical data"]
except:
message("error trying to read data from " + sym + "_data.json")
return None
stocks[sym] = stock
return True
def __initStockPriceAtClose(stock):
try:
xml = stock.xml.find_all("span", {"data-reactid": "50"})
stock.priceAtClose = float(xml[0].string.strip().replace(",",""))
except:
message(stock.symbol + " close price data not currently available")
def __initStockPriceAfterHours(stock):
try:
xml = stock.xml.find_all("span", {"data-reactid": "55"})
stock.priceAfterHours = float(xml[0].string.strip().replace(",",""))
except:
message(stock.symbol + " after hours data not currently available")
def __initChangeAtClose(stock):
try:
xml = stock.xml.find_all("span", {"data-reactid": "51"})
temp = xml[0].string.strip().replace(",","")
amountPoints = ""
amountPercentage = ""
i = 0
while(i < len(temp) and temp[i] != '('):
amountPoints += temp[i]
i += 1
i += 1
while(i < len(temp) and temp[i] != '%'):
amountPercentage += temp[i]
i += 1
stock.pointChangeAtClose = float(amountPoints)
stock.percentageChangeAtClose = float(amountPercentage)
except:
message(stock.symbol + " change at close data not currently available")
def __initChangeAfterHours(stock):
try:
xml = stock.xml.find_all("span", {"data-reactid": "58"})
temp = xml[0].string.strip().replace(",","")
amountPoints = ""
amountPercentage = ""
i = 0
while(i < len(temp) and temp[i] != '('):
amountPoints += temp[i]
i += 1
i += 1
while(i < len(temp) and temp[i] != '%'):
amountPercentage += temp[i]
i += 1
stock.pointChangeAfterHours = float(amountPoints)
stock.percentageChangeAfterHours = float(amountPercentage)
except:
message(stock.symbol + " change after hours data not currently available")
def __initPreviousClose(stock):
try:
xml = stock.xml.find_all("span", {"data-reactid": "98"})
previousClose = float(xml[0].string.strip().replace(",",""))
stock.previousClose = previousClose
except:
message(stock.symbol + " previous close data not currently available")
def __initOpen(stock):
try:
xml = stock.xml.find_all("span", {"data-reactid" : "103"})
openPrice = float(xml[0].string.strip().replace(",",""))
stock.openPrice = openPrice
except:
message(stock.symbol + " open price data not currently available")
def __initDayRange(stock):
high = ""
low = ""
try:
xml = stock.xml.find_all("td", {"data-reactid" : "117"})
temp = xml[0].string.strip().replace(",","")
i = 0
while(i < len(temp) and temp[i] != " "):
low += temp[i]
i += 1
i += 3
while(i < len(temp)):
high += temp[i]
i+= 1
dayRange = [float(low), float(high)]
stock.dayRange = dayRange
except:
message(stock.symbol + " day range data not currently available")
def __init52WeekRange(stock):
high = ""
low = ""
try:
xml = stock.xml.find_all("td", {"data-reactid" : "121"})
temp = xml[0].string.strip().replace(",","")
i = 0
while(i < len(temp) and temp[i] != " "):
low += temp[i]
i += 1
i += 3
while(i < len(temp)):
high += temp[i]
i+= 1
yearRange = [float(low), float(high)]
stock.yearRange = yearRange
except:
message("52 week range data not currently available")
def __initVolume(stock):
volume = stock.getDataElement("126", "volume data not currently available")
if(volume is None):
return None
stock.volume = float(volume)
def __initAverageVolume(stock):
averageVolume = stock.getDataElement("131", "average volume data not currently available")
if(averageVolume is None):
return None
stock.averageVolume = float(averageVolume)
def __initMarketCap(stock):
temp = stock.getDataElement("139", "market cap data not currently available")
if(temp is None):
return None
marketCap = ""
for i in temp:
if(i.isdigit() or i == "."):
marketCap += i
marketCap = float(marketCap)
if(temp[len(temp) - 1] == "T"):
marketCap *= 1000000000000
elif(temp[len(temp) - 1] == "B"):
marketCap *= 1000000000
elif(temp[len(temp) - 1] == "M"):
marketCap *= 1000000
stock.marketCap = marketCap
def __initHistoricalDataAll(stock):
# try:
xml = stock.xmlHistorical
temp = str(xml)
i = temp.find("HistoricalPriceStore")
stringData = ""
while(temp[i] != "["):
i += 1
while(temp[i] != "]"):
stringData += temp[i]
i += 1
stringData += "]"
ohcl = {}
data = ast.literal_eval(stringData)
openD = []
high = []
low = []
close = []
adjClose = []
volume = []
date = []
firstDay = datetime(1970, 1, 1)
for x in data:
try:
# day = [float(x["open"]), float(x["high"]), float(x['close']), float(x["low"])]
dateInt = int(int(x["date"]) / 86400)
dateString = (firstDay + timedelta(days=dateInt)).date().isoformat()
openD.append(float(x["open"]))
high.append(float(x["high"]))
low.append(float(x["low"]))
close.append(float(x["close"]))
volume.append(float(x["volume"]))
adjClose.append(float(x["adjclose"]))
date.append(dateString)
except:
day = ""
ohcl["open"] = openD
ohcl["high"] = high
ohcl["low"] = low
ohcl["close"] = close
ohcl["volume"] = volume
ohcl["adjclose"] = adjClose
ohcl["date"] = date
# except:
# YahooFinanceAPI.message("error trying to access OHCL data")
# return None
stock.OHCL = ohcl
#returns the initialized stock
def getInitializedStock(sym):
try:
stock = stocks[sym]
return stock
except:
message("Error trying to access: " + sym + ". Try initializing the stock first")
return None
#gets the stock price from a list of stocks
def getStockPriceAtClose(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
price = stock.priceAtClose
if(price is None):
return None
message(sym + " Price of at close: " + str(price))
return price
#gets the stock price after hours
def getStockPriceAfterHours(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
price = stock.priceAfterHours
if(price is None):
return None
message(sym + " Price after hours: " + str(price))
return price
#gets the change at close and returns the point and percentage change
def getChangeAtClose(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
amountPoints = stock.pointChangeAtClose
amountPercentage = stock.percentageChangeAtClose
if(amountPoints is None or amountPercentage is None):
return None
return [float(amountPoints), float(amountPercentage)]
#gets the point change at close
def getPointChangeAtClose(sym):
change = getChangeAtClose(sym)
if(change is None):
return None
message(sym + " point change at close: " + str(change[0]))
return change[0]
#gets the percentage change at close
def getPercentageChangeAtClose(sym):
change = getChangeAtClose(sym)
if(change is None):
return None
message( sym + " percentage change at close: " + str(change[1]) + "%")
return change[1]
#gets the change after hours and returns the point and percentage change
def getChangeAfterHours(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
amountPoints = stock.pointChangeAfterHours
amountPercentage = stock.percentageChangeAfterHours
if(amountPoints is None or amountPercentage is None):
return None
return [float(amountPoints), float(amountPercentage)]
#gets the point change at close
def getPointChangeAfterHours(sym):
change = getChangeAfterHours(sym)
if(change is None):
return None
message(sym + " point change after hours: " + str(change[0]))
return change[0]
#gets the percentage change at close
def getPercentageChangeAfterHours(sym):
change = getChangeAfterHours(sym)
if(change is None):
return None
message( sym + " percentage change after hours: " + str(change[1]) + "%")
return change[1]
#gets the previous close
def getPreviousClose(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
previousClose = stock.previousClose
if(previousClose is None):
return None
message(sym + " previous close: " + str(previousClose))
return previousClose
#gets the opening price
def getOpen(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
openPrice = stock.openPrice
if(openPrice is None):
return None
message(sym + " open: " + str(openPrice))
return openPrice
#gets the day's range
def getDayRange(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
dayRange = stock.dayRange
if(dayRange is None):
return None
return dayRange
#gets the day's low
def getDayLow(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
if(stock.dayRange is None):
getDayRange(sym)
stock = getInitializedStock(sym)
if(stock is None or stock.dayRange is None):
return None
message(sym + " day low: " + str(stock.dayRange[0]))
return stock.dayRange[0]
#gets the day's high
def getDayHigh(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
if(stock.dayRange is None):
getDayRange(sym)
stock = getInitializedStock(sym)
if(stock is None or stock.dayRange is None):
return None
message(sym + " day high: " + str(stock.dayRange[1]))
return stock.dayRange[1]
#gets the 52 week range
def get52WeekRange(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
yearRange = stock.yearRange
if(yearRange is None):
return None
return yearRange
#gets the 52 week low
def get52WeekLow(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
if(stock.yearRange is None):
get52WeekRange(sym)
stock = getInitializedStock(sym)
if(stock is None or stock.yearRange is None):
return None
message(sym + " 52 week low: " + str(stock.yearRange[0]))
return stock.yearRange[0]
#gets the 52 week high
def get52WeekHigh(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
if(stock.yearRange is None):
get52WeekRange(sym)
stock = getInitializedStock(sym)
if(stock is None or stock.yearRange is None):
return None
message(sym + " 52 week high: " + str(stock.yearRange[1]))
return stock.yearRange[1]
#gets the volume for a stock symbol
def getVolume(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
volume = stock.volume
if(volume is None):
return None
message(sym + " volume: " + str(volume))
return volume
def getAverageVolume(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
averageVolume = stock.averageVolume
if(averageVolume is None):
return None
message(sym + " average volume: " + str(averageVolume))
return averageVolume
def getMarketCap(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
marketCap = stock.marketCap
if(marketCap is None):
return None
message(sym + " market cap: " + str(marketCap))
return marketCap
#returns the historical data in a dict with keys: "open", "low", "high", "close"
#each key holds data from most recent to least recent
def getHistoricalDataAll(sym):
stock = getInitializedStock(sym)
if(stock is None):
return None
ohcl = stock.OHCL
return ohcl
#returns the historical data in OHCL format from a range of trading days
#num1 holds the start of how many days ago and num2 holds the end
#e.g. num1 = 100 and num2 = 20 returns the data from 20 days ago to 100 days ago
#returns data from most recent to least recent
def getHistoricalDataRangeTradingDays(sym, num1, num2):
if(num2 < num1):
t = num2
num2 = num1
num1 = t
stock = getInitializedStock(sym)
if(stock is None):
return None
ohcl = stock.OHCL
if(ohcl is None):
return None
data = {}
try:
data["open"] = ohcl["open"][num1 : num2]
data["close"] = ohcl["close"][num1 : num2]
data["low"] = ohcl["low"][num1 : num2]
data["high"] = ohcl["high"][num1 : num2]
return data
except:
return None
#returns the historical data in OHCL format from an x number of trading days from most recent to least recent
def getHistoricalDataPastXTradingDays(sym, num):
return getHistoricalDataRangeTradingDays(sym, 0, num)
#returns the OHCL data of the past 5 trading days from most recent to least recent
def getHistoricalDataPast5TradingDays(sym):
return getHistoricalDataPastXTradingDays(sym, 5)
#return the OHCL data of the past 30 trading days from most recent to least recent
def getHistoricalDataPast30TradingDays(sym):
return getHistoricalDataPastXTradingDays(sym, 30)
#takes in two dates of strings in 'YYYY-MM-DD' format and returns the historical data in that range
def getHistoricalDataRangeOfDates(sym, date1, date2):
stock = getInitializedStock(sym)
if(stock is None):
return None
dates = stock.OHCL["date"]
print(dates)
try:
d1 = dates.index(date1)
d2 = dates.index(date2)
except:
message("error trying to access data from dates given")
return None
return getHistoricalDataRangeTradingDays(sym, d1, d2)
class Stock:
symbol = None
xml = None
xmlHistorical = None
priceAtClose = None
priceAfterHours = None
pointChangeAtClose = None
percentageChangeAtClose = None
pointChangeAfterHours = None
percentageChangeAfterHours = None
previousClose = None
openPrice = None
dayRange = None
yearRange = None
volume = None
averageVolume = None
marketCap = None
OHCL = None
def __init__(self, xml, xmlHistorical, symbol):
self.xml = xml
self.xmlHistorical = xmlHistorical
self.symbol = symbol
def __str__(self):
return self.sym
def getDataElement(self, id, message):
try:
xml = self.xml.find_all("span", {"data-reactid" : str(id)})
return xml[0].string.strip().replace(",","")
except:
message(message)
return None
|
package dbhandle
type IDBHandle interface {
// 詳細交易資訊
GetTransactiondetail(StockCode string, date string) ([]map[string]interface{}, error)
SetTransactiondetail(C, D, T, TS, TK0, TK1, TLong, CH, N, NF, Y, Z, IP, TV, A, F, B, G, EX, IT, MT, O, OA, OB, OT, OV, OZ, I, L, H, V, W, U, S, P, PS, PZ string) error
// GetstockAlreadyDay(StockCode string) ([]map[string]interface{}, error)
// 歷史個股成交資訊
Setstockyear(data ...interface{}) error
Setstockmonth(data ...interface{}) error
Setstockday(data ...interface{}) error
// 個股日本益比、殖利率及股價淨值比
SetBwibbu(data ...interface{}) error
// 三大法人買賣金額統計表
SetLegalperson(flag, date string, data interface{}) error
// 已取得資料紀錄
Getcollectionflag(StockCode, Flag string) ([]map[string]interface{}, error)
Setcollectionflag(StockCode, Flag, Date string) error
// Set
}
// var stockBDSQL *processdb.SQLCLi
type DBHandle struct {
IDBHandle
}
var Instance IDBHandle
|
Structure and Asymptotic Theory for Multivariate Asymmetric Volatility: Empirical Evidence for Country Risk Ratings *
Following the rapid growth in the international debt of less developed countries in the 1970s and the increasing incidence of debt rescheduling in the early 1980s, country risk has become a topic of major concern for the international financial community. A critical assessment of country risk is essential because it reflects the ability and willingness of a country to service its financial obligations. Various risk rating agencies employ different methods to determine country risk ratings, combining a range of qualitative and quantitative information regarding alternative measures of economic, financial and political risk into associated composite risk ratings. This paper provides an international comparison of country risk ratings compiled by the International Country Risk Guide (ICRG), which is the only international rating agency to provide detailed and consistent monthly data over an extended period for a large number of countries. As risk ratings can be treated as indexes, their rate of change, or returns, merits attention in the same manner as financial returns. For this reason, a constant correlation multivariate asymmetric ARMA-GARCH model is presented and its underlying structure is established, including the unique, strictly stationary and ergodic solution of the model, its causal expansion, and convenient sufficient conditions for the existence of moments. Alternative empirically verifiable sufficient conditions for the consistency and asymptotic normality of the quasi-maximum likelihood estimator are established under non-normality of the conditional (or standardized) shocks. The empirical results provide a comparative assessment of the conditional means and volatilities associated with international country risk returns across countries and over time, enable a validation of the regularity conditions underlying the models, highlight the importance of economic, financial and political risk ratings as components of a composite risk rating, evaluate the multivariate effects of alternative risk returns and different countries, and evaluate the usefulness of the ICRG risk ratings in modelling risk returns. |
Characterising inflations of monotone grid classes of permutations
We characterise those permutation classes whose simple permutations are monotone griddable. This characterisation is obtained by identifying a set of nine substructures, at least one of which must occur in any simple permutation containing a long sum of 21s.
Introduction
A common route to understanding the structure of a permutation class (and hence, e.g. complete its enumeration) is via its simple permutations, as their structure can be considerably easier to characterise than the entire class. Albert, Atkinson, Homberger and Pantone introduced the notion of deflatability to study this phenomenon: that is, the property that the simples in a given permutation class C actually belong to a proper subclass D C. See also Vatter's recent survey .
One general case of deflatability is where the set of simple permutations of a class is finite. Such classes are well-quasi-ordered, finitely based, and have algebraic generating functions , and via a Ramsey-type result for simple permutations , it is decidable when a permutation class has this property .
In this paper, we look beyond classes with finitely many simples to those whose simples are 'monotone griddable', and prove the following characterisation. We postpone formal definitions until later, but see Figure 1 for examples of the structures mentioned. • arbitrarily long parallel sawtooth alternations, • arbitrarily long sliced wedge sawtooth alternations, • proper pin sequences with arbitrarily many turns, and • spiral proper pin sequences with arbitrarily many extensions.
In general, classes whose simple permutations are monotone griddable do not immediately possess the range of properties that classes with only finitely many simples do. Indeed, few general properties are known even for classes that are themselves wholly monotone griddable, but this has not diminished the efficacy of the following characterisation for the structural understanding and enumeration of many classes (see, for example ).
A sum of k copies of 21 is the permutation 21 43 65 · · · (2k)(2k − 1), written in one line notation. We will often abbreviate this to ⊕ k 21. Similarly, a skew sum of k copies of 12 is the permutation ⊖ k 12 = (2k − 1)(2k) · · · 34 12. Theorem 1.2 (Huczynska and Vatter ). A class C is monotone griddable if and only if it does not admit arbitrarily long sums of 21 or skew sums of 12. That is, for some k neither ⊕ k 21 nor ⊖ k 12 belong to C.
Aside from the structural information that it provides in its own right, the reason that this simple-tocheck characterisation has proved so useful is that the classes to which it has been applied typically in fact possess the stronger property of being geometrically griddable. Such classes are well-quasi-ordered, finitely based and have rational generating functions . As we have no direct characterisation for a class to be geometrically griddable, the above theorem (which certainly provides a necessary condition) has often provided enough of a 'hook' to solve the task at hand.
It is our hope that Theorem 1.1 can provide a similar 'hook' to ease the study of classes whose simple permutations are geometrically griddable. Any such class is known to be well-quasi-ordered, finitely based, and strongly algebraic (meaning that it and every subclass have algebraic generating functions), see Albert, Ruškuc and Vatter . Furthermore, every class with growth rate less than κ ≈ 2.20557, is of this form . For instances of practical enumeration tasks that have exploited the geometric griddability of the simple permutations, see Albert, Atkinson and Vatter , and Pantone .
Our characterisation in Theorem 1.1 relies on the following auxiliary result, which guarantees the existence of certain types of structure in simple permutations that contain a long sum of 21s.
There exists a function f (n) such that every simple permutation that contains a sum of f (n) copies of 21 must contain a parallel or wedge sawtooth alternation of length 3n or an increasing oscillation of length n.
See Figure 2 for examples of the three types of unavoidable structure. Note that wedge sawtooth alternations are not necessarily simple, so the existence of wedge sawtooth alternations in a permutation class does not guarantee that the simple permutations are not monotone griddable, but Theorem 1.3 nevertheless provides a sufficient condition.
The rest of this paper is organised as follows. In Section 2 we introduce basic notions and define the structures mentioned in the above results. In Section 3 we give the proof of Theorem 1.3, and in Section 4 we complete the proof of Theorem 1.1. In the final section, we discuss some future directions for this work.
Preliminaries
For definitions common to the wider study of permutation patterns, we refer the reader to Bevan's introduction . For a broader background to the study of permutation classes, see Vatter's excellent survey in the Handbook of enumerative combinatorics.
Geometry, simplicity and gridding
Critical to our work is the ability to visualise permutations and parts of permutations in the plane. The plot of a permutation π of length n is the set of coordinates (or points) (i, π(i)) for i = 1, . . . , n.
In a slight abuse of notation, we will rarely distinguish between a permutation and its plot.
This exposes an important collection of symmetries that are available -specifically the dihedral group generated by reflections in a vertical, horizontal, or diagonal axis. It is to these symmetries we refer when we make an appeal 'by symmetry'. In particular to prove Theorem 1.1 it suffices to show that if the simple permutations of a class C contain arbitrarily long sums of 21, then configurations of one of the specified types must occur.
Given points p 1 , . . . , p k in the plane (typically belonging to the plot of a permutation), denote by rect(p 1 , . . . , p k ) the smallest axes-parallel rectangle that contains them. We call rect(p 1 , . . . , p k ) the rectangular hull of p 1 , . . . , p k .
Let R be any axes-parallel rectangle in the plot of a permutation π. The rectangle R divides the plot of π into nine regions, and we identify the four 'corners' as NE, NW, SE and SW of π relative to R, as illustrated in the following diagram.
SE SW
For the rectangle R itself, denote by π| R the permutation that is order isomorphic to the points of π contained in R.
Any point (or collection of points) that lies in one of the four unlabelled regions in the above picture is said to slice the rectangle R. Put formally, if R = × is a rectangle, then the point (x, y) slices R vertically if x ∈ (a, b) and y ∈ , and horizontally if x ∈ and y ∈ (c, d). By extension, we say that a point slices a collection of points in the plane if it slices their rectangular hull.
An interval of a permutation π is a (nonempty) set of points {(i, π(i)) : i ∈ I} for some set of indices I, such that both I and π(I) = {π(i) : i ∈ I} form contiguous sets of natural numbers.
We can easily identify an interval geometrically by noting that rect((i, π(i)) : i ∈ I) cannot be sliced, and must contain only points corresponding to indices from I. Equivalently, the (nonempty) set of points of π belonging to an unsliced axes-parallel rectangle R form an interval.
Trivially, every singleton of π and the whole of π form intervals. If there are no other intervals and π = 1, then π is said to be simple.
Given a permutation σ of length n, and permutations π 1 , . . . , π n , the inflation of σ by π 1 , . . . , π n is the permutation obtained by replacing each entry σ(i) by a sequence of points forming an interval order isomorphic to π i , and with the intervals in the same relative ordering as the corresponding points of σ. This permutation is commonly denoted by σ .
The reverse process to inflation (i.e., decomposing a permutation into intervals) forms the basis for the substitution decomposition, the essence of which is captured in the following result. ). Every permutation π is expressible as the inflation of a unique simple permutation σ. Furthermore, if |σ| ≥ 4, then in the expression π = σ , the intervals π 1 , . . . π n are also unique.
Proposition 2.1 (Albert and Atkinson
For a permutation which is the inflation of a simple permutation of length 2 (i.e., σ = 12 or 21), we do not have the same guarantee of uniqueness of the intervals (although there are methods to recover this if needed). If π = 12 for some permutations π 1 and π 2 , then we also write π = π 1 ⊕ π 2 , and say that π is sum decomposable. Any permutation that is not sum decomposable is sum indecomposable. Similarly, if π = 21 we write π = π 1 ⊖ π 2 and say that π is skew decomposable, otherwise π is skew indecomposable. Finally, the case where π is both sum and skew indecomposable corresponds to the case in Proposition 2.1 where the unique simple permutation has length at least 4 (as there are no simple permutations of length 3).
For completeness, we now briefly introduce the notion of griddability. However, we do not actually require this definition in our work (the characterisation provided by Theorem 1.2 suffices). A class C is monotone griddable if there exist integers h and v such that for every permutation π ∈ C, we may divide the plot of π into cells using at most h horizontal and v vertical lines, in such a way as the points in each cell form a monotone increasing or decreasing sequence (or the cell is empty).
Pin sequences
Following , a pin sequence is a sequence of points p 1 , p 2 , . . . in the plot of π such that for each i ≥ 3, p i slices rect(p 1 , . . . , p i−1 ). Each pin p i for i ≥ 3 has a direction -one of left, right, up or downbased on its position relative to the rectangle that it slices. By convention, the pins p 1 and p 2 will be regarded as having no direction.
A proper pin sequence is one that satisfies two additional conditions: • Maximality: each pin must be maximal in its direction. For example, if p i is a right pin, then there are no points further to the right of p i that slice rect(p 1 , . . . , p i−1 ).
As all pin sequences required in the sequel will be proper, for brevity we will sometimes use the term 'pin sequence' to mean a proper pin sequence. We now recall some basic properties of (proper) pin sequences. See the lower-left part of Figure 1 for examples of pin sequences.
The following result is critical to what will follow later. A pin sequence p 1 , . . . , p m in a permutation π is said to be right reaching if p m is the rightmost point of π. ). For every simple permutation π and pair of points p 1 and p 2 (unless, trivially, p 1 is the right-most point of π), there is a (proper) right-reaching pin sequence beginning with p 1 and p 2 .
Lemma 2.3 (Brignall, Huczynska and Vatter
We now introduce some new terminology relating to pins that we will require for our characterisation. Let p 1 , . . . , p m be a pin sequence in a permutation π. We say that a pin p i turns if the direction of p i is the same as the direction of p i−2 . The significance of this concept is in the following observation.
Proof. For a pin sequence p 1 , . . . , p m , let p denote the length of the longest sum of 21s and q the length of the longest skew sum of 12s that the permutation corresponding to the sequence contains. Let ℓ(p 1 , . . . , p m ) = p + q.
We will prove by induction on k the following statement: every pin sequence p 1 , . . . , p m containing 3k turns satisfies ℓ(p 1 , . . . , p m ) ≥ k. The lemma will follow directly.
The base case k = 0 is trivially true, so let p 1 , . . . , p m be a pin sequence with 3(k + 1) turns. Let p j be the latest pin in the sequence which forms a turn, noting that j ≥ 3(k + 1) + 3 ≥ 6 (since the first three pins cannot be turns). By symmetry, we may assume that p j is a right pin (and hence so is p j−2 ) and p j−1 an up pin. Note also that p 1 , . . . , p j−3 contains at least 3(k + 1) − 3 = 3k turns, so by induction we know that ℓ(p 1 , . . . , p j−3 ) ≥ k. We now have the following situation: By inspection, we see that the pair of points p j−1 , p j forms a copy of 21 that is NE of rect(p 1 , . . . , p j−3 ), and so we may add this copy of 21 to the longest sum of 21s that can be found in rect(p 1 , . . . , p j−3 ). Thus we conclude ℓ(p 1 . . . , p m ) ≥ ℓ(p 1 , . . . , p j ) ≥ k + 1.
The permutations in Theorems 1.1 and 1.3
Sawtooth alternations A sawtooth alternation of length 3n is a permutation on 3n points that contains a sum of n copies of 21 placed alongside (horizontally or vertically) a monotone sequence of n points, in such a way as each copy of 21 is sliced by a single entry from the monotone sequence -see the first two illustrations in Figure 2. 1 We divide the family of sawtooth alternations into two types: a parallel sawtooth alternation is one in which the monotone sequence is increasing, while a wedge sawtooth alternation is one in which the monotone sequence is decreasing. See Figure 2 (on page 3).
It is easy to verify that for n ≥ 2, the parallel sawtooth alternations of length 3n are simple. On the other hand, no wedge sawtooth alternation is simple. This fact underpins the extra work that is required in order to get from Theorem 1.3 to Theorem 1.1.
To recover simplicity in wedge sawtooth alternations, consider the sawtooth alternation shown on the left of Figure 2. The leftmost three points of this permutation forms an interval that is order isomorphic to 312. In order to break this (and every other) interval, we form sliced wedge sawtooth alternations in one of three ways: pull the '1' of this 312 below all the other points of the monotone sequence (type 1), pull the '2' to the right of all other points of the permutation (type 2), or replace the '1' with a new maximal element in the permutation (type 3). See the top-right portion of Figure 1 (on page 2). For wedge sawtooth alternations that are oriented differently, we make the analogous definitions by appealing to symmetry.
Proper pin sequences with turns As defined in the previous subsection, a turn in a pin sequence is a pin p i that has the same direction as p i−2 . By Lemma 2.4, a pin sequence that contains a lot of turns also contains a long sum of 21s or skew sum of 12s (or both). See the bottom left portion of Figure 1, where the pins that are turns have been marked with hollow points. By Proposition 2.2(d), pin sequences with turns either correspond to simple permutations, or they correspond to permutations for which we may remove one point to recover a simple permutation.
Increasing oscillations An increasing oscillation of length n is a permutation on n points formed by a pin sequence that starts from a copy of 21, and then is entirely made up of right and up pins. (That is, for every i ≥ 5 the pin is a turn.) There are two increasing oscillations of each length, which may be obtained from one another by symmetry. See the rightmost illustration in Figure 2 (on page 3).
Extended spiral pin sequences A pin sequence p 1 , . . . , p m that contains no turns must either follow the repeating pattern of directions 'left, up, right, down' or 'left, down, right, up'. We call both of these pin sequences spirals.
Unlike pin sequences with many turns, spiral pin sequences do not contain long sums of 21 or skew sums of 12 (indeed, spiral pin sequences are contained in the class of skew-merged permutations, Av(3412, 2143)). To recover long sums of 21, we add points in specific locations that we call extensions. We will consider two types. For ease of explanation, we assume that p i−1 is an up pin and p i a right pin; all other cases follow by symmetry.
Type 1: An additional point q is a type 1 extension of p i if rect(p i , q) is sliced by p i−1 and/or p i+1 , and by no other points, and the points p i , q form a copy of 21.
Type 2: Three additional points q, r, s that are placed relative to p i form a type 2 extension of p i if: (i) s lies either so that p i−1 , s forms a copy of 21 and the only pin slicing rect(p i−1 , s) is p i , or so that s, p i+1 forms a copy of 21 and the only pin slicing rect(p i+1 , s) is p i+2 ; (ii) q and r form a copy of 21 that is SW of p i , NE of rect(p 1 , . . . , p i−2 ) and is sliced only by s; and (iii) p i+1 separates p i from q and r, and it is the only pin other than s that slices rect(p i , q, r).
A spiral pin sequence with k extensions is a spiral pin sequence for which there exists k distinct pins to which extensions of either type have been added. See Figure 3 for illustrations of the possible extensions to spirals, and the lower right portion of Figure 1 for two examples of a spiral pin sequence with several extensions.
We observe that any spiral pin sequence with k extensions is simple: starting from the fact that a spiral pin sequence is itself simple, the only possible intervals that could be created when the extensions are added can contain at most one point of the original pin sequence. Furthermore, Type 2 Figure 3: Forming extended spirals. For Type 1, any one of the three hollow points may be added. For Type 2, the copy of 21 is added, together with one of the two slicing points.
every point belonging to a Type 1 extension is separated from any other point by at least one point belonging to the spiral, which prevents these points from being contained in a proper interval. For a Type 2 extension, the only exception to this is that the two points q and r (forming the copy of 21) are separated by the third point s, but rect(q, r, s) is sliced by at least two points from the original spiral.
Lemma 2.5.
A spiral pin sequence with 2k extensions contains either ⊕ k 21 or ⊖ k 12.
Proof. For any of the three possible type 1 extensions q to the pin p i in Figure 3, we see that p i , q forms a copy of 21 that is NE of rect(p 1 , . . . , p i−2 ). Similarly, either of the type 2 extensions also provides a copy of 21 that is NE of the same rectangle, rect(p 1 , . . . , p i−2 ). Similar arguments apply by symmetry to the other corners NW, SE and SW. Thus, from 2k extensions, by symmetry we can find k that contribute a copy of 21. Furthermore, since any such copy of 21 arising from an extension of p i is NE or SW of rect(p 1 , . . . , p i−2 ), we conclude that this collection of k copies of 21 forms a copy of ⊕ k 21, as required.
Proof of Theorem 1.3
Let π be a permutation, and let R be an axes-parallel rectangle in the plot of π. A sliced copy of 21 that spans R is a copy of 21 whose rectangular hull is NE or SW of R and sliced by a point that is NW or SE of R.
For our proof we will need the following observation: Lemma 3.2. Let π be a sum indecomposable permutation with |π| > 1. Any line slicing π must slice a copy of 21.
Proof. If π were sliced by a line not slicing a copy of 21, then π would equal π 1 ⊕ π 2 with π 1 the subpermutation of π below (or left of) the line, and π 2 the subpermutation above (or right of) the line. Proof. Starting with R 0 = R and Σ 0 a set of points in R forming a sum of L(8m 2 ) 8m 2 copies of 21, we will construct a sequence of rectangles R 0 R 1 · · · R k with k ≤ 8m 2 with the following properties, for i ≥ 1.
(i) Each R i contains a set of points Σ i Σ i−1 that forms a sum of at least L(8m 2 ) 8m 2 −i copies of 21; (ii) the subpermutation π| R i is an interval inside π| R i−1 (and thus by induction is also an interval inside π| R 0 ).
(iii) In π| R i−1 , there exists a copy of 21 that is NE or SW of R i , and either forms a sliced copy of 21 spanning R i , or is sliced by a point outside R 0 .
Our construction of rectangles will terminate at R k for some k < 8m 2 if in π| R k we can find a copy of a simple permutation that contains a sum of (at least) L copies of 21 or a sawtooth alternation of length 3m.
Otherwise, our construction terminates when k = 8m 2 , at which point condition (iii) will guarantee that we have a sum of at least 8m 2 copies of 21 inside R 0 (one for each rectangle). Each copy of 21 is either sliced by a point outside R 0 , or it forms a sliced copy of 21 that spans the next rectangle R i in the sequence.
If there are 4m 2 copies of 21 that are sliced by points outside R 0 , then we can find m 2 pairs that are sliced on the same side of R 0 , and we can apply the Erdős-Szekeres Theorem to find a monotone sequence of m points outside R 0 slicing copies of 21, giving a sawtooth alternation of length 3m.
On the other hand, if there are 4m 2 copies of 21 sliced inside R 0 , then we can find m 2 copies of 21 that are sliced by points in the same way (i.e., one of the 21 lying NE or SW of the next rectangle, with the slicing point being NW or SE). Because the sequence of rectangles R 0 , R 1 , . . . are nested, the slicing points already form a monotone sequence (see Figure 4), which means that we have in fact found a wedge sawtooth alternation of length 3m 2 (and hence one of length 3m).
Thus, it now suffices to describe the process to construct R i+1 from R i to satisfy (i)-(iii) above. Consider the substitution decomposition of π| R i . First, if π| R i is skew decomposable, then one of the skew indecomposable components must contain all of Σ i , so we could restrict π| R i to this single component, with (i)-(iii) still being satisfied. Thus, without loss of generality, we can assume that π| R i is skew indecomposable.
Next, suppose that π| R i is sum decomposable. If Σ i is contained entirely inside a single sum indecomposable component of π| R i , then we may replace R i with the rectangular hull of this component, Figure 4: A sequence of m sliced copies of 21 that span nested rectangles in the same direction immediately yields a wedge sawtooth alternation of length 3m. in which case π| R i is no longer sum decomposable and we have a different case. Otherwise, Σ i is distributed across at least two of the sum indecomposable components of π| R i , and note that each such component τ intersects Σ i in a whole number of 21s.
Since π is simple, every nonsingleton sum indecomposable component of π| R i must be sliced by a point outside R i (and hence outside R 0 ), and by Lemma 3.2 this slice must also slice a copy of 21 inside the component. Thus, if we have at least 4m 2 nonsingleton components in π| R i , we can find a sum of 4m 2 copies of 21 sliced by points outside R 0 . Since there are four sides of R 0 , m 2 of these components have their slicing points on the same side of R 0 , and the Erdős-Szekeres Theorem then yields a wedge or parallel sawtooth alternation of length 3m. See Figure 5(a).
Now we may suppose that we have fewer than 4m 2 nonsingleton sum indecomposable components in π| R i . By the pigeonhole principle, this means that there is some component τ which contains at least L(8m 2 ) 8m 2 −i /(4m 2 ) > L(8m 2 ) 8m 2 −(i+1) copies of 21. We also know (by assumption) that there exists some other nonsingleton component τ ′ , which must be sliced by some point outside R 0 , see Figure 5(b). We set R i+1 to be the rectangular hull of τ, and any other nonsingleton component τ ′ provides the copy of 21 sliced by a point outside R 0 . Figure 6: The scenarios when π| R i is sum and skew indecomposable: (a) π R i contains 2L intervals intersecting Σ i in 1 point; (b) At least 4m 2 intervals intersect Σ i in two or more points; (c) τ contains many copies of 21 from Σ i and another interval τ ′ contains at least 1; (d) τ contains many copies of 21, and another copy of 21 is split across two other intervals.
We now turn to the case where π| R i is sum and skew indecomposable. In this case, consider the substitution decomposition of π| R i , and write π| R i = σ . By Observation 3.1, each interval τ j may contain 0, 1, or a multiple of 2 points from Σ i . First, if Σ i ⊂ τ j for some j, then we may replace R i with the rectangular hull of τ j , and consider this instead.
If for some copy of 21 in Σ i one of the points is a singleton inside some interval τ j , then there exists another interval that contains the other point as a singleton. Thus, if we can find at least 2L intervals of π| R i each of which intersects Σ i in exactly one point, then σ contains a sum of at least L copies of 21, see Figure 6(a). On the other hand, if we can find at least 4m 2 intervals each of which intersects Σ i in (at least) two points, then each of these intervals must contain a copy of 21 that is sliced by a point outside R 0 , and by the earlier argument we can find a sawtooth alternation of length 3m. See Figure 6(b).
We may now suppose that fewer than 2L intervals intersect in exactly one point (so there are at most L copies of 21 for which this occurs), and fewer than 4m 2 intersect in two or more points. Since Σ i comprises at least L(8m 2 ) 8m 2 −i copies of 21, there are at least L (8m 2 ) 8m 2 −i − 1 copies of 21 in which the '2' and '1' of each pair lie in the same interval. By the pigeonhole principle, there exists an interval τ that contains at least L (8m 2 ) 8m 2 −i − 1 /(4m 2 ) > L(8m 2 ) 8m 2 −(i+1) copies of 21 from Σ i . We set R i+1 to be the rectangular hull of τ.
Finally, since τ is not the only interval containing copies of 21 from Σ i , we can now either find another interval τ ′ that contains a whole number of copies of 21 from Σ i , or two intervals that together contain a copy of 21 from Σ i . In the first case (illustrated in Figure 6(c)), the interval τ ′ must contain a copy of 21 that is sliced by a point outside R 0 as required. In the second case (illustrated in Figure 6(d)), the copy of 21 must be sliced either outside R 0 , or by Lemma 3.3 we can find a copy of 21 in the same region of R i relative to R i+1 as the original 21 (NE or SW) which is sliced by a point either NW or SE of R i+1 and within R i . This gives us the necessary sliced copy of 21 that spans R i+1 for condition (iii).
This completes the possible cases for the substitution decomposition of π| R i , and hence the proof. Note that since π is a finite permutation, the number of times that we may replace the rectangle R i with a smaller rectangle (e.g., when π| R i is skew decomposable) is bounded.
Given a simple permutation π, let ρ(π) denote the sum of the lengths of the maximal sawtooth alternations in π of each of the eight types depicted in Figure 2. Our proof of Theorem 1.3 will be complete after we have proved the following lemma. Proof of Theorem 1.3, given Lemma 3.5. We set f (n) = g(n, 8n) where g is the function from Lemma 3.5. Thus, any π that contains a sum of at least g(n, 8n) copies of 21 must contain an increasing oscillation of length n, or have ρ(π) ≥ 24n. Since ρ(π) is the sum of the sizes of the eight different maximal sawtooth alternations that can be found in π, one of these sawtooth alternations has length at least 3n.
Proof of Lemma 3.5. First, for m = 1, 2, 3, and any s ∈ N, we can set g(m, s) = 2 (since any simple permutation of length at least four contains an increasing oscillation of length 3). Additionally, for any fixed m ≥ 4, it is not hard to see that we may take g(m, 1) = 2 (since any simple permutation of length at least four contains a sawtooth alternation of length three). Thus, we may now assume that m ≥ 4 and s > 1, and we will show that we may take g(m, s) = (m + 3) · k + 1, where k = (8s 2 ) 8s 2 g(m, s − 1) by induction on s.
We start with a simple permutation π which contains a sum of at least (m + 3)k + 1 copies of 21, and denote the points in some longest such sum by Σ. We now partition Σ into m + 4 disjoint rectangles, R 0 through R m+3 , where R 0 is the bounding rectangle for the first copy of 21 in Σ, and, for 1 ≤ i ≤ m + 3, R i is the bounding rectangle for the k least copies of 21 in Σ not contained in any previous R j .
If any rectangle R i contains a sawtooth alternation of length 3s, then we are done. Therefore, we may assume that no rectangle contains such a sawtooth alternation, and so in each π| R i we can find a simple permutation σ i with g(m, s − 1) copies of 21 by Lemma 3.4. For i = 1, . . . , m + 3, let S i = rect(σ i ). If any σ i contained an increasing oscillation of length m, then we are done, thus by the inductive hypothesis we can assume that ρ(σ i ) = 3s − 3 for i = 1, . . . , m + 3 (since ρ(σ i ) ≥ 3s − 3 and if strictly greater we are done).
The next five paragraphs are best read in conjunction with Figure 7. Let h denote the horizontal line crossing through the '1' of the initial 21, and v the vertical line crossing through the '2'. Note that the bottom-left corner of π below h and to the left of v is increasing, else we would be able to find a longer sum of 21s in π than Σ. For convenience, we will refer to the L-shaped region below h and/or to the left of v as the outside region of π, and the rest of π will be inside. Figure 7: The general set-up in the proof of Lemma 3.5. The crosshatched regions must be empty to avoid sliced copies of 21 that span some rectangle S i . The pin p k is shown here so as to slice S 3 , and the shaded rectangle denotes rect(p 1 , . . . , p k−2 ).
Recall that a sliced copy of 21 that spans S j is a copy of 21 that is NE or SW of S j , sliced by a point that is NW or SE of S j . If we can find a sliced copy of 21 that spans S j , then we may append it to one of the eight types of sawtooth alternation in S j . Since ρ(σ j ) = 3s − 3, this implies that ρ(π) ≥ 3s and we are done. Consequently, from now on we will assume that there are no sliced copies of 21 spanning any rectangle S j .
Under this assumption, since any point p that slices some S i must (by Lemma 3.2) slice a copy of 21, all such slicing points must be below and to the left of the top-right corner of S i+1 , and above and to the right of the bottom-left corner of S i−1 (when these rectangles exist). This implies that a number of regions defined by the four boundary lines of each S i must be empty, as identified by the crosshatched areas in Figure 7.
Now consider a shortest right-reaching pin sequence starting from the initial 21 of Σ. We will denote this pin sequence by p 1 , p 2 , . . . , p n . For any initial segment p 1 , . . . , p j , let i j denote the least index (if it exists) such that S i j is contained in the NE region of rect(p 1 , . . . , p j ). Observe that i 1 = i 2 = 1. For any j satisfying 2 ≤ j < n, the pin p j+1 slices the rectangular hull rect(p 1 , . . . , p j ) in such a way as to slice a copy of 21. Thus, whenever p j+1 is a right pin or an up pin, we can assume that p j+1 does not extend beyond S i j . From this, we make two conclusions: first, that i j+1 ≤ i j + 1, and second, that every S i must be sliced by some pin. Note that if p j+1 is a down or left pin, then i j+1 = i j .
In this pin sequence, we identify pin p k , which is the first pin such that S 1 ⊂ rect(p 1 , . . . , p k ). Clearly p k must be an up pin or a right pin, and we will assume that it is an up pin, the other case being analogous. We claim that i k ≤ 4. If not, then some p ℓ (with ℓ < k) slices S 3 . Since p k is the earliest up pin that extends at least as far as the top of S 1 , we conclude that p ℓ must be a right pin, and can be no higher than the top of S 1 . However, any such pin must then contribute to a sliced copy of 21 that spans S 2 . Thus i k ≤ 4, and note that this bound is tight, as illustrated by the example placement of p k and p k−1 in Figure 7. Note further that, in any case, none of the pins p k+2 , . . . , p n can slice S 1 .
We now classify more precisely which j > k can satisfy i j < i j+1 . By the earlier comments, this can Figure 8: Up to symmetry, the three situations where p j+1 is an up pin but p j and p j+1 are not both in the inside region, all give rise to sliced copies of 21 that span S 1 or S i j . The shaded region denotes rect(p 1 , . . . , p j ). The case where p j+1 is a right pin is analogous.
only happen when p j+1 is an up or right pin, and then i j+1 ≤ i j + 1. We claim that i j = i j+1 unless both p j and p j+1 lie in the inside region.
First, suppose p j+1 is a right or up pin in the outside region, then it cannot slice S i j (else p j+1 slices a copy of 21 in S i j , and this sliced copy spans S 1 , see Figure 8(a)), and it cannot extend beyond S i j (else we can find a copy of 21 in {p 1 , . . . , p j } sliced by p j+1 that spans S i j , see Figure 8(b)). Thus, we conclude that S i j is NE of rect(p 1 , . . . , p j+1 ), i.e. i j+1 = i j .
Next, suppose p j+1 is a right or up pin in the inside region, but p j is outside. By definition, p j cannot slice S i j , from which we conclude that p j+1 cannot be contained in S i j . Furthermore, p j+1 cannot slice S i j , else we may take a point in S i j together with p j and p j+1 , and form a sliced copy of 21 that spans S 1 , see Figure 8(c). This completes the claim.
We now identify the least index k ′ > k + 1 such that i k ′ +1 = i k ′ + 1, and note that i k ′ ≤ 6 (since i k ≤ 4) and none of p k ′ , . . . , p n slices S 1 . By the above argument, the sequence of pins p k ′ −1 , p k ′ , p k ′ +1 must be 'up-right-up' or 'right-up-right'. Now consider p k ′ +2 : if it is a down or left pin, then {p k ′ , p k ′ +1 , p k ′ +2 } forms a sliced copy of 21 that spans S 1 . Thus, p k ′ +2 must also be an up or right pin, and the same argument applies to all subsequent pins. Thus, the sequence of points p k ′ −1 , p k ′ , p k ′ +1 , . . . , p n defines an increasing oscillation. Furthermore, since i k ′ ≤ 6 and each subsequent pin can slice at most one more rectangle than its immediate predecessor, since there are m + 3 rectangles in total, this oscillation contains at least m points, completing the proof.
Monotone griddability for simple permutations
In this section, we complete our proof of Theorem 1.1. Starting from Theorem 1.3, our main concern is handling wedge sawtooth alternations since they are not simple. Our key result is the following, whose proof will take up the majority of this section. Conversely, let Si(C) denote the set of simple permutations in C, and suppose that the permutations in Si(C) are not monotone griddable. By Theorem 1.2, the permutation class formed by taking the closure of the set Si(C) must contain arbitrarily long sums of 21 or skew sums of 12, and hence there must be simple permutations in C that contain arbitrarily long sums of 21 or skew sums of 12. If C contains arbitrarily long sums of 21, then by Theorem 1.3 the simple permutations of C must contain arbitrarily long sawtooth alternations or increasing oscillations. We are done unless Si(C) contains only long wedge sawtooth alternations, but in this case we may apply Proposition 4.1 to conclude that C contains arbitrarily long split wedge sawtooth alternations, proper pin sequences with arbitrarily many turns, or spiral proper pin sequences with arbitrarily many extensions. In any case, we conclude that we have found one of the forbidden substructures specified in Theorem 1.1.
A symmetric argument applies in the case when C contains arbitrarily long skew-sums of 12.
First, if there exists i such that ∆(i) ≥ m then the pin p i slices a copy of 21 in rect(p 1 , . . . , p i−1 ), which together with m − 1 sliced copies of 21 from ω, T t(i)+1 , . . . , T t(i)+m−1 , forms a split wedge sawtooth alternation of length 3m, of type 1 if p i is a down pin, type 2 if p i is a right pin, and type 3 if p i is an up pin. Consequently, we can now assume that ∆(i) < m for all i. Letting M = {i : ∆(i) > 0} denote the set of indices i for which ∆(i) is non-zero, we have |M| ≥ 3p(2s + 1).
Next, we are done if the pin sequence contains at least p turns, thus we will assume that there are fewer than p turns in total. Since |M| ≥ 3p(2s + 1), there exists a turn-free factor of p 1 , . . . , p n containing at least 3(2s + 1) distinct indices i for which ∆(i) > 0. Let this factor be p k , . . . , p ℓ , which we will assume forms a clockwise spiral pin sequence (i.e. the directions follow the order up, right, down, left).
We will assume that p k is a down pin, otherwise we may remove at most three pins from the beginning of p k , . . . , p ℓ to recover a spiral sequence beginning with a down pin. The cost of doing this is that p k , . . . , p ℓ is now only guaranteed to contain at least 3(2s + 1) − 3 = 6s distinct indices i with ∆(i) > 0. Irrespective of this, we have k ≥ 3 (since the first two pins have no direction), which means that p k extends from a non-trivial rect(p 1 , . . . , p k−1 ), and therefore p k slices a copy of 21.
Our discussion is now accompanied by Figure 9. In these diagrams, the dark grey regions contain no points because of the maximality of pins, and the light grey regions contain no points because any such point would enable us to take a 'shortcut' in the pin sequence, contradicting our choice of a shortest right-reaching pin sequence. There are also a number of crosshatched areas: these denote regions where the existence of a point will contribute a Type 1 extension.
We now consider any pin p i (with k ≤ i ≤ ℓ − 4) for which ∆(i) > 0, with a view to identifying a Type 1 or Type 2 extension in each case. Note that if p i is a left pin, then ∆(i) = 0 since rect(p 1 , . . . , p i−1 ) already slices or contains the leftmost sliced copy of 21 in ω. Thus the cases that remain are where p i is an up, right or down pin.
If p i is a down pin, then there must be at least one point of ω in a crosshatched region (regions A and B in Figure 9(a)), allowing a Type 1 extension to p i . Similarly, if p i is a right pin, then we again conclude that we can find a Type 1 extension to p i , since there must exist a point of ω in regions A or B of Figure 9(b).
This leaves the case where p i is an up pin, illustrated in Figure 9(c). If any one of the crosshatched regions contains a point, then we can find a Type 1 extension for one of p i−3 , p i+1 or p i+2 , so we now assume that these are empty. Fix some triple T j which contains a point that slices rect(p 1 , . . . , p i ) \ rect(p 1 , . . . , p i−1 ). The '1' of this triple must either coincide with p i+1 , or lie in one of the regions A, B or C. This implies the same of the '2', and thence the slicing point must equal p i+2 , or lie in one of the regions D, E or F.
If both the '2' and the '1' of T j lie in C and the slicing point lies in F, then we have a Type 2 extension of p i+1 and we are done. There are two other cases to consider: either the slicing point is in region F and the '1' is in region B, or the slicing point is below the pin p i+3 (i.e. it equals p i+2 or lies in regions D or E). In either case, we have that the '2' and the '1' are sliced by a point that lies below p i+3 . We can now substitute the pin p i+1 with the '1', and the pin p i+2 with the slicing point of T j , and then the 2 is a Type 1 extension of the '1' (acting as a right pin). 2 We have now shown how to find an extension whenever we have a pin p i with ∆(i) > 0. There are at least 6s of these pins, but the above analysis does not guarantee that the 6s extensions are applied to distinct pins, and extremality may have been (temporarily) violated.
To resolve these issues, define a spiral to be a set of four contiguous pins that begins with a down pin. We observe that the above analysis shows us that for any pin p i with ∆(i) > 0, the pin(s) which can be extended (or substituted and extended) all lie in the same spiral as p i , or the spiral immediately after p i . Thus, by restricting our collection of pins p i with ∆(i) > 0 in p k , . . . , p ℓ to a subset for which any pair is separated by a complete spiral of pins, we can ensure that the extensions are applied to distinct pins. Note also that this corrects any issues arising from the violation of extremality.
In order to do this, recall that every left pin p i satisfies ∆(i) = 0. Thus, we may choose every sixth pin from the collection of 6s pins with ∆(i) > 0, leaving us with a set of at least s pins separated by at least seven points. This, in turn, gives us a set of s distinct pins which have extensions of types 1 or 2, and thus we have formed an extended spiral with s extensions.
Finally, while we cannot appeal to symmetry to cover the case where the pin sequence spirals in the opposite direction, the arguments are similar and so we omit the details.
Concluding remarks
Decision procedure In this paper we have characterised the classes whose simple permutations are monotone griddable. From this, it should be possible to describe a decision procedure to answer the follow algorithmic problem: Question 5.1. Given a finitely based permutation class C, is it decidable whether the simple permutations in C are monotone griddable?
The crux of such an algorithm would likely be to extend existing algorithms that handle pin sequences (such as those given in and ) to identify turns and (for spiral pin sequences) extensions.
Geometric griddability For a class C whose simple permutations are all geometrically griddable, the class itself is contained in the substitution closure Si(C) , whence we can conclude that C has a number of important properties: it is finitely based, well-quasi-ordered, and is enumerated by an algebraic generating function (see Theorems 4.4 and 6.1 of ). For this reason, a geometric analogue to Theorem 1.1 is highly desirable.
Question 5.2. Does there exist a characterisation of classes whose simple permutations are geometrically griddable?
This would appear to be a difficult question. In particular, there is no known analogue of Theorem 1.2 to characterise when a class is itself geometrically griddable. |
/**
* Persistent_ptr base (non-template) class
*
* Implements some of the functionality of the persistent_ptr class. It defines
* all applicable conversions from and to a persistent_ptr_base.
*
* It can be used e.g. as a parameter, where persistent_ptr of any template
* type is required. It is similar to persistent_ptr<void> (it can point
* to whatever type), but it can be used when you want to have pointer to some
* unspecified persistent_ptr (with persistent_ptr<void> it can't be done,
* because: persistent_ptr<T>* does not convert to persistent_ptr<void>*).
*/
class persistent_ptr_base {
public:
persistent_ptr_base() noexcept : oid(OID_NULL)
{
}
persistent_ptr_base(PMEMoid oid) noexcept : oid(oid)
{
}
persistent_ptr_base(persistent_ptr_base const &r) noexcept : oid(r.oid)
{
}
persistent_ptr_base(persistent_ptr_base &&r) noexcept
: oid(std::move(r.oid))
{
}
persistent_ptr_base &
operator=(persistent_ptr_base &&r)
{
detail::conditional_add_to_tx(this);
this->oid = std::move(r.oid);
return *this;
}
persistent_ptr_base &
operator=(persistent_ptr_base const &r)
{
detail::conditional_add_to_tx(this);
this->oid = r.oid;
return *this;
}
persistent_ptr_base &
operator=(std::nullptr_t &&)
{
detail::conditional_add_to_tx(this);
this->oid = {0, 0};
return *this;
}
void
swap(persistent_ptr_base &other)
{
detail::conditional_add_to_tx(this);
detail::conditional_add_to_tx(&other);
std::swap(this->oid, other.oid);
}
const PMEMoid &
raw() const noexcept
{
return this->oid;
}
PMEMoid *
raw_ptr() noexcept
{
return &(this->oid);
}
protected:
PMEMoid oid;
} |
import logging
import time
from datetime import datetime, timedelta
from time import perf_counter, sleep
from uuid import uuid4
import pytest
import celery
from celery import chain, chord, group
from celery.canvas import StampingVisitor
from celery.utils.serialization import UnpickleableExceptionWrapper
from celery.worker import state as worker_state
from .conftest import TEST_BACKEND, get_active_redis_channels
from .tasks import (ClassBasedAutoRetryTask, ExpectedException, add, add_ignore_result, add_not_typed, fail,
fail_unpickleable, print_unicode, retry, retry_once, retry_once_headers, retry_once_priority,
retry_unpickleable, return_properties, sleeping)
TIMEOUT = 10
_flaky = pytest.mark.flaky(reruns=5, reruns_delay=2)
_timeout = pytest.mark.timeout(timeout=300)
def flaky(fn):
return _timeout(_flaky(fn))
class test_class_based_tasks:
@flaky
def test_class_based_task_retried(self, celery_session_app,
celery_session_worker):
task = ClassBasedAutoRetryTask()
celery_session_app.register_task(task)
res = task.delay()
assert res.get(timeout=TIMEOUT) == 1
def _producer(j):
"""Single producer helper function"""
results = []
for i in range(20):
results.append([i + j, add.delay(i, j)])
for expected, result in results:
value = result.get(timeout=10)
assert value == expected
assert result.status == 'SUCCESS'
assert result.ready() is True
assert result.successful() is True
return j
class test_tasks:
def test_simple_call(self):
"""Tests direct simple call of task"""
assert add(1, 1) == 2
assert add(1, 1, z=1) == 3
@flaky
def test_basic_task(self, manager):
"""Tests basic task call"""
results = []
# Tests calling task only with args
for i in range(10):
results.append([i + i, add.delay(i, i)])
for expected, result in results:
value = result.get(timeout=10)
assert value == expected
assert result.status == 'SUCCESS'
assert result.ready() is True
assert result.successful() is True
results = []
# Tests calling task with args and kwargs
for i in range(10):
results.append([3*i, add.delay(i, i, z=i)])
for expected, result in results:
value = result.get(timeout=10)
assert value == expected
assert result.status == 'SUCCESS'
assert result.ready() is True
assert result.successful() is True
@flaky
def test_multiprocess_producer(self, manager):
"""Testing multiple processes calling tasks."""
from multiprocessing import Pool
pool = Pool(20)
ret = pool.map(_producer, range(120))
assert list(ret) == list(range(120))
@flaky
def test_multithread_producer(self, manager):
"""Testing multiple threads calling tasks."""
from multiprocessing.pool import ThreadPool
pool = ThreadPool(20)
ret = pool.map(_producer, range(120))
assert list(ret) == list(range(120))
@flaky
def test_ignore_result(self, manager):
"""Testing calling task with ignoring results."""
result = add.apply_async((1, 2), ignore_result=True)
assert result.get() is None
# We wait since it takes a bit of time for the result to be
# persisted in the result backend.
sleep(1)
assert result.result is None
@flaky
def test_timeout(self, manager):
"""Testing timeout of getting results from tasks."""
result = sleeping.delay(10)
with pytest.raises(celery.exceptions.TimeoutError):
result.get(timeout=5)
@flaky
def test_expired(self, manager):
"""Testing expiration of task."""
# Fill the queue with tasks which took > 1 sec to process
for _ in range(4):
sleeping.delay(2)
# Execute task with expiration = 1 sec
result = add.apply_async((1, 1), expires=1)
with pytest.raises(celery.exceptions.TaskRevokedError):
result.get()
assert result.status == 'REVOKED'
assert result.ready() is True
assert result.failed() is False
assert result.successful() is False
# Fill the queue with tasks which took > 1 sec to process
for _ in range(4):
sleeping.delay(2)
# Execute task with expiration at now + 1 sec
result = add.apply_async((1, 1), expires=datetime.utcnow() + timedelta(seconds=1))
with pytest.raises(celery.exceptions.TaskRevokedError):
result.get()
assert result.status == 'REVOKED'
assert result.ready() is True
assert result.failed() is False
assert result.successful() is False
@flaky
def test_eta(self, manager):
"""Tests tasks scheduled at some point in future."""
start = perf_counter()
# Schedule task to be executed in 3 seconds
result = add.apply_async((1, 1), countdown=3)
sleep(1)
assert result.status == 'PENDING'
assert result.ready() is False
assert result.get() == 2
end = perf_counter()
assert result.status == 'SUCCESS'
assert result.ready() is True
# Difference between calling the task and result must be bigger than 3 secs
assert (end - start) > 3
start = perf_counter()
# Schedule task to be executed at time now + 3 seconds
result = add.apply_async((2, 2), eta=datetime.utcnow() + timedelta(seconds=3))
sleep(1)
assert result.status == 'PENDING'
assert result.ready() is False
assert result.get() == 4
end = perf_counter()
assert result.status == 'SUCCESS'
assert result.ready() is True
# Difference between calling the task and result must be bigger than 3 secs
assert (end - start) > 3
@flaky
def test_fail(self, manager):
"""Tests that the failing task propagates back correct exception."""
result = fail.delay()
with pytest.raises(ExpectedException):
result.get(timeout=5)
assert result.status == 'FAILURE'
assert result.ready() is True
assert result.failed() is True
assert result.successful() is False
@flaky
def test_revoked(self, manager):
"""Testing revoking of task"""
# Fill the queue with tasks to fill the queue
for _ in range(4):
sleeping.delay(2)
# Execute task and revoke it
result = add.apply_async((1, 1))
result.revoke()
with pytest.raises(celery.exceptions.TaskRevokedError):
result.get()
assert result.status == 'REVOKED'
assert result.ready() is True
assert result.failed() is False
assert result.successful() is False
def test_revoked_by_headers_simple_canvas(self, manager):
"""Testing revoking of task using a stamped header"""
# Try to purge the queue before we start
# to attempt to avoid interference from other tests
while True:
count = manager.app.control.purge()
if count == 0:
break
target_monitoring_id = uuid4().hex
class MonitoringIdStampingVisitor(StampingVisitor):
def on_signature(self, sig, **headers) -> dict:
return {'monitoring_id': target_monitoring_id}
for monitoring_id in [target_monitoring_id, uuid4().hex, 4242, None]:
stamped_task = add.si(1, 1)
stamped_task.stamp(visitor=MonitoringIdStampingVisitor())
result = stamped_task.freeze()
result.revoke_by_stamped_headers(headers={'monitoring_id': [monitoring_id]})
stamped_task.apply_async()
if monitoring_id == target_monitoring_id:
with pytest.raises(celery.exceptions.TaskRevokedError):
result.get()
assert result.status == 'REVOKED'
assert result.ready() is True
assert result.failed() is False
assert result.successful() is False
else:
assert result.get() == 2
assert result.status == 'SUCCESS'
assert result.ready() is True
assert result.failed() is False
assert result.successful() is True
# Clear the set of revoked stamps in the worker state.
# This step is performed in each iteration of the loop to ensure that only tasks
# stamped with a specific monitoring ID will be revoked.
# For subsequent iterations with different monitoring IDs, the revoked stamps will
# not match the task's stamps, allowing those tasks to proceed successfully.
worker_state.revoked_stamps.clear()
# Try to purge the queue after we're done
# to attempt to avoid interference to other tests
while True:
count = manager.app.control.purge()
if count == 0:
break
def test_revoked_by_headers_complex_canvas(self, manager, subtests):
"""Testing revoking of task using a stamped header"""
try:
manager.app.backend.ensure_chords_allowed()
except NotImplementedError as e:
raise pytest.skip(e.args[0])
for monitoring_id in ["4242", [1234, uuid4().hex]]:
# Try to purge the queue before we start
# to attempt to avoid interference from other tests
while True:
count = manager.app.control.purge()
if count == 0:
break
target_monitoring_id = isinstance(monitoring_id, list) and monitoring_id[0] or monitoring_id
class MonitoringIdStampingVisitor(StampingVisitor):
def on_signature(self, sig, **headers) -> dict:
return {'monitoring_id': target_monitoring_id, 'stamped_headers': ['monitoring_id']}
stamped_task = sleeping.si(4)
stamped_task.stamp(visitor=MonitoringIdStampingVisitor())
result = stamped_task.freeze()
canvas = [
group([stamped_task]),
chord(group([stamped_task]), sleeping.si(2)),
chord(group([sleeping.si(2)]), stamped_task),
chain(stamped_task),
group([sleeping.si(2), stamped_task, sleeping.si(2)]),
chord([sleeping.si(2), stamped_task], sleeping.si(2)),
chord([sleeping.si(2), sleeping.si(2)], stamped_task),
chain(sleeping.si(2), stamped_task),
chain(sleeping.si(2), group([sleeping.si(2), stamped_task, sleeping.si(2)])),
chain(sleeping.si(2), group([sleeping.si(2), stamped_task]), sleeping.si(2)),
chain(sleeping.si(2), group([sleeping.si(2), sleeping.si(2)]), stamped_task),
]
result.revoke_by_stamped_headers(headers={'monitoring_id': monitoring_id})
for sig in canvas:
sig_result = sig.apply_async()
with subtests.test(msg='Testing if task was revoked'):
with pytest.raises(celery.exceptions.TaskRevokedError):
sig_result.get()
assert result.status == 'REVOKED'
assert result.ready() is True
assert result.failed() is False
assert result.successful() is False
worker_state.revoked_stamps.clear()
# Try to purge the queue after we're done
# to attempt to avoid interference to other tests
while True:
count = manager.app.control.purge()
if count == 0:
break
@flaky
def test_wrong_arguments(self, manager):
"""Tests that proper exceptions are raised when task is called with wrong arguments."""
with pytest.raises(TypeError):
add(5)
with pytest.raises(TypeError):
add(5, 5, wrong_arg=5)
with pytest.raises(TypeError):
add.delay(5)
with pytest.raises(TypeError):
add.delay(5, wrong_arg=5)
# Tasks with typing=False are not checked but execution should fail
result = add_not_typed.delay(5)
with pytest.raises(TypeError):
result.get(timeout=5)
assert result.status == 'FAILURE'
result = add_not_typed.delay(5, wrong_arg=5)
with pytest.raises(TypeError):
result.get(timeout=5)
assert result.status == 'FAILURE'
@pytest.mark.xfail(
condition=TEST_BACKEND == "rpc",
reason="Retry failed on rpc backend",
strict=False,
)
def test_retry(self, manager):
"""Tests retrying of task."""
# Tests when max. retries is reached
result = retry.delay()
tik = time.monotonic()
while time.monotonic() < tik + 5:
status = result.status
if status != 'PENDING':
break
sleep(0.1)
else:
raise AssertionError("Timeout while waiting for the task to be retried")
assert status == 'RETRY'
with pytest.raises(ExpectedException):
result.get()
assert result.status == 'FAILURE'
# Tests when task is retried but after returns correct result
result = retry.delay(return_value='bar')
tik = time.monotonic()
while time.monotonic() < tik + 5:
status = result.status
if status != 'PENDING':
break
sleep(0.1)
else:
raise AssertionError("Timeout while waiting for the task to be retried")
assert status == 'RETRY'
assert result.get() == 'bar'
assert result.status == 'SUCCESS'
def test_retry_with_unpickleable_exception(self, manager):
"""Test a task that retries with an unpickleable exception.
We expect to be able to fetch the result (exception) correctly.
"""
job = retry_unpickleable.delay(
"foo",
"bar",
retry_kwargs={"countdown": 10, "max_retries": 1},
)
# Wait for the task to raise the Retry exception
tik = time.monotonic()
while time.monotonic() < tik + 5:
status = job.status
if status != 'PENDING':
break
sleep(0.1)
else:
raise AssertionError("Timeout while waiting for the task to be retried")
assert status == 'RETRY'
# Get the exception
res = job.result
assert job.status == 'RETRY' # make sure that it wasn't completed yet
# Check it
assert isinstance(res, UnpickleableExceptionWrapper)
assert res.exc_cls_name == "UnpickleableException"
assert res.exc_args == ("foo",)
job.revoke()
def test_fail_with_unpickleable_exception(self, manager):
"""Test a task that fails with an unpickleable exception.
We expect to be able to fetch the result (exception) correctly.
"""
result = fail_unpickleable.delay("foo", "bar")
with pytest.raises(UnpickleableExceptionWrapper) as exc_info:
result.get()
exc_wrapper = exc_info.value
assert exc_wrapper.exc_cls_name == "UnpickleableException"
assert exc_wrapper.exc_args == ("foo",)
assert result.status == 'FAILURE'
@flaky
def test_task_accepted(self, manager, sleep=1):
r1 = sleeping.delay(sleep)
sleeping.delay(sleep)
manager.assert_accepted([r1.id])
@flaky
def test_task_retried_once(self, manager):
res = retry_once.delay()
assert res.get(timeout=TIMEOUT) == 1 # retried once
@flaky
def test_task_retried_once_with_expires(self, manager):
res = retry_once.delay(expires=60)
assert res.get(timeout=TIMEOUT) == 1 # retried once
@flaky
def test_task_retried_priority(self, manager):
res = retry_once_priority.apply_async(priority=7)
assert res.get(timeout=TIMEOUT) == 7 # retried once with priority 7
@flaky
def test_task_retried_headers(self, manager):
res = retry_once_headers.apply_async(headers={'x-test-header': 'test-value'})
headers = res.get(timeout=TIMEOUT)
assert headers is not None # retried once with headers
assert 'x-test-header' in headers # retry keeps custom headers
@flaky
def test_unicode_task(self, manager):
manager.join(
group(print_unicode.s() for _ in range(5))(),
timeout=TIMEOUT, propagate=True,
)
@flaky
def test_properties(self, celery_session_worker):
res = return_properties.apply_async(app_id="1234")
assert res.get(timeout=TIMEOUT)["app_id"] == "1234"
class test_trace_log_arguments:
args = "CUSTOM ARGS"
kwargs = "CUSTOM KWARGS"
def assert_trace_log(self, caplog, result, expected):
# wait for logs from worker
sleep(.01)
records = [(r.name, r.levelno, r.msg, r.data["args"], r.data["kwargs"])
for r in caplog.records
if r.name in {'celery.worker.strategy', 'celery.app.trace'}
if r.data["id"] == result.task_id
]
assert records == [(*e, self.args, self.kwargs) for e in expected]
def call_task_with_reprs(self, task):
return task.set(argsrepr=self.args, kwargsrepr=self.kwargs).delay()
@flaky
def test_task_success(self, caplog):
result = self.call_task_with_reprs(add.s(2, 2))
value = result.get()
assert value == 4
assert result.successful() is True
self.assert_trace_log(caplog, result, [
('celery.worker.strategy', logging.INFO,
celery.app.trace.LOG_RECEIVED,
),
('celery.app.trace', logging.INFO,
celery.app.trace.LOG_SUCCESS,
),
])
@flaky
def test_task_failed(self, caplog):
result = self.call_task_with_reprs(fail.s(2, 2))
with pytest.raises(ExpectedException):
result.get(timeout=5)
assert result.failed() is True
self.assert_trace_log(caplog, result, [
('celery.worker.strategy', logging.INFO,
celery.app.trace.LOG_RECEIVED,
),
('celery.app.trace', logging.ERROR,
celery.app.trace.LOG_FAILURE,
),
])
class test_task_redis_result_backend:
@pytest.fixture()
def manager(self, manager):
if not manager.app.conf.result_backend.startswith('redis'):
raise pytest.skip('Requires redis result backend.')
return manager
def test_ignoring_result_no_subscriptions(self, manager):
channels_before_test = get_active_redis_channels()
result = add_ignore_result.delay(1, 2)
assert result.ignored is True
new_channels = [channel for channel in get_active_redis_channels() if channel not in channels_before_test]
assert new_channels == []
def test_asyncresult_forget_cancels_subscription(self, manager):
channels_before_test = get_active_redis_channels()
result = add.delay(1, 2)
assert set(get_active_redis_channels()) == {
f"celery-task-meta-{result.id}".encode(), *channels_before_test
}
result.forget()
new_channels = [channel for channel in get_active_redis_channels() if channel not in channels_before_test]
assert new_channels == []
def test_asyncresult_get_cancels_subscription(self, manager):
channels_before_test = get_active_redis_channels()
result = add.delay(1, 2)
assert set(get_active_redis_channels()) == {
f"celery-task-meta-{result.id}".encode(), *channels_before_test
}
assert result.get(timeout=3) == 3
new_channels = [channel for channel in get_active_redis_channels() if channel not in channels_before_test]
assert new_channels == []
|
Commemorative Privilege in National Statuary Hall: Spatial Constructions of Racial Citizenship
ABSTRACT This article takes on a rhetorical investigation of the spatial and racial politics at play in the Capitol Building’s National Statuary Hall (NSH) collection. I argue that the material arrangement of the NSH collection enacts a form of what I call commemorative privilege, wherein the Capitol’s most prestigious places valorize those citizens that emulate the nation’s history of ascriptive citizenship ideals while the building’s basement houses those citizens whose voices and bodies have resisted such norms. I unpack both structural and embodied forms of commemorative privilege, underscoring the mutually constitutive relationship between people and places. Thus, the analysis demonstrates that not all space is created equal, and where citizens are placed within a symbolic space has material and ideological implications for racialized citizenship ideals. |
package dk.stacktrace.messagingforwarder;
import android.util.Log;
import java.io.IOException;
import java.io.OutputStream;
import java.net.HttpURLConnection;
import java.net.URL;
class HttpPostThread implements Runnable {
private static final String TAG = HttpPostThread.class.getName();
public HttpPostThread(URL url, String message) {
this.url = url;
this.message = message;
}
@Override
public void run() {
HttpURLConnection connection = null;
try {
connection = (HttpURLConnection)this.url.openConnection();
connection.setDoOutput(true);
connection.setRequestMethod("POST");
connection.setRequestProperty("Content-Type", "text/plain; charset=UTF-8");
byte bytes[] = this.message.getBytes("UTF-8");
OutputStream out = connection.getOutputStream();
out.write(bytes);
out.flush();
int status = connection.getResponseCode();
Log.i(TAG, "Server replied with HTTP status: " + status);
out.close();
}
catch (IOException e) {
Log.w(TAG, "Error communicating with HTTP server", e);
}
finally {
if (connection != null) {
connection.disconnect();
}
}
}
private final URL url;
private final String message;
}
|
// SetPcrTotalTests sets the "pcrTotalTests" field.
func (iru *InfectedRecordUpdate) SetPcrTotalTests(i int) *InfectedRecordUpdate {
iru.mutation.ResetPcrTotalTests()
iru.mutation.SetPcrTotalTests(i)
return iru
} |
Kyle Lowry is irreplaceable for the Toronto Raptors. If you take the money out of the equation for a second it’s easier to love what Lowry means to this team.
Kyle Lowry is irreplaceable for the Toronto Raptors. If you take the money out of the equation for a second it’s easier to love what Lowry means to this team. Lowry is the leader on the court and in the locker room.
What About DeRozan?
DeMar DeRozan is an all-star and even if you think he’s better than Kyle Lowry, DeRozan isn’t the vocal leader in the locker room. That’s fine, but a franchise needs a star player that will hold guys accountable behind closed doors. For the Raptors that guy has been Kyle Lowry.
Bad Playoffs?
People are quick to point out Kyle Lowry has been bad in the playoffs. He has struggled and there is no way of covering that up, but I think suggesting Lowry is a bad playoff player is a stretch. His post-all-star injury history hasn’t been good with the Raptors over the last four years. The injuries are the most plausible reason why Lowry hasn’t looked right in the playoffs. If he is healthy there is no reason why anyone should expect Lowry’s play to dip come playoff time.
Kyle Lowry Was on Pace for a Career Year
Prior to getting injured Lowry was having the best year of his career. In fact, for a while, the Raptors as a team were at a historical pace in terms of offense. The combination of Lowry and DeRozan was yielding this potent offense until it wasn’t. Prior to the all-star break, the Raptors ran into a slump. There were less DeRozan insane outrageous scoring games. Terrence Ross’s shooting reverted to the mean. And Kyle Lowry ran into problems on defense. The Raptors made moves to acquire Serge Ibaka and PJ Tucker to fix the PF hole and improve wing defense. However, Kyle Lowry was hurt and only played four games post-all-star and clearly wasn’t 100% heading into the playoffs. If Kyle would have been healthy and was able to build chemistry with Ibaka and Tucker maybe the Raptors would have had a better shot against the Cavs.
Kyle Lowry Pre-All-Star Stats:
G MPG FG% 3PM 3P% FT% RPG APG TPG SPG BPG PPG 56 37.7 46.3 3.3 41.7 82.6 4.7 6.9 2.8 1.4 0.3 22.8
Why Does Lowry Seemingly Get Injured Every Year?
There is no definitive answer to this question. During the regular season, Kyle Lowry has played heavy minutes. In fact, Lowry has only played under 30 minutes once as a member of the Toronto Raptors. That was his first year with the Raptors where he played 29.7 minutes per game. In the four seasons following he only played one under 35 minutes per game. And in the last two seasons, he’s played at least 37 minutes per game in each. Playing heavy minutes like that puts miles on the odometer. If Lowry is breaking down every season isn’t the fix just to play him fewer minutes? The Raptors have capable backups now in Cory Joseph, Delon Wright, and Fred VanVleet. There’s no reason why Kyle Lowry needs to play more than 30 minutes a game during the regular season. I would even argue the Raptors would still get into the playoffs if Lowry only played 25 minutes a game. The Raptors have been getting around 50 wins each year over the last four years. If it means sacrificing a few regular seasons wins for Kyle Lowry to be 100% come playoff time sign me up. There was a time where just getting to the playoffs was enough for the Raptors fanbase. Not anymore. Fans want to see the Raptors compete. Without a healthy Lowry that is probably impossible.
If Kyle Lowry Leaves What Happens at Point Guard?
Cory Joseph and Delon Wright are good backups. Hell, even Fred VanVleet might be a good backup, but none of these guys should be starters in the NBA. Joseph isn’t good enough of a playmaker to be a starter and he’s atrocious on the defensive end. Joseph had a defensive rating of 110 this season. That is the same rating as DeRozan and DeRozan is known to be a poor defender. You might not want to buy in that Cory is an awful defender on one stat alone, so pick any stat or metric and it will tell a similar story. Joseph had a negative box score plus/minus and ranked 73rd in defensive real plus/minus among point guards. He shoots okay, but not enough to make up for his below average passing ability as a point guard. And again Joseph isn't helpful when the other team has the ball.
But Delon Wright Can Be a Starter Right?
I think what people don’t realize with Wright is that he is already 25 years old and a couple of weeks older than Jonas Valanciunas. Unlike Joseph, Wright has been a very good defender. Delon can probably develop his game a little bit, but to be a starting point guard in the NBA he would likely need to significantly improve his shooting and playmaking. I think it’s possible he can improve one, but not both which could make him an elite backup point guard in the NBA.
Point guard is the deepest position in the NBA now. If you are a team not named the Spurs without a top 15 point guard, you probably are a fringe playoff team at best. If the Raptors lose Kyle Lowry that’s probably where they likely will be in the standings next year and that would be taking a step backward.
Freddy V?
Fred VanVleet was a pleasant surprise. He was undrafted but didn’t look out of place in his rookie season with the Raptors. VanVleet could very well become a backup point guard in the NBA, but anything more than that is wishful thinking.
Debunking PG Replacements
I’ve heard from Raptors fans that Eric Bledsoe and Jrue Holiday could be viable replacements for Kyle Lowry. Personally, I like both players, but it’s hard to imagine either would be better than Lowry. Eric Bledsoe would require the Raptors giving up assets to trade for him. If the Raptors are giving up something of value for Bledsoe and Bledsoe isn’t as good as Lowry that’s probably sliding a little too far on the treadmill. Jrue Holiday is interesting, but the Raptors would have to renounce all their free agents to have a hope of signing him. That means not only giving up Kyle Lowry, but also giving up Serge Ibaka, PJ Tucker, and Patrick Patterson. The cap space simply isn’t there to add Holiday as well as replace the Raptors free agents.
The $200M Question
At the age of 31 with health concerns is Kyle Lowry worth a 5-year $200+M contract? Based on that question most people would say the Raptors shouldn’t sign Kyle Lowry. However, I don’t think it will take $200M to re-sign Lowry. There isn’t much of a market at max money for him in the NBA. Philly and Houston two teams to be rumored to be interested in Lowry have already gotten point guards (Fultz & Paul respectively). Kyle has stated he wants to be on a winning team with the chance of competing for a title. The Raptors are certainly a winning team, but probably don’t have a real shot at a title next year. However, unless Lowry signs with the Warriors the odds for him winning a title next year are extremely low. Some might argue that if Lowry were to join the Spurs they would have a shot at beating the Warriors. They might be right, but I don’t think the Spurs are an organization that would throw big money at an aging point guard with health concerns. The Spurs are probably the best-managed franchise in the business from being able to develop talent at any spot in the draft to making their players feel like they are part of an elite fraternity. My question is why would they take a risk with big money on Kyle Lowry when they can continue to be very good without getting stuck with a potentially awful contract. If Kyle would be willing to leave a lot of money on the table I’m sure the Spurs would love to have him, but he’s likely about to sign his last NBA contract and has been underpaid his entire career. Lowry implied this season that he would re-sign this season if the Raptors paid him the $200M, but I doubt he would scoff at a figure like $150M to stay with the Raptors especially if there aren’t a lot of other teams ready to pony up that kind of money.
“You have said Lowry isn’t worth $200M but is worth $150M?”
Kyle Lowry probably isn’t worth a $150M contract, but if that’s what it takes to keep him you do it. Lowry probably has two elite seasons left in him. The Raptors window to be good is now, so they need him for those next two prime years. Beyond that Kyle will likely decline, but the hope is that even if he’s no longer an elite player he can still be serviceable in years 3 and 4. This contract isn’t going to be moveable, so it’s hard to stomach dead money on the books in the final years. If it’s the last year of this new contract where Kyle Lowry doesn’t have anything left I’m okay with that. He has given the Raptors the best four-year run in franchise history and didn’t bolt the last time he was a free agent when every American media outlet said he would or should. At $150M Kyle Lowry would be the highest paid player on the Raptors, but not so much so that DeRozan would be jealous.
The Raptors need Kyle Lowry back. There is no way to replace his contributions to the team. Since there is no alternative to Lowry, the Raptors have to do everything in their power to bring him back (without paying the $200M). Otherwise, the Raptors may have to blow up the team and tank. If you have been a fan of the Raptors for a long time that is not something you should want. |
<gh_stars>1-10
"""
gromacstopfile.py: Used for loading Gromacs top files.
This is part of the OpenMM molecular simulation toolkit originating from
Simbios, the NIH National Center for Physics-Based Simulation of
Biological Structures at Stanford, funded under the NIH Roadmap for
Medical Research, grant U54 GM072970. See https://simtk.org.
Portions copyright (c) 2012-2018 Stanford University and the Authors.
Authors: <NAME>
Contributors: <NAME>
Permission is hereby granted, free of charge, to any person obtaining a
copy of this software and associated documentation files (the "Software"),
to deal in the Software without restriction, including without limitation
the rights to use, copy, modify, merge, publish, distribute, sublicense,
and/or sell copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
THE AUTHORS, CONTRIBUTORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
USE OR OTHER DEALINGS IN THE SOFTWARE.
"""
from __future__ import absolute_import
__author__ = "<NAME>"
__version__ = "1.0"
from simtk.openmm.app import Topology
from simtk.openmm.app import PDBFile
from . import forcefield as ff
from . import element as elem
from . import amberprmtopfile as prmtop
import simtk.unit as unit
import simtk.openmm as mm
import math
import os
import re
import distutils.spawn
from collections import OrderedDict, defaultdict
from itertools import combinations, combinations_with_replacement
HBonds = ff.HBonds
AllBonds = ff.AllBonds
HAngles = ff.HAngles
OBC2 = prmtop.OBC2
novarcharre = re.compile(r'\W')
def _find_all_instances_in_string(string, substr):
""" Find indices of all instances of substr in string """
indices = []
idx = string.find(substr, 0)
while idx > -1:
indices.append(idx)
idx = string.find(substr, idx+1)
return indices
def _replace_defines(line, defines):
""" Replaces defined tokens in a given line """
if not defines: return line
for define in reversed(defines):
value = defines[define]
indices = _find_all_instances_in_string(line, define)
if not indices: continue
# Check to see if it's inside of quotes
inside = ''
idx = 0
n_to_skip = 0
new_line = []
for i, char in enumerate(line):
if n_to_skip:
n_to_skip -= 1
continue
if char in ('\'"'):
if not inside:
inside = char
else:
if inside == char:
inside = ''
if idx < len(indices) and i == indices[idx]:
if inside:
new_line.append(char)
idx += 1
continue
if i == 0 or novarcharre.match(line[i-1]):
endidx = indices[idx] + len(define)
if endidx >= len(line) or novarcharre.match(line[endidx]):
new_line.extend(list(value))
n_to_skip = len(define) - 1
idx += 1
continue
idx += 1
new_line.append(char)
line = ''.join(new_line)
return line
class GromacsTopFile(object):
"""GromacsTopFile parses a Gromacs top file and constructs a Topology and (optionally) an OpenMM System from it."""
class _MoleculeType(object):
"""Inner class to store information about a molecule type."""
def __init__(self):
self.atoms = []
self.bonds = []
self.angles = []
self.dihedrals = []
self.exclusions = []
self.pairs = []
self.constraints = []
self.cmaps = []
self.vsites2 = []
self.has_virtual_sites = False
self.has_nbfix_terms = False
def _processFile(self, file):
append = ''
for line in open(file):
if line.strip().endswith('\\'):
append = '%s %s' % (append, line[:line.rfind('\\')])
else:
self._processLine(append+' '+line, file)
append = ''
def _processLine(self, line, file):
"""Process one line from a file."""
if ';' in line:
line = line[:line.index(';')]
stripped = line.strip()
ignore = not all(self._ifStack)
if stripped.startswith('*') or len(stripped) == 0:
# A comment or empty line.
return
elif stripped.startswith('[') and not ignore:
# The start of a category.
if not stripped.endswith(']'):
raise ValueError('Illegal line in .top file: '+line)
self._currentCategory = stripped[1:-1].strip()
elif stripped.startswith('#'):
# A preprocessor command.
fields = stripped.split()
command = fields[0]
if len(self._ifStack) != len(self._elseStack):
raise RuntimeError('#if/#else stack out of sync')
if command == '#include' and not ignore:
# Locate the file to include
name = stripped[len(command):].strip(' \t"<>')
searchDirs = self._includeDirs+(os.path.dirname(file),)
for dir in searchDirs:
file = os.path.join(dir, name)
if os.path.isfile(file):
# We found the file, so process it.
self._processFile(file)
break
else:
raise ValueError('Could not locate #include file: '+name)
elif command == '#define' and not ignore:
# Add a value to our list of defines.
if len(fields) < 2:
raise ValueError('Illegal line in .top file: '+line)
name = fields[1]
valueStart = stripped.find(name, len(command))+len(name)+1
value = line[valueStart:].strip()
value = value or '1' # Default define is 1
self._defines[name] = value
elif command == '#ifdef':
# See whether this block should be ignored.
if len(fields) < 2:
raise ValueError('Illegal line in .top file: '+line)
name = fields[1]
self._ifStack.append(name in self._defines)
self._elseStack.append(False)
elif command == '#undef':
# Un-define a variable
if len(fields) < 2:
raise ValueError('Illegal line in .top file: '+line)
if fields[1] in self._defines:
self._defines.pop(fields[1])
elif command == '#ifndef':
# See whether this block should be ignored.
if len(fields) < 2:
raise ValueError('Illegal line in .top file: '+line)
name = fields[1]
self._ifStack.append(name not in self._defines)
self._elseStack.append(False)
elif command == '#endif':
# Pop an entry off the if stack.
if len(self._ifStack) == 0:
raise ValueError('Unexpected line in .top file: '+line)
del(self._ifStack[-1])
del(self._elseStack[-1])
elif command == '#else':
# Reverse the last entry on the if stack
if len(self._ifStack) == 0:
raise ValueError('Unexpected line in .top file: '+line)
if self._elseStack[-1]:
raise ValueError('Unexpected line in .top file: '
'#else has already been used ' + line)
self._ifStack[-1] = (not self._ifStack[-1])
self._elseStack[-1] = True
elif not ignore:
# Gromacs occasionally uses #define's to introduce specific
# parameters for individual terms (for instance, this is how
# ff99SB-ILDN is implemented). So make sure we do the appropriate
# pre-processor replacements necessary
line = _replace_defines(line, self._defines)
# A line of data for the current category
if self._currentCategory is None:
raise ValueError('Unexpected line in .top file: '+line)
if self._currentCategory == 'defaults':
self._processDefaults(line)
elif self._currentCategory == 'moleculetype':
self._processMoleculeType(line)
elif self._currentCategory == 'molecules':
self._processMolecule(line)
elif self._currentCategory == 'atoms':
self._processAtom(line)
elif self._currentCategory == 'bonds':
self._processBond(line)
elif self._currentCategory == 'angles':
self._processAngle(line)
elif self._currentCategory == 'dihedrals':
self._processDihedral(line)
elif self._currentCategory == 'exclusions':
self._processExclusion(line)
elif self._currentCategory == 'pairs':
self._processPair(line)
elif self._currentCategory == 'constraints':
self._processConstraint(line)
elif self._currentCategory == 'cmap':
self._processCmap(line)
elif self._currentCategory == 'atomtypes':
self._processAtomType(line)
elif self._currentCategory == 'bondtypes':
self._processBondType(line)
elif self._currentCategory == 'angletypes':
self._processAngleType(line)
elif self._currentCategory == 'dihedraltypes':
self._processDihedralType(line)
elif self._currentCategory == 'implicit_genborn_params':
self._processImplicitType(line)
elif self._currentCategory == 'pairtypes':
self._processPairType(line)
elif self._currentCategory == 'cmaptypes':
self._processCmapType(line)
elif self._currentCategory == 'nonbond_params':
self._processNonbondType(line)
elif self._currentCategory == 'virtual_sites2':
self._processVirtualSites2(line)
elif self._currentCategory.startswith('virtual_sites'):
if self._currentMoleculeType is None:
raise ValueError('Found %s before [ moleculetype ]' %
self._currentCategory)
self._currentMoleculeType.has_virtual_sites = True
def _processDefaults(self, line):
"""Process the [ defaults ] line."""
fields = line.split()
if len(fields) < 5:
# fudgeLJ and fudgeQQ not specified, assumed 1.0 by default
if len(fields) == 3:
fields.append(1.0)
fields.append(1.0)
else:
raise ValueError('Too few fields in [ defaults ] line: '+line)
if fields[0] != '1':
raise ValueError('Unsupported nonbonded type: '+fields[0])
if not fields[1] in ('1', '2', '3'):
raise ValueError('Unsupported combination rule: '+fields[1])
if fields[2].lower() == 'no':
self._genpairs = False
self._defaults = fields
def _processMoleculeType(self, line):
"""Process a line in the [ moleculetypes ] category."""
fields = line.split()
if len(fields) < 1:
raise ValueError('Too few fields in [ moleculetypes ] line: '+line)
type = GromacsTopFile._MoleculeType()
self._moleculeTypes[fields[0]] = type
self._currentMoleculeType = type
def _processMolecule(self, line):
"""Process a line in the [ molecules ] category."""
fields = line.split()
if len(fields) < 2:
raise ValueError('Too few fields in [ molecules ] line: '+line)
self._molecules.append((fields[0], int(fields[1])))
def _processAtom(self, line):
"""Process a line in the [ atoms ] category."""
if self._currentMoleculeType is None:
raise ValueError('Found [ atoms ] section before [ moleculetype ]')
fields = line.split()
if len(fields) < 5:
raise ValueError('Too few fields in [ atoms ] line: '+line)
self._currentMoleculeType.atoms.append(fields)
def _processBond(self, line):
"""Process a line in the [ bonds ] category."""
if self._currentMoleculeType is None:
raise ValueError('Found [ bonds ] section before [ moleculetype ]')
fields = line.split()
if len(fields) < 3:
raise ValueError('Too few fields in [ bonds ] line: '+line)
if fields[2] != '1':
raise ValueError('Unsupported function type in [ bonds ] line: '+line)
self._currentMoleculeType.bonds.append(fields)
def _processAngle(self, line):
"""Process a line in the [ angles ] category."""
if self._currentMoleculeType is None:
raise ValueError('Found [ angles ] section before [ moleculetype ]')
fields = line.split()
if len(fields) < 4:
raise ValueError('Too few fields in [ angles ] line: '+line)
if fields[3] not in ('1', '5'):
raise ValueError('Unsupported function type in [ angles ] line: '+line)
self._currentMoleculeType.angles.append(fields)
def _processDihedral(self, line):
"""Process a line in the [ dihedrals ] category."""
if self._currentMoleculeType is None:
raise ValueError('Found [ dihedrals ] section before [ moleculetype ]')
fields = line.split()
if len(fields) < 5:
raise ValueError('Too few fields in [ dihedrals ] line: '+line)
if fields[4] not in ('1', '2', '3', '4', '5', '9'):
raise ValueError('Unsupported function type in [ dihedrals ] line: '+line)
self._currentMoleculeType.dihedrals.append(fields)
def _processExclusion(self, line):
"""Process a line in the [ exclusions ] category."""
if self._currentMoleculeType is None:
raise ValueError('Found [ exclusions ] section before [ moleculetype ]')
fields = line.split()
if len(fields) < 2:
raise ValueError('Too few fields in [ exclusions ] line: '+line)
self._currentMoleculeType.exclusions.append(fields)
def _processPair(self, line):
"""Process a line in the [ pairs ] category."""
if self._currentMoleculeType is None:
raise ValueError('Found [ pairs ] section before [ moleculetype ]')
fields = line.split()
if len(fields) < 3:
raise ValueError('Too few fields in [ pairs ] line: '+line)
if fields[2] != '1':
raise ValueError('Unsupported function type in [ pairs ] line: '+line)
self._currentMoleculeType.pairs.append(fields)
def _processConstraint(self, line):
"""Process a line in the [ constraints ] category."""
if self._currentMoleculeType is None:
raise ValueError('Found [ constraints ] section before [ moleculetype ]')
fields = line.split()
if len(fields) < 4:
raise ValueError('Too few fields in [ constraints ] line: '+line)
self._currentMoleculeType.constraints.append(fields)
def _processCmap(self, line):
"""Process a line in the [ cmaps ] category."""
if self._currentMoleculeType is None:
raise ValueError('Found [ cmap ] section before [ moleculetype ]')
fields = line.split()
if len(fields) < 6:
raise ValueError('Too few fields in [ cmap ] line: '+line)
self._currentMoleculeType.cmaps.append(fields)
def _processAtomType(self, line):
"""Process a line in the [ atomtypes ] category."""
fields = line.split()
if len(fields) < 6:
raise ValueError('Too few fields in [ atomtypes ] line: '+line)
if len(fields[3]) == 1:
# Bonded type and atomic number are both missing.
fields.insert(1, None)
fields.insert(1, None)
elif len(fields[4]) == 1 and fields[4].isalpha():
if fields[1][0].isalpha():
# Atomic number is missing.
fields.insert(2, None)
else:
# Bonded type is missing.
fields.insert(1, None)
self._atomTypes[fields[0]] = fields
def _processBondType(self, line):
"""Process a line in the [ bondtypes ] category."""
fields = line.split()
if len(fields) < 5:
raise ValueError('Too few fields in [ bondtypes ] line: '+line)
if fields[2] != '1':
raise ValueError('Unsupported function type in [ bondtypes ] line: '+line)
self._bondTypes[tuple(fields[:2])] = fields
def _processAngleType(self, line):
"""Process a line in the [ angletypes ] category."""
fields = line.split()
if len(fields) < 6:
raise ValueError('Too few fields in [ angletypes ] line: '+line)
if fields[3] not in ('1', '5'):
raise ValueError('Unsupported function type in [ angletypes ] line: '+line)
self._angleTypes[tuple(fields[:3])] = fields
def _processDihedralType(self, line):
"""Process a line in the [ dihedraltypes ] category."""
fields = line.split()
if len(fields) < 7:
raise ValueError('Too few fields in [ dihedraltypes ] line: '+line)
if fields[4] not in ('1', '2', '3', '4', '5', '9'):
raise ValueError('Unsupported function type in [ dihedraltypes ] line: '+line)
key = tuple(fields[:5])
if fields[4] == '9' and key in self._dihedralTypes:
# There are multiple dihedrals defined for these atom types.
self._dihedralTypes[key].append(fields)
else:
self._dihedralTypes[key] = [fields]
def _processImplicitType(self, line):
"""Process a line in the [ implicit_genborn_params ] category."""
fields = line.split()
if len(fields) < 6:
raise ValueError('Too few fields in [ implicit_genborn_params ] line: '+line)
self._implicitTypes[fields[0]] = fields
def _processPairType(self, line):
"""Process a line in the [ pairtypes ] category."""
fields = line.split()
if len(fields) < 5:
raise ValueError('Too few fields in [ pairtypes] line: '+line)
if fields[2] != '1':
raise ValueError('Unsupported function type in [ pairtypes ] line: '+line)
self._pairTypes[tuple(fields[:2])] = fields
def _processCmapType(self, line):
"""Process a line in the [ cmaptypes ] category."""
fields = line.split()
if len(fields) < 8 or len(fields) < 8+int(fields[6])*int(fields[7]):
raise ValueError('Too few fields in [ cmaptypes ] line: '+line)
if fields[5] != '1':
raise ValueError('Unsupported function type in [ cmaptypes ] line: '+line)
self._cmapTypes[tuple(fields[:5])] = fields
def _processNonbondType(self, line):
"""Process a line in the [ nonbond_params ] category."""
fields = line.split()
if len(fields) < 5:
raise ValueError('Too few fields in [ nonbond_params ] line: '+line)
if fields[2] != '1':
raise ValueError('Unsupported function type in [ nonbond_params ] line: '+line)
self._nonbondTypes[tuple(sorted(fields[:2]))] = fields
def _processVirtualSites2(self, line):
"""Process a line in the [ virtual_sites2 ] category."""
fields = line.split()
if len(fields) < 5:
raise ValueError('Too few fields in [ virtual_sites2 ] line: ' + line)
self._currentMoleculeType.vsites2.append(fields[:5])
def __init__(self, file, periodicBoxVectors=None, unitCellDimensions=None, includeDir=None, defines=None):
"""Load a top file.
Parameters
----------
file : str
the name of the file to load
periodicBoxVectors : tuple of Vec3=None
the vectors defining the periodic box
unitCellDimensions : Vec3=None
the dimensions of the crystallographic unit cell. For
non-rectangular unit cells, specify periodicBoxVectors instead.
includeDir : string=None
A directory in which to look for other files included from the
top file. If not specified, we will attempt to locate a gromacs
installation on your system. When gromacs is installed in
/usr/local, this will resolve to /usr/local/gromacs/share/gromacs/top
defines : dict={}
preprocessor definitions that should be predefined when parsing the file
"""
if includeDir is None:
includeDir = _defaultGromacsIncludeDir()
self._includeDirs = (os.path.dirname(file), includeDir)
# Most of the gromacs water itp files for different forcefields,
# unless the preprocessor #define FLEXIBLE is given, don't define
# bonds between the water hydrogen and oxygens, but only give the
# constraint distances and exclusions.
self._defines = OrderedDict()
self._defines['FLEXIBLE'] = True
self._genpairs = True
if defines is not None:
for define, value in defines.iteritems():
self._defines[define] = value
# Parse the file.
self._currentCategory = None
self._ifStack = []
self._elseStack = []
self._moleculeTypes = {}
self._molecules = []
self._currentMoleculeType = None
self._atomTypes = {}
self._bondTypes= {}
self._angleTypes = {}
self._dihedralTypes = {}
self._implicitTypes = {}
self._pairTypes = {}
self._cmapTypes = {}
self._nonbondTypes = {}
self._processFile(file)
# Create the Topology from it.
top = Topology()
## The Topology read from the prmtop file
self.topology = top
if periodicBoxVectors is not None:
if unitCellDimensions is not None:
raise ValueError("specify either periodicBoxVectors or unitCellDimensions, but not both")
top.setPeriodicBoxVectors(periodicBoxVectors)
else:
top.setUnitCellDimensions(unitCellDimensions)
PDBFile._loadNameReplacementTables()
for moleculeName, moleculeCount in self._molecules:
if moleculeName not in self._moleculeTypes:
raise ValueError("Unknown molecule type: "+moleculeName)
moleculeType = self._moleculeTypes[moleculeName]
if moleculeCount > 0 and moleculeType.has_virtual_sites:
raise ValueError('Virtual sites not yet supported by Gromacs parsers')
# Create the specified number of molecules of this type.
for i in range(moleculeCount):
atoms = []
lastResidue = None
c = top.addChain()
for index, fields in enumerate(moleculeType.atoms):
resNumber = fields[2]
if resNumber != lastResidue:
lastResidue = resNumber
resName = fields[3]
if resName in PDBFile._residueNameReplacements:
resName = PDBFile._residueNameReplacements[resName]
r = top.addResidue(resName, c)
if resName in PDBFile._atomNameReplacements:
atomReplacements = PDBFile._atomNameReplacements[resName]
else:
atomReplacements = {}
atomName = fields[4]
if atomName in atomReplacements:
atomName = atomReplacements[atomName]
# Try to determine the element.
atomicNumber = self._atomTypes[fields[1]][2]
if atomicNumber is None:
# Try to guess the element from the name.
upper = atomName.upper()
if upper.startswith('CL'):
element = elem.chlorine
elif upper.startswith('NA'):
element = elem.sodium
elif upper.startswith('MG'):
element = elem.magnesium
else:
try:
element = elem.get_by_symbol(atomName[0])
except KeyError:
element = None
elif atomicNumber == '0':
element = None
else:
element = elem.Element.getByAtomicNumber(int(atomicNumber))
atoms.append(top.addAtom(atomName, element, r))
# Add bonds to the topology
for fields in moleculeType.bonds:
top.addBond(atoms[int(fields[0])-1], atoms[int(fields[1])-1])
def createSystem(self, nonbondedMethod=ff.NoCutoff, nonbondedCutoff=1.0*unit.nanometer,
constraints=None, rigidWater=True, implicitSolvent=None, soluteDielectric=1.0, solventDielectric=78.5,
ewaldErrorTolerance=0.0005, removeCMMotion=True, hydrogenMass=None, switchDistance=None):
"""Construct an OpenMM System representing the topology described by this
top file.
Parameters
----------
nonbondedMethod : object=NoCutoff
The method to use for nonbonded interactions. Allowed values are
NoCutoff, CutoffNonPeriodic, CutoffPeriodic, Ewald, PME, or LJPME.
nonbondedCutoff : distance=1*nanometer
The cutoff distance to use for nonbonded interactions
constraints : object=None
Specifies which bonds and angles should be implemented with
constraints. Allowed values are None, HBonds, AllBonds, or HAngles.
Regardless of this value, constraints that are explicitly specified
in the top file will always be included.
rigidWater : boolean=True
If true, water molecules will be fully rigid regardless of the value
passed for the constraints argument
implicitSolvent : object=None
If not None, the implicit solvent model to use. The only allowed
value is OBC2.
soluteDielectric : float=1.0
The solute dielectric constant to use in the implicit solvent model.
solventDielectric : float=78.5
The solvent dielectric constant to use in the implicit solvent
model.
ewaldErrorTolerance : float=0.0005
The error tolerance to use if nonbondedMethod is Ewald, PME or LJPME.
removeCMMotion : boolean=True
If true, a CMMotionRemover will be added to the System
hydrogenMass : mass=None
The mass to use for hydrogen atoms bound to heavy atoms. Any mass
added to a hydrogen is subtracted from the heavy atom to keep their
total mass the same.
switchDistance : float=None
The distance at which the potential energy switching function is turned on for
Lennard-Jones interactions. If this is None, no switching function will be used.
Returns
-------
System
the newly created System
"""
# Create the System.
sys = mm.System()
boxVectors = self.topology.getPeriodicBoxVectors()
if boxVectors is not None:
sys.setDefaultPeriodicBoxVectors(*boxVectors)
elif nonbondedMethod in (ff.CutoffPeriodic, ff.Ewald, ff.PME, ff.LJPME):
raise ValueError('Illegal nonbonded method for a non-periodic system')
nb = mm.NonbondedForce()
sys.addForce(nb)
if self._defaults[1] in ('1', '3'):
lj = mm.CustomNonbondedForce('A1*A2/r^12-C1*C2/r^6')
lj.addPerParticleParameter('C')
lj.addPerParticleParameter('A')
sys.addForce(lj)
if implicitSolvent is OBC2:
gb = mm.GBSAOBCForce()
gb.setSoluteDielectric(soluteDielectric)
gb.setSolventDielectric(solventDielectric)
sys.addForce(gb)
nb.setReactionFieldDielectric(1.0)
elif implicitSolvent is not None:
raise ValueError('Illegal value for implicitSolvent')
bonds = None
angles = None
periodic = None
rb = None
harmonicTorsion = None
cmap = None
mapIndices = {}
bondIndices = []
topologyAtoms = list(self.topology.atoms())
exclusions = []
pairs = []
fudgeQQ = float(self._defaults[4])
fudgeLJ = float(self._defaults[3])
# Build a lookup table to let us process dihedrals more quickly.
dihedralTypeTable = {}
for key in self._dihedralTypes:
if key[1] != 'X' and key[2] != 'X':
if (key[1], key[2]) not in dihedralTypeTable:
dihedralTypeTable[(key[1], key[2])] = []
dihedralTypeTable[(key[1], key[2])].append(key)
if (key[2], key[1]) not in dihedralTypeTable:
dihedralTypeTable[(key[2], key[1])] = []
dihedralTypeTable[(key[2], key[1])].append(key)
wildcardDihedralTypes = []
for key in self._dihedralTypes:
if key[1] == 'X' or key[2] == 'X':
wildcardDihedralTypes.append(key)
for types in dihedralTypeTable.values():
types.append(key)
# NBFIX
atom_types = []
for moleculeName, moleculeCount in self._molecules:
moleculeType = self._moleculeTypes[moleculeName]
for _ in range(moleculeCount):
for atom in moleculeType.atoms:
atom_types.append(atom[1])
has_nbfix_terms = any([pair in self._nonbondTypes for pair in combinations_with_replacement(sorted(set(atom_types)), 2)])
if has_nbfix_terms:
# Build a lookup table and angle/dihedral indices list to
# let us handle exclusion manually.
angleIndices = []
torsionIndices = []
atom_partners = defaultdict(lambda : defaultdict(set))
atom_charges = []
# Loop over molecules and create the specified number of each type.
for moleculeName, moleculeCount in self._molecules:
moleculeType = self._moleculeTypes[moleculeName]
for i in range(moleculeCount):
# Record the types of all atoms.
baseAtomIndex = sys.getNumParticles()
atomTypes = [atom[1] for atom in moleculeType.atoms]
try:
bondedTypes = [self._atomTypes[t][1] for t in atomTypes]
except KeyError as e:
raise ValueError('Unknown atom type: ' + e.message)
bondedTypes = [b if b is not None else a for a, b in zip(atomTypes, bondedTypes)]
# Add atoms.
for fields in moleculeType.atoms:
if len(fields) >= 8:
mass = float(fields[7])
else:
mass = float(self._atomTypes[fields[1]][3])
sys.addParticle(mass)
# Add bonds.
atomBonds = [{} for x in range(len(moleculeType.atoms))]
for fields in moleculeType.bonds:
atoms = [int(x)-1 for x in fields[:2]]
types = tuple(bondedTypes[i] for i in atoms)
if len(fields) >= 5:
params = fields[3:5]
elif types in self._bondTypes:
params = self._bondTypes[types][3:5]
elif types[::-1] in self._bondTypes:
params = self._bondTypes[types[::-1]][3:5]
else:
raise ValueError('No parameters specified for bond: '+fields[0]+', '+fields[1])
# Decide whether to use a constraint or a bond.
useConstraint = False
if rigidWater and topologyAtoms[baseAtomIndex+atoms[0]].residue.name == 'HOH':
useConstraint = True
if constraints in (AllBonds, HAngles):
useConstraint = True
elif constraints is HBonds:
elements = [topologyAtoms[baseAtomIndex+i].element for i in atoms]
if elem.hydrogen in elements:
useConstraint = True
# Add the bond or constraint.
length = float(params[0])
if useConstraint:
sys.addConstraint(baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], length)
else:
if bonds is None:
bonds = mm.HarmonicBondForce()
sys.addForce(bonds)
bonds.addBond(baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], length, float(params[1]))
# Record information that will be needed for constraining angles.
atomBonds[atoms[0]][atoms[1]] = length
atomBonds[atoms[1]][atoms[0]] = length
# Add angles.
degToRad = math.pi/180
for fields in moleculeType.angles:
atoms = [int(x)-1 for x in fields[:3]]
types = tuple(bondedTypes[i] for i in atoms)
if len(fields) >= 6:
params = fields[4:]
elif types in self._angleTypes:
params = self._angleTypes[types][4:]
elif types[::-1] in self._angleTypes:
params = self._angleTypes[types[::-1]][4:]
else:
raise ValueError('No parameters specified for angle: '+fields[0]+', '+fields[1]+', '+fields[2])
# Decide whether to use a constraint or a bond.
useConstraint = False
if rigidWater and topologyAtoms[baseAtomIndex+atoms[0]].residue.name == 'HOH':
useConstraint = True
if constraints is HAngles:
elements = [topologyAtoms[baseAtomIndex+i].element for i in atoms]
if elements[0] == elem.hydrogen and elements[2] == elem.hydrogen:
useConstraint = True
elif elements[1] == elem.oxygen and (elements[0] == elem.hydrogen or elements[2] == elem.hydrogen):
useConstraint = True
# Add the bond or constraint.
theta = float(params[0])*degToRad
if useConstraint:
# Compute the distance between atoms and add a constraint
if atoms[0] in atomBonds[atoms[1]] and atoms[2] in atomBonds[atoms[1]]:
l1 = atomBonds[atoms[1]][atoms[0]]
l2 = atomBonds[atoms[1]][atoms[2]]
length = math.sqrt(l1*l1 + l2*l2 - 2*l1*l2*math.cos(theta))
sys.addConstraint(baseAtomIndex+atoms[0], baseAtomIndex+atoms[2], length)
else:
if angles is None:
angles = mm.HarmonicAngleForce()
sys.addForce(angles)
angles.addAngle(baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], baseAtomIndex+atoms[2], theta, float(params[1]))
if fields[3] == '5':
# This is a Urey-Bradley term, so add the bond.
if bonds is None:
bonds = mm.HarmonicBondForce()
sys.addForce(bonds)
k = float(params[3])
if k != 0:
bonds.addBond(baseAtomIndex+atoms[0], baseAtomIndex+atoms[2], float(params[2]), k)
# Add torsions.
for fields in moleculeType.dihedrals:
atoms = [int(x)-1 for x in fields[:4]]
types = tuple(bondedTypes[i] for i in atoms)
dihedralType = fields[4]
reversedTypes = types[::-1]+(dihedralType,)
types = types+(dihedralType,)
if (dihedralType in ('1', '4', '5', '9') and len(fields) > 7) or (dihedralType == '3' and len(fields) > 10) or (dihedralType == '2' and len(fields) > 6):
paramsList = [fields]
else:
# Look for a matching dihedral type.
paramsList = None
if (types[1], types[2]) in dihedralTypeTable:
dihedralTypes = dihedralTypeTable[(types[1], types[2])]
else:
dihedralTypes = wildcardDihedralTypes
for key in dihedralTypes:
if all(a == b or a == 'X' for a, b in zip(key, types)) or all(a == b or a == 'X' for a, b in zip(key, reversedTypes)):
paramsList = self._dihedralTypes[key]
if 'X' not in key:
break
if paramsList is None:
raise ValueError('No parameters specified for dihedral: '+fields[0]+', '+fields[1]+', '+fields[2]+', '+fields[3])
for params in paramsList:
if dihedralType in ('1', '4', '9'):
# Periodic torsion
k = float(params[6])
if k != 0:
if periodic is None:
periodic = mm.PeriodicTorsionForce()
sys.addForce(periodic)
periodic.addTorsion(baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], baseAtomIndex+atoms[2], baseAtomIndex+atoms[3], int(float(params[7])), float(params[5])*degToRad, k)
elif dihedralType == '2':
# Harmonic torsion
k = float(params[6])
phi0 = float(params[5])
if k != 0:
if harmonicTorsion is None:
harmonicTorsion = mm.CustomTorsionForce('0.5*k*(thetap-theta0)^2; thetap = step(-(theta-theta0+pi))*2*pi+theta+step(theta-theta0-pi)*(-2*pi); pi = %.15g' % math.pi)
harmonicTorsion.addPerTorsionParameter('theta0')
harmonicTorsion.addPerTorsionParameter('k')
sys.addForce(harmonicTorsion)
# map phi0 into correct space
phi0 = phi0 - 360 if phi0 > 180 else phi0
harmonicTorsion.addTorsion(baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], baseAtomIndex+atoms[2], baseAtomIndex+atoms[3], (phi0*degToRad, k))
else:
# RB Torsion
c = [float(x) for x in params[5:11]]
if any(x != 0 for x in c):
if rb is None:
rb = mm.RBTorsionForce()
sys.addForce(rb)
if dihedralType == '5':
# Convert Fourier coefficients to RB coefficients.
c = [c[1]+0.5*(c[0]+c[2]), 0.5*(-c[0]+3*c[2]), -c[1]+4*c[3], -2*c[2], -4*c[3], 0]
rb.addTorsion(baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], baseAtomIndex+atoms[2], baseAtomIndex+atoms[3], c[0], c[1], c[2], c[3], c[4], c[5])
# Add CMAP terms.
for fields in moleculeType.cmaps:
atoms = [int(x)-1 for x in fields[:5]]
types = tuple(bondedTypes[i] for i in atoms)
if len(fields) >= 8 and len(fields) >= 8+int(fields[6])*int(fields[7]):
params = fields
elif types in self._cmapTypes:
params = self._cmapTypes[types]
elif types[::-1] in self._cmapTypes:
params = self._cmapTypes[types[::-1]]
else:
raise ValueError('No parameters specified for cmap: '+fields[0]+', '+fields[1]+', '+fields[2]+', '+fields[3]+', '+fields[4])
if cmap is None:
cmap = mm.CMAPTorsionForce()
sys.addForce(cmap)
mapSize = int(params[6])
if mapSize != int(params[7]):
raise ValueError('Non-square CMAPs are not supported')
map = []
for i in range(mapSize):
for j in range(mapSize):
map.append(float(params[8+mapSize*((j+mapSize//2)%mapSize)+((i+mapSize//2)%mapSize)]))
map = tuple(map)
if map not in mapIndices:
mapIndices[map] = cmap.addMap(mapSize, map)
cmap.addTorsion(mapIndices[map], baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], baseAtomIndex+atoms[2], baseAtomIndex+atoms[3],
baseAtomIndex+atoms[1], baseAtomIndex+atoms[2], baseAtomIndex+atoms[3], baseAtomIndex+atoms[4])
# Set nonbonded parameters for particles.
for fields in moleculeType.atoms:
params = self._atomTypes[fields[1]]
if len(fields) > 6:
q = float(fields[6])
else:
q = float(params[4])
if has_nbfix_terms:
if self._defaults[1] != '2':
raise NotImplementedError('NBFIX terms with LB combination rule is not yet supported')
nb.addParticle(q, 1.0, 0.0)
atom_charges.append(q)
else:
if self._defaults[1] == '1':
nb.addParticle(q, 1.0, 0.0)
lj.addParticle([math.sqrt(float(params[6])), math.sqrt(float(params[7]))])
elif self._defaults[1] == '2':
nb.addParticle(q, float(params[6]), float(params[7]))
elif self._defaults[1] == '3':
nb.addParticle(q, 1.0, 0.0)
sigma = float(params[6])
epsilon = float(params[7])
lj.addParticle([math.sqrt(4*epsilon*sigma**6), math.sqrt(4*epsilon*sigma**12)])
for fields in moleculeType.atoms:
if implicitSolvent is OBC2:
if fields[1] not in self._implicitTypes:
raise ValueError('No implicit solvent parameters specified for atom type: '+fields[1])
gbparams = self._implicitTypes[fields[1]]
gb.addParticle(q, float(gbparams[4]), float(gbparams[5]))
for fields in moleculeType.bonds:
atoms = [int(x)-1 for x in fields[:2]]
bondIndices.append((baseAtomIndex+atoms[0], baseAtomIndex+atoms[1]))
for fields in moleculeType.constraints:
if fields[2] == '1':
atoms = [int(x)-1 for x in fields[:2]]
bondIndices.append((baseAtomIndex+atoms[0], baseAtomIndex+atoms[1]))
if has_nbfix_terms:
for fields in moleculeType.bonds:
atoms = [int(x)-1 for x in fields[:2]]
atom_partners[baseAtomIndex+atoms[0]]['bond'].add(baseAtomIndex+atoms[1])
atom_partners[baseAtomIndex+atoms[1]]['bond'].add(baseAtomIndex+atoms[0])
for fields in moleculeType.angles:
atoms = [int(x)-1 for x in fields[:3]]
angleIndices.append((baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], baseAtomIndex+atoms[2]))
for pair in combinations(atoms, 2):
atom_partners[baseAtomIndex+pair[0]]['angle'].add(baseAtomIndex+pair[1])
atom_partners[baseAtomIndex+pair[1]]['angle'].add(baseAtomIndex+pair[0])
for fields in moleculeType.dihedrals:
atoms = [int(x)-1 for x in fields[:4]]
torsionIndices.append((baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], baseAtomIndex+atoms[2], baseAtomIndex+atoms[3]))
for pair in combinations(atoms, 2):
atom_partners[baseAtomIndex+pair[0]]['torsion'].add(baseAtomIndex+pair[1])
atom_partners[baseAtomIndex+pair[1]]['torsion'].add(baseAtomIndex+pair[0])
# Record nonbonded exceptions.
for fields in moleculeType.pairs:
atoms = [int(x)-1 for x in fields[:2]]
types = tuple(atomTypes[i] for i in atoms)
atom1params = nb.getParticleParameters(baseAtomIndex+atoms[0])
atom2params = nb.getParticleParameters(baseAtomIndex+atoms[1])
atom1params = [x.value_in_unit_system(unit.md_unit_system) for x in atom1params]
atom2params = [x.value_in_unit_system(unit.md_unit_system) for x in atom2params]
if len(fields) >= 5:
params = [float(x) for x in fields[3:5]]
elif types in self._pairTypes:
params = [float(x) for x in self._pairTypes[types][3:5]]
elif types[::-1] in self._pairTypes:
params = [float(x) for x in self._pairTypes[types[::-1]][3:5]]
elif not self._genpairs:
raise ValueError('No pair parameters defined for atom '
'types %s and gen-pairs is "no"' % types)
elif has_nbfix_terms:
continue
else:
# Generate the parameters based on the atom parameters.
if self._defaults[1] == '2':
params = [0.5*(atom1params[1]+atom2params[1]), fudgeLJ*math.sqrt(atom1params[2]*atom2params[2])]
else:
atom1lj = lj.getParticleParameters(baseAtomIndex+atoms[0])
atom2lj = lj.getParticleParameters(baseAtomIndex+atoms[1])
params = [fudgeLJ*atom1lj[0]*atom2lj[0], fudgeLJ*atom1lj[1]*atom2lj[1]]
pairs.append((baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], atom1params[0]*atom2params[0]*fudgeQQ, params[0], params[1]))
for fields in moleculeType.exclusions:
atoms = [int(x)-1 for x in fields]
for atom in atoms[1:]:
if atom > atoms[0]:
exclusions.append((baseAtomIndex+atoms[0], baseAtomIndex+atom))
# Record virtual sites
for fields in moleculeType.vsites2:
atoms = [int(x)-1 for x in fields[:3]]
c1 = float(fields[4])
vsite = mm.TwoParticleAverageSite(baseAtomIndex+atoms[1], baseAtomIndex+atoms[2], (1-c1), c1)
sys.setVirtualSite(baseAtomIndex+atoms[0], vsite)
# Add explicitly specified constraints.
for fields in moleculeType.constraints:
atoms = [int(x)-1 for x in fields[:2]]
length = float(fields[2])
sys.addConstraint(baseAtomIndex+atoms[0], baseAtomIndex+atoms[1], length)
# Create nonbonded exceptions.
if not has_nbfix_terms:
nb.createExceptionsFromBonds(bondIndices, fudgeQQ, fudgeLJ)
else:
excluded_atom_pairs = set() # save these pairs so we don't zero them out
for tor in torsionIndices:
# First check to see if atoms 1 and 4 are already excluded because
# they are 1-2 or 1-3 pairs (would happen in 6-member rings or
# fewer). Then check that they're not already added as exclusions
if 'bond' in atom_partners[tor[3]] and tor[0] in atom_partners[tor[3]]['bond']: continue
if 'angle' in atom_partners[tor[3]] and tor[0] in atom_partners[tor[3]]['angle']: continue
key = min((tor[0], tor[3]),
(tor[3], tor[0]))
if key in excluded_atom_pairs: continue # multiterm...
params1 = self._atomTypes[atom_types[tor[0]]]
params4 = self._atomTypes[atom_types[tor[3]]]
q1 = atom_charges[tor[0]]
rmin1 = float(params1[6])
eps1 = float(params1[7])
q4 = atom_charges[tor[3]]
rmin4 = float(params4[6])
eps4 = float(params4[7])
charge_prod = (q1*q4)
epsilon = (math.sqrt(abs(eps1 * eps4)))
rmin14 = (rmin1 + rmin4) / 2
nb.addException(tor[0], tor[3], charge_prod, rmin14, epsilon)
excluded_atom_pairs.add(key)
# Add excluded atoms
for atom_idx, atom in atom_partners.items():
# Exclude all bonds and angles
for atom2 in atom['bond']:
if atom2 > atom_idx:
nb.addException(atom_idx, atom2, 0.0, 1.0, 0.0)
excluded_atom_pairs.add((atom_idx, atom2))
for atom2 in atom['angle']:
if ((atom_idx, atom2) in excluded_atom_pairs):
continue
if atom2 > atom_idx:
nb.addException(atom_idx, atom2, 0.0, 1.0, 0.0)
excluded_atom_pairs.add((atom_idx, atom2))
for atom2 in atom['dihedral']:
if atom2 <= atom_idx: continue
if ((atom_idx, atom2) in excluded_atom_pairs):
continue
nb.addException(atom_idx, atom2, 0.0, 1.0, 0.0)
for exclusion in exclusions:
nb.addException(exclusion[0], exclusion[1], 0.0, 1.0, 0.0, True)
if self._defaults[1] in ('1', '3'):
# We're using a CustomNonbondedForce for LJ interactions, so also create a CustomBondForce
# to handle the exceptions.
pair_bond = mm.CustomBondForce('-C/r^6+A/r^12')
pair_bond.addPerBondParameter('C')
pair_bond.addPerBondParameter('A')
sys.addForce(pair_bond)
lj.createExclusionsFromBonds(bondIndices, 3)
for pair in pairs:
nb.addException(pair[0], pair[1], pair[2], 1.0, 0.0, True)
pair_bond.addBond(pair[0], pair[1], [pair[3], pair[4]])
for exclusion in exclusions:
lj.addExclusion(exclusion[0], exclusion[1])
elif self._defaults[1] == '2':
for pair in pairs:
nb.addException(pair[0], pair[1], pair[2], pair[3], pair[4], True)
# Finish configuring the NonbondedForce.
methodMap = {ff.NoCutoff:mm.NonbondedForce.NoCutoff,
ff.CutoffNonPeriodic:mm.NonbondedForce.CutoffNonPeriodic,
ff.CutoffPeriodic:mm.NonbondedForce.CutoffPeriodic,
ff.Ewald:mm.NonbondedForce.Ewald,
ff.PME:mm.NonbondedForce.PME,
ff.LJPME:mm.NonbondedForce.LJPME}
nb.setNonbondedMethod(methodMap[nonbondedMethod])
nb.setCutoffDistance(nonbondedCutoff)
nb.setEwaldErrorTolerance(ewaldErrorTolerance)
if switchDistance is not None:
nb.setUseSwitchingFunction(True)
nb.setSwitchingDistance(switchDistance)
if self._defaults[1] in ('1', '3'):
methodMap = {ff.NoCutoff:mm.CustomNonbondedForce.NoCutoff,
ff.CutoffNonPeriodic:mm.CustomNonbondedForce.CutoffNonPeriodic,
ff.CutoffPeriodic:mm.CustomNonbondedForce.CutoffPeriodic,
ff.Ewald:mm.CustomNonbondedForce.CutoffPeriodic,
ff.PME:mm.CustomNonbondedForce.CutoffPeriodic,
ff.LJPME:mm.CustomNonbondedForce.CutoffPeriodic}
lj.setNonbondedMethod(methodMap[nonbondedMethod])
lj.setCutoffDistance(nonbondedCutoff)
if switchDistance is not None:
lj.setUseSwitchingFunction(True)
lj.setSwitchingDistance(switchDistance)
if has_nbfix_terms:
if self._defaults[1] != '2':
raise NotImplementedError('NBFIX terms with LB combination rule is not yet supported')
atom_nbfix_types = set([])
for pair in self._nonbondTypes:
atom_nbfix_types.add(pair[0])
atom_nbfix_types.add(pair[1])
lj_idx_list = [0 for _ in atom_types]
lj_radii, lj_depths = [], []
num_lj_types = 0
lj_type_list = []
for i,atom_type in enumerate(atom_types):
atom = self._atomTypes[atom_type]
if lj_idx_list[i]: continue # already assigned
atom_rmin = atom[6]
atom_epsilon = atom[7]
num_lj_types += 1
lj_idx_list[i] = num_lj_types
ljtype = (atom_rmin, atom_epsilon)
lj_type_list.append(atom)
lj_radii.append(float(atom_rmin))
lj_depths.append(float(atom_epsilon))
for j in range(i+1, len(atom_types)):
atom_type2 = atom_types[j]
if lj_idx_list[j] > 0: continue # already assigned
atom2 = self._atomTypes[atom_type2]
atom2_rmin = atom2[6]
atom2_epsilon = atom2[7]
if atom2 is atom:
lj_idx_list[j] = num_lj_types
elif atom_type not in atom_nbfix_types:
# Only non-NBFIXed atom types can be compressed
ljtype2 = (atom2_rmin, atom2_epsilon)
if ljtype == ljtype2:
lj_idx_list[j] = num_lj_types
# Now everything is assigned. Create the A-coefficient and
# B-coefficient arrays
acoef = [0 for i in range(num_lj_types*num_lj_types)]
bcoef = acoef[:]
for i in range(num_lj_types):
namei = lj_type_list[i][0]
for j in range(num_lj_types):
namej = lj_type_list[j][0]
try:
params = self._nonbondTypes[tuple(sorted((namei, namej)))]
rij = float(params[3])
wdij = float(params[4])
except KeyError:
rij = (lj_radii[i] + lj_radii[j]) / 2
wdij = math.sqrt(lj_depths[i] * lj_depths[j])
acoef[i+num_lj_types*j] = 2 * math.sqrt(wdij) * rij**6
bcoef[i+num_lj_types*j] = 4 * wdij * rij**6
cforce = mm.CustomNonbondedForce('(a/r6)^2-b/r6; r6=r^6;'
'a=acoef(type1, type2);'
'b=bcoef(type1, type2)')
cforce.addTabulatedFunction('acoef',
mm.Discrete2DFunction(num_lj_types, num_lj_types, acoef))
cforce.addTabulatedFunction('bcoef',
mm.Discrete2DFunction(num_lj_types, num_lj_types, bcoef))
cforce.addPerParticleParameter('type')
if (nonbondedMethod in (ff.PME, ff.LJPME, ff.Ewald, ff.CutoffPeriodic)):
cforce.setNonbondedMethod(cforce.CutoffPeriodic)
cforce.setCutoffDistance(nonbondedCutoff)
cforce.setUseLongRangeCorrection(True)
elif nonbondedMethod is ff.NoCutoff:
cforce.setNonbondedMethod(cforce.NoCutoff)
elif nonbondedMethod is ff.CutoffNonPeriodic:
cforce.setNonbondedMethod(cforce.CutoffNonPeriodic)
cforce.setCutoffDistance(nonbondedCutoff)
else:
raise ValueError('Unrecognized nonbonded method')
if switchDistance is not None:
cforce.setUseSwitchingFunction(True)
cforce.setSwitchingDistance(switchDistance)
for i in lj_idx_list:
cforce.addParticle((i - 1,)) # adjust for indexing from 0
for i in range(nb.getNumExceptions()):
ii, jj, q, eps, sig = nb.getExceptionParameters(i)
cforce.addExclusion(ii, jj)
sys.addForce(cforce)
# Adjust masses.
if hydrogenMass is not None:
for atom1, atom2 in self.topology.bonds():
if atom1.element == elem.hydrogen:
(atom1, atom2) = (atom2, atom1)
if atom2.element == elem.hydrogen and atom1.element not in (elem.hydrogen, None):
transferMass = hydrogenMass-sys.getParticleMass(atom2.index)
sys.setParticleMass(atom2.index, hydrogenMass)
sys.setParticleMass(atom1.index, sys.getParticleMass(atom1.index)-transferMass)
# Add a CMMotionRemover.
if removeCMMotion:
sys.addForce(mm.CMMotionRemover())
return sys
def _defaultGromacsIncludeDir():
"""Find the location where gromacs #include files are referenced from, by
searching for (1) gromacs environment variables, (2) for the gromacs binary
'pdb2gmx' or 'gmx' in the PATH, or (3) just using the default gromacs
install location, /usr/local/gromacs/share/gromacs/top """
if 'GMXDATA' in os.environ:
return os.path.join(os.environ['GMXDATA'], 'top')
if 'GMXBIN' in os.environ:
return os.path.abspath(os.path.join(os.environ['GMXBIN'], '..', 'share', 'gromacs', 'top'))
pdb2gmx_path = distutils.spawn.find_executable('pdb2gmx')
if pdb2gmx_path is not None:
return os.path.abspath(os.path.join(os.path.dirname(pdb2gmx_path), '..', 'share', 'gromacs', 'top'))
else:
gmx_path = distutils.spawn.find_executable('gmx')
if gmx_path is not None:
return os.path.abspath(os.path.join(os.path.dirname(gmx_path), '..', 'share', 'gromacs', 'top'))
return '/usr/local/gromacs/share/gromacs/top'
|
/**
* A {@link Mixin} for showing a progress bar.
*/
public class ProgressBarMixin implements Mixin {
private TemplateLayout mTemplateLayout;
@Nullable
private ColorStateList mColor;
/**
* @param layout The layout this mixin belongs to.
*/
public ProgressBarMixin(TemplateLayout layout) {
mTemplateLayout = layout;
}
/**
* @return True if the progress bar is currently shown.
*/
public boolean isShown() {
final View progressBar = mTemplateLayout.findManagedViewById(R.id.suw_layout_progress);
return progressBar != null && progressBar.getVisibility() == View.VISIBLE;
}
/**
* Sets whether the progress bar is shown. If the progress bar has not been inflated from the
* stub, this method will inflate the progress bar.
*
* @param shown True to show the progress bar, false to hide it.
*/
public void setShown(boolean shown) {
if (shown) {
View progressBar = getProgressBar();
if (progressBar != null) {
progressBar.setVisibility(View.VISIBLE);
}
} else {
View progressBar = peekProgressBar();
if (progressBar != null) {
progressBar.setVisibility(View.GONE);
}
}
}
/**
* Gets the progress bar in the layout. If the progress bar has not been used before, it will be
* installed (i.e. inflated from its view stub).
*
* @return The progress bar of this layout. May be null only if the template used doesn't have a
* progress bar built-in.
*/
private ProgressBar getProgressBar() {
final View progressBar = peekProgressBar();
if (progressBar == null) {
final ViewStub progressBarStub =
(ViewStub) mTemplateLayout.findManagedViewById(R.id.suw_layout_progress_stub);
if (progressBarStub != null) {
progressBarStub.inflate();
}
setColor(mColor);
}
return peekProgressBar();
}
/**
* Gets the progress bar in the layout only if it has been installed.
* {@link #setShown(boolean)} should be called before this to ensure the progress bar
* is set up correctly.
*
* @return The progress bar of this layout, or null if the progress bar is not installed. The
* null case can happen either if {@link #setShown(boolean)} with true was
* not called before this, or if the template does not contain a progress bar.
*/
public ProgressBar peekProgressBar() {
return (ProgressBar) mTemplateLayout.findManagedViewById(R.id.suw_layout_progress);
}
/**
* Sets the color of the indeterminate progress bar. This method is a no-op on SDK < 21.
*/
public void setColor(@Nullable ColorStateList color) {
mColor = color;
if (Build.VERSION.SDK_INT >= VERSION_CODES.LOLLIPOP) {
final ProgressBar bar = peekProgressBar();
if (bar != null) {
bar.setIndeterminateTintList(color);
if (Build.VERSION.SDK_INT >= VERSION_CODES.M || color != null) {
// There is a bug in Lollipop where setting the progress tint color to null
// will crash with "java.lang.NullPointerException: Attempt to invoke virtual
// method 'int android.graphics.Paint.getAlpha()' on a null object reference"
// at android.graphics.drawable.NinePatchDrawable.draw(:250)
// The bug doesn't affect ProgressBar on M because it uses ShapeDrawable instead
// of NinePatchDrawable. (commit 6a8253fdc9f4574c28b4beeeed90580ffc93734a)
bar.setProgressBackgroundTintList(color);
}
}
}
}
/**
* @return The color previously set in {@link #setColor(ColorStateList)}, or null if the color
* is not set. In case of null, the color of the progress bar will be inherited from the theme.
*/
@Nullable
public ColorStateList getColor() {
return mColor;
}
} |
Antiviral Roles of Abscisic Acid in Plants
Abscisic acid (ABA) is a key hormone involved in tuning responses to several abiotic stresses and also has remarkable impacts on plant defense against various pathogens. The roles of ABA in plant defense against bacteria and fungi are multifaceted, inducing or reducing defense responses depending on its time of action. However, ABA induces different resistance mechanisms to viruses regardless of the induction time. Recent studies have linked ABA to the antiviral silencing pathway, which interferes with virus accumulation, and the micro RNA (miRNA) pathway through which ABA affects the maturation and stability of miRNAs. ABA also induces callose deposition at plasmodesmata, a mechanism that limits viral cell-to-cell movement. Bamboo mosaic virus (BaMV) is a member of the potexvirus group and is one of the most studied viruses in terms of the effects of ABA on its accumulation and resistance. In this review, we summarize how ABA interferes with the accumulation and movement of BaMV and other viruses. We also highlight aspects of ABA that may have an effect on other types of resistance and that require further investigation.
INTRODUCTION
Plants adapt to or tolerate stress through production of specific hormones that are produced at very low concentrations. One of the classical and well-studied phytohormones is abscisic acid (ABA), the importance of which is highlighted by its various roles in development (such as seed dormancy, germination, and floral induction) and stress responses (such as drought, salinity, and pathogen infection) (Mauch-Mani and Mauch, 2005;Wasilewska et al., 2008;Finkelstein, 2013;Humplik et al., 2017).
Abscisic acid affects the plant defense response to pathogens of different lifestyles, such as biotrophs that thrive on a living host without killing it and necrotrophs that cause host death and thrive on dead matter (Mauch-Mani and Mauch, 2005;Fan et al., 2009;Xu et al., 2013). However, the effects of ABA are multifaceted, depending on the pathosystem studied and the timing of induction (Ton et al., 2009). ABA can enhance plant defense if it is triggered at early stages of infection by closing stomata and inducing callose deposition at cell walls (Ton et al., 2009;Ellinger et al., 2013). In contrast, if a pathogen is successfully established inside a plant tissue, then ABA induction can hamper plant defense by antagonizing other hormone pathways such as those responsible for salicylic acid (SA) or ethylene synthesis (Anderson et al., 2004;Yasuda et al., 2008).
While ABA can both induce and reduce plant defense against fungal and bacterial pathogens, it appears to only enhance plant antiviral defense as shown for several viruses Alazem et al., 2014;Alazem and Lin, 2015). Two ABA-dependent defense mechanisms against viruses have been reported in plants, callose deposition at plasmodesmata (PD) (Iglesias and Meins, 2000;De Storme and Geelen, 2014) and the RNA silencing pathway (Alazem and Lin, 2015;Alazem et al., 2017). In addition, ABA-related recessive resistance has been reported for two RNA viruses, bamboo mosaic virus (BaMV) and cucumber mosaic virus (CMV) (Alazem et al., 2014). These findings have attributed novel antiviral roles to ABA in plants, and have raised outstanding questions discussed below that require further investigations.
Bamboo mosaic virus is a positive-sense, single-stranded RNA virus of the Potexvirus genus (Family Alphaflexiviridae) with a genomic RNA of 6.4 Kb (Lin et al., 1994). BaMV genome encodes five open reading frames that translate into a replicase composed of three domains (a capping enzyme domain, a helicase-like domain, and an RNA-dependent RNA polymerase domain) (Li et al., 1998(Li et al., , 2001aHuang et al., 2004), three movement proteins (Lin et al., 1994(Lin et al., , 2004(Lin et al., , 2006, and a capsid protein (Lan et al., 2010).
Since ABA effects on plant antiviral defense have been mostly studied using BaMV, here we summarize how ABA interferes with the accumulation, movement, and symptom development of BaMV and other viruses following infection. We also highlight several aspects of the ABA signaling pathway that may have potential effects on other types of antiviral resistance and that require further investigation.
VIRUS INFECTION INDUCES ABA
Several RNA viruses have been shown to induce drought tolerance in plants, a phenomenon observed following infection by CMV, tobacco mosaic virus (TMV) and tobacco rattle virus (TRV) in different host plants including Nicotiana tabacum, Beta vulgaris, and Oryza sativa . Xu et al. (2008) ascribed this drought tolerance to the increase in the concentrations of osmoprotectants and antioxidants following viral infection. However, apart from the effects of osmolytes, drought tolerance is usually attributed to the increase of ABA content (Finkelstein, 2013). In fact, the increase of ABA content in virus-infected hosts has been reported for a number of compatible interactions (successful infection leading to disease) such as CMV/Nicotiana benthamiana (Alazem et al., 2014), BaMV/Arabidopsis thaliana and BaMV/N. benthamiana (Alazem et al., 2014), and TMV/N. tabacum (Fraser and Whenham, 1989). However, in some incompatible interactions (successful plant defense), viral infection does not induce ABA (Kovac et al., 2009;Baebler et al., 2014). For example, infection by potato virus Y (PVY NTN ) of the resistant potato cultivar Sante, which harbors the Ry sto extreme resistance gene, did not induce ABA. Instead, jasmonic acid (JA) increased within the first few hours after PVY NTN infection (Flis et al., 2005;Kovac et al., 2009). Unaltered ABA content has also been reported for the resistant potato cultivar Rywal (carrying the R-gene Ny-1) following PVY infection, and for a resistant tomato cultivar (carrying the R-gene Tm-1) infected with TMV, although, in this latter case, the tomato cultivar resistant to TMV contained more ABA than a susceptible cultivar (Whenham et al., 1986;Baebler et al., 2014). Another study has shown that infecting resistant soybean (carrying the R-gene Rsv3) with an avirulent strain (G5H) of soybean mosaic virus (SMV) resulted in higher ABA content during the first 24 h of infection. Interestingly, SA was not induced throughout the time course of the experiment, but was increased late in response to a virulent SMV strain (G7H) (Seo et al., 2014).
Although viroids represent an interesting class of infectious entities without encoded proteins, studies on defense responses to viroids are still preliminary and lack solid conclusions on the roles of ABA or other hormones. For example, in response to potato spindle tuber viroid (PSTVd) infection (RG1 severe strain), ABArelated genes have shown different patterns of expression in tomato cultivars. Some genes in the ABA biosynthesis pathway were upregulated, such as the subunit of farnesyl transferase and the phospholipase D α-1, whereas few components of the guard cell ABA signaling pathway were downregulated (Owens et al., 2012). A similar study showed that no ABA or SA genes were induced following infection with the PSTVd RG1 strain, but only β-1,3-glucanase was induced at 25 days post-infection (Itaya et al., 2002). The difference between these two studies may be attributable to annotation of the tomato genome, which was not available at the time of the latter study. However, given the documented effect of ABA on callose accumulation, it can be speculated that ABA contributes to defense against viroids through callose. We will discuss the example of chrysanthemum stunt viroid (CSVd) spread in apical domains in the following section.
Since SA plays a major role in R-gene-mediated resistance, it is taken for granted that SA levels are elevated following viral infections (Baebler et al., 2014;de Ronde et al., 2014). However, there are some cases where JA or ABA are increased during early responses, such as of PVY NTN or SMV (Kovac et al., 2009;Baebler et al., 2014;Seo et al., 2014). In both examples, SA was induced at later stages of infection. This concurrent induction of ABA/JA then SA suggests that each hormone contributes differently to defense. It remains unanswered why hormone responses in incompatible interactions differ according to the infecting virus.
Abscisic acid deficiency has been reported to have an influential role in R-gene-mediated resistance against bacterial pathogens. For example, high temperature inhibits nuclear localization of the proteins SNC1 and RSP4, which is required for resistance against the bacterial pathogen Pseudomonas syringae. However, when the ABA biosynthesis pathway was impaired, nuclear localization of both proteins was enhanced regardless of temperature, leading to temperature-insensitive resistance against P. syringae (Mang et al., 2012). Since the effect of ABA was achieved through the biosynthesis pathway (by testing aba1 and aba2 mutants) rather than through ABA signaling (by testing abi1-1 and abi4-1 mutants), the authors suggested a role for ABA2 in R-gene-mediated resistance (Mang et al., 2012). Similar effects of ABA on R-genes that function against viruses are possible. Some R-genes have previously been shown to be temperature-sensitive, such as the Rx-gene against potato virus X (PVX) and the N-gene against TMV, but when plant culture temperatures were increased from 22 to 28 • C the hypersensitive response disappeared in infected tobacco and tomato plants (Samuel, 1931;Whitham et al., 1996;Wang et al., 2009). A recent study revealed that the temperature-sensitive Wsm1 gene, which confers resistance to wheat streak mosaic virus (WSMV) and triticum mosaic virus (TriMV), and Wsm2 that confers resistance to WSMV alone, block the systemic movement of both viruses in wheat at low temperature. Both viruses failed to enter the leaf sheaths of inoculated leaves at 18 • C (but not at 24 • C), thereby conferring resistance by impairing their long-distance movement (Tatineni et al., 2016). Whether or not ABA mediates these effects has yet to be investigated.
ABA-DEPENDENT CALLOSE ACCUMULATION IS AN ANTIVIRAL MECHANISM
Plant viruses move from cell to cell via PD, with specific viral proteins (mostly movement proteins) modifying PD and increasing the size exclusion limit (which determines the size of the molecules traversing PD), thereby allowing the large viral movement complex to pass through (Fridborg et al., 2003;Lucas, 2006;Su et al., 2010;Heinlein, 2015). Trafficking through PD can be modulated by the controlled deposition of callose, a polysaccharide of the class β-1,3-glucan, at the necks of PD (Iglesias and Meins, 2000;Li et al., 2012b). Callose is a key component involved in cell fortification, and is found in different tissues at various developmental stages because it is required for growth and development. It is encoded by callose synthase (CalS) genes (or glucan synthase-like ), a gene family comprising 12 members in Arabidopsis that are involved in producing callose in different tissues/organelles (Verma and Hong, 2001;Dong et al., 2008;Ellinger and Voigt, 2014). Callose is also involved in plant response to biotic stress, with its deposition on the cell wall and at PD being important for restricting pathogen progression (Mauch-Mani and Mauch, 2005;Luna et al., 2011;Ellinger and Voigt, 2014). Among CalS genes, CalS10 (or GSL8) has been identified as the primary regulator of callose deposition at PD (Guseman et al., 2010;Ellinger and Voigt, 2014;Han et al., 2014).
In contrast, ABA has been shown to suppress expression of PR2, which allows more callose to accumulate at PD (Rezzonico et al., 1998) and thereby reduces viral intercellular movement and spread (Iglesias and Meins, 2000;Heinlein, 2015). The negative effect of ABA on β-1,3 glucanases suggests that ABA can increase callose accumulation in different tissues and organelles (PD, cell wall, phloem sieve plates). In fact, few studies listed below have shown the link between ABA induction, callose deposition and restriction of virus movement.
Below, we summarize the findings on the roles of callose in both compatible and incompatible plant-virus interactions:
Roles of Callose in Compatible Interactions
Most of the cases reporting a role for ABA in plant defense against viruses involve compatible interactions. ABA pretreatment has been shown to reduce levels of different RNA viruses, such as tobacco necrosis virus (TNV) on Phaseolus vulgaris (Iriti and Faoro, 2008), TMV on N. tabacum (Whenham et al., 1986;Fraser and Whenham, 1989), and BaMV on A. thaliana (Alazem et al., 2014). These works postulated that enhanced callose deposition at PD could explain the ABA-dependent resistance, which is supported by the inability of TNV, for example, to spread in ABA-treated leaves (Iriti and Faoro, 2008).
In compatible interactions, the response of plants to virus or viroid infections is not strong enough to prevent spread of the viral agents to other tissues, which is evident from the levels of defense responses such as ABA, SA, callose and reactive oxygen species (ROS) (Kovac et al., 2009;Baebler et al., 2014;Seo et al., 2014;Lopez-Gresa et al., 2016). Considering that the biosynthesis pathway of ABA (like other hormones such as SA and JA) takes place in the chloroplast (Finkelstein, 2013), and that certain viruses and viroids interfere with several machineries in such plastids (Zhao et al., 2016), this might be the reason why some plants do not produce sufficient amounts of ABA or callose in response to infection in leaves. In contrast, callose deposition in meristemic tissues seems to be more efficient in preventing viroid spread. For instance, the response of two different Argyranthemum cultivars (Yellow Empire and Border Dark Red) to infection with chrysanthemum stunt viroid (CSVd) revealed that less callose was deposited at PD in the shoot apical meristem (SAM) of Yellow Empire compared to Border Dark Red, which resulted in the spread of CSVd to the uppermost cell layers in the apical dome and the youngest leaf primordia 1 and 2 of Yellow Empire (Zhang Z. et al., 2015). However, the SAM in the Border Dark Red cultivar presented more callose particles, which prevented CSVd from spreading beyond the lower part of the apical domain and elder leaf primordia (Zhang Z. et al., 2015). Which factor controls or induces callose deposition in SAM is unknown. Notably, both cultivars showed disease symptoms after infection with CSVd, which raises the question of whether callose deposition at PD occurs in other tissues (such as leaves) and whether this accumulation affects CSVd movement (Flores, 2016).
Roles of Callose in Incompatible Interactions
Callose deposition has been documented in resistant soybean plants (carrying the R-resistance gene) in response to SMV. This response restricted SMV to the inoculated sites as no SMV RNA was detected beyond these sites (Li et al., 2012b). The same study also showed that susceptible soybean plants infected with SMV could not accumulate callose and, as a result, SMV infection spread (Li et al., 2012b). A similar study showed that another soybean cultivar that possessed the Rsv3 gene exhibited extreme resistance to SMV (Seo et al., 2014). This resistance was achieved by a subset of PP2C-encoding genes that comprise components of the ABA signaling pathway and that are induced by ABA. Recognition of SMV's cylindrical inclusion effector by the cultivar's Rsv3 protein induced the ABA pathway and activated the PP2Ca3 gene which, in turn, induced callose deposition and conferred extreme resistance against SMV (Seo et al., 2014). However, the mechanism linking PP2C proteins and CalS genes (or their protein products) or β-1,3-glucanases is unknown. Thus, induction of ABA in some incompatible plant-virus interactions suggests a role for ABA in innate immunity that needs to be experimentally validated (Whenham et al., 1986;Melotto et al., 2008;Kovac et al., 2009;Pacheco et al., 2012;Seo et al., 2014).
Callose in the Early Antiviral Response: Is it Controlled by SA or ABA?
Early induction of ABA in some incompatible interactions supports the hypothesis that ABA plays a role during early immune responses against some viruses (Whenham et al., 1986;Melotto et al., 2008;Pacheco et al., 2012;Seo et al., 2014). However, it remains unclear whether or not callose deposition at that stage is completely ABA-dependent because no ABA mutants have been assessed to confirm the role of ABA-dependent callose deposition in incompatible interactions ( Figure 1A).
In contrast, much more is known about how SA affects PD and callose. Several reports have shown that SA induces PD closure and impairs their permeability by increasing the amount of callose deposited at PD Cui and Lee, 2016). This effect requires the action of plasmodesmatalocated protein 5 (PDLP5), which is dependent on NPR-1 . PDLP5 controls the expression of CalS1 and CalS8 genes that are responsible for callose synthesis and deposition at PD in response to SA treatment (Cui and Lee, 2016). The major gene involved in callose deposition at PD, CalS10, functions independently of PDLP5 or SA, as evidenced by the normal plasmodesmal permeability induced by exogenous SA in the cals10-1 mutant (Cui and Lee, 2016).
Despite the fact that both SA and ABA enhance callose deposition at PD (Figure 1B), the mechanism regulating this effect is quite different in each case. While the action of SA is mediated directly via specific genes (PDLP5, CalS1, and CalS8), ABA exerts a general indirect effect by transcriptionally decreasing β-1,3-glucanases that proteins may target all kinds of callose (Oide et al., 2013;Wang et al., 2013;Cui and Lee, 2016).
It is important to note that, in some cases, ABA does not lead to more callose deposition and, depending on growth conditions, its effect on callose deposition can even be reversed. For example, under conditions of low light intensity, high sucrose levels and the addition of (ii) Positive regulation of several AGO genes in the sRNA pathway, which reduces BaMV and PVX levels. ABA has an additional role in non-host resistance against PVX because ABA deficiency resulted in limited accumulation of AGO2 so that Arabidopsis became susceptible to PVX accumulation and systemic movement. The role of ABA in incompatible interactions has not been addressed. However, the effects of ABA on the callose and sRNA pathway, as well as the increased ABA content in some incompatible interactions, may suggest a role in such interactions. (B) The antagonistic pathways SA and ABA positively regulate common subsets of antiviral resistance mechanisms: callose deposition and sRNA (half-green half-blue circles). SA controls R-gene resistance, induces hypersensitive responses (HR), and the accumulation of reactive oxygen species (ROS) (green circles). SA contributes to the production of siRNAs and enhances callose deposition during early immune responses in incompatible interactions. In addition, exogenous application of SA increases plant tolerance to viruses in compatible interactions, which is supported by the increased susceptibility in lines with an impaired SA pathway.
Gamborg's vitamin to growth medium, applications of ABA have been shown to repress callose deposition (Luna et al., 2011).
Suppression of Callose-Mediated Defense
Although ABA reduces the expression of β-1,3-glucanases, which are responsible for callose degradation, some viroids have evolved different ways to overcome the potential increase in callose deposition at PD. For example, PSTVd in tomato benefits from the activation of the small RNA (sRNA) pathway that produces sRNAs derived from the virulence modulating region of PSTVd. These viroid-derived sRNAs target the CalS11-like and CalS12-like genes to interfere with callose synthase mRNA levels. However, their roles in callose accumulation at PD in tomato plants is unknown (Adkar-Purushothama et al., 2015a). In addition, some viruses recruit host factors that help degrade or remove callose from PD such as TAG4.4/SAG2.3, AtBG_ppap (a beta-glucanase), ANK/TIP1-3, or others, so that callose does not hinder viral intercellular trafficking (Burch-Smith and Zambryski, 2016).
Several gaps remain in our knowledge of the roles of the antagonistic ABA and SA pathways in callose-mediated restriction of virus spread in incompatible interactions. Some studies addressed the roles of either hormone in incompatible interactions and callose deposition ( Figure 1B). However, since both hormones appear to affect callose levels, a study that jointly tests the effects of both hormones on β-1,3-glucanase and CalS genes and proteins, and consequent callose accumulation at PD or cell walls, would greatly clarify how cells react early to infection and induce defense responses.
ABA-DEPENDENT ANTIVIRAL DEFENSE THROUGH RNA SILENCING PATHWAYS
Because sRNAs are repressors of gene expression, their mechanism of action is referred as RNA silencing, gene silencing, or RNA interference (Vaucheret, 2006). RNA silencing occurs on two levels; transcriptional gene silencing, and RNA degradation , which correlates with the accumulation of short-interference small RNAs (siRNAs) (Vaucheret, 2006). siRNAs are loaded into the RNA-Induced Silencing Complex (RISC) and guide Argonaute (AGO) proteins (the key player in RISC) to cleave or inactivate RNAs derived from transposons, viral-, trans-, or endogenous-genes leading to their degradation (Vaucheret, 2006;Chapman and Carrington, 2007). In Arabidopsis, the backbone of the RNA silencing pathway consists of proteins from three families: (1) The Dicer-Like (DCL) family, which comprises four genes (DCL1, DCL2, DCL3, and DCL4).
(2) The AGO family, which comprises of 10 functional members (From AGO1 to AGO10, with a pseudo AGO8) (Takeda et al., 2008;Mallory and Vaucheret, 2010;Seo et al., 2013). (3) The RNA-directed RNA polymerase (RDR)s family, which comprises three functional genes; RDR1, RDR2, and RDR6. The antiviral RNA silencing pathway is PTGS-based, and several genes in the DCL, AGO, and RDR families appear to have redundancy in their function against invading viruses (Vaucheret, 2008;Garcia-Ruiz et al., 2010, 2015Pelaez and Sanchez, 2013;Seo et al., 2013). While siRNAs, which are derived from viruses, transgenes or from a subset of endogenous genes, are cis-acting siRNAs and therefore their action is termed as autosilencing, micro-RNAs (miRNA) originate from distinct genes, different from the ones they regulate, with their action referred to as heterosilencing (Bartel, 2004;Vaucheret, 2006). Viruses have evolved viral suppressors for RNA silencing (VSR) that enable them to counteract the antiviral RNA silencing pathway (Li and Ding, 2006;Burgyan and Havelda, 2011). Generally, VSR are multifunctional and play vital roles is viruses' movement, replication or pathogenesis (Cao et al., 2010;Csorba et al., 2015). For example, the movement protein "triple gene block protein 1" in several potexviruses has VSR function along with its role in virus movement (Senshu et al., 2009;Lim et al., 2010;Brosseau and Moffett, 2015). Viruses often encode one VSR that can interfere the RNA silencing pathway at different steps such as binding dsRNA, preventing siRNA translocation or RISC assembly, or interacting with AGO proteins and impairing their silencing function (Li and Ding, 2006;Jin and Zhu, 2010;Burgyan and Havelda, 2011;Kenesi et al., 2017).
Until very recently, ABA-dependent callose deposition at PD was the only documented link between ABA and resistance to viruses. However, a recently revealed connection between ABA and the RNA silencing pathway has added another role for ABA in resistance to viruses (Alazem and Lin, 2015;Alazem et al., 2017). ABA-dependent defense against BaMV and PVX in Arabidopsis, for example, is mainly achieved through the RNA silencing pathway, not through callose deposition at PD (Jaubert et al., 2011;Alazem et al., 2017).
Role of ABA in Endogenous sRNA Pathways
Expanding evidence has attributed a regulatory role for ABA in sRNA pathways, such as the siRNA and miRNA pathways. Previous works reported that ABA is required for stabilization of Cap binding proteins (CBP) 20 and 80 in a post-translational mechanism . These two proteins function in the formation of pre-miRNA transcripts and facilitate splicing during miRNA biogenesis. In addition, cbp20 and cbp80 mutants render plants hypersensitive to ABA. It is known that CBP20 is a negative regulator of ABA-dependent drought tolerance, and mutation of this gene renders plants tolerant to drought (Papp et al., 2004;Kim et al., 2008;Kuhn et al., 2008). Similarly, CBP80 downregulation in potato reduced miR159 levels, thereby allowing accumulation of the miR159-target genes MYB33 and 101 and consequently increasing drought tolerance (Pieczynski et al., 2013). In fact, mutants of several components of the miRNA pathway such as hyponastic leaves 1 (HYL1), HUA enhancer 1 (HEN1) or DCL1 also exhibit hypersensitivity to ABA (Lu and Fedoroff, 2000;Zhang et al., 2008). Other mutants have shown ABA supersensitivity such as dcl2, dcl3, dcl4 and their corresponding triple mutant. Expression of ABA-responsive genes such as RD22 and ABF3 was significantly increased in all dcl mutants. The mutants dcl2, dcl3 and dcl4, but not dcl1, showed increased levels of ABI3, ABI4, and ABI5 gene products (Zhang et al., 2008). Of note, abi3-1 and abi4-1 increased plant susceptibility to BaMV infection, but the genes regulated by these factors are still unknown (Alazem et al., 2014). Actually, several works have indicated that abiotic stresses such as drought, salinity, or cold stress (all of which are partially regulated by ABA) induce genes in the DCL and RDR families in tomato, maize and Populus trichocarpa (Qian et al., 2011;Bai et al., 2012;Zhao et al., 2015). The direct effect of ABA on the expression of DCLs or RDR is exemplified by the increased expression of RDR1 in A. thaliana and of all RDRs in O. sativa, but only RDR6 was responsible for persistent ABA post-transcriptional control of gene silencing in O. sativa Hunter et al., 2013).
Antiviral Role of ABA through Regulation of AGOs
Argonaute proteins are integral players in all sRNA pathways in plants and animals, comprising a family of 10 members in Arabidopsis (Carbonell, 2017). By associating with different sRNAs, they regulate the expression of many genes and thereby control several aspects of growth, development and resistance to viruses (Vaucheret, 2008;Carbonell and Carrington, 2015;Zhang H. et al., 2015). All AGOs have been reported to reduce levels of different viruses, with variations in efficiency probably due to the effects of VSRs (Brosseau and Moffett, 2015;Carbonell and Carrington, 2015;Brosseau et al., 2016;Alazem et al., 2017). For example, when deleting the VSR of PVX ( P25), all overexpressed AGOs downregulated PVX-P25 in N. benthamiana (Brosseau and Moffett, 2015). miR168 levels, which maintains AGO1 homeostasis, are regulated by ABA (Li et al., 2012a). Li et al. (2012a) found that AGO1 RNA is not induced within 12 h of ABA treatment in Arabidopsis seedlings because of the effect of miR168, but another study found that extending the effect of ABA to 4 days induced not only AGO1 but also AGO2 and AGO3 (Alazem et al., 2017). The latter study conducted experiments on ∼30 dayold Arabidopsis, compared to the 7-day old seedlings used by Li et al. (2012a). These results were confirmed in ABAdeficient mutants (aba2-1 and aao3), showing that AGO1, AGO2, and AGO3 were expressed at very low levels (Alazem et al., 2017). In that study, BaMV infection also induced AGO1, 2 and 3 expression, but when aba2-1-and aao3deficient mutants were infected with BaMV, AGO1 and 2 but not 3 failed to accumulate to wild-type levels, indicating the expression of these AGOs is ABA-dependent. Furthermore, ABA was found to have negative effects on AGO4 and AGO10 expression, but differential effects on AGO7 expression, since ABA treatment did not induce AGO7 mRNA accumulation in wild-type plants but ABA-deficient mutants (aba2-1 and aao3) showed significantly reduced expression of AGO7 (Alazem et al., 2017). These findings imply that ABA generally affects several genes in the RNA silencing pathway, perhaps representing an important tool by which ABA tunes plant responses to different stimuli.
Although AGO1 has antiviral activity against several viruses (Morel et al., 2002;Qu et al., 2008), BaMV levels were not reduced in the 4mAGO1 transgenic line in which AGO1 was made resistant to the downregulatory effect of the AGO1-miR168 complex by four mismatches that prevent binding with miR168a. In the same context, the miR168a-2 mutant accumulates the AGO1 protein, but BaMV levels were still unaffected in this mutant compared to wild-type plants (Vaucheret, 2009;Alazem et al., 2017). It could be that either AGO1 has no clear effect against BaMV or that a VSR of BaMV (probably TGBp1) impairs the antiviral activity of AGO1. In contrast, the ago1-27 mutant showed reduced BaMV levels compared to wildtype plants because of the increased expression of AGO2 FIGURE 2 | Abscisic acid (ABA) effects pathway on BaMV accumulation and plant antiviral resistance. A large part of ABA biosynthesis takes place in the chloroplast. Impairment of genes that function in the chloroplast, such as ABA1 in Nicotiana benthamiana or NCED3 in Arabidopsis thaliana, significantly reduces BaMV levels. The last two steps in ABA biosynthesis take place in the cytosol, where ABA2 coverts xanthosin into ABA-aldehyde and AAO3 reduces ABA-aldehyde to produce ABA. ABA2 mutants have markedly reduced levels of BaMV (-)RNA. Whether ABA2, ABA-aldehyde or other factors controlled by ABA2 are required for BaMV to accumulate is unknown. In contrast, mutation of AAO3 and downstream genes increases susceptibility to BaMV. ABA partially controls the expression of the AGO gene family and induces AGO1, 2 and 3, with AGO2 and 3 but not AGO1 acting against BaMV. In addition, ABA induction of callose is ineffective against BaMV because plants silenced in CalS10 still show resistance after ABA treatment. HF, host factor. and AGO3 levels in this mutant. Surprisingly, BaMV levels were not affected in the ABA-treated ago3-2 mutant and were not significantly reduced in the ABA-treated ago2-1 mutant compared with corresponding mock-treated mutant (Alazem et al., 2017). These findings imply that ABA-dependent resistance to BaMV is mainly achieved through AGO2 and AGO3, and that callose deposition at PD may not be the main resistance mechanism controlled by ABA, at least in some compatible interactions. In fact, restriction of viruses to the sites of infection during incompatible interactions can also be ascribed to the activity of the RNA silencing pathway, and further studies on this topic could reveal much about the involvement of ABA in incompatible interactions. In the same context, it was found that the RNA silencing pathway controls the non-host resistance of A. thaliana to PVX infection, mainly through AGO2 (Jaubert et al., 2011). This finding was also confirmed for the aba2-1 mutant, which produces very little AGO2, thereby allowing PVX to accumulate locally and move systemically compared to the scenario in wild-type plants (Alazem et al., 2017).
Several studies have addressed the roles of the RNA silencing pathway in resistance to viroids since RNA or DNA replication intermediates trigger this pathway (Minoia et al., 2014;Adkar-Purushothama et al., 2015b;Carbonell and Daros, 2017). Since ABA regulates several genes in this pathway, ABA could also play a role in mediating resistance to viroids. For example, Minoia et al. (2014) found that A. thaliana AGO1, AGO2, AGO2, AGO3, AGO4, AGO5, and AGO9 were loaded with PSTVd-derived sRNA in infected N. benthamiana plants. Given the regulatory role of ABA in AGO1, 2, and 3, it is possible that ABA may participate in resistance to viroids through these AGOs.
ABA AND RECESSIVE RESISTANCE
Recessive resistance is defined as the loss of susceptibility when an important host factor required for virus replication is impaired (Hashimoto et al., 2016). To date, most of the discovered recessive-resistance genes belong to the translation initiation factor (eIF) 4E and eIF4G groups (Hashimoto et al., 2016). However, other host factors are involved in BaMV accumulation and they localize to the cytosol and chloroplast (Figure 2). Further information on those factors is described in a recent review (Huang et al., 2017). Here, we briefly focus on the chloroplast-related genes since ABA and other hormones are biosynthesized in chloroplasts.
Chloroplast phosphoglycerate kinase (cPGK) interacts with the 3 -untranslated region of BaMV to direct BaMV RNA to the chloroplasts, and silencing or mislocalization of cPGK significantly reduces BaMV levels (Lin et al., 2007;Cheng et al., 2013). BaMV Minus-strand (-) RNA has been detected within chloroplasts, which suggests localization of BaMV replication intermediates there (Lin et al., 2007;Cheng et al., 2013). In accordance with these findings, the Arabidopsis genotype Cvi-0 comprises a natural recessive resistance gene, rwm1, which encodes a mutated cPGK protein and confers resistance to two potyviruses (watermelon mosaic virus and plum pox virus) but not to the potexvirus PVX or the cucumovirus CMV (Lin et al., 2007;Ouibrahim et al., 2014;Poque et al., 2015). Furthermore, the ABA biosynthesis gene ABA2 and the upstream gene NCED3 are important for BaMV (-)RNA accumulation (Alazem et al., 2014). Because of the feedback loop in the ABA biosynthesis pathway, the nced3 mutant exhibited low levels of ABA2, accounting for the low level of BaMV in that mutant. Hence, ABA2 is required for a step preceding BaMV translation, and a similar role was also suggested for the accumulation of CMV in A. thaliana (Alazem et al., 2014).
In the same context, in the ABA biosynthesis pathway, ABA1 and NCED3 are localized in the chloroplasts, whereas ABA2 and AAO3 (the aao3 mutant is highly susceptibility to BaMV unlike the aba2-1 mutant; Alazem et al., 2014) are localized in the cytosol. Hence, the ABA biosynthesis pathway in the chloroplasts may be required for BaMV accumulation (Figure 2). It is still not known whether this recessive resistance is the result of ABA2 substrate or other factors controlled by ABA2. The different localization of cPGK and ABA2 (Cheng et al., 2002) and the different nature of the substrates handled by them may suggest different roles.
CONCLUDING REMARKS
The increased expression of several genes of the AGO, RDR, and DCL families in response to ABA, as well as the observation that several of these genes are important players in the antiviral RNA silencing pathway, strengthens the notion that the antiviral role of ABA is partially achieved through the RNA silencing pathway. The additional effect of ABA-dependent callose deposition at PD thus endows ABA with a dual function in restricting virus spread ( Figure 1A). Both mechanisms have been assessed only for BaMV (Figure 2), and the findings have shown that callose deposition is not the only defense mechanism mediated by ABA. Further studies with other viruses and viroids will reveal how efficient these mechanisms are in different pathosystems.
The antagonism between SA and ABA is well-documented, whereby downstream genes of either pathway are suppressed if the other hormone is applied or induced (Yasuda et al., 2008;Zabala et al., 2009;Moeder et al., 2010). It is known that viruses disrupt hormonal balance in compatible interactions, leading to simultaneous induction of some antagonistic pathways such as ABA and SA in the case of BaMV and CMV (Alazem et al., 2014). However, because of the positive effects that both hormones have on the same subset of defense responses (Figure 1B), it is not clear whether these two antagonistic pathways actually act antagonistically during viral infections. Antagonism is evident in some incompatible interactions in which the induction of these pathways is strong, sequential and not concurrent, implying that each hormone takes a role in triggering several redundant antiviral mechanisms (Alazem and Lin, 2015), but experimental evidence is lacking.
AUTHOR CONTRIBUTIONS
MA and N-SL wrote, revised, and approved this manuscript. |
import time
import json
import re
import tekore as tk
from termcolor import colored
from SpotifyRecommender import config_project, mpd_connector
class TagExtractor:
"""
This class calls the spotify api to get high-level descriptors for the songs of the in the config file specifed MPD Server.
These descriptors are serialized to a json file along with "artist","title", "popularity", "genre","album" and"date".
The top 3 related artists for every artist in the mpd media library are stored to a 2nd JSON file.
Instantiate this class to call the above mentioned functionality.
"""
def __init__(self):
print("Extracting tags from Spotify")
self.spotify = self.init_spotify()
song_list = mpd_connector.MpdConnector(config_project.MPD_IP, config_project.MPD_PORT).get_all_songs()
id_name_list = self.get_spotify_data(song_list)
self.get_similiar_artists(id_name_list)
list_with_high_level_tags = self.match_high_level_tags(id_name_list)
self.save_as_json(list_with_high_level_tags, config_project.PATH_SONG_DATA)
print("Finished extracting Tags.")
def init_spotify(self):
"""
Initialize the main entity for the Spotify API. This includes Authorization.
:return: Spotify object on which API methods can be called
"""
cred = tk.RefreshingCredentials(config_project.CLIENT_ID, config_project.CLIENT_SECRET)
app_token = cred.request_client_token()
sender = tk.RetryingSender(sender=tk.CachingSender())
spotify = tk.Spotify(app_token, sender=sender)
return spotify
def get_spotify_data(self, songnames_dict):
"""
Getting the spotify ids by searching for artists and songnames parsed from the mpd tags
"""
songnames_dict = self._remove_brackets(songnames_dict)
spotify_id_list = []
error_list = []
for single_track_info in songnames_dict:
try: # just a saveguard, if RetryingSender should fail, e.g. unexpected exceptions
track_paging_object, = self.spotify.search(
single_track_info["title"] + " " + single_track_info["artist"], limit=1)
if len(track_paging_object.items) != 0:
spotify_id_list.append({"artist": single_track_info["artist"], "title": single_track_info["title"],
"popularity": track_paging_object.items[0].popularity,
"id": track_paging_object.items[0].id, "genre": single_track_info["genre"],
"album": single_track_info["album"],
"artist_id": track_paging_object.items[0].artists[0].id})
else:
error_list.append(single_track_info)
except Exception as e:
print(e)
time.sleep(1)
print("waiting 1s, unexpected api exception")
error_list.append(single_track_info)
print(colored("Found on Spotify:", "green"), len(spotify_id_list), "/", len(songnames_dict))
return spotify_id_list
def _remove_brackets(self, dict_list):
"""
remove the brackets from song and artist names, to increase the chance of a match on spotify
:param dict_list:
:return:
"""
for song in dict_list:
song["title"] = re.sub("\((.*?)\)", "", song["title"]).strip()
song["artist"] = re.sub("\((.*?)\)", "", song["artist"]).strip()
return dict_list
def get_similiar_artists(self, spotify_data):
"""
Get the top 3 related artists on spotify and serialized it to json file.
Path defined by PATH_RELATED_ARTISTS
:param spotify_data: Returned by get_spotify_data
:return:
"""
artist_dict = {}
for song_info in spotify_data:
if song_info["artist"] not in artist_dict:
related_artists = self.spotify.artist_related_artists(song_info["artist_id"])
related_artists_temp = []
for i in range(0, 3): # just append the first 3 related artists
if len(related_artists) <= i+1:
related_artists_temp.append("placeholder artist")
continue
related_artists_temp.append(related_artists[i].name)
artist_dict[song_info["artist"]] = related_artists_temp
self.save_as_json(artist_dict, config_project.PATH_RELATED_ARTISTS)
return artist_dict
def match_high_level_tags(self, id_name_list):
"""
:param id_name_list: from get_spotify_data
:return: list of dict
"""
for song_info in id_name_list:
audio_features = self.spotify.track_audio_features(song_info["id"])
reduced_audio_features = AudioFeatures(audio_features.valence, audio_features.danceability,
audio_features.energy, self._scale_tempo_down(audio_features.tempo),
audio_features.acousticness,
audio_features.speechiness)
song_info["audio_features"] = reduced_audio_features.asdict()
song_info.pop("id", None)
song_info.pop("artist_id", None)
return id_name_list
def _scale_tempo_down(self, tempo_in_bpm):
"""
Scale Tempo attribute down to a scale from 0 - 1. Max BPM (Beats per minute) is assumed to be 225, since its extremely
rare for a song to have a higher BPM
"""
max_bpm = 205
min_bpm = 60
if tempo_in_bpm > max_bpm:
tempo_in_bpm = max_bpm
if tempo_in_bpm <= min_bpm:
tempo_in_bpm = min_bpm + 1
max_bpm -= min_bpm
tempo_in_bpm -= min_bpm
return round(tempo_in_bpm / max_bpm, 3)
def save_as_json(self, high_level_dict_list, save_path):
with open(save_path, "w") as file_name:
json.dump(high_level_dict_list, file_name, indent=4)
class AudioFeatures:
def __init__(self, valence, danceability, energy, tempo, acousticness, speechiness):
self.valence = valence
self.danceability = danceability
self.energy = energy
self.tempo = tempo
self.acousticness = acousticness
self.speechiness = speechiness
def asdict(self):
return {"valence": self.valence, "danceability": self.danceability, "energy": self.energy, "tempo": self.tempo,
"acousticness": self.acousticness, "speechiness": self.speechiness}
|
/*
For nanoseconds, most platforms have nothing available that
(a) doesn't require bringing in a 40-kb librt.so library
(b) really has nanosecond resolution.
*/
ulonglong my_timer_nanoseconds(void) {
#if defined(HAVE_SYS_TIMES_H) && defined(HAVE_GETHRTIME)
return (ulonglong)gethrtime();
#elif defined(HAVE_CLOCK_GETTIME) && defined(CLOCK_REALTIME)
{
struct timespec tp;
clock_gettime(CLOCK_REALTIME, &tp);
return (ulonglong)tp.tv_sec * 1000000000 + (ulonglong)tp.tv_nsec;
}
#elif defined(__APPLE__) && defined(__MACH__)
{
ulonglong tm;
static mach_timebase_info_data_t timebase_info = {0, 0};
if (timebase_info.denom == 0) (void)mach_timebase_info(&timebase_info);
tm = mach_absolute_time();
return (tm * timebase_info.numer) / timebase_info.denom;
}
#else
return 0;
#endif
} |
<reponame>megahertz0/android_thunder
package com.xunlei.downloadprovider.download.tasklist.list.c;
import android.view.View;
import android.view.View.OnClickListener;
import com.xunlei.downloadprovider.download.util.g;
import com.xunlei.downloadprovider.download.util.n;
import org.android.spdy.SpdyProtocol;
// compiled from: DownloadFreeTrialBanner.java
final class f implements OnClickListener {
final /* synthetic */ d a;
f(d dVar) {
this.a = dVar;
}
public final void onClick(View view) {
if (!g.a().a(this.a.c.getTaskId())) {
g.a().a(this.a.c.getTaskId(), g.a, true);
} else if (this.a.c.mVipChannelStatus == 16) {
g.a().a(this.a.c.getTaskId(), g.c, true);
} else {
if (n.a(this.a.c, n.g(this.a.c))) {
g.a().a(this.a.c.getTaskId(), g.c, true);
} else {
g.a().a(this.a.c.getTaskId(), g.b, true);
}
}
this.a.a(SpdyProtocol.PUBKEY_SEQ_ADASH);
d.g(this.a);
if (this.a.d != null) {
this.a.d.onClick(view);
}
}
}
|
pub mod prelude {
pub use super::Args;
pub use super::CliCommand;
pub use super::CliCommands;
pub use super::CliOption;
pub use super::CliOptions;
}
mod names {
pub(super) const CMD_HELP: &str = "help";
pub(super) const CMD_STATUS: &str = "status";
pub(super) const CMD_DUMP_CONFIG: &str = "dump-config";
pub(super) const OPT_DOUBLE_HELP: &str = "help";
pub(super) const OPT_DOUBLE_VERSION: &str = "version";
pub(super) const OPT_SINGLE_HELP: char = 'h';
pub(super) const OPT_SINGLE_VERSION: char = 'v';
}
mod commands;
mod options;
pub use commands::{CliCommand, CliCommands};
pub use options::{CliOption, CliOptions};
use crate::error::prelude::*;
use std::convert::TryFrom;
use std::env;
pub struct Args {
pub commands: CliCommands,
pub options: CliOptions,
}
impl Args {
pub fn new() -> MyResult<Self> {
let (commands, options) = env::args().skip(1).try_fold(
(CliCommands::default(), CliOptions::default()),
|(mut commands, mut options), arg| {
if let Ok(opts) = CliOptions::try_from(arg.as_str()) {
options.0.append(&mut opts.into());
Ok((commands, options))
} else {
if let Ok(cmd) = CliCommand::try_from(arg.as_str()) {
commands.0.push(cmd);
Ok((commands, options))
} else {
Err(Error::InvalidArgument(arg))
}
}
},
)?;
Ok(Self { commands, options })
}
}
pub fn print_help() {
let opt_help = {
let opt = CliOption::Help;
format!("-{}, --{}", opt.name_single(), opt.name_double())
};
let opt_vers = {
let opt = CliOption::Version;
format!("-{}, --{}", opt.name_single(), opt.name_double())
};
let cmd_status = CliCommand::Status.name();
let cmd_help = CliCommand::Help.name();
let cmd_dump_config = CliCommand::DumpConfig.name();
println!(
r#"{description}
USAGE:
{name} [OPTIONS] [COMMAND]
OPTIONS:
{opt_help:<opt_width$} Print this help message and exit.
{opt_vers:<opt_width$} Print version information and exit.
COMMANDS:
{cmd_status}
Print the current cmus playback status
with the format configured in the config.toml file.
This is the default command, so you may omit this argument.
{cmd_dump_config}
Print the default config as TOML to stdout.
To write the default config to the proper config file, run something like:
mkdir -p ~/.config/{name}
{name} {cmd_dump_config} > ~/.config/{name}/config.toml
{cmd_help}
Print this help message and exit."#,
description = crate::meta::DESCRIPTION,
name = crate::meta::NAME,
opt_width = 16,
opt_help = opt_help,
opt_vers = opt_vers,
cmd_status = cmd_status,
cmd_help = cmd_help,
cmd_dump_config = cmd_dump_config,
);
}
pub fn print_version() {
println!("{} v{}", crate::meta::NAME, crate::meta::VERSION)
}
pub fn dump_config() {
print!(
r#"# DEFAULT CONFIG FOR {name}
# To write this config to the proper config file, run something like:
# mkdir -p ~/.config/{name}
# {name} {cmd_dump_config} > ~/.config/{name}/config.toml
{config}"#,
name = crate::meta::NAME,
cmd_dump_config = CliCommand::DumpConfig.name(),
config = crate::config::DEFAULT_CONFIG
);
}
|
Restriction of Foamy Viruses by APOBEC Cytidine Deaminases
ABSTRACT Foamy viruses (FVs) are nonpathogenic retroviruses infecting many species of mammals, notably primates, cattle, and cats. We have examined whether members of the apolipoprotein B-editing catalytic polypeptide-like subunit (APOBEC) family of antiviral cytidine deaminases restrict replication of simian FV. We show that human APOBEC3G is a potent inhibitor of FV infectivity in cell culture experiments. This antiviral activity is associated with cytidine editing of the viral genome. Both molecular FV clones and primary uncloned viruses were susceptible to APOBEC3G, and viral infectivity was also inhibited by murine and simian APOBEC3G homologues, as well as by human APOBEC3F. Wild-type and bet-deleted viruses were similarly sensitive to this antiviral activity, suggesting that Bet does not significantly counteract APOBEC proteins. Moreover, we did not detect FV sequences that may have been targeted by APOBEC in naturally infected macaques, but we observed a few G-to-A substitutions in humans that have been accidentally contaminated by simian FV. In infected hosts, the persistence strategy employed by FV might be based on low levels of replication, as well as avoidance of cells expressing large amounts of active cytidine deaminases. |
<reponame>kunpengcompute/devkit-vscode-plugin
import { Injectable, SecurityContext } from '@angular/core';
import { DomSanitizer } from '@angular/platform-browser';
@Injectable({
providedIn: 'root'
})
export class MytipService {
public a: any;
public wsTip: any;
public interval: any;
constructor(
public domSanitizer: DomSanitizer
) {
this.a = window.document.createElement('div');
this.a.className = 'tip-box';
this.a.innerHTML = '';
window.document.querySelectorAll('body')[0].appendChild(this.a);
this.wsTip = window.document.createElement('div');
this.wsTip.className = 'wsTip-box';
this.wsTip.innerHTML = '';
window.document.querySelectorAll('body')[0].appendChild(this.wsTip);
}
public escapeHtml(strings: any) {
const entityMap: any = {
'&': '&',
'<': '<',
'>': '>',
'"': '"',
'\'': ''',
'/': '/'
};
return String(strings).replace(/[&<>"'\/]/g, (s) => {
return entityMap[s];
});
}
public alertInfo(options: any) {
let interval: any;
window.document.querySelectorAll('.tip-box')[0].innerHTML = '';
let img = '';
let imgClose = '';
let html = '';
switch (options.type) {
case 'warn':
img = ` <img src='./assets/img/newSvg/warn.svg' class="tip-icon" />`;
imgClose = ` <img src='./assets/img/newSvg/warn_close.svg' onclick='' class='tip-close-btn'/>`;
html = ` <div class='mytip'>
<div class="tip-content tip-content-warn"><div>` + img +
this.domSanitizer.sanitize(SecurityContext.HTML, options.content) +
` </div><div class="tip-close"> ` + imgClose + ` </div></div>
</div>`;
break;
case 'success':
img = ` <img src='./assets/img/newSvg/ok.svg' class="tip-icon" />`;
imgClose = ` <img src='./assets/img/newSvg/ok_close.svg' onclick='' class='tip-close-btn'/>`;
html = ` <div class='mytip'>
<div class="tip-content tip-content-success"><div>` + img +
this.domSanitizer.sanitize(SecurityContext.HTML, options.content) +
` </div><div class="tip-close"> ` + imgClose + ` </div></div>
</div>`;
break;
case 'error':
img = ` <img src='./assets/img/newSvg/error.svg' class="tip-icon" />`;
imgClose = ` <img src='./assets/img/newSvg/error_close.svg' onclick='' class='tip-close-btn'/>`;
html = ` <div class='mytip'>
<div class="tip-content tip-content-error"><div>` + img +
this.domSanitizer.sanitize(SecurityContext.HTML, options.content) +
` </div><div class="tip-close"> ` + imgClose + ` </div></div>
</div>`;
break;
case 'tip':
img = ` <img src='./assets/img/newSvg/tip.svg' class="tip-icon" />`;
imgClose = ` <img src='./assets/img/newSvg/tip_close.svg' onclick='' class='tip-close-btn'/>`;
html = ` <div class='mytip'>
<div class="tip-content tip-content-tip"><div>` + img +
this.domSanitizer.sanitize(SecurityContext.HTML, options.content) +
` </div><div class="tip-close"> ` + imgClose + ` </div></div>
</div>`;
break;
default:
img = ` <img src='./assets/img/newSvg/ok.svg' class="tip-icon" />`;
imgClose = ` <img src='./assets/img/newSvg/ok_close.svg' onclick='' class='tip-close-btn'/>`;
html = ` <div class='mytip'>
<div class="tip-content tip-content-success"><div>` + img +
this.domSanitizer.sanitize(SecurityContext.HTML, options.content) +
` </div><div class="tip-close"> ` + imgClose + ` </div></div>
</div>`;
break;
}
html = `${html}`;
if (this.wsTip.innerHTML === '') {
this.a.style.top = '50px';
} else {
this.a.style.top = '100px';
}
this.a.innerHTML = html;
this.a.style.display = 'block';
interval = setTimeout(() => {
this.a.style.display = 'none';
this.wsTip.style.top = '50px';
}, options.time);
$('.tip-close-btn').on('click', (e) => {
clearInterval(interval);
interval = null;
this.a.style.display = 'none';
});
}
public wsErrorTip(options: any) {
let img = '';
let imgClose = '';
let html = '';
switch (options.type) {
case 'warn':
img = ` <img src='./assets/img/newSvg/warn.svg' class="tip-icon" />`;
imgClose = ` <img src='./assets/img/newSvg/warn_close.svg' class='tip-close-btn-ws'/>`;
html = ` <div class='myWstip'>
<div class="tip-content tip-content-warn"><div>` + img +
this.domSanitizer.sanitize(SecurityContext.HTML, options.content) +
` </div><div class="tip-close"> ` + imgClose + ` </div></div>
</div>`;
break;
default:
img = ` <img src='./assets/img/newSvg/ok.svg' class="tip-icon" />`;
imgClose = ` <img src='./assets/img/newSvg/ok_close.svg' onclick='' class='tip-close-btn'/>`;
html = ` <div class='myWstip'>
<div class="tip-content tip-content-success"><div>` + img +
this.domSanitizer.sanitize(SecurityContext.HTML, options.content) +
` </div><div class="tip-close"> ` + imgClose + ` </div></div>
</div>`;
break;
}
html = `${html}`;
if (this.a.style.display !== 'block') {
this.wsTip.style.top = '50px';
} else {
this.wsTip.style.top = '100px';
}
this.wsTip.innerHTML = html;
this.wsTip.style.display = 'block';
}
public clearWsTip() {
this.wsTip.innerHTML = '';
this.wsTip.style.display = 'none';
}
}
|
//-----------------------------------------------------------------------------
// Abort any pending or blocked reception.
//-----------------------------------------------------------------------------
void ts::TunerDevice::abort(bool silent)
{
_aborted = true;
if (!_reading_dvr) {
hardClose(silent ? nullptr : &_duck.report());
}
} |
3D beamforming trials with an active antenna array
Beamforming techniques for mobile wireless communication systems like LTE using multiple antenna arrays for transmission and reception to increase the signal-to-noise-and-interference ratio (SINR) are state of the art. The increased SINR is not only due to a larger gain in the direction of the desired user, but also due to a better control of the spatial distribution of interference in the cell. To further enhance the system performance not only the horizontal, but also the vertical dimension can be exploited for beam pattern adaptation, thus giving an additional degree of freedom for interference avoidance among adjacent cells. This horizontal and vertical beam pattern adaptation is also referred to as 3D beamforming in the following. This paper describes investigations of the potential of 3D beamforming with lab and field trial setups and provides initial performance results. |
AN international magazine with ties to the Financial Times newspaper has named Limerick as a ‘European City of the Future’.
fDi magazine, an international news and foreign direct investment magazine published by fDi Intelligence, a specialist division of the Financial Times, named Limerick as one of the principal winners among the European Cities of the Future Awards for 2016/2017.
fDi Magazine’s biennial league tables have become a key benchmark of competitiveness and serve as a barometer of attractiveness for the many cities and regions that are proactively pitching themselves for inward investment.
Limerick was named European City of the Future in its population category and has been included in a list of top 10 European cities in its population category for Human Capital and Lifestyles.
It also included Limerick among the Top 10 Northern European Cities. The city was also runner-up in the population category for Business Friendliness, Economic Potential and FDI Strategy.
It was nominated by the local authority for the awards and Mayor Cllr Liam Galvin said “success in this prestigious annual awards scheme is testament to the council's ongoing good work with the likes of IDA Ireland and Enterprise Ireland in securing numerous significant investment projects in the 12-18 months including Uber, Dell, AMAX and viagogo.
“Limerick is continuing to demonstrate its ability to attract considerable FDI in comparison to other parts of this country and I am delighted that this fantastic achievement is now receiving international recognition which will further boost efforts to promote the city's attractiveness as an investment location,” he added.
Pat Daly of the council's economic development and planning department, said the local authority was "particularly pleased with its showing in the FDI Strategy category, as this element of the awards scheme gives recognition to the work being undertaken by European cities to attract foreign direct investment.
"In recent years, Limerick has emerged as an attractive investment location for businesses from inside and outside of Ireland. Ongoing investment has transformed the city centre with the arrival of more than 30 new retailers in the past 12 months. In the Limerick city region today there is more than 14,500 people working in 116 overseas companies.
"To capitalise on this progress, Limerick is investing over €1 billion in enterprise and investment infrastructure and targeting the delivery of 12,000 as part of the Limerick 2030 Vision: An Economic and Spatial Plan for Limerick," he added. |
Amber Hollibaugh, a long-time activist, told Laura Flaunders on Sunday that those with alternative sexualities were “nowhere near close” to being sexually liberated.
Despite the repeal on the military policy “Don’t Ask, Don’t Tell” and growing acceptance of same sex marriage, Hollibaugh said the LGBT movement still had a long way to go in regards to sexual freedom.
“I’m not sorry that we can now enter the military and I’m not sorry that we can marry, but frankly I come from a moment in time and a radical vision in time that never made marriage or the military my criteria of success,” she explained. “I didn’t want us to have wars, I didn’t want us to have armies, and I didn’t want to register my relationship with the state.”
“So, are those victories? They are,” Hallibaugh added. “Were they discriminatory? They were. Were they my idea of what it was that we were trying to build as a liberation movement for queer people? No.”
She expressed her disappointment that the LGBT movement did not fight for “the importance of the erotic, of people actually getting to fulfill desire and not be punished because they have it.”
Hollibaugh is the Interim Executive Director of Queers for Economic Justice.
Watch video, uploaded to YouTube, below: |
import java.util.Scanner;
public class Main {
private static int m, n, p, q;
private static int cur, pos, sum, len;
private static int max, min, ans, res;
private static int ai, bi, ci, di;
private static int[] aix, bix;
private static long al, bl, cl, dl;
private static long[] alx, blx;
private static double ad, bd, cd, dd;
private static double[] adx, bdx;
private static boolean ab, bb, cb, db;
private static boolean[] abx, bbx;
private static char ac, bc, cc, dc;
private static char[] acx, bcx;
private static String as, bs, cs, ds;
private static Scanner in = new Scanner(System.in);
public static void main(String[] args) {
as = in.next();
acx = as.toCharArray();
m = acx.length;
for (p = 0; p < m; p++) {
if (acx[p] == '[') {
ai++;
break;
}
}
for (; p < m; p++) {
if (acx[p] == ':') {
ai++;
break;
}
}
for (q = m - 1; q >= 0; q--) {
if (acx[q] == ']') {
bi++;
break;
}
}
for (; q >= 0; q--) {
if (acx[q] == ':') {
bi++;
break;
}
}
if(ai == 2 && bi == 2 && p < q){
for (int i = p+1; i < q; i++) {
if(acx[i]=='|'){
ci++;
}
}
System.out.println(ci+4);
}else {
System.out.println(-1);
}
}
} |
<filename>server/src/config.rs
use anyhow::Context;
use hocon::de::wrappers::Serde;
use serde::Deserialize;
use std::env;
use std::time::Duration;
const CONFIG_PATH: &str = "/config/config.conf";
// pub type Config = Arc<AppConfig>
#[derive(Deserialize, Debug, Clone)]
pub struct Config {
pub port: u32,
pub addr: String,
pub file_dir: String,
#[serde(default)]
pub cors: bool,
pub admin_config: Option<AdminConfig>,
pub https: Option<HttpsConfig>,
#[serde(default)]
pub cache: CacheConfig,
#[serde(default)]
pub domains: Vec<DomainConfig>,
}
//TODO: create config with lots of default value
impl Config {
pub fn load() -> anyhow::Result<Self> {
let config_path = env::var("SPA_CONFIG").unwrap_or(CONFIG_PATH.to_string());
let load_file = hocon::HoconLoader::new()
.load_file(&config_path)
.with_context(|| "can not read config file")?;
load_file
.resolve::<Config>()
.with_context(|| "parse config file error")
}
}
#[derive(Deserialize, Debug, Clone, Eq, PartialEq)]
pub struct AdminConfig {
pub port: u32,
pub addr: String,
pub token: String,
#[serde(default = "default_max_upload_size")]
pub max_upload_size: u64,
pub deprecated_version_delete: Option<DeprecatedVersionRemove>,
}
fn default_max_upload_size() -> u64 {
30 * 1024 * 1024
}
#[derive(Deserialize, Debug, Clone)]
pub struct DomainConfig {
pub domain: String,
pub cors: Option<bool>,
pub cache: Option<DomainCacheConfig>,
pub https: Option<DomainHttpsConfig>,
}
#[derive(Deserialize, Debug, Clone)]
pub struct DomainHttpsConfig {
pub ssl: Option<SSL>,
pub http_redirect_to_https: Option<bool>,
//#[serde(default)]
//pub disabled: bool,
}
#[derive(Deserialize, Debug, Clone)]
pub struct SSL {
pub private: String,
pub public: String,
}
#[derive(Deserialize, Debug, Clone)]
pub struct HttpsConfig {
pub ssl: Option<SSL>,
pub port: i32,
pub addr: String,
#[serde(default)]
pub http_redirect_to_https: bool,
}
// should write Deserialize by hand.
#[derive(Deserialize, Debug, Clone)]
pub struct CacheConfig {
#[serde(default = "default_max_size")]
pub max_size: u64,
#[serde(default)]
pub compression: bool,
#[serde(default)]
pub client_cache: Vec<ClientCacheItem>,
}
#[derive(Deserialize, Debug, Clone)]
pub struct DomainCacheConfig {
pub max_size: Option<u64>,
pub compression: Option<bool>,
pub client_cache: Option<Vec<ClientCacheItem>>,
}
fn default_max_size() -> u64 {
10 * 1024 * 1024
}
#[derive(Deserialize, Debug, Clone)]
pub struct ClientCacheItem {
#[serde(deserialize_with = "Serde::<Duration>::with")]
pub expire: Duration,
pub extension_names: Vec<String>,
}
impl Default for CacheConfig {
fn default() -> Self {
CacheConfig {
max_size: default_max_size(),
client_cache: Vec::new(),
compression: false,
}
}
}
#[derive(Deserialize, Debug, Clone, Eq, PartialEq)]
pub struct DeprecatedVersionRemove {
#[serde(default = "default_cron")]
pub cron: String,
#[serde(default = "default_max_reserve")]
pub max_reserve: u32,
}
pub fn default_cron() -> String {
String::from("0 0 3 * * *")
}
pub fn default_max_reserve() -> u32 {
return 2;
}
|
/**
* Writes a general header of debug information iff the debug flag is set to
* true
*/
protected void writeDebugHeader() {
if (this.debug) {
System.out.println("Product location to search : " + this.currentBaselineLocation);
System.out.println("Report location : " + this.reportLocation);
System.out.println("Searching for API references : " + this.considerapi);
System.out.println("Searching for internal references : " + this.considerinternal);
System.out.println("Searching for illegal API use : " + this.considerillegaluse);
if (this.excludeListLocation != null) {
System.out.println(
"Exclude list location : " + this.excludeListLocation);
} else {
System.out.println(
"No exclude list location");
}
if (this.includeListLocation != null) {
System.out.println(
"Include list location : " + this.includeListLocation);
} else {
System.out.println(
"No include list location");
}
if (this.filters != null) {
System.out.println(
"API Filter location : " + this.filters);
} else {
System.out.println(
"No API filter location");
}
if (this.scopepattern == null) {
System.out.println(
"No scope pattern defined - searching all bundles");
} else {
System.out.println(
"Scope pattern : " + this.scopepattern);
}
if (this.referencepattern == null) {
System.out.println(
"No baseline pattern defined - reporting references to all bundles");
} else {
System.out.println(
"Baseline pattern : " + this.referencepattern);
}
System.out.println("-----------------------------------------------------------------------------------------------------");
}
} |
<reponame>concord-consortium/dq-playground
import React from "react";
import { render, screen } from "@testing-library/react";
import userEvent from "@testing-library/user-event";
import { DQNode } from "../models/dq-node";
import { Variable } from "../models/variable";
import { DQRoot } from "../models/dq-root";
import { GenericContainer } from "../models/test-utils";
import { Diagram } from "./diagram";
beforeAll(() => {
// Setup ResizeObserver and offset* properties
// see: https://github.com/wbkd/react-flow/issues/716
window.ResizeObserver =
window.ResizeObserver ||
jest.fn().mockImplementation(() => ({
disconnect: jest.fn(),
observe: jest.fn(),
unobserve: jest.fn(),
}));
Object.defineProperties(window.HTMLElement.prototype, {
offsetHeight: {
get() {
return parseFloat(this.style.height) || 1;
},
},
offsetWidth: {
get() {
return parseFloat(this.style.width) || 1;
},
},
});
(window.SVGElement as any).prototype.getBBox = () => ({x:0, y:0, width: 0, height: 0});
});
describe("Quantity Node", () => {
it("diagram should include quantity node", async () => {
const variable = Variable.create({});
const root = DQRoot.create();
const node = DQNode.create({ variable: variable.id, x: 0, y: 0 });
root.addNode(node);
// references have to be within the same tree so we need some container
const container = GenericContainer.create();
container.add(variable);
container.setRoot(root);
render(<Diagram dqRoot={root} />);
expect(screen.getByTestId("diagram")).toBeInTheDocument();
expect(screen.getByTestId("quantity-node")).toBeInTheDocument();
expect(screen.getByTestId("delete-node-button")).toBeInTheDocument();
});
it("quantity node entries should be saved", () => {
const variable = Variable.create({});
const root = DQRoot.create();
const node = DQNode.create({ variable: variable.id, x: 0, y: 0 });
root.addNode(node);
// references have to be within the same tree so we need some container
const container = GenericContainer.create();
container.add(variable);
container.setRoot(root);
render(<Diagram dqRoot={root} />);
expect(screen.getByTestId("variable-name")).toBeInTheDocument();
const nameTextBox = screen.getByTestId("variable-name");
const valueTextBox = screen.getByTestId("variable-value");
const unitTextBox = screen.getByTestId("variable-unit");
userEvent.type(nameTextBox, "my variable name");
userEvent.type(valueTextBox, "45");
userEvent.type(unitTextBox, "miles");
expect(variable.name).toBe("my variable name");
expect(variable.value).toEqual(45);
expect(variable.unit).toBe("miles");
//verify cannot enter a non-digit into value input
userEvent.type(valueTextBox, "letter");
expect(variable.value).toBe(undefined);
});
});
|
Tasked with finding the definition of a good and a bad logo seems like it would be quite easy, however once you’ve got past the clearly blatant bad logos and start looking towards the multi-million pound companies that you see daily; you begin to look really in-depth as to the quality and thought put into and some of the elements behind the logo.
To me a good logo is something that represents the company in a clear and concise way, this should be done in two ways:
The first is to have a logo that’s elements make up what you stand for, whether that be Costa Coffee, with their coffee beans boldly at the forefront of their logo or Wikipedia’s world built up of connecting jigsaw pieces with various written languages on. The second would be a logo that can stand by itself and be acknowledged clearly by the public domain. Perfect examples are Nike, Apple & Facebook, without the need for text cluttering up some products. Nike and Apple, are great at this with Apple’s iMac’s and iPhone’s clear black logo on the fore front of the products design. It’s instantaneous, you know who it is – Nike are the same with their famous tick on shoe boxes & TV advertisements.
The three logos that I specifically chose as ‘Good Logos’ were Apple (surprise, surprise!), Google and Buffer, the scheduled tweeting app. The one I will be focusing on in this blog post is Google as I’d like to talk about the recent changes that have appeared on everyone’s homepage. If you use Bing, please stop reading now…
The Google, logo that we all knew and loved was present on our homepage from May 31, 1999 to May 5, 2010, since then there have been a few facelifts and adjustments, but nothing too different from the norm. On 1st of September 2015, we saw a whole new logo, something completely different from what we had seen before. This wasn’t just removing the shadow, or flattening the image, this was BIG!
Straight off the bat you can see that the new logo is completely sans serif with a simplified look; when I saw the logo for the first time I disliked it – I believed it to look unprofessional, drastically different and a bit ugly. Over the following weeks this logo has grown on me to the point where I feel it represents Google in a whole. No longer do they look like a scary, billion dollar corporation that’s going to take over the world; we are left with a very playful, creative and bright logo that represents Google as a whole. Upon first viewing the logo I believed it to be very childish looking due to the type looking like lettering you would find on children’s wooden blocks, maybe that’s the point with their new parent company called ‘Alphabet’ after all. I felt they had pushed the logo past simplification to the point where it seemed unprofessional. Although giving myself some time to think, I realised that Google, as a whole is a company that has its fingers in many, many pies and is involved in so many areas experimenting, building products and providing a service – that this new modernised logo gives off connotations of how they want to be represented which is a bold, creative and playful corporation that are friendly and not trying to take over the world by monopolizing. It has also been discussed that Google and many other large corporations try to hook young children onto their services by making their logos easy to read though streamlining and flattening. I also believe that in our day to day lives we need to recognise a logo within 1-3 seconds due to the fast speeds that society works at when on their smartphone devices; a small logo icon that can be recognised on the side of a building, or a billboard is just as effective.
Bad logos are everywhere – I think that’s a given, whether it be a local companies plumbing service or a multimillionaire company such as ‘Boots’. They come in all shapes and sizes and logos have evolved to the point where ‘bad’ logos have started to fade away, its relatively simple to create good looking logo in 2015 with websites helping you build them in seconds. Still it seems that for some large companies it’s still just as easy to screw up…
An example of a bad logo would have to be Taco Bell, I can’t think of a more clashing and eye hurting logo than that; if you can please leave a comment as I’m reluctantly curious. The colours are wrong and completely destroy whatever Taco Bell want potential customers to think. Fresh, tasty tacos is the last thing on my mind when I see this logo, I feel with a pastel colour palette, some flattening and the complexity of the logo’s details rearranged it could achieve a higher quality look. The pink & purple distract and could potentially capture a customer’s eye when walking on a street however it’s for the wrong reasons.
The previous segment to this blog focused on why these logos are deemed as ‘good’, and how Google recently undertook a major revamp, although they were not the first to dabble in this area – it was actually Microsoft & Ebay, major giants in who have almost pioneered this change.
Microsoft not only changed their logo but their entire operating system with the ‘Metro’ look introduced in Windows 7, that blurred the lines between computers, laptops and with the rising tablet – this flat look was deemed ugly, basic and immature when it first launched years ago. It is no surprise that years later other giants have followed in their footprints and a huge culture shift has happened regarding logos and how we as a society receive and perceive data.
In conclusion, I see logos as a whole are at, or are approaching their mid way point in their evolution of this flattening, simplified stage. Since 2010, logos have slowly snowballed in this direction and it’s at a point where small businesses will begin to copy and emulate these stylizations they are seeing larger corporations put into effect. Over the next 5-7 years we will reach a point where a new post-simplified version of logos will emerge, I have no idea how that will look and most likely neither do you – but more detail and clarity will be taken into account when new logo designs are being created. There isn’t anywhere to go from this point except merge the two & sprinkle in new design ideas to create a unique hybrid we haven’t really seen before. |
IMPLEMENTATION OF THE GOVERNMENT'S DISCRETION IN INDONESIA, AUTHORITY, AND RESPONSIBILITY IN THE MANAGEMENT OF GOVERNMENT
The research entitled "Implementation of the Government's Discretionary Authority and Accountability in the Administration of Government" has five problem formulations. First, how is the implementation of discretionary authority in the administration of Government? Second, there are obstacles in the exercise of discretionary authority. The third is how to overcome obstacles. Fourth, the limits of discretion in decision making. Fifth, Government Instruments. The purpose of this study was to identify and analyze the five problem formulations above. This research is qualitative descriptive The results of this study indicate that the exercise of discretionary authority in the administration of Government is a logical consequence of the welfare state where the welfare state government is a state power that is given the task and responsibility for the welfare of its citizens. In contrast, discretionary authority does not mean that it can be used freely. However, it must follow the rules written in Law Number 30 of 2004 concerning Good Governance and Principles of Good Governance (AAUPB). The implementation of discretionary authority in the administration of Government also has several obstacles that cause the Government to be less efficient and less effective. Efforts to overcome these obstacles are not easy because they must involve all parties, namely legislative power, executive power, and judicial power. This is also, of course, requires the role of citizens. These things are to ensure the implementation of discretionary authority in the administration of Government so that it is efficient and effective in realizing people's welfare. |
<reponame>townofdon/youtube-time-reclaimer
export const PORT_NAME = '__TIME_RECLAIMER__';
|
<gh_stars>0
#include "antsCommandLineParser.h"
#include "antsUtilities.h"
#include "ReadWriteData.h"
#include "itkAffineTransform.h"
#include "itkAntiAliasBinaryImageFilter.h"
#include "itkImageFileReader.h"
#include "itkImageFileWriter.h"
#include "vtkSTLReader.h"
#include "vtkSTLWriter.h"
#include "vtkPLYReader.h"
#include "vtkPLYWriter.h"
#include "itkImageToVTKImageFilter.h"
#include "vtkActor.h"
#include "vtkCallbackCommand.h"
#include "vtkExtractEdges.h"
#include "vtkGraphicsFactory.h"
#include "vtkImageData.h"
#include "vtkImageStencil.h"
#include "vtkLookupTable.h"
#include "vtkMarchingCubes.h"
#include "vtkMetaImageWriter.h"
#include "vtkPointData.h"
#include "vtkPolyData.h"
#include "vtkPolyDataConnectivityFilter.h"
#include "vtkPolyDataMapper.h"
#include "vtkPolyDataNormals.h"
#include "vtkProperty.h"
#include "vtkSmartPointer.h"
#include "vtkTriangleFilter.h"
#include "vtkUnsignedCharArray.h"
#include "vtkWindowedSincPolyDataFilter.h"
#include "vtkPolyDataWriter.h"
#include "vtkPolyDataReader.h"
#include "vtkPolyDataToImageStencil.h"
#include "vtkPNGWriter.h"
#include "vtkRenderer.h"
#include "vtkRenderWindow.h"
#include "vtkRenderWindowInteractor.h"
#include "vtkScalarBarActor.h"
#include "vtkSmoothPolyDataFilter.h"
#include "vtkTextProperty.h"
#include "vtkWindowToImageFilter.h"
#include "itkMath.h"
#include <vector>
#include <string>
namespace ants
{
float CalculateGenus( vtkPolyData *mesh, bool verbose )
{
vtkSmartPointer<vtkExtractEdges> extractEdges = vtkSmartPointer<vtkExtractEdges>::New();
extractEdges->SetInputData( mesh );
extractEdges->Update();
auto numberOfEdges = static_cast<float>( extractEdges->GetOutput()->GetNumberOfLines() );
auto numberOfVertices = static_cast<float>( mesh->GetNumberOfPoints() );
auto numberOfFaces = static_cast<float>( mesh->GetNumberOfPolys() );
float genus = 0.5 * ( 2.0 - numberOfVertices + numberOfEdges - numberOfFaces );
if( verbose )
{
std::cout << "Genus = " << genus << std::endl;
std::cout << " number of vertices = " << numberOfVertices << std::endl;
std::cout << " number of edges = " << numberOfEdges << std::endl;
std::cout << " number of faces = " << numberOfFaces << std::endl;
}
return genus;
}
void Display( vtkPolyData *vtkMesh,
const std::vector<float> rotationAngleInDegrees,
const std::vector<float> backgroundColor,
const std::string screenCaptureFileName,
const bool renderScalarBar = false,
vtkLookupTable *scalarBarLookupTable = nullptr,
const std::string scalarBarTitle = std::string( "" ),
unsigned int scalarBarNumberOfLabels = 5,
unsigned int scalarBarWidthInPixels = 0,
unsigned int scalarBarHeightInPixels = 0
)
{
vtkSmartPointer<vtkGraphicsFactory> graphicsFactory =
vtkSmartPointer<vtkGraphicsFactory>::New();
graphicsFactory->SetOffScreenOnlyMode( false );
graphicsFactory->SetUseMesaClasses( 1 );
vtkSmartPointer<vtkPolyDataMapper> mapper = vtkSmartPointer<vtkPolyDataMapper>::New();
mapper->SetInputData( vtkMesh );
mapper->ScalarVisibilityOn();
vtkSmartPointer<vtkActor> actor = vtkSmartPointer<vtkActor>::New();
actor->SetMapper( mapper );
actor->GetProperty()->SetInterpolationToFlat();
actor->GetProperty()->ShadingOff();
actor->GetProperty()->SetSpecular( 0.0 );
actor->GetProperty()->SetSpecularPower( 0 );
actor->RotateX( rotationAngleInDegrees[0] );
actor->RotateY( rotationAngleInDegrees[1] );
actor->RotateZ( rotationAngleInDegrees[2] );
vtkSmartPointer<vtkRenderer> renderer = vtkSmartPointer<vtkRenderer>::New();
renderer->SetBackground( backgroundColor[0] / 255.0, backgroundColor[1] / 255.0, backgroundColor[2] / 255.0 );
vtkSmartPointer<vtkRenderWindow> renderWindow = vtkSmartPointer<vtkRenderWindow>::New();
renderWindow->AddRenderer( renderer );
vtkSmartPointer<vtkCallbackCommand> callback = vtkSmartPointer<vtkCallbackCommand>::New();
renderer->AddObserver( vtkCommand::KeyPressEvent, callback );
vtkSmartPointer<vtkRenderWindowInteractor> renderWindowInteractor =
vtkSmartPointer<vtkRenderWindowInteractor>::New();
renderWindowInteractor->SetRenderWindow( renderWindow );
renderer->AddActor( actor );
if( renderScalarBar )
{
vtkSmartPointer<vtkScalarBarActor> scalarBar = vtkSmartPointer<vtkScalarBarActor>::New();
scalarBar->SetLookupTable( scalarBarLookupTable );
scalarBar->SetTitle( scalarBarTitle.c_str() );
scalarBar->SetMaximumNumberOfColors( 256 );
scalarBar->SetNumberOfLabels( scalarBarNumberOfLabels );
scalarBar->SetLabelFormat( "%.2g" );
if( scalarBarWidthInPixels > 0 && scalarBarHeightInPixels > 0 )
{
if( scalarBarWidthInPixels > scalarBarHeightInPixels )
{
scalarBar->SetOrientationToHorizontal();
}
else
{
scalarBar->SetOrientationToVertical();
}
scalarBar->SetMaximumWidthInPixels( scalarBarWidthInPixels );
scalarBar->SetMaximumHeightInPixels( scalarBarHeightInPixels );
}
vtkSmartPointer<vtkTextProperty> titleTextProperty = vtkSmartPointer<vtkTextProperty>::New();
titleTextProperty->ItalicOff();
titleTextProperty->BoldOn();
titleTextProperty->SetColor( 0.0, 0.0, 0.0 );
titleTextProperty->SetJustificationToCentered();
// titleTextProperty->SetFontSize( 50 );
scalarBar->SetTitleTextProperty( titleTextProperty );
vtkSmartPointer<vtkTextProperty> labelTextProperty = vtkSmartPointer<vtkTextProperty>::New();
labelTextProperty->ItalicOff();
labelTextProperty->BoldOff();
labelTextProperty->SetColor( 0.0, 0.0, 0.0 );
// labelTextProperty->SetFontSize( 5 );
scalarBar->SetLabelTextProperty( labelTextProperty );
scalarBar->VisibilityOn();
renderer->AddActor2D( scalarBar );
}
renderWindow->Render();
if( screenCaptureFileName.empty() )
{
renderWindowInteractor->Start();
}
else
{
vtkSmartPointer<vtkWindowToImageFilter> windowToImageFilter =
vtkSmartPointer<vtkWindowToImageFilter>::New();
windowToImageFilter->SetInput( renderWindow );
windowToImageFilter->SetMagnification( 5 );
windowToImageFilter->Update();
vtkSmartPointer<vtkPNGWriter> writer = vtkSmartPointer<vtkPNGWriter>::New();
writer->SetFileName( screenCaptureFileName.c_str() );
writer->SetInputConnection( windowToImageFilter->GetOutputPort() );
writer->Write();
}
}
int antsImageToSurface( itk::ants::CommandLineParser *parser )
{
constexpr unsigned int ImageDimension = 3;
typedef float RealType;
typedef itk::Image<RealType, ImageDimension> ImageType;
typedef itk::Image<int, ImageDimension> MaskImageType;
typedef unsigned char RgbComponentType;
typedef itk::RGBPixel<RgbComponentType> RgbPixelType;
typedef itk::Image<RgbPixelType, ImageDimension> RgbImageType;
ImageType::PointType zeroOrigin;
zeroOrigin.Fill( 0.0 );
// Read in input surface image
ImageType::Pointer inputImage = nullptr;
RealType defaultColorRed = 255.0;
RealType defaultColorGreen = 255.0;
RealType defaultColorBlue = 255.0;
RealType defaultAlpha = 1.0;
itk::ants::CommandLineParser::OptionType::Pointer inputImageOption =
parser->GetOption( "surface-image" );
if( inputImageOption && inputImageOption->GetNumberOfFunctions() )
{
if( inputImageOption->GetFunction( 0 )->GetNumberOfParameters() == 0 )
{
std::string inputFile = inputImageOption->GetFunction( 0 )->GetName();
ReadImage<ImageType>( inputImage, inputFile.c_str() );
inputImage->SetOrigin( zeroOrigin );
}
else
{
std::string inputFile = inputImageOption->GetFunction( 0 )->GetParameter( 0 );
ReadImage<ImageType>( inputImage, inputFile.c_str() );
inputImage->SetOrigin( zeroOrigin );
if( inputImageOption->GetFunction( 0 )->GetNumberOfParameters() > 1 )
{
std::vector<RealType> defaultColors = parser->ConvertVector<RealType>(
inputImageOption->GetFunction( 0 )->GetParameter( 1 ) );
if( defaultColors.size() == 1 )
{
defaultColorRed = defaultColors[0];
defaultColorGreen = defaultColors[0];
defaultColorBlue = defaultColors[0];
defaultAlpha = 1.0;
}
else if( defaultColors.size() == 3 )
{
defaultColorRed = defaultColors[0];
defaultColorGreen = defaultColors[1];
defaultColorBlue = defaultColors[2];
defaultAlpha = 1.0;
}
else if( defaultColors.size() == 4 )
{
defaultColorRed = defaultColors[0];
defaultColorGreen = defaultColors[1];
defaultColorBlue = defaultColors[2];
defaultAlpha = defaultColors[3];
}
else
{
std::cerr << "Incorrect color format specified." << std::endl;
return EXIT_FAILURE;
}
}
}
}
else
{
std::cerr << "Input image not specified." << std::endl;
return EXIT_FAILURE;
}
// There's a reorientation issue between itk image physical space and the mesh space
// for which we have to account. See
// http://www.vtk.org/pipermail/vtkusers/2011-July/068595.html
// and
// http://www.vtk.org/Wiki/VTK/ExamplesBoneYard/Cxx/VolumeRendering/itkVtkImageConvert
typedef itk::AffineTransform<RealType> RigidTransformType;
RigidTransformType::Pointer meshToItkImageTransform = RigidTransformType::New();
RigidTransformType::OutputVectorType offset;
offset[0] = -inputImage->GetOrigin()[0];
offset[1] = -inputImage->GetOrigin()[1];
offset[2] = -inputImage->GetOrigin()[2];
RigidTransformType::MatrixType matrix;
for( unsigned int i = 0; i < ImageDimension; i++ )
{
for( unsigned int j = 0; j < ImageDimension; j++ )
{
matrix( i, j ) = inputImage->GetDirection()( i, j );
}
}
meshToItkImageTransform->SetMatrix( matrix );
meshToItkImageTransform->SetOffset( offset );
// Get anti-alias RMSE parameter
RealType antiAliasRmseParameter = 0.03;
itk::ants::CommandLineParser::OptionType::Pointer antiAliasRmseOption =
parser->GetOption( "anti-alias-rmse" );
if( antiAliasRmseOption && antiAliasRmseOption->GetNumberOfFunctions() )
{
antiAliasRmseParameter = parser->Convert<RealType>( antiAliasRmseOption->GetFunction( 0 )->GetName() );
}
typedef itk::AntiAliasBinaryImageFilter<ImageType, ImageType> AntiAliasFilterType;
AntiAliasFilterType::Pointer antiAlias = AntiAliasFilterType::New();
antiAlias->SetMaximumRMSError( antiAliasRmseParameter );
antiAlias->SetInput( inputImage );
antiAlias->Update();
// Reconstruct binary surface.
typedef itk::ImageToVTKImageFilter<ImageType> ConnectorType;
ConnectorType::Pointer connector = ConnectorType::New();
connector->SetInput( antiAlias->GetOutput() );
connector->Update();
vtkSmartPointer<vtkMarchingCubes> marchingCubes = vtkSmartPointer<vtkMarchingCubes>::New();
marchingCubes->SetInputData( connector->GetOutput() );
marchingCubes->ComputeScalarsOff();
marchingCubes->ComputeGradientsOff();
marchingCubes->SetNumberOfContours( 1 );
marchingCubes->SetValue( 0, 0.0 );
marchingCubes->Update();
vtkSmartPointer<vtkPolyDataConnectivityFilter> connectivityFilter =
vtkSmartPointer<vtkPolyDataConnectivityFilter>::New();
connectivityFilter->SetExtractionModeToLargestRegion();
connectivityFilter->SetInputData( marchingCubes->GetOutput() );
connectivityFilter->Update();
vtkSmartPointer<vtkTriangleFilter> triangularizer = vtkSmartPointer<vtkTriangleFilter>::New();
triangularizer->SetInputData( connectivityFilter->GetOutput() );
triangularizer->Update();
vtkPolyData *vtkMesh = triangularizer->GetOutput();
CalculateGenus( vtkMesh, true );
// Add the functional overlays
std::vector<RgbImageType::Pointer> functionalRgbImages;
std::vector<MaskImageType::Pointer> functionalMaskImages;
std::vector<RealType> functionalAlphaValues;
itk::ants::CommandLineParser::OptionType::Pointer functionalOverlayOption =
parser->GetOption( "functional-overlay" );
if( functionalOverlayOption && functionalOverlayOption->GetNumberOfFunctions() )
{
for( unsigned int n = 0; n < functionalOverlayOption->GetNumberOfFunctions(); n++ )
{
if( functionalOverlayOption->GetFunction( n )->GetNumberOfParameters() < 2 )
{
std::cerr << "Error: each functional overlay must have an RGB image and mask."
<< "See help menu." << std::endl;
return EXIT_FAILURE;
}
// read RGB image
std::string rgbFileName = functionalOverlayOption->GetFunction( n )->GetParameter( 0 );
typedef itk::ImageFileReader<RgbImageType> RgbReaderType;
RgbReaderType::Pointer rgbReader = RgbReaderType::New();
rgbReader->SetFileName( rgbFileName.c_str() );
try
{
rgbReader->Update();
rgbReader->GetOutput()->SetOrigin( zeroOrigin );
}
catch( ... )
{
std::cerr << "Error reading RGB file " << rgbFileName << std::endl;
return EXIT_FAILURE;
}
functionalRgbImages.emplace_back(rgbReader->GetOutput() );
// read mask
std::string maskFileName = functionalOverlayOption->GetFunction( n )->GetParameter( 1 );
typedef itk::ImageFileReader<MaskImageType> MaskReaderType;
MaskReaderType::Pointer maskReader = MaskReaderType::New();
maskReader->SetFileName( maskFileName.c_str() );
try
{
maskReader->Update();
maskReader->GetOutput()->SetOrigin( zeroOrigin );
}
catch( ... )
{
std::cerr << "Error reading mask file " << maskFileName << std::endl;
return EXIT_FAILURE;
}
functionalMaskImages.emplace_back(maskReader->GetOutput() );
if( functionalOverlayOption->GetFunction( n )->GetNumberOfParameters() > 2 )
{
auto alpha = parser->Convert<RealType>(
functionalOverlayOption->GetFunction( n )->GetParameter( 2 ) );
functionalAlphaValues.push_back( alpha );
}
else
{
functionalAlphaValues.push_back( 1.0 );
}
}
}
// Reset mesh points to physical space of ITK images
vtkSmartPointer<vtkPoints> meshPoints = vtkMesh->GetPoints();
int numberOfPoints = meshPoints->GetNumberOfPoints();
for( int n = 0; n < numberOfPoints; n++ )
{
RigidTransformType::InputPointType inputTransformPoint;
RigidTransformType::OutputPointType outputTransformPoint;
for( unsigned int d = 0; d < ImageDimension; d++ )
{
inputTransformPoint[d] = meshPoints->GetPoint( n )[d];
}
outputTransformPoint = meshToItkImageTransform->TransformPoint( inputTransformPoint );
meshPoints->SetPoint( n, outputTransformPoint[0], outputTransformPoint[1], outputTransformPoint[2] );
}
// Do the painting
vtkSmartPointer<vtkUnsignedCharArray> colors = vtkSmartPointer<vtkUnsignedCharArray>::New();
colors->SetNumberOfComponents( 4 ); // R, G, B, and alpha components
colors->SetName( "Colors" );
for( int n = 0; n < numberOfPoints; n++ )
{
ImageType::IndexType index;
ImageType::PointType imagePoint;
for( unsigned int d = 0; d < ImageDimension; d++ )
{
imagePoint[d] = meshPoints->GetPoint( n )[d];
}
RealType currentRed = defaultColorRed / 255.0;
RealType currentGreen = defaultColorGreen / 255.0;
RealType currentBlue = defaultColorBlue / 255.0;
RealType currentAlpha = defaultAlpha;
for( int i = functionalAlphaValues.size() - 1; i >= 0; i-- )
{
bool isInsideImage = functionalMaskImages[i]->TransformPhysicalPointToIndex( imagePoint, index );
if( isInsideImage && functionalMaskImages[i]->GetPixel( index ) != 0 )
{
// http://stackoverflow.com/questions/726549/algorithm-for-additive-color-mixing-for-rgb-values
// or
// http://en.wikipedia.org/wiki/Alpha_compositing
RgbPixelType rgbPixel = functionalRgbImages[i]->GetPixel( index );
RealType functionalRed = rgbPixel.GetRed() / 255.0;
RealType functionalGreen = rgbPixel.GetGreen() / 255.0;
RealType functionalBlue = rgbPixel.GetBlue() / 255.0;
RealType functionalAlpha = functionalAlphaValues[i];
RealType backgroundRed = currentRed;
RealType backgroundGreen = currentGreen;
RealType backgroundBlue = currentBlue;
RealType backgroundAlpha = currentAlpha;
currentAlpha = 1.0 - ( 1.0 - functionalAlpha ) * ( 1.0 - backgroundAlpha );
currentRed = functionalRed * functionalAlpha / currentAlpha + backgroundRed * backgroundAlpha * ( 1.0 - functionalAlpha ) / currentAlpha;
currentGreen = functionalGreen * functionalAlpha / currentAlpha + backgroundGreen * backgroundAlpha * ( 1.0 - functionalAlpha ) / currentAlpha;
currentBlue = functionalBlue * functionalAlpha / currentAlpha + backgroundBlue * backgroundAlpha * ( 1.0 - functionalAlpha ) / currentAlpha;
}
}
unsigned char currentColor[4];
currentColor[0] = static_cast<unsigned char>( currentRed * 255.0 );
currentColor[1] = static_cast<unsigned char>( currentGreen * 255.0 );
currentColor[2] = static_cast<unsigned char>( currentBlue * 255.0 );
currentColor[3] = static_cast<unsigned char>( currentAlpha * 255.0 );
colors->InsertNextTupleValue( currentColor );
}
vtkMesh->GetPointData()->SetScalars( colors );
// Inflation
vtkSmartPointer<vtkWindowedSincPolyDataFilter> inflater =
vtkSmartPointer<vtkWindowedSincPolyDataFilter>::New();
itk::ants::CommandLineParser::OptionType::Pointer inflationOption = parser->GetOption( "inflation" );
if( inflationOption && inflationOption->GetNumberOfFunctions() )
{
unsigned int numberOfIterations = 0;
if( inflationOption->GetFunction( 0 )->GetNumberOfParameters() == 0 )
{
numberOfIterations = parser->Convert<unsigned int>( inflationOption->GetFunction( 0 )->GetName() );
}
else
{
numberOfIterations = parser->Convert<unsigned int>( inflationOption->GetFunction( 0 )->GetParameter( 0 ) );
}
if( numberOfIterations > 0 )
{
inflater->SetInputData( vtkMesh );
inflater->SetNumberOfIterations( numberOfIterations );
inflater->BoundarySmoothingOn();
inflater->FeatureEdgeSmoothingOff();
inflater->SetFeatureAngle( 180.0 );
inflater->SetEdgeAngle( 180.0 );
inflater->SetPassBand( 0.001 );
inflater->NonManifoldSmoothingOn();
inflater->NormalizeCoordinatesOff();
inflater->Update();
vtkMesh = inflater->GetOutput();
}
}
// Write the vtk mesh to file.
itk::ants::CommandLineParser::OptionType::Pointer outputOption = parser->GetOption( "output" );
if( outputOption && outputOption->GetNumberOfFunctions() )
{
std::string outputFile = outputOption->GetFunction( 0 )->GetName();
std::string ext = itksys::SystemTools::GetFilenameExtension( outputFile );
if( strcmp( ext.c_str(), ".stl" ) == 0 )
{
vtkSmartPointer<vtkSTLWriter> writer = vtkSmartPointer<vtkSTLWriter>::New();
writer->SetInputData( vtkMesh );
writer->SetFileName( outputFile.c_str() );
writer->Write();
}
if( strcmp( ext.c_str(), ".ply" ) == 0 )
{
vtkSmartPointer<vtkPLYWriter> writer = vtkSmartPointer<vtkPLYWriter>::New();
writer->SetInputData( vtkMesh );
writer->SetFileName( outputFile.c_str() );
writer->Write();
}
if( strcmp( ext.c_str(), ".vtk" ) == 0 )
{
vtkSmartPointer<vtkPolyDataWriter> writer = vtkSmartPointer<vtkPolyDataWriter>::New();
writer->SetInputData( vtkMesh );
writer->SetFileName( outputFile.c_str() );
writer->Write();
}
}
vtkSmartPointer<vtkLookupTable> lookupTable = vtkSmartPointer<vtkLookupTable>::New();
std::string scalarBarTitle( "antsSurf" );
unsigned int scalarBarNumberOfLabels = 5;
unsigned int scalarBarWidthInPixels = 0;
unsigned int scalarBarHeightInPixels = 0;
bool renderScalarBar = false;
itk::ants::CommandLineParser::OptionType::Pointer scalarBarOption = parser->GetOption( "scalar-bar" );
if( scalarBarOption && scalarBarOption->GetNumberOfFunctions() )
{
renderScalarBar = true;
std::string lookupTableFile;
if( scalarBarOption->GetFunction( 0 )->GetNumberOfParameters() == 0 )
{
lookupTableFile = scalarBarOption->GetFunction( 0 )->GetName();
}
else
{
lookupTableFile = scalarBarOption->GetFunction( 0 )->GetParameter( 0 );
if( scalarBarOption->GetFunction( 0 )->GetNumberOfParameters() > 1 )
{
scalarBarTitle = scalarBarOption->GetFunction( 0 )->GetParameter( 1 );
}
if( scalarBarOption->GetFunction( 0 )->GetNumberOfParameters() > 2 )
{
scalarBarNumberOfLabels = parser->Convert<unsigned int>( scalarBarOption->GetFunction( 0 )->GetParameter( 2 ) );
}
if( scalarBarOption->GetFunction( 0 )->GetNumberOfParameters() > 3 )
{
std::vector<unsigned int> dimensions = parser->ConvertVector<unsigned int>( scalarBarOption->GetFunction( 0 )->GetParameter( 3 ) );
scalarBarWidthInPixels = dimensions[0];
scalarBarHeightInPixels = dimensions[1];
}
}
// Read in color table
std::ifstream fileStr( lookupTableFile.c_str() );
if( !fileStr.is_open() )
{
std::cerr << " Could not open file " << lookupTableFile << '\n';
renderScalarBar = false;
}
int tableSize = std::count( std::istreambuf_iterator<char>( fileStr ), std::istreambuf_iterator<char>(), '\n' );
fileStr.clear();
fileStr.seekg( 0, std::ios::beg );
lookupTable->SetNumberOfTableValues( tableSize );
lookupTable->Build();
RealType value;
RealType redComponent;
RealType greenComponent;
RealType blueComponent;
RealType alphaComponent;
char comma;
RealType minValue = itk::NumericTraits<RealType>::max();
RealType maxValue = itk::NumericTraits<RealType>::min();
unsigned int index = 0;
while( fileStr >> value >> comma >> redComponent >> comma >> greenComponent >> comma >> blueComponent >> comma >> alphaComponent )
{
lookupTable->SetTableValue( index++, redComponent / 255.0, greenComponent / 255.0, blueComponent / 255.0, alphaComponent );
if( value < minValue )
{
minValue = value;
}
if( value > maxValue )
{
maxValue = value;
}
}
lookupTable->SetTableRange( minValue, maxValue );
fileStr.close();
}
// Display vtk mesh
itk::ants::CommandLineParser::OptionType::Pointer displayOption = parser->GetOption( "display" );
if( displayOption && displayOption->GetNumberOfFunctions() )
{
std::vector<float> rotationAnglesInDegrees;
rotationAnglesInDegrees.push_back( 0.0 );
rotationAnglesInDegrees.push_back( 0.0 );
rotationAnglesInDegrees.push_back( 0.0 );
std::vector<float> backgroundColor;
backgroundColor.push_back( 255.0 );
backgroundColor.push_back( 255.0 );
backgroundColor.push_back( 255.0 );
std::string screenCaptureFileName = std::string( "" );
screenCaptureFileName = displayOption->GetFunction( 0 )->GetName();
if( strcmp( screenCaptureFileName.c_str(), "false" ) == 0 ||
strcmp( screenCaptureFileName.c_str(), "0" ) == 0 )
{
// do not render and exit
return EXIT_SUCCESS;
}
std::size_t position = screenCaptureFileName.find( "png" );
if( position == std::string::npos )
{
screenCaptureFileName.clear();
}
else
{
std::cout << "Writing surface to image file " << screenCaptureFileName << "." << std::endl;
}
if( displayOption->GetFunction( 0 )->GetNumberOfParameters() == 0 )
{
Display( vtkMesh, rotationAnglesInDegrees, backgroundColor, screenCaptureFileName,
renderScalarBar, lookupTable, scalarBarTitle, scalarBarNumberOfLabels,
scalarBarWidthInPixels, scalarBarHeightInPixels );
}
else
{
if( displayOption->GetFunction( 0 )->GetNumberOfParameters() > 0 )
{
rotationAnglesInDegrees = parser->ConvertVector<float>(
displayOption->GetFunction( 0 )->GetParameter( 0 ) );
}
if( displayOption->GetFunction( 0 )->GetNumberOfParameters() > 1 )
{
backgroundColor = parser->ConvertVector<float>(
displayOption->GetFunction( 0 )->GetParameter( 1 ) );
if( backgroundColor.size() == 1 )
{
backgroundColor.push_back( backgroundColor[0] );
backgroundColor.push_back( backgroundColor[0] );
}
}
Display( vtkMesh, rotationAnglesInDegrees, backgroundColor, screenCaptureFileName,
renderScalarBar, lookupTable, scalarBarTitle, scalarBarNumberOfLabels,
scalarBarWidthInPixels, scalarBarHeightInPixels );
}
}
return EXIT_SUCCESS;
}
int antsSurfaceToImage( itk::ants::CommandLineParser *parser )
{
itk::ants::CommandLineParser::OptionType::Pointer surfaceOption =
parser->GetOption( "mesh" );
vtkSmartPointer<vtkPolyData> vtkMesh;
std::string inputFile;
if( surfaceOption && surfaceOption->GetNumberOfFunctions() > 0 )
{
inputFile = surfaceOption->GetFunction( 0 )->GetName();
std::string ext = itksys::SystemTools::GetFilenameExtension( inputFile );
try
{
if( strcmp( ext.c_str(), ".stl" ) == 0 )
{
vtkSmartPointer<vtkSTLReader> reader = vtkSmartPointer<vtkSTLReader>::New();
reader->SetFileName( inputFile.c_str() );
reader->Update();
vtkMesh = reader->GetOutput();
}
if( strcmp( ext.c_str(), ".ply" ) == 0 )
{
vtkSmartPointer<vtkPLYReader> reader = vtkSmartPointer<vtkPLYReader>::New();
reader->SetFileName( inputFile.c_str() );
reader->Update();
vtkMesh = reader->GetOutput();
}
if( strcmp( ext.c_str(), ".vtk" ) == 0 )
{
vtkSmartPointer<vtkPolyDataReader> reader = vtkSmartPointer<vtkPolyDataReader>::New();
reader->SetFileName( inputFile.c_str() );
reader->Update();
vtkMesh = reader->GetOutput();
}
}
catch( ... )
{
std::cerr << "Error. Unable to read mesh input file." << std::endl;
return EXIT_FAILURE;
}
}
else
{
std::cerr << "No mesh file specified." << std::endl;
return EXIT_FAILURE;
}
double bounds[6];
vtkMesh->GetBounds( bounds );
std::string outputFile;
std::vector<double> spacing;
itk::ants::CommandLineParser::OptionType::Pointer outputOption = parser->GetOption( "output" );
if( outputOption && outputOption->GetNumberOfFunctions() )
{
outputFile = outputOption->GetFunction( 0 )->GetName();
if( outputOption->GetFunction( 0 )->GetNumberOfParameters() == 0 )
{
spacing.push_back( 1.0 );
std::cout << "Warning. No spacing is specified---defaulting to 1.0." << std::endl;
}
else
{
spacing = parser->ConvertVector<double>(
outputOption->GetFunction( 0 )->GetParameter( 0 ) );
}
}
else
{
std::cerr << "Error. No output specified." << std::endl;
return EXIT_FAILURE;
}
vtkSmartPointer<vtkImageData> whiteImage = vtkSmartPointer<vtkImageData>::New();
double spacing2[3]; // desired volume spacing
if( spacing.size() == 1 )
{
spacing2[0] = spacing[0];
spacing2[1] = spacing[0];
spacing2[2] = spacing[0];
}
else if( spacing.size() == 3 )
{
spacing2[0] = spacing[0];
spacing2[1] = spacing[1];
spacing2[2] = spacing[2];
}
else
{
std::cerr << "Error. Incorrect spacing specified." << std::endl;
return EXIT_FAILURE;
}
whiteImage->SetSpacing( spacing2 );
// compute dimensions
int dim[3];
for( unsigned int i = 0; i < 3; i++ )
{
dim[i] = static_cast<int>( std::ceil( ( bounds[i * 2 + 1] - bounds[i * 2] ) / spacing2[i] ) );
}
whiteImage->SetDimensions( dim );
whiteImage->SetExtent( 0, dim[0] - 1, 0, dim[1] - 1, 0, dim[2] - 1 );
double origin[3];
origin[0] = bounds[0] + spacing2[0] / 2;
origin[1] = bounds[2] + spacing2[1] / 2;
origin[2] = bounds[4] + spacing2[2] / 2;
whiteImage->SetOrigin( origin );
whiteImage->AllocateScalars( VTK_UNSIGNED_CHAR, 1 );
// fill the image with foreground voxels:
unsigned char inval = 1;
unsigned char outval = 0;
vtkIdType count = whiteImage->GetNumberOfPoints();
for( vtkIdType i = 0; i < count; ++i )
{
whiteImage->GetPointData()->GetScalars()->SetTuple1( i, inval );
}
// polygonal data --> image stencil:
vtkSmartPointer<vtkPolyDataToImageStencil> pol2stenc = vtkSmartPointer<vtkPolyDataToImageStencil>::New();
pol2stenc->SetInputData( vtkMesh );
pol2stenc->SetOutputOrigin( origin );
pol2stenc->SetOutputSpacing( spacing2 );
pol2stenc->SetOutputWholeExtent( whiteImage->GetExtent() );
pol2stenc->Update();
// cut the corresponding white image and set the background:
vtkSmartPointer<vtkImageStencil> imgstenc = vtkSmartPointer<vtkImageStencil>::New();
imgstenc->SetInputData( whiteImage );
imgstenc->SetStencilConnection( pol2stenc->GetOutputPort() );
imgstenc->ReverseStencilOff();
imgstenc->SetBackgroundValue( outval );
imgstenc->Update();
// Write the vtk mesh to image file.
if( outputOption && outputOption->GetNumberOfFunctions() )
{
vtkSmartPointer<vtkMetaImageWriter> writer = vtkSmartPointer<vtkMetaImageWriter>::New();
writer->SetFileName( outputFile.c_str() );
writer->SetInputData( imgstenc->GetOutput() );
writer->Write();
}
return EXIT_SUCCESS;
}
void InitializeCommandLineOptions( itk::ants::CommandLineParser *parser )
{
typedef itk::ants::CommandLineParser::OptionType OptionType;
{
std::string description =
std::string( "Main input binary image for 3-D rendering. One can also " )
+ std::string( "set a default color value in the range [0,255]. The " )
+ std::string( "fourth default color element is the alpha value in " )
+ std::string( "the range [0,1]." );
OptionType::Pointer option = OptionType::New();
option->SetLongName( "surface-image" );
option->SetShortName( 's' );
option->SetUsageOption( 0, "surfaceImageFilename" );
option->SetUsageOption( 1, "[surfaceImageFilename,<defaultColor=255x255x255x1>]" );
option->SetDescription( description );
parser->AddOption( option );
}
{
std::string description =
std::string( "The user can also specify a vtk polydata file to be converted " )
+ std::string( "to a binary image." );
OptionType::Pointer option = OptionType::New();
option->SetLongName( "mesh" );
option->SetShortName( 'm' );
option->SetUsageOption( 0, "meshFilename" );
option->SetDescription( description );
parser->AddOption( option );
}
{
std::string description =
std::string( "A functional overlay can be specified using both " )
+ std::string( "and rgb image and a mask specifying where that " )
+ std::string( "rgb image should be applied. Both images must " )
+ std::string( "have the same image geometry as the input image. " )
+ std::string( "Optionally, an alpha parameter can be specified." )
+ std::string( "Note that more than one functional overlays can " )
+ std::string( "be rendered, the order in which they are specified " )
+ std::string( "on the command line matters, and rgb images are " )
+ std::string( "assumed to be unsigned char [0,255]." );
OptionType::Pointer option = OptionType::New();
option->SetLongName( "functional-overlay" );
option->SetShortName( 'f' );
option->SetUsageOption( 0, "[rgbImageFileName,maskImageFileName,<alpha=1>]" );
option->SetDescription( description );
parser->AddOption( option );
}
{
std::string description =
std::string( "Anti-alias maximum RMSE parameter for surface reconstruction " )
+ std::string( "(default = 0.03)." );
OptionType::Pointer option = OptionType::New();
option->SetLongName( "anti-alias-rmse" );
option->SetShortName( 'a' );
option->SetUsageOption( 0, "value" );
option->SetDescription( description );
parser->AddOption( option );
}
{
std::string description =
std::string( "Perform inflation of the mesh." );
OptionType::Pointer option = OptionType::New();
option->SetLongName( "inflation" );
option->SetShortName( 'i' );
option->SetUsageOption( 0, "numberOfIterations" );
option->SetUsageOption( 1, "[numberOfIterations]" );
option->SetDescription( description );
parser->AddOption( option );
}
{
std::string description =
std::string( "Display output surface function in VTK window. Rotation " )
+ std::string( "angles are in degrees and the default background color " )
+ std::string( "is white (255x255x255). Note that the filename, to be " )
+ std::string( "considered such, must have a \"png\" extension. If the " )
+ std::string( "filename is omitted in the third usage option, then the " )
+ std::string( "window is displayed." );
OptionType::Pointer option = OptionType::New();
option->SetLongName( "display" );
option->SetShortName( 'd' );
option->SetUsageOption( 0, "doWindowDisplay" );
option->SetUsageOption( 1, "filename" );
option->SetUsageOption( 2, "<filename>[rotateXxrotateYxrotateZ,<backgroundColor=255x255x255>]" );
option->SetDescription( description );
parser->AddOption( option );
}
{
std::string description =
std::string( "Given a binary image input, the output is a vtk polydata file (possible " )
+ std::string( "extensions include .stl, .ply, and .vtk). " )
+ std::string( "Alternatively, if a mesh file is specified as input, the output " )
+ std::string( "is an itk binary image." );
OptionType::Pointer option = OptionType::New();
option->SetLongName( "output" );
option->SetShortName( 'o' );
option->SetUsageOption( 0, "surfaceFilename" );
option->SetUsageOption( 1, "imageFilename[spacing]" );
option->SetDescription( description );
parser->AddOption( option );
}
{
std::string description =
std::string( "Add a scalar bar to the rendering for the final overlay. One can tailor " )
+ std::string( "the aesthetic by changing the number of labels and/or the orientation and ")
+ std::string( "size of the scalar bar. If the \'width\' > \'height\' (in pixels) then the ")
+ std::string( "orientation is horizontal. Otherwise it is vertical (default)." );
OptionType::Pointer option = OptionType::New();
option->SetLongName( "scalar-bar" );
option->SetShortName( 'b' );
option->SetUsageOption( 0, "lookupTable" );
option->SetUsageOption( 1, "[lookupTable,<title=antsSurf>,<numberOfLabels=5>,<widthxheight>]" );
option->SetDescription( description );
parser->AddOption( option );
}
{
std::string description = std::string( "Print the help menu (short version)." );
OptionType::Pointer option = OptionType::New();
option->SetShortName( 'h' );
option->SetDescription( description );
parser->AddOption( option );
}
{
std::string description = std::string( "Print the help menu." );
OptionType::Pointer option = OptionType::New();
option->SetLongName( "help" );
option->SetDescription( description );
parser->AddOption( option );
}
}
// entry point for the library; parameter 'args' is equivalent to 'argv' in (argc,argv) of commandline parameters to
// 'main()'
int antsSurf( std::vector<std::string> args, std::ostream* /*out_stream = nullptr */ )
{
// put the arguments coming in as 'args' into standard (argc,argv) format;
// 'args' doesn't have the command name as first, argument, so add it manually;
// 'args' may have adjacent arguments concatenated into one argument,
// which the parser should handle
args.insert( args.begin(), "antsSurf" );
int argc = args.size();
char* * argv = new char *[args.size() + 1];
for( unsigned int i = 0; i < args.size(); ++i )
{
// allocate space for the string plus a null character
argv[i] = new char[args[i].length() + 1];
std::strncpy( argv[i], args[i].c_str(), args[i].length() );
// place the null character in the end
argv[i][args[i].length()] = '\0';
}
argv[argc] = nullptr;
// class to automatically cleanup argv upon destruction
class Cleanup_argv
{
public:
Cleanup_argv( char* * argv_, int argc_plus_one_ ) : argv( argv_ ), argc_plus_one( argc_plus_one_ )
{
}
~Cleanup_argv()
{
for( unsigned int i = 0; i < argc_plus_one; ++i )
{
delete[] argv[i];
}
delete[] argv;
}
private:
char* * argv;
unsigned int argc_plus_one;
};
Cleanup_argv cleanup_argv( argv, argc + 1 );
// antscout->set_stream( out_stream );
itk::ants::CommandLineParser::Pointer parser =
itk::ants::CommandLineParser::New();
parser->SetCommand( argv[0] );
std::string commandDescription =
std::string( "Produce a 3-D surface rendering with optional RGB overlay. Alternatively, " )
+ std::string( "one can input a mesh which can then be converted to a binary image. " );
parser->SetCommandDescription( commandDescription );
InitializeCommandLineOptions( parser );
if( parser->Parse( argc, argv ) == EXIT_FAILURE )
{
return EXIT_FAILURE;
}
if( argc == 1 )
{
parser->PrintMenu( std::cout, 5, false );
return EXIT_FAILURE;
}
else if( parser->GetOption( "help" )->GetFunction() && parser->Convert<bool>( parser->GetOption( "help" )->GetFunction()->GetName() ) )
{
parser->PrintMenu( std::cout, 5, false );
return EXIT_SUCCESS;
}
else if( parser->GetOption( 'h' )->GetFunction() && parser->Convert<bool>( parser->GetOption( 'h' )->GetFunction()->GetName() ) )
{
parser->PrintMenu( std::cout, 5, true );
return EXIT_SUCCESS;
}
// Get dimensionality
itk::ants::CommandLineParser::OptionType::Pointer imageOption =
parser->GetOption( "surface-image" );
itk::ants::CommandLineParser::OptionType::Pointer surfaceOption =
parser->GetOption( "mesh" );
if( imageOption && imageOption->GetNumberOfFunctions() > 0 )
{
std::string inputFile;
if( imageOption->GetFunction( 0 )->GetNumberOfParameters() == 0 )
{
inputFile = imageOption->GetFunction( 0 )->GetName();
}
else if( imageOption->GetFunction( 0 )->GetNumberOfParameters() > 0 )
{
inputFile = imageOption->GetFunction( 0 )->GetParameter( 0 );
}
itk::ImageIOBase::Pointer imageIO = itk::ImageIOFactory::CreateImageIO(
inputFile.c_str(), itk::ImageIOFactory::ReadMode );
unsigned int dimension = imageIO->GetNumberOfDimensions();
if( dimension == 3 )
{
antsImageToSurface( parser );
}
else
{
std::cerr << "Unsupported dimension" << std::endl;
return EXIT_FAILURE;
}
}
else if( surfaceOption && surfaceOption->GetNumberOfFunctions() > 0 )
{
antsSurfaceToImage( parser );
}
else
{
std::cerr << "Input not specified. See help menu." << std::endl;
return EXIT_FAILURE;
}
return EXIT_SUCCESS;
}
} // namespace ants
|
<gh_stars>1-10
#!/usr/bin/env python
import serial
import os
class PowerMeter():
""" Newport Optical Power Meter 1830-C """
def __init__(self):
if os.name == "posix":
portbase = '/dev/ttyUSB'
else:
portbase = 'COM'
for i in xrange(10):
try:
self.ser = serial.Serial("%s%d" %(portbase, i),
baudrate=9600,
bytesize=8,
stopbits=1,
parity=serial.PARITY_NONE,
timeout=1,
xonxoff=1)
self.getReading()
break
except:
self.ser = None
pass
if self.ser is None:
print "No connection..."
return None
else:
print "Powermeter connected"
def sendCom(self, command):
self.ser.write("%s\n" % (command))
def readReply(self):
return(self.ser.readline().strip())
def getReading(self):
self.sendCom("D?")
value = self.readReply();
try:
fvalue = float(value)
except:
fvalue = None
return(fvalue)
if __name__ == "__main__":
powermeter = PowerMeter()
|
/**
* Copies one list of markers to another list of markers so that it can be used as a
* local variable so the member variable won't be altered
* @param list1 list of markers that the copy is going to be made from
* @return list of markers identical to the parameter
*/
private List<Marker> copyList(List<Marker> list1) {
List<Marker> list2 = new ArrayList<>();
for (int i = 0; i < list1.size(); i++) {
list2.add(list1.get(i));
}
return list2;
} |
import os
import sys
def getBuildNumber(releaseVersion):
main, major, minor = releaseVersion.split(".")
buildNumber = 0x010000 * int(main) + 0x0100 * int(major) + 0x01 * int(minor)
return buildNumber
def updateBuildNumber(fileName, releaseVersion):
with open(fileName) as fp:
lines = fp.readlines()
with open(fileName, "w") as fp:
for line in lines:
if line.startswith(" number:"):
fp.write(f" number: {getBuildNumber(releaseVersion)}{os.linesep}")
else:
fp.write(line)
if __name__ == "__main__":
updateBuildNumber(sys.argv[1], sys.argv[2])
|
/**
* Rewrite the source code with resource extracted.
*
* @param tree AST tree (or subtree) to be examined.
* @param codeUnit AbstractCodeUnit instance that has all information related to a source file.
* @return source code after resource replacement.
*/
private String rewriteSource(CommonTree tree, AbstractCodeUnit codeUnit) {
CommonTokenStream tokens = codeUnit.getTokens();
AstCodeReplacement replacement = (AstCodeReplacement)codeUnit.getReplacement(tree);
if (replacement != null) {
return generateReplacementCode(replacement, codeUnit);
}
int ind = tree.getTokenStartIndex();
/*System.out.println(tree.toStringTree());
System.out.println(tree.getTokenStartIndex());
System.out.println(tree.getTokenStopIndex());
*/
StringBuilder sourceCode = new StringBuilder();
for (int i = 0; i < tree.getChildCount(); i++) {
CommonTree child = (CommonTree)tree.getChild(i);
for (; ind < child.getTokenStartIndex(); ind++) {
sourceCode.append(tokens.get(ind).getText());
}
sourceCode.append(rewriteSource(child, codeUnit));
ind = child.getTokenStopIndex() + 1;
}
for (; ind <= tree.getTokenStopIndex(); ind++) {
String text = tokens.get(ind).getText();
if (!text.equals("<EOF>")) {
sourceCode.append(tokens.get(ind).getText());
}
}
return sourceCode.toString();
} |
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE DeriveGeneric #-}
module ExercismWizard.Execute.Overview.RawExercise
( RawExercise (..)
)
where
import Data.Aeson
import GHC.Generics
data RawExercise = RawExercise
{ status :: String
, uri :: Maybe String
, name :: String
, eId :: String
, core :: Bool
}
deriving (Eq, Show, Generic, FromJSON, ToJSON)
|
/**
* Parses a {@code String csvFileLocation} into a {@code Path}.
* Leading and trailing whitespaces will be trimmed.
*
* @throws ParseException if the given {@code fileLocation} is invalid.
*/
public static Path parseCsv(String fileLocation) throws ParseException {
requireNonNull(fileLocation);
String trimmedFileLocation = fileLocation.trim();
Path path = Paths.get(trimmedFileLocation);
return path;
} |
// displayed as blanks
private static class NumberOrNullRenderer extends MyTCRStripedHighlight {
private NumberFormat formatter;
public NumberOrNullRenderer() { super(); }
@Override
public void setValue(final Object value) {
if (formatter == null) {
formatter = NumberFormat.getIntegerInstance();
}
if (value == null)
setText("");
else if (! isInteger(value))
setText(value.toString());
else
setText(formatter.format(value));
}
private boolean isInteger(final Object value) {
try {
Integer.parseInt(value.toString());
return true;
}
catch (final Exception ignore) {
return false;
}
}
} |
<reponame>sergio222-dev/nestjs-ddd-practice<gh_stars>1-10
import { Entity, PrimaryKey, Property } from "mikro-orm";
import { MikroCourseRepository } from "@libs/First-aprox-lib/Courses/Infrastructure/persistense/MikroCourseRepository";
@Entity({
tableName: 'course',
customRepository: () => MikroCourseRepository,
})
export class CourseEntity {
@PrimaryKey()
id!: string;
@Property()
name: string;
@Property()
duration: string;
constructor(id: string, name: string, duration: string) {
this.id = id;
this.name = name;
this.duration = duration;
}
}
|
<filename>packages/function/copyTextToClipBoard.ts
/**
* @description 复制文本到剪贴板上,由于浏览器的原因,需要在dom事件中执行该函数。
* @param {string} text 要复制到剪切板上的文本
*/
export function copyTextToClipBoard(text:string) {
const inputEle = document.createElement("input");
const inputStyle = {
opacity: "0",
zIndex: "-100000",
height: "1px",
width: "1px",
position: "absolute",
left: "10000000px",
top: "0",
border: "none",
padding: "0",
margin: "0",
};
Object.assign(inputEle.style, inputStyle);
document.body.appendChild(inputEle);
inputEle.value = text;
inputEle.select();
document.execCommand("copy");
document.body.removeChild(inputEle);
}
|
This article contains live links and may be accessed here:
http://www.tenthamendmentcenter.com/2009/11/29/resist-dc-a-step-by-…
29 Nov 2009
by State Rep. Matthew Shea (WA-4th)
This summer, legislators from several states met to discuss the steps needed to
restore our Constitutional Republic. The federal government has ignored the many
state sovereignty resolutions from 2009 notifying it to cease and desist its
current and continued overreach. The group decided it was time to actively
counter the tyranny emanating from Washington D.C.
From those discussions it became clear three things needed to happen.
1. State Legislatures need to pass 10 key pieces of legislation “with teeth” to
put the federal government back in its place.
2. The people must pass the legislation through the Initiative process if any
piece of the legislative agenda fails.
3. County Sheriffs must reaffirm and uphold their oaths to protect and defend
the Constitution of the United States.
With the advent of the Tea Party Movement, many people have been asking how
exactly we can make the above reality. What follows is Part I of the outline of
that plan regarding state legislation, the action steps any concerned citizen
can take to see this legislation to fruition, and the brief history and
justifications behind each.
Step 1: Reclaim State Sovereignty through Key Nullification Legislation
Our Constitutional Republic is founded on a system of checks and balances known
as the “separation of powers.” Rarely, however, are the states considered part
of this essential principle.
Enter the “doctrine of nullification.”
Nullification is based on the simple principle that the federal government
cannot be the final arbiter of the extent and boundaries of its own power. This
includes all branches of the federal government. In the law this is known as a
“conflict of interest.”
Additionally, since the states created the federal government the federal
government was an agent of the states; not the other way around. Thus, Thomas
Jefferson believed that, by extension, the states had a natural right to nullify
(render as of no effect) any laws they believed were unconstitutional.
In the Kentucky Resolutions of 1798 he wrote,
“co-States, recurring to their natural right…will concur in declaring these acts
void, and of no force, and will each take measures of its own for providing that
neither these acts, nor any others of the General Government not plainly and
intentionally authorized by the Constitution, shalt be exercised within their
respective territories.”1
Alexander Hamilton echoed this sentiment in Federalist #85 “We may safely rely
on the disposition of the state legislatures to erect barriers against the
encroachments of the national authority.” 2
It is clear then that State Legislatures can stop the unconstitutional overreach
of the Obama administration through nullification. Here is a list of proposed
nullification legislation to introduce in all 50 States.
1. Nullification of Socialized Health Care [current efforts] [example legislation]
2. Nullification of National Cap and Trade [example legislation]
3. Federal Enumerated Powers Requirement (Blanket Nullification) [details]
4. Establishment of a Federal Tax Escrow Account [example legislation]
If imposed, socialized health care and cap and trade will crush our economy.
These programs are both unconstitutional, creating government powers beyond
those enumerated by the Constitution. If those programs are nullified, it will
give the individual states a fighting chance to detach from a federal budget in
freefall and save the economies of the individual states.
Next, blanket nullification.
The Federal Government, particularly the House of Representatives, needs to
abide by its own rules. In particular, House Rule XIII 3(d) specifically states
that:
“Each report of a committee on a public bill or public joint resolution shall
contain the following: (1) A statement citing the specific powers granted to
Congress in the Constitution to enact the law proposed by the bill or resolution.” 3
Needless to say, this rule is generally ignored. The idea behind blanket
nullification is that if the Congress does not specify the enumerated power it
is using according to its own rules, or the power specified is not one of the
enumerated powers granted to Congress in the United States Constitution, then
the “law” is automatically null and void.
Lastly, the federal government cannot survive without money. I know that seems
obvious but many states are missing the opportunity to use money as an incentive
for the federal government to return to its proper role. Most visibly, states
help collect the federal portion of the gasoline tax. That money should be put
into an escrow account at the state level and held there. The Escrow Account
legislation includes a provision that all consumer, excise, and income taxes
payable to the federal government would go through this account first. This
would do two things. First, it would give states the ability to collect interest
on that money to help offset revenue shortfalls. Second, it would allow states
to hold that money as long as needed as an incentive for the federal government
to return within the enumerated boundaries of its power.
Step 2: Erect an impenetrable wall around the County Sheriff and the 2nd
Amendment.
As recently stated in the famous Heller opinion by the United States Supreme
Court, the right to bear arms “is an individual right protecting against both
public and private violence” and “when the able-bodied men of a nation are
trained in arms and organized they are better able to resist tyranny.” 4
Thus, it is clear that the 2nd Amendment not only protects the right to
self-defense but that right extends to defending oneself against tyranny. As
with any historical attempt to establish a dictatorship weapons must be seized
or severely regulated. 5
Here is a list of legislation to prevent this from happening, some of which has
already been introduced in states around the country:
• Sheriff First [model legislation]
• Extension of the Castle Doctrine (right to protection) [sample legislation]
• Prohibition of Gun and Ammunition Tracking [see above]
• Firearms Freedom Act [current efforts] [model legislation]
The county Sheriff is the senior law enforcement officer both in terms of rank
and legal authority in a county. This comes from a tradition of over 1000 years
of Anglo-Saxon common law. Anglo-Saxon communities were typically organized into
“shires” consisting of approximately 1000 people. 6
The chief law enforcement officer of the shire was the “reeve” or “reef.” Hence,
the modern combination of the two words, as we know them today, “shire reef” or
“Sheriff.” 7
Consequently, the Sheriff’s pre-eminent legal authority is well established.
This was confirmed in Printz v. United States. 7 Justice Scalia quotes James
Madison who wrote in Federalist 39:
“In the latter, the local or municipal authorities form distinct and independent
portions of the supremacy, no more subject, within their respective spheres, to
the general authority, than the general authority is subject to them, within its
own sphere.”9
Sheriff 1st legislation would formally declare that all federal agents and
officers must give notice of, and seek permission before, any arrest, search, or
seizure occurs. Thus, federal agents and officers seeking to enforce
unconstitutional laws must go through the county Sheriff first.
Extending the castle doctrine to one’s person would go a long way toward
eliminating the arbitrary “no carry” areas. Like Virginia Tech, it is these
areas where guns for self-defense are most needed.
Many gun and ammunition tracking schemes have been, and are still being,
attempted. The intended purpose of “reducing gun related” crime is never
realized. Instead, law-abiding citizens are punished with regulatory burdens and
fees. Quite simply we need transparency in government not in the people.
Montana started the firearms freedom act to rein in the federal government’s use
of the Commerce Clause to regulate everything within the stream of commerce. The
original intent of the Commerce Clause was to regulate commerce between states
not within states as Professor Rob Natelson points out in his 2007 Montana Law
Review article.10
The Montana FFA simply returns to that original understanding regarding firearms
made, sold, and kept within a state’s borders.
This list is by no means exhaustive. However, it does contain some immediate
steps that can be taken toward freedom and restoring our God honoring
Constitutional Republic. Hitler’s laws of January 30 and February 14, 1934,
should serve as a stark reminder of what happens when state sovereignty is
abolished.
In the coming few weeks I will publish the next part of the plan.
Matthew Shea [send him email] is a State Representative in Washington’s 4th
District. He’s the author of HJM4009 for State Sovereignty. Visit his website.
Copyright © 2009 by TenthAmendmentCenter.com. Permission to reprint in whole or
in part is gladly granted, provided full credit is given.
NOTES:
• 1. Kentucky Resolution of 1798, Thomas Jefferson, Adopted by Kentucky
Legislature on November 10, 1798.
• 2. Federalist No. 85, Publius (Alexander Hamilton), August 13 and 16, 1788.
• 3. Rules of the House XIII 3(d), “Content of Reports,” Page 623, 110th Congress.
• 4. District of Columbia v. Heller, 554 U.S. ___ (Actual Pages 11, 13) (2008)
• 5. Id at (Actual Page 11).
• 6.
http://www.thenewamerican.com/index.php/history/ancient/1859-teutob…
• 7. http://www.etymonline.com/index.php?search=sheriff&searchmode=none
• 8. Printz v. United States, 521 U.S. 898 (1997)
• 9. Federalist No. 39, Publius (James Madison), January 16, 1788
• 10. Tempering the Commerce Power, 68 Mont. L. Rev. 95 (2007).
If you enjoyed this post:
Click Here to Get the Free Tenth Amendment Center Newsletter,
Advertisements |
Kottayam: Pulsar Suni, the prime accused in the actress attack case, has revealed that a film actress was behind the February 17 actress attack case.
Suni on Sunday said the person referred to as 'madam' was an actress and he would reveal her name on Wednesday. Suni was brought to Kottayam to be produced before the chief judicial magistrate's court in connection with a case of forging identity proofs for securing sim cards. The court adjourned the hearing.
Suni, who has been throwing hints everytime he came across TV cameras, had earlier this month said that the 'madam' involved in the actress attack case was from film industry.
It was Suni's revelation regarding alleged conspiracy in the case that led to the arrest of actor Dileep. Dileep, who was arrested on July 10, has been lodged in Aluva sub-jail and is yet to be granted bail.
The victim actress, who has worked in Malayalam, Tamil and Telugu films, was abducted and allegedly molested inside her car for two hours by the accused, who had forced their way into the vehicle on February 17. |
import { BeforeInsert, BeforeUpdate, Column, Entity, ManyToOne, PrimaryGeneratedColumn } from "typeorm";
import { SubstationEntity } from "./substation.entity";
import { EquipmentTypeEntity } from "./equipmentType.entity";
import { addMonths, parseISO } from "date-fns";
@Entity({name: 'equipment'})
export class EquipmentEntity {
@PrimaryGeneratedColumn()
id: number;
@Column({type: 'varchar', length: 10, unique: true, nullable: false})
invNum: string; // инвентарный номер
@Column({type: 'int', nullable: false})
equipmentTypeId: number;
@Column({type: 'date', nullable: false})
inspectDate: Date; // дата проверки
@Column({type: 'date', nullable: true}) // формат в БД: ГГГГ-ММ-ДД Пример: 2022-04-10
lastCheckoutDate: Date; // дата последнего испытания
@Column({type: 'date', nullable: true})
nextCheckoutDate: Date; // дата следующего испытания
@Column({type: 'boolean', nullable: false, default: true})
isGoodCondition: boolean; // рабочее состояние
@Column({type: 'varchar', length: 30, nullable: false})
inspectedBy: string; // ФИО проверяющего
@Column()
substationId: number; // ID подстанции
@Column({type: 'text', default: ''})
notation: string; // заметки
@ManyToOne(() => SubstationEntity)
substation: SubstationEntity;
@ManyToOne(() => EquipmentTypeEntity, {eager: true})
equipmentType: EquipmentTypeEntity;
} |
<reponame>GlobalNOC/wsc-python<gh_stars>0
import http.cookiejar
import logging
import requests
from lxml import etree as ET
namespaces = {
'S': 'http://schemas.xmlsoap.org/soap/envelope/',
'paos': 'urn:liberty:paos:2003-08',
'ecp': 'urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp',
'saml': 'urn:oasis:names:tc:SAML:2.0:assertion',
'saml2p': 'urn:oasis:names:tc:SAML:2.0:protocol',
'ds': 'http://www.w3.org/2000/09/xmldsig#',
}
# for ns in namespaces.keys():
# ET.register_namespace(ns, namespaces[ns])
class NoNameService(Exception):
pass
class NoURL(Exception):
pass
class UndefinedURN(Exception):
pass
class InvalidURN(Exception):
pass
class RemoteMethodException(Exception):
pass
class LoginFailure(Exception):
pass
class ECP(requests.auth.AuthBase):
def __init__(self, username, password, realm, debug=False):
self.debug = debug
self.username = username
self.password = password
self.realm = realm
def handle_ecp(self, r, **kwargs):
if r.headers.get('content-type', None) == 'application/vnd.paos+xml':
logging.debug("Got PAOS Header. Redirecting through ECP login.")
e = ET.fromstring(r.content)
r.close()
session = requests.Session()
# XXX
# Extract the relay state to use later
(relaystate,) = ET.XPath(
'S:Header/ecp:RelayState', namespaces=namespaces)(e)
# Extract the response consumer URL to compare later
responseconsumer = ET.XPath(
'S:Header/paos:Request', namespaces=namespaces)(e)[0].get('responseConsumerURL')
logging.debug("SP expects the response at: %s", responseconsumer)
# Clean up the SP login request
e.remove(ET.XPath('S:Header', namespaces=namespaces)(e)[0])
# Log into the IdP with the SP request
logging.debug(
"Logging into the IdP via %s as %s",
self.realm, self.username
)
login_r = session.post(
self.realm,
auth=(self.username, self.password),
data=ET.tostring(e),
headers={'content-type': 'text/xml'}
)
if login_r.status_code != requests.codes.ok:
raise RemoteMethodException(
"Received status code {0} from IdP".format(login_r.status_code))
ee = ET.fromstring(login_r.content)
# Make sure we got back the same response consumer URL and assertion consumer service URL
idpACS = ET.XPath(
'S:Header/ecp:Response', namespaces=namespaces)(ee)[0].get('AssertionConsumerServiceURL')
logging.debug("IdP said to send the response to %s", idpACS)
if responseconsumer != idpACS:
raise LoginFailure("SP and IdP ACS mismatch")
# Make sure we got a successful login
if ET.XPath('S:Body/saml2p:Response/saml2p:Status/saml2p:StatusCode', namespaces=namespaces)(ee)[0].get('Value') != 'urn:oasis:names:tc:SAML:2.0:status:Success':
raise LoginFailure("Login to IdP unsuccessful")
logging.debug("IdP accepted login.")
# Clean up login token
(h,) = ET.XPath('S:Header', namespaces=namespaces)(ee)
for el in h:
h.remove(el)
h.append(relaystate)
# Pass login token to SP
logging.debug("Sending login token to SP.")
return_r = session.post(responseconsumer,
data=ET.tostring(ee),
headers={
'Content-Type': 'application/vnd.paos+xml'},
allow_redirects=False)
if return_r.status_code not in [requests.codes.ok, requests.codes.found]:
raise RemoteMethodException(
"Received status code {0} from SP".format(return_r.status_code))
# Prepare the original request with the new login cookies
prep = r.request.copy()
if not hasattr(prep, '_cookies'):
prep._cookies = requests.cookies.RequestsCookieJar()
requests.cookies.extract_cookies_to_jar(
prep._cookies, r.request, r.raw)
prep._cookies.update(session.cookies)
prep.prepare_cookies(prep._cookies)
# Re-launch the original request
logging.debug("Re-launching original request after logging in.")
_r = r.connection.send(prep, **kwargs)
# Add the login flow to the request history
_r.history.append(r)
_r.history.append(login_r)
_r.history.append(return_r)
_r.request = prep
return _r
logging.debug(
"No PAOS header. Assuming already logged in, or no Shib required.")
return r
def __call__(self, r):
# Update or add Accept header to indicate we want to do ECP
if 'Accept' in r.headers:
r.headers['Accept'] += ', application/vnd.paos+xml'
else:
r.headers['Accept'] = '*/*, application/vnd.paos+xml'
# Signal that we support ECP
r.headers['PAOS'] = 'ver="urn:liberty:paos:2003-08";"urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp"'
r.register_hook('response', self.handle_ecp)
return r
def __eq__(self, other):
return all([self.username == getattr(other, 'username', None),
self.password == getattr(other, 'password', None),
self.realm == getattr(other, 'realm', None)])
def __ne__(self, other):
return not self == other
class WSC(object):
_debug = None
_url = None
_username = None
_password = None
_urn = None
_ns = None
_ns_etree = None
_realm = None
_raw = False
_strict_content_type = True
_session = None
_timeout = None
def __init__(self, ns='/etc/grnoc/name-service-cacher/name-service.xml', debug=False):
logging.debug("Initialized WSC object")
self._debug = debug
self.ns = ns
self.session = requests.Session()
self.timeout = 60
@property
def ns(self):
return self._ns
@ns.setter
def ns(self, ns):
logging.debug("Setting NS cache: %s", ns)
self._ns = ns
@property
def url(self):
return self._url
@url.setter
def url(self, url):
logging.debug("Setting URL: %s", url)
self._url = url
@property
def username(self):
return self._username
@username.setter
def username(self, username):
logging.debug("Setting Username: %s", username)
self._username = username
@property
def password(self):
return self._password
@password.setter
def password(self, password):
logging.debug("Setting Password")
self._password = password
@property
def realm(self):
return self._realm
@realm.setter
def realm(self, realm):
logging.debug("Setting Realm: %s", realm)
self._realm = realm
@property
def raw(self):
return self._raw
@realm.setter
def raw(self, raw):
logging.debug("Setting Raw: %s", raw)
self._raw = raw
@property
def strict_content_type(self):
return self._strict_content_type
@realm.setter
def strict_content_type(self, strict_content_type):
logging.debug("Setting Strict Content Type: %s", strict_content_type)
self._strict_content_type = strict_content_type
@property
def session(self):
return self._session
@session.setter
def session(self, session):
logging.debug("Setting Session: %s", session)
self._session = session
@property
def timeout(self):
return self._timeout
@timeout.setter
def timeout(self, timeout):
logging.debug("Setting Timeout: %s", timeout)
self._timeout = timeout
@property
def urn(self):
return self._urn
@urn.setter
def urn(self, urn):
self._ns_etree = ET.parse(self.ns)
if not self._ns:
raise NoNameService()
if not urn.startswith('urn:publicid:IDN+grnoc.iu.edu:'):
raise InvalidURN()
(_, _, _, urn_cloud, urn_class, urn_version, urn_service) = urn.split(':')
ns_cloud = [c for c in self._ns_etree.findall(
"./cloud") if c.attrib.get('id') == urn_cloud]
if len(ns_cloud) != 1:
raise UndefinedURN("Looking for {0} found {1} matching clouds".format(
urn_cloud, len(ns_cloud)))
ns_cloud = ns_cloud[0]
ns_class = [c for c in ns_cloud.findall(
'./class') if c.attrib.get('id') == urn_class]
if len(ns_class) != 1:
raise UndefinedURN("Looking for {0}:{1} found {2} matching classes".format(
urn_cloud, urn_class, len(ns_class)))
ns_class = ns_class[0]
ns_version = [c for c in ns_class.findall(
'./version') if c.attrib.get('value') == urn_version]
if len(ns_version) != 1:
raise UndefinedURN("Looking for {0}:{1}:{2} found {3} matching versions".format(
urn_cloud, urn_class, urn_version, len(ns_version)))
ns_version = ns_version[0]
ns_service = [c for c in ns_version.findall(
'./service') if c.attrib.get('id') == urn_service]
if len(ns_service) != 1:
raise UndefinedURN("Looking for {0}:{1}:{2}:{3} found {4} matching services".format(
urn_cloud, urn_class, urn_version, urn_service, len(ns_service)))
ns_service = ns_service[0]
ns_locations = [c for c in ns_service.findall('./location')]
if len(ns_locations) < 1:
raise UndefinedURN("Looking for {0}:{1}:{2}:{3} found no matching locations".format(
urn_cloud, urn_class, urn_version, urn_service))
ns_locations.sort(key=lambda loc: loc.attrib.get('weight'))
logging.debug("Setting and resolving URN: %s", urn)
self.url = ns_locations[0].attrib.get('url')
self._urn = urn
def _remoteHandler(self, name):
def handler(*args, **kwargs):
if not self.url:
raise NoURL()
data = {'method': name}
data.update(kwargs)
if not self.realm:
logging.debug(
"Realm not set. Launching as HTTP Basic without a fixed realm.")
r = self.session.post(self.url, auth=(
self.username, self.password), data=data, timeout=self.timeout)
elif "https://" in self.realm:
logging.debug(
"Realm set and looks like Shibboleth ECP. Launching with ECP")
r = self.session.post(self.url, auth=ECP(
self.username, self.password, self.realm, debug=self._debug), data=data, timeout=self.timeout)
else:
raise LoginFailure("Realm is not an IdP ECP Endpoint")
if r.status_code != requests.codes.ok:
raise RemoteMethodException(
"Received status code {0}".format(r.status_code))
if self._raw:
return r.content
if self._strict_content_type and not '/json' in r.headers.get('content-type'):
raise RemoteMethodException(
"Unknown content type {0}".format(r.headers.get('content-type')))
try:
return r.json()
except:
raise RemoteMethodException("JSON parse error")
return handler
def __getattr__(self, name):
return self._remoteHandler(name)
def _save(self, filename):
jar = http.cookiejar.LWPCookieJar(filename)
for cookie in self.session.cookies:
jar.set_cookie(cookie)
jar.save(ignore_discard=True)
def _load(self, filename):
jar = http.cookiejar.LWPCookieJar(filename)
jar.load(ignore_discard=True)
for cookie in jar:
self.session.cookies.set_cookie(cookie)
|
{-# LANGUAGE LambdaCase
, ScopedTypeVariables
, OverloadedStrings
, RecordWildCards #-}
module Main (main) where
import System.Exit
import System.Directory
import System.Environment
import System.FilePath
import System.Process
import System.Console.GetOpt
import System.Console.ANSI
import Control.Monad
import Control.Monad.Except
import Control.Monad.State
import Control.Applicative
import Control.Exception
import Control.Concurrent.Async
import Data.Function
import Data.List
import Data.Either
import Data.Char
import qualified Data.Text as T
import qualified Data.Text.IO as TI
import Data.Attoparsec.Text hiding (try)
import Data.Attoparsec.Combinator (lookAhead)
import Text.Printf
import Distribution.PackageDescription
import Distribution.PackageDescription.Parse
import Distribution.Verbosity (normal)
import Distribution.Simple.Utils (findPackageDesc)
import Distribution.ModuleName (toFilePath, components)
import Language.Haskell.Exts as E
import Language.Preprocessor.Cpphs
import Network.HTTP
main :: IO ()
main = do
-- Process command line arguments
(pkgName, argVerA, argVerB, flags) <-
runExcept <$> (getCmdOpt <$> getProgName <*> getArgs) >>= either die return
when (argVerA == argVerB) $
die "Need to specify different versions / packages for comparison"
mode <- case foldr (\f r -> case f of FlagMode m -> m; _ -> r) "downloaddb" flags of
"downloaddb" -> return ModeDownloadDB
"builddb" -> return ModeBuildDB
"parsehs" -> return ModeParseHS
m -> die $ printf "'%s' is not a valid mode" m
let disableColor = FlagDisableColor `elem` flags
silentFlag = FlagSilent `elem` flags
-- Did we get a package version, DB path or package path?
([verA, verB] :: [EitherVerPath]) <- forM [argVerA, argVerB] $ \ver ->
case parseOnly pkgVerParser (T.pack ver) of
-- Not a version, check if we got a valid DB file or package path
Left _ | mode == ModeParseHS -> do
flip unless (die $ errHdr ++ " or package path" ) =<< doesDirectoryExist ver
return $ Right ver
| otherwise -> do
flip unless (die $ errHdr ++ " or database path") =<< doesFileExist ver
return $ Right ver
where errHdr = "'" ++ ver ++ "' is not a valid version string (1.0[.0[.0]])"
-- Looks like a valid version string
Right _ -> return $ Left ver
diff <- withTmpDirectory $ \tmpDir -> do
-- Need to download packages?
when (mode `elem` [ModeBuildDB, ModeParseHS]) $
forM_ (lefts [verA, verB]) $ \verString -> do
let pkg = pkgName ++ "-" ++ verString
unless silentFlag . putStrLn $ "Downloading " ++ pkg ++ "..."
runExceptT (downloadPackage pkg tmpDir) >>= either die return
-- Parse, compute difference
either die return =<<
( runExceptT $
let cp = ComputeParams tmpDir pkgName verA verB silentFlag
in case mode of
ModeDownloadDB -> computeDiffDownloadHoogleDB cp
ModeBuildDB -> computeDiffBuildHoogleDB cp
ModeParseHS -> computeDiffParseHaskell cp
)
-- Output results
unless silentFlag $ printf "\n--- Diff for | %s → %s | ---\n\n"
(either id id verA)
(either id id verB)
outputDiff diff disableColor silentFlag
data FlagMode = ModeDownloadDB | ModeBuildDB | ModeParseHS
deriving (Eq)
data CmdFlag = FlagDisableColor | FlagSilent | FlagMode String
deriving (Eq)
getCmdOpt :: String -> [String] -> Except String (String, String, String, [CmdFlag])
getCmdOpt prgName args =
case getOpt RequireOrder opt args of
(flags, (pkgName:verA:verB:[]), []) -> return (pkgName, verA, verB, flags)
(_, _, []) -> throwError usage
(_, _, err) -> throwError (concat err ++ "\n" ++ usage)
where
header =
"hackage-diff | Compare the public API of different versions of a Hackage library\n" ++
"github.com/blitzcode/hackage-diff | www.blitzcode.net | (C) 2016 <NAME>\n\n" ++
"Usage: " ++ prgName ++ " [options] <package-name> <old-version|path> <new-version|path>"
footer =
"\nExamples:\n" ++
" " ++ prgName ++ " mtl 2.1 2.2.1\n" ++
" " ++ prgName ++ " --mode=builddb JuicyPixels 3.1.4.1 3.1.5.2\n" ++
" " ++ prgName ++ " conduit 1.1.5 ~/tmp/conduit-1.1.6/dist/doc/html/conduit/conduit.txt\n" ++
" " ++ prgName ++ " --mode=parsehs QuickCheck 2.6 2.7.6\n" ++
" " ++ prgName ++ " --mode=parsehs -s Cabal ~/tmp/Cabal-1.18.0/ 1.20.0.0\n"
usage = usageInfo header opt ++ footer
opt = [ Option []
["mode"]
(ReqArg FlagMode "[downloaddb|builddb|parsehs]")
( "what to download / read, how to compare\n" ++
" downloaddb - download Hoogle DBs and diff (Default)\n" ++
" builddb - download packages, build Hoogle DBs and diff\n" ++
" parsehs - download packages, directly diff .hs exports"
)
, Option ['c']
["disable-color"]
(NoArg FlagDisableColor)
"disable color output"
, Option ['s']
["silent"]
(NoArg FlagSilent)
"disable progress output"
]
-- Check a package version string (1.0[.0[.0]])
pkgVerParser :: Parser ()
pkgVerParser = (nDigits 4 <|> nDigits 3 <|> nDigits 2) *> endOfInput
where digitInt = void (decimal :: Parser Int)
nDigits n = count (n - 1) (digitInt *> char '.') *> digitInt
-- Create and clean up temporary working directory
withTmpDirectory :: (FilePath -> IO a) -> IO a
withTmpDirectory = bracket
( do sysTmpDir <- getTemporaryDirectory
let tmpDir = addTrailingPathSeparator $ sysTmpDir </> "hackage-diff"
createDirectoryIfMissing True tmpDir
return tmpDir
)
( removeDirectoryRecursive )
cabalInstall :: [String] -> ExceptT String IO ()
cabalInstall args = do
(cabalExit, _, cabalStdErr) <- liftIO $ readProcessWithExitCode "cabal" args []
unless (cabalExit == ExitSuccess) . throwError $ cabalStdErr
-- Use cabal-install to download a package from hackage
downloadPackage :: String -> FilePath -> ExceptT String IO ()
downloadPackage pkg destination = cabalInstall [ "get", pkg, "--destdir=" ++ destination ]
data ExportCmp = EAdded | ERemoved | EModified String {- Old signature -} | EUnmodified
deriving (Show, Eq, Ord)
data ModuleCmp = MAdded [String] -- Module was added
| MAddedParseError -- Like above, but we couldn't parse the new one
| MRemoved [String] -- Module was removed
| MRemovedParseError -- Like above, but we couldn't parse the old one
| MNotSureIfModifiedParseError -- New and/or old didn't parse, can't tell
| MModified [(ExportCmp, String)] -- Modified
| MUnmodifed -- Changed
deriving (Show, Eq, Ord)
type Diff = [(ModuleCmp, String)]
-- Print out the computed difference, optionally with ANSI colors
outputDiff :: Diff -> Bool -> Bool -> IO ()
outputDiff diff disableColor disableLengend = do
let putStrCol color str
| disableColor = liftIO $ putStr str
| otherwise = liftIO . putStr $ setSGRCode [SetColor Foreground Vivid color] ++
str ++ setSGRCode [Reset]
putStrLnCol color str = liftIO $ putStrCol color str >> putStrLn ""
breakingChanges <- flip execStateT (0 :: Int) . forM_ diff $ \case
(MAdded exps , mname) -> do
putStrLnCol Green $ "+ " ++ mname
mapM_ (putStrLnCol Green . printf " + %s") exps
(MAddedParseError , mname) ->
putStrLnCol Green $
printf " + %s (ERROR: failed to parse new version, exports not available)" mname
(MRemoved exps , mname) -> do
putStrLnCol Red $ "- " ++ mname
mapM_ (\e -> modify' (+ 1) >> putStrLnCol Red (printf " - %s" e)) exps
(MRemovedParseError , mname) -> do
modify' (+ 1)
putStrLnCol Red $
" - " ++ mname ++ " (ERROR: failed to parse old version, exports not available)"
(MNotSureIfModifiedParseError, mname) -> do
putStrLnCol Yellow $ "× " ++ mname ++
" (Potentially modified, ERROR: failed to parse new and/or old version)"
(MModified exps , mname) -> do
putStrLnCol Yellow $ "× " ++ mname
forM_ exps $ \(cmp, expname) -> case cmp of
EAdded -> putStrLnCol Green $ " + " ++ expname
ERemoved -> do modify' (+ 1)
putStrLnCol Red $ " - " ++ expname
EModified old -> do modify' (+ 1)
putStrLnCol Yellow $ " × New: " ++ expname ++ "\n" ++
" Old: " ++ old
EUnmodified -> return ()
(MUnmodifed , mname) -> putStrLnCol White $ "· " ++ mname
unless disableLengend $ do
putStrLn ""
putStrCol Green "[+ Added] "
putStrCol Red "[- Removed] "
putStrCol Yellow "[× Modified] "
putStrCol White "[· Unmodified]\n"
unless (breakingChanges == 0) $
putStrLnCol Red $ printf "\n%i potential breaking changes found" breakingChanges
-- All the parameters required by the various compute* functions that actually prepare the
-- data and compute the difference
data ComputeParams = ComputeParams { cpTmpDir :: FilePath
, cpPackage :: String
, cpVerA :: EitherVerPath
, cpVerB :: EitherVerPath
, cpSilentFlag :: Bool
} deriving (Eq, Show)
-- A package can be specified by a version string, a Hoogle DB file path or a package path
type VersionString = String
type EitherVerPath = Either VersionString FilePath
-- Compute a Diff by comparing the package's Hoogle DB read from disk or downloaded from Hackage
computeDiffDownloadHoogleDB :: ComputeParams -> ExceptT String IO Diff
computeDiffDownloadHoogleDB ComputeParams { .. } = do
-- Get Hoogle databases
putS "Downloading / Reading Hoogle DBs..."
(dbA, dbB) <-
either (\(e :: IOException) -> throwError $ "DB Error: " ++ show e ++ tip) return =<<
(liftIO . try $ concurrently (downloadOrRead cpVerA) (downloadOrRead cpVerB))
-- Parse
putS "Parsing Hoogle DBs..."
[parsedDBA, parsedDBB] <- forM [dbA, dbB] $ \db ->
either throwError return $ parseOnly (hoogleDBParser <* endOfInput) db
-- Debug parser in GHCi: parseOnly hoogleDBParser <$> TI.readFile "base.txt" >>=
-- \(Right db) -> mapM_ (putStrLn . show) db
-- Compare
putS "Comparing Hoogle DBs..."
return $ diffHoogleDB parsedDBA parsedDBB
where getHoogleDBURL ver = "http://hackage.haskell.org/package" </> cpPackage ++ "-" ++ ver </>
"docs" </> cpPackage <.> "txt"
-- Network.HTTP is kinda crummy, but pulling in http-client/conduit
-- just for downloading two small text files is probably not worth it
downloadURL url = T.pack <$> do
req <- simpleHTTP (getRequest url)
-- HTTP will throw an IOException for any connection error,
-- also examine the response code and throw one for every
-- non-200 one we get
code <- getResponseCode req
unless (code == (2, 0, 0)) . throwIO . userError $
"Status code " ++ show code ++ " for request " ++ url
getResponseBody req
tip = "\nYou can try building missing Hoogle DBs yourself by running with --mode=builddb"
putS = unless cpSilentFlag . liftIO . putStrLn
downloadOrRead = either (downloadURL . getHoogleDBURL) (TI.readFile)
-- Compute a Diff by comparing the package's Hoogle DB build through Haddock. Unfortunately,
-- running Haddock requires to have the package configured with all dependencies
-- installed. This can often be very slow and frequently fails for older packages, on top
-- of any Haddock failures that might happen
computeDiffBuildHoogleDB :: ComputeParams -> ExceptT String IO Diff
computeDiffBuildHoogleDB ComputeParams { .. } =
flip catchError (\e -> throwError $ e ++ tip) $ do
forM_ (lefts [cpVerA, cpVerB]) $ \ver -> do -- Only build if we don't have a DB file path
let pkg = cpPackage ++ "-" ++ ver
putS $ "Processing " ++ pkg ++ "..."
-- TODO: This is rather ugly. Cabal does not allow us to specify the target
-- directory, and the current directory is not a per-thread property.
-- While createProcess allows the specification of a working directory, our
-- preferred wrapper readProcessWithExitCode does not expose that.
-- Duplicating that function and its web of private helpers here would be
-- quite some overhead. For now we simply change the working directory of
-- the process
--
-- https://ghc.haskell.org/trac/ghc/ticket/9322#ticket
--
liftIO . setCurrentDirectory $ cpTmpDir </> pkg
-- All the steps required to get the Hoogle DB
putS " Creating Sandbox" >> cabalInstall [ "sandbox", "init" ]
putS " Installing Dependencies" >> cabalInstall [ "install"
, "--dependencies-only"
-- Try building as fast as
-- possible
, "-j"
, "--disable-optimization"
, "--ghc-option=-O0"
, "--disable-library-for-ghci"
]
putS " Configuring" >> cabalInstall [ "configure" ]
putS " Building Haddock" >> cabalInstall [ "haddock", "--hoogle" ]
-- Read DBs from disk
[dbA, dbB] <-
forM [cpVerA, cpVerB] $ \ver ->
(liftIO . try . TI.readFile $ either getHoogleDBPath id ver)
>>= either (\(e :: IOException) -> throwError $ show e) return
-- Parse
[parsedDBA, parsedDBB] <- forM [dbA, dbB] $ \db ->
either throwError return $ parseOnly hoogleDBParser db
-- Compare
return $ diffHoogleDB parsedDBA parsedDBB
where
putS = unless cpSilentFlag . liftIO . putStrLn
getHoogleDBPath ver = cpTmpDir </> cpPackage ++ "-" ++ ver </> "dist/doc/html" </>
cpPackage </> cpPackage <.> "txt"
tip = "\nIf downloading / building Hoogle DBs fails, you can try directly parsing " ++
"the source files by running with --mode=parsehs"
-- Compare two packages made up of readily parsed Hoogle DBs
diffHoogleDB :: [DBEntry] -> [DBEntry] -> Diff
diffHoogleDB dbA dbB = do
let [verA, verB] = flip map [dbA, dbB]
( -- Sort exports by name
map (\(nm, exps) -> (nm, sortBy (compare `on` dbeName) exps))
-- Sort modules by name
. sortBy (compare `on` fst)
-- Extract module name, put into (name, exports) pair
. map (\case ((DBModule nm):exps) -> (nm , exps)
exps -> ("(Unknown)", exps)
)
-- Group by module
. groupBy (\a b -> or $ (\case DBModule _ -> False
_ -> True
) <$> [a, b]
)
-- Filter out comments and package information
. filter (\case (DBPkgInfo _ _) -> False
(DBComment _ ) -> False
_ -> True
)
)
modulesAdded = allANotInBBy ((==) `on` fst) verB verA
modulesRemoved = allANotInBBy ((==) `on` fst) verA verB
modulesKept = intersectBy ((==) `on` fst) verA verB
resAdded = flip map modulesAdded $ \(nm, exps) ->
(MAdded . map (show) $ exps, T.unpack nm)
resRemoved = flip map modulesRemoved $ \(nm, exps) ->
(MRemoved . map (show) $ exps, T.unpack nm)
resKept =
sortBy compareKept . flip map modulesKept $ \(mname, modA') ->
-- Did the exports change?
case (modA', snd <$> find ((== mname) . fst) verB) of
(_ , Nothing ) -> -- This really should not ever happen here
(MNotSureIfModifiedParseError, T.unpack mname)
(modA, Just modB)
| didExpChange -> (MModified expCmp , T.unpack mname)
| otherwise -> (MUnmodifed , T.unpack mname)
where -- Which exports were added / removed / modified?
didExpChange = or $ map (\case (EUnmodified, _) -> False; _ -> True) expCmp
expCmp = expAdded ++ expRemoved ++ expKept
expAdded =
[(EAdded , show x) | x <- allANotInBBy compareDBEName modB modA]
expRemoved =
[(ERemoved, show x) | x <- allANotInBBy compareDBEName modA modB]
expKept =
-- We don't sort by modified / unmodified here as we currently
-- don't list the unmodified ones
flip map (intersectBy compareDBEName modA modB) $ \eOld ->
case find (compareDBEName eOld) modB of
Nothing -> error "intersectBy / find is broken..."
Just eNew | compareDBEType eOld eNew ->
(EUnmodified, show eOld)
| otherwise ->
(EModified $ show eOld, show eNew)
-- Sort everything by modification type, but make sure we sort
-- modified modules by their name, not their export list
compareKept a b = case (a, b) of
((MModified _, nameA), (MModified _, nameB)) -> compare nameA nameB
_ -> compare a b
in resAdded ++ resRemoved ++ resKept
-- Stupid helper to build module / export lists. Should probably switch to using
-- Data.Set for all of these operations to stop having O(n*m) everywhere
allANotInBBy :: (a -> a -> Bool) -> [a] -> [a] -> [a]
allANotInBBy f a b = filter (\m -> not $ any (f m) b) a
data DBEntry = DBModule !T.Text
| DBPkgInfo !T.Text !T.Text
| DBComment !T.Text
| DBType !T.Text !T.Text
| DBNewtype !T.Text !T.Text
| DBData !T.Text !T.Text
| DBCtor !T.Text !T.Text
| DBClass !T.Text !T.Text
| DBInstance !T.Text !T.Text
| DBFunction !T.Text !T.Text
deriving (Eq)
-- When comparing names we have to take the kind of the export into account, i.e.
-- type and value constructors may have the same name without being identical
compareDBEName :: DBEntry -> DBEntry -> Bool
compareDBEName a b = case (a, b) of
(DBModule _ , DBModule _ ) -> cmp; (DBPkgInfo _ _ , DBPkgInfo _ _ ) -> cmp;
(DBComment _ , DBComment _ ) -> cmp; (DBType _ _ , DBType _ _ ) -> cmp;
(DBNewtype _ _ , DBNewtype _ _ ) -> cmp; (DBData _ _ , DBData _ _ ) -> cmp;
(DBCtor _ _ , DBCtor _ _ ) -> cmp; (DBClass _ _ , DBClass _ _ ) -> cmp;
(DBInstance _ _, DBInstance _ _) -> cmp; (DBFunction _ _, DBFunction _ _) -> cmp;
_ -> False
where cmp = ((==) `on` dbeName) a b
-- Compare the type of two entries. If we simply compare the type string, we will
-- have mistakes like classifying those two functions as having a change in type:
--
-- func :: Num a => a -> a
-- func :: (Num a) => a -> a
--
-- So we try to parse the type with haskell-src-exts and then fall back on a string
-- compare if that fails. Parsing again every time the comparison function is called is
-- obviously rather slow, but it hasn't been an issue so far
--
-- TODO: We should do a name normalization pass on the parsed type, otherwise
-- 'id :: a -> a' and 'id :: b -> b' will be reported as different
--
compareDBEType :: DBEntry -> DBEntry -> Bool
compareDBEType a b =
-- We assume that a and b are the same kind of export (i.e. they have already been
-- matched with dbeName, which only compares exports of the same kind), and now we
-- want to know if the type differs between them
case a of
-- The syntax we use to list exported Ctors and their types can't be parsed as a
-- declaration, just compare the type part
DBCtor _ _ -> case ( parseTypeWithMode mode . T.unpack $ dbeType a
, parseTypeWithMode mode . T.unpack $ dbeType b
) of
(E.ParseOk resA, E.ParseOk resB) -> resA == resB
_ -> stringTypeCmp
-- Also can't parse our type / newtype syntax, fall back to string compare
DBType _ _ -> stringTypeCmp
DBNewtype _ _ -> stringTypeCmp
-- Parse everything else in its entirety as a top-level declaration
_ -> case ( parseDeclWithMode mode $ show a
, parseDeclWithMode mode $ show b
) of
(E.ParseOk resA, E.ParseOk resB) -> resA == resB
_ -> stringTypeCmp
where mode = -- Enable some common extension to make parsing more likely to succeed
defaultParseMode
{
extensions = [ EnableExtension FunctionalDependencies
, EnableExtension MultiParamTypeClasses
, EnableExtension TypeOperators
, EnableExtension KindSignatures
, EnableExtension MagicHash
, EnableExtension FlexibleContexts
]
}
stringTypeCmp = ((==) `on` dbeType) a b
-- Extract a database entry's "name" (i.e. a function name vs its type)
dbeName :: DBEntry -> T.Text
dbeName = \case
DBModule nm -> nm; DBPkgInfo k _ -> k ; DBComment _ -> "";
DBType nm _ -> nm; DBNewtype nm _ -> nm; DBData nm _ -> nm;
DBCtor nm _ -> nm; DBClass nm _ -> nm; DBInstance nm _ -> nm;
DBFunction nm _ -> nm
-- Extract a database entry's "type" (i.e. a function type vs its name)
dbeType :: DBEntry -> T.Text
dbeType = \case
DBModule _ -> ""; DBPkgInfo _ v -> v ; DBComment _ -> "";
DBType _ ty -> ty; DBNewtype _ ty -> ty; DBData _ ty -> ty;
DBCtor _ ty -> ty; DBClass _ ty -> ty; DBInstance _ ty -> ty;
DBFunction _ ty -> ty
instance Show DBEntry where
show = \case
DBModule nm -> "module " ++ T.unpack nm
DBPkgInfo k v -> "@" ++ T.unpack k ++ T.unpack v
DBComment txt -> "-- " ++ T.unpack txt
DBType nm ty -> "type " ++ T.unpack nm ++ " " ++ T.unpack ty
DBNewtype nm ty -> "newtype " ++ T.unpack nm ++ " " ++ T.unpack ty
DBData nm ty -> "data " ++ T.unpack nm ++ (if T.null ty then "" else " " ++ T.unpack ty)
DBCtor nm ty -> T.unpack nm ++ " :: " ++ T.unpack ty
DBClass _ ty -> "class " ++ T.unpack ty
DBInstance _ ty -> "instance " ++ T.unpack ty
DBFunction nm ty -> T.unpack nm ++ " :: " ++ T.unpack ty
-- Parse a Hoogle text database
hoogleDBParser :: Parser [DBEntry]
hoogleDBParser = many parseLine
where
parseLine = (*>) skipEmpty $ parseComment <|> parseData <|> parsePkgInfo <|>
parseDBModule <|> parseCtor <|> parseNewtype <|>
parseDBType <|> parseClass <|> parseInstance <|>
parseFunction
parseComment = string "-- " *> (DBComment <$> tillEoL)
parsePkgInfo = char '@' *> (DBPkgInfo <$> takeTill (== ' ') <*> tillEoL)
parseData = string "data " *>
( (DBData <$> takeTill (`elem` [ ' ', '\n' ]) <* endOfLine <*> "") <|>
(DBData <$> takeTill (== ' ') <* skipSpace <*> tillEoL)
)
parseNewtype = string "newtype " *>
( (DBNewtype <$> takeTill (`elem` [ ' ', '\n' ]) <* endOfLine <*> "") <|>
(DBNewtype <$> takeTill (== ' ') <* skipSpace <*> tillEoL)
)
-- TODO: At some point Hoogle DBs started to have Ctors and functions
-- names wrapped in brackets. Not sure what's up with that, just
-- parse them as part of the name so the parser doesn't stop
parseCtor = do void . lookAhead $ satisfy isAsciiUpper <|>
(char '[' *> satisfy isAsciiUpper)
DBCtor <$> takeTill (== ' ') <* string " :: " <*> tillEoL
-- TODO: This doesn't parse function lists correctly
parseFunction = do void . lookAhead $ satisfy isAsciiLower <|> char '[' <|> char '('
DBFunction <$> takeTill (== ' ') <* string " :: " <*> tillEoL
parseInstance = do void $ string "instance "
line <- T.words <$> tillEoL
-- The name of an instance is basically everything
-- after the typeclass requirements
let nm = case break (== "=>") line of
(xs, []) -> T.unwords xs
(_, (_:xs)) -> T.unwords xs
return . DBInstance nm $ T.unwords line
parseClass = do void $ string "class "
line <- T.words <$> tillEoL
let nm = case break (== "=>") line of
((n:_), []) -> n
(_, (_:n:_)) -> n
_ -> ""
-- TODO: Sometimes typeclasses have all their default method
-- implementations listed right after the 'where' part,
-- just cut all of this off for now
trunc = fst . break (== "where") $ line
in return . DBClass nm $ T.unwords trunc
parseDBType = string "type " *> (DBType <$> takeTill (== ' ') <* skipSpace <*> tillEoL)
parseDBModule = string "module " *> (DBModule <$> takeTill (== '\n')) <* endOfLine
skipEmpty = many endOfLine
tillEoL = takeTill (== '\n') <* endOfLine
-- Compute a Diff by processing Haskell files directly. We use the Cabal API to locate and
-- parse the package .cabal file, extract a list of modules from it, and then pre-process
-- each module with cpphs and finally parse it with haskell-src-exts. The principal issue
-- with this approach is the often complex use of the CPP inside Haskell packages, making
-- this fail fairly often. This method also currently does not look at type signatures and
-- has various other limitations, like not working with modules that do not have an
-- export list
computeDiffParseHaskell :: ComputeParams -> ExceptT String IO Diff
computeDiffParseHaskell ComputeParams { .. } = do
[mListA, mListB] <- forM [cpVerA, cpVerB] $ \ver -> do
let pkgPath = either (\v -> cpTmpDir </> cpPackage ++ "-" ++ v) id ver
unless cpSilentFlag . liftIO . putStrLn $ "Processing " ++ pkgPath ++ "..."
-- Find .cabal file
dotCabal <- (liftIO . findPackageDesc $ pkgPath) >>= either throwError return
-- Parse .cabal file, extract exported modules
exports <- condLibrary <$> (liftIO $ readGenericPackageDescription normal dotCabal) >>= \case
Nothing -> throwError $ pkgPath ++ " is not a library"
Just node -> return $ exposedModules . condTreeData $ node
-- Build module name / module source file list
--
-- TODO: Some packages have a more complex source structure, need to look at the
-- cabal file some more to locate the files
let modules = flip map exports $
\m -> ( concat . intersperse "." . components $ m
, pkgPath </> toFilePath m <.> "hs" -- TODO: Also .lhs?
)
-- Parse modules
liftIO . forM modules $ \(modName, modPath) -> do
unless cpSilentFlag . putStrLn $ " Parsing " ++ modName
Main.parseModule modPath >>= either
-- Errors only affecting single modules are recoverable, just
-- print them instead of throwing
(\e -> putStrLn (" " ++ e) >> return (modName, Nothing))
(\r -> return (modName, Just r ))
-- Compute difference
return $ comparePackageModules mListA mListB
-- Parse a Haskell module interface using haskell-src-exts and cpphs
parseModule :: FilePath -> IO (Either String (Module SrcSpanInfo))
parseModule modPath = runExceptT $ do
(liftIO $ doesFileExist modPath) >>= flip unless
(throwError $ "Can't open source file '" ++ modPath ++ "'")
-- Run cpphs as pre-processor over our module
--
-- TODO: This obviously doesn't have the same defines and include paths set like
-- when compiling with GHC, major source of failures right now
modSrcCPP <- liftIO $ readFile modPath >>= runCpphs defaultCpphsOptions modPath
-- Parse pre-processed Haskell source. This pure parsing function unfortunately throws
-- exceptions for things like encountering an '#error' directive in the code, so we
-- also have to handle those as well
(liftIO . try . evaluate $
parseFileContentsWithMode defaultParseMode { parseFilename = modPath } modSrcCPP)
>>= \case Left (e :: ErrorCall) ->
throwError $ "Haskell Parse Exception - " ++ show e
Right (E.ParseFailed (SrcLoc fn ln cl) err) ->
throwError $ printf "Haskell Parse Error - %s:%i:%i: %s" fn ln cl err
Right (E.ParseOk parsedModule) ->
return parsedModule
type PackageModuleList = [(String, Maybe (Module SrcSpanInfo))]
-- Compare two packages made up of readily parsed Haskell modules
comparePackageModules :: PackageModuleList -> PackageModuleList -> Diff
comparePackageModules verA verB = do
let -- Compare lists of modules
modulesAdded = allANotInBBy ((==) `on` fst) verB verA
modulesRemoved = allANotInBBy ((==) `on` fst) verA verB
modulesKept = intersectBy ((==) `on` fst) verA verB
-- Build result Diff of modules
resAdded = flip map modulesAdded $ \case
(mname, Just m ) ->
(MAdded . map (prettyPrint) $ moduleExports m, mname)
(mname, Nothing) ->
(MAddedParseError, mname)
resRemoved = flip map modulesRemoved $ \case
(mname, Just m ) ->
(MRemoved . map (prettyPrint) $ moduleExports m, mname)
(mname, Nothing) ->
(MRemovedParseError, mname)
-- TODO: This doesn't sort correctly by type of change + name
resKept = sortBy (compare `on` fst) . flip map modulesKept $ \(mname, modA') ->
-- Did the exports change?
case (modA', findModule verB mname) of
(_, Nothing) -> (MNotSureIfModifiedParseError, mname)
(Nothing, _) -> (MNotSureIfModifiedParseError, mname)
(Just modA, Just modB)
| moduleExports modA == moduleExports modB
-> (MUnmodifed , mname)
| otherwise -> (MModified expCmp, mname)
where -- Which exports were added / removed?
expCmp =
[(EAdded , prettyPrint x) | x <- expAdded ] ++
[(ERemoved , prettyPrint x) | x <- expRemoved ] ++
[(EUnmodified, prettyPrint x) | x <- expUnmodified]
-- TODO: We do not look for type changes, no EModified
expAdded = allANotInBBy (==) (moduleExports modB)
(moduleExports modA)
expRemoved = allANotInBBy (==) (moduleExports modA)
(moduleExports modB)
expUnmodified = intersectBy (==) (moduleExports modA)
(moduleExports modB)
-- TODO: If the module does not have an export spec, we assume it exports nothing
moduleExports (Module _ (Just (ModuleHead _ _ _ (Just (ExportSpecList _ exportSpec)))) _ _ _ ) = exportSpec
moduleExports _ = []
findModule mlist mname = maybe Nothing snd $ find ((== mname) . fst) mlist
in resAdded ++ resRemoved ++ resKept
|
package a
// TestWithPeriodF that is multi line and has a
// period in the end.
func TestWithPeriodF() {}
|
package com.lighters.demo;
import android.graphics.Color;
import android.os.Bundle;
import android.support.design.widget.FloatingActionButton;
import android.support.design.widget.Snackbar;
import android.support.v7.app.AppCompatActivity;
import android.support.v7.widget.Toolbar;
import android.view.View;
import com.lighters.library.guide.BubbleGuide;
import com.lighters.library.guide.BubbleGuideOption;
import com.lighters.library.guide.enumtype.BubbleAlignType;
import com.lighters.library.guide.enumtype.BubbleArrowDirection;
/**
* Created by alighters on 16/4/29.
* Email: <EMAIL>
* GitHub: https://github.com/alighters
*/
public class GuideNormalActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_guide_normal);
Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
toolbar.setTitle(R.string.guide_normal);
FloatingActionButton fab = (FloatingActionButton) findViewById(R.id.fab);
fab.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
Snackbar.make(view, "Replace with your own action", Snackbar.LENGTH_LONG)
.setAction("Action", null).show();
}
});
showGuide();
}
private void showGuide() {
BubbleGuideOption guideOption = new BubbleGuideOption.Builder()
.alignType(BubbleAlignType.RIGHT)
.arrowDirection(BubbleArrowDirection.BOTTOM)
.confirmText("I Got it")
.tipText("This a normal button")
.bubbleColor(Color.WHITE)
.arrowWidth(20)
.roundCornerSize(50)
.arrowHeight(20)
.build();
new BubbleGuide.Builder(this)
.container(findViewById(android.R.id.content))
.guide(R.id.fab)
.option(guideOption)
.build().show();
}
}
|
Home » News Samsung Galaxy Nexus no longer available from the Google Play store. Could it be because of the injunction? NewsPhones Samsung Galaxy Nexus no longer available from the Google Play store. Could it be because of the injunction?
The Samsung Galaxy Nexus is no longer available for you to purchase from the Google Play store. At the moment, the only thing that is displayed when you visit the Galaxy Nexus page at the Play store a “Coming Soon” message with the option to be notified via E-Mail when the phone is available to purchase again.
Unfortunately, we’re not sure if the device is simply out of stock or if this has something to do with the recent injunction prevents the Galaxy Nexus from being sold in the United States. The ban is currently in effect and allows remaining Galaxy Nexus inventory to be sold. Once the inventory has been sold, Samsung and Google won’t be able to import any new shipments of the Galaxy Nexus until things are worked out in the courts.
Of course, it could also be because the device is simply out of stock. A lot of people decided to purchase the Galaxy Nexus after news broke the Apple was going to keep it from being sold in the U.S.
Hopefully this is just a temporary thing and the Galaxy Nexus will be available for you to purchase at $349.99 again pretty soon. In the mean time, fill out the form on the page to get notified when Galaxy Nexus is being sold on Google Play again.
Source: Google Play |
// trims `offset` components from the beginning of the multiaddr.
func offset(maddr ma.Multiaddr, offset int) ma.Multiaddr {
_, after := ma.SplitFunc(maddr, func(c ma.Component) bool {
if offset == 0 {
return true
}
offset--
return false
})
return after
} |
/**
* Handles actions involving authenticating
* user. Sets the logged-in session to true
* @return success or error
*/
public String authenticate()
{
logger.debug("in method authenticate");
boolean loginStatus = userService.validateUser(user.getUserID(),user.getPassword());
if ( loginStatus == true)
{
user = userService.getUserID(user.getUserID());
String loginName = (String) user.getFirstName()+" "+(String) user.getLastName();
getSession().put("logged","true");
getSession().put("userid",user.getUserID());
getSession().put("loginName",loginName);
return SUCCESS;
}
else
{
return ERROR;
}
} |
package packages
import (
"fmt"
"os/exec"
"strings"
"github.com/strattadb/setup/internal/pkg/helpers"
)
var aptPackages = [50]string{
"apt-transport-https",
"ca-certificates",
"curl",
"software-properties-common",
"build-essential",
"cmake",
"git",
"vim",
"fish",
"nodejs",
"python3.7",
"docker-ce",
"kubectl",
"fonts-firacode",
"xsel",
}
func installAptPackages() {
removeAptRepositories()
addAptRepositories()
packages := strings.Join(aptPackages[:], " ")
err := exec.Command("sudo", "apt-get", "update").Run()
helpers.LogAndExitIfError(err)
err = exec.Command("sudo", "apt-get", "-y", "install", packages).Run()
helpers.LogAndExitIfError(err)
postInstallDocker()
}
func removeAptRepositories() {
}
var aptRepositories = [1]string{"ppa:deadsnakes/ppa"}
func addAptRepositories() {
repositories := strings.Join(aptRepositories[:], " ")
cmd := exec.Command("sudo", "add-apt-repository", "-y", repositories)
cmd.Run()
addNodeJSAptRepository()
addYarnAptRepository()
addDockerAptRepository()
addKubectlAptRepository()
}
func addNodeJSAptRepository() {
const nodeVersion = "10"
nodeSourceURL := fmt.Sprintf(
"https://deb.nodesource.com/setup_%s.x",
nodeVersion)
cmdStr := fmt.Sprintf("curl -sL %s | sudo -E bash -", nodeSourceURL)
helpers.RunBashCommandAndLogAndExitIfError(cmdStr)
}
func addYarnAptRepository() {
const yarnPublicKeyURL = "https://dl.yarnpkg.com/debian/pubkey.gpg"
const echoYarnList = `echo "deb https://dl.yarnpkg.com/debian/ stable main"`
cmdStr := fmt.Sprintf(`
curl -sS %s | sudo apt-key add - && \
%s | sudo tee /etc/apt/sources.list.d/yarn.list`,
yarnPublicKeyURL,
echoYarnList)
helpers.RunBashCommandAndLogAndExitIfError(cmdStr)
}
func addDockerAptRepository() {
const dockerPublicKeyURL = "https://download.docker.com/linux/ubuntu/gpg"
const echoDockerList = `echo "deb http://apt.kubernetes.io/ kubernetes-xenial main"`
cmdStr := fmt.Sprintf(`
curl -fsSL %s | sudo apt-key add - && \
%s | sudo tee /etc/apt/sources.list.d/kubernetes.list`,
dockerPublicKeyURL,
echoDockerList)
helpers.RunBashCommandAndLogAndExitIfError(cmdStr)
}
func addKubectlAptRepository() {
const kubectlPublicKeyURL = "https://packages.cloud.google.com/apt/doc/apt-key.gpg"
const echoKubectlList = `echo "deb http://apt.kubernetes.io/ kubernetes-xenial main"`
cmdStr := fmt.Sprintf(`
curl -s %s | sudo apt-key add - && \
%s | sudo tee /etc/apt/sources.list.d/kubernetes.list`,
kubectlPublicKeyURL,
echoKubectlList)
helpers.RunBashCommandAndLogAndExitIfError(cmdStr)
}
func postInstallDocker() {
const cmdStr = `
sudo groupadd docker && \
sudo usermod -aG docker "${USER}"`
helpers.RunBashCommandAndLogAndExitIfError(cmdStr)
}
|
import java.util.*;
import java.io.*;
import java.lang.*;
import java.math.*;
public class A {
public static void main(String[] args) throws Exception {
BufferedReader bf = new BufferedReader(new InputStreamReader(System.in));
// Scanner scan = new Scanner(System.in);
PrintWriter out = new PrintWriter(new OutputStreamWriter(System.out));
int n = Integer.parseInt(bf.readLine());
for(int t=0; t<n; t++) {
StringTokenizer st = new StringTokenizer(bf.readLine());
int r = Integer.parseInt(st.nextToken());
int c = Integer.parseInt(st.nextToken());
int k = Integer.parseInt(st.nextToken());
char[][] grid = new char[r][c];
for(int i=0; i<r; i++) grid[i] = bf.readLine().toCharArray();
int rice_count = 0;
for(int i=0; i<r; i++) for(int j=0; j<c; j++) if(grid[i][j]=='R') rice_count++;
int num = rice_count/k;
int extra = rice_count % k;
int counter = 0;
char[][] ans = new char[r][c];
int cur = (int)('a');
int curChickCount = 1;
for(int i=0; i<r; i++) {
int start = 0; if(i%2 == 1) start = c-1;
int end = c; if(i%2 == 1) end = -1;
int diff = 1; if(i%2 == 1) diff = -1;
for(int j=start; j!=end; j+=diff) {
int threshold = num;
if(extra > 0) threshold = num+1;
if(counter == threshold) {
extra -= 1;
if(curChickCount < k) {
if(cur == (int)('z')) cur = (int)'A';
else if (cur == (int)('Z')) cur = (int)('0');
else cur += 1;
curChickCount+=1;
}
counter = 0;
}
ans[i][j] = (char)cur;
if(grid[i][j] == 'R') counter++;
}
}
for(int i=0; i<r; i++) {
StringBuilder sb = new StringBuilder();
for(int j=0; j<c; j++) sb.append(ans[i][j]);
out.println(sb.toString());
}
}
// StringTokenizer st = new StringTokenizer(bf.readLine());
// int[] a = new int[n]; for(int i=0; i<n; i++) a[i] = Integer.parseInt(st.nextToken());
// int n = Integer.parseInt(st.nextToken());
// int n = scan.nextInt();
out.close(); System.exit(0);
}
}
|
So this is the corp deck that I took to a regionals event that I travelled for with a few good friends from my local scene. I'd been testing the deck a lot, and was confident that it would fare well against a new crowd. With a record of 4-1 with my this deck, I couldn't have asked for much more. Lets delve into how this mess of cards comes together to confuse and surprise the runner.
Lets start with the elephant in the room, the All-Sentry suite. I know what your thinking, this is madness. There is hardly any ETR, how do you keep the runner at bay? I feel this is balanced by the low agenda density, and the possibility of recursion making trashed assets not game-ending. Most of the ice is fairly taxing to get through if all you're using garotte.. Imagine having to use Femme to break every piece of ice in a 4 or 5 deep server. The ice strength is spread out nicely, making atman at 4 not necessarily ideal. The lone Swordsman over R&D can do wonders as well. Lots of program destruction to catch nosey runners early, and potentially cripple them.
Agendas:
With only 7 agendas in 49 cards, it's the lowest you can go in ETF, making multi-access runs rarely fruitful and often disappointing. Whatever you score first helps your game immensely. Using Priority Req to res Janus, NEXT Gold, or Flare is the start of a mean scoring server. Using Eden Fragment to build giant scoring servers, while making a credit on the install, is crushing.
Assets:
The real stars of the deck. Junebugs and cerebrals make great traps late game, but the most amazing, unexpected, game changing card is Thomas Haas. Having him early is amazing. after a few turns of setting up, making creds, you Install, gain one, advance, advance. The next few turns consist of corp draw, install either ice or asset for a cred, double advance. There have been times I've trashed him for 24 credits. People just don't know what to make of a super advanced card. It also acts as an amazing trap. You spend most of your credits advancing him, getting as low as 3 or 4. The runner, assuming your broke, decides to make a ballsy run on your R&D. Surprise! I have 26 credits. Rez Janus. Tech startup is also crazy. Using it to get melange is out of this world (ha..ha), and finding EBC in the case of the valencia match up is crucial. Finding JHow is also nice.
The operations speak for themselves. Interns to recur ash, will or melange. Even agenda's that've been pitched, waiting for JHow to work his magic, into the scoring server when an opportunity presents itself. |
#include <bits/stdc++.h>
#define ll long long
using namespace std;
int main(){
ll t;cin>>t;
while(t--){
ll n;cin>>n;
std::vector<ll>arr(n),dep(n),tm(n) ;
for(ll u=0;u<n;u++)cin>>arr[u]>>dep[u];
for(ll u=0;u<n;u++)cin>>tm[u];
ll current = 0,ans,treveltime,stay;
for(ll u=0;u<n;u++){
if(u==0) treveltime=arr[u]+tm[u];
else treveltime=arr[u]-dep[u-1]+tm[u];
current += treveltime;
ans = current;
if((dep[u]-arr[u])%2==0)stay = (dep[u]-arr[u])/2;
else stay = (dep[u]-arr[u])/2+1;
current = max(dep[u],current+stay);
}
cout<<ans<<endl;
}
return 0;
}
|
package main
import (
"bytes"
"encoding/json"
"io/ioutil"
"net/http"
"net/http/httptest"
"testing"
"github.com/gin-gonic/gin"
"github.com/stretchr/testify/require"
"github.com/tendermint/tendermint/types"
"xa.org/xablockchain/xchain-meta/relaychain"
)
func TestProof(t *testing.T) {
// test gin context
iccpBytes, err := ioutil.ReadFile("./testdata/cosmos-Iccp")
require.Nil(t, err)
validatorBytes, err := ioutil.ReadFile("./testdata/validator_set.txt")
require.Nil(t, err)
validators := &types.ValidatorSet{}
cdc := relaychain.ModuleCdc
require.Nil(t, cdc.UnmarshalJSON(validatorBytes, validators))
iccp := &relaychain.ICCP{}
require.Nil(t, json.Unmarshal(iccpBytes, iccp))
proofInfo := &ProofInfo{
Validators: validatorBytes,
ChainID: "appchain1",
Iccp: iccp,
}
proofInfoBytes, err := json.Marshal(proofInfo)
require.Nil(t, err)
buf := bytes.NewBuffer(proofInfoBytes)
gin.SetMode(gin.TestMode)
g, _ := gin.CreateTestContext(httptest.NewRecorder())
r, err := http.NewRequest("POST", "http://localhost/verify", buf)
require.Nil(t, err)
g.Request = r
verify(g)
require.Equal(t, g.Writer.Status(), 200)
}
|
#pragma once
#include "component.hpp"
#include "common.hpp"
#include "texture.hpp"
namespace meov::core::components {
class SkyboxComponent : public Component {
public:
explicit SkyboxComponent(const fs::path &path);
~SkyboxComponent() override = default;
void PreDraw(Graphics &g) override;
void Draw(Graphics &g) override;
void PostDraw(Graphics &g) override;
void Update(double) override;
void Serialize() override;
bool Valid() const;
private:
bool mDirtyFlag{ true };
fs::path mPath;
std::shared_ptr<Texture> mSkyboxTexture;
void OnInvalidSerialize();
void OnValidSerialize();
};
} // namespace meov::core::components
|
Focus on collagen: in vitro systems to study fibrogenesis and antifibrosis _ state of the art
Fibrosis represents a major global disease burden, yet a potent antifibrotic compound is still not in sight. Part of the explanation for this situation is the difficulties that both academic laboratories and research and development departments in the pharmaceutical industry have been facing in re-enacting the fibrotic process in vitro for screening procedures prior to animal testing. Effective in vitro characterization of antifibrotic compounds has been hampered by cell culture settings that are lacking crucial cofactors or are not holistic representations of the biosynthetic and depositional pathway leading to the formation of an insoluble pericellular collagen matrix. In order to appreciate the task which in vitro screening of antifibrotics is up against, we will first review the fibrotic process by categorizing it into events that are upstream of collagen biosynthesis and the actual biosynthetic and depositional cascade of collagen I. We point out oversights such as the omission of vitamin C, a vital cofactor for the production of stable procollagen molecules, as well as the little known in vitro tardy procollagen processing by collagen C-proteinase/BMP-1, another reason for minimal collagen deposition in cell culture. We review current methods of cell culture and collagen quantitation vis-à-vis the high content options and requirements for normalization against cell number for meaningful data retrieval. Only when collagen has formed a fibrillar matrix that becomes cross-linked, invested with ligands, and can be remodelled and resorbed, the complete picture of fibrogenesis can be reflected in vitro. We show here how this can be achieved. A well thought-out in vitro fibrogenesis system represents the missing link between brute force chemical library screens and rational animal experimentation, thus providing both cost-effectiveness and streamlined procedures towards the development of better antifibrotic drugs.
Abstract
Fibrosis represents a major global disease burden, yet a potent antifibrotic compound is still not in sight. Part of the explanation for this situation is the difficulties that both academic laboratories and research and development departments in the pharmaceutical industry have been facing in reenacting the fibrotic process in vitro for screening procedures prior to animal testing. Effective in vitro characterization of antifibrotic compounds has been hampered by cell culture settings that are lacking crucial cofactors or are not holistic representations of the biosynthetic and depositional pathway leading to the formation of an insoluble pericellular collagen matrix. In order to appreciate the task which in vitro screening of antifibrotics is up against, we will first review the fibrotic process by categorizing it into events that are upstream of collagen biosynthesis and the actual biosynthetic and depositional cascade of collagen I. We point out oversights such as the omission of vitamin C, a vital cofactor for the production of stable procollagen molecules, as well as the little known in vitro tardy procollagen processing by collagen C-proteinase/BMP-1, another reason for minimal collagen deposition in cell culture. We review current methods of cell culture and collagen quantitation vis-à-vis the high content options and requirements for normalization against cell number for meaningful data retrieval. Only when collagen has formed a fibrillar matrix that becomes cross-linked, invested with ligands, and can be remodelled and resorbed, the complete picture of fibrogenesis can be reflected in vitro. We show here how this can be achieved. A well thought-out in vitro fibrogenesis system represents the missing link between brute force chemical library screens and rational animal experimentation, thus providing both cost-effectiveness and streamlined procedures towards the development of better antifibrotic drugs.
Fibrosis -ubiquitous problem and global burden
Repair of damaged tissues is an essential biological process which allows directed replacement of dead or damaged cells with connective tissue after injury. The repaired area is addressed as a scar. Hence, scarring represents a survival mechanism that is conserved throughout evolution and appears to be most pronounced in humans. If this wound healing process goes awry, fibrosis results, often causing an excessively large scar or the scarry transformation of organ parts or whole organs. Besides local scarring at sites of acute trauma, a variety of other causes, such as chronic infections, chronic exposure to alcohol and other toxins, autoimmune and allergic reactions, radio-and chemotherapy, can all lead to fibrosis. This pathological process, therefore, can occur in almost any organ or tissue of the body and, typically, results from situations persisting for several weeks or months in which inflammation, tissue destruction and repair occur simultaneously. In this setting, fibrosis most frequently affects the lungs, liver, skin and kidneys. There are approximately 5 million cases of idiopathic lung fibrosis globally , not counting rare disorders like cystic fibrosis or very common ones such as asthma. Chronic hepatitis virus B and C are a major cause of liver fibrosis/cirrhosis which currently ranks 18th of the global disease burden . Scar formation after myocardial infarction can on one hand prevent the injured myocardium from dilatation and rupture but, on the other hand, it can impair cardiac function through increasing ventricular wall stiffness . Atherosclerotic lesions contain fibrotic tissue which can occupy up 87% of total plaque area .
Peri-implantational fibrosis represents a current clinical roadblock in regenerative medicine, which is gaining attention in the tissue engineering field. Every implant is surrounded by a fibrotic tissue reaction that depends on the material, its surface and its degradation profile . This is a consequence of chronic local inflammation and a reflection of the host's tissue attempt to destroy the implant or to cope with it. If destruction is not an option, the implants get wrapped in a fibrous shroud with sparse or no vascularization, so that it becomes effectively isolated from the surrounding tissue. This is seen in artificial ligaments , implanted biosensors , joint implants , breast implants , encapsulated tissues/cells , drug delivery systems and eye implants , and regularly impairs the proper function of the implant. This has prompted the field to alter surface structures and coatings to contain this problem . A potential strategy could be to develop biomaterials that will deliver an antifibrotic substance locally .
It becomes clear that the development of effective antifibrotics is an important unmet clinical need and with it remains the necessity for rapid in vitro screening tools to characterize lead antifibrotic compounds before they are tested in animal models. This review will focus on the current state of the art to emulate a fibrotic process in vitro, the associated challenges and pitfalls and suggestions on how to address them.
Fibrogenesis in vivo -complexity and key players
In order to appreciate the task which in vitro screening of antifibrotics is up against, we shall dissect the fibrotic process into two categories: first, events that are upstream of collagen biosynthesis; and, secondly, the biosynthetic and depositional cascade of collagen I.
Upstream events of fibrosis -cellular players in vivo
Trauma disrupts the anatomical cohesion of tissue structures, most evident by bleeding which indicates breakage of blood vessels and disruption of their endothelial lining. This immediately induces a haemostatic response encompassing platelet aggregation, blood clot formation and accumulation of provisional ECM . Damaged epithelia secrete cytokines, growth factors and chemoattractants for mononuclear cells to phagocytose cellular debris at the site of injury and for fibroblasts to deposit collagen and remodel it. Thus, a scar is formed that eventually matures. The origin of these fibroblasts is currently a matter of debate. They are either differentiating homing mesenchymal stem cells , fibrocytes from the blood circulation or are derived from epithelia via epithelial-mesenchymal transition (EMT) . The fibroblasts involved in scarring have a myofibroblast phenotype characterised by α-smooth muscle actin (α-SMA) expression, increased secretion of collagen type I and III, and contractility . These cells are believed to be responsible for the majority of collagen production in most organs.
'Soluble' factors mediating fibrosis
The cellular effectors of fibrosis are activated and phenotypically modulated by humoral players, namely chemokines, growth factors and cytokines. Most notorious is transforming growth factor β1 (TGF-β1) which supports wound healing and repair. Under pathological conditions, TGF-β1 coordinates a cross-talk between parenchymal inflammatory and collagen-expressing cells, and plays a key role in fibrosis progression. TGF-β1 is often referred to as a 'soluble' factor. We use quotation marks here because TGF-β1 is stored in its latent form bound to TGF-β1 binding proteins in the matrix , and can in its active form be scavenged and possibly neutralized by decorin-mediated binding into the ECM (for review see ).
Along with factors such as the epithelial growth factor, basic fibroblast growth factor and interleukin-1, TGF-β1 appears to play a key role in EMT . The connective tissue growth factor and platelet-derived growth factor have also been reported to be involved in fibrosis (for a more in-depth review on cytokines and molecular mechanisms involved in fibrogenesis, refer to ). Inflammation typically precedes fibrosis, although it has been demonstrated that fibrosis is not always driven by inflammation. This suggests that the mechanisms that regulate fibrogenesis are, to a certain extent, distinct from those regulating inflammation . This may explain the lack of efficacy of anti-inflammatory compounds in the treatment of fibrotic disease . Antifibrotic strategies at the upstream level aim to interfere with fibrotic growth factors and make use of interfering antibodies , small molecules , proteins , antisense technology or the use of human recombinant TGF-β3. TGF-β3 is an alleged TGF-β1 antagonist that has shown some promise in phase III clinical trials in a prophylactic setting of small skin wounds .
Understanding the last mile of the fibrotic pathway
Irrespective of upstream events that trigger and entertain fibrosis, the final product of cellular activity is the massive deposition of collagen which results in scar formation, organ or peri-implantational fibrosis. Therefore, we would like to turn now to the obvious target in fibrosis, namely the biosynthetic pathway of collagen itself.
The overall amount of collagen deposited by fibroblasts is a regulated balance between collagen synthesis and collagen catabolism, which is a carefully controlled process. During a pathological maturation and remodelling phase, collagen synthesized by fibroblasts exceeds the rate at which it is degraded such that the net amount of collagen continues to increase. There are several key points along the collagen biosynthesis pathway that can be targeted to effect a net reduction of collagen secretion and/or deposition ( Figure 1). Transcription interference can be affected using histone deacetylase inhibitors or substances like halofuginone . At the post-transcriptional level, siRNA targeting growth factors and key players have been investigated and, recently, we and others have suggested the use of microRNAs . Interfering with post-translational modifications by inhibiting prolyl-4 hydroxylase renders collagen triple helices less thermostable and prevents their secretion , downregulation of collagen chaperone hsp47 does likewise . At the extracellular level, inhibition of procollagen C-proteinase/BMP1 prevents the removal of the C-terminal propeptide from the procollagen I molecule and, thus, the supramolecular assembly of collagen into fibrils . Inhibition of lysyl oxidase mediation of intermolecular cross-links between collagen triple helices can reduce collagen content of the ECM, presumably by rendering collagen aggregates more susceptible to proteolytic remodelling . Similarly, the administration of hepatocyte growth factor and matrix metalloproteinase 1 (MMP1) increases collagen turnover in the ECM. It becomes clear that a meaningful in vitro system for the testing and characterization of antifibrotics should be able to emulate the above described complete collagen matrix formation cascade, encompassing its biosynthesis and all post-translational (intra-and extra-cellular) modifications that give rise to a stable supramolecular assembly, and also ideally allow the study of remodelling/fibrolysis. For the sake of convenience and efficiency for screening purposes, quantitation of collagen and other proteins of interest should preferentially be in and from one well.
Fibrogenesis in vitro -constraints and options
The task of emulating collagen matrix formation in vitro has been surprisingly difficult, partly due to the omission of important cofactors and partially because of intrinsic properties of contemporary cell culture conditions that still are largely unknown. We will discuss these problems and their solutions.
Biosynthetic issues -getting collagen made and deposited in vitro
(1) The first challenge for in vitro fibrogenesis is the sufficient production of collagen and its subsequent incorporation into a pericellular matrix. The omission of ascorbic acid in cell culture results in minimal production and deposition of collagen on the cell layer. Ascorbate is a crucial cosubstrate of the enzymes responsible for the post-translational hydroxylation of prolyl and lysyl residues necessary for rendering the collagen triple helix thermostable, as well as for the extracellular cross-linking of collagen fibers, respectively . In other words, scurvy can exist in cell culture. However, like its counterpart in vivo, it can be easily treated in vitro by the administration of ascorbate. On the other hand, L-ascorbic acid has a short half-life in culture and completely oxidizes after 3 days, so use of the stable form of ascorbate, such as a magnesium salt of L-ascorbic acid 2-phosphate hexahydrate, is highly recommended .
About 23% of the collagen molecule is composed of proline and hydroxyproline. As a non-essential amino acid, proline is synthesized from arginine/ornithine via the urea cycle and glutamate (directly or indirectly from glutamine via glutaminase) through the citric acid cycle. Clinical observations in burn patients suggest a drain of arginine, ornithine and glutamate , while wound fluid proline levels are at least 50% higher than plasma levels, suggesting active import of proline into the wound . Providing additional proline or glutamine in the diet to enhance collagen biosynthesis, however, does not result in increased collagen accumulation. In contrast, arginine, and ornithine supplementation are most effective in increasing collagen deposition . As cell culture media contains L-arginine and are usually supplemented with L-glutamine, direct proline supplementation of cell cultures in static and bioreactor conditions is not necessary for increasing collagens synthesis, as also shown recently . However, systematic studies of proline-precursor supplementation in vitro under conditions of increased synthesis and deposition have yet to be conducted.
(2) Even under ascorbate supplementation, fibroblasts deposit only minimal amounts of secreted collagen I into their matrices. The reason for this lies in the tardy procollagen C-proteinase/BMP1 activity under current aqueous culture conditions. This results in an accumulation of unprocessed procollagen in the cell culture medium where it does not belong and is discarded with every medium change. Analytical methods have, therefore, mostly focused on the convenient measurement of procollagen secreted into culture medium. This may be sufficient to assess compounds that primarily influence biosynthesis and/or secretion, but it would not allow assessment of any later step in collagen matrix formation.
To improve collagen deposition in cell culture, a fibroplasia model was developed using hyperconfluent human dermal fibroblasts exposed to TGF-β1 for up to 1 month to allow the formation of a fibroplastic tissue . In an abbreviated form (8 days TGF-β1 treatment) this model is in use at Pfizer Global R&D (Sandwich, Kent). This model moves closer to emulating the entire collagen biosynthesis pathway as it allows the assessment of procollagen C-proteinase inhibitors, which are predicted to interfere with collagen deposition . Destructive analysis of deposited collagen by high-performance liquid chromatography of 4-hydroxyproline is performed on the insoluble culture fraction. Our laboratory has systematically developed technology to accelerate collagen deposition in vitro by introducing macromolecules into the culture medium. We have developed two deposition technologies that differ in terms of speed and morphology of the deposited collagen. The first approach employs charged macromolecules such as dextran sulphate 500 kDa (DxS) and polysodium-4-styrene sulfonate . DxS leads to a granular deposition of collagen within 48-72 h, exceeding that of non-crowded cultures by 20-to 30-fold within the same time frame. The second approach using neutral macromolecules in a Ficoll cocktail (Fc) increases collagen deposition 10-fold in 6 days and in a reticular deposition pattern ( Figure 2B). Both approaches are based on the creation of the excluded volume effect as explained elsewhere . Briefly macromolecules drive reaction partners into closer collaboration resulting in improved protein folding and protein-protein interactions. In the case of fibrogenic cell culture, the conversion of procollagen to collagen is sped up as well as the supramolecular assembly of collagen triple helices to form fibres.
Quantitative issues -measuring collagen and normalizing the data
(3) Determination of the amount of collagen produced in vitro is the next challenge, and this can be done in a variety of ways ranging from simple colorimetric assays to elaborate chromatographic procedures using radioactive and non-radioactive material. What most of these procedures have in common is the need to destroy the cell layer to obtain solubilized collagen from the pericellular matrix. The oldest established colorimetric method employs chloramine-T method to measure hydroxyproline . The destructive pre-solubilization requirement of collagen for this assay, lack of specificity in discriminating between collagen types, as well as a lack of internal normalization within samples, are disadvantages in a screening setting. The Sirius dye has been used since 1964 to identify collagen in histology specimens . It is based on the selective binding of Sirius Red F3BA to collagen. Subsequent elution with sodium hydroxide-methanol and read-out at 540 nm can be done in cuvettes, in microtiterplates with collagen extract adsorbed to the plastic and using microplate readers . Biocolor Ltd (County Antrim, Northern Ireland, UK) made the Sircol™ Collagen Assay commercially available in 2007. Precipitation of soluble collagen with the Sirius Red dye is required prior to release of the dye with an alkali. In our hands, this assay grossly overestimates collagen and procollagen secreted into cell culture medium due to interference of non-collagenous serum proteins, so we would recommend prepurification (peptic digest and ultrafiltration) to improve both sensitivity and specificity (Ricky R Lareu, Dimitrios Zeugolis and Michael Raghunath, unpublished experiments). An interesting solution appears to be the adsorption of this dye onto collagen deposited on the cell layer in culture . This was recently suggested in a modified version for the testing of antifibrotic agents , although this screening system did not use or recommend the addition of ascorbic acid . Various enzyme-linked immunoassays can be used to quantify specific collagen types such as coating collagen extracts onto multiwell plates followed by detection using antibodies, sandwich assays and competitive enzyme immunoassays. These assays are very suitable for assessing soluble collagen from culture media but would require extraction and dialysis procedures to release insoluble collagen from pericellular matrices. In this case, normalization for differences between protein amounts or cell numbers in samples is not possible. The determination of the hydroxyproline/ proline ratio via HPLC is based on total hydrolysis of protein samples, separation of hydroxyproline from proline and back calculating the possible collagen content of the sample . While this method allows for the analysis of any given biological material in experienced laboratories, it is not amenable to a screening setting and would not allow for normalisation or discrimination between different collagen types, the same holds true for metabolically labelled cell cultures with the attending issues of isotope handling . Similar issues arise with gas chromatography/mass spectrometry requiring derivatization using trifluoroacetylation and methanol esterification of 4hydroxyproline in collagen. Incorporation of the stable isotope of oxygen, 18 O 2 , into collagen is also possible with this method and enables the examination of collagen synthesis in vitro . Polyacrylamide gel electrophoresis can be very collagen specific using metabolic labelling of cell cultures with radiolabelled amino acids glycine and pro- The Scar-in-the-Jar system combines enhanced collagen deposition with optical analysis for in situ quantitation Figure 2 The Scar-in-the-Jar system combines enhanced collagen deposition with optical analysis for in situ quantitation. (A) Cell layers were pepsin digested, resolved by sodium dodecyl sulphate -polyacrylamide and silver stained. In comparison with fibroplasia models (FP1: Ref , FP2: Ref ), macromolecular crowding increased matrix formation including stronger lysyl oxidase-mediated cross-linking in both deposition modes (rapid: dextran sulphate ; accelerated: Ficoll cocktail ), within a shorter time frame. Note: the presence of collagen V in FP and the accelerated deposition mode and its absence in the rapid deposition mode. Collagen V is usually absent from fibrotic tissue; hence, the extracellular matrix obtained in the rapid deposition mode will probably be more similar to a fibrotic matrix. (B) Cell layers were immunostained for collagen I and fibronectin. Cell nuclei were stained with 4', 6-diamidino-2-phenylindoldilactate (DAPI). The rapid deposition mode (negatively charged, DxS) produces granular collagen I and fibronectin within 2 days, and the accelerated mode (neutral, Fc) produces collagen I with a reticular deposition pattern within 6 days. Therefore, the amount, velocity and morphology of deposited collagen can be manipulated depending on the macromolecules used. (C) Optical analysis of deposited collagen I using a 2× objective, eliminated corner auto-fluorescence in the four corner fields with triangular masks to conceal these regions during quantitation. (D) Cytometry and quantitation of the area of deposited collagen I in a 24-well multiplate format enabled identification of antifibrotic substances that perturb the collagen biosynthesis pathway resulting in a net reduction of deposited collagen I. (i) DAPI-stained nuclei at 20× total magnification in monochrome pseudocolour, 600× magnification (inset). (ii) Red scored nuclei by Count Nuclei module for cytometry. (iii) Immunostained deposited collagen I. (iv) Regions with fluorescent pixel intensity above a selected value based on controls are demarcated by the software in green for quantitation of deposited collagen I area at 100× magnification. This figure is reproduced with permission (Ref ). Repair 2009, 2:7 http://www.fibrogenesis.com/content/2/1/7 line and subsequent detection of radioactive bands using fluorography . Metabolic labelling can be replaced with safer and very sensitive silver staining or immunoblotting . The latter has the advantage of differentiating between various collagen types and comparison with an internal standard like actin as a cell mass equivalent for normalization. Collagen antibodies tend to recognize protein conformation in addition to sequence specificity, which can pose a problem for the detection of denatured collagen α-chains in sodium dodecyl sulphatepolyacrylamide gel electrophoresis (SDS-PAGE), hence native PAGE might be an alternative option . All gel electrophoresis-based approaches are excellent qualitative and quantitative back-up techniques. However, their laboriousness would exclude them as screening tools.
Fibrogenesis & Tissue
(4) Normalization of data to account for cell number variations is another important consideration. Most of the methods discussed above lack the option of normalization and the use of parallel cultures subjected to the separate measurement of total protein or DNA content. In any case, cell density influences the amount of collagen deposited: in fact, subconfluent cultures produce the most collagen . Depending on the substance screened, inhibition or stimulation of proliferation, or cytotoxicity may occur. As this certainly will impact on the collagen amount secreted/deposited, well-to-well variances not withstanding, the acquired data must account for these changes in cell number/density. Data normalization methods involve the destruction of cell layers to quantify housekeeping proteins by Western blot or DNA . This not only increases sample-processing steps, but also cumulates error.
Qualitative issues -looking beyond collagen I (5) Fibrosis is not only characterized by an excess buildup of collagen I but, depending on the tissue context, other proteins might be of interest. Also, the transition of fibrogenic cells from a quiescent fibroblast stage to a fibrotic myofibroblast phenotype should be considered. Markers such as α-SMA or fibroblast activation protein-α are up-regulated in stimulated fibroblasts and a survey of the production and deposition of non-collagenous extracellular matrix (ECM) proteins as an internal control for the action of antifibrotic agents would be desirable. As well as capturing the actual number of cells in a given well, options for a high content read-out covering as many of the above markers as possible in addition to collagen I would constitute an ideal antifibrotic screening system.
A current solution: the Scar-in-a-Jar
The Scar-in-a-Jar has been developed in our laboratory in order to address the problems of in vitro fibrogenesis discussed above. This system: (1) solves the problem of tardy collagen deposition; (2) quantitatively measures relative changes in collagen I deposition via immunofluorescence and in situ optical analysis; (3) allows for high content screening within a single well including α-SMA and another ECM protein of choice; and (4) while implementing normalization for cell number by counting nuclei via 4',6-diamidino-2-phenylindoldilactate (DAPI) staining . As discussed above, the addition of charged and neutral macromolecules into culture medium dramatically enhances the deposition of collagen into the pericellular matrix (Figure 2B), within a shorter time, to a greater extent and degree of cross-linking than possible in both fibroplasia models by Clark et al. (1997) and Fish et al. (2007) (Figure 2A). In both cases, enhanced collagen deposition reduces culture time to 48 h (DxS) and 6 days (Fc), prior to optical quantitation. TGF-β1 addition is optional. For optical analysis, samples in a 24-well format are immunostained for collagen I and fibronectin, and cell nuclei are stained with DAPI. Automated image acquisition using a 2× objective is performed and the area of collagen I and fibronectin per cell is ascertained using the Metamorph ® Imaging System software ( Figure 2C-D).
The Scar-in-a-Jar enables the analysis of proteins-of-interest and cell enumeration all within a single-well, thus minimizing processing steps, material loss and sample variation. Cell enumeration can give an indication of potential anti-proliferative effects of a compound and allows for correction of protein production to account for the variance in cell numbers due to that compound. We also demonstrated its capability to discern the reduction of deposited collagen by known and novel inhibitors targeting various points of the collagen biosynthesis pathway from the epigenetic to the extracellular level, including two C-proteinase inhibitors, FSY002, originally developed in a German pharmaceutical company and PCP56 from Pfizer (NY, USA) . PCP56 demonstrated efficacy in agreement with an earlier report on the abridged fibroplasia model but FSY002 did not work in our system. FSY002 has an interesting history. This phosphinate inhibitor did inhibit purified C-proteinase in the test tube and an IC 50 value could be obtained . However, tests in conventional monolayer fibroblast culture proved inconclusive and we have to conclude in hindsight that this was due to tardy C-proteinase activity. Running this substance in our system with fully active Cproteinase activity revealed its ineffectiveness. At this point in time, optical analysis is unable to discern a reduction in collagen cross-links if there is no net collagen reduction on the cell layer . Hence, the testing of lysyl oxidase inhibitors will be better analysed by the biochemical analysis method employing pepsin digestion of cell layers followed by SDS-PAGE and silver staining to visualize collagen cross-links (Figure 2A) .
The combination of short culture time, rapid collagen biosynthesis, complete deposition, together with optical analysis that circumvents the need for protein extraction, makes the Scar-in-a-Jar a convenient assay that remains in a single well from start to finish.
Future developments
Besides the current use of fibroblasts, this system has the flexibility and potential to screen the effect of potential antifibrotic compounds on other organ specific culprits such as hepatic stellate cells. The optical accessibility of the Scar-in-a-Jar has room for the introduction of one or more additional cell types like monocytes or smooth muscle cells to augment the fibrotic context and tailor it for a dermal wound healing situation or an atherosclerotic plaque. This could be done in direct coculture or using inserts, with the fibrotic target cells adhering to the bottom of the well. Besides preliminary testing of exogenously added matrix metalloproteinases , the current system has not yet been fathomed for its ability to study collagen matrix metalloproteinases turnover that would be relevant for analysing effects of collagen cross-link inhibitors and inducers of matrix metalloprtoteinases like hepatocyte growth factor. We envision longer observation times in this instance or challenging cell cultures after an initial ECM build-up, with additional cell types or substances that induce remodelling. A very promising line of research using this system could be non-enzymatic glycation studies to capture a diabetic situation and testing breakers of advanced glycation end products.
Conclusions
With the current burden of fibrosis worldwide and acquired connective tissue disorders as seen in diabetes, the development of in vitro test systems that mimic fibrosis and connective tissue formation is more needed than ever. The lack of adequate systems to study fibrogenesis in vitro has either impeded the development of antifibrotics or has forced research and development to move too early into animal models. Although in vivo evaluation is pivotal for preclinical development, a rapid in vitro quantitative screening method is mandatory for the first round determination of potential anti-fibrotic compounds, since any animal model represents the same complexity as a human and unclear in vitro data will unlikely be cleared in vivo. As there are several key points along the collagen biosynthesis pathway that can be interfered with, it is necessary to emulate all of these steps in vitro and, ideally, to get high content information out of a single well. This problem has been finally solved with the Scar-in-a-Jar. However, we look forward with great interest to seeing refinements and further developments of antifibrotics screening in the quest for the most potent and versatile antifibrotic compound.
Publish with Bio Med Central and every scientist can read your work free of charge http://www.fibrogenesis.com/content/2/1/7 |
/**
* Old version of converting Cidr version to binary.
*
* @param ipCidr
* @return
*/
private static String convertToBinaryUsingTurnedOnBits(String ipCidr) {
String afterSlash = ipCidr.split("/")[1];
int numberOfTurnedOnBits = Integer.parseInt(afterSlash);
StringBuilder binaryConvertedOctetsBuilder = new StringBuilder();
for (int i = 1; i <= 32; i++) {
if (i <= numberOfTurnedOnBits) {
binaryConvertedOctetsBuilder.append("1");
if (i == 8 || i == 16 || i == 24) {
binaryConvertedOctetsBuilder.append(".");
}
} else {
binaryConvertedOctetsBuilder.append("0");
if (i == 8 || i == 16 || i == 24) {
binaryConvertedOctetsBuilder.append(".");
}
}
}
String binaryConvertedOctets = binaryConvertedOctetsBuilder.toString();
return binaryConvertedOctets;
} |
def correct_ceil(obj, fill_value=1e-7, var_name='backscatter'):
data = obj[var_name].data
data[data <= 0] = fill_value
data = np.log10(data)
obj[var_name].values = data
if 'units' in obj[var_name].attrs:
obj[var_name].attrs['units'] = 'log(' + obj[var_name].attrs['units'] + ')'
else:
obj[var_name].attrs['units'] = 'log(unknown)'
return obj |
package banking
import (
bankingtypes "git.ooo.ua/vipcoin/chain/x/banking/types"
"git.ooo.ua/vipcoin/lib/filter"
"github.com/forbole/bdjuno/v2/database/types"
)
// SaveMsgSystemRewardTransfers - method that create transfers to the "vipcoin_chain_banking_system_reward_transfer"
func (r Repository) SaveMsgSystemRewardTransfers(transfers ...*bankingtypes.MsgSystemRewardTransfer) error {
if len(transfers) == 0 {
return nil
}
query := `INSERT INTO vipcoin_chain_banking_system_reward_transfer
(creator, wallet_from, wallet_to, asset, amount, extras)
VALUES
(:creator, :wallet_from, :wallet_to, :asset, :amount, :extras)`
if _, err := r.db.NamedExec(query, toMsgSystemRewardTransfersDatabase(transfers...)); err != nil {
return err
}
return nil
}
// GetMsgSystemRewardTransfers - method that get transfers from the "vipcoin_chain_banking_system_reward_transfer"
func (r Repository) GetMsgSystemRewardTransfers(filter filter.Filter) ([]*bankingtypes.MsgSystemRewardTransfer, error) {
query, args := filter.Build(
tableMsgSystemRewardTransfer,
types.FieldCreator, types.FieldWalletFrom, types.FieldWalletTo,
types.FieldAsset, types.FieldAmount, types.FieldExtras,
)
var result []types.DBSystemRewardTransfer
if err := r.db.Select(&result, query, args...); err != nil {
return []*bankingtypes.MsgSystemRewardTransfer{}, err
}
transfers := make([]*bankingtypes.MsgSystemRewardTransfer, 0, len(result))
for _, transfer := range result {
transfers = append(transfers, toMsgSystemRewardTransferDomain(transfer))
}
return transfers, nil
}
|
package apu
import (
"log"
"github.com/mtojek/nes-emulator/bus"
)
const cpuFrequency = 1789773
const frameCounterRate = cpuFrequency / 240.0
var lengthTable = []byte{
10, 254, 20, 2, 40, 4, 80, 6, 160, 8, 60, 10, 14, 12, 26, 14,
12, 16, 24, 18, 48, 20, 96, 22, 192, 24, 72, 26, 16, 28, 32, 30,
}
var dutyTable = [][]byte{
{0, 1, 0, 0, 0, 0, 0, 0},
{0, 1, 1, 0, 0, 0, 0, 0},
{0, 1, 1, 1, 1, 0, 0, 0},
{1, 0, 0, 1, 1, 1, 1, 1},
}
var pulseTable [31]float32
var tndTable [203]float32
func init() {
for i := 0; i < 31; i++ {
pulseTable[i] = 95.52 / (8128.0/float32(i) + 100)
}
for i := 0; i < 203; i++ {
tndTable[i] = 163.67 / (24329.0/float32(i) + 100)
}
}
type APU2303 struct {
TriggerIRQ bool
dmcModer dmcModer
channel chan float32
pulse1 *Pulse
pulse2 *Pulse
noise *Noise
triangle *Triangle
dmc *DMC
framePeriod byte
frameValue byte
frameIRQ bool
filterChain FilterChain
sampleRate float64
cycle uint64
}
var _ bus.ReadableWriteable = new(APU2303)
type dmcModer interface {
DMCMode()
}
func Create(cpuBus bus.ReadableWriteable, dmcModer dmcModer) *APU2303 {
return &APU2303{
channel: make(chan float32, 44100),
dmcModer: dmcModer,
sampleRate: float64(cpuFrequency)/48000,
filterChain: FilterChain{
HighPassFilter(48000, 90),
HighPassFilter(48000, 440),
LowPassFilter(48000, 14000),
},
pulse1: &Pulse{
channel: 1,
},
pulse2: &Pulse{
channel: 2,
},
noise: &Noise{
shiftRegister: 1,
},
triangle: new(Triangle),
dmc: &DMC{
cpuBus: cpuBus,
dmcModer: dmcModer,
},
}
}
func (apu *APU2303) Read(addr uint16) uint8 {
switch addr {
case 0x4015:
return apu.readStatus()
}
log.Printf("APU: read from unmapped address %04X\n", addr)
return 0
}
func (apu *APU2303) Write(addr uint16, value uint8) {
switch addr {
case 0x4000:
apu.pulse1.writeControl(value)
case 0x4001:
apu.pulse1.writeSweep(value)
case 0x4002:
apu.pulse1.writeTimerLow(value)
case 0x4003:
apu.pulse1.writeTimerHigh(value)
case 0x4004:
apu.pulse2.writeControl(value)
case 0x4005:
apu.pulse2.writeSweep(value)
case 0x4006:
apu.pulse2.writeTimerLow(value)
case 0x4007:
apu.pulse2.writeTimerHigh(value)
case 0x4008:
apu.triangle.writeControl(value)
case 0x4009:
case 0x4010:
apu.dmc.writeControl(value)
case 0x4011:
apu.dmc.writeValue(value)
case 0x4012:
apu.dmc.writeAddress(value)
case 0x4013:
apu.dmc.writeLength(value)
case 0x400A:
apu.triangle.writeTimerLow(value)
case 0x400B:
apu.triangle.writeTimerHigh(value)
case 0x400C:
apu.noise.writeControl(value)
case 0x400D:
case 0x400E:
apu.noise.writePeriod(value)
case 0x400F:
apu.noise.writeLength(value)
case 0x4015:
apu.writeControl(value)
case 0x4017:
apu.writeFrameCounter(value)
}
}
func (apu *APU2303) writeControl(value uint8) {
apu.pulse1.enabled = value&1 == 1
apu.pulse2.enabled = value&2 == 2
apu.triangle.enabled = value&4 == 4
apu.noise.enabled = value&8 == 8
apu.dmc.enabled = value&16 == 16
if !apu.pulse1.enabled {
apu.pulse1.lengthValue = 0
}
if !apu.pulse2.enabled {
apu.pulse2.lengthValue = 0
}
if !apu.triangle.enabled {
apu.triangle.lengthValue = 0
}
if !apu.noise.enabled {
apu.noise.lengthValue = 0
}
if !apu.dmc.enabled {
apu.dmc.currentLength = 0
} else {
if apu.dmc.currentLength == 0 {
apu.dmc.restart()
}
}
}
func (apu *APU2303) writeFrameCounter(value byte) {
apu.framePeriod = 4 + (value>>7)&1
apu.frameIRQ = (value>>6)&1 == 0
// apu.frameValue = 0
if apu.framePeriod == 5 {
apu.stepEnvelope()
apu.stepSweep()
apu.stepLength()
}
}
func (apu *APU2303) readStatus() uint8 {
var result byte
if apu.pulse1.lengthValue > 0 {
result |= 1
}
if apu.pulse2.lengthValue > 0 {
result |= 2
}
if apu.triangle.lengthValue > 0 {
result |= 4
}
if apu.noise.lengthValue > 0 {
result |= 8
}
if apu.dmc.currentLength > 0 {
result |= 16
}
return result
}
func (apu *APU2303) stepEnvelope() {
apu.pulse1.stepEnvelope()
apu.pulse2.stepEnvelope()
apu.triangle.stepCounter()
apu.noise.stepEnvelope()
}
func (apu *APU2303) stepSweep() {
apu.pulse1.stepSweep()
apu.pulse2.stepSweep()
}
func (apu *APU2303) stepLength() {
apu.pulse1.stepLength()
apu.pulse2.stepLength()
apu.triangle.stepLength()
apu.noise.stepLength()
}
func (apu *APU2303) Clock() {
cycle1 := apu.cycle
apu.cycle++
cycle2 := apu.cycle
apu.stepTimer()
f1 := int(float64(cycle1) / frameCounterRate)
f2 := int(float64(cycle2) / frameCounterRate)
if f1 != f2 {
apu.stepFrameCounter()
}
s1 := int(float64(cycle1) / apu.sampleRate)
s2 := int(float64(cycle2) / apu.sampleRate)
if s1 != s2 {
apu.sendSample()
}
}
func (apu *APU2303) stepTimer() {
if apu.cycle%2 == 0 {
apu.pulse1.stepTimer()
apu.pulse2.stepTimer()
apu.noise.stepTimer()
apu.dmc.stepTimer()
}
apu.triangle.stepTimer()
}
func (apu *APU2303) stepFrameCounter() {
switch apu.framePeriod {
case 4:
apu.frameValue = (apu.frameValue + 1) % 4
switch apu.frameValue {
case 0, 2:
apu.stepEnvelope()
case 1:
apu.stepEnvelope()
apu.stepSweep()
apu.stepLength()
case 3:
apu.stepEnvelope()
apu.stepSweep()
apu.stepLength()
apu.fireIRQ()
}
case 5:
apu.frameValue = (apu.frameValue + 1) % 5
switch apu.frameValue {
case 0, 2:
apu.stepEnvelope()
case 1, 3:
apu.stepEnvelope()
apu.stepSweep()
apu.stepLength()
}
}
}
func (apu *APU2303) fireIRQ() {
if apu.frameIRQ {
apu.TriggerIRQ = true
}
}
func (apu *APU2303) sendSample() {
output := apu.filterChain.Step(apu.output())
select {
case apu.channel <- output:
default:
}
}
func (apu *APU2303) output() float32 {
p1 := apu.pulse1.output()
p2 := apu.pulse2.output()
t := apu.triangle.output()
n := apu.noise.output()
d := apu.dmc.output()
pulseOut := pulseTable[p1+p2]
tndOut := tndTable[3*t+2*n+d]
return pulseOut + tndOut
}
func (apu *APU2303) AudioBuffer() chan float32 {
return apu.channel
}
|
/* Build the RB tree corresponding to the VMA list. */
void build_mmap_rb(struct mm_struct * mm)
{
struct vm_area_struct * vma;
rb_node_t ** rb_link, * rb_parent;
mm->mm_rb = RB_ROOT;
rb_link = &mm->mm_rb.rb_node;
rb_parent = NULL;
for (vma = mm->mmap; vma; vma = vma->vm_next) {
__vma_link_rb(mm, vma, rb_link, rb_parent);
rb_parent = &vma->vm_rb;
rb_link = &rb_parent->rb_right;
}
} |
#include<bits/stdc++.h>
#define ll long long
#define INF 0x3f3f3f3f
using namespace std;
const int MAXN=200005;
int n,m;
int dis[MAXN][55];
bool vis[MAXN][55];
struct node{
int to,w;
};
vector<node>g[MAXN];
struct pos{
int u,pre,d;
bool operator < (const pos & p) const {
return p.d<d;
}
};
void Dijkstra(int s)
{
priority_queue<pos>que;
dis[s][0]=0;
que.push((pos){s,0,0});
while(!que.empty()){
pos now=que.top();
que.pop();
if(vis[now.u][now.pre])continue;
vis[now.u][now.pre]=true;
for(int i=0;i<g[now.u].size();i++){
int to=g[now.u][i].to,w=g[now.u][i].w;
if(now.pre==0){
if(dis[to][w]>dis[now.u][now.pre]){
dis[to][w]=dis[now.u][now.pre];
que.push((pos){to,w,dis[to][w]});
}
}else{
if(dis[to][0]>dis[now.u][now.pre]+(now.pre+w)*(now.pre+w)){
dis[to][0]=dis[now.u][now.pre]+(now.pre+w)*(now.pre+w);
//cout<<dis[to][0]<<endl;
que.push((pos){to,0,dis[to][0]});
}
}
}
}
}
int main()
{
memset(dis,INF,sizeof dis);
memset(vis,false,sizeof vis);
scanf("%d%d",&n,&m);
for(int i=1;i<=m;i++){
int u,v,w;
scanf("%d%d%d",&u,&v,&w);
g[u].push_back((node){v,w});
g[v].push_back((node){u,w});
}
Dijkstra(1);
for(int i=1;i<=n;i++){
//cout<<INF<<" "<<dis[i][0]<<endl;
if(dis[i][0]==INF){
printf("-1");
}else{
printf("%d",dis[i][0]);
}
printf("%c"," \n"[i==n]);
}
}
//dis[i][j] 表示前面一条边的长度为j是1到i的最短路 |
<reponame>jayserdny/utopian-api-types
/**
* Typescript definition for utopian-api
*
* @author <NAME>
* https://github.com/jayserdny
*
*/
declare module "utopian-api" {
var utopian_api: UTOPIAN_API;
export = utopian_api;
}
declare class UTOPIAN_API {
getModerators(): Promise<Array<String>>;
} |
package execute
import (
"context"
"sync"
)
// Executor knows how to limit the execution using different kind of execution workflows
// like worker pools.
// It also has different policies of how to work, for example waiting a time before
// erroring, or directly erroring.
type Executor interface {
// Execute will execute the received function and will return the
// result of the executed function, or reject error from the executor.
Execute(ctx context.Context, f func() error) error
WorkerPool
}
// WorkerPool maintains a worker pool what knows how to increase and decrease the worker pool.
type WorkerPool interface {
SetWorkerQuantity(quantity int)
}
// workerPool knows how to increase and decrease the current workers executing jobs.
// it's only objective is to set the desired number of concurrent execution flows
type workerPool struct {
workerStoppers []chan struct{}
jobQueue chan func()
mu sync.Mutex
}
func newWorkerPool() workerPool {
return workerPool{
jobQueue: make(chan func()),
}
}
// SetWorkerQuantity knows how to increase or decrease the worker pool.
func (w *workerPool) SetWorkerQuantity(quantity int) {
if quantity < 0 {
return
}
w.mu.Lock()
defer w.mu.Unlock()
// If we don't need to increase or decrease the worker quantity the do nothing.
if len(w.workerStoppers) == quantity {
return
}
// If we have less workers then we need to add workers.
if len(w.workerStoppers) < quantity {
w.increaseWorkers(quantity - len(w.workerStoppers))
return
}
// If we reached here then we need to reduce workers.
w.decreaseWorkers(len(w.workerStoppers) - quantity)
}
func (w *workerPool) decreaseWorkers(workers int) {
// Stop the not needed workers.
toStop := w.workerStoppers[:workers]
for _, stopC := range toStop {
close(stopC)
}
// Set the new worker quantity.
w.workerStoppers = w.workerStoppers[workers:]
}
func (w *workerPool) increaseWorkers(workers int) {
for i := 0; i < workers; i++ {
// Create a channel to stop the worker.
stopC := make(chan struct{})
go w.newWorker(stopC)
w.workerStoppers = append(w.workerStoppers, stopC)
}
}
func (w *workerPool) newWorker(stopC chan struct{}) {
for {
select {
case <-stopC:
return
case f := <-w.jobQueue:
f()
}
}
}
|
// min-lldb-version: 310
// compile-flags:-C debuginfo=1
// gdb-command:run
// lldb-command:run
// Nothing to do here really, just make sure it compiles. See issue #8513.
fn main() {
let _ = ||();
let _ = (1_usize..3).map(|_| 5);
}
|
<reponame>niko-bikchen/Ribenteuer
package com.scriptizergs.ribenteuer.model.Location.Monsters.CastleDungeonMonsters;
public enum CastleMonsters {
FANTOM,
SKELETON
}
|
#include "ctranslate2/layers/common.h"
#include <cmath>
#include "../device_dispatch.h"
namespace ctranslate2 {
namespace layers {
Embeddings::Embeddings(const models::Model& model, const std::string& scope)
: _embeddings(model.get_variable(scope + "/weight"))
, _qscale(model.get_variable_if_exists(scope + "/weight_scale"))
, _scale(model.get_flag_with_default(scope + "/multiply_by_sqrt_depth", true)
? new StorageView(static_cast<float>(sqrt(_embeddings.dim(-1))))
: nullptr) {
}
void Embeddings::operator()(const StorageView& ids,
StorageView& output) {
PROFILE("Embeddings");
if (_embeddings.dtype() == DataType::INT16 || _embeddings.dtype() == DataType::INT8) {
const auto device = output.device();
StorageView gathered(_embeddings.dtype(), device);
_gather_op(_embeddings, ids, gathered);
if (_qscale->is_scalar())
ops::Dequantize()(gathered, *_qscale, output);
else {
StorageView scale(_qscale->dtype(), device);
_gather_op(*_qscale, ids, scale);
ops::Dequantize()(gathered, scale, output);
}
} else {
_gather_op(_embeddings, ids, output);
}
if (_scale)
ops::Mul()(output, *_scale, output);
}
static bool should_shift_input_to_u8(Device device, DataType dtype) {
// If the target Gemm implementation prefers the u8s8s32 format, we can shift
// the input to the u8 domain and add a compensation term.
return (device == Device::CPU
&& dtype == DataType::INT8
&& primitives<Device::CPU>::prefer_u8s8s32_gemm());
}
static StorageView* compute_u8_compensation(const StorageView& weight) {
// The compensation term for the shifted input only depends on the weight, so
// we can compute it once.
const dim_t k = weight.dim(1);
const dim_t n = weight.dim(0);
auto* compensation = new StorageView({n}, DataType::INT32);
primitives<Device::CPU>::compute_u8_compensation(weight.data<int8_t>(),
/*transpose=*/true,
k, n,
/*alpha=*/1,
compensation->data<int32_t>());
return compensation;
}
Dense::Dense(const models::Model& model, const std::string& scope)
: _weight(model.get_variable(scope + "/weight"))
, _bias(model.get_variable_if_exists(scope + "/bias"))
, _qscale(model.get_variable_if_exists(scope + "/weight_scale"))
, _partial_weight(_weight.device(), _weight.dtype())
, _partial_bias(_weight.device(), DataType::FLOAT)
, _partial_qscale(_weight.device())
, _gemm_op(1, 0, false, true)
, _u8_quantization_shift(should_shift_input_to_u8(_weight.device(), _weight.dtype())
? 128 : 0)
, _u8_shift_compensation(_u8_quantization_shift != 0
? compute_u8_compensation(_weight) : nullptr) {
}
void Dense::mask_weights(const StorageView& index) {
ops::Gather()(_weight, index, _partial_weight);
if (_u8_shift_compensation)
_u8_shift_compensation.reset(compute_u8_compensation(_partial_weight));
if (_bias)
ops::Gather()(*_bias, index, _partial_bias);
if (_qscale && !_qscale->is_scalar())
ops::Gather()(*_qscale, index, _partial_qscale);
}
void Dense::reset_mask() {
_partial_weight.clear();
_partial_bias.clear();
_partial_qscale.clear();
}
void Dense::operator()(const StorageView& input, StorageView& output) {
PROFILE("Dense");
const StorageView* qscale = _partial_qscale.empty() ? _qscale : &_partial_qscale;
const StorageView* weight = _partial_weight.empty() ? &_weight : &_partial_weight;
const StorageView* bias = _partial_bias.empty() ? _bias : &_partial_bias;
if (_weight.dtype() == DataType::INT16 || _weight.dtype() == DataType::INT8) {
const auto device = input.device();
StorageView qinput(_weight.dtype(), device);
StorageView qinput_scale(_qscale->dtype(), device);
StorageView qoutput(DataType::INT32, device);
ops::Quantize()(input, qinput, qinput_scale, _u8_quantization_shift);
_gemm_op(qinput, *weight, qoutput, _u8_shift_compensation.get());
ops::Dequantize()(qoutput, qinput_scale, *qscale, output);
} else {
_gemm_op(input, *weight, output);
}
if (bias) {
DEVICE_DISPATCH(output.device(),
primitives<D>::add_batch_broadcast(bias->data<float>(),
output.data<float>(),
bias->size(),
output.size()));
}
}
LayerNorm::LayerNorm(const models::Model& model, const std::string& scope)
: _beta(model.get_variable(scope + "/beta"))
, _gamma(model.get_variable(scope + "/gamma")) {
}
void LayerNorm::operator()(const StorageView& input, StorageView& output) {
_norm_op(_beta, _gamma, input, output);
}
}
}
|
<reponame>shanghuiyang/rpi-devices<gh_stars>10-100
package main
import (
"testing"
"github.com/stretchr/testify/assert"
)
func TestStart(t *testing.T) {
monitor := cpuMonitor{}
assert.NotNil(t, monitor)
}
func TestUsage(t *testing.T) {
cupinfo := `
top - 20:04:01 up 9 min, 2 users, load average: 0.22, 0.22, 0.18
Tasks: 72 total, 1 running, 71 sleeping, 0 stopped, 0 zombie
%Cpu(s): 2.0 us, 2.0 sy, 0.0 ni, 96.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st
MiB Mem : 432.7 total, 330.8 free, 34.7 used, 67.2 buff/cache
MiB Swap: 100.0 total, 100.0 free, 0.0 used. 347.1 avail Mem
`
monitor := cpuMonitor{}
assert.NotNil(t, monitor)
usage, err := monitor.usage(cupinfo)
assert.NoError(t, err)
assert.InDelta(t, 4.0, usage, 1e-9)
}
|
package apig;
public class Palindrome {
public static void main(String[] args) {
Palindrome solver = new Palindrome();
assert solver.isPalindrome("madam");
assert solver.isPalindrome("abccba");
assert !solver.isPalindrome("abcd");
}
public boolean isPalindrome(String in) {
if (in == null ) {
return false;
}
int s = 0;
int e = in.length() - 1;
while (s < e) {
if (in.charAt(s++) != in.charAt(e--)) {
return false;
}
}
return true;
}
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.