content
stringlengths 10
4.9M
|
---|
def csv_template(folder, outputfile):
with open(outputfile, "ab") as f:
print "\nCreating template csv file...\n"
w = csv.writer(f)
header = ("Feature_Dataset", "Feature_Class", "Shape_Type", "Populated",
"Feature_Count", "Workspace", "Spatial_Reference", "Editor")
w.writerow(header)
rows = describe_data(folder)
w.writerows(rows) |
Correlation between Hamstring Flexor Power Restoration and Functional Performance Test: 2-Year Follow-Up after ACL Reconstruction Using Hamstring Autograft
Purpose To evaluate the restoration of the flexor power and the correlation between the flexor power and functional performance tests (FPTs) after anterior cruciate ligament (ACL) reconstruction with hamstring autograft. Materials and Methods Twenty-three men, who underwent ACL reconstruction with hamstring autograft, were evaluated using Lysholm, Subjective IKDC, Tegner activity score, isokinetic flexion and hyperflexion power test, and the FPTs at 1 and 2-year follow-up. We analyzed the mean change from 1 to 2 year and the correlation between both the flexion and hyperflexion power deficit with the FPTs at each follow-up. Results Mean age of the patients was 30.9 years (range, 19 to 44). Tegner activity score was significantly increased from 5.7 to 6.3 (p=.010). Hyperflexion power of the involved knee deficits significantly decreased at 2 year follow-up compared with 1 year (p<.001). There was a correlation between the flexor power deficit and the co-contraction, carioca, and involved one-legged hop test at each follow-up. However, no significant correlations were revealed between the hyperflexion power deficit and the FPTs. Conclusions Hyperflexion power deficit after ACL reconstruction with the hamstring autograft decreased at 2 year follow-up compared to 1 year and does not affect the results of the FPTs.
Introduction
The importance of hamstring on the stability of the anterior cruciate ligament (ACL) during sports activities has been documented in various studies 1,2) . Unfortunately, knee flexor strength assessed in the sitting position can persist after ACL
Ko et al. Correlation between Hamstring Flexor Power Restoration and Functional Performance Test
at 1 year and 2 years after surgery and analyzed the correlations with the functional performance tests (FPTs). Our hypothesis was that flexion deficit rate would not change between 1 year after surgery and 2 years after surgery but have a correlation with knee function, whereas a hyperflexion deficit rate would change between 1 year after surgery and 2 years after surgery but have no influence on knee function.
Materials
Of the patients who underwent ACL reconstruction using hamstring (semitendinous tendon and gracilis tendon) autograft between March 2006 and September 2010, 23 patients who were available for subjective tests, muscle strength test, and FPTs at 1 year and 2 years after surgery were included in this study. All the patients were young, active males who presented with an ACL injury alone, a combined meniscus injury that required partial menisectomy or surgical repair, or a combined medial collateral ligament injury that required non-surgical treatment. Patients with other combined injuries were excluded from the study. The mean interval from injury to surgery was 24.2±12.4 days. Female patients were excluded from the study for the following reasons: 1) direct comparison with males was impossible due to the significant intergender difference in the muscle volume increase during muscle strength training 8) ; and 2) female patients have relatively low levels of activity, which makes mean value calculation difficult.
Surgical Technique
An autologous quadrupled hamstring tendon was used in all cases. A femoral tunnel was drilled at the 10 o' clock position for the right knee and 2 o' clock position for the left knee within the femoral notch. Femoral fixation was performed with bioabsorbable cross pins. For tibial fixation, bioabsorbable interference screws were used and staples or cortical screws and washers were used. All the operations were performed by one surgeon.
Assessment Methods
For subjective assessment, Lysholm knee score, International Knee Documentation Committee (IKDC) subjective knee score, and Tegner activity score were used. Muscle strength testing was performed in the sitting position and in the prone position. The FPTs included co-contraction test, shuttle run test, carioca test, and one leg hop test.
1) Isokinetic strength test
Muscle strength was measured using Biodex (Biodex Corp., Shirley, NY, USA) dynamometer with the patient in a sitting position. The lateral femoral condyle was aligned with the rotational axis of the dynamometer. The knee flexion peak torque was determined from 4 trials that were performed at 90 o at an angular velocity of 60 o /sec with full extension considered as 0 o . For the assessment of the hyperflexion strength, peak torque was measured between 60 o -120 o at an angular velocity of 60 o / sec in prone position and determined from 4 trials (Fig. 1). The interlimb difference was recorded in percent to obtain flexion deficit rate.
2) Co-contraction test
The co-contraction test was designed to reproduce rotational forces at the knee that necessitates tibial translation and countercontraction of the femoral muscles. The test was performed with a Velcro belt secured around the patient's waist. The belt was attached to a rubber tube with a length of 122 cm (48 inches) and a diameter of 2.54 cm (1 inch). The tube was secured on a wall 154 cm (60 inches) above the floor. A semicircle with a radius of 244 cm (96 inches) was painted on the floor. The patient was asked to run along the semicircular line 5 times and the time to completion was measured (Fig. 2).
3) Shuttle run test
The shuttle run test was designed to reproduce acceleration and deceleration forces that are common in sports activities. The patient ran back and forth on a 6.1-meter course twice for a total of 24.4 m and the fastest speed was recorded (Fig. 3).
4) Carioca test
The carioca test was designed to reproduce the pivot shift movement in ACL-deficient patients. The patient was asked to run laterally using a cross-over step on a 12-meter course from left to right and then in a reverse direction. The fastest speed was recorded (Fig. 4).
5) One leg hop test
The one leg hop test was designed to assess dynamic stability of the ACL-deficient knee. The patient was asked to jump on one
Statistical Analysis
All the data were analyzed using SPSS/PC ver. 18.0 (IBM, Armonk, New York, USA) for Windows to obtain means and standard deviations. Subjective test results were analyzed using nonparametric Wilcoxon signed rank tests. The FPT results and flexion strength at 1 year and 2 years after surgery were compared using parametric paired t-tests. Correlations between flexion strength deficit, hyperflexion strength deficit, and FPTs were determined using the Pearson's correlation coefficient.
Results
The data from a total of 23 patients were analyzed. The mean age of the patients was 30.9 years (range, 19 to 44 years). Their mean height and mean weight were 171.7 cm and 77.3 kg, respectively (Table 1). On the subjective tests, a statistically significant change was found in the Tegner activity score only, from 5.7±1.1 at 1 year after surgery to 6.4±1.3 at 2 years after surgery (p=0.010) ( Table 2). On the FPTs, the shuttle run test results were slightly worse at 2 years after surgery. The co-contraction test, carioca test, and one leg hop test results were slightly improved at 2 years after surgery, but no statistically significant difference was noted (Table 3). No notable change was observed between the two follow-up assessments in the flexion strength deficit measured in a sitting position. On the hyperflexion strength test performed in the prone position, the muscle strength of the affected leg was significantly improved from 47±12 Nm at 1 year after surgery to 54±13 Nm at 2 years after surgery (p=0.022). The hyperflexion deficit rate was significantly decreased from 25%±13% to 12%±12% (p<0.001) ( Table 4).
Significant correlations were found between the flexion deficit rate and the FPTs at 1 year and 2 years after the surgery. The 1 year and 2 years correlation coefficients were r=0.401 and r=0.432, respectively, for the co-contraction test; r=0.442 and r= 0.451, respectively, for the carioca test; and r=-0.312 and r=-0.354, respectively, for the one leg hop test (Table 5). No significant correlations were found between the hyperflexion deficit rate and the FPTs, both at 1 year and 2 years after surgery (Table 6).
Discussion
The purpose of this study was to assess hamstring tendon p<0.05. regeneration following harvest for ACL reconstruction and to investigate possible correlations between the recovery of muscle strength and the FPTs. The hyperflexion deficit rate was improved between 1 year after surgery and 2 years after surgery; the improvement in the Tegner activity score at 2 years after surgery was statistically significant. However, the subjective knee score was not improved and no statistically significant change was observed in the FPTs. The flexion and hyperflexion strength deficit and regeneration of the hamstring tendon after harvest have not been clearly elucidated. Eriksson et al. 9) reported that magnetic resonance imaging (MRI) performed at 6 to 12 months after ACL reconstruction using a semitendinosus tendon graft showed a regeneration of the semitendinosus tendon with normal anatomical topographies to the level of the tibial plateau in 8 of the total 11 patients. Papandrea et al. 10) examined 40 patients who had undergone ACL reconstruction with semitendinosus and gracilis tendon autografts using ultrasound preoperatively and at 2 weeks and 1, 2, 3, 6, 12, 18, and 24 months postoperatively and observed that the anatomical structure of the donor site was similar to that of the normal tendon at 18 to 24 months postoperatively. Simonian et al. 11) evaluated 9 patients who had undergone ACL reconstruction with semitendinosus and gracilis tendon autografts using MRI for a minimum of 3 year followup and concluded that tendon harvest did not significantly compromise function and strength in spite of the more proximal insertion of the tendons after harvest compared to the nonoperated side.
Keays et al. 12) found muscle strength deficit in only 10% of the 31 patients at 6 months after ACL reconstruction using semitendinosus tendon grafts. However, some recent studies have shown that statistically significant muscle strength deficit can persist in knees ≥2 years after ACL reconstruction. Ardern et al. 13) found about 27% isokinetic muscle strength deficit in the operated limb compared with the nonoperated in 50 patients with high sports activity after a mean follow-up of 32.5 months. In this study, we obtained more favorable results compared to the previous studies: the muscle strength deficit rate in the operated limb compared to the nonoperated limb was 12% at 1 year after surgery and 11% at 2 years after surgery. We attribute these findings to the rigorous rehabilitation performed at least once a week until 6 months after surgery.
Onishi et al. 14) investigated the relationship between the EMG activity of the four different hamstrings and knee flexion angle and reported that the EMG activity of the semitendinous tendon increases at 70 o and peaks between knee flexion angles of 90 o and 105 o . Tashiro et al. 15) reported that hamstring muscle strength weakness in the operated limb was observed at ≥70 o of flexion in the sitting position and prone position in 85 patients who underwent ACL reconstruction using hamstring tendon grafts. Adachi et al. 16) found no significant side-to-side difference in flexion strength in 58 patients who underwent ACL reconstruction using hamstring grafts, but the active knee flexion angle was significantly lower in the operated limb. Thus, they suggested that ACL reconstruction using hamstring tendon grafts should be followed by hyperflexion strength tests and be determined with care in patients who perform sports activities in which strong knee flexion strength is required in deep flexion angle of the knee. In this study, the mean hyperflexion strength deficit of 25.4% in the operated limb was significantly high compared to the nonoperated limb at 1 year after surgery, but it was decreased to 12% at 2 years after surgery, which was below the normal level of 15%.
In this study, the subjective test results had not changed significantly between 1 year after surgery and 2 years after surgery. This might have been because the Lysholm knee score and IKDC subject score were already close to normal at 1 year after surgery. Obtaining near normal scores before sufficient functional recovery can be explained by a ceiling effect, which means that the experimental treatment was so effective or the test level was so low that all the patients could obtain high scores. This result may also indicate that knee score cannot be helpful in assessing knee function in long-term follow-up. In contrast, the mean Tegner activity score was significantly improved from 5.7 points at 1 year after surgery to 6.3 points at 2 years after surgery. Thus, we believe that Tegner activity score can be useful for the longterm follow-up assessment of knee function. Beard et al. 17) and Ejerhed et al. 18) also reported that the mean Tegner activity score was improved to 4.3-6.5 at 2 years after surgery. On the other hand, Ejerhed et al. 18) reported that the Tegner activity level was reduced by two to three units compared to the preinjury level. In our study, however, the mean Tegner activity level before injury was 6.4 and the value at 2 years after surgery was similar to the preinjury level. Keays et al. 19) followed 62 patients who underwent ACL reconstruction using patellar tendon grafts (n=31) and hamstring tendon grafts (n=31) and compared them with normal healthy controls at 6 years after surgery. In their study, no significant intergroup differences were found in muscle strength. In terms of the results of FPTs (shuttle run test, carioca test, and one leg hop test), the hamstring tendon graft group was similar to the normal group, whereas the patellar tendon graft group was significantly different from the normal group. In our previous study 20) , we investigated the correlations among the FPTs (one leg hop test, co-contraction test, shuttle run test, and carioca test) in 40 active subjects who had a normal Tegner activity score of 6-7 points. The mean time for the co-contraction test, shuttle run test, and carioca test was 15.34 seconds, 7.67 seconds, 8.47 seconds, respectively. On the one leg hop test, the mean hop distance was 157.8 cm for the dominant leg and 160.1 cm for the non-dominant leg. In the current study, the mean time for the co-contraction test, shuttle run test, and carioca test was 15.8 seconds, 8.0 seconds, and 9.48 seconds, respectively, at 1 year after surgery and 15.8 seconds, 8.1 seconds, and 9.3 seconds, respectively, at 2 years after surgery. The mean co-contraction test value at 2 years after surgery was similar to that of the normal group, whereas the mean time for the shuttle run test and carioca test at 2 years after surgery was longer by 0.5 seconds and 0.9 seconds, respectively. The mean hop distance in the normal group was 157.8 cm for the dominant leg and 160.1 cm for the non-dominant leg. In the current study, the mean hop distance at 2 years after surgery was close to normal, 164.2 cm for the nonoperated leg and 152.5 cm for the operated leg. On the relationship between the flexion deficit rate in the sitting position and the FPTs, the flexion deficit rate was associated with the co-contraction test, carioca test, and one leg hop test of the operated limb. Viola et al. 21) reported that ACL reconstruction using hamstring tendon grafts may result in a deficit in knee flexor strength and internal tibial rotation weakness, which could interfere with postoperative rehabilitation and potentially impair athletic performance. The correlation between the co-contraction test and the carioca test might be attributable to the fact that these two tests are to assess tibial rotation strength. However, they cautioned that since the importance of flexor strength and internal rotation weakness and functional role have not been defined yet, the adverse impact on athletic performance is not clear.
In our previous study 22) , we evaluated correlations between the hyperflexion deficit recovery and the Lysholm knee score, IKDC subject score, Tegner activity score, KT-2000, Hop test, and three FPTs at 19 months after ACL reconstruction that had used hamstring autografts. There was no correlation between the hyperflexion deficit rate and the FPTs.
The weaknesses of this study include the small study population, the possibility of patient selection bias, and inclusion of male patients only. We believe further studies involving a larger study population and a long-term follow-up should be conducted to verify our results.
Conclusions
Hyperflexion deficit rate after ACL reconstruction with hamstring autografts at 2 years after surgery was lower than that at 1 year after surgery and had no influence on the FPT results. |
<filename>Client/game-box-web-ui/src/app/modules/user/+state/users.state.ts<gh_stars>1-10
import { IUsersListModel } from 'src/app/modules/user/models/users-list.model';
import { IAppState } from 'src/app/store/app.state';
export interface IState extends IAppState {
users: IUsersState;
}
export interface IUsersState {
all: IUsersListModel[];
}
|
Transmission Path Planning based on Improved Artificial Potential Field and Enhanced Ant Colony Algorithm
Using geographic information system (GIS) as the information platform, Based on the cost standard of typical transmission lines of 500kv power transmission and transformation project of State Grid Corporation of China, the clustering of environmental influencing factors was analyzed by principal component analysis. On this basis, the comprehensive cost of unit grid is evaluated by BP neural network, and the final comprehensive cost matrix of unit grid is obtained. Based on the principle of minimizing the cost of grid construction, the proposed grid region transmission path is searched by ant colony algorithm. Considering the environmental constraints of the search area, the corresponding cost compensation mechanism and avoidance crossing mechanism are set for regions with different degrees of constraint capacity to improve the environmental resultant force. An improved artificial potential field is introduced to estimate the starting direction of the path of the ant colony algorithm. The optimal corner processing mechanism is added to further reduce the comprehensive cost of transmission lines and improve the convergence speed of the algorithm. |
/**
* Copy the current immutable object by setting a value for the {@link PatternCoordinates#chrono()
* chrono} attribute. A value equality check is used to prevent copying of the same value by
* returning {@code this}.
*
* @param value A new value for chrono
* @return A modified copy of the {@code this} object
*/
public final ImmutablePatternCoordinates withChrono(Chrono value) {
if (this.chrono == value) return this;
Chrono newValue = Objects.requireNonNull(value, "chrono");
if (this.chrono.equals(newValue)) return this;
return new ImmutablePatternCoordinates(this, newValue, this.locale);
} |
def add_nan(data, n_zeros=4):
global limit
slices = data.shape[0]
print(slices)
zero_pad = np.zeros((limit-slices, n_zeros))
padded_data = np.vstack([data, zero_pad])
nan_data = np.where(padded_data == 0, np.nan, padded_data)
return nan_data |
import styled from "@emotion/native";
import debounce from "lodash.debounce";
import React, { Component } from "react";
import { Dimensions, ScrollView, TextStyle, ViewStyle } from "react-native";
import { TIME_MODES, TIME_MODES_TO_TIME_UNITS } from "./constants";
import EventSelect from "./event-select";
import TimelineBody from "./timeline-body";
import { DateArgs, enumerateDatesBetweenDates } from "./utils";
export type Data = Array<{ props: ItemProps; subItems?: Array<ItemProps> }>;
export type Styles = { container?: ViewStyle; text?: TextStyle };
export type DatesFormat = Record<string, string>;
export type DateLinesStyles = { day?: ViewStyle; weekend?: ViewStyle; today?: ViewStyle };
export type ID = string | number;
export type EventsExpanding = Record<number, boolean>;
export type ModesToDayContainerSize = Record<string, number>;
export type EventsPosition = Record<ID, { top: number; left: number }>;
export type ScrollRef = null | ScrollView;
export type SelectProps = { [key: string]: any };
export type OnItemPress = (id: ID) => void;
export type Period = { startDate: DateArgs; endDate: DateArgs };
export { TIME_MODES };
const MODES_TO_DAY_CONTAINER_SIZE: ModesToDayContainerSize = {
[TIME_MODES.M]: 10,
[TIME_MODES.W]: 20,
[TIME_MODES.D]: 50,
};
const GAP_BETWEEN_EVENTS = 50;
export interface ItemProps {
startDate: DateArgs;
endDate: DateArgs;
title: string;
styles?: Styles;
id: ID;
}
interface Props {
defaultTimeMode?: string;
onMainItemPress?: OnItemPress;
onSubItemPress?: OnItemPress;
data: Data;
showSubItemsOnMainItemPress?: boolean;
period: Period;
useSelectForScrollingToItems?: boolean;
selectProps?: SelectProps;
useTapOnDatesToChangeTimeMode?: boolean;
useStickyItemsText?: boolean;
datesStyles?: Styles;
datesFormat?: DatesFormat;
dateLinesStyles?: DateLinesStyles;
modesToDayContainerSize?: ModesToDayContainerSize;
gapBetweenEvents?: number;
horizontal?: boolean;
}
interface State {
timeMode: string;
size: number;
eventsExpanding: EventsExpanding;
scrollPosition: number;
}
const TimelineContainer = styled.View`
display: flex;
flex: 1;
padding: 5px;
`;
class Timeline extends Component<Props, State> {
state = {
timeMode: this.props.defaultTimeMode || TIME_MODES.D,
size: Dimensions.get("window")[this.props.horizontal ? "height" : "width"],
eventsExpanding: {},
scrollPosition: 0,
};
eventsPositions: EventsPosition = {};
scrollYRef: ScrollRef = null;
scrollXRef: ScrollRef = null;
updateScrollPosition = (scrollPosition: number): void => {
const { useStickyItemsText = true } = this.props;
if (useStickyItemsText) {
this.setState({ scrollPosition });
}
};
debouncedScrollPositionUpdater = debounce(this.updateScrollPosition, 100);
setTimeMode = (timeMode: string): void => this.setState({ timeMode });
updateEventsExpanding = (isExpanded: boolean, index: number): void => {
this.setState({ eventsExpanding: { ...this.state.eventsExpanding, [index]: isExpanded } });
};
updateSize = (size: number): void => this.setState({ size });
updateTimeModeByDateTap = (): void => {
const { useTapOnDatesToChangeTimeMode = true } = this.props;
if (useTapOnDatesToChangeTimeMode) {
const { timeMode } = this.state;
let updatedTimeMode;
switch (timeMode) {
case TIME_MODES.M: {
updatedTimeMode = TIME_MODES.W;
break;
}
case TIME_MODES.W: {
updatedTimeMode = TIME_MODES.D;
break;
}
case TIME_MODES.D: {
updatedTimeMode = TIME_MODES.M;
break;
}
}
if (updatedTimeMode) {
this.setTimeMode(updatedTimeMode);
}
}
};
setEventsPosition = (eventsPositions: EventsPosition): void => {
this.eventsPositions = eventsPositions;
};
setScrollRef = (ref: ScrollRef, isX?: boolean): void => {
if (isX) {
this.scrollXRef = ref;
} else {
this.scrollYRef = ref;
}
};
onSelect = (id: ID): void => {
if (this.scrollXRef && this.scrollYRef) {
const { left, top } = this.eventsPositions[id];
const { gapBetweenEvents = GAP_BETWEEN_EVENTS, horizontal } = this.props;
this.debouncedScrollPositionUpdater.cancel();
this.updateScrollPosition(horizontal ? left : top);
this.scrollYRef.scrollResponderScrollTo({
y: top - (horizontal ? gapBetweenEvents : 0),
animated: true,
});
this.scrollXRef.scrollResponderScrollTo({
x: left - (horizontal ? 0 : gapBetweenEvents),
animated: true,
});
}
};
render() {
const {
onMainItemPress,
onSubItemPress,
data,
showSubItemsOnMainItemPress = true,
period,
useSelectForScrollingToItems = true,
datesFormat,
datesStyles,
dateLinesStyles,
modesToDayContainerSize = {},
gapBetweenEvents = GAP_BETWEEN_EVENTS,
selectProps = {},
horizontal,
} = this.props;
const { timeMode, size, eventsExpanding, scrollPosition } = this.state;
const dates = enumerateDatesBetweenDates(
period.startDate,
period.endDate,
TIME_MODES_TO_TIME_UNITS[timeMode],
);
return (
<TimelineContainer>
{useSelectForScrollingToItems && (
<EventSelect
items={data.map(({ props }) => ({ key: props.id, label: props.title }))}
onChange={this.onSelect}
{...selectProps}
/>
)}
<TimelineBody
timeMode={timeMode}
size={size}
eventsExpanding={eventsExpanding}
dates={dates}
data={data}
onMainItemPress={onMainItemPress}
onSubItemPress={onSubItemPress}
showSubItemsOnMainItemPress={showSubItemsOnMainItemPress}
updateTimeModeByDateTap={this.updateTimeModeByDateTap}
updateEventsExpanding={this.updateEventsExpanding}
updateSize={this.updateSize}
scrollPosition={scrollPosition}
updateScrollPosition={this.debouncedScrollPositionUpdater}
datesFormat={datesFormat}
datesStyles={datesStyles}
dateLinesStyles={dateLinesStyles}
modesToDayContainerSize={{ ...MODES_TO_DAY_CONTAINER_SIZE, ...modesToDayContainerSize }}
gapBetweenEvents={gapBetweenEvents}
setEventsPosition={this.setEventsPosition}
setScrollRef={this.setScrollRef}
horizontal={horizontal}
/>
</TimelineContainer>
);
}
}
export default Timeline;
|
// CommitRenameReplaceDentry must be called after the file represented by from
// is renamed without RENAME_EXCHANGE. If to is not nil, it represents the file
// that was replaced by from.
//
// Preconditions: PrepareRenameDentry was previously called on from and to.
func (vfs *VirtualFilesystem) CommitRenameReplaceDentry(from, to *Dentry) {
from.mu.Unlock()
if to != nil {
to.dead = true
to.mu.Unlock()
if to.isMounted() {
vfs.forgetDeadMountpoint(to)
}
}
} |
//
// (C) Copyright 2020 Intel Corporation.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
//
// GOVERNMENT LICENSE RIGHTS-OPEN SOURCE SOFTWARE
// The Government's rights to use, modify, reproduce, release, perform, display,
// or disclose this software are subject to the terms of the Apache License as
// provided in Contract No. 8F-30005.
// Any reproduction of computer software, computer software documentation, or
// portions thereof marked with this legend must also reproduce the markings.
//
package server
import (
"fmt"
"github.com/pkg/errors"
"github.com/daos-stack/daos/src/control/build"
"github.com/daos-stack/daos/src/control/common/proto"
ctlpb "github.com/daos-stack/daos/src/control/common/proto/ctl"
"github.com/daos-stack/daos/src/control/fault"
"github.com/daos-stack/daos/src/control/server/storage/bdev"
"github.com/daos-stack/daos/src/control/server/storage/scm"
)
// newMntRet creates and populates SCM mount result.
// Currently only used for format operations.
func (srv *IOServerInstance) newMntRet(inErr error) *ctlpb.ScmMountResult {
var info string
if fault.HasResolution(inErr) {
info = fault.ShowResolutionFor(inErr)
}
return &ctlpb.ScmMountResult{
Mntpoint: srv.scmConfig().MountPoint,
State: newResponseState(inErr, ctlpb.ResponseStatus_CTL_ERR_SCM, info),
Instanceidx: srv.Index(),
}
}
// newCret creates and populates NVMe controller result and logs error
func (srv *IOServerInstance) newCret(pciAddr string, inErr error) *ctlpb.NvmeControllerResult {
var info string
if pciAddr == "" {
pciAddr = "<nil>"
}
if inErr != nil && fault.HasResolution(inErr) {
info = fault.ShowResolutionFor(inErr)
}
return &ctlpb.NvmeControllerResult{
Pciaddr: pciAddr,
State: newResponseState(inErr, ctlpb.ResponseStatus_CTL_ERR_NVME, info),
}
}
// scmFormat will return either successful result or error.
func (srv *IOServerInstance) scmFormat(reformat bool) (*ctlpb.ScmMountResult, error) {
srvIdx := srv.Index()
cfg := srv.scmConfig()
req, err := scm.CreateFormatRequest(cfg, reformat)
if err != nil {
return nil, errors.Wrap(err, "generate format request")
}
scmStr := fmt.Sprintf("SCM (%s:%s)", cfg.Class, cfg.MountPoint)
srv.log.Infof("Instance %d: starting format of %s", srvIdx, scmStr)
res, err := srv.scmProvider.Format(*req)
if err == nil && !res.Formatted {
err = errors.WithMessage(scm.FaultUnknown, "is still unformatted")
}
if err != nil {
srv.log.Errorf(" format of %s failed: %s", scmStr, err)
return nil, err
}
srv.log.Infof("Instance %d: finished format of %s", srvIdx, scmStr)
return srv.newMntRet(nil), nil
}
func (srv *IOServerInstance) bdevFormat(p *bdev.Provider) (results proto.NvmeControllerResults) {
srvIdx := srv.Index()
cfg := srv.bdevConfig()
results = make(proto.NvmeControllerResults, 0, len(cfg.DeviceList))
// A config with SCM and no block devices is valid.
if len(cfg.DeviceList) == 0 {
return
}
srv.log.Infof("Instance %d: starting format of %s block devices %v",
srvIdx, cfg.Class, cfg.DeviceList)
res, err := p.Format(bdev.FormatRequest{
Class: cfg.Class,
DeviceList: cfg.DeviceList,
MemSize: cfg.MemSize,
})
if err != nil {
results = append(results, srv.newCret("", err))
return
}
for dev, status := range res.DeviceResponses {
// TODO DAOS-5828: passing status.Error directly triggers segfault
var err error
if status.Error != nil {
err = status.Error
}
results = append(results, srv.newCret(dev, err))
}
srv.log.Infof("Instance %d: finished format of %s block devices %v",
srvIdx, cfg.Class, cfg.DeviceList)
return
}
// StorageFormatSCM performs format on SCM and identifies if superblock needs
// writing.
func (srv *IOServerInstance) StorageFormatSCM(reformat bool) (mResult *ctlpb.ScmMountResult) {
srvIdx := srv.Index()
needsScmFormat := reformat
srv.log.Infof("Formatting scm storage for %s instance %d (reformat: %t)",
build.DataPlaneName, srvIdx, reformat)
var scmErr error
defer func() {
if scmErr != nil {
srv.log.Errorf(msgFormatErr, srvIdx)
mResult = srv.newMntRet(scmErr)
}
}()
if srv.isStarted() {
scmErr = errors.Errorf("instance %d: can't format storage of running instance",
srvIdx)
return
}
// If not reformatting, check if SCM is already formatted.
if !reformat {
needsScmFormat, scmErr = srv.NeedsScmFormat()
if scmErr == nil && !needsScmFormat {
scmErr = scm.FaultFormatNoReformat
}
if scmErr != nil {
return
}
}
if needsScmFormat {
mResult, scmErr = srv.scmFormat(true)
}
return
}
// StorageFormatNVMe performs format on NVMe if superblock needs writing.
func (srv *IOServerInstance) StorageFormatNVMe(bdevProvider *bdev.Provider) (cResults proto.NvmeControllerResults) {
srv.log.Infof("Formatting nvme storage for %s instance %d", build.DataPlaneName, srv.Index())
// If no superblock exists, format NVMe and populate response with results.
needsSuperblock, err := srv.NeedsSuperblock()
if err != nil {
return proto.NvmeControllerResults{
srv.newCret("", err),
}
}
if needsSuperblock {
cResults = srv.bdevFormat(bdevProvider)
}
return
}
|
# Django
from django.views.generic import ListView, FormView, DetailView
from django.http.response import JsonResponse
from django.http.response import Http404, HttpResponseBadRequest
from django.urls import reverse_lazy
from django.utils import timezone
from django.utils.translation import gettext as _
# Django REST Framework
from rest_framework.generics import UpdateAPIView, CreateAPIView
from rest_framework.views import APIView
from rest_framework.response import Response
from rest_framework import status
from rest_framework.permissions import IsAuthenticated
from rest_framework.viewsets import GenericViewSet
from rest_framework.decorators import action
# Logic
from logs.logic.convert_time import ConvertTime
# Serializers
from logs.serializers import UpdateTimeLogModelSerializer, TimeLogModelSerializer, CreateTimeLogModelSerializer
# Models
from logs.models import TimeLog, Phase
from programs.models import Program
# Forms
from logs.forms import CreateLogProgramForm
# Mixins
from psp.mixins import MemberUserProgramRequiredMixin
from logs.mixins import IsUserOwnerProgram, TimeLogNotStop
class ListProgramTimeLogView(MemberUserProgramRequiredMixin, ListView):
template_name = 'logs/time_log.html'
context_object_name = 'programs'
def get_queryset(self):
return TimeLog.objects.filter(program=self.program).order_by('phase__order_index')
def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)
context["program_opened"] = self.program
context["phases_not_in_program"] = Phase.objects.exclude(id__in=TimeLog.objects.filter(program=self.program).values('phase__pk')).values('name').order_by('order_index')
context["is_active_phase"] = self.program.program_log_time.filter(finish_date=None).exists()
if context["is_active_phase"]:
context["time_log"] = self.program.program_log_time.get(finish_date=None)
return context
class CreateTimeLogView(MemberUserProgramRequiredMixin, CreateAPIView):
queryset = TimeLog.objects.all()
serializer_class = CreateTimeLogModelSerializer
def dispatch(self, request, *args, **kwargs):
self.program = Program.objects.get(pk=kwargs['pk_program'])
self.is_active_phase = self.program.program_log_time.filter(finish_date=None).exists()
if self.is_active_phase:
return HttpResponseBadRequest(reason='There is a phase active')
return super().dispatch(request, *args, **kwargs)
def create(self, request, *args, **kwargs):
serializer = self.serializer_class(data=request.data)
if serializer.is_valid():
time_log = serializer.save(self.program)
return Response(data=TimeLogModelSerializer(instance=time_log).data, status=status.HTTP_201_CREATED)
return Response(data=serializer.errors, status=status.HTTP_400_BAD_REQUEST)
# Pause time log
class UpdateCurrentTimeLog(UpdateAPIView):
queryset = TimeLog.objects.all()
permission_classes = [IsUserOwnerProgram, TimeLogNotStop]
lookup_url_kwarg = 'pk_time_log'
serializer_class = UpdateTimeLogModelSerializer
class StopCurrentTimeLogView(UpdateAPIView):
queryset = TimeLog.objects.all()
permission_classes = [IsUserOwnerProgram, TimeLogNotStop]
lookup_url_kwarg = 'pk_time_log'
serializer_class = UpdateTimeLogModelSerializer
def update(self, request, *args, **kwargs):
serializer = self.serializer_class(data=request.data)
if serializer.is_valid():
data = serializer.validated_data
time_log = self.get_object()
time_log.delta_time = data['delta_time']
time_log.is_paused = data['is_paused']
time_log.finish_date = timezone.now()
time_log.save()
return Response(data=TimeLogModelSerializer(instance=time_log).data, status=status.HTTP_200_OK)
return Response(data=serializer.errors, status=status.HTTP_400_BAD_REQUEST)
# Restart time log
class RestartTimeLog(APIView):
permission_classes = [IsUserOwnerProgram, TimeLogNotStop]
def dispatch(self, request, *args, **kwargs):
try:
self.time_log = TimeLog.objects.get(pk=kwargs['pk_time_log'])
except TimeLog.DoesNotExist:
raise Http404("The time log doesn't exists")
return super().dispatch(request, *args, **kwargs)
def patch(self, request, pk_time_log):
self.time_log.is_paused = False
self.time_log.last_restart_time = timezone.now()
self.time_log.save()
return Response(data=TimeLogModelSerializer(instance=self.time_log).data, status=status.HTTP_200_OK)
class DetailTimeLogView(MemberUserProgramRequiredMixin, DetailView):
model = TimeLog
template_name = 'logs/timer_time_log.html'
context_object_name = 'time_log'
def get_object(self):
if self.program.program_log_time.filter(finish_date=None).exists():
return self.program.program_log_time.get(finish_date=None)
return None
|
/**
* The TextSimilarity class provides methods that estimate the similarity of two
* different strings.
*
* @author Vasilis Vryniotis <[email protected]>
*/
public class TextSimilarity {
/**
* This calculates the similarity between two strings as described in Programming
* Classics: Implementing the World's Best Algorithms by Oliver (ISBN 0-131-00413-1).
*
* @param text1
* @param text2
* @return
*/
public static double oliverSimilarity(String text1, String text2) {
preprocessDocument(text1);
preprocessDocument(text2);
String smallerDoc=text1;
String biggerDoc=text2;
if(text1.length()>text2.length()) {
smallerDoc=text2;
biggerDoc=text1;
}
double p=PHPSimilarText.similarityPercentage(smallerDoc, biggerDoc);
p/=100.0;
return p;
}
/**
* Estimates the w-shingler similarity between two texts. The w is the number
* of word sequences that are used for the estimation.
*
* References:
* http://phpir.com/shingling-near-duplicate-detection
* http://www.std.org/~msm/common/clustering.html
*
* @param text1
* @param text2
* @param w
* @return
*/
public static double shinglerSimilarity(String text1, String text2, int w) {
preprocessDocument(text1);
preprocessDocument(text2);
NgramsExtractor.Parameters parameters = new NgramsExtractor.Parameters();
parameters.setMaxCombinations(w);
parameters.setMaxDistanceBetweenKwds(0);
parameters.setExaminationWindowLength(w);
NgramsExtractor ngrams = new NgramsExtractor(parameters);
Map<String, Double> keywords1 = ngrams.extract(text1);
Map<String, Double> keywords2 = ngrams.extract(text2);
//remove all the other combinations except of the w-grams
filterKeywordCombinations(keywords1, w);
filterKeywordCombinations(keywords2, w);
//ngrams=null;
//parameters=null;
double totalKeywords=0.0;
double commonKeywords=0.0;
Set<String> union = new HashSet<>(keywords1.keySet());
union.addAll(keywords2.keySet());
totalKeywords+=union.size();
//union=null;
Set<String> intersect = new HashSet<>(keywords1.keySet());
intersect.retainAll(keywords2.keySet());
commonKeywords+=intersect.size();
//intersect=null;
double resemblance=commonKeywords/totalKeywords;
//keywords1=null;
//keywords2=null;
return resemblance;
}
private static String preprocessDocument(String text) {
//URLs
text=StringCleaner.tokenizeURLs(text);
//Remove HTML
text=HTMLParser.extractText(text);
//Remove an Accents
text=StringCleaner.removeAccents(text);
//Remove extra spaces
text=StringCleaner.removeExtraSpaces(text);
return text;
}
private static void filterKeywordCombinations(Map<String, Double> keywords, int w) {
Iterator<Map.Entry<String, Double>> it = keywords.entrySet().iterator();
while(it.hasNext()) {
Map.Entry<String, Double> entry = it.next();
if(PHPMethods.substr_count(entry.getKey(), ' ')!=w-1) {
it.remove();
}
}
}
} |
package dbrouter
import (
"bytes"
"context"
"strings"
"sync"
"sync/atomic"
"time"
"github.com/shawnfeng/sutil/sconf/center"
"github.com/shawnfeng/sutil/slog/slog"
)
var (
configCenter center.ConfigCenter
)
const (
globalGranularityKey = "global.granularity" // 精度
globalThresholdKey = "global.threshold" // 精度内阈值
globalBreakerGapKey = "global.breakergap" // 触发熔断后的熔断间隔,单位: 秒
checkTick = time.Millisecond * 25
defaultThreshold = 10
defaultBreakerGap = 10
)
// TOOD 简单计数法实现熔断操作,后续改为滑动窗口或三方组件的方式
type BreakerManager struct {
lock sync.Mutex
Breakers map[string]*Breaker
}
type Breaker struct {
Rejected int32
RejectedStart int64
Count int32
}
var bm *BreakerManager
func statBreaker(cluster, table string, err error) {
if err != nil && (strings.Contains(err.Error(), "timeout") || strings.Contains(err.Error(), "invalid connection")) {
key := concat(cluster, "_", table)
bm.lock.Lock()
if _, ok := bm.Breakers[key]; !ok {
breaker := new(Breaker)
breaker.Run()
bm.Breakers[key] = breaker
}
breaker := bm.Breakers[key]
bm.lock.Unlock()
atomic.AddInt32(&breaker.Count, 1)
}
}
func Entry(cluster, table string) bool {
key := concat(cluster, "_", table)
bm.lock.Lock()
breaker := bm.Breakers[key]
bm.lock.Unlock()
if breaker != nil {
return atomic.LoadInt32(&breaker.Rejected) != 1
}
return true
}
func (breaker *Breaker) Run() {
go func() {
granularityStr, exist := configCenter.GetStringWithNamespace(context.TODO(), center.DefaultApolloMysqlNamespace, globalGranularityKey)
if !exist {
slog.Warnf(context.TODO(), "dbrouter: get granularity from apollo failed, exist: %v", exist)
granularityStr = "1s"
}
granularity, err := time.ParseDuration(granularityStr)
if err != nil {
slog.Warnf(context.TODO(), "dbrouter: granularity in apollo is invalid, %s", granularityStr)
granularity = time.Second * 1
}
granularityTickC := time.Tick(granularity)
checkTickC := time.Tick(checkTick)
for {
select {
case <-granularityTickC:
atomic.StoreInt32(&breaker.Count, 0)
// check 1s/checkTick times in 1s
case <-checkTickC:
threshold, exist := configCenter.GetIntWithNamespace(context.TODO(), center.DefaultApolloMysqlNamespace, globalThresholdKey)
if !exist {
slog.Warnf(context.TODO(), "dbrouter: get threshold from apollo failed, exist: %v", exist)
threshold = defaultThreshold
}
breakerGap, exist := configCenter.GetIntWithNamespace(context.TODO(), center.DefaultApolloMysqlNamespace, globalBreakerGapKey)
if !exist {
slog.Warnf(context.TODO(), "dbrouter: get breakGap from apollo failed, exist: %v", exist)
breakerGap = defaultBreakerGap
}
if atomic.LoadInt32(&breaker.Count) > int32(threshold) {
atomic.StoreInt32(&breaker.Rejected, 1)
breaker.RejectedStart = time.Now().Unix()
} else {
now := time.Now().Unix()
if now-breaker.RejectedStart > int64(breakerGap) {
atomic.StoreInt32(&breaker.Rejected, 0)
}
}
}
}
}()
}
func initConfig() error {
var err error
configCenter, err = center.NewConfigCenter(center.ApolloConfigCenter)
if err != nil {
return err
}
err = configCenter.Init(context.TODO(), center.DefaultApolloMiddlewareService, []string{center.DefaultApolloMysqlNamespace})
if err != nil {
return err
}
return nil
}
func concat(strings ...string) string {
var buffer bytes.Buffer
for _, s := range strings {
buffer.WriteString(s)
}
return buffer.String()
}
func init() {
bm = &BreakerManager{Breakers: make(map[string]*Breaker)}
err := initConfig()
if err != nil {
slog.Panicf(context.TODO(), "dbrouter: init apollo config failed, err: %v", err)
}
}
|
import { Protocol } from './protocol';
import { Messages } from './messages';
import { MessageConnection } from 'vscode-jsonrpc';
import { EventEmitter } from 'events';
/**
* Server incoming
*/
export class Incoming {
private connection: MessageConnection;
private emitter: EventEmitter;
/**
* Constructs a new discovery handler
* @param connection message connection to the RSP
* @param emitter event emitter to handle notification events
*/
constructor(connection: MessageConnection, emitter: EventEmitter) {
this.connection = connection;
this.emitter = emitter;
this.listen();
}
/**
* Subscribes to notifications sent by the server
*/
private listen() {
this.connection.onNotification(Messages.Client.MessageBoxNotification.type, param => {
this.emitter.emit('messageBox', param);
});
this.connection.onNotification(Messages.Client.DiscoveryPathAddedNotification.type, param => {
this.emitter.emit('discoveryPathAdded', param);
});
this.connection.onNotification(Messages.Client.DiscoveryPathRemovedNotification.type, param => {
this.emitter.emit('discoveryPathRemoved', param);
});
this.connection.onNotification(Messages.Client.ServerAddedNotification.type, param => {
this.emitter.emit('serverAdded', param);
});
this.connection.onNotification(Messages.Client.ServerRemovedNotification.type, param => {
this.emitter.emit('serverRemoved', param);
});
this.connection.onNotification(Messages.Client.ServerAttributesChangedNotification.type, param => {
this.emitter.emit('serverAttributesChanged', param);
});
this.connection.onNotification(Messages.Client.ServerStateChangedNotification.type, param => {
this.emitter.emit('serverStateChanged', param);
});
this.connection.onNotification(Messages.Client.ServerProcessCreatedNotification.type, param => {
this.emitter.emit('serverProcessCreated', param);
});
this.connection.onNotification(Messages.Client.ServerProcessTerminatedNotification.type, param => {
this.emitter.emit('serverProcessTerminated', param);
});
this.connection.onNotification(Messages.Client.ServerProcessOutputAppendedNotification.type, param => {
this.emitter.emit('serverProcessOutputAppended', param);
});
this.connection.onNotification(Messages.Client.JobAddedNotification.type, param => {
this.emitter.emit('jobAdded', param);
});
this.connection.onNotification(Messages.Client.JobRemovedNotification.type, param => {
this.emitter.emit('jobRemoved', param);
});
this.connection.onNotification(Messages.Client.JobChangedNotification.type, param => {
this.emitter.emit('jobChanged', param);
});
}
onPromptString(listener: (arg: Protocol.StringPrompt) => Promise<string>): void {
this.connection.onRequest(Messages.Client.PromptStringRequest.type, listener);
}
onMessageBox(listener: (arg: Protocol.MessageBoxNotification) => void): void {
this.emitter.on('messageBox', listener);
}
removeOnMessageBox(listener: (arg: Protocol.MessageBoxNotification) => void): void {
this.emitter.removeListener('messageBox', listener);
}
onDiscoveryPathAdded(listener: (arg: Protocol.DiscoveryPath) => void): void {
this.emitter.on('discoveryPathAdded', listener);
}
removeOnDiscoveryPathAdded(listener: (arg: Protocol.DiscoveryPath) => void): void {
this.emitter.removeListener('discoveryPathAdded', listener);
}
onDiscoveryPathRemoved(listener: (arg: Protocol.DiscoveryPath) => void): void {
this.emitter.on('discoveryPathRemoved', listener);
}
removeOnDiscoveryPathRemoved(listener: (arg: Protocol.DiscoveryPath) => void): void {
this.emitter.removeListener('discoveryPathRemoved', listener);
}
onServerAdded(listener: (arg: Protocol.ServerHandle) => void): void {
this.emitter.on('serverAdded', listener);
}
removeOnServerAdded(listener: (arg: Protocol.ServerHandle) => void): void {
this.emitter.removeListener('serverAdded', listener);
}
onServerRemoved(listener: (arg: Protocol.ServerHandle) => void): void {
this.emitter.on('serverRemoved', listener);
}
removeOnServerRemoved(listener: (arg: Protocol.ServerHandle) => void): void {
this.emitter.removeListener('serverRemoved', listener);
}
onServerAttributesChanged(listener: (arg: Protocol.ServerHandle) => void): void {
this.emitter.on('serverAttributesChanged', listener);
}
removeOnServerAttributesChanged(listener: (arg: Protocol.ServerHandle) => void): void {
this.emitter.removeListener('serverAttributesChanged', listener);
}
onServerStateChanged(listener: (arg: Protocol.ServerState) => void): void {
this.emitter.on('serverStateChanged', listener);
}
removeOnServerStateChanged(listener: (arg: Protocol.ServerState) => void): void {
this.emitter.removeListener('serverStateChanged', listener);
}
onServerProcessCreated(listener: (arg: Protocol.ServerProcess) => void): void {
this.emitter.on('serverProcessCreated', listener);
}
removeOnServerProcessCreated(listener: (arg: Protocol.ServerProcess) => void): void {
this.emitter.removeListener('serverProcessCreated', listener);
}
onServerProcessTerminated(listener: (arg: Protocol.ServerProcess) => void): void {
this.emitter.on('serverProcessTerminated', listener);
}
removeOnServerProcessTerminated(listener: (arg: Protocol.ServerProcess) => void): void {
this.emitter.removeListener('serverProcessTerminated', listener);
}
onServerProcessOutputAppended(listener: (arg: Protocol.ServerProcessOutput) => void): void {
this.emitter.on('serverProcessOutputAppended', listener);
}
removeOnServerProcessOutputAppended(listener: (arg: Protocol.ServerProcessOutput) => void): void {
this.emitter.removeListener('serverProcessOutputAppended', listener);
}
onJobAdded(listener: (arg: Protocol.JobHandle) => void): void {
this.emitter.on('jobAdded', listener);
}
removeOnJobAdded(listener: (arg: Protocol.JobHandle) => void): void {
this.emitter.removeListener('jobAdded', listener);
}
onJobRemoved(listener: (arg: Protocol.JobRemoved) => void): void {
this.emitter.on('jobRemoved', listener);
}
removeOnJobRemoved(listener: (arg: Protocol.JobRemoved) => void): void {
this.emitter.removeListener('jobRemoved', listener);
}
onJobChanged(listener: (arg: Protocol.JobProgress) => void): void {
this.emitter.on('jobChanged', listener);
}
removeOnJobChanged(listener: (arg: Protocol.JobProgress) => void): void {
this.emitter.removeListener('jobChanged', listener);
}
}
|
<filename>stack/transport/uacp/secure_channel_open.go
package uacp
import (
"bytes"
"fmt"
"io/ioutil"
"time"
"github.com/searis/guma/stack/encoding/binary"
"github.com/searis/guma/stack/transport"
"github.com/searis/guma/stack/uatype"
)
func (sc *SecureChannel) open(deadline time.Time) error {
// Wait for receiveQueue spot or deadline.
requestID, err := sc.recvState.WaitForRequestID(deadline)
if err != nil {
return err
}
// The receivequeue must always be freed, and it is always safe to cancel,
// even on success.
defer sc.recvState.CancelRequestID(requestID)
// Prepare and encode request.
var msgBuff bytes.Buffer
enc := binary.NewEncoder(&msgBuff)
requestType := uatype.SecurityTokenRequestTypeIssue
if sc.securityToken.ChannelId != 0 {
requestType = uatype.SecurityTokenRequestTypeRenew
}
var timeoutHint uint32
if !deadline.IsZero() {
timeoutHint = encodeUnsignedDuration(time.Until(deadline))
}
if err := enc.Encode(uatype.OpenSecureChannelRequest{
RequestHeader: uatype.RequestHeader{
Timestamp: time.Now().UTC(),
TimeoutHint: timeoutHint,
},
RequestType: requestType,
SecurityMode: sc.security.MessageSecurity,
RequestedLifetime: encodeUnsignedDuration(sc.timeouts.RequestLifetime),
}); err != nil {
return transport.LocalError(uatype.StatusBadInternalError, err)
}
// Send request.
if err := sc.sendState.SendMsg(secureMsg{
Type: secureMsgTypeOpn,
ChannelID: sc.securityToken.ChannelId,
RequestID: requestID,
SecurityHeader: sc.security.SecurityHeader,
Request: transport.Request{
NodeID: uatype.NewFourByteNodeID(0, uatype.NodeIdOpenSecureChannelRequest_Encoding_DefaultBinary).Expanded(),
Body: &msgBuff,
},
}, deadline); err != nil {
return err
}
resp, err := sc.recvState.WaitForResponse(requestID, msgTypeOpn, deadline)
if err != nil {
return err
}
p, err := ioutil.ReadAll(resp.Body)
dec := binary.NewDecoder(bytes.NewBuffer(p))
switch resp.NodeID.Uint() {
case uatype.NodeIdOpenSecureChannelResponse_Encoding_DefaultBinary:
target := uatype.OpenSecureChannelResponse{}
if err := dec.Decode(&target); err != nil {
return transport.LocalError(uatype.StatusBadInternalError, err)
}
if sc.securityToken.ChannelId == 0 {
sc.securityToken = target.SecurityToken
}
// TODO handle more security stuff.
case uatype.NodeIdServiceFault_Encoding_DefaultBinary:
target := uatype.ServiceFault{}
if err := dec.Decode(&target); err != nil {
return transport.LocalError(uatype.StatusBadInternalError, err)
}
return &target
default:
err := fmt.Errorf("unexpected node ID %d", resp.NodeID.Uint())
return transport.LocalError(uatype.StatusBadUnknownResponse, err)
}
return nil
}
|
<reponame>stevegury/rust-by-example
fn main() {
let _immutable_variable = 1;
let mut mutable_variable = 1;
println!("Before mutation: {}", mutable_variable);
// Ok
mutable_variable += 1;
println!("After mutation: {}", mutable_variable);
// Error!
_immutable_variable += 1;
// FIXME ^ Comment out this line
}
|
<reponame>tomascury/mbty
package com.mobiquityinc.model;
import java.io.Serializable;
import java.math.BigDecimal;
public class PackageItem implements Serializable {
private int index;
private BigDecimal weight;
private BigDecimal cost;
public PackageItem() {
}
public int getIndex() {
return index;
}
public void setIndex(int index) {
this.index = index;
}
public BigDecimal getWeight() {
return weight;
}
public void setWeight(BigDecimal weight) {
this.weight = weight;
}
public BigDecimal getCost() {
return cost;
}
public void setCost(BigDecimal cost) {
this.cost = cost;
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
PackageItem that = (PackageItem) o;
if (weight != null ? !weight.equals(that.weight) : that.weight != null) return false;
return cost != null ? cost.equals(that.cost) : that.cost == null;
}
@Override
public int hashCode() {
int result = weight != null ? weight.hashCode() : 0;
result = 31 * result + (cost != null ? cost.hashCode() : 0);
return result;
}
@Override
public String toString() {
return "PackageItem{" +
"index=" + index +
", weight=" + weight +
", cost=" + cost +
'}';
}
}
|
import json
import re
from bs4 import BeautifulSoup as bs
from urllib.parse import quote, unquote
# Configure URL with city, state
def get_base_url(root_url, city, state):
base_url = root_url + city + ",-" + state + "_rb/?searchQueryState="
return base_url
def decode_query_params(full_url):
'''
Decode URL query parameters
Returns: decoded params
'''
decoded_url = unquote(full_url)
decoded_params = decoded_url.split("searchQueryState=")[1]
params = json.loads(decoded_params)
return params
def configure_query_params(*args, **kwargs):
'''
Configures query parameters for the URL endpoint
- TO DO: add args for Lat, Lon coordinates; region type; etc.
'''
query_params = {
'pagination': {
},
'usersSearchTerm': user_search_term,
'mapBounds': {
'west': w_coords,
'east': e_coords,
'south': s_coords,
'north': n_coords
},
'regionSelection': [{
'regionId': region_id, 'regionType': region_type
}],
'isMapVisible': True,
'filterState': {
'price': {
'min': pmin,
'max': pmax
},
'beds': {
'min': min_beds
},
'baths': {
'min': 1
},
'sqft': {
'min': min_sqft
},
'con': {
'value': False
},
'pmf': {
'value': False
},
'fore': {
'value': False
},
'lau': {
'value': lau
},
'mp': {
'min': min_price,
'max': max_price
},
'auc': {
'value': False
},
'nc': {
'value': False
},
'fr': {
'value': fr
},
'fsbo': {
'value': False
},
'cmsn': {
'value': False
},
'pf': {
'value': False
},
'fsba': {
'value': False
},
'ac': {
'value': ac
}
},
'isListVisible': True
}
string_params = json.dumps(query_params)
encoded_params = quote(string_params)
return encoded_params
def configure_full_url(base_url, encoded_params):
'''
Configures, returns full_url
'''
full_url = base_url + encoded_params
return full_url
def get_page_count(soup_object):
'''
Get total pages in result set
Use to modify URL for pagination
'''
search_pagination = soup_object.find("div", class_="search-pagination").nav.ul.find_all('li')
page_count = len(search_pagination) - 2
return page_count
def get_room_count(pattern, text):
'''
Searches for the pattern within the text
- Suitable for bedrooms, bathrooms
'''
if len(re.findall(pattern, text)) > 0:
match = re.split(pattern, text)[0][-1]
else:
match = "Not Listed"
return match
def get_new_url(full_url, current_page):
'''
Prepare the URL for the nth page of results
'''
params = decode_query_params(full_url)
params['pagination']['currentPage'] = current_page
string_params = json.dumps(params)
encoded_params = quote(string_params)
new_url = base_url + encoded_params
return new_url |
import styled from 'styled-components/native'
import { Feather } from '@expo/vector-icons'
import { FlatList, FlatListProps } from 'react-native'
import { RFPercentage, RFValue } from 'react-native-responsive-fontsize'
import { getBottomSpace, getStatusBarHeight } from 'react-native-iphone-x-helper'
import { DataListProps } from '.'
export const Container = styled.View`
flex: 1;
background-color: ${({ theme }) => theme.colors.background};
`;
export const LoadContainer = styled.View`
flex:1;
justify-content:center;
align-items:center;
`
export const Header = styled.View`
width:100%;
height:${RFPercentage(40)}px;
justify-content:center;
align-items:flex-start;
flex-direction: row;
background-color:${({ theme }) => theme.colors.primary};
`;
export const UserWrapper = styled.View`
width:100%;
padding:0 24px;
margin-top:${getStatusBarHeight() + RFValue(28)}px;
flex-direction: row;
justify-content: space-between;
align-items: center;
`;
export const UserInfo = styled.View`
flex-direction: row;
align-items: center;
`;
export const Photo = styled.Image`
width:${RFValue(55)}px;
height:${RFValue(55)}px;
border-radius:10px;
`;
export const User = styled.View`
margin-left:17px;
`;
export const UserGreeting = styled.Text`
color:${({ theme }) => theme.colors.shape};
font-family:${({ theme }) => theme.fonts.regular};
font-size:${RFValue(18)}px;
`;
export const UserName = styled.Text`
color:${({ theme }) => theme.colors.shape};
font-family:${({ theme }) => theme.fonts.bold};
font-size:${RFValue(18)}px;
`;
export const LogoutButton = styled.TouchableOpacity`
`;
export const Icon = styled(Feather)`
color:${({ theme }) => theme.colors.secondary};
font-size:${RFValue(24)}px;
`;
export const HighlightCards = styled.ScrollView.attrs({
horizontal: true,
showsHorizontalScrollIndicator: false,
contentContainerStyle: { paddingHorizontal: 24 },
})`
width:100%;
position:absolute;
margin-top:${RFPercentage(20)}px;
`;
export const Transactions = styled.View`
flex:1%;
padding: 0 24px;
margin-top:${RFPercentage(15)}px;
`;
export const Title = styled.Text`
font-family:${({ theme }) => theme.fonts.regular};
font-size:${RFValue(18)}px;
color:${({ theme }) => theme.colors.title};
margin-bottom:16px;
`;
export const TransactionsList = styled(FlatList as new (props: FlatListProps<DataListProps>) => FlatList<DataListProps>).attrs({
showsVerticalScrollIndicator: false,
contentContainerStyle: {
paddingBottom: getBottomSpace()
}
})`
`;
|
/**
* @return The external name of the configuration
*/
public String externalName()
{
switch (this) {
case ON:
return "on";
case OFF:
return "off";
case IO:
return "io";
}
throw new UnreachableCodeException();
} |
<reponame>rigbypc/StoreCheckout
package test.point.of.sale;
import static org.junit.Assert.*;
import org.apache.commons.codec.digest.DigestUtils;
import org.junit.Test;
import point.of.sale.*;
public class TestConsistency {
@Test
public void testSaleStorageConsistency() {
ArrayStorage storage = new ArrayStorage();
storage.put("1", "Milk, 3.99");
storage.put("2", "Beer, 4.99");
StoreConsistencyChecker checker = new StoreConsistencyChecker(storage);
//get the consistency value
checker.updateConsistencyCheck();
//check that nothing has changed
assertTrue(checker.checkConsistency());
storage.put("3", "Wine, 19.99");
assertFalse(checker.checkConsistency());
storage.put("2", "Beer, 0.01");
assertFalse(checker.checkConsistency());
storage.put("2", "Beer, 1.99");
checker.updateConsistencyCheck();
//checker.updateConsistencyCheck();
assertTrue(checker.checkConsistency());
}
//@Test
public void test() {
String password = "<PASSWORD>";
String oldHash = "A591A6D40BF420404A011733CFB7B190D62C65BF0BCDA32B57B277D9AD9F146E";
String hashed = DigestUtils.sha256Hex(password).toUpperCase();
assertEquals(oldHash, hashed);
}
}
|
Sen. John McCain (R-Ariz.) gave a staunch defense of the free press Saturday, noting that attacks on the media are “how dictators get started.”
Speaking on NBC’s “Meet The Press,” to be aired Sunday, McCain took a swipe at President Donald Trump’s volleys against the Fourth Estate, particularly a Friday tweet in which the press was called the “enemy of the American people.”
“We need a free press,” said the 2008 Republican presidential candidate. “We must have it. It’s vital.”
“If you want to preserve ― I’m very serious now ― if you want to preserve democracy as we know it, you have to have a free and many times adversarial press,” he added.
McCain said that without a free press, “we would lose so much of our individual liberties over time.”
“That’s how dictators get started,” he added, noting that attacks on journalists questioning those in power are a tactic used by autocratic governments.
“When you look at history, the first thing that dictators do is shut down the press,” he said. “I’m not saying that President Trump is trying to be a dictator. I’m just saying we need to learn the lessons of history.”
“A fundamental part of that new world order was a free press,” he added. “I hate the press; I hate you especially,” McCain quipped. “But the fact is we need you.”
Trump has ratcheted up his assaults against media organizations in recent weeks, culminating in a belligerent press conference Thursday in which he excoriated the members of the press as “fake news.”
McCain, in Germany for the Munich Security conference, has unleashed a series of thinly veiled attacks on the White House.
In a speech before the conference, he slammed a “hardening resentment” toward “immigrants, and refugees, and minority groups, especially Muslims” and asked world leaders not to give up on America despite the country’s current politics. |
import { DMChannel, GuildMember, TextBasedChannels } from 'discord.js';
import { Db, ObjectID } from 'mongodb';
import dbInstance from '../../utils/MongoDbUtils';
import constants from '../constants/constants';
import fqConstants from '../constants/firstQuest';
import Log, { LogUtils } from '../../utils/Log';
import channelIds from '../constants/channelIds';
import roleIds from '../constants/roleIds';
import ServiceUtils from '../../utils/ServiceUtils';
import { CommandContext } from 'slash-create';
export default async (member: GuildMember, ctx: CommandContext): Promise<any> => {
try {
ServiceUtils.validateLevel2AboveMembers(member);
} catch (e) {
LogUtils.logError('L2 validation failed', e);
ctx?.send(`Hi, ${ctx.user.mention}! You do not have permission to use this command.`);
return;
}
await ctx?.send(`Hi, ${ctx.user.mention}! I sent you a DM with more information.`);
const dmChannel = await member.user.createDM();
await dmChannel.send({ content: 'Which message would you like to edit?' });
Log.debug('Asking user which message to edit for first quest');
await createSelectMessage(dmChannel, member);
};
const createSelectMessage = async (dmChannel, member): Promise<void> => {
const data = await fetchData();
Log.debug('pulled first quest content from db');
const embed = await createEmbed(data);
const selectMessage = await dmChannel.send({ embeds: [embed] });
for (let i = 0; i < embed.fields.length; i++) {
await selectMessage.react(constants.EMOJIS[(i + 1).toString()]);
}
const emojiArray = createEmojiArray(embed.fields.length);
const filter = (reaction, user) => {
return emojiArray.includes(reaction.emoji.name) && !user.bot;
};
const collector = selectMessage.createReactionCollector({ filter, max: 1, time: (7000 * 60), dispose: true });
collector.on('end', async (collected, reason) => {
if (reason === 'limit') {
for (const reac of collected.values()) {
const users = await reac.users.fetch();
if (users.has(member.user.id)) {
const key = 'fq' + reac.emoji.name.slice(0, 1);
const selectedContent = data[0].messages[key].replace(/\\n/g, '\n');
await dmChannel.send({ content: selectedContent });
const confirmationMessage = await dmChannel.send({ content:
'\n\n**Please confirm your selection:** \n\n' +
'👍 - Replace this content with new content \n' +
'🔃 - Change selection \n' +
'❌ - Cancel',
});
await confirmationMessage.react('👍');
await confirmationMessage.react('🔃');
await confirmationMessage.react('❌');
await collectConfirmation(confirmationMessage, member, key, data[0].messages);
}
}
} else {
Log.warn('Command timed out for first quest configuration');
await dmChannel.send({ content: 'Command timed out.' });
}
});
};
const collectConfirmation = async (message, member, key, origMessages): Promise<void> => {
const filter = (reaction, user) => {
return ['👍', '🔃', '❌'].includes(reaction.emoji.name) && !user.bot;
};
const collector = message.createReactionCollector({ filter, max: 1, time: (7000 * 60), dispose: true });
collector.on('end', async (collected, reason) => {
if (reason === 'limit') {
for (const reac of collected.values()) {
const users = await reac.users.fetch();
if (users.has(member.user.id)) {
if (reac.emoji.name === '👍') {
await collectUserInput(message.channel, member, key, origMessages);
return;
} else if (reac.emoji.name === '🔃') {
await createSelectMessage(message.channel, member);
return;
} else if (reac.emoji.name === '❌') {
await message.channel.send({ content: 'Command cancelled.' });
}
}
}
} else {
Log.warn('Command timed out for first quest configuration emoji reaction');
await message.channel.send({ content: 'Command timed out.' });
}
});
};
const collectUserInput = async (dmChannel: DMChannel, member: GuildMember, key: string, origMessages: Record<string, string>): Promise<void> => {
await dmChannel.send({ content: '**Your input please: ATTENTION max character count of 2000 per message ! ** \n(Go here for guidance on how to format your message ' +
'<https://support.discord.com/hc/en-us/articles/210298617-Markdown-Text-101-Chat-Formatting-Bold-Italic-Underline->)' });
const msgCollector = dmChannel.createMessageCollector({ time: (1000 * 60 * 25), max: 1 });
msgCollector.on('collect', async (m) => {
if (m.content.length <= 2000) {
await confirmMessageCollected(dmChannel, member, m.content, key, origMessages);
} else {
await dmChannel.send({ content: 'Input too long. Please reduce message to 2000 characters.' });
await collectUserInput(dmChannel, member, key, origMessages);
}
});
};
const confirmMessageCollected = async (dmChannel, member, responseContent, key, origMessages) => {
const finalConfirmation = await dmChannel.send({ content: '👍 - Confirm and exit \n➡️ - Confirm and select another \n❌ - Cancel' });
await finalConfirmation.react('👍');
await finalConfirmation.react('➡️');
await finalConfirmation.react('❌');
const filter = (reaction, user) => {
return ['👍', '➡️', '❌'].includes(reaction.emoji.name) && !user.bot;
};
const collector = finalConfirmation.createReactionCollector({ filter, max: 1, time: (7000 * 60), dispose: true });
collector.on('end', async (collected, reason) => {
if (reason === 'limit') {
for (const reac of collected.values()) {
const users = await reac.users.fetch();
if (users.has(member.user.id)) {
if (reac.emoji.name === '👍') {
const dbResponse = await updateDatabase(member, responseContent, key, origMessages);
await dmChannel.send({ content: `Database update complete. Status: ${dbResponse}` });
return;
} else if (reac.emoji.name === '➡️') {
const dbResponse = await updateDatabase(member, responseContent, key, origMessages);
await dmChannel.send({ content: dbResponse });
await createSelectMessage(dmChannel, member);
return;
} else if (reac.emoji.name === '❌') {
await dmChannel.send({ content: 'Command cancelled.' });
}
}
}
} else {
await dmChannel.send({ content: 'Command timed out.' });
}
});
};
const updateDatabase = async (member, content, key, origMessages) => {
const db: Db = await dbInstance.connect(constants.DB_NAME);
const timestamp = Date.now();
const logMeta = {
origContent: origMessages[key],
newContent: content,
messageKey: key,
updatedBy: member.user.id,
timestamp: timestamp,
};
const opts = {
level: 'info',
meta: logMeta,
};
const firstQuestContent = await db.collection(constants.DB_COLLECTION_FIRST_QUEST_CONTENT);
const filter = { _id: ObjectID(fqConstants.FIRST_QUEST_DB_DOCUMENT_ID) };
const options = { upsert: false };
origMessages[key] = content;
const updateDoc = { $set: { messages: origMessages, last_updated: timestamp } };
const update = await firstQuestContent.updateOne(filter, updateDoc, options);
Log.info('First Quest message content updated', opts);
const channels = await member.guild.channels.fetch();
const fqProjectChannel = channels.get(channelIds.firstQuestProject) as TextBasedChannels;
// This has to be split up into separate messages to not exceed 2000 character limit of discord
await fqProjectChannel.send({ content: `<@&${ roleIds.firstQuestProject }> : First Quest message content was updated by user ${ member.user.username } with user id: ${ member.user.id } ` });
await fqProjectChannel.send({ content: '**Original message:**' });
await fqProjectChannel.send({ content: logMeta.origContent });
await fqProjectChannel.send({ content: '**New message:**' });
await fqProjectChannel.send({ content: logMeta.newContent });
return (update.result.ok && update.result.nModified) ? 'Message updated successfully' : 'Could not update message, please try again';
};
const fetchData = async () => {
const db: Db = await dbInstance.connect(constants.DB_NAME);
const firstQuestContent = await db.collection(constants.DB_COLLECTION_FIRST_QUEST_CONTENT).find({});
return await firstQuestContent.toArray();
};
const createEmbed = async (data) => {
const embed = {
title: 'Overview of current message content',
fields: [],
footer: { text: 'select emote to edit corresponding question' },
};
for (const [index, [, value]] of Object.entries(Object.entries(data[0].messages as Record<string, string>))) {
// eslint-disable-next-line
const regexUrl = /(?:(?:https?|ftp|file):\/\/|www\.|ftp\.)(?:\([-A-Z0-9+&@#\/%=~_|$?!:,.]*\)|[-A-Z0-9+&@#\/%=~_|$?!:,.])*(?:\([-A-Z0-9+&@#\/%=~_|$?!:,.]*\)|[A-Z0-9+&@#\/%=~_|$])/igm;
embed.fields.push({
name: `Message ${constants.EMOJIS[(parseInt(index) + 1).toString()]}`,
value: value.replace(regexUrl, 'URL REMOVED').replace(/\\n/g, '\n').slice(0, 200) + '...',
});
}
return embed;
};
const createEmojiArray = (len) => {
const emojiArray = [];
for (let i = 0; i < len; i++) {
emojiArray.push(constants.EMOJIS[(i + 1).toString()]);
}
return emojiArray;
}; |
// WithNumber configures the Payload to accept a JSON Number value (disabled by
// default).
//
// When using WithNumber to enable this behavior then it works like calling
// WithInt. Otherwise, JSON Number will be disabled (regardless if it had been
// previously enabled using WithUint or WithFloat).
//
// The default behavior when calling this method is to enable this
// configuration.
func (p *Payload) WithNumber(enable ...bool) *Payload {
p.with[Number] = len(enable) == 0 || enable[0]
if p.with[Number] {
return p.WithInt()
}
return p
} |
/** Creates the unicast and multicast sockets and starts the unicast and multicast receiver threads */
public void start() throws Exception {
timer.start();
if(time_service != null)
time_service.start();
fetchLocalAddresses();
startDiagnostics();
bundler.start();
setInAllThreadFactories(cluster_name != null? cluster_name.toString() : null, local_addr, thread_naming_pattern);
} |
/*
* Copyright (c) 2008-2018 Haulmont.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.haulmont.cuba.web.gui.facets;
import com.haulmont.bali.util.ParamsMap;
import com.haulmont.cuba.gui.GuiDevelopmentException;
import com.haulmont.cuba.gui.components.Timer;
import com.haulmont.cuba.gui.components.compatibility.LegacyFragmentAdapter;
import com.haulmont.cuba.gui.screen.FrameOwner;
import com.haulmont.cuba.gui.xml.FacetProvider;
import com.haulmont.cuba.gui.xml.layout.ComponentLoader.ComponentContext;
import com.haulmont.cuba.web.gui.components.WebTimer;
import org.apache.commons.lang3.StringUtils;
import org.dom4j.Element;
import org.springframework.stereotype.Component;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.util.function.Consumer;
import static java.lang.Boolean.parseBoolean;
import static java.lang.Integer.parseInt;
import static org.apache.commons.lang3.StringUtils.isNotEmpty;
@Component("cuba_TimerFacetProvider")
public class TimerFacetProvider implements FacetProvider<Timer> {
@Override
public Class<Timer> getFacetClass() {
return Timer.class;
}
@Override
public Timer create() {
return new WebTimer();
}
@Override
public String getFacetTag() {
return "timer";
}
@Override
public void loadFromXml(Timer facet, Element element, ComponentContext context) {
loadTimer(facet, element, context);
}
protected void loadTimer(Timer timer, Element element, ComponentContext context) {
String id = element.attributeValue("id");
if (isNotEmpty(id)) {
timer.setId(id);
}
String delay = element.attributeValue("delay");
if (StringUtils.isEmpty(delay)) {
throw new GuiDevelopmentException("Timer 'delay' can't be empty", context,
"Timer ID", timer.getId());
}
int value = parseInt(delay);
if (value <= 0) {
throw new GuiDevelopmentException("Timer 'delay' must be greater than 0",
context, "Timer ID", timer.getId());
}
timer.setDelay(value);
timer.setRepeating(parseBoolean(element.attributeValue("repeating")));
// use @Subscribe event handlers instead
String onTimer = element.attributeValue("onTimer");
if (isNotEmpty(onTimer)) {
String timerMethodName = onTimer;
if (StringUtils.startsWith(onTimer, "invoke:")) {
timerMethodName = StringUtils.substring(onTimer, "invoke:".length());
}
timerMethodName = StringUtils.trim(timerMethodName);
addInitTimerMethodTask(timer, timerMethodName, context);
}
String autostart = element.attributeValue("autostart");
if (isNotEmpty(autostart)
&& parseBoolean(autostart)) {
timer.start();
}
}
// for compatibility only
@Deprecated
protected void addInitTimerMethodTask(Timer timer, String timerMethodName, ComponentContext context) {
FrameOwner controller = context.getFrame().getFrameOwner();
if (controller instanceof LegacyFragmentAdapter) {
controller = ((LegacyFragmentAdapter) controller).getRealScreen();
}
Class<? extends FrameOwner> windowClass = controller.getClass();
Method timerMethod;
try {
timerMethod = windowClass.getMethod(timerMethodName, Timer.class);
} catch (NoSuchMethodException e) {
throw new GuiDevelopmentException("Unable to find invoke method for timer", context,
ParamsMap.of(
"Timer Id", timer.getId(),
"Method name", timerMethodName));
}
timer.addTimerActionListener(new DeclarativeTimerActionHandler(timerMethod, controller));
}
@Deprecated
protected static class DeclarativeTimerActionHandler implements Consumer<Timer.TimerActionEvent> {
protected final Method timerMethod;
protected final FrameOwner controller;
public DeclarativeTimerActionHandler(Method timerMethod, FrameOwner controller) {
this.timerMethod = timerMethod;
this.controller = controller;
}
@Override
public void accept(Timer.TimerActionEvent e) {
try {
timerMethod.invoke(controller, e.getSource());
} catch (IllegalAccessException | InvocationTargetException ex) {
throw new RuntimeException("Unable to invoke onTimer", ex);
}
}
}
} |
def create_from_file(filename, **kwargs):
cfg = load_yaml(filename)
cfg.update(kwargs)
class_type = cfg.pop("type", DEFAULT_REDUCTION_TYPE)
ReductionClass = load_module(class_type)
return ReductionClass(**cfg) |
<reponame>boom10899/cuEDM
#include <algorithm>
#include <cmath>
#include <iostream>
#include <limits>
#include <vector>
#ifdef LIKWID_PERFMON
#include <likwid.h>
#else
#define LIKWID_MARKER_INIT
#define LIKWID_MARKER_THREADINIT
#define LIKWID_MARKER_SWITCH
#define LIKWID_MARKER_REGISTER(regionTag)
#define LIKWID_MARKER_START(regionTag)
#define LIKWID_MARKER_STOP(regionTag)
#define LIKWID_MARKER_CLOSE
#define LIKWID_MARKER_GET(regionTag, nevents, events, time, count)
#endif
#include "nearest_neighbors_cpu.h"
NearestNeighborsCPU::NearestNeighborsCPU(uint32_t tau, uint32_t Tp,
bool verbose)
: NearestNeighbors(tau, Tp, verbose)
{
}
// clang-format off
void NearestNeighborsCPU::compute_lut(LUT &out, const Series &library,
const Series &target, uint32_t E,
uint32_t top_k)
{
const auto shift = (E - 1) * tau + Tp;
const auto n_library = library.size() - shift;
const auto n_target = target.size() - shift + Tp;
const auto p_library = library.data();
const auto p_target = target.data();
// Allocate temporary buffer for distance matrix
distances.resize(n_target * n_library);
timer_distances.start();
// Compute distances between all library and target points
#pragma omp parallel
{
LIKWID_MARKER_START("calc_distances");
}
#pragma omp parallel for
for (auto i = 0u; i < n_target; i++) {
#pragma omp simd
for (auto j = 0u; j < n_library; j++) {
distances[i * n_library + j] = 0.0f;
}
for (auto k = 0u; k < E; k++) {
const float tmp = p_target[i + k * tau];
#pragma omp simd
for (auto j = 0u; j < n_library; j++) {
// Perform embedding on-the-fly
auto diff = tmp - p_library[j + k * tau];
distances[i * n_library + j] += diff * diff;
}
}
}
#pragma omp parallel
{
LIKWID_MARKER_STOP("calc_distances");
}
// Ignore degenerate neighbors
#pragma omp parallel for
for (auto i = 0u; i < n_target; i++) {
for (auto j = 0u; j < n_library; j++) {
if (p_target + i == p_library + j) {
distances[i * n_library + j] =
std::numeric_limits<float>::infinity();
}
}
}
timer_distances.stop();
// Allocate buffer in LUT
out.resize(n_target, top_k);
timer_sorting.start();
// Sort indices
#pragma omp parallel
{
LIKWID_MARKER_START("partial_sort");
#pragma omp for
for (auto i = 0u; i < n_target; i++) {
std::partial_sort_copy(Counter<uint32_t>(0),
Counter<uint32_t>(n_library),
out.indices.begin() + i * top_k,
out.indices.begin() + (i + 1) * top_k,
[&](uint32_t a, uint32_t b) -> uint32_t {
return distances[i * n_library + a] <
distances[i * n_library + b];
});
}
LIKWID_MARKER_STOP("partial_sort");
}
timer_sorting.stop();
// Compute L2 norms from SSDs and reorder them to match the indices
// Shift indices
#pragma omp parallel for
for (auto i = 0u; i < n_target; i++) {
for (auto j = 0u; j < top_k; j++) {
auto idx = out.indices[i * top_k + j];
out.distances[i * top_k + j] =
std::sqrt(distances[i * n_library + idx]);
out.indices[i * top_k + j] = idx + shift;
}
}
}
// clang-format on
|
Sociodemographic factors and obesity in preadolescent black and white girls: NHLBI's Growth and Health Study.
The association of sociodemographic and family composition data with obesity was studied in 1213 black and 1166 white girls, ages 9 and 10, enrolled in the National Heart, Lung, and Blood Institute's Growth and Health Study. Obesity was defined as body mass index at or greater than age- and sex-specific 85th percentile as outlined in the Second National Health and Nutrition Examination Survey. The prevalence of obesity was higher for pubertal girls than for prepubertal girls and for girls with older mothers/female guardians. As odds ratio of 1.14 was observed for each 5-year increase in maternal age. Obesity was less common for girls with more siblings; the odds for obesity decreased by 14% for each additional sibling in the household. In blacks, the prevalence of obesity was not related to parental employment or to parental education. In whites, the odds of obesity were higher for girls with no employed parent/guardian in the household and for girls with parents or guardians with lower levels of educational attainment. Examining the associations between sociodemographic factors and risk of childhood obesity provides important clues for understanding racial differences in obesity, a major risk factor for coronary heart disease. |
<reponame>Glurt/Evening-Mail
package com.cooper.nwemail.models;
import com.cooper.nwemail.enums.ContactGroupEnum;
import java.util.List;
/**
* A ContactMethodType represents a group of contact methods and contains a list
* of contact methods within it.
*/
public class ContactMethodType {
private ContactGroupEnum mContactType;
private String header;
private List<ContactMethod> methods;
public ContactMethodType(final ContactGroupEnum type) {
mContactType = type;
}
public ContactGroupEnum getContactType() {
return mContactType;
}
public void setContactType(final ContactGroupEnum mContactType) {
this.mContactType = mContactType;
}
public void setHeader(final String header) {
this.header = header;
}
public String getHeader() {
return header;
}
public List<ContactMethod> getMethods() {
return methods;
}
public void setMethods(final List<ContactMethod> methods) {
this.methods = methods;
}
}
|
Circulating Leptin and Bone Mineral Density in Rheumatoid Arthritis
Objective. To evaluate the association between circulating leptin and bone mineral density (BMD) in patients with rheumatoid arthritis (RA). Methods. One-hundred thirty postmenopausal women with RA were assessed for body mass index (BMI), disease characteristics, history of drug use, rheumatoid factor, and erythrocyte sedimentation rate (ESR). BMD (g/cm2) was determined in the hip and spine by DEXA. Serum leptin concentrations were measured by ELISA. Spearman’s correlation coefficients (rho) were determined between BMD and leptin and other variables. A multiple regression analysis was used to adjust for confounders. Results. Patients’ serum leptin levels varied widely (range 2–128 ng/ml). Thirty-three patients (25%) had osteoporosis. Higher levels of leptin correlated significantly with BMD in the lumbar spine (rho = 0.17, p = 0.04) and total hip (rho = 0.21, p = 0.01). The variables that were negatively correlated with BMD were age, duration of menopause, and ESR. After adjustment for confounders, leptin was no longer associated with BMD. In the multivariate model, factors that remained associated with BMD in the total hip were age (p = 0.021) and BMI (p = 0.003); and the factors that remained associated with BMD in the lumbar spine were BMI (p = 0.03) and ESR (p = 0.01). Conclusion. No relevant association was found between circulating leptin levels and BMD in patients with RA in this cross-sectional study. Followup studies are needed to evaluate whether abnormal leptin levels confer a risk for fractures due to osteoporosis. |
Go By Feel, Skip The Lactate Threshold Test Matt Fitzgerald / November 25, 2013
The good news for runners is that they can take an at-home lactate threshold test. Photo: www.shutterstock.com
One study reveals that athletes can find their lactate threshold heart rate by feel.
Any good training plan for runners includes workouts that are intended to be performed at “lactate threshold” intensity. What is the lactate threshold? It is the intensity of exercise at which lactate — an intermediate product of carbohydrate metabolism in the muscles — begins to accumulate in the muscles because it’s being produced faster than the muscles can use it and the excess “leaks” into the bloodstream.
Why does every coach in the universe want runners to train at lactate threshold (LT) intensity about once a week? Because training at LT is a powerful and efficient way to build fitness. For most runners, the lactate threshold corresponds to the fastest pace that can be sustained for 30 to 60 minutes (closer to 30 minutes for less fit individuals, closer to 60 minutes for highly fit individuals). So it’s not a super-high intensity, but it’s not a low intensity either. It’s somewhere in the middle — call it moderately high. As such, it is hard enough to stimulate big gains in fitness but not so hard that it leaves a runner wiped out, as long as it is done in judicious amounts.
The greatest benefit of LT training is that it greatly increases a runner’s capacity to sustain faster running paces for prolonged periods of time. It doesn’t make you faster, but it does make you much slower to fatigue when running fast. Since LT intensity is in the neighborhood of 10K and half marathon race pace for most runners, we’re talking about a type of training that significantly increases how long a runner can sustain a desired race pace for such events.
The lactate threshold is defined by a concentration of lactate in the blood — specifically, a concentration of 4 millimoles per liter. But runners don’t care about that. They care about the heart rate or pace that is associated with that particular lactate concentration. If they know either of these values then they can train at lactate threshold intensity on their own by monitoring their heart rate or pace. Heart rate is a little more useful than pace because it is relevant to all environments, whereas the pace that is associated with LT varies depending on whether you’re running uphill, downhill, or on level terrain.
RELATED: Rethinking Threshold Training With A New Approach
The traditional method of determining LT heart rate is a lactate threshold test conducted in an exercise laboratory. After warming up on a treadmill you run at incrementally increasing speeds while wearing a heart rate monitor. For example, you might start with 3 minutes at 6.5 mph, then do 3 minutes at 6.7 mph, and so on. At each speed a blood sample is taken from a fingertip and its lactate concentration is measured. You keep going until you’re running at a pace that produces a blood lactate concentration that exceeds 4 mmol/L. The results are graphed and used to plot the exact heart rate at which that special threshold concentration was reached.
The problem with lab-based LT testing is that it’s expensive, inconvenient, and invasive. To spare athletes from these negatives, some coaches have come up with field tests to determine lactate threshold heart rate. One such field test is a 30-minute time trial. You warm up and then run as far as you can in 30 minutes. Your average heart rate in the last 10 minutes is your estimated LT heart rate. A comparison of this procedure to the lab-based LT test found that it was quite accurate. But it has a downside too: It’s very stressful — the equivalent of running a 30-minute race.
Isn’t there some accurate way to determine LT heart rate in the field that does not leave a runner hyperventilating at the end? The short answer is yes. Johannes Scherr and colleagues at Munich Technical University recently demonstrated that runners and cyclists can find their own lactate threshold by feel — or, more precisely, by perceived exertion.
Perceived exertion is a global, subjective perception of how hard an exercise effort feels at any given moment. Scientists have traditionally used a tool called the Borg Scale to quantify perceived effort. The scale ranges from 6 to 20, which seems weird, but it was originally intended to correlate with heart rate values of 60 to 200 (which it doesn’t really do very well in practice anyway). On this scale, a perceived exertion rating (or RPE) of 6 represents a laughably easy effort and a rating of 20 represents an effort that is so miserably hard that exhaustion is but moments away.
RELATED: Run Faster By Improving Your Lactate Clearance Rate
Over a long period of time, Dr. Scherr and his collaborators performed cycling and running LT tests on 2,560 men and women between the ages of 17 and 44 years. Except there was a twist: Instead of taking objective measurements only, as in conventional LT testing, they also asked the subjects to rate their perceived exertion on Borg’s 6-20 scale at each step. When the researchers crunched the numbers, they discovered that a subjective effort rating of 13 on this scale consistently correlated with a blood lactate concentration of 4 mmol/L. This was true across ages, genders, exercise modalities and fitness levels.
The practical implication of this finding is that runners like you can now find their own LT heart rate at home without undo suffering. All you have to do is strap on a heart rate monitor and perform your own graded exercise test. Start off at a low intensity and rate the effort on a 6-20 scale. After 2 or 3 minutes, increase your speed slightly and rate your effort again. Keep doing this until your RPE reach 13 and then note your heart rate. That’s your lactate threshold heart rate.
Personally, I am particularly gratified by Johannes Scherr’s study because it validates my own perceived exertion-based lactate threshold field test I designed for PEAR Sports a few years ago. This one is even easier to do because it uses a less unwieldy 1-10 scale (LT falls at 6 on this scale) and I guide users through it step by step with audio instructions delivered through headphones. When I did the test myself I got an LT heart rate of 160 bpm. When I went to a lab for a formal LT test a few days later the result was 159 bmp. Not too shabby!
RELATED: Is Threshold Psychological Or Physiological?
****
About The Author:
Matt Fitzgerald is the author of numerous books, including Iron War: Dave Scott, Mark Allen & The Greatest Race Ever Run (VeloPress, 2011). He is also a Training Intelligence Specialist for PEAR Sports. To learn more about Matt visit www.mattfitzgerald.org. |
//Wildcard Capture and Helper Methods
public class WildcardError {
/*void foo(List<?> i) {
i.set(0, i.get(0));
}
}*/
void foo(List<?> i) {
fooHelper(i);
}
private <T> void fooHelper(List<T> l) {
T t;
t=l.get(0);
l.set(0, l.get(0));
}
/*void swapFirst(List<? extends Number> l1, List<? extends Number> l2) {
Number temp = l1.get(0);
l1.set(0, l2.get(0)); // expected a CAP#1 extends Number,
// got a CAP#2 extends Number;
// same bound, but different types
l2.set(0, temp); // expected a CAP#1 extends Number,
// got a Number
}*/
//There is no helper method to work around the problem, because the code is fundamentally wrong.
} |
def retrieve_observation(obsid, suffix=['FLC'], archive=False, clobber=False):
local_files = []
if Observations is None:
log.warning("The astroquery package was not found. No files retrieved!")
return local_files
obs_table = Observations.query_criteria(obs_id=obsid, obstype='all')
if not obs_table:
log.info("WARNING: Query for {} returned NO RESULTS!".format(obsid))
return local_files
dpobs = Observations.get_product_list(obs_table)
data_products_by_id = Observations.filter_products(dpobs,
productSubGroupDescription=suffix,
extension='fits',
mrp_only=False)
if not data_products_by_id:
log.info("WARNING: No FLC files found for {} - will look for FLT "
"files instead.".format(obsid))
suffix = ['FLT']
data_products_by_id = Observations.filter_products(dpobs,
productSubGroupDescription=suffix,
extension='fits',
mrp_only=False)
if not data_products_by_id:
log.info(
"WARNING: No FLC or FLT files found for {}.".format(obsid))
return local_files
all_images = data_products_by_id['productFilename'].tolist()
log.info(all_images)
if not clobber:
rows_to_remove = []
for row_idx, row in enumerate(data_products_by_id):
fname = row['productFilename']
if os.path.isfile(fname):
log.info(fname + " already exists. File download skipped.")
rows_to_remove.append(row_idx)
data_products_by_id.remove_rows(rows_to_remove)
manifest = Observations.download_products(data_products_by_id,
mrp_only=False)
if not clobber:
for rownum in rows_to_remove[::-1]:
if manifest:
manifest.insert_row(rownum,
vals=[all_images[rownum], "LOCAL", "None", "None"])
else:
return all_images
download_dir = None
for file, file_status in zip(manifest['Local Path'], manifest['Status']):
if file_status != "LOCAL":
if download_dir is None:
download_dir = os.path.dirname(os.path.abspath(file))
local_file = os.path.abspath(os.path.basename(file))
if archive:
shutil.copy(file, local_file)
else:
shutil.move(file, local_file)
local_files.append(os.path.basename(local_file))
else:
local_files.append(file)
if not archive:
shutil.rmtree('mastDownload')
return local_files |
/**
* Instance state service.
*/
public final class StateService {
private final StateNode stateNode;
private final RegistryCenterRepository regCenter;
private final OrchestrationInstance instance;
public StateService(final String name, final RegistryCenterRepository regCenter) {
stateNode = new StateNode(name);
this.regCenter = regCenter;
instance = OrchestrationInstance.getInstance();
}
/**
* Persist instance online.
*/
public void persistInstanceOnline() {
regCenter.persistEphemeral(stateNode.getInstancesNodeFullPath(instance.getInstanceId()), "");
}
/**
* Initialize data sources node.
*/
public void persistDataSourcesNode() {
regCenter.persist(stateNode.getDataSourcesNodeFullRootPath(), "");
}
} |
#include<bits/stdc++.h>
using namespace std;
main()
{
int n,i,a[2000]={0},b[2000]={0},cnt_1=0,k=1,maxi=-5,f=1,j=1,g=1,ans=0,l[2000]={0},r[2000]={0},r_u;
cin>>n>>r_u;
for(i=1;i<=n;i++)
{
cin>>a[i];
if(a[i]==1)
{
cnt_1++;
l[k]=i-r_u+1;
if(l[k]>n){l[k]=n;}
if(l[k]<1){l[k]=1;}
r[k]=i+r_u-1;
if(r[k]>n){r[k]=n;}
if(r[k]<1){r[k]=1;}
k++;
}
}
while(1)
{ maxi=-5;
for(j=f;j<k;j++)
{
if(l[j]<=g)
{
int temp;
temp=r[j]-g+1;
if(temp>maxi)
{
maxi=temp;f=j+1;
}
}
else
{
break;
}
}
if(maxi==-5)
{
cout<<"-1"<<endl;return 0;
}
g=g+maxi;ans++;
if(g>n)
{
break;
}
}
cout<<ans<<endl;
return 0;
}
|
/*
* Copyright (c) 2016 <NAME>
*/
#ifndef CPPLINT_EXAMPLE2_FILE1_H_
#define CPPLINT_EXAMPLE2_FILE1_H_
void test_me();
#endif /* CPPLINT_EXAMPLE2_FILE1_H_ */
|
/**
* Token identifier for container operations, similar to block token.
*/
@InterfaceAudience.Private
public class ContainerTokenIdentifier extends ShortLivedTokenIdentifier {
public static final Text KIND = new Text("HDDS_CONTAINER_TOKEN");
private ContainerID containerID;
public ContainerTokenIdentifier() {
}
public ContainerTokenIdentifier(String ownerId, ContainerID containerID,
String certSerialId, Instant expiryDate) {
super(ownerId, expiryDate, certSerialId);
this.containerID = containerID;
}
@Override
public Text getKind() {
return KIND;
}
@Override
public void write(DataOutput out) throws IOException {
ContainerTokenSecretProto.Builder builder = ContainerTokenSecretProto
.newBuilder()
.setOwnerId(getOwnerId())
.setCertSerialId(getCertSerialId())
.setExpiryDate(getExpiry().toEpochMilli())
.setContainerId(containerID.getProtobuf());
out.write(builder.build().toByteArray());
}
@Override
public void readFields(DataInput in) throws IOException {
final DataInputStream dis = (DataInputStream) in;
if (!dis.markSupported()) {
throw new IOException("Could not peek first byte.");
}
ContainerTokenSecretProto proto =
ContainerTokenSecretProto.parseFrom((DataInputStream) in);
setCertSerialId(proto.getCertSerialId());
setExpiry(Instant.ofEpochMilli(proto.getExpiryDate()));
setOwnerId(proto.getOwnerId());
this.containerID = ContainerID.getFromProtobuf(proto.getContainerId());
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
ContainerTokenIdentifier that = (ContainerTokenIdentifier) o;
return super.equals(that) &&
containerID == that.containerID;
}
@Override
public int hashCode() {
return Objects.hash(super.hashCode(), getExpiry());
}
@Override
public String getService() {
return containerID.toString();
}
} |
// ToStr applies a function that takes an int and returns an S. If the AoAoI
// is invalid or if any function returns an invalid S, ToStr returns an
// invalid AoAoS. Note: unlike Map, this is a deep conversion of individual
// elements of the 2-D slice of ints.
func (m AoAoI) ToStr(f func(x int) S) AoAoS {
if m.IsErr() {
return ErrAoAoS(m.err)
}
xss := make([][]string, len(m.just))
for i, xs := range m.just {
xss[i] = make([]string, len(xs))
for j, v := range xs {
num, err := f(v).Unbox()
if err != nil {
return ErrAoAoS(err)
}
xss[i][j] = num
}
}
return JustAoAoS(xss)
} |
<reponame>dakloifarwa/adsuino
#ifndef ALLNET_IDS_H
#define ALLNET_IDS_H
//The DevIDs from ALLNET IP multi socket device.
#define ALLNET_DEVID_WW 1
#define ALLNET_DEVID_FBH 2
#define ALLNET_DEVID_HK 3
#endif /* ALLNET_IDS_H */
|
/**
* The subject has changed. Notify listeners that the value has changed.
*/
protected synchronized void subjectChanged() {
Object oldValue = this.getValue();
boolean hasListeners = this.hasListeners();
if (hasListeners) {
this.disengageSubject();
}
this.subject = this.subjectHolder.getValue();
if (hasListeners) {
this.engageSubject();
this.fireAspectChange(oldValue, this.getValue());
}
} |
def property_MPI() -> ModuleType:
load()
assert this._MPI is not None
return this._MPI |
<reponame>seams-cms/go-seams-cms-sdk<gh_stars>0
// Copyright (c) 2019 Seams-CMS and contributors. All rights reserved.
// Use of this source code is governed by the MIT License that can be found in
// the LICENSE file.
package change
import (
"github.com/stretchr/testify/assert"
"testing"
)
func TestNewClient(t *testing.T) {
api := NewClient("space", "api-key")
assert.Contains(t, api.seamsClient.BaseUrl, "https://change.seams-api.com")
}
func TestNewClientWithConfig(t *testing.T) {
config := Configuration{
"space",
"api-key",
false,
"",
}
api := NewClientWithConfig(&config)
assert.Contains(t, api.seamsClient.BaseUrl, "https://change-nocdn.seams-api.com")
}
func TestNewClientWithConfig_baseurl(t *testing.T) {
config := Configuration{
"space",
"api-key",
false,
"http://base.url",
}
api := NewClientWithConfig(&config)
assert.Contains(t, api.seamsClient.BaseUrl, "http://base.url")
}
|
/**
*** JSON Parse Exception
**/
public static class JSONParsingException
extends Exception
{
private int index = 0;
public JSONParsingException(String msg, int ndx) {
super(msg);
this.index = ndx;
}
public int getIndex() {
return this.index;
}
public String toString() {
String s = super.toString();
return s + " ["+this.index+"]";
}
} |
import heapq
x,y,z,k= map(int,input().split())
alist = sorted(list(map(int,input().split())),reverse=True)
blist = sorted(list(map(int,input().split())),reverse=True)
clist = sorted(list(map(int,input().split())),reverse=True)
l = [(-(alist[0]+blist[0]+clist[0]),0,0,0)]
heapq.heapify(l)
count=0
ai=bi=ci=0
selected = set()
selected.add((0,0,0))
while len(l)!=0:
temp = heapq.heappop(l)
(tempa,tempb,tempc)=temp[1:]
print(-temp[0])
count += 1
if count==k:break
if tempa+1<=x-1:
if not (tempa+1,tempb,tempc) in selected:
heapq.heappush(l,(-(alist[tempa+1]+blist[tempb]+clist[tempc]),tempa+1,tempb,tempc))
selected.add((tempa+1,tempb,tempc))
if tempb+1<=y-1:
if not (tempa,tempb+1,tempc) in selected:
heapq.heappush(l,(-(alist[tempa]+blist[tempb+1]+clist[tempc]),tempa,tempb+1,tempc))
selected.add((tempa,tempb+1,tempc))
if tempc+1<=z-1:
if not (tempa,tempb,tempc+1) in selected:
heapq.heappush(l,(-(alist[tempa]+blist[tempb]+clist[tempc+1]),tempa,tempb,tempc+1))
selected.add((tempa,tempb,tempc+1)) |
/*
* Use this function to delete a vport object. Fabric object should
* be stopped before this function call.
*
* !!!!!!! Donot invoke this from within FCS !!!!!!!
*
* param[in] vport - pointer to bfa_fcs_vport_t.
*
* return None
*/
bfa_status_t
bfa_fcs_vport_delete(struct bfa_fcs_vport_s *vport)
{
if (vport->lport.port_cfg.preboot_vp)
return BFA_STATUS_PBC;
bfa_sm_send_event(vport, BFA_FCS_VPORT_SM_DELETE);
return BFA_STATUS_OK;
} |
An open letter to our friends, family, and fans of Immersion Arcade,
Since the conception of our company in Summer 2016, we’ve been on a mission to deliver the very best experience in virtual reality.
After several months of operations we’ve learned so much about what it takes to run an arcade of the future.
Unfortunately, we recently shut our doors and have decided to cease operations entirely. After abruptly losing our lease, we have both moved on to different academic and entrepreneurial pursuits which has consumed most of our available time.
While Immersion Arcade is no longer in operation, we've taken our knowledge and understanding of arcades and the virtual reality industry as a whole and pivoted towards our recent venture - VR League, an e-sports company focused on VR tournament play and virtual reality news, education, and consumer research, and consulting.
Our mission is to create a community of arcades to share their love for virtual reality while providing thought leadership on the future of this growing industry.
We are still in the exploratory stages for VR League and look forward to seeing where it takes us in the future! Thank you all so much for the support and encouragement along the way!
Visit VR League on Facebook for more information.
Sincerely,
Brian Jesse and Derick Downey (Co-Founders, Immersion Arcade). |
// returns a shallow copy of current object.
func (j *jwt) clone() *jwt {
return &jwt{
issuer: j.issuer,
signMethod: j.signMethod,
expiredTime: j.expiredTime,
refreshTime: j.refreshTime,
tokenCtxKey: j.tokenCtxKey,
tokenSeeks: j.tokenSeeks,
rsaPublicKey: j.rsaPublicKey,
rsaPrivateKey: j.rsaPrivateKey,
ecdsaPublicKey: j.ecdsaPublicKey,
ecdsaPrivateKey: j.ecdsaPrivateKey,
secretKey: j.secretKey,
adapter: j.adapter,
identityKey: j.identityKey,
ctx: j.ctx,
}
} |
Focal therapy for prostate cancer: The current status
Purpose In an era of increasing prostate cancer incidence and earlier detection, the assessment of clinical significance of prostate cancer is critical. Minimally invasive therapies are increasingly being investigated in localized prostate cancer. Methods and results In this review, we discuss the current status of magnetic resonance imaging targeted fusion prostate biopsy and focal therapy for prostate cancer, its rationale, and techniques. Conclusion Focal therapy offers a promising outlook for prostate cancer treatment, with the goal of effectively achieving cancer control while minimizing morbidity. Long term studies are needed.
Rationale for focal therapy for prostate cancer
With the widespread use of prostate-specific antigen (PSA) screening and increasing life-expectancy, more men are being diagnosed with localized, low-risk, low-grade prostate cancer. 1 These patients can be managed with definitive therapy, including radical prostatectomy (RP) or radiation therapy (RT). However, these radical therapies are associated with significant complication risks and side effects, which may be unsuitable for or undesired by the patient with low-risk prostate cancer. In an era of increasing prostate cancer incidence and stage migration toward earlier disease, appropriate management of the disease requires assessment of the risk of clinical significance of the disease. Minimally invasive therapies are increasingly being investigated as an alternative.
Prostate cancer is relatively slow growing, with doubling times for local tumors estimated at 2e4 years. Some prostate cancers prove to be so small, low-grade, and noninvasive that they appear to pose little risk to the patient, and are considered indolent. A recent review suggests that 49% of men undergoing RP have pathological features in the RP specimen consistent with an insignificant or indolent cancer (organ-confined cancer < 0.5 mL, no Gleason Grade 4 or 5 component). 2 Up to 33% of patients on active surveillance (AS) eventually fall out of surveillance and undergo definitive treatment after 2e5 years because of initial understaging or disease progression. 3 Seventy-three percent of patients initially enrolled in AS who undergo RP have a significant cancer on RP specimens. 4 Other downsides of AS include the mental and emotional burden and anxieties associated with untreated cancer. Therefore, AS is an option for only a select group of men.
In order to cure and control localized prostate cancer, the concept of focal therapy has emerged. Focal therapy is the middle ground between AS and radical therapy, offering much less morbidity with cancer control. Focal destruction of cancer, with preservation of the surrounding organ, has already been used widely in the oncological treatment of kidney, liver, breast, and brain.
The concept of focal therapy is relevant for prostate cancer in a number of ways. First of all, there is strong evidence that the vast majority of metastases find their origin in the same prostate cancer cell clone, derived from the same lesion called the index lesion. 5,6 Histopathological features of the index lesion predict the clinical behavior of the entire gland despite multiple synchronous tumors in >90% of patients. 7,8 While prostate cancer is typically multifocal with clonal heterogeneity of prostate cancer within the gland, not all tumors within a single gland have the potential for lethality. Historically, the threshold for clinically significant disease, capable of metastatic progression, has been set at 0.5 mL, with some Gleason grade component 4. 7,8 It has been shown that in >80% patients with an index lesion of cancer, the aggregate volume of secondary tumors is < 0.5 mL. 7,8 Since most metastatic cancers originate from a single clonal cancer cell, it would be reasonable and effective to identify and target this potentially lethal lesion with focal therapy. Thus, selective treatment of clinically significant disease, with acceptance of residual, insignificant disease may serve as a meaningful treatment paradigm. To date, limited clinical data exist regarding outcomes of focal therapy. 9
Candidate selection/risk strata
The selection of patients is a critical element of the challenges of focal therapy adoption and use. Patient candidate selection should ultimately be based on the intent of focal therapy. In those patients in whom focal therapy is utilized for cure, the disease should be low risk and low volume in a targetable area of the prostate. The ideal patient would be one with low-stage, low-risk prostate cancer that could be completely eradicated.
Focal therapy can be used with the intent of disease control. A therapy to control cancer would prolong the natural history of prostate cancer and delay the morbidity of radical treatment. In this situation, focal therapy would treat the dominant lesion or index lesion. In doing so, focal therapy could prolong the period of surveillance, and mitigate the uncertainties and anxieties of pure AS. 10 Lastly, focal therapy could be utilized as a part of a multimodal treatment approach in the high-risk patient who would likely fail single-modality therapy, but avoid the morbidities associated with radical treatment. Use of focal therapy for noncurative intent has yet to be validated and studied. 10 Up to now, most trials have included only low-risk patients under the premise that men with low-risk disease are at little risk of systemic relapse, and thus, local disease control can be a measure of treatment efficacy. 11 As focal targeting methods develop, there is a stronger impetus to treat men who are at risk of disease-related mortality, as they may be the ones to benefit the most. In treating only low-risk patients, one can argue that the benefit of therapy may never be proven, as these patients would have fared well on surveillance anyway. However, most focal therapy trials include lowrisk patients due to the known risk of 30e40% upgrading of surgical pathology from biopsy pathology. At this time, it is not clear if Gleason 7 (3 þ 4) with small proportion of 4 has a similar favorable outcome as Gleason 6. Gleason 7 (3 þ 4) has an intermediate risk of relapse, and therefore gives focal therapy the opportunity to treat and prevent prostate cancer relapse. The heterogeneity in biological behavior of Gleason 7 tumors has been shown. Gleason score 4 þ 3 tumors had an increased risk of progression (compared to Gleason 3 þ 4 tumors) independent of stage and margin status, and were predictive of metastatic disease (as opposed to Gleason 3 þ 4 tumors). 12 In addition, Gleason 4 þ 3 tumors were more strongly associated with extraprostatic extension and upgrading on surgical specimens than Gleason 3 þ 4 tumors were. 13 Gleason 7 (4 þ 3) tumors have a similar risk of relapse as Gleason 8 (4 þ 4) tumors.
Candidate selection relies heavily on accurate patient identification and risk stratification. Risk stratification can be used to assess the chance of unfavorable pathology, poor oncological outcome, biochemical recurrence, and survival. Low-risk category patients have a low risk of short-term cancer mortality. The D'Amico classification is the most common classification used to stratify the risk of biochemical recurrence after radical treatment. 14 The percentage of Gleason 4 tumors is sharply correlated with outcome. Stamey et al suggested that 20% of Gleason 4/5 tumors on biopsy (which is correlated to the same percentage of Gleason 4/5 tumor in RP specimens) represents the lower-risk subset of those harboring a Gleason 4 pattern. 15
Limitations of standard systematic biopsy
Transrectal ultrasound (TRUS)-guided biopsy using a 12-core sampling scheme is the standard approach for prostate cancer diagnosis. 16 Performing TRUS biopsy for focal therapy selection is felt to be inadequate due to the risk of underestimating disease risk, volume, and focality. 17 It has been shown that if a 12-core biopsy shows unilateral disease, there is a 75% chance of a tumor on the contralateral side. 18 Focal therapy selection and planning requires accurate assessment of these parameters.
The success of focal therapy clearly depends on the ability to detect the extent and laterality of prostate cancer and then accurately target it. There is no consensus currently on patient selection protocols for focal therapy. The reason for this is twofold. So far, there has been a lack of adequate biopsy techniques that can accurately detect prostate cancer lesions, and also a lack of imaging modalities to complement inadequate biopsies. Detection relies upon reduction of sampling error through the number of samples taken and the location of the samples in the prostate. 19,20 In men with negative biopsies, repeat biopsy is often used up to five or six times before detection e sampling error is overcome through increased sampling. This approach of random sampling leads to three intrinsic errors: (1) underdetection by missing a potentially lethal cancer; (2) overdetection by identifying a small nonlethal cancer; and (3) misclassification by identifying an apparent low-risk cancer in someone with high-risk disease. Even extended TRUS-guided saturation biopsy appears to be inadequate in the proper selection of patients for focal therapy. 21 Transperineal (TP) biopsy with threedimensional (3D) mapping was thought to improve on cancer localization, as samples are taken every 5 mm throughout the volume of the prostate using a brachytherapy template grid under TRUS guidance. However, >61% of patients diagnosed with unilateral cancer on TP biopsy were found to have bilateral disease, and 27% were upstaged in Gleason score. 22,23 Moreover, TP biopsy has fallen out of favor due to time demands, need for anesthesia, and cost.
Biopsy sampling error may be better addressed through localization of the cancer region by imaging than through simply increasing sampling. To achieve this goal, fusion biopsy has evolved as the standard for accurate maximal fusion of disease foci, according to a consensus panel. 24
MRI-targeted fusion biopsy
The evolution of MRI to multiparametric MRI (MP MRI) is an important innovation for focal therapy in prostate cancer. A typical MP MRI includes T1-weighted sequences with dynamic contrast enhancement (DCE) sequence, T2-weighted sequences, and diffusion-weighted imaging (DWI) sequences performed by torso phased-array coils. 25 MP MRI is the best noninvasive imaging test for the visualization of cancer foci in prostate. While MP-MRI may not detect all foci of disease in the prostate, it appears to better detect clinically significant foci based upon Gleason score and cancer volume. 26 For significant lesions, as defined previously, sensitivity and specificity of MP MRI are up to 90%. 27 In one study, sensitivity, specificity, negative predictive value, and accuracy for peripheral zone cancer detection at biopsy were, respectively, 100, 51.4, 100 and 66.7%. 28 In a series of 83 patients studied by multiparametric imaging (T2 þ DWI þ DCE) at 1.5 T before biopsy, MRI was associated with a high sensitivity, specificity, and accuracy for detection of prostate cancer of 95%, 74%, and 86%, respectively. 29 MP MRI, as a 3D technique, can determine prostate cancer foci location within the gland and volume/shape of the tumor and can be used to target lesions.
MRIeultrasound fusion technology has recently allowed targeted biopsies to cancer-suspicious regions noted on MRI. The Artemis spatial tracking and computerized biopsy system functions to record the position of biopsy cores within a 3D template reconstruction of the prostate. Computer software allows fusion of the patient's MRI with real-time ultrasound while performing the Artemis biopsy, allowing targeting of the abnormal region on MRI during Artemis biopsy.
Contemporary series suggest that 63% of men with elevated PSA would have an abnormality suspicious for PC on MP MRI. Targeted biopsies of these areas would lead to cancer identification in~65% of those men. 30e32 While overall cancer detection rates are lower with targeted than systematic biopsy, the majority of cancers missed on targeted biopsy are deemed clinically insignificant (likely nonlethal) as measured by current pathological classification methods. 30,31,33,34 MRI-targeted biopsy results in a 42% clinically significant cancer detection rate. 30 Haffner et al studied extended systematic biopsies and MRI-targeted biopsies in the same patient in their study group. Targeted biopsies detected 16% more Grade 4e5 cases and better quantified the cancer than did extended systematic biopsies, with cancer lengths of 5.56 mm versus 4.70 mm (P ¼ 0.002). 31 In a recent study by Sonn et al 35 of patients with negative prior biopsies or with prostate cancer on AS, the addition of MRIeUS fusion targeted biopsies to systematic biopsies increased the rate of diagnosis of all cancers, as well as Gleason 7 cancer. Thirty-eight percent of men with Gleason 7 cancer had disease detected only via targeted biopsies of lesions identified on MRI. 35 In men with clinically suspected prostate cancer, a biopsy using MRI to inform the sampling was associated with a 42% clinically significant cancer detection rate. 30 Fusion biopsy is more accurate than transrectal biopsy, with a higher cancer detection rate of 55% (vs. 24e40% in standard 12-core biopsy) and more upgrading of the Gleason score in up to 32% of cases, with increased detection of clinically significant higher Gleason score cancers. 36,37
Cryotherapy
The initial experience in focal cryoablation was reported by Onik and colleagues. 38 In their study, nine men with unilateral prostate cancer on biopsy underwent cryoablation with preservation of the neurovascular bundle on the contralateral unaffected side. With a mean follow up of 3 years, all men had stable PSAs and all six men who underwent repeat biopsies were negative for pathological recurrence. Seven of nine men were potent.
Several other clinical studies have investigated the use of focal cryotherapy since Onik's initial study. 39e41 While a majority of trials utilized a hemiablative approach, optimal cryoprobe placement has yet to be determined (Fig. 1). Computer-based technologists have strived to develop and improve cryoprobe placement to maximize destruction of targeted tissue while sparing adjacent noncancerous tissue 42 (Figs. 2 and 3).
There is a lack of consensus on how recurrence is defined after cryotherapy, and no accepted definition of PSA failure after primary therapy, making data on outcomes hard to interpret. Among the existing literature, it is also difficult to ascertain whether these patients had true recurrence from missed treatment versus cancer that was originally missed on staging and biopsy. Lambert et al reported a 12% biochemical recurrence rate (defined as PSA nadir > 50%) with 43% with biopsy-proven recurrence on repeat biopsy. Bahn et al had a low 7% rate of biochemical recurrence with only 1/25 (4%) men having evidence of cancer when undergoing repeat biopsy. 40 Truesdale et al 43 reported a biochemical failure rate of 27.3% according to the Phoenix definition of PSA nadir þ2 and a 46% positive rebiopsy rate among cases with suspicion for recurrence. Most of these recurrences (70e93%) occurred in the untreated contralateral side, which may indicate more a failure of initial staging, rather than the treatment.
The largest published experience and outcomes with focal cryotherapy comes from the Cryo On-Line Data (COLD) registry. In its latest update, of 1160 patients that had been treated with focal cryoablation, the biochemical recurrence-free rate (ASTRO definition of three consecutive PSA rises after post-treatment nadir) at 3 years was 75.7%. Prostate biopsy was performed in 14.1%, and positive in 26.3% of these patients, which comprised only 3.7% (43/ 1160) of all treated patients. 44 Older trials and reviews reflect a combination of older and newer cryosystems, making the data also difficult to interpret or apply to current methods. From the COLD registry, urinary continence (defined as use of 0 pads) was 98.4%, and maintenance of spontaneous erections was 58.1%. Urinary retention and fistula rates were both low, with prolonged urinary retention (>30 days) occurring in six (1.1%) patients, and rectourethral fistula observed in one (0.1%) patient. 44 Focal cryoablation is increasingly used for selected patients with prostate cancer, with a 10-fold increase in use from 1999 to 2005 based on the COLD registry. Oncological efficacy in the most recent COLD series update appears similar to that of whole-gland cryoablation. 44
High-intensity focused ultrasound
The initial study demonstrating high-intensity focused ultrasound (HIFU) success in treating prostate cancer was published in 1995. 45 HIFU works by ablating tissue via US-guided application of mechanical and thermal energy. The two mechanisms of tissue damage are by the conversion of mechanical energy into heat and inertial cavitation.
The majority of published results using HIFU have investigated its efficacy as a whole-gland treatment (Fig. 4). Ganzer and colleagues recently reported 14-year follow-up data on oncological and functional outcomes in 538 men. The biochemical disease-free rate at 5 years was 81% and at 61% at 10 years. 46 Previous studies have cited biochemical disease-free rate ranging from 45% to 84% at 5 years and 69% at 7 years, using ASTRO or Phoenix criteria. 47 In the Ganzer study, metastatic disease was reported in 0.4e6% of low-and intermediate-risk patients, and 15.4% in high-risk patients. prostate-cancer-specific death occurred in 18 (3.3%) patients. 46 Based on recent reviews, the most commonly encountered morbidities after whole-gland therapy include impotence (44%), urinary incontinence (8%), urinary retention (5.3%), chronic perineal pain (3.4%), and rectourethral fistula (1%). 48 Other common complications include stress urinary incontinence (1e28%), urinary tract infection (0e58%), urethral/bladder neck stenosis or strictures (1e31%). For the Ablatherm HIFU device, the rate of complications has been significantly reduced over the years, due to technical improvements. The rate of urinary retention was <10% and of rectourethral fistula was 0e3%. 49 To date, not many studies have used HIFU used as focal therapy. Muto and colleagues compared 70 patients undergoing wholegland HIFU to 29 with unilateral disease undergoing focal HIFU. At 12 months, there was an 82% negative biopsy rate, with focal treatment not appearing to compromise cancer control. Urinary symptoms did not differ significantly. 50 El Fegoun and colleagues 51 reported results of focal HIFU hemiablation performed on 12 patients. Median follow-up was 10 years. Recurrence-free survival was 90% at 5 years, and 38% at 10 years. Five patients had salvage therapy with repeat HIFU (n ¼ 1) or hormonal therapy (n ¼ 4) and there were no metastases. Complications included one case of urinary retention and two patients with urinary tract infections. 51 Another study by Ahmed et al 9 reported results of HIFU hemiablation in 20 patients with unilateral cancer. On follow-up biopsy of the treated side at 6 months, 89% of men had negative biopsy. At 12 months follow-up, 95% of men reported erections sufficient for intercourse and 90% of men were pad and leak free. 9
Other approaches
While cryoablation and HIFU are currently the two modalities with the most long-standing experience in focal therapy, there are various other treatment strategies currently under investigation. Focal laser ablation (FLA) is a recent technique that uses laser energy to ablate MRI visible lesions. The advantage of this approach is that it can be done with real-time monitoring via MRI, allowing the surgeon to ensure completeness of treatment as well as avoid vital structures in order to minimize morbidity (Figs. 5 and 6). A Phase I trial of 12 patients reported by Lindner et al showed that 50% of the cohort had no evidence of disease after FLA, while 67% of them had no evidence of disease only at the site of ablation. One patient with residual disease at the ablation site underwent RP without complications. There were no changes in erectile function or voiding symptoms. 52 Another recent study by Oto et al 53 examined nine patients who underwent FLA. Immediate contrast-enhanced posttreatment MRI showed a hypovascular defect in eight patients. Average International Prostate Symptom Score (IPSS) and Sexual Health Inventory for Men (SHIM) scores did not change significantly. MRI-guided biopsy of the ablation zone showed no cancer in seven patients (78%) and Gleason Grade 6 cancer in two (22%). Selfresolving perineal abrasion and focal paresthesia of the glans penis each occurred in one patient. 53 Bipolar radiofrequency ablation (RFA) is another technique under investigation that can be performed under TRUS guidance. 54e56 The ultrasound images and probe-driving mechanism template are mated to allow accurate position of the probe so as to precisely target the regions of the prostate that were mapped during the biopsy. A specially designed driver mechanism is used to position the probe to within 0.5 mm of the desired location. Appropriate insertions of bipolar RFA probes are designed in the planning process to target the selected regions. Although the FDA has approved bipolar RFA for the treatment of prostate cancer, no trials have as yet reported its outcomes. 56
Follow-up of patients for assessment of efficacy
The best method to follow patients after focal therapy is controversial. The current methods generally use follow-up PSA, MP MRI, and/or biopsy in some combination. In utilizing biopsy as a follow-up parameter, the rigor of the follow-up should be the same as the selection biopsy in order to determine treatment efficacy.
The follow-up of focal therapy using MRI is possible, especially using the DCE sequence as the treated lesion/region no longer enhances 57 (Figs. 7 and 8). The optimal timing of such imaging depends upon the goal, with immediate imaging at 1 week best demonstrating the zone of treatment effect and delayed imaging at 6 months demonstrating residual regions of cancer left untreated.
The relative amount of tissue ablated varies greatly in focal therapy, due to differences in volume, location and Gleason score. Thus, PSA levels may not be reflective, predictable, or comparable across patients. The ideal PSA nadir is also undefined. It has been shown that PSA decreases by 30e60% after focal therapy. 40,50,58,59 In a study using focal HIFU, the PSA decreased by 80% at 6 months. 9 The mean PSA after focal therapy is 2e3 ng/dL. 10 Many groups have used the ASTRO criteria to define biochemical recurrence after focal therapy, to account for variability in PSA due to benign prostatic hyperplasia, inflammation, and residual disease. The Phoenix definition is thought to be more specific for recurrence in patients who have had definitive radiation treatment. 10
Conclusion
Focal therapy has the potential to offer an array of treatments that stand midway between AS and radical therapy for patients with low-to intermediate-risk disease. Future randomized studies in focal therapy must critically evaluate candidate selection criteria and robustly answer questions regarding outcomes and follow-up monitoring. Focal therapy offers a promising outlook for the future in the treatment of prostate cancer, with the goal of effectively achieving cancer control while minimizing morbidity. |
<reponame>blueblueblue/infinispan
package org.infinispan.server.test.task.servertask;
import org.infinispan.tasks.ServerTask;
import org.infinispan.tasks.TaskContext;
import org.infinispan.tasks.TaskExecutionMode;
import java.util.Map;
import java.util.Set;
/**
* Author: <NAME>, <EMAIL>
* Date: 1/20/16
* Time: 6:33 AM
*/
public class DistributedTestServerTask implements ServerTask {
public static final String NAME = "serverTask777792";
public static final String EXCEPTION_MESSAGE = "Intentionally Thrown Exception";
private TaskContext taskContext;
@Override
public Object call() {
Map<String, Boolean> parameters = (Map<String, Boolean>) taskContext.getParameters().get();
if (parameters == null || parameters.isEmpty()) {
return System.getProperty("jboss.node.name");
} else {
throw new RuntimeException(EXCEPTION_MESSAGE);
}
}
@Override
public void setTaskContext(TaskContext taskContext) {
this.taskContext = taskContext;
}
@Override
public String getName() {
return NAME;
}
@Override
public TaskExecutionMode getExecutionMode() {
return TaskExecutionMode.ALL_NODES;
}
}
|
/**
* Exchanges an Authorization Code for an Access Token, Refresh Token and (optional) ID Token.
* This provides the benefit of not exposing any tokens to the User Agent and possibly other
* malicious applications with access to the User Agent.
* The Authorization Server can also authenticate the Client before exchanging the Authorization
* Code for an Access Token.
*
* Needs to be run on a separate thread.
*
* @param authCode the authorization code received from the authorization endpoint
* @return the parsed successful token response received from the token endpoint
* @throws IOException for an error response
*/
public TokenResponse requestTokensWithCodeGrant(String authCode) throws IOException {
AuthorizationCodeTokenRequest request = new AuthorizationCodeTokenRequest(
AndroidHttp.newCompatibleTransport(),
new GsonFactory(),
new GenericUrl(tokenEndpoint),
authCode
);
request.setRedirectUri(redirectUrl);
if (extras != null) {
for (Map.Entry<String, String> queryParam : extras.entrySet()) {
request.set(queryParam.getKey(), queryParam.getValue());
}
}
if (!TextUtils.isEmpty(clientSecret)) {
request.setClientAuthentication(new BasicAuthentication(clientId, clientSecret));
} else {
request.set("client_id", clientId);
}
if (useOAuth2) {
Log.d(TAG, "tokens request OAuth2 sent");
TokenResponse tokenResponse = request.executeUnparsed().parseAs(TokenResponse.class);
String accessToken = tokenResponse.getAccessToken();
if (!TextUtils.isEmpty(accessToken)){
Log.d(TAG, String.format("Manage to parse and extract AT : %1$s", accessToken));
return tokenResponse;
}
else {
throw new IOException("Invalid Access Token returned.");
}
} else {
Log.d(TAG, "tokens request OIDC sent");
IdTokenResponse response = IdTokenResponse.execute(request);
String idToken = response.getIdToken();
if (isValidIdToken(idToken)) {
try {
String accessToken = response.getAccessToken();
if (!TextUtils.isEmpty(accessToken)) {
return response;
} else {
throw new IOException("Invalid access token. The at_hash does not match with the return access token.");
}
} catch (Exception e) {
throw new IOException("Can not validate AccessToken.", e);
}
} else {
throw new IOException("Invalid ID token returned.");
}
}
} |
class MassTable:
"""
The purpose of this class is to:
a) Create a dictionary between MassNuclide objects, previously constructed
from the table mass_excess2020.txt, and their mass excess A_nuc - A
measured in MeV.
b) Parse the information of the previously defined dictionary to the Nucleus
class.
The only required variable to construct an instance of this class is : var filename:
that contains the .txt table file with the nuclei and their mass excess. If this
variable is not provided, then mass_excess2020.txt is considered by default.
"""
def __init__(self, filename=None):
self._mass_diff = {}
if filename:
self.filename = filename
else:
self.filename = dir_mass_data
self._read_table()
def _add_mass_nuclide(self, nuc):
assert isinstance(nuc, MassNuclide)
assert not str(nuc) in self._mass_diff.keys()
self._mass_diff[str(nuc)] = nuc
def _read_table(self):
file = open(self.filename, 'r')
for _ in range(4):
file.readline()
for line in file:
data_list = line.strip().split()
#print(data_list)
A_str = data_list.pop(0)
Z_str = data_list.pop(0)
dm_str = data_list.pop(0)
A = int(A_str)
Z = int(Z_str)
dm = float(dm_str)
nuc = MassNuclide(a=A, z=Z, dm=dm)
self._add_mass_nuclide(nuc)
file.close()
def get_mass_diff(self, nuc):
if str(nuc) in self._mass_diff.keys():
return self._mass_diff[str(nuc)]
else:
raise NotImplementedError("Nuclear mass difference is not available") |
<reponame>Xinyi-Yu/Multiagent-LTL-Opacity<filename>codes/ltl2ba.py
#!/usr/bin/python
# -*- coding: utf-8 -*-
from os.path import abspath, dirname, join
from subprocess import check_output
from codecs import getdecoder
from argparse import ArgumentParser
from promela import Parser
def run_ltl2ba(formula):
script_dir = dirname(abspath(__file__))
ltl2ba = join(script_dir, "ltl2ba")
raw_output = check_output([ltl2ba, "-f", "%s" % formula])
ascii_decoder = getdecoder("ascii")
(output, _) = ascii_decoder(raw_output)
return output
def parse_ltl(formula):
ltl2ba_output = run_ltl2ba(formula)
parser = Parser(ltl2ba_output)
edges = parser.parse()
return edges
if __name__ == "__main__":
argparser = ArgumentParser(description="Call the ltl2ba program and parse the output")
argparser.add_argument('LTL')
args = argparser.parse_args()
ltl2ba_output = run_ltl2ba(args.LTL)
parser = Parser(ltl2ba_output)
transitions = parser.parse()
print transitions
|
The UK's young unemployed could form a dole queue stretching from London to Edinburgh, research by MPs and the Commons library has calculated.
As MPs prepare to launch a new cross-party group dealing with youth unemployment, the figures suggest that out-of-work 16- to 24-year-olds could stand in a line that extends for 434 miles.
This is equivalent to an area the size of Leeds, assuming each young person has 0.75 metres (2.5 feet) square to stand in. The figures were worked out for the new all-party group on youth unemployment, launched by Pamela Nash, at 29 the youngest MP in the House of Commons. Other members of the group include the Tory MPs Zac Goldsmith and Laura Sandys, the Lib Dems Julian Huppert and John Leech, and the former Labour home secretary Lord Reid.
About 917,000 young people in the UK are classified as unemployed, with half claiming jobseeker's allowance.
Nash, the Labour MP for Aidrie and Shotts, said the situation was a national disgrace, adding: "I want to ensure that … youth unemployment remains a top priority for both governments and for all politicians in Edinburgh and London." Youth unemployment dropped slightly in the past three months of 2013, but the rate still remains stubbornly high at 19.9% of 16- to 24-year-olds.
According to the Office for National Statistics, young people in the UK appear to have borne the brunt of the financial crisis, with a larger proportion them now out of work than any other age group.
The unemployment rate among 16- to 17-year-olds is 35.9%, and 18% among 18- to 24-year-olds, according to the latest economic review by the statistics watchdog.
In contrast, the rate falls to 4.7% among 35- to 49-year-olds, and 4.4% among 50- to 64-year-olds. Those aged between 18 and 24 accounted for almost 30% of the rise in the unemployment rate between the first quarter of 2008 and the peak in unemployment in the fourth quarter of 2011, roughly double their proportion of the labour force.
A DWP spokesman said: "The number of young people in work increased by 49,000 in the last 3 months with the number claiming Jobseeker's Allowance falling for the last 20 months.
"We know there is still more to do but more young people in work means more people have the security of a regular wage." |
{-# LANGUAGE GeneralizedNewtypeDeriving #-}
{-# LANGUAGE NamedFieldPuns #-}
module RandomAccessMachine where
import Control.Monad.State
import Numeric.Natural
import Debug.Trace
-- Register number
newtype RegNum = RegNum Natural
deriving (Show, Eq, Num, Real, Ord, Enum, Integral)
-- Instruction number
newtype InstrNum = InstrNum Natural
deriving (Show, Eq, Num, Real, Ord, Enum, Integral)
-- | Instruction
data Instr =
Z RegNum
| S RegNum
| M (RegNum, RegNum)
| J (RegNum, RegNum, InstrNum)
| E
deriving (Show, Eq)
-- | Program
newtype Program = Program [Instr]
deriving (Show, Eq)
-- | Register
newtype Register = Register [Natural]
deriving (Show, Eq)
-- | Environement
data Env =
Env
{ pc :: InstrNum -- Program counter
, register :: Register -- Regisiter
, isTerminated :: Bool -- Whether terminated or not
}
deriving (Show, Eq)
-- | Initial enviromemnt
initEnv :: [Natural] -> Env
initEnv args =
Env
{ pc = 1
, register = Register (0 : args ++ repeat 0)
, isTerminated = False
}
-- | Random Access Machine
newtype RandomAccessMachineT m a =
RandomAccessMachineT
{ runRandomAccessMachineT :: StateT Env m a
}
deriving (Functor, Applicative, Monad, MonadState Env, MonadTrans)
-- | Set nat to regNum
setRegister :: Monad m => RegNum -> Natural -> RandomAccessMachineT m ()
setRegister regNum nat = do
-- Calc index
let idx :: Int
idx = (fromInteger . toInteger) regNum
-- Set nat to regNum
modify (\env@Env{register=Register naturals} ->
let newRegL = take idx naturals
newRegR = drop (idx+1) naturals
in env{register=Register (newRegL ++ [nat] ++ newRegR)}
)
-- | Get nat by regNum
getNatFromRegister :: Monad m => RegNum -> RandomAccessMachineT m Natural
getNatFromRegister regNum = do
-- Calc index
let idx :: Int
idx = (fromInteger . toInteger) regNum
Register naturals <- gets register
-- Return natural
return (naturals !! idx)
-- | Increment program counter
incPc :: Monad m => RandomAccessMachineT m ()
incPc = modify (\env@Env{pc} -> env{pc=pc+1})
-- | Run an instruction
runInstr :: Monad m => Instr -> RandomAccessMachineT m ()
runInstr (Z regNum) = setRegister regNum 0 >> incPc
runInstr (S regNum) = do
nat <- getNatFromRegister regNum
setRegister regNum (nat + 1)
incPc
runInstr (M (regNum1, regNum2)) = do
nat <- getNatFromRegister regNum2
setRegister regNum1 nat
incPc
runInstr (J (regNum1, regNum2, instrNum)) = do
n1 <- getNatFromRegister regNum1
n2 <- getNatFromRegister regNum2
if n1 == n2
then modify (\env -> env{pc=instrNum})
else incPc
runInstr (E) = modify (\env -> env{isTerminated=True})
-- | Make random access machine
makeRandomAccessMachine :: Monad m => Program -> RandomAccessMachineT m Natural
makeRandomAccessMachine (Program instrs) = do
-- Get instruction number
instrNum <- gets pc
-- Get index
let idx = (fromInteger . toInteger) instrNum - 1
when (idx >= length instrs ) $
error ("Index " ++ show idx ++ ": out of instructions")
-- Get instruction
let instr = instrs !! idx
-- Run the instruction
runInstr instr
end <- gets isTerminated
if end
then getNatFromRegister 0
else makeRandomAccessMachine (Program instrs)
-- | Execute a program
execProgram :: Program -> [Natural] -> (Natural, Env)
execProgram program args =
runState (runRandomAccessMachineT (makeRandomAccessMachine program)) (initEnv args)
|
// TestCallback tests that a callback can be created and called.
func TestCallback(t *testing.T) {
chunkAddr := chunktesting.GenerateTestRandomChunk().Address()
targets := pss.Targets{[]byte{0xED}}
callbackWasCalled := make(chan bool)
pssSender := &mockPssSender{
callbackC: callbackWasCalled,
}
recoveryCallback := recovery.NewCallback(pssSender)
go recoveryCallback(chunkAddr, targets)
select {
case <-callbackWasCalled:
break
case <-time.After(100 * time.Millisecond):
t.Fatal("recovery callback was not called")
}
} |
/*
* Pixel Dungeon
* Copyright (C) 2012-2015 <NAME>
*
* Shattered Pixel Dungeon
* Copyright (C) 2014-2021 <NAME>
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>
*/
package com.postmodern.postmoderndungeon.actors.mobs;
import com.postmodern.postmoderndungeon.Assets;
import com.postmodern.postmoderndungeon.Dungeon;
import com.postmodern.postmoderndungeon.actors.Actor;
import com.postmodern.postmoderndungeon.effects.CellEmitter;
import com.postmodern.postmoderndungeon.effects.Speck;
import com.postmodern.postmoderndungeon.items.EquipableItem;
import com.postmodern.postmoderndungeon.items.Heap;
import com.postmodern.postmoderndungeon.items.Item;
import com.postmodern.postmoderndungeon.items.armor.Armor;
import com.postmodern.postmoderndungeon.items.artifacts.Artifact;
import com.postmodern.postmoderndungeon.items.wands.Wand;
import com.postmodern.postmoderndungeon.items.weapon.Weapon;
import com.postmodern.postmoderndungeon.items.weapon.missiles.MissileWeapon;
import com.postmodern.postmoderndungeon.messages.Messages;
import com.postmodern.postmoderndungeon.sprites.MimicSprite;
import com.postmodern.postmoderndungeon.utils.GLog;
import com.watabou.noosa.audio.Sample;
import com.watabou.utils.Random;
public class GoldenMimic extends Mimic {
{
spriteClass = MimicSprite.Golden.class;
}
@Override
public String name() {
if (alignment == Alignment.NEUTRAL){
return Messages.get(Heap.class, "locked_chest");
} else {
return super.name();
}
}
@Override
public String description() {
if (alignment == Alignment.NEUTRAL){
return Messages.get(Heap.class, "locked_chest_desc") + "\n\n" + Messages.get(this, "hidden_hint");
} else {
return super.description();
}
}
public void stopHiding(){
state = HUNTING;
if (Actor.chars().contains(this) && Dungeon.level.heroFOV[pos]) {
enemy = Dungeon.hero;
target = Dungeon.hero.pos;
enemySeen = true;
GLog.w(Messages.get(this, "reveal") );
CellEmitter.get(pos).burst(Speck.factory(Speck.STAR), 10);
Sample.INSTANCE.play(Assets.Sounds.MIMIC, 1, 0.85f);
}
}
@Override
public void setLevel(int level) {
super.setLevel(Math.round(level*1.33f));
}
@Override
protected void generatePrize() {
super.generatePrize();
//all existing prize items are guaranteed uncursed, and have a 50% chance to be +1 if they were +0
for (Item i : items){
if (i instanceof EquipableItem || i instanceof Wand){
i.cursed = false;
i.cursedKnown = true;
if (i instanceof Weapon && ((Weapon) i).hasCurseEnchant()){
((Weapon) i).enchant(null);
}
if (i instanceof Armor && ((Armor) i).hasCurseGlyph()){
((Armor) i).inscribe(null);
}
if (!(i instanceof MissileWeapon || i instanceof Artifact) && i.level() == 0 && Random.Int(2) == 0){
i.upgrade();
}
}
}
}
}
|
import FWCore.ParameterSet.Config as cms
process = cms.Process("QcdLowPtDQM")
process.load("DQMServices.Core.DQM_cfg")
process.load("DQMServices.Components.DQMEnvironment_cfi")
process.load('FWCore/MessageService/MessageLogger_cfi')
process.load('Configuration.StandardSequences.GeometryRecoDB_cff')
process.load('Configuration/StandardSequences/MagneticField_AutoFromDBCurrent_cff')
process.load('Configuration/StandardSequences/FrontierConditions_GlobalTag_cff')
process.load('DQM/Physics/qcdLowPtDQM_cfi')
process.GlobalTag.globaltag = 'STARTUP3X_V8D::All'
process.options = cms.untracked.PSet(
FailPath = cms.untracked.vstring("ProductNotFound")
)
process.maxEvents = cms.untracked.PSet(
input = cms.untracked.int32(-1)
)
process.dump = cms.EDAnalyzer('EventContentAnalyzer')
process.source = cms.Source("PoolSource",
duplicateCheckMode = cms.untracked.string('noDuplicateCheck'),
fileNames = cms.untracked.vstring(
'file:/putfilehere.root'
)
)
##uncomment if you run on MC raw
#process.p1 = cms.Path(
# process.myRecoSeq1
#)
#process.siPixelDigis.InputLabel = cms.InputTag("rawDataCollector")
process.p2 = cms.Path(
process.myRecoSeq2 *
process.QcdLowPtDQM +
process.dqmSaver
)
process.dqmSaver.workflow = cms.untracked.string('/Physics/QCDPhysics/LowPt')
|
/// remove an item from the list
void remove(GarbageCollected *p)
{
ct--;
if(p->prev)
p->prev->next = p->next;
else
hd = p->next;
if(p->next)
p->next->prev = p->prev;
else
tl = p->prev;
} |
package com.github.TKnudsen.ComplexDataObject.data.uncertainty;
public enum ValueUncertaintyCharacteristics {
AMOUNT, LOWERBOUND, UPPERBOUND, LOWERQUARTILE, UPPERQUARTILE, MEDIAN, MEAN
}
|
def take_screenshot(self,filename=None):
if self._browser is None:
return
if filename is None:
dts = datetime.datetime.today().strftime("%Y%m%d%H%M%S")
filename = 'hcss_%s.png' % dts
self.logger.debug("screenshot filename: %s" % (filename))
self._browser.save_screenshot(filename) |
Story highlights Feds take 52 bus companies and 340 unsafe vehicles off the road in Operation Quick Strike
Federal Motor Carrier Safety Administration investigates unsafe motor coach firms
Investigators conducted reviews of safety practices at 250 most at-risk motor coach companies
U.S. transportation officials shut down 52 bus companies and 340 vehicles Thursday as part of as an eight-month effort targeting unsafe motor coach operations, according to the Federal Motor Carrier Safety Administration.
Called Operation Quick Strike, the sweeping action from New York to California involved companies in 22 states and the District of Columbia. The companies were put out of business after more than 50 specially trained investigators conducted detailed reviews of safety practices at the 250 most at-risk motor coach companies based on roadside inspection and safety data, according to a statement from the agency.
"Bus travel is increasingly popular because it is a convenient, inexpensive option for students, groups and families," U.S. Transportation Secretary Anthony Foxx said. "But it must also be safe."
Thursday's announcement comes weeks after the National Transportation Safety Board expressed concerns with the thoroughness of FMCSA investigations, citing four deadly crashes involving operators who were already on the radar of the oversight agency.
"While FMCSA deserves recognition for putting bad operators out of business, they need to crack down before crashes occur, not just after high-visibility events," NTSB Chairwoman Deborah A.P. Hersman said in a statement last month.
Two deadly bus accidents in the last year prompted the investigations.
FMCSA shut down the U.S. operations of Mi Joo Tour & Travel of Coquitlam, British Columbia, after an accident in Oregon in December 2012 that killed nine and injured 39. The driver had far exceeded the 70-hour work limit, the investigation found.
The agency also shuttered bus company Scapadas Magicas after an accident in which eight people were killed -- seven bus passengers and a driver in another vehicle -- and dozens were injured. The February wreck occurred east of San Bernardino, California. The company failed to maintain its buses and ensure that its drivers were licensed, the agency said.
"In the aftermath of two deadly crashes earlier this year, FMCSA re-examined the way we investigate passenger carriers to make our methods even more effective at preventing crashes," agency spokeswoman Marissa Padilla said. "Using safety and roadside inspection data, FMCSA identified 250 high-risk carriers for top-to-bottom investigations designed to uncover any patterns of unsafe behavior or faulty bus maintenance."
FMCSA administrator Anne Ferro said Thursday, "I think the most important thing to point out is that we didn't wait for NTSB to tell us we had to re-examine the way we were carrying out investigations. That's exactly what we did early this year, which then led to this thorough sweep of 250 of the highest risk companies."
Dan Ronan, spokesman for the American Bus Association, a Washington-based trade group representing the motor coach, tourism and travel industry, praised the federal initiative.
"The Federal Motor Carrier Safety Administration has done the right thing by finding companies running illegally or in an unsafe manner," he said. "More than half the fatalities and injuries in the industry were caused by a small group of providers. There's no shortage of bus companies. There are plenty of good ones out there."
As a result of the investigation and inspections, 52 motor coach companies were put out of business and shut down for safety violations such as failure to maintain their buses adequately, inadequate drug and alcohol driver testing and overwork of drivers.
Other companies took action to correct the safety violations, and 28 avoided being shut down, the agency said.
In all, 340 vehicles of the more than 1,300 inspected were taken off the road for safety and maintenance violations, according to the report.
Additionally, inspectors looked into more than 1,300 carriers that had little to no inspection history with the safety agency, and more than 240 were will be investigated further.
"This year we evaluated and enhanced our investigation methods to dig deeper than ever before and uncover dangerous patterns of unsafe behavior and business practices," said Anne S. Ferro, administrator of the Federal Motor Carrier Safety Administration. |
<reponame>Gouplook/kc<filename>rpcinterface/interface/cards/ncard.go
package cards
import (
"context"
"git.900sui.cn/kc/rpcinterface/interface/common"
)
//空的入参
type EmptyParams struct {
}
//空的出参
type EmptyReplies struct {
}
//限次卡基本信息数据结构
type NCardBase struct {
Name string `mapstructure:"name"` //名称
BusID int `mapstructure:"bus_id"` //商户ID
ShortDesc string `mapstructure:"sort_desc"` //短描述short
RealPrice float64 `mapstructure:"real_price"` //现价
Price float64 `mapstructure:"price"` //标价
ServicePeriod int `mapstructure:"service_period"` //保险时间 月
SaleShopNum int `mapstructure:"sale_shop_num"` //在售门店数量
ImgID int `mapstructure:"img_id"` //图片ID
Sales int `mapstructure:"sales"` //销量
Ctime int `mapstructure:"ctime"` //发布时间
ValidCount int `mapstructure:"validcount"` //包含单项目总次数
CtimeStr string // create time 字符串格式
IsPermanentValidity int `mapstructure:is_permanent_validity` // 是否永久有效:1-是;2-否
}
//添加限次卡入参
type ArgsAddNCard struct {
common.BsToken
NCardBase
Notes []CardNote //温馨提示
IncludeSingles []IncSingle //包含的单项目
GiveSingles []IncSingle //赠送的单项目
GiveSingleDesc []GiveSingleDesc //赠品描述
ImgHash string //封面图片hash串
}
//添加限次卡出参
type RepliesAddNCard struct {
NCardID int
}
//修改限次卡入参
type ArgsEditNCard struct {
common.BsToken
NCardBase
Notes []CardNote //温馨提示
CardID int `mapstructure:"card_id"` //限次卡ID
IncludeSingles []IncSingle //包含的单项目
GiveSingles []IncSingle //赠送的单项目
GiveSingleDesc []GiveSingleDesc //赠品描述
ImgHash string `mapstructure:"img_hash"` //封面图片hansh串
}
//限次卡详情入参
type ArgsNCardInfo struct {
NCardID int `mapstructure:"card_id"` //限次卡ID
ShopID int `mapstructure:"shop_id"` //门店ID非必选,需要获取限次卡在门店的详情时传递
}
type BusInfo struct {
BusIcon string //bus Icon url
BusCompanyName string //bus name
BusBrandName string //bus name
}
//限次卡详情返回数据
type ReplyNCardInfo struct {
NCardBase
ShareLink string //分享链接
Notes []CardNote //温馨提示
NCardID int `mapstructure:"ncard_id"` //限次卡ID
SsId int //在门店的id
BindID int `mapstructure:"bind_id"` //商家主营行业ID
ImgHash string //封面图片hash串
ImgUrl string //封面图片url
IncludeSingles []IncSingleDetail2 //包含的单项目
GiveSingles []IncSingleDetail2 //赠送的单项目
IsAllSingle bool //适用于全部单项目
IsAllProduct bool //适用于全部商品
GiveSingleDesc []GiveSingleDesc //赠品描述
IsGround int `mapstructure:"is_ground"` //总店铺是否上架 0=否 1=是
ShopStatus int //限次卡在子店的销售状态 1=下架 2=上架 3=被总店禁用
BusInfo
SingleTotalNum int //卡项包含项目的总次数
ShopLists []ReplyShopName // 总店限次卡门店添加信息
}
//总店限次卡列表入参
type ArgsBusNCardPage struct {
common.Paging
common.BsToken
FilterShopHasAdd bool //false-获取全部,true-过滤添加过的数据
IsGround string //状态过滤:默认全部,1=下架 2=上架
}
//限次卡在列表的数据结构
type NCardDesc struct {
NCardBase //限次卡基本信息
NCardID int `mapstructure:"ncard_id"` //限次卡ID
BindID int `mapstructure:"bind_id"` //商家主营行业ID
Clicks int
Sales int `mapstructure:"sales"`
IsGround int `mapstructure:"is_ground"` //总店铺是否上架 0=否 1=是
ShopStatus int //限次卡在子店的销售状态 1=下架 2=上架 3=被总店禁用 只有在门店才有效
ShopHasAdd int //子店是否添加 0=否 1=是 只有在门店才有效
ShopDelStatus int //在店铺的删除状态
ShopItemId int //项目在门店的id
ApplySingleNum int //适用单项目的个数
GiveSingleNum int //赠送单项目的个数
}
//限次卡列表返回数据
type ReplyNCardPage struct {
TotalNum int //限次卡总数量
List []NCardDesc //限次卡列表
IndexImg map[int]string //限次卡封面图
}
//设置适用门店
type ArgsSetNCardShop struct {
common.BsToken
NCardIDs []int `mapstructure:"card_ids"` //限次卡IDs
ShopIDs []int `mapstructure:"shop_ids"` //适用的门店IDs
IsAllShop bool `mapstructure:"all_shop"` //是否适用所有门店 为true的情况下,ShopIDs不用传也不生效
}
//总店上下架限次卡入参
type ArgsDownUpNCard struct {
common.BsToken
NCardIDs []int `mapstructure:"card_ids"` //限次卡IDs
OptType uint8 //操作类型 参考常量OPT_UP/OPT_DOWN
}
//子店获取适用本店的限次卡列表入参
type ArgsShopGetBusNCardPage struct {
common.Paging
common.BsToken
}
//子店添加限次卡到自己店铺入参
type ArgsShopAddNCard struct {
common.BsToken
NCardIDs []int `mapstructure:"card_ids"` //限次卡IDs
}
//获取子店的限次卡列表入参
type ArgsShopNCardPage struct {
common.Paging
ShopID int `mapstructure:"shop_id"` //门店ID
Status int //限次卡上架状态
}
//子店上下架限次卡
type ArgsShopDownUpNCard struct {
common.BsToken
CardIDs []int
OptType uint8 //操作类型 参考常量OPT_UP/OPT_DOWN
}
type ArgsShopNcardRpc struct {
ShopId int
NcardIds []int
}
type ReplyShopNcardRpc struct {
List []NCardDesc //限次卡列表
}
//总店-删除
type ArgsDeleteNCard struct {
common.BsToken
NcardIds []int
}
//分店-删除
type ArgsDeleteShopNCard struct {
common.BsToken
NcardIds []int
}
type NCard interface {
//添加限次卡
AddNCard(ctx context.Context, args *ArgsAddNCard, replies *RepliesAddNCard) error
//编辑限次卡
EditNCard(ctx context.Context, args *ArgsEditNCard, replies *EmptyReplies) error
//获取限次卡的详情
NCardInfo(ctx context.Context, args *ArgsNCardInfo, reply *ReplyNCardInfo) error
//获取总店的限次卡列表
BusNCardPage(ctx context.Context, args *ArgsBusNCardPage, reply *ReplyNCardPage) error
//设置适用门店
SetNCardShop(ctx context.Context, args *ArgsSetNCardShop, reply *EmptyReplies) error
//总店上下架限次卡
DownUpNCard(ctx context.Context, args *ArgsDownUpNCard, reply *EmptyReplies) error
//子店获取适用本店的限次卡列表
ShopGetBusNCardPage(ctx context.Context, args *ArgsShopGetBusNCardPage, reply *ReplyNCardPage) error
//子店添加限次卡到自己的店铺
ShopAddNCard(ctx context.Context, args *ArgsShopAddNCard, reply *EmptyReplies) error
//获取子店的限次卡列表
ShopNCardPage(ctx context.Context, args *ArgsShopNCardPage, reply *ReplyNCardPage) error
//子店上下架限次卡
ShopDownUpNCard(ctx context.Context, args *ArgsShopDownUpNCard, reply *EmptyReplies) error
//ShopNcardRpc
ShopNcardRpc(ctx context.Context, args *ArgsShopNcardRpc, reply *ReplyShopNcardRpc) error
//总店-软删除
DeleteNCard(ctx context.Context, args *ArgsDeleteNCard, reply *bool) error
//分店-软删除
DeleteShopNCard(ctx context.Context, args *ArgsDeleteShopNCard, reply *bool) error
}
|
<filename>front-end/src/app/users/edit-user/edit-user.component.ts
import { Component, OnInit } from '@angular/core';
import { UserService } from '../user.service';
import { NgForm } from '@angular/forms';
import { User } from '../user.model';
import { ServerService } from '../../server.service';
@Component({
selector: 'app-edit-user',
templateUrl: './edit-user.component.html',
styleUrls: ['./edit-user.component.css']
})
export class EditUserComponent implements OnInit {
constructor(private userService: UserService, private serverService: ServerService) {
// this.userService.UserEditing = new User("omar","0987098709","<EMAIL>","bsdasd","123456",true,true,true,true);
}
ngOnInit() {
}
onSumbit(form: NgForm, Vper, Dper, Aper, Freeze, exit) {
// let newUser: User;
// let swap: boolean = false;
// let currIndex: number = 0;
if (Vper == false && Dper == false && Aper == false && Freeze == false) {
alert("freeze the account or delete it give at least 1 permission");
return;
}
// for (let index = 0; index < this.userService.usersList.length; index++) {
// if (index == this.userService.usersList.length - 1) {
// swap = true;
// }
// if (this.userService.usersList[index].username == form.value.username) {
// if (this.userService.UserEditing.username == form.value.username) {
// currIndex = this.userService.usersList.indexOf(this.userService.UserEditing);
// newUser = new User(form.value.name, form.value.phone, form.value.email, this.userService.UserEditing.username, form.value.password, Vper, Dper, Aper,Freeze);
// this.userService.usersList[currIndex] = newUser;
// exit.click();
// return;
// }
// else {
// alert("username is already in use");
// return;
// }
// } else {
// if (swap == true) {
// currIndex = this.userService.usersList.indexOf(this.userService.UserEditing);
// newUser = new User(form.value.name, form.value.phone, form.value.email, this.userService.UserEditing.username, form.value.password, Vper, Dper, Aper,Freeze);
// this.userService.usersList[currIndex] = newUser;
// exit.click();
// return;
// }
// }
// }
let newUser = new User(form.value.name, form.value.phone, form.value.email, this.userService.UserEditing.username, form.value.password, Vper, Dper, Aper, Freeze);
this.userService.UserEditing.name = form.value.name;
this.userService.UserEditing.phone = form.value.phone;
this.userService.UserEditing.email = form.value.email;
this.userService.UserEditing.password = <PASSWORD>;
this.userService.UserEditing.VolPer = Vper;
this.userService.UserEditing.DonorPer = Dper;
this.userService.UserEditing.AdoptPer = Aper;
this.userService.UserEditing.Freeze = Freeze;
this.serverService.editUser(this.userService.UserEditing)
.subscribe((res) => {
if (res.status === 200) {
exit.click();
} else {
return alert('בעיה מצד שרת');
}
}, (e) => alert(e));
}
}
|
Is Same-Sex Marriage an Equality Issue? Framing Effects Among African Americans
Public opinion about gay rights is often shaped by egalitarian values. While the extant literature has suggested that African Americans’ value structure tends to be very egalitarian, many popular media accounts as well as some scholarly research indicate that Blacks have, at times, opposed gay rights. We assert that when the media frame gay rights as equality issues, Blacks are more likely to rely on egalitarianism to form their opinions. We use content analysis to show that equality framing of gay rights dramatically increased in 2012, and we use national survey data to show that Blacks’ support for marriage equality also precipitously rose beginning in 2012. Then, we use data from an original framing experiment to show that exposure to an equality frame does, indeed, encourage Blacks to be more reliant on their egalitarian values to form an opinion about same-sex marriage and to be more supportive of the policy. |
<reponame>forksnd/win32metadata
/*++
Copyright (c) Microsoft Corporation
Module Name:
MDMRegistration.h
Abstract:
This file contains structures, function signatures for 3rd Party
management software that intends to interact with Windows MDE
Environment:
User Mode - Win32
Notes:
--*/
#if (NTDDI_VERSION >= NTDDI_WINBLUE)
#ifndef _MDM_REG_
#define _MDM_REG_
#ifdef _MSC_VER
#pragma once
#endif
#include <winapifamily.h>
#include <winerror.h>
#pragma region Desktop Family
#if WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_DESKTOP)
#ifdef __cplusplus
extern "C" {
#endif
////////////////////////////////////////////////////////////////////////////////
//
// Error Codes
//
#define E_DATATYPE_MISMATCH HRESULT_FROM_WIN32(ERROR_DATATYPE_MISMATCH)
// We will use this facility code for back end registration errors.
#define MDM_REGISTRATION_FACILITY_CODE 25 // Errors from desktop 8.0-8.1
#define DEVICE_ENROLLER_FACILITY_CODE 24 // Errors from 10+ (and phone 8+)
// Invalid Schema , Message Format Error from server.
#define MREGISTER_E_DEVICE_MESSAGE_FORMAT_ERROR MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 1)
// Server failed to authenticate the user.
#define MREGISTER_E_DEVICE_AUTHENTICATION_ERROR MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 2)
// User is not authorized to enroll.
#define MREGISTER_E_DEVICE_AUTHORIZATION_ERROR MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 3)
// User has no permission on the cert template or CA unreachable.
#define MREGISTER_E_DEVICE_CERTIFCATEREQUEST_ERROR MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 4)
#define MENROLL_E_DEVICE_CERTIFCATEREQUEST_ERROR MAKE_HRESULT(SEVERITY_ERROR, DEVICE_ENROLLER_FACILITY_CODE, 4)
// Generic Failure from management server, such as DB access error.
#define MREGISTER_E_DEVICE_CONFIGMGRSERVER_ERROR MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 5)
// Unhandled exception from server.
#define MREGISTER_E_DEVICE_INTERNALSERVICE_ERROR MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 6)
// Unhandled exception from server.
#define MREGISTER_E_DEVICE_INVALIDSECURITY_ERROR MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 7)
// Unknown server error.
#define MREGISTER_E_DEVICE_UNKNOWN_ERROR MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 8)
// Another enrollment operation is currently underway.
#define MREGISTER_E_REGISTRATION_IN_PROGRESS MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 9)
// NO LONGER USED
#define MREGISTER_E_DEVICE_ALREADY_REGISTERED MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 10)
// Device is already enrolled.
#ifndef MENROLL_E_DEVICE_ALREADY_ENROLLED
#define MENROLL_E_DEVICE_ALREADY_ENROLLED MAKE_HRESULT(SEVERITY_ERROR, DEVICE_ENROLLER_FACILITY_CODE, 10)
#endif
// NO LONGER USED
#define MREGISTER_E_DEVICE_NOT_REGISTERED MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 11)
// Device is not enrolled.
#ifndef MENROLL_E_DEVICE_NOT_ENROLLED
#define MENROLL_E_DEVICE_NOT_ENROLLED MAKE_HRESULT(SEVERITY_ERROR, DEVICE_ENROLLER_FACILITY_CODE, 11)
#endif
// NO LONGER USED
#define MREGISTER_E_DISCOVERY_REDIRECTED MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 12)
// NO LONGER USED
#define MREGISTER_E_DEVICE_NOT_AD_REGISTERED_ERROR MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 13)
// NO LONGER USED
#define MREGISTER_E_DISCOVERY_FAILED MAKE_HRESULT(SEVERITY_ERROR, MDM_REGISTRATION_FACILITY_CODE, 14)
// User already enrolled too many devices. Delete or unenroll old ones to fix this error (user can fix it without admin)
#define MENROLL_E_DEVICECAPREACHED MAKE_HRESULT(SEVERITY_ERROR, DEVICE_ENROLLER_FACILITY_CODE, 19)
// Mobile device management generally not supported (would save an admin call)
#define MENROLL_E_NOTSUPPORTED MAKE_HRESULT(SEVERITY_ERROR, DEVICE_ENROLLER_FACILITY_CODE, 21)
// License of user is in bad state blocking enrollment (user still needs to call admin)
#define MENROLL_E_USERLICENSE MAKE_HRESULT(SEVERITY_ERROR, DEVICE_ENROLLER_FACILITY_CODE, 24)
// The user canceled the operation
#define MENROLL_E_USER_CANCELED MAKE_HRESULT(SEVERITY_ERROR, DEVICE_ENROLLER_FACILITY_CODE, 42)
// Struct returned by the discovery service containing
// the endpoints and information about the management service.
typedef struct _MANAGEMENT_SERVICE_INFO
{
LPWSTR pszMDMServiceUri;
LPWSTR pszAuthenticationUri;
}MANAGEMENT_SERVICE_INFO,*PMANAGEMENT_SERVICE_INFO;
#define DEVICEREGISTRATIONTYPE_MDM_ONLY 0
#define DEVICEREGISTRATIONTYPE_MAM 5
#define DEVICEREGISTRATIONTYPE_MDM_DEVICEWIDE_WITH_AAD 6
#define DEVICEREGISTRATIONTYPE_MDM_USERSPECIFIC_WITH_AAD 13
// Struct returned by the discovery service containing
// the endpoints and information about the management service.
typedef struct _MANAGEMENT_REGISTRATION_INFO
{
BOOL fDeviceRegisteredWithManagement;
DWORD dwDeviceRegistionKind;
LPWSTR pszUPN;
LPWSTR pszMDMServiceUri;
}MANAGEMENT_REGISTRATION_INFO, *PMANAGEMENT_REGISTRATION_INFO;
typedef enum _REGISTRATION_INFORMATION_CLASS {
DeviceRegistrationBasicInfo = 1,
MaxDeviceInfoClass // MaxDeviceInfoClass should always be the last enum
} REGISTRATION_INFORMATION_CLASS, *PREGISTRATION_INFORMATION_CLASS;
/*++
Routine Description:
This function is used to check if this device is registered with an MDM service.
Arguments:
ppDeviceRegistion - details of the registration, free with HeapFree
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
GetDeviceRegistrationInfo(
_In_ REGISTRATION_INFORMATION_CLASS DeviceInformationClass,
_Out_ PVOID* ppDeviceRegistrationInfo
);
/*++
Routine Description:
This function is used to check if this device is registered with an MDM service.
Arguments:
pfIsDeviceRegisteredWithManagement - will be set to TRUE if device is registered, FALSE otherwise.
cchUPN - maximum length of pszUPN
pszUPN - string parameter to return the UPN
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
IsDeviceRegisteredWithManagement(
_Out_ BOOL *pfIsDeviceRegisteredWithManagement,
_In_ DWORD cchUPN,
_Out_opt_z_cap_(cchUPN) LPWSTR pszUPN
);
/*++
Routine Description:
This function is used to check if registration is allowed
Arguments:
pfIsManagementRegistrationAllowed - will be set to TRUE if device is managed externally, FALSE otherwise.
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
IsManagementRegistrationAllowed(
_Out_ BOOL *pfIsManagementRegistrationAllowed
);
/*++
Routine Description:
This function is used to check if device is not already enrolled and SKU license (SLAPI) allows enrollment
Arguments:
isEnrollmentAllowed - TRUE if allowed
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
IsMdmUxWithoutAadAllowed(
_Out_ BOOL* isEnrollmentAllowed
);
/*++
Routine Description:
This function is used to set if the device is externally managed
Arguments:
IsManagedExternally - TRUE if device is managed externally, FALSE otherwise.
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
SetManagedExternally(
_In_ BOOL IsManagedExternally
);
/*++
Routine Description:
This function is used to auto-discover the MDM service.
Arguments:
pszUPN - email.
Return Value:
HRESULT indicating success or failure.
pszUPN - UPN (email address) of the user
ppMgmtInfo - Struct containing MDM service URIs
--*/
HRESULT WINAPI
DiscoverManagementService(
_In_z_ LPCWSTR pszUPN,
_Out_ PMANAGEMENT_SERVICE_INFO* ppMgmtInfo
);
/*++
Routine Description:
This function is used to register a device with the MDM service synchronously.
It will get the MDM information, including authentication token from AAD
Arguments:
UserToken - The User to impersonate when attempting to get AAD token
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
RegisterDeviceWithManagementUsingAADCredentials(HANDLE UserToken);
/*++
Routine Description:
This function is used to register a device with the MDM service synchronously.
It will get the MDM information, including authentication device token from AAD
Arguments:
None
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
RegisterDeviceWithManagementUsingAADDeviceCredentials();
/*++
Routine Description:
This function is used to register a device with the MDM service synchronously.
It will automatically discover the MDM information, including MDM device enrollment resource URL and authentication device token from AAD
Arguments:
MDMApplicationID Unique ID of MDM application that is configured in Azure AD.
Only required when multiple MDM applications are configured on the Azure AD.
The maximum length is 255 characters, excluding the terminal null.
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
RegisterDeviceWithManagementUsingAADDeviceCredentials2(_In_opt_ PCWSTR MDMApplicationID);
/*++
Routine Description:
This function is used to register a device with the MDM service synchronously.
Arguments:
pszUPN - UPN (email address) of the user
ppszMDMServiceUri - Management service URI.
pszAccessToken - Access token obtained from the authentication service
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
RegisterDeviceWithManagement(
_In_z_ LPCWSTR pszUPN,
_In_z_ LPCWSTR ppszMDMServiceUri,
_In_z_ LPCWSTR ppzsAccessToken
);
/*++
Routine Description:
This function is used to unregister a device synchronously.
Arguments:
enrollmentID - enrollmentID of specific management server to unregister from. If null, then unregister from all.
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
UnregisterDeviceWithManagement(
_In_opt_z_ LPCWSTR enrollmentID
);
/*++
Routine Description:
This API is used to get the config info associated with the provider ID.
Arguments:
providerID - string parameter containing the providerID
configStringBufferLength - pointer to the buffer length (size of configString in chars)
configString - a buffer that will contain the ConfigInfo if the function completes successfully
If the buffer specified by configString parameter is not large enough to hold the data, the function
returns ERROR_MORE_DATA and stores the required buffer size in the variable pointed to by configStringBufferLength.
In this case, the contents of the configString buffer are undefined.
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
GetDeviceManagementConfigInfo(
_In_ PCWSTR providerID,
_Inout_ DWORD* configStringBufferLength,
_Out_writes_to_opt_(*configStringBufferLength, *configStringBufferLength) PWSTR configString
);
/*++
Routine Description:
This API is used to set the config info associated with the provider ID.
Arguments:
providerID - string parameter containing the providerID
configString - string parameter containing the ConfigInfo(data to write)
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
SetDeviceManagementConfigInfo(
_In_ PCWSTR providerID,
_In_ PCWSTR configString
);
/*++
Routine Description:
This API is used to get the management app hyperlink associated with the MDM service.
Arguments:
cchHyperlink - maximum length of pszHyperlink
pszHyperlink - string parameter to return a hyperlink to the MDM service app
Return Value:
HRESULT indicating success or failure.
--*/
HRESULT WINAPI
GetManagementAppHyperlink(
_In_ DWORD cchHyperlink,
_Out_z_cap_(cchHyperlink) LPWSTR pszHyperlink
);
#if (NTDDI_VERSION > NTDDI_WINBLUE || \
(NTDDI_VERSION == NTDDI_WINBLUE && defined(WINBLUE_KBSPRING14)))
/*++
Routine Description:
This function is used to discover the MDM service given a candidate service URL
Arguments:
pszUPN - email
pszDiscoveryServiceCandidate - candidate service URL for for discovery
Return Value:
HRESULT indicating success or failure.
ppMgmtInfo - Struct containing MDM service URIs
--*/
HRESULT WINAPI
DiscoverManagementServiceEx(
_In_z_ LPCWSTR pszUPN,
_In_z_ LPCWSTR pszDiscoveryServiceCandidate,
_Out_ PMANAGEMENT_SERVICE_INFO* ppMgmtInfo
);
#endif
#ifdef __cplusplus
}
#endif
#endif // WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_DESKTOP)
#pragma endregion
#endif // _MDM_REG_
#endif // NTDDI_VERSION >= NTDDI_WINBLUE
|
.
A 56 year old patient fitted with a femoro-popliteal Dacron prosthesis (Weavenit-Meadox Medical) 84 months earlier, presented with a severe palpable dilation in the middle part of the graft. The dilated segment of the prosthesis was resected locally and spliced to a new fabric device. The explanted section of prosthesis was analyzed; the dilation was found to extend only over several centimeters near the centre of the prosthesis. The degree of "healing" was found to be satisfactory except in the part which was most distended; this area also revealed mild bacterial colonization. Physico-chemical analysis of the fabric revealed that the dilation had taken place subsequently to hydrolytic degradation of the polyester fibres. Manufacturing conditions may have contributed to predisposing parts of the prosthesis to accelerated biodeterioration in vivo. |
def shave_bd(img, bd):
return img[bd:-bd, bd:-bd, :] |
Lifetime heavy drinking years predict alcohol use disorder severity over and above current alcohol use
ABSTRACT Background: Preclinical studies demonstrate that chronic and heavy alcohol use facilitates neuroadaptations that perpetuate addiction-like behaviors. In clinical studies, it is unclear whether the extent of heavy alcohol use over the lifetime contributes to alcohol use disorder (AUD) severity over and above current alcohol use patterns (i.e. last 30 days to 3-months). Such information may improve our understanding of the phenomenology of AUD. Objectives: The purpose of this study was to examine lifetime heavy drinking years in relation to a clinical assessment of AUD. Methods: Participants, who were non-treatment-seeking and engaged in heavy drinking (n = 140; 50% male), completed an interview-based assessment of lifetime regular and heavy drinking years along with a battery of measures indexing alcohol use and problems, drinking motives, and depression and anxiety symptomatology. Results: Lifetime heavy drinking years was positively associated with lifetime regular drinking years, current alcohol use, alcohol problems, tonic alcohol craving, drinking for the enhancing effects of alcohol, and drinking to cope (r’s = .21–.58). Adjusting for lifetime regular drinking years and current alcohol use, lifetime heavy drinking years predicted higher scores on the Alcohol Use Disorder Identification Test (AUDIT; B = .382; SE = .123). A multivariate logistic regression found that lifetime heavy drinking years predicted greater odds of more severe AUD over and above current alcohol use (OR = 1.147). Conclusion: Our findings suggest that lifetime heavy drinking years are a clinically meaningful indicator of AUD severity that is not redundant with current alcohol use measures. |
package amf0
import (
"bytes"
"io/ioutil"
"testing"
"github.com/stretchr/testify/assert"
)
func TestNullBuildsAndEncodes(t *testing.T) {
buf := new(bytes.Buffer)
n, err := new(Null).Encode(buf)
assert.Nil(t, err)
assert.Equal(t, 0, n)
assert.Empty(t, buf.Bytes())
}
func TestNullDecodes(t *testing.T) {
o := new(Null)
err := o.Decode(new(bytes.Buffer))
assert.Nil(t, err)
}
func BenchmarkNullDecode(b *testing.B) {
out := new(Null)
for i := 0; i < b.N; i++ {
out.Decode(bytes.NewReader([]byte{}))
}
}
func BenchmarkNullEncode(b *testing.B) {
n := new(Null)
for i := 0; i < b.N; i++ {
n.Encode(ioutil.Discard)
}
}
func TestUndefinedBuildsAndEncodes(t *testing.T) {
buf := new(bytes.Buffer)
n, err := new(Undefined).Encode(buf)
assert.Nil(t, err)
assert.Equal(t, 0, n)
assert.Empty(t, buf.Bytes())
}
func TestUndefinedDecodes(t *testing.T) {
o := new(Undefined)
err := o.Decode(new(bytes.Buffer))
assert.Nil(t, err)
}
func BenchmarkUndefinedDecode(b *testing.B) {
out := new(Undefined)
for i := 0; i < b.N; i++ {
out.Decode(bytes.NewReader([]byte{}))
}
}
func BenchmarkUndefinedEncode(b *testing.B) {
n := new(Undefined)
for i := 0; i < b.N; i++ {
n.Encode(ioutil.Discard)
}
}
|
def fibonacci_recurcive(n=10, first_pass=True, fibonacci_db=None, prints=True):
if fibonacci_error(n): return "Invalid input! Input must be positive integers."
global a, b
if first_pass and n==2: return [0, 1]
if first_pass and n==1: return [0]
if not fibonacci_db: fibonacci_db = [0, 1]
else: fibonacci_db = fibonacci_db
if n == 0:
erase_stored_values()
return fibonacci_db
if first_pass:
if prints: print("{} {}".format(a,b), end=" ")
n -= 2
a, b = b, a+b
if prints: print(b, end=" ")
fibonacci_db.append(b)
return fibonacci_recurcive(n-1, first_pass=False, fibonacci_db=fibonacci_db, prints=prints) |
<gh_stars>0
"""
CRUD de SQLite3 con Python 3
"""
import sqlite3
try:
bd = sqlite3.connect("libros.db")
cursor = bd.cursor()
sentencia = "SELECT * FROM libros;"
cursor.execute(sentencia)
libros = cursor.fetchall()
print("+{:-<20}+{:-<20}+{:-<10}+{:-<50}+".format("", "", "", ""))
print("|{:^20}|{:^20}|{:^10}|{:^50}|".format("Autor", "Género", "Precio", "Título"))
print("+{:-<20}+{:-<20}+{:-<10}+{:-<50}+".format("", "", "", ""))
for autor, genero, precio, titulo in libros:
print("|{:^20}|{:^20}|{:^10}|{:^50}|".format(autor, genero, precio, titulo))
print("+{:-<20}+{:-<20}+{:-<10}+{:-<50}+".format("", "", "", ""))
except sqlite3.OperationalError as error:
print("Error al abrir:", error)
|
// TODO(eroman): Move this to anonymous namespace.
Status ExportKeyDontCheckExtractability(blink::WebCryptoKeyFormat format,
const blink::WebCryptoKey& key,
std::vector<uint8>* buffer) {
switch (format) {
case blink::WebCryptoKeyFormatRaw: {
platform::SymKey* sym_key;
Status status = ToPlatformSymKey(key, &sym_key);
if (status.IsError())
return status;
return platform::ExportKeyRaw(sym_key, buffer);
}
case blink::WebCryptoKeyFormatSpki: {
platform::PublicKey* public_key;
Status status = ToPlatformPublicKey(key, &public_key);
if (status.IsError())
return status;
return platform::ExportKeySpki(public_key, buffer);
}
case blink::WebCryptoKeyFormatPkcs8: {
platform::PrivateKey* private_key;
Status status = ToPlatformPrivateKey(key, &private_key);
if (status.IsError())
return status;
return platform::ExportKeyPkcs8(private_key, key.algorithm(), buffer);
}
case blink::WebCryptoKeyFormatJwk:
return ExportKeyJwk(key, buffer);
default:
return Status::ErrorUnsupported();
}
} |
def child(self):
return Context(parent=self) |
package accountviewnet_test
import (
"encoding/json"
"log"
"testing"
)
func TestDjPageGet(t *testing.T) {
req := client.NewDjPageGetRequest()
req.QueryParams().PageSize = 20
req.QueryParams().FilterControlSource1 = "DJ_CODE"
req.QueryParams().FilterOperator1 = "Equal"
req.QueryParams().FilterValueType1 = "C"
req.QueryParams().FilterValue1 = "860"
req.QueryParams().FilterValue1 = "123"
resp, err := req.Do()
if err != nil {
t.Error(err)
}
b, _ := json.MarshalIndent(resp, "", " ")
log.Println(string(b))
}
|
<filename>packages/web/src/components/FormCreateQuiz/components/FormQuiz/styles.ts
import styled from 'styled-components'
import theme from '../../../../styles/theme'
export const InputStyled = styled.input`
width: 100%;
height: 40px;
border-radius: 5px;
outline: 0;
background: transparent;
border: 1px solid ${theme.colors.mainBg};
color: ${theme.colors.contrastText};
padding: 0 10px;
font-size: 18px;
margin: 10px 0;
`
export const TextAreaStyled = styled.textarea`
width: 100%;
height: 80px;
display: flex;
border-radius: 5px;
outline: 0;
background: transparent;
border: 1px solid ${theme.colors.mainBg};
color: ${theme.colors.contrastText};
padding: 0 10px;
font-size: 18px;
margin: 10px 0;
`
|
def parse(cls, key_bytes: bytes) -> "PublicKey":
return cls(ecdsa.VerifyingKey.from_string(key_bytes, curve=SECP256k1)) |
log_level = "INFO"
max_task_count = 1000
poll_db_interval = 100
mq_queue = "downloader_queue"
mq_routing_key = "downloader_routing_key"
mq_exchange = "downloader_exchange"
config_domains = ['youku.com', 'ykimg.com', 'tudou.com',
'tudouui.com', 'tdimg.com', 'le.com',
'letv.com', 'letvcdn.com', 'iqiyi.com',
'qiyi.com', 'sohu.com', 'qq.com',
'qzoneapp.com', 'gtimg.com']
# config_dates = "AND url_date>='20160827' AND url_date<='20160829'"
config_dates = ""
|
def first(self) -> "Locator":
return mapping.from_impl(self._impl_obj.first) |
<reponame>potherca-contrib/origin-community-cartridges
/*
* RHQ Management Platform
* Copyright (C) 2005-2011 Red Hat, Inc.
* All rights reserved.
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation version 2 of the License.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
*/
package org.rhq.enterprise.gui.coregui.client.components.table;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.LinkedHashMap;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import com.google.gwt.user.client.Timer;
import com.smartgwt.client.data.Criteria;
import com.smartgwt.client.data.DSCallback;
import com.smartgwt.client.data.DSRequest;
import com.smartgwt.client.data.DSResponse;
import com.smartgwt.client.data.DataSourceField;
import com.smartgwt.client.data.Record;
import com.smartgwt.client.data.ResultSet;
import com.smartgwt.client.data.SortSpecifier;
import com.smartgwt.client.types.ListGridFieldType;
import com.smartgwt.client.types.Overflow;
import com.smartgwt.client.types.SelectionStyle;
import com.smartgwt.client.types.VerticalAlignment;
import com.smartgwt.client.util.BooleanCallback;
import com.smartgwt.client.util.SC;
import com.smartgwt.client.widgets.Canvas;
import com.smartgwt.client.widgets.HTMLFlow;
import com.smartgwt.client.widgets.IButton;
import com.smartgwt.client.widgets.Label;
import com.smartgwt.client.widgets.events.ClickEvent;
import com.smartgwt.client.widgets.events.ClickHandler;
import com.smartgwt.client.widgets.events.DoubleClickEvent;
import com.smartgwt.client.widgets.events.DoubleClickHandler;
import com.smartgwt.client.widgets.form.fields.ComboBoxItem;
import com.smartgwt.client.widgets.form.fields.FormItem;
import com.smartgwt.client.widgets.form.fields.HiddenItem;
import com.smartgwt.client.widgets.form.fields.SelectItem;
import com.smartgwt.client.widgets.form.fields.TextItem;
import com.smartgwt.client.widgets.form.fields.events.ChangedEvent;
import com.smartgwt.client.widgets.form.fields.events.ChangedHandler;
import com.smartgwt.client.widgets.form.fields.events.KeyPressEvent;
import com.smartgwt.client.widgets.form.fields.events.KeyPressHandler;
import com.smartgwt.client.widgets.grid.ListGrid;
import com.smartgwt.client.widgets.grid.ListGridField;
import com.smartgwt.client.widgets.grid.events.DataArrivedEvent;
import com.smartgwt.client.widgets.grid.events.DataArrivedHandler;
import com.smartgwt.client.widgets.grid.events.SelectionChangedHandler;
import com.smartgwt.client.widgets.grid.events.SelectionEvent;
import com.smartgwt.client.widgets.layout.Layout;
import com.smartgwt.client.widgets.layout.LayoutSpacer;
import com.smartgwt.client.widgets.menu.IMenuButton;
import com.smartgwt.client.widgets.menu.MenuItem;
import com.smartgwt.client.widgets.menu.events.MenuItemClickEvent;
import org.rhq.core.domain.search.SearchSubsystem;
import org.rhq.enterprise.gui.coregui.client.CoreGUI;
import org.rhq.enterprise.gui.coregui.client.InitializableView;
import org.rhq.enterprise.gui.coregui.client.RefreshableView;
import org.rhq.enterprise.gui.coregui.client.components.TitleBar;
import org.rhq.enterprise.gui.coregui.client.components.form.DateFilterItem;
import org.rhq.enterprise.gui.coregui.client.components.form.EnhancedSearchBarItem;
import org.rhq.enterprise.gui.coregui.client.util.CriteriaUtility;
import org.rhq.enterprise.gui.coregui.client.util.Log;
import org.rhq.enterprise.gui.coregui.client.util.RPCDataSource;
import org.rhq.enterprise.gui.coregui.client.util.message.Message;
import org.rhq.enterprise.gui.coregui.client.util.selenium.LocatableDynamicForm;
import org.rhq.enterprise.gui.coregui.client.util.selenium.LocatableHLayout;
import org.rhq.enterprise.gui.coregui.client.util.selenium.LocatableIButton;
import org.rhq.enterprise.gui.coregui.client.util.selenium.LocatableIMenuButton;
import org.rhq.enterprise.gui.coregui.client.util.selenium.LocatableListGrid;
import org.rhq.enterprise.gui.coregui.client.util.selenium.LocatableMenu;
import org.rhq.enterprise.gui.coregui.client.util.selenium.LocatableToolStrip;
import org.rhq.enterprise.gui.coregui.client.util.selenium.LocatableVLayout;
import org.rhq.enterprise.gui.coregui.client.util.selenium.SeleniumUtility;
/**
* A tabular view of set of data records from an {@link RPCDataSource}.
*
* WARNING! If you make _any_ changes to this class, no matter how seemingly
* trivial, you must get it peer reviewed. Send out your proposed changes
* to the dev mailing list and ask for comments. Any problems introduced to
* this class are magnified because it is used in so many UI views and problems
* are hard to detect due to the various ways it is used.
*
* @author <NAME>
* @author <NAME>inger
*/
@SuppressWarnings("rawtypes")
public class Table<DS extends RPCDataSource> extends LocatableHLayout implements RefreshableView, InitializableView {
private static final int DATA_PAGE_SIZE = 50;
protected static final String FIELD_ID = "id";
protected static final String FIELD_NAME = "name";
private LocatableVLayout contents;
private HTMLFlow titleCanvas;
private LocatableVLayout titleLayout;
private TitleBar titleBar;
private String titleIcon;
private TableFilter filterForm;
private ListGrid listGrid;
private Label tableInfo;
private List<String> headerIcons = new ArrayList<String>();
private boolean showHeader = true;
private boolean showFooter = true;
private boolean showFooterRefresh = true;
private boolean showFilterForm = true;
private String titleString;
private Criteria initialCriteria;
private boolean initialCriteriaFixed = true;
private SortSpecifier[] sortSpecifiers;
private String[] excludedFieldNames;
private boolean autoFetchData;
private boolean flexRowDisplay = true;
private boolean hideSearchBar = false;
private String initialSearchBarSearchText = null;
private DS dataSource;
private DoubleClickHandler doubleClickHandler;
private List<TableActionInfo> tableActions = new ArrayList<TableActionInfo>();
private boolean tableActionDisableOverride = false;
protected List<Canvas> extraWidgetsAboveFooter = new ArrayList<Canvas>();
protected List<Canvas> extraWidgetsInMainFooter = new ArrayList<Canvas>();
private LocatableToolStrip footer;
private LocatableToolStrip footerExtraWidgets;
private LocatableIButton refreshButton;
private boolean initialized;
public Table(String locatorId) {
this(locatorId, null, null, null, null, true);
}
public Table(String locatorId, String tableTitle) {
this(locatorId, tableTitle, null, null, null, true);
}
public Table(String locatorId, String tableTitle, String icon) {
this(locatorId, tableTitle, null, null, null, true);
setTitleIcon(icon);
}
public Table(String locatorId, String tableTitle, Criteria criteria, String icon) {
this(locatorId, tableTitle, criteria, null, null, (criteria == null));
setTitleIcon(icon);
}
public Table(String locatorId, String tableTitle, Criteria criteria) {
this(locatorId, tableTitle, criteria, null, null, (criteria == null));
}
public Table(String locatorId, String tableTitle, SortSpecifier[] sortSpecifiers) {
this(locatorId, tableTitle, null, sortSpecifiers, null, true);
}
protected Table(String locatorId, String tableTitle, SortSpecifier[] sortSpecifiers, Criteria criteria) {
this(locatorId, tableTitle, criteria, sortSpecifiers, null, (criteria == null));
}
public Table(String locatorId, String tableTitle, boolean autoFetchData) {
this(locatorId, tableTitle, null, null, null, autoFetchData);
}
public Table(String locatorId, String tableTitle, SortSpecifier[] sortSpecifiers, String[] excludedFieldNames) {
this(locatorId, tableTitle, null, sortSpecifiers, excludedFieldNames, true);
}
public Table(String locatorId, String tableTitle, Criteria criteria, SortSpecifier[] sortSpecifiers,
String[] excludedFieldNames) {
this(locatorId, tableTitle, criteria, sortSpecifiers, excludedFieldNames, (criteria == null));
}
public Table(String locatorId, String tableTitle, Criteria criteria, SortSpecifier[] sortSpecifiers,
String[] excludedFieldNames, boolean autoFetchData) {
super(locatorId);
if (criteria != null && autoFetchData) {
throw new IllegalArgumentException(
"Non-null initialCriteria and autoFetchData=true cannot be specified together, due to a bug in SmartGWT.");
}
setWidth100();
setHeight100();
//setOverflow(Overflow.HIDDEN);
this.titleString = tableTitle;
this.initialCriteria = criteria;
this.sortSpecifiers = sortSpecifiers;
this.excludedFieldNames = excludedFieldNames;
this.autoFetchData = autoFetchData;
}
/**
* If this returns true, then even if a {@link #getSearchSubsystem() search subsystem}
* is defined by the table class, the search bar will not be shown.
*
* @return true if the search bar is to be hidden (default is false)
*/
public boolean getHideSearchBar() {
return this.hideSearchBar;
}
public void setHideSearchBar(boolean flag) {
this.hideSearchBar = flag;
}
public String getInitialSearchBarSearchText() {
return this.initialSearchBarSearchText;
}
public void setInitialSearchBarSearchText(String text) {
this.initialSearchBarSearchText = text;
}
public void setFlexRowDisplay(boolean flexRowDisplay) {
this.flexRowDisplay = flexRowDisplay;
}
/**
* Override Point:
* Override this method to use a customized layout.
*
* @return the Layout for all of the Table contents
* @see #configureTableContents(Layout)
*/
protected LocatableVLayout createTableContents() {
if (null == contents) {
contents = new LocatableVLayout(extendLocatorId("Contents"));
}
return contents;
}
/**
* Override Point:
* Override this method to set Layout attributes that can be set at initialization time. By default
* the contents layout is:
* <pre>
* - set to 100% width and height
* - set to Overflow.AUTO
* </pre>
*
* @param contents
*/
protected void configureTableContents(Layout contents) {
contents.setWidth100();
contents.setHeight100();
contents.setOverflow(Overflow.AUTO);
}
public LocatableVLayout getTableContents() {
return contents;
}
/**
* Override Point:
* Override this method to set grid properties outside of those required by Table or set via Table's constructors.
* The default implementation:
* <pre>
* - sets to 100% width and height
* </pre>
* This is called from onInit() and guarantees grid not null.
*
* @param grid
*/
protected void configureListGrid(ListGrid grid) {
listGrid.setWidth100();
listGrid.setHeight100();
listGrid.setLoadingDataMessage("${loadingImage} " + MSG.common_msg_loading());
listGrid.setEmptyMessage(MSG.common_msg_noItemsToShow());
}
@Override
protected void onInit() {
super.onInit();
contents = createTableContents();
configureTableContents(contents);
addMember(contents);
filterForm = new TableFilter(this);
// Table filters and search bar are currently mutually exclusive.
if (getSearchSubsystem() == null) {
configureTableFilters();
} else {
if (!this.hideSearchBar) {
final EnhancedSearchBarItem searchFilter = new EnhancedSearchBarItem("search", getSearchSubsystem(),
getInitialSearchBarSearchText());
setFilterFormItems(searchFilter);
}
}
listGrid = createListGrid(contents.extendLocatorId("ListGrid"));
configureListGrid(listGrid);
listGrid.setAutoFetchData(autoFetchData);
if (initialCriteria != null) {
listGrid.setInitialCriteria(initialCriteria);
}
if (sortSpecifiers != null) {
listGrid.setInitialSort(sortSpecifiers);
}
listGrid.setAlternateRecordStyles(true);
listGrid.setResizeFieldsInRealTime(false);
listGrid.setSelectionType(getDefaultSelectionStyle());
listGrid.setDataPageSize(DATA_PAGE_SIZE); // the default is 75 - lower it to speed up data loading
// Don't fetch more than 200 records for the sake of an attempt to group-by.
listGrid.setGroupByMaxRecords(200); // the default is 1000
// Disable the group-by option in the column header context menu, since group-by requires the entire data set to
// by loaded on the client-side, which isn't practical for most of our list views, since they can contain
// thousands of records.
listGrid.setCanGroupBy(false);
if (flexRowDisplay) {
//listGrid.setAutoFitData(Autofit.HORIZONTAL); // do NOT set this - smartgwt appears to have a problem that causes it to eat CPU
listGrid.setWrapCells(true);
listGrid.setFixedRecordHeights(false);
}
// By default, SmartGWT will disable any rows that have a record named "enabled" with a value of false - setting
// these fields to a bogus field name will disable this behavior. Note, setting them to null does *not* disable
// the behavior.
listGrid.setRecordEnabledProperty("foobar");
listGrid.setRecordEditProperty("foobar");
// TODO: Uncomment the below line once we've upgraded to SmartGWT 2.3.
//listGrid.setRecordCanSelectProperty("foobar");
DS dataSource = getDataSource();
if (dataSource != null) {
dataSource.setDataPageSize(DATA_PAGE_SIZE);
listGrid.setDataSource(dataSource);
}
this.initialized = true;
}
@Override
public boolean isInitialized() {
return this.initialized;
}
protected SelectionStyle getDefaultSelectionStyle() {
return SelectionStyle.MULTIPLE;
}
// Table is an InitializableView. This onDraw() waits until we're sure we're initialized and then
// lays down the canvas. This gives subclasses a chance to perform initialization (including async calls)
// required to support the overrides (like configureTable()) they may have provided and that are called in
// doOnDraw().
@Override
protected void onDraw() {
super.onDraw();
if (isInitialized()) {
doOnDraw();
} else {
new Timer() {
final long startTime = System.currentTimeMillis();
public void run() {
if (isInitialized()) {
doOnDraw();
cancel();
} else {
// if after 10s we still aren't initialized just keep going, else try again
long elapsedMillis = System.currentTimeMillis() - startTime;
if (elapsedMillis < 10000L) {
schedule(100); // Reschedule the timer.
} else {
doOnDraw();
cancel();
}
}
}
}.run(); // fire the timer immediately
}
}
protected void doOnDraw() {
try {
// I'm not sure this is necessary as I'm not sure it's the case that draw()/onDraw() will get called
// multiple times. But if it did/does, this protects us by removing the current members before they
// get set below. Note that by having this here we *can not* add members in onInit, because they will
// immediately get removed. -jshaughn
for (Canvas child : contents.getMembers()) {
contents.removeChild(child);
}
// Title
this.titleCanvas = new HTMLFlow();
updateTitleCanvas(this.titleString);
if (showHeader) {
// titleLayout not really needed now as TitleBar has a LocatableVLayout
titleLayout = new LocatableVLayout(contents.extendLocatorId("Title"));
titleLayout.setAutoHeight();
titleLayout.setAlign(VerticalAlignment.BOTTOM);
contents.addMember(titleLayout, 0);
}
if (filterForm.hasContent()) {
contents.addMember(filterForm);
}
// add the listGrid defined in onInit
contents.addMember(listGrid);
// Footer
// A second toolstrip that optionally appears before the main footer - it will contain extra widgets.
// This is hidden from view unless extra widgets are actually added to the table above the main footer.
this.footerExtraWidgets = new LocatableToolStrip(contents.extendLocatorId("FooterExtraWidgets"));
footerExtraWidgets.setPadding(5);
footerExtraWidgets.setWidth100();
footerExtraWidgets.setMembersMargin(15);
footerExtraWidgets.hide();
contents.addMember(footerExtraWidgets);
this.footer = new LocatableToolStrip(contents.extendLocatorId("Footer"));
footer.setPadding(5);
footer.setWidth100();
footer.setMembersMargin(15);
if (!showFooter) {
footer.hide();
}
contents.addMember(footer);
// The ListGrid has been created and configured - now give subclasses a chance to configure the table.
configureTable();
listGrid.addDoubleClickHandler(new DoubleClickHandler() {
@Override
public void onDoubleClick(DoubleClickEvent event) {
if (doubleClickHandler != null && !getTableActionDisableOverride()) {
doubleClickHandler.onDoubleClick(event);
}
}
});
Label tableInfo = new Label();
tableInfo.setWrap(false);
setTableInfo(tableInfo);
refreshRowCount();
// NOTE: It is essential that we wait to hide any excluded fields until after super.onDraw() is called, since
// super.onDraw() is what actually adds the fields to the ListGrid (based on what fields are defined in
// the underlying datasource).
if (this.excludedFieldNames != null) {
for (String excludedFieldName : excludedFieldNames) {
this.listGrid.hideField(excludedFieldName);
}
}
if (showHeader) {
drawHeader();
}
if (showFooter) {
drawFooter();
}
if (!autoFetchData && (initialCriteria != null)) {
refresh();
}
} catch (Exception e) {
CoreGUI.getErrorHandler().handleError(MSG.view_table_drawFail(this.toString()), e);
}
markForRedraw();
}
private void refreshRowCount() {
Label tableInfo = getTableInfo();
if (tableInfo != null) {
boolean lengthIsKnown = false;
if (listGrid != null) {
ResultSet results = listGrid.getResultSet();
if (results != null) {
Boolean flag = results.lengthIsKnown();
if (flag != null) {
lengthIsKnown = flag.booleanValue();
}
} else {
lengthIsKnown = (listGrid.getDataSource() == null); // not bound by a datasource, assume we know
}
}
String contents;
if (lengthIsKnown) {
int totalRows = this.listGrid.getTotalRows();
int selectedRows = this.listGrid.getSelectedRecords().length;
contents = MSG.view_table_totalRows(String.valueOf(totalRows), String.valueOf(selectedRows));
} else {
contents = MSG.view_table_totalRowsUnknown();
}
tableInfo.setContents(contents);
}
}
@Override
public void destroy() {
this.initialized = false;
// immediately null out the listGrid to stop async refresh requests from executing during the destroy
// logic. This happens in selenium testing or when a user navs away prior to the refresh.
this.listGrid = null;
SeleniumUtility.destroyMembers(createTableContents());
super.destroy();
}
private void drawHeader() {
// just use the first icon (not sure use case for multiple icons in title)
titleBar = new TitleBar(titleLayout, titleString);
if (titleIcon != null) {
titleBar.setIcon(titleIcon);
}
titleLayout.addMember(titleBar);
titleLayout.addMember(titleCanvas);
}
private void drawFooter() {
// populate the extraWidgets toolstrip
footerExtraWidgets.removeMembers(footerExtraWidgets.getMembers());
if (!extraWidgetsAboveFooter.isEmpty()) {
for (Canvas extraWidgetCanvas : extraWidgetsAboveFooter) {
footerExtraWidgets.addMember(extraWidgetCanvas);
}
footerExtraWidgets.show();
}
footer.removeMembers(footer.getMembers());
for (final TableActionInfo tableAction : tableActions) {
if (null == tableAction.getValueMap()) {
// button action
IButton button = new LocatableIButton(tableAction.getLocatorId(), tableAction.getTitle());
button.setDisabled(true);
button.setOverflow(Overflow.VISIBLE);
button.addClickHandler(new ClickHandler() {
public void onClick(ClickEvent clickEvent) {
disableAllFooterControls();
if (tableAction.confirmMessage != null) {
String message = tableAction.confirmMessage.replaceAll("\\#",
String.valueOf(listGrid.getSelectedRecords().length));
SC.ask(message, new BooleanCallback() {
public void execute(Boolean confirmed) {
if (confirmed) {
tableAction.action.executeAction(listGrid.getSelectedRecords(), null);
} else {
refreshTableInfo();
}
}
});
} else {
tableAction.action.executeAction(listGrid.getSelectedRecords(), null);
}
}
});
tableAction.actionCanvas = button;
footer.addMember(button);
} else {
// menu action
LocatableMenu menu = new LocatableMenu(tableAction.getLocatorId() + "Menu");
final Map<String, Object> menuEntries = tableAction.getValueMap();
for (final String key : menuEntries.keySet()) {
MenuItem item = new MenuItem(key);
item.addClickHandler(new com.smartgwt.client.widgets.menu.events.ClickHandler() {
public void onClick(MenuItemClickEvent event) {
disableAllFooterControls();
tableAction.getAction().executeAction(listGrid.getSelectedRecords(), menuEntries.get(key));
}
});
menu.addItem(item);
}
IMenuButton menuButton = new LocatableIMenuButton(tableAction.getLocatorId(), tableAction.getTitle());
menuButton.setMenu(menu);
menuButton.setDisabled(true);
menuButton.setAutoFit(true); // this makes it pretty tight, but maybe better than the default, which is pretty wide
menuButton.setOverflow(Overflow.VISIBLE);
menuButton.setShowMenuBelow(false);
tableAction.actionCanvas = menuButton;
footer.addMember(menuButton);
}
}
for (Canvas extraWidgetCanvas : extraWidgetsInMainFooter) {
footer.addMember(extraWidgetCanvas);
}
footer.addMember(new LayoutSpacer());
if (isShowFooterRefresh()) {
this.refreshButton = new LocatableIButton(extendLocatorId("Refresh"), MSG.common_button_refresh());
refreshButton.addClickHandler(new ClickHandler() {
public void onClick(ClickEvent clickEvent) {
disableAllFooterControls();
refresh();
}
});
footer.addMember(refreshButton);
}
footer.addMember(tableInfo);
// Manages enable/disable buttons for the grid
listGrid.addSelectionChangedHandler(new SelectionChangedHandler() {
public void onSelectionChanged(SelectionEvent selectionEvent) {
refreshTableInfo();
}
});
listGrid.addDataArrivedHandler(new DataArrivedHandler() {
public void onDataArrived(DataArrivedEvent dataArrivedEvent) {
if (null != listGrid) {
refreshTableInfo();
}
}
});
// Ensure buttons are initially set correctly.
refreshTableInfo();
}
public void disableAllFooterControls() {
for (TableActionInfo tableAction : tableActions) {
tableAction.actionCanvas.disable();
}
for (Canvas extraWidget : extraWidgetsAboveFooter) {
extraWidget.disable();
}
for (Canvas extraWidget : extraWidgetsInMainFooter) {
extraWidget.disable();
}
if (isShowFooterRefresh() && this.refreshButton != null) {
this.refreshButton.disable();
}
}
/**
* Subclasses can use this as a chance to configure the list grid after it has been
* created but before it has been drawn to the DOM. This is also the proper place to add table
* actions so that they're rendered in the footer.
*/
protected void configureTable() {
return;
}
public void setFilterFormItems(FormItem... formItems) {
setShowHeader(false);
this.filterForm.setItems(formItems);
this.filterForm.setNumCols(4);
}
/**
* Overriding components can use this as a chance to add {@link FormItem}s which will filter
* the table that displays their data.
*/
protected void configureTableFilters() {
}
public String getTitleString() {
return this.titleString;
}
/**
* Set the Table's title string. This will subsequently call {@link #updateTitleCanvas(String)}.
* @param titleString
*/
public void setTitleString(String titleString) {
this.titleString = titleString;
if (this.titleCanvas != null) {
updateTitleCanvas(titleString);
}
}
public void setTitleIcon(String titleIcon) {
this.titleIcon = titleIcon;
}
public Canvas getTitleCanvas() {
return this.titleCanvas;
}
/**
* To set the Table's title, call {@link #setTitleString(String)}. This is primarily declared for purposes of
* override.
* @param titleString
*/
public void updateTitleCanvas(String titleString) {
if (titleString == null) {
titleString = "";
}
titleCanvas.markForRedraw();
}
public boolean isShowHeader() {
return showHeader;
}
public void setShowHeader(boolean showHeader) {
this.showHeader = showHeader;
}
public boolean isShowFooter() {
return showFooter;
}
public void setShowFooter(boolean showFooter) {
this.showFooter = showFooter;
}
/**
* Refreshes the list grid's data, filtered by any fixed criteria, as well as any user-specified filters.
*/
public void refresh() {
refresh(false);
}
/**
* Refreshes the list grid's data, filtered by any fixed criteria, as well as any user-specified filters.
* <p/>
* If resetPaging is true, resets paging on the grid prior to refreshing the data. resetPaging=true should be
* specified when refreshing right after records have been deleted, since the current paging settings may have
* become invalid due to the decrease in the total data set size.
*/
public void refresh(boolean resetPaging) {
if (!isInitialized()) {
return;
}
final ListGrid listGrid = getListGrid();
Criteria criteria = getCurrentCriteria();
Log.debug(getClass().getName() + ".refresh() using criteria [" + CriteriaUtility.toString(criteria) + "]...");
listGrid.setCriteria(criteria);
if (resetPaging) {
listGrid.scrollToRow(0);
}
// Only call invalidateCache() and fetchData() if the ListGrid is backed by a DataSource.
if (listGrid.getDataSource() != null) {
// Invalidate the cached records - if listGrid.getAutoFetchData() is true, this will cause the ListGrid to
// automatically call fetchData().
listGrid.invalidateCache();
if (!this.autoFetchData && (initialCriteria != null)) {
listGrid.fetchData(criteria);
}
}
listGrid.markForRedraw();
}
protected Criteria getInitialCriteria() {
return initialCriteria;
}
/**
* Can be called in constructor to reset initialCriteria.
* @param initialCriteria
*/
protected void setInitialCriteria(Criteria initialCriteria) {
this.initialCriteria = initialCriteria;
}
protected boolean isInitialCriteriaFixed() {
return initialCriteriaFixed;
}
/**
* @param initialCriteriaFixed If true initialCriteria is applied to all subsequent fetch criteria. If false
* initialCriteria is used only for the initial autoFetch. Irrelevant if autoFetch is false. Default is true.
*/
protected void setInitialCriteriaFixed(boolean initialCriteriaFixed) {
this.initialCriteriaFixed = initialCriteriaFixed;
}
/**
*
* @return the current criteria, which includes any fixed criteria, as well as any user-specified filters; may be
* null if there are no fixed criteria or user-specified filters
*/
protected Criteria getCurrentCriteria() {
Criteria criteria = null;
// If this table has a filter form (table filters OR search bar),
// we need to refresh it as per the filtering, combined with any fixed criteria.
if (this.filterForm != null && this.filterForm.hasContent()) {
criteria = this.filterForm.getValuesAsCriteria();
if (this.initialCriteriaFixed) {
if (criteria != null) {
if (this.initialCriteria != null) {
// There is fixed criteria - add it to the filter form criteria.
CriteriaUtility.addCriteria(criteria, this.initialCriteria);
}
} else {
criteria = this.initialCriteria;
}
}
} else if (this.initialCriteriaFixed) {
criteria = this.initialCriteria;
}
return criteria;
}
public DS getDataSource() {
return dataSource;
}
public void setDataSource(DS dataSource) {
this.dataSource = dataSource;
}
/**
* Creates this Table's list grid (called by onInit()). Subclasses can override this if they require a custom
* subclass of LocatableListGrid.
*
* @param locatorId the locatorId that should be set on the returned LocatableListGrid
*
* @return this Table's list grid (must be an instance of LocatableListGrid)
*/
protected LocatableListGrid createListGrid(String locatorId) {
return new LocatableListGrid(locatorId);
}
/**
* Returns this Table's list grid - may be null if the Table has not yet been {@link #isInitialized() initialized}.
* Subclasses should *not* override this method.
*
* @return this Table's list grid - may be null if the Table has not yet been {@link #isInitialized() initialized}
*/
public ListGrid getListGrid() {
return listGrid;
}
/**
* Wraps ListGrid.setFields(...) but takes care of "id" field display handling. Equivalent to calling:
* <pre>
* setFields( false, fields );
* </pre>
*
* @param fields the fields
*/
public void setListGridFields(ListGridField... fields) {
setListGridFields(false, fields);
}
/**
* Wraps ListGrid.setFields(...) but takes care of "id" field display handling.
*
* @param forceIdField if true, and "id" is a defined field, then display it. If false, it is displayed
* only in debug mode.
* @param fields the fields
*/
public void setListGridFields(boolean forceIdField, ListGridField... fields) {
if (getDataSource() == null) {
throw new IllegalStateException("setListGridFields() called on " + getClass().getName()
+ ", which is not a DataSource-backed Table.");
}
String[] dataSourceFieldNames = getDataSource().getFieldNames();
Set<String> dataSourceFieldNamesSet = new LinkedHashSet<String>();
dataSourceFieldNamesSet.addAll(Arrays.asList(dataSourceFieldNames));
Map<String, ListGridField> listGridFieldsMap = new LinkedHashMap<String, ListGridField>();
for (ListGridField listGridField : fields) {
listGridFieldsMap.put(listGridField.getName(), listGridField);
}
dataSourceFieldNamesSet.removeAll(listGridFieldsMap.keySet());
DataSourceField dataSourceIdField = getDataSource().getField(FIELD_ID);
boolean hideIdField = (!CoreGUI.isDebugMode() && !forceIdField);
if (dataSourceIdField != null && hideIdField) {
// setHidden() will not work on the DataSource field - use the listGrid.hideField() instead.
this.listGrid.hideField(FIELD_ID);
}
ListGridField listGridIdField = listGridFieldsMap.get(FIELD_ID);
if (listGridIdField != null) {
listGridIdField.setHidden(hideIdField);
}
if (!dataSourceFieldNamesSet.isEmpty()) {
ListGridField[] newFields = new ListGridField[fields.length + dataSourceFieldNamesSet.size()];
int destIndex = 0;
if (dataSourceFieldNamesSet.contains(FIELD_ID)) {
String datasourceFieldTitle = getDataSource().getField(FIELD_ID).getTitle();
String listGridFieldTitle = (datasourceFieldTitle != null) ? datasourceFieldTitle : MSG
.common_title_id();
listGridIdField = new ListGridField(FIELD_ID, listGridFieldTitle, 55);
// Override the DataSource id field metadata for consistent display across all Tables.
listGridIdField.setType(ListGridFieldType.INTEGER);
listGridIdField.setCanEdit(false);
listGridIdField.setHidden(hideIdField);
newFields[destIndex++] = listGridIdField;
dataSourceFieldNamesSet.remove(FIELD_ID);
}
System.arraycopy(fields, 0, newFields, destIndex, fields.length);
destIndex += fields.length;
for (String dataSourceFieldName : dataSourceFieldNamesSet) {
DataSourceField dataSourceField = getDataSource().getField(dataSourceFieldName);
ListGridField listGridField = new ListGridField(dataSourceField.getName());
this.listGrid.hideField(dataSourceFieldName);
listGridField.setHidden(true);
newFields[destIndex++] = listGridField;
}
this.listGrid.setFields(newFields);
} else {
this.listGrid.setFields(fields);
}
}
public void setTitleBar(TitleBar titleBar1) {
this.titleBar = titleBar1;
}
/**
* Note: To prevent user action while a current action completes, all widgets on the footer are disabled
* when footer actions take place, typically a button click. It is up to the action to ensure the page
* (via refresh() or CoreGUI.refresh()) or footer (via refreshTableActions) are refreshed as needed at action
* completion. Failure to do so may leave the widgets disabled.
*/
public void addTableAction(String locatorId, String title, TableAction tableAction) {
this.addTableAction(locatorId, title, null, null, tableAction);
}
/**
* Note: To prevent user action while a current action completes, all widgets on the footer are disabled
* when footer actions take place, typically a button click. It is up to the action to ensure the page
* (via refresh() or CoreGUI.refresh()) or footer (via refreshTableActions) are refreshed as needed at action
* completion. Failure to do so may leave the widgets disabled.
*/
public void addTableAction(String locatorId, String title, String confirmation, TableAction tableAction) {
this.addTableAction(locatorId, title, confirmation, null, tableAction);
}
/**
* Note: To prevent user action while a current action completes, all widgets on the footer are disabled
* when footer actions take place, typically a button click. It is up to the action to ensure the page
* (via refresh() or CoreGUI.refresh()) or footer (via refreshTableActions) are refreshed as needed at action
* completion. Failure to do so may leave the widgets disabled.
*/
public void addTableAction(String locatorId, String title, String confirmation,
Map<String, Object> valueMap, TableAction tableAction) {
// If the specified locator ID is qualified, strip off the ancestry prefix, so we can make sure its locator ID
// extends the footer's locator ID as it should.
int underscoreIndex = locatorId.lastIndexOf('_');
String unqualifiedLocatorId;
if (underscoreIndex >= 0 && underscoreIndex != (locatorId.length() - 1)) {
unqualifiedLocatorId = locatorId.substring(underscoreIndex + 1);
} else {
unqualifiedLocatorId = locatorId;
}
TableActionInfo info = new TableActionInfo(this.footer.extendLocatorId(unqualifiedLocatorId), title,
confirmation, valueMap, tableAction);
tableActions.add(info);
}
/**
* Updates the list of table's associated actions <code>tableActions</code>.
* It automatically updates the gui by calling <code>drawFooter()</code> provided the table has been initialized.
*
* Note: To prevent user action while a current action completes, all widgets on the footer are disabled
* when footer actions take place, typically a button click. It is up to the action to ensure the page
* (via refresh() or CoreGUI.refresh()) or footer (via refreshTableActions) are refreshed as needed at action
* completion. Failure to do so may leave the widgets disabled.
*
* @param title the title of a modified action
* @param valueMap the map containing the tuples with name of a select item and <code>actionValue</code> which is
* then passed to <code>tableAction.executeAction()</code>; use the <code>LinkedHashMap</code> if you want to
* preserve the order of map items
* @param tableAction the tableAction object (on this object the <code>executeAction()</code> is actually invoked)
*/
public void updateTableAction(String title, Map<String, Object> valueMap,
TableAction tableAction) {
if (title == null) {
return;
}
for (TableActionInfo info : tableActions) {
if (title.equals(info.getTitle())) {
if (valueMap != null) info.setValueMap(valueMap);
if (tableAction != null) info.setAction(tableAction);
// the action listeners have to be re-added
if (isInitialized()) drawFooter();
break;
}
}
}
public void setListGridDoubleClickHandler(DoubleClickHandler handler) {
doubleClickHandler = handler;
}
/**
* Adds extra widgets to the bottom of the table view.
* <br/><br/>
* Note: To prevent user action while a current action completes, all widgets on the footer are disabled
* when footer actions take place, typically a button click. It is up to the action to ensure the page
* (via refresh() or CoreGUI.refresh()) or footer (via refreshTableActions) are refreshed as needed at action
* completion. Failure to do so may leave the widgets disabled.
*
* @param widget the new widget to add to the table view
* @param aboveFooter if true, the widget will be placed in a second toolstrip just above the main footer.
* if false, the widget will be placed in the main footer toolstrip itself. This is
* useful if the widget is really big and won't fit in the main footer along with the
* rest of the main footer members.
*/
public void addExtraWidget(Canvas widget, boolean aboveFooter) {
if (aboveFooter) {
this.extraWidgetsAboveFooter.add(widget);
} else {
this.extraWidgetsInMainFooter.add(widget);
}
}
public void setHeaderIcon(String headerIcon) {
if (this.headerIcons.size() > 0) {
this.headerIcons.clear();
}
addHeaderIcon(headerIcon);
}
public void addHeaderIcon(String headerIcon) {
this.headerIcons.add(headerIcon);
}
/**
* By default, all table actions have buttons that are enabled or
* disabled based on if and how many rows are selected. There are
* times when you don't want the user to be able to press table action
* buttons regardless of which rows are selected. This method let's
* you set this override-disable flag.
*
* Note: this also effects the double-click handler - if this disable override
* is on, the double-click handler is not called.
*
* @param disabled if true, all table action buttons will be disabled
* if false, table action buttons will be enabled based on their predefined
* selection enablement rule.
*/
public void setTableActionDisableOverride(boolean disabled) {
this.tableActionDisableOverride = disabled;
refreshTableInfo();
}
public boolean getTableActionDisableOverride() {
return this.tableActionDisableOverride;
}
public void refreshTableInfo() {
if (this.showFooter && (this.listGrid != null)) {
if (this.tableActionDisableOverride) {
this.listGrid.setSelectionType(SelectionStyle.NONE);
} else {
this.listGrid.setSelectionType(getDefaultSelectionStyle());
}
//int selectionCount = this.listGrid.getSelectedRecords().length;
for (TableActionInfo tableAction : this.tableActions) {
if (tableAction.actionCanvas != null) { // if null, we haven't initialized our buttons yet, so skip this
boolean enabled = (!this.tableActionDisableOverride && tableAction.action.isEnabled(this.listGrid
.getSelectedRecords()));
tableAction.actionCanvas.setDisabled(!enabled);
}
}
for (Canvas extraWidget : this.extraWidgetsAboveFooter) {
extraWidget.enable();
if (extraWidget instanceof TableWidget) {
((TableWidget) extraWidget).refresh(this.listGrid);
}
}
for (Canvas extraWidget : this.extraWidgetsInMainFooter) {
extraWidget.enable();
if (extraWidget instanceof TableWidget) {
((TableWidget) extraWidget).refresh(this.listGrid);
}
}
refreshRowCount();
if (isShowFooterRefresh() && this.refreshButton != null) {
this.refreshButton.enable();
}
}
}
protected void deleteSelectedRecords() {
deleteSelectedRecords(null);
}
protected void deleteSelectedRecords(DSRequest requestProperties) {
ListGrid listGrid = getListGrid();
final int selectedRecordCount = listGrid.getSelectedRecords().length;
final List<String> deletedRecordNames = new ArrayList<String>(selectedRecordCount);
listGrid.removeSelectedData(new DSCallback() {
public void execute(DSResponse response, Object rawData, DSRequest request) {
if (response.getStatus() == DSResponse.STATUS_SUCCESS) {
Record[] deletedRecords = response.getData();
for (Record deletedRecord : deletedRecords) {
String name = deletedRecord.getAttribute(getTitleFieldName());
deletedRecordNames.add(name);
}
if (deletedRecordNames.size() == selectedRecordCount) {
// all selected schedules were successfully deleted.
String deletedMessage = getDeletedMessage(deletedRecordNames.size());
String deletedMessageDetail = deletedMessage + ": [" + deletedRecordNames.toString() + "]";
Message message = new Message(deletedMessage, deletedMessageDetail);
CoreGUI.getMessageCenter().notify(message);
refresh();
}
}
// TODO: Print error messages for failures or partial failures.
}
}, requestProperties);
}
protected String getTitleFieldName() {
return FIELD_NAME;
}
protected String getDeletedMessage(int numDeleted) {
String num = String.valueOf(numDeleted);
String thing = (1 == numDeleted) ? MSG.common_label_item() : MSG.common_label_items();
return MSG.common_msg_deleted(num, thing);
}
protected String getDeleteConfirmMessage() {
return MSG.common_msg_deleteConfirm(MSG.common_label_items());
}
protected void hideField(ListGridField field) {
getListGrid().hideField(field.getName());
field.setHidden(true);
}
// -------------- Inner utility classes ------------- //
/**
* A subclass of SmartGWT's DynamicForm widget that provides a more convenient interface for filtering a
* {@link Table} of results.
*
* @author <NAME>
*/
private static class TableFilter extends LocatableDynamicForm implements KeyPressHandler, ChangedHandler {
private Table<?> table;
private EnhancedSearchBarItem searchBarItem;
private HiddenItem hiddenItem;
public TableFilter(Table<?> table) {
super(table.extendLocatorId("TableFilter"));
setWidth100();
setPadding(5);
this.table = table;
}
@Override
public void setItems(FormItem... items) {
for (FormItem nextFormItem : items) {
nextFormItem.setWrapTitle(false);
nextFormItem.setWidth(300); // wider than default
if (nextFormItem instanceof TextItem) {
nextFormItem.addKeyPressHandler(this);
} else if (nextFormItem instanceof SelectItem) {
nextFormItem.addChangedHandler(this);
} else if (nextFormItem instanceof DateFilterItem) {
nextFormItem.addChangedHandler(this);
} else if (nextFormItem instanceof EnhancedSearchBarItem) {
searchBarItem = (EnhancedSearchBarItem) nextFormItem;
searchBarItem.getSearchBar().getSearchComboboxItem().addKeyPressHandler(this);
String name = searchBarItem.getName();
// postfix the name of the item so it is not processed by the filters and that the
// hidden item is used instead.
searchBarItem.setName(name + "_hidden");
hiddenItem = new HiddenItem(name);
hiddenItem.setValue(searchBarItem.getSearchBar().getSearchComboboxItem().getValueAsString());
}
}
if (hiddenItem != null) {
Log.debug("Found hidden items");
// Add the hidden item if it exists
FormItem[] tmpItems = new FormItem[items.length + 1];
System.arraycopy(items, 0, tmpItems, 0, items.length);
tmpItems[items.length] = hiddenItem;
items = tmpItems;
}
for (FormItem item : items) {
Log.debug(" ******** Form Items sent: " + item.getName() + ": " + item.getValue());
}
super.setItems(items);
}
private void fetchFilteredTableData() {
table.refresh();
}
@Override
public void onKeyPress(KeyPressEvent event) {
if (event.getKeyName().equals("Enter")) {
Log.debug("Table.TableFilter Pressed Enter key");
if (null != searchBarItem) {
ComboBoxItem comboBoxItem = searchBarItem.getSearchBar().getSearchComboboxItem();
String searchBarValue = comboBoxItem.getValueAsString();
String hiddenValue = (String) hiddenItem.getValue();
Log.debug("Table.TableFilter searchBarValue :" + searchBarValue + ", hiddenValue" + hiddenValue);
// Only send a fetch request if the user actually changed the search expression.
if (!equals(searchBarValue, hiddenValue)) {
hiddenItem.setValue(searchBarValue);
Log.debug("Table.TableFilter fetchFilteredTableData");
fetchFilteredTableData();
}
} else {
fetchFilteredTableData();
}
}
}
@Override
public void onChanged(ChangedEvent event) {
fetchFilteredTableData();
}
public boolean hasContent() {
return super.getFields().length != 0;
}
private static boolean equals(String string1, String string2) {
if (string1 == null) {
return (string2 == null);
} else {
return (string1.equals(string2));
}
}
}
public static class TableActionInfo {
private String locatorId;
private String title;
private String confirmMessage;
private Map<String, Object> valueMap;
private TableAction action;
private Canvas actionCanvas;
protected TableActionInfo(String locatorId, String title, String confirmMessage,
Map<String, Object> valueMap, TableAction action) {
this.locatorId = locatorId;
this.title = title;
this.confirmMessage = confirmMessage;
this.valueMap = valueMap;
this.action = action;
}
public String getLocatorId() {
return locatorId;
}
public String getTitle() {
return title;
}
public String getConfirmMessage() {
return confirmMessage;
}
public Map<String, Object> getValueMap() {
return valueMap;
}
public void setValueMap(Map<String, Object> valueMap) {
this.valueMap = valueMap;
}
public Canvas getActionCanvas() {
return actionCanvas;
}
public void setActionCanvas(Canvas actionCanvas) {
this.actionCanvas = actionCanvas;
}
public TableAction getAction() {
return action;
}
public void setAction(TableAction action) {
this.action = action;
}
}
public boolean isShowFooterRefresh() {
return showFooterRefresh;
}
public void setShowFooterRefresh(boolean showFooterRefresh) {
this.showFooterRefresh = showFooterRefresh;
}
public Label getTableInfo() {
return tableInfo;
}
public void setTableInfo(Label tableInfo) {
this.tableInfo = tableInfo;
}
public boolean isShowFilterForm() {
return showFilterForm;
}
public void setShowFilterForm(boolean showFilterForm) {
this.showFilterForm = showFilterForm;
}
/*
* By default, no search bar is shown above this table. if this table represents a subsystem that is capable
* of search, return the specific object here.
*/
protected SearchSubsystem getSearchSubsystem() {
return null;
}
}
|
n = int(input())
if n>1:
d = dict()
for i in range(n):
s = input().split()
d[int(s[0])] = int(s[1])
l = list(d.keys())
l.sort()
b=False
for i in range(1, n):
if d[l[i]] < d[l[i - 1]]:
b=True
break
if b==True:
print("Happy Alex")
else:
print('Poor Alex')
else:
input()
print('Poor Alex') |
import {
GraphQLField,
GraphQLObjectType,
GraphQLSchema,
isListType,
isNonNullType
} from 'graphql';
import camelCase from 'lodash/camelCase';
import upperFirst from 'lodash/upperFirst';
export type ResolverStoreItem = {
type: GraphQLObjectType; //eg: [city]
schemaKey: string; // eg: cities
entityName: string; // eg: City
argsTSName: string; // eg: QueryMyCitiesArgs
returnTSName: string; // eg: `Query['cities']`
isList: boolean;
isMutation: boolean;
isQuery: boolean;
isSubscription: boolean;
field: GraphQLField<any, any>;
isNonNull: boolean;
kind: string // mutation, subscription, query
};
export type ResolversStore = Map<string, ResolverStoreItem>;
export function getResolversHelper(schema: GraphQLSchema) {
let _resolversStore: ResolversStore = new Map();
const finalRootTypes = {
mutation: schema.getMutationType(),
query: schema.getQueryType(),
subscription: schema.getSubscriptionType()
};
const queryFields = finalRootTypes.query
? finalRootTypes.query.getFields()
: {};
const mutationFields = finalRootTypes.mutation
? finalRootTypes.mutation.getFields()
: {};
const subscriptionFields = finalRootTypes.subscription
? finalRootTypes.subscription.getFields()
: {};
let fields = {
...queryFields,
...mutationFields,
...subscriptionFields
};
const fieldKeys = Object.keys(fields);
fieldKeys.forEach(schemaKey => {
const field = fields[schemaKey];
const type = field.type as GraphQLObjectType;
const isList = isListType(field.type);
const isNonNull = isNonNullType(field.type);
let finalType = isNonNullType(field.type) ? field.type.ofType : field.type;
const entityName = isListType(finalType)
? finalType.ofType.toString()
: finalType.toString();
const isMutation = !!mutationFields[schemaKey];
const isQuery = !!queryFields[schemaKey];
const isSubscription = !!subscriptionFields[schemaKey];
let argsPrefix = '';
if (isMutation) {
argsPrefix = 'Mutation';
}
if (isQuery) {
argsPrefix = 'Query';
}
if (isSubscription) {
argsPrefix = 'Subscription';
}
const kind = argsPrefix.toLowerCase();
let argsTSName = field.args.length
? `${argsPrefix}${upperFirst(camelCase(schemaKey))}Args`
: '{}';
let returnTSName = `${argsPrefix}['${schemaKey}']`;
// returnTSName = isNonNull ? returnTSName : `Maybe<${returnTSName}>`; // TODO
const item = {
type,
schemaKey,
entityName,
isList,
argsTSName,
returnTSName,
isMutation,
isQuery,
isSubscription,
field,
isNonNull,
kind
};
_resolversStore.set(schemaKey, item);
});
return _resolversStore;
}
|
#pragma once
#include <ostream>
#include <iomanip>
#include "com_include.h"
#if !defined(_MSC_VER) && !defined(DXVK_NATIVE)
# ifdef __WINE__
# define DXVK_DEFINE_GUID(iface) \
template<> inline GUID const& __wine_uuidof<iface> () { return iface::guid; }
# else
# define DXVK_DEFINE_GUID(iface) \
template<> inline GUID const& __mingw_uuidof<iface> () { return iface::guid; }
# endif
#endif
std::ostream& operator << (std::ostream& os, REFIID guid);
|
# Link: https://oj.leetcode.com/problems/longest-substring-without-repeating-characters/
class Solution:
# @return an integer
def lengthOfLongestSubstring(self, s):
d = {}
start = 0
maxlen = 0
for i, c in enumerate(s):
if c not in d.keys():
if (i - start + 1) > maxlen:
maxlen = i - start + 1
d[c] = i
else:
for _ in range(start, d[c]):
d.pop(s[_])
start = d[c] + 1
d[c] = i
return maxlen
|
<gh_stars>1-10
package connectors
import (
osBrick "github.com/ydcool/os-brick-go"
"github.com/ydcool/os-brick-go/initiator"
"log"
"path/filepath"
"time"
)
//This method discovers a multipath device.
//
// Discover a multipath device based on a defined connection_property
// and a device_wwn and return the multipath_id and path of the multipath
// enabled device if there is one.
func discoverMPathDevice(deviceWwn string, connProperties map[string]interface{}, deviceName string) (string, string, error) {
path, err := initiator.FindMultipathDevicePath(deviceWwn)
if err != nil {
return "", "", err
}
var (
devicePath, multipathID string
)
if path == "" {
//find_multipath_device only accept realpath not symbolic path
deviceRealPath, err := filepath.EvalSymlinks(deviceName)
if err != nil {
return "", "", err
}
mPathInfo, err := initiator.FindMultipathDevice(deviceRealPath)
if mPathInfo != nil && err == nil {
devicePath = mPathInfo["device"].(string)
multipathID = deviceWwn
} else {
//we didn't find a multipath device.
//so we assume the kernel only sees 1 device
devicePath = deviceName
}
} else {
devicePath = path
multipathID = deviceWwn
}
if am, ok := connProperties["access_mode"]; ok && am != "ro" {
//Sometimes the multipath devices will show up as read only
//initially and need additional time/rescans to get to RW.
success := osBrick.RunWithRetry(5, time.Second, func(_ int) bool {
err := initiator.WaitForRW(deviceWwn, devicePath)
return err == nil
})
if !success {
log.Printf("block device %s is still read-only. Continuing anyway.", devicePath)
}
}
return devicePath, multipathID, nil
}
|
/**
* Format the course entry as a CSV file. The seasons part has a variable length
*
* @param entry The course entry to write as a CSV string
* @return The generated string
*/
private String entryToLine(CourseEntry entry) {
String line = MessageFormat.format("{0},{1}", entry.id, entry.name);
for(Season s : entry.seasons)
line += MessageFormat.format(",{0}", s);
return line;
} |
// Lab 2-1.
// This is the same as the first simple example in the course book,
// but with a few error checks.
// Remember to copy your file to a new on appropriate places during the lab so you keep old results.
// Note that the files "lab1-1.frag", "lab1-1.vert" are required.
// Should work as is on Linux and Mac. MS Windows needs GLEW or glee.
// See separate Visual Studio version of my demos.
#ifdef __APPLE__
#include <OpenGL/gl3.h>
// Linking hint for Lightweight IDE
// uses framework Cocoa
#endif
#include "MicroGlut.h"
#include "GL_utilities.h"
#include <math.h>
#include "loadobj.h"
#include "VectorUtils3.h"
//Defines
#define near 1.0
#define far 40.0
#define right 0.5
#define left -0.5
#define top 0.5
#define bottom -0.5
// Globals
// Data would normally be read from files
GLfloat myRotMatrix[] =
{
1.0f, 0.0f, 0.0f,0.0f,
0.0f, 1.0f, 0.0f,0.0f,
0.0f, 0.0f, 1.0f,0.0f,
0.0f, 0.0f, 0.0f, 1.0f
};
GLfloat myRotMatrix2[] =
{
1.0f, 0.0f, 0.0f,0.0f,
0.0f, 1.0f, 0.0f,0.0f,
0.0f, 0.0f, 1.0f,0.0f,
0.0f, 0.0f, 0.0f, 1.0f
};
GLfloat projectionMatrix[] =
{
2.0f*near/(right-left), 0.0f, (right+left)/(right-left), 0.0f,
0.0f, 2.0f*near/(top-bottom), (top+bottom)/(top-bottom), 0.0f,
0.0f, 0.0f, -(far + near)/(far - near), -2*far*near/(far - near),
0.0f, 0.0f, -1.0f, 0.0f
};
mat4 rot, rot2, trans, total, worldToViewMatrix;
Model *blade1;
Model *blade2;
Model *blade3;
Model *blade4;
Model *walls;
Model *roof;
Model *balcony;
void LoadTGATextureSimple(char *filename, GLuint *tex);
// Reference to program
GLuint program;
GLuint myTex;
GLuint myTex2;
// vertex array object
unsigned int vertexArrayObjID;
unsigned int vertexArrayObjID2;
void init(void)
{
dumpInfo();
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
glEnable(GL_DEPTH_TEST);
// GL inits
glClearColor(0.9,0.8,0.5,0);
printError("GL inits");
// Load and compile shader
program = loadShaders("lab3-1.vert","lab3-1.frag");
glUseProgram(program);
printError("init shader");
// Upload geometry to the GPU:
blade1 = LoadModelPlus("windmill/blade.obj");
blade2 = LoadModelPlus("windmill/blade.obj");
blade3 = LoadModelPlus("windmill/blade.obj");
blade4 = LoadModelPlus("windmill/blade.obj");
walls = LoadModelPlus("windmill/windmill-walls.obj");
balcony = LoadModelPlus("windmill/windmill-balcony.obj");
roof = LoadModelPlus("windmill/windmill-roof.obj");
//glActiveTexture(GL_TEXTURE0);
//Frustum matrix
glUniformMatrix4fv(glGetUniformLocation(program, "projMatrix"), 1, GL_TRUE, projectionMatrix);
printError("init arrays");
// End of upload of geometry
}
void SetRotationMatrix(GLfloat t, GLfloat *m)
{
m[0] = cos(t); m[1] = 0; m[2] = -sin(t); m[3] = 0.0;
m[4] = 0; m[5] = 1; m[6] = 0; m[7] = 0.0;
m[8] = sin(t); m[9] = 0; m[10] = cos(t); m[11] = 0.0;
m[12] = 0.0; m[13] = 0.0; m[14] = 0.0; m[15] = 1.0;
}
void SetRotationMatrix2(GLfloat t, GLfloat *m)
{
m[0] = 1; m[1] = 0; m[2] =0; m[3] = 0.0;
m[4] = 0; m[5] = cos(t); m[6] = -sin(t); m[7] = 0.0;
m[8] = 0; m[9] = sin(t); m[10] = cos(t); m[11] = 0.0;
m[12] = 0.0; m[13] = 0.0; m[14] = 0.0; m[15] = 1.0;
}
void SetTranslationMatrix(GLfloat t, GLfloat *m)
{
m[0] = 1.0; m[1] = 0.0; m[2] = 0; m[3] = sin(t);
m[4] = 0.0; m[5] = 1.0; m[6] = 0; m[7] = cos(2*t);
m[8] = 0; m[9] = 0; m[10] = 1.0; m[11] = 0;
m[12] = 0.0; m[13] = 0.0; m[14] = 0.0; m[15] = 1.0;
}
void OnTimer(int value)
{
glutPostRedisplay();
glutTimerFunc(20, &OnTimer, value);
}
void display(void)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
printError("pre display");
GLfloat t = (GLfloat)glutGet(GLUT_ELAPSED_TIME);
worldToViewMatrix = lookAt(25*sin(t/1000.0), 0, 25*cos(t/1000.0), 0,0,0, 0,1,0);
glUniformMatrix4fv(glGetUniformLocation(program, "camMatrix"), 1, GL_TRUE, worldToViewMatrix.m);
//For Model 1
trans = T(5, 4.2, 0);
rot = Ry(3.14);
rot2 = Rx(0+t/100.0);
total = Mult(trans,Mult(rot,rot2));
glUniform1f(glGetUniformLocation(program, "t"), t);
glUniform1i(glGetUniformLocation(program, "texUnit"), 0);
SetRotationMatrix2(t/1000.0, myRotMatrix2);
glUniformMatrix4fv(glGetUniformLocation(program, "mdlMatrix"), 1, GL_TRUE, total.m);
DrawModel(blade1, program, "in_Position", "inNormal", "inTexCoord");
rot = Ry(3.14);
rot2 = Rx(3.14*0.5+t/100.0);
total = Mult(Mult(trans,rot), rot2);
SetRotationMatrix2(t/1000.0, myRotMatrix2);
glUniformMatrix4fv(glGetUniformLocation(program, "mdlMatrix"), 1, GL_TRUE, total.m);
DrawModel(blade2, program, "in_Position", "inNormal", "inTexCoord");
rot = Ry(3.14);
rot2 = Rx(3.14+t/100.0);
total = Mult(trans,Mult(rot, rot2));
SetRotationMatrix2(t/1000.0, myRotMatrix2);
glUniformMatrix4fv(glGetUniformLocation(program, "mdlMatrix"), 1, GL_TRUE, total.m);
DrawModel(blade3, program, "in_Position", "inNormal", "inTexCoord");
rot = Ry(3.14);
rot2 = Rx(3.14*1.5+t/100.0);
total = Mult(Mult(trans,rot), rot2);
SetRotationMatrix2(t/1000.0, myRotMatrix2);
glUniformMatrix4fv(glGetUniformLocation(program, "mdlMatrix"), 1, GL_TRUE, total.m);
DrawModel(blade4, program, "in_Position", "inNormal", "inTexCoord");
//For Model 2
trans = T(0, -5, 0);
rot = Ry(0);
total = Mult(trans,rot);
SetRotationMatrix2(t/1000.0, myRotMatrix2);
glUniformMatrix4fv(glGetUniformLocation(program, "mdlMatrix"), 1, GL_TRUE, total.m);
DrawModel(walls, program, "in_Position", "inNormal", "inTexCoord");
// For model 3
trans = T(0, -5, 0);
rot = Ry(0);
total = Mult(trans,rot);
SetRotationMatrix2(t/1000.0, myRotMatrix2);
glUniformMatrix4fv(glGetUniformLocation(program, "mdlMatrix"), 1, GL_TRUE, total.m);
DrawModel(roof, program, "in_Position", "inNormal", "inTexCoord");
// For model 4
trans = T(0, -5, 0);
rot = Ry(0);
total = Mult(trans,rot);
SetRotationMatrix2(t/1000.0, myRotMatrix2);
glUniformMatrix4fv(glGetUniformLocation(program, "mdlMatrix"), 1, GL_TRUE, total.m);
DrawModel(balcony, program, "in_Position", "inNormal", "inTexCoord");
printError("display");
glutSwapBuffers();
}
int main(int argc, char *argv[])
{
glutInit(&argc, argv);
glutInitContextVersion(3, 2);
glutCreateWindow("GL3 triangle example");
glutDisplayFunc(display);
glutTimerFunc(20, &OnTimer, 0);
init();
glutMainLoop();
return 0;
}
|
/**
* List training data is successful.
*
* @throws InterruptedException the interrupted exception
*/
@Test
public void listTrainingDataIsSuccessful() throws InterruptedException {
server.enqueue(jsonResponse(listTrainingDataResp));
ListTrainingDataOptions getRequest =
new ListTrainingDataOptions.Builder(environmentId, collectionId).build();
TrainingDataSet response = discoveryService.listTrainingData(getRequest).execute().getResult();
RecordedRequest request = server.takeRequest();
assertEquals(TRAINING1_PATH, request.getPath());
assertEquals(GET, request.getMethod());
assertEquals(listTrainingDataResp, response);
} |
package controllers
import (
"gAPIManagement/api/config"
"fmt"
"gopkg.in/mgo.v2"
"gAPIManagement/api/servicediscovery"
"gAPIManagement/api/database"
"encoding/json"
"gopkg.in/mgo.v2/bson"
"gAPIManagement/api/http"
"github.com/qiangxue/fasthttp-routing"
"strconv"
)
func ListServiceGroupsHandler(c *routing.Context) error {
sg, err := ServiceDiscovery().GetListOfServicesGroup()
if err != nil {
http.Response(c, `{"error" : true, "msg": ` + strconv.Quote(err.Error()) + `}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
json, _ := json.Marshal(sg)
http.Response(c, string(json), 200, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
func RegisterServiceGroupHandler(c *routing.Context) error {
servicegroup, err := servicediscovery.ValidateServiceGroupBody(c)
if err != nil {
http.Response(c, err.Error(), 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
session, db := database.GetSessionAndDB(database.MONGO_DB)
servicegroup.Id = bson.NewObjectId()
collection := db.C(servicediscovery.SERVICE_GROUP_COLLECTION)
index := mgo.Index{
Key: []string{"name"},
Unique: true,
DropDups: true,
Background: true,
Sparse: true,
}
err = collection.EnsureIndex(index)
err = collection.Insert(&servicegroup)
database.MongoDBPool.Close(session)
if err != nil {
http.Response(c, `{"error" : true, "msg": ` + strconv.Quote(err.Error()) + `}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
http.Response(c, `{"error" : false, "msg": "Service created successfuly."}`, 201, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
func AddServiceToGroupHandler(c *routing.Context) error {
serviceGroupId := c.Param("group_id")
var bodyMap map[string]string
err := json.Unmarshal(c.Request.Body(), &bodyMap)
if err != nil {
http.Response(c, err.Error(), 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
if _, ok := bodyMap["service_id"]; !ok {
http.Response(c, `{"error": "Invalid body. Missing service_id."}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
if serviceGroupId == "null" || bodyMap["service_id"] == "null" || bodyMap["service_id"] == "" {
http.Response(c, `{"error": "Invalid body."}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
serviceGroupIdHex := bson.ObjectIdHex(serviceGroupId)
serviceId := bson.ObjectIdHex(bodyMap["service_id"])
removeFromAllGroups := bson.M{"$pull": bson.M{"services": serviceId }}
updateGroup := bson.M{"$addToSet": bson.M{"services": serviceId }}
session, db := database.GetSessionAndDB(database.MONGO_DB)
_,err = db.C(servicediscovery.SERVICE_GROUP_COLLECTION).UpdateAll(bson.M{}, removeFromAllGroups)
err = db.C(servicediscovery.SERVICE_GROUP_COLLECTION).UpdateId(serviceGroupIdHex, updateGroup)
var sg servicediscovery.ServiceGroup
db.C(servicediscovery.SERVICE_GROUP_COLLECTION).FindId(serviceGroupIdHex).One(&sg)
fmt.Println(sg.IsReachable)
updateService := bson.M{"$set": bson.M{"groupid": sg.Id,"groupvisibility": sg.IsReachable}}
err = db.C(servicediscovery.SERVICES_COLLECTION).UpdateId(serviceId, updateService)
if err != nil {
database.MongoDBPool.Close(session)
http.Response(c, `{"error" : true, "msg": ` + strconv.Quote(err.Error()) + `}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
database.MongoDBPool.Close(session)
if err != nil {
http.Response(c, `{"error" : true, "msg": ` + strconv.Quote(err.Error()) + `}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
http.Response(c, `{"error" : false, "msg": "Service added to group successfuly."}`, 201, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
func DeassociateServiceFromGroup(c *routing.Context) error {
serviceGroupId := c.Param("group_id")
serviceId := c.Param("service_id")
if serviceGroupId == "null" || serviceId == "null" || serviceId == "" {
http.Response(c, `{"error": "Invalid body."}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
serviceGroupIdHex := bson.ObjectIdHex(serviceGroupId)
serviceIdHex := bson.ObjectIdHex(serviceId)
updateGroup := bson.M{"$pull": bson.M{"services": serviceIdHex }}
updateService := bson.M{"$set": bson.M{"groupid": nil, "usegroupattributes": false}}
session, db := database.GetSessionAndDB(database.MONGO_DB)
err := db.C(servicediscovery.SERVICES_COLLECTION).UpdateId(serviceIdHex, updateService)
if err != nil {
database.MongoDBPool.Close(session)
http.Response(c, `{"error" : true, "msg": ` + strconv.Quote(err.Error()) + `}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
err = db.C(servicediscovery.SERVICE_GROUP_COLLECTION).UpdateId(serviceGroupIdHex, updateGroup)
database.MongoDBPool.Close(session)
if err != nil {
http.Response(c, `{"error" : true, "msg": "` + strconv.Quote(err.Error()) + `}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
http.Response(c, `{"error" : false, "msg": "Service deassociated from group successfuly."}`, 201, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
func UpdateServiceGroup(c *routing.Context) error {
serviceGroupId := bson.ObjectIdHex(c.Param("group_id"))
var sGroup servicediscovery.ServiceGroup
sgNew := c.Request.Body()
json.Unmarshal(sgNew, &sGroup)
session, db := database.GetSessionAndDB(database.MONGO_DB)
err := db.C(servicediscovery.SERVICE_GROUP_COLLECTION).UpdateId(serviceGroupId, sGroup)
if err != nil {
database.MongoDBPool.Close(session)
http.Response(c, `{"error" : true, "msg": ` + strconv.Quote(err.Error()) + `}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
database.MongoDBPool.Close(session)
http.Response(c, `{"error" : false, "msg": "Service group update successfuly."}`, 200, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
func RemoveServiceGroup(c *routing.Context) error {
serviceGroupId := bson.ObjectIdHex(c.Param("group_id"))
session, db := database.GetSessionAndDB(database.MONGO_DB)
err := db.C(servicediscovery.SERVICE_GROUP_COLLECTION).RemoveId(serviceGroupId)
_,err = db.C(servicediscovery.SERVICES_COLLECTION).UpdateAll(bson.M{}, bson.M{"$set": bson.M{"groupid": nil}})
database.MongoDBPool.Close(session)
if err != nil {
http.Response(c, `{"error" : true, "msg": ` + strconv.Quote(err.Error()) + `}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
http.Response(c, `{"error" : false, "msg": "Service group removed successfuly."}`, 200, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
func GetServiceGroupHandler(c *routing.Context) error {
serviceGroup := string(c.Param("group"))
session, db := database.GetSessionAndDB(database.MONGO_DB)
var sg servicediscovery.ServiceGroup
var err error
if bson.IsObjectIdHex(serviceGroup) {
err = db.C(servicediscovery.SERVICE_GROUP_COLLECTION).FindId(bson.ObjectIdHex(serviceGroup)).One(&sg)
} else {
err = db.C(servicediscovery.SERVICE_GROUP_COLLECTION).Find(bson.M{"name":serviceGroup}).One(&sg)
}
database.MongoDBPool.Close(session)
if err != nil {
http.Response(c, `{"error" : true, "msg": ` + strconv.Quote(err.Error()) + `}`, 400, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
}
sgByte, _ := json.Marshal(sg)
http.Response(c, string(sgByte), 200, ServiceDiscoveryServiceName(), config.APPLICATION_JSON)
return nil
} |
// Performs a CheckoutIndex on the files read from opts.Stdin
func CheckoutIndexFromReader(c *Client, opts CheckoutIndexOptions) error {
idx, err := c.GitDir.ReadIndex()
if err != nil {
return err
}
return CheckoutIndexFromReaderUncommited(c, idx, opts)
} |
#include<bits/stdc++.h>
#define ll long long
using namespace std;
int n, d, dis;
ll s[505000], laz[505000], k, x;
bool chk(ll now){
memset(laz,0,sizeof laz);
ll las=k, exa=0, need;
for (int i=1;i<=n;++i){
//if (now==5) printf("%d %lld %lld\n",i,exa,las);
exa+=laz[i];
if (s[i]+exa>=now) continue;
need=now-s[i]-exa;
if (need>las) return 0;
las-=need; exa+=need;
laz[min(n+1,i+dis)]-=need;
}
return 1;
}
int main(){
cin>>n>>d>>k; dis=d*2+1;
for (int i=1;i<=n;++i){
scanf("%lld",&x);
s[max(1,i-d)]+=x;
s[min(n+1,i+d+1)]-=x;
}
for (int i=1;i<=n;++i) s[i+1]+=s[i];
ll l=0, r=1.1e18, mid, ans;
for (;l<=r;){
mid=l+r>>1;
if (chk(mid)) ans=mid, l=mid+1;
else r=mid-1;
}
cout<<ans<<endl;
} |
<filename>builder_join_test.go
// Copyright 2018 The Xorm Authors. All rights reserved.
// Use of this source code is governed by a BSD-style
// license that can be found in the LICENSE file.
package builder
import (
"testing"
"github.com/stretchr/testify/assert"
)
func TestJoin(t *testing.T) {
sql, args, err := Select("c, d").From("table1").LeftJoin("table2", Eq{"table1.id": 1}.And(Lt{"table2.id": 3})).
RightJoin("table3", "table2.id = table3.tid").Where(Eq{"a": 1}).ToSQL()
assert.NoError(t, err)
assert.EqualValues(t, "SELECT c, d FROM table1 LEFT JOIN table2 ON table1.id=? AND table2.id<? RIGHT JOIN table3 ON table2.id = table3.tid WHERE a=?",
sql)
assert.EqualValues(t, []interface{}{1, 3, 1}, args)
sql, args, err = Select("c, d").From("table1").LeftJoin("table2", Eq{"table1.id": 1}.And(Lt{"table2.id": 3})).
FullJoin("table3", "table2.id = table3.tid").Where(Eq{"a": 1}).ToSQL()
assert.NoError(t, err)
assert.EqualValues(t, "SELECT c, d FROM table1 LEFT JOIN table2 ON table1.id=? AND table2.id<? FULL JOIN table3 ON table2.id = table3.tid WHERE a=?",
sql)
assert.EqualValues(t, []interface{}{1, 3, 1}, args)
sql, args, err = Select("c, d").From("table1").LeftJoin("table2", Eq{"table1.id": 1}.And(Lt{"table2.id": 3})).
CrossJoin("table3", "table2.id = table3.tid").Where(Eq{"a": 1}).ToSQL()
assert.NoError(t, err)
assert.EqualValues(t, "SELECT c, d FROM table1 LEFT JOIN table2 ON table1.id=? AND table2.id<? CROSS JOIN table3 ON table2.id = table3.tid WHERE a=?",
sql)
assert.EqualValues(t, []interface{}{1, 3, 1}, args)
sql, args, err = Select("c, d").From("table1").LeftJoin("table2", Eq{"table1.id": 1}.And(Lt{"table2.id": 3})).
InnerJoin("table3", "table2.id = table3.tid").Where(Eq{"a": 1}).ToSQL()
assert.NoError(t, err)
assert.EqualValues(t, "SELECT c, d FROM table1 LEFT JOIN table2 ON table1.id=? AND table2.id<? INNER JOIN table3 ON table2.id = table3.tid WHERE a=?",
sql)
assert.EqualValues(t, []interface{}{1, 3, 1}, args)
subQuery2 := Select("e").From("table2").Where(Gt{"e": 1})
subQuery3 := Select("f").From("table3").Where(Gt{"f": "2"})
sql, args, err = Select("c, d").From("table1").LeftJoin(subQuery2, Eq{"table1.id": 1}.And(Lt{"table2.id": 3})).
InnerJoin(subQuery3, "table2.id = table3.tid").Where(Eq{"a": 1}).ToSQL()
assert.NoError(t, err)
assert.EqualValues(t, "SELECT c, d FROM table1 LEFT JOIN (SELECT e FROM table2 WHERE e>?) ON table1.id=? AND table2.id<? INNER JOIN (SELECT f FROM table3 WHERE f>?) ON table2.id = table3.tid WHERE a=?",
sql)
assert.EqualValues(t, []interface{}{1, 1, 3, "2", 1}, args)
}
|
<reponame>ponomarenkotema/emxlive
import { registerAs } from "@nestjs/config";
const alphavantage = registerAs('alphavantage', () => ({
token: '<KEY>',
url: 'https://www.alphavantage.co/'
}));
const finnhub = registerAs('finnhub', () => ({
token: 'bqillg7<PASSWORD>8<PASSWORD>q<PASSWORD>',
url: 'https://finnhub.io/api/v1/stock/'
}));
const tiingo = registerAs('tiingo', () => ({
token: '<KEY>',
url: 'https://api.tiingo.com/'
}));
export { alphavantage };
|
Have you bought your transcranial direct current stimulation (tDCS) head-set yet? You've heard of this, right? It's a device with electrodes that zap your brain through your skull, using mild electrical currents to make you smarter. The man in the picture above sports one of the first commercially available devices. Produced by Foc.us, it's available for $249, and also comes in black. This technology is far from new - Roman physician Galen was on to something similar when he slapped electric fish on his patients' heads. But tDCS is now in the process of going mainstream: there are DIY brain-zapping enthusiasts on YouTube; last year MTV editor Mary H K Choi wrote an amusing but inconclusive tDCS self-experimentation piece for Aeon; and just the other day, Oliver Burkeman included tDCS in his roundup of new brain-enhancing technologies for The Guardian.
The manufacturers claim that the tDCS headset will "overclock your brain", increase your brain's plasticity and "make your synapses fire faster". Overclocking sounds a bit dangerous, and rather than your synapses, wouldn't it be better to make your neurons fire faster? Synapses are the junctions between neurons. We usually say it's neurons that "fire" and their message is passed across one or more synapses to other neurons using chemicals. Unless the marketing people were talking specifically about electrical synapses? But sorry, I'm rambling. Must focus. "Foc.us". Need more electric current. Hang on …
Phew, better … To be fair to Foc.us, the idea of having faster synapses at the flick of switch sounds appealing, and, believe it or not, their claims for the brain-enhancing effects of tDCS are not entirely unfounded. In fact, almost each week there's a new study claiming that tDCS can boost yet another aspect of mental function. Zapping different parts of the brain has been linked with superior learning of new motor skills; better math skills; better social skills; superior learning and memory; and on the list goes. tDCS is also being investigated as a treatment for a range of psychiatric and neurological problems, but for today let's focus on mental enhancement for healthy people.
From a physiological perspective, tDCS affects brain function in two ways - by altering the baseline activity level of targeted neurons and by modifying functioning at synapses. The effect on neuron activity levels occurs while you zap; the synaptic influence is a longer-lasting after-effect. The specific changes depend on a whole range of factors, most obviously whether the current is positive ("anodal"), which increases neuronal activity, or negative ("cathodal") which suppresses it. tDCS is not to be confused with electro-convulsive therapy (for severe depression and other conditions) in which a much higher current of electricity is used to deliberately induce a brain seizure.
So the brain changes triggered by tDCS are real. And there are those findings in peer-reviewed journals showing a range of appealing cognitive benefits. What's not to like? Well, I confess I'm geeky enough to have compiled and read a number of cautionary science papers on tDCS published by experts over the last couple of years, and they certainly give pause for thought. Before you start revving up your grey matter with extra electricity, I suggest you bear in mind the following caveats and warnings:
Most studies looking at the cognitive benefits of tDCS fail to include adequate blinding and control conditions. This means the researchers and the lab rats - sorry, participants - both know who is receiving the real intervention. Big placebo effects are therefore likely because participants will have expectations of some kind of effect, and researchers could also influence outcomes with their enthusiasm or expectations. You don't just strap on a tDCS headset and become instantly smarter. Shucks. The experts say that the technique works by enhancing the effects of learning and practice. You still have to put effort in. "tDCS alone is of little use," Roi Cohen Kadosh, a leading researcher in this area, told me last year. ‘The advantage of it is when it is combined with a cognitive training, rather than just applied alone to the brain," he said. But even then it doesn't work for everyone. There's huge variability in the effects of tDCS between individuals, and probably also in the same individual from one session to another. "Unfortunately … response reliability at the level of the individual has not been explored (or at least reported) in the literature to date," say Jared Horvath and colleagues. Factors to do with fatigue and hormone levels are also likely to interact with tDCS in ways we don't yet understand. All of which makes it hard to know the optimal and safe level of zapping to use. Bad news! "Meddling with the tDCS dose is potentially as dangerous as tampering with a drug's chemical composition," say Marom Bikson and colleagues in their recent Letter to *Nature *entitled: "Transcranial devices are not playthings". Other factors that will interfere with the dose include how much hair you have on your head and whether or not you sweat a lot. Effects of brain zapping can accumulate over time and the long-term consequences of this are unknown. Researchers studying tDCS are very careful to target specific brain areas. How will you know you're zapping the right part of your brain? This is particularly important for left-handers, who can have functional hubs located on a different side of the brain than usual. What you do after a brain zapping session can modify or completely nullify any effects of the electricity. Walking around or having specific thoughts is all it takes to potentially reverse the effects. Research on this problem is still in its infancy, so there's no way you can know how best to behave after a tDCS session to preserve any potential benefits. If you enhance mental function in one area, it can actually have an negative impact on another aspect of mental function. Because the neural effects of tDCS can be long-lasting, what might be advantageous in one situation could therefore leave you impaired in a different context later. Misuse of the technology could risk seizures or scalp burns. Also watch out for itching, fatigue and nausea. Nick Davis and colleagues say it's a mistake to think of brain zapping as non-invasive. "Any technique which directly affects brain tissue to generate such powerful acute and long-lasting effects should be treated with the same respect as any surgical technique," they write. On the plus side, a 2011 paper stated that "no serious side effects have occurred" in more than 100 studies with patients and healthy controls. Photographic evidence from Foc.us suggests that too much tDCS causes a desire to squinch.
Disclaimer: Despite possible appearances to the contrary, this post was written by an under-clocked brain that's not yet been zapped by tDCS. |
//
// RGBUIColor.h
// RGB
//
// Created by Orange on 8/12/16.
// Copyright © 2016 colorcun. All rights reserved.
//
#import <Foundation/Foundation.h>
@interface RGBUIColor : NSObject
@end
|
/**
* Connects the session to remote nadron server. Depending on the connection
* parameters provided to LoginHelper, it can connect both TCP and UDP
* transports.
*
* @param session
* The session to be connected to remote nadron server.
* @throws InterruptedException
* @throws Exception
*/
public void connectSession(final Session session)
throws InterruptedException, Exception
{
connectSession(session, (EventHandler[]) null);
} |
/**
* MIIS Core Identifier (Item 94).
*
* <p>From ST:
*
* <blockquote>
*
* Use according to the rules and requirements defined in ST 1204.
*
* <p>The MIIS Core Identifier allows users to include the MIIS Core Identifier (MISB ST 1204)
* Binary Value (opposed to the text-based representation) within MISB ST 0601. Item 94's value does
* not include MISB ST 1204's 16-byte Key or length, only the value portion. See MISB ST 1204 for
* generation and usage requirements.
*
* </blockquote>
*/
public class MiisCoreIdentifier implements IUasDatalinkValue {
private CoreIdentifier coreIdentifier;
/**
* Create from value.
*
* @param identifier a valid ST1204 Core Identifier
*/
public MiisCoreIdentifier(CoreIdentifier identifier) {
coreIdentifier = identifier;
}
/**
* Create from encoded bytes.
*
* @param bytes The byte array containing the raw values.
*/
public MiisCoreIdentifier(byte[] bytes) {
coreIdentifier = CoreIdentifier.fromBytes(bytes);
}
/**
* Get the identifier.
*
* @return The identifier (which can be null if the parsing failed).
*/
public CoreIdentifier getCoreIdentifier() {
return coreIdentifier;
}
@Override
public byte[] getBytes() {
if (coreIdentifier != null) {
return coreIdentifier.getRawBytesRepresentation();
} else {
return null;
}
}
@Override
public String getDisplayableValue() {
if (coreIdentifier != null) {
return coreIdentifier.getTextRepresentation();
} else {
return "[NULL]";
}
}
@Override
public String getDisplayName() {
return "MIIS Core Identifier";
}
} |
// VersionLessThan returns true if a < b
func VersionLessThan(a, b string) bool {
xa, err := extractInts(a)
if err != nil {
panic(err)
}
xb, err := extractInts(b)
if err != nil {
panic(err)
}
for i := 0; i < len(xa); i++ {
if len(xb) >= i {
if xa[i] == xb[i] {
continue
}
return xa[i] < xb[i]
}
}
return false
} |
/**
* Removes a structure element kid.
*
* @param structureElement the structure element
* @return <code>true</code> if the kid was removed, <code>false</code> otherwise
*/
public boolean removeKid(PDStructureElement structureElement)
{
boolean removed = this.removeObjectableKid(structureElement);
if (removed)
{
structureElement.setParent(null);
}
return removed;
} |
def process_example(
interface: Interface, example_id: int
) -> Tuple[List[Any], List[float]]:
example_set = interface.examples[example_id]
raw_input = [
interface.input_components[i].preprocess_example(example)
for i, example in enumerate(example_set)
]
prediction = interface.process(raw_input)
return prediction |
/// Draw the speciied image, with rounded corners
pub fn add_image_rounded(
&'ui self,
texture_id: TextureId,
p_min: [f32; 2],
p_max: [f32; 2],
rounding: f32,
) -> ImageRounded {
ImageRounded::new(self, texture_id, p_min, p_max, rounding)
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.