content
stringlengths 10
4.9M
|
---|
CAPE CANAVERAL, Fla. – Astronomers have found a dwarf planet far beyond the orbit of Pluto and can only guess how it got there.
The diminutive world, provisionally called "2012 VP 113" by the international Minor Planet Center, is estimated to be about 280 miles (450 km) in diameter, less than half the size of a neighbouring dwarf planet named Sedna discovered a decade ago.
Sedna and VP 113 are the first objects found in a region of the solar system previously believed to be devoid of planetary bodies.
The proverbial no-man’s land extended from the outer edge of the Kuiper Belt, home to the dwarf planet Pluto and more than 1,000 other small icy bodies, to the comet-rich Oort Cloud, which orbits the sun some 10,000 times farther away than Earth.
"When Sedna was discovered 10 years ago it kind of redefined what we thought about the solar system," astronomer Scott Sheppard of the Carnegie Institution of Washington D.C. said in an interview.
Nothing in the appearance of the modern-day solar system can account for Sedna and VP 113’s existence, say astronomers who published their findings on Wednesday in the journal Nature.
Sedna’s 11,400-year orbit takes it only as close as 76 times the distance that Earth orbits the sun. VP 113’s closest approach is 80 times as far as Earth’s orbit of the sun – roughly twice as far as the Kuiper Belt.
"In the current architecture of the solar system, Sedna and 2012 VP113 should not be there," writes astronomer Megan Schwamb, of the Academia Sinica in Taipei, Taiwan, in a separate Nature article.
Computer simulations provide a few potential scenarios.
Lead researcher Chad Trujillo favours the idea that a sibling star forming in the same stellar nursery as the sun gravitationally elbowed some Oort Cloud residents inward as it flew by.
Sheppard suggests that another planet at least as massive as Earth got bumped out of the solar system, taking some Kuiper Belt bodies with it along the way.
That renegade planet or planets actually may still be lurking in the farthest reaches of the solar system, too dim and remote to be detected by currently available telescopes and cameras, Sheppard said.
A third option is that the sun has a companion, something five- to 10 times the mass of Earth, whose gravity is pinning Sedna, VP 113 and potentially millions of other dwarf-like planets in unusual and distant orbits.
"With our discovery of one more object, we can’t rule out one theory or another," Trujillo said.
More residents of the region, now known as the inner Oort Cloud, soon may make their presence known.
Astronomers are working to confirm six other Sedna-like objects found last year. That requires imaging the mini-planets several times over a year or longer to measure how much they have moved relative to background stars.
"They’re really hard to find," Trujillo said.
Astronomers suspect there may be 150 million Sedna-like dwarf planets measuring between 31 and 5,000 miles (50 and 8,000 km) in diameter, a larger population than the Kuiper Belt objects. |
/**
* dma_region_init - clear out all fields but do not allocate anything
*/
void dma_region_init(struct dma_region *dma)
{
dma->kvirt = NULL;
dma->dev = NULL;
dma->n_pages = 0;
dma->n_dma_pages = 0;
dma->sglist = NULL;
} |
/**
* A tabular validation exception.
* <p>
* <b>NOTE: This package was never completed and isn't used anywhere.</b>
* <p>
* @author Erich P. Gatejen
* @version 1.0
* <p>
* <i>Version History</i>
* <pre>EPG - Initial - 10 NOV 04
* </pre>
*/
public class TabularValidationException extends TabularException {
private static final long serialVersionUID = 1L;
// PRIVATE FIELDS
private int theLineNumber;
/*
* This field defines the value of no line number. All real line numbers will be natural number: 0 through MAX INT.
*/
public final static int NO_LINE_NUMBER = -1;
/**
* Default constructor.
*/
public TabularValidationException() {
super();
theLineNumber = NO_LINE_NUMBER;
}
public TabularValidationException(String message) {
super(message);
theLineNumber = NO_LINE_NUMBER ;
}
public TabularValidationException(String message, int lineNumber) {
super(message);
theLineNumber = lineNumber;
}
public TabularValidationException(String message, Throwable cause) {
super(message,cause);
theLineNumber = NO_LINE_NUMBER;
}
public TabularValidationException(Throwable cause) {
super(cause);
theLineNumber = NO_LINE_NUMBER;
}
public int getLineNumber() {
return theLineNumber;
}
} |
def display_credential():
return Credential_Sect.display_credentials() |
Paravertebral cervical chordoma – a case report
Chordomas constitute <5% of vertebral column tumours and a third of these arise in the upper cervical spine and tend to be clival – usually midline, with occasional eccentric extension. We report a case of cervical chordoma presenting as a lateral neck mass and discuss its origin, diagnosis and management. |
def create_hash_table(self):
nchains = len(self.obj.symbols) + 1
nbuckets = 8
buckets = [0] * nbuckets
chain = [0] * nchains
for symbol in self.obj.symbols:
symbol_index = self.symbol_id_map[symbol.id]
hash_value = elf_hash(symbol.name)
bucket_index = hash_value % nbuckets
if buckets[bucket_index] == 0:
buckets[bucket_index] = symbol_index
else:
chain_index = buckets[bucket_index]
while chain[chain_index] != 0:
chain_index = chain[chain_index]
chain[chain_index] = symbol_index |
<filename>frontend/pages/api/quiz/count.ts
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from "next";
import prisma from "../../../lib/prisma";
type Data = {
count: number;
};
export default async function handle(
req: NextApiRequest,
res: NextApiResponse<Data>
) {
const questionsCount = await prisma.question.count();
res.status(200).json({ count: questionsCount });
}
|
<reponame>angie1148/azure-sdk-for-js2
/*
* Copyright (c) Microsoft Corporation.
* Licensed under the MIT License.
*
* Code generated by Microsoft (R) AutoRest Code Generator.
* Changes may cause incorrect behavior and will be lost if the code is regenerated.
*/
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT License.
import {
Account,
MLTeamAccountManagementClient
} from "@azure/arm-machinelearningexperimentation";
import { DefaultAzureCredential } from "@azure/identity";
/**
* This sample demonstrates how to Creates or updates a team account with the specified parameters.
*
* @summary Creates or updates a team account with the specified parameters.
* x-ms-original-file: specification/machinelearningexperimentation/resource-manager/Microsoft.MachineLearningExperimentation/preview/2017-05-01-preview/examples/CreateAccount.json
*/
async function accountCreate() {
const subscriptionId = "00000000-1111-2222-3333-444444444444";
const resourceGroupName = "accountcrud-1234";
const accountName = "accountcrud5678";
const parameters: Account = {
keyVaultId:
"/subscriptions/00000000-1111-2222-3333-444444444444/resourceGroups/accountcrud-1234/providers/Microsoft.KeyVault/vaults/testkv",
location: "East US",
storageAccount: {
accessKey: "key",
storageAccountId:
"/subscriptions/00000000-1111-2222-3333-444444444444/resourceGroups/accountcrud-1234/providers/Microsoft.Storage/storageAccounts/testStorageAccount"
},
tags: { tagKey1: "TagValue1" },
vsoAccountId:
"/subscriptions/00000000-1111-2222-3333-444444444444/resourceGroups/accountcrud-1234/providers/microsoft.visualstudio/account/vsotest"
};
const credential = new DefaultAzureCredential();
const client = new MLTeamAccountManagementClient(credential, subscriptionId);
const result = await client.accounts.createOrUpdate(
resourceGroupName,
accountName,
parameters
);
console.log(result);
}
accountCreate().catch(console.error);
|
Three different methods for treating multiple enchondromatosis in one hand.
Ollier's disease remains comparatively rare, and is a non-hereditary cartilage dysplasia of bone. It is usually associated with problems such as deformity and fracture. Three different methods were used in a one-hand of 15-year-old boy reporting his pain in the left hand and swellings. After the curettage of tumor, regarding as the differences of all parts of the bone structure reconstruction in the patient's hand, we chose three following methods for this boy, i.e. fixed by the locking plate with calcium phosphate cement, filled with allograft bone, curetted the tumor without any bone graft. After the surgery, the patient was able to perform full motion of the operated hand. No evidence of recurrence was noted four years after surgery. To choose the different ways with bone grafts or not that relies on the patients' conditions for bone structure reconstruction. However, patients with large osseous defects or pathological fracture, we demand full bone graft and reliable internal fixation. After surgery, early exercises can reach a desirable result and functional recovery. |
#include<bits/stdc++.h>
using namespace std;
#define mod 1000000007
#define ll long long
#define Author std::ios_base::sync_with_stdio(0);
#define u_map unordered_map<ll,ll>
#define n_map map<ll,ll>
#define n_pair pair<ll,ll>
#define all(v) v.begin(),v.end()
#define frr(i,j,k) for(int i=j; i<k; i++)
#define frp(i,j,k) for(int i=j; i>k; i--)
#define pb(a) push_back(a)
#define lb(v,t) lower_bound(all(v),t)-v.begin()
#define ub(v,t) upper_bound(all(v),t)-v.begin()
#define in(t) scanf("%lld",&t);
#define out(t) printf("%lld\n",t);
int main(){
ll t;
in(t);
while(t--){
ll n;
in(n);
ll arr[n+1],br[n+1];
frr(i,0,n){in(arr[i]);
in(br[i]);}
map<pair<int,int>,ll>mp;
bool die[n+2];
frr(i,0,n)die[i]=false;
frr(i,0,n){
if(i!=(0)){
ll diff=(br[i-1]-arr[i]);
if(diff>0)die[i]=true;
pair<int,int>p=make_pair(i-1,i);
if(!die[i])
mp[p]=abs(diff);}
else{
ll diff=(br[n-1]-arr[0]);
if(diff>0)die[0]=true;
else{
die[0]=false;
pair<int,int>p=make_pair(n-1,0);
mp[p]=abs(diff);
}
}
}
ll ans[n+1];
memset(ans,0,sizeof(ans));
ans[0]=arr[0];
frr(i,1,n){
// cout<<mp[{i-1,i}]<<endl;
if(die[i]==false){
ans[0]+=mp[{i-1,i}];
}
}
// cout<<ans[0]<<endl;
ll an1=ans[0];
frr(i,1,n){
if(i==1){
ans[i]=ans[i-1]+arr[i]-arr[i-1]+mp[{n-1,0}]-mp[{0,1}];}
else
ans[i]=ans[i-1]+arr[i]-arr[i-1]+mp[{i-2,i-1}]-mp[{i-1,i}];
// cout<<ans[i]<<endl;
an1=min(an1,ans[i]);
}
out(an1);
}
ios_base::sync_with_stdio(false);
cin.tie(NULL);
cout.tie(NULL);
return 0;
} |
<filename>pkg/common/operator/backendconfig_test.go
package operator
import (
"testing"
"k8s.io/ingress-gce/pkg/backendconfig"
api_v1 "k8s.io/api/core/v1"
)
func TestDoesServiceReferenceBackendConfig(t *testing.T) {
testCases := []struct {
desc string
service *api_v1.Service
expected bool
}{
{
desc: "service with no backend config",
service: backendconfig.SvcWithoutConfig,
expected: false,
},
{
desc: "service with test backend config",
service: backendconfig.SvcWithTestConfig,
expected: true,
},
{
desc: "service with default test backend config",
service: backendconfig.SvcWithDefaultTestConfig,
expected: true,
},
{
desc: "service with test backend config in a different namespace",
service: backendconfig.SvcWithTestConfigOtherNamespace,
expected: false,
},
{
desc: "service with a different backend config",
service: backendconfig.SvcWithOtherConfig,
expected: false,
},
{
desc: "service with a different default backend config",
service: backendconfig.SvcWithDefaultOtherConfig,
expected: false,
},
{
desc: "service with invalid backend config",
service: backendconfig.SvcWithInvalidConfig,
expected: false,
},
}
for _, tc := range testCases {
result := doesServiceReferenceBackendConfig(tc.service, backendconfig.TestBackendConfig)
if result != tc.expected {
t.Fatalf("SDSD)")
}
}
}
|
Apple initially claimed the iPhone 4 antenna flaw really isn't a flaw at all, but a software bug that misrepresents the number of bars of signal strength. It's now well-known that the iPhone 4 suffers from an antenna design flaw, and a new report by Bloomberg News reveals that Steve Jobs was warned early in the design phase by a top engineer that the antenna design could lead to dropped calls.
Steve Jobs highlighted the iPhone 4 antenna design as "brilliant engineering" during his WWDC keynote. For many users it does offer better reception in areas that previous iPhones experienced problems, but it does have a reproducible issue that causes signal loss when a particular spot on the bottom left of the device is touched when a user holds the iPhone 4 in their left hand.
According to an anonymous source speaking to Bloomberg, senior director of engineering for iPhone and iPod Ruben Caballero warned top Apple management that the antenna design had potential to cause reception problems under certain conditions. This same source also told Bloomberg that one of Apple's carrier partners expressed concern over the antenna design before the iPhone 4's June launch.
Apple offered the explanation last week that the real problem is that all iPhones running iOS 4 actually display too many bars due to the range of numeric signal strength values programmed for each "bar" typically displayed on all cell phones. A beta of iOS 4.1 containing Apple's fix was released to developers yesterday, and reports have confirmed that iPhones now typically show fewer bars of signal strength in areas without perfect reception. Unfortunately for iPhone 4 owners, the change doesn't actually mitigate the signal loss that happens when the small gap in the stainless steel bezel that separates two of the iPhone 4's three antennas is touched, causing the two to be bridged electrically. If the signal is weak enough, it can result in dropped calls or data connections.
Analysts disagree about the potential impact for Apple, but news earlier this week that Consumer Reports decided it could not recommend the device to consumers because of the potential for reception degradation caused a serious slump (about $10, or 4 percent) in Apple's stock value.
Even politicians are expressing concern over the problem. Sen. Charles Schumer (D-NY) inked a missive to Apple over the antenna woes. "I am concerned that the nearly two million purchasers of the iPhone 4 may not have complete information about the quality of the product they have purchased," reads a copy of the letter posted by AppleInsider. "The burden for consumers caused by this glitch, combined with the confusion over its cause and how it will be fixed, has the potential to undermine the many benefits of this innovative device."
Apple will be holding a special press conference at 10am PDT on Friday, and Ars will be on the scene. So far the company hasn't revealed any details of the announcement except to say that it concerns the iPhone 4. Speculation so far is that Apple will publicly address the antenna flaw and offer some kind of fix, though there is always a possibility it could be something else.
We would like know what you think Apple will announce at the press conference. Will Apple dismiss the issue? Offer a recall? Teach us all how to hold the phone properly?
Update: Apple has officially denied everything about the Bloomberg report, and says that it won't be issuing a recall. "We challenge Bloomberg BusinessWeek to produce anything beyond rumors to back this up. It's simply not true," an Apple spokesperson told the Wall Street Journal.
What will Apple reveal during its press conference on Friday?online survey |
def enter_enemy_attack_state(self):
self.state = self.info_box.state = c.ENEMY_ATTACK
enemy = self.enemy_list[self.enemy_index]
enemy.enter_enemy_attack_state() |
Investigation of Topologies of Low Voltage Multilevel Inverters
This paper research on the topologies of main circuit of low-voltage high-performance power inverters. Two new topologies of multilevel inverter are proposed. Inverters with the proposal topologies share the advantages of simplicity of circuit and control method, fewer controllable semiconductor devices involved, high quality of output waveforms, high efficiency and so on. Simulations using PSIM 4.0 of the proposed adding-bidirectional-switch multilevel inverter based on multi-carrier SPWM method are carried out which proved the validity and practicability of the proposed topology. |
/*
* nist_increment_block
* Increment the output block as a big-endian number.
*/
static void
nist_increment_block(unsigned int* V)
{
int i;
unsigned int x;
for (i = NIST_BLOCK_OUTLEN_INTS - 1; i >= 0; --i) {
x = NIST_NTOHL(V[i]) + 1;
V[i] = NIST_HTONL(x);
if (x)
return;
}
} |
def d1_holland(x):
return np.arctan(x) |
/**
* Hip and Shoulder to Foot joints position vectors
*/
public void jointsToFootPositionVectors()
{
FrameVector hipToFootInWorld = new FrameVector(ReferenceFrame.getWorldFrame());
FrameVector shoulderToFootInWorld = new FrameVector(ReferenceFrame.getWorldFrame());
/*
* Foot location in world
*/
footLocation.set(robot.computeFootLocation());
/*
* Foot to hip position vector
*/
robot.getHipJoint().getTranslationToWorld(hipToFootInWorld.getVector());
hipJointPosition.set(hipToFootInWorld);
hipToFootPositionVector.sub(footLocation.getVector3dCopy(), hipToFootInWorld.getVector());
hipToFootUnitVector.set(hipToFootPositionVector);
hipToFootUnitVector.normalize();
/*
* Shoulder to Foot position vector
*/
robot.getShoulderJoint().getTranslationToWorld(shoulderToFootInWorld.getVector());
shoulderJointPosition.set(shoulderToFootInWorld);
shoulderToFootPositionVector.sub(footLocation.getVector3dCopy(), shoulderToFootInWorld.getVector());
shoulderToFootUnitVector.set(shoulderToFootPositionVector);
shoulderToFootUnitVector.normalize();
} |
.
OBJECTIVE
To observe the clinical efficacy differences between acupuncture combined with dynamic moxibustion and acupuncture alone for adult ankylosing spondylitis (AS) at early-to-mid stage based on medication.
METHODS
Fifty-five cases of adult AS were randomly divided into an acupuncture-moxibustion group (28 cases) and an acupuncture group (27 cases). The two groups were treated with oral administration of sulfasalazine tablets. In addition, the acupuncture-moxibustion group was treated with acupuncture at Jiaji (EX-B 2), Shenshu (BL 23), Dachangshu (BL 25), Weizhong (BL 40) as well as dynamic moxibustion at the first line of bladder meridian of foot-taiyang and governor vessel from Dazhui (GV 14) to Yaoshu (GV 2). The acupuncture group was treated with acupuncture, the acupoints and manipulation of which were identical to acupuncture-moxibustion group. The treatment was given once a day, five days per week; one session was consisted of one-month treatment, and totally three sessions were given. The bath ankylosing spondylitis functional index (BAFI) and bath ankylosing spondylitis disease activity index (BASDAI) were compared before and after treatment in the two groups; also the clinical effective rates were compared between the two groups.
RESULTS
The total effective rate was 96.4% (27/28) in the acupuncture-moxibustion group, which was superior to 88.9% (24/27) in the acupuncture group (P<0.05). Compared before treatment, the BASFI and BASDAI were reduced after treatment in the two groups (all P<0.05), which were more significant in the acupuncture-moxibustion group (both P<0.05).
CONCLUSIONS
Based on medication, acupuncture combined with dynamic moxibustion could improve the clinical symptoms of AS, which is superior to simple acupuncture. |
def _mul_div(self, scaling_factor, div=False):
if not isinstance(scaling_factor, UFloat):
try:
scaling_factor = float(scaling_factor)
except (TypeError, ValueError):
raise TypeError("Spectrum must be multiplied/divided by a scalar")
if (
scaling_factor == 0
or np.isinf(scaling_factor)
or np.isnan(scaling_factor)
):
raise ValueError("Scaling factor must be nonzero and finite")
else:
if (
scaling_factor.nominal_value == 0
or np.isinf(scaling_factor.nominal_value)
or np.isnan(scaling_factor.nominal_value)
):
raise ValueError("Scaling factor must be nonzero and finite")
if div:
multiplier = 1 / scaling_factor
else:
multiplier = scaling_factor
if self._counts is not None:
data_arg = {"counts": self.counts * multiplier}
else:
data_arg = {"cps": self.cps * multiplier}
if self.is_calibrated:
spect_obj = Spectrum(bin_edges_kev=self.bin_edges_kev, **data_arg)
else:
spect_obj = Spectrum(bin_edges_raw=self.bin_edges_raw, **data_arg)
return spect_obj |
<gh_stars>0
import os, math
import pickle
import numpy as np
import geopy.distance
from scipy.sparse import linalg
import scipy.sparse as sp
import tqdm
class DataLoader(object):
def __init__(self, xs, ys, batch_size, pad_with_last_sample=True):
"""
:param xs:
:param ys:
:param batch_size:
:param pad_with_last_sample: pad with the last sample to make number of samples divisible to batch_size.
"""
self.batch_size = batch_size
self.current_ind = 0
if pad_with_last_sample:
num_padding = (batch_size - (len(xs) % batch_size)) % batch_size
x_padding = np.repeat(xs[-1:], num_padding, axis=0)
y_padding = np.repeat(ys[-1:], num_padding, axis=0)
xs = np.concatenate([xs, x_padding], axis=0)
ys = np.concatenate([ys, y_padding], axis=0)
self.size = len(xs)
self.num_batch = int(self.size // self.batch_size)
self.xs = xs
self.ys = ys
def shuffle(self):
permutation = np.random.permutation(self.size)
xs, ys = self.xs[permutation], self.ys[permutation]
self.xs = xs
self.ys = ys
def get_iterator(self):
self.current_ind = 0
def _wrapper():
while self.current_ind < self.num_batch:
start_ind = self.batch_size * self.current_ind
end_ind = min(self.size, self.batch_size * (self.current_ind + 1))
x_i = self.xs[start_ind: end_ind, ...]
y_i = self.ys[start_ind: end_ind, ...]
yield (x_i, y_i)
self.current_ind += 1
return _wrapper()
class StandardScaler():
"""
Standard the input
"""
def __init__(self, mean, std):
self.mean = mean
self.std = std
def transform(self, data):
return (data - self.mean) / self.std
def inverse_transform(self, data):
return (data * self.std) + self.mean
def sym_adj(adj):
"""Symmetrically normalize adjacency matrix."""
adj = sp.coo_matrix(adj)
rowsum = np.array(adj.sum(1))
d_inv_sqrt = np.power(rowsum, -0.5).flatten()
d_inv_sqrt[np.isinf(d_inv_sqrt)] = 0.
d_mat_inv_sqrt = sp.diags(d_inv_sqrt)
return adj.dot(d_mat_inv_sqrt).transpose().dot(d_mat_inv_sqrt).astype(np.float32).todense()
def asym_adj(adj):
adj = sp.coo_matrix(adj)
rowsum = np.array(adj.sum(1)).flatten()
d_inv = np.power(rowsum, -1).flatten()
d_inv[np.isinf(d_inv)] = 0.
d_mat= sp.diags(d_inv)
return d_mat.dot(adj).astype(np.float32).todense()
def calculate_normalized_laplacian(adj):
"""
# L = D^-1/2 (D-A) D^-1/2 = I - D^-1/2 A D^-1/2
# D = diag(A 1)
:param adj:
:return:
"""
adj = sp.coo_matrix(adj)
d = np.array(adj.sum(1))
d_inv_sqrt = np.power(d, -0.5).flatten()
d_inv_sqrt[np.isinf(d_inv_sqrt)] = 0.
d_mat_inv_sqrt = sp.diags(d_inv_sqrt)
normalized_laplacian = sp.eye(adj.shape[0]) - adj.dot(d_mat_inv_sqrt).transpose().dot(d_mat_inv_sqrt).tocoo()
return normalized_laplacian
def calculate_scaled_laplacian(adj_mx, lambda_max=2, undirected=True):
if undirected:
adj_mx = np.maximum.reduce([adj_mx, adj_mx.T])
L = calculate_normalized_laplacian(adj_mx)
if lambda_max is None:
lambda_max, _ = linalg.eigsh(L, 1, which='LM')
lambda_max = lambda_max[0]
L = sp.csr_matrix(L)
M, _ = L.shape
I = sp.identity(M, format='csr', dtype=L.dtype)
L = (2 / lambda_max * L) - I
return L.astype(np.float32).todense()
def load_pickle(pickle_file):
try:
with open(pickle_file, 'rb') as f:
pickle_data = pickle.load(f)
except UnicodeDecodeError as e:
with open(pickle_file, 'rb') as f:
pickle_data = pickle.load(f, encoding='latin1')
except Exception as e:
print('Unable to load data ', pickle_file, ':', e)
raise
return pickle_data
def load_adj(pkl_filename, adjtype):
"""
Load directed adjacency matrix (predefined)
:param pkl_filename:
:param adjtype:
:return:
"""
sensor_ids, sensor_id_to_ind, adj_mx = load_pickle(pkl_filename)
if adjtype == "scalap":
adj = [calculate_scaled_laplacian(adj_mx)]
elif adjtype == "normlap":
adj = [calculate_normalized_laplacian(adj_mx).astype(np.float32).todense()]
elif adjtype == "symnadj":
adj = [sym_adj(adj_mx)]
elif adjtype == "transition":
adj = [asym_adj(adj_mx)]
elif adjtype == "doubletransition":
adj = [asym_adj(adj_mx), asym_adj(np.transpose(adj_mx))]
elif adjtype == "identity":
adj = [np.diag(np.ones(adj_mx.shape[0])).astype(np.float32)]
else:
error = 0
assert error, "adj type not defined"
return sensor_ids, sensor_id_to_ind, adj
def load_dataset(dataset_dir, batch_size):
data = {}
for category in ['train', 'val', 'test']:
cat_data = np.load(os.path.join(dataset_dir, category + '.npz'))
data['x_' + category] = cat_data['x']
data['y_' + category] = cat_data['y']
scaler = StandardScaler(mean=data['x_train'][..., 0].mean(), std=data['x_train'][..., 0].std())
# Data format
for category in ['train', 'val', 'test']:
data['x_' + category][..., 0] = scaler.transform(data['x_' + category][..., 0])
data['train_loader'] = DataLoader(data['x_train'], data['y_train'], batch_size)
data['val_loader'] = DataLoader(data['x_val'], data['y_val'], batch_size)
data['test_loader'] = DataLoader(data['x_test'], data['y_test'], batch_size)
data['scaler'] = scaler
return data
## newly added method from generating the adjacency matrix: directed graph
def get_adjacency_matrix(distance_df, sensor_ids, normalized_k=0.1):
"""
Compute the directed adjacency matrix
:param distance_df: data frame with three columns: [from, to, distance].
:param sensor_ids: list of sensor ids.
:param normalized_k: entries that become lower than normalized_k after normalization are set to zero for sparsity.
:return:
"""
num_sensors = len(sensor_ids)
dist_mx = np.zeros((num_sensors, num_sensors), dtype=np.float32)
dist_mx[:] = np.inf
# Builds sensor id to index map.
sensor_id_to_ind = {}
for i, sensor_id in enumerate(sensor_ids):
sensor_id_to_ind[sensor_id] = i
# Fills cells in the matrix with distances.
for index, row in tqdm(distance_df.iterrows(), total=distance_df.shape[0]):
if row["from"] not in sensor_ids or row["to"] not in sensor_ids:
continue
dist_mx[sensor_id_to_ind[row["from"]], sensor_id_to_ind[row["to"]]] = row["cost"]
# Calculates the standard deviation as theta.
distances = dist_mx[~np.isinf(dist_mx)].flatten()
std = distances.std()
adj_mx = np.exp(-np.square(dist_mx / std))
# Make the adjacent matrix symmetric by taking the max.
# adj_mx = np.maximum.reduce([adj_mx, adj_mx.T])
return sensor_ids, sensor_id_to_ind, adj_mx
def get_dist_matrix(sensor_locs):
"""
Compute the absolute spatial distance matrix
:param sensor_locs: with header and index, [index, sensor_id, longitude, latitude]
:return:
"""
sensor_ids = sensor_locs[1:, 1] #remove header and index
sensor_id_to_ind = {}
num_sensors = len(sensor_ids)
dist_mx = np.zeros((num_sensors, num_sensors), dtype=np.float32)
dist_mx[:] = np.inf
for i, sensor_id in enumerate(sensor_ids):
sensor_id_to_ind.update({sensor_id : i})
for id1 in sensor_ids:
coords_1 = sensor_locs[sensor_locs[:, 1] == id1][0][2:]
for id2 in sensor_ids:
if math.isinf(dist_mx[sensor_id_to_ind[id1], sensor_id_to_ind[id2]]):
coords_2 = sensor_locs[sensor_locs[:, 1] == id2][0][2:]
dist = round(geopy.distance.distance(coords_1, coords_2).km, 2)
dist_mx[sensor_id_to_ind[id1], sensor_id_to_ind[id2]] = dist
dist_mx[sensor_id_to_ind[id2], sensor_id_to_ind[id1]] = dist
else:
continue
return sensor_ids, sensor_id_to_ind, dist_mx
def get_undirect_adjacency_matrix(dist_mx, k):
"""
Compute the undirected adjacency matrix with the formula given by ChebyNet:
- https://github.com/hazdzz/STGCN/issues/8
- https://github.com/mdeff/cnn_graph/blob/c4d2c75d1807a1d1189b84bd6f4a0aafca5b8c53/lib/graph.py#L57
:param dist_mx: exact spatial distance matrix, [num_sensor, num_sensors].
:param k: entries larger than k are set to zero for sparsity, or knn: entries far away from k-neareast-neighbor are set to zero for sparsity.
:return: weighted undirected adjacency matrix
"""
sigma2 = np.std(dist_mx)**2
#sigma2 = np.mean(dist_mx)**2
dist_mx = dist_mx * (dist_mx < k)
print(dist_mx.shape)
W = np.exp(-dist_mx**2 / sigma2) #the diagonal is set to 1
return W |
def ht_inv(mat: np.ndarray) -> np.ndarray:
mat_inv = np.zeros_like(mat)
mat_inv[..., 3, 3] = 1
mat_inv[..., :3, :3] = np.swapaxes(mat[..., :3, :3], -1, -2)
mat_inv[..., :3, 3] = np.squeeze(-mat_inv[..., :3, :3] @ mat[..., :3, 3, np.newaxis])
return mat_inv |
def ClippedAdam(optim_args):
return PyroOptim(pt_ClippedAdam, optim_args) |
def _location_canonical_name(location):
location = _location_normalize(location)
if location.lower() in LOCATION_ALIAS:
return LOCATION_ALIAS[location.lower()]
return location |
#ifndef CAMERA_HEADER
#define CAMERA_HEADER
#include <GLFW/glfw3.h>
#include <glm\glm.hpp>
#include <glm\gtx\transform.hpp>
#include "../engine/entities/entity.h"
class Camera
{
public:
explicit Camera(glm::vec2 mousePosition)
: m_up(0.0f, 1.0f, 0.0f)
{
m_oldMousePosition = mousePosition;
}
public:
void Bind(Entity* ent)
{
m_boundEnt = ent;
}
glm::mat4 ViewMat(void)
{
return glm::lookAt(m_boundEnt->Position(), m_boundEnt->Position() + *(m_boundEnt->ViewDirection()), m_up);
}
void Look(glm::vec2 newMousePosition)
{
if (m_oldMousePosition != newMousePosition)
{
glm::vec2 mouseDelta = newMousePosition - m_oldMousePosition;
*(m_boundEnt->ViewDirection()) = glm::mat3(glm::rotate(glm::radians(-mouseDelta.x) * 0.02f, m_up)) * *(m_boundEnt->ViewDirection());
glm::vec3 toRotateAround = glm::cross(*(m_boundEnt->ViewDirection()), m_up);
*(m_boundEnt->ViewDirection()) = glm::mat3(glm::rotate(glm::radians(-mouseDelta.y) * 0.02f, toRotateAround))
* *(m_boundEnt->ViewDirection());
m_oldMousePosition = newMousePosition;
}
}
private:
Entity* m_boundEnt;
const glm::vec3 m_up;
glm::vec2 m_oldMousePosition;
};
#endif |
// GetPhysicalScan returns PhysicalBlockScan for the LogicalBlockScan.
func (s *LogicalBlockScan) GetPhysicalScan(schemaReplicant *memex.Schema, stats *property.StatsInfo) *PhysicalBlockScan {
ds := s.Source
ts := PhysicalBlockScan{
Block: ds.blockInfo,
DeferredCausets: ds.DeferredCausets,
BlockAsName: ds.BlockAsName,
DBName: ds.DBName,
isPartition: ds.isPartition,
physicalBlockID: ds.physicalBlockID,
Ranges: s.Ranges,
AccessCondition: s.AccessConds,
}.Init(s.ctx, s.blockOffset)
ts.stats = stats
ts.SetSchema(schemaReplicant.Clone())
if ts.Block.PKIsHandle {
if pkDefCausInfo := ts.Block.GetPkDefCausInfo(); pkDefCausInfo != nil {
if ds.statisticBlock.DeferredCausets[pkDefCausInfo.ID] != nil {
ts.Hist = &ds.statisticBlock.DeferredCausets[pkDefCausInfo.ID].Histogram
}
}
}
return ts
} |
The SDCC sent the following emergency press release following a very homophobic mailer that dropped yesterday.
Seattle, WA — November 2 – Yesterday, the day before Election Day, Senate Republicans hit the 48th Legislative District with a mail piece attacking Senator Rodney Tom for his support of equality and Washington’s LGBT community. Paid for through several political committees by the Senate Republican Campaign Committee, the piece cites Senator Tom’s October 20 interview with Publicola and his support of marriage equality for gay and lesbian couples and then lists perceived consequences this would have for straight couples. These include the, “End of no-fault divorce…[the] repeal of custody rights…[and] no more community property.†Senate Majority Leader Lisa Brown decries the mailer as, “A last ditch effort on Mike Hewitt’s part with little foundation in reality. It implies that if you are in favor of gay rights, then you are against women’s rights, which is false. In addition, it implies that Rodney is against women’s rights and that he’s harassed women, both of which are completely untrue. It’s genuinely repulsive and represents a new low in Washington State politics.†This isn’t the first time Republican leadership has deceived the electorate with an outright lie regarding a Senator’s position this election season. Two weeks ago, the Spokesman Review decried a television ad against incumbent Senator Chris Marr. The ad said that Marr had made cruel remarks to female employees at his place of business. Appalled by the blatant lies in the ad, the woman quoted came forward of her own will refuting any inappropriate activity. The Spokeman Review saw the ad for what it was, “Scumbaggy.†Sorely embarrassed, funders of the Leadership Council demanded that Senate Republicans pull the ad, which he did. This mail piece is just as deplorable and just as disgusting. Chris Gregorich of the Senate Democratic Campaign Committee, “This is the most nonsensical political ad of the entire election cycle and it’s very telling that the Republican leadership would release it a day before election day. If such a blatantly homophobic and embarrassing ad were sent out any earlier, there would be a media firestorm in response. They are so desperate for a majority, they are willing to use hate, intolerance, and misinformation to mislead Washington voters.â€
Josh Friedes, Executive Director of Equal Rights Washington, told me via e-mail, “Rodney Tom has championed basic civil rights for Washington’s gay and lesbian families. Nothing should anger fair minded voters more than when people attack a legislator by telling voters that a legislators support for gay rights will hurt women. It’s the most absurd claim out there and it is a blatant lie. But this is exactly the attack being levied against State Senator Rodney Tom in the 48th district. And of course Tom’s opponents know that their mailer is false, that’s why they waited until the last minute so that the pro-choice community and gay rights community, which overlap considerably, would not have time to respond. But voters in the 48th can respond by reminding each other of the importance for voting for Rodney Tom.
“The content of the mailer itself ironically demonstrates the support that exists in the 48th for LGBT civil rights. Years ago hatemongers could argue with some degree of success that one should not support a candidate because they support gay rights, but times have changed and today the majority of voters in Washington State support gay rights, so the arguments used by the right are changing. The right today is trying to make ridiculous claims that supporting gay rights will hurt women or other historically discriminated against groups. What the right doesn’t want the people of the 48th district to know is that last year it was the pro-choice community, and over 30 organizations representing communities of color who helped propel referendum 71 to victory. Referendum 71 upheld the states domestic partnership law. A law Rodney Tom supported, a law supported by the people of the 48th.” |
/**
* Returns whether the given digit sequence is in one of the ranges specified by this instance.
* This is more efficient that obtaining the associated {@code RangeSet} and checking that.
*/
public boolean matches(DigitSequence digits) {
if (digits.length() != length()) {
return false;
}
for (int n = 0; n < length(); n++) {
if ((bitmasks.charAt(n) & (1 << digits.getDigit(n))) == 0) {
return false;
}
}
return true;
} |
package com.xixi.approval.myapproval.enums;
/**
* @author shengchengchao
* @Description
* @createTime 2021/4/26
*/
public enum StatusEnum {
/**
* 可以进行审批操作的简单节点是Ready状态
*/
READY("READY","审批"),
/**
* 审批完成
*/
NORMAL("COMPLETE","审批完成"),
/**
* 现在还没有走到的节点状态
*/
FUTURE("FUTURE","待审批"),
/**
* 只有复杂节点有该状态,表示在等待子节点审批
*/
WAITING("WAITING","子节点待审批"),
/**
* 回滚
*/
ROLLBACK("ROLLBACK","驳回"),
/**
* 只有复杂节点有该状态,在并发节点下,一个子节点过了审批,其他字节点为跳过
*/
SKIP("SKIP","跳过")
;
private String status;
private String statusCn;
public String getStatus() {
return status;
}
public String getStatusCn() {
return statusCn;
}
StatusEnum(String status, String statusCn) {
this.status = status;
this.statusCn = statusCn;
}
}
|
The last barricades come down Thursday on North Texas’ newest and flashiest roadway.
We’ve known it for years as I-635, LBJ and the more formal LBJ Freeway. But the free doesn’t entirely apply anymore to the new stretch across North Dallas.
Get used to the monikers LBJ Express and LBJ TEXpress. And get ready for a driving experience that reflects Dallas’ reputation for glitz.
This is the rare roadway that comes with an online instruction manual. It’s highly advisable to read the long list of FAQs to comprehend the ins and outs of this multibillion-dollar extravaganza.
Consider: There’s the free portion of the new LBJ, and there’s the pay portion partially hidden from overhead view in the form of six sunken toll lanes. You can’t jump on and off just anywhere.
The pay lanes cost as little as 10 cents a mile or as much as 75 cents a mile, depending on traffic and time of day. Prices can change by the minute, but motorists “lock in” the price displayed on message boards when they enter the toll lanes.
View video at dallasnews.com
Wait, there’s more: Drivers get a discount if they use a TollTag. Carpooling TollTaggers get an extra discount if they pre-register online.
Heading downtown from LBJ? Take the free exit ramps south onto I-35E, or use the pay ramps that take you half way to the center city. Were the express lanes worth the cost? Try it and be the judge.
North Dallas might be justified in the euphoria of this final segment opening up to traffic, after decades of planning and four years of construction mess. But there is no joy immediately to the east, in Northeast Dallas, Garland and Mesquite, where the original, 1970s-vintage LBJ East still awaits financing and a final plan for improvement. Those communities are justified in their sense of impatience.
What brought North Dallas its reincarnated LBJ was a sense of urgency despite the high cost and the mere trickle of funding available from Austin. That hatched an innovative public-private partnership with Spanish infrastructure investors who bid for 52 years of tolling rights on LBJ’s new pay lanes. Foreign money was largely responsible for accomplishing what Texas motor fuels taxes couldn’t afford.
Now, progress on LBJ East is caught in a tug-o-war between locally elected officials and a delegation of anti-toll state lawmakers; the legislators have drawn a line in the sand against incorporating any new toll lanes in the design of a widened LBJ East.
We’ll side with the local officials on this one. Tolls may not always be the perfect financing for roadways, and highways may not solve every mobility need. But if tolling can help accelerate a project, and if drivers have a choice between pay and free lanes, lawmakers should not be so eager to rule out financing options. Just ask drivers zipping down the new pay lanes in North Dallas.
The LBJ Express
Length: 10.7 miles on I-635, 5.8 miles on I-35E
Construction cost: $2.6 billion
Lanes: Four free lanes in each direction on I-635, plus continuous frontage roads; three toll lanes in each direction
Financing: $490 million from TxDOT and other public funds; $664 million from outside developers LBJ Infrastructure Group; $1.5 billion in federal loans that developers must repay
LBJ Infrastructure Group: Cintra U.S., the American arm of a Spanish infrastructure giant; Meridiam Infrastructure Finance, global investor in public works; Dallas Police & Fire Pension System
Lease terms: Developers have tolling rights for 52 years and are responsible for maintenance and returning roadway in good condition
Tolling operations: Provided by North Texas Tollway Authority
SOURCE: DMN research |
<gh_stars>0
use crate::components;
use crate::game;
use crate::map;
use crate::player::character;
use log;
use rltk::{Point, Rltk, VirtualKeyCode};
use specs;
use specs::prelude::*;
use std::cmp::{max, min};
pub fn try_move(delta_x: i32, delta_y: i32, ecs: &mut specs::World) {
let mut positions = ecs.write_storage::<components::Position>();
let players = ecs.read_storage::<components::Player>();
let mut viewsheds = ecs.write_storage::<components::Viewshed>();
let entities = ecs.entities();
let combat_stats = ecs.read_storage::<components::CombatStats>();
let game_map = ecs.fetch::<map::Map>();
let mut wants_to_melee = ecs.write_storage::<components::WantsToMelee>();
for (entity, _player, pos, viewshed) in
(&entities, &players, &mut positions, &mut viewsheds).join()
{
if pos.x + delta_x < 1
|| pos.x + delta_x > game_map.width - 1
|| pos.y + delta_y < 1
|| pos.y + delta_y > game_map.height - 1
{
return;
}
let destination_idx = game_map.xy_idx(pos.x + delta_x, pos.y + delta_y);
for potential_target in game_map.tile_content[destination_idx].iter() {
let target = combat_stats.get(*potential_target);
if let Some(_target) = target {
wants_to_melee
.insert(
entity,
components::WantsToMelee {
target: *potential_target,
},
)
.expect("Add target failed");
return;
}
}
if !game_map.blocked[destination_idx] {
pos.x = min(game_map.width - 1, max(0, pos.x + delta_x));
pos.y = min(game_map.height - 1, max(0, pos.y + delta_y));
viewshed.dirty = true;
let mut ppos = ecs.write_resource::<Point>();
ppos.x = pos.x;
ppos.y = pos.y;
}
}
}
pub fn input(gs: &mut game::state::State, ctx: &mut Rltk) -> game::state::RunState {
// Player movement
match ctx.key {
None => return game::state::RunState::AwaitingInput, // Nothing happened
Some(key) => match key {
// Movement
VirtualKeyCode::Left | VirtualKeyCode::A | VirtualKeyCode::Key4 => {
try_move(-1, 0, &mut gs.ecs)
}
VirtualKeyCode::Right | VirtualKeyCode::D | VirtualKeyCode::Key6 => {
try_move(1, 0, &mut gs.ecs)
}
VirtualKeyCode::Up | VirtualKeyCode::W | VirtualKeyCode::Key8 => {
try_move(0, -1, &mut gs.ecs)
}
VirtualKeyCode::Down | VirtualKeyCode::S | VirtualKeyCode::Key2 => {
try_move(0, 1, &mut gs.ecs)
}
VirtualKeyCode::Key7 | VirtualKeyCode::Q => try_move(-1, -1, &mut gs.ecs),
VirtualKeyCode::Key9 | VirtualKeyCode::E => try_move(1, -1, &mut gs.ecs),
VirtualKeyCode::Key3 | VirtualKeyCode::C => try_move(1, 1, &mut gs.ecs),
VirtualKeyCode::Key1 | VirtualKeyCode::Z => try_move(-1, 1, &mut gs.ecs),
// Items management
VirtualKeyCode::P => character::get_item(&mut gs.ecs), // pick-up
VirtualKeyCode::I => return game::state::RunState::ShowInventory,
VirtualKeyCode::L => return game::state::RunState::ShowDropItem, // let-go
// Main menu
VirtualKeyCode::Escape => return game::state::RunState::ShowMainMenu,
VirtualKeyCode::Space => {
log::debug!("Pausing game ...");
return game::state::RunState::Paused;
}
_ => {
log::debug!("Got user input: {:?}", key);
return game::state::RunState::AwaitingInput;
}
},
}
game::state::RunState::PlayerTurn
}
|
/*
* @(#)$Id$
*
* Author : <NAME>, <EMAIL>
*
*
* Purpose :
*
* -----------------------------------------------------------------------
*
* Revision Information:
*
* Date Who Reason
*
* 23.03.2008 ukurmann Initial Release
*
* -----------------------------------------------------------------------
*
* Copyright 1999-2009 ETH Zurich. All Rights Reserved.
*
* This software is the proprietary information of ETH Zurich.
* Use is subject to license terms.
*
*/
package org.ximtec.igesture.tool.view.testset;
import java.beans.PropertyChangeEvent;
import java.util.logging.Logger;
import org.ximtec.igesture.tool.core.Controller;
import org.ximtec.igesture.tool.core.DefaultController;
import org.ximtec.igesture.tool.core.EdtProxy;
import org.ximtec.igesture.tool.core.TabbedView;
import org.ximtec.igesture.tool.explorer.ExplorerTreeController;
import org.ximtec.igesture.tool.explorer.ExplorerTreeModel;
import org.ximtec.igesture.tool.util.NodeInfoFactory;
import org.ximtec.igesture.tool.view.MainModel;
public class TestSetController extends DefaultController {
private static final Logger LOGGER = Logger.getLogger(TestSetController.class
.getName());
private ITestSetView testSetView;
private MainModel mainModel;
private ExplorerTreeController explorerTreeController;
public TestSetController(Controller parentController) {
super(parentController);
mainModel = getLocator().getService(
MainModel.IDENTIFIER, MainModel.class);
initController();
}
private void initController() {
testSetView = EdtProxy.newInstance(new TestSetView(this), ITestSetView.class);
ExplorerTreeModel explorerModel = new ExplorerTreeModel(mainModel
.getTestSetList(), NodeInfoFactory.createTestSetNodeInfo(this));
explorerTreeController = new ExplorerTreeController(this, testSetView,
explorerModel);
addController(explorerTreeController);
}
@Override
public TabbedView getView() {
return testSetView;
}
@Override
public void propertyChange(PropertyChangeEvent evt) {
LOGGER.info("PropertyChange");
super.propertyChange(evt);
explorerTreeController.getExplorerTreeView().refresh();
}
}
|
// This test verifies that an upstream connection failure during ask redirection processing is
// handled correctly. In this case the "asking" command and original client request have been sent
// to the target server, and then the connection is closed. The fake Redis client should receive an
// upstream failure error in response to its request.
TEST_P(RedisProxyWithRedirectionIntegrationTest, ConnectionFailureBeforeAskingResponse) {
initialize();
std::string request = makeBulkStringArray({"get", "foo"});
std::stringstream redirection_error;
redirection_error << "-ASK 1111 " << redisAddressAndPort(fake_upstreams_[1]) << "\r\n";
std::string proxy_to_server;
IntegrationTcpClientPtr redis_client = makeTcpConnection(lookupPort("redis_proxy"));
redis_client->write(request);
FakeRawConnectionPtr fake_upstream_connection_1, fake_upstream_connection_2;
EXPECT_TRUE(fake_upstreams_[0]->waitForRawConnection(fake_upstream_connection_1));
EXPECT_TRUE(fake_upstream_connection_1->waitForData(request.size(), &proxy_to_server));
EXPECT_EQ(request, proxy_to_server);
proxy_to_server.clear();
EXPECT_TRUE(fake_upstream_connection_1->write(redirection_error.str()));
EXPECT_TRUE(fake_upstreams_[1]->waitForRawConnection(fake_upstream_connection_2));
std::string asking_request = makeBulkStringArray({"asking"});
EXPECT_TRUE(fake_upstream_connection_2->waitForData(asking_request.size() + request.size(),
&proxy_to_server));
EXPECT_EQ(asking_request + request, proxy_to_server);
EXPECT_TRUE(fake_upstream_connection_2->close());
std::stringstream error_response;
error_response << "-" << RedisCmdSplitter::Response::get().UpstreamFailure << "\r\n";
redis_client->waitForData(error_response.str());
EXPECT_EQ(error_response.str(), redis_client->data());
redis_client->close();
EXPECT_TRUE(fake_upstream_connection_1->close());
} |
An Analysis of Saturated Critical Heat Flux in Micro/Mini-Channels
This paper verified the macro-to-micro-scale transitional criterion BoRel0.5 = 200 proposed by Li and Wu because data points where BoRel0.5 ≤ 200 and BoRel0.5 > 200 show very different trends for the entire database (1,672 data points). For the 859 data points with BoRel0.5 ≤ 200, the boiling number at CHF decreases greatly with length-to-diameter ratio Lh /dhe when Lh /dhe is small, while Lh /dhe presents negligible effect on the boiling number when Lh /dhe > 150. For the region where Lh /dhe ≤ 150 and BoRel0.5 ≤ 200, a simple saturated CHF correlation was proposed by using the boiling number, length-to-diameter ratio, and exit quality. Heated length and heated equivalent diameter were adopted in the length-to-diameter ratio, considering the actual heat transfer conditions. A combined dimensionless number WemCal0.8 was introduced to correlate the micro/mini-channel database for the region: Lh /dhe > 150 and BoRel0.5 ≤ 200. The new method can predict the overall micro/mini-channel database accurately on the whole. It can predict almost 95.5% of the non-aqueous data and 93.5% of the water data within the ± 30% error band.Copyright © 2010 by ASME |
/*
* Copyright (C) 2012-2014 Open Source Robotics Foundation
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
#include "gazebo/physics/Road.hh"
#include "test/util.hh"
using namespace gazebo;
class RoadTest : public gazebo::testing::AutoLogFixture { };
/////////////////////////////////////////////////
TEST_F(RoadTest, Texture)
{
std::ostringstream roadStr;
roadStr << "<sdf version='" << SDF_VERSION << "'><road name='my_road'>"
<< "<width>5</width>"
<< "<point>0 0 0</point>"
<< "<point>25 0 0</point>"
<< "<material>"
<< "<script>"
<< "<name>primary</name>"
<< "</material>"
<< "</script>"
<< "</road></sdf>";
sdf::ElementPtr roadSDF(new sdf::Element);
sdf::initFile("road.sdf", roadSDF);
sdf::readString(roadStr.str(), roadSDF);
physics::RoadPtr road(
new physics::Road(physics::BasePtr()));
sdf::ElementPtr scriptElem = roadSDF->GetElement(
"material")->GetElement("script");
ASSERT_TRUE(scriptElem);
road->Load(roadSDF);
EXPECT_STREQ(road->GetName().c_str(), "my_road");
EXPECT_STREQ(road->GetSDF()->GetElement("material")->
GetElement("script")->Get<std::string>("name").c_str(), "primary");
scriptElem->GetElement("name")->Set("lanes_2");
road->Load(roadSDF);
EXPECT_STREQ(road->GetSDF()->GetElement("material")->
GetElement("script")->Get<std::string>("name").c_str(), "lanes_2");
scriptElem->GetElement("name")->Set("motorway");
road->Load(roadSDF);
EXPECT_STREQ(road->GetSDF()->GetElement("material")->
GetElement("script")->Get<std::string>("name").c_str(), "motorway");
}
/////////////////////////////////////////////////
int main(int argc, char **argv)
{
::testing::InitGoogleTest(&argc, argv);
return RUN_ALL_TESTS();
}
|
// decodedAPIError decodes and returns the error message from the API response.
// If the message is blank, it returns a fallback message with the status code.
func decodedAPIError(resp *http.Response) error {
var apiError struct {
Error struct {
Type string `json:"type"`
Message string `json:"message"`
PossibleTrackIDs []string `json:"possible_track_ids"`
} `json:"error,omitempty"`
}
if err := json.NewDecoder(resp.Body).Decode(&apiError); err != nil {
return fmt.Errorf("failed to parse API error response: %s", err)
}
if apiError.Error.Message != "" {
if apiError.Error.Type == "track_ambiguous" {
return fmt.Errorf(
"%s: %s",
apiError.Error.Message,
strings.Join(apiError.Error.PossibleTrackIDs, ", "),
)
}
return fmt.Errorf(apiError.Error.Message)
}
return fmt.Errorf("unexpected API response: %d", resp.StatusCode)
} |
/**
* @return the edge punctuation of the default
* {@link EdgePunctuationPatternProvider}
*/
public static String edgePuncPattern() {
return new EdgePunctuationPatternProvider(new PunctuationPatternProvider()) {
@Override
public String correctEdges(String s) {
return null;
}
@Override
public String patternString() {
return null;
}
}.EdgePunct;
} |
// maps a function that generates a key to an entire geobuf file
func (splitter *Splitter) MapToSubFiles(myfunc func(feature *geojson.Feature) []string) {
i := 0
s := time.Now()
for splitter.Reader.Next() {
feature := splitter.Reader.Feature()
keys := myfunc(feature)
for _, key := range keys {
splitter.AddFeature(key, feature)
}
i++
if i%1000 == 0 {
fmt.Printf("\r%d Features Split in %s", i, time.Now().Sub(s))
}
}
fmt.Println()
} |
//PostChallenge the Method Handler of "POST /challenges"
func PostChallenge(c echo.Context) error {
me := c.Get("me").(*model.User)
req := &challengeJSON{}
if err := c.Bind(req); err != nil {
return echo.NewHTTPError(http.StatusBadRequest, fmt.Sprintf("failed to bind request body: %v", err))
}
captions, penalties := make([]string, len(req.Hints)), make([]int, len(req.Hints))
for _, _hintJSON := range req.Hints {
idSplit := strings.Split(_hintJSON.ID, ":")
i, _ := strconv.Atoi(idSplit[1])
captions[i] = _hintJSON.Caption
penalties[i] = _hintJSON.PenaltyPercent
}
flags, scores := make([]string, len(req.Flags)), make([]int, len(req.Flags))
for _, _flagJSON := range req.Flags {
idSplit := strings.Split(_flagJSON.ID, ":")
i, _ := strconv.Atoi(idSplit[1])
flags[i] = _flagJSON.Flag
scores[i] = _flagJSON.Score
}
challenge, err := model.NewChallenge(req.Genre, req.Name, req.Author.ID, req.Score, req.Caption, captions, penalties, flags, scores, req.Answer)
if err != nil {
return echo.NewHTTPError(http.StatusInternalServerError, err.Error())
}
solvedMap, openedMap, foundMap := makeSolvedOpenedFoundMaps(me)
json := newChallengeJSON(me, challenge, solvedMap, openedMap, foundMap)
c.Response().Header().Set(echo.HeaderLocation, os.Getenv("API_URL_PREFIX")+"/challenges/"+challenge.ID)
return c.JSON(http.StatusCreated, json)
} |
/**
* This class implements a platform independent I2C LED panel. This class is intended to be extended by a platform
* dependent I2C LED panel which provides the abstract methods required by this class. This class provides the APIs
* to assemble command requests and send them over to the panel asynchronously.
*/
public abstract class TrcI2cLEDPanel
{
protected static final String moduleName = "TrcI2cLEDPanel";
protected static final boolean debugEnabled = false;
protected static final boolean tracingEnabled = false;
protected static final boolean useGlobalTracer = false;
protected static final TrcDbgTrace.TraceLevel traceLevel = TrcDbgTrace.TraceLevel.API;
protected static final TrcDbgTrace.MsgLevel msgLevel = TrcDbgTrace.MsgLevel.INFO;
protected TrcDbgTrace dbgTrace = null;
private static final int I2C_BUFF_LEN = 32;
/**
* This method writes the data buffer to the device asynchronously.
*
* @param data specifies the data buffer.
*/
public abstract void asyncWriteData(byte[] data);
private final String instanceName;
/**
* Constructor: Creates an instance of the object.
*
* @param instanceName specifies the instance name.
*/
public TrcI2cLEDPanel(final String instanceName)
{
if (debugEnabled)
{
dbgTrace = useGlobalTracer?
TrcDbgTrace.getGlobalTracer():
new TrcDbgTrace(moduleName + "." + instanceName, tracingEnabled, traceLevel, msgLevel);
}
this.instanceName = instanceName;
} //TrcI2cLEDPanel
/**
* This method returns the instance name.
*
* @return instance name.
*/
public String toString()
{
return instanceName;
} //toString
/**
* This method sets the specified line in the LED panel with all the text info for displaying text on the panel.
* Note that the (x, y) coordinates is rotation sensitive. If rotation is 0, the text orientation is normal
* horizontal and (0, 0) corresponds to the upper left corner of the physical panel. If rotation is 2, the
* text orientation is inverted horizontal and (0, 0) corresponds to the lower right corner of the physica
* panel.
*
* @param index specifies the line index of the array.
* @param x specifies the x coordinate of the upper left corner of the text rectangle.
* @param y specifies the y coordinate of the upper left corner of the text rectangle.
* @param fontColor specifies the font color for displaying the text.
* @param orientation specifies the text orientation (0: normal horizontal, 1: clockwise vertical,
* 2: inverted horizontal, 3: anti-clockwise vertical).
* @param fontSize specifies the size of the font (1: 6x8, 2: 12x16, 3: 18x24, 4: 24x32).
* @param scrollInc specifies the scroll increment (0: no scroll, 1: scroll to the right, -1: scroll to the left).
* @param text specifies the text string to be displayed.
*/
public void setTextLine(
int index, int x, int y, int fontColor, int orientation, int fontSize, int scrollInc, String text)
{
sendCommand("setTextLine " + index + " " + x + " " + y + " " + fontColor + " " + orientation + " " +
fontSize + " " + scrollInc + " " + text);
} //setTextLine
/**
* This method clears the specified line in the lines array.
*
* @param index specifies the line index of the array.
*/
public void clearTextLine(int index)
{
sendCommand("clearTextLine " + index);
} //clearTextLine
/**
* This method clears all text lines in the lines array.
*/
public void clearAllTextLines()
{
sendCommand("clearAllTextLines");
} //clearAllTextLines
/**
* This method sets the Arduino loop delay. This effectively controls how fast the text will scroll.
*
* @param delay specifies the delay in msec.
*/
public void setDelay(int delay)
{
sendCommand("setDelay " + delay);
} //setDelay
/**
* This method converts the specified RGB values into a 16-bit color value in 565 format (5-bit R, 6-bit G and
* 5-bit B: RRRRRGGGGGGBBBBB).
*
* @param red specifies the red value.
* @param green specifies the green value.
* @param blue specifies the blue value.
* @return 16-bit color value in 565 format.
*/
public int color(int red, int green, int blue)
{
final String funcName = "color";
int colorValue = ((red & 0xf8) << 8) | ((green & 0xfc) << 3) | (blue >> 3);
if (debugEnabled)
{
dbgTrace.traceEnter(funcName, TrcDbgTrace.TraceLevel.API, "red=%d,green=%d,blue=%d", red, green, blue);
dbgTrace.traceExit(funcName, TrcDbgTrace.TraceLevel.API, "=0x%x", colorValue);
}
return colorValue;
} //color
/**
* This method sends the command string to the I2C device. If the command string is longer than 32 bytes,
* it will break down the command string into multiple I2C requests so that they can be reassembled on the
* device side.
*
* @param command specifies the command string to be sent to the I2C device.
*/
private void sendCommand(String command)
{
final String funcName = "sendCommand";
if (debugEnabled)
{
dbgTrace.traceEnter(funcName, TrcDbgTrace.TraceLevel.FUNC, "command=%s", command);
}
command += "~";
int cmdLen = command.length();
if (debugEnabled)
{
dbgTrace.traceInfo(funcName, "sendCommand(%s)=%d", command, command.length());
}
for (int i = 0; i < cmdLen; )
{
int len = Math.min(cmdLen - i, I2C_BUFF_LEN);
if (len > 0)
{
byte[] data = command.substring(i, i + len).getBytes();
if (debugEnabled)
{
dbgTrace.traceInfo(funcName, "asyncWrite%s=%d", Arrays.toString(data), data.length);
}
asyncWriteData(data);
i += len;
}
}
if (debugEnabled)
{
dbgTrace.traceExit(funcName, TrcDbgTrace.TraceLevel.FUNC);
}
} //sendCommand
} |
<filename>utils/compile_contracts.py
import json
import subprocess
import os
from os import path
import shutil
import re
# A proof of concept / convenient script to quickly compile contracts and their go bindings
# Can be run from the Makefile with make compile_contracts
solc_versions = ["v0.4", "v0.6", "v0.7"]
rootdir = "./artifacts/contracts/ethereum/"
targetdir = "./contracts/ethereum"
# The names of the contracts that we're actually compiling to use.
used_contract_names = [
"APIConsumer",
"BlockhashStore",
"DeviationFlaggingValidator",
"Flags",
"FluxAggregator",
"KeeperConsumer",
"KeeperRegistry",
"LinkToken",
"MockETHLINKAggregator",
"MockGASAggregator",
"OffchainAggregator",
"Oracle",
"SimpleReadAccessController"
"SimpleWriteAccessController",
"UpkeepRegistrationRequests",
"VRF",
"VRFConsumer",
"VRFCoordinator",
]
print("Locally installing hardhat...")
subprocess.run('npm install --save-dev hardhat', shell=True, check=True)
print("Modifying hardhat settings...")
with open("hardhat.config.js", "w") as hardhat_config:
hardhat_config.write("""module.exports = {
solidity: {
compilers: [
{
version: "0.8.0",
settings: {
optimizer: {
enabled: true,
runs: 50
}
}
},
{
version: "0.7.6",
settings: {
optimizer: {
enabled: true,
runs: 50
}
}
},
{
version: "0.6.6",
settings: {
optimizer: {
enabled: true,
runs: 50
}
}
},
{
version: "0.6.0",
settings: {
optimizer: {
enabled: true,
runs: 50
}
}
},
{
version: "0.4.24",
settings: {
optimizer: {
enabled: true,
runs: 50
}
}
}
]
}
}""")
print("Compiling contracts...")
subprocess.run('npx hardhat compile', shell=True, check=True)
print("Creating contract go bindings...")
for version in solc_versions:
for subdir, dirs, files in os.walk(rootdir + version):
for f in files:
if ".dbg." not in f:
print(f)
compile_contract = open(subdir + "/" + f, "r")
data = json.load(compile_contract)
contract_name = data["contractName"]
abi_name = targetdir + "/" + version + "/abi/" + contract_name + ".abi"
abi_file = open(abi_name, "w")
abi_file.write(json.dumps(data["abi"], indent=2))
bin_name = targetdir + "/" + version + "/bin/" + contract_name + ".bin"
bin_file = open(bin_name, "w")
bin_file.write(str(data["bytecode"]))
abi_file.close()
bin_file.close()
if contract_name in used_contract_names:
go_file_name = targetdir + "/" + contract_name + ".go"
subprocess.run("abigen --bin=" + bin_name + " --abi=" + abi_name + " --pkg=" + contract_name + " --out=" +
go_file_name, shell=True, check=True)
# Replace package name in file, abigen doesn't let you specify differently
with open(go_file_name, 'r+') as f:
text = f.read()
text = re.sub("package " + contract_name, "package ethereum", text)
f.seek(0)
f.write(text)
f.truncate()
print("Cleaning up Hardhat...")
subprocess.run('npm uninstall --save-dev hardhat', shell=True)
if path.exists("hardhat.config.js"):
os.remove("hardhat.config.js")
if path.exists("package-lock.json"):
os.remove("package-lock.json")
if path.exists("package.json"):
os.remove("package.json")
if path.exists("node_modules/"):
shutil.rmtree("node_modules/")
if path.exists("artifacts/"):
shutil.rmtree("artifacts/")
if path.exists("cache/"):
shutil.rmtree("cache/")
print("Done!") |
def iter_changes(self):
while not self._disposed.is_set():
initial_time = time.time()
old_visit_info = self._single_visit_info
old_file_to_mtime = old_visit_info.file_to_mtime
changes = []
append_change = changes.append
self._single_visit_info = single_visit_info = _SingleVisitInfo()
for path_watcher in self._path_watchers:
path_watcher._check(single_visit_info, append_change, old_file_to_mtime)
for entry in old_file_to_mtime:
append_change((Change.deleted, entry))
for change in changes:
yield change
actual_time = (time.time() - initial_time)
if self.print_poll_time:
print('--- Total poll time: %.3fs' % actual_time)
if actual_time > 0:
if self.target_time_for_single_scan <= 0.0:
for path_watcher in self._path_watchers:
path_watcher.sleep_time = 0.0
else:
perc = self.target_time_for_single_scan / actual_time
if perc > 2.:
perc = 2.
elif perc < 0.5:
perc = 0.5
for path_watcher in self._path_watchers:
if path_watcher.sleep_time <= 0.0:
path_watcher.sleep_time = 0.001
new_sleep_time = path_watcher.sleep_time * perc
diff_sleep_time = new_sleep_time - path_watcher.sleep_time
path_watcher.sleep_time += (diff_sleep_time / (3.0 * len(self._path_watchers)))
if actual_time > 0:
self._disposed.wait(actual_time)
if path_watcher.sleep_time < 0.001:
path_watcher.sleep_time = 0.001
diff = self.target_time_for_notification - actual_time
if diff > 0.:
self._disposed.wait(diff) |
<reponame>Maxopoly/cards-database
import { Card } from '../../../interfaces'
import Set from '../Vivid Voltage'
const card: Card = {
name: {
en: "Phanpy",
fr: "Phanpy",
es: "Phanpy",
it: "Phanpy",
pt: "Phanpy",
de: "Phanpy"
},
illustrator: "Shibuzoh.",
rarity: "Common",
category: "Pokemon",
set: Set,
hp: 70,
types: [
"Fighting",
],
attacks: [
{
cost: [
"Fighting",
],
name: {
en: "Stampede",
fr: "Ruée",
es: "Estampida",
it: "<NAME>",
pt: "Estouro",
de: "Zertrampeln"
},
damage: 10,
},
{
cost: [
"Fighting",
"Colorless",
],
name: {
en: "Strike Back",
fr: "Vengeur",
es: "Contraimpacto",
it: "Risposta",
pt: "Revidar",
de: "Kontern"
},
effect: {
en: "This attack does 30 damage for each damage counter on this Pokémon.",
fr: "Cette attaque inflige 30 dégâts pour chaque marqueur de dégâts sur ce Pokémon.",
es: "Este ataque hace 30 puntos de daño por cada contador de daño en este Pokémon.",
it: "Questo attacco infligge 30 danni per ogni segnalino danno presente su questo Pokémon.",
pt: "Este ataque causa 30 pontos de dano para cada contador de dano neste Pokémon.",
de: "Diese Attacke fügt für jede Schadensmarke auf diesem Pokémon 30 Schadenspunkte zu."
},
damage: "30×",
},
],
weaknesses: [
{
type: "Grass",
value: "×2"
},
],
retreat: 2,
regulationMark: "D",
variants: {
normal: true,
reverse: true,
holo: false,
firstEdition: false
},
stage: "Basic",
description: {
en: "It is strong despite its compact size. It can easily pick up and carry an adult human on its back."
}
}
export default card
|
// Make sure that the filter context has everything that might be
// nuked from it during URLRequest teardown before the SdchFilter
// destructor.
TEST_F(SdchFilterTest, SparseContextOk) {
std::vector<Filter::FilterType> filter_types;
filter_types.push_back(Filter::FILTER_TYPE_SDCH);
char output_buffer[20];
std::string url_string("http://ignore.com");
filter_context()->SetURL(GURL(url_string));
scoped_ptr<Filter> filter(Filter::Factory(filter_types, *filter_context()));
int output_bytes_or_buffer_size = sizeof(output_buffer);
Filter::FilterStatus status = filter->ReadData(output_buffer,
&output_bytes_or_buffer_size);
EXPECT_EQ(0, output_bytes_or_buffer_size);
EXPECT_EQ(Filter::FILTER_NEED_MORE_DATA, status);
filter_context()->NukeUnstableInterfaces();
} |
// SetChaosAddrDisconnect will allow connections to and from addr but, after
// 10-70 seconds, it will abruptly close a connection.
func (t *TCPMsgRing) SetChaosAddrDisconnect(addr string, disconnect bool) {
t.chaosAddrDisconnectsLock.Lock()
t.chaosAddrDisconnects[addr] = disconnect
t.chaosAddrDisconnectsLock.Unlock()
} |
/* DSP read only one adc channel, all ADCL should be closed */
static void close_adc_channel(struct mixer *amixer, bool main, bool aux, bool hp)
{
struct mixer_ctl *mic_adcl = NULL;
if (amixer) {
if (main) {
mic_adcl = mixer_get_ctl_by_name(amixer, KCTL_MAIN_MIC_ADCL);
mixer_ctl_set_value(mic_adcl, 0, 0);
}
if (aux) {
mic_adcl = mixer_get_ctl_by_name(amixer, KCTL_AUX_MIC_ADCL);
mixer_ctl_set_value(mic_adcl, 0, 0);
}
if (hp) {
mic_adcl = mixer_get_ctl_by_name(amixer, KCTL_HP_MIC_ADCL);
mixer_ctl_set_value(mic_adcl, 0, 0);
}
}
} |
<filename>client/src/main/bot.go
package main
import (
"bytes"
"net"
"time"
"github.com/urfave/cli"
)
type bot struct {
*cli.Context
net.Conn
}
func newBot(c *cli.Context) *bot {
return &bot{Context: c}
}
func startBot(c *cli.Context, address string) error {
bot := newBot(c)
return bot.start(address)
}
func (b *bot) start(address string) error {
conn, err := net.Dial("tcp", address)
if err != nil {
return err
}
b.Conn = conn
buff := bytes.NewBufferString("echo\r\n").Bytes()
for err = nil; err == nil; {
_, err = b.Conn.Write(buff)
time.Sleep(time.Millisecond)
}
return err
}
|
/**
* Loads methods called by tested invoked and merges with attribute
* {@link #methodsCalledByTestedInvoked}.
*
* @throws FileNotFoundException If 'mcti.ef' does not exist, is a
* directory rather than a regular file, or for some other reason cannot be
* opened for reading.
* @throws IOException If 'mcti.ef' cannot be read
*/
private void load() throws FileNotFoundException, IOException {
if (!MCTI_FILE.exists())
return;
try (ObjectInputStream ois = new ObjectInputStream(new FileInputStream(MCTI_FILE))) {
if (ois.available() == 0)
return;
@SuppressWarnings("unchecked")
Map<Invoked, Set<String>> storedCollection =
(Map<Invoked, Set<String>>) ois.readObject();
combineCollectedMethodWithStoredCollection(storedCollection);
}
catch(java.io.EOFException | ClassNotFoundException e) {
MCTI_FILE.delete();
}
} |
// Translate glslang type to SPIR-V memory decorations.
void TranslateMemoryDecoration(const glslang::TQualifier& qualifier, std::vector<spv::Decoration>& memory)
{
if (qualifier.coherent)
memory.push_back(spv::DecorationCoherent);
if (qualifier.volatil)
memory.push_back(spv::DecorationVolatile);
if (qualifier.restrict)
memory.push_back(spv::DecorationRestrict);
if (qualifier.readonly)
memory.push_back(spv::DecorationNonWritable);
if (qualifier.writeonly)
memory.push_back(spv::DecorationNonReadable);
} |
def is_super(self) -> bool:
return self.administrator and self.member['id'] == 1 or self.id in settings.SUPER_USERS |
A Study of Biodegradable Oil as Transformer Insulating Material
In power transformers, liquid insulating materials are used as cooling agent and electrical insulator. Currently, mineral oil is the most commonly used liquid insulation material in power transformers. However, mineral oil has some drawbacks in its characteristics such as a deprived biodegradability and high flammability. On the other hand, mineral oil is a very good insulating material for power transformer and has very good electrical features. Researchers are presently looking new sorts of insulating materials, which are more environmentally friendly. The further improvement of power transformer oil is biodegradable oil and the focus is on vegetable oils. This paper investigates about mix of corn oil and palm kernel oil as an option to mineral oil. Different characteristics of mix corn oil and palm kernel oil such as breakdown voltage, aging process and moisture content were measured according to standards and then the results were compared to the characteristics of mineral oil. It was shown that the mixture of corn oil and palm kernel oil has the same or greater characteristics compared to mineral oil. |
Linear trends over 20 years in sexually transmitted infections among patients attending a tertiary care center in north Kerala, India.
BACKGROUND
Worldwide, a declining trend is observed in sexually transmitted infections of bacterial origin which is reflected as a rise in the proportion of viral sexually transmitted infections.
AIMS
To find out the clinical referral patterns of sexually transmitted infections among patients who attended the sexually transmitted infection clinic attached to Dermatology and Venereology Department of Government Medical College, Kozhikode from 1.1.1998 to 31.12.2017 and to study the linear trends in the pattern of sexually transmitted infections over 20 years.
METHODS
After clearance from the institutional ethics committee, a retrospective study was conducted among patients who attended the sexually transmitted infection clinic of Government Medical College, Kozhikode from 1.1.1998 to 31.12.2017 and were diagnosed to have sexually transmitted infections.
RESULTS
During the 20 year study period 5227 patients, attended the sexually transmitted infection clinic of our institution. Diagnosis of sexually transmitted infection was made in 2470 (47.3%) cases. Predominant sexually transmitted infections were herpes genitalis (964, 39%), condyloma acuminata (921, 37.9%) and syphilis (418, 17.2%). Viral sexually transmitted infections (1885, 76.3%) outnumbered bacterial sexually transmitted infections (575, 23.3%). A declining trend was noted for both bacterial and viral sexually transmitted infections over the 20 year period, which was more marked for the former. But the latter years of the study documented a rising trend in total sexually transmitted infections including bacterial sexually transmitted infections.
LIMITATIONS
The study does not reflect the status of sexually transmitted infections in the general population since it was conducted in a tertiary referral center.
CONCLUSION
The disturbing ascending trend recorded in sexually transmitted infections including syphilis during the final years of the 20-year period needs to be watched closely, to plan future strategies. |
import sys
import time
import os, re, math
import numpy as np
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui
from multiprocessing import Process, Queue
from subprocess import Popen, PIPE, STDOUT, call
class Runner(QtCore.QObject):
'''
Runs a job in a separate process and forwards messages from the job to the
main thread through a pyqtSignal.
'''
msg_from_job = QtCore.pyqtSignal(object)
def __init__(self, start_signal):
'''
:param start_signal: the pyqtSignal that starts the job
'''
super(Runner, self).__init__()
self.job_input = None
self.job_function = None
self.comm_queue = None
start_signal.connect(self._run)
def _run(self):
self.p = Process(target=self.job_function, args=(self.comm_queue, self.job_input,))
self.p.start()
class hitime_hit (object):
def __init__(self, mz, rt, score):
self.mz = mz
self.rt = rt
self.score = score
def __repr__(self):
return ( 3 * '%s, ' %(self.mz, self.rt, self.score))
def run_job(runner, runner_thread, queue, function, input):
""" Call this to start a new job """
runner.job_function = function
runner.job_input = input
runner.comm_queue = queue
runner_thread.start()
def prepare_HM_color_grad(data):
pos = np.array([0.0, 1.0])
color = np.array([[255,255,255,1], [255,0,0,1]], dtype=np.ubyte)
a = pg.ColorMap(pos, color)
lut = a.getLookupTable(0.0, 1.0, 256)
points = np.asarray([x.score for x in data])
# dmax = np.max(points)
points = pg.applyLookupTable(points, lut)
dmax0 = np.max(points[:,0])
dmax1 = np.max(points[:,1])
dmax2 = np.max(points[:,2])
pens = []
for i in points:
r, g, b = float(i[0])/dmax0*255, float(i[1])/dmax1*255, float(i[2])/dmax2*255
colour = pg.mkColor(r,g,b)
#print i, r, g, b
pens.append(colour)
return pens
def plot_EIC(results, options):
mzml_file = str(options.mzmlFile)
neutral_mod_delta = options.mzDelta
EIC_width = options.eicWidth
# build list of light EIC targets including ranges for heavy/light isotopes
for result in results:
result.light_ll = result.mz - EIC_width
result.light_hl = result.mz + EIC_width
result.heavy_ll = result.mz - EIC_width + neutral_mod_delta
result.heavy_hl = result.mz + EIC_width + neutral_mod_delta
result.EIC_rt = []
result.EIC_int_light = []
result.EIC_int_heavy = []
# get spectra - returns tuple of time, mzs, ints
spectra = readSpectra(mzml_file, 1)
#print results[0].ht_rt_index
print('Extracting EIC...')
for n, spectrum in enumerate(spectra):
time, mzs, ints, lvl = spectrum
# test if spectrum corresponds to HT hit point
for res in results:
res.EIC_rt.append(float(time))
res.EIC_int_light.append(0)
res.EIC_int_heavy.append(0)
if res.ht_rt_index == n:
# print 'found spectrum: spec rt: %s, result rt: %s' %(time, res.rt)
# if so, record data
res.HT_MS_mz = mzs
res.HT_MS_int = ints
for res in results:
# get mzs indices within window for light and heavy EICs
light = np.where((mzs > res.light_ll) & (mzs < res.light_hl))
heavy = np.where((mzs > res.heavy_ll) & (mzs < res.heavy_hl))
# check that the array is non-empty
if light[0].shape[0] > 0: res.EIC_int_light[-1] += np.sum(ints[light])
if heavy[0].shape[0] > 0: res.EIC_int_heavy[-1] += np.sum(ints[heavy])
return results
def readSpectra(mzml_file, msLevel = None):
'''
read MS_spectra from an mzML file of the given msLevel
'''
import pymzml
msrun = pymzml.run.Reader(str(mzml_file))
for n, spectrum in enumerate(msrun):
# only consider MS1 level
if msLevel:
if spectrum['ms level'] != msLevel: continue
lvl = spectrum['ms level']
try:
time = spectrum['scan time']
except:
try:
time = spectrum['scan start time']
except Exception as e:
print ('Warning, skipping spectrum %s' %n)
print ('Stack trace:')
print (str(e))
continue
try:
mzs = np.array(spectrum.mz, dtype = "float32")
ints = np.array(spectrum.i, dtype = 'float32')
assert mzs.shape == ints.shape
yield time, mzs, ints, lvl
except Exception as e:
print ('Warning, skipping spectrum %s' %n)
print ('Stack trace:')
print (str(e))
continue
def getXRange(hit, pc):
'''
Get the xrange values at +/- Xpc of a target value
'''
hit = float(hit)
pc = float(pc)
xmin = hit - hit * pc/100
xmax = hit + hit * pc/100
return (xmin, xmax)
def retrieve_file_matching_string(string, files):
'''
Input: list of filename objects
'''
for f in files:
if f.ifname == str(string):
return f.ifpath
def zero_fill(xData, yData):
x = np.repeat(xData, 3)
y = np.dstack((np.zeros(yData.shape[0]), yData, np.zeros(yData.shape[0]))).flatten()
return x, y
|
import java.awt.*;
/**
* Panel holding the extracted trees and associated controls and sliders
*
* @author <NAME>
*/
public class ScannerCard extends Panel
{
/// The control
Control c;
/// The extracted trees
Animation a;
/// The slider
Slider sld;
/// Dimensions
int width,height;
/**
* New scanner card
*
* @param width desired width of card
* @param height desired height of card
*/
public ScannerCard(int width, int height)
{
setLayout(new BorderLayout());
String s[] = {"Rewind","BackStep","Step","FForward"};
c = new Control(s);
add(c,"South");
sld = new Slider(width);
add(sld,"Center");
Dimension d = c.getPreferredSize();
Dimension d2 = sld.getPreferredSize();
a = new Animation(width,height - d.height - d2.height);
add(a,"North");
c.setAnimationListener(sld);
sld.setControlListener(c);
sld.setAnimationListener(a);
this.width = width;
this.height = height;
}
/**
* Get all animation listeners on this card (one)
*
* @return the animation listener on the card
*/
public AnimationListener getAnimationListener() { return sld; }
/**
* Set an info listener for this card
*
* @param i the info listener
*/
public void setInfoListener(InfoListener i)
{
if (a != null)
a.setInfoListener(i);
}
/**
* Make correct preferred size
*/
public Dimension getPreferredSize()
{
return new Dimension(width,height);
}
}
|
n= int(input())
l= list(map(int,input().split()))
s= 0
for i in range(n):
if(l[i]<l[0]):
l[0],l[i]= l[i],l[0]
for j in range(n):
if(l[n-1]<l[j]):
l[n-1],l[j]= l[j],l[n-1]
for k in range(n):
s+= l[k]
print(l[0],l[n-1],s) |
<reponame>l1ttps/nest-auth
import { Injectable } from '@nestjs/common';
// special case, must use require :)
const { promisify } = require('util');
import * as NodeCache from "node-cache"
enum KeyCacheEnum {
}
export { KeyCacheEnum }
@Injectable()
export class CacheService {
constructor() { }
private myCache = new NodeCache({ stdTTL: 100, checkperiod: 120 });
public set = (key, val, ttl) => this.myCache.set(key, val, ttl)
public get(key): any {
const value = this.myCache.get(key);
if (!!value) return value;
return null;
}
public has(key) {
return !!this.myCache.has(key);
}
} |
/* Report that FD <fd> can receive anymore without polling. */
static inline void fd_may_recv(const int fd)
{
HA_ATOMIC_OR(&fdtab[fd].state, FD_EV_READY_R);
if (atleast2(fdtab[fd].thread_mask))
HA_SPIN_LOCK(FD_LOCK, &fdtab[fd].lock);
fd_update_cache(fd);
if (atleast2(fdtab[fd].thread_mask))
HA_SPIN_UNLOCK(FD_LOCK, &fdtab[fd].lock);
} |
REQUEST_FAILED = "Ой, чёт не зашло..."
ARCHIVE = "Архивный"
CONN_FEE = "Стоимость подключения"
SUBSCR_FEE = "Стоимость использования"
TIME_FMT = "%Y-%m-%dT%H:%M:%S"
def yesno(flag):
return 'Да' if flag else 'Нет'
|
export default {
transform: {
'^.+\\.tsx?$': 'ts-jest',
'\\.[j]sx?$': 'babel-jest'
},
testRegex: '(/__test__/.*|(\\.|/)(test|spec))\\.(jsx?|tsx?)$',
moduleFileExtensions: ['ts', 'tsx', 'js', 'jsx', 'json', 'node'],
// collectCoverageFrom: ['src/**/!(*.d).{js,ts}'],
// collectCoverage: true,
testEnvironment: 'jsdom'
// coverageDirectory: './coverage',
// reporters: [
// 'default',
// [
// '../../node_modules/jest-html-reporter',
// {
// pageTitle: 'Dom Util Report',
// includeFailureMsg: true
// }
// ]
// ]
}
|
<reponame>Mo-Shakib/DSA
# Queue implemenation using array
class Queue:
def __init__(self):
self.data = []
def enqueue(self, item):
self.data.append(item)
def dequeue(self):
if self.data == []:
return None
else:
x = self.data[0]
self.data = self.data[1:]
return x
def peek(self):
if self.data == []:
return None
return self.data[0]
queue = Queue()
queue.enqueue(1)
queue.enqueue(2)
queue.enqueue(3)
queue.enqueue(4)
queue.enqueue(5)
print(queue.peek())
print(queue.dequeue())
print(queue.peek())
print(queue.dequeue())
print(queue.peek())
print(queue.data) |
/**
* <pre>
* Creates a TagValue as a child of the specified TagKey. If a another
* request with the same parameters is sent while the original request is in
* process the second request will receive an error. A maximum of 300
* TagValues can exist under a TagKey at any given time.
* </pre>
*/
public void createTagValue(com.google.cloud.resourcemanager.v3.CreateTagValueRequest request,
io.grpc.stub.StreamObserver<com.google.longrunning.Operation> responseObserver) {
io.grpc.stub.ClientCalls.asyncUnaryCall(
getChannel().newCall(getCreateTagValueMethod(), getCallOptions()), request, responseObserver);
} |
/**
* Modify a {@code FusekiServer.Builder} setup for data access control.
*/
public static FusekiServer.Builder fusekiBuilderAccessCtl(FusekiServer.Builder builder, Function<HttpAction, String> determineUser) {
builder.registerOperation(Operation.Query, WebContent.contentTypeSPARQLQuery, new AccessCtl_SPARQL_QueryDataset(determineUser));
builder.registerOperation(Operation.GSP_R, new AccessCtl_GSP_R(determineUser));
builder.registerOperation(Operation.Update, WebContent.contentTypeSPARQLUpdate, new AccessCtl_Deny("Update"));
builder.registerOperation(Operation.GSP_RW, new AccessCtl_AllowGET(new GSP_RW(), "GSP Write"));
builder.registerOperation(Operation.GSP_RW, new AccessCtl_GSP_R(determineUser));
return builder;
} |
import { arrayWith, objectLike } from '@aws-cdk/assert';
import '@aws-cdk/assert/jest';
import { App, Stack, Stage, StageProps } from '@aws-cdk/core';
import { Construct } from 'constructs';
import * as cdkp from '../lib';
import { sortedByRunOrder } from './testmatchers';
import { BucketStack, PIPELINE_ENV, TestApp, TestGitHubNpmPipeline } from './testutil';
let app: App;
let pipelineStack: Stack;
let pipeline: cdkp.CdkPipeline;
beforeEach(() => {
app = new TestApp();
pipelineStack = new Stack(app, 'PipelineStack', { env: PIPELINE_ENV });
pipeline = new TestGitHubNpmPipeline(pipelineStack, 'Cdk');
});
test('interdependent stacks are in the right order', () => {
// WHEN
pipeline.addApplicationStage(new TwoStackApp(app, 'MyApp'));
// THEN
expect(pipelineStack).toHaveResourceLike('AWS::CodePipeline::Pipeline', {
Stages: arrayWith({
Name: 'MyApp',
Actions: sortedByRunOrder([
objectLike({ Name: 'Stack1.Prepare' }),
objectLike({ Name: 'Stack1.Deploy' }),
objectLike({ Name: 'Stack2.Prepare' }),
objectLike({ Name: 'Stack2.Deploy' }),
]),
}),
});
});
test('multiple independent stacks go in parallel', () => {
// WHEN
pipeline.addApplicationStage(new ThreeStackApp(app, 'MyApp'));
// THEN
expect(pipelineStack).toHaveResourceLike('AWS::CodePipeline::Pipeline', {
Stages: arrayWith({
Name: 'MyApp',
Actions: sortedByRunOrder([
// 1 and 2 in parallel
objectLike({ Name: 'Stack1.Prepare' }),
objectLike({ Name: 'Stack2.Prepare' }),
objectLike({ Name: 'Stack1.Deploy' }),
objectLike({ Name: 'Stack2.Deploy' }),
// Then 3
objectLike({ Name: 'Stack3.Prepare' }),
objectLike({ Name: 'Stack3.Deploy' }),
]),
}),
});
});
test('manual approval is inserted in correct location', () => {
// WHEN
pipeline.addApplicationStage(new TwoStackApp(app, 'MyApp'), {
manualApprovals: true,
});
// THEN
expect(pipelineStack).toHaveResourceLike('AWS::CodePipeline::Pipeline', {
Stages: arrayWith({
Name: 'MyApp',
Actions: sortedByRunOrder([
objectLike({ Name: 'Stack1.Prepare' }),
objectLike({ Name: 'ManualApproval' }),
objectLike({ Name: 'Stack1.Deploy' }),
objectLike({ Name: 'Stack2.Prepare' }),
objectLike({ Name: 'ManualApproval2' }),
objectLike({ Name: 'Stack2.Deploy' }),
]),
}),
});
});
test('extra space for sequential intermediary actions is reserved', () => {
// WHEN
pipeline.addApplicationStage(new TwoStackApp(app, 'MyApp'), {
extraRunOrderSpace: 1,
});
// THEN
expect(pipelineStack).toHaveResourceLike('AWS::CodePipeline::Pipeline', {
Stages: arrayWith({
Name: 'MyApp',
Actions: sortedByRunOrder([
objectLike({
Name: 'Stack1.Prepare',
RunOrder: 1,
}),
objectLike({
Name: 'Stack1.Deploy',
RunOrder: 3,
}),
objectLike({
Name: 'Stack2.Prepare',
RunOrder: 4,
}),
objectLike({
Name: 'Stack2.Deploy',
RunOrder: 6,
}),
]),
}),
});
});
test('combination of manual approval and extraRunOrderSpace', () => {
// WHEN
pipeline.addApplicationStage(new OneStackApp(app, 'MyApp'), {
extraRunOrderSpace: 1,
manualApprovals: true,
});
// THEN
expect(pipelineStack).toHaveResourceLike('AWS::CodePipeline::Pipeline', {
Stages: arrayWith({
Name: 'MyApp',
Actions: sortedByRunOrder([
objectLike({
Name: 'Stack1.Prepare',
RunOrder: 1,
}),
objectLike({
Name: 'ManualApproval',
RunOrder: 2,
}),
objectLike({
Name: 'Stack1.Deploy',
RunOrder: 4,
}),
]),
}),
});
});
class OneStackApp extends Stage {
constructor(scope: Construct, id: string, props?: StageProps) {
super(scope, id, props);
new BucketStack(this, 'Stack1');
}
}
class TwoStackApp extends Stage {
constructor(scope: Construct, id: string, props?: StageProps) {
super(scope, id, props);
const stack2 = new BucketStack(this, 'Stack2');
const stack1 = new BucketStack(this, 'Stack1');
stack2.addDependency(stack1);
}
}
/**
* Three stacks where the last one depends on the earlier 2
*/
class ThreeStackApp extends Stage {
constructor(scope: Construct, id: string, props?: StageProps) {
super(scope, id, props);
const stack1 = new BucketStack(this, 'Stack1');
const stack2 = new BucketStack(this, 'Stack2');
const stack3 = new BucketStack(this, 'Stack3');
stack3.addDependency(stack1);
stack3.addDependency(stack2);
}
}
|
def slice(self, count, last=False):
items = list(self.keys())
if last is not True:
return items[:count]
else:
return items[count * -1:] |
/**
* Tests the backward compatibility for the extended item that in the design
* file of which the version is smaller than "3.1.0".
*
* @throws Exception
*
*/
public void testCompatibleBoundDataColumns( ) throws Exception
{
openDesign( fileName );
save( );
assertTrue( compareFile( goldenFileName) );
} |
/********************************************************************************
* Copyright (c) 2018 TypeFox and others.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License v. 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0.
*
* This Source Code may also be made available under the following Secondary
* Licenses when the conditions for such availability set forth in the Eclipse
* Public License v. 2.0 are satisfied: GNU General Public License, version 2
* with the GNU Classpath Exception which is available at
* https://www.gnu.org/software/classpath/license.html.
*
* SPDX-License-Identifier: EPL-2.0 OR GPL-2.0 WITH Classpath-exception-2.0
********************************************************************************/
import { VNode } from 'snabbdom/vnode';
import { IView, RenderingContext } from '../../base/views/view';
import { SIssueMarker, SIssueSeverity } from './model';
export declare class IssueMarkerView implements IView {
render(marker: SIssueMarker, context: RenderingContext): VNode;
protected getMaxSeverity(marker: SIssueMarker): SIssueSeverity;
protected getPath(severity: SIssueSeverity): "M768 128q209 0 385.5 103t279.5 279.5 103 385.5-103 385.5-279.5 279.5-385.5 103-385.5-103-279.5-279.5-103-385.5 103-385.5 279.5-279.5 385.5-103zm128 1247v-190q0-14-9-23.5t-22-9.5h-192q-13 0-23 10t-10 23v190q0 13 10 23t23 10h192q13 0 22-9.5t9-23.5zm-2-344l18-621q0-12-10-18-10-8-24-8h-220q-14 0-24 8-10 6-10 18l17 621q0 10 10 17.5t24 7.5h185q14 0 23.5-7.5t10.5-17.5z" | "M1024 1376v-160q0-14-9-23t-23-9h-96v-512q0-14-9-23t-23-9h-320q-14 0-23 9t-9 23v160q0 14 9 23t23 9h96v320h-96q-14 0-23 9t-9 23v160q0 14 9 23t23 9h448q14 0 23-9t9-23zm-128-896v-160q0-14-9-23t-23-9h-192q-14 0-23 9t-9 23v160q0 14 9 23t23 9h192q14 0 23-9t9-23zm640 416q0 209-103 385.5t-279.5 279.5-385.5 103-385.5-103-279.5-279.5-103-385.5 103-385.5 279.5-279.5 385.5-103 385.5 103 279.5 279.5 103 385.5z";
}
//# sourceMappingURL=views.d.ts.map |
def reset(self, kind = "HARD"):
if not kind in self.m_reset:
return
self.m_mirrored = self.m_reset[kind]
self.m_desired = self.m_mirrored
self.value = self.m_mirrored
if kind == "HARD":
self.m_written = False |
// Handle returns http.HandlerFunc depends on the given parameters
func Handle(dir string, isSpa bool) http.HandlerFunc {
handler := handleNonSPA(dir)
if isSpa {
handler = handleSPA(dir)
}
return middleware(handler)
} |
Wild Edible Plants of Nutritional and Medicinal Significance to the Tribes of Palghar, Maharashtra, India
Wild edible plants (WEP) are an important component from the perspective of tribal diet. There are various traditional practices and beliefs in relation to the use of wild edibles among various tribal communities in Maharashtra. In this study, the WEP found in Jawhar block of Palghar district and detailed information on the local usage for medicine or food purpose has been documented. With this, the traditional methods of preparation, collection and storage of these edible wild plants has also been documented. The present paper presents a total of 162 species of WEP, out of which almost 74% are consumed as food, 14% possess medicinal significance while 12% of the species exhibit both dietary and medicinal significance. This type of study could contribute in recording the traditional heritage of food culture and generate awareness about the importance of wild edible species. Documentation of these wild plant species can help in commercialization and domestication of the wild varieties and their entry into urban marketplaces to generate higher revenue for the farmers. Wild edibles could prove to be a remedy to food scarcity, a source of nutritional security and improve the economy in tribal areas. |
/*
* Inorder successor of a node in binary Search Tree (BST) is the next node in inorder traversal.
* Write a method to find inorder successor of a given value "d" in a BST.
* Binary Node: 100, 50, 200, 25, 75, 125, 350
*
* Inorder successor of 25 is 50
* Inorder successor of 50 is 75
* Inorder successor of 75 is 100
* Inorder successor of 100 is 125
* Inorder successor of 125 is 200
* Inorder successor of 200 is 350
* Inorder successor of 350 is NULL since it is the last node
*
* Runtime Complexity:
* Logarithmic, O(logn)
*
* Memory Complexity:
* Constant, O(1).
*
* Find the value d in BST. If d has a right child then the left most child in right child's subtree
* will be the in-order successor of d. This would also be the child with the minimum value in that subtree.
* Find the value d in BST. If d has no right child then:
* in-order successor is NULL if d is right most node in the BST i.e. last node in the in-order traversal
* in-order successor is the node with minimum value higher than d in the parent chain of d
*
* */
private static class Node {
private int data;
private Node left, right;
Node(int item) {
data = item;
left = right = null;
}
public int getData() {
return data;
}
public Node getLeft() {
return left;
}
public Node getRight() {
return right;
}
} |
/**
* Tests service reassignment.
*/
public class GridServiceReassignmentSelfTest extends GridServiceProcessorAbstractSelfTest {
/** */
private static final String SERVICE_NAME = "testService";
/** */
private static final long SERVICE_TOP_WAIT_TIMEOUT = 2_000L;
/** {@inheritDoc} */
@Override protected int nodeCount() {
return 1;
}
/**
* @throws Exception If failed.
*/
@Test
public void testClusterSingleton() throws Exception {
checkReassigns(1, 1);
}
/**
* @throws Exception If failed.
*/
@Test
public void testNodeSingleton() throws Exception {
checkReassigns(0, 1);
}
/**
* @throws Exception If failed.
*/
@Test
public void testLimited1() throws Exception {
checkReassigns(5, 2);
}
/**
* @throws Exception If failed.
*/
@Test
public void testLimited2() throws Exception {
checkReassigns(7, 3);
}
/**
* @throws Exception If failed.
*/
private CounterService proxy(Ignite g) throws Exception {
return g.services().serviceProxy(SERVICE_NAME, CounterService.class, false);
}
/**
* @param total Total number of services.
* @param maxPerNode Maximum number of services per node.
* @throws IgniteCheckedException If failed.
*/
private void checkReassigns(int total, int maxPerNode) throws Exception {
CountDownLatch latch = new CountDownLatch(nodeCount());
DummyService.exeLatch(SERVICE_NAME, latch);
grid(0).services().deployMultiple(SERVICE_NAME, new CounterServiceImpl(), total, maxPerNode);
for (int i = 0; i < 10; i++)
proxy(randomGrid()).increment();
Collection<Integer> startedGrids = new HashSet<>();
try {
startedGrids.add(0);
int maxTopSize = 5;
boolean grow = true;
Random rnd = new Random();
for (int i = 0; i < 20; i++) {
if (grow) {
assert startedGrids.size() < maxTopSize;
startRandomNodesMultithreaded(maxTopSize, rnd, startedGrids);
if (startedGrids.size() == maxTopSize)
grow = false;
}
else {
assert startedGrids.size() > 1;
int gridIdx = nextRandomIdx(startedGrids, rnd);
stopGrid(gridIdx);
startedGrids.remove(gridIdx);
if (startedGrids.size() == 1)
grow = true;
}
for (int attempt = 0; attempt <= 10; ++attempt) {
U.sleep(500);
if (checkServices(total, maxPerNode, F.first(startedGrids), attempt == 10))
break;
}
}
}
finally {
grid(F.first(startedGrids)).services().cancel(SERVICE_NAME);
stopAllGrids();
startGrid(0);
}
}
/**
* Checks services assignments.
*
* @param total Total number of services.
* @param maxPerNode Maximum number of services per node.
* @param gridIdx Grid index to check.
* @param lastTry Last try flag.
* @throws Exception If failed.
* @return {@code True} if check passed.
*/
private boolean checkServices(int total, int maxPerNode, int gridIdx, boolean lastTry) throws Exception {
IgniteEx grid = grid(gridIdx);
waitForServicesReadyTopology(grid, grid.context().discovery().topologyVersionEx());
Map<UUID, Integer> srvcTop = grid.context().service().serviceTopology(SERVICE_NAME, SERVICE_TOP_WAIT_TIMEOUT);
Collection<UUID> nodes = F.viewReadOnly(grid.context().discovery().aliveServerNodes(), F.node2id());
assertNotNull("Grid assignments object is null", srvcTop);
int sum = 0;
for (Map.Entry<UUID, Integer> entry : srvcTop.entrySet()) {
UUID nodeId = entry.getKey();
if (!lastTry && !nodes.contains(nodeId))
return false;
assertTrue("Dead node is in assignments: " + nodeId, nodes.contains(nodeId));
Integer nodeCnt = entry.getValue();
if (maxPerNode > 0)
assertTrue("Max per node limit exceeded [nodeId=" + nodeId + ", max=" + maxPerNode +
", actual=" + nodeCnt, nodeCnt <= maxPerNode);
sum += nodeCnt;
}
if (total > 0)
assertTrue("Total number of services limit exceeded [sum=" + sum +
", assigns=" + srvcTop + ']', sum <= total);
else
assertEquals("Reassign per node failed.", nodes.size(), srvcTop.size());
if (!lastTry && proxy(grid).get() != 10)
return false;
assertEquals(10, proxy(grid).get());
return true;
}
/**
* Start 1, 2 or 3 random nodes simultaneously.
*
* @param limit Cluster size limit.
* @param rnd Randmo generator.
* @param grids Collection with indexes of running nodes.
* @throws Exception If failed.
*/
private void startRandomNodesMultithreaded(int limit, Random rnd, Collection<Integer> grids) throws Exception {
int cnt = rnd.nextInt(Math.min(limit - grids.size(), 3)) + 1;
for (int i = 1; i <= cnt; i++) {
int gridIdx = nextAvailableIdx(grids, limit, rnd);
if (i == cnt)
startGrid(gridIdx);
else
GridTestUtils.runAsync(() -> startGrid(gridIdx));
grids.add(gridIdx);
}
}
/**
* Gets next available index.
*
* @param startedGrids Indexes for started grids.
* @param maxTopSize Max topology size.
* @return Next available index.
*/
private int nextAvailableIdx(Collection<Integer> startedGrids, int maxTopSize, Random rnd) {
while (true) {
int idx = rnd.nextInt(maxTopSize);
if (!startedGrids.contains(idx))
return idx;
}
}
/**
* @param startedGrids Started grids.
* @param rnd Random numbers generator.
* @return Randomly chosen started grid.
*/
private int nextRandomIdx(Iterable<Integer> startedGrids, Random rnd) {
while (true) {
for (Integer idx : startedGrids) {
if (rnd.nextBoolean())
return idx;
}
}
}
} |
Dundee United left-back Andrew Robertson has won PFA Scotland's young player of the year award.
The 20-year-old has been a mainstay in the United side this season, having joined from Queen's Park last summer.
And his performances earned him his first Scotland cap in the March friendly win over Poland
Robertson's Tannadice team-mates Stuart Armstrong and Ryan Gauld and St Johnstone's Stevie May had also nominated for the young player award.
If we could win the Scottish Cup that would top off the most amazing season for me Andrew Robertson Dundee United
With United having reached the Scottish Cup final, where they will face Saints, Robertson is eager to end a memorable season with a medal.
"It has been a really successful season for me but this award probably means the most to me, because the players have voted for me," he explained.
"Stuart and Gauldy have had amazing seasons but I am glad to win it. We have been joking since we were nominated and there was a few laughs.
"I wouldn't say we were really surprised when the three of us were nominated because we were playing well as a whole team but there has to be credit given to the other players like Paul Paton and John Rankin, who have played every week and made us tick.
"If we could win the Scottish Cup that would top off the most amazing season for me. We know it will be tough against St Johnstone but we will try to get the win on that day.
"We let ourselves down in the league last week by getting beat in the league by St Johnstone but we were magnificent against Motherwell in the 5-1 win on Saturday.
"The Scottish Cup final will be completely different with the fans, the atmosphere and what is at stake."
PFA Scotland awards
Premiership player of the year - Kris Commons (Celtic)
Championship player of the year - Kane Hemmings (Cowdenbeath)
League One player of the year - Lee Wallace (Rangers)
League Two player of the year - Rory McAllister (Peterhead)
Young player of the year - Andrew Robertson (Dundee United)
Manager of the year - Derek McInnes (Aberdeen)
Goal of the season - Jonny Hayes (for Aberdeen against Celtic, February 2014)
Special merit award - Frank McKeown (Stranraer) |
//Class handling the locking and unlocking of the ADC
class Lock
{
public:
Lock() : open(false) {}
void lock()
{
open = 1;
mutexADC.lock();
}
void unlock()
{
open = 0;
mutexADC.unlock();
}
bool isLocked()
{
return open;
}
private:
bool open;
std::mutex mutexADC;
} |
<reponame>AnthonyM/service-fabric<filename>src/prod/src/Management/healthmanager/HealthEventStoreData.cpp
// ------------------------------------------------------------
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License (MIT). See License.txt in the repo root for license information.
// ------------------------------------------------------------
#include "stdafx.h"
using namespace Common;
using namespace Federation;
using namespace std;
using namespace Store;
using namespace ServiceModel;
using namespace Management::HealthManager;
HealthEventStoreData::HealthEventStoreData()
: StoreData()
, sourceId_()
, property_()
, state_(FABRIC_HEALTH_STATE_INVALID)
, lastModifiedUtc_(DateTime::Now())
, sourceUtcTimestamp_(DateTime::Zero)
, timeToLive_(TimeSpan::MaxValue)
, description_()
, reportSequenceNumber_(FABRIC_INVALID_SEQUENCE_NUMBER)
, removeWhenExpired_(false)
, persistedIsExpired_(false)
, cachedIsExpired_(false)
, isPendingUpdateToStore_(false)
, lastOkTransitionAt_(DateTime::Zero)
, lastWarningTransitionAt_(DateTime::Zero)
, lastErrorTransitionAt_(DateTime::Zero)
, isInStore_(false)
, diff_()
, priority_(Priority::NotAssigned)
{
}
HealthEventStoreData::HealthEventStoreData(
HealthInformation const & healthInfo,
ServiceModel::Priority::Enum priority,
Store::ReplicaActivityId const & activityId)
: StoreData(activityId)
, sourceId_(healthInfo.SourceId)
, property_(healthInfo.Property)
, state_(healthInfo.State)
, lastModifiedUtc_(DateTime::Now())
, sourceUtcTimestamp_(healthInfo.SourceUtcTimestamp)
, timeToLive_(healthInfo.TimeToLive)
, description_(healthInfo.Description)
, reportSequenceNumber_(healthInfo.SequenceNumber)
, removeWhenExpired_(healthInfo.RemoveWhenExpired)
, persistedIsExpired_(false)
, cachedIsExpired_(false)
, isPendingUpdateToStore_(false)
, lastOkTransitionAt_(DateTime::Zero)
, lastWarningTransitionAt_(DateTime::Zero)
, lastErrorTransitionAt_(DateTime::Zero)
, isInStore_(false)
, diff_()
, priority_(priority)
{
switch (healthInfo.State)
{
case FABRIC_HEALTH_STATE_OK:
lastOkTransitionAt_ = lastModifiedUtc_;
break;
case FABRIC_HEALTH_STATE_WARNING:
lastWarningTransitionAt_ = lastModifiedUtc_;
break;
case FABRIC_HEALTH_STATE_ERROR:
lastErrorTransitionAt_ = lastModifiedUtc_;
break;
default:
Assert::CodingError("Unsupported health state for event {0}", healthInfo.State);
break;
}
}
// Called to update an existing event.
// The existing event must be in store.
HealthEventStoreData::HealthEventStoreData(
HealthEventStoreData const & previousValue,
Store::ReplicaActivityId const & activityId)
: StoreData(activityId)
, sourceId_(previousValue.sourceId_)
, property_(previousValue.property_)
, state_(previousValue.state_)
, lastModifiedUtc_(previousValue.LastModifiedUtc)
, sourceUtcTimestamp_(previousValue.SourceUtcTimestamp)
, timeToLive_(previousValue.timeToLive_)
, description_(previousValue.description_)
, reportSequenceNumber_(previousValue.ReportSequenceNumber)
, removeWhenExpired_(previousValue.removeWhenExpired_)
, persistedIsExpired_(previousValue.cachedIsExpired_) // Copy the cached value, not the persisted value
, cachedIsExpired_(previousValue.cachedIsExpired_)
, isPendingUpdateToStore_(previousValue.isPendingUpdateToStore_)
, lastOkTransitionAt_(previousValue.lastOkTransitionAt_)
, lastWarningTransitionAt_(previousValue.lastWarningTransitionAt_)
, lastErrorTransitionAt_(previousValue.lastErrorTransitionAt_)
, isInStore_(true)
, diff_() // no diff
, priority_(previousValue.priority_)
{
ASSERT_IF(
!previousValue.diff_ && previousValue.cachedIsExpired_ == previousValue.persistedIsExpired_,
"{0}: create from diff: no changes {1}",
activityId.ActivityId,
previousValue);
}
HealthEventStoreData::HealthEventStoreData(HealthEventStoreData && other)
: StoreData(move(other))
, sourceId_(move(other.sourceId_))
, property_(move(other.property_))
, state_(move(other.state_))
, lastModifiedUtc_(move(other.lastModifiedUtc_))
, sourceUtcTimestamp_(move(other.sourceUtcTimestamp_))
, timeToLive_(move(other.timeToLive_))
, description_(move(other.description_))
, reportSequenceNumber_(move(other.reportSequenceNumber_))
, removeWhenExpired_(move(other.removeWhenExpired_))
, persistedIsExpired_(move(other.persistedIsExpired_))
, cachedIsExpired_(move(other.cachedIsExpired_))
, isPendingUpdateToStore_(move(other.isPendingUpdateToStore_))
, lastOkTransitionAt_(move(other.lastOkTransitionAt_))
, lastWarningTransitionAt_(move(other.lastWarningTransitionAt_))
, lastErrorTransitionAt_(move(other.lastErrorTransitionAt_))
, isInStore_(move(other.isInStore_))
, diff_(move(other.diff_))
, priority_(move(other.priority_))
{
}
HealthEventStoreData & HealthEventStoreData::operator = (HealthEventStoreData && other)
{
if (this != &other)
{
sourceId_ = move(other.sourceId_);
property_ = move(other.property_);
state_ = move(other.state_);
lastModifiedUtc_ = move(other.lastModifiedUtc_);
sourceUtcTimestamp_ = move(other.sourceUtcTimestamp_);
timeToLive_ = move(other.timeToLive_);
description_ = move(other.description_);
reportSequenceNumber_ = move(other.reportSequenceNumber_);
removeWhenExpired_ = move(other.removeWhenExpired_);
persistedIsExpired_ = move(other.persistedIsExpired_);
cachedIsExpired_ = move(other.cachedIsExpired_);
isPendingUpdateToStore_ = move(other.isPendingUpdateToStore_);
lastOkTransitionAt_ = move(other.lastOkTransitionAt_);
lastErrorTransitionAt_ = move(other.lastErrorTransitionAt_);
lastWarningTransitionAt_ = move(other.lastWarningTransitionAt_);
isInStore_ = move(other.isInStore_);
diff_ = move(other.diff_);
priority_ = move(other.priority_);
}
StoreData::operator=(move(other));
return *this;
}
HealthEventStoreData::~HealthEventStoreData()
{
}
bool HealthEventStoreData::get_IsSystemReport() const
{
return Common::StringUtility::StartsWithCaseInsensitive<std::wstring>(sourceId_, ServiceModel::Constants::EventSystemSourcePrefix);
}
bool HealthEventStoreData::get_IsStatePropertyError() const
{
return state_ == FABRIC_HEALTH_STATE_ERROR && property_ == ServiceModel::Constants::HealthStateProperty;
}
int64 HealthEventStoreData::get_ReportSequenceNumber() const
{
return diff_ ? diff_->ReportSequenceNumber : reportSequenceNumber_;
}
Common::DateTime HealthEventStoreData::get_LastModifiedUtc() const
{
return diff_ ? diff_->LastModifiedUtc : lastModifiedUtc_;
}
Common::DateTime HealthEventStoreData::get_SourceUtcTimestamp() const
{
return diff_ ? diff_->SourceUtcTimestamp : sourceUtcTimestamp_;
}
Priority::Enum HealthEventStoreData::get_Priority() const
{
ASSERT_IF(priority_ == Priority::NotAssigned, "{0}: Priority not set", *this);
return priority_;
}
// After inconsistency between diff and current have been persisted to disk, move diff into current
void HealthEventStoreData::MoveDiffToCurrent(Store::ReplicaActivityId const & replicaActivityId)
{
if (diff_)
{
lastModifiedUtc_ = diff_->LastModifiedUtc;
sourceUtcTimestamp_ = diff_->SourceUtcTimestamp;
reportSequenceNumber_ = diff_->ReportSequenceNumber;
diff_.reset();
}
else
{
ASSERT_IF(persistedIsExpired_ == cachedIsExpired_, "{0}: MoveDiffToCurrent when there are no changes: {1}", replicaActivityId.ActivityId, *this);
}
persistedIsExpired_ = cachedIsExpired_;
this->ReInitializeTracing(replicaActivityId);
}
FABRIC_HEALTH_STATE HealthEventStoreData::GetEvaluatedHealthState(bool considerWarningAsError) const
{
// Use already set value for expired
if (this->IsExpired)
{
// Consider expired events as error
return FABRIC_HEALTH_STATE_ERROR;
}
if (considerWarningAsError && state_ == FABRIC_HEALTH_STATE_WARNING)
{
return FABRIC_HEALTH_STATE_ERROR;
}
return state_;
}
void HealthEventStoreData::Update(HealthEventStoreData && other)
{
#ifdef DBG
ASSERT_IF(other.diff_, "HealthEventStoreData::Update: Can't update with event with diff. Old={0}, new={1}", *this, other);
ASSERT_IFNOT(sourceId_ == other.sourceId_, "Update: SourceId changed, old={0}, new={1}", *this, other);
ASSERT_IFNOT(property_ == other.property_, "Update: Property changed, old={0}, new={1}", *this, other);
#endif
// Compute the transition state times if the state changed
if (state_ != other.state_)
{
// Mark when the previous state ended
switch (state_)
{
case FABRIC_HEALTH_STATE_OK:
lastOkTransitionAt_ = other.lastModifiedUtc_;
break;
case FABRIC_HEALTH_STATE_WARNING:
lastWarningTransitionAt_ = other.lastModifiedUtc_;
break;
case FABRIC_HEALTH_STATE_ERROR:
lastErrorTransitionAt_ = other.lastModifiedUtc_;
break;
default:
Assert::CodingError("Unsupported health state for event {0}", *this);
break;
}
// Mark when the new state began
switch (other.state_)
{
case FABRIC_HEALTH_STATE_OK:
lastOkTransitionAt_ = move(other.lastOkTransitionAt_);
break;
case FABRIC_HEALTH_STATE_WARNING:
lastWarningTransitionAt_ = move(other.lastWarningTransitionAt_);
break;
case FABRIC_HEALTH_STATE_ERROR:
lastErrorTransitionAt_ = move(other.lastErrorTransitionAt_);
break;
default:
Assert::CodingError("Unsupported health state for event {0}", other);
break;
}
state_ = move(other.state_);
}
lastModifiedUtc_ = move(other.lastModifiedUtc_);
sourceUtcTimestamp_ = move(other.sourceUtcTimestamp_);
timeToLive_ = move(other.timeToLive_);
description_ = move(other.description_);
reportSequenceNumber_ = move(other.reportSequenceNumber_);
removeWhenExpired_ = move(other.removeWhenExpired_);
persistedIsExpired_ = move(other.persistedIsExpired_);
cachedIsExpired_ = move(other.cachedIsExpired_);
isPendingUpdateToStore_ = move(other.isPendingUpdateToStore_);
diff_ = move(other.diff_);
}
bool HealthEventStoreData::UpdateExpired()
{
if (!cachedIsExpired_)
{
// If diff_ is set, look at diff_ and ignore persisted value.
// If there are any differences, they will be made consistent by writing diff to store
// on next cleanup timer or report.
if (this->LastModifiedUtc.AddWithMaxValueCheck(timeToLive_) <= DateTime::Now())
{
cachedIsExpired_= true;
}
}
return cachedIsExpired_;
}
void HealthEventStoreData::UpdateOnLoadFromStore(FABRIC_HEALTH_REPORT_KIND healthInfoKind)
{
// If store persisted data says expired, maintain the information
if (!persistedIsExpired_ && !removeWhenExpired_)
{
// Otherwise, ignore the persisted last modified time and use the load store time instead
diff_ = make_unique<HealthEventDiff>(sourceUtcTimestamp_, reportSequenceNumber_);
}
cachedIsExpired_ = persistedIsExpired_;
priority_ = HealthReport::GetPriority(healthInfoKind, sourceId_, property_);
}
std::wstring HealthEventStoreData::GeneratePrefix(std::wstring const & entityId)
{
return wformatString(
"{0}{1}",
entityId,
Constants::TokenDelimeter);
}
void HealthEventStoreData::WriteTo(__in Common::TextWriter & w, Common::FormatOptions const &) const
{
w.Write(
"{0}({1},{2},Generated@{3}, Updated@{4}, timeToLive={5}, reportLSN={6}, desc={7}, removeWhenExpired={8}, {9})",
this->Type,
this->Key,
state_,
this->SourceUtcTimestamp,
this->LastModifiedUtc,
timeToLive_,
this->ReportSequenceNumber,
description_,
removeWhenExpired_,
GetTransitionHistory());
}
void HealthEventStoreData::WriteToEtw(uint16 contextSequenceId) const
{
HMCommonEvents::Trace->HealthEventStoreDataTrace(
contextSequenceId,
this->Type,
this->Key,
wformatString(state_),
this->SourceUtcTimestamp,
this->LastModifiedUtc,
timeToLive_,
this->ReportSequenceNumber,
description_,
removeWhenExpired_,
GetTransitionHistory());
}
std::wstring HealthEventStoreData::GetTransitionHistory() const
{
wstring history;
StringWriter writer(history);
if (lastOkTransitionAt_ != DateTime::Zero)
{
writer.Write("LastOkAt:{0}.", lastOkTransitionAt_);
}
if (lastWarningTransitionAt_ != DateTime::Zero)
{
writer.Write("LastWarningAt:{0}.", lastWarningTransitionAt_);
}
if (lastErrorTransitionAt_ != DateTime::Zero)
{
writer.Write("LastErrorAt:{0}.", lastErrorTransitionAt_);
}
return history;
}
bool HealthEventStoreData::CanUpdateEvent(
FABRIC_SEQUENCE_NUMBER reportSequenceNumber) const
{
// Check that the sequence number increased
ASSERT_IF(reportSequenceNumber <= FABRIC_INVALID_SEQUENCE_NUMBER, "{0}: can't update with event with sn {1}", *this, reportSequenceNumber);
if (reportSequenceNumber <= this->ReportSequenceNumber)
{
HMEvents::Trace->DropReportStaleSourceLSN(
this->ReplicaActivityId,
this->Key,
reportSequenceNumber_,
reportSequenceNumber);
return false;
}
return true;
}
bool HealthEventStoreData::HasSameFields(ServiceModel::HealthReport const & report) const
{
#ifdef DBG
ASSERT_IFNOT(sourceId_ == report.SourceId, "Update: SourceId changed, old={0}, new={1}", *this, report);
ASSERT_IFNOT(property_ == report.Property, "Update: Property changed, old={0}, new={1}", *this, report);
#endif
// Do not check lastModifiedUtc, sourceUtc, and sequence number
return state_ == report.State &&
timeToLive_ == report.TimeToLive &&
description_ == report.Description &&
removeWhenExpired_ == report.RemoveWhenExpired;
}
bool HealthEventStoreData::TryUpdateDiff(ServiceModel::HealthReport const & report)
{
if (persistedIsExpired_)
{
// The report was persisted to disk as expired, need to write it again
return false;
}
if (removeWhenExpired_)
{
// Always persist transient events
return false;
}
if (!diff_)
{
diff_ = make_unique<HealthEventDiff>(report.SourceUtcTimestamp, report.SequenceNumber);
}
else
{
diff_->Update(report);
}
cachedIsExpired_ = false;
return true;
}
HealthEvent HealthEventStoreData::GenerateEvent() const
{
return HealthEvent(
sourceId_,
property_,
timeToLive_,
state_,
description_,
reportSequenceNumber_, // return persisted value, not in-memory one
sourceUtcTimestamp_, // return persisted value, not in-memory one
this->LastModifiedUtc, // return last modified value, the in-memory one
cachedIsExpired_, // previously computed value of expired
removeWhenExpired_,
lastOkTransitionAt_,
lastWarningTransitionAt_,
lastErrorTransitionAt_);
}
// ==========================================
// Internal class
// ==========================================
HealthEventStoreData::HealthEventDiff::HealthEventDiff(
Common::DateTime const sourceUtcTimestamp,
FABRIC_SEQUENCE_NUMBER reportSequenceNumber)
: lastModifiedUtc_(DateTime::Now())
, sourceUtcTimestamp_(sourceUtcTimestamp)
, reportSequenceNumber_(reportSequenceNumber)
{
}
HealthEventStoreData::HealthEventDiff::~HealthEventDiff()
{
}
void HealthEventStoreData::HealthEventDiff::Update(ServiceModel::HealthReport const & report)
{
lastModifiedUtc_ = DateTime::Now();
sourceUtcTimestamp_ = report.SourceUtcTimestamp;
reportSequenceNumber_ = report.SequenceNumber;
}
|
<reponame>Light-Bearing/JavaDeveloper-from-0-to-pro
package core;
/**
* Created by Danya on 24.08.2015.
*/
public class WeightMeter
{
public static Double getWeight(Car car)
{
return car.getWeight();
}
} |
<reponame>pegah5665/ngx-mapBox
import { Component } from '@angular/core';
import { ActivatedRoute } from '@angular/router';
@Component({
selector: 'app-add-an-icon-to-the-map',
templateUrl: './add-an-icon-to-the-map.component.html',
styleUrls: ['./add-an-icon-to-the-map.component.css']
})
export class AddAnIconToTheMapComponent {
title = '';
style = 'mapbox://styles/mapbox/streets-v9';
source = {
type: 'geojson',
data: {
type: 'FeatureCollection',
features: [{
type: 'Feature',
geometry: {
type: 'Point',
coordinates: [0, 0],
},
}],
},
};
layout = {
'icon-image': 'cat',
'icon-size': 0.25
};
imageSrc = 'https://upload.wikimedia.org/wikipedia/commons/thumb/6/60/Cat_silhouette.svg/400px-Cat_silhouette.svg.png';
constructor(route: ActivatedRoute) {
this.title = route.snapshot.data['title'];
}
}
|
<gh_stars>1-10
import React, {
Component
} from 'react';
import PropTypes from 'prop-types';
import addClass from 'dom-helpers/addClass';
import removeClass from 'dom-helpers/removeClass';
import Transition, {
TransitionProps
} from 'react-transition-group/Transition';
import {
StateMap
} from '@stylable/runtime';
import {
CombinePropsAndAttributes,
Bind,
omit
} from '../../helpers';
export interface ITransitionState {
active?: string;
enter?: string;
enterActive?: string;
enterDone?: string;
exit?: string;
exitActive?: string;
exitDone?: string;
}
interface ISelfProps {
states: ((stateMap: StateMap) => string) | ITransitionState;
}
export type IProps = CombinePropsAndAttributes<
ISelfProps,
TransitionProps
>;
const defaultStylableStates: ITransitionState = {
active: 'active',
enter: 'enter',
enterActive: 'enterActive',
enterDone: 'enterDone',
exit: 'exit',
exitActive: 'exitActive',
exitDone: 'exitDone'
};
const TransitionStateKeys = Object.keys(defaultStylableStates);
export default class StylableTransition extends Component<IProps> {
static propTypes = {
...(Transition as any).propTypes,
states: PropTypes.oneOfType([
PropTypes.func,
PropTypes.object
]).isRequired,
onEnter: PropTypes.func,
onEntering: PropTypes.func,
onEntered: PropTypes.func,
onExit: PropTypes.func,
onExiting: PropTypes.func,
onExited: PropTypes.func
};
private readonly stylableStates: {
active: Record<string, string>,
enter: Record<string, string>,
exit: Record<string, string>
};
constructor(props: IProps) {
super(props);
const {
states
} = props;
const getStateClass = typeof states === 'function'
? (state: string) => states({ [state]: true })
: (state: string) => states[state];
const stylableStates = {
active: {},
enter: {},
exit: {}
};
TransitionStateKeys.forEach((key: string) => {
const [state, phase] = key
.replace(/(Active|Done)/, ' $1')
.toLowerCase()
.split(' ');
if (stylableStates.hasOwnProperty(state)) {
stylableStates[state][phase || 'state'] = getStateClass(key);
}
});
this.stylableStates = stylableStates;
}
render() {
const props: any = omit(this.props, [
'states'
]);
return (
<Transition
{...props}
onEnter={this.onEnter}
onEntered={this.onEntered}
onEntering={this.onEntering}
onExit={this.onExit}
onExiting={this.onExiting}
onExited={this.onExited}
/>
);
}
@Bind()
private onEnter(node: HTMLElement) {
const stateClass = this.getStateClass('enter');
this.removeStateClass(node, 'exit');
if (stateClass) {
addClass(node, stateClass);
}
const {
onEnter
} = this.props;
if (typeof onEnter === 'function') {
onEnter(node);
}
}
@Bind()
private onEntering(node: HTMLElement) {
const enterStateClass = this.getStateClass('enter', 'active');
const activeStateClass = this.getStateClass('active');
if (enterStateClass) {
this.reflowAndAddStateClass(node, enterStateClass);
}
if (activeStateClass) {
if (enterStateClass) {
addClass(node, activeStateClass);
} else {
this.reflowAndAddStateClass(node, activeStateClass);
}
}
const {
onEntering
} = this.props;
if (typeof onEntering === 'function') {
onEntering(node);
}
}
@Bind()
private onEntered(node: HTMLElement) {
const stateClass = this.getStateClass('enter', 'done');
this.removeStateClass(node, 'enter');
if (stateClass) {
addClass(node, stateClass);
}
const {
onEntered
} = this.props;
if (typeof onEntered === 'function') {
onEntered(node);
}
}
@Bind()
private onExit(node: HTMLElement) {
const stateClass = this.getStateClass('exit');
this.removeStateClass(node, 'enter');
this.removeStateClass(node, 'active');
if (stateClass) {
addClass(node, stateClass);
}
const {
onExit
} = this.props;
if (typeof onExit === 'function') {
onExit(node);
}
}
@Bind()
private onExiting(node: HTMLElement) {
const stateClass = this.getStateClass('exit', 'active');
if (stateClass) {
this.reflowAndAddStateClass(node, stateClass);
}
const {
onExiting
} = this.props;
if (typeof onExiting === 'function') {
onExiting(node);
}
}
@Bind()
private onExited(node: HTMLElement) {
const stateClass = this.getStateClass('exit', 'done');
this.removeStateClass(node, 'exit');
if (stateClass) {
addClass(node, stateClass);
}
const {
onExited
} = this.props;
if (typeof onExited === 'function') {
onExited(node);
}
}
private getStateClass(state: string, phase = 'state') {
const phases = this.stylableStates[state];
if (!phases) {
return false;
}
return phases[phase] || false;
}
private removeStateClass(node: HTMLElement, state: string) {
const phases = this.stylableStates[state];
if (!phases) {
return;
}
const {
state: stateClass,
active: activePhaseClass,
done: donePhaseClass
} = phases;
if (stateClass) {
removeClass(node, stateClass);
}
if (activePhaseClass) {
removeClass(node, activePhaseClass);
}
if (donePhaseClass) {
removeClass(node, donePhaseClass);
}
}
private reflowAndAddStateClass(node: HTMLElement, stateClass: string) {
if (node) {
// This is for to force a repaint,
// which is necessary in order to transition styles when adding a class name.
node.scrollTop; // tslint:disable-line
addClass(node, stateClass);
}
}
}
|
#include <bits/stdc++.h>
typedef long long ll;
typedef long double ld;
using namespace std;
mt19937 rng((unsigned int) chrono::steady_clock::now().time_since_epoch().count());
const int MOD = 1e9 + 7;
const float EBS = 1e-9;
const int INF = 1e8;
#define IO ios_base::sync_with_stdio(0);cin.tie(0);cout.tie(0)
template<typename T, typename U>
pair<T, U> operator+(const std::pair<T, U> &l, const std::pair<T, U> &r) {
return {l.first + r.first, l.second + r.second};
}
template<typename T, typename U>
pair<T, U> operator-(const std::pair<T, U> &l, const std::pair<T, U> &r) {
return {l.first - r.first, l.second - r.second};
}
const static int N = 1e5 + 10;
struct fenwick {
vector<ll> tree_num;
vector<ll> tree_price_num;
ll n;
fenwick(ll n) {
this->n = n + 1;
tree_num.resize(this->n);
tree_price_num.resize(this->n);
}
void add(ll index, ll value) {
ll temp = index;
for (++index; index < n; index += index & -index) {
tree_num[index] += value;
}
index = temp;
value = value * index;
for (++index; index < n; index += index & -index) {
tree_price_num[index] += value;
}
}
pair<ll, ll> get(ll index) {
ll pre_cost = 0;
ll pre_num = 0;
for (++index; index > 0; index -= index & -index) {
pre_cost += tree_price_num[index];
pre_num += tree_num[index];
}
return {pre_cost, pre_num};
}
};
int main() {
IO;
ll n, k, m;
cin >> n >> k >> m;
vector<vector<ll>> append(n + 5);
vector<vector<ll>> rem(n + 5);
vector<pair<pair<ll, ll>, pair<ll, ll >>> all(m + 1);
for (int i = 1; i <= m; i++) {
ll l, r, c, p;
cin >> l >> r >> c >> p;
all[i] = {{l, r},
{c, p}};
append[l].push_back(i);
rem[r + 1].push_back(i);
}
fenwick *f = new fenwick(1e6 + 50);
ll ans = 0;
for (int i = 1; i <= n; i++) {
for (int h : append[i]) {
f->add(all[h].second.second, all[h].second.first);
}
for (int h : rem[i]) {
f->add(all[h].second.second, -1 * all[h].second.first);
}
ll l = 1;
ll r = 1e6+10 ;
ll p = -1;
pair<ll , ll > it;
while (l < r) {
int m = l + (r - l) / 2;
it = f->get(m);
if (it.second >= k) {
r = m;
} else {
l = m + 1;
}
}
if (l > 1e6 + 2) {
ans += it.first;
continue;
}
p = l;
ll c = it.first;
ll ex = it.second - k ;
c -= ex * p;
ans += c;
}
cout << ans << endl;
return 0;
} |
If MLB wants to speed up pace of play, it should. But as an olive branch, grandfather in veterans like Max Scherzer.
All long-term deals carry risk, but Colorado and Arenado are the perfect storm of a franchise in win-now mode committing to a superstar cashing in at the right time.
New York Mets third baseman Todd Frazier injured his left oblique muscle and will return to New York for a cortisone injection.
Top international prospect Victor Victor Mesa, who had been expected to play a lot for the Marlins in spring training, has a strained right hamstring.
b-grounded to third for X Cedeno in the 9th
ABOUT COOKIES
To help make this website better, to improve and personalize your experience and for advertising purposes, are you happy to accept cookies and other technologies? |
"""Data utilities.
TODO: more info...
----------
"""
__all__ = [
'DataGenerator',
'make_generator',
]
import numpy as np
import pandas as pd
from probflow.core.settings import get_backend
from probflow.core.settings import get_datatype
from probflow.core.base import BaseDataGenerator
class DataGenerator(BaseDataGenerator):
"""Generate data to feed through a model.
TODO
Parameters
----------
x : |ndarray| or |DataFrame| or |Series| or |DataGenerator|
Independent variable values (or, if fitting a generative model,
the dependent variable values). Should be of shape (Nsamples,...)
y : |None| or |ndarray| or |DataFrame| or |Series|
Dependent variable values (or, if fitting a generative model,
``None``). Should be of shape (Nsamples,...). Default = ``None``
batch_size : int
Number of samples to use per minibatch. Use ``None`` to use a single
batch for all the data.
Default = ``None``
shuffle : bool
Whether to shuffle the data each epoch.
Default = ``False``
testing : bool
Whether to treat data as testing data (allow no dependent variable).
Default = ``False``
"""
def __init__(self,
x=None,
y=None,
batch_size=None,
shuffle=False,
test=False):
# Check types
data_types = (np.ndarray, pd.DataFrame, pd.Series)
if x is not None and not isinstance(x, data_types):
raise TypeError('x must be an ndarray, a DataFrame, or a Series')
if y is not None and not isinstance(y, data_types):
raise TypeError('y must be an ndarray, a DataFrame, or a Series')
if batch_size is not None:
if not isinstance(batch_size, int):
raise TypeError('batch_size must be an int')
if batch_size < 1:
raise ValueError('batch_size must be >0')
if not isinstance(shuffle, bool):
raise TypeError('shuffle must be True or False')
if not isinstance(test, bool):
raise TypeError('test must be True or False')
# Check sizes are consistent
if x is not None and y is not None:
if x.shape[0] != y.shape[0]:
raise ValueError('x and y must contain same number of samples')
# Generative model?
if not test and y is None:
y = x
x = None
# Number of samples
if x is None:
self._n_samples = y.shape[0]
else:
self._n_samples = x.shape[0]
# Batch size
if batch_size is None or y.shape[0] < batch_size:
self._batch_size = self._n_samples
else:
self._batch_size = batch_size
# Store references to data
self.x = x
self.y = y
# Shuffle data
self.shuffle = shuffle
self.on_epoch_end()
@property
def n_samples(self):
"""Number of samples in the dataset"""
return self._n_samples
@property
def batch_size(self):
"""Number of samples per batch"""
return self._batch_size
def __getitem__(self, index):
"""Generate one batch of data"""
# Get shuffled indexes
ix = self.ids[index*self.batch_size:(index+1)*self.batch_size]
# Get x data
if self.x is None:
x = None
elif isinstance(self.x, pd.DataFrame):
x = self.x.iloc[ix, :]
elif isinstance(self.x, pd.Series):
x = self.x.iloc[ix]
else:
x = self.x[ix, ...]
# Get y data
if self.y is None:
y = None
elif isinstance(self.y, pd.DataFrame):
y = self.y.iloc[ix, :]
elif isinstance(self.y, pd.Series):
y = self.y.iloc[ix]
else:
y = self.y[ix, ...]
# Return both x and y
return x, y
def on_epoch_end(self):
"""Shuffle data each epoch"""
if self.shuffle:
self.ids = np.random.permutation(self.n_samples)
else:
self.ids = np.arange(self.n_samples, dtype=np.uint64)
def make_generator(x=None, y=None, batch_size=None, shuffle=False, test=False):
"""Make input a DataGenerator if not already"""
if isinstance(x, DataGenerator):
return x
else:
return DataGenerator(x, y, batch_size=batch_size,
shuffle=shuffle, test=test)
|
"""Utilities. """
from padl.dumptools import inspector
from padl.print_utils import format_argument
from padl.transforms import AtomicTransform
from padl.transforms import _pd_trace
def _maketrans(attr, getitem=False):
class T(AtomicTransform):
"""Dynamically generated transform for the "same" object.
:param args: Arguments to pass to the input's method.
:param kwargs: Keyword arguments to pass to the input's method.
"""
def __init__(self, *args, **kwargs):
self._args = args
self._kwargs = kwargs
caller_frameinfo = inspector.outer_caller_frameinfo(__name__)
call_info = inspector.CallInfo(caller_frameinfo)
if getitem:
call = inspector.get_segment_from_frame(caller_frameinfo.frame, 'getitem')
else:
call = inspector.get_segment_from_frame(caller_frameinfo.frame, 'call')
AtomicTransform.__init__(
self,
call=call,
call_info=call_info,
)
def __call__(self, args):
return getattr(args, attr)(*self._args, **self._kwargs)
def _formatted_args(self) -> str:
"""Format the object's init arguments for printing. """
args_list = [format_argument(val) for val in self._args]
args_list += [f'{key}={format_argument(val)}' for key, val in self._kwargs.items()]
return ', '.join(args_list)
def _pd_longrepr(self, formatting=True, marker=None):
out = self._pd_shortrepr()
return out + marker[1] if marker else out
def _pd_shortrepr(self, formatting=True, max_width=None):
return f'{attr}({self._formatted_args()})'
def _pd_tinyrepr(self, formatting=True, max_width=None):
return self.pd_name or attr
return T
class _Same:
"""Transform factory for capturing attributes/ get-items. """
def __getitem__(self, item):
return _maketrans('__getitem__', True)(item)
def __getattr__(self, attr):
return _maketrans(attr)
#: Transform factory for capturing attributes/ get-items.
same = _Same()
class _Debug:
"""Customized debugger for :class:`padl.transforms.Transform`s.
When an exception on the execution of a :class:`padl.transforms.Transform` is produced and a
:class:`_Debug` object is called, an interactive debugger at different levels in the
:class:`padl.transforms.Transform` is gotten.
At the top, the user interacts with the entire transform and its absolute input. One level
down, it goes directly to the stage that got the Exception (either to
:meth:`padl.transforms.Transform.pd_preprocess`, :meth:`padl.transforms.Transform.pd_forward`,
or :meth:`padl.transforms.Transform.pd_postprocess`) and each level deeper moves recursively
inside the element that failed until the :class:`padl.transforms.AtomicTransform` that got the
Exception is reached.
"""
def __init__(self):
self.trans = None
self.args = None
self.default_msg = (
'Defined commands are: \n'
' u(p): step up\n'
' d(own): step down\n'
' w(here am I?): show code position\n'
' i(nput): show input here\n'
' r(epeat): repeat here (will produce the same exception)\n'
' t(ransform): displays the current transform\n'
' h(elp): print help about the commands\n'
' q(uit): quit'
)
def __call__(self) -> None:
"""Call me for getting an interactive debugger in case of error.
User can give following input and expect response
u(p): step up
d(own): step down
w(here am I?): show code position
i(nput): show input here
r(epeat): repeat here (will produce the same exception)
h(elp): print help about the commands
q(uit): quit'
"""
pos = len(_pd_trace) - 1
print(self.default_msg + '\n' + _pd_trace[pos].transform_str)
while True:
try:
x = input('> ')
except IndexError:
continue
if x == 'd':
pos, msg = self._down_step(pos, _pd_trace)
elif x == 'u':
pos, msg = self._up_step(pos, _pd_trace)
elif x == 'q':
self.args = _pd_trace[pos].args
self.trans = _pd_trace[pos].transform
break
elif x == 'w':
msg = _pd_trace[pos].code_position
elif x == 'i':
msg = _pd_trace[pos].args
elif x == 'r':
self.args = _pd_trace[pos].args
self.trans = _pd_trace[pos].transform
# This 0 is because the last element carries a problem: when adding the last
# element to _pd_trace *Transform.pd_mode* has been already set up to None again.
self.repeat(_pd_trace[0].pd_mode, pos)
elif x == 'h' or x == 'help':
msg = self.default_msg
elif x == 't':
msg = _pd_trace[pos].transform_str
else:
i = _pd_trace[pos].args
try:
code = compile(x, '', 'single')
exec(code)
except Exception as err:
print(err)
if x in {'d', 'u', 'w', 'i', 'h', 'help', 't'}:
print(f'\n{msg}\n')
def repeat(self, mode: str, pos: int) -> None:
"""Repeat the execution from the current position *pos* (the same Exception will be
produced).
:param mode: mode ('train', 'eval', 'infer').
:param pos: level of the :class:`Transform` we are inspecting.
"""
assert mode in ('train', 'eval', 'infer'), 'Mode should be "train", "eval" or "infer'
_pd_trace.clear()
if pos == len(_pd_trace) - 1:
self._repeat_entire(mode)
else:
self._repeat_on_stage(mode)
@staticmethod
def _down_step(pos, pd_trace):
if pos > 0:
pos -= 1
return pos, pd_trace[pos].transform_str
return pos, 'Reached the bottom.'
@staticmethod
def _up_step(pos, pd_trace):
if pos < len(pd_trace) - 1:
pos += 1
return pos, pd_trace[pos].transform_str
return pos, 'Reached top level.'
def _repeat_entire(self, mode):
if mode == 'train':
list(self.trans.train_apply(self.args, batch_size=len(self.args), num_workers=0))
elif mode == 'eval':
list(self.trans.eval_apply(self.args, batch_size=len(self.args), num_workers=0))
elif mode == 'infer':
self.trans.infer_apply(self.args)
raise ValueError('Mode is not set, it should be "train", "eval" or "infer')
def _repeat_on_stage(self, mode):
self.trans.pd_call_in_mode(self.args, mode)
pd_debug = _Debug()
|
package fr.hugosimony.pokemontopaze.pokemon.items;
public enum PokeBalls {
POKE_BALL
}
|
// PerMsec returns per msec for given input data
func PerMsec(orig []float64) []float64 {
ost := orig[0]
nca := len(orig) / 2
oet := orig[(nca-1)*2]
dur := oet - ost
dms := int(dur / 0.001)
rdt := make([]float64, dms)
si := 0
mxi := 0
for i := 0; i < dms; i++ {
ct := ost + float64(i)*0.001
st := orig[si*2]
et := orig[(si+1)*2]
sca := orig[si*2+1]
eca := orig[(si+1)*2+1]
if ct > et {
si++
if si >= nca-1 {
break
}
st = orig[si*2]
et = orig[(si+1)*2]
sca = orig[si*2+1]
eca = orig[(si+1)*2+1]
}
mxi = i
pt := (ct - st) / (et - st)
ca := sca + pt*(eca-sca)
rdt[i] = ca
}
return rdt[:mxi+1]
} |
<reponame>Gamecock/search
package edu.ecu.cs.csci6030.search;
import java.util.Arrays;
/**
* @author Finch
*
*This is a single class that does single term, two term AND boolean, and two term proximity queries.
* Todo: refactor into multiple classes and an interface.
*/
public class Query {
private String term1;
private String term2;
private Integer separation;
public Query(String term1, String term2, Integer separation) {
if (term1 == null) {
this.term1 = term2;
this.term2 = null;
} else {
this.term1 = term1;
this.term2 = term2;
}
this.separation = separation;
}
public String getTerm1() {
return term1;
}
public String getTerm2() {
return term2;
}
public Integer getSeparation() {
return separation;
}
public int[] search(PositionalIndex index) {
int[] results = new int[0];
int[] list1 = null;
PositionalPosting posting1 = null;
PositionalPosting posting2 = null;
if (term1 == null) {
return results;
} else {
posting1 = index.getPosting(term1);
if (null != posting1) {
list1 = posting1.getDocumentList();
} else {
//null AND anything will be null, so quit looking
return results;
}
}
if (term2 != null) {
posting2 = index.getPosting(term2);
int[] list2 =null;
if (null != posting2) {
list2 = posting2.getDocumentList();
} else {
return results;
//Intersection of results with null will be null
}
int[] finalList2 = list2;
results = Arrays.stream(list1).filter(x -> Arrays.stream(finalList2).anyMatch(y -> y == x)).toArray();
} else {
results = Arrays.copyOf(list1, list1.length);
}
if (separation != null) results = intersectPosition(results, posting1, posting2, separation);
return results;
}
private int[] intersectPosition(int[] results, PositionalPosting posting1, PositionalPosting posting2, Integer separation) {
for (int document = 0 ; document < results.length; document++) {
boolean match = false;
int ptr1 = 0;
int ptr2 = 0;
int[] list1 = posting1.get(results[document]).getList();
int[] list2 = posting2.get(results[document]).getList();
while (ptr1 < list1.length & ptr2< list2.length){
if (Math.abs(list1[ptr1] -list2[ptr2]) <= separation+1) {
match = true;
break;//no need to check rest of the list
}
if (list1[ptr1] < list2[ptr2] ) {
ptr1++;
} else {
ptr2 ++;
}
}
if (!match) results[document] = -1;
}
return Arrays.stream(results).filter(x -> x>=0).toArray();
}
@Override
public String toString() {
return "Query: term1 ["+term1+"], term2 ["+term2+"], with separation: " + separation;
}
}
|
/**
* Release what we want need anymore after last available row has been
* returned from datanodes.
*/
void
NdbWorker::postFetchRelease()
{
if (m_resultStreams != NULL)
{
for (unsigned opNo=0; opNo<m_query->getNoOfOperations(); opNo++)
{
m_resultStreams[opNo].~NdbResultStream();
}
}
m_resultStreams = NULL;
} |
<reponame>millet373/CocoaInput
#include "libwincocoainput.h"
#include <imm.h>
void (*javaDone)(wchar_t*);
int* (*javaDraw)(wchar_t*,int,int);
int (*javaRect)(float*);
void setCallback(int*(*c_draw)(wchar_t*,int,int),void(*c_done)(wchar_t*),int (*c_rect)(float*)){
javaDraw=c_draw;
javaDone=c_done;
javaRect=c_rect;
}
HWND hwnd;
HIMC himc;
LRESULT compositionLocationNotify(HWND hWnd){
HIMC imc=NULL;
float *rect= malloc(sizeof(float)*4);
CILog("ready call javaRect");
if(javaRect(rect)){
free(rect);
return FALSE;
}
CANDIDATEFORM rectStruct = {0,CFS_EXCLUDE,{rect[0],rect[1]},{rect[0]-rect[2],rect[1]-rect[3],rect[0]+1,rect[1]+1}};
imc=ImmGetContext(hwnd);
ImmSetCandidateWindow(imc,&rectStruct);
ImmReleaseContext(hWnd,imc);
free(rect);
return TRUE;
}
LRESULT CALLBACK (*glfwWndProc)(HWND,UINT,WPARAM,LPARAM);
LRESULT CALLBACK wrapper_wndProc(HWND hWnd, UINT msg, WPARAM wParam, LPARAM lParam){
switch(msg){
case WM_IME_NOTIFY:{
if(wParam==IMN_OPENCANDIDATE){
compositionLocationNotify(hWnd);
return TRUE;
}
else{
break;
}
}
case WM_IME_STARTCOMPOSITION:{
compositionLocationNotify(hWnd);
return TRUE;
}
case WM_IME_ENDCOMPOSITION:{
javaDone(L"");
return TRUE;
}
case WM_IME_COMPOSITION:{
HIMC imc=NULL;
LONG textSize,attrSize,clauseSize;
int i, focusedBlock, length;
LPWSTR buffer;
LPSTR attributes;
DWORD* clauses;
LONG cursor=0;
if(lParam&GCS_RESULTSTR){
CILog("ResultStr");
imc=ImmGetContext(hwnd);
textSize = ImmGetCompositionStringW(imc, GCS_RESULTSTR, NULL, 0);
length = textSize / sizeof(WCHAR);
buffer = calloc(length + 1, sizeof(WCHAR));
ImmGetCompositionStringW(imc, GCS_RESULTSTR, buffer, textSize);
ImmReleaseContext(hWnd, imc);
javaDone(buffer);
}
if(lParam&GCS_COMPSTR){
imc=ImmGetContext(hwnd);
if(lParam&GCS_CURSORPOS){
cursor=ImmGetCompositionStringW(imc, GCS_CURSORPOS, NULL, 0);
}
textSize = ImmGetCompositionStringW(imc, GCS_COMPSTR, NULL, 0);
attrSize = ImmGetCompositionStringW(imc, GCS_COMPATTR, NULL, 0);
clauseSize = ImmGetCompositionStringW(imc, GCS_COMPCLAUSE, NULL, 0);
if(textSize<=0){
ImmReleaseContext(hWnd, imc);
javaDone(L"");
}
else{
length = textSize / sizeof(WCHAR);
buffer = calloc(length + 1, sizeof(WCHAR));
attributes = calloc(attrSize, 1);
clauses = calloc(clauseSize, 1);
ImmGetCompositionStringW(imc, GCS_COMPSTR, buffer, textSize);
ImmGetCompositionStringW(imc, GCS_COMPATTR, attributes, attrSize);
ImmGetCompositionStringW(imc, GCS_COMPCLAUSE, clauses, clauseSize);
ImmReleaseContext(hWnd, imc);
int selected_begin=-1;
int selected_length=0;
int i;
for(i=0;i<attrSize;i++){
if(attributes[i]&(ATTR_TARGET_CONVERTED)){
if(selected_begin==0){
selected_begin=i;
}
selected_length++;
}
}
if(selected_begin>=0){
cursor=selected_begin;
}
compositionLocationNotify(hWnd);
javaDraw(buffer,cursor,selected_length);
}
}
return TRUE;
}
default:break;
}
return CallWindowProc(glfwWndProc,hWnd,msg,wParam,lParam);
}
void initialize(
long hwndp,
int*(*c_draw)(wchar_t*,int,int),
void(*c_done)(wchar_t*),
int (*c_rect)(float*),
LogFunction log,
LogFunction error,
LogFunction debug
){
initLogPointer(log,error,debug);
CILog("CocoaInput Windows Clang Initializer start. library compiled at %s %s",__DATE__,__TIME__);
setCallback(c_draw,c_done,c_rect);
hwnd=(HWND)hwndp;
glfwWndProc = GetWindowLongPtr(hwnd,GWLP_WNDPROC);
SetWindowLongPtr(hwnd,GWLP_WNDPROC,wrapper_wndProc);
CILog("Window procedure replaced");
//input_himc = ImmGetContext(hwnd);
/*if(!hImc){
hImc = ImmCreateContext();
HIMC oldhImc = ImmAssociateContext( hwnd, hImc );
}*/
//ImmReleaseContext(hwnd,input_himc);
himc = ImmGetContext(hwnd);
if(!himc){
himc = ImmCreateContext();
}
ImmReleaseContext(hwnd,himc);
himc=ImmAssociateContext(hwnd,0);
CILog("CocoaInput Windows initializer done!");
}
void set_focus(int flag){
CILog("setFocused:%d",flag);
if(flag){
ImmAssociateContext(hwnd,himc);
compositionLocationNotify(hwnd);
}
else{
himc=ImmAssociateContext(hwnd,0);
}
}
|
<filename>src/db/db.ts
import sqlite from 'better-sqlite3';
import * as path from 'path';
export const db = sqlite(path.join(process.cwd(), './devsincrypto.sqlite'));
|
package documents
import (
"encoding/json"
"github.com/gorilla/mux"
"github.com/holmes89/book-organizer/internal/common"
"github.com/sirupsen/logrus"
"io/ioutil"
"net/http"
)
func MakeDocumentHandler(mr *mux.Router, service DocumentService) http.Handler {
r := mr.PathPrefix("/documents").Subrouter()
h := &documentHandler{
service: service,
}
r.HandleFunc("/{id}", h.FindByID).Methods("GET")
r.HandleFunc("/{id}", h.UpdateFields).Methods("PATCH")
r.HandleFunc("/{id}", h.Delete).Methods("DELETE")
r.HandleFunc("/scan", h.Scan).Methods("PUT")
r.HandleFunc("/", h.FindAll).Methods("GET")
return r
}
type documentHandler struct {
service DocumentService
}
func (h *documentHandler) FindByID(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
vars := mux.Vars(r)
id, ok := vars["id"]
if !ok {
common.MakeError(w, http.StatusBadRequest, "document", "Missing Id", "findbyid")
return
}
entity, err := h.service.FindByID(ctx, id)
if err != nil {
common.MakeError(w, http.StatusInternalServerError, "document", "Server Error", "findbyid")
return
}
common.EncodeResponse(r.Context(), w, entity)
}
func (h *documentHandler) UpdateFields(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
b, _ := ioutil.ReadAll(r.Body)
defer r.Body.Close()
req := Document{}
if err := json.Unmarshal(b, &req); err != nil {
logrus.WithError(err).Error("unable to unmarshal link tag")
common.MakeError(w, http.StatusBadRequest, "document", "Bad Request", "updateFields")
return
}
vars := mux.Vars(r)
id, ok := vars["id"]
if !ok {
common.MakeError(w, http.StatusBadRequest, "document", "Missing Id", "updateFields")
return
}
entity, err := h.service.UpdateFields(ctx, id, req)
if err != nil {
common.MakeError(w, http.StatusInternalServerError, "document", "Server Error", "updateFields")
return
}
common.EncodeResponse(r.Context(), w, entity)
}
func (h *documentHandler) Delete(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
vars := mux.Vars(r)
id, ok := vars["id"]
if !ok {
common.MakeError(w, http.StatusBadRequest, "document", "Missing Id", "delete")
return
}
if err := h.service.Delete(ctx, id); err != nil {
common.MakeError(w, http.StatusInternalServerError, "document", "Server Error", "delete")
return
}
common.EncodeResponse(r.Context(), w, map[string]string{"status": "success"})
}
func (h *documentHandler) FindAll(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
entity, err := h.service.FindAll(ctx, nil)
if err != nil {
common.MakeError(w, http.StatusInternalServerError, "document", "Server Error", "findall")
return
}
common.EncodeResponse(r.Context(), w, entity)
}
func (h *documentHandler) Scan(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
err := h.service.Scan(ctx)
if err != nil {
common.MakeError(w, http.StatusInternalServerError, "document", "Server Error", "scan")
return
}
common.EncodeResponse(r.Context(), w, map[string]string{"status": "success"})
}
|
/**
* Initializes the list of client interceptors.
*
* @return The list of global client interceptors.
*/
protected List<ClientInterceptor> initClientInterceptors() {
final List<ClientInterceptor> interceptors = new ArrayList<>();
for (final GlobalClientInterceptorConfigurer configurer : this.applicationContext
.getBeansOfType(GlobalClientInterceptorConfigurer.class).values()) {
configurer.configureClientInterceptors(interceptors);
}
sortInterceptors(interceptors);
return interceptors;
} |
def _path(key):
return sys.modules['seisflows_paths'][key.upper()] |
<filename>src/Message.hs<gh_stars>0
{-# LANGUAGE OverloadedStrings #-}
module Message
(
Message (..)
, initSchema
, insertMessage
, findMessagesByRecipient
, findMessagesByRecipientChanges
) where
import Control.Monad (mzero, void, when)
import Data.Aeson
import qualified Data.Aeson as Aeson
import Data.Text (Text)
import Database.RethinkDB.NoClash
data Message = Message { getText :: Text
, getRecipient :: Text
}
deriving Show
instance FromDatum Message
instance ToDatum Message
instance Expr Message
instance ToJSON Message where
toJSON (Message text recipient) = object [ "text" .= text
, "recipient" .= recipient
]
instance FromJSON Message where
parseJSON (Aeson.Object v) = Message <$>
(v .: "text") <*>
(v .: "recipient")
parseJSON _ = mzero
messageTable :: Table
messageTable = table "messages"
messageRecipientIndex :: Index
messageRecipientIndex = Index "message_recipient_index"
insertMessage :: Message -> RethinkDBHandle -> IO ()
insertMessage m h = (void . run' h) $ messageTable # insert m
findMessagesByRecipient :: Text -> RethinkDBHandle -> IO [Message]
findMessagesByRecipient recipient h =
run h $ findMessagesByRecipientQuery recipient
findMessagesByRecipientChanges :: Text -> RethinkDBHandle -> IO (Cursor Message)
findMessagesByRecipientChanges recipient h =
run h $ (findMessagesByRecipientQuery recipient # changes) ! "new_val"
findMessagesByRecipientQuery :: Text -> ReQL
findMessagesByRecipientQuery recipient =
messageTable # getAll messageRecipientIndex [recipient]
initSchema :: RethinkDBHandle -> Database -> IO ()
initSchema h database = do
tables <- (run h $ tableList database) :: IO [String]
when ("messages" `elem` tables) $
(void . run' h) $ messageTable # tableDrop
(void . run' h) $ messageTable # tableCreate
(void . run' h) $
messageTable # indexCreate "message_recipient_index" (! "recipient")
(void . run' h) $ messageTable # indexWait []
|
//DISPLAY CELL WARNINGS ON MAIN SCREEN
void Core0::cellPartialUpdate(int errorType, int cellNum)
{
uint16_t box_w = 14;
uint16_t box_h = 23;
uint16_t box_y = 63;
uint16_t box_x = 0;
if (cellNum < 8) {
box_x = 11 + (box_w - 1) * cellNum;
}
else {
box_x = 180 + (box_w - 1) * (cellNum - 8);
}
if (errorType == 0) {
display.fillRect(box_x+1, box_y - box_h+1, box_w-2, box_h-2, GxEPD_WHITE);
}
else if (errorType == 1) {
display.fillRect(box_x, box_y - box_h, box_w, box_h, GxEPD_BLACK);
}
else if (errorType == 2) {
display.fillRect(box_x, box_y - box_h, box_w, box_h, GxEPD_BLACK);
display.setTextColor(GxEPD_WHITE);
display.setCursor(box_x + 1, box_y - 6);
display.print("V");
}
else if (errorType == 3) {
display.fillRect(box_x, box_y - box_h, box_w, box_h, GxEPD_BLACK);
display.setTextColor(GxEPD_WHITE);
display.setCursor(box_x + 1, box_y - 6);
display.print("T");
}
else if (errorType == 4) {
display.fillRect(box_x, box_y - box_h, box_w, box_h, GxEPD_BLACK);
display.setTextColor(GxEPD_WHITE);
display.setCursor(box_x + 1, box_y - 6);
display.print("C");
}
else if (errorType == 5) {
display.fillRect(box_x, box_y - box_h, box_w, box_h, GxEPD_BLACK);
display.setTextColor(GxEPD_WHITE);
display.setCursor(box_x + 1, box_y - 6);
display.print("!");
}
} |
"DXM" redirects here. For the cricket bats, see Gunn & Moore
Not to be confused with Dextrorphan or Dexamethasone
Dextromethorphan (DXM or DM) is a medication most often used as a cough suppressant in over-the-counter cold and cough medicines. It is sold in syrup, tablet, spray, and lozenge forms.
It is in the morphinan class of medications with sedative, dissociative, and stimulant properties (at lower doses). In its pure form, dextromethorphan occurs as a white powder.[3]
DXM is also used recreationally. When exceeding approved dosages, dextromethorphan acts as a dissociative anesthetic. It has multiple mechanisms of action, including actions as a nonselective serotonin reuptake inhibitor[4] and a sigma-1 receptor agonist.[5][6] DXM and its major metabolite, dextrorphan, also act as an NMDA receptor antagonist at high doses, which produces effects similar to, yet distinct from, the dissociative states created by other dissociative anesthetics such as ketamine and phencyclidine.
The metabolic pathway continues from dextrorphan to 3-methoxymorphinan to 3-hydroxymorphinan. The 3-methoxymorphinan metabolite produces local anesthetic effects in rats, with potency between dextrorphan and DXM.[7]
Medical uses [ edit ]
Generic dextromethorphan cough syrup.
Cough suppression [ edit ]
The primary use of dextromethorphan is as a cough suppressant, for the temporary relief of cough caused by minor throat and bronchial irritation (such as commonly accompanies the flu and common cold), as well as those resulting from inhaled particle irritants.[8] However, controlled studies have found the symptomatic effectiveness of dextromethorphan similar to placebo.[9]
Neuropsychiatric disorders [ edit ]
In 2010, the FDA approved the combination drug dextromethorphan/quinidine for the treatment of pseudobulbar affect (uncontrollable laughing/crying). Dextromethorphan is the actual therapeutic agent in the combination; quinidine merely serves to inhibit the enzymatic degradation of dextromethorphan and thereby increase its circulating concentrations via inhibition of CYP2D6.[10]
In 2016, the ASA released a promising study with the combination of dextromethorphan with pregabalin, acetaminophen, and naproxen which showed a decrease in postoperative pain intensity (preemptive analgesia).[11]
Contraindications [ edit ]
Because dextromethorphan can trigger a histamine release (allergic reaction), atopic children, who are especially susceptible to allergic reactions, should be administered dextromethorphan only if absolutely necessary, and only under the strict supervision of a healthcare professional.[12]
Adverse effects [ edit ]
Side effects of dextromethorphan at normal therapeutic doses can include:[2][8][12]
A rare side effect is respiratory depression.[8]
Neurotoxicity [ edit ]
Dextromethorphan had been thought to cause Olney's lesions when administered intravenously; however, this was later proven inconclusive, due to lack of research on humans. Tests were performed on rats, giving them 50 mg and up every day up to a month. Neurotoxic changes, including vacuolation, have been observed in posterior cingulate and retrosplenial cortices of rats administered other NMDA receptor antagonists such as PCP, but not with dextromethorphan.[13][14]
Dependence and withdrawal [ edit ]
In many documented cases, dextromethorphan has produced psychological dependence in people who used it recreationally. However, it does not produce physical addiction, according to the WHO Committee on Drug Dependence.[15] It is considered less addictive than the other common weak opiate cough suppressant, codeine.[2] Since dextromethorphan also acts as a serotonin reuptake inhibitor, users describe that regular recreational use over a long period of time can cause withdrawal symptoms similar to those of antidepressant discontinuation syndrome. Additionally, disturbances have been reported in sleep, senses, movement, mood, and thinking.
Overdose [ edit ]
Adverse effects of dextromethorphan in overdose at doses 3 to 10 times the recommended therapeutic dose:[16][not in citation given]
euphoria
increased energy
increased confidence
mild nausea
restlessness
insomnia
talking fast
feelings of increased strength
dilated pupils
glassy eyes
dizziness
At doses 15 to 75 times the recommended therapeutic dose:[16][not in citation given]
Interactions [ edit ]
Dextromethorphan should not be taken with monoamine oxidase inhibitors (MAOIs)[12] due to the potential for serotonin syndrome, which is a potentially life-threatening condition that can occur rapidly, due to a buildup of an excessive amount of serotonin in the body.
Caution should be exercised when taking dextromethorphan when drinking grapefruit juice or eating grapefruits, as compounds in grapefruit affect a number of drugs, including dextromethorphan, through the inhibition of the cytochrome p450 system in the liver, and can lead to excessive accumulation and prolonged effects. Grapefruit and grapefruit juices (especially white grapefruit juice, but also including other citrus fruits such as bergamot and lime, as well as a number of noncitrus fruits[17]) generally are recommended to be avoided while using dextromethorphan and numerous other medications.
Pharmacology [ edit ]
Pharmacodynamics [ edit ]
Dextromethorphan has been found to possess the following actions (<1 μM) using rat tissues:[19][25]
Rather than acting as a direct NMDA receptor antagonist itself, dextromethorphan acts as a prodrug of its much more potent metabolite dextrorphan, and this is the actual mediator of its dissociative effects.[26] What role, if any, (+)-3-methoxymorphinan, dextromethorphan's other major metabolite, plays in its effects is not entirely clear.[27]
Pharmacokinetics [ edit ]
Following oral administration, dextromethorphan is rapidly absorbed from the gastrointestinal tract, where it enters the bloodstream and crosses the blood–brain barrier.[citation needed]
At therapeutic doses, dextromethorphan acts centrally (meaning that it acts on the brain) as opposed to locally (on the respiratory tract). It elevates the threshold for coughing, without inhibiting ciliary activity. Dextromethorphan is rapidly absorbed from the gastrointestinal tract and converted into the active metabolite dextrorphan in the liver by the cytochrome P450 enzyme CYP2D6. The average dose necessary for effective antitussive therapy is between 10 and 45 mg, depending on the individual. The International Society for the Study of Cough recommends "an adequate first dose of medication is 60 mg in the adult and repeat dosing should be infrequent rather than the qds recommended."[28]
DXM has an elimination half-life of approximately 4 hours in individuals with an extensive metabolizer phenotype; this is increased to approximately 13 hours when DXM is given in combination with quinidine.[21] The duration of action after oral administration is about three to eight hours for dextromethorphan hydrobromide, and 10 to 12 hours for dextromethorphan polistirex. Around one in 10 of the Caucasian population has little or no CYP2D6 enzyme activity, leading to long-lived high drug levels.[28]
Metabolism [ edit ]
[29] Main metabolism pathways for DXM degeration catalyzed by cytochrome P450 monooxygenases (CYP3A4 and CYP2D6) and UDP-glucuronosyl-transferase (UGT).
The first pass through the hepatic portal vein results in some of the drug being metabolized by O-demethylation into an active metabolite of dextromethorphan called dextrorphan (DXO). DXO is the 3-hydroxy derivative of dextromethorphan. The therapeutic activity of dextromethorphan is believed to be caused by both the drug and this metabolite. Dextromethorphan also undergoes N-demethylation (to 3-methoxymorphinan or MEM),[30] and partial conjugation with glucuronic acid and sulfate ions. Hours after dextromethorphan therapy, (in humans) the metabolites (+)-3-hydroxy-N-methylmorphinan, (+)-3-morphinan, and traces of the unchanged drug are detectable in the urine.[12]
A major metabolic catalyst involved is the cytochrome P450 enzyme known as 2D6, or CYP2D6. A significant portion of the population has a functional deficiency in this enzyme and are known as poor CYP2D6 metabolizers. O-demethylation of DXM to DXO contributes to at least 80% of the DXO formed during DXM metabolism.[30] As CYP2D6 is a major metabolic pathway in the inactivation of dextromethorphan, the duration of action and effects of dextromethorphan can be increased by as much as three times in such poor metabolizers.[31] In one study on 252 Americans, 84.3% were found to be "fast" (extensive) metabolizers, 6.8% to be "intermediate" metabolizers, and 8.8% were "slow" metabolizers of DXM.[32] A number of alleles for CYP2D6 are known, including several completely inactive variants. The distribution of alleles is uneven amongst ethnic groups.
A large number of medications are potent inhibitors of CYP2D6. Some types of medications known to inhibit CYP2D6 include certain SSRIs and tricyclic antidepressants, some antipsychotics, and the commonly available antihistamine diphenhydramine. Therefore, the potential of interactions exists between dextromethorphan and medications that inhibit this enzyme, particularly in slow metabolizers.[citation needed]
DXM is also metabolized by CYP3A4. N-demethylation is primarily accomplished by CYP3A4, contributing to at least 90% of the MEM formed as a primary metabolite of DXM.[30]
A number of other CYP enzymes are implicated as minor pathways of DXM metabolism. CYP2B6 is actually more effective than CYP3A4 at N-demethylation of DXM, but, since the average individual has a much lower CYP2B6 content in his/her liver relative to CYP3A4, most N-demethylation of DXM is catalyzed by CYP3A4.[30]
Chemistry [ edit ]
Dextromethorphan is the dextrorotatory enantiomer of levomethorphan, which is the methyl ether of levorphanol, both opioid analgesics. It is named according to IUPAC rules as (+)-3-methoxy-17-methyl-9α,13α,14α-morphinan. As its pure form, dextromethorphan occurs as an odorless, opalescent white powder. It is freely soluble in chloroform and insoluble in water; the hydrobromide salt is water-soluble up to 1.5 g/100 mL at 25 °C.[33] Dextromethorphan is commonly available as the monohydrated hydrobromide salt, however some newer extended-release formulations contain dextromethorphan bound to an ion-exchange resin based on polystyrene sulfonic acid. Dextromethorphan's specific rotation in water is +27.6° (20 °C, Sodium D-line).[citation needed]
History [ edit ]
The racemic parent compound racemorphan was first described in a Swiss and US patent application from Hoffmann-La Roche in 1946 and 1947, respectively; a patent was granted in 1950. A resolution of the two isomers of racemorphan with tartaric acid was published in 1952,[34] and DXM was successfully tested in 1954 as part of US Navy and CIA-funded research on nonaddictive substitutes for codeine.[35] DXM was approved by the FDA in 1958 as an over-the-counter antitussive.[34] As had been initially hoped, DXM was a solution for some of the problems associated with the use of codeine phosphate as a cough suppressant, such as sedation and opiate dependence, but like the dissociative anesthetics phencyclidine and ketamine, DXM later became associated with nonmedical use.[34][36]
During the 1960s and 1970s, dextromethorphan became available in an over-the-counter tablet form by the brand name Romilar. In 1973, Romilar was taken off the shelves after a burst in sales because of frequent misuse, and was replaced by cough syrup in an attempt to cut down on abuse.[36] The advent of widespread internet access in the 1990s allowed users to rapidly disseminate information about DXM, and online discussion groups formed around use and acquisition of the drug. As early as 1996, DXM HBr powder could be purchased in bulk from online retailers, allowing users to avoid consuming DXM in syrup preparations.[34] As of January 1, 2012, dextromethorphan is prohibited for sale to minors in the State of California and in the State of Oregon as of January 1, 2018, except with a doctor's prescription.[37] Several other states have also began regulating sales of dextromethorphan to minors.
In Indonesia, the National Agency of Drug and Food Control (BPOM-RI) prohibited single-component dextromethorphan drug sales with or without prescription. Indonesia is the only country in the world that makes single-component dextromethorphan illegal even by prescription[38] and violators may be prosecuted by law. National Anti-Narcotics Agency (BNN RI) has even threatened to revoke pharmacies' and drug stores' licenses if they still stock dextromethorphan, and will notify the police for criminal prosecution.[39] As a result of this regulation, 130 drugs have been withdrawn from the market, but drugs containing multicomponent dextromethorphan can be sold over the counter.[40] In its official press release, BPOM-RI also stated that dextromethorphan is often used as a substitute for marijuana, amphetamine, and heroin by drug abusers, and its use as an antitussive is less beneficial nowadays.[41]
Society and culture [ edit ]
Marketing [ edit ]
It may be used in generic labels and store brands, Benylin DM, Mucinex DM, Camydex-20 tablets, Robitussin, NyQuil, Dimetapp, Vicks, Coricidin, Delsym, TheraFlu, Charcoal D, Cinfatós and others. It has been used in counterfeit medications.[42]
Recreational use [ edit ]
Dextromethorphan gel capsules
Over-the-counter preparations containing dextromethorphan have been used in manners inconsistent with their labeling, often as a recreational drug.[36] At doses much higher than medically recommended, DXM and its major metabolite, dextrorphan, acts as an NMDA receptor antagonist, which produces dissociative hallucinogenic states somewhat similar to ketamine and phencyclidine.[43] Along with other drugs such as ketamine or phencyclidine, also known as PCP, there is a street name for dextromethorphan-infused substances which is also known as "Angel". It may produce distortions of the visual field – feelings of dissociation, distorted bodily perception, excitement, and a loss of sense of time. Some users report stimulant-like euphoria, particularly in response to music. Dextromethorphan usually provides its recreational effects in a non-linear fashion, so that they are experienced in significantly varied stages. These stages are commonly referred to as "plateaus". These plateaus are labeled between one and four, one being the lowest and so on. Each plateau is said to come with different related effects and experiences.[44] Teens tend to have a higher likelihood to use dextromethorphan-related drugs as they are easier to access, and an easier way to cope with psychiatric disorders.[45]
Research [ edit ]
Dextromethorphan/quinidine is also under investigation for the treatment of a variety of other neurological and neuropsychiatric conditions besides pseudobulbar affect, such as agitation associated with Alzheimer's disease and major depressive disorder.[10]
See also [ edit ] |
/**
* Obtiene el rol con base en el nombre
* @param nombreRol
* @return
* @throws ExceptionFatal
*/
public static Rol consultarRol(String nombreRol) throws ExceptionFatal {
try {
return (Rol) FabricaConexiones.conectar("postgres").consultarObjeto("select r from Rol r where r.nombre = '" + nombreRol + "'");
} catch (Exception e) {
throw new ExceptionFatal(e.getMessage());
}
} |
// copyFileWithTar returns a function which copies a single file from outside
// of any container, or another container, into our working container, mapping
// read permissions using the passed-in ID maps, writing using the container's
// ID mappings, possibly overridden using the passed-in chownOpts
func (b *Builder) copyFileWithTar(tarIDMappingOptions *IDMappingOptions, chownOpts *idtools.IDPair, hasher io.Writer) func(src, dest string) error {
if tarIDMappingOptions == nil {
tarIDMappingOptions = &IDMappingOptions{
HostUIDMapping: true,
HostGIDMapping: true,
}
}
return func(src, dest string) error {
logrus.Debugf("copyFileWithTar(%s, %s)", src, dest)
f, err := os.Open(src)
if err != nil {
return errors.Wrapf(err, "error opening %q to copy its contents", src)
}
defer func() {
if f != nil {
f.Close()
}
}()
sysfi, err := system.Lstat(src)
if err != nil {
return errors.Wrapf(err, "error reading attributes of %q", src)
}
hostUID := sysfi.UID()
hostGID := sysfi.GID()
containerUID, containerGID, err := util.GetContainerIDs(tarIDMappingOptions.UIDMap, tarIDMappingOptions.GIDMap, hostUID, hostGID)
if err != nil {
return errors.Wrapf(err, "error mapping owner IDs of %q: %d/%d", src, hostUID, hostGID)
}
fi, err := os.Lstat(src)
if err != nil {
return errors.Wrapf(err, "error reading attributes of %q", src)
}
hdr, err := tar.FileInfoHeader(fi, filepath.Base(src))
if err != nil {
return errors.Wrapf(err, "error generating tar header for: %q", src)
}
hdr.Name = filepath.Base(dest)
hdr.Uid = int(containerUID)
hdr.Gid = int(containerGID)
pipeReader, pipeWriter := io.Pipe()
writer := tar.NewWriter(pipeWriter)
var copyErr error
go func(srcFile *os.File) {
err := writer.WriteHeader(hdr)
if err != nil {
logrus.Debugf("error writing header for %s: %v", srcFile.Name(), err)
copyErr = err
}
n, err := pools.Copy(writer, srcFile)
if n != hdr.Size {
logrus.Debugf("expected to write %d bytes for %s, wrote %d instead", hdr.Size, srcFile.Name(), n)
}
if err != nil {
logrus.Debugf("error reading %s: %v", srcFile.Name(), err)
copyErr = err
}
if err = writer.Close(); err != nil {
logrus.Debugf("error closing write pipe for %s: %v", srcFile.Name(), err)
}
if err = srcFile.Close(); err != nil {
logrus.Debugf("error closing %s: %v", srcFile.Name(), err)
}
pipeWriter.Close()
pipeWriter = nil
}(f)
untar := b.untar(chownOpts, hasher)
err = untar(pipeReader, filepath.Dir(dest))
if err == nil {
err = copyErr
}
f = nil
if pipeWriter != nil {
pipeWriter.Close()
}
return err
}
} |
<reponame>mayuri-dhote/psydac
import numpy as np
import pytest
import os
from sympde.topology import Domain
from psydac.api.discretization import discretize
from psydac.utilities.utils import refine_array_1d
from psydac.core.bsplines import quadrature_grid, basis_ders_on_quad_grid, elements_spans
from psydac.utilities.quadratures import gauss_legendre
try:
mesh_dir = os.environ['PSYDAC_MESH_DIR']
except KeyError:
base_dir = os.path.dirname(os.path.realpath(__file__))
base_dir = os.path.join(base_dir, '..', '..', '..')
mesh_dir = os.path.join(base_dir, 'mesh')
@pytest.mark.parametrize('geometry_file', ['collela_3d.h5', 'collela_2d.h5', 'bent_pipe.h5'])
@pytest.mark.parametrize('refinement', [2, 3, 4])
def test_build_mesh(geometry_file, refinement):
filename = os.path.join(mesh_dir, geometry_file)
domain = Domain.from_file(filename)
domainh = discretize(domain, filename=filename)
for mapping in domainh.mappings.values():
space = mapping.space
grid = [refine_array_1d(space.breaks[i], refinement, remove_duplicates=False) for i in range(mapping.ldim)]
x_mesh, y_mesh, z_mesh = mapping.build_mesh(grid, npts_per_cell=refinement + 1)
if mapping.ldim == 2:
eta1, eta2 = grid
pcoords = np.array([[mapping(e1, e2) for e2 in eta2] for e1 in eta1])
x_mesh_l = pcoords[..., 0:1]
y_mesh_l = pcoords[..., 1:2]
z_mesh_l = np.zeros_like(x_mesh_l)
elif mapping.ldim == 3:
eta1, eta2, eta3 = grid
pcoords = np.array([[[mapping(e1, e2, e3) for e3 in eta3] for e2 in eta2] for e1 in eta1])
x_mesh_l = pcoords[..., 0]
y_mesh_l = pcoords[..., 1]
z_mesh_l = pcoords[..., 2]
else:
assert False
assert x_mesh.flags['C_CONTIGUOUS'] and y_mesh.flags['C_CONTIGUOUS'] and z_mesh.flags['C_CONTIGUOUS']
assert np.allclose(x_mesh, x_mesh_l)
assert np.allclose(y_mesh, y_mesh_l)
assert np.allclose(z_mesh, z_mesh_l)
|
<filename>src/disco_l475.rs
use stm32l4xx_hal::usb::{Peripheral, UsbBus};
use stm32l4xx_hal::serial::{Serial, self};
use stm32l4xx_hal::prelude::*;
pub fn reset() -> ! {
panic!("reset");
}
fn enable_crs() {
let rcc = unsafe { &(*stm32::RCC::ptr()) };
rcc.apb1enr1.modify(|_, w| w.crsen().set_bit());
let crs = unsafe { &(*stm32::CRS::ptr()) };
// Initialize clock recovery
// Set autotrim enabled.
crs.cr.modify(|_, w| w.autotrimen().set_bit());
// Enable CR
crs.cr.modify(|_, w| w.cen().set_bit());
}
/// Enables VddUSB power supply
fn enable_usb_pwr() {
// Enable PWR peripheral
let rcc = unsafe { &(*stm32::RCC::ptr()) };
rcc.apb1enr1.modify(|_, w| w.pwren().set_bit());
// Enable VddUSB
let pwr = unsafe { &*stm32::PWR::ptr() };
pwr.cr2.modify(|_, w| w.usv().set_bit());
}
pub fn init() -> (impl usb_device::class_prelude::UsbBus,(),(),()) {
// Get access to the device specific peripherals from the peripheral access crate
let p = Peripherals::take().unwrap_or_else(|| unreachable!());
let mut cp = cortex_m::Peripherals::take().unwrap_or_else(|| unreachable!());
// Take ownership over the raw flash and rcc devices and convert them
// into the corresponding HAL structs
let mut flash = p.FLASH.constrain();
let mut rcc = p.RCC.constrain();
let mut pwr = p.PWR.constrain(&mut rcc.apb1r1);
// Freeze the configuration of all the clocks in the system and store
// the frozen frequencies in `clocks`
let clocks = rcc.cfgr.sysclk(80.mhz()).freeze(&mut flash.acr, &mut pwr);
// Acquire the GPIOB peripheral
let mut gpioa = p.GPIOB.split(&mut rcc.ahb2);
let tx = gpioa.pb6.into_af7(&mut gpioa.moder, &mut gpioa.afrl);
let rx = gpioa.pb7.into_af7(&mut gpioa.moder, &mut gpioa.afrl);
let mut gpiob = dp.GPIOB.split(&mut rcc.ahb2);
let mut led = gpiob
.pb3
.into_push_pull_output(&mut gpiob.moder, &mut gpiob.otyper);
led.set_low(); // Turn off
let mut gpioa = dp.GPIOA.split(&mut rcc.ahb2);
let usb = Peripheral {
usb: dp.USB,
pin_dm: gpioa.pa11.into_af10(&mut gpioa.moder, &mut gpioa.afrh),
pin_dp: gpioa.pa12.into_af10(&mut gpioa.moder, &mut gpioa.afrh),
};
let usb_bus = UsbBus::new(usb);
cp.SYST.set_clock_source(SystClkSource::External);
cp.SYST.set_reload(clocks.master_clock.0 / (8 * 1_000));
cp.SYST.clear_current();
cp.SYST.enable_counter();
let (tx, rx) = Serial::usart1(
p.USART1,
(tx, rx),
serial::Config::default().baudrate(115_200.bps()),
clocks,
&mut rcc.apb2,
)
.split();
Self {
sin: rx,
sout: tx,
name: "disco-l475-iot01a",
}
((),(),(),())
}
pub async fn trigger(_ctx: ()) {
}
pub fn consume_debug(_f: impl FnMut(&[u8])->usize) {
}
#[macro_export]
macro_rules! dbgprint {
($($arg:tt)*) => {{}};
}
macro_rules! impl_capabilities {
($name:ty) => {
impl usbd_dfu::Capabilities for $name {
const CAN_UPLOAD: bool = true;
const CAN_DOWNLOAD: bool = true;
const IS_MANIFESTATION_TOLERANT: bool = true;
const WILL_DETACH: bool = false;
const DETACH_TIMEOUT: u16 = 5000;
const TRANSFER_SIZE: u16 = 4096;
}
};
}
pub struct DFURuntimeImpl;
impl_capabilities!(DFURuntimeImpl);
|
package org.fade.pattern.bp.observer.improve;
/**
* 观察者模式
* 改进
* 客户端
* @author fade
* */
public class Client {
public static void main(String[] args) {
WeatherData weatherData = new WeatherData();
Observer baidu = new BaiduWeather();
Observer sina = new SinaWeather();
weatherData.registerObserver(baidu);
weatherData.registerObserver(sina);
weatherData.setData(25f,1020.3f,0.85f);
}
}
|
The 49ers have sought permission to interview Colts special assistant Rob Chudzinki for their vacated offensive coordinator spot, the NFL Network reported.
Chudzinski, 46, was the Browns head coach in 2013, but was fired following a 4-12 season. He has served as the offensive coordinator with the Panthers (2011-12) and Browns (2007-08). He was the also the offensive coordinator at the University of Miami from 2001-03 when 49ers running back Frank Gore was the school.
The Rams and Bears have also reportedly sought permission to interview Chudzinki.
********************************************************************
Niners safety Antoine Bethea will replace Seattle’s Kam Chancellor in the Pro Bowl, which will be held Sunday in Glendale, Ariz.
In his first season with the 49ers, Bethea, 30, ranked third on the team in tackles (86) and second in interceptions (4).
Niners left guard Mike Iupati, 27, who appeared on the injury report with knee and elbow injuries in the second half of the season, has withdrawn from the Pro Bowl and will be replaced by Miami’s Mike Pouncey.
Bethea and left tackle Joe Staley will be the 49ers’ lone Pro Bowl representatives.
Twitter: @Eric_Branch |
/**
* This class handles the OS specific resource locations
*
* @author aguelle
*
*/
@Configuration
public class Resources {
static Logger log = Logger.getLogger(Resources.class.getName());
private final String WORKING_DIRECTORY;
private final String IMAGE_PATH;
private final String PROPERTIES_PATH;
private final String DEFAULT_FILE_LIST_PATH;
private final String DEFAULT_FILE_NAME = "file_list.mido";
private final String MIDIAUTOMATOR_PROPERTIES = "midiautomator.properties";
private final String PROJECT_PROPERTIES = "project.properties";
private final String KEY_VERSION = "midiautomator.version";
private final String LOG_FILE_NAME = "MidiAutomator.log";
private final String LOG_CONSOLE_APPENDER_NAME = "Console Appender";
private final String LOG_FILE_APPENDER_NAME = "File Appender";
public Resources() {
WORKING_DIRECTORY = MidiAutomator.wd;
if (System.getProperty("os.name").contains("Mac")) {
IMAGE_PATH = WORKING_DIRECTORY + "/images";
PROPERTIES_PATH = SystemUtils
.replaceSystemVariables("$HOME/Library/Application Support/Midi Automator");
DEFAULT_FILE_LIST_PATH = SystemUtils
.replaceSystemVariables("$HOME/Library/Application Support/Midi Automator");
configureLog4J(SystemUtils
.replaceSystemVariables("$HOME/Library/Application Support/Midi Automator/")
+ LOG_FILE_NAME);
} else if (System.getProperty("os.name").contains("Windows")) {
IMAGE_PATH = "images";
PROPERTIES_PATH = SystemUtils
.replaceSystemVariables("%HOMEPATH%\\AppData\\Roaming\\Midi Automator");
DEFAULT_FILE_LIST_PATH = SystemUtils
.replaceSystemVariables("%HOMEPATH%\\AppData\\Roaming\\Midi Automator");
configureLog4J(SystemUtils
.replaceSystemVariables("%HOMEPATH%\\AppData\\Roaming\\Midi Automator\\")
+ LOG_FILE_NAME);
} else {
IMAGE_PATH = "null";
PROPERTIES_PATH = "null";
DEFAULT_FILE_LIST_PATH = "null";
}
migrate();
log.info("Starting Midi Automator version: " + getVersion());
log.info("Working Driectory (-wd) set to: " + WORKING_DIRECTORY);
}
public String getImagePath() {
return IMAGE_PATH;
}
public String getPropertiesPath() {
return PROPERTIES_PATH;
}
public String getDefaultFileListPath() {
return DEFAULT_FILE_LIST_PATH;
}
public String getWorkingDirectory() {
return WORKING_DIRECTORY;
}
public String getDefaultFileName() {
return DEFAULT_FILE_NAME;
}
public String getMidiAutomatorPropertiesFileName() {
return MIDIAUTOMATOR_PROPERTIES;
}
/**
* Gets the version from the properties file
*
* @return application version
*/
public String getVersion() {
Properties properties = new Properties();
InputStream inputStream = getClass().getClassLoader()
.getResourceAsStream(PROJECT_PROPERTIES);
try {
properties.load(inputStream);
} catch (IOException e) {
e.printStackTrace();
}
return properties.getProperty(KEY_VERSION);
}
/**
* Configures the Log4J properties
*
* @param logFilePath
* The log file path
*/
private void configureLog4J(String logFilePath) {
// This is the root logger provided by log4j
Logger rootLogger = Logger.getRootLogger();
rootLogger.setLevel(Level.DEBUG);
// Define log pattern layout
PatternLayout layout = new PatternLayout("[%-5p] %d %c - %m%n");
// Add console appender to root logger
if (rootLogger.getAppender(LOG_CONSOLE_APPENDER_NAME) == null) {
ConsoleAppender consoleAppender = new ConsoleAppender(layout);
consoleAppender.setName(LOG_CONSOLE_APPENDER_NAME);
rootLogger.addAppender(consoleAppender);
}
// Add file appender with layout and output log file name
try {
if (rootLogger.getAppender(LOG_FILE_APPENDER_NAME) == null) {
RollingFileAppender fileAppender = new RollingFileAppender(
layout, logFilePath);
fileAppender.setAppend(false);
fileAppender.setImmediateFlush(true);
fileAppender.setName(LOG_FILE_APPENDER_NAME);
fileAppender.setMaxFileSize("5MB");
fileAppender.setMaxBackupIndex(10);
rootLogger.addAppender(fileAppender);
}
} catch (IOException e) {
System.out.println("Failed to add appender !!");
}
}
/**
* Migrates the persistence files to this version.
*/
private void migrate() {
// add version to properties file
log.info("Migrating to version: " + getVersion());
}
} |
/*
* GoScans, a collection of network scan modules for infrastructure discovery and information gathering.
*
* Copyright (c) Si<EMAIL> AG, 2016-2021.
*
* This work is licensed under the terms of the MIT license. For a copy, see the LICENSE file in the top-level
* directory or visit <https://opensource.org/licenses/MIT>.
*
*/
package banner
import (
"bytes"
"crypto/tls"
"fmt"
"github.com/ziutek/telnet"
"go-scans/utils"
"net"
"strings"
"time"
)
const Label = "Banner"
const receiveSize = 2048
const tagPlain = "Plain"
const tagSsl = "Ssl"
const tagTelnet = "Telnet"
const tagHttp = "Http"
const tagHttps = "Https"
const triggerWindows = "\r\n" // Line feed to use. However, Linux systems often don't like it and don't respond to it
const triggerLinux = "\n" // Line feed that makes Linux systems happy to respond
const triggerHttp = "GET / HTTP/1.1\r\nHost: %s\r\n\r\n" // GET request that makes HTTP servers to respond. They usually don't care about the used line feed, independent of the underlying OS
// Setup configures the environment accordingly, if the scan module has some special requirements. A successful setup
// is required before a scan can be started.
func Setup(logger utils.Logger) error {
return nil
}
// CheckSetup checks whether Setup() executed accordingly. Scan arguments should be checked by the scanner.
func CheckSetup() error {
return nil
}
type ResultData struct {
Plain []byte
Ssl []byte
Telnet []byte
Http []byte
Https []byte
}
type Result struct {
Data *ResultData // Bytes array, to be converted by consumer as required
Status string // Final scan status (success or graceful error). Should be stored along with the scan results.
Exception bool // Indicates if something went wrong badly and results shall be discarded. This should never be
// true, because all errors should be handled gracefully. Logging an error message should always precede setting
// this flag! This flag may additionally come along with a message put into the status attribute.
}
type Scanner struct {
Label string
Started time.Time
Finished time.Time
logger utils.Logger
target string
port int
protocol string
dialTimeout time.Duration
receiveTimeout time.Duration
}
func NewScanner(
logger utils.Logger, // Can be any logger implementing our minimalistic interface. Wrap your logger to satisfy the interface, if necessary (like utils.LoggerTest).
target string,
port int,
protocol string,
dialTimeout time.Duration,
receiveTimeout time.Duration,
) (*Scanner, error) {
// Check whether input target is valid
if !utils.IsValidAddress(target) {
return nil, fmt.Errorf("invalid target '%s'", target)
}
// Check whether input protocol is valid
if !(protocol == "tcp" || protocol == "udp") {
return nil, fmt.Errorf("invalid protocol '%s'", protocol)
}
// Initiate scanner with sanitized input values
scan := Scanner{
Label,
time.Time{}, // zero time
time.Time{}, // zero time
logger,
strings.TrimSpace(target), // Address to be scanned (might be IPv4, IPv6 or hostname)
port,
strings.TrimSpace(protocol),
dialTimeout,
receiveTimeout,
}
// Return scan struct
return &scan, nil
}
// Run starts scan execution. This must either be executed as a goroutine, or another thread must be active listening
// on the scan's result channel, in order to avoid a deadlock situation.
func (s *Scanner) Run() (res *Result) {
// Recover potential panics to gracefully shut down scan
defer func() {
if r := recover(); r != nil {
// Log exception
s.logger.Errorf(fmt.Sprintf("Unexpected error: %s", r))
// Build error status from error message and formatted stacktrace
errMsg := fmt.Sprintf("%s%s", r, utils.StacktraceIndented("\t"))
// Return result set indicating exception
res = &Result{
nil,
errMsg,
true,
}
}
}()
// Set scan started flag
s.Started = time.Now()
s.logger.Infof("Started scan of %s:%d (%s).", s.target, s.port, s.protocol)
// Execute scan logic
res = s.execute()
// Log scan completion message
s.Finished = time.Now()
duration := s.Finished.Sub(s.Started).Minutes()
s.logger.Infof("Finished scan of %s:%d (%s) in %fm.", s.target, s.port, s.protocol, duration)
// Return result set
return res
}
func (s *Scanner) execute() *Result {
// Declare temporary results variable collecting intermediate results
tmpResults := make(map[string][]byte)
// Try plain socket trigger for Windows
s.logger.Debugf("Sending plain socket trigger (Windows).")
respPlainWin, errPlainWin := sendPlain(s.target, s.port, s.protocol, triggerWindows, s.dialTimeout, s.receiveTimeout)
// Check first error response for connection timeout. Don't proceed if host/port ist not reachable at all
if socketErrorType(errPlainWin) == "dial" {
s.logger.Debugf("Endpoint not reachable, aborting scan.")
return &Result{
&ResultData{},
utils.StatusNotReachable,
false,
}
}
// Store result if not empty
s.updateResultMap(tmpResults, tagPlain, respPlainWin, errPlainWin)
// Try Linux line feed if Windows line feed did not work (wrong line feed might cause read timeout)
if _, ok := tmpResults[tagPlain]; !ok {
s.logger.Debugf("Sending plain socket trigger (Linux).")
respPlainLin, errPlainLin := sendPlain(s.target, s.port, s.protocol, triggerLinux, s.dialTimeout, s.receiveTimeout)
s.updateResultMap(tmpResults, tagPlain, respPlainLin, errPlainLin) // Store result if not empty
}
// Continue with triggers only working on TCP
if s.protocol == "tcp" {
// Try SSL socket trigger for Windows
s.logger.Debugf("Sending SSL socket trigger (Windows).")
respSslWin, errSslWin := sendSsl(s.target, s.port, triggerWindows, s.dialTimeout, s.receiveTimeout)
s.updateResultMap(tmpResults, tagSsl, respSslWin, errSslWin) // Store result if not empty
// Try Linux line feed if Windows line feed did not work (wrong line feed might cause read timeout)
if _, ok := tmpResults[tagSsl]; !ok {
// Try SSL socket trigger for Linux
s.logger.Debugf("Sending SSL socket trigger (Linux).")
respSslLin, errSslLin := sendSsl(s.target, s.port, triggerLinux, s.dialTimeout, s.receiveTimeout)
s.updateResultMap(tmpResults, tagSsl, respSslLin, errSslLin) // Store result if not empty
}
// Try telnet trigger for Windows
s.logger.Debugf("Sending telnet trigger (Windows).")
respTelnetWin, errTelWin := sendTelnet(s.target, s.port, true, s.dialTimeout, s.receiveTimeout)
s.updateResultMap(tmpResults, tagTelnet, respTelnetWin, errTelWin) // Store result if not empty
// Try Linux line feed if Windows line feed did not work (wrong line feed might cause read timeout)
if _, ok := tmpResults[tagTelnet]; !ok {
// Try telnet trigger for Linux
s.logger.Debugf("Sending telnet trigger (Linux).")
respTelnetLin, errTelLin := sendTelnet(s.target, s.port, false, s.dialTimeout, s.receiveTimeout)
s.updateResultMap(tmpResults, tagTelnet, respTelnetLin, errTelLin) // Store result if not empty
}
// Prepare HTTP request
req := fmt.Sprintf(triggerHttp, s.target)
// Try HTTP trigger
s.logger.Debugf("Sending HTTP trigger.")
respHttp, errHttp := sendPlain(s.target, s.port, "tcp", req, s.dialTimeout, s.receiveTimeout)
s.updateResultMap(tmpResults, tagHttp, respHttp, errHttp) // Store result if not empty
// Try HTTPs trigger
s.logger.Debugf("Sending HTTPS trigger.")
respHttps, errHttps := sendSsl(s.target, s.port, req, s.dialTimeout, s.receiveTimeout)
s.updateResultMap(tmpResults, tagHttps, respHttps, errHttps) // Store result if not empty
}
// Prepare results data
results := &ResultData{
Plain: tmpResults[tagPlain], // Returns result bytes or empty bytes
Ssl: tmpResults[tagSsl], // Returns result bytes or empty bytes
Telnet: tmpResults[tagTelnet], // Returns result bytes or empty bytes
Http: tmpResults[tagHttp], // Returns result bytes or empty bytes
Https: tmpResults[tagHttps], // Returns result bytes or empty bytes
}
// Return pointer to result struct
s.logger.Debugf("Returning scan result")
return &Result{
results,
utils.StatusCompleted,
false,
}
}
func (s *Scanner) updateResultMap(res map[string][]byte, triggerName string, response []byte, err error) {
// Get rid of whitespaces
response = bytes.TrimSpace(response)
// Log error or store valid response
if err != nil {
s.logger.Debugf("Trigger '%s' failed: %s", triggerName, err)
} else if len(response) == 0 {
s.logger.Debugf("Trigger response '%s' was empty.", triggerName)
} else {
res[triggerName] = response
}
}
func sendPlain(
address string,
port int,
protocol string,
trigger string,
dialTimeout time.Duration,
receiveTimeout time.Duration,
) ([]byte, error) {
// Establish TCP/UDP connection
conn, errCon := net.DialTimeout(protocol, fmt.Sprintf("%s:%d", address, port), dialTimeout)
// Return error if TCP/UDP connection failed
if errCon != nil {
return []byte{}, errCon
}
// Make sure connection gets closed on exit
defer func() { _ = conn.Close() }()
// Set maximum time to wait. Go sockets require timestamp to timeout, not int (seconds)
errSet := conn.SetDeadline(time.Now().Add(receiveTimeout))
if errSet != nil {
return []byte{}, errSet
}
// Send trigger
_, errWrite := conn.Write([]byte(trigger))
if errWrite != nil {
return []byte{}, errWrite
}
// Receive response
responseBuffer := make([]byte, receiveSize)
n, errRead := conn.Read(responseBuffer)
if errRead != nil && errRead.Error() != "EOF" {
return []byte{}, errRead
}
// Slice the buffer up until first null byte if a null byte exists at all
n = bytes.IndexByte(responseBuffer, 0)
if n >= 0 {
responseBuffer = responseBuffer[:n]
}
// Return response
return responseBuffer, nil
}
func sendSsl(address string, port int, trigger string, dialTimeout, receiveTimeout time.Duration) ([]byte, error) {
// Connect to address
conn, errDial := tls.DialWithDialer(
&net.Dialer{Timeout: dialTimeout},
"tcp",
fmt.Sprintf("%s:%d", address, port),
utils.InsecureTlsConfigFactory(),
)
if errDial != nil {
return []byte{}, errDial
}
// Make sure connection gets closed on exit
defer func() { _ = conn.Close() }()
// Set maximum time to wait. Go sockets require timestamp to timeout, not int (seconds)
errSet := conn.SetDeadline(time.Now().Add(receiveTimeout))
if errSet != nil {
return []byte{}, errSet
}
// Send trigger
_, errWrite := conn.Write([]byte(trigger))
if errWrite != nil {
return []byte{}, errWrite
}
// Receive response
responseBuffer := make([]byte, receiveSize)
n, errRead := conn.Read(responseBuffer)
if errRead != nil && errRead.Error() != "EOF" {
return []byte{}, errRead
}
// Slice the buffer up until first null byte if a null byte exists at all
n = bytes.IndexByte(responseBuffer, 0)
if n >= 0 {
responseBuffer = responseBuffer[:n]
}
// Return response
return responseBuffer, nil
}
func sendTelnet(address string, port int, isWindows bool, dialTimeout, receiveTimeout time.Duration) ([]byte, error) {
// Connect to address
conn, errDial := telnet.DialTimeout("tcp", fmt.Sprintf("%s:%d", address, port), dialTimeout)
if errDial != nil {
return []byte{}, errDial
}
// Make sure connection gets closed on exit
defer func() { _ = conn.Close() }()
// Set maximum time to wait. Go sockets require timestamp to timeout, not int (seconds)
errSet := conn.SetDeadline(time.Now().Add(receiveTimeout))
if errSet != nil {
return []byte{}, errSet
}
// If connection is to Linux, set Unix write mode
if !isWindows {
conn.SetUnixWriteMode(true)
}
// Send trigger
_, errWrite := conn.Write([]byte("\n"))
if errWrite != nil {
return []byte{}, errWrite
}
// Receive response
responseBuffer := make([]byte, receiveSize)
n, errRead := conn.Read(responseBuffer)
if errRead != nil && errRead.Error() != "EOF" {
return []byte{}, errRead
}
// Slice the buffer up until first null byte if a null byte exists at all
n = bytes.IndexByte(responseBuffer, 0)
if n >= 0 {
responseBuffer = responseBuffer[:n]
}
// Return response
return responseBuffer, nil
}
// Digging deep to find out which kind of socket error happened. Because Golang is too stupid to give you an error code
// or a certain error type. Just an error string, which might be different on different OS/languages.
// Returns empty string if error is not a socket issue
func socketErrorType(err error) string {
switch t := err.(type) {
case *net.OpError:
return t.Op // Might be "dial", "read", "write",...
}
return "" // Empty if not a socket error
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.