content
stringlengths 10
4.9M
|
---|
// NewStateStore creates a new persistent state storage.
func NewStateStore(path string, l logging.Logger) (storage.StateStorer, error) {
db, err := leveldbDriver.Open(path, "")
if err != nil {
if !ldberr.IsCorrupted(err) {
return nil, err
}
}
s := &store{
db: db.(driver.BatchDB),
logger: l,
}
if err := migrate(s); err != nil {
return nil, err
}
return s, nil
} |
Universal Home Entertainment is bringing the action-thriller Eliminators to Blu-ray & DVD (read our review) on December 6, 2016. The film stars Scott Adkins (Close Range, Hard Target 2) and former WWE champ Wade Barrett. The film is directed by James Nunn, who previously worked with Adkins in Green Street 3: Never Back Down.
When his home in London is attacked, a former federal agent (Adkins) must come out of hiding of the witness protection program to protect his daughter. With his true identity exposed to the criminal underworld, he goes on the run with Europe’s most dangerous assassin (Barrett) on his trail and must use every trick he knows to keep his family alive.
Adkins fans should consider themselves spoiled for the next several months, as the action icon has an array of films in the works, including Boyka: Undisputed IV, Savage Dogs, Altar Rock, The Returner, a possible Ninja 3, and an appearance in Marvel’s upcoming Doctor Strange.
Updates: Watch the trailer for Eliminators down below:
Please visit our sponsor, Frank and Beanz Doggie Apparel. |
<reponame>QianJianhua1/spack<filename>var/spack/repos/builtin/packages/scitokens-cpp/package.py
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class ScitokensCpp(CMakePackage):
""" A C++ implementation of the SciTokens library with a C library interface.
SciTokens provide a token format for distributed authorization. """
homepage = "https://github.com/scitokens/scitokens-cpp"
url = "https://github.com/scitokens/scitokens-cpp/archive/refs/tags/v0.7.0.tar.gz"
version('0.7.0', sha256='72600cf32523b115ec7abf4ac33fa369e0a655b3d3b390e1f68363e6c4e961b6')
depends_on('sqlite')
depends_on('curl')
depends_on('uuid', type='build')
# https://github.com/scitokens/scitokens-cpp/issues/72
@when('^openssl@3:')
def patch(self):
filter_file(' -Werror', '', 'CMakeLists.txt')
|
def u_decay(self, value):
if not isinstance(value, int):
warnings.warn('''"u_decay should be an integer''')
value = int(value)
if value not in range(0, 2 ** 12 - 1):
raise ValueError('''v_decay should be between 0 and 2**12-1''')
self._u_decay = value |
Hydrogen Sulfide and Glucose Homeostasis: A Tale of Sweet and the Stink.
SIGNIFICANCE
Among many endogenous mediators, the gasotransmitter hydrogen sulfide (H2S) plays an important role in the regulation of glucose homeostasis. In this article we discuss different functional roles of H2S in several metabolic organs/tissues required in the maintenance of glucose homeostasis. Recent Advances: New evidence has emerged revealing the insulin sensitizing role of H2S in adipose tissue and skeletal muscle biology. In addition, H2S was demonstrated to be a potent stimulator of gluconeogenesis via the induction and stimulation of various glucose-producing pathways in the liver.
CRITICAL ISSUES
Similar to its other physiological effects, H2S exhibits paradoxical characteristics in the regulation of glucose homeostasis: (1) H2S stimulates glucose production via activation of gluconeogenesis and glycogenolysis in hepatocytes, yet inhibits lipolysis in adipocytes; (2) H2S stimulates glucose uptake into adipocytes and skeletal muscle but inhibits glucose uptake into hepatocytes; (3) H2S inhibits insulin secretion from pancreatic β cells, yet sensitizes insulin signaling and insulin-triggered response in adipose tissues and skeletal muscle. It is also unclear the impact H2S may have on glucose metabolism and utilization by other vital organs, such as the brain.
FUTURE DIRECTIONS
Recent reports and ongoing studies lay the foundation for a general, although highly incomplete, understanding of the effect of H2S on regulating glucose homeostasis. In this review, we describe the molecular mechanisms and physiological outcomes of the gasotransmitter H2S on organs and tissues required for homeostatic maintenance of blood glucose. Future directions highlighting the H2S-mediated homeostatic control of glucose metabolism under physiological and insulin-resistant conditions are also discussed. Antioxid. Redox Signal. 28, 1463-1482. |
Advanced male breast cancer treatment with the LH‐RH analogue buserelin alone or in combination with the antiandrogen flutamide
Ten men with advanced breast cancer were evaluated for response to treatment with the luteinizing hormone‐releasing hormone (LH‐RH) analogue, buserelin, alone or in combination with the antiandrogen, flutamide. One of five patients receiving buserelin as a single agent had a partial remission lasting 12 months, and with the addition of flutamide, this lasted over 24 additional months. Three patients had stable disease with a median duration of 6 months (range, two to 14). One patient had progressive disease. Of five patients receiving the combination of buserelin and flutamide from the beginning of therapy, four patients had a partial remission with a median duration of over 15 months (range, over five to 16). One patient's disease remained stable for 12 months. Major side effects were hot flushes, loss of libido, and impotence. Buserelin initiates a castration‐like endocrine response and has potential in the treatment of men with disseminated breast cancer when used either alone or in combination with flutamide. |
// Close closes the datastore and releases all db resources.
func (s *nutsDBStore) Close() error {
s.log.Debugf("closing store at path: %s", s.path)
err := s.db.Close()
s.db = nil
s.log.Info("store closed")
return s.logError("close", err)
} |
#include <iostream>
#include <algorithm>
using namespace std;
int main()
{
int n , c1 = 0 , c2 = 0 , c3 = 0 , n1 = 0 , n2 = 0 , n3 = 0 , counter = 0 ;
cin >> n ;
int arr[n] ;
for(int i = 0 ; i < n ; i++){
cin >> arr[i] ;
if(arr[i] == 1){
n1++;
}else if(arr[i] == 2){
n2++ ;
}else if(arr[i] == 3){
n3++;
}
}
int arr1[n1] ;
int arr2[n2] ;
int arr3[n3] ;
for(int i = 0 ; i < n ; i++){
int m = i + 1 ;
if(arr[i] == 1){
arr1[c1] = m ;
c1++ ;
}else if(arr[i] == 2){
arr2[c2] = m ;
c2++ ;
}else if(arr[i] == 3){
arr3[c3] = m ;
c3++ ;
}
}
/*for(int i = 0 ; i < 3 ; i++){
cout << arr1[i] << ' ' << arr2[i] << ' ' << arr3[i] << endl ;
}*/
if((n1 >= 1) && (n2 >= 1) && (n3 >= 1)){
if(((n1 >= n2) && (n2 >= n3)) || ((n2 >= n1) && (n1 >= n3))){
cout << n3 << '\n' ;
for(int i = 0 ; i < n3 ; i++){
cout << arr1[i] << ' ' << arr2[i] << ' ' << arr3[i] << '\n' ;
}
}else if(((n1 >= n3) && (n3 >= n2)) || ((n3 >= n1) && (n1 >= n2))){
cout << n2 << '\n' ;
for(int i = 0 ; i < n2 ; i++){
cout << arr1[i] << ' ' << arr2[i] << ' ' << arr3[i] << '\n' ;
}
}else if(((n3 >= n2) && (n2 >= n1)) || ((n2 >= n3) && (n3 >= n1))){
cout << n1 << '\n' ;
for(int i = 0 ; i < n1 ; i++){
cout << arr1[i] << ' ' << arr2[i] << ' ' << arr3[i] << '\n' ;
}
}
}else {
cout << 0 ;
}
return 0;
}
|
/**
* Removes a role from a member
*
* @param argument
* the name of the role to remove
* @param member
* the member to modify
*/
private void removeRole(String argument, Member member)
{
try
{
List<Role> guildRoles = guild.getRolesByName(argument, false);
if (guildRoles.isEmpty())
{
Helpers.send(channel, "Role " + argument
+ " does not exist. Maybe try creating it first");
return;
}
Role r = guildRoles.get(0);
List<Role> oldRoles = member.getRoles();
if (!oldRoles.contains(r))
{
Helpers.send(channel, argument + " is not assigned to "
+ member.getEffectiveName());
return;
}
guildController.removeSingleRoleFromMember(member, r).queue(a -> {
Helpers.send(channel, "Removed role " + r.getName() + " from "
+ member.getEffectiveName());
}, b -> {
Helpers.send(channel, "Failed to remove role " + r.getName()
+ " from " + member.getEffectiveName());
});
}
catch (Exception ex)
{
ex.printStackTrace();
}
} |
Over the past two weeks, word has begun to spread that Big Dairy is shelling out lawsuit dollars to American consumers of milk, yogurt, and other creamy products. But while 'free' money is often sweet news by itself, the story behind this relatively slim settlement has a distinctly more sour and, for milk cows, far deadlier flavor.
As NPR reported Friday, a class-action lawsuit against the dairy industry's largest umbrella group has resulted in a $52 million settlement that's available to select consumers of its products. Cooperatives Working Together (CWT), whose members contribute about 70% of all U.S. milk, responded to a suit by animal rights activists that alleged price-fixing with a no-fault settlement package of cash and store credit worth just shy of 0.15% of the industry's annual haul.
The class-action suit was brought by lawyers with the animal rights group Compassion Over Killing, which, through the help of gallant general counsel Cheryl Leahy, "spends a lot of time trying to get the dairy industry to be nicer to cows," says NPR. Several years back, Leahy caught wind of an industry practice during (increasingly frequent) hard times for price-per-gallon of launching herd buyout programs, in which arms of the CWT or affiliated National Milk Producers Federation offered to buy out smaller farms' milk cows to process them en masse for meat.
And while the practice of cutting down livestock populations is an entirely legal and, on a smaller scale, quite regular and integral one for the industry, Leahy felt it still "[sounded] a lot like price-fixing." She told NPR, "It raised questions for us about antitrust laws. And the especially outrageous part to us was that this was all done by killing over 500,000 young cows."
See also: Uber Must Pay $20M For Luring Drivers With Inflated Wage Stats
So, rather than leak money in court 'til the cows come home addressing a long-troubled tradition, the industry group decided to settle. Jim Mulhern, President and Chief Executive Officer of the National Milk Producers Federation, said in a statement [PDF], “Our CWT leadership team, with support from the CWT membership, has worked diligently to put this legacy issue behind us. Settlement of this litigation is the most sensible and responsible course of action to maintain the current CWT Export Assistance program and allow us to focus on the future." He continued,
It is important to note that the court has found no antitrust violation and CWT makes no admission of wrongdoing in this settlement. The activity at issue in this litigation — the herd retirement program — has long since been terminated by CWT.
Unfortunately for some, rebates will only be available to claimants who purchased fresh milk products in 15 states after 2003, since which time milk production has indeed been climbing steadily. Between 2005 and 2015, annual production rose from around 177 billion pounds to 209 billion, while cash receipts from 2015 "marketings of milk" dropped nearly 28% from the previous year, according to the U.S. Department of Agriculture. As NPR previously pointed out, these numbers also reflect an industry that's seen the number of farms with milking cows drop from about 3.5 million to fewer than 58,000 in the past six decades.
Nevertheless, Big Dairy will be picking up the tab for eligible claimants who file by the end of January, i.e. in the next two days. So while there's still time for dairy fans to apply for at least some compensation, it couldn't hurt to shake a leg--a stampede of 3.5 million reportedly already have.
[h/t NPR] |
#include "mainwindow.h"
#include <QApplication>
using namespace std;
#define DLL_PUBLIC __attribute__ ((visibility ("default")))
extern "C" DLL_PUBLIC void initializeWindow(Configuration configuration);
extern "C" DLL_PUBLIC int execute();
extern "C" DLL_PUBLIC void sendToBrowser(const char* text);
extern "C" DLL_PUBLIC QMenu* addMenu(const char* title);
extern "C" DLL_PUBLIC QMenu* addSubmenu(const char* title, QMenu* parent);
extern "C" DLL_PUBLIC int setMenuItem(QMenu* menu, MenuItem menuItem);
extern "C" DLL_PUBLIC void setMenuItemChecked(int cmdId, bool checked);
extern "C" DLL_PUBLIC void setMenuItemSelected(int cmdId, int groupCount, int id);
extern "C" DLL_PUBLIC void closeWindow();
extern "C" DLL_PUBLIC void showDevTools();
extern "C" DLL_PUBLIC void showFullscreen(bool fullscreen);
QApplication* app{nullptr};
MainWindow *window{nullptr};
void create_window(Configuration configuration) {
window = new MainWindow(configuration);
window->show();
}
QMenu* addMenu(const char* title) {
return window->add_menu(title);
}
QMenu* addSubmenu(const char* title, QMenu* parent) {
return window->add_menu(title, parent);
}
int setMenuItem(QMenu* menu, MenuItem menu_item) {
return window->set_menu_item(menu, menu_item);
}
void setMenuItemChecked(int cmdId, bool checked) {
window->setMenuItemChecked(cmdId, checked);
}
void setMenuItemSelected (int cmdId, int, int id) {
window->setMenuItemChecked(cmdId + id, true);
}
void closeWindow() {
window->exit();
}
void showDevTools() {
auto webView = new WebEngineView;
auto url = QUrl("http://localhost:8888");
webView->page()->load(url);
webView->show();
}
void showFullscreen(bool fullscreen) {
window->showFullscreen(fullscreen);
}
QString create_debugging_arg(int port) {
return "--remote-debugging-port=" + QString::number(port > 0 ? port : 8888);
}
void initializeWindow(Configuration configuration) {
QCoreApplication::setAttribute(Qt::AA_EnableHighDpiScaling);
int c = configuration.debugging_enabled ? 2 : 0;
char* args[2];
args[0] = (char*)"WebWindow";
auto arg = create_debugging_arg(configuration.debugging_port).toUtf8();
args[1] = (char*)(const char*)arg;
char **argv = configuration.debugging_enabled ? args : nullptr;
app = new QApplication(c, argv);
create_window(configuration);
}
int execute() {
auto ret = app->exec();
delete window;
delete app;
return ret;
}
void sendToBrowser(const char* text) {
window->send_to_browser(text);
}
|
#include <opencv2/core/core.hpp>
#include "baseapi.h"
#include <leptonica/allheaders.h>
#include <iostream>
#include <string>
#include <regex>
#include <utility>
#include <fstream>
using namespace std;
void deleteAllMark(string &s, const string &mark) {
size_t nSize = mark.size();
while (1) {
size_t pos = s.find(mark);
if (pos == string::npos) {
return;
}
s.erase(pos, nSize);
}
}
string regex_matchEN(const string &str, const string &path) {
smatch match;
regex re("企业名称:.*");
if (regex_search(str, match, re)) {
return match[0].str().substr(13);
}
return "匹配企业名称错误 :" + path;
}
string regex_matchMN(const string &str, const string &path) {
smatch match;
regex re("法定代表人:.*");
if (regex_search(str, match, re)) {
return match[0].str().substr(16);
}
return "";
}
int getCh(string path) {
tesseract::TessBaseAPI *api = new tesseract::TessBaseAPI();
if (api->Init(NULL, "chi_sim")) {
fprintf(stderr, "Could not initialize tesseract.\n");
exit(1);
}
// cvtColor(resultColor,resultColor,CV_BGR2RGBA);
// api->SetImage((uchar *) resultColor.data, resultColor.size().width, resultColor.size().height,
// resultColor.channels(), api->Recognize(0));
// api->SetPageSegMode(static_cast<tesseract::PageSegMode >(7));
// api->TesseractRect(resultColor.data, 1, resultColor.step1(), 0, 0, resultColor.cols, resultColor.rows);
// char *out = api->GetUTF8Text();
// printf("%s", out);
string outText;
extern ofstream ofile;
const char *cpath = path.data();
Pix *image = pixRead(cpath);
if (!image) {
// pixDestroy(&image);
ofile << "非图片文件" << endl;
return 0;
}
api->SetImage(image);
outText = api->GetUTF8Text();
deleteAllMark(outText, " ");
deleteAllMark(outText, ";");
api->End();
// cout << outText;
// delete outText;
// pixDestroy(&image);
ofile << regex_matchEN(outText,path) << "," << regex_matchMN(outText,path) << endl;
return 1;
}
//int main() {
// getCh(" ");
//}
|
An Analysis of the Spatial Distribution of Soil Erosion in Shitai County
Spatial distribution of soil erosion is studied in Shitai County, based on the universal soil loss equation and the GIS spatial analysis, the relationships between soil erosion, terrain, soil types, and land uses have been investigated in this paper. The results show that the total amount of soil erosion is 2850069.39 t/a in 2007. Moderate erosion area is predominant, occupying 34.16% of the region. Spatial distribution pattern of soil erosion is an obvious block distribution characteristic. Soil erosion is the strongest from 200m to 500m in different elevation zones in the study area. Soil erosion at a 15° to 30° incline is the strongest and the weakest is over 35°. Soil erosion in the northeast is the strongest, but soil erosion in the southeast is intermediate. The strongest erosion is in forestland soil of different land use types, and the second strongest is in grassland soil. Red soil and limestone soil have the most prominent erosion, while erosion in stony soil is the weakest. |
import java.util.Scanner;
import java.util.Arrays;
import java.lang.Math;
public class CF1003DD3{
public static void main(String[] args){
Scanner in = new Scanner(System.in);
int n = in.nextInt();
int q = in.nextInt();
int [] coins = new int [33];
for(int i = 0; i < n; i++){
int input = in.nextInt();
coins[log2OfPower2(input)]++ ;
}
int [] qs = new int [q];
for(int i = 0; i < q; i++){
qs[i] = in.nextInt();
}
for(int i = 0; i < q; i++){
solve(qs[i], coins.clone());
}
}
private static void solve(int b, int [] coins){
// System.out.println("============================");
// System.out.println("query = " + b);
int res = 0;
for(int i = 30; i >= 0; i--){
int coin = TwoPowerI(i);
while(coins[i] > 0 && b >= coin){
int needed = b / coin;
if(needed > coins[i]) needed = coins[i];
b -= coin * needed;
coins[i] -= needed;
res += needed;
}
}
if(b == 0) System.out.println(res);
else System.out.println(-1);
}
public static int log2OfPower2(int a){
int res = 0;
while((a & 1) == 0 && a != 0){
res++;
a >>= 1;
}
return res;
}
public static int TwoPowerI(int i){
return 1 << i;
}
} |
/// trigger an Alert, for anything security related
fn error_level(&self) -> Level {
match self {
MessageError::InvalidAddress(_) => Level::Error,
MessageError::UnknownSender(_) => Level::Error,
MessageError::ForbiddenSender(_) => Level::Alert,
MessageError::UnknownRecipient(_) => Level::Error,
// trigger an Alert because sender did not send payment
MessageError::SenderPaymentRequired(_) => Level::Alert,
MessageError::DecodingError(_) => Level::Error,
// Mark this as critical because encoding should never fail besides IO errors
MessageError::EncodingError(_) => Level::Critical,
MessageError::InvalidSignature(_) => Level::Alert,
MessageError::InvalidSessionIdLength { .. } => Level::Alert,
MessageError::InvalidDigestLength { .. } => Level::Alert,
MessageError::ChecksumFailed(_) => Level::Alert,
MessageError::DecryptionFailed(_) => Level::Alert,
MessageError::InvalidSessionId { .. } => Level::Error,
MessageError::MessageDataDeserializationFailed(_, _) => Level::Error,
MessageError::EncodedMessageSerializationFailed(_, _) => Level::Error,
}
} |
On power control in cellular communication using ANFIS
Transmitter power is an important resource in cellular mobile radio systems. Effective power control can increase system capacity and quality of service. A commonly used measure of the quality of service is the Carrier to Interference Ratio (CIR) at the receiver. The main idea on power control schemes is to maximize the minimum CIR in each channel of system. In this paper a new adaptive neuro fuzzy distributed power control algorithm which can be used to maximize the minimum CIR among all of co-channel users, has been introduced. Simulations and results have been compared with classical method and show that intelligent control system has better performance than other existing methods. |
/**
* Test method for adding a brand new group to the system. It verifies if after
* adding group, the group is redirected to the group management page; if this
* group now exists in the groups list; and if the number of groups existing was
* incremented by one.
*/
@Test
public void testAddGroup() {
logger.info("Testing add a brand new group");
try {
driver.get(UiTestUtilities.ADD_GROUPS_URL);
WebElement inputGroupname = driver.findElement(By.id("inputGroupname"));
WebElement btnEditUsers = driver.findElement(By.id("showUsersListBtn"));
WebElement submitGroupFormBtn = driver.findElement(By.id("submitGroupFormBtn"));
String newGroupname = "webdriver_group" + System.currentTimeMillis();
inputGroupname.sendKeys(newGroupname);
btnEditUsers.click();
List<WebElement> cbUsersList = driver.findElements(By.name("idsList"));
for (WebElement checkbox : cbUsersList) {
checkbox.click();
}
submitGroupFormBtn.click();
assertEquals(UiTestUtilities.GROUPS_URL, driver.getCurrentUrl());
WebElement divAlertSucess = driver.findElement(By.className("alert-success"));
assertTrue(divAlertSucess.isDisplayed());
assertTrue(divAlertSucess.getText().contains(newGroupname));
} catch (Exception e) {
logger.error("Could not run the test properly: {}", e.getMessage());
}
} |
<gh_stars>1-10
#!/usr/bin/env python3
class ctruncate:
''' Pointless Wrapper '''
def __init__(self,mtzin,mtzout, showcommand=False, log=None, logview=None):
# Generate timestamp
import datetime
timestamp = datetime.datetime.now().strftime("%y%m%d-%H%M%S")
# Assign general inputs to class variables
self.showcommand = showcommand
self.mtzin = mtzin
self.mtzout = mtzout
self.log = log
self.logview = logview
if (self.mtzout == None): self.mtzout = f"{timestamp}-ctruncate.mtz"
if (self.log == None): self.log = f"{timestamp}-ctruncate.log"
def run(self):
''' Run it! '''
import subprocess
# Setup the command options
cmd = ("ctruncate "
"-mtzin "+self.mtzin+" "
"-mtzout "+self.mtzout+" "
"-colin /*/*/[IMEAN,SIGIMEAN] "
"-colano /*/*/[I\(+\),SIGI\(+\),I\(-\),SIGI\(-\)] ")
cmd += (f"<< eof >>{self.log}\n")
# End the command entry
cmd += ("\n"
"eof")
# Print the final command to terminal
if (self.showcommand == True): print(cmd)
# Start the log file
import sys
log = open(self.log, "w")
log.write("ctruncate run through python wrapper using command:\n\n")
log.write("truncatewrap.py "+(" ".join(sys.argv[1:]))+"\n\n")
log.close()
# Show logview?
if self.logview:
subprocess.Popen(["logview", self.log],
stdout = subprocess.DEVNULL, stderr = subprocess.DEVNULL)
# Print to terminal
bold = "\033[1m"
italic = "\033[3m"
underline = "\033[4m"
blink = "\033[5m"
clear = "\033[0m"
green = "\033[32m"
yellow = "\033[33m"
red = "\033[31m"
purple = "\033[35m"
print(f"\n{underline}ctruncate.py{clear} > {purple}{self.log}{clear}\n")
print(f"Running... ", end='', flush=True)
# Run the command
try:
s = subprocess.check_output(cmd, shell = True, stderr = subprocess.DEVNULL)
self.result = s.decode('ascii')
except:
print(f"{red}Error!{clear}\n")
sys.exit()
print(f"{green}Complete!{clear}\n")
import argparse
parser = argparse.ArgumentParser(prog='ctruncate.py', usage='%(prog)s [options]')
optional = parser._action_groups.pop()
required = parser.add_argument_group('required arguments')
required.add_argument("--mtz", metavar="input.mtz",
type=ascii,
required=True,
help="MTZ input file")
optional.add_argument("--mtzout", metavar="output.mtz",
type=ascii,
help="MTZ output file (Default: YYMMDD-HHMMSS-ctruncate.mtz)")
optional.add_argument("--logout", metavar="output.log",
type=ascii,
help="Log filename (Default: YYMMDD-HHMMSS-ctruncate.log)")
optional.add_argument("--logview",
help = "Run CCP4 logview while refmac is running.",
action = "store_true")
parser.add_argument("--showcommand", help="Show AIMLESS command", action="store_true")
parser._action_groups.append(optional)
# If running directly (not imported)
if __name__ == "__main__":
# Get the command line arguments
args = parser.parse_args()
# Pass args to the main class
program = ctruncate(mtzin=args.mtz, mtzout=args.mtzout, showcommand=args.showcommand, log=args.logout, logview=args.logview)
# Run the main class
program.run()
|
def remove_pose(self):
for i in range(len(self.shape_box)):
window.remove(self.shape_box[i])
self.shape_box.clear()
window.remove(self.eye) |
// ReverseResolve resolves an address in to an ENS name
// This will return an error if the name is not found or otherwise 0
func ReverseResolve(client *ethclient.Client, input *common.Address) (name string, err error) {
if input == nil {
err = errors.New("No address supplied")
return
}
nameHash := NameHash(input.Hex()[2:] + ".addr.reverse")
contract, err := ReverseResolver(client)
if err != nil {
return
}
name, err = contract.Name(nil, nameHash)
if name == "" {
err = errors.New("No resolution")
}
return
} |
Epidemiology, Prevention, Diagnosis, and Management of Venous Thromboembolism in Gastrointestinal Cancers
Venous thromboembolism (VTE) is a leading cause of cardiovascular death and is associated with significant morbidity. Patients with cancer, and gastrointestinal (GI) malignancies in particular, are at increased risk of VTE, increased risk of bleeding with VTE treatment, and increased risk of recurrent VTE compared with the general population. VTE has been shown to be a leading cause of death among patients with cancer. This review will discuss special considerations in the prevention, diagnosis, and management of VTE in patients with GI malignancies. Given the increased risk of VTE observed in ambulatory patients with GI malignancies, multiple trials have examined and demonstrated the efficacy of prophylactic anticoagulation in high-risk patients with cancer undergoing chemotherapy, particularly in patients with gastric and pancreatic cancers. Patients with GI malignancies have also played a central role in discussions of the risks and benefits of the use of direct oral anticoagulants in patients with cancers, with first-line anticoagulation options expanding to include low-molecular-weight heparin, rivaroxaban, edoxaban, and apixaban. However, there continue to be concerns regarding an increased risk of bleeding with edoxaban and rivaroxaban in patients with GI malignancies. In addition to anticoagulation, individualized risk and benefit analysis should be undertaken for interventions including inferior vena cava (IVC) filter placement and catheter-directed thrombolysis in the setting of increased risk of bleeding and recurrent VTE for patients with GI malignancies. Several unique scenarios that may be seen with GI malignancies, including incidental VTE, splanchnic vein thrombosis, IVC thrombosis, and iliac vein compression, require individualized decision making. |
export const c_menu_m_drilldown_c_menu_Transition: {
"name": "--pf-c-menu--m-drilldown--c-menu--Transition",
"value": "transform 250ms, visibility 250ms",
"var": "var(--pf-c-menu--m-drilldown--c-menu--Transition)"
};
export default c_menu_m_drilldown_c_menu_Transition; |
def train_subject_specific(subject, epochs=500, batch_size=32, lr=0.001, silent=False, plot=True,
**kwargs):
train_samples, train_labels = get_data(subject, training=True)
test_samples, test_labels = get_data(subject, training=False)
train_loader = as_data_loader(train_samples, train_labels, batch_size=batch_size)
test_loader = as_data_loader(test_samples, test_labels, batch_size=batch_size)
model = EEGNet(T=train_samples.shape[2], **kwargs)
model.initialize_params()
if t.cuda.is_available():
model = model.cuda()
loss_function = t.nn.CrossEntropyLoss()
optimizer = t.optim.Adam(model.parameters(), lr=lr, eps=1e-7)
scheduler = None
print_summary(model, optimizer, loss_function, scheduler)
with tqdm(desc=f"Subject {subject}", total=epochs, leave=False, disable=silent,
unit='epoch', ascii=True) as pbar:
model, metrics, _, history = _train_net(subject, model, train_loader, test_loader,
loss_function, optimizer, scheduler=scheduler,
epochs=epochs, early_stopping=False, plot=plot,
pbar=pbar)
if not silent:
print(f"Subject {subject}: accuracy = {metrics[0, 0]}")
return model, metrics, history |
||
||
||
Meadowind was kind enough to make most of the brushes Photoshop Friendly! Here's a link!
Meadow ended up deactivating. Have no fear though! You can still install the brushes using this method
Download brushes individually here EDIT 3: Here 's a link to the brush installation tutorial!EDIT 2:EDIT: OOPS I forgot to resize the preview image. Sorry it was super huge!This was blowing up on my tumblr randomly, so I decided to update it and put it here on my deviantart! Added some better examples, tweaked things a bit, and added a nice how-to on installing the alphas into FireAlpaca! Everywhere you see blue/blueish shades will be replaced with your foreground color! Enjoy guys!If you like it, please think of donating via points on my profile, or via my paypal. Thanks so much guys! Please feel free to leave requests, questions, or comments below!You MAY use it for public and private use.You MAY use it in for profit work so long as I am properly and CLEARLY CREDITEDYou MAY modify the brush as you see fit.You MAY direct your friends to this brush so they can download it.You MAY show me your art using this brush! I'd love to see it.~You MAY NOT repost these brushes anywhere.You MAY NOT claim these brushes as your own.You MAY NOT distribute the brushes in any way.Brushes (c) me, |
import java.io.*;
import java.util.*;
public class problem339D {
public static int pow (int base, int exp){
int ans=1;
for (int i=1; i<=exp; i++){
ans*=base;
}
return ans;
}
public static void main (String[]args)throws IOException{
BufferedReader x=new BufferedReader(new InputStreamReader(System.in));
StringTokenizer st=new StringTokenizer(x.readLine());
int n=Integer.parseInt(st.nextToken());
int m=Integer.parseInt(st.nextToken());
int[]tree=new int[pow(2, n+1)];
st=new StringTokenizer(x.readLine());
for (int i=pow(2,n); i<=pow(2,n+1)-1; i++){
tree[i]=Integer.parseInt(st.nextToken());
}
//alternates OR and XOR per level
for (int i=n-1; i>=0; i--){
for (int j=0; j<pow(2,i); j++){
if ((n-1-i)%2==0){
tree[pow(2,i)+j]=tree[2*(pow(2,i)+j)]|tree[2*(pow(2,i)+j)+1];
}else{
tree[pow(2,i)+j]=tree[2*(pow(2,i)+j)]^tree[2*(pow(2,i)+j)+1];
}
}
}
for (int query=0; query<m; query++){
st=new StringTokenizer(x.readLine());
int index=Integer.parseInt(st.nextToken())-1;
tree[pow(2,n)+index]=Integer.parseInt(st.nextToken());
int cur=(pow(2,n)+index)/2;
for (int i=n-1; i>=0; i--){
if ((n-1-i)%2==0){
tree[cur]=tree[2*cur]|tree[2*cur+1];
}else{
tree[cur]=tree[2*cur]^tree[2*cur+1];
}
cur/=2;
}
System.out.println(tree[1]);
}
}
}
|
import argparse
import json
from datetime import datetime
from typing import Dict
import pathlib
parser = argparse.ArgumentParser()
parser.add_argument('--first_step_dir', type=str, default=None)
parser.add_argument('--target_dir', type=str, default=None)
args = parser.parse_args()
def get_timestamp() -> str:
return datetime.now().isoformat()
def process_stats_file(source_fp: pathlib.Path, hash_table: Dict[str, str]):
deduped_stats = []
deduped_hashes = []
with open(source_fp, mode="r") as in_file:
while True:
jstr = in_file.readline()
if not jstr:
break
record_stats = json.loads(jstr)
content_hash = record_stats["content_hash"]
if content_hash in hash_table:
# skip this record since it's a duplicate
continue
hash_table[content_hash] = content_hash
deduped_stats.append(record_stats)
deduped_hashes.append(content_hash)
return hash_table, deduped_stats, deduped_hashes
def main():
first_step_dir = pathlib.Path(args.first_step_dir)
deduped_stats_fp = pathlib.Path(args.target_dir) / "stats_deduped.jsonl"
print(f"[{get_timestamp()}][INFO] Deduplicating "
f"records from {first_step_dir}")
# get list of stats files
stats_filepaths = list(first_step_dir.glob("stats_*.jsonl"))
total_files_to_process = len(stats_filepaths)
deduped_stats_file = open(deduped_stats_fp, "w")
hash_set = {}
for file_num, fp in enumerate(stats_filepaths, start=1):
print(f"[{get_timestamp()}][INFO]"
f"[{file_num}/{total_files_to_process}] "
f"Processing {fp}")
hash_set, deduped_stats, deduped_hashes = process_stats_file(
fp, hash_set
)
# write out stats
for stats in deduped_stats:
deduped_stats_file.write(json.dumps(stats) + "\n")
# write out jsonl to hashes
out_fn = fp.name.replace("stats_", "hashes_")
with open(pathlib.Path(args.target_dir) / out_fn, "w") as f:
f.write(json.dumps({"hashes": deduped_hashes}) + "\n")
print(f"[{get_timestamp()}][INFO] Flushing ...")
deduped_stats_file.flush()
deduped_stats_file.close()
print(f"[{get_timestamp()}][INFO] "
f"Total number of unique records: {len(hash_set)}")
if __name__ == '__main__':
main()
|
def top(self, category=0):
return Top(self.base_url, category) |
def map_scale(self):
return self.gxmap.get_map_scale() |
/*
Author: <NAME>, Coventry University
*/
#pragma once
#include <stdint.h>
// PI*2
constexpr double pi2 = 3.14159 * 2;
/**
* @brief Get microseconds since the ESP started.
*/
int64_t getMicros();
/**
* @brief Get milliseconds since the ESP started.
*/
int64_t getMillis();
/**
* @brief Get the next integer between [0,max), wrapping around to 0 if >= max.
* @param s Source integer from which to get the next integer in the sequence [0,max)
* @param max Maximum value the next integer can be.
*/
uint32_t getNextInt(uint32_t s, uint32_t max);
/**
* @brief Get the previous integer between [0,max), wrapping around if < 0.
* @param s Source integer from which to get the prev integer in the sequence [0,max)
* @param max Maximum value the prev integer can be.
*/
uint32_t getPrevInt(uint32_t s, uint32_t max);
// Result of calculateTimeDelta().
extern double timeDelta;
/**
* @brief Calculate the time delta between this call and the previous call to calculateTimeDelta()
* NOTE: Do not call more than once per frame, and make sure to always call this from the same one task.
* The result is stored in the global variable "timeDelta".
*/
void calculateTimeDelta();
/**
* @brief Interpolate from a double to the other, storing the result in the first parameter.
* @param from Double to interpolate from.
* @param to Destination double to interpolate to.
*/
void smoothLerp(double &from, double to); |
package com.app.chenyang.sweather.network;
import com.app.chenyang.sweather.entity.HeWeather;
import com.app.chenyang.sweather.global.MyConst;
import java.util.concurrent.TimeUnit;
import okhttp3.OkHttpClient;
import retrofit2.Retrofit;
import retrofit2.adapter.rxjava.RxJavaCallAdapterFactory;
import retrofit2.converter.gson.GsonConverterFactory;
import rx.Subscription;
import rx.android.schedulers.AndroidSchedulers;
import rx.functions.Action1;
import rx.schedulers.Schedulers;
/**
* Created by chenyang on 2017/3/14.
*/
public class WeatherRequest {
public static final String BASE_URL = "https://free-api.heweather.com/v5/";
private static final int CONNECT_TIMEOUT = 10;
private static final int WRITE_TIMEOUT = 10;
private static final int READ_TIMEOUT = 20;
private Retrofit retrofit;
private WeatherService service;
private WeatherRequest(){
OkHttpClient.Builder builder = new OkHttpClient.Builder();
builder.connectTimeout(CONNECT_TIMEOUT, TimeUnit.SECONDS)
.writeTimeout(WRITE_TIMEOUT,TimeUnit.SECONDS)
.readTimeout(READ_TIMEOUT,TimeUnit.SECONDS);
retrofit = new Retrofit.Builder()
.client(builder.build())
.addConverterFactory(GsonConverterFactory.create())
.addCallAdapterFactory(RxJavaCallAdapterFactory.create())
.baseUrl(BASE_URL)
.build();
service = retrofit.create(WeatherService.class);
}
private static class SingletonFactory{
private static final WeatherRequest INSTANCE = new WeatherRequest();
}
public static WeatherRequest getInstance(){
return SingletonFactory.INSTANCE;
}
public Subscription getWeather(Action1<HeWeather> a1, Action1<Throwable> a2, String city){
return service.getWeather(city, MyConst.KEY)
.subscribeOn(Schedulers.io())
.observeOn(AndroidSchedulers.mainThread())
.unsubscribeOn(Schedulers.io())
.subscribe(a1,a2);
}
public Subscription getWeatherNowThread(Action1<HeWeather> a1, Action1<Throwable> a2, String city){
return service.getWeather(city, MyConst.KEY)
.subscribe(a1,a2);
}
}
|
Hillary Clinton divulged Top Secret nuclear security intelligence to tens of millions of worldwide television audience viewers Wednesday night during the third presidential debate, according to high-ranking Department of Defense personnel.
Clinton, responding to opponent Donald Trump and a question posed by debate moderator Chris Wallace, boasted specific and “damaging” details about the United States’ nuclear response time to retaliate during a nuclear attack. Clinton said:
“But here’s the deal. The bottom line on nuclear weapons is that when the president gives the order, it must be followed. There’s about four minutes between the order being given and the people responsible for launching nuclear weapons to do so.” –Hillary Clinton, National TV Appearance
“Secretary Clinton proved tonight she is unfit to be commander-in-chief,” a top-ranking DOD intelligence source said. “What she did compromises our national security. She is cavalier and reckless and in my opinion should be detained and questioned so we can unravel why she did what she did.”
According to Pentagon sources, the information Clinton disseminated publicly is Top Secret intelligence governed under the U.S. Special Access Program (SAP) which dictates safeguards and protocols for accessing and discussing highly classified and Top Secret intelligence. The specific details of the country’s nuclear response time discussed by Clinton, sources said, are only known by a handful of individuals outside top military brass, including the following “need-to-know” (NTK) officials:
President
Vice President
Secretary of State
Secretary of Defense
Secretary of Homeland Security
Attorney General
Director of National Intelligence
CIA Director
Deputy Secretary of State
Deputy Secretary of Defense
Special personnel designated solely by the President in writing
Sources said late Wednesday Clinton likely violated two different types of Dept. of Defense SAP protocols. Since nuclear response is part of the sensitive national plan for nuclear war operations, all of its schematics are covered under both “Intelligence SAPs” and “Operation and Support SAPs,” sources said. Both contain Top Secret information.
“Targeting options by ICBM (intercontinental ballistic missiles), air or sea, launch order, launch procedures and response are some of the most secretly guarded tenets of national security and nuclear war policy,” a Pentagon source said. “It’s truly incredible that (nuclear) response time as part of an ERO (Emergency Response Option) is now out there in the public domain to our adversaries.”
U.S. Defense sources said according to developed U.S. counterintelligence, military officials in China, North Korea, Syria, Russia, Iran and even actors like ISIS had no previous definitive intelligence to determine the U.S. nuclear response time, especially during an ERO, prior to Clinton’s admission Wednesday night. Sources reluctantly acknowledged her calculations were accurate.
“Any time frame calculated would have merely been an educated hypothesis, absent leaked documents and there have been no such breaches,” the DOD source said.
Clinton has come under fire time and time again for mishandling national security secrets via email, telephone and secure facsimile during and after her tenure as secretary of state. Her mishandling of classified and top secret intelligence sparked a year-long investigation by the FBI and various Congressional committees which continue to examine Clinton’s lackluster security controls with and attitude toward protecting some of the nation’s most sensitive data which she stored on an unprotected home server in the basement of her Chappaqua, New York home during her post at Foggy Bottom.
Clinton, just this week, unleashed a series of campaign ads painting Trump as a loose cannon who should never be in control of nuclear secrets and weapons. Clinton trumpeted the seemingly ill-timed ads Wednesday night at the debate prior to her rant divulging nuclear secrets herself.
See Also: Hillary Clinton Supplied Cash, Weapons, Tanks, Training to Al-Qaeda to Kill Gaddafi & Weaponize “ISIS” in Syria
-30- |
<gh_stars>0
package chapter.android.aweme.ss.com.homework;
import android.content.Context;
import android.support.annotation.NonNull;
import android.support.v7.widget.RecyclerView;
import android.util.Log;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.ImageView;
import android.widget.TextView;
import java.util.List;
import chapter.android.aweme.ss.com.homework.model.Message;
import chapter.android.aweme.ss.com.homework.widget.CircleImageView;
public class MyAdapter extends RecyclerView.Adapter<MyAdapter.MyViewHolder> {
private static final String TAG="MyAdapter";
private int mNumberItems;
private static int viewHolderCount;
private List<Message> mMessages;
private final ListItemClickListener mOnClickListener;
public MyAdapter(List<Message> messages, ListItemClickListener listener) {
mNumberItems=messages.size();
mMessages=messages;
viewHolderCount=0;
mOnClickListener=listener;
Log.d(TAG,"Created");
}
@NonNull
@Override
public MyViewHolder onCreateViewHolder(@NonNull ViewGroup viewGroup, int i) {
Context context=viewGroup.getContext();
int layoutIdForListItem = R.layout.im_list_item;
LayoutInflater inflater = LayoutInflater.from(context);
boolean shouldAttachToParentImmediately = false;
View view = inflater.inflate(layoutIdForListItem, viewGroup, shouldAttachToParentImmediately);
MyViewHolder viewHolder=new MyViewHolder(view);
Log.d(TAG, "onCreateViewHolder: number of ViewHolders created: " + viewHolderCount);
viewHolderCount++;
return viewHolder;
}
@Override
public void onBindViewHolder(@NonNull MyViewHolder myViewHolder, int position) {
Log.d(TAG, "onBindViewHolder: #" + position);
myViewHolder.bind(position);
}
@Override
public int getItemCount() {
return mNumberItems;
}
public class MyViewHolder extends RecyclerView.ViewHolder implements View.OnClickListener{
private final CircleImageView iv_avatar;
private final ImageView robot_notice;
private final TextView tv_title;
private final TextView tv_description;
private final TextView tv_time;
public MyViewHolder(View view){
super(view);
iv_avatar=view.findViewById(R.id.iv_avatar);
robot_notice=view.findViewById(R.id.robot_notice);
tv_title=view.findViewById(R.id.tv_title);
tv_description=view.findViewById(R.id.tv_description);
tv_time=view.findViewById(R.id.tv_time);
view.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
mOnClickListener.onListItemClick(getAdapterPosition());
}
});
}
public void bind(int position){
switch (mMessages.get(position).getIcon()){
case "TYPE_ROBOT": {
iv_avatar.setImageResource(R.drawable.session_robot);
break;
}
case "TYPE_GAME": {
iv_avatar.setImageResource(R.drawable.icon_micro_game_comment);
break;
}
case "TYPE_SYSTEM": {
iv_avatar.setImageResource(R.drawable.session_system_notice);
break;
}
case "TYPE_STRANGER": iv_avatar.setImageResource(R.drawable.session_stranger);break;
case "TYPE_USER": iv_avatar.setImageResource(R.drawable.icon_girl);break;
default: iv_avatar.setImageResource(R.drawable.icon_blacksend_touch);break;
}
if(mMessages.get(position).isOfficial())
robot_notice.setVisibility(View.VISIBLE);
else
robot_notice.setVisibility(View.INVISIBLE);
tv_title.setText(mMessages.get(position).getTitle());
tv_description.setText(mMessages.get(position).getDescription());
tv_time.setText(mMessages.get(position).getTime());
}
@Override
public void onClick(View v) {
int clickedPosition = getAdapterPosition();
if (mOnClickListener != null) {
mOnClickListener.onListItemClick(clickedPosition);
}
}
}
public interface ListItemClickListener {
void onListItemClick(int clickedItemIndex);
}
}
|
Tolerability of four‐drug antiretroviral combination therapy in primary HIV‐1 infection
Objectives Rapid initiation of antiretroviral therapy (ART) is important for individuals with high baseline viral loads, such as in primary HIV‐1 infection (PHI). Four‐drug regimens are sometimes considered; however, data are lacking on tolerability. We aimed to evaluate the tolerability of four‐drug regimens used in the Research in Viral Eradication of HIV‐1 Reservoirs (RIVER) study. Methods At enrolment, ART‐naïve adult participants or those newly commenced on ART were initiated or intensified to four‐drug regimens within 4 weeks of PHI. Rapid start was defined as pre‐confirmation or ≤ 7 days of confirmed diagnosis. Primary and secondary outcomes were patient‐reported adherence measured by 7‐day recall and regimen switches between enrolment and randomization, respectively. Results Overall, 54 men were included: 72.2% were of white ethnicity, with a median age of 32 years old, 42.6% had a viral load of ≥ 100 000 HIV‐1 RNA copies/mL, and in 92.6% sex with men was the mode of acquisition of HIV‐1. Twenty (37%) started a four‐drug regimen and 34 (63%) were intensified. Rapid ART initiation occurred in 28%, 100% started in ≤ 4 weeks. By weeks 4, 12, and 24, 37.0%, 69.0%, and 94.0% were undetectable (viral load < 50 copies/mL), respectively. Adherence rates of 100% at weeks 4, 12, 22 and 24 were reported in 88.9%, 87.0%, 82.4% and 94.1% of participants, respectively. Five individuals switched to three drugs, four changed their regimen constituents, and two switched post‐randomization. Conclusions Overall, four‐drug regimens were well tolerated and had high levels of adherence. Whilst their benefit over three‐drug regimens is lacking, our findings should provide reassurance if a temporarily intensified regimen is clinically indicated to help facilitate treatment.
Introduction
Following the findings of the START and TEMPRANO trials in 2015 , HIV-1 treatment guidelines are unified in their recommendation to initiate antiretroviral therapy (ART) irrespective of CD4 count . There is also a consensus that the rapid initiation of ART (ideally ≤ 7days after confirmed HIV-1 diagnosis) is feasible , can achieve faster virological suppression , minimizes the HIV-1 reservoir and subsequent immune recovery , and improves uptake of ART and retention of care . Rapid ART initiation is particularly important for individuals with primary HIV-1 infection (PHI) to mitigate the elevated risk of onward transmission due to very high HIV-1 viral loads .
Current guidelines for rapidly starting ART in PHI recommend triple ART regimens comprising a tenofovirbased, dual nucleos(t)ide reverse transcriptase inhibitor (NRTI) backbone combined with integrase strand transfer inhibitors (INSTIs), or a boosted protease inhibitor (PI) such as darunavir/ritonavir (DRV/r) . Some physicians elect to start all four components, particularly with high viral loads in PHI. The rationale for four drugs is to access the benefits of faster viral suppression seen with INSTIs combined with the higher genetic barrier to resistance associated with PIs . This also safely negates the need to await genotype resistance and HLA-B*5701 results.
Despite the recommendations for rapid start of ART in PHI, there is a paucity of data on the tolerability and adherence in this setting, a time when patients are dealing with the burden of a new diagnosis of HIV-1, potentially compounded by symptoms of seroconversion. As such, our aim was to review the tolerability of four-drug regimens in the Research in Viral Eradication of HIV-1 Reservoirs (RIVER) trial (NCT02336074) .
Methods
The RIVER trial methodology is described in the primary manuscript . RIVER was conducted in the UK, during 2016-2018. At enrolment, ART-na€ ıve adult participants or those newly commenced on ART were initiated or intensified to four-drug regimens within 4 weeks of PHI. They were randomized 6 months later to adjuvant ChAdV63.HIVconsv-prime and vorinostat or to continue ART alone. The post-randomization intervention period lasted 18 weeks. This analysis only includes participants who received a four-drug ART regimen.
Participants were recommended a four-drug ART regimen as per the RIVER protocol. This included daily DRV/ r 800/100 mg, as per the guidance for ART initiation prior to genotype availability at the time of recruitment , raltegravir 400 mg twice a day to facilitate rapid viral load suppression, and a dual, tenofovir-based NRTI backbone. For those on a three-drug combination pre-enrolment, intensification was proposed with raltegravir if on a PI-based regimen or a boosted PI if on a raltegravirbased regimen.
Our primary outcome was patient-reported adherence measured by 7-day recall at weeks 0, 4, 12, 22 and 24 (randomization). The 7-day recall tool is widely used in trials conducted by the AIDS Clinical Trials Group and the International Network for Strategic Initiatives in Global HIV Trials . Our secondary outcome was the number of regimen switches between enrolment and randomization.
Five changed to a three-drug regimen and despite a pill burden of four to six pills/day. While only 15 participants (28%) had a truly 'rapid' ART start, everyone had commenced ART within four weeks of confirmed PHI diagnosis. Importantly, the design of the trial allowed the inclusion of those who had already started ART, and thus any delays should not have been attributed to screening/enrolment procedures for the trial. The RIVER trial patient population was small, male and highly motivated, limiting the generalizability to other cohorts, although despite this, our data demonstrate that four-drug regimens are feasible in the PHI setting, including for rapid ART initiation.
While three-drug regimens using a dolutegravir, bictegravir or darunavir/r third agent are recommended in PHI, barriers to four-drug regimens (e.g. pill burden) are diminished with modern fixed-dose combinations. Similarly, while the prevalence of transmitted INSTI resistance in the UK is currently low, this may rise with greater use . However, it is acknowledged that the use of fourdrug combinations has become less common and is partially driven by physician choice dependent on the clinical circumstance (e.g. concerns about drug-resistant HIV acquisition). It is also noted that studies comparing standard three-drug regimens with five-drug combinations that included raltegravir and maraviroc found no difference in viral suppression or HIV reservoir size, although sample sizes were small . Viable four-drug regimens offer flexibility by expanding treatment options for people newly diagnosed with HIV, particularly in PHI, wishing to start treatment promptly. In these scenarios, clinical teams may also be reassured that a robust regimen is being utilized pending initial investigations; the regimen can then easily be rationalized when results are available or when viral suppression has been achieved. |
<gh_stars>1-10
package net.andreinc.carsandpolice.query;
import net.andreinc.carsandpolice.query.utils.KsqlDbStreamingQuery;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.messaging.simp.SimpMessagingTemplate;
import org.springframework.stereotype.Component;
import javax.annotation.PostConstruct;
import java.util.Map;
@Component
public class CarsBlockedStreamingQuery {
@Autowired
private SimpMessagingTemplate simpMessagingTemplate;
@Autowired
private KsqlDbStreamingQuery ksqlDbStreamingQuery;
@PostConstruct
public void carsBlocked() {
ksqlDbStreamingQuery.query("select * from carsBlocked emit changes;", (row) -> {
Map<String, Object> objectMap = row.asObject().getMap();
simpMessagingTemplate.convertAndSend("/topic/carsblocked", objectMap);
});
}
}
|
/**
* Combines the results of a join between ExpandIntermediateResults and an edge embedding by growing
* the intermediate result.
* Before growing it is checked whether distinctiveness conditions would still apply.
*/
@FunctionAnnotation.ForwardedFieldsFirst("f0")
@FunctionAnnotation.ForwardedFieldsSecond("f2")
public class MergeExpandEmbeddings
extends RichFlatJoinFunction<ExpandEmbedding, EdgeWithTiePoint, ExpandEmbedding> {
/**
* Holds the index of all vertex columns that should be distinct
*/
private final List<Integer> distinctVertices;
/**
* Holds the index of all edge columns that should be distinct
*/
private final List<Integer> distinctEdges;
/**
* Specifies a base column that should be equal to the paths end node
*/
private final int closingColumn;
/**
* Create a new Combine Expand Embeddings Operator
* @param distinctVertices distinct vertex columns
* @param distinctEdges distinct edge columns
* @param closingColumn base column that should be equal to a paths end node
*/
public MergeExpandEmbeddings(List<Integer> distinctVertices,
List<Integer> distinctEdges, int closingColumn) {
this.distinctVertices = distinctVertices;
this.distinctEdges = distinctEdges;
this.closingColumn = closingColumn;
}
@Override
public void join(ExpandEmbedding base, EdgeWithTiePoint edge,
Collector<ExpandEmbedding> out) throws Exception {
if (checkDistinctiveness(base, edge)) {
out.collect(base.grow(edge));
}
}
/**
* Checks the distinctiveness criteria for the expansion
* @param prev previous intermediate result
* @param edge edge along which we expand
* @return true if distinct criteria apply for the expansion
*/
private boolean checkDistinctiveness(ExpandEmbedding prev, EdgeWithTiePoint edge) {
if (distinctVertices.isEmpty() && distinctEdges.isEmpty()) {
return true;
}
// the new candidate is invalid under vertex isomorphism
if (edge.getSource().equals(edge.getTarget()) &&
!distinctVertices.isEmpty()) {
return false;
}
// check if there are any clashes in the path
for (GradoopId ref : prev.getPath()) {
if ((ref.equals(edge.getSource()) || ref.equals(edge.getTarget()) &&
!distinctVertices.isEmpty()) || (ref.equals(edge.getId()) && !distinctEdges.isEmpty())) {
return false;
}
}
List<GradoopId> ref;
// check for clashes with distinct vertices in the base
for (int i : distinctVertices) {
ref = prev.getBase().getIdAsList(i);
if ((ref.contains(edge.getTarget()) && i != closingColumn) ||
ref.contains(edge.getSource())) {
return false;
}
}
// check for clashes with distinct edges in the base
ref = prev.getBase().getIdsAsList(distinctEdges);
return !ref.contains(edge.getId());
}
} |
<filename>src/unwinder/sentry_unwinder_libunwindstack.cpp
extern "C" {
#include "sentry_boot.h"
#include "sentry_core.h"
}
#include <memory>
#include <ucontext.h>
#include <unwindstack/Maps.h>
#include <unwindstack/Memory.h>
#include <unwindstack/Regs.h>
#include <unwindstack/RegsGetLocal.h>
#include <unwindstack/Unwinder.h>
extern "C" {
size_t
sentry__unwind_stack_libunwindstack(
void *addr, const sentry_ucontext_t *uctx, void **ptrs, size_t max_frames)
{
std::unique_ptr<unwindstack::Regs> regs;
if (uctx) {
regs = std::unique_ptr<unwindstack::Regs>(
unwindstack::Regs::CreateFromUcontext(
unwindstack::Regs::CurrentArch(), uctx->user_context));
} else if (!addr) {
regs = std::unique_ptr<unwindstack::Regs>(
unwindstack::Regs::CreateFromLocal());
unwindstack::RegsGetLocal(regs.get());
} else {
return 0;
}
unwindstack::LocalMaps maps;
if (!maps.Parse()) {
SENTRY_WARN("unwinder failed to parse process maps\n");
ptrs[0] = (void *)regs->pc();
return 1;
}
const std::shared_ptr<unwindstack::Memory> process_memory
= unwindstack::Memory::CreateProcessMemoryCached(getpid());
unwindstack::Unwinder unwinder(
max_frames, &maps, regs.get(), process_memory);
unwinder.Unwind();
std::vector<unwindstack::FrameData> &frames = unwinder.frames();
size_t rv = 0;
for (unwindstack::FrameData &frame : frames) {
ptrs[rv++] = (void *)frame.pc;
}
return rv;
}
}
|
# -*-coding:Latin-1 -*
from emoji import get_emoji_regexp
from re import sub
# Defining function to remove emojis
def remove_emoji(text):
return get_emoji_regexp().sub(u'', text)
def clean_data(input):
# Removing emojis
output = input.map( lambda my_text: remove_emoji(my_text) )
# Encoding into ascii (comment out this line if using other languages than English)
#output = output.map( lambda my_text: my_text.encode("ascii", errors="ignore").decode() )
# Removing strings starting by $, #, @ or http
output = output.map( lambda my_text: sub(pattern=r'http(\S+)(\s+)' ,repl=" " ,string=my_text) )
output = output.map( lambda my_text: sub(pattern=r'http(\S+)$' ,repl="" ,string=my_text) )
output = output.map( lambda my_text: sub(pattern=r'\@(\S+)(\s+)' ,repl=" " ,string=my_text) )
output = output.map( lambda my_text: sub(pattern=r'\@(\S+)$' ,repl="" ,string=my_text) )
output = output.map( lambda my_text: sub(pattern=r'\#(\S+)(\s+)' ,repl=" " ,string=my_text) )
output = output.map( lambda my_text: sub(pattern=r'\#(\S+)$' ,repl="" ,string=my_text) )
output = output.map( lambda my_text: sub(pattern=r'\$(\S+)(\s+)' ,repl=" " ,string=my_text) )
output = output.map( lambda my_text: sub(pattern=r'\$(\S+)$' ,repl="" ,string=my_text) )
# Removing retweets
output = output.map( lambda my_text: sub(pattern=r'^RT',repl="",string=my_text) )
# Removing space-like symbols
output = output.map( lambda my_text: my_text
.replace( "(" ,' ')
.replace( ")" ,' ')
.replace( "[" ,' ')
.replace( "]" ,' ')
.replace( "{" ,' ')
.replace( "}" ,' ')
.replace( "\\" ,' ')
.replace( "/" ,' ')
.replace( "#" ," ")
.replace( "@" ," ")
.replace( "$" ," ")
.replace( "?" ," ")
.replace( "!" ," ")
.replace( ":" ,' ')
.replace( ";" ,' ')
.replace( "." ,' ')
.replace( "," ," ")
.replace( '"' ,' ')
.replace( "'" ,' ')
)
# Removing undesired spaces
output = output.map( lambda my_text: sub(pattern=r'\s+', repl=" ", string=my_text).strip() )
# Converting to lowercase
output = output.map( lambda my_text: my_text.lower() )
# Removing undesired characters (i.e. all non-alphabetic characters)
#output = output.map( lambda my_text: sub(pattern=r'[^a-z]',repl="",string=my_text) )
# Uncomment this line to print first 20 results
#result.map( lambda my_text: "gcg "+my_text+" gcg" ).pprint(20)
return output
|
The bulk of the book’s second half spends time slowly setting the table of the world immediately surrounding Nazarick. I liked that we immediately got the sense of a living and breathing world with people going about their lives that existed independently of Momonga’s little fiefdom. We also get introduced to the political dynamics of this world, and although they weren’t anything too special or interesting I definitely wanted to see Momonga get himself involved in them for interests sake. And that he does after a village situated near Nazarick is attacked by bandits, giving Momonga the opportunity to gauge his powers relative to the residents of the world as well as gain information. This was quite heavy on the dialogue overall just as in the first half of the volume, but it felt like comparatively less happened in this space, making this segment feel like a little bit of a slog at times despite being interesting on the whole.
Naturally, Momonga’s expedition led to a couple of fierce battles which were interesting because of Momonga’s struggle to balance fitting into this world clashed with his desire to demonstrate his overwhelming power to influence the world around him. However, once Momonga’s power relative to the rest of the world was demonstrated, the battles lost a lot of their tension. This caused this volume’s climatic battle to fall a little bit flat because it felt quite dragged out in comparison to what actually happened in it, and as a result it felt as though the second half of this volume probably could have been trimmed down by quite a bit to avoid dragging along. It was still quite fascinating to watch Momonga do his plotting and slowly gain confidence, but the political dynamic explored didn’t feel quite as interesting as I thought it might have. However, this was still primarily about setting the table for Momonga’s future adventures, so I wasn’t overly bothered by this second half because I still felt that it did a good job portraying Momonga’s slow process feeling out this new world.
As a side note, Yen Press did a fantastic job with this physical release by including all of the illustrations and character profiles in colour. The illustrations are absolutely stunning, and are definitely unique in compared to the anime-inspired illustrations most other light novels have. The book is also printed on higher-quality paper than their usual light novel releases, giving this volume a premium feel.
Final Thoughts
Overlord Vol. 1 is a compelling look at the Momonga’s process of coping with being given virtually unlimited power in a richly developed world. I really enjoyed the way Yggdrasil was introduced through Momonga’s wistful recollections of the glory days of Ainz Ooal Gown, and this gave the story a personal feel that made me emotionally invested both in Momonga’s own story as well as that of his burgeoning kingdom. The characters are given interesting personalities, and I really liked seeing their interactions with Momonga for the way they cemented his unique position within the world. Although the second half of this volume drags a little bit at times, I liked the way that it introduced the broader world that Momonga will be inhabiting. This volume was all about setting the table for Momonga’s future adventures, and it definitely succeeds in characterizing him in a fascinating and impactful manner. |
/*---------------------------------------------------------------------------------------------
* Copyright (c) 2019 Bentley Systems, Incorporated. All rights reserved.
* Licensed under the MIT License. See LICENSE.md in the project root for license terms.
*--------------------------------------------------------------------------------------------*/
/** @module Inputs */
import * as React from "react";
import * as classnames from "classnames";
import Select, { SelectProps } from "./Select";
import InputStatus from "./InputStatus";
/** Properties for [[LabeledSelect]] components */
export interface LabeledSelectProps extends SelectProps {
label: string;
status?: InputStatus;
message?: string;
}
/** Dropdown wrapper that allows for additional styling and labelling */
export class LabeledSelect extends React.Component<LabeledSelectProps> {
public render(): JSX.Element {
return (
<label className={classnames(
"uicore-inputs-labeled-select",
{ disabled: this.props.disabled },
this.props.status,
this.props.className,
)}>
<div className={"label"}>{this.props.label}</div>
<Select disabled={this.props.disabled} {...this.props} />
{this.props.message &&
<div className={"message"}>{this.props.message}</div>}
</label>
);
}
}
export default LabeledSelect;
|
package com.freetmp.mbg.merge.expression;
import com.freetmp.mbg.merge.AbstractMerger;
import com.github.javaparser.ast.expr.DoubleLiteralExpr;
/**
* Created by LiuPin on 2015/5/13.
*/
public class DoubleLiteralExprMerger extends AbstractMerger<DoubleLiteralExpr> {
@Override public DoubleLiteralExpr doMerge(DoubleLiteralExpr first, DoubleLiteralExpr second) {
DoubleLiteralExpr dle = new DoubleLiteralExpr();
dle.setValue(first.getValue());
return dle;
}
@Override public boolean doIsEquals(DoubleLiteralExpr first, DoubleLiteralExpr second) {
if(!first.getValue().equals(second.getValue())) return false;
return true;
}
}
|
/****************************************************************************
Function
ES_PostToServiceLIFO
Parameters
uint8_t : Which service to post to (index into ServDescList)
ES_Event : The Event to be posted
Returns
boolean : False if the post function failed during execution
Description
Posts, using LIFO strategy, to one of the services' queues
Notes
used by the Defer/Recall event capability
Author
J. Edward Carryer, 11/02/13
****************************************************************************/
bool ES_PostToServiceLIFO( uint8_t WhichService, ES_Event TheEvent){
if ((WhichService < ARRAY_SIZE(EventQueues)) &&
(ES_EnQueueLIFO( EventQueues[WhichService].pMem, TheEvent) ==
true )){
Ready |= BitNum2SetMask[WhichService];
return true;
} else
return false;
} |
# -*- coding: utf-8 -*-
"""
This file contains the Qudi Interface file for control wavemeter hardware.
Qudi is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Qudi is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Qudi. If not, see <http://www.gnu.org/licenses/>.
Copyright (c) the Qudi Developers. See the COPYRIGHT.txt file at the
top-level directory of this distribution and at <https://github.com/Ulm-IQO/qudi/>
"""
from core.interface import abstract_interface_method
from core.meta import InterfaceMetaclass
class WavemeterInterface(metaclass=InterfaceMetaclass):
""" Define the controls for a wavemeter hardware.
Note: This interface is very similar in feature with slow counter
"""
@abstract_interface_method
def start_acqusition(self):
""" Method to start the wavemeter software.
@return (int): error code (0:OK, -1:error)
Also the actual threaded method for getting the current wavemeter
reading is started.
"""
pass
@abstract_interface_method
def stop_acqusition(self):
""" Stops the Wavemeter from measuring and kills the thread that queries the data.
@return (int): error code (0:OK, -1:error)
"""
pass
@abstract_interface_method
def get_current_wavelength(self, kind="air"):
""" This method returns the current wavelength.
@param (str) kind: can either be "air" or "vac" for the wavelength in air or vacuum, respectively.
@return (float): wavelength (or negative value for errors)
"""
pass
@abstract_interface_method
def get_current_wavelength2(self, kind="air"):
""" This method returns the current wavelength of the second input channel.
@param (str) kind: can either be "air" or "vac" for the wavelength in air or vacuum, respectively.
@return float: wavelength (or negative value for errors)
"""
pass
@abstract_interface_method
def get_timing(self):
""" Get the timing of the internal measurement thread.
@return (float): clock length in second
"""
pass
@abstract_interface_method
def set_timing(self, timing):
""" Set the timing of the internal measurement thread.
@param (float) timing: clock length in second
@return (int): error code (0:OK, -1:error)
"""
pass
|
<filename>python-algorithm/leetcode/problem_524.py
"""524. Longest Word in Dictionary through Deleting
https://leetcode.com/problems/longest-word-in-dictionary-through-deleting/
"""
import bisect
from typing import List
class Solution:
def find_longest_word(self, s: str, dictionary: List[str]) -> str:
mapper = [[] for _ in range(26)]
for i, c in enumerate(s):
mapper[ord(c) - 97].append(i)
dictionary.sort(key=lambda x: len(x), reverse=True)
ans = ''
for word in dictionary:
if len(word) < len(ans):
break
i, j = 0, len(word) - 1
left, right = -1, len(s)
while i < j:
pi = mapper[ord(word[i]) - 97]
pj = mapper[ord(word[j]) - 97]
if not pi or not pj:
break
pos = bisect.bisect_right(pi, left)
if pos == len(pi):
break
left = pi[pos]
pos = bisect.bisect_left(pj, right) - 1
if pos < 0:
break
right = pj[pos]
if left >= right:
break
i += 1
j -= 1
if i >= j:
if not ans or word < ans:
ans = word
return ans
|
256×256, 100kfps, 61% Fill-factor time-resolved SPAD image sensor for microscopy applications
A 256×256 Single Photon Avalanche Diode (SPAD) image sensor operating at 100kfps with fill factor of 61% and pixel pitch of 16μm is reported. An all-NMOS 7T pixel allows high uniformity gated operation down to 4ns and ∼600ps fall time with on-chip delay generation. The sensor operates with 0.996 temporal aperture ratio (TAR) in rolling shutter and has a parasitic light sensitivity (PLS) in excess of −160dB when operated in global shutter. Gating and cooling allow the suppression of dark noise, which, in combination with the high fill factor, enables competitive low-light performance with electron multiplying CCDs (EMCCDs) whilst offering time-resolved imaging modes. |
Syracuse, N.Y. — Popular Syracuse Crunch defenseman J.P. Cote said Tampa Bay has told him it will not re-sign him for the 2015-16 season.
Cote, on an NHL deal, will be an unrestricted free agent on July 1.
Cote, 33, was a heart-and-soul defenseman for Syracuse who helped build winning cultures in Norfolk and Syracuse. He was a key part of the Admirals' run to the Calder Cup title in 2011-12 and then played to same role as the Crunch reached the finals in 2012-13.
He skated for Syracuse the past three seasons, earning a reputation as one of the toughest stay-at-home defensemen in the AHL. He was also one of most active Crunch players in community work.
"The way I play the game and the way I approach it, any team I'm with I liked to get involved,'' he said Friday night. "The most fun with hockey is getting involved. I guess that's what hurts the most, to be told we don't need you to be part of this group anymore.
"That's being said, it is a business. I have nothing but good memories in Syracuse. I look back on it with pride.''
As happens with most veterans eventually, Cote was pushed aside by a youth movement. Tampa Bay brought in a core of good rookie blueliners to the Crunch last year — Jake Dotchin, Slater Koekkoek and Dylan Blujus — and this season highly regarded prospect Tony DeAngelo is expected to join that mix.
"They are taking a different approach,'' Cote said. "There's a lot of guys coming up. At the end of the day, you're exchangeable. It's a breakup. That's what it feels like.''
Cote said Tampa Bay assistant general manager Julien BriseBois told him of the decision a couple weeks ago. Still, the Lightning flew Cote from his home in Quebec to Amalie Arena for Game 5 of the Stanley Cup playoffs. It gave him a chance to spend a few more hours around his former Norfolk/Syracuse teammates, such as Tyler Johnson, Mike Angelidis and Ondrej Palat.
"It was nice to see the boys,'' Cote said. "It was just a nice thought that Julien came up with. It's a great organization. I would have wished to keep going, keep working hard for the organization, for the fans of Syracuse.''
Cote is skating with several NHL players in Quebec, including his good friend Antoine Vermette. Vermette was a member of the Chicago Blackhawks team that beat the Lightning in the Stanley Cup Finals.
Cote said he's looking forward to Vermette's Stanley Cup party.
"I know I'll be putting my little guy in the Cup,'' Cote said of his 5-month-old son.
The always optimistic Cote said his preference is to keep playing in North America, even if it's only on an AHL deal. While Cote's skating ability likely precludes a spot for him in the NHL, his savvy and leadership should earn make him a good candidate for another job in the minors.
"I've had some time to process it. I'm super-confident in the future,'' he said. "Hockey is a great game, I want to keep playing it. Looking forward, I'm not going to play for another 10 years. I will listen, if it's a good deal at the AHL level. If it's an NHL deal, even better.
"I'm hoping to go to a team that appreciates (my grit). Teams have shopping lists. I'm sure I'm not at the top of their lists. I know my role. I'm sure I can be helpful for the younger guys. I'm very optimistic for what's ahead. It's another adventure. I'm pumped for it.''
Contact Lindsay Kramer anytime: Email | Twitter | 315-470-2151 |
/*************************************
*
* DSK board DSP32C I/O handlers
*
*************************************/
void harddriv_state::hd68k_dsk_dsp32_w(offs_t offset, uint16_t data)
{
m_dsk_pio_access = true;
if (m_dsp32.found()) m_dsp32->pio_w(offset, data);
m_dsk_pio_access = false;
} |
/// Class does some book keeping for timing an interval
/// Resolution: 1 second.
///
class IntervalTimer {
private:
time_t _timeout;
time_t _start_time {0};
bool _running {false};
public:
IntervalTimer( time_t to ) :
_timeout(to) { }
bool expired() {
return (_running and ((time(0) - _start_time) > _timeout));
}
bool running() const {
return _running;
}
time_t timeout_secs() const {
return _timeout;
}
void set_timeout( time_t to ) {
_timeout = to;
}
void start() { no effect if already running
if (not _running) {
std::cout << "start timer" << std::endl;
_start_time = time(0);
_running = true;
}
}
void stop() { no effect if already stopped
_start_time = 0;
std::cout << "stop timer" << std::endl;
_running = false;
}
} |
package bayern.steinbrecher.woodpacker.data;
import java.util.Collection;
import java.util.Collections;
/**
* @author <NAME>
* @since 0.1
*/
public class CuttingPlan {
private final Collection<PlankSolutionRow> rows;
private final BasePlank basePlank;
private final int oversize;
public CuttingPlan(final Collection<PlankSolutionRow> rows, final BasePlank basePlank, final int oversize) {
this.rows = Collections.unmodifiableCollection(rows);
this.basePlank = basePlank;
this.oversize = oversize;
}
public Collection<PlankSolutionRow> getRows() {
return rows;
}
public BasePlank getBasePlank() {
return basePlank;
}
public int getOversize() {
return oversize;
}
}
|
/**
* Navigation to specified point split by axis for int[] destination
*
* @param currentPoint Current point tile coordinates
* @param targetPoint Target location in tile coordinates
*/
public synchronized static void navigateToPoint(double currentPoint[], int targetPoint[]) {
double distXToPoint;
double distYToPoint;
distXToPoint = targetPoint[0] - currentPoint[0];
distYToPoint = targetPoint[1] - currentPoint[1];
if (Math.abs(distXToPoint) >= Math.abs(distYToPoint)) {
travelTo(targetPoint[0] * TILE_SIZE, currentPoint[1] * TILE_SIZE);
} else {
travelTo(currentPoint[0] * TILE_SIZE, targetPoint[1] * TILE_SIZE);
}
travelTo(targetPoint[0] * TILE_SIZE, targetPoint[1] * TILE_SIZE);
} |
/**
* returns privilege entities by code collection
*
* @param
* @return List<Privilege>
* @author umit.kas
*/
@Override
public List<Privilege> findAllByCode(List<Long> codes) throws DataNotFoundException {
List<Privilege> entities = privilegeRepository.findAllByCodeIn(codes);
if (entities == null) {
throw new DataNotFoundException("No such privilege collection is found");
}
return entities;
} |
package io.microconfig.core.properties.templates.definition.parser;
import io.microconfig.core.properties.Property;
import io.microconfig.core.properties.templates.TemplateDefinition;
import io.microconfig.core.properties.templates.TemplateDefinitionParser;
import io.microconfig.core.properties.templates.TemplatePattern;
import lombok.RequiredArgsConstructor;
import java.io.File;
import java.io.IOException;
import java.nio.file.Path;
import java.util.Collection;
import java.util.List;
import java.util.stream.Stream;
import static java.nio.file.Files.list;
import static java.util.Collections.emptyList;
import static java.util.Collections.singletonList;
import static java.util.stream.Collectors.toList;
@RequiredArgsConstructor
public class ArrowNotationParser implements TemplateDefinitionParser {
private final TemplatePattern templatePattern;
@Override
public Collection<TemplateDefinition> parse(Collection<Property> componentProperties) {
return componentProperties.stream()
.map(this::processProperty)
.flatMap(Collection::stream)
.collect(toList());
}
private List<TemplateDefinition> processProperty(Property property) {
if (!correctNotation(property.getKey())) return emptyList();
String[] split = property.getValue().trim().split(" -> ");
if (split.length != 2) return emptyList();
if (split[0].endsWith("/*")) {
return processWithAsterisk(property.getKey(), split[0], split[1]);
}
return singletonList(createTemplate(property.getKey(), split[0], split[1]));
}
private List<TemplateDefinition> processWithAsterisk(String key, String from, String to) {
String fromDir = from.substring(0, from.length() - 2);
try (Stream<Path> templates = list(new File(fromDir).toPath())) {
return templates
.map(path -> createTemplate(key, path.toString(), to + "/" + path.getFileName()))
.collect(toList());
} catch (IOException e) {
throw new RuntimeException("Can't get templates from dir " + from, e);
}
}
private TemplateDefinition createTemplate(String key, String from, String to) {
TemplateDefinition templateDefinition = new TemplateDefinition(
templatePattern.extractTemplateType(key),
templatePattern.extractTemplateName(key),
templatePattern);
templateDefinition.setFromFile(from);
templateDefinition.setToFile(to);
return templateDefinition;
}
private boolean correctNotation(String key) {
return key.endsWith(templatePattern.extractTemplateName(key)) && !key.contains("[");
}
}
|
/*
** EPITECH PROJECT, 2020
** PSU_42sh_2019
** File description:
** init
*/
#include <stdlib.h>
/* getcwd */
#include <unistd.h>
/* strdup */
#include <string.h>
#include "hasher/insert_data.h"
#include "proto/shell/local_variables.h"
#include "types/local_variables.h"
static void local_variables_add(struct hasher_s **hasher,
char *value, char *name)
{
struct local_var_s *var = NULL;
if (value) {
var = local_variable_from_data(*hasher, name, value);
if (var->data.string) {
hasher_insert_data_ordered(hasher, strdup(name), var);
}
}
}
struct hasher_s *local_variables_init(void)
{
struct hasher_s *hasher = NULL;
local_variables_add(&hasher, getcwd(NULL, 0), "cwd");
local_variables_add(&hasher, getenv("TERM"), "term");
local_variables_add(&hasher, "/usr/bin:/bin", "path");
return (hasher);
}
|
<reponame>gengxiaoyun/test
package metadata
import (
"github.com/romberli/go-util/config"
"github.com/romberli/das/pkg/message"
)
func init() {
initDebugMonitorSystemMessage()
initInfoMonitorSystemMessage()
initErrorMonitorSystemMessage()
}
const (
// debug
DebugMetadataGetMonitorSystemAll = 100601
DebugMetadataGetMonitorSystemByEnv = 100602
DebugMetadataGetMonitorSystemByID = 100603
DebugMetadataGetMonitorSystemByHostInfo = 100604
DebugMetadataAddMonitorSystem = 100605
DebugMetadataUpdateMonitorSystem = 100606
DebugMetadataDeleteMonitorSystem = 100607
// info
InfoMetadataGetMonitorSystemAll = 200601
InfoMetadataGetMonitorSystemByEnv = 200602
InfoMetadataGetMonitorSystemByID = 200603
InfoMetadataGetMonitorSystemByHostInfo = 200604
InfoMetadataAddMonitorSystem = 200605
InfoMetadataUpdateMonitorSystem = 200606
InfoMetadataDeleteMonitorSystem = 200607
// error
ErrMetadataGetMonitorSystemAll = 400601
ErrMetadataGetMonitorSystemByEnv = 400602
ErrMetadataGetMonitorSystemByID = 400603
ErrMetadataGetMonitorSystemByHostInfo = 400604
ErrMetadataAddMonitorSystem = 400605
ErrMetadataUpdateMonitorSystem = 400606
ErrMetadataDeleteMonitorSystem = 400607
)
func initDebugMonitorSystemMessage() {
message.Messages[DebugMetadataGetMonitorSystemAll] = config.NewErrMessage(message.DefaultMessageHeader, DebugMetadataGetMonitorSystemAll, "metadata: get all monitor systems completed. message: %s")
message.Messages[DebugMetadataGetMonitorSystemByEnv] = config.NewErrMessage(message.DefaultMessageHeader, DebugMetadataGetMonitorSystemByEnv, "metadata: get monitor systems by environment completed. message: %s")
message.Messages[DebugMetadataGetMonitorSystemByID] = config.NewErrMessage(message.DefaultMessageHeader, DebugMetadataGetMonitorSystemByID, "metadata: get monitor system by id completed. message: %s")
message.Messages[DebugMetadataGetMonitorSystemByHostInfo] = config.NewErrMessage(message.DefaultMessageHeader, DebugMetadataGetMonitorSystemByHostInfo, "metadata: get monitor system by host info completed. message: %s")
message.Messages[DebugMetadataAddMonitorSystem] = config.NewErrMessage(message.DefaultMessageHeader, DebugMetadataAddMonitorSystem, "metadata: add new monitor system completed. message: %s")
message.Messages[DebugMetadataUpdateMonitorSystem] = config.NewErrMessage(message.DefaultMessageHeader, DebugMetadataUpdateMonitorSystem, "metadata: update monitor system completed. message: %s")
message.Messages[DebugMetadataDeleteMonitorSystem] = config.NewErrMessage(message.DefaultMessageHeader, DebugMetadataDeleteMonitorSystem, "metadata: delete monitor system completed. message: %s")
}
func initInfoMonitorSystemMessage() {
message.Messages[InfoMetadataGetMonitorSystemAll] = config.NewErrMessage(message.DefaultMessageHeader, InfoMetadataGetMonitorSystemAll, "metadata: get all monitor systems completed")
message.Messages[InfoMetadataGetMonitorSystemByEnv] = config.NewErrMessage(message.DefaultMessageHeader, InfoMetadataGetMonitorSystemByEnv, "metadata: get monitor systems by environment completed. env_id: %d")
message.Messages[InfoMetadataGetMonitorSystemByID] = config.NewErrMessage(message.DefaultMessageHeader, InfoMetadataGetMonitorSystemByID, "metadata: get monitor system by id completed. id: %d")
message.Messages[InfoMetadataGetMonitorSystemByHostInfo] = config.NewErrMessage(message.DefaultMessageHeader, InfoMetadataGetMonitorSystemByHostInfo, "metadata: get monitor system by host info completed. host_ip: %s, port_num: %d")
message.Messages[InfoMetadataAddMonitorSystem] = config.NewErrMessage(message.DefaultMessageHeader, InfoMetadataAddMonitorSystem, "metadata: add new monitor system completed. system_name: %s, system_type: %d, host_ip: %s, port_num: %d, port_num_slow: %d, base_url: %s, env_id: %d")
message.Messages[InfoMetadataUpdateMonitorSystem] = config.NewErrMessage(message.DefaultMessageHeader, InfoMetadataUpdateMonitorSystem, "metadata: update monitor system completed. id: %d")
message.Messages[InfoMetadataDeleteMonitorSystem] = config.NewErrMessage(message.DefaultMessageHeader, InfoMetadataDeleteMonitorSystem, "metadata: delete monitor system completed. id: %d")
}
func initErrorMonitorSystemMessage() {
message.Messages[ErrMetadataGetMonitorSystemAll] = config.NewErrMessage(message.DefaultMessageHeader, ErrMetadataGetMonitorSystemAll, "metadata: get all monitor systems failed.\n%s")
message.Messages[ErrMetadataGetMonitorSystemByEnv] = config.NewErrMessage(message.DefaultMessageHeader, ErrMetadataGetMonitorSystemByEnv, "metadata: get monitor systems by environment failed. env_id: %d\n%s")
message.Messages[ErrMetadataGetMonitorSystemByID] = config.NewErrMessage(message.DefaultMessageHeader, ErrMetadataGetMonitorSystemByID, "metadata: get monitor system by id failed. id: %d\n%s")
message.Messages[ErrMetadataGetMonitorSystemByHostInfo] = config.NewErrMessage(message.DefaultMessageHeader, ErrMetadataGetMonitorSystemByHostInfo, "metadata: get monitor system by host info failed. host_ip: %s, port_num: %d\n%s")
message.Messages[ErrMetadataAddMonitorSystem] = config.NewErrMessage(message.DefaultMessageHeader, ErrMetadataAddMonitorSystem, "metadata: add new monitor system failed. system_name: %s, system_type: %d, host_ip: %s, port_num: %d, port_num_slow: %d, base_url: %s, env_id: %d\n%s")
message.Messages[ErrMetadataUpdateMonitorSystem] = config.NewErrMessage(message.DefaultMessageHeader, ErrMetadataUpdateMonitorSystem, "metadata: update monitor system failed. id: %d\n%s")
message.Messages[ErrMetadataDeleteMonitorSystem] = config.NewErrMessage(message.DefaultMessageHeader, ErrMetadataDeleteMonitorSystem, "metadata: delete monitor system failed. id: %d\n%s")
}
|
// Run executes kube-test given an artifact directory, and sets settings
// required for kubetest to work with Cluster API. JUnit files are
// also gathered for inclusion in Prow.
func Run(ctx context.Context, input RunInput) error {
if input.ClusterProxy == nil {
return errors.New("ClusterProxy must be provided")
}
if input.GinkgoNodes == 0 {
input.GinkgoNodes = DefaultGinkgoNodes
}
if input.GinkgoSlowSpecThreshold == 0 {
input.GinkgoSlowSpecThreshold = 120
}
if input.NumberOfNodes == 0 {
numNodes, err := countClusterNodes(ctx, input.ClusterProxy)
if err != nil {
return errors.Wrap(err, "Unable to count number of cluster nodes")
}
input.NumberOfNodes = numNodes
}
if input.KubernetesVersion == "" && input.ConformanceImage == "" {
discoveredVersion, err := discoverClusterKubernetesVersion(input.ClusterProxy)
if err != nil {
return errors.Wrap(err, "Unable to discover server's Kubernetes version")
}
input.KubernetesVersion = discoveredVersion
}
input.ArtifactsDirectory = framework.ResolveArtifactsDirectory(input.ArtifactsDirectory)
reportDir := path.Join(input.ArtifactsDirectory, "kubetest")
outputDir := path.Join(reportDir, "e2e-output")
kubetestConfigDir := path.Join(reportDir, "config")
if err := os.MkdirAll(outputDir, 0o750); err != nil {
return err
}
if err := os.MkdirAll(kubetestConfigDir, 0o750); err != nil {
return err
}
ginkgoVars := map[string]string{
"nodes": strconv.Itoa(input.GinkgoNodes),
"slowSpecThreshold": strconv.Itoa(input.GinkgoSlowSpecThreshold),
}
tmpConfigFilePath := path.Join(kubetestConfigDir, "viper-config.yaml")
if err := copyFile(input.ConfigFilePath, tmpConfigFilePath); err != nil {
return err
}
tmpKubeConfigPath, err := dockeriseKubeconfig(kubetestConfigDir, input.ClusterProxy.GetKubeconfigPath())
if err != nil {
return err
}
var testRepoListVolumeArgs []string
if input.KubeTestRepoListPath != "" {
testRepoListVolumeArgs, err = buildKubeTestRepoListArgs(kubetestConfigDir, input.KubeTestRepoListPath)
if err != nil {
return err
}
}
e2eVars := map[string]string{
"kubeconfig": "/tmp/kubeconfig",
"provider": "skeleton",
"report-dir": "/output",
"e2e-output-dir": "/output/e2e-output",
"dump-logs-on-failure": "false",
"report-prefix": "kubetest.",
"num-nodes": strconv.FormatInt(int64(input.NumberOfNodes), 10),
"viper-config": "/tmp/viper-config.yaml",
}
ginkgoArgs := buildArgs(ginkgoVars, "-")
e2eArgs := buildArgs(e2eVars, "--")
if input.ConformanceImage == "" {
input.ConformanceImage = versionToConformanceImage(input.KubernetesVersion)
}
kubeConfigVolumeMount := volumeArg(tmpKubeConfigPath, "/tmp/kubeconfig")
outputVolumeMount := volumeArg(reportDir, "/output")
viperVolumeMount := volumeArg(tmpConfigFilePath, "/tmp/viper-config.yaml")
user, err := user.Current()
if err != nil {
return errors.Wrap(err, "unable to determine current user")
}
userArg := user.Uid + ":" + user.Gid
networkArg := "--network=kind"
e2eCmd := exec.Command("docker", "run", "--user", userArg, kubeConfigVolumeMount, outputVolumeMount, viperVolumeMount, "-t", networkArg)
if len(testRepoListVolumeArgs) > 0 {
e2eCmd.Args = append(e2eCmd.Args, testRepoListVolumeArgs...)
}
e2eCmd.Args = append(e2eCmd.Args, input.ConformanceImage)
e2eCmd.Args = append(e2eCmd.Args, "/usr/local/bin/ginkgo")
e2eCmd.Args = append(e2eCmd.Args, ginkgoArgs...)
e2eCmd.Args = append(e2eCmd.Args, "/usr/local/bin/e2e.test")
e2eCmd.Args = append(e2eCmd.Args, "--")
e2eCmd.Args = append(e2eCmd.Args, e2eArgs...)
e2eCmd = framework.CompleteCommand(e2eCmd, "Running e2e test", false)
if err := e2eCmd.Run(); err != nil {
return errors.Wrap(err, "Unable to run conformance tests")
}
if err := framework.GatherJUnitReports(reportDir, input.ArtifactsDirectory); err != nil {
return err
}
return nil
} |
import { AboutPage } from './AboutPage'
export default AboutPage
|
/**
* @param initialSettings a map with settings to be present in the config.
* @return a configuration with default values augmented with the provided <code>initialSettings</code>.
*/
@Nonnull
public static Config defaults( @Nonnull final Map<String,String> initialSettings )
{
return builder().withSettings( initialSettings ).build();
} |
/*
* Copyright (c) 2021 Huawei Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
export const source: string = `
import router from '@system.router'
import app from '@system.app'
@Entry
@Component
struct MyComponent {
@State text_num: number = 0.0
build() {
Column() {
Text('fingers:2,angle: ' + this.text_num)
.fontSize(25)
.width(400)
.height(400)
.backgroundColor('red')
.gesture(
RotationGesture({fingers: 2, angle: 5})
.onActionStart((event: GestureEvent) => {
this.text_num = event.angle
console.error('rotation gesture on clicked')
})
.onActionUpdate((event: GestureEvent) => {
this.text_num = event.angle
console.error('rotation gesture on clicked')
})
.onActionEnd((event: GestureEvent) => {
this.text_num = event.angle
console.error('rotation gesture on clicked')
})
.onActionCancel(() => {
})
)
}
}
}`
export const expectResult: string =
`var router = globalThis.requireNativeModule('system.router');
var app = globalThis.requireNativeModule('system.app');
class MyComponent extends View {
constructor(compilerAssignedUniqueChildId, parent, params) {
super(compilerAssignedUniqueChildId, parent);
this.__text_num = new ObservedPropertySimple(0.0, this, "text_num");
this.updateWithValueParams(params);
}
updateWithValueParams(params) {
if (params.text_num !== undefined) {
this.text_num = params.text_num;
}
}
aboutToBeDeleted() {
this.__text_num.aboutToBeDeleted();
SubscriberManager.Get().delete(this.id());
}
get text_num() {
return this.__text_num.get();
}
set text_num(newValue) {
this.__text_num.set(newValue);
}
render() {
Column.create();
Text.create('fingers:2,angle: ' + this.text_num);
Text.fontSize(25);
Text.width(400);
Text.height(400);
Text.backgroundColor('red');
Gesture.create(GesturePriority.Low);
RotationGesture.create({ fingers: 2, angle: 5 });
RotationGesture.onActionStart((event) => {
this.text_num = event.angle;
console.error('rotation gesture on clicked');
});
RotationGesture.onActionUpdate((event) => {
this.text_num = event.angle;
console.error('rotation gesture on clicked');
});
RotationGesture.onActionEnd((event) => {
this.text_num = event.angle;
console.error('rotation gesture on clicked');
});
RotationGesture.onActionCancel(() => {
});
RotationGesture.pop();
Gesture.pop();
Text.pop();
Column.pop();
}
}
loadDocument(new MyComponent("1", undefined, {}));
`
|
// ProgressWriter reports the write progress.
func ProgressWriter(op *operations.Operation, key string, description string) func(io.WriteCloser) io.WriteCloser {
return func(writer io.WriteCloser) io.WriteCloser {
if op == nil {
return writer
}
progress := func(progressInt int64, speedInt int64) {
progressWrapperRender(op, key, description, progressInt, speedInt)
}
writePipe := &ioprogress.ProgressWriter{
WriteCloser: writer,
Tracker: &ioprogress.ProgressTracker{
Handler: progress,
},
}
return writePipe
}
} |
If Bill Nye loves space so much, how come he's never gone into it?
It's not for lack of trying. In a recent interview at AOL headquarters in New York, America's beloved "Science Guy" and the CEO of The Planetary Society said that he had applied to join the astronaut corps four times, but NASA rejected him.
"The kind of people that become astronauts are amazing," Nye explained. "They're these crazy, wonderful, overachiever, wild people."
Nye, whose gift for explaining science and promoting space exploration has earned him worldwide fame, isn't exactly an underachiever. As he said in the interview, he was a good athlete and not prone to the motion sickness that troubles some astronauts.
Still, it was a no-go. And so Nye joked that there might have been a very basic explanation for why NASA didn't want him -- just watch the interview in the video above to hear the story!
Also on HuffPost: |
// make sure we don't crash on highly nested expressions
// or rather, crash in a controlled way
fn recursion_check(&self) -> RecursionGuard {
// this is just a guesstimate, it should probably be configurable
#[cfg(debug_assertions)]
const MAX_DEPTH: usize = 50;
#[cfg(not(debug_assertions))]
const MAX_DEPTH: usize = 200;
let guard = self.recursion_guard.clone();
let depth = Rc::strong_count(&guard.0);
if depth > MAX_DEPTH {
eprintln!(
"fatal: maximum recursion depth exceeded ({} > {})",
depth, MAX_DEPTH
);
std::process::exit(102);
}
guard
} |
<filename>pkg/client/secrets.go
package client
import (
"context"
"fmt"
"github.com/consensys/quorum-key-manager/src/stores/api/types"
)
const secretsPath = "secrets"
func (c *HTTPClient) SetSecret(ctx context.Context, storeName, id string, req *types.SetSecretRequest) (*types.SecretResponse, error) {
secret := &types.SecretResponse{}
reqURL := fmt.Sprintf("%s/%s/%s", withURLStore(c.config.URL, storeName), secretsPath, id)
response, err := postRequest(ctx, c.client, reqURL, req)
if err != nil {
return nil, err
}
defer closeResponse(response)
err = parseResponse(response, secret)
if err != nil {
return nil, err
}
return secret, nil
}
func (c *HTTPClient) GetSecret(ctx context.Context, storeName, id, version string) (*types.SecretResponse, error) {
secret := &types.SecretResponse{}
reqURL := fmt.Sprintf("%s/%s/%s", withURLStore(c.config.URL, storeName), secretsPath, id)
if version != "" {
reqURL = fmt.Sprintf("%s?version=%s", reqURL, version)
}
response, err := getRequest(ctx, c.client, reqURL)
if err != nil {
return nil, err
}
defer closeResponse(response)
err = parseResponse(response, secret)
if err != nil {
return nil, err
}
return secret, nil
}
func (c *HTTPClient) ListSecrets(ctx context.Context, storeName string) ([]string, error) {
var ids []string
reqURL := fmt.Sprintf("%s/%s", withURLStore(c.config.URL, storeName), secretsPath)
response, err := getRequest(ctx, c.client, reqURL)
if err != nil {
return nil, err
}
defer closeResponse(response)
err = parseResponse(response, &ids)
if err != nil {
return nil, err
}
return ids, nil
}
|
Efficient estimation of human immunodeficiency virus incidence rate using a pooled cross‐sectional cohort study design
Development of methods to accurately estimate human immunodeficiency virus (HIV) incidence rate remains a challenge. Ideally, one would follow a random sample of HIV‐negative individuals under a longitudinal study design and identify incident cases as they arise. Such designs can be prohibitively resource intensive and therefore alternative designs may be preferable. We propose such a simple, less resource‐intensive study design and develop a weighted log likelihood approach which simultaneously accounts for selection bias and outcome misclassification error. The design is based on a cross‐sectional survey which queries individuals' time since last HIV‐negative test, validates their test results with formal documentation whenever possible, and tests all persons who do not have documentation of being HIV‐positive. To gain efficiency, we update the weighted log likelihood function with potentially misclassified self‐reports from individuals who could not produce documentation of a prior HIV‐negative test and investigate large sample properties of validated sub‐sample only versus pooled sample estimators through extensive Monte Carlo simulations. We illustrate our method by estimating incidence rate for individuals who tested HIV‐negative within 1.5 and 5 years prior to Botswana Combination Prevention Project enrolment. This article establishes that accurate estimates of HIV incidence rate can be obtained from individuals' history of testing in a cross‐sectional cohort study design by appropriately accounting for selection bias and misclassification error. Moreover, this approach is notably less resource‐intensive compared to longitudinal and laboratory‐based methods. |
/**
* This class will be used when order is not change so not need to update the row
*/
public class DummyRowUpdater implements SortTempRowUpdater {
private static final long serialVersionUID = 5989093890994039617L;
@Override public void updateSortTempRow(IntermediateSortTempRow intermediateSortTempRow) {
// DO NOTHING
}
@Override public void updateOutputRow(Object[] out, int[] dimArray, Object[] noDictArray,
Object[] measureArray) {
out[WriteStepRowUtil.DICTIONARY_DIMENSION] = dimArray;
out[WriteStepRowUtil.NO_DICTIONARY_AND_COMPLEX] = noDictArray;
out[WriteStepRowUtil.MEASURE] = measureArray;
}
} |
/**
* The Constants class provides a convenient place for teams to hold robot-wide numerical or boolean
* constants. This class should not be used for any other purpose. All constants should be declared
* globally (i.e. public static). Do not put anything functional in this class.
*
* <p>It is advised to statically import this class (or one of its inner classes) wherever the
* constants are needed, to reduce verbosity.
*/
public final class Constants {
// DriveTrain
// device numbers?
public static final int DT_LEFT_FRONT = 13;
public static final int DT_LEFT_BACK = 15;
public static final int DT_RIGHT_FRONT = 2;
public static final int DT_RIGHT_BACK = 20;
//Ingestor
public static final int INGESTOR_TALON_CARGO = 10;
public static final int INGESTOR_TALON_BELT = 11;
public static final double INGESTOR_BELT_POWER = -1;
//Shooter
public static final int S_POW = 10;
// Stick Sensitivity
public static final double SENSITIVITY = 0.4;
} |
<filename>array/sum.go<gh_stars>0
package array
// Sum takes array as input, and returns the sum of all of them.
func Sum(numbers []int) int {
sum := 0
for _, number := range numbers {
sum += number
}
return sum
}
// SumAll compute sum all the given slices, and return this in new slice.
func SumAll(numbersToSum ...[]int) (sum []int) {
for _, numbers := range numbersToSum {
sum = append(sum, Sum(numbers))
}
return
}
// SumAllTails compute sum all the given slices except the head element, and return this in new slice.
func SumAllTails(numbersToSum ...[]int) (sum []int) {
for _, numbers := range numbersToSum {
if len(numbers) == 0 {
sum = append(sum, 0)
} else {
sum = append(sum, Sum(numbers[1:]))
}
}
return
}
|
The Department of Veterans Affairs’ top health official on Monday told a House committee hearing into “preventable deaths” at VA medical facilities that the incidents represented “serious but not systemic” problems for the department.
The hearing, held in Pittsburgh at the Allegheny County Courthouse, featured testimony from family members and veterans who had troubling and in some cases tragic interactions with VA medical facilities in Pittsburgh, Atlanta, Buffalo, Dallas, and Jackson, Miss.
“The patient care issues the committee has raised are serious, but not systemic,” Robert A. Petzel, VA’s secretary for health, told the House Committee on Veterans’ Affairs, which held the hearing. The hearing was titled, “A Matter of Life and Death: Examining Preventable Deaths, Patient Safety Issues and Bonuses for VA Execs Who Oversaw Them.”
Rep. Jeff Miller (R-Fla.), chairman of the committee, said that VA inspector general reports “have linked a number of these incidents to widespread mismanagement . . . the department has consistently given executives who presided over these events glowing performance reviews and cash bonuses.”
In Pittsburgh, where Legionnaires’ disease is blamed for five veterans’ death, “VA officials knew they had a Legionnaires’ disease outbreak on their hands, but they kept it secret for more than a year,” Miller said.
In Atlanta, IG reports blamed mismanagement for the overdose of one patient and the suicides of two others.In Buffalo, at least 18 veterans tested positive for hepatitis after it was discovered that the medical center had been reusing disposable insulin pins.
The medical centers in Jackson and Dallas are the subject of numerous allegations of poor patient care, Miller said.
Petzel provided the committee with details on disciplinary actions taken in connection with the events, but did not discuss them publicly. “When adverse events do occur, there are many ways to hold people accountable — when it is appropriate to do so,” he said.
Petzel said the VA has taken steps to avoid repeating problems that have surfaced in connection to the incidents, including the spread of Legionnaires at the Pittsburgh medical center.
“Lessons learned from Pittsburgh, and they are extensive, are now being used to ensure water safety at all VA medical centers throughout the nation,” Petzel said. The Buffalo incident has “triggered a national change in how our system manages the use of insulin pens,” he added.
Addressing families present at the hearing, Petzel added, “I am saddened by the stories of loss that I have heard from the families and I offer my sincerest condolences to the families here today.” |
<reponame>LX-s-Team/CookingLevel-Forge
package net.minecraft.advancements;
import com.google.gson.JsonObject;
import net.minecraft.advancements.critereon.DeserializationContext;
import net.minecraft.resources.ResourceLocation;
import net.minecraft.server.PlayerAdvancements;
public interface CriterionTrigger<T extends CriterionTriggerInstance> {
ResourceLocation getId();
void addPlayerListener(PlayerAdvancements pPlayerAdvancements, CriterionTrigger.Listener<T> pListener);
void removePlayerListener(PlayerAdvancements pPlayerAdvancements, CriterionTrigger.Listener<T> pListener);
void removePlayerListeners(PlayerAdvancements pPlayerAdvancements);
T createInstance(JsonObject pJson, DeserializationContext pContext);
public static class Listener<T extends CriterionTriggerInstance> {
private final T trigger;
private final Advancement advancement;
private final String criterion;
public Listener(T pTrigger, Advancement pAdvancement, String pCriterion) {
this.trigger = pTrigger;
this.advancement = pAdvancement;
this.criterion = pCriterion;
}
public T getTriggerInstance() {
return this.trigger;
}
public void run(PlayerAdvancements pPlayerAdvancements) {
pPlayerAdvancements.award(this.advancement, this.criterion);
}
public boolean equals(Object pOther) {
if (this == pOther) {
return true;
} else if (pOther != null && this.getClass() == pOther.getClass()) {
CriterionTrigger.Listener<?> listener = (CriterionTrigger.Listener)pOther;
if (!this.trigger.equals(listener.trigger)) {
return false;
} else {
return !this.advancement.equals(listener.advancement) ? false : this.criterion.equals(listener.criterion);
}
} else {
return false;
}
}
public int hashCode() {
int i = this.trigger.hashCode();
i = 31 * i + this.advancement.hashCode();
return 31 * i + this.criterion.hashCode();
}
}
} |
import {Client as DiscordClient, Message} from "discord.js";
import fetch from "node-fetch";
import * as dcu from '../discordbot/discordUtil'
import {formatFile} from "./pasteFormatter";
import {createPaste} from "./pasteApi";
const ALLOWED_SUFFIXES = [ '.txt', '.log', '.cfg', '.json', '.json5', '.iml', '.xml', '.yml', '.yaml' ]
export function startPasteHandler(client: DiscordClient): void {
client.on('interactionCreate', async interaction => {
if (!interaction.isMessageContextMenu()) return;
if (interaction.commandName == 'Create_Paste') {
try {
const channel = await dcu.tryTextChannel(client, interaction.channelId)
const msg = await channel?.messages?.fetch(interaction.targetMessage.id)
if (channel == null || msg == null) {
await dcu.sendError(interaction, 'Can\'t create paste: No message selected.')
} else {
const paste = findTextToPaste(msg)
if (paste == null) {
await dcu.sendError(interaction, 'Can\'t create paste: No suitable attachment found.')
} else if (paste == 'too_large') {
await dcu.sendError(interaction, 'Can\'t paste file: Too large')
} else {
await interaction.deferReply({
ephemeral: true,
fetchReply: true
})
const text = await (await fetch(paste.url)).text()
const formatted = formatFile(paste.fileName, text)
const result = await createPaste(paste.fileName, formatted)
await channel.send({
content: `:page_facing_up: <${result.url}>`,
reply: {
messageReference: msg,
failIfNotExists: false
},
allowedMentions: {
repliedUser: false
}
})
await interaction.editReply({
// code block needed so discord won't try to create an embed which would cause the
// link to be called and delete the paste
content: '**Delete paste:** `' + result.delete + '`'
})
}
}
} catch (err) {
console.log(err)
}
}
})
}
function findTextToPaste(msg: Message): PasteText | 'too_large' | null {
let defaultReturn: 'too_large' | null = null
for (const attachment of msg.attachments.values()) {
const name = attachment.name
if (name != null && ALLOWED_SUFFIXES.some(suffix => name.toLowerCase().endsWith(suffix))) {
if (attachment.size > (100 * 1024)) {
defaultReturn = 'too_large'
} else {
return {
fileName: name,
url: attachment.url
}
}
}
}
return defaultReturn;
}
interface PasteText {
fileName: string,
url: string
}
|
from typing import Optional
import pandas as pd
import numpy as np
from .common.helpers.helpers import Frame
from .settings import default_ticker, PeriodLength, _MONTHS_PER_YEAR
from .api.data_queries import QueryData
from .api.namespaces import get_assets_namespaces
class Asset:
"""
A financial asset, that could be used in a list of assets or in portfolio.
Parameters
----------
symbol: str, default "SPY.US"
Symbol is an asset ticker with namespace after dot. The default value is "SPY.US" (SPDR S&P 500 ETF Trust).
"""
def __init__(self, symbol: str = default_ticker):
if symbol is None or len(str(symbol).strip()) == 0:
raise ValueError("Symbol can not be empty")
self._symbol = str(symbol).strip()
self._check_namespace()
self._get_symbol_data(symbol)
self.ror: pd.Series = QueryData.get_ror(symbol)
self.first_date: pd.Timestamp = self.ror.index[0].to_timestamp()
self.last_date: pd.Timestamp = self.ror.index[-1].to_timestamp()
self.period_length: float = round(
(self.last_date - self.first_date) / np.timedelta64(365, "D"), ndigits=1
)
self.pl = PeriodLength(
self.ror.shape[0] // _MONTHS_PER_YEAR,
self.ror.shape[0] % _MONTHS_PER_YEAR,
)
def __repr__(self):
dic = {
"symbol": self.symbol,
"name": self.name,
"country": self.country,
"exchange": self.exchange,
"currency": self.currency,
"type": self.type,
"isin": self.isin,
"first date": self.first_date.strftime("%Y-%m"),
"last date": self.last_date.strftime("%Y-%m"),
"period length": "{:.2f}".format(self.period_length),
}
return repr(pd.Series(dic))
def _check_namespace(self):
namespace = self._symbol.split(".", 1)[-1]
allowed_namespaces = get_assets_namespaces()
if namespace not in allowed_namespaces:
raise ValueError(
f"{namespace} is not in allowed assets namespaces: {allowed_namespaces}"
)
def _get_symbol_data(self, symbol) -> None:
x = QueryData.get_symbol_info(symbol)
self.ticker: str = x["code"]
self.name: str = x["name"]
self.country: str = x["country"]
self.exchange: str = x["exchange"]
self.currency: str = x["currency"]
self.type: str = x["type"]
self.isin: str = x["isin"]
self.inflation: str = f"{self.currency}.INFL"
@property
def symbol(self) -> str:
"""
Return a symbol of the asset.
Returns
-------
str
"""
return self._symbol
@property
def price(self) -> Optional[float]:
"""
Return live price of an asset.
Live price is delayed (15-20 minutes).
For certain namespaces (FX, INDX, PIF etc.) live price is not supported.
Returns
-------
float, None
Live price of the asset. Returns None if not defined.
"""
return QueryData.get_live_price(self.symbol)
@property
def close_daily(self):
"""
Return close price time series historical daily data.
Returns
-------
Series
Time series of close price historical data (daily).
"""
return QueryData.get_close(self.symbol, period='D')
@property
def close_monthly(self):
"""
Return close price time series historical monthly data.
Monthly close time series not adjusted to for corporate actions: dividends and splits.
Returns
-------
Series
Time series of close price historical data (monthly).
Examples
--------
>>> import matplotlib.pyplot as plt
>>> x = ok.Asset('VOO.US')
>>> x.close_monthly.plot()
>>> plt.show()
"""
return Frame.change_period_to_month(self.close_daily)
@property
def adj_close(self):
"""
Return adjusted close price time series historical daily data.
The adjusted closing price amends a stock's closing price after accounting
for corporate actions: dividends and splits. All values are adjusted by reducing the price
prior to the dividend payment (or split).
Returns
-------
Series
Time series of adjusted close price historical data (daily).
"""
return QueryData.get_adj_close(self.symbol, period='D')
@property
def dividends(self) -> pd.Series:
"""
Return dividends time series historical monthly data.
Returns
-------
Series
Time series of dividends historical data (monthly).
Examples
--------
>>> x = ok.Asset('VNQ.US')
>>> x.dividends
Date
2004-12-22 1.2700
2005-03-24 0.6140
2005-06-27 0.6440
2005-09-26 0.6760
...
2020-06-25 0.7590
2020-09-25 0.5900
2020-12-24 1.3380
2021-03-25 0.5264
Freq: D, Name: VNQ.US, Length: 66, dtype: float64
"""
div = QueryData.get_dividends(self.symbol)
if div.empty:
# Zero time series for assets where dividend yield is not defined.
index = pd.date_range(
start=self.first_date, end=self.last_date, freq="MS", closed=None
)
period = index.to_period("D")
div = pd.Series(data=0, index=period)
div.rename(self.symbol, inplace=True)
return div.resample("M").sum()
@property
def nav_ts(self) -> Optional[pd.Series]:
"""
Return NAV time series (monthly) for mutual funds.
"""
if self.exchange == "PIF":
return QueryData.get_nav(self.symbol)
return np.nan
|
// Describe a local image with the highest confidence guess.
func (c *Client) Describe(localImagePath string) (*computervision.ImageCaption, error) {
if c.loud {
fmt.Printf("Trying to describe %s\n", localImagePath)
}
var localImage io.ReadCloser
localImage, err := os.Open(localImagePath)
if err != nil {
return nil, err
}
maxNumberDescriptionCandidates := new(int32)
*maxNumberDescriptionCandidates = 1
localImageDescription, err := c.visionClient.DescribeImageInStream(
c.visionContext,
localImage,
maxNumberDescriptionCandidates,
"",
)
if err != nil {
return nil, err
}
if len(*localImageDescription.Captions) == 0 {
return nil, ErrorNoLabel
}
imageCaption := (*localImageDescription.Captions)[0]
if *imageCaption.Confidence < c.threshold {
return &imageCaption, &ConfidenceError{*imageCaption.Confidence}
}
return &imageCaption, nil
} |
/// Create and configure config instance.
fn configure<F>(f: F) -> Self::Config
where
F: FnOnce(Self::Config) -> Self::Config,
{
f(Self::Config::default())
} |
/**
* Performs linear interpolation to compute value of a point given a list of x and y coordinates
* @param point Point at which the interpolation value is needed
* @param x X-coordinates of the data points (must be increasing)
* @param y Y-coordinates of the data points
* @return double The evaluation output
*/
public static double interpolate(double point, double[] x, double[] y) {
if (!isSorted(x, false)) {
throw new IllegalArgumentException("X-coordinates must be increasing");
}
LinearInterpolator li = new LinearInterpolator();
PolynomialSplineFunction psf = li.interpolate(x, y);
double out = psf.value(point);
return out;
} |
// Some values are custom and specific to Denizen
public static CuboidBlockSet fromSpongeStream(InputStream is) {
CuboidBlockSet cbs = new CuboidBlockSet();
try {
NBTInputStream nbtStream = new NBTInputStream(new GZIPInputStream(is));
NamedTag rootTag = nbtStream.readNamedTag();
nbtStream.close();
if (!rootTag.getName().equals("Schematic")) {
throw new Exception("Tag 'Schematic' does not exist or is not first!");
}
CompoundTag schematicTag = (CompoundTag) rootTag.getTag();
Map<String, Tag> schematic = schematicTag.getValue();
if (schematic.containsKey("DenizenEntities")) {
String entities = getChildTag(schematic, "DenizenEntities", StringTag.class).getValue();
cbs.entities = ListTag.valueOf(entities, CoreUtilities.errorButNoDebugContext);
}
short width = getChildTag(schematic, "Width", ShortTag.class).getValue();
short length = getChildTag(schematic, "Length", ShortTag.class).getValue();
short height = getChildTag(schematic, "Height", ShortTag.class).getValue();
int originX = 0;
int originY = 0;
int originZ = 0;
if (schematic.containsKey("DenizenOffset")) {
int[] offsetArr = getChildTag(schematic, "DenizenOffset", IntArrayTag.class).getValue();
originX = offsetArr[0];
originY = offsetArr[1];
originZ = offsetArr[2];
}
cbs.x_width = width;
cbs.z_height = length;
cbs.y_length = height;
cbs.center_x = originX;
cbs.center_y = originY;
cbs.center_z = originZ;
cbs.blocks = new FullBlockData[width * length * height];
Map<String, Tag> paletteMap = getChildTag(schematic, "Palette", CompoundTag.class).getValue();
HashMap<Integer, BlockData> palette = new HashMap<>(256);
for (String key : paletteMap.keySet()) {
int id = getChildTag(paletteMap, key, IntTag.class).getValue();
BlockData data;
try {
data = NMSHandler.getBlockHelper().parseBlockData(key);
}
catch (Exception ex) {
Debug.echoError(ex);
MaterialTag material = MaterialTag.valueOf(BlockHelper.getMaterialNameFromBlockData(key), CoreUtilities.noDebugContext);
data = (material == null ? new MaterialTag(Material.AIR) : material).getModernData();
}
palette.put(id, data);
}
Map<BlockVector, Map<String, Tag>> tileEntitiesMap = new HashMap<>();
if (schematic.containsKey("BlockEntities")) {
List<Tag> tileEntities = getChildTag(schematic, "BlockEntities", JNBTListTag.class).getValue();
for (Tag tag : tileEntities) {
if (!(tag instanceof CompoundTag)) {
continue;
}
CompoundTag t = (CompoundTag) tag;
int[] pos = getChildTag(t.getValue(), "Pos", IntArrayTag.class).getValue();
int x = pos[0];
int y = pos[1];
int z = pos[2];
BlockVector vec = new BlockVector(x, y, z);
tileEntitiesMap.put(vec, t.getValue());
}
}
byte[] blocks = getChildTag(schematic, "BlockData", ByteArrayTag.class).getValue();
int i = 0;
int index = 0;
while (i < blocks.length) {
int value = 0;
int varintLength = 0;
while (true) {
value |= (blocks[i] & 127) << (varintLength++ * 7);
if (varintLength > 5) {
throw new Exception("Schem file blocks tag data corrupted");
}
if ((blocks[i] & 128) != 128) {
i++;
break;
}
i++;
}
FullBlockData block = new FullBlockData(palette.get(value));
int y = index / (width * length);
int z = (index % (width * length)) / width;
int x = (index % (width * length)) % width;
int cbsIndex = z + y * cbs.z_height + x * cbs.z_height * cbs.y_length;
BlockVector pt = new BlockVector(x, y, z);
if (tileEntitiesMap.containsKey(pt)) {
block.tileEntityData = NMSHandler.getInstance().createCompoundTag(tileEntitiesMap.get(pt));
}
cbs.blocks[cbsIndex] = block;
index++;
}
if (schematic.containsKey("DenizenFlags")) {
Map<String, Tag> flags = getChildTag(schematic, "DenizenFlags", CompoundTag.class).getValue();
for (Map.Entry<String, Tag> flagData : flags.entrySet()) {
int flagIndex = Integer.valueOf(flagData.getKey());
cbs.blocks[flagIndex].flags = MapTag.valueOf(((StringTag) flagData.getValue()).getValue(), CoreUtilities.noDebugContext);
}
}
}
catch (Exception e) {
Debug.echoError(e);
}
return cbs;
} |
// SendReply sends a reply to the caller
func (r *Request) SendReply(resp proto.Message, withError *Error) error {
if r.StreamedReply() {
r.KeepStreamAlive.Stop()
if withError == nil {
return r.sendReply(
nil, &Error{Type: Error_EOS, MsgCount: r.StreamMsgCount},
)
}
}
return r.sendReply(resp, withError)
} |
Bioactive four-membered heterocyclic compounds: the anti --> syn interconversion in dithietane-1,3-dioxide.
The anti-dithietane-1,3-dioxide --> syn-dithietane-1,3-dioxide isomerization reaction has been theoretically studied on the frame of MO theory both in the gas phase and in solution. In the gas phase the anti (II(a)) <--> syn (II(s)) equilibrium is slightly displaced to the anti isomer formation. The syn concentration ( ) is ca. 36% in the gas phase, whereas in low polarity solvent, such as carbon tetrachloride, is ca. 63%. In medium-high polarity solvents like acetonitrile and dimethyl-sulfoxide the / ratio is ca. 0.37. |
<gh_stars>0
package com.tiscon.service;
import com.tiscon.code.OptionalServiceType;
import com.tiscon.code.PackageType;
import com.tiscon.dao.EstimateDao;
import com.tiscon.domain.Customer;
import com.tiscon.domain.CustomerOptionService;
import com.tiscon.domain.CustomerPackage;
import com.tiscon.dto.UserOrderDto;
import org.springframework.beans.BeanUtils;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
/**
* 引越し見積もり機能において業務処理を担当するクラス。
*
* @author <NAME>
*/
@Service
public class EstimateService {
/** 引越しする距離の1 kmあたりの料金[円] */
private static final int PRICE_PER_DISTANCE = 100;
private final EstimateDao estimateDAO;
/**
* コンストラクタ。
*
* @param estimateDAO EstimateDaoクラス
*/
public EstimateService(EstimateDao estimateDAO) {
this.estimateDAO = estimateDAO;
}
/**
* 見積もり依頼をDBに登録する。
*
* @param dto 見積もり依頼情報
*/
@Transactional
public void registerOrder(UserOrderDto dto) {
Customer customer = new Customer();
BeanUtils.copyProperties(dto, customer);
estimateDAO.insertCustomer(customer);
if (dto.getWashingMachineInstallation()) {
CustomerOptionService washingMachine = new CustomerOptionService();
washingMachine.setCustomerId(customer.getCustomerId());
washingMachine.setServiceId(OptionalServiceType.WASHING_MACHINE.getCode());
estimateDAO.insertCustomersOptionService(washingMachine);
}
List<CustomerPackage> packageList = new ArrayList<>();
packageList.add(new CustomerPackage(customer.getCustomerId(), PackageType.BOX.getCode(), dto.getBox()));
packageList.add(new CustomerPackage(customer.getCustomerId(), PackageType.BED.getCode(), dto.getBed()));
packageList.add(new CustomerPackage(customer.getCustomerId(), PackageType.BICYCLE.getCode(), dto.getBicycle()));
packageList.add(new CustomerPackage(customer.getCustomerId(), PackageType.WASHING_MACHINE.getCode(), dto.getWashingMachine()));
estimateDAO.batchInsertCustomerPackage(packageList);
}
/**
* 見積もり依頼に応じた概算見積もりを行う。
*
* @param dto 見積もり依頼情報
* @return 概算見積もり結果の料金
*/
public Integer getPrice(UserOrderDto dto) throws ParseException {
double distance = estimateDAO.getDistance(dto.getOldPrefectureId(), dto.getNewPrefectureId());
// 小数点以下を切り捨てる
int distanceInt = (int) Math.floor(distance);
int boxes = getBoxForPackage(dto.getBox(), PackageType.BOX)
+ getBoxForPackage(dto.getBed(), PackageType.BED)
+ getBoxForPackage(dto.getBicycle(), PackageType.BICYCLE)
+ getBoxForPackage(dto.getWashingMachine(), PackageType.WASHING_MACHINE);
// 箱に応じてトラックの種類が変わり、それに応じて料金が変わるためトラック料金を算出する。
int pricePerTruck = estimateDAO.getPricePerTruck(boxes);
// オプションサービスの料金を算出する。
int priceForOptionalService = 0;
if (dto.getWashingMachineInstallation()) {
priceForOptionalService = estimateDAO.getPricePerOptionalService(OptionalServiceType.WASHING_MACHINE.getCode());
}
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd");
sdf.setLenient(false);
Date d = sdf.parse(dto.getScheduleddate());
SimpleDateFormat msdf = new SimpleDateFormat("MM");
String datem = msdf.format(d);
System.out.println("month : " + datem);
double priceDistanceTruck = 0;
int priceDistance = distanceInt * PRICE_PER_DISTANCE;
priceDistanceTruck = priceDistance + pricePerTruck;
// 距離当たりの料金を算出する
if(datem.equals("03") || datem.equals("04")) {
priceDistanceTruck = priceDistanceTruck * 1.5;
}else if(datem.equals("09")){
priceDistanceTruck = priceDistanceTruck * 1.2;
}else{
}
// 小数点以下を切り捨てる
int priceDistanceTruckInt = (int) Math.floor(priceDistanceTruck);
return priceDistanceTruckInt + priceForOptionalService;
}
/**
* 荷物当たりの段ボール数を算出する。
*
* @param packageNum 荷物数
* @param type 荷物の種類
* @return 段ボール数
*/
private int getBoxForPackage(int packageNum, PackageType type) {
return packageNum * estimateDAO.getBoxPerPackage(type.getCode());
}
} |
March 13th, 2015
Linus Under Wraps, Fedora Tests Wayland & More…
FOSS Week in Review
To be honest, I’m really not finished going through all the materials I picked up at the Open Compute Project 2015 U.S. Summit this week in San Jose. There is a lot of interesting stuff here to wade through, and I’m still going through it.
Meanwhile, much of what makes the FOSS world interesting didn’t wait for me to finish. Like:
Linus Under Wraps? We all know Linus Torvalds. We “get it” — he’s a guy with a big vocabulary who doesn’t suffer fools gladly. Many of us are okay with “Linus being Linus,” though some lately have begun to question how positive his criticisms can be. Earlier this week, Business Insider reported the Linux Foundation appeared to try to rein him in, slapping him on the wrist when they issued a new “Code of Conflict” policy that declared “personal insults or abuse are not welcome.”
The “Code of Conflict” says that if “anyone feels personally abused, threatened, or otherwise uncomfortable” while working on Linux, they should report the situation to the Technical Advisory Board who will step in and mediate.”
Greg Kroah-Hartman wrote it and submitted it as a “patch” to the Linux system, which meant ultimately Linus had to see the “patch” to approve it. He did, of course, adding the comment “Let’s see how this works.”
Let’s see, indeed.
Fedora 22 Alpha out: It’s nearly spring, and with the warmer weather and blooming flowers usually comes the even-numbered Fedora releases. The Fedora 22 Alpha is now ready for your test-driving pleasure, should you choose to give it a spin.
Hidden in the release notes in the link above is this morsel: “The login screen now uses Wayland by default. This is a step towards replacing X with Wayland, and users should not actually notice the difference.” Okay, let’s just see about that. If you want to see a preview of what that might look like, you can take a look here.
Fedora 22 Beta is scheduled for release in mid-April.
It’s all in the terminology: I’ve always liked the term “Metal-as-a-Service” (MAAS). It invokes instantly conjuring up Judas Priest or Anthrax (the band, not the livestock disease) at a moment’s notice, though I know that’s not really what it means.
In any case, Canonical trumpets its partnership with Microsoft — yep, Microsoft — this week at the Open Compute Summit, where the Isle of Man reached across to Redmond to demonstrate how Canonical and Microsoft are working together to create scalable, OCP-compliant architecture.
“Canonical is supporting bare-metal provisioning on Microsoft’s OCS hardware with our open source Metal-as-a-Service (MAAS) deployment product,” the article states. “This support means Windows and Linux (Ubuntu, CentOS, SUSE) operating systems, as well as application software on top, can be one-touch provisioned on OCS hardware. At the summit, Canonical is demonstrating how to provision a multi-tier web architecture effectively, including content publishing and database to separate nodes in an OCS chassis.”
What’s mysteriously missing from that parenthetical in the previous paragraph? If you said Red Hat, you’d go on to the bonus round.
Okay then…get a room, you two. See you next week. |
<filename>crates/erooster_imap/src/commands/noop.rs<gh_stars>1-10
use crate::commands::CommandData;
use futures::{channel::mpsc::SendError, Sink, SinkExt};
use tracing::instrument;
pub struct Noop;
impl Noop {
#[instrument(skip(self, lines, command_data))]
pub async fn exec<S>(
&self,
lines: &mut S,
command_data: &CommandData<'_>,
) -> color_eyre::eyre::Result<()>
where
S: Sink<String, Error = SendError> + std::marker::Unpin + std::marker::Send,
{
// TODO return status as suggested in https://www.rfc-editor.org/rfc/rfc9051.html#name-noop-command
lines
.send(format!("{} OK NOOP completed", command_data.tag))
.await?;
Ok(())
}
}
|
<reponame>mraitmaier/pyrus
"""
testresult.py - a script implementing TestResult and TestStatus classes
The former is the main class (as the name of the file suggests...) and it
is a container for test result. The latter is just a simple helper class
serving as a enum for available test result values. It's convenient for
importing and easy to use.
NOTE: this module should not be run as a standalone scripts, excepts for
built-in tests.
"""
# HISTORY ####################################################################
#
# 0.0.1 Mar11 MR # The initial version of the file
##############################################################################
__description__ = "TestResult class implementation"
__version__ = "0.0.1"
__author__ = "<NAME>."
import unittest
from pyrus.core.testresult import TestResult, TestStatus
class TestStatusUnit(unittest.TestCase):
"""
TestStatusUnit - class representing unit tests for TestStatus class
"""
def test_01_pass(self):
""" """
s = TestStatus.PASS
self.assertEqual(s, TestStatus.PASS)
def test_02_pass(self):
""" """
s = 0
self.assertEqual(s, TestStatus.PASS)
def test_03_pass(self):
""" """
s = TestStatus.FAIL
self.assertEqual(s, TestStatus.FAIL)
def test_04_fail(self):
""" """
s = 1
self.assertEqual(s, TestStatus.FAIL)
def test_05_xfail(self):
""" """
s = TestStatus.XFAIL
self.assertEqual(s, TestStatus.XFAIL)
def test_06_xfail(self):
""" """
s = 2
self.assertEqual(s, TestStatus.XFAIL)
def test_07_nottested(self):
""" """
s = TestStatus.NOT_TESTED
self.assertEqual(s, TestStatus.NOT_TESTED)
def test_08_nottested(self):
""" """
s = 3
self.assertEqual(s, TestStatus.NOT_TESTED)
def test_09_skipped(self):
""" """
s = TestStatus.SKIPPED
self.assertEqual(s, TestStatus.SKIPPED)
def test_10_skipped(self):
""" """
s = 4
self.assertEqual(s, TestStatus.SKIPPED)
def test_11_unknown(self):
""" """
s = TestStatus.UNKNOWN
self.assertEqual(s, TestStatus.UNKNOWN)
def test_12_unknown(self):
""" """
s = -1
self.assertEqual(s, TestStatus.UNKNOWN)
def test_13_invalid(self):
""" """
s = -2
self.assertNotIn(s, TestStatus.values, "Not in valid range")
def test_14_invalid(self):
""" """
s = 5
self.assertNotIn(s, TestStatus.values, "Not in valid range")
def test_15_invalid(self):
""" """
s = 666
self.assertNotIn(s, TestStatus.values, "Not in valid range")
def test_16_invalid_str(self):
""" """
s = "blah"
self.assertNotIn(s, TestStatus.values, "Not in valid range")
def test_17_invalid_float(self):
""" """
s = 0.23
self.assertNotIn(s, TestStatus.values, "Not in valid range")
def test_20_convert_pass(self):
""" """
vals = ["pass", "PASS", "PasS", "pASs", " pass ", " PASS",
" PasS "]
for v in vals:
s = TestStatus.convert(v)
self.assertEqual(s, TestStatus.PASS)
def test_21_convert_fail(self):
""" """
vals = ["fail", "FAIL", "Fail", "fAIl", " fail ", " FaIL "]
for v in vals:
s = TestStatus.convert(v)
self.assertEqual(s, TestStatus.FAIL)
def test_22_convert_xfail(self):
""" """
vals = ["xfail", "XFAIL", "xFail", "XfAIl", " xfail ",
" xFaIL ", "expected fail", "EXPECTED FAIL",
" EXpEcTED fAIL "]
for v in vals:
s = TestStatus.convert(v)
self.assertEqual(s, TestStatus.XFAIL)
def test_23_convert_nottested(self):
""" """
vals = ["not tested", "NOT TESTED", "not_tested", "NOT_TESTED",
" NOt_tested ", " NOT tested "]
for v in vals:
s = TestStatus.convert(v)
self.assertEqual(s, TestStatus.NOT_TESTED)
def test_24_convert_skipped(self):
""" """
vals = ["skipped", "SKIPPED", "SkiPPed", " skipped ",
" SkiPpEd "]
for v in vals:
s = TestStatus.convert(v)
self.assertEqual(s, TestStatus.SKIPPED)
def test_25_convert_unknown(self):
""" """
vals = ["unknownd", "UNKNOWN", "UnkNOwn", " unknown ",
" UNKnown "]
for v in vals:
s = TestStatus.convert(v)
self.assertEqual(s, TestStatus.UNKNOWN)
def test_26_convert_empty(self):
""" """
s = TestStatus.convert("")
self.assertEqual(s, TestStatus.UNKNOWN)
def test_27_convert_default(self):
""" """
s = TestStatus.convert("blah")
self.assertEqual(s, TestStatus.UNKNOWN)
class TestResultUnit(unittest.TestCase):
"""
TestResultUnit - represents unit tests for TestResult class
"""
def test_01_pass(self):
""" """
r = TestResult(0)
self.assertEqual(r.result, TestStatus.PASS)
def test_02_pass(self):
""" """
r = TestResult(TestStatus.PASS)
self.assertEqual(r.result, TestStatus.PASS)
def test_03_pass_str(self):
""" """
r = TestResult(TestStatus.PASS)
self.assertEqual(str(r), "pass")
def test_04_pass_toJson(self):
json = """{"result": "pass"}"""
r = TestResult(TestStatus.PASS)
self.assertEqual(r.toJson(), json)
def test_05_fail(self):
""" """
r = TestResult(1)
self.assertEqual(r.result, TestStatus.FAIL)
def test_06_fail(self):
""" """
r = TestResult(TestStatus.FAIL)
self.assertEqual(r.result, TestStatus.FAIL)
def test_07_fail_str(self):
""" """
r = TestResult(TestStatus.FAIL)
self.assertEqual(str(r), "fail")
def test_08_fail_toJson(self):
json = """{"result": "fail"}"""
r = TestResult(TestStatus.FAIL)
self.assertEqual(r.toJson(), json)
def test_09_xfail(self):
""" """
r = TestResult(2)
self.assertEqual(r.result, TestStatus.XFAIL)
def test_10_xfail(self):
""" """
r = TestResult(TestStatus.XFAIL)
self.assertEqual(r.result, TestStatus.XFAIL)
def test_11_xfail_str(self):
""" """
r = TestResult(TestStatus.XFAIL)
self.assertEqual(str(r), "expected fail")
def test_12_xfail_toJson(self):
json = """{"result": "expected fail"}"""
r = TestResult(TestStatus.XFAIL)
self.assertEqual(r.toJson(), json)
def test_13_nottested(self):
""" """
r = TestResult(3)
self.assertEqual(r.result, TestStatus.NOT_TESTED)
def test_14_nottested(self):
""" """
r = TestResult(TestStatus.NOT_TESTED)
self.assertEqual(r.result, TestStatus.NOT_TESTED)
def test_15_nottested_str(self):
""" """
r = TestResult(TestStatus.NOT_TESTED)
self.assertEqual(str(r), "not tested")
def test_16_nottested_toJson(self):
json = """{"result": "not tested"}"""
r = TestResult(TestStatus.NOT_TESTED)
self.assertEqual(r.toJson(), json)
def test_17_skipped(self):
""" """
r = TestResult(4)
self.assertEqual(r.result, TestStatus.SKIPPED)
def test_18_skipped(self):
""" """
r = TestResult(TestStatus.SKIPPED)
self.assertEqual(r.result, TestStatus.SKIPPED)
def test_19_skipped_str(self):
""" """
r = TestResult(TestStatus.SKIPPED)
self.assertEqual(str(r), "skipped")
def test_20_skipped_toJson(self):
json = """{"result": "skipped"}"""
r = TestResult(TestStatus.SKIPPED)
self.assertEqual(r.toJson(), json)
def test_21_unknown(self):
""" """
r = TestResult(-1)
self.assertEqual(r.result, TestStatus.UNKNOWN)
def test_22_unknown(self):
""" """
r = TestResult(TestStatus.UNKNOWN)
self.assertEqual(r.result, TestStatus.UNKNOWN)
def test_23_unknown_str(self):
""" """
r = TestResult(TestStatus.UNKNOWN)
self.assertEqual(str(r), "unknown")
def test_24_unknown_toJson(self):
json = """{"result": "unknown"}"""
r = TestResult(TestStatus.UNKNOWN)
self.assertEqual(r.toJson(), json)
def test_25_default(self):
""" """
r = TestResult()
self.assertEqual(r.result, TestStatus.UNKNOWN)
def test_26_default_str(self):
""" """
r = TestResult()
self.assertEqual(str(r), "unknown")
def test_27_default_toJson(self):
json = """{"result": "unknown"}"""
r = TestResult()
self.assertEqual(r.toJson(), json)
def test_28_set(self):
""" """
r = TestResult()
self.assertEqual(r.result, TestStatus.UNKNOWN)
r.result = TestStatus.PASS
self.assertEqual(r.result, TestStatus.PASS)
r.result = TestStatus.FAIL
self.assertEqual(r.result, TestStatus.FAIL)
r.result = TestStatus.XFAIL
self.assertEqual(r.result, TestStatus.XFAIL)
r.result = TestStatus.SKIPPED
self.assertEqual(r.result, TestStatus.SKIPPED)
r.result = TestStatus.NOT_TESTED
self.assertEqual(r.result, TestStatus.NOT_TESTED)
r.result = TestStatus.PASS
self.assertEqual(r.result, TestStatus.PASS)
self.assertEqual(str(r), "pass")
def test_29_invalid_below(self):
""" """
self.assertRaises(AssertionError, TestResult, -2)
def test_30_invalid_above(self):
""" """
self.assertRaises(AssertionError, TestResult, 666)
def test_31_invalid_float(self):
""" """
self.assertRaises(AssertionError, TestResult, 2.23)
def test_32_invalidi_hex(self):
""" """
self.assertRaises(AssertionError, TestResult, 0x45)
def test_33_invalid_string(self):
""" """
self.assertRaises(AssertionError, TestResult, "a string")
def test_34_invalid_list(self):
""" """
self.assertRaises(AssertionError, TestResult, [])
def test_35_invalid_dict(self):
""" """
self.assertRaises(AssertionError, TestResult, {})
if __name__ == '__main__':
suite = unittest.makeSuite(TestStatusUnit)
unittest.TextTestRunner(verbosity=2).run(suite)
#
suite = unittest.makeSuite(TestResultUnit)
unittest.TextTestRunner(verbosity=2).run(suite)
|
The first thing that comes to mind for some former Creighton players about new Purdue assistant coach Steve Lutz: "Toughness."
Cole Huff and Geoffrey Groselle each played for the Bluejays during Lutz' seven years with Greg McDermott's program. Both tabbed "toughness" above all else when asked to describe him, but with a caveat.
"He's a tough guy, but there's a real balance to him," said Huff, who completed his eligibility at Creighton this past season. "He is a tough, intense guy who doesn't accept you slacking off or losing focus, but at the same time, off the court and on the court, he really cares about his guys. He's like a parent, where he can be pretty hard on you, but just because he wants the best for you.
"He'll test your mental toughness because he'll say some things you might not like, but it'll be the truth. If you're messing up, blowing assignments or just not playing the way you should be, he's going to be the first one to let you know. He definitely gets his point across and more often than not you're going to learn from your mistake so you don't have to deal with Coach Lutz again."
Groselle, a 7-footer now playing professionally in Germany, joked that Purdue players might, "love him and hate him at the same time."
But more of the former, Groselle said.
"If you're messing up, he'll be the first to call you out on it and be brutally honest," Groselle said. "I can't tell you how many times I've heard him yell my name in practice. I've had so many meetings with him and every time, it starts with, 'I'm going to be honest with you, Geoffrey …,' and then, bam, he hits you with it, the truth you don't necessarily want to hear, but you need to hear."
That sort of forthrightness, both players said, built trust, and made Lutz an integral part of Creighton's success. In seven years under McDermott, with Lutz on staff, the Bluejays have averaged 24 wins and made four NCAA Tournament appearances.
"The culture they built those first couple years with Doug (McDermott), it was really a family," Groselle said of his alma mater's success. "That's really why we won so many games. We had an amazing talent in Doug, then in (Justin Patton) and Marcus Foster this year, but the family they built and culture they built was so important, and Lutz was a big part of it.
"It started with recruiting, then continued through my entire career, especially with me. If I ever had an issue, I went to him, and that trust you build with the coaching staff, it really pays dividends on the court, because if you trust your teammates and trust your coaches on and off the court, it's a good recipe for success."
Continue reading below |
def _compute_contiguity_geom(self, region_id=None, params={}):
nx, ny = self.borders[0].shape[0]-1, self.borders[1].shape[0]-1
if region_id is not None:
return compute_contiguity_grid(region_id, (nx, ny))
iss, jss, dts = [], [], []
for i in range(nx):
for j in range(ny):
aux_ = compute_contiguity_grid(i*nx+j, (nx, ny))
n_aux = len(aux_)
dts.append(np.ones(n_aux).astype(int))
iss.append(np.ones(n_aux).astype(int)*i*nx+j)
jss.append(np.array(aux_).astype(int))
iss, jss = np.hstack(iss).astype(int), np.hstack(jss).astype(int)
dts = np.hstack(dts).astype(int)
contiguous = coo_matrix((dts, (iss, jss)), shape=(nx*ny, nx*ny))
return contiguous |
In 2011, Google and Apple spent more on patent litigation than research, the author says. | AP Photos Throw (patent) trolls under bridge
When America’s system of patent litigation has gotten so dysfunctional that President Barack Obama and Republican Judiciary Chairman Bob Goodlatte agree on the need for reform, something is amiss. Make no mistake: Patents are important. Inventors’ ability to protect and exploit their discoveries has long been an essential driver of American prosperity — deemed so vital by the founding fathers that granting patents was one of the new federal government’s few enumerated legislative powers.
Of late, however, litigation over software patents has emerged as a drag, not a boon, to technological innovation. According to a study by Boston University law professors Mike Meurer and Jim Bessen, the “patent tax” adds 20 percent to software and electronic research and development costs, and in 2011, Google and Apple spent more on patent litigation and acquisition than on research and development. Little wonder: In June, for instance, the International Trade Commission (ITC) responded to a Samsung patent-infringement claim with a ruling that would have blocked the importation of certain Apple iPhones and iPads, had President Obama not taken the extraordinary step of vetoing the decision last Saturday in the first such presidential override of an ITC decision since 1987.
Story Continued Below
Unlike pharmaceutical patents, software patents are hard to define and have often been improvidently granted. And multiple software patents tend to be embedded within products, such as smartphones, cars, and printers — which can open up the manufacturers and even the users of such products to exploitative litigation.
Thus, the nation’s most aggressive plaintiffs’ lawyers — whom we at the Manhattan Institute have dubbed Trial Lawyers, Inc. — have begun to manipulate U.S. legal rules to extract wealth from the nation’s most innovative companies. Most such litigation today is not filed by companies holding patents, such as Samsung, but by “patent trolls” — people or companies that produce no goods or services themselves but exist to acquire patent rights and seek to enforce them against businesses that are producing goods or services using related technologies. Over the last six years, the number of lawsuits filed by such patent-assertion entities has increased 526 percent, according to research by Santa Clara law professor Colleen Chien, examining data from the patent risk-management company RPX Corporation.
And the businesses and individuals being hurt by patent-troll litigation abuses are not merely technology companies and manufacturers. In February 2011, a patent troll named Innovatio IP Ventures, LLC, acquired a portfolio of 31 patents related to Wireless Fidelity (Wi-Fi) technology. Instead of suing manufacturers employing Wi-Fi technologies that allegedly infringed on the patents, attorneys working with Innovatio mailed more than 8,000 letters seeking $2,500 to $3,000 each from retail businesses — including hotels, coffee shops, and restaurants — that offered customers Wi-Fi services.
Late last year, another patent troll, MPHJ Technology Investments LLC, acquired a patent covering scanner technologies that employed a one-button scan and send-to-e-mail function, which had been granted to Israeli resident Laurence Klein in 1997 but never used in a manufacturing setting. Rather than suing scanner and printer manufacturers, attorneys working with MPHJ mailed demand letters to hundreds of small and medium-size U.S. businesses that were end users of printers and scanners — seeking roughly $1,000 per worker in licensing royalties.
Apart from the fact that the U.S. Patent and Trademarks Office has issued many patents it shouldn’t, such patent-lawsuit shenanigans are enabled by idiosyncratic features of the American legal system that have made the United States far more litigious than any other nation in the developed world. The U.S., unlike every country in Western Europe, does not require the losers of a civil lawsuit to reimburse the winner’s expenses. A company facing a $3,000 demand letter might very well settle up rather than risking the need to mount a legal defense — which in a patent suit typically costs hundreds of thousands or millions of dollars if the case goes to trial, successful or not.
“Forum shopping” is another useful and much-abused tool for patent trolls. Under U.S. law, patent lawsuits can be asserted wherever a product is sold, which has led plaintiffs’ lawyers to seek out jurisdictions particularly likely to expedite their claims and award them hefty jury verdicts — places like the Eastern District of Texas, where the number of patent lawsuits filed skyrocketed from 32 in 2002 to 1,266 in 2012, according to filing data compiled by Perkins Coie patent attorney James Pistorino. And in addition to federal court, patent lawyers can file suit at the ITC, which has broad powers, as in the Samsung-Apple case, to keep products at the border on the basis of an alleged patent infringement involving a single software program.
The president and certain congressional leaders have advanced various approaches to fixing patent-litigation abuse, but there seems to be a growing consensus that something must be done. Let’s hope that our elected officials put aside the normal partisan bickering on this one and get to work: The nation’s economic growth and technological leadership depend on it.
James R. Copland is the director of the Center for Legal Policy at the Manhattan Institute, which on Aug. 5 released a new report, “ Trial Lawyers Inc.: Patent Trolls.”
This article tagged under: Opinion
Opinion
Patents |
def data_load_essentia(filename, duration=None, offset=0.0, sr=22050, mono=True, **kwargs):
assert type(filename) is str and filename is not None and filename != '', 'filename argument {0} / {1} is invalid'.format(filename, type(filename))
l.debug('Loading ({2}) {0} s of audio file {1}'.format(duration, filename, 'essentia'))
sr = 22050
y = MonoLoader(filename=filename, sampleRate=sr).compute()
l.debug('Loaded ({2}) audio file with {0} samples at rate {1}'.format(type(y), sr, 'essentia'))
return y, sr |
/*
* Copyright (c) 2019, 2021 Oracle and/or its affiliates.
* Licensed under the Universal Permissive License v 1.0 as shown at
* http://oss.oracle.com/licenses/upl.
*/
package com.oracle.coherence.examples.tls;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URL;
import com.tangosol.net.PasswordProvider;
import com.tangosol.util.Base;
import com.tangosol.util.Resources;
/**
* A file based Coherence {@link com.tangosol.net.PasswordProvider}.
* <p>
* If the file name passed to the constructor is {@code null} then
* an empty password value is returned from the {@link #get()}
* method.
*/
public class FileBasedPasswordProvider
implements PasswordProvider {
private static final char[] EMPTY_PASSWORD = new char[0];
/**
* The name of the file containing the password.
*/
private final String fileName;
/**
* Create a {@link com.oracle.coherence.examples.tls.FileBasedPasswordProvider}.
*
* @param file the name of the file containing the password
*/
public FileBasedPasswordProvider(String file) {
fileName = file;
}
@Override
public char[] get() {
return readPassword(fileName);
}
/**
* Read a password from a file.
*
* @param fileName the password file name
*
* @return the password
*/
public static char[] readPassword(String fileName) {
return readPassword(fileName, EMPTY_PASSWORD);
}
/**
* Read a password from a file.
*
* @param fileName the password file name
* @param defaultPassword the default password
*
* @return the password or the default password if the file was not found
*/
public static char[] readPassword(String fileName, char[] defaultPassword) {
if (fileName == null || fileName.trim().length() == 0) {
return defaultPassword;
}
URL url = Resources.findFileOrResource(fileName, FileBasedPasswordProvider.class.getClassLoader());
if (url == null) {
throw new IllegalStateException("Could not find password file " + fileName);
}
try (InputStream in = url.openStream()) {
BufferedReader reader = new BufferedReader(new InputStreamReader(in));
String line = reader.readLine();
return line == null ? new char[0] : line.toCharArray();
}
catch (IOException e) {
throw Base.ensureRuntimeException(e);
}
}
}
|
<reponame>jiaruixu/Detectron
#!/usr/bin/python
#
# Converts the polygonal annotations of the Cityscapes dataset
# to images, where pixel values encode the ground truth classes and the
# individual instance of that classes.
#
# The Cityscapes downloads already include such images
# a) *color.png : the class is encoded by its color
# b) *labelIds.png : the class is encoded by its ID
# c) *instanceIds.png : the class and the instance are encoded by an instance ID
#
# With this tool, you can generate option
# d) *instanceTrainIds.png : the class and the instance are encoded by an instance training ID
# This encoding might come handy for training purposes. You can use
# the file labes.py to define the training IDs that suit your needs.
# Note however, that once you submit or evaluate results, the regular
# IDs are needed.
#
# Please refer to 'json2instanceImg.py' for an explanation of instance IDs.
#
# Uses the converter tool in 'json2instanceImg.py'
# Uses the mapping defined in 'labels.py'
#
# python imports
from __future__ import print_function
import os, glob, sys
# cityscapes imports
sys.path.append( os.path.normpath( os.path.join( os.path.dirname( __file__ ) , '..' , 'helpers' ) ) )
from csHelpers import printError
from json2instanceImg import json2instanceImg
# The main method
def main():
# Where to look for Cityscapes
if 'CITYSCAPES_DATASET' in os.environ:
cityscapesPath = os.environ['CITYSCAPES_DATASET']
else:
cityscapesPath = os.path.join(os.path.dirname(os.path.realpath(__file__)),'..','..')
# how to search for all ground truth
searchFine = os.path.join( cityscapesPath , "gtFine" , "*" , "*" , "*_gt*_polygons.json" )
searchCoarse = os.path.join( cityscapesPath , "gtCoarse" , "*" , "*" , "*_gt*_polygons.json" )
# search files
filesFine = glob.glob( searchFine )
filesFine.sort()
filesCoarse = glob.glob( searchCoarse )
filesCoarse.sort()
# concatenate fine and coarse
files = filesFine + filesCoarse
# files = filesFine # use this line if fine is enough for now.
# quit if we did not find anything
if not files:
printError( "Did not find any files. Please consult the README." )
# a bit verbose
print("Processing {} annotation files".format(len(files)))
# iterate through files
progress = 0
print("Progress: {:>3} %".format( progress * 100 / len(files) ), end=' ')
for f in files:
# create the output filename
dst = f.replace( "_polygons.json" , "_instanceTrainIds.png" )
# do the conversion
try:
json2instanceImg( f , dst , "trainIds" )
except:
print("Failed to convert: {}".format(f))
raise
# status
progress += 1
print("\rProgress: {:>3} %".format( progress * 100 / len(files) ), end=' ')
sys.stdout.flush()
# call the main
if __name__ == "__main__":
main()
|
/** A queue of all as yet unattributed classes.
*
* <p><b>This is NOT part of any supported API.
* If you write code that depends on this, you do so at your own risk.
* This code and its internal interfaces are subject to change or
* deletion without notice.</b>
*/
public class Todo extends AbstractQueue<Env<AttrContext>> {
/** The context key for the todo list. */
protected static final Context.Key<Todo> todoKey =
new Context.Key<Todo>();
/** Get the Todo instance for this context. */
public static Todo instance(Context context) {
Todo instance = context.get(todoKey);
if (instance == null)
instance = new Todo(context);
return instance;
}
/** Create a new todo list. */
protected Todo(Context context) {
context.put(todoKey, this);
}
public void append(Env<AttrContext> env) {
add(env);
}
@Override
public Iterator<Env<AttrContext>> iterator() {
return contents.iterator();
}
@Override
public int size() {
return contents.size();
}
public boolean offer(Env<AttrContext> e) {
if (contents.add(e)) {
if (contentsByFile != null)
addByFile(e);
return true;
} else {
return false;
}
}
public Env<AttrContext> poll() {
if (size() == 0)
return null;
Env<AttrContext> env = contents.remove(0);
if (contentsByFile != null)
removeByFile(env);
return env;
}
public Env<AttrContext> peek() {
return (size() == 0 ? null : contents.get(0));
}
public Queue<Queue<Env<AttrContext>>> groupByFile() {
if (contentsByFile == null) {
contentsByFile = new LinkedList<Queue<Env<AttrContext>>>();
for (Env<AttrContext> env: contents) {
addByFile(env);
}
}
return contentsByFile;
}
private void addByFile(Env<AttrContext> env) {
JavaFileObject file = env.toplevel.sourcefile;
if (fileMap == null)
fileMap = new HashMap<JavaFileObject, FileQueue>();
FileQueue fq = fileMap.get(file);
if (fq == null) {
fq = new FileQueue();
fileMap.put(file, fq);
contentsByFile.add(fq);
}
fq.fileContents.add(env);
}
private void removeByFile(Env<AttrContext> env) {
JavaFileObject file = env.toplevel.sourcefile;
FileQueue fq = fileMap.get(file);
if (fq == null)
return;
if (fq.fileContents.remove(env)) {
if (fq.isEmpty()) {
fileMap.remove(file);
contentsByFile.remove(fq);
}
}
}
LinkedList<Env<AttrContext>> contents = new LinkedList<Env<AttrContext>>();
LinkedList<Queue<Env<AttrContext>>> contentsByFile;
Map<JavaFileObject, FileQueue> fileMap;
class FileQueue extends AbstractQueue<Env<AttrContext>> {
@Override
public Iterator<Env<AttrContext>> iterator() {
return fileContents.iterator();
}
@Override
public int size() {
return fileContents.size();
}
public boolean offer(Env<AttrContext> e) {
if (fileContents.offer(e)) {
contents.add(e);
return true;
}
return false;
}
public Env<AttrContext> poll() {
if (fileContents.size() == 0)
return null;
Env<AttrContext> env = fileContents.remove(0);
contents.remove(env);
return env;
}
public Env<AttrContext> peek() {
return (fileContents.size() == 0 ? null : fileContents.get(0));
}
LinkedList<Env<AttrContext>> fileContents = new LinkedList<Env<AttrContext>>();
}
} |
"""Act as a proxy for the Foo instance in this thread's context."""
from ThreadLocalStorage import currentContext
class FooProxy:
"""Act as a proxy for the Foo instance in this thread's context.
This is based on the ThreadLocalStorage module.
"""
def __getattr__(self, attr):
return getattr(currentContext().foo, attr)
def __setattr__(self, attr, value):
setattr(currentContext().foo, attr, value)
|
<reponame>braposo/braid-design-system<filename>lib/components/private/ThemeContext.ts<gh_stars>1-10
import { createContext, useContext } from 'react';
import { Theme } from './../../themes/theme';
const themeContext = createContext<Theme | null>(null);
export const useTheme = () => {
const theme = useContext(themeContext);
if (theme === null) {
throw new Error('No theme passed');
}
return theme;
};
export default themeContext;
|
Evidence of eosinophil extracellular trap cell death in COPD: does it represent the trigger that switches on the disease?
In spite of the numerous studies on chronic obstructive pulmonary disease (COPD), the cellular and molecular basis of the disease’s development remain unclear. Neutrophils and eosinophils are known to be key players in COPD. Recently, neutrophil extracellular trap cell death (NETosis), a mechanism due to decondensation and extrusion of chromatin to form extracellular traps, has been demonstrated in COPD. However, there is limited knowledge about eosinophil extracellular trap cell death (EETosis) and its role in the pathogenesis of COPD. The aim of this study was to evaluate EETosis in stable COPD. Induced sputum obtained from healthy smokers and low exacerbation risk COPD A or B group patients or high exacerbation risk COPD C or D group patients were included. Samples were examined using electron microscopy and immunofluorescence. Healthy smokers (n=10) and COPD A (n=19) group exhibited neutrophilic or paucigranulocytic phenotypes, with NETosis being absent in these patients. In contrast, COPD B (n=29), with eosinophilic or mixed phenotypes, showed EETosis and incipient NETosis. COPD C (n=18) and COPD D groups (n=13) were differentiated from low exacerbation rate-COPD group by the abundant cellular debris, with COPD C group having an eosinophilic pattern and numerous cells undergoing EETosis. A hallmark of this group was the abundant released membranes that often appeared phagocytosed by neutrophils, which coincidentally exhibited early NETosis changes. The COPD D group included patients with a neutrophilic or mixed pattern, with abundant neutrophil extracellular trap-derived material. This study is the first to demonstrate EETosis at different stages of stable COPD. The results suggest a role for eosinophils in COPD pathophysiology, especially at the beginning and during the persistence of the disease, regardless of whether the patient quit smoking, with EETosis debris probably triggering uncontrolled NETosis. The main target of these findings should be young smokers with the potential to develop COPD.
Introduction
Chronic obstructive pulmonary disease (COPD) is currently the third cause of death worldwide, 1 as well as being a significant factor in terms of direct and indirect lost productivity. Although the current COPD epidemic is in part due to smoking, an important pathophysiological aspect is that not all smokers develop COPD, and the cellular and molecular basis to predict which smokers will develop the disease is completely unknown. Furthermore, the reason for the lack of achieving an attenuation of inflammatory signs, even years after having discontinued smoking, is still far from being clarified, thereby representing a real challenge for research related to this field. COPD is characterized by the chronic inflammation of the airways as a result of exposure to inhaled irritants and the consequent epithelial damage. Neutrophils are considered to be the principal cell responsible for orchestrating COPD, with several reports having pointed out that excessive proteolytic activity of neutrophil elastase contributes to the tissue damage observed in the airways of COPD. 2 However, this does not appear to be a major trigger of other clinical or pathophysiologic abnormalities in COPD. 3 Furthermore, Singh et al 4 observed a high prevalence of eosinophils in the blood and sputum of patients with COPD, but without any clinical relevance having been shown.
The death of inflammatory cells and their removal are critical events in affected tissues. Neutrophils usually die by apoptosis, a process that occurs by cleavage of DNA into oligonucleosomal size fragments, chromatin condensation, and the formation of apoptotic bodies, leading to phagocytosis by neighboring cells, thus preventing inflammation. 5 Recently, an alternative mechanism of death was described for neutrophils, 6,7 in which the cell releases its DNA forming neutrophil extracellular traps (NETs). 8 This original neutrophil extracellular trap cell death (NETosis) process, referred to as "NETosis" by Steinberg and Grinstein, is distinct from apoptosis and necrosis 9 and allows a complete neutrophil microbicidal function. NETs then need to be removed quickly to prevent further tissue damage or autoimmune phenomena, which have been described in pathologies that occur with alterations in the NETs resolution. 10 The occurrence of NETosis has been described in many inflammatory conditions of the lung, including cystic fibrosis, which is an aggravating factor of airway obstruction. 11,12 A recent study has demonstrated the presence of sterile NETs in the sputum of patients with stable and exacerbated COPD, which correlated with the severity of airflow limitation. 13 As different cell types such as eosinophils, mast cells, and macrophages can also die by a similar mechanism, the process was renamed "ETosis," meaning cell death with release of extracellular traps (ETs). 14,15 However, there is no evidence demonstrating the occurrence of eosinophil extracellular trap cell death (EETosis) or its contribution in COPD. Therefore, the aim of this study was to evaluate the occurrence of eosinophil death by EETosis in stable COPD.
Materials and methods subjects
Ex-smokers who had quit smoking for 6 months or longer, COPD and healthy patients with a smoking history of $20 pack/year, of both genders and of age over 60 years, were enrolled for this study after being admitted to the Inpatient Service of Pneumonology of the Sanatorio Allende and the Centre of Smoking Cessation of the Nuevo Hospital San Roque of Córdoba, Argentina. To be included, COPD patients had to fulfill requirements of a forced expiratory volume in one second (FEV 1 ) ,80% and FEV 1 /forced vital capacity (FVC) ,70% following inhalation of a bronchodilator. In addition, only healthy ex-smokers with no respiratory symptoms and normal lung function tests (FEV 1 /FVC $70%) were included. The COPD patients were classified as low exacerbation risk (LER) or high exacerbation risk (HER) groups according to the criteria and classification established by the Global Initiative for Chronic Obstructive Lung Disease (GOLD). 1,16 Patients with a history of infectious disease during the six weeks prior to the enrollment and those with bronchial asthma in childhood and/or adolescence were excluded. In addition, subjects with chronic respiratory diseases other than COPD, gastroesophageal reflux diseases, obesity (body mass index $30), cardiovascular diseases, cancer, collagen diseases, drugs abuse or known pneumotoxic or pulmonary occupational exposure risks, and unconscious patients or those who declined to participate were excluded. Patients who had had COPD exacerbations treated with systemic or inhaled corticosteroids during the past six weeks were also excluded. HER COPD patients were treated with a maintenance therapy according to the GOLD guidelines for the management of stable COPD. 1 Written informed consent was obtained from all subjects, and the study and protocol were approved by the Medical Ethics Committee of Sanatorio Allende (Nueva Córdoba).
study design
This was a cross-sectional study. After admission, individual patients were subjected to lung function examinations, 17 and samples of induced sputum (IS) were obtained.
Is and sample processing
To obtain an IS sample, 200 mg of salbutamol by inhalation with a metered-dose dispenser was administered to the patients ten minutes before sputum induction. Nebulization was carried out with NaCl saline solutions in increasing concentrations (3%, 4%, and 5%) for 7 minutes, with a 5 minute interval between each dose. Before attempting expectoration, patients were asked to remove nasal secretions and rinse the mouth and throat with running water; they were then asked to cough energetically to expectorate bronchial secretions. Before and after each nebulization, the peak expiratory flow (PEF) was measured; if it decreased to less than 10%, patients were exposed to the next salt concentration. Conversely, if PEF fall was 10%-20%, nebulization was prolonged for 7 minutes before continuing with the next hypertonic concentration. The protocol was interrupted if the patient evidenced symptoms of respiratory distress or demonstrated a PEF less than 20% of baseline.
887
Occurrence of eosinophil extracellular trap cell death in COPD Expectorated sputum was processed immediately or kept at 4°C for a period no longer than 2 hours. The total volume of sputum obtained was weighed, and the mucous fraction different from the saliva was selected. This portion was dissolved in a volume of 1% dithiothreitol (DTT) (Sputolysin ® Reagent Cat No 560000 Calbiochem, EMD Millipore, Billerica, MA, USA) equivalent to four times the weight of the selected fraction. The mixture was stirred for 15 seconds and transferred to a water bath for 15 minutes with continuous stirring. Then, 0.1 M phosphate-buffered saline (PBS) was added in equal volume to DTT used and filtered with 48 mm thick gauze. The filtrate was centrifuged at 1,500 rpm for 10 minutes and the supernatant stored in Eppendorf tubes at -70°C for future biochemical analysis. For the cytological exam, the cellular pellet was suspended in 0.5 mL of PBS. The viability of the cells was determined with the Trypan blue exclusion technique. A sample quality was considered acceptable if cell viability was equal to or greater than 60%. Cells were counted in a Neubauer chamber and resuspended in PBS to a final concentration of 0.75×10 6 -1×10 6 cells/mL. Cell monolayers (cytospins) were prepared with 50 µL aliquots on glass slides, and these were centrifuged at 500 rpm for 4 minutes in a cytocentrifuge (Giumelli ® ; Giumelli, Buenos Aires, Argentina).
After air drying, cytospins were stained with May Grünwald-Giemsa stain to enable differential cell counts, including a count of neutrophils, eosinophils, and macrophages, with the percentages based on a minimum of 200 nonbronchial squamous cells. In order to characterize the cellular inflammatory pattern, we further classified patients according to the predominant cells in the cytospins of the IS as eosinophilic ($3%), neutrophilic ($67%), mixed (Eosinophilic and neutrophilic predominance), or pausigranulocytic (neither eosinophilic nor neutrophilic). 18
electron microscopy
Fresh sputum plugs were fixed in a mixture of 4% formaldehyde and 2% glutaraldehyde, omitting the DTT treatment, and embedded in Araldite (Electron Microscopy Sciences, Hatfield, PA, USA) for ultrastructural analysis. Thin sections were then mounted on nickel grids for observation in a Zeiss LEO 906 E Electron Microscope (Carl Zeiss, Oberkochen, Germany). Some samples were also embedded in LR-White (Electron Microscopy Science), a hydrophilic resin, in order to perform DAPI staining to evaluate the extracellular DNA traps.
Immunofluorescence
Immunofluorescence to analyze NETosis/EETosis occurrence was performed on unprocessed freshly collected plugs of IS that had been fixed for 5-10 minutes to slides without poly-l-lysine and then fixed in 4% paraformaldehyde. After permeabilization with 0.25% Triton X-100, slides were treated for 1 h with 5% PBS-BSA to block nonspecific binding and incubated overnight at 4°C in a humidified chamber with a rabbit polyclonal anti-neutrophil elastase antibody (1/100; Abcam, Cambridge, MA) or a mouse monoclonal anti-human eosinophil major basic protein antibody (1/100; Santa Cruz Biotechnology, Santa Cruz, CA, USA). Afterwards, the slides were washed three times with PBS and further incubated with Alexa 594-conjugated goat antirabbit polyclonal antibody or conjugated to rabbit anti-mouse polyclonal antibody (Thermo Scientific, Waltham, MA, USA) for 1 h and mounted using a fluoromount containing DAPI (4′, 6-diamidino-2-phenylindole). Images were then obtained using an inverted confocal laser scanning microscope, FluoView FV 1000 (Olympus; Tokyo, Japan).
Extracellular traps quantification at epifluorescence microscopy level
In order to quantify extranuclear DNA, 0.5-1 µm semithin LR-White-embedded IS sections were obtained at three different levels, and at least 10 pictures were taken at each level after DAPI staining. ImageJ (NHI, Bethesda MD, USA) was used to detect and quantify DNA utilizing home-built plugins. The amount of extranuclear/extracellular DNA was obtained by calculating the difference between the total DAPI stain and the well-recognized nuclear DAPI signal.
Extracellular trap quantification at electron microscopy level
Thin sections (80 nm) from three different regions of araldite-embedded IS samples were mounted on copper grids; to avoid repeated counting, only one section per grid was mounted. A total of 20 photographs, taken at 4,600×, were evaluated from each region. The mean area of extracellular DNA was calculated using the ImageJ software in each picture. Extracellular DNA originating from eosinophils (EETosis) was recognized as compact structures with sharp borders and punctiform content, frequently associated with preserved eosinophil secretory granules or adhered debris. Another distinctive feature was the leftover condensed regions at the periphery or in the interior of the cells, with the presence of abundant cell membrane debris near the extracellular DNA material being the most distinct evidence of EETosis.
By contrast, extracellular nets originating from neutrophils were distinguishable by the presence of small secretory granules, organelles, lipid droplets, and small membranous vesicles an eosinophil undergoing eeTosis (eo) is devoid of the plasma and nuclear membranes, with the nuclear chromatin being highly decondensed and surrounded by crystal-containing secretory granules (arrowheads; C). a neu undergoing neTosis is shown in D, displaying a decondensed nucleus surrounded by typical, small electron-dense secretory granules (arrows). Panel E shows an apoptotic neu, while in F, a normal neu is seen in contact with granule (arrows and arrowheads)-associated extracellular traps (*) derived from neTosis and eeTosis. Abbreviations: COPD, chronic obstructive pulmonary disease; Ma, macrophages; neTosis, neutrophil extracellular trap cell death; neu, neutrophils; eeTosis, eosinophil extracellular trap cell death. associated with the chromatin material. In COPD D patients, the abundant extracellular nets were almost devoid of organelles; in this case, they were characterized by their irregular shapes and baggy borders. Neutrophils undergoing NETosis (NETotic) could appear homogenously decondensed or could still preserve thin linear condensations without associated secretory granules (Figures 1, 2 and 3 for representative images).
statistical analysis
Results are expressed as mean ± standard error of the mean. The data are presented as relative frequency differences between groups, examined using the Mann-Whitney U-test and analysis of variance-Tukey's test for nonnormality and the independent unpaired t-test for normally distributed data. The Infostat statistical software package (Infostat, FCA-UNC, Córdoba, Argentina) was used for the analysis, and a P-value ,0.05 was considered statistically significant. Table 1 exhibits the demographic features of 89 patients. A significant difference in group age was found, with the severe HER group being the oldest compared to the remaining groups. The percentage of the eosinophils and macrophages as well as the inflammatory patterns (eosinophilic, neutrophilic, mixed, or pausigranulocytic) were significantly different among patient subgroups. As shown in Figure 4, group A was the only one that did not include patients with an eosinophilic phenotype. Surprisingly,
891
Occurrence of eosinophil extracellular trap cell death in COPD
Morphological analysis by electron microscopy and Dna labeling
In order to study the incidence of NETosis or EETosis in the different groups of COPD patients, an ultrastructural study of plugs isolated from IS without DTT treatment was performed. Neutrophils and eosinophils from HS mostly revealed a normal morphology when evaluated by immunofluorescence ( Figure 5). On electron microscopy, neutrophils exhibited normal ultrastructure with numerous secretory granules ( Figure 1A). In LER patients, most neutrophils of the COPD A group exhibited nuclear characteristics compatible with apoptosis, including margination of dense masses of chromatin beneath the nuclear membrane and sharp areas separated from the diffuse chromatin ( Figure 1B); no evidence of NETosis was found. DAPI staining on semithin sections showed that neither HS nor COPD A patients exhibited extracellular DNA ( Figure 6A and B); some apoptotic-like neutrophils were clearly visualized, which was in agreement with the ultrastructural findings in COPD A ( Figure 6B). An interesting observation in the LER-COPD B group, which exhibited the eosinophilic phenotype, was the presence of EETosis. This could be identified by the presence of eosinophil-like secretory granules associated with
893
Occurrence of eosinophil extracellular trap cell death in COPD the nucleus, which exhibited decondensed chromatin, and it was already denudated of the nuclear membrane ( Figure 1C). EETosis occurrence was confirmed by immunofluorescence, with typical images of extruded DNA, associated in part with the major basic protein (MBP) (Figure 5B and C). The typical morphology of NETosis could also be detected in the IS of COPD B, clearly recognized by the presence of neutrophil secretory granules associated with decondensed nuclear material ( Figure 1D). By immunofluorescence, neutrophil elastase was found to be colocalized with DNA in some pre-NETotic or NETotic neutrophils ( Figure 5E), together with the presence of some extracellular DNA intermingled with inflammatory cells ( Figure 6C). However, extracellular traps forming classical NETs were rarely seen in the COPD B group. However, this group appeared to be quite a heterogeneous group with regard to its neutrophil morphology and functional state. Some patients of this group mainly exhibited numerous apoptotic cells ( Figure 1E), while others displayed highly preserved and active neutrophils, coexisting with the typical morphology of cell death by NETosis and EETosis ( Figure 1F).
There were clear differences between HER and LER patients regarding the presence of abundant cellular debris in their IS, with some of these variations depending on the cellularity. In the HER COPD C group that included patients with an eosinophilic inflammatory pattern (Table 1), numerous burst eosinophils were observed with signs of EETosis as expelled nuclei still associated with eosinophil secretory granules. However, in contrast with NETosis, nuclei did not seem decondensed to the same extent as in neutrophils and instead appeared as round masses (Figure 2A) or as irregular DAPI-stained structures in semithin sections ( Figure 6D). A significant observation was that the higher the number of eosinophils that underwent EETosis, the higher the presence of cell debris, especially in the form of numerous free membranes and some detached cilia ( Figure 2B), with macrophages being observed to phagocyte small portions of nuclear decondensated material and free membranes ( Figure 2B). The neutrophils in these patients frequently seemed to be pre-NETotic as indicated by the chromatin decondensation and the loss of nuclear poly-lobulation, both events that precede the extrusion of the nuclear material, but still preserving their nuclear and plasma membranes ( Figure 2C). Also, pre-NETotic neutrophils could often be seen incorporating cell debris (chromatin and membranes) by phagocytosis.
In HER COPD D patients with a neutrophilic inflammatory pattern, normal neutrophils frequently appeared associated with NET-like structures that could be identified by the presence of abundant decondensed nuclear material, with a few remaining electron-dense granules and small membrane-limited vesicles ( Figures 2D and 5G and H). We next quantified the proportion of EETosis and NETosis at the ultrastructural level. The analysis confirmed that EETosis is predominant in COPD B and C, whereas NETosis is particularly abundant in COPD D (Figure 3).
These findings were then confirmed by immunofluorescence ( Figure 5F). DAPI staining revealed that extracellular traps appeared to gather as fibers with a NET-like distribution ( Figure 6E), with the analysis of IS by DAPI staining making it possible to identify the LER and HER groups based on the different quality-quantity patterns of the extracellular nuclear material. Quantification of the DAPI signal revealed a significant increase in COPD C and D patients ( Figure 6F).
Discussion
This study focuses on aspects of inflammation that could be associated with the early symptoms and disease progression of COPD. Here, we demonstrated the occurrence of eosinophil death by EETosis in stable LER symptomatic COPD B and in HER patients with eosinophilic or mixed patterns. Furthermore, in high-exacerbation COPD patients, abundant extracellular traps were observed, thereby indicating an increase in NETosis and EETosis occurrence.
To date, most of the research has addressed the neutrophil in COPD. For instance, the ECLIPSE study has evaluated the relationship between sputum neutrophils and FEV 1 , but only found a weak association. 1 This is in agreement with our study in that the neutrophil percentage did not change significantly among different groups of patients. Recently, Grabcanovic-Musija et al 13 concluded that the severity of lung functions in COPD was associated to neutrophil death by NETosis. However, they only evaluated neutrophils and included patients with both exacerbated and stable stages of the disease.
Our study confirms the occurrence of NETosis but suggests that eosinophils might also play an important role in COPD. This was evidenced by the fact that the asymptomatic COPD A group, which lacked eosinophils, did not exhibit NETosis. Conversely, all LER symptomatic COPD B patients with eosinophilic or mixed cellular phenotypes showed an early presence of EETosis, thus revealing a factor that probably contributes to triggering the disease symptoms. In agreement, Singh et al 4 The HER COPD subset exhibited either combined eosinophil and neutrophil extracellular traps or only those derived from neutrophils. However, when eosinophils and EETosis coexisted in HER patients, IS showed abundant free organelles and membranes, indicative of greater cellular damage.
Several reports have focused on clarifying the beginning of the "burning" process of COPD. 19 Our observations have led us to place eosinophils not only at the triggering of the COPD disease, but probably in its maintenance as well, in spite of smoking cessation, as a consequence of the prolonged accumulation of eosinophils and their debris in the airways. 16,20 Therefore, cell debris of eosinophils undergoing EETosis, especially chromatin, accumulates in airways and could become a chemotactic stimulus for sterile inflammation by recruiting neutrophils to phagocytose cell debris. In this way, the process of NETosis could be started and self-perpetuated over the long time period of disease evolution in the COPD D subset, even if EETosis is no longer present, as occurs in many other chronic diseases. 21,22 Even though a longitudinal study is needed to demonstrate such a progression, it is supported by Agusti et al, 23 who concluded, based on the cohort study ECLIPSE, that severity of disease progressed in COPD B and C patients.
Although patients with less than 10 years history of smoking were reported to have normal lung function, 24 the presence of eosinophils in their IS samples (shown in the present study) indicates the importance of evaluating inflammatory cells in the IS of HS and LER patients in order to initiate a specific anti-inflammatory treatment in the early stages of the disease. Therefore, we hypothesize that eosinophils may be implicated in the early development of COPD and that the presence of eosinophils might indicate the smokers who are most at risk of developing the disease.
It is well known that the smoke induces a complex microenvironment containing over 4,000 toxins and leading to proinflammatory cytokine induction in macrophages. In addition, thousands of reactive oxygen species (ROS) are produced in the burning cigarette, and these are able to damage the epithelial cells lining the airways. 25 ROS also activate intracellular signaling cascades leading to inflammatory gene activation (IL-8 and TNF-α) that promote chronic immune cell recruitment and inflammation. Therefore, EETosis induction in COPD patients, as described in this study, probably is a secondary feature in genetically predisposed patients to have eosinophils in airways. According to our observations, eosinophils would be very sensitive to these toxins and respond widely by triggering EETosis, as indicated by the lack of preserved eosinophils.
The cytotoxic effects of eosinophils may favor a more symptomatic disease and also contribute to increased exacerbations, and therefore faster progression. Free destroyed eosinophil secretory granules can be aggressive for the epithelium and endothelium as the proteins they contain, contribute to tissue damage and dysfunction. 26,27 In addition, eosinophil histone release, either from nuclei or from mitochondrial DNA, has been demonstrated to have a killing activity on pathogens and mammalian cells. Extracellular DNA extrusion from eosinophils was verified in Crohn disease, where IL-5 or IFNγ, followed by lipopolysaccharide, induced these cells to release mitochondrial DNA independently of cell death. 32 In our study, a different methodology was used and eosinophil death was observed with typical ultrastructural changes that precede EETosis, as has been described before by many authors (for example, Fuchs et al 7 ) for neutrophils. Cigarette-smokinginduced microenvironment in COPD airways may explain the difference as it does trigger cell death.
Although the inhaled corticosteroid is the recommended treatment for HER, we showed that a significant number of HER patients still had elevated percentages of eosinophils in spite of corticosteroids. In line with this, other authors reported that inhaled corticosteroids failed to suppress inflammation in COPD patients. 33 Our observations by electron microscopy raise doubts about the efficiency of corticosteroids to induce eosinophil control once they are destroyed by EETosis, thereby strengthening the importance of early pharmacological intervention.
In summary, we demonstrated, for the first time, the presence of EETosis, in parallel with NETosis, in COPD at different stages in subjects at risk and with stable disease status. EETosis was useful to explain the eosinophil role in the pathophysiology, especially at the beginning and during COPD progression, regardless of whether the patient quit smoking. It is not necessary, however, to evaluate its presence in the clinical practice, as it is sufficient to study the occurrence of eosinophils in the IS of young healthy smokers without any spirometric changes since this might predict the development of COPD and be a guide for effective early treatment. Finally, our findings contribute by characterizing clear differences among groups, which is in agreement with their clinical manifestations. These results could contribute to prevention of the development of COPD in young smokers with a predisposition to develop the disease, with novel therapeutic techniques still being needed to treat the advanced disease. The International Journal of COPD is an international, peer-reviewed journal of therapeutics and pharmacology focusing on concise rapid reporting of clinical studies and reviews in COPD. Special focus is given to the pathophysiological processes underlying the disease, intervention programs, patient focused education, and self management protocols.
This journal is indexed on PubMed Central, MedLine and CAS. The manuscript management system is completely online and includes a very quick and fair peer-review system, which is all easy to use. Visit http://www.dovepress.com/testimonials.php to read real quotes from published authors. |
/**
* Returns whether the specified plug-in is installed.
*
* @param shortName
* the plug-in to check
* @return <code>true</code> if the specified plug-in is installed, <code>false</code> if not.
*/
public static boolean isPluginInstalled(final String shortName) {
Hudson instance = Hudson.getInstance();
if (instance != null) {
return instance.getPlugin(shortName) != null;
}
return true;
} |
#ifndef DataFormats_Common_EndPathStatus_h
#define DataFormats_Common_EndPathStatus_h
namespace edm
{
class EndPathStatus {
public:
EndPathStatus() {}
};
}
#endif
|
import Promise = require('any-promise');
import { Request, Response, PopsicleError } from 'popsicle';
declare function popsicleRetry(retries?: (error: PopsicleError, response: Response, iter: number) => number): (request: Request, next: () => Promise<Response>) => Promise<any>;
declare namespace popsicleRetry {
function retryAllowed(error: PopsicleError, response: Response): boolean;
function retries(count?: number, isRetryAllowed?: typeof retryAllowed): (error: PopsicleError, response: Response, iter: number) => number;
}
export = popsicleRetry;
|
package tr.cobanse.batak.server.action;
import java.util.List;
import java.util.stream.Collectors;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import tr.cobanse.batak.common.RequestMessage;
import tr.cobanse.batak.common.ResponseMessage;
import tr.cobanse.batak.common.ResponseType;
import tr.cobanse.batak.server.GameContext;
import tr.cobanse.batak.server.game.GameRoom;
import tr.cobanse.batak.server.util.JsonUtil;
public class ListGame implements RequestCommand {
private Logger logger = LoggerFactory.getLogger(getClass().getName());
@Override
public ResponseMessage execute(RequestMessage requestMessage, GameContext gameContext) {
logger.debug("executing action {}" , JsonUtil.toJson(requestMessage));
List<GameRoom> availableGames = gameContext.listGames();
List<String> gameIds = availableGames.stream().map(GameRoom::getGameId).collect(Collectors.toList());
ResponseMessage responseMessage = new ResponseMessage(ResponseType.LISTGAME, gameIds, null, null, null);
logger.debug("action executed. returning message is {} " , responseMessage);
return responseMessage;
}
}
|
def plot_twinboundary_shear_bandstructures(
self,
npoints:int=51,
with_eigenvectors:bool=False,
use_reciprocal_lattice:bool=True):
from twinpy.common.band_path \
import get_labels_band_paths_from_seekpath
from twinpy.plot.band_structure import BandsPlot
self._check_twinboundary_shear_analyzer_is_set()
tb_shear = self._twinboundary_shear_analyzer
base_phn = self._twinboundary_shear_analyzer.phonon_analyzers[0]
base_cell = base_phn.primitive_cell
labels, base_band_paths = \
get_labels_band_paths_from_seekpath(cell=base_cell)
band_structures = \
tb_shear.get_band_structures(
base_band_paths=base_band_paths,
labels=labels,
npoints=npoints,
with_eigenvectors=with_eigenvectors,
use_reciprocal_lattice=use_reciprocal_lattice,
)
bsp = BandsPlot(band_structures=band_structures)
_, _ = bsp.plot_band_structures()
plt.show() |
/* In the name of Allah */
#include<bits/stdc++.h>
using namespace std;
const int N = 1 << 18;
int n, cnt[N], par[N];
long long ans;
int get_par(int u) {
return par[u] < 0? u: par[u] = get_par(par[u]);
}
bool merge(int u, int v) {
if ((u = get_par(u)) == (v = get_par(v)))
return false;
if (par[u] < par[v])
swap(u, v);
return par[u] += par[v], par[v] = u, true;
}
int main() {
cnt[0]++;
scanf("%d", &n);
memset(par, -1, sizeof par);
for (int i = 0; i < n; i++) {
int x;
scanf("%d", &x);
cnt[x]++, ans -= x;
}
for (int i = N - 1; i; i--)
for (int j = i; j; --j &= i)
if (cnt[j] && cnt[i ^ j] && merge(j, i ^ j))
ans += (cnt[j] + cnt[i ^ j] - 1LL) * i, cnt[j] = cnt[i ^ j] = 1;
printf("%lld", ans);
}
|
/* If mktime_ok failed, display the failed time values,
and provide possible hints. Example output:
date: error: invalid date/time value:
date: user provided time: '(Y-M-D) 2006-04-02 02:45:00'
date: normalized time: '(Y-M-D) 2006-04-02 03:45:00'
date: __
date: possible reasons:
date: non-existing due to daylight-saving time;
date: numeric values overflow;
date: missing timezone;
*/
static void
debug_mktime_not_ok (struct tm const *tm0, struct tm const *tm1,
parser_control const *pc, bool time_zone_seen)
{
char tmp[DBGBUFSIZE];
int i;
const bool eq_sec = (tm0->tm_sec == tm1->tm_sec);
const bool eq_min = (tm0->tm_min == tm1->tm_min);
const bool eq_hour = (tm0->tm_hour == tm1->tm_hour);
const bool eq_mday = (tm0->tm_mday == tm1->tm_mday);
const bool eq_month = (tm0->tm_mon == tm1->tm_mon);
const bool eq_year = (tm0->tm_year == tm1->tm_year);
const bool dst_shift = eq_sec && eq_min && !eq_hour
&& eq_mday && eq_month && eq_year;
if (!pc->parse_datetime_debug)
return;
dbg_printf (_("error: invalid date/time value:\n"));
dbg_printf (_(" user provided time: '%s'\n"),
debug_strfdatetime (tm0, pc, tmp, sizeof tmp));
dbg_printf (_(" normalized time: '%s'\n"),
debug_strfdatetime (tm1, pc, tmp, sizeof tmp));
i = snprintf (tmp, sizeof tmp,
" %4s %2s %2s %2s %2s %2s",
eq_year ? "" : "----",
eq_month ? "" : "--",
eq_mday ? "" : "--",
eq_hour ? "" : "--",
eq_min ? "" : "--",
eq_sec ? "" : "--");
if (0 <= i)
{
if (sizeof tmp - 1 < i)
i = sizeof tmp - 1;
while (0 < i && tmp[i - 1] == ' ')
--i;
tmp[i] = '\0';
}
dbg_printf ("%s\n", tmp);
dbg_printf (_(" possible reasons:\n"));
if (dst_shift)
dbg_printf (_(" non-existing due to daylight-saving time;\n"));
if (!eq_mday && !eq_month)
dbg_printf (_(" invalid day/month combination;\n"));
dbg_printf (_(" numeric values overflow;\n"));
dbg_printf (" %s\n", (time_zone_seen ? _("incorrect timezone")
: _("missing timezone")));
} |
UPDATED
‘Perpetuating an Islamophobic, hostile campus climate …’
The president of the San Diego State College Republicans is facing threats and demands for his resignation over a letter from the political club calling on the school’s Muslim Student Association to condemn the recent terror attack in Barcelona.
The letter, signed by SDSU College Republicans Chairman Brandon Jones, stated in part that “until radical Islamic terrorism is disavowed by the Muslim Student Organization at SDSU, we cannot move forward in creating an inclusive environment for all students on campus.” It added the Muslim Student Association’s leadership should resign if they do not disavow Islamic terrorism.
The letter caused an uproar.
The national Muslim Student Association expressed support for the San Diego State chapter for “their solidarity, strength and perseverance in the face of ignorance and hate.”
The Young Democratic Socialists of SDSU responded by declaring: “We condemn the San Diego State College Republicans’ disgraceful statement towards the SDSU Muslim Student Association and the SDSU Muslim community. Retract and apologize now.”
The Transfronterizo Alliance Student Organization, which describes itself as working to create an “inclusive campus environment for SDSU students who live a transborder lifestyle,” joined the chorus.
Its statement condemned the College Republicans for “perpetuating an Islamophobic, hostile campus climate for our Muslim student community on campus.”
“We demand a safe and inclusive campus environment for all San Diego State University students and will not tolerate further racist, xenophobic and Islamophobic letters by the San Diego State College Republicans toward any other student group or campus community,” the group said on Facebook.
Transfronterizo’s statement calls for Jones’ removal as head of the College Republicans, calling his leadership “extremist.”
Jones, for his part, stands behind his group’s statement. He said accusations it was just trolling the Muslim Student Association are offbase.
“We weren’t trolling anyone. We were looking for them to condemn the radical Islamic terrorist attacks much like my organization condemned the acts of white supremacy and neo-nazism in Charlottesville,” Jones said in a message Tuesday to The College Fix.
Jones added he and his group are standing strong.
“The San Diego State College Republicans are standing by our statement we sent to the SDSU Muslim Student Association. As far as the distaste that some students have expressed towards me personally, I am baffled by the hypocrisy that comes from the left. They have shown their true colors and have exposed their own double standard,” Jones told The Fix.
In addition to fielding calls for his resignation, Jones said he has been told to kill himself in an anonymous email, and received a threat on his cell phone after he published his telephone number as part of his group’s Muslim Student Association press release.
The text stated: “I hope you rot in hell. We’re coming for you this week, you vile piece of a human being. Watch your back every step you take. SDSU campus will be the war zone against you inhumane rats.”
Now there’s rumors Jones will be the subject of a campus protest sometime this week, he added. But the 21-year-old political science major said he’s not backing down, despite claims he’s been reported to diversity and inclusion campus administrators and the dean of students.
CORRECTION: The original article misspelled the first name of the chairman of the SDSU College Republicans. His name is Brandon. The article has been amended accordingly.
Like The College Fix on Facebook / Follow us on Twitter |
def last_modified(file_or_folder: str) -> arrow.Arrow:
try:
return max(
mtime(f) for f in iter_files(file_or_folder) if not os.path.islink(f)
)
except ValueError:
raise CommandError(f"no files in folder: {file_or_folder}") |
/**
* Creates a new PatternConverter.
*
* @param converterId converterId.
* @param currentLiteral literal to be used if converter is unrecognized or following converter
* if converterId contains extra characters.
* @param rules map of stock pattern converters keyed by format specifier.
* @param options converter options.
* @return converter or null.
*/
private PatternConverter createConverter(final String converterId, final StringBuilder currentLiteral,
final Map<String, Class<PatternConverter>> rules,
final List<String> options) {
String converterName = converterId;
Class<PatternConverter> converterClass = null;
for (int i = converterId.length(); i > 0 && converterClass == null; i--) {
converterName = converterName.substring(0, i);
if (converterClass == null && rules != null) {
converterClass = rules.get(converterName);
}
}
if (converterClass == null) {
LOGGER.error("Unrecognized format specifier [" + converterId + "]");
return null;
}
final Method[] methods = converterClass.getDeclaredMethods();
Method newInstanceMethod = null;
for (final Method method : methods) {
if (Modifier.isStatic(method.getModifiers()) && method.getDeclaringClass().equals(converterClass) &&
method.getName().equals("newInstance")) {
if (newInstanceMethod == null) {
newInstanceMethod = method;
} else if (method.getReturnType().equals(newInstanceMethod.getReturnType())) {
LOGGER.error("Class " + converterClass + " cannot contain multiple static newInstance methods");
return null;
}
}
}
if (newInstanceMethod == null) {
LOGGER.error("Class " + converterClass + " does not contain a static newInstance method");
return null;
}
final Class<?>[] parmTypes = newInstanceMethod.getParameterTypes();
final Object [] parms = parmTypes.length > 0 ? new Object[parmTypes.length] : null;
if (parms != null) {
int i = 0;
boolean errors = false;
for (final Class<?> clazz : parmTypes) {
if (clazz.isArray() && clazz.getName().equals("[Ljava.lang.String;")) {
final String[] optionsArray = options.toArray(new String[options.size()]);
parms[i] = optionsArray;
} else if (clazz.isAssignableFrom(Configuration.class)) {
parms[i] = config;
} else {
LOGGER.error("Unknown parameter type " + clazz.getName() + " for static newInstance method of " +
converterClass.getName());
errors = true;
}
++i;
}
if (errors) {
return null;
}
}
try {
final Object newObj = newInstanceMethod.invoke(null, parms);
if (newObj instanceof PatternConverter) {
currentLiteral.delete(0, currentLiteral.length()
- (converterId.length() - converterName.length()));
return (PatternConverter) newObj;
} else {
LOGGER.warn("Class " + converterClass.getName() + " does not extend PatternConverter.");
}
} catch (final Exception ex) {
LOGGER.error("Error creating converter for " + converterId, ex);
}
return null;
} |
/**
* Helper class to register JSR-310 specific {@link Converter} implementations in case the we're running on Java 8.
*
* @author Mark Paluch
*/
public abstract class Jsr310Converters {
private static final boolean JAVA_8_IS_PRESENT = ClassUtils.isPresent("java.time.LocalDateTime",
Jsr310Converters.class.getClassLoader());
/**
* Returns the converters to be registered. Will only return converters in case we're running on Java 8.
*
* @return
*/
public static Collection<Converter<?, ?>> getConvertersToRegister() {
if (!JAVA_8_IS_PRESENT) {
return Collections.emptySet();
}
List<Converter<?, ?>> converters = new ArrayList<>();
converters.add(new LocalDateTimeToBytesConverter());
converters.add(new BytesToLocalDateTimeConverter());
converters.add(new LocalDateToBytesConverter());
converters.add(new BytesToLocalDateConverter());
converters.add(new LocalTimeToBytesConverter());
converters.add(new BytesToLocalTimeConverter());
converters.add(new ZonedDateTimeToBytesConverter());
converters.add(new BytesToZonedDateTimeConverter());
converters.add(new InstantToBytesConverter());
converters.add(new BytesToInstantConverter());
converters.add(new ZoneIdToBytesConverter());
converters.add(new BytesToZoneIdConverter());
converters.add(new PeriodToBytesConverter());
converters.add(new BytesToPeriodConverter());
converters.add(new DurationToBytesConverter());
converters.add(new BytesToDurationConverter());
return converters;
}
public static boolean supports(Class<?> type) {
if (!JAVA_8_IS_PRESENT) {
return false;
}
return Arrays.<Class<?>> asList(LocalDateTime.class, LocalDate.class, LocalTime.class, Instant.class,
ZonedDateTime.class, ZoneId.class, Period.class, Duration.class).contains(type);
}
/**
* @author Mark Paluch
* @since 1.7
*/
@WritingConverter
static class LocalDateTimeToBytesConverter extends StringBasedConverter implements Converter<LocalDateTime, byte[]> {
@Override
public byte[] convert(LocalDateTime source) {
return fromString(source.toString());
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@ReadingConverter
static class BytesToLocalDateTimeConverter extends StringBasedConverter implements Converter<byte[], LocalDateTime> {
@Override
public LocalDateTime convert(byte[] source) {
return LocalDateTime.parse(toString(source));
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@WritingConverter
static class LocalDateToBytesConverter extends StringBasedConverter implements Converter<LocalDate, byte[]> {
@Override
public byte[] convert(LocalDate source) {
return fromString(source.toString());
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@ReadingConverter
static class BytesToLocalDateConverter extends StringBasedConverter implements Converter<byte[], LocalDate> {
@Override
public LocalDate convert(byte[] source) {
return LocalDate.parse(toString(source));
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@WritingConverter
static class LocalTimeToBytesConverter extends StringBasedConverter implements Converter<LocalTime, byte[]> {
@Override
public byte[] convert(LocalTime source) {
return fromString(source.toString());
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@ReadingConverter
static class BytesToLocalTimeConverter extends StringBasedConverter implements Converter<byte[], LocalTime> {
@Override
public LocalTime convert(byte[] source) {
return LocalTime.parse(toString(source));
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@WritingConverter
static class ZonedDateTimeToBytesConverter extends StringBasedConverter implements Converter<ZonedDateTime, byte[]> {
@Override
public byte[] convert(ZonedDateTime source) {
return fromString(source.toString());
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@ReadingConverter
static class BytesToZonedDateTimeConverter extends StringBasedConverter implements Converter<byte[], ZonedDateTime> {
@Override
public ZonedDateTime convert(byte[] source) {
return ZonedDateTime.parse(toString(source));
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@WritingConverter
static class InstantToBytesConverter extends StringBasedConverter implements Converter<Instant, byte[]> {
@Override
public byte[] convert(Instant source) {
return fromString(source.toString());
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@ReadingConverter
static class BytesToInstantConverter extends StringBasedConverter implements Converter<byte[], Instant> {
@Override
public Instant convert(byte[] source) {
return Instant.parse(toString(source));
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@WritingConverter
static class ZoneIdToBytesConverter extends StringBasedConverter implements Converter<ZoneId, byte[]> {
@Override
public byte[] convert(ZoneId source) {
return fromString(source.toString());
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@ReadingConverter
static class BytesToZoneIdConverter extends StringBasedConverter implements Converter<byte[], ZoneId> {
@Override
public ZoneId convert(byte[] source) {
return ZoneId.of(toString(source));
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@WritingConverter
static class PeriodToBytesConverter extends StringBasedConverter implements Converter<Period, byte[]> {
@Override
public byte[] convert(Period source) {
return fromString(source.toString());
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@ReadingConverter
static class BytesToPeriodConverter extends StringBasedConverter implements Converter<byte[], Period> {
@Override
public Period convert(byte[] source) {
return Period.parse(toString(source));
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@WritingConverter
static class DurationToBytesConverter extends StringBasedConverter implements Converter<Duration, byte[]> {
@Override
public byte[] convert(Duration source) {
return fromString(source.toString());
}
}
/**
* @author Mark Paluch
* @since 1.7
*/
@ReadingConverter
static class BytesToDurationConverter extends StringBasedConverter implements Converter<byte[], Duration> {
@Override
public Duration convert(byte[] source) {
return Duration.parse(toString(source));
}
}
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.