content
stringlengths 7
2.61M
|
---|
Simulation Research on Parameters of Compressed Air Energy Storage System With the increasing demand for energy in our country, new energy sources with wind power as the main force have been vigorously developed. However, the intermittent and unstable characteristics of wind power generation pose huge challenges to the large-scale grid integration of new energy sources. Compressed air energy storage (CAES) can balance the intermittency and volatility of new energy due to its peak-cutting and valley-filling characteristics, and has great development prospects. This article first explains the development status and working principle of CAES, and then establishes a system simulation based on the existing data of a northwest wind power plant. The results show that the temperature of the thermal storage tank and the system work both increase first and then remain stable with the increase of the number of system cycles. The simulation results are in good agreement with the actual situation, which also verifies the feasibility of the system.
|
import { data as countries } from 'emoji-flags';
export const dataArmamdo = [
{
name: '<NAME>',
series: [
{
name: 'Germany111',
value: 40632
},
{
name: 'United States111',
value: 49737
},
{
name: 'France111',
value: 36745
},
{
name: 'United Kingdom111',
value: 36240
},
{
name: 'Spain111',
value: 33000
},
{
name: 'Italy111',
value: 35800
},
{
name: 'Germany11',
value: 40632
},
{
name: 'United States11',
value: 49737
},
{
name: 'France11',
value: 36745
},
{
name: 'United Kingdom11',
value: 36240
},
{
name: 'Spain11',
value: 33000
},
{
name: 'Italy11',
value: 35800
},
{
name: 'Germany',
value: 40632
},
{
name: 'United States',
value: 49737
},
{
name: 'France',
value: 36745
},
{
name: 'United Kingdom',
value: 36240
},
{
name: 'Spain',
value: 33000
},
{
name: 'Italy',
value: 35800
}
]
},
];
export function generateData2() {
console.log(dataArmamdo);
return [...dataArmamdo];
}
|
/** Pojo that used as key value for object's inlining benchmark. */
static class TestKey {
/** */
private final int id;
/** */
public TestKey(int id) {
this.id = id;
}
}
|
import asyncio
import json
import unittest
from collections import Awaitable
from decimal import Decimal
from unittest.mock import patch, AsyncMock
import aiohttp
from aioresponses import aioresponses
from hummingbot.client.config.fee_overrides_config_map import fee_overrides_config_map
from hummingbot.client.config.global_config_map import global_config_map
from hummingbot.connector.connector.uniswap_v3.uniswap_v3_connector import UniswapV3Connector
from hummingbot.core.data_type.trade_fee import TokenAmount
class UniswapV3ConnectorTest(unittest.TestCase):
def setUp(self) -> None:
super().setUp()
self.base = "COINALHPA"
self.quote = "HBOT"
self.trading_pair = f"{self.base}-{self.quote}"
self.wallet_key = "someWalletKey"
self.gateway_host = "gtw_host"
self.gateway_port = 123
global_config_map["gateway_api_host"].value = self.gateway_host
global_config_map["gateway_api_port"].value = self.gateway_port
self.connector = UniswapV3Connector(
trading_pairs=[self.trading_pair],
wallet_private_key=self.wallet_key,
ethereum_rpc_url="https://<network>.infura.io/v3/YOUR-PROJECT-ID",
)
self.ev_loop = asyncio.get_event_loop()
def async_run_with_timeout(self, coroutine: Awaitable, timeout: float = 1):
ret = self.ev_loop.run_until_complete(asyncio.wait_for(coroutine, timeout))
return ret
@aioresponses()
@patch(
"hummingbot.connector.connector.uniswap_v3.uniswap_v3_connector.UniswapV3Connector._http_client",
new_callable=AsyncMock
)
def test_get_quote_price_updates_fee_overrides_config_map(self, mocked_api, mocked_http_client):
mocked_http_client.return_value = aiohttp.ClientSession()
url = f"https://{self.gateway_host}:{self.gateway_port}/eth/uniswap/price"
mock_response = {
"price": 10,
"gasLimit": 30000,
"gasPrice": 1,
"gasCost": 2,
"swaps": [],
}
mocked_api.post(url, body=json.dumps(mock_response))
self.connector._account_balances = {"ETH": Decimal("10000")}
self.connector._allowances = {self.quote: Decimal("10000")}
self.async_run_with_timeout(
self.connector.get_quote_price(self.trading_pair, is_buy=True, amount=Decimal("2"))
)
self.assertEqual(
fee_overrides_config_map["uniswap_v3_maker_fixed_fees"].value, [TokenAmount("ETH", Decimal(str("2")))]
)
self.assertEqual(
fee_overrides_config_map["uniswap_v3_taker_fixed_fees"].value, [TokenAmount("ETH", Decimal(str("2")))]
)
|
# Copyright (c) 2013, Web Notes Technologies Pvt. Ltd. and Contributors
# MIT License. See license.txt
from __future__ import unicode_literals
import webnotes
from webnotes.utils import scrub_urls
class DocType():
def __init__(self, doc, doclist=[]):
self.doc = doc
self.doclist = doclist
def get_parent_bean(self):
return webnotes.bean(self.doc.parenttype, self.doc.parent)
def update_parent(self):
"""update status of parent Lead or Contact based on who is replying"""
observer = self.get_parent_bean().get_method("on_communication")
if observer:
observer()
def on_update(self):
self.update_parent()
@webnotes.whitelist()
def make(doctype=None, name=None, content=None, subject=None, sent_or_received = "Sent",
sender=None, recipients=None, communication_medium="Email", send_email=False,
print_html=None, attachments='[]', send_me_a_copy=False, set_lead=True, date=None):
# add to Communication
sent_via = None
# since we are using fullname and email,
# if the fullname has any incompatible characters,formataddr can deal with it
try:
import json
sender = json.loads(sender)
except ValueError:
pass
if isinstance(sender, (tuple, list)) and len(sender)==2:
from email.utils import formataddr
sender = formataddr(sender)
comm = webnotes.new_bean('Communication')
d = comm.doc
d.subject = subject
d.content = content
d.sent_or_received = sent_or_received
d.sender = sender or webnotes.conn.get_value("Profile", webnotes.session.user, "email")
d.recipients = recipients
# add as child
sent_via = webnotes.get_obj(doctype, name)
d.parent = name
d.parenttype = doctype
d.parentfield = "communications"
if date:
d.communication_date = date
d.communication_medium = communication_medium
comm.ignore_permissions = True
comm.insert()
if send_email:
d = comm.doc
send_comm_email(d, name, sent_via, print_html, attachments, send_me_a_copy)
@webnotes.whitelist()
def get_customer_supplier(args=None):
"""
Get Customer/Supplier, given a contact, if a unique match exists
"""
import webnotes
if not args: args = webnotes.local.form_dict
if not args.get('contact'):
raise Exception, "Please specify a contact to fetch Customer/Supplier"
result = webnotes.conn.sql("""\
select customer, supplier
from `tabContact`
where name = %s""", args.get('contact'), as_dict=1)
if result and len(result)==1 and (result[0]['customer'] or result[0]['supplier']):
return {
'fieldname': result[0]['customer'] and 'customer' or 'supplier',
'value': result[0]['customer'] or result[0]['supplier']
}
return {}
def send_comm_email(d, name, sent_via=None, print_html=None, attachments='[]', send_me_a_copy=False):
from json import loads
footer = None
if sent_via:
if hasattr(sent_via, "get_sender"):
d.sender = sent_via.get_sender(d) or d.sender
if hasattr(sent_via, "get_subject"):
d.subject = sent_via.get_subject(d)
if hasattr(sent_via, "get_content"):
d.content = sent_via.get_content(d)
footer = set_portal_link(sent_via, d)
from webnotes.utils.email_lib.smtp import get_email
mail = get_email(d.recipients, sender=d.sender, subject=d.subject,
msg=d.content, footer=footer)
if send_me_a_copy:
mail.cc.append(webnotes.conn.get_value("Profile", webnotes.session.user, "email"))
if print_html:
print_html = scrub_urls(print_html)
mail.add_attachment(name.replace(' ','').replace('/','-') + '.html', print_html)
for a in loads(attachments):
try:
mail.attach_file(a)
except IOError, e:
webnotes.msgprint("""Unable to find attachment %s. Please resend without attaching this file.""" % a,
raise_exception=True)
mail.send()
def set_portal_link(sent_via, comm):
"""set portal link in footer"""
from webnotes.webutils import is_signup_enabled
from webnotes.utils import get_url, cstr
import urllib
footer = None
if is_signup_enabled() and hasattr(sent_via, "get_portal_page"):
portal_page = sent_via.get_portal_page()
if portal_page:
is_valid_recipient = cstr(sent_via.doc.email or sent_via.doc.email_id or
sent_via.doc.contact_email) in comm.recipients
if is_valid_recipient:
url = "%s/%s?name=%s" % (get_url(), portal_page, urllib.quote(sent_via.doc.name))
footer = """<!-- Portal Link --><hr>
<a href="%s" target="_blank">View this on our website</a>""" % url
return footer
def get_user(doctype, txt, searchfield, start, page_len, filters):
from controllers.queries import get_match_cond
return webnotes.conn.sql("""select name, concat_ws(' ', first_name, middle_name, last_name)
from `tabProfile`
where ifnull(enabled, 0)=1
and docstatus < 2
and (%(key)s like "%(txt)s"
or concat_ws(' ', first_name, middle_name, last_name) like "%(txt)s")
%(mcond)s
limit %(start)s, %(page_len)s """ % {'key': searchfield,
'txt': "%%%s%%" % txt, 'mcond':get_match_cond(doctype, searchfield),
'start': start, 'page_len': page_len})
def get_lead(doctype, txt, searchfield, start, page_len, filters):
from controllers.queries import get_match_cond
return webnotes.conn.sql(""" select name, lead_name from `tabLead`
where docstatus < 2
and (%(key)s like "%(txt)s"
or lead_name like "%(txt)s"
or company_name like "%(txt)s")
%(mcond)s
order by lead_name asc
limit %(start)s, %(page_len)s """ % {'key': searchfield,'txt': "%%%s%%" % txt,
'mcond':get_match_cond(doctype, searchfield), 'start': start,
'page_len': page_len})
|
The Role of MIP-1 in the Development of Systemic Inflammatory Response and Organ Injury following Trauma Hemorrhage1 Although MIP-1 is an important chemokine in the recruitment of inflammatory cells, it remains unknown whether MIP-1 plays any role in the development of systemic inflammatory response following trauma-hemorrhage (T-H). C57BL/6J wild type (WT) and MIP-1-deficient (KO) mice were used either as control, subjected to sham operation (cannulation or laparotomy only or cannulation plus laparotomy) or T-H (midline laparotomy, mean blood pressure 35 ± 5 mmHg for 90 min, followed by resuscitation) and sacrificed 2 h thereafter. A marked increase in serum -glutathione transferase, TNF-, IL-6, IL-10, MCP-1, and MIP-1 and Kupffer cell cytokine production was observed in WT T-H mice compared with shams or control. In addition lung and liver tissue edema and neutrophil infiltration (myeloperoxidase (MPO) content) was also increased following T-H in WT animals. These inflammatory markers were markedly attenuated in the MIP-1 KO mice following T-H. Furthermore, compared with 2 h, MPO activities at 24 and 48 h after T-H declined steadily in both WT and KO mice. However, normalization of MPO activities to sham levels within 24 h was seen in KO mice but not in WT mice. Thus, MIP-1 plays an important role in mediating the acute inflammatory response following T-H. In the absence of MIP-1, acute inflammatory responses were attenuated; rapidly recovered and less remote organ injury was noted following T-H. Thus, interventions that reduce MIP-1 levels following T-H should be useful in decreasing the deleterious inflammatory consequence of trauma.
|
// USE: call this to cause a DB object to be loaded into memory for use...
StatusResult DBManager::useDatabase(const std::string &aName) {
Database *theDatabase = loadDatabase(aName);
if (nullptr!=theDatabase) {
releaseDB();
activeDB = theDatabase;
std::cout << "Using " << aName << " database" << std::endl;
return StatusResult{ECE141::noError};
}
return StatusResult{ECE141::unknownDatabase};
}
|
Inter-urban mobility via cellular position tracking in the southeast Songliao Basin, Northeast China Position tracking using cellular phones can provide fine-grained traveling data between and within cities on hourly and daily scales, giving us a feasible way to explore human mobility. However, such fine-grained data are traditionally owned by private companies and is extremely rare to be publicly available even for one city. Here, we present, to the best of our knowledge, the largest inter-city movement dataset using cellular phone logs. Specifically, our data set captures 3-million cellular devices and includes 70 million movements. These movements are measured at hourly intervals and span a week-long duration. Our measurements are from the southeast Sangliao Basin, Northeast China, which span three cities and one country with a collective population of 8 million people. The dynamic, weighted and directed mobility network of inter-urban divisions is released in simple formats, as well as divisions GPS coordinates to motivate studies of human interactions within and between cities. Background & Summary Popular use of cellular phones enables measurements of large-scale human mobility traces, which have become readily available and served as proxy for human mobility. The underlying interactions of meta-populations within and between cities have been extensively studied both in applied work (e.g., inter-urban mobility 1, urban activities 2, urban evolution 3, heterogeneous responses during extreme events 4 ), and epidemiology studies of mobility networks 5,6. To study human movements, especially among cities, the analytic framework of mobility networks provides a useful way to characterize interactions among people in different sites. Although transportation and interaction patterns between locations change at hourly and daily scales, many studies of human mobility assume they are static, neglecting the nature of mobility dynamics. This is, arguably, due to the lack of fine-grained public datasets that could describe the mobility dynamics between cities. There are some open access datasets covering small geographical locations taking into account the time ordering of interactions, such as networks of wifi hotspots within a city 10 and networks of students in a university campus 11. However, fine-grained movement datasets covering large geographical regions including multiple cities with large populations are still missing from the open-access datasets. In this paper, we curate and amass a fine-grained dataset of mobility to study inter-urban interactions. We capture cellular position tracking of millions mobile phone users from an open-data program in Changchun city. Each location in our dataset represents a group of cellular stations in an official administrative division. We assume that individual stays at a location if her location is the same at least for half an hour in an hour time interval. Directed movement of each individual from a source location O to a destination location D denotes a change of location for the corresponding individual. We record the time of the directed movement as the time of arriving D in our dataset. The overall directed mobility network of locations is finally compiled by sequentially processing the directed movements for all individuals. In the network, a node represents a location. A weighted edge represents the total number of users' movements between a pair of locations in each hour. The dataset contains movements of near 3-million anonymized cellular phone users among 167 divisions (henceforth locations), covering 4 geographically adjacent areas (Changchun City, Dehui City, Yushu City, and Nong'an County) for a one-week period starting on August 7, 2017. This total geographic area, located in the southeast Songliao Basin in the center of the Northeast China Plain, Northeast China, covers more than 20 square kilometers and, in 2017, had a population of nearly 8 million. To facilitate the use of the open data, we process the above raw dataset to extract a dynamic and directed mobility network of locations. We make these networks available through files in CSV file format, separated by commas. There are 2 files released in 2 folders. The first file denotes the mobility network with four columns ordered by origin location, destination location, their weight and time. For spatial analysis applications, we also provide a geospatial file denoting the GPS information for each location, containing three columns ordered by location associated with its latitude and longitude. Although this described dataset is a major step towards enabling research about human mobility, it has several limitations. First, despite the fact that the dataset covers a cohort of millions of movements, it is only for a one-week period in summer time. Depending on the application, longer periods of time intervals might be needed. Second, we define a user has movement only when s/he stays in a new location at least half an hour. This may also induce bias as it ignores quick movements. Third, the individual's destination position is the last known recorded location of the individual. This recording might cause bias. The individual might actually already be in D during the whole period of t and t − 1. Fourth, the individual's original position might have been unrecorded at an earlier time (e.g., an hour or a day) than her/his recorded arrival time t, since it depends on the last time that the user used her/his phone. We caveat the researchers to be careful about their conclusions when using these data. Methods Original data sources. Our data consist of location records of millions of anonymized cellular phone users for one week starting from August 7, 2017. These locations include 4 geographically neighboring areas (i.e., Changchun City, Dehui City, Yushu City, and Nong'an County). A cellular phone is assumed to be located at the location of the closest cellular base station that it interacts through sending or receiving signals. In the raw movement data each base station is a unique unit. Note that a set of cellular base stations can serve a metapopulation to provide services together. There are over 12,000 cellular stations with their exact GPS location information. Using the input of GPS positions associated with cellular stations, we can get their official administrative division codes in 2017 version using the Amap APIs (https://lbs.amap.com/api/webservice/guide/api/georegeo), as well as the GPS information of each administrative division. In total, these cellular stations located in 167 divisions, with 100 in Changchun, 27 in Yushu, 18 in Dehui, and 22 in Nong'an. Each division includes 72 stations on average with a standard deviation of 65 stations. We group together a set of base stations as one location if they are within the same division. There are nearly 3-million phone users in this study. Most of these users are active with enough credits left in their accounts. The accounts with no credit stop receiving signals automatically in a few days by the company's system. For each user, we aggregate the corresponding location records into hourly movements. Specifically, we assume an individual stays in a location at least half an hour to be considered in that particular location. If a user is spending less than 30 minutes in a location, we assume s/he does not visit the corresponding location during the corresponding hour. Some trips may have large time intervals perhaps due to phones being out of battery power. As such, we do not consider trips whose duration more than 12 hours (less than 0.3% of the total trips). And accordingly, each Defining the mobility network. Considering each place (a city or a country) as multiple metapopulations in different locations, we construct the directed mobility network for each hour of the week. Each location is represented as a node in our network. Edges are directed, connecting nodes where users move from origins to destinations and weighted by the total number of users' movements in each hour-location scenario. An individual directed movement from location i to location j at time t denotes that in a user's movement, location j emerges after the previous location i at time t. Data records This dataset is released by 2 comma-separated values (CSV) files, each in a folder, including more than 70-million movements 12. The first file includes the hourly mobility network with four columns ordered by origin location, destination location, edge weight, and arriving time. The weight is the number of movements per hour between the origin location and destination location. The second file includes the GPS information for each location, containing three columns ordered by location associated with its latitude and longitude. Shenzhen taxi passengers. The x-axis denotes the logarithmic degree. Y-axis is the probability density function for the kernel density estimation. For the mobility network, we estimate the degree of a node as the total number of hourly movements starting or ending in this location across 168 hours in the whole week, as the density plot colored by blue. In contrast, we show the degree distribution of the static mobility network with zones as nodes and passenger flows between nodes as edges, aggregating 2,338,576 trips by taxi passengers in 13,798 taxis in Shenzhen from 18 April 2011 to 26 April 2011 over 1634 zones 7, as the density plot colored by black. We fit the two datasets by Gamma distributions for our released dataset and Shenzhen. More details of fitness summaries are shown by texts associated with each plot. www.nature.com/scientificdata www.nature.com/scientificdata/ Finally, two folders are used to to group these files 12. The first folder (Week-Mobility-Network) includes (Mobility.txt), the file of the hourly-mobility network for the entire week. The second folder (GPS-Location) includes (GPS.txt) the file of latitude and longitude information for each location in the mobility network. Mobility.txt In the mobility network, each row represents the total number of hourly movements by people from locations i to j in the corresponding day. There are four columns ordered by origin location, destination location, their weight, and time. The format for this file is the following. GPS.txt The GPS information for each location. The format of this file is organized as three columns ordered by location identifier and the corresponding latitude and longitude information. Location: numerical administrative division code for each location; Latitude: numerical values for the latitude of the corresponding location; Longitude: numerical values for the longitude of the corresponding location. technical Validation The reliability of location and time information of users' movements in the network data largely depends on the reliability of the underlying source data. We verify the consistency via the geographic-explicit distribution of locations. We visualize 400 locations on a geographic map, as shown in Fig. 1. Fig. 3 Community structures over days. We construct the daily mobility network via aggregating 24 hourly mobility networks by summing all edges' weights. The Louvain community detection algorithm 13 serves to probe community structures based on the daily mobility network for each day of the week (subgraphs a to g). We map community structures with colors denoting different communities in each day. An inter-urban community represents nodes in this community that belong to different locations. We consider 3 community-based measures to reveal the interactions of inter-urban mobility, as shown in subtable h. Specifically, R is the percentage of nodes in an inter-urban community over all nodes. M denotes the mean number of nodes in a community. N represents the number of communities with more than 10 nodes. We can observe Sunday is special, bridging weekday and weekend inter-urban mobility patterns and connect otherwise disconnected inter-urban locations with the highest R and the lowest N. www.nature.com/scientificdata www.nature.com/scientificdata/ Mobility network. In the mobility network, nodes are defined as locations, and edges weighted by the mobility flows between nodes. We verify the consistency of the mobility network with people's daily life with the hourly movement flows over seven days of the week, as shown in Fig. 2a. A movement denotes an individual movement, whose origin node is different from its destination node. For each hour, we count the number of movements between locations as the hourly movement flow. The hourly movement flows of all working days show two traffic peaks (morning and evening). The morning period is starting at 9:00, and the evening is beginning at 17:00. Both are approximately 4 hours long, similar to the reported mobility flows in the literature for another Chinese city of Shanghai with the morning period starting at 9 am, and the evening period starting 4 pm 2. As for weekends, traffic peaks are slightly lower and especially weak in the afternoon. Figure 2b shows the trip durations for 24 hours. The y-axis denote the proportion of trip number over all across trip durations. We can observe that trips with less 12 hours account for over 99.7% of the total trips. The degree of a node denotes the total number of hourly movements passing through the corresponding node during the 168 hours of the week. Figure 2c shows the degree distribution as compared to the degree distribution of another mobility network for another Chinese city (i.e., Shenzen) 7. We can observe that the part of the log degree distribution for high degree values follows a Gamma distribution with a mean value of 10.9387. In contrast, the reported log degree distribution of the mobility network for Shenzhen 7 shows a quite different Gamma distribution with a mean value of 5.5516. Network structure analysis. Additionally, we analyze the community structure of the mobility network using the Louvain community detection algorithm 13. In each day, the inter-urban mobility network often consists of communities-groups of metapopulations in locations who are highly intra-connected, but only loosely interconnected 14,15. Figure 3 shows the community structures for each day with colors denoting different detected communities. To explore the interactions of inter-urban mobility, we consider the inter-urban community, which represents nodes in this community belong to different locations. We consider three community-based measures. Specifically, R is defined as the percentage of nodes in the community that indicate inter-urban movement. High R denotes the strong movement between locations, resulting in multiple inter-urban locations ending up in the same community. M denotes the mean size of nodes in a community. High M denotes the high average size of locations in a local affiliation. N represents the number of communities with more than 10 nodes. High N denotes the high variability in mobility with more local affiliations. We can observe Sunday is special with the highest R and the lowest N, bridging weekday and weekend inter-urban mobility patterns and connect otherwise disconnected inter-urban locations. Code availability Matlab code for data analysis of location correction and mobility network construction can be obtained freely from Supplemental File 1 with no restrictions to access.
|
Multiple Wavelet Coherence to Evaluate Local Multivariate Relationships in a Groundwater System Groundwater level fluctuations are affected by surface properties due to complex correlations of groundwatersurface water interaction and/or other surface processes, which are usually hard to be accurately quantified. Previous studies have assessed the relationship between groundwater level fluctuations and specific controlling factors. However, few studies have been conducted to explore the impact of the combination of multiple factors on the groundwater system. Hence, this paper tries to explore the localized and scalespecific multivariate relationships between the groundwater level and controlling factors (such as hydrologic and meteorological factors) using bivariate wavelet coherence and multiple wavelet coherence. The groundwater level fluctuations of two wells in areas covered by different plant densities (i.e., the riparian zone of the Colorado River, USA) are analyzed. Main findings include three parts. First, barometric pressure and river stage are the best factors to interpret the groundwater level fluctuations at small scales (<1 day) and large scales (>1 day) at the well of lowdensity plants stand, respectively. Second, at the well of highdensity plants stand, the best predictors to control the groundwater level fluctuations include barometric pressure (<1 day), the combination of barometric pressure and temperature (17days), temperature (730days), and the combination of barometric pressure, temperature, and river stage (>30days). The best predictor of groundwater head fluctuations depends on the variance of the vegetation coverage and hydrological processes. Third, these results provide a suite of factors to explain the groundwater level variations, which is an important topic in waterresource prediction and management.
|
<reponame>DasCapschen/UE4-TutorialProject
// Fill out your copyright notice in the Description page of Project Settings.
#pragma once
#include "GameFramework/Character.h"
#include "PlayerCpp.generated.h"
UCLASS()
class TUTORIAL2017_API APlayerCpp : public ACharacter
{
GENERATED_BODY()
public:
APlayerCpp();
UFUNCTION(Server, Reliable, WithValidation)
void SrvInteract();
void HandleHighlight();
AActor* FindActorInLOS();
virtual void BeginPlay() override;
virtual void Tick(float DeltaTime) override;
virtual void SetupPlayerInputComponent(class UInputComponent* PlayerInputComponent) override;
private:
AActor* FocusedActor;
float InteractionDistance = 400.f;
FCollisionQueryParams TraceParams;
};
|
Mission Pony brings the joy of horseback riding into the modern age! Fully electric ponies are available to rent for your event.
yee-haw!
San Francisco’s own motorized stable offers the unique fun of urban horseback riding. These electric horses were concocted in a garage in the technology innovation cradle of Northern California. Mission Ponies are battery powered with zero emissions - except the sound of the rider's joyful exclamations!
adventures
Mission Pony makes a great addition to your event or party. Our wranglers bring the ponies to you anywhere in the Bay Area and make the experience fun for everyone.
Our packages include fun games and activities for all ages. Ride the range as a cowboy or cowgirl roping dogies! Thrill to the chase of a fox hunt — the kind where even the fox has a good time! Our wranglers will come with the needed accessories and supplies — or work with you to customize the fun for your event. We have full-size horses — and unicorns! — for adult events, and smaller, adult-controlled horses for children’s events.
Get in touch with our Stable Master for more information or to schedule your event today!
giddy up!
|
Friends have set up a memorial fund to help the family of a motorcyclist hit and killed by a suspected drunken driver Wednesday night.
Residents may donate at Wells Fargo Bank under the Rickey Parks fund.
Parks, 39, was hit by a pickup at about 11 p.m. at Fourth Street and Milwaukee. He later died at University Medical Center.
Police arrested Manuel O. Martinez Jr., 43, on intoxication manslaughter charges. Authorities said he ran a red light and slammed into Parks' motorcycle.
Parks left behind his wife, Cecilia, and six children, the youngest of whom is 2.
The Messengers Riding Club will remember Parks at 1 p.m. Sunday at the Fellowship Clubhouse, 4229 Idalou Highway.
The family was still making funeral arrangements Friday.
|
<gh_stars>1-10
// Copyright (C) 2011-2020 Roki. Distributed under the MIT License
#ifndef INCLUDED_SROOK_CONFIG_ARCH_CONVEX_CORE_HPP
#define INCLUDED_SROOK_CONFIG_ARCH_CONVEX_CORE_HPP
#if defined(__convex__)
# define SROOK_ARCH_IS_CONVEX 1
# if defined(__convex_c1__)
# define SROOK_CONVEX_C1 __convex_c1__
# else
# define SROOK_CONVEX_C1 0
# endif
# if defined(__convex_c2__)
# define SROOK_CONVEX_C2 __convex_c2__
# else
# define SROOK_CONVEX_C2 0
# endif
# if defined(__convex_c32__)
# define SROOK_CONVEX_C32 __convex_c32__
# else
# define SROOK_CONVEX_C32 0
# endif
# if defiend(__convex_c34__)
# define SROOK_CONVEX_C34 __convex_c34__
# else
# define SROOK_CONVEX_C34 0
# endif
# if defined(__convex_c38__)
# define SROOK_CONVEX_C38 __convex_c38__
# else
# define SROOK_CONVEX_C38 0
# endif
#else
# define SROOK_ARCH_IS_CONVEX 0
# define SROOK_CONVEX_C1 0
# define SROOK_CONVEX_C2 0
# define SROOK_CONVEX_C32 0
# define SROOK_CONVEX_C34 0
# define SROOK_CONVEX_C38 0
#endif
#endif
|
/**
* Save activity state when it's paused.
*/
@Override
protected void onPause() {
String names = playerNames[Constants.PLAYER_1] + ',' + playerNames[Constants.PLAYER_2] + ',' + playerNames[Constants.PLAYER_3] + ',' + playerNames[Constants.PLAYER_4];
PreferenceManager.getDefaultSharedPreferences(getApplicationContext()).edit().putString("KEY_PLAYER_NAMES", names).commit();
PreferenceManager.getDefaultSharedPreferences(getApplicationContext()).edit().putInt("KEY_NUM_PLAYERS", numPlayers).commit();
PreferenceManager.getDefaultSharedPreferences(getApplicationContext()).edit().putBoolean("KEY_TEAM_TOGETHER", teamTogetherCheckbox.isChecked()).commit();
PreferenceManager.getDefaultSharedPreferences(getApplicationContext()).edit().putInt("KEY_STARTING_LIFE", startingLife).commit();
String mana = playerManaColors[Constants.PLAYER_1] + "," + playerManaColors[Constants.PLAYER_2] + "," + playerManaColors[Constants.PLAYER_3] + "," + playerManaColors[Constants.PLAYER_4];
PreferenceManager.getDefaultSharedPreferences(getApplicationContext()).edit().putString("KEY_MANA_COLOR", mana).commit();
super.onPause();
}
|
package de.uniwue.VNFP.solvers.cplex;
import de.uniwue.VNFP.model.*;
import de.uniwue.VNFP.model.factory.TopologyFileReader;
import de.uniwue.VNFP.model.factory.TrafficRequestsReader;
import de.uniwue.VNFP.model.factory.VnfLibReader;
import de.uniwue.VNFP.solvers.AMinimizeCpu;
import ilog.concert.IloIntVar;
import ilog.concert.IloLinearNumExpr;
import ilog.cplex.IloCplex;
import java.io.PrintStream;
import java.util.ArrayList;
import java.util.Locale;
import java.util.stream.Collectors;
public class MinimizeCpuCplex2 extends AMinimizeCpu {
public MinimizeCpuCplex2(ProblemInstance pi) {
super(pi);
}
public static void main(String[] args) throws Exception {
Locale.setDefault(Locale.US);
// Read topology and request files.
//String base = "/home/alex/w/17/benchmark-vnfcp-generator/java/VNFCP_benchmarking/res/eval-topo/";
//String base = "/home/alex/w/17/benchmark-vnfcp-generator/java/VNFCP_benchmarking/res/msgp/";
String base = "/home/alex/w/old/ma/java/VNFP/res/problem_instances/BCAB15/";
//String base = "/home/alex/w/17/benchmark-vnfcp-generator/eval/dynamic/1507907420074/r5/j5/";
VnfLib lib = VnfLibReader.readFromFile(base + "vnfLib");
NetworkGraph ng = TopologyFileReader.readFromFile(base + "topology", lib);
TrafficRequest[] reqs = TrafficRequestsReader.readFromFile(base + "requests-small", ng, lib);
ProblemInstance pi = new ProblemInstance(ng, lib, reqs, new Objs(lib.getResources()));
new MinimizeCpuCplex2(pi).minimizeCpu("cplex_log", "cplex_sol", "cplex_iis");
}
@Override
public void minimizeCpu(String log, String sol, String iis) throws Exception {
IloCplex cplex = new IloCplex();
// Create decision variables
IloIntVar[][][] arfe = new IloIntVar[requests.length][][];
for (int r = 0; r < arfe.length; r++) {
arfe[r] = new IloIntVar[functions[r].length + 1][links.length];
for (int f = 0; f < arfe[r].length; f++) {
for (int e = 0; e < arfe[r][f].length; e++) {
arfe[r][f][e] = cplex.boolVar("arfe[" + r + "][" + f + "][" + e + "]");
}
}
}
IloIntVar[][][][] mrfni = new IloIntVar[requests.length][][][];
for (int r = 0; r < mrfni.length; r++) {
mrfni[r] = new IloIntVar[functions[r].length][nodes.length][];
for (int f = 0; f < mrfni[r].length; f++) {
for (int n = 0; n < mrfni[r][f].length; n++) {
int maxInst_ = maxInst;
for (int i = 0; i < pi.objectives.TOTAL_USED_RESOURCES.length; i++) {
maxInst_ = Math.min(maxInst_, (int) Math.floor(nodes[n].resources[i] / functions[r][f].reqResources[i]));
}
mrfni[r][f][n] = new IloIntVar[maxInst_];
for (int i = 0; i < mrfni[r][f][n].length; i++) {
mrfni[r][f][n][i] = cplex.boolVar("mrfni[" + r + "][" + f + "][" + n + "][" + i + "]");
}
}
}
}
IloIntVar[][][] mtni = new IloIntVar[vnfs.length][nodes.length][];
for (int t = 0; t < mtni.length; t++) {
for (int n = 0; n < mtni[t].length; n++) {
int maxInst_ = maxInst;
for (int i = 0; i < pi.objectives.TOTAL_USED_RESOURCES.length; i++) {
maxInst_ = Math.min(maxInst_, (int) Math.floor(nodes[n].resources[i] / vnfs[t].reqResources[i]));
}
mtni[t][n] = new IloIntVar[maxInst_];
for (int i = 0; i < mtni[t][n].length; i++) {
mtni[t][n][i] = cplex.boolVar("mtni[" + t + "][" + n + "][" + i + "]");
}
}
}
// Combine mtni and mrfni variables semantically
IloLinearNumExpr expr;
for (int t = 0; t < mtni.length; t++) {
ArrayList<Integer[]> listRf = typeAssoc[t];
if (!listRf.isEmpty()) {
for (int n = 0; n < mtni[t].length; n++) {
for (int i = 0; i < mtni[t][n].length; i++) {
expr = cplex.linearNumExpr();
for (Integer[] rf : listRf) {
expr.addTerm(1.0, mrfni[rf[0]][rf[1]][n][i]);
}
cplex.add(cplex.ifThen(cplex.ge(expr, 1.0), cplex.eq(mtni[t][n][i], 1.0)));
}
}
}
}
// Set objective
expr = cplex.linearNumExpr();
// for (int t = 0; t < mtni.length; t++) {
// for (int n = 0; n < mtni[t].length; n++) {
// for (int i = 0; i < mtni[t][n].length; i++) {
// expr.addTerm(100.0 * vnfs[t].cpuRequired, mtni[t][n][i]);
// }
// }
// }
for (int r = 0; r < arfe.length; r++) {
for (int f = 0; f < arfe[r].length; f++) {
for (int e = 0; e < arfe[r][f].length; e++) {
expr.addTerm(1.0, arfe[r][f][e]);
}
}
}
cplex.addMinimize(expr);
// Add constraints (2) "exactly one instance and one node for each requested function"
for (int r = 0; r < mrfni.length; r++) {
for (int f = 0; f < mrfni[r].length; f++) {
expr = cplex.linearNumExpr();
for (int n = 0; n < mrfni[r][f].length; n++) {
for (int i = 0; i < mrfni[r][f][n].length; i++) {
expr.addTerm(1.0, mrfni[r][f][n][i]);
}
}
cplex.addEq(expr, 1.0, "c1");
}
}
if (pi.ng.directed) {
// Add constraints (3+4) for the directed (or undirected but full-duplex) link case
for (int r = 0; r < arfe.length; r++) {
for (int f = 0; f < arfe[r].length; f++) {
for (int n = 0; n < nodes.length; n++) {
expr = cplex.linearNumExpr();
int constant = 0;
// Special case: f = 0
if (f == 0 && nodes[n].equals(requests[r].ingress)) {
constant += 1;
}
// General case
else if (f > 0) {
for (int i = 0; i < mrfni[r][f-1][n].length; i++) {
expr.addTerm(1.0, mrfni[r][f-1][n][i]);
}
}
// Incoming links
for (Link l : nodes[n].getInLinks()) {
int e = linkIndices.get(l);
expr.addTerm(1.0, arfe[r][f][e]);
}
// Special case: f = |r_c| + 1
if (f == mrfni[r].length && nodes[n].equals(requests[r].egress)) {
constant -= 1;
}
// General case
else if (f < mrfni[r].length) {
for (int i = 0; i < mrfni[r][f][n].length; i++) {
expr.addTerm(-1.0, mrfni[r][f][n][i]);
}
}
for (Link l : nodes[n].getOutLinks()) {
int e = linkIndices.get(l);
expr.addTerm(-1.0, arfe[r][f][e]);
}
cplex.addEq(expr, -constant, "c3+4");
}
}
}
}
else {
// Add constraints (3) "all used links are connected to endpoints or another link"
for (int r = 0; r < arfe.length; r++) {
// Edge case: path from src to first vnf
for (int e = 0; e < arfe[r][0].length; e++) {
Link l = links[e];
int n1 = nodeIndices.get(l.node1);
int n2 = nodeIndices.get(l.node2);
for (int nn : new int[]{n1, n2}) {
expr = cplex.linearNumExpr();
expr.addTerm(-1.0, arfe[r][0][e]);
for (int i = 0; i < mrfni[r][0][nn].length; i++) {
expr.addTerm(1.0, mrfni[r][0][nn][i]);
}
for (Link ll : nodes[nn].getNeighbors()) {
int lll = linkIndices.get(ll);
if (lll != e) {
expr.addTerm(1.0, arfe[r][0][lll]);
}
}
cplex.addGe(expr, (requests[r].ingress.equals(nodes[nn]) ? -1.0 : 0.0), "c3_1");
}
}
// VNF f-1 to VNF f
for (int f = 1; f < arfe[r].length - 1; f++) {
for (int e = 0; e < arfe[r][f].length; e++) {
Link l = links[e];
int n1 = nodeIndices.get(l.node1);
int n2 = nodeIndices.get(l.node2);
for (int nn : new int[]{n1, n2}) {
expr = cplex.linearNumExpr();
expr.addTerm(-1.0, arfe[r][f][e]);
for (int i = 0; i < mrfni[r][f][nn].length; i++) {
expr.addTerm(1.0, mrfni[r][f][nn][i]);
}
for (int i = 0; i < mrfni[r][f - 1][nn].length; i++) {
expr.addTerm(1.0, mrfni[r][f - 1][nn][i]);
}
for (Link ll : nodes[nn].getNeighbors()) {
int lll = linkIndices.get(ll);
if (lll != e) {
expr.addTerm(1.0, arfe[r][f][lll]);
}
}
cplex.addGe(expr, 0.0, "c3_2");
}
}
}
// Edge case: path from last vnf to dst
int dst = arfe[r].length - 1;
for (int e = 0; e < arfe[r][dst].length; e++) {
Link l = links[e];
int n1 = nodeIndices.get(l.node1);
int n2 = nodeIndices.get(l.node2);
for (int nn : new int[]{n1, n2}) {
expr = cplex.linearNumExpr();
expr.addTerm(-1.0, arfe[r][dst][e]);
for (int i = 0; i < mrfni[r][dst - 1][nn].length; i++) {
expr.addTerm(1.0, mrfni[r][dst - 1][nn][i]);
}
for (Link ll : nodes[nn].getNeighbors()) {
int lll = linkIndices.get(ll);
if (lll != e) {
expr.addTerm(1.0, arfe[r][dst][lll]);
}
}
cplex.addGe(expr, (requests[r].egress.equals(nodes[nn]) ? -1.0 : 0.0), "c3_3");
}
}
}
// Add constraints (4) "source/destination points must be connected to the paths"
for (int r = 0; r < mrfni.length; r++) {
// Edge case: path from src to first vnf
for (int n = 0; n < nodes.length; n++) {
expr = cplex.linearNumExpr();
for (int i = 0; i < mrfni[r][0][n].length; i++) {
expr.addTerm(-1.0, mrfni[r][0][n][i]);
}
for (Link ll : nodes[n].getNeighbors()) {
expr.addTerm(1.0, arfe[r][0][linkIndices.get(ll)]);
}
cplex.addGe(expr, (requests[r].ingress.equals(nodes[n]) ? -1.0 : 0.0), "c4_1");
expr = cplex.linearNumExpr();
for (int i = 0; i < mrfni[r][0][n].length; i++) {
expr.addTerm(1.0, mrfni[r][0][n][i]);
}
for (Link ll : nodes[n].getNeighbors()) {
expr.addTerm(1.0, arfe[r][0][linkIndices.get(ll)]);
}
cplex.addGe(expr, (requests[r].ingress.equals(nodes[n]) ? 1.0 : 0.0), "c4_1");
}
// VNF f-1 to VNF f
for (int f = 1; f < mrfni[r].length; f++) {
for (int n = 0; n < mrfni[r][f].length; n++) {
expr = cplex.linearNumExpr();
for (int i = 0; i < mrfni[r][f][n].length; i++) {
expr.addTerm(-1.0, mrfni[r][f][n][i]);
}
for (int i = 0; i < mrfni[r][f - 1][n].length; i++) {
expr.addTerm(1.0, mrfni[r][f - 1][n][i]);
}
for (Link ll : nodes[n].getNeighbors()) {
expr.addTerm(1.0, arfe[r][f][linkIndices.get(ll)]);
}
cplex.addGe(expr, 0.0, "c4_2");
expr = cplex.linearNumExpr();
for (int i = 0; i < mrfni[r][f][n].length; i++) {
expr.addTerm(1.0, mrfni[r][f][n][i]);
}
for (int i = 0; i < mrfni[r][f - 1][n].length; i++) {
expr.addTerm(-1.0, mrfni[r][f - 1][n][i]);
}
for (Link ll : nodes[n].getNeighbors()) {
expr.addTerm(1.0, arfe[r][f][linkIndices.get(ll)]);
}
cplex.addGe(expr, 0.0, "c4_2");
}
}
// Edge case: path from last vnf to dst
int last = requests[r].vnfSequence.length - 1;
for (int n = 0; n < nodes.length; n++) {
expr = cplex.linearNumExpr();
for (int i = 0; i < mrfni[r][last][n].length; i++) {
expr.addTerm(1.0, mrfni[r][last][n][i]);
}
for (Link ll : nodes[n].getNeighbors()) {
expr.addTerm(1.0, arfe[r][last + 1][linkIndices.get(ll)]);
}
cplex.addGe(expr, (requests[r].egress.equals(nodes[n]) ? 1.0 : 0.0), "c4_3");
expr = cplex.linearNumExpr();
for (int i = 0; i < mrfni[r][last][n].length; i++) {
expr.addTerm(-1.0, mrfni[r][last][n][i]);
}
for (Link ll : nodes[n].getNeighbors()) {
expr.addTerm(1.0, arfe[r][last + 1][linkIndices.get(ll)]);
}
cplex.addGe(expr, (requests[r].egress.equals(nodes[n]) ? -1.0 : 0.0), "c4_3");
}
}
}
// Add constraints (5) "respect available computational resources on nodes"
for (int n = 0; n < nodes.length; n++) {
for (int j = 0; j < pi.objectives.TOTAL_USED_RESOURCES.length; j++) {
expr = cplex.linearNumExpr();
for (int t = 0; t < vnfs.length; t++) {
for (int i = 0; i < mtni[t][n].length; i++) {
expr.addTerm(vnfs[t].reqResources[j], mtni[t][n][i]);
}
}
cplex.addLe(expr, nodes[n].resources[j], "c5_"+j);
}
}
// Add constraints (6) "respect available bandwidth on links"
for (int e = 0; e < links.length; e++) {
expr = cplex.linearNumExpr();
for (int r = 0; r < arfe.length; r++) {
for (int f = 0; f < arfe[r].length; f++) {
expr.addTerm(requests[r].bandwidthDemand, arfe[r][f][e]);
}
}
cplex.addLe(expr, links[e].bandwidth, "c6");
}
// Add constraints (7) "respect available capacity of VNFs"
for (int t = 0; t < vnfs.length; t++) {
ArrayList<Integer[]> listRf = typeAssoc[t];
for (int n = 0; n < nodes.length; n++) {
for (int i = 0; i < maxInst; i++) {
expr = cplex.linearNumExpr();
for (Integer[] rf : listRf) {
if (i < mrfni[rf[0]][rf[1]][n].length) {
expr.addTerm(requests[rf[0]].bandwidthDemand, mrfni[rf[0]][rf[1]][n][i]);
}
}
cplex.addLe(expr, vnfs[t].processingCapacity, "c7");
}
}
}
// Add constraints (8) "respect maximum delay of r"
for (int r = 0; r < requests.length; r++) {
expr = cplex.linearNumExpr();
double vnfDelay = 0.0;
for (int f = 0; f < requests[r].vnfSequence.length; f++) {
for (int e = 0; e < links.length; e++) {
expr.addTerm(links[e].delay, arfe[r][f][e]);
}
vnfDelay += requests[r].vnfSequence[f].delay;
}
cplex.addLe(expr, requests[r].expectedDelay - vnfDelay, "c8");
}
// Optimize model
//PrintStream out = new PrintStream(new FileOutputStream(sol));
PrintStream out = System.out;
if (cplex.solve() && !cplex.getStatus().equals(IloCplex.Status.Infeasible)) {
out.println("Feasible solution found! (status: " + cplex.getStatus() + ")");
out.println("Instances:");
double cores = 0.0;
for (int t = 0; t < mtni.length; t++) {
for (int n = 0; n < mtni[t].length; n++) {
int count = 0;
for (int i = 0; i < mtni[t][n].length; i++) {
if (cplex.getValue(mtni[t][n][i]) > 0.0) {
count++;
}
}
if (count > 0) {
out.println(String.format(" %s: [%dx %s]", nodes[n].name, count, vnfs[t].name));
cores += count * vnfs[t].reqResources[0];
}
}
}
out.println("Used CPU cores: " + cores);
double numHops = 0.0;
for (int r = 0; r < arfe.length; r++) {
for (int f = 0; f < arfe[r].length; f++) {
for (int e = 0; e < arfe[r][f].length; e++) {
numHops += cplex.getValue(arfe[r][f][e]);
}
}
}
out.println("Number of hops: " + numHops);
out.println("Paths:");
for (int r = 0; r < requests.length; r++) {
out.println(" [" + r + "]: " + findPath(arfe, r, cplex));
}
}
// Unfeasible?
else {
System.out.println("No feasible solution... status: " + cplex.getStatus());
// Check the following...
if (cplex.getStatus() == IloCplex.Status.Infeasible) {
System.out.println("Attempting to write conflict to " + iis);
try {
cplex.writeConflict(iis);
}
catch (Exception e) {
System.out.println(e);
}
}
out.println("Nodes (+Neighbors):");
for (int n = 0; n < nodes.length; n++) {
out.print(" [" + n + "]: '" + nodes[n].name + "' (");
int n_ = n;
out.print(nodes[n].getNeighbors().stream().map(l -> l.getOther(nodes[n_]).name).collect(Collectors.joining(",")));
out.print(") (");
out.print(nodes[n].getNeighbors().stream().map(l -> "" + nodeIndices.get(l.getOther(nodes[n_]))).collect(Collectors.joining(",")));
out.println(")");
}
out.println("Links:");
for (int l = 0; l < links.length; l++) {
out.print(" [" + l + "]: " + links[l].node1.name + " -> " + links[l].node2.name + " ");
out.println("(" + nodeIndices.get(links[l].node1) + " -> " + nodeIndices.get(links[l].node2) + ")");
}
out.println("VNFs:");
for (int v = 0; v < vnfs.length; v++) {
out.println(" [" + v + "]: '" + vnfs[v].name + "'");
}
out.println("Requests:");
for (int r = 0; r < requests.length; r++) {
out.println(" [" + r + "]: '" + requests[r].ingress.name + "' -> '" + requests[r].egress.name + "' (or " + nodeIndices.get(requests[r].ingress) + "->" + nodeIndices.get(requests[r].egress) + ")");
}
}
cplex.end();
}
}
|
<reponame>adele-robots/fiona
#ifndef __ISCENE_H
#define __ISCENE_H
#include "Configuration.h"
//PABLO: Skeleton
class ISkeleton;
//PABLO: MTB MODIFIED
//class MorphTargetBlender;
class IMorphTargetBlender;
class IScene
{
public:
virtual void init(int width, int height);
virtual void quit(void)const=0;
public:
//PABLO: Skeleton
//PABLO: MTB MODIFIED
IScene(psisban::Config *bc, IMorphTargetBlender *bl, ISkeleton *sk) :
bodyConfiguration(bc),
morphTargetBlender(bl),
skeleton(sk)
{}
//PABLO: Skeleton
ISkeleton *skeleton;
//void setWidth(int);
//void setHeight(int);
//H3DRes _fontMatRes;
//H3DRes _panelMatRes;
//H3DRes camaraNode;
//PABLO: MTB MODIFIED
//MorphTargetBlender *morphTargetBlender;
IMorphTargetBlender *morphTargetBlender;
//H3DNode morphNode; // morph node se recalcula redundantemente en morphTargetBlender
// OJO animacion
//bool isAnimationEnabled;
//H3DRes _animationRes;
public:
psisban::Config *bodyConfiguration;
//PABLO: Scene
//private:
protected:
//void addResources(void);
//void loadResourcesFormDisk(void);
//void addNodesToScene(void);
//void setInitialMorphs(void);
//PABLO: Scene
//private:
protected:
//H3DRes pipelineResource;
//H3DRes lightMatRes;
//H3DRes sceneResource;
//int width;
//int height;
};
#endif
|
def reconstruct(self) -> Generator[Tuple[int, bytes], None, None]:
nt_header = self.get_nt_header()
layer_name = self.vol.layer_name
symbol_table_name = self.get_symbol_table_name()
section_alignment = nt_header.OptionalHeader.SectionAlignment
sect_header_size = self._context.symbol_space.get_type(symbol_table_name + constants.BANG +
"_IMAGE_SECTION_HEADER").size
size_of_image = nt_header.OptionalHeader.SizeOfImage
if size_of_image > (1024 * 1024 * 100):
raise ValueError("The claimed SizeOfImage is too large: {}".format(size_of_image))
read_layer = self._context.layers[layer_name]
raw_data = read_layer.read(self.vol.offset, nt_header.OptionalHeader.SizeOfImage, pad = True)
fixed_data = self.fix_image_base(raw_data, nt_header)
yield 0, fixed_data
start_addr = nt_header.FileHeader.SizeOfOptionalHeader + \
(nt_header.OptionalHeader.vol.offset - self.vol.offset)
counter = 0
for sect in nt_header.get_sections():
if sect.VirtualAddress > size_of_image:
raise ValueError("Section VirtualAddress is too large: {}".format(sect.VirtualAddress))
if sect.Misc.VirtualSize > size_of_image:
raise ValueError("Section VirtualSize is too large: {}".format(sect.Misc.VirtualSize))
if sect.SizeOfRawData > size_of_image:
raise ValueError("Section SizeOfRawData is too large: {}".format(sect.SizeOfRawData))
if sect is not None:
sect_size = conversion.round(sect.Misc.VirtualSize, section_alignment, up = True)
sectheader = read_layer.read(sect.vol.offset, sect_header_size)
sectheader = self.replace_header_field(sect, sectheader, sect.PointerToRawData, sect.VirtualAddress)
sectheader = self.replace_header_field(sect, sectheader, sect.SizeOfRawData, sect_size)
sectheader = self.replace_header_field(sect, sectheader, sect.Misc.VirtualSize, sect_size)
offset = start_addr + (counter * sect_header_size)
yield offset, sectheader
counter += 1
|
Quantitative Investment Trading Model Based on Model Recognition Strategy with Deep Learning Method With the acceleration of economic globalization, the frequent price fluctuations of gold and bitcoin and other currencies have attracted wide attention from the quantitative investment industry. For market traders, rational use of deep learning means to improve traditional investment trading strategies has become one of the main contents of current work. In this paper, the deep learning method is used to make a horizontal comparison of the benefit increase of the model recognition strategy with deep learning as the main means compared with the other two strategies, and a longitudinal comparison is made between the deep learning method and the traditional time series of fitting accuracy advantages. We grouped gold and bitcoin prices in the LSTM/GRU framework, trained the recursive dynamic neural network model on the daily data of each group, and used the dropout algorithm to reduce the overfitting effect of the model and retained 20% of the data for cross-checking. The results obtained by this method show that the benefit of the whole neural network model is more obvious when making decisions on the data of the day, and the fitting accuracy of the model is more than 73%, and the average absolute error is 14.040908, indicating a good fitting degree. Compared with model recognition strategies represented by LSTM/GRU, follow-the-winner and follow-the-loser have obvious disadvantages in terms of investment trading principle, and their returns are far lower than the $22,059.583248 obtained under the model recognition strategy. We compare the price trends of gold and bitcoin under ARIMA and ARIMA by comparing the LSTM/GRU method under the framework of model recognition with the time series method in model recognition and find that the mean square error is much greater than the fitting results of neural network. Therefore, it is concluded that the model recognition strategy integrating the deep learning model is the best fit and the best profit among the three conditions. Finally, we change the transaction cost of gold and bitcoin to 7% to simulate whether the transaction model in different countries is stable. The conclusion shows that when the transaction cost changes within 7%, the model still has high feasibility and stability and is relatively robust.
|
# -------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
# --------------------------------------------------------------------------
'''
Mapping and utility functions for Name to Spark ML operators
'''
from pyspark.ml.feature import Binarizer
from pyspark.ml.feature import BucketedRandomProjectionLSHModel
from pyspark.ml.feature import Bucketizer
from pyspark.ml.feature import ChiSqSelectorModel
from pyspark.ml.feature import CountVectorizerModel
from pyspark.ml.feature import DCT
from pyspark.ml.feature import ElementwiseProduct
from pyspark.ml.feature import HashingTF
from pyspark.ml.feature import IDFModel
from pyspark.ml.feature import ImputerModel
from pyspark.ml.feature import IndexToString
from pyspark.ml.feature import MaxAbsScalerModel
from pyspark.ml.feature import MinHashLSHModel
from pyspark.ml.feature import MinMaxScalerModel
from pyspark.ml.feature import NGram
from pyspark.ml.feature import Normalizer
from pyspark.ml.feature import OneHotEncoderModel
from pyspark.ml.feature import PCAModel
from pyspark.ml.feature import PolynomialExpansion
from pyspark.ml.feature import QuantileDiscretizer
from pyspark.ml.feature import RegexTokenizer
from pyspark.ml.feature import StandardScalerModel
from pyspark.ml.feature import StopWordsRemover
from pyspark.ml.feature import StringIndexerModel
from pyspark.ml.feature import Tokenizer
from pyspark.ml.feature import VectorAssembler
from pyspark.ml.feature import VectorIndexerModel
from pyspark.ml.feature import VectorSlicer
from pyspark.ml.feature import Word2VecModel
from pyspark.ml.classification import LinearSVCModel, RandomForestClassificationModel, GBTClassificationModel, \
MultilayerPerceptronClassificationModel
from pyspark.ml.classification import LogisticRegressionModel
from pyspark.ml.classification import DecisionTreeClassificationModel
from pyspark.ml.classification import NaiveBayesModel
from pyspark.ml.classification import OneVsRestModel
from pyspark.ml.regression import AFTSurvivalRegressionModel, DecisionTreeRegressionModel, RandomForestRegressionModel
from pyspark.ml.regression import GBTRegressionModel
from pyspark.ml.regression import GeneralizedLinearRegressionModel
from pyspark.ml.regression import IsotonicRegressionModel
from pyspark.ml.regression import LinearRegressionModel
from pyspark.ml.clustering import BisectingKMeans
from pyspark.ml.clustering import KMeans
from pyspark.ml.clustering import GaussianMixture
from pyspark.ml.clustering import LDA
def build_sparkml_operator_name_map():
res = {k: "pyspark.ml.feature." + k.__name__ for k in [
Binarizer, BucketedRandomProjectionLSHModel, Bucketizer,
ChiSqSelectorModel, CountVectorizerModel, DCT, ElementwiseProduct, HashingTF, IDFModel, ImputerModel,
IndexToString, MaxAbsScalerModel, MinHashLSHModel, MinMaxScalerModel, NGram, Normalizer, OneHotEncoderModel,
PCAModel, PolynomialExpansion, QuantileDiscretizer, RegexTokenizer,
StandardScalerModel, StopWordsRemover, StringIndexerModel, Tokenizer, VectorAssembler, VectorIndexerModel,
VectorSlicer, Word2VecModel
]}
res.update({k: "pyspark.ml.classification." + k.__name__ for k in [
LinearSVCModel, LogisticRegressionModel, DecisionTreeClassificationModel, GBTClassificationModel,
RandomForestClassificationModel, NaiveBayesModel, MultilayerPerceptronClassificationModel, OneVsRestModel
]})
res.update({k: "pyspark.ml.regression." + k.__name__ for k in [
AFTSurvivalRegressionModel, DecisionTreeRegressionModel, GBTRegressionModel, GBTRegressionModel,
GeneralizedLinearRegressionModel, IsotonicRegressionModel, LinearRegressionModel, RandomForestRegressionModel
]})
return res
sparkml_operator_name_map = build_sparkml_operator_name_map()
def get_sparkml_operator_name(model_type):
'''
Get operator name of the input argument
:param model_type: A spark-ml object (LinearRegression, StringIndexer, ...)
:return: A string which stands for the type of the input model in our conversion framework
'''
if model_type not in sparkml_operator_name_map:
raise ValueError("No proper operator name found for '%s'" % model_type)
return sparkml_operator_name_map[model_type]
|
Vegetation and Climate Around 780 Kyrs BP in Northern Kathmandu Valley, Central Nepal Palynological study from the Dharmasthali Formation exposed in the northern part of Kathmandu valley revealed the composition of forest vegetation that were growing in middle Pleistocene (780 kyrs BP) in this area. In a total fifteen samples were collected from the 46 m exposed section for the palynological study. The profile can be divided into two zones on the basis of pollen assemblages. The lower part (DF-I) is dominated by Pteridophyte spores such as Lygodium, Polypodium, Cyathea and Pteris. The dominance of Pteridophytes indicate that the forest floor was moist and humid. The tree pollen consists of Abies, Pinus, Quercus, Podocarpus and Alnus. Other Gymnosperms such as Picea and Tsuga were represented by very low percentage. Poaceae and Cyperaceae show their strong presence indicating grassland and wetland conditions around the depositional basin. In the upper zone (DF-II) there is increase of Gymnosperms such as Picea and Abies. The subtropical Gymnosperm Podocarpus decreased while Tsuga completely became absent in this zone. Cold climate preferring trees such as Cedrus, Betula, Juglans and Ulmus appeared first time in this zone. The climate became even colder and drier in the upper part of the section. Near water plants such as Cyperaceae and Typha show their dominance in this zone. The plant assemblages from bottom part of the Dharmasthali Formation indicate warm climate condition which was becoming colder after 780 kyrs towards the top part of the sequence. Bulletin of Department of Geology, vol. 20-21, 2018, pp:37-48
|
public class Main
{
public static class A {
int x;
}
public A a;
public static void main(Main[] args)
{
assert(args != null); // allowed to fail
if(args != null && args.length > 0) {
Main m = args[0];
if(m != null) {
assert(m.a == null); // allowed to fail
}
}
}
}
|
package ru.stqa.pft.rest.tests;
import com.google.gson.Gson;
import com.google.gson.JsonElement;
import com.google.gson.JsonParser;
import com.google.gson.reflect.TypeToken;
import org.apache.http.client.fluent.Executor;
import org.apache.http.client.fluent.Request;
import org.apache.http.message.BasicNameValuePair;
import org.testng.SkipException;
import org.testng.annotations.Test;
import ru.stqa.pft.rest.model.Issue;
import java.io.IOException;
import java.util.Set;
import static org.testng.Assert.assertEquals;
public class RestTests {
@Test
public void testCreateIssue() throws IOException {
try {
skipIfNotFixed(14);
} catch (SkipException e) {
e.printStackTrace();
}
Set<Issue> oldIssues = getIssues();
Issue newIssue = new Issue().withSubject("Test issue2").withDescription("New test issue2");
int issueId = createIssue(newIssue);
Set<Issue> newIssues = getIssues();
oldIssues.add(newIssue.withId(issueId));
assertEquals(newIssues, oldIssues);
}
public int getIssueId() throws IOException {
String json = getExecutor().execute(Request.Get("http://demo.bugify.com/api/issues.json"))
.returnContent().asString();
JsonElement parsed = new JsonParser().parse(json);
JsonElement issues = parsed.getAsJsonObject().get("issues");
Set<Issue> fromJson = new Gson().fromJson(issues, new TypeToken<Set<Issue>>() {
}.getType());
Issue issue = fromJson.iterator().next();
int issueId = issue.getId();
return issueId;
}
private Set<Issue> getIssues() throws IOException {
String json = getExecutor().execute(Request.Get("http://demo.bugify.com/api/issues.json"))
.returnContent().asString();
JsonElement parsed = new JsonParser().parse(json);
JsonElement issues = parsed.getAsJsonObject().get("issues");
return new Gson().fromJson(issues, new TypeToken<Set<Issue>>(){}.getType());
}
private Executor getExecutor() {
return Executor.newInstance().auth("LSGjeU4yP1X493ud1hNniA==", "");
}
private int createIssue(Issue newIssue) throws IOException {
String json = getExecutor().execute(Request.Post("http://demo.bugify.com/api/issues.json")
.bodyForm(new BasicNameValuePair("subject", newIssue.getSubject()),
new BasicNameValuePair("description", newIssue.getDescription())))
.returnContent().asString();
JsonElement parsed = new JsonParser().parse(json);
return parsed.getAsJsonObject().get("issue_id").getAsInt();
}
private void skipIfNotFixed(int issueId) throws IOException {
if (isIssueOpen(issueId)) {
throw new SkipException("Ignored because of issue " + issueId);
}
}
private boolean isIssueOpen(int issueId) throws IOException {
boolean isIssueOpen = false;
String query = "http://demo.bugify.com/api/issues/" + issueId + ".json?attachments=false&comments=false&followers=false&history=false";
String json = getExecutor().execute(Request.Get(query)).returnContent().asString();
JsonElement parsed = new JsonParser().parse(json);
JsonElement issues = parsed.getAsJsonObject().get("issues");
Set<Issue> fromJson = new Gson().fromJson(issues, new TypeToken<Set<Issue>>() {
}.getType());
String issueState = fromJson.iterator().next().getStateName();
System.out.println(issueState);
if ((issueState.equals("Resolved")) || (issueState.equals("Closed")) ||
(issueState.equals("Fixed"))) {
return false;
}
return true;
}
}
|
Prosaposin deficiency and saposin B deficiency (activator-deficient metachromatic leukodystrophy): report on two patients detected by analysis of urinary sphingolipids and carrying novel PSAP gene mutations. Prosaposin deficiency (pSap-d) and saposin B deficiency (SapB-d) are both lipid storage disorders caused by mutations in the PSAP gene that codes for the 65-70 kDa prosaposin protein, which is the precursor for four sphingolipid activator proteins, saposins A-D. We report on two new patients with PSAP gene defects; one, with pSap-d, who had a severe neurovisceral dystrophy and died as a neonate, and the other with SapB-d, who presented with a metachromatic leukodystrophy-like disorder but had normal arylsulfatase activity. Screening for urinary sphingolipids was crucial to the diagnosis of both patients, with electrospray ionization tandem mass spectrometry also providing quantification. The pSap-d patient is the first case with this condition where urinary sphingolipids have been investigated. Multiple sphingolipids were elevated, with globotriaosylceramide showing the greatest increase. Both patients had novel mutations in the PSAP gene. The pSap-d patient was homozygous for a splice-acceptor site mutation two bases upstream of exon 10. This mutation led to a premature stop codon and yielded low levels of transcript. The SapB-d patient was a compound heterozygote with a splice-acceptor site variant exclusively affecting the SapB domain on one allele, and a 2 bp deletion leading to a null, that is, pSap-d mutation, on the other allele. Phenotypically, pSap-d is a relatively uniform disease of the neonate, whereas SapB-d is heterogeneous with a spectrum similar to that in metachromatic leukodystrophy. The possible existence of genotypes and phenotypes intermediate between those of pSap-d and the single saposin deficiencies is speculated. exon 10. This mutation led to a premature stop codon and yielded low levels of transcript. The SapB-d patient was a compound heterozygote with a splice-acceptor site variant exclusively affecting the SapB domain on one allele, and a 2 bp deletion leading to a null, that is, pSap-d mutation, on the other allele. Phenotypically, pSap-d is a relatively uniform disease of the neonate, whereas SapB-d is heterogeneous with a spectrum similar to that in metachromatic leukodystrophy. The possible existence of genotypes and phenotypes intermediate between those of pSap-d and the single saposin deficiencies is speculated. INTRODUCTION Prosaposin (pSap) is a non-enzymic 65-70 kDa glycoprotein encoded by the PSAP gene . Amongst its roles, pSap is the precursor for four saposins (Saps) A-D, which are formed by proteolysis. The Saps, also known as sphingolipid activator proteins, are indispensable cofactors for the intralysosomal degradation of a number of sphingolipids and seem to interact directly with the specific lipid hydrolases and/or facilitate presentation of the lipid substrates to these enzymes . Defects in the PSAP gene can cause a deficiency of either the entire pSap protein (prosaposin deficiency, pSap-d) or an individual Sap: SapA-d, SapB-d, SapC-d, or SapD-d, with, to date, SapD-d only being reported in an animal model . In humans, pSap-d is a unique neonatal condition with an acute generalized neurovisceral dystrophy associated with the storage of multiple sphingolipids, whereas each isolated Sap deficiency is generally similar to a particular sphingolipid hydrolasedeficiency, namely, SapA-d to Krabbe leukodystrophy , SapB-d to metachromatic leukodystrophy (MLD), and SapC-d to Gaucher disease . The pathologies and biochemical phenotypes observed in pSap-d and the single Sap-deficient diseases have provided indirect insight into the specific roles and normal functions, including certain neurotrophic effects, of p-Sap and/or the individual Saps. We report on two additional patients, one with pSap-d and the other with SapB-d. Both patients were detected by urinary glycosphingolipid analysis and they also have novel PSAP mutation(s). Tandem mass spectrometry (MS/MS) of the urinary lipids proved to be an efficient screening method. The distinctive pattern found in urine from the present pSap-d patient, with elevations in multiple sphingolipids, including ceramide, constitutes the first urine sphingolipid analysis for this condition. PATIENTS AND METHODS Patient 1 Patient 1 was born at term (weight, 3.2 kg ; length, 50 cm ; occipital-frontal circumference , 32 cm ) after an uneventful pregnancy to parents who were first cousins. The mother had noticed frequent and rhythmic movements of the child in late pregnancy. Directly after birth he had precipitate movements and clonic fits that were resistant to anticonvulsive drugs. Sucking and swallowing were insufficient and tube feeding was started. After 3 weeks he had increased serum Creactive protein and needed additional oxygen. A chest X-ray revealed pulmonary infiltrations. At the age of 4 weeks, he presented with muscle hypotonia, myoclonus and periods of twitching of the right arm and hand that were unresponsive to drugs. The liver and spleen were enlarged, which was confirmed by sonography, with the liver and spleen vertical diameters increased to 7 and 7.5 cm, respectively. Laboratory tests showed increased liver enzymes. Testing of white blood cell lysosomal enzymes revealed a very low galactosylceramide b-galactosidase (EC 3.2.1.46) activity. On ECG, there were signs of mitral insufficiency. In the eye fundi, the optic disks were atrophic, the right more so than the left, and the maculae were not demarcated. Sonography of the slightly microcephalic skull showed small ventricles and periventricular punctate echogenicities. On cerebral magnetic resonance imaging (MRI) a thin corpus callosum and bilateral absence of the gyrus cinguli were found; the periventricular white matter regions showed striking multiple symmetrical signal changes suggestive of gray matter heterotopias ( Fig. 1), although there was no complete iso-intensity with gray matter. An electroencephalogram (EEG) revealed general changes with invariant alpha-activity, multi-focal sharp waves, but no ictual patterns which were also absent when the child had clonic jerk sequences of the limbs and head. At the age of 5 weeks, cerebrospinal fluid analysis revealed an increased total protein (349 mg/dl; normal for age, 53 22 mg/dl). In the smears of a bone marrow aspirate a few storage macrophage-like cells were seen. Electron microscopy of a skin biopsy revealed generalized lysosomal storage (Fig. 2). The child deteriorated and parenteral nourishment was necessary. Repeated pulmonary infections led to death at the age of 55 days (weight, 3.85 kg ; length, not recorded; OFC, 35.5 cm ). In a urinary sample collected at day 44 a number of glycosphingolipids were elevated on lipid thin layer FIG. 1. Cranial MRI scans (T2 weighted) for patient 1 (pSap-d). Note the absence of gyri cinguli and the corpus callosum, which is thinned. The arrows point to dark lesions arranged in a chain-like manner that are suggestive of gray matter heterotopias. chromatography (TLC) (Fig. 3). The complex sphingolipidosis suggested a diagnosis of pSap-d. Patient 2 Patient 2 was born abroad 2 weeks before term after an uneventful pregnancy. When he was 7 months old and had started crawling, the parents noticed that the movements of his right arm and leg were poor. When he was admitted at the age of 9 months, he had signs of a mild, right-sided, arm-accentuated spastic hemiparesis. Skull sonography revealed a left-temporal, large porencephalic cyst, and distension of the left-ventricle, suggesting a preceding infarction of the medial cerebral artery. An MRI scan confirmed the medial artery infarction, which probably occurred at or around the time of birth, and frontal sickle-shaped fluid pools were evident, suggesting retarded myelination. At the age of 12 months (weight, 10.5 kg ; length, 80 cm ; OFC, 47.5 cm ), his spastic hemiparesis, with slight flexion of joints, did not prevent him from rolling over, although there was some hypotonia of the trunk muscles. At the age of 18 months his hemiparetic problems, with hand-fisting and accentuated patellar reflex on the right side, were moderate. He was able to reach a standing position. At the age of 23 months he had lost his previous ability to walk a few steps and was unable to stand freely or to crawl as actively as before. His limb muscles had distinctly reduced power. At 25 months of age he had muscle hypotonia, very weak peripheral reflexes and no Babinski sign. Laboratory values revealed an increased cerebrospinal fluid protein of 98 mg/dl (normal for age, 45 15 mg/dl), but lactate was normal. At 28 months he was able to sit, stand, and play a little, but only with assistance. He often had periods of unmotivated crying. At 43 months he had his first generalized epileptic seizure, had lost his active speech and interest in toys, and had lower limb spasticity with a back-curved right knee. His hands were fisted and almost no longer used. His feet were held in the extensor position and he had a distinct spastic tetraparesis. A skull MRI scan revealed extensive white matter lesions with preserved U-fibers. Conduction velocity of the ulnar nerve was reduced to 12.7 m/sec (normal for age, >38 m/sec). Laboratory values showed normal activities of white blood cell lysosomal enzymes, including arylsulfatase A (ASA; EC 3.1.6.8) and galactosylceramide b-galactosidase (EC 3.2.1.46). Repeated infections, EBV positivity, and feeding problems were noted. At the age of 4 years, a preliminary diagnosis of MLD was made because of the finding of highly elevated urinary sulfatide. In view of the normal white blood cell ASA activity, SapB-d was suggested. Five months later the severely retarded, tetraspastic child had almost no eye contact with the examiner or fixation with the eyes. At the age of 5 years there was an additional tonic epileptic fit and the valproate dose was increased to 24 mg/kg body weight. At the age of 6 years (weight, 16.2 kg ; length, 110 cm ; OFC, 53 cm ) generalized muscle hypertonicity, pes equinovarus, and eye pupils that were unreactive to light were noted. Urine Samples Urine samples (24 hr collection) from patient 1 and 2 were kept frozen at 20 C. Frozen samples from Fabry patients, MLD patients, healthy children up to 12 years and adult controls were available. FIG. 3. Urinary lipids in (a) patient 1 (pSap-d; 44-day-old), and (b) a normal control infant (11-month-old, with similarly low urinary creatinine as in patient 1) analysed by two-dimensional TLC (direction of first solvent was upward, and of the second solvent from left to right). Lipids from each 3 ml urine sample were separated on silica-gel HPTLC plates and stained with anisaldehyde/sulfuric acid. Symbols (alphabetic): g1, glucosylceramide; g2, dihexosylceramides; g3, globotriaosylceramide; gm3, G M3 ganglioside (preliminary identification); sm, sphingomyelin; st, start of sulfatide test; su, start of urinary lipid extract; sul, patient sulfatide (double spot); t, sulfatide test (double spot, different fatty acid type as compared to sul) calibrated to an amount equivalent to 10-fold of the mean normal sulfatide amount in 3 ml urine from 1-to 10-year-old children (n 5). Note the intense spots for g1, g3, and gm3 in (a). Enzyme Assays, Fibroblast Loading Tests, and Lipid Thin Layer Chromatography The indicated methods were used for the determination of enzyme activities with radioactive substrates , loading fibroblasts with radioactive glucosylceramide , globotriaosylceramide , sulfatide and sphingomyelin , the TLC analysis of fibroblasts , extraction of urine (please refer to supporting information S1 which may be found in the online version of this article), and TLC of urinary lipids . The samples were vortexed for a few seconds at 5 min intervals three times. Then 150 ml Milli-Q water was added and the mixing procedure repeated. After 30 min at room temperature the samples were centrifuged (14,500g, 5 min). Each separated lower phase was filtered (PTFE syringe pump filter) and re-washed by mixing with 500 ml Milli-Q water , the mixture centrifuged and separated as above. The final lower phases were evaporated under nitrogen. The dry residues were recovered with 500 ml chloroform-methanol 2:1 (v/v), and divided into 300 ml (positive ion mode) and 200 ml (negative ion mode) aliquots. The aliquots were evaporated under nitrogen. Before analysis the positive ion mode aliquot was dissolved in 300 ml 10 mM ammonium formate (no. 55674 Sigma-Aldrich; grade puriss. p.a. for LC-MS Fluka) in methanol and the negative mode aliquot in 200 ml methanol . Lipids from cultured fibroblasts were extracted as described with chloroform/methanol 2:1 (v/v) containing IST in the above-mentioned concentrations. Tandem Mass Spectrometry (MS/MS) Analysis of Urinary Sphingolipids Electrospray ionization (ESI)-MS/MS analysis. The MS/MS equipment comprised an AB/MDS SCIEX API 3200 triple-quadrupole mass spectrometer (AB/MDS Sciex, Concord, Canada) with an ionspray source and an Agilent 1100 Autosampler (Agilent Technologies, Inc., Santa Clara, CA). Electrospray conditions and mass spectrometer ion optics were optimized for sphingolipid measurements using standard samples with lipid 10 mg/ml (for detailed conditions, see supporting information Table I which may be found in the online version of this article). Direct flow injection analysis, with methanol as the mobile phase, was done at a flow rate of 50 ml/min. Using the multiple reaction monitoring mode, the sphingolipids from a given sample were analyzed in series: For each sphingolipid, a 20 ml lipid extract aliquot corresponding to 6 ml urine, or to 0.2 mg fibroblast protein, was injected into the methanol mobile phase. Analysis was optimized for each sphingolipid, resulting in a sufficiently symmetrical peak shape with at least 12 measuring points per peak (e.g., see peaks in supporting information Fig. 1 which may be found in the online version of this article). This method allowed quantification of all major isoforms of each sphingolipid, distinguished by their fatty acid moiety (C16:0 to C26:0 fatty acids, non-substituted types and hydroxy-derivatives; details tabulated in supporting information Table II which may be found in the online version of this article). The negative ion mode was used for the analysis of sulfatide and the positive ion mode for neutral glycosphingolipids, ceramide, and sphingomyelin. Quantification of sulfatide, Gb3Cer, dihexosylceramides (including LacCer and digalactosylceramide), monohexosylceramides (mainly GlcCer), ceramide, and sphingomyelin was done by single point calibration with a standard lipid concentration of 600 ng/ml (external calibration standard) corrected by the signal ratio toward IST. The concentrations of standard lipids and individual IST (see above) in the external calibration point were within the previously determined linear range of 50 ng-7 mg lipid per ml. The concentrations of IST in the external calibration point and in the patient samples were the same . Reproducibility of all measurements was 93% or higher. The Student's t-test was used to determine statistical significance. The following porcine lipid standards were used: Sulfatide (no. 131305P) and sphingomyelin (no. 860062P) were from Avanti Polar Lipids, Inc., Alabaster, AL and Gb3Cer (no. 1067) and ceramide (no. 1056) from Matreya LLC, Pleasant Gap, PA. LacCer (bovine, no. G3166) and GlcCer (from human Gaucher spleen, no. G9884) were from Sigma-Aldrich. The MS/MS method cannot differentiate between the glucose (e.g., in GlcCer) and galactose (e.g., in digalactosylceramide) moieties because of the same mass. Therefore, GlcCer and galactosylceramide, as well as LacCer and digalactosylceramide, were quantified as monohexosylceramides and dihexosylceramides, respectively. Chemical identity and purity of sphingolipids used as calibration standards and as IST were proved by high performance TLC (HPTLC) and mass spectrometry (e.g., see supporting information Fig. 2 which may be found in the online version of this article). Purity of all sphingolipid standards was >97%. Molecular Analysis Genomic DNA and total mRNA were isolated from the patients' cultured fibroblasts and from the parents' peripheral leukocytes by standard techniques. The PSAP gene was analyzed for mutations as described previously . Briefly, all coding exons and intron-exon boundaries were amplified from genomic DNA and sequenced directly from gel-purified PCR products using automated fluorescent sequencers. The mRNA was reverse-transcribed using Superscript II (Gibco ) and oligo-dT. RT-PCR products were sequenced as described above. Sequences were numbered sequentially from the A of the first ATG codon, which was designated 1 (reference genomic sequence: GenBank , NC_000010.9; reference mRNA sequence: GenBank, NM_002778.1; protein sequence: UniProtKB/ Swiss-Prot , P07602 ). Electron Microscopic Findings in Skin Biopsy Confirming earlier results by A. Bornemann (T€ ubingen, personal communication), there was generalized dermal lysosomal storage expressed in the eccrine glands, capillary endothelium (Fig. 2), perivascular macrophages, Schwann cells, adipocytes, and some fibroblasts of patient 1. The storage lysosomes were around 1 mm in diameter and were filled with pleiomorphic, predominantly multivesicular structures; individual vesicles were around 170 nm in diameter. Biochemical Findings in Cultured Skin Fibroblasts Fibroblast homogenates from patient 1 had a partially reduced activity of glucosylceramide b-glucosidase (EC 3.2.1.45), a more markedly reduced activity of galactosylceramide b-galactosidase, but a normal sphingomyelinase (EC 3.1.4.12) activity (for quantitative data, see supporting information S6 which may be found in the online version of this article). The activity of galactosylceramide b-galactosidase was normal in fibroblast homogenates from patient 2 (glucosylceramide b-glucosidase and sphingomyelinase not tested). Thin layer chromatographic analysis of lipids extracted from fibroblasts (corresponding to about 0.5 mg protein) of patient 1 revealed intensely stained spots for ceramide, GlcCer, and LacCer similar to ; other glycosphingolipids were not studied. The corresponding spots on chromatograms from control cells were about 3-to 10-fold less intense. Ceramide was additionally quantified by MS/MS and an amount of 16.2 mg/mg fibroblast protein found for patient 1 (normal range , 3.6-6.2 mg/mg). A three to fourfold increase in dihexosylceramide and Gb3Cer concentrations in fibroblasts was also confirmed by MS/MS. Of these lipid elevations, only one was found to occur also in fibroblasts of patient 2: There was a fourfold increase in Gb3Cer. Metabolic experiments with radioactive sphingolipid substrates (tritium-labeled on their ceramide moieties) loaded onto living fibroblast cultures from patient 1 and patient 2 gave similar results to those described for an earlier pSap-d and an earlier SapB-d Urinary Lipid Findings by Thin Layer Chromatography Based on comparison with the staining intensity of lipids on the chromatogram from a control urine sample, the chromatogram for patient 1 (pSap-d) showed markedly increased levels of Gb3Cer, GlcCer, and G M3 ganglioside (preliminary identification), and sulfatides and dihexosylceramides were also elevated (Fig. 3). Sulfatides were also clearly elevated in the urine extract from patient 2 (SapB-d), and there was a slight increase in Gb3Cer. Table I summarizes the quantitative urinary sphingolipid findings in patients 1 (pSap-d) and 2 (SapB-d), as compared to findings in Fabry disease, MLD, and normal controls. The data were standardized relative to the concentration of sphingomyelin as a reference cellular sphingolipid. Lipid marked with footnotes c and e indicate that there was a statistically significant elevation when compared to the appropriate control group (infantile/late infantile controls for pSap-d, SapB-d, and MLD; adult controls for Fabry disease). In patient 2, the elevated concentration of sulfatide (P < 0.001) was of a similar magnitude to that found in MLD (Table I). The distinct increase in the concentration of Gb3Cer (P < 0.001) was also remarkable. The high elevations of the two urinary marker glycolipids, Gb3Cer, and sulfatide, had different proportions in patients 1 and 2: Gb3Cer and sulfatide concentrations accounted for 60% and 20% of excreted glycosphingolipids in patient 1, respectively, while they accounted for 21% and 59% in patient 2. In Figure 4, the percent distribution of concentrations of the main urinary sphingolipids is shown. This format, which allowed for a simple standardization of urinary lipid values in the absence of reference parameters, confirmed most of the findings summarized in Table I. The combined percentages for sulfatide, Gb3Cer, dihexosylceramides, glucosylceramide, and ceramide was higher in the diseases studied than for controls, with pSap-d having the highest percentage, consistent with the unique urinary multiple sphingolipid elevations in this condition. PSAP Gene Analysis Patient 1 (pSap-d). Patient 1 was found to be homozygous for a point mutation, c.1006-2A > G, in the PSAP gene. The mutation is located two bases upstream of the exon 10 acceptor splice site and alters the consensus splice site sequence. Analysis of the patient's cDNA sequence identified a splicing error with activation of a cryptic splice site in intron 9 leading to both an insertion of 70 bases from the intronic sequence into the mRNA (r.1006-70_1006-lins) and a premature stop codon. Each of the parents was found to be a carrier for the mutation. While it was possible to amplify the mutant mRNA by RT-PCR in the patient, the same analysis in the parents showed that, in comparison to the wild-type transcript, only trace amounts of the mutant transcript were present. Patient 2 (SapB-d). Patient 2 was found to be heterozygous for two mutations. The first mutation, c.577-2A > G, is a splicing mutation located in the acceptor splice site of intron 5. The second mutation is a 2bp deletion, c.828-829delGA, located in exon 8. The latter mutation leads to a frameshift and a premature stop codon. No transcript corresponding to the c.828-829delGA sequence was detected by RT-PCR in the patient. As this mutation leads to a premature stop codon, the most likely explanation for the absence of transcript is nonsense-mediated decay. Two transcripts from the other allele, which carries the c.577-2A > G, were found. Analysis of these indicated that the acceptor splice site in intron 5 was rendered non-functional by the mutation, with either of two different downstream acceptor sites being used instead. In the first transcript a cryptic splice site in exon 6 was used, since the mRNA contained a deletion of 21 bp from exon 6 (15 of them encoding SapB; r.577-_597del). In the second transcript the whole of exon 6 was deleted (r.577_720del) and the exon 7 acceptor splice site was used. These in-frame deletions affected only the SapB domain, while the sequence encoding the remaining Saps was intact and in frame. Analysis of the parental genomic DNA showed that the mother was a carrier of c.577-2A > G sequence variant while the father was heterozygous for c.828-829delGA. DISCUSSION Disorders caused by defects in the PSAP gene form a poorly known sub-group of lysosomal lipid storage diseases that are clinically and metabolically highly variable. Reports on the few known cases of pSap-d have indicated that this disorder should be considered in the differential diagnosis of neonates with unexplained neurologic signs, in particular, if these are combined with visceral involvement. The central nervous system changes in pSap-d may be caused not only by early lipid storage, but also by primary deficits in the organization of cerebral architecture, since pSap and/or some of its products are known to have essential neurotrophic functions. The present pSapd patient was clinically and biochemically very similar to the earlier reported pSap-d patients . The diagnosis in the present SapB-d patient was complicated by the early finding of a massive infarction of the left arteria cerebri media, presumably coincidental to the leukodystrophy. However, the finding of high urinary sulfatide eventually led to a molecular diagnosis of SapB-d. This patient was similar to earlier reported patients . Descriptions of other patients suggest a highly variable clinical phenotype in SapB-d, similar to that in ASA-deficient MLD. Urine is an accessible and non-invasive sample that can also be considered to be an ''indirect kidney biopsy'' for lipid and other analyses, due to the presence of portions of kidney cells . Solid materials from desquamated renal tubule epithelial and glomerular cells seem to provide the main source of urinary sphingolipids and other lipids, although some contribution from blood cells and plasma cannot be excluded, for example, in patients with kidney disease. For patient and control urine samples, the proportion of each of the main sphingolipids relative to the total concentration of these sphingolipids can be calculated without standardization to a reference parameter (Fig. 4). In normal controls sphingomyelin accounted for more than 60% of these sphingolipids, but in the patients, the proportion of sphingomyelin was considerably less due to the preponderance of other sphingolipids. The relative proportions of individual urinary sphingolipids were, in general, diagnostically informative. However, the quantitative lipid signals from the mass spectrometric sphingolipid analysis of urine extracts should be standardized to a reference parameter (supporting information S8 may be found in the online version of this article). We used the ratio of signals for the different sphingolipids, including ceramide, to the urinary sphingomyelin concentration (Table I), in analogy to a described approach . In this study we have demonstrated for the first time the use of urinary sphingolipid analysis when diagnosing the rare pSap-d condition. We have also confirmed the usefulness of this procedure when screening for SapB-d and other sphingolipidoses (Table I). In particular, urinary lipid analysis by ESI-MS/MS for the present pSap-d neonate verified the complex urinary lipid changes in this condition and allowed quantification of individual sphingolipid classes. There was a large increase in the concentration of Gb3Cer (within the range seen in adult Fabry disease) along with increases in sulfatide, dihexosylceramides (LacCer and digalactosylceramide), GlcCer, and ceramide. On the other hand, ESI-MS/MS analysis in the new patient with SapB-d showed a urinary concentration of sulfatide similar to that in MLD, and an elevated concentration of Gb3Cer, though lower than in adult Fabry disease males (see also supporting information S9 which may be found in the online version of this article). The urinary concentration of LacCer/ digalactosylceramide was also increased in the SapB-d patient (Table I). Digalactosylceramide is a substrate that, like Gb3Cer, is thought to depend on intact SapB for its degradation by a-galactosidase A (EC 3.2.1.22) and, therefore, would be expected to be increased in SapB-d. Other authors have also reported elevated levels of this dihexosylceramide (in addition to LacCer, Gb3Cer, and sulfatide) in urine from SapB-d patients. The c.1006-2A > G splice-site mutation in patient 1 (pSap-d) results in a premature stop codon with, when compared to the wildtype transcript, only traces of the mutant transcript detected by RT-PCR, indicating that it was probably largely removed through nonsense-mediated decay. Consistent with this view, another premature stop codon mutation located downstream of the SapB domain, as in the present patient, also results in an apparently mRNA-negative (pSap-d) allele . Although we could not completely exclude the possibility that the residual mutant transcript gave rise to small amounts of functional SapA and SapB in our patient in vivo, the patient's clinical and biochemical phenotype was as severe as in previously described patients, supporting the view that functional SapA and SapB were absent. The two mutations carried by patient 2 (SapB-d) affect the fate of the prosaposin transcript in very different ways. A 2 bp deletion (c.828-829delGA) in exon 8 on one allele results in complete absence of the transcript, most likely due to nonsense-mediated decay, so no pSap or Saps would be generated from this allele. On the other allele, the c.577-2A > G splicing mutation leads to the formation of two alternative transcripts, both of which carry an inframe deletion of a portion of the SapB domain. This mutation exclusively affects the SapB domain and should not result in the deficiency of other Saps. In keeping with this finding, the biochemical and also clinical phenotype in this patient were consistent with an isolated absence of SapB. To date, four different genotypes (all homozygous), including the one for the present patient, have been reported for pSap-d . All have led to a complete loss of functional pSap and Saps and a rather uniform phenotype. In contrast, for the SapB-d condition not only late infantile but also later manifesting, even adult, phenotypes have been described. SapC-d is similar to SapB-d in this respect, with patients presenting with a range of phenotypes including neuronopathic and non-neuronopathic forms . In deficiencies of single Saps the second allele can be a null (pSap-d) allele, with the three active Saps derived from only one allele (refer to for examples of SapC-d and the patients described in and the present patient for examples of SapB-d). However, other mutant PSAP genotypes may occur which affect the function of more than one Sap domain but less than the whole pSap. In a two-saposin-deficient (Saps C and D inactive) mouse model, the resulting phenotype included some but not all features of pSap-d mice . One 28-monthold patient was described to have SapB-d, but was not studied molecularly. Of note, his clinical and pathologic findings indicated a generalized neurovisceral dystrophy comparable to that in pSap-d, in addition to symptoms suggestive of MLD. The study of additional mutant PSAP genotypes and phenotypes should contribute to the better understanding of the functions of the pSap/Saps system.
|
Former President George W. Bush shared a photo from Trinity Forest Golf Club in Dallas after shooting the first hole-in-one of his life.
Former President George W. Bush shot the first hole-in-one of his life on Wednesday at Trinity Forest Golf Club in Dallas, and he shared a photo from the par-3 12th hole, listed at 164 yards from the white tees.
"With coaching from @thebushcenter CEO Ken Hersh and board members Mike Meece and Bill Hickey, I scored my first hole-in-one at the home of our Warrior Open and the @attbyronnelson. Next golf goal: live to 100 so I can shoot my age."
Bush's post had more than 40,000 likes within an hour. Bonus points for the Presidents Cup sweater!
The George W. Bush Presidential Center annually hosts the three-day Warrior Open at Trinity Forest, a tournament for members of the U.S. Armed Forces who were wounded overseas.
The PGA Tour began hosting the AT&T Byron Nelson at Trinity Forest in 2018.
With coaching from @thebushcenter CEO Ken Hersh and board members Mike Meece and Bill Hickey, I scored my first hole-in-one at the home of our Warrior Open and the @attbyronnelson. Next golf goal: live to 100 so I can shoot my age.
|
package ar.com.hjg.pngj.awt;
import java.awt.image.BufferedImage;
import java.io.File;
import java.util.HashSet;
import java.util.List;
import java.util.Random;
import java.util.Set;
import junit.framework.AssertionFailedError;
import org.junit.After;
import org.junit.Test;
import ar.com.hjg.pngj.ImageInfo;
import ar.com.hjg.pngj.PngHelperInternal;
import ar.com.hjg.pngj.PngReaderByte;
import ar.com.hjg.pngj.awt.ImageLineBI.BufferedImage2PngAdapter;
import ar.com.hjg.pngj.chunks.PngChunkTRNS;
import ar.com.hjg.pngj.cli.CliArgs;
import ar.com.hjg.pngj.test.TestSupport;
public class TestImageLineBI {
private static final boolean removeTmpFiles = false;
private static final boolean verbose = false;
private Set<File> tmpFilesToDelete = new HashSet<File>();
public static List<File> getImagesBank(int bank) {
List<File> pngs = null;
if (bank == 1)
pngs = CliArgs.listPngFromDir(TestSupport.absFile("colormodels"), true);
if (bank == 2)
pngs = CliArgs.listPngFromDir(TestSupport.absFile("grays"), true);
return pngs;
}
private void delOnExit(File f) {
tmpFilesToDelete.add(f);
}
Random rand = new Random(1);
/** Reads via our BIreader, writes via ImageIO, and compares pixel by pixel */
private void testRead(File ori, boolean preferCustom) {
PngReaderBI png = new PngReaderBI(ori);
if (verbose)
PngHelperInternal.debug(String.format("====testing with values %s cust=%s==",
ori.getName() + " " + png.imgInfo.toStringBrief(), preferCustom));
png.setPreferCustomInsteadOfBGR(preferCustom);
File dest = TestSupport.absFile("test/__test.tmp.png");
delOnExit(dest);
BufferedImage img = png.readAll();
if (verbose)
PngHelperInternal.debug(ImageIoUtils.imageTypeName(img.getType()));
ImageIoUtils.writePng(dest, img);
TestSupport.testSameValues(ori, dest);
}
private void testWrite1(File f, int convToType, boolean forceRgb) {
testWrite1(f, convToType, forceRgb, 1);
}
/**
* Reads via via ImageIO, (optionally does a BI conversion), writes via our
* PngWriterBI and compares pixel by pixel
*
* @param f Png file
* @param convToType one of the BufferedImage types (negative if not conversion)
* @param forceRgb If true, we force resort to getRGB in our internal logic
*/
private void testWrite1(File f, int convToType, boolean forceRgb, int tolerance) {
try {
File dest = TestSupport.absFile("test/__test.tmp.png");
delOnExit(dest);
BufferedImage bi1 = ImageIoUtils.readPng(f);
if (verbose)
PngHelperInternal.debug(f + " type=" + ImageIoUtils.imageTypeName(bi1.getType()) + " conv to "
+ (convToType > -1 ? ImageIoUtils.imageTypeName(convToType) : "-") + " force RGB=" + forceRgb);
BufferedImage bi2 = null;
if (convToType > 0 && convToType != bi1.getType()) {
bi2 = new BufferedImage(bi1.getWidth(), bi1.getHeight(), convToType);
bi2.getGraphics().drawImage(bi1, 0, 0, null);
}
BufferedImage2PngAdapter adap = new BufferedImage2PngAdapter(bi2 != null ? bi2 : bi1);
adap.forceresortToGetRGB = forceRgb;
PngWriterBI pngw = PngWriterBI.createInstance(adap, dest);
pngw.writeAll();
TestSupport.testSameValues(f, dest, tolerance);
} catch (AssertionFailedError e) {
System.err.println("Error with " + f + " typeconv=" + ImageIoUtils.imageTypeNameShort(convToType)
+ " forceRgb=" + forceRgb);
throw e;
}
}
@Test
public void testRead1() {
for (File png : getImagesBank(1)) {
testRead(png, rand.nextBoolean());
}
}
@Test
public void testRead2() {
for (File png : getImagesBank(2)) {
testRead(png, rand.nextBoolean());
}
}
@Test
public void testReadPartialRgb8() {
List<File> pngs = getImagesBank(1);
File ori = pngs.get(0);
File dest = TestSupport.absFile("test/__test.tmp.png");
File dest2 = TestSupport.absFile("test/__test2.tmp.png");
delOnExit(dest);
delOnExit(dest2);
{
PngReaderBI png = new PngReaderBI(ori);
int nlines = png.imgInfo.rows / 2 - 1, offset = 1, step = 2;
// dest.deleteOnExit();
BufferedImage img = png.readAll(nlines, offset, step); // 10 lines, starting for 1, skipping 1
// System.err.println(ImageIoUtils.imageTypeName(img.getType()));
ImageIoUtils.writePng(dest, img);
TestSupport.copyPartial(ori, dest2, nlines, step, offset, false);
}
TestSupport.testSameValues(dest, dest2);
}
@Test
public void testWrite1() {
for (File png : getImagesBank(1)) {
PngReaderByte p = new PngReaderByte(png);
ImageInfo imi = p.imgInfo;
boolean hastrns = p.getChunksList().getById1(PngChunkTRNS.ID) != null;
boolean isalphaortrns = hastrns || imi.alpha;
p.close();
// dont try conversions that loose info
if (imi.bitDepth <= 8 && !isalphaortrns) {
testWrite1(png, BufferedImage.TYPE_3BYTE_BGR, rand.nextBoolean());
testWrite1(png, BufferedImage.TYPE_INT_RGB, rand.nextBoolean());
testWrite1(png, BufferedImage.TYPE_INT_BGR, rand.nextBoolean());
}
if (imi.bitDepth <= 8) {
// testWrite1(png, BufferedImage.TYPE_4BYTE_ABGR, rand.nextBoolean());
// THIS FAILS testWrite1(png, BufferedImage.TYPE_INT_ARGB, rand.nextBoolean());
// testWrite1(png, BufferedImage.TYPE_4BYTE_ABGR_PRE, rand.nextBoolean());
}
// testWrite1(png, BufferedImage.TYPE_CUSTOM, rand.nextBoolean());
}
}
@Test
public void testWrite2() { // needs 1 tolerance http://stackoverflow.com/questions/23707736/
File png = TestSupport.absFile("colormodels/04pt.png");
testWrite1(png, BufferedImage.TYPE_4BYTE_ABGR, false, 1);
}
@Test
public void testWrite3() { // needs
File png = TestSupport.absFile("colormodels/08ax.png");
testWrite1(png, BufferedImage.TYPE_4BYTE_ABGR, false, 5);
}
@Test
public void testWriteLinePerLine() {
File f = TestSupport.absFile("colormodels/08.png");
File dest = TestSupport.absFile("test/__test.tmp.png");
delOnExit(dest);
BufferedImage bi1 = ImageIoUtils.readPng(f);
PngWriterBI pngw = PngWriterBI.createInstance(bi1, dest);
// you'll rarely want to do this
for (int r = 0; r < pngw.imgInfo.rows; r++) {
pngw.writeRow(pngw.getIlineset().getImageLine(r));
}
pngw.end(); // Dont forget this.
TestSupport.testSameValues(f, dest);
}
@Test
public void testWritePartial() {
File f = TestSupport.absFile("colormodels/08.png");
File dest = TestSupport.absFile("test/__test.tmp.png");
delOnExit(dest);
File dest2 = TestSupport.absFile("test/__test.tmp.png");
delOnExit(dest2);
int nlines = 10, offset = 1, step = 2;
BufferedImage bi1 = ImageIoUtils.readPng(f);
PngWriterBI pngw = PngWriterBI.createInstance(bi1, nlines, offset, step, dest);
pngw.writeAll();
pngw.end(); // not necessary here
TestSupport.copyPartial(f, dest2, nlines, step, offset, false);
TestSupport.testSameValues(dest2, dest);
}
@After
public void endTest() {
if (removeTmpFiles)
for (File f : tmpFilesToDelete) {
f.delete();
}
}
}
|
<filename>gcc-gcc-7_3_0-release/libgomp/testsuite/libgomp.c/pr66133.c
/* PR middle-end/66133 */
/* { dg-do run } */
#include <stdlib.h>
#include <unistd.h>
volatile int x;
__attribute__((noinline)) void
foo (void)
{
if (x == 0)
{
#pragma omp task
{
usleep (2000);
exit (0);
}
}
else
abort ();
}
int
main ()
{
#pragma omp parallel num_threads (2)
{
#pragma omp barrier
#pragma omp single
foo ();
}
exit (0);
}
|
Roving Wireless Sensor Network ECE 561 Project Report, May 2009 Mobile wireless sensor networks show great potential in a number of important disciplines such as search-and-rescue, military reconnaissance, environmental monitoring, and extraterrestrial exploration. This project focuses on the embedded system design acting as a test bed for experimentation. Keywords-component; networks; sensors; wireless; autonomous; robots; distributed; rover; energy harvesting; Figure 1: Rovers Under Test I. OVERVIEW AND MOTIVATION Our very society is in essence a network of distributed capabilities and distributed intelligence. Although some may clearly have more impact than others, no one individual is responsible for the success or failure of our species as a whole. The tasks of keeping our society running are distributed throughout this ad-hoc network, and even significant disasters can eventually be overcome, or at least worked around. In varying degrees and scales this theme is recurrent throughout both our human society and the natural world, from our global food network to the running of university department, from a colony of ants to a pride of lions, from a flock of birds to coral reef to the very cells making up our bodies. The distribution of tasks among a multitude of similar yet specialized, semiautonomous entities is perhaps the system most proven to be successful. We are only just beginning to successfully emulate such systems by explicit design, though not for lack of trying. Although the preceding paragraph may seem out of place in the context of a paper concerning sensor networks, it is not. We, as ordinary individuals, are nodes of perhaps the largest distributed sensor network on the planet. (It is not necessarily surprising then, that as individual nodes, we are generally not aware of what the ultimate purpose may be.) In contrast, while the vast majority of our engineered hardware systems to date have been comprised of specialized parts or subsystems, there was little or no ability for one subsystem to adapt to the task of another that was no longer able to perform. Extremely high stakes tasks such as the Voyager space probes tended to be engineered for the utmost reliability, with perhaps some redundancy. With that route came very high costs and long development cycles. They were essentially single, very high uptime sensor nodes for a multi-master-single-slave peer-topeer wireless network. On a day to day basis, any discrepancies required a cessation of activities until new commands were received from a command center one of the master nodes. Communication between these master node located on Earth and the sensor node located on Mars can also have an enormous propagation delay which greatly reduces system efficiency. For example the closest distance from the Earth to Mars in the last 60,000 years was 55.758*10^9 meters in 2003. If the RF signals travel at the speed of light (3*10^8 meters / second) then the one way signal propagation delay is 185 seconds. If this communication can be minimized and collaboration between multiple nodes located on the Mars surface can replace some of this communication then system efficiency and necessary system throughput will be greatly increased. NASA's reduced-cost Mars rovers however, although still individually complicated, took a slightly more distributed approach. At the highest level, one could consider the pair to be two identical sensor nodes of a multi-path, multi-hop tree network where each rover could communicate with one of at least three Mars orbiting satellites. The satellites serve as routing nodes to the Earth-based command centers, which can be considered coordinator nodes. A large advantage, however, was the addition of software that provided the rovers with a limited amount of autonomy. They no longer had to wait for a command from Earth to achieve their essential goals valid scientific data acquisition or to find a way around the small rock in their path any particular day. Even their mechanical systems contained sensor networks, such as their six wheels with rotation and motor current sensors that have resulted in a drive system with exceptional fault tolerance. In light of the first paragraph, it is the view of this author that NASA has actually taken a small step towards a design plan that, far from being unknown, has already been proven by nature. The use of two identical, mobile rovers as mobile sensor nodes greatly increased the scientific return while at the same time lowering total costs and the risks of total failure. We endeavor to create a low-cost platform capable of taking an additional step, by further reducing mobile sensor node (rover) complexity, increasing the number of nodes and utilizing the increased routing possibilities to incorporate modern sensor network techniques.
|
package com.airbnb.lottie.value;
/**
* Static value version of {@link LottieFloatRelativeValueCallback}.
*/
public class LottieStaticFloatRelativeValueCallback extends LottieFloatRelativeValueCallback {
private final float offset;
public LottieStaticFloatRelativeValueCallback(float offset) {
this.offset = offset;
}
@Override
public Float getOffset(
float startFrame, float endFrame,
Float startValue, Float endValue,
float linearKeyframeProgress, float interpolatedKeyframeProgress,
float overallProgress) {
return offset;
}
}
|
Killing Klytaimnestra: Matricide Myths on Etruscan Bronze Mirrors Etruscan artisans manufactured several types of luxurious bronze mirrors between the mid-sixth and second centuries BCE. These artifacts had both practical and symbolic functions, first within the private sphere of the home and later in the funerary environment. In addition to reflecting their owners social status and wealth, their polished surfaces were used during the daily toilette to project a self image and as a means for witnessing the transformation of that image via sumptuous clothing, luxurious jewelry and costly cosmetics. Some mirrors record the names of their owners, women such as Thanchvil Fulnia, Ceithurnea, Ramtha Paithna, and Scata Vechnsa, while others demonstrate that they were exchanged as gifts.1 During the fourth century BCE, Tite Cale, for example, honored his mother with the gift of a mirror (Fig. 1), although it is unclear whether the artifact was given to his mortal mother or if it was dedicated to a mother goddess, perhaps Turan.2 In another instance, a man named Arnt is mentioned as the giver, while a third example indicates that Etruscan women also presented mirrors to men.3 In addition to their association with adornment4 and gift exchange, Etruscan mirrors have been connected with marriage and prophecy. Many scholars believe that they were presented to brides and grooms at elite weddings,5 and/or used to understand an individuals fate or fortune, possibly at times of transition.6 Finally, many mirrors functioned as a sophisticated form of visual communication within the domestic sphere, evoking the values and ideals of their patrons and users, most of whom had a sophisticated and nuanced understanding of Greek myth and literature. Narratives frequently drawn from the latter but best understood as interpretatii etrusche,7 along with single/multi-figured compositions featuring gods, goddesses, heroes and heroines, were engraved or executed in relief on the reserve sides, providing changeless scenes that could be contemplated whenever the objects were not in use. Addressed to both women and men,8 their messages project
|
/*
* Copyright 2019-2020 VMware, Inc.
* All Rights Reserved.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.apache.org/licenses/LICENSE-2.0
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package objects
import (
"sync"
"github.com/vmware/load-balancer-and-ingress-services-for-kubernetes/pkg/utils"
)
var lbinstance *lbLister
var lbonce sync.Once
func SharedlbLister() *lbLister {
lbonce.Do(func() {
lbinstance = &lbLister{
lbStore: NewObjectMapStore(),
sharedVipKeyToServicesStore: NewObjectMapStore(),
serviceToSharedVipKeyStore: NewObjectMapStore(),
}
})
return lbinstance
}
type lbLister struct {
lbStore *ObjectMapStore
// annotationKey -> [svc1, svc2, svc3]
sharedVipKeyToServicesStore *ObjectMapStore
// svc1 -> annotationKey
serviceToSharedVipKeyStore *ObjectMapStore
}
func (a *lbLister) Save(svcName string, lb interface{}) {
utils.AviLog.Debugf("Saving lb svc :%s", svcName)
a.lbStore.AddOrUpdate(svcName, lb)
}
func (a *lbLister) Get(svcName string) (bool, interface{}) {
ok, obj := a.lbStore.Get(svcName)
return ok, obj
}
func (a *lbLister) GetAll() interface{} {
obj := a.lbStore.GetAllObjectNames()
return obj
}
func (a *lbLister) Delete(svcName string) {
a.lbStore.Delete(svcName)
}
func (a *lbLister) UpdateSharedVipKeyServiceMappings(key, svc string) {
a.serviceToSharedVipKeyStore.AddOrUpdate(svc, key)
found, services := a.GetSharedVipKeyToServices(key)
if found {
if utils.HasElem(services, svc) {
return
}
services = append(services, svc)
a.sharedVipKeyToServicesStore.AddOrUpdate(key, services)
return
}
a.sharedVipKeyToServicesStore.AddOrUpdate(key, []string{svc})
}
func (a *lbLister) RemoveSharedVipKeyServiceMappings(svc string) bool {
if found, key := a.GetServiceToSharedVipKey(svc); found {
if foundServices, services := a.GetSharedVipKeyToServices(key); foundServices {
services = utils.Remove(services, svc)
if len(services) == 0 {
a.sharedVipKeyToServicesStore.Delete(key)
} else {
a.sharedVipKeyToServicesStore.AddOrUpdate(key, services)
}
}
}
a.serviceToSharedVipKeyStore.Delete(svc)
return true
}
func (a *lbLister) GetSharedVipKeyToServices(key string) (bool, []string) {
found, serviceList := a.sharedVipKeyToServicesStore.Get(key)
if !found {
return false, make([]string, 0)
}
return true, serviceList.([]string)
}
func (a *lbLister) GetServiceToSharedVipKey(svc string) (bool, string) {
found, key := a.serviceToSharedVipKeyStore.Get(svc)
if !found {
return false, ""
}
return true, key.(string)
}
|
/**
* Call doActionPerformed with a console, or show an error if we are not able
* to find a console.
*
* @param e see DumbAwareAction#actionPerformed(e)
*/
@Override public void actionPerformed(@NotNull AnActionEvent e) {
OCamlConsoleRunner runner = OCamlConsoleToolWindowFactory.getOCamlConsoleRunner();
if (runner == null) {
var notification = new OCamlNotificationData(OCamlBundle.message("repl.no.started.desc"));
notification.mySubtitle = OCamlBundle.message("repl.no.started.title");
notification.myNotificationType = NotificationType.ERROR;
OCamlNotifications.notify(notification);
} else if (runner.isNotAbleToRun()) {
var notification = new OCamlNotificationData(OCamlBundle.message("repl.run.command.no.sdk"));
notification.mySubtitle = OCamlBundle.message("repl.run.command.no.sdk.info");
notification.myNotificationType = NotificationType.ERROR;
OCamlNotifications.notify(notification);
} else {
doActionPerformed(e, runner);
}
}
|
def findSd(x):
x = str(x)
return sum([int(d) for d in x])
n = int(input())
sd = findSd(n)
while sd & 3:
n+=1
sd = findSd(n)
print(n)
|
#pragma once
#include <time.h>
#include <Windows.h>
class MTime
{
public:
MTime(int y, int m, int d)
{
year = y;
month = m;
day = d;
}
static MTime to_MTime(std::string &ntime); //将如“2019.6.1”这样的合法输入转为时间类实例
std::string getTimeString(char segment = '.'); //将MTime转化为string语句,用于输出(默认分隔符为'.')
int remainingTime(); //返回传入时间(比今天晚)距离今天还有多久,不包含当天
bool operator==(MTime &b);
bool operator>(MTime &b);
private:
int year;
int month;
int day;
};
|
Quick Access
Review / Favorite Track / For Fans Of / Atmosphere Levels / Links (Music & Social)
Polish Doomsters Load Up A Second Hit
Overdriven tube amps? Check. Liberal use of the wah pedal? Check. References to weed and satan? Sigh… here we go again. It pains me to say it, but the stoner doom genre has long since been oversaturated. It would seem that every man (not to mention his emphysema-ridden dog) is playing some variation on the well-trodden musical path embarked upon by Bongzilla, Electric Wizard, et al back in the 90s. In order to stand out in this scene these days, you either need to be either really fucking weird, or just really fucking good. It will satisfy fans to know thatBelzebonG cement themselves firmly into the latter category on Greenferno, their sophomore album.
How is the sound ?
Here you have a band who seem to grasp that the whole point of playing music is to have fun. If that’s your bag, then this is an absolute blast from start to finish!
The riffs on Greenferno are surprisingly upbeat (dare I say ‘optimistic’?), characterised by undulating swagger rather than just being slow for slow’s sake. Yes, you’ve heard this kind of stuff time and time before, but that doesn’t make it any less entertaining.
The four tracks on offer have been crafted through jamming them out in the practice room, so there’s an organic progression in each track which makes for great ear candy. The drumming is effortless and disciplined so as not to detract any attention from the massive sludgy riffs and lashings of distant, wailing guitars. Beneath all this lurks an obscenely filthy bass tone, and some of the best moments on the album are the all-too-short jammy passages where this bass takes centre stage.
Production-wise Greenferno is squeaky clean. In fact it’s a little too clean, and it fails to generate that special, primordial atmosphere which comes with any truly great Doom record. Still, production is largely a matter of personal preference, and whether you’re a newcomer to the genre or a seasoned veteran you’ll find this is a very engaging record which will go down an absolute treat.
In summary, this is big, dumb stoner doom with all the psychedelic trimmings, played with real commitment – just don’t expect it to top any “best of” lists coming of the end of the year.
And here is a little video Mr. Fuzz filmed during the DesertFest Belgium :
Why is this album worth listening ?
Powerful, bluesy riffs which straddle sludge and doom
That filthy, muffed-up bass tone – the album is like porn in this respect.
The climax to Goat Smoking Blues is virtually the same riff from Funeralopolis by Electric Wizard. And it absolutely slays!
In what situation you should listen to this album ?
Greenferno is such an accessible, catchy record that I find it difficult to think of that situation where it wouldn’t go down a treat. It also clocks in at just 35 minutes, so it’s easy to digest.
Something particular to note ?
According to their Facebook page, Belzebong’s lineup is as follows: “Cheesy dude”, “Sheepy dude”, “Alky dud”, “Hexy dude” and “Boogey dude”. Yeah… moving on…
|
The present invention concerns portable objects of small dimensions, such as wristwatches, that comprise a rotating control stem, the actuation of which controls a mechanical or electronic function of the portable object in which the rotating control stem is arranged.
To properly perform the mechanical or electronic function concerned, it must be possible to detect the actuation of the rotating control stem. Among various possible solutions, one consists in measuring the variation in magnetic induction produced by the rotation of a magnet integral with the control stem. To detect this variation in magnetic induction, it is possible to use a magnetic sensor of the Hall effect type which is capable of measuring the value of magnetic induction of the environment in which it is located.
A recurrent problem that arises in the field of detecting the rotation of a control stem by measuring magnetic induction is that of the reproducibility of the measurement from one portable object to another. Indeed, the portable objects referred to here, such as wristwatches, are produced in large quantities on an industrial scale. It is therefore necessary to take steps to ensure the best possible reproducibility of the magnetic induction measurement from one object to another, without these steps adding too much to the final cost price of the portable object. In order to ensure good reproducibility of a magnetic induction measurement, it must be possible to ensure the proper relative positioning of the magnet and the inductive sensor.
|
#include <eve/module/math.hpp>
#include <eve/wide.hpp>
#include <iostream>
#include <iomanip>
using wide_ft = eve::wide<float>;
using wide_dt = eve::wide<double>;
int main()
{
wide_ft wxf;
wide_dt wxd;
std::cout << "---- simd" << std::setprecision(9) << std::endl
<< "-> inv_e(as<wide_ft>()) = " << eve::inv_e(eve::as<wide_ft>()) << std::endl
<< "-> inv_e(as(wxf)) = " << eve::inv_e(eve::as(wxf)) << std::endl
<< "-> upward(inv_e)(as<wide_ft>()) = " << eve::upward(eve::inv_e)(eve::as<wide_ft>()) << std::endl
<< "-> upward(inv_e)(as(wxf)) = " << eve::upward(eve::inv_e)(eve::as(wxf)) << std::endl
<< "-> downward(inv_e)(as<wide_ft>()) = " << eve::downward(eve::inv_e)(eve::as<wide_ft>()) << std::endl
<< "-> downward(inv_e)(as(wxf)) = " << eve::downward(eve::inv_e)(eve::as(wxf)) << std::endl
<< std::setprecision(17)
<< "-> inv_e(as<wide_dt>()) = " << eve::inv_e(eve::as<wide_dt>()) << std::endl
<< "-> inv_e(as(wxd)) = " << eve::inv_e(eve::as(wxd)) << std::endl
<< "-> upward(inv_e)(as<wide_dt>()) = " << eve::upward(eve::inv_e)(eve::as<wide_dt>()) << std::endl
<< "-> upward(inv_e)(as(wxd)) = " << eve::upward(eve::inv_e)(eve::as(wxd)) << std::endl
<< "-> downward(inv_e)(as<wide_dt>()) = " << eve::downward(eve::inv_e)(eve::as<wide_dt>()) << std::endl
<< "-> downward(inv_e)(as(wxd)) = " << eve::downward(eve::inv_e)(eve::as(wxd)) << std::endl;
float xf;
double xd;
std::cout << "---- scalar" << std::endl
<< "-> inv_e(as<float>()) = " << eve::inv_e(eve::as(float())) << std::endl
<< "-> inv_e(as<xf)) = " << eve::inv_e(eve::as(xf)) << std::endl
<< "-> inv_e(as<double>()) = " << eve::inv_e(eve::as(double()))<< std::endl
<< "-> inv_e(as<xd)) = " << eve::inv_e(eve::as(xd)) << std::endl;
return 0;
}
|
. The mycocinogenous strain Tilletiopsis flava VKM Y-2823 was found to possess fungicidal activity at pH 3.5-4.5, which was retained after curing the strain by eliminating the extrachromosomal genetic elements. The mycocin produced by the strain had a molecular mass of more than 10 kDa and was readily inactivated by heating and treatment with protease K. This mycocin was found to be active against species of the anamorphic genus Tilletiopsis. The overwhelming majority of other representatives of the order Tilletiales, as well as ascomycetous and basidiomycetous yeasts, which either form or did not from ballistospores of the orders Sporidiales and Tremellales, were resistant to it.
|
An IntegrinTargeting RGDKTagged Nanocarrier: Anticancer Efficacy of Loaded Curcumin Herein we report the design and development of 51 integrinspecific noncovalent RGDKlipopeptidefunctionalized singlewalled carbon nanotubes (SWNTs) that selectively deliver the anticancer drug curcumin to tumor cells. RGDK tetrapeptidetagged amphiphiles were synthesized that efficiently disperse SWNTs with a suspension stability index of >80% in cell culture media. 3(4,5dimethylthiazol2yl)2,5diphenyltetrazolium bromide (MTT) and lactate dehydrogenase (LDH)based cell viability assays in tumor (B16F10 melanoma) and noncancerous (NIH3T3 mouse fibroblast) cells revealed the noncytotoxic nature of these RGDKlipopeptideSWNT conjugates. Cellular uptake experiments with monoclonal antibodies against v3, v5, and 51 integrins showed that these SWNT nanovectors deliver their cargo (Cy3labeled oligonucleotides, Cy3oligo) to B16F10 cells selectively via 51 integrin. Notably, the nanovectors failed to deliver the Cy3oligo to NIH3T3 cells. The RGDKSWNT is capable of delivering the anticancer drug curcumin to B16F10 cells more efficiently than NIH3T3 cells, leading to selective killing of B16F10 cells. Results of AnnexinV binding based flow cytometry experiments are consistent with selective killing of tumor cells through the late apoptotic pathway. Biodistribution studies in melanoma (B16F10)bearing C57BL/6J mice showed tumorselective accumulation of curcumin intravenously administered via RGDKlipopeptideSWNT nanovectors.
|
/*
* Copyright (c) 2016, SRCH2
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* * Neither the name of the SRCH2 nor the
* names of its contributors may be used to endorse or promote products
* derived from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
* WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
* DISCLAIMED. IN NO EVENT SHALL SRCH2 BE LIABLE FOR ANY
* DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
* ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
* SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package com.srch2.android.http.app.demo.data.contacts;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.HashSet;
import org.json.JSONObject;
import android.content.ContentResolver;
import android.database.Cursor;
import com.srch2.android.http.app.demo.data.SourceDataRecords;
import com.srch2.android.http.app.demo.data.incremental.IncrementalValuePair;
public class IterateContactsDifference {
private static final String TAG = "ContentSourceInspector";
public static HashMap<Boolean, SourceDataRecords> inspectForRecordsToUpdate(ContentResolver cr, HashSet<IncrementalValuePair> latestIncrementalSnapshot) {
HashSet<IncrementalValuePair> currentIncrementalData = getCurrentIncrementalDataValues(cr);
HashMap<Boolean, HashSet<IncrementalValuePair>> results =
IncrementalValuePair.resolveIncrementalDifference(
currentIncrementalData, latestIncrementalSnapshot);
HashSet<IncrementalValuePair> additions = results.get(true);
HashSet<IncrementalValuePair> deletions = results.get(false);
HashMap<Boolean, SourceDataRecords> updateRecordSet = new HashMap<Boolean, SourceDataRecords>();
SourceDataRecords recordsToAdd = (additions != null && additions.size() > 0) ? retrieveAdditions(cr, additions) : new SourceDataRecords(0);
SourceDataRecords recordsToDelete = (deletions != null && deletions.size() > 0) ? retrieveDeletions(deletions) : new SourceDataRecords(0);
updateRecordSet.put(true, recordsToAdd);
updateRecordSet.put(false, recordsToDelete);
return updateRecordSet;
}
/** Returns the hashset of id-version pairs representing the current incremental data, to be used to diff against the latest incremental snapshot. */
private static HashSet<IncrementalValuePair> getCurrentIncrementalDataValues(ContentResolver cr) {
HashSet<IncrementalValuePair> currentIncrementalData = new HashSet<IncrementalValuePair>();
Cursor c = null;
try {
c = IterateRawContacts.getCursor(cr);
if (c.moveToFirst()) {
final int cursorGetCount = c.getCount();
currentIncrementalData = new HashSet<IncrementalValuePair>(cursorGetCount);
//final int nameColumnIndex = c.getColumnIndex(ContactsContract.RawContacts.DISPLAY_NAME_PRIMARY);
//CharArrayBuffer cab = new CharArrayBuffer(40);
//StringBuilder name = new StringBuilder();
long previousId = -1;
do { // note: it was observed after starring a record, there was a new contact id for the same contact
// ie: duplicates must be pruned at some point based on an equality of this class before inserting
final long currentId = c.getLong(0);
if (currentId == 0) { continue; }
/* test to see if can be omited: depends on if adding single will prevent if containing these lines...
c.copyStringToBuffer(nameColumnIndex, cab);
if (cab.sizeCopied != 0) {
name.setLength(0);
name.insert(0, cab.data, 0, cab.sizeCopied);
} else {
continue;
}
*/
final int version = c.getInt(1);
if (currentId != previousId) {
currentIncrementalData.add(new IncrementalValuePair(currentId, version));
previousId = currentId;
}
} while (c.moveToNext());
}
} finally {
if (c != null) {
c.close();
}
}
return currentIncrementalData;
}
/** Retrieves the set of source data records, used to do restful update, representing the new records that need to be added to the index. */
public static SourceDataRecords retrieveAdditions(ContentResolver cr, HashSet<IncrementalValuePair> additions) {
String[] selectIds = ContentProviderConstants.getSelectedIdArgs(additions);
SourceDataRecords records = IterateRawContacts.iterate(IterateRawContacts.getCursor(cr, selectIds));
IterateRawContactEntities.iterate(IterateRawContactEntities.getCursor(cr, selectIds), records);
return records;
}
/** Retrieves the set of ids, used to do restful insert, representing the records that need to be deleted from the index. */
public static SourceDataRecords retrieveDeletions(HashSet<IncrementalValuePair> deletions) {
return new SourceDataRecords(deletions);
}
}
|
<reponame>MrJustyy/jkzx_fe<gh_stars>0
import { Button, Col, message, Row } from 'antd';
import FormItem from 'antd/lib/form/FormItem';
import BigNumber from 'bignumber.js';
import _ from 'lodash';
import moment from 'moment';
import React from 'react';
import { InputBase, ITableColDef, ITableApi } from '@/components/type';
import {
KNOCK_DIRECTION_MAP,
LEG_FIELD,
LEG_TYPE_FIELD,
LEG_TYPE_MAP,
OB_DAY_FIELD,
UP_BARRIER_TYPE_MAP,
LEG_ID_FIELD,
} from '@/constants/common';
import { Form2, SmartTable, DatePicker } from '@/containers';
import Form from '@/containers/Form';
import ModalButton from '@/containers/ModalButton';
import PopconfirmButton from '@/containers/PopconfirmButton';
import { UnitInputNumber } from '@/containers/UnitInputNumber';
import { qlDateScheduleCreate } from '@/services/quant-service';
import { getLegEnvs, getMoment, getRequiredRule, remove } from '@/tools';
import { ILegColDef } from '@/types/leg';
import { PAGE_SIZE } from '@/constants/component';
const OB_DAY_STRING_FIELD = 'OB_DAY_STRING_FIELD';
class ObserveModalInput extends InputBase<{
direction?: string;
record: any;
api: ITableApi;
}> {
public state = {
visible: false,
popconfirmVisible: false,
dealDataSource: [],
generateLoading: false,
};
public legType: string;
public record: any;
constructor(props) {
super(props);
this.legType = props.record[LEG_TYPE_FIELD];
this.state.dealDataSource = this.computeDataSource(
(props.value || []).map((item, index) => {
const date = moment(item[OB_DAY_FIELD]);
return {
...item,
[OB_DAY_FIELD]: date,
payDay: Form2.createField(moment(item.payDay ? item.payDay : item[OB_DAY_FIELD])),
price: Form2.createField(item.price),
};
}),
);
}
public computeDataSource = (dataSource = [], reload) => {
let nextDataSource = dataSource.sort(
(a, b) => a[OB_DAY_FIELD].valueOf() - b[OB_DAY_FIELD].valueOf(),
);
const { record } = this.props;
const upBarrier = Form2.getFieldValue(record[LEG_FIELD.UP_BARRIER]);
const upBarrierType = Form2.getFieldValue(record[LEG_FIELD.UP_BARRIER_TYPE]);
const step = Form2.getFieldValue(record[LEG_FIELD.STEP]);
const initialSpot = Form2.getFieldValue(record[LEG_FIELD.INITIAL_SPOT]);
const barrierVal =
upBarrierType === UP_BARRIER_TYPE_MAP.PERCENT
? new BigNumber(initialSpot).multipliedBy(new BigNumber(upBarrier).div(100)).toNumber()
: upBarrier;
nextDataSource = nextDataSource.map((item, index) => {
const price = Form2.getFieldValue(item.price);
let priceData = price;
if (!price || reload) {
if (_.isNaN(barrierVal)) {
priceData = 0;
} else {
priceData = new BigNumber(barrierVal)
.plus(new BigNumber(index).multipliedBy(new BigNumber(step).div(100)))
.decimalPlaces(4)
.toNumber();
}
}
return {
...item,
// payDay: Form2.createField(item[OB_DAY_FIELD]),
...(this.isAutoCallSnow()
? {
price: Form2.createField(priceData),
}
: null),
[OB_DAY_STRING_FIELD]: getMoment(item[OB_DAY_FIELD]).format('YYYY-MM-DD'),
};
});
return nextDataSource;
};
public getRowInstance = () => {
const { api, record } = this.props;
const { tableApi, tableManager } = api;
const id = record[LEG_ID_FIELD];
const row = tableManager.rowNodes.find(item => item.id === id);
return row;
};
public onOpen = () => {
const { api, record } = this.props;
const { tableApi, tableManager } = api;
const id = record[LEG_ID_FIELD];
const row = this.getRowInstance();
row.node.changeDropdownMenuVisible(false);
row.node.switchDropdownMenu(false);
tableApi.looseActive();
this.setState({
visible: true,
});
};
public onOk = async () => {
const row = this.getRowInstance();
row.node.switchDropdownMenu(true);
this.setState(
state => ({
visible: !state.visible,
}),
() => {
const val = this.state.dealDataSource.map(item => {
const DataItem = Form2.getFieldsValue(item);
return {
..._.omit(DataItem, OB_DAY_STRING_FIELD),
[OB_DAY_FIELD]: item[OB_DAY_FIELD].format('YYYY-MM-DD'),
payDay: DataItem.payDay ? DataItem.payDay.format('YYYY-MM-DD') : null,
};
});
if (this.props.onChange) {
this.props.onChange(val);
}
if (this.props.onValueChange) {
this.props.onValueChange(val);
}
},
);
};
public onCancel = () => {
const row = this.getRowInstance();
row.node.switchDropdownMenu(true);
this.setState(state => ({
visible: !state.visible,
}));
};
public onSubmitButtonClick = params => {
const { dataSource } = params;
if (
this.state.dealDataSource.find(item =>
getMoment(item[OB_DAY_FIELD]).isSame(dataSource.day, 'd'),
)
) {
message.warn('不可以出现相同日期');
return;
}
this.setState(state => ({
dealDataSource: this.computeDataSource(
[
...state.dealDataSource,
{
[OB_DAY_FIELD]: dataSource.day,
[OB_DAY_STRING_FIELD]: getMoment(dataSource.day).format('YYYY-MM-DD'),
payDay: Form2.createField(moment(dataSource.day)),
},
],
true,
),
}));
};
public bindRemove = rowIndex => () => {
this.setState(state => ({
dealDataSource: this.computeDataSource(
remove(state.dealDataSource, (item, index) => item[OB_DAY_STRING_FIELD] === rowIndex),
true,
),
}));
};
public onPopcomfirmButtonConfirm = () => {
this.setState(
{
popconfirmVisible: false,
},
() => {
this.onGenerate();
},
);
};
public getAutoGenerateParams = () => {
const { record } = this.props;
const start = getMoment(Form2.getFieldValue(record[LEG_FIELD.EFFECTIVE_DATE]))
.clone()
.format('YYYY-MM-DD');
const end = getMoment(Form2.getFieldValue(record[LEG_FIELD.EXPIRATION_DATE])).format(
'YYYY-MM-DD',
);
const freq = Form2.getFieldValue(record[LEG_FIELD.UP_OBSERVATION_STEP]);
return { start, end, freq };
};
public onGenerate = async () => {
const { start, end, freq } = this.getAutoGenerateParams();
this.setState({ generateLoading: true });
const { error, data } = await qlDateScheduleCreate({
start,
end,
freq,
roll: 'backward',
adj: 'modified_following',
holidays: ['DEFAULT_CALENDAR'],
});
this.setState({ generateLoading: false });
if (error) return;
this.setState({
dealDataSource: this.computeDataSource(
data
.filter(item => moment(item).isAfter(start))
.map(item => ({
[OB_DAY_FIELD]: moment(item),
payDay: Form2.createField(moment(item)),
})),
),
});
};
public onPopconfirmClick = () => {
if (_.isEmpty(this.state.dealDataSource) === false) {
this.setState({
popconfirmVisible: true,
});
return;
}
this.onGenerate();
};
public onHidePopconfirm = () => {
this.setState({
popconfirmVisible: false,
});
};
public isAccruals = () => this.legType === LEG_TYPE_MAP.RANGE_ACCRUALS;
public isAutoCallSnow = () => this.legType === LEG_TYPE_MAP.AUTOCALL;
public isAutoCallPhoenix = () => this.legType === LEG_TYPE_MAP.AUTOCALL_PHOENIX;
public isIn = () => this.props.direction === KNOCK_DIRECTION_MAP.DOWN;
public isUp = () => this.props.direction === KNOCK_DIRECTION_MAP.UP;
public getColumnDefs = (): ITableColDef[] => {
const { editing: editable } = this.props;
return [
{
title: '观察日',
dataIndex: OB_DAY_FIELD,
render: (text, record, index) => record[OB_DAY_FIELD].format('YYYY-MM-DD'),
},
{
title: '支付日',
dataIndex: 'payDay',
defaultEditing: false,
editable,
render: (value, record, index, { form, editing, colDef }) => (
<FormItem>
{form.getFieldDecorator({})(
<DatePicker
editing={editing}
defaultOpen
{...{
format: 'YYYY-MM-DD',
}}
/>,
)}
</FormItem>
),
// render: (text, record, index) => record.payDay.format('YYYY-MM-DD'),
},
this.isAutoCallSnow()
? {
title: '障碍价格',
dataIndex: 'price',
defaultEditing: false,
editable,
render: (val, record, index, { form, editing }) => (
<FormItem>
{form.getFieldDecorator({})(
<UnitInputNumber autoSelect editing={editing} unit="¥" min={0} />,
)}
</FormItem>
),
}
: {
title: '已观察到价格',
dataIndex: 'price',
defaultEditing: false,
editable: record => {
const disabled = record.obDay.isBefore(moment().subtract(-1, 'day'), 'day');
return editable && disabled;
},
render: (val, record, index, { form, editing }) => (
<FormItem>
{form.getFieldDecorator({})(
<UnitInputNumber autoSelect editing={editing} unit="¥" min={0} />,
)}
</FormItem>
),
},
...(editable
? [
{
title: '操作',
dataIndex: 'operation',
render: (text, record, index) => (
<Row type="flex" align="middle">
<a
style={{ color: 'red' }}
onClick={this.bindRemove(record[OB_DAY_STRING_FIELD])}
>
删除
</a>
</Row>
),
},
]
: []),
];
};
public getAutoGenerateButton = () => (
<PopconfirmButton
type="primary"
loading={this.state.generateLoading}
onClick={this.onPopconfirmClick}
popconfirmProps={{
title: '生成将覆盖当前表格内容',
visible: this.state.popconfirmVisible,
onCancel: this.onHidePopconfirm,
onConfirm: this.onPopcomfirmButtonConfirm,
}}
>
批量生成观察日
</PopconfirmButton>
);
public handleCellValueChanged = params => {
this.setState(state => ({
dealDataSource: this.computeDataSource(
state.dealDataSource.map(item => {
if (item[OB_DAY_STRING_FIELD] === params.rowId) {
return params.record;
}
return item;
}),
),
}));
};
public renderResult = () => {
const { editing: editable, record } = this.props;
const expirationDate = _.get(record, 'expirationDate.value');
return (
<Row
type="flex"
justify="space-between"
align="middle"
style={{
width: '100%',
}}
>
<ModalButton
type="primary"
size="small"
modalProps={{
closable: false,
footer: editable
? [
<Button key="cancel" onClick={this.onCancel}>
取消
</Button>,
<Button key="submit" type="primary" onClick={this.onOk}>
确认
</Button>,
]
: [
<Button key="cancel" onClick={this.onCancel}>
取消
</Button>,
],
title: `观察日${editable ? '编辑' : '查看'}`,
destroyOnClose: true,
width: 700,
visible: this.state.visible,
onCancel: this.onCancel,
}}
onClick={this.onOpen}
style={{ width: '100%', display: 'block' }}
content={
<>
{editable && (
<Row style={{ marginBottom: 10 }} type="flex" justify="space-between">
<Col>
<Form
onSubmitButtonClick={this.onSubmitButtonClick}
layout="inline"
controls={[
{
field: 'day',
control: {
label: '观察日',
},
input: {
type: 'date',
range: 'day',
disabledDate: current =>
current && current > moment(expirationDate).subtract(-1, 'day'),
},
decorator: {
rules: [
{
required: true,
},
],
},
},
]}
submitText="添加"
resetable={false}
/>
</Col>
<Col>{this.getAutoGenerateButton()}</Col>
</Row>
)}
<SmartTable
pagination={{
showSizeChanger: false,
}}
dataSource={this.state.dealDataSource}
rowKey={OB_DAY_STRING_FIELD}
onCellFieldsChange={this.handleCellValueChanged}
columns={this.getColumnDefs()}
/>
</>
}
>
观察日{editable ? '管理' : '查看'}
</ModalButton>
</Row>
);
};
public renderEditing() {
return this.renderResult();
}
public renderRendering() {
return this.renderResult();
}
}
export const ExpireNoBarrierObserveDay: ILegColDef = {
title: '敲出/coupon观察日',
dataIndex: LEG_FIELD.EXPIRE_NO_BARRIEROBSERVE_DAY,
editable: record => false,
defaultEditing: record => {
const { isEditing, isBooking, isPricing } = getLegEnvs(record);
if (isBooking || isPricing) {
return true;
}
return false;
},
render: (val, record, index, { form, editing, api }) => (
<FormItem>
{form.getFieldDecorator({
rules: [getRequiredRule()],
})(
<ObserveModalInput
editing={editing}
record={record}
direction={KNOCK_DIRECTION_MAP.UP}
api={api}
/>,
)}
</FormItem>
),
};
|
/**
* Created by al on 3/4/16.
*/
public class SMSRequestToken {
@Expose
public String phone;
public SMSRequestToken(String phoneNumber) {
phone = phoneNumber;
}
}
|
Vanishing Viscosity Limit for Incompressible Flow Around a Sufficiently Small Obstacle In this article we consider viscous flow in the exterior of an obstacle satisfying the standard no-slip boundary condition at the surface of the obstacle. We look for conditions under which solutions of the NavierStokes system in the exterior domain converge to solutions of the Euler system in the full space when both viscosity and the size of the obstacle vanish. We prove that this convergence is true assuming two hypothesis: first, that the initial exterior domain velocity converges strongly (locally) in L to the full-space initial velocity and second, that the diameter of the obstacle is smaller than a suitable constant times viscosity, or, in other words, that the obstacle is sufficiently small. The convergence holds as long as the solution to the limit problem is known to exist and stays sufficiently smooth. To fix the O spatial scale, we consider flows with an initial vorticity which is compacly supported, vanishes near obstacle, and does not depend on viscosity and the on size of the obstacle. In, Iftimie proved that any such vorticity gives rise to a family of exterior flows which converges in L to the corresponding full-space flow. For exterior two dimensional flow, topology implies that the initial velocity is not determined by vorticity alone, but also by its harmonic part. In the case of two dimensional flow, we prove strong convergence of initial data, as required by our main result, if the harmonic part of the family of initial velocities is chosen so that the circulation of the initial flow around the small obstacle vanishes. This work complements the study of incompressible flow around small obstacles, which has been carried out in
|
import csv
import json
import os
from collections import OrderedDict
from contextlib import contextmanager
from ocdsdocumentationsupport.profile_builder import ProfileBuilder
TRANSLATABLE_CODELIST_HEADERS = ('Title', 'Description', 'Extension')
TRANSLATABLE_SCHEMA_KEYWORDS = ('title', 'description')
VALID_FIELDNAMES = ('Code', 'Title', 'Description', 'Extension')
def build_profile(basedir, standard_tag, extension_versions, registry_base_url=None, schema_base_url=None):
"""
Pulls extensions into a profile.
- Writes extensions' README.md files (docs/extensions/{id}.md)
- Merges extensions' JSON Merge Patch files for OCDS' release-schema.json (schema/profile/release-schema.json)
- Writes extensions' codelist files (schema/profile/codelists)
- Patches OCDS' release-schema.json with extensions' JSON Merge Patch files (schema/patched/release-schema.json)
- Patches OCDS' codelist files with extensions' codelist files (schema/patched/codelists)
- Updates the "codelists" field in extension.json
The profile's codelists exclude deprecated codes and add an Extension column.
`basedir` is the profile's schema/ directory.
"""
@contextmanager
def open_file(name, mode):
"""
Creates the directory if it doesn't exist.
"""
os.makedirs(os.path.dirname(name), exist_ok=True)
f = open(name, mode)
try:
yield f
finally:
f.close()
def write_json_file(data, *parts):
with open_file(os.path.join(basedir, *parts), 'w') as f:
json.dump(data, f, indent=2, separators=(',', ': '))
f.write('\n')
def write_codelist_file(codelist, fieldnames, *parts):
with open_file(os.path.join(basedir, *parts, 'codelists', codelist.name), 'w') as f:
writer = csv.DictWriter(f, fieldnames=fieldnames, lineterminator='\n', extrasaction='ignore')
writer.writeheader()
writer.writerows(codelist)
builder = ProfileBuilder(standard_tag, extension_versions, registry_base_url, schema_base_url)
extension_codelists = builder.extension_codelists()
directories_and_schemas = {
'profile': {
'release-schema.json': builder.release_schema_patch(),
},
'patched': {
'release-schema.json': builder.patched_release_schema(),
'release-package-schema.json': builder.release_package_schema(),
}
}
# Write the documentation files.
for extension in builder.extensions():
with open_file(os.path.join(basedir, '..', 'docs', 'extensions', '{}.md'.format(extension.id)), 'w') as f:
f.write(extension.remote('README.md'))
# Write the JSON Merge Patch and JSON Schema files.
for directory, schemas in directories_and_schemas.items():
for filename, schema in schemas.items():
write_json_file(schema, directory, filename)
# Write the extensions' codelists.
for codelist in extension_codelists:
write_codelist_file(codelist, codelist.fieldnames, 'profile')
# Write the patched codelists.
for codelist in builder.patched_codelists():
codelist.add_extension_column('Extension')
codelist.remove_deprecated_codes()
fieldnames = [fieldname for fieldname in codelist.fieldnames if fieldname in VALID_FIELDNAMES]
write_codelist_file(codelist, fieldnames, 'patched')
# Update the "codelists" field in extension.json.
with open(os.path.join(basedir, 'profile', 'extension.json')) as f:
metadata = json.load(f, object_pairs_hook=OrderedDict)
codelists = [codelist.name for codelist in extension_codelists]
if codelists:
metadata['codelists'] = codelists
else:
metadata.pop('codelists', None)
write_json_file(metadata, 'profile', 'extension.json')
|
The remarkable momentum of Barack Obama’s campaign to be the presidential candidate for the Democratic Party is raising the hopes of millions of Americans who have felt marginalised by mainstream politics.
No one who opposes racism and oppression can ignore this reality. But if Obama makes it to the White House, what can we expect from the office of the president?
Even if all the barriers to electoral victory – and there are still many – are overcome by Obama’s team, the 2008 US election will not point to an end to US imperialism, racism and war.
Exit polls from Democratic Party primaries and caucuses across the US indicate Obama’s support is coming largely from poor, young, black, immigrant and/or women voters, many of who have never taken part in electoral politics on this scale.
But the momentum Obama’s candidacy is generating is not matched by the politics he stands for or that of the Democratic Party he seeks to lead. Obama and his campaign organisers are brilliantly adapting to a perceived potential voting base, calling upon every supporter to become actively involved in the presidential race.
More tech-savy than any of his competitors within or beyond the Democrats, Obama has relied on email and YouTube videos to reach a wider, younger audience.
The style is as important as the message – which is almost devoid of content, but is optimistic and captures the mood.
He repeats in clear and passionate language that altering the course of US politics set by George Bush and company is urgent and possible.
Just go to some of the videos on the various websites to get a sense of this.
Obama has claimed the legacy of civil rights leader Martin Luther King. He has been compared to former president John F Kennedy.
He points to his mixed-race and multicultural background as a symbol of a new and different “America” in his speeches and autogiography.
But Barack Obama is not a candidate for peace.
Take the issue of the Iraq war, which Obama opposes. Notably, so has the other Democratic Party presidential candidate contender Hillary Clinton. John Edwards, who has now dropped out of the race, also opposes the war.
This reflects the depth of opposition to the war among the US public, and increasingly, among a section of the US ruling class.
Obama has pointed to the fact that he opposed the initial US-led attack in 2003, unlike Clinton and Edwards.
But Obama is not against the war on the grounds that it is imperialist, racist and illegal. Instead, he sees it as unwinnable.
Once elected to the US senate in 2004, Obama supported Bush’s calls for unconditional funding for the war in Iraq in 2005 and 2006.
He also voted to confirm Condoleezza Rice as US secretary of state, despite evidence that she had presented false testimonies to congress and the fact that she was a central part of the Bush’s team that pushed through the Iraq war.
In June 2006, Obama voted against an amendment demanding a timetable for US withdrawal from Iraq, despite having previously called for such a timetable in the senate.
And during the 2006 Democratic Party congressional primaries Obama backed pro-war candidates such as Joe Lieberman and Ned Lamont.
Obama is also in favour of maintaining a US occupation in Iraq. He has directly linked US troop withdrawal from Iraq to redeployment in Afghanistan.
In November 2006, Obama stated, “Drawing down our troops in Iraq will allow us to redeploy additional troops to northern Iraq and elsewhere in the region as an over-the-horizon force.
“Perhaps most importantly, some of these troops could be redeployed to Afghanistan.
It is for all these reasons that Ralph Nader has announced a left wing challenge for the presidential elections.
The Democrats will attack him for splitting the “progressive” vote and attempt to stop him from standing. But socialists back both his right to stand and his criticisms of US corporate power and the two main parties.
The promise of a new, progressive US, domestically and internationally, is something very different from the politics offered by Obama and the Democrats. Historically, the Democrats and the Republicans have had very similar policies. They are both parties of big business.
But today, in the curious structures of the US’s institutional party politics, the social base of a new movement is finding some expression in the hopeful anticipation of a new type of leadership of the Democratic Party.
Reliance on these structures will inevitably prove painfully disappointing. If a better world is indeed possible, it will be mass movements from below that make it a reality.
|
. PURPOSE The main problem when treating the superficial femoral and popliteal artery with PTA or stent implantation is the relatively high restenosis rate. Several ablative systems are available as an alternative. The purpose of this prospective study was to evaluate the safety and performance of a novel rotational and aspirating atherectomy system. MATERIALS AND METHODS From June to December 2006, we treated 23 patients, median age 70.3 years, with the rotational atherectomy system. All patients had de-novo lesions of the SFA or PA with a minimum stenosis of 70%. According to the Rutherford classification, 39% of the patients were in category 2 and 61% in category 3. The median lesion length was 26.3 mm (5-100 mm). 26% of the patients had occlusions. RESULTS The technical success rate was 100%. In 14 cases (61%) additional balloon dilatation was applied, in two cases stent implantation were performed. The median treatment duration with the device was 187.7 +/- 106.1 s (59-391 s). The aspirated volume was 116.5 +/- 72.0 ml. The ankle brachial index improved from preinterventional 0.60 +/- 0.16 to postinterventional 0.85 +/- 0.13 and after 6 month 0.80 +/- 0.13. During follow-up, two (8%) restenosis occurred. There were two complications, one dissection and one distal embolization. Follow-up could not be performed for two patients (8%). CONCLUSION Atherectomy of femoropopliteal lesions with the Pathway PV Atherectomy System is very safe and effective. The low restenosis rate of 8% is promising, but there is still a lack of long-term results.
|
def restart_process(self):
url = '{}/{}'.format(self.base_url_path, 'restart-process')
return self._request('POST', url)
|
. BACKGROUND Health literacy is defined as the degree to which individuals obtain, process and understand basic health information and services. It is necessary to make appropriate decisions about their health. Evidence has shown that the level of health literacy is critical to the prognosis of chronic diseases. The Short Assessment of Health Literacy for Spanish-speaking Adults (SAHLSA-50) is a short and simple health literacy adult assessment. AIM To determine the validity and reliability indicators of SAHLSA-50 in Chilean adults. MATERIAL AND METHODS The survey was applied to 84 older adults living in high and low income neighborhoods. RESULTS The survey had an adequate construct validity and reliability, its Comparative Fit Index was 0.93, its Tucker-Lewis index was 0.927 and its Root Mean Square Error of Approximation was 0.044. Close fit was not statistically significant (p = 0.828). Reliability was estimated by K-Richardson, which reported a good outcome (0.9255). Despite the good global indicators obtained, it is necessary to pay attention to some items that would fail to explain the Health literacy construct or were beyond the parameters of difficulty and discrimination proposed by the authors of the test. CONCLUSIONS We propose this test as a useful tool to assess health literacy in the adult population in Chile. Its use and incorporation into local research can be especially recommended in the areas of education and health promotion.
|
<reponame>thalysonrodrigues/POO
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
package lista03;
import java.util.Scanner;
/**
*
* @author thalyson
*/
public class Question6 {
public static void main(String[] args) {
// declaracao de variaveis
Scanner scan = new Scanner(System.in);
// leitura
System.out.print("Texto: ");
String texto = scan.nextLine().toLowerCase();
int totalVogais = Question6.countVogais(texto);
int totalConsoantes = texto.length() - totalVogais;
System.out.println("Total de vogais: " + totalVogais + " (" + 100 / texto.length() * totalVogais + "%)");
System.out.println("Total de consoantes: " + totalConsoantes + " (" + 100 / texto.length() * totalConsoantes + "%)");
}
public static int countVogais(String texto) {
char[] vogais = new char[]{'a', 'e', 'i', 'o', 'u'};
int totalVogais = 0;
for (int i = 0; i < texto.length(); i++) {
for (int j = 0; j < vogais.length; j++) {
if (vogais[j] == texto.charAt(i)) {
totalVogais++;
break;
}
}
}
return totalVogais;
}
}
|
// CPropertyPageEx2
// Extension of the CPropertyPageEx class.
//-----------------------------------------------------------------------------------------------------------------
//. delete this class
#pragma once
class CPropertyPageEx2 : public CPropertyPage
{
DECLARE_DYNCREATE(CPropertyPageEx2)
// Construction
public:
CPropertyPageEx2();
~CPropertyPageEx2();
// Dialog Data
private:
//{{AFX_DATA(CPropertyPageEx2)
enum { IDD = IDD_ERROR };
// NOTE - ClassWizard will add data members here.
// DO NOT EDIT what you see in these blocks of generated code !
//}}AFX_DATA
// Overrides
// ClassWizard generate virtual function overrides
//{{AFX_VIRTUAL(CPropertyPageEx2)
protected:
virtual void DoDataExchange(CDataExchange* pDX); // DDX/DDV support
//}}AFX_VIRTUAL
// Implementation
protected:
// Generated message map functions
//{{AFX_MSG(CPropertyPageEx2)
// NOTE: the ClassWizard will add member functions here
//}}AFX_MSG
DECLARE_MESSAGE_MAP()
};
|
<filename>src/spot_format.ts
import { SpotAPI } from './spot_api'
export interface FormattedMessage {
_id: number
time: Date
point: GeoJSON.Point
messageType: string
originalMessage?: SpotAPI.Message
}
export function formatSpotMessage(msg: SpotAPI.Message): FormattedMessage {
const time = new Date(msg.dateTime)
const point: GeoJSON.Point = {type: 'Point', coordinates: [msg.longitude, msg.latitude]}
return {_id: msg.id, time, point, messageType: msg.messageType, originalMessage: msg}
}
|
Development and validation of the Vietnamese Primary Care Assessment Tool provider version Aim: To adapt the provider version of the Primary Care Assessment Tool (PCAT) for Vietnam and determine its internal consistency and validity. Background: There is a growing need to measure and explore the impact of various characteristics of health care systems on the quality of primary care. It would provide the best evidence for policy makers if these evaluations come from both the demand and supply sides of the health care sector. Comparatively more researchers have studied primary care quality from the consumer perspective than from the providers perspective. This study aims at the latter. Method: Our study translated and adapted the PCAT provider version (PCAT PE) into a Vietnamese version, after which a cross-sectional survey was conducted to examine the feasibility, internal consistency and validity of the Vietnamese PCAT provider version (VN PCAT PE). All general doctors working at 152 commune health centres in Thua Thien Hue province had been selected to participate in the survey. Findings: The VN PCAT PE is an instrument for evaluation of primary care in Vietnam with 116 items comprising six scales representing four core primary care domains, and three additional scales representing three derivative domains. From the translation and cultural adaptation stage, two items were combined, two items were removed and one item was added. Six other items were excluded due to problems in item-total correlations. All items have a low non-response or dont know/dont remember response rate, and there were no floor or ceiling effects. All scales had a Cronbachs alpha above 0.80, except for the Coordination scale, which still was above the minimum level of 0.70. Conclusion: The VN PCAT PE demonstrates adequate internal consistency and validity to be used as an effective tool for measuring the quality of primary care in Vietnam from the provider perspective. Introduction Since the Alma-Ata declaration 40 years ago, primary care has been described repeatedly as essential care that is universally accessible to individuals and families in communities, available at an affordable cost to communities and countries and the first level of contact for patients (or the first element of a continuing health care process). With these notable features, there is compelling evidence that stronger primary care systems are associated in general with better population health outcomes including lower mortality rates, rates of premature death and hospitalizations for ambulatory care sensitive conditions, and higher infant birth weight, life expectancy, and satisfaction with the health care system (Starfield, 1991;Starfield and Shi, 2002;;Niti and Ng, 2003). Primary care is a factor in improving public health and health outcomes and the prevention of illness and death, with lower use of hospital-based medical care, associated with lower costs ), and more equitable distribution of health within a population a;. A critical review on the contribution of primary care to health and health systems in low-and middle-income countries (LMIC) showed that primary-care-focused health initiatives have improved access to health care, including among the poor, at reasonably low cost (). There is also evidence that primary care programmes have reduced child mortality and, in some cases, wealth-based disparities in mortality (). Similar to many LMIC, Vietnam faces the challenges of the double burden of communicable and non-communicable disease and the trend to sustainable development from its own funding. Since 2013, the government has issued many important policy changes to reinforce the grassroot networks as well as the health care system in general (Vietnam Ministry of Health, 2013; Vietnam Prime Minister, 2013;Prime Minister, 2016;Vietnam Government, 2016;Ministry of Health, 2016a;2016b;. In 2015, the Primary Health Care Performance Initiative (PHCPI) was launched in 135 LMIC with the aim of catalyzing improvements in primary health care systems (PHCPI). The PHCPI conceptual framework conceived of a high-quality primary health care subdomain, which includes the classic primary health care functions such as first contact accessibility, comprehensiveness, and coordination as first laid out by Starfield and others in the world plus added a new function in person-centred care to distinguish between the continuity and person-centred components in Starfield's original domain of person-focused care over time. This high-quality primary care is one of the key subdomains for measurement of primary health care service delivery in health systems (). Worldwide, commitment for improvements in primary care is increasing. An example is the new UN Sustainable Goal for Health (Enhance health and promote well-being for all at all ages) (World Health Oganization, 2016). Recently, the new Astana Declaration: 'From Alma-Ata towards universal health coverage and the Sustainable Development Goals' released by WHO and UNICEF in October 2018 reaffirmed the commitment of States and Governments to 'build a sustainable primary health care as well as to enhance capacity and infrastructure for primary carethe first contact with health services' (WHO and UNICEF, 2018). Consequently, there is also a growing need to measure various characteristics of primary care as we mentioned above and explore their impact on the quality of primary care. It would provide the best evidence for policy makers if these evaluations come from both the demand and supply sides of the health care sector. Comparatively more researchers have studied assessments of primary care quality from the consumer perspective than from the workforce perspective. A recent South African study pointed out that there is a significant gap between the two, that is, between the clients' experience with primary care and what managers and providers think they are delivering (). There are various tools that have been used for measuring characteristics of primary care, for example, the CPCI (Components of Primary Care Instrument), the PCAS (Primary Care Assessment Survey) (), the EUROPEP questionnaire (European Task Force on Patient Evaluations of General Practice Care) (), the CAHPS (Consumer Assessment of Healthcare Providers and Systems) (), the P3 C (Parents' Perception of Primary Care) (), and the PCAT (Primary Care Assessment Tool) (). The PCAT developed by Barbara Starfield at the Johns Hopkins Primary Care Policy Centre is one of the most widely studied and applied tools for measuring the quality of primary care across the globe. The PCAT family includes four versions: the consumer-client, facility, provider and health system versions. Through the PCAT, primary care quality is evaluated according to its core principles (first contact care, continuous longitudinal care, coordination, and comprehensiveness) and three other derivative domains (family-centered care, community-orientated care, and culturally competent care) (). In contrast with the consumer version, which has been translated and validated in many languages and countries across the world (;;Wang and Shi, 2014;), little work has been done for the provider version questionnaires. As the PCAT consumer version was validated and successfully used in Vietnam (), we found that the PCAT provider version could render an adequate reflection on organizational resources and health care processes from a primary care provider perspective. As a first step, this study was conducted to adapt the PCAT provider tool for Vietnam and determine its internal consistency and validity. Nguyen Thi Hoa et al. Translation and adaptation of the PCAT provider version for Vietnam The PCAT provider version (PCAT PE) was translated and culturally adapted strictly according to the guidelines from the Johns Hopkins Primary Care Policy Center for use in international settings (Starfield and Shi, 2009) (illustrated by Figure 1). The first round was done in 2007 including all recommended steps as follows: Step 1: Forward translation performed by a bilingual physician and PhD student whose native tongue was Vietnamese with experience in translating documents between English and Vietnamese. This translator was familiar with use of the PCAT. To the best of the translator's ability, the translation preserved the intent rather than the literal meaning of the items. Step 2: Qualitative review of the translated survey was done by several doctors and other workers from Hanoi Medical School. This was performed in focus group discussion, where every translated item was reviewed to ensure its clarity, use of common language, and conceptual adequacy. Step 3: Backward translation was done by a Vietnamese woman whose native language is American English and who has lived long enough in the USA to know the language and routines of daily life. This translator was not familiar with the specific wording of the original PCAT terms. The instructions given to the back translator were identical to those given to the forward translator. The aim of this step was to identify items that required further study. Step 4: Health systems research experts and the forward/ backward translators jointly reviewed the forward and backward translations in order to detect items that were not effectively translated, which were confusing or generated concerns. A few modifications were made until a consensus version was reached. Step 5: Thereafter a lay panel of Vietnamese physicians reviewed the translation, identified troublesome items, and proposed alternatives. Step 6: Pilot testing of the translated version: the questionnaire was administered to 108 physicians, that is, 41 physicians working at Commune health centers (CHCs) and 67 physicians working as academic trainers and administrators at the medical universities. Basic descriptive analyses were conducted to ensure adequate distribution of responses. The respondents were debriefed to identify any wording or comprehension problems. To ensure the high quality of the questionnaires, certain steps were repeated in 2008 (steps 6, 2, 4, 5), 2011 (steps 2 and 3), 2013, and 2014 (steps 2 and 6) before it was declared fit to be used in a general population (Table 1). Below we describe those steps with the year wherein they were performed: In 2008, Pilot testing was performed again for 28 physicians in the Specialist Level 1 in family medicine (CK1) training programme in Khanh Hoa. A dissemination workshop was then held in Vietnam with primary care physicians from several medical schools to review the pilot data and make additional revision suggestions based on responses from the previous pilot testing round (Qualitative review). Following this review, a panel of primary care physicians from six medical schools in Vietnam and a team of researchers and physicians from Boston University participated in two rounds of revisions of PCAT questions, including appropriate contextual translation of concepts (Lay panel review). Dr. Barbara Starfield reviewed the revised version pre-translation and gave comments that were incorporated into a final version (Health system researcher experts review). In 2011, a Qualitative review was repeated by the research team (Hue UMP and BU). Discussion on the cultural relevance of each item in the Vietnamese version and comparison between the current version and the original PCAT were made. This round also checked the matching between each equivalent item of the consumer and provider surveys. The research team produced a list of problematic items and proposed solutions. Backward translation was repeated after the qualitative review. The back translation was undertaken by a woman whose native language is American English and has lived in the USA long enough to know the language and routines of daily life. A new translated version of the questionnaires was produced. In April 2013, an additional pilot study was conducted for 60 physicians working at CHCs in Thua Thien Hue Province. These physicians were divided into two groups: one group read the questionnaires and gave their opinions in terms of content and accuracy of evaluation for practice of physicians working in primary care in Vietnam. The other group was asked to fill in the entire questionnaire and give their feedback on challenges they faced. From October 2013 to January 2014, a final revision was done by the research team from Hue UMP and BU (qualitative review). The team went through all the items and asked for advice from international experts with experience in PCAT validation. After this round, a final translated version of the questionnaire was produced with 9 scales and 123 items as compared to 9 scales and 124 items of the original PCAT provider. This is a self-completion questionnaire and takes approximately 30-45 min to fulfil. We maintained a four-point Likert scale response format (1 = definitely not; 2 = probably not; 3 = probably; and 4 = definitely) providing an additional 'don't know/don't remember' option in case participants could not choose one of those four options. Table 1 in Supplementary Material shows items changed in the final translated questionnaires from the original version. Data collection To evaluate the feasibility, internal consistency, and validity of the VN PCAT PE, a cross-sectional study was implemented. The Table 1. Different steps in the translation and adaptation process and in which rounds they were repeated Step Round 1, 2007 Round 2, 2008 Round 3, 2011 Round 4, 2013 and 2014 Step 1: Forward translation x Step 2: Qualitative review Step 3: Backward translation x x Step 4: Health system researcher experts x x with Dr. Barbara Starfield's comments Step 5: Lay panel review x x Step 6: Pilot testing x x x study was conducted in Thua Thien Hue province with all general doctors working at CHCs. There are 152 CHCs in the 9 districts of this province. Normally, one CHC is equipped with a general doctor as the head of the CHC. There are some exceptions: some CHCs have two general doctors, others only a traditional medicine doctor or an assistant traditional medicine doctor or an assistant doctor. The questionnaires were delivered at the end of the monthly meeting of each district health center. In cases where one or more doctors were absent in that meeting, we tried to contact them and make an appointment at their CHC to have an interview at a later stage, where a trained interviewer assisted the doctor to complete the questionnaire. After three unsuccessful engagement efforts during the study period, we excluded these doctors from our research. Before the interview, participants received a full explanation of the study's content and purpose and signed a consent form if they agreed to participate. Participants received 5 USD as an appreciation gift for their time and contribution. Data collection was conducted from December 2017 to February 2018. This study obtained ethical approval from the Scientific Committee of Hue University of Medicine and Pharmacy on 18 March 2014 and IRB review from Boston University (H-31432). Data analysis All collected questionnaires were cleaned and entered into EpiData. Data analysis was performed using SPSS software version 23.0. Subsequent full validation involved several steps ( Figure 2). First, individual items were evaluated on several criteria. Items with a high percentage (≥20%) of item non-response or 'don't know/don't remember' responses, or items with a large floor or ceiling effect (>80% of respondents chose the lowest or highest rating) were removed. Next, the item-total correlation for the remaining items in each scale was calculated (item-total correlation before review). Items were removed if the item-total correlation was below 0.30 or if Cronbach's coefficient alpha for that scale improved substantially when the item was removed. Finally, item-discriminant validity was tested: for each item, the item-total correlation (item-total correlation after review) with the hypothesized scale should be substantially higher than the correlation with the other scales. In the second phase, Cronbach's coefficient alpha was used to examine how well all items measured the same construct (internal consistency). A value of 0.70 is commonly seen as a minimum. The recoding progress and calculation for the sum mean score of domains and subdomains of primary care strictly complied with the guideline PCAT manual issued by Johns Hopkins University in 1998. For calculating the sum mean scores of domains and subdomains, a mean value was assigned to 'not sure/don't remember' answers as well as to missing values. Characteristics of study population Among the 157 doctors working at the 152 CHCs in Thua Thien Hue province, 150 participated in our study, one refused and six were absent because of maternal or sick leave or study leave. Tables 2 and 3 show the characteristics of the participants and their work place. There were about twice as many male doctors as female ones. More than half of these doctors have been practicing for 20 years or more. Although CHCs receive patients of all ages, the majority of them are adults and only a small percentage of them must pay out-of-pocket for their health visits. Table 4 shows the evaluation of the individual items. All items have a low non-response or 'don't know/ don't remember' response rate (<20%) and there were no floor or ceiling effects (≤80%). One item from First contact access (C9) and one item from Ongoing care (D1) were removed because of an item-total correlation below 0.30. The Cronbach's alphas of the different scales were not improved substantially by removing any items. Four items of the Community orientation care scale were removed because their item-total correlation with that scale was lower than their correlations with the other scales. (see Table 2 -Supplementary Material). Internal consistency of the different scales Based on these parameters, 116 items of the VN PCAT-PE were determined to be appropriate for use with Vietnamese health care providers, to represent four core domains with six scales and three derivative domains with three scales (Table 5). All scales had a Cronbach's alpha above 0.80, except for the scale of Coordination, which still was above the minimum level of 0.70. Main findings The outcome of this study is a translated and adapted PCAT provider version for Vietnam. The results showed that this questionnaire is a valid tool to evaluate primary care quality in Vietnam from the provider viewpoint with high overall reliability and validity. Interpretation of the results in relation to existing literature This study rendered a PCAT ready for evaluation studies of the primary care system from the providers perspective in Vietnam. Previous PCAT validation studies focused mostly on the patients' (consumers') version. Now that the providers' version is available, a deeper and more comprehensive assessment of primary care quality becomes possible, adding a second key view on the demand-supply relationship of the primary care system of Vietnam. The VN PCAT provider version preserves the integrity characteristics of the original PCAT provider version with 116 items belonging to nine scales. There were only slight changes in the number of items in most scales except for the Community Orientation scale, from which four items were removed because their item-total correlation with the hypothesized scale was lower than the correlations with another scale. In a South African study, a new scale (about the primary health care team) was added at the end of the questionnaire (). A Chinese study removed the scale of First contact access from their tool (). We succeeded in retaining most major characteristics of the original tool, however, preserving the possibility of future comparison with other primary care quality assessment studies using the original PCAT tool. In the validation study of the consumer tool VN PCAT AE, the domains of First contact access and Comprehensive (service available) more items were removed (six and five items, respectively) (). A probable reason why this was not the case in the provider study is that the providers had more knowledge about the items' content and knew better the services they were providing than the consumers. This may have reduced the ground effect and the number of 'don't know/ don't remembers' as well as the number of missing answers. Due to the fact that Vietnam has a specific culture (mid-level country, Southeast-Asian) and a developing primary care context, the 2007 process alone was not sufficient. As the reader may have observed, it was indeed a lengthy process for the translation and cultural adaptation (from 2007 to 2014). In order to improve its quality, various important steps were repeated several times, including four times for the qualitative review and three times for pilot testing. These added steps were necessary to develop a well-constructed and fully adapted tool for measuring the specific health care setting of Vietnam. There are several potential biases of this study due to its limitation in design: the study population was restricted to general doctors working at CHCs. Although they are the major resource for providing primary care in Vietnam currently, there are other primary care doctors such as private doctors and doctors working in primary care outpatient clinics of some hospitals who should also be surveyed to assure the expected diversity and comprehensiveness of the tool. Conclusions We developed the VN PCAT PE as a valid and reliable tool to measure the quality of primary care from a provider perspective in Vietnam. Used together with the VN PCAT AE, primary care performance can be examined comprehensively. The gap in views between primary care users (demand side) and providers (supply side) in Vietnam can now be identified. Financial Support. This work was supported by the Atlantic Philanthropies (grant nos. 14613, 21627) And the VLIR InterUniversity Cooperation Programme VLIR-IUC with Hue University (grant no. ZIUC2017AP026). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Conflicts of Interest. None. Ethical Standards. The authors assert that all procedures contributing to this work comply with the ethical standards of the relevant national and institutional
|
Like, he did make the play, and made the score 17-3 Rams.
But it didn’t look like he would get in because of Kerry Collins. However, Collins is a quarterback.
“That’s what I thought, — ‘This is a quarterback,'” Bush said of Collins, who looked to have a good angle to prevent Bush’s first NFL touchdown. “I wanted to make it as hard as [heck] to tackle me. Once I get close to that line I’m putting my head down and my momentum is going to take me in.
Needless to say, since he is a member of this Ram team, he is playing some of the best football of his career.
|
SEATTLE – Amid reports of increasing hate crimes across the US, dozens of young people in Seattle attended a special workshop to learn how to resist discrimination, hate, and Islamophobia.
“I see a lot of prejudice just in our country right now, and I don’t want that prejudice in my life or my friends’ lives, and I don’t want to see it hurting people,” said Alex Davidson, 13.
Davidson is one of dozens of other kids who attended the workshop at Seattle university.
The workshop, organized by local Muslim leaders, taught students how to write a letter to the editor or craft a thoughtful social media post in response to rising reports of discrimination and prejudice.
“In light of what’s happening in our country, this is a critical time, this is an important issue,” said Aneelah Afzali, an organizer of the event.
Many of the attendants confirmed that they decided to participate after they learned about this weekend’s arson at the Islamic Center of the Eastside in Bellevue.
Afzali says only 38 percent of Americans know someone who is Muslim. She believes lack of exposure, combined with heavy media coverage of terrorism, can lead to dangerous misperceptions about her faith.
Storytelling that emphasizes and encourages contact, interaction and humanization is what next weekend’s workshop is all about.
“This issue is so important right now. It’s a critical topic, and it’s an opportunity for the youth to do something about it,” Afzali said.
The workshop is part of recent initiative by the Muslim Association of Puget Sound, The American Muslim Empowerment Network, which is working on ways to educate Americans about Islam and Muslims.
|
/*
*AVISO LEGAL
© Copyright
*Este programa esta protegido por la ley de derechos de autor.
*La reproduccion o distribucion ilicita de este programa o de cualquiera de
*sus partes esta penado por la ley con severas sanciones civiles y penales,
*y seran objeto de todas las sanciones legales que correspondan.
*Su contenido no puede copiarse para fines comerciales o de otras,
*ni puede mostrarse, incluso en una version modificada, en otros sitios Web.
Solo esta permitido colocar hipervinculos al sitio web.
*/
package com.bydan.erp.nomina.util.report;
import org.apache.log4j.Logger;
import java.sql.Time;
import java.sql.Timestamp;
import org.apache.poi.ss.usermodel.Workbook;
import org.apache.poi.ss.usermodel.Sheet;
import org.apache.poi.ss.usermodel.Row;
import org.apache.poi.ss.usermodel.Cell;
import org.apache.poi.ss.usermodel.CellStyle;
import org.apache.poi.ss.usermodel.IndexedColors;
import org.apache.poi.ss.util.CellRangeAddress;
import javax.swing.border.Border;
import java.io.InputStream;
import java.util.Scanner;
import java.util.List;
import java.util.ArrayList;
import java.util.Set;
import java.util.Date;
//import java.util.ArrayList;
import com.bydan.framework.erp.business.entity.GeneralEntity;
import com.bydan.framework.erp.business.entity.GeneralEntityReturnGeneral;
import com.bydan.framework.erp.business.entity.GeneralEntityParameterGeneral;
import com.bydan.framework.erp.business.entity.DatoGeneral;
import com.bydan.framework.erp.business.entity.OrderBy;
import com.bydan.framework.erp.business.entity.Classe;
import com.bydan.framework.erp.business.entity.Reporte;
import com.bydan.framework.erp.util.ConstantesJsp;
import com.bydan.framework.erp.business.dataaccess.ConstantesSql;
import com.bydan.erp.nomina.resources.general.AuxiliarGeneral;
import com.bydan.erp.nomina.util.report.ProcesoCierreMesConstantesFunciones;
import com.bydan.erp.nomina.util.report.ProcesoCierreMesParameterReturnGeneral;
//import com.bydan.erp.nomina.util.report.ProcesoCierreMesParameterGeneral;
import com.bydan.framework.erp.business.logic.DatosCliente;
import com.bydan.framework.erp.util.*;
import com.bydan.erp.nomina.business.entity.*;
import com.bydan.erp.nomina.business.entity.report.*;
import com.bydan.erp.seguridad.business.entity.*;
import com.bydan.erp.nomina.business.entity.*;
import com.bydan.erp.seguridad.util.*;
import com.bydan.erp.nomina.util.*;
//import com.bydan.framework.erp.util.*;
//import com.bydan.framework.erp.business.logic.*;
//import com.bydan.erp.nomina.business.dataaccess.*;
//import com.bydan.erp.nomina.business.logic.*;
//import java.sql.SQLException;
//CONTROL_INCLUDE
import com.bydan.erp.seguridad.business.entity.*;
@SuppressWarnings("unused")
final public class ProcesoCierreMesConstantesFunciones{
public static String S_TIPOREPORTE_EXTRA="";
//USADO MAS EN RELACIONADO PARA MANTENIMIENTO MAESTRO-DETALLE
public static Integer TAMANIO_ALTO_MAXIMO_TABLADATOS=Constantes.ISWING_TAMANIOMAXIMO_TABLADATOS;
public static Integer TAMANIO_ALTO_MINIMO_TABLADATOS=Constantes.ISWING_TAMANIOMINIMO_TABLADATOS;
//PARA TABLA RELACIONES(DESCRIPCION HEIGHTPE_REL_TAB)
public static Integer ALTO_TABPANE_RELACIONES=Constantes.ISWING_ALTO_TABPANE + Funciones2.getValorProporcion(Constantes.ISWING_ALTO_TABPANE,0);
//PARA TABLA RELACIONADO(DESCRIPTION HEIGHTPE_REL)
public static Integer TAMANIO_ALTO_MAXIMO_TABLADATOSREL=Constantes.ISWING_TAMANIOMAXIMO_TABLADATOSREL + Funciones2.getValorProporcion(Constantes.ISWING_TAMANIOMAXIMO_TABLADATOSREL,0);
public static Integer TAMANIO_ALTO_MINIMO_TABLADATOSREL=Constantes.ISWING_TAMANIOMINIMO_TABLADATOSREL + Funciones2.getValorProporcion(Constantes.ISWING_TAMANIOMINIMO_TABLADATOSREL,0);
//PARA CAMBIAR TODO--> SE CAMBIA EN TABLA RELACIONES Y TABLAS RELACIONADOS
/*
PARA MANEJAR EL TAB RELACIONES CON TABLA DE DATOS SE DEBE MODIFICAR Y VERIFICAR LOS VALORES CONTANTES:
final public static Integer ISWING_TAMANIOMAXIMO_TABLADATOSREL=240;//230;350;
final public static Integer ISWING_TAMANIOMINIMO_TABLADATOSREL=240;//230;260
final public static Integer ISWING_ALTO_TABPANE=375;//375;400;260;
CASO CONTRARIO, ESTOS VALORES SERIAN PARA CADA CASO (NO CONSTANTES)
NOTA:
* LA ALINEACION HORIZONTAL,FALTA
*/
public static final String SFINALQUERY="";
public static final String SNOMBREOPCION="ProcesoCierreMes";
public static final String SPATHOPCION="Nomina";
public static final String SPATHMODULO="nomina/";
public static final String SPERSISTENCECONTEXTNAME="";
public static final String SPERSISTENCENAME="ProcesoCierreMes"+ProcesoCierreMesConstantesFunciones.SPERSISTENCECONTEXTNAME+Constantes.SPERSISTENCECONTEXTNAME;
public static final String SEJBNAME="ProcesoCierreMesHomeRemote";
public static final String SEJBNAME_ADDITIONAL="ProcesoCierreMesHomeRemoteAdditional";
//RMI
public static final String SLOCALEJBNAME_RMI=ProcesoCierreMesConstantesFunciones.SCHEMA+"_"+ProcesoCierreMesConstantesFunciones.SEJBNAME+"_"+Constantes.SEJBLOCAL;//"erp/ProcesoCierreMesHomeRemote/local"
public static final String SREMOTEEJBNAME_RMI=ProcesoCierreMesConstantesFunciones.SCHEMA+"_"+ProcesoCierreMesConstantesFunciones.SEJBNAME+"_"+Constantes.SEJBREMOTE;//remote
public static final String SLOCALEJBNAMEADDITIONAL_RMI=ProcesoCierreMesConstantesFunciones.SCHEMA+"_"+ProcesoCierreMesConstantesFunciones.SEJBNAME+Constantes.SEJBADDITIONAL+"_"+Constantes.SEJBLOCAL;//"erp/ProcesoCierreMesHomeRemote/local"
public static final String SREMOTEEJBNAMEADDITIONAL_RMI=ProcesoCierreMesConstantesFunciones.SCHEMA+"_"+ProcesoCierreMesConstantesFunciones.SEJBNAME+Constantes.SEJBADDITIONAL+"_"+Constantes.SEJBREMOTE;//remote
//RMI
//JBOSS5.1
public static final String SLOCALEJBNAME=Constantes.SEJBPACKAGE+Constantes.SEJBSEPARATOR+ProcesoCierreMesConstantesFunciones.SEJBNAME+Constantes.SEJBSEPARATOR+Constantes.SEJBLOCAL;//"erp/ProcesoCierreMesHomeRemote/local"
public static final String SREMOTEEJBNAME=Constantes.SEJBPACKAGE+Constantes.SEJBSEPARATOR+ProcesoCierreMesConstantesFunciones.SEJBNAME+Constantes.SEJBSEPARATOR+Constantes.SEJBREMOTE;//remote
public static final String SLOCALEJBNAMEADDITIONAL=Constantes.SEJBPACKAGE+Constantes.SEJBSEPARATOR+ProcesoCierreMesConstantesFunciones.SEJBNAME+Constantes.SEJBADDITIONAL+Constantes.SEJBSEPARATOR+Constantes.SEJBLOCAL;//"erp/ProcesoCierreMesHomeRemote/local"
public static final String SREMOTEEJBNAMEADDITIONAL=Constantes.SEJBPACKAGE+Constantes.SEJBSEPARATOR+ProcesoCierreMesConstantesFunciones.SEJBNAME+Constantes.SEJBADDITIONAL+Constantes.SEJBSEPARATOR+Constantes.SEJBREMOTE;//remote
//JBOSS5.1
public static final String SSESSIONNAME=ProcesoCierreMesConstantesFunciones.OBJECTNAME + Constantes.SSESSIONBEAN;
public static final String SSESSIONNAME_FACE=Constantes.SFACE_INI+ProcesoCierreMesConstantesFunciones.SSESSIONNAME + Constantes.SFACE_FIN;
public static final String SREQUESTNAME=ProcesoCierreMesConstantesFunciones.OBJECTNAME + Constantes.SREQUESTBEAN;
public static final String SREQUESTNAME_FACE=Constantes.SFACE_INI+ProcesoCierreMesConstantesFunciones.SREQUESTNAME + Constantes.SFACE_FIN;
public static final String SCLASSNAMETITULOREPORTES="Proceso Cierre Meses";
public static final String SRELATIVEPATH="../../../../";
public static final String SCLASSPLURAL="es";
public static final String SCLASSWEBTITULO="Proceso Cierre Mes";
public static final String SCLASSWEBTITULO_LOWER="Proceso Cierre Mes";
public static Integer INUMEROPAGINACION=10;
public static Integer ITAMANIOFILATABLA=Constantes.ISWING_ALTO_FILA;
public static Boolean ES_DEBUG=false;
public static Boolean CON_DESCRIPCION_DETALLADO=false;
public static final String CLASSNAME="ProcesoCierreMes";
public static final String OBJECTNAME="procesocierremes";
//PARA FORMAR QUERYS
public static final String SCHEMA=Constantes.SCHEMA_NOMINA;
public static final String TABLENAME="proceso_cierre_mes";
public static final String SQL_SECUENCIAL=SCHEMA+"."+TABLENAME+"_id_seq";
public static String QUERYSELECT="select procesocierremes from "+ProcesoCierreMesConstantesFunciones.SPERSISTENCENAME+" procesocierremes";
public static String QUERYSELECTNATIVE="select "+ProcesoCierreMesConstantesFunciones.SCHEMA+"."+ProcesoCierreMesConstantesFunciones.TABLENAME+".id,"+ProcesoCierreMesConstantesFunciones.SCHEMA+"."+ProcesoCierreMesConstantesFunciones.TABLENAME+".version_row,"+ProcesoCierreMesConstantesFunciones.SCHEMA+"."+ProcesoCierreMesConstantesFunciones.TABLENAME+".id_mes,"+ProcesoCierreMesConstantesFunciones.SCHEMA+"."+ProcesoCierreMesConstantesFunciones.TABLENAME+".id_estructura,"+ProcesoCierreMesConstantesFunciones.SCHEMA+"."+ProcesoCierreMesConstantesFunciones.TABLENAME+".es_para_reversion from "+ProcesoCierreMesConstantesFunciones.SCHEMA+"."+ProcesoCierreMesConstantesFunciones.TABLENAME;//+" as "+ProcesoCierreMesConstantesFunciones.TABLENAME;
//AUDITORIA
public static Boolean ISCONAUDITORIA=false;
public static Boolean ISCONAUDITORIADETALLE=true;
//GUARDAR SOLO MAESTRO DETALLE FUNCIONALIDAD
public static Boolean ISGUARDARREL=false;
public static final String ID=ConstantesSql.ID;
public static final String VERSIONROW=ConstantesSql.VERSIONROW;
public static final String IDMES= "id_mes";
public static final String IDESTRUCTURA= "id_estructura";
public static final String ESPARAREVERSION= "es_para_reversion";
//TITULO CAMPO
public static final String LABEL_ID= "Id";
public static final String LABEL_ID_LOWER= "id";
public static final String LABEL_VERSIONROW= "Version Row";
public static final String LABEL_VERSIONROW_LOWER= "version Row";
public static final String LABEL_IDMES= "Mes";
public static final String LABEL_IDMES_LOWER= "Mes";
public static final String LABEL_IDESTRUCTURA= "Estructura";
public static final String LABEL_IDESTRUCTURA_LOWER= "Estructura";
public static final String LABEL_ESPARAREVERSION= "Es Para Reversion";
public static final String LABEL_ESPARAREVERSION_LOWER= "Es Para Reversion";
public static String getProcesoCierreMesLabelDesdeNombre(String sNombreColumna) {
String sLabelColumna="";
if(sNombreColumna.equals(ProcesoCierreMesConstantesFunciones.IDMES)) {sLabelColumna=ProcesoCierreMesConstantesFunciones.LABEL_IDMES;}
if(sNombreColumna.equals(ProcesoCierreMesConstantesFunciones.IDESTRUCTURA)) {sLabelColumna=ProcesoCierreMesConstantesFunciones.LABEL_IDESTRUCTURA;}
if(sNombreColumna.equals(ProcesoCierreMesConstantesFunciones.ESPARAREVERSION)) {sLabelColumna=ProcesoCierreMesConstantesFunciones.LABEL_ESPARAREVERSION;}
if(sLabelColumna.equals("")) {
sLabelColumna=sNombreColumna;
}
return sLabelColumna;
}
public static String getNombreEjb_JBoss81(String sAplicacion,String sModule,String sClaseEjb,String sInterfaceEjb) throws Exception {
String sDescripcion="";
sDescripcion="ejb:"+sAplicacion+"/"+sModule+"/"+sClaseEjb+"!" + sInterfaceEjb;
return sDescripcion;
}
public static String getes_para_reversionDescripcion(ProcesoCierreMes procesocierremes) throws Exception {
String sDescripcion=Constantes.SCAMPOVERDADERO;
if(!procesocierremes.getes_para_reversion()) {
sDescripcion=Constantes.SCAMPOFALSO;
}
return sDescripcion;
}
public static String getes_para_reversionHtmlDescripcion(ProcesoCierreMes procesocierremes) throws Exception {
String sDescripcion=FuncionesJsp.getStringHtmlCheckBox(procesocierremes.getId(),procesocierremes.getes_para_reversion());
return sDescripcion;
}
public static String getProcesoCierreMesDescripcion(ProcesoCierreMes procesocierremes) {
String sDescripcion=Constantes.SCAMPONONE;
if(procesocierremes !=null/* && procesocierremes.getId()!=0*/) {
if(procesocierremes.getId()!=null) {
sDescripcion=procesocierremes.getId().toString();
}//procesocierremesprocesocierremes.getId().toString();
}
return sDescripcion;
}
public static String getProcesoCierreMesDescripcionDetallado(ProcesoCierreMes procesocierremes) {
String sDescripcion="";
sDescripcion+=ProcesoCierreMesConstantesFunciones.ID+"=";
sDescripcion+=procesocierremes.getId().toString()+",";
sDescripcion+=ProcesoCierreMesConstantesFunciones.VERSIONROW+"=";
sDescripcion+=procesocierremes.getVersionRow().toString()+",";
sDescripcion+=ProcesoCierreMesConstantesFunciones.IDMES+"=";
sDescripcion+=procesocierremes.getid_mes().toString()+",";
sDescripcion+=ProcesoCierreMesConstantesFunciones.IDESTRUCTURA+"=";
sDescripcion+=procesocierremes.getid_estructura().toString()+",";
sDescripcion+=ProcesoCierreMesConstantesFunciones.ESPARAREVERSION+"=";
sDescripcion+=procesocierremes.getes_para_reversion().toString()+",";
return sDescripcion;
}
public static void setProcesoCierreMesDescripcion(ProcesoCierreMes procesocierremes,String sValor) throws Exception {
if(procesocierremes !=null) {
//procesocierremesprocesocierremes.getId().toString();
}
}
public static String getMesDescripcion(Mes mes) {
String sDescripcion=Constantes.SCAMPONONE;
if(mes!=null/*&&mes.getId()>0*/) {
sDescripcion=MesConstantesFunciones.getMesDescripcion(mes);
}
return sDescripcion;
}
public static String getEstructuraDescripcion(Estructura estructura) {
String sDescripcion=Constantes.SCAMPONONE;
if(estructura!=null/*&&estructura.getId()>0*/) {
sDescripcion=EstructuraConstantesFunciones.getEstructuraDescripcion(estructura);
}
return sDescripcion;
}
public static String getNombreIndice(String sNombreIndice) {
if(sNombreIndice.equals("Todos")) {
sNombreIndice="Tipo=Todos";
} else if(sNombreIndice.equals("PorId")) {
sNombreIndice="Tipo=Por Id";
} else if(sNombreIndice.equals("BusquedaProcesoCierreMes")) {
sNombreIndice="Tipo= Por Mes Por Estructura Por Es Para Reversion";
} else if(sNombreIndice.equals("FK_IdEstructura")) {
sNombreIndice="Tipo= Por Estructura";
} else if(sNombreIndice.equals("FK_IdMes")) {
sNombreIndice="Tipo= Por Mes";
}
return sNombreIndice;
}
public static String getDetalleIndicePorId(Long id) {
return "Parametros->Porid="+id.toString();
}
public static String getDetalleIndiceBusquedaProcesoCierreMes(Long id_mes,Long id_estructura,Boolean es_para_reversion) {
String sDetalleIndice=" Parametros->";
if(id_mes!=null) {sDetalleIndice+=" Codigo Unico De Mes="+id_mes.toString();}
if(id_estructura!=null) {sDetalleIndice+=" Codigo Unico De Estructura="+id_estructura.toString();}
if(es_para_reversion!=null) {sDetalleIndice+=" Es Para Reversion="+es_para_reversion.toString();}
return sDetalleIndice;
}
public static String getDetalleIndiceFK_IdEstructura(Long id_estructura) {
String sDetalleIndice=" Parametros->";
if(id_estructura!=null) {sDetalleIndice+=" Codigo Unico De Estructura="+id_estructura.toString();}
return sDetalleIndice;
}
public static String getDetalleIndiceFK_IdMes(Long id_mes) {
String sDetalleIndice=" Parametros->";
if(id_mes!=null) {sDetalleIndice+=" Codigo Unico De Mes="+id_mes.toString();}
return sDetalleIndice;
}
public static void quitarEspaciosProcesoCierreMes(ProcesoCierreMes procesocierremes,ArrayList<DatoGeneral> arrDatoGeneral) throws Exception {
}
public static void quitarEspaciosProcesoCierreMess(List<ProcesoCierreMes> procesocierremess,ArrayList<DatoGeneral> arrDatoGeneral) throws Exception {
for(ProcesoCierreMes procesocierremes: procesocierremess) {
}
}
public static void InicializarGeneralEntityAuxiliaresProcesoCierreMes(ProcesoCierreMes procesocierremes,Boolean conAsignarBase,Boolean conInicializarAuxiliar) throws Exception {
if(conAsignarBase && procesocierremes.getConCambioAuxiliar()) {
procesocierremes.setIsDeleted(procesocierremes.getIsDeletedAuxiliar());
procesocierremes.setIsNew(procesocierremes.getIsNewAuxiliar());
procesocierremes.setIsChanged(procesocierremes.getIsChangedAuxiliar());
//YA RESTAURO, NO DEBERIA HACERLO NUEVAMENTE AL MENOS NO HASTA GUARDAR OTRA VEZ
procesocierremes.setConCambioAuxiliar(false);
}
if(conInicializarAuxiliar) {
procesocierremes.setIsDeletedAuxiliar(false);
procesocierremes.setIsNewAuxiliar(false);
procesocierremes.setIsChangedAuxiliar(false);
procesocierremes.setConCambioAuxiliar(false);
}
}
public static void InicializarGeneralEntityAuxiliaresProcesoCierreMess(List<ProcesoCierreMes> procesocierremess,Boolean conAsignarBase,Boolean conInicializarAuxiliar) throws Exception {
for(ProcesoCierreMes procesocierremes : procesocierremess) {
if(conAsignarBase && procesocierremes.getConCambioAuxiliar()) {
procesocierremes.setIsDeleted(procesocierremes.getIsDeletedAuxiliar());
procesocierremes.setIsNew(procesocierremes.getIsNewAuxiliar());
procesocierremes.setIsChanged(procesocierremes.getIsChangedAuxiliar());
//YA RESTAURO, NO DEBERIA HACERLO NUEVAMENTE AL MENOS NO HASTA GUARDAR OTRA VEZ
procesocierremes.setConCambioAuxiliar(false);
}
if(conInicializarAuxiliar) {
procesocierremes.setIsDeletedAuxiliar(false);
procesocierremes.setIsNewAuxiliar(false);
procesocierremes.setIsChangedAuxiliar(false);
procesocierremes.setConCambioAuxiliar(false);
}
}
}
public static void InicializarValoresProcesoCierreMes(ProcesoCierreMes procesocierremes,Boolean conEnteros) throws Exception {
if(conEnteros) {
Short ish_value=0;
}
}
public static void InicializarValoresProcesoCierreMess(List<ProcesoCierreMes> procesocierremess,Boolean conEnteros) throws Exception {
for(ProcesoCierreMes procesocierremes: procesocierremess) {
if(conEnteros) {
Short ish_value=0;
}
}
}
public static void TotalizarValoresFilaProcesoCierreMes(List<ProcesoCierreMes> procesocierremess,ProcesoCierreMes procesocierremesAux) throws Exception {
ProcesoCierreMesConstantesFunciones.InicializarValoresProcesoCierreMes(procesocierremesAux,true);
for(ProcesoCierreMes procesocierremes: procesocierremess) {
if(procesocierremes.getsType().equals(Constantes2.S_TOTALES)) {
continue;
}
}
}
public static ArrayList<String> getArrayColumnasGlobalesProcesoCierreMes(ArrayList<DatoGeneral> arrDatoGeneral) throws Exception {
ArrayList<String> arrColumnasGlobales=new ArrayList<String>();
arrColumnasGlobales=ProcesoCierreMesConstantesFunciones.getArrayColumnasGlobalesProcesoCierreMes(arrDatoGeneral,new ArrayList<String>());
return arrColumnasGlobales;
}
public static ArrayList<String> getArrayColumnasGlobalesProcesoCierreMes(ArrayList<DatoGeneral> arrDatoGeneral,ArrayList<String> arrColumnasGlobalesNo) throws Exception {
ArrayList<String> arrColumnasGlobales=new ArrayList<String>();
Boolean noExiste=false;
return arrColumnasGlobales;
}
public static ArrayList<String> getArrayColumnasGlobalesNoProcesoCierreMes(ArrayList<DatoGeneral> arrDatoGeneral) throws Exception {
ArrayList<String> arrColumnasGlobales=new ArrayList<String>();
return arrColumnasGlobales;
}
public static Boolean ExisteEnLista(List<ProcesoCierreMes> procesocierremess,ProcesoCierreMes procesocierremes,Boolean conIdNulo) throws Exception {
Boolean existe=false;
for(ProcesoCierreMes procesocierremesAux: procesocierremess) {
if(procesocierremesAux!=null && procesocierremes!=null) {
if((procesocierremesAux.getId()==null && procesocierremes.getId()==null) && conIdNulo) {
existe=true;
break;
} else if(procesocierremesAux.getId()!=null && procesocierremes.getId()!=null){
if(procesocierremesAux.getId().equals(procesocierremes.getId())) {
existe=true;
break;
}
}
}
}
return existe;
}
public static ArrayList<DatoGeneral> getTotalesListaProcesoCierreMes(List<ProcesoCierreMes> procesocierremess) throws Exception {
ArrayList<DatoGeneral> arrTotalesDatoGeneral=new ArrayList<DatoGeneral>();
DatoGeneral datoGeneral=new DatoGeneral();
for(ProcesoCierreMes procesocierremes: procesocierremess) {
if(procesocierremes.getsType().equals(Constantes2.S_TOTALES)) {
continue;
}
}
return arrTotalesDatoGeneral;
}
public static ArrayList<OrderBy> getOrderByListaProcesoCierreMes() throws Exception {
ArrayList<OrderBy> arrOrderBy=new ArrayList<OrderBy>();
OrderBy orderBy=new OrderBy();
return arrOrderBy;
}
public static List<String> getTodosTiposColumnasProcesoCierreMes() throws Exception {
List<String> arrTiposColumnas=new ArrayList<String>();
String sTipoColumna=new String();
return arrTiposColumnas;
}
public static ArrayList<Reporte> getTiposSeleccionarProcesoCierreMes() throws Exception {
return ProcesoCierreMesConstantesFunciones.getTiposSeleccionarProcesoCierreMes(false,true,true,true,true);
}
public static ArrayList<Reporte> getTiposSeleccionarProcesoCierreMes(Boolean conFk) throws Exception {
return ProcesoCierreMesConstantesFunciones.getTiposSeleccionarProcesoCierreMes(conFk,true,true,true,true);
}
public static ArrayList<Reporte> getTiposSeleccionarProcesoCierreMes(Boolean conFk,Boolean conStringColumn,Boolean conValorColumn,Boolean conFechaColumn,Boolean conBitColumn) throws Exception {
ArrayList<Reporte> arrTiposSeleccionarTodos=new ArrayList<Reporte>();
Reporte reporte=new Reporte();
if(conFk) {
reporte=new Reporte();
reporte.setsCodigo(ProcesoCierreMesConstantesFunciones.LABEL_IDMES);
reporte.setsDescripcion(ProcesoCierreMesConstantesFunciones.LABEL_IDMES);
arrTiposSeleccionarTodos.add(reporte);
}
if(conFk) {
reporte=new Reporte();
reporte.setsCodigo(ProcesoCierreMesConstantesFunciones.LABEL_IDESTRUCTURA);
reporte.setsDescripcion(ProcesoCierreMesConstantesFunciones.LABEL_IDESTRUCTURA);
arrTiposSeleccionarTodos.add(reporte);
}
if(conBitColumn) {
reporte=new Reporte();
reporte.setsCodigo(ProcesoCierreMesConstantesFunciones.LABEL_ESPARAREVERSION);
reporte.setsDescripcion(ProcesoCierreMesConstantesFunciones.LABEL_ESPARAREVERSION);
arrTiposSeleccionarTodos.add(reporte);
}
return arrTiposSeleccionarTodos;
}
public static ArrayList<Reporte> getTiposRelacionesProcesoCierreMes(Boolean conEspecial) throws Exception {
ArrayList<Reporte> arrTiposRelacionesTodos=new ArrayList<Reporte>();
Reporte reporte=new Reporte();
//ESTO ESTA EN CONTROLLER
return arrTiposRelacionesTodos;
}
public static void refrescarForeignKeysDescripcionesProcesoCierreMes(ProcesoCierreMes procesocierremesAux) throws Exception {
procesocierremesAux.setmes_descripcion(MesConstantesFunciones.getMesDescripcion(procesocierremesAux.getMes()));
procesocierremesAux.setestructura_descripcion(EstructuraConstantesFunciones.getEstructuraDescripcion(procesocierremesAux.getEstructura()));
}
public static void refrescarForeignKeysDescripcionesProcesoCierreMes(List<ProcesoCierreMes> procesocierremessTemp) throws Exception {
for(ProcesoCierreMes procesocierremesAux:procesocierremessTemp) {
procesocierremesAux.setmes_descripcion(MesConstantesFunciones.getMesDescripcion(procesocierremesAux.getMes()));
procesocierremesAux.setestructura_descripcion(EstructuraConstantesFunciones.getEstructuraDescripcion(procesocierremesAux.getEstructura()));
}
}
public static ArrayList<Classe> getClassesForeignKeysOfProcesoCierreMes(ArrayList<Classe> classesP,DeepLoadType deepLoadType)throws Exception {
try {
ArrayList<Classe> classes=new ArrayList<Classe>();
if(deepLoadType.equals(DeepLoadType.NONE)) {
classes.add(new Classe(Mes.class));
classes.add(new Classe(Estructura.class));
} else if(deepLoadType.equals(DeepLoadType.INCLUDE)) {
for(Classe clas:classesP) {
if(clas.clas.equals(Mes.class)) {
classes.add(new Classe(Mes.class));
}
}
for(Classe clas:classesP) {
if(clas.clas.equals(Estructura.class)) {
classes.add(new Classe(Estructura.class));
}
}
} else if(deepLoadType.equals(DeepLoadType.EXCLUDE)) {
}
return classes;
} catch(Exception e) {
//Funciones.manageException(logger,e);
throw e;
}
}
public static ArrayList<Classe> getClassesForeignKeysFromStringsOfProcesoCierreMes(ArrayList<String> arrClasses,DeepLoadType deepLoadType)throws Exception {
try {
ArrayList<Classe> classes=new ArrayList<Classe>();
if(deepLoadType.equals(DeepLoadType.NONE)) {
for(String sClasse:arrClasses) {
if(Mes.class.getSimpleName().equals(sClasse)) {
classes.add(new Classe(Mes.class)); continue;
}
if(Estructura.class.getSimpleName().equals(sClasse)) {
classes.add(new Classe(Estructura.class)); continue;
}
}
} else if(deepLoadType.equals(DeepLoadType.INCLUDE)) {
for(String sClasse:arrClasses) {
if(Mes.class.getSimpleName().equals(sClasse)) {
classes.add(new Classe(Mes.class)); continue;
}
if(Estructura.class.getSimpleName().equals(sClasse)) {
classes.add(new Classe(Estructura.class)); continue;
}
}
} else if(deepLoadType.equals(DeepLoadType.EXCLUDE)) {
}
return classes;
} catch(Exception e) {
//Funciones.manageException(logger,e);
throw e;
}
}
public static ArrayList<Classe> getClassesRelationshipsOfProcesoCierreMes(ArrayList<Classe> classesP,DeepLoadType deepLoadType)throws Exception {
try {
return ProcesoCierreMesConstantesFunciones.getClassesRelationshipsOfProcesoCierreMes(classesP,deepLoadType,true);
} catch(Exception e) {
//Funciones.manageException(logger,e);
throw e;
}
}
public static ArrayList<Classe> getClassesRelationshipsOfProcesoCierreMes(ArrayList<Classe> classesP,DeepLoadType deepLoadType,Boolean conMuchosAMuchos)throws Exception {
try {
ArrayList<Classe> classes=new ArrayList<Classe>();
if(deepLoadType.equals(DeepLoadType.NONE)) {
} else if(deepLoadType.equals(DeepLoadType.INCLUDE)) {
} else if(deepLoadType.equals(DeepLoadType.EXCLUDE)) {
}
return classes;
} catch(Exception e) {
//Funciones.manageException(logger,e);
throw e;
}
}
public static ArrayList<Classe> getClassesRelationshipsFromStringsOfProcesoCierreMes(ArrayList<String> arrClasses,DeepLoadType deepLoadType)throws Exception {
try {
return ProcesoCierreMesConstantesFunciones.getClassesRelationshipsFromStringsOfProcesoCierreMes(arrClasses,deepLoadType,true);
} catch(Exception e) {
//Funciones.manageException(logger,e);
throw e;
}
}
public static ArrayList<Classe> getClassesRelationshipsFromStringsOfProcesoCierreMes(ArrayList<String> arrClasses,DeepLoadType deepLoadType,Boolean conMuchosAMuchos)throws Exception {
try {
ArrayList<Classe> classes=new ArrayList<Classe>();
if(deepLoadType.equals(DeepLoadType.NONE)) {
for(String sClasse:arrClasses) {
}
} else if(deepLoadType.equals(DeepLoadType.INCLUDE)) {
for(String sClasse:arrClasses) {
}
} else if(deepLoadType.equals(DeepLoadType.EXCLUDE)) {
}
return classes;
} catch(Exception e) {
//Funciones.manageException(logger,e);
throw e;
}
}
//FUNCIONES CONTROLLER
public static void actualizarLista(ProcesoCierreMes procesocierremes,List<ProcesoCierreMes> procesocierremess,Boolean permiteQuitar) throws Exception {
}
public static void actualizarSelectedLista(ProcesoCierreMes procesocierremes,List<ProcesoCierreMes> procesocierremess) throws Exception {
try {
for(ProcesoCierreMes procesocierremesLocal:procesocierremess) {
if(procesocierremesLocal.getId().equals(procesocierremes.getId())) {
procesocierremesLocal.setIsSelected(procesocierremes.getIsSelected());
break;
}
}
} catch(Exception e) {
throw e;
}
}
public static void setEstadosInicialesProcesoCierreMes(List<ProcesoCierreMes> procesocierremessAux) throws Exception {
//this.procesocierremessAux=procesocierremessAux;
for(ProcesoCierreMes procesocierremesAux:procesocierremessAux) {
if(procesocierremesAux.getIsChanged()) {
procesocierremesAux.setIsChanged(false);
}
if(procesocierremesAux.getIsNew()) {
procesocierremesAux.setIsNew(false);
}
if(procesocierremesAux.getIsDeleted()) {
procesocierremesAux.setIsDeleted(false);
}
}
}
public static void setEstadosInicialesProcesoCierreMes(ProcesoCierreMes procesocierremesAux) throws Exception {
//this.procesocierremesAux=procesocierremesAux;
if(procesocierremesAux.getIsChanged()) {
procesocierremesAux.setIsChanged(false);
}
if(procesocierremesAux.getIsNew()) {
procesocierremesAux.setIsNew(false);
}
if(procesocierremesAux.getIsDeleted()) {
procesocierremesAux.setIsDeleted(false);
}
}
public static void seleccionarAsignar(ProcesoCierreMes procesocierremesAsignar,ProcesoCierreMes procesocierremes) throws Exception {
procesocierremesAsignar.setId(procesocierremes.getId());
procesocierremesAsignar.setVersionRow(procesocierremes.getVersionRow());
procesocierremesAsignar.setid_mes(procesocierremes.getid_mes());
procesocierremesAsignar.setmes_descripcion(procesocierremes.getmes_descripcion());
procesocierremesAsignar.setid_estructura(procesocierremes.getid_estructura());
procesocierremesAsignar.setestructura_descripcion(procesocierremes.getestructura_descripcion());
procesocierremesAsignar.setes_para_reversion(procesocierremes.getes_para_reversion());
}
public static void inicializarProcesoCierreMes(ProcesoCierreMes procesocierremes) throws Exception {
try {
procesocierremes.setId(0L);
procesocierremes.setid_mes(null);
procesocierremes.setid_estructura(-1L);
procesocierremes.setes_para_reversion(false);
} catch(Exception e) {
throw e;
}
}
public static void generarExcelReporteHeaderProcesoCierreMes(String sTipo,Row row,Workbook workbook) {
Cell cell=null;
int iCell=0;
CellStyle cellStyle = Funciones2.getStyleTitulo(workbook,"PRINCIPAL");
if(sTipo.equals("RELACIONADO")) {
iCell++;
}
cell = row.createCell(iCell++);
cell.setCellValue(ProcesoCierreMesConstantesFunciones.LABEL_IDMES);
cell.setCellStyle(cellStyle);
cell = row.createCell(iCell++);
cell.setCellValue(ProcesoCierreMesConstantesFunciones.LABEL_IDESTRUCTURA);
cell.setCellStyle(cellStyle);
cell = row.createCell(iCell++);
cell.setCellValue(ProcesoCierreMesConstantesFunciones.LABEL_ESPARAREVERSION);
cell.setCellStyle(cellStyle);
}
public static void generarExcelReporteDataProcesoCierreMes(String sTipo,Row row,Workbook workbook,ProcesoCierreMes procesocierremes,CellStyle cellStyle) throws Exception {
Cell cell=null;
int iCell=0;
if(sTipo.equals("RELACIONADO")) {
iCell++;
}
cell = row.createCell(iCell++);
cell.setCellValue(procesocierremes.getmes_descripcion());
if(cellStyle!=null) {
cell.setCellStyle(cellStyle);
}
cell = row.createCell(iCell++);
cell.setCellValue(procesocierremes.getestructura_descripcion());
if(cellStyle!=null) {
cell.setCellStyle(cellStyle);
}
cell = row.createCell(iCell++);
cell.setCellValue(Funciones2.getDescripcionBoolean(procesocierremes.getes_para_reversion()));
if(cellStyle!=null) {
cell.setCellStyle(cellStyle);
}
}
//FUNCIONES CONTROLLER
public String sFinalQueryProcesoCierreMes="";
public String getsFinalQueryProcesoCierreMes() {
return this.sFinalQueryProcesoCierreMes;
}
public void setsFinalQueryProcesoCierreMes(String sFinalQueryProcesoCierreMes) {
this.sFinalQueryProcesoCierreMes= sFinalQueryProcesoCierreMes;
}
public Border resaltarSeleccionarProcesoCierreMes=null;
public Border setResaltarSeleccionarProcesoCierreMes(ParametroGeneralUsuario parametroGeneralUsuario/*ProcesoCierreMesBeanSwingJInternalFrame procesocierremesBeanSwingJInternalFrame*/) {
Border borderResaltar=Funciones2.getBorderResaltar(parametroGeneralUsuario,"COLUMNA");
//procesocierremesBeanSwingJInternalFrame.jTtoolBarProcesoCierreMes.setBorder(borderResaltar);
this.resaltarSeleccionarProcesoCierreMes= borderResaltar;
return borderResaltar;
}
public Border getResaltarSeleccionarProcesoCierreMes() {
return this.resaltarSeleccionarProcesoCierreMes;
}
public void setResaltarSeleccionarProcesoCierreMes(Border borderResaltarSeleccionarProcesoCierreMes) {
this.resaltarSeleccionarProcesoCierreMes= borderResaltarSeleccionarProcesoCierreMes;
}
//RESALTAR,VISIBILIDAD,HABILITAR COLUMNA
public Border resaltaridProcesoCierreMes=null;
public Boolean mostraridProcesoCierreMes=true;
public Boolean activaridProcesoCierreMes=true;
public Border resaltarid_mesProcesoCierreMes=null;
public Boolean mostrarid_mesProcesoCierreMes=true;
public Boolean activarid_mesProcesoCierreMes=true;
public Boolean cargarid_mesProcesoCierreMes=true;//ConNoLoadForeignKeyColumnOTable=false
public Boolean event_dependid_mesProcesoCierreMes=false;//ConEventDepend=true
public Border resaltarid_estructuraProcesoCierreMes=null;
public Boolean mostrarid_estructuraProcesoCierreMes=true;
public Boolean activarid_estructuraProcesoCierreMes=true;
public Boolean cargarid_estructuraProcesoCierreMes=true;//ConNoLoadForeignKeyColumnOTable=false
public Boolean event_dependid_estructuraProcesoCierreMes=false;//ConEventDepend=true
public Border resaltares_para_reversionProcesoCierreMes=null;
public Boolean mostrares_para_reversionProcesoCierreMes=true;
public Boolean activares_para_reversionProcesoCierreMes=true;
public Border setResaltaridProcesoCierreMes(ParametroGeneralUsuario parametroGeneralUsuario/*ProcesoCierreMesBeanSwingJInternalFrame procesocierremesBeanSwingJInternalFrame*/) {
Border borderResaltar=Funciones2.getBorderResaltar(parametroGeneralUsuario,"COLUMNA");
//procesocierremesBeanSwingJInternalFrame.jTtoolBarProcesoCierreMes.setBorder(borderResaltar);
this.resaltaridProcesoCierreMes= borderResaltar;
return borderResaltar;
}
public Border getresaltaridProcesoCierreMes() {
return this.resaltaridProcesoCierreMes;
}
public void setResaltaridProcesoCierreMes(Border borderResaltar) {
this.resaltaridProcesoCierreMes= borderResaltar;
}
public Boolean getMostraridProcesoCierreMes() {
return this.mostraridProcesoCierreMes;
}
public void setMostraridProcesoCierreMes(Boolean mostraridProcesoCierreMes) {
this.mostraridProcesoCierreMes= mostraridProcesoCierreMes;
}
public Boolean getActivaridProcesoCierreMes() {
return this.activaridProcesoCierreMes;
}
public void setActivaridProcesoCierreMes(Boolean activaridProcesoCierreMes) {
this.activaridProcesoCierreMes= activaridProcesoCierreMes;
}
public Border setResaltarid_mesProcesoCierreMes(ParametroGeneralUsuario parametroGeneralUsuario/*ProcesoCierreMesBeanSwingJInternalFrame procesocierremesBeanSwingJInternalFrame*/) {
Border borderResaltar=Funciones2.getBorderResaltar(parametroGeneralUsuario,"COLUMNA");
//procesocierremesBeanSwingJInternalFrame.jTtoolBarProcesoCierreMes.setBorder(borderResaltar);
this.resaltarid_mesProcesoCierreMes= borderResaltar;
return borderResaltar;
}
public Border getresaltarid_mesProcesoCierreMes() {
return this.resaltarid_mesProcesoCierreMes;
}
public void setResaltarid_mesProcesoCierreMes(Border borderResaltar) {
this.resaltarid_mesProcesoCierreMes= borderResaltar;
}
public Boolean getMostrarid_mesProcesoCierreMes() {
return this.mostrarid_mesProcesoCierreMes;
}
public void setMostrarid_mesProcesoCierreMes(Boolean mostrarid_mesProcesoCierreMes) {
this.mostrarid_mesProcesoCierreMes= mostrarid_mesProcesoCierreMes;
}
public Boolean getActivarid_mesProcesoCierreMes() {
return this.activarid_mesProcesoCierreMes;
}
public void setActivarid_mesProcesoCierreMes(Boolean activarid_mesProcesoCierreMes) {
this.activarid_mesProcesoCierreMes= activarid_mesProcesoCierreMes;
}
public Boolean getCargarid_mesProcesoCierreMes() {
return this.cargarid_mesProcesoCierreMes;
}
public void setCargarid_mesProcesoCierreMes(Boolean cargarid_mesProcesoCierreMes) {
this.cargarid_mesProcesoCierreMes= cargarid_mesProcesoCierreMes;
}
public Border setResaltarid_estructuraProcesoCierreMes(ParametroGeneralUsuario parametroGeneralUsuario/*ProcesoCierreMesBeanSwingJInternalFrame procesocierremesBeanSwingJInternalFrame*/) {
Border borderResaltar=Funciones2.getBorderResaltar(parametroGeneralUsuario,"COLUMNA");
//procesocierremesBeanSwingJInternalFrame.jTtoolBarProcesoCierreMes.setBorder(borderResaltar);
this.resaltarid_estructuraProcesoCierreMes= borderResaltar;
return borderResaltar;
}
public Border getresaltarid_estructuraProcesoCierreMes() {
return this.resaltarid_estructuraProcesoCierreMes;
}
public void setResaltarid_estructuraProcesoCierreMes(Border borderResaltar) {
this.resaltarid_estructuraProcesoCierreMes= borderResaltar;
}
public Boolean getMostrarid_estructuraProcesoCierreMes() {
return this.mostrarid_estructuraProcesoCierreMes;
}
public void setMostrarid_estructuraProcesoCierreMes(Boolean mostrarid_estructuraProcesoCierreMes) {
this.mostrarid_estructuraProcesoCierreMes= mostrarid_estructuraProcesoCierreMes;
}
public Boolean getActivarid_estructuraProcesoCierreMes() {
return this.activarid_estructuraProcesoCierreMes;
}
public void setActivarid_estructuraProcesoCierreMes(Boolean activarid_estructuraProcesoCierreMes) {
this.activarid_estructuraProcesoCierreMes= activarid_estructuraProcesoCierreMes;
}
public Boolean getCargarid_estructuraProcesoCierreMes() {
return this.cargarid_estructuraProcesoCierreMes;
}
public void setCargarid_estructuraProcesoCierreMes(Boolean cargarid_estructuraProcesoCierreMes) {
this.cargarid_estructuraProcesoCierreMes= cargarid_estructuraProcesoCierreMes;
}
public Border setResaltares_para_reversionProcesoCierreMes(ParametroGeneralUsuario parametroGeneralUsuario/*ProcesoCierreMesBeanSwingJInternalFrame procesocierremesBeanSwingJInternalFrame*/) {
Border borderResaltar=Funciones2.getBorderResaltar(parametroGeneralUsuario,"COLUMNA");
//procesocierremesBeanSwingJInternalFrame.jTtoolBarProcesoCierreMes.setBorder(borderResaltar);
this.resaltares_para_reversionProcesoCierreMes= borderResaltar;
return borderResaltar;
}
public Border getresaltares_para_reversionProcesoCierreMes() {
return this.resaltares_para_reversionProcesoCierreMes;
}
public void setResaltares_para_reversionProcesoCierreMes(Border borderResaltar) {
this.resaltares_para_reversionProcesoCierreMes= borderResaltar;
}
public Boolean getMostrares_para_reversionProcesoCierreMes() {
return this.mostrares_para_reversionProcesoCierreMes;
}
public void setMostrares_para_reversionProcesoCierreMes(Boolean mostrares_para_reversionProcesoCierreMes) {
this.mostrares_para_reversionProcesoCierreMes= mostrares_para_reversionProcesoCierreMes;
}
public Boolean getActivares_para_reversionProcesoCierreMes() {
return this.activares_para_reversionProcesoCierreMes;
}
public void setActivares_para_reversionProcesoCierreMes(Boolean activares_para_reversionProcesoCierreMes) {
this.activares_para_reversionProcesoCierreMes= activares_para_reversionProcesoCierreMes;
}
public void setMostrarCampos(DeepLoadType deepLoadType,ArrayList<Classe> campos)throws Exception {
Boolean esInicial=false;
Boolean esAsigna=false;
if(deepLoadType.equals(DeepLoadType.INCLUDE) || deepLoadType.equals(DeepLoadType.NONE)) {
esInicial=false;
esAsigna=true;
} else if(deepLoadType.equals(DeepLoadType.EXCLUDE)) {
esInicial=true;
esAsigna=false;
}
this.setMostraridProcesoCierreMes(esInicial);
this.setMostrarid_mesProcesoCierreMes(esInicial);
this.setMostrarid_estructuraProcesoCierreMes(esInicial);
this.setMostrares_para_reversionProcesoCierreMes(esInicial);
for(Classe campo:campos) {
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.ID)) {
this.setMostraridProcesoCierreMes(esAsigna);
continue;
}
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.IDMES)) {
this.setMostrarid_mesProcesoCierreMes(esAsigna);
continue;
}
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.IDESTRUCTURA)) {
this.setMostrarid_estructuraProcesoCierreMes(esAsigna);
continue;
}
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.ESPARAREVERSION)) {
this.setMostrares_para_reversionProcesoCierreMes(esAsigna);
continue;
}
}
}
public void setActivarCampos(DeepLoadType deepLoadType,ArrayList<Classe> campos)throws Exception {
Boolean esInicial=false;
Boolean esAsigna=false;
if(deepLoadType.equals(DeepLoadType.INCLUDE) || deepLoadType.equals(DeepLoadType.NONE)) {
esInicial=false;
esAsigna=true;
} else if(deepLoadType.equals(DeepLoadType.EXCLUDE)) {
esInicial=true;
esAsigna=false;
}
this.setActivaridProcesoCierreMes(esInicial);
this.setActivarid_mesProcesoCierreMes(esInicial);
this.setActivarid_estructuraProcesoCierreMes(esInicial);
this.setActivares_para_reversionProcesoCierreMes(esInicial);
for(Classe campo:campos) {
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.ID)) {
this.setActivaridProcesoCierreMes(esAsigna);
continue;
}
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.IDMES)) {
this.setActivarid_mesProcesoCierreMes(esAsigna);
continue;
}
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.IDESTRUCTURA)) {
this.setActivarid_estructuraProcesoCierreMes(esAsigna);
continue;
}
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.ESPARAREVERSION)) {
this.setActivares_para_reversionProcesoCierreMes(esAsigna);
continue;
}
}
}
public void setResaltarCampos(DeepLoadType deepLoadType,ArrayList<Classe> campos,ParametroGeneralUsuario parametroGeneralUsuario/*,ProcesoCierreMesBeanSwingJInternalFrame procesocierremesBeanSwingJInternalFrame*/)throws Exception {
Border esInicial=null;
Border esAsigna=null;
if(deepLoadType.equals(DeepLoadType.INCLUDE) || deepLoadType.equals(DeepLoadType.NONE)) {
esInicial=null;
esAsigna=Funciones2.getBorderResaltar(parametroGeneralUsuario,"COLUMNA");
} else if(deepLoadType.equals(DeepLoadType.EXCLUDE)) {
esInicial=Funciones2.getBorderResaltar(parametroGeneralUsuario,"COLUMNA");
esAsigna=null;
}
this.setResaltaridProcesoCierreMes(esInicial);
this.setResaltarid_mesProcesoCierreMes(esInicial);
this.setResaltarid_estructuraProcesoCierreMes(esInicial);
this.setResaltares_para_reversionProcesoCierreMes(esInicial);
for(Classe campo:campos) {
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.ID)) {
this.setResaltaridProcesoCierreMes(esAsigna);
continue;
}
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.IDMES)) {
this.setResaltarid_mesProcesoCierreMes(esAsigna);
continue;
}
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.IDESTRUCTURA)) {
this.setResaltarid_estructuraProcesoCierreMes(esAsigna);
continue;
}
if(campo.clase.equals(ProcesoCierreMesConstantesFunciones.ESPARAREVERSION)) {
this.setResaltares_para_reversionProcesoCierreMes(esAsigna);
continue;
}
}
}
public void setMostrarRelaciones(DeepLoadType deepLoadType,ArrayList<Classe> clases)throws Exception {
Boolean esInicial=false;
Boolean esAsigna=false;
if(deepLoadType.equals(DeepLoadType.INCLUDE) || deepLoadType.equals(DeepLoadType.NONE)) {
esInicial=false;
esAsigna=true;
} else if(deepLoadType.equals(DeepLoadType.EXCLUDE)) {
esInicial=true;
esAsigna=false;
}
for(Classe clase:clases) {
}
}
public void setActivarRelaciones(DeepLoadType deepLoadType,ArrayList<Classe> clases)throws Exception {
Boolean esInicial=false;
Boolean esAsigna=false;
if(deepLoadType.equals(DeepLoadType.INCLUDE) || deepLoadType.equals(DeepLoadType.NONE)) {
esInicial=false;
esAsigna=true;
} else if(deepLoadType.equals(DeepLoadType.EXCLUDE)) {
esInicial=true;
esAsigna=false;
}
for(Classe clase:clases) {
}
}
public void setResaltarRelaciones(DeepLoadType deepLoadType,ArrayList<Classe> clases,ParametroGeneralUsuario parametroGeneralUsuario/*,ProcesoCierreMesBeanSwingJInternalFrame procesocierremesBeanSwingJInternalFrame*/)throws Exception {
Border esInicial=null;
Border esAsigna=null;
if(deepLoadType.equals(DeepLoadType.INCLUDE) || deepLoadType.equals(DeepLoadType.NONE)) {
esInicial=null;
esAsigna=Funciones2.getBorderResaltar(parametroGeneralUsuario,"COLUMNA");
} else if(deepLoadType.equals(DeepLoadType.EXCLUDE)) {
esInicial=Funciones2.getBorderResaltar(parametroGeneralUsuario,"COLUMNA");
esAsigna=null;
}
for(Classe clase:clases) {
}
}
public Boolean mostrarBusquedaProcesoCierreMesProcesoCierreMes=true;
public Boolean getMostrarBusquedaProcesoCierreMesProcesoCierreMes() {
return this.mostrarBusquedaProcesoCierreMesProcesoCierreMes;
}
public void setMostrarBusquedaProcesoCierreMesProcesoCierreMes(Boolean visibilidadResaltar) {
this.mostrarBusquedaProcesoCierreMesProcesoCierreMes= visibilidadResaltar;
}
public Boolean activarBusquedaProcesoCierreMesProcesoCierreMes=true;
public Boolean getActivarBusquedaProcesoCierreMesProcesoCierreMes() {
return this.activarBusquedaProcesoCierreMesProcesoCierreMes;
}
public void setActivarBusquedaProcesoCierreMesProcesoCierreMes(Boolean habilitarResaltar) {
this.activarBusquedaProcesoCierreMesProcesoCierreMes= habilitarResaltar;
}
public Border resaltarBusquedaProcesoCierreMesProcesoCierreMes=null;
public Border getResaltarBusquedaProcesoCierreMesProcesoCierreMes() {
return this.resaltarBusquedaProcesoCierreMesProcesoCierreMes;
}
public void setResaltarBusquedaProcesoCierreMesProcesoCierreMes(Border borderResaltar) {
this.resaltarBusquedaProcesoCierreMesProcesoCierreMes= borderResaltar;
}
public void setResaltarBusquedaProcesoCierreMesProcesoCierreMes(ParametroGeneralUsuario parametroGeneralUsuario/*ProcesoCierreMesBeanSwingJInternalFrame procesocierremesBeanSwingJInternalFrame*/) {
Border borderResaltar=Funciones2.getBorderResaltar(parametroGeneralUsuario,"TAB");
this.resaltarBusquedaProcesoCierreMesProcesoCierreMes= borderResaltar;
}
//CONTROL_FUNCION2
}
|
#include <stdio.h>
#include <math.h>
int main(int argc, char const *argv[])
{
long long isbn;
do {
printf("Enter ISBN Number: ");
scanf("%lld", &isbn);
} while (isbn < 0);
int sum = 0;
for (int i = 10; i > 0; i--) {
sum += (isbn % 10) * i;
isbn /= 10;
}
(sum % 11 == 0) ? printf("Valid ISBN!\n") : printf("Invalid ISBN!!\n");
return 0;
}
|
“You’re going to have more than just what happened last night, you’re going to have, I think, many other cases where they want to take their borders back," Donald Trump says. | AP Photo Trump shrugs: 'Looks like' EU breakup is on its way
The European Union is likely to break up as result of Britain's vote to leave, Donald Trump said Friday morning in Scotland, casting the stunning overnight referendum results as just the start of a larger movement across the continent and around the world.
“Well, it looks like it’s on its way and we’ll see what happens,” Trump said when asked if he saw Great Britain's vote as a precursor to a European Union breakup. “So I could see it happening. I have no opinion, really, but I could certainly see it happening. I saw this happening. I could read what was happening here and I could see things happening in Germany.
Story Continued Below
"I hope they straighten out the situation because you know it can really be very nasty. What’s going on can be really really nasty," he said.
“People want to take their country back. They want to have independence, in a sense. You see it with Europe, all over Europe,” Trump said. “You’re going to have more than just what happened last night, you’re going to have, I think, many other cases where they want to take their borders back, they want to take their monetary back, they want to take a lot of things back. They want to be able to have a country again. So I think you’re going to have this happen more and more, I really believe that.”
Trump’s visit to Scotland was largely a business trip to his newly renovated Turnberry golf course, though he cast it as an effort to support his family. He spent the bulk of his opening remarks at a press conference on the course’s 9th tee talking up renovations to the property and thanking his family and business associates. He was accompanied on the trip by his adult children.
“I really do see a parallel between what's happening in the United States and what's happening here,” Trump said, connecting his own America first message to Britain’s “leave” vote. “People want to see borders. They don't necessarily want people pouring into their country that they don't know who they are and where they come from.”
|
The Roomba i7+ can map your home's floor plan as it sweeps up. So with the Google Assistant, you can command the robot to clean up a specific room with just your voice.
Google is teaming up with Roomba maker iRobot to create new smart home features based on indoor maps of users' homes.
The high-end Roomba i7+ robot vacuum can map your home's floor plan as it sweeps up. So with Google Assistant, you can command the robot to clean up a specific room with just your voice by saying "Hey Google, clean the kitchen" or "Hey Google, clean the living room."
Going forward, the companies plan to further integrate their platforms, giving customers "the choice to opt in to new innovative smart home experiences that leverage a broader understanding of the home's space," iRobot announced Wednesday. The announcement is light on details, but iRobot says its indoor mapping data "may help to simplify smart home setup and enable powerful new automations."
Last year, iRobot co-founder and CEO Colin Angle announced his intention to share maps of users' homes with third-party companies working on smart home devices. At the time, he said iRobot's mapping data can help smart home devices like lights, thermostats, and security cameras better understand their physical environment.
"Robots with mapping and spatial awareness capabilities will play an important role in allowing other smart devices in the home to more seamlessly work together," Angle said in a statement this week. "We're looking forward to working with Google to explore new ways to enable a more thoughtful home."
Google, meanwhile, said it does not collect any spatial awareness data from iRobot.
"Much like assigning smart lights or other smart devices to rooms in the home, the Assistant only learns what names people have given to areas of their homes, so that it can then deploy the iRobot i7+ to that area," a Google spokesperson told PCMag today. "We do not receive any information on the layout of the home or where the areas are, respectively."
|
Food, Mobility, and Health in a 17th and 18th Century Arctic Mining Population in Silbojokk, Swedish Spmi Established in 1635, the silver mine of Nasafjll and the smeltery site in Silbojokk in Swedish Spmi were used during several phases until the late 19th century. Excavations in Silbojokk, c. 40 km from Nasafjll, have revealed buildings such as a smeltery, living houses, a bakery, and a church with a churchyard. From the beginning, both local and non-local individuals worked at the mine and the smeltery. Non-locals were recruited to work in the mine and at the smeltery, and the local Smi population was recruited to transport the silver down to the Swedish coast. Females, males, and children of different ages were represented among the individuals buried at the churchyard in Silbojokk, which was used between c. 1635 and 1770. Here we study diet, mobility, and exposure to lead (Pb) in the smeltery workers, the miners, and the local population. By employing isotopic analysis, 13C, 15N, 34S, 87Sr/86Sr and elemental analysis, we demonstrate that individuals in Silbojokk had a homogenous diet, except for two individuals. In addition, both local and non-local individuals were all exposed to Pb, which in some cases could have been harmful to their health.
|
Benefits of sea ice thickness initialization for the Arctic decadal climate prediction skill in EC-Earth3 Abstract. A substantial part of Arctic climate predictability at interannual time scales stems from the knowledge of the initial sea ice conditions. Among all the variables characterizing sea ice, sea ice volume, being a product of sea ice area/concentration (SIC) and thickness (SIT), is the most sensitive parameter for climate change. However, the majority of climate prediction systems are only assimilating the observed SIC due to lack of long-term reliable global observation of SIT. In this study the EC-Earth3 Climate Prediction System with anomaly initialization to ocean, SIC and SIT states is developed. In order to evaluate the benefits of specific initialized variables at regional scales, three sets of retrospective ensemble prediction experiments are performed with different initialization strategies: ocean-only; ocean plus SIC; and ocean plus SIC and SIT initialization. The increased skill from ocean plus SIC initialization is small in most regions, compared to ocean-only initialization. In the marginal ice zone covered by seasonal ice, skills regarding winter SIC are mainly gained from the initial ocean temperature anomalies. Consistent with previous studies, the Arctic sea ice volume anomalies are found to play a dominant role for the prediction skill of September Arctic sea ice extent. Winter preconditioning of SIT for the perennial ice in the central Arctic Ocean results in increased skill of SIC in the adjacent Arctic coastal waters (e.g. the Laptev/East Siberian/Chukchi Seas) for lead time up to a decade. This highlights the importance of initializing SIT for predictions of decadal time scale in regional Arctic sea ice. Our results suggest that as the climate warming continues and the central Arctic Ocean might become seasonal ice free in the future, the controlling mechanism for decadal predictability may thus shift from being the sea ice volume playing the major role to a more ocean-related processes.
|
#ifndef MYAWESOMEGAME_BRICK_QUADRATIC_HPP
#define MYAWESOMEGAME_BRICK_QUADRATIC_HPP
#include "brick_impl.hpp"
#include "shape.hpp"
class BrickQuadratic : public BrickImpl {
public:
BrickQuadratic(std::shared_ptr<jt::Box2DWorldInterface> world, const b2BodyDef* def);
private:
void doCreate() override;
};
#endif // MYAWESOMEGAME_BRICK_QUADRATIC_HPP
|
from sklearn.ensemble._hist_gradient_boosting.gradient_boosting import (
HistGradientBoostingClassifier,
HistGradientBoostingRegressor,
)
from sklearn.linear_model import (
LinearRegression,
LogisticRegression,
LogisticRegressionCV,
RidgeCV,
Ridge,
)
from sklearn.preprocessing import PolynomialFeatures, SplineTransformer
from sklearn.svm import SVC, SVR
from sklearn.ensemble import (
GradientBoostingClassifier,
RandomForestClassifier,
GradientBoostingRegressor,
RandomForestRegressor,
)
from sklearn.preprocessing import PolynomialFeatures
from sklearn.neural_network import MLPClassifier
from sklearn.pipeline import Pipeline, make_pipeline
# ### Dict of estimators for 1D toy examples ### #
CLASSIFIERS = {
"linear": LogisticRegression(),
"linear_tlearn": Pipeline(
[
("interaction", PolynomialFeatures(interaction_only=True)),
("reg", LogisticRegression()),
]
),
"linearCV": LogisticRegressionCV(),
"gradient_boosting": GradientBoostingClassifier(n_estimators=100, random_state=1),
"mlp": MLPClassifier(hidden_layer_sizes=(100, 100, 100, 100), random_state=0),
"svc_rbf": SVC(probability=True, random_state=0, C=0.2),
# "svc_rbf": SVC(probability=True, random_state=0, C=10),
"svm_lin": SVC(kernel="linear", probability=True, random_state=0),
"svm_poly": SVC(kernel="poly", degree=2, probability=True, random_state=0),
"random_forest": RandomForestClassifier(random_state=0, n_estimators=20),
"poly": make_pipeline(PolynomialFeatures(), LogisticRegression()),
"hist_gradient_boosting": HistGradientBoostingClassifier(),
}
REGRESSORS = {
"linear": LinearRegression(),
"ridge": Ridge(),
"linear_tlearn": Pipeline(
[
("interaction", PolynomialFeatures(interaction_only=True)),
("reg", LinearRegression()),
]
),
"linearCV": RidgeCV(),
"gradient_boosting": GradientBoostingRegressor(n_estimators=500, random_state=0),
"svm_lin": SVR(kernel="linear"),
"svm_poly": SVR(kernel="poly", degree=2),
"random_forest": RandomForestRegressor(random_state=0, n_estimators=20),
"poly3": make_pipeline(PolynomialFeatures(degree=3), Ridge()),
"spline3": make_pipeline(SplineTransformer(degree=3), Ridge()),
"hist_gradient_boosting": HistGradientBoostingRegressor(),
}
HP_KWARGS_HGB = {
"histgradientboostingclassifier__learning_rate": [1e-3, 1e-2, 1e-1, 1],
"histgradientboostingclassifier__min_samples_leaf": [2, 10, 50, 100, 200],
}
HP_KWARGS_LR = {"logisticregression__C": [1e-3, 1e-2, 1e-1, 1]}
|
<filename>Bit Manipulation/03counting_no._of_set_bits/method1.cpp
//counting no. of set bits.
#include<bits/stdc++.h>
using namespace std;
int main(){
ios::sync_with_stdio(0);
cin.tie(0);
int n; // number
cin >> n ;
int cnt=0;
while (n)
{
if((n&1)!=0) cnt++;
n = n>>1;
}
cout<<cnt;
return 0;
}
|
Frederic Gehring
Frederic P. Gehring, C.M. (20 January 1903 – 26 April 1998) was an American Catholic priest who served as a military chaplain during the Guadalcanal Campaign of World War II. As well as serving as a parish priest, he was also for a time the National Chaplain for the Catholic War Veterans and the 1st Marine Division Association.
Life
Gehring was born in Brooklyn, New York, and as a young man was admitted to the Congregation of the Mission, more commonly called the Vincentian Fathers. He entered the Congregation at St. Vincent's Seminary in Germantown, Philadelphia, their house of formation. A few years after his ordination as a priest in 1930, he was assigned to the Congregation's missions in China, where he was made responsible for running orphanages for Chinese children. He served in China for six years, returning in 1939.
Gehring volunteered as a Navy chaplain on December 9, 1941, two days after the attack on Pearl Harbor. His role as padre during the Guadalcanal Campaign was depicted by actor Richard Carlyle in the movie The Gallant Hours. Gehring was the first US Naval chaplain decorated with the Presidential Legion of Merit for conspicuous gallantry. The citation read:
for exceptionally meritorious conduct in the performance of outstanding services... during the early months of the campaign against enemy Japanese forces on Guadalcanal... from September 26, 1942... Voluntarily making three hazardous expeditions through enemy-occupied territory, Chaplain Gehring, aided by native scouts, evacuated missionaries trapped on the island. In addition to his routine duties, he frequently visited the front lines and was a constant source of encouragement to the Marine and Army units under continual attack by the enemy. Brave under fire, cheerful in the face of discouragement, and tireless in his devotion to duty, Chaplain Gehring lifted the morale of our men to an exceptional degree. By his fine leadership and great courage he inspired all with whom he came in contact.
In addition to the Legion of Merit, Gehring was awarded the Navy and Marine Corps Medal, and the US Marine Corps Presidential Unit Citation.
Gehring's familiarity with the island led to his acting as driver for Admiral Halsey during the Admiral's visit to Guadalcanal. His skill with the violin helped to entertain the troops.
After World War II, Gehring continued working as a priest, returning to Germantown in 1963, where he served as pastor of St. Vincent's Church. He retired from that position five years before his death, to live with his sister in Orlando, Florida. His funeral was at St. Vincent's Seminary in Germantown, where in 1930 he had been ordained.
Patsy Li
While on Guadalcanal, Gehring discovered a six-year-old girl, who had been beaten and bayonetted, and was also suffering from malaria. Gehring nursed the child back to life, and named her Patsy Li. She was adopted by a Singaporean woman, who believed the child to be her own daughter of the same name, who had been lost at sea. Later, the child was proved beyond doubt to be the woman's own daughter. In 1950, Gehring brought Patsy Li to the United States, where she became a nurse. He wrote her story in the 1962 book A Child of Miracles.
|
<reponame>herculesinc/nova.core<filename>tests/test.ts
// IMPORTS
// =================================================================================================
import { Context, OperationConfig, OperationServices } from '@nova/core';
import * as nova from '../index';
import { MockCache } from './mocks';
// MODULE VARIABLES
// =================================================================================================
const opConfig: OperationConfig = {
id : 'testId',
name : 'testName',
origin : 'testOrigin',
actions : [testAction]
};
const opServices: OperationServices = {
cache : new MockCache()
};
async function testAction(this: Context, inputs: any) {
this.cache.get('test key');
this.defer(nova.actions.clearCache, ['test key1']);
this.defer(nova.actions.clearCache, ['test key2']);
return { inputs, processed: true };
}
// TESTS
// =================================================================================================
(async function runTest() {
const inputs = { test: 'testing' };
const operation = new nova.Operation(opConfig, opServices);
const result = await operation.execute(inputs);
console.log('-'.repeat(100));
console.log(JSON.stringify(result));
})();
|
Boomer = The Guy Who Drank Too Much
The faintest of belches resonating from down the hall are the first indication that a Boomer is near and he's about to blow. Use his heavy footsteps to identify him as quickly as possible and get out of his way, because when he spews (and he will), the results can be devastating.
The Tank = The Cockblock
She's the only thing standing between your friends and a group of seemingly interested females. Unfortunately she drove and she's tired, so there's no way you're getting past her by yourself. Enlisting the help of several willing and able wingmen is the only way to render her overpowering snark ineffective.
Smoker = The Guy Who Keeps Roping You Into Conversations
Maybe you had a class together freshman year, or share a mutual acquaintance, or simply high fived tonight over anything pre-1999 and post 1991. Whatever the reason, the Smoker thinks every time you're in his field of vision is an invitation to reminisce about your shitty professor or what Jeremy's been up to or just how sick Legend's Of the Hidden Temple was.
Hunter = The Guy Who Wants To Fight
While some people see a party as a chance to get together with friends and have fun, The hunter sees it as an MMA prizefight in a basement. Who the Hunter picks is completely at random, so all you can do is stay alert, be prepared, and completely forget about having any sort of fun tonight.
The Jockey = The Friend Who Doesn't Know Anyone
You were adamant in explaining that he wouldn't know a single person there, but you were obligated to at least invite him, and unfortunately he said yes. Your only opportunity for a few fleeting moments of privacy is when you have to piss, so use them wisely. And don't think for a second that peeing outside will deter him from staying with drooling distance, because he has zero problems with watching you urinate.
The Spitter = The Guy Who Hits On Every Girl
Every female within shouting distance can expect a relentless amount of game spit in their direction. Want to identify a potential spitter before he can strike? It's simple, look at his hat: if it's positioned anywhere on his but the normal way, chances are you've got a live one. Loitering in his presence can only do bad things for your general well-being, so be sure to get out of his range quickly as possible. Ironically, the only girls who seem to willingly fall for it are already dead inside.
Witch = The Roommate Who Didn't Want to Throw a Party
Her nursing exam is on Monday, but she won't let the fact that it's only Friday stop her from feeling inconvenienced by everyone's good time. She's in her room right now and just waiting for someone to bother her. Don't be that person. Just walk a safe distance around her and don't make too much noise, and if you're carrying a flashlight, turn it off immediately.
Though to be fair you probably shouldn't be carrying a flashlight.
|
<reponame>ValentinCamus/AlphaEngine
#include <iostream>
#include <Alpha/Core/CoreMinimal.h>
#include <Alpha/Engine/EngineMinimal.h>
#include <Alpha/Input/Input.h>
#include <Alpha/Application/Application.h>
#include <Sandbox/Sandbox.h>
int main(int argc, char *argv[])
{
ALPHA_UNUSED(argc);
ALPHA_UNUSED(argv);
Alpha::Core::Init();
Alpha::Input::Init();
Alpha::Engine::Init();
Alpha::Application application;
auto sandboxLayer = Alpha::NewPointer<Alpha::SandboxLayer>();
auto sandboxGuiLayer = Alpha::NewPointer<Alpha::GuiSandboxLayer>();
application.PushLayer(sandboxLayer);
application.PushLayer(sandboxGuiLayer);
Alpha::Logger::Info("AlphaEngine: Running...");
application.Run();
Alpha::Exit(EXIT_SUCCESS);
}
|
Searching for molecular markers in head and neck squamous cell carcinomas (HNSCC) by statistical and bioinformatic analysis of larynx-derived SAGE libraries Background Head and neck squamous cell carcinoma (HNSCC) is one of the most common malignancies in humans. The average 5-year survival rate is one of the lowest among aggressive cancers, showing no significant improvement in recent years. When detected early, HNSCC has a good prognosis, but most patients present metastatic disease at the time of diagnosis, which significantly reduces survival rate. Despite extensive research, no molecular markers are currently available for diagnostic or prognostic purposes. Methods Aiming to identify differentially-expressed genes involved in laryngeal squamous cell carcinoma (LSCC) development and progression, we generated individual Serial Analysis of Gene Expression (SAGE) libraries from a metastatic and non-metastatic larynx carcinoma, as well as from a normal larynx mucosa sample. Approximately 54,000 unique tags were sequenced in three libraries. Results Statistical data analysis identified a subset of 1,216 differentially expressed tags between tumor and normal libraries, and 894 differentially expressed tags between metastatic and non-metastatic carcinomas. Three genes displaying differential regulation, one down-regulated (KRT31) and two up-regulated (BST2, MFAP2), as well as one with a non-significant differential expression pattern (GNA15) in our SAGE data were selected for real-time polymerase chain reaction (PCR) in a set of HNSCC samples. Consistent with our statistical analysis, quantitative PCR confirmed the upregulation of BST2 and MFAP2 and the downregulation of KRT31 when samples of HNSCC were compared to tumor-free surgical margins. As expected, GNA15 presented a non-significant differential expression pattern when tumor samples were compared to normal tissues. Conclusion To the best of our knowledge, this is the first study reporting SAGE data in head and neck squamous cell tumors. Statistical analysis was effective in identifying differentially expressed genes reportedly involved in cancer development. The differential expression of a subset of genes was confirmed in additional larynx carcinoma samples and in carcinomas from a distinct head and neck subsite. This result suggests the existence of potential common biomarkers for prognosis and targeted-therapy development in this heterogeneous type of tumor. pattern (GNA15) in our SAGE data were selected for real-time polymerase chain reaction (PCR) in a set of HNSCC samples. Consistent with our statistical analysis, quantitative PCR confirmed the upregulation of BST2 and MFAP2 and the downregulation of KRT31 when samples of HNSCC were compared to tumor-free surgical margins. As expected, GNA15 presented a non-significant differential expression pattern when tumor samples were compared to normal tissues. Conclusion: To the best of our knowledge, this is the first study reporting SAGE data in head and neck squamous cell tumors. Statistical analysis was effective in identifying differentially expressed genes reportedly involved in cancer development. The differential expression of a subset of genes was confirmed in additional larynx carcinoma samples and in carcinomas from a distinct head and neck subsite. This result suggests the existence of potential common biomarkers for prognosis and targeted-therapy development in this heterogeneous type of tumor. Background Head and neck squamous cell carcinoma (HNSCC) is one of the most common malignancies in humans, affecting distinct head and neck topologies including oral cavity, oropharynx, hypopharynx, larynx and nasopharynx. HNSCC is associated with high alcohol and tobacco consumption, and represents a major international health problem with approximately 650,000 cases and 90,000 deaths per year worldwide. In Brazil, over 13,000 new cases are expected in 2008. Currently, advances in both surgical and nonsurgical therapeutics have led to increased local tumor control. However, overall mortality rates have not improved due to tumor recurrences in regional and distant sites of the aerodigestive tract. When detected early, HNSCC has a 75% 5-year survival rate, but most patients present metastatic disease at the time of diagnosis, which reduces survival rate to 35%. This 5-year survival rate is one of the lowest among aggressive cancers and has shown no significant improvement in recent years. Currently, there are very few molecular markers that can be used with accuracy and reliability as indicators of head and neck carcinomas with potential for metastatic progression, and therefore as indicators of a more aggressive tumor behavior. A pre-operative marker, for example, could significantly help in determining the most appropriate treatment for a particular patient. Moreover, changes in the gene expression profile arising exclusively or preferentially in cancer can be used as molecular markers. In fact, these markers may provide us with new means for the early detection of cancer and cancer risk assessment, as discussed by Hunter et al. for HNSCC. In order to investigate molecular markers that may be relevant for prognosis and therapy in cancer disease, largescale transcriptomic approaches such as SAGE and microarrays have been extensively reported in the literature. In the present study, we decided to use SAGE since this technique allows an unbiased global view of all the transcripts expressed in a tissue sample at a given time point. Despite its appropriateness for such studies, SAGE is an expensive and complex technique, thus commonly involving few and often rare biological samples. We generated individual SAGE libraries from metastatic (N+) and non-metastatic (N0) larynx carcinomas, and from normal mucosa samples. A database was created to provide absolute frequency tags for each gene in metastatic and non-metastatic tumors, and for the normal tissues. For the statistical analysis of differentially expressed tags, the Poisson distribution was used as the basic probabilistic model. The Cox partial likelihood combined with Dempster p-values allowed us to consider an efficient significance test to compare the Poisson means of the three groups. Also, the choice of critical level depended on the expression power of the tag been tested. The analysis of the data by our statistical approach revealed subsets of differentially expressed genes between tumor and normal tissues, and between metastatic and non-metastatic carcinomas. These differentially expressed genes deserve further consideration as potential biomarkers for metastatic progression, and therefore as indicators of a more aggressive tumor behavior. Sample preparation for SAGE and real time PCR experiments Samples were frozen in liquid nitrogen and stored at -80°C. Total RNA was extracted using TRIzol Reagent and treated with DNase (Invitrogen Corporation, Carlsbad, CA, USA). cDNA synthesis was performed using the High Capacity cDNA Archive kit (Applied Biosystems, Forster City, CA, USA) as described by the manufacturer. The study protocol was approved by the National Committee of Ethics in Research (CONEP 1763/05, 18/05/ 2005) and informed consent was obtained from all patients enrolled. SAGE SAGE was carried out using the I-SAGE™ Kit (Invitrogen Corporation, Carlsbad, CA, USA). Briefly, mRNA was captured from total RNA by binding to oligo (dT) magnetic beads, and reverse transcribed with SuperScript™ II reverse transcriptase and E. coli DNA polymerase. Bound cDNA was cleaved with Nla III (anchoring enzyme), divided in two fractions and ligated to adapters A and B, both containing a BsmF I restriction site followed by a CATG 3'overhang, with different primer anchoring sequences at the 5'end. Adapter linked cDNA from both fractions were cleaved with BsmF I (tagging enzyme) to generate adapter linked tags that were filled in by Klenow polymerase and then mixed and ligated to form adapter linked ditags. This mixture was used as template, in three 96-well 50 l PCR reactions using primers complementary to the adapters, and the ~100-bp products were PAGE purified. Adapters were eliminated by digestion with Nla III and PAGE purification of the 26 bp ditags that were ligated to form concatamers. Concatamers were submitted to polyacrylamide gel electrophoresis and regions ranging from 300-500 bp, 500-800 bp and 800-1000 bp were purified and ligated to pZero ® -1 cloning vector. Ligation reactions were used to transform One Shot ® TOP10 Eletrocomp™ E. coli cells using 0.2 cm cuvettes and a Gene Pulser II electroporator (Bio-Rad Laboratories, Hercules, CA, USA) set at 2.5 kV, 25 mF and 200. Cells were plated on low salt LB agar containing Zeocin ®, in plates compatible with the automated colony picker QPix2 (Genetix, New Milton, Hampshire, UK). Picked colonies were grown separately on 96well plates containing 2XYT media. An aliquot of each well was then used directly in a PCR reaction, with forward and reverse M13 primers. Amplified inserts were checked and sequenced with forward M13 primer in a MegaBACE™1000 sequencer (Amersham Biosciences, Piscataway, NJ, USA) and the DYEnamic ET Dye Terminator Sequencing Kit (Amersham Biosciences, Piscataway, NJ, USA), or alternatively, an ABI PRISM ® 377 DNA Sequencer (Applied Biosystems, Foster City, CA) and the ABI PRISM ® BigDye™ Primer Cycle Sequencing Kit (Applied Biosystems, Foster City, CA, USA). Three SAGE libraries were generated using two larynx cancer samples (one with lymph node metastasis or N+ and one with no lymph node metastasis or N0) and a normal control library pooled from two normal samples (surgical margins from one N+ and one N0 larynx cancer). For each library, 6,000 sequencing reactions were performed and tags were extracted to yield approximately 100,000 tags per library. The raw data files are available at the Gene Expression Omnibus database (GEO) under the accession numbers: GSM303325 (pool of normal samples); GSM303340 (N0 tumor), GSM303349 (N+ tumor). SAGE database Tag frequency tables, composed of a "tag" column (10 bp sequences) and a "count" column (number of times that the tag appears in the library) were obtained by the SAGE™ Analysis 2000 Software 4.0, with minimum tag count set to 1 and maximum di-tag length set to 28 bp, whereas other parameters were set on default. A relational MySQL database was developed to store data from SAGE experiments. The datasets contained information on: gene name, accession number, UniGene code, gene symbol, absolute frequency tags in metastatic and nonmetastatic tumors and normal tissues. Other tables were generated to store information on metabolic pathways and gene ontology. Scripts developed in Perl integrated with the MySQL database allowed the identification of genes and their respective frequencies in the three libraries which were used as input data in the program that performed statistical analysis. A schematic representation of databases, data analysis, and experimental validation representing our approach is shown in Figure 1. Statistical Method Before starting the statistical analysis, we decided to exclude very low expression tags from the study. The inclusion criterion considered only tags with total normalized frequencies larger than 3. In order to obtain normalized frequencies, the absolute frequencies in each library were divided by the total number of tags of this library and multiplied by the total number of tags of the smallest library. For each tag, we observed the sum of its three normalized frequencies. If this sum was larger than 3, it was kept in the study; otherwise, it was excluded. The remaining tags, after this exclusion procedure, are the object of our study (Additional file 1, Supplementary Table 1). Returning to the absolute frequencies of the remaining tags, we observed that all frequencies were low in relation to the size of the libraries. In such rare cases, the Poisson distribution is the adequate statistical model for the analysis. In fact, the three absolute frequencies for each tag are considered independent Poisson distributed variables. The statistical objective at this point, for a specific tag, is to decide whether there are differences in expression among the libraries. We should perform statistical tests for every tag in the data bank. Comparing Poisson distribution is not an easy task. We then used the Partial Likelihood method as developed by Cox. Briefly, the procedure considers the three frequencies of a specific tag as forming an observation of a trinomial distribution, where the sample size is now the total tag abundance, S. Representing now the unknown trinomial probabilities of a specific tag by (p 1, p 2, p 3 ) and the total library sizes by (N 1, N 2, N 3 ), homoge-neity among the original three Poisson averages can be tested by testing, in the trinomial model, the hypothesis Again, we have a difficult task to compute a p-value in a tri-dimensional sample space. Since we have distinct tag abundances, which can go from 4 to more than one thousand tags, we have to be very precise in defining the p-values. For this task, we decided to use the method developed by Dempster. The method consists of ordering the sample space by the likelihood ratios. To compute the p-value, the tail area was considered as the set of all points that have likelihood ratios smaller than those of the observed frequencies. Finally, as mentioned before, the tag abundances can be very different, and considering the same significance level would be inappropriate for the tags with low frequencies. Following the recommendations of DeGroot, we used the decision theory optimum procedure that minimizes the risk function a + b. Here, and are the first and second kind of errors. In our case, we decided to choose a = 4 and b = 1 since we believe that the first type of error (deciding in favor of differentially expressed when it is not) is more dangerous than the second type of error (deciding against differentially expressed when it is). Using simulated samples, we found that the level of significance is a function of S, the tag abundance: = 0.07S -1/2. A detailed description of the statistical method is presented in Varuzza and Pereira. Functional classification of differentially expressed genes and online gene expression analysis Gene ontology (GO) annotation was used for the functional classification of up-and down-regulated genes. This task was performed using terms from the Gene Ontology database. Additionally, we used the Oncomine database in order to search for a previous association of differentially expressed genes found in this study with head and neck cancer. Real Time PCR Three genes displaying down (KRT31) or upregulation (BST2, MFAP2) were selected for validation in additional tissues using real-time polymerase chain reaction. One gene (GNA15) that did not present differential expression was also selected for this validation. Their expression was checked in 26 larynx SCC samples (15 N0 and 11 N+) relative to matched normal samples and in 36 tongue SCC samples (18 N0 and 18 N+). The primers were manually designed using the following parameters: 19-23 bp length, 30-70% GC content, a short amplicon size (66-110 bp), and at least one primer of each pair flanking an intron-exon boundary to prevent genomic amplification ( Table 1). All primers were purchased from Invitrogen (Brazil). ple was tested in triplicate. The PCR conditions were 50°C for 2 min, 95°C for 10 min followed by 40 cycles at 95°f or 15 sec, 60°C for 1 min, 65°C for 34 sec. Following the PCR, dissociation curve analysis was performed to confirm the desired single gene product. For each primer set, the efficiency of the PCR reaction (linear equation: y = slope + intercept) was measured in triplicate on serial dilutions of the same cDNA sample (a pool of 10 samples). The PCR efficiency (E) was calculated by the formula E = and ranged from 1.96 to 2.02 in the different assays. The slope and R2 values for target and reference genes are shown in Table 1. The relative expression ratio (fold change) of the target genes was calculated according to Pfaffl. Sta-tistical analysis was calculated by a two-tailed unpaired t test using GraphPad prism software. Statistical analysis of SAGE data We constructed three SAGE libraries from two larynx carcinoma samples and a pooled control sample aiming to identify global events involved in tumorigenesis and potential biomarkers in HNSCC. Given the huge amount of data generated by SAGE, events that play a consistent role in cancer phenotype may be undistinguished from those that are random events, leading to false positive and false negative results. Statistical analysis and bioinformatic tools are used to overcome these limitations and improve the identification of a gene expression signature of biological and therapeutic interest. In the present study, we propose a statistical approach to analyze SAGE data through the use of Poisson probabilistic model and the conditional test of Cox partial likelihood. A Dempster methodology for ordering the sample points of the sample spaces throughout the likelihood ratio was also considered to compute the p-values. As the. The sequences were stored in a MySQL relational database and analyzed as shown in Figure 1. Statistical analysis identified subsets of 1,216 differentially expressed tags between tumor and normal libraries, and 894 differentially expressed tags between metastatic and non-metastatic carcinomas. Sixty top-up and 60 topdownregulated tags in aggressive versus non-aggressive tumors and in both these tumors versus normal tissues, as well as their normalized frequencies, and the corresponding genes according to SAGE Genie and SAGEmap databases are presented in Supplementary Tables 6-11 (Additional file 2). Since several authors have reported that chi-square test is the most appropriate for SAGE experiments, we compared the performance of our statistical approach (named here as Kemp method) with that of chi-square test. For this comparison, the SAGE data set was divided into two groups: the low-abundance tags with counts equal and lower than 50, and the high abundance tags expressed at higher levels (> 50). Good correspondence between the data obtained by both tests was found for the latter tag group (Figure 2), indicating that they are equivalent for the analysis of highly expressed sequences. A similar result was not observed for low-abundance tags (Figure 3). Using a proposed tag-customized critical level for both tests, we found 341 discordant tags, which represent 4.8% of total differentially expressed tags: 100 (29.3%) were considered differentially expressed by chi-square test but not by Kemp method, and 241 (70.7%) by Kemp but not by chi-square test. Most discordant cases were low-abundance tags (Additional file 3). A tag presenting a differential expression pattern but low counts may be considered as statistically non-significant by methods that use fixed critical levels. Although a number of these tags probably have biological relevance, their selection from the SAGE data sets remains a challenge. To circumvent this limitation, Kemp's method calculates the critical level of each particular tag taking into account its total frequency, thus making the method applicable for detecting differences in expression of tags with counts ranging from 20 to 50. In addition, the use of a tag-customized critical level minimizes both type I and type II errors. Conversely, most of the statistical tests currently used to detect differentially expressed genes are based on asymptotic results, and perform poorly for low expression tags. Another feature of these tests is the common use of a single canonical cutoff for the significance level (p-value) of all tags, without taking into account the type II error. Differentially expressed genes: biological functions and potential involvement in HNSCC Information on biological processes was obtained from the Gene Ontology (GO) database for the top upand down-regulated genes identified by the statistical approach (Tables 2 and 3). The data may be helpful for evaluating their potential as drug targets and molecular markers of cancer. Although some GO terms are not directly related to tumorigenesis, as lipid metabolism process and viral genome replication, they provide evidence of some important changes in cell metabolism coupled to energy generation and cell growth. Potential molecular markers identified by SAGE: Validation by Real-Time PCR The selection of genes for validation by real-time RT-PCR was carried out after an extensive literature analysis of gene expression studies of head and neck carcinomas [3,4,. The following criteria were used for gene selection: (i) potential involvement in cancer development or aggressiveness and a yet unclear role in HNSCC tumorigenesis, and (ii) similar expression pattern in data reported in the literature as well as in our SAGE experiments. Using these criteria as guidelines, four genes were selected: two with a pronounced overexpression in SAGE tumor libraries (BST2 and MFAP2), one with an intermediate downregulation profile (KRT31, also referred to as Top up-and down-regulated genes selected from SAGE in N+ tumor sample compared to N0 sample. KRTHA1) and one with a non-significant differential expression pattern (GNA15). According to the statistical analysis performed, BST2 and MFAP2 tags were expressed at high levels in tumors compared to normal tissues (at least 13-fold or higher), the latter also exhibiting a remarkable overexpression in N+ samples in relation to N0 samples. The normalized frequencies of BST2 tags showed N+ tumor/normal and N0 tumor/normal ratios of 15.8 and 24.3, respectively. For MFAP2, N+ tumor/N0 tumor and N+ tumor/normal ratios were 25.3 and 13.5, respectively. In contrast to these genes, GNA15 showed no differences in gene expression between samples analyzed by SAGE and was selected as a negative control. Although classified as a relevant underexpressed candidate marker in tumors by the statistical analysis of SAGE data, KRT31 displayed less expressive differences between N+ or N0 tumors and normal tissues. The normalized frequencies of tags are shown in Supplementary Tables 6-11 (Additional file 2). Similar expression patterns of BST2, MFAP2, KRT31 and GNA15 tags were observed by using a chi-square test. The expression data for the selected genes were validated in 15 pairs of tumor and matched normal tissues from N0 LSCC and 11 pairs from N+ LSCC. The data were also validated in another head and neck subsite by using 36 pairs of tumor and matched normal tissues from tongue squamous cell carcinomas (18 N+ and 18 N0). MFAP2 was upregulated (≥ 2 fold) and KRT31 was downregulated (≥ 2-fold) in both N+ and N0 laryngeal tumors versus normal samples, the former also in tongue tumors. BST2 gene was also upregulated but only in N0 tumors versus normal tissues. No difference between N+ and N0 carcinomas was detected for these genes, except for MFAP2 in tongue samples. According to SAGE expression profiles, GNA15 exhibited a non-significant differential expression pattern in carcinomas versus normal tissues, except between N+ and N0 tumors (Figure 4). The results of the real time PCR experiments were, therefore, in agreement with SAGE data. However, as PCR experiments were performed using a larger number of cases than SAGE, we observed high variability of gene expression among the samples. This finding suggests Chi-square p-value versus Kemp value for high-abundance tags The selected genes present intriguing functions related to normal and neoplastic development. KRT31 gene, for example, encodes a type I hair keratin, which is specifically expressed in hair and nails but has been previously observed in normal keratinocytes from buccal mucosa. In cancer, loss of differentiation-specific hair keratins was found in late-stage pilomatrixoma, a skin tumor of follicular origin. Since, keratin 31 has been detected in normal oral mucosa, a similar change in its expression pattern may occur in mucosa-derived squamous carcinomas. The BST2 gene encodes the bone marrow stromal cell antigen 2, a transmembrane glycoprotein potentially involved in interactions between cancer cells and bone marrow stromal cells and related to angiogenesis, cell proliferation and chromosomal instability. The BST2 promoter region contains putative cis elements for GATA1, STAT 3 and 1 transcription factors, the latter overexpressed in HNSCCs. BST2 up-regulation has been observed in multiple myeloma, non-Down syndrome (DS) acute megakaryocytic leukemia, tamoxifen-resistant breast cancer and, in the present study, was upregulated in HNSCC samples. Stromal cells prevent chemotherapy-induced apoptosis of leukemia cells. The findings of Ge et al. suggest that BST2 could potentially participate in the leukemia-cell protection from ara-C-induced cytotoxicity mediated by bone marrow stromal cells. These data and the findings of Becker et al. on BST2 overexpression in tamoxifenresistant breast cancer indicate that BST2 may possibly represent a new therapeutic target for leukemia as well as for other types of cancer, including HNSCC. The MFAP2 or MAGP-1 gene encodes the microfibrillarassociated protein 2, a small molecular weight component of extracellular microfibrils, which are structural elements of elastic tissues in the lungs, skin, and vasculature. Miyamoto et al. showed that MAGP-1 protein Chi-square p-value versus Kemp value for low-abundance tags Relative quantitation of target gene expression for each sample was calculated according to Pfaffl ; GAPDH was used as the internal reference and normal sample as the calibrator. Values were Log2 transformed (y-axis) so that all values below -1 indicate down-regulation in gene expression while values above 1 represent up-regulation in tumor samples compared to normal samples. Differences in gene expression between groups (N0 and N+) were calculated by unpaired t test using GraphPad prism software and were considered statistically significant at P < 0.05 (*). The error bar represents the mean ± S.E.M (standard error of the mean). can bind to the Notch1 receptor, leading to a subsequent signaling cascade. In self-renewing tissues and during tumorigenesis, Notch signaling may inhibit differentiation, lineage specification at developmental branch points and induction of differentiation. For example, Notch signaling regulates binary cell fate decisions in the development of the peripheral nervous system in flies. Equipotent precursors give rise to two alternative cell fates: epidermal or neuronal, depending on whether a progenitor cell receives a strong or weak Notch signal. In the skin, Notch induces terminal differentiation of keratinocytes. Therefore, the Notch pathway may lead to different and sometimes opposing outcomes. One explanation is that Notch function is context-dependent. Abnormal Notch activation has been observed in different tumors although growth suppression has also been noticed after constitutively over-expressed active Notch1. Thus, Notch signaling can function as both an oncogene and a tumor suppressor, even within a single tumor, supporting the idea that the Notch1 pathway is cell-type specific and context-dependent. Overall, the results of the real time PCR experiments showed consistent patterns in HNSCC patients and were in agreement with SAGE analysis. However, little is known about changes at the protein level, and the relationship between gene expression and tumor phenotype as well as the potential value of these genes as biomarkers for HNSCC tumorigenesis should be evaluated in future studies. Conclusion To the best of our knowledge, this is the first study reporting SAGE data in head and neck squamous cell tumors. The analysis of SAGE data by our statistical approach was effective in identifying differentially expressed genes reportedly involved in cancer development. In agreement with our statistical analysis, three genes (BST2, MFAP2 and KRT31) selected for validation experiments were differentially expressed in an independent subset of HNSCCs compared to normal tissues or in metastatic versus nonmetastatic samples. The selected genes have not been pre-viously implicated in head and neck tumorigenesis. In addition, our data suggest a role for Notch signaling in HNSCC tumorigenesis, together with factors involved in keratinocyte differentiation, keratinization and epidermis development. The confirmation of the differential expression of this subset of genes selected from LSCC SAGE libraries in other HNSCC sites reinforce the existence of potential common biomarkers for prognosis and targeted therapy of such tumors. Additional file 1 Supplementary
|
<reponame>taydy/go-leetcode
package medium
// https://leetcode-cn.com/problems/longest-palindromic-substring/
// 给定一个字符串 s,找到 s 中最长的回文子串。你可以假设 s 的最大长度为 1000。
//
// 示例 1:
// 输入: "babad"
// 输出: "bab"
// 注意: "aba" 也是一个有效答案。
//
// 示例 2:
// 输入: "cbbd"
// 输出: "bb"
//
//
// 思路:
// 中心扩展算法:
// 回文中心的两侧互为镜像。因此,回文可以从他的中心展开,并且只有2n-1个这样的中心(一个元素为中心的情况有n个,两个元素为中心的情况有n-1个)
//
func longestPalindrome(s string) string {
if len(s) == 0 {
return ""
}
chars := []rune(s)
var start, end int
for i := 0; i < len(s); i++ {
length := expandAroundCenter(chars, i, i)
lengthB := expandAroundCenter(chars, i, i+1)
if length < lengthB {
length = lengthB
}
if length > end-start {
start = i - (length-1)>>1
end = i + length>>1
}
}
return string(chars[start : end+1])
}
func expandAroundCenter(chars []rune, left, right int) int {
length := len(chars)
for left >= 0 && right < length && chars[left] == chars[right] {
left--
right++
}
return right - left - 1
}
|
1. Field of the Invention
The present invention relates to a magnetic recording medium having a flexible non-magnetic film base and a magnetic layer thereon comprising magnetizable particles dispersed in a resinous binder in combination with a fluorinated silane ester as a lubricant.
2. DESCRIPTION OF THE PRIOR ART
A magnetic recording medium whether used for audio recording, video recording, or other magnetic recording purposes comes in contact with tape guide members, magnetic heads and the like during use. In the case of a video tape recorder, where high tape velocities are encountered, the tape must have a sufficient wear resistance and a relatively small friction coefficient if it is to run smoothly and steadily for a long time. Magnetic recording tape which has an increased friction coefficient vibrates at the tape guide members and at the magnetic heads during the recording operation or the reproducing operation, so that the recorded signals or the reproduced signals are distorted from the originals. In some cases, a so-called "Q" sound due to vibration of the magnetic recording tape is encountered.
Efforts have been made to overcome the above-described defects and to impart lubricity or smoothness to the magnetic recording tape, but no completely satisfactory lubricant for magnetic recording tapes has yet been obtained. For example, it has been suggested to use lubricants such as a silicone fluid, castor oil, molybdenum disulfide, graphite, higher fatty acids or the like, the lubricant being mixed into a magnetic layer containing a magnetic powder such as gamma ferric oxide and a binder such as polyvinyl chloride. Magnetic recording tapes containing such lubricants exhibit some wear resistance, but not to a sufficient degree. When a large quantity of the lubricant is mixed into the magnetic layer in order to further increase the wear resistance, a so-called "blooming" occurs on the magnetic layer. The blooming results from the lubricating agent exuding on the surface of the magnetic layer and becoming separated therefrom. As a result, the surface of the magnetic recording tape gets rough, and more powder comes off from the magnetic recording layer.
|
<reponame>bobsomers/flexrender
#include "types/image.hpp"
#include <limits>
#include "OpenEXR/ImfOutputFile.h"
#include "OpenEXR/ImfChannelList.h"
using std::numeric_limits;
using std::string;
namespace fr {
Image::Image(int16_t width, int16_t height) :
_width(width),
_height(height),
_buffers() {
AddBuffer("R");
AddBuffer("G");
AddBuffer("B");
}
Image::Image() :
_buffers() {
_width = numeric_limits<int16_t>::min();
_height = numeric_limits<int16_t>::min();
}
void Image::AddBuffer(const string& name) {
_buffers[name] = Buffer(_width, _height, 0.0f);
}
void Image::Merge(const Image* other) {
for (const auto& kv_pair : other->_buffers) {
_buffers[kv_pair.first].Merge(kv_pair.second);
}
}
void Image::ToEXRFile(const string& filename) const {
// Create the header and channel list.
Imf::Header header(_width, _height);
for (const auto& kv_pair : _buffers) {
const auto& name = kv_pair.first;
header.channels().insert(name.c_str(), Imf::Channel(Imf::FLOAT));
}
// Create the output file and frame buffer.
Imf::OutputFile file(filename.c_str(), header);
Imf::FrameBuffer frame;
// Set up the memory layout.
for (const auto& kv_pair : _buffers) {
const auto& name = kv_pair.first;
const auto& buffer = kv_pair.second;
frame.insert(name.c_str(), Imf::Slice(Imf::FLOAT,
const_cast<char*>(reinterpret_cast<const char*>(&(buffer._data[0]))),
sizeof(float), _width * sizeof(float)));
}
// Write it out.
file.setFrameBuffer(frame);
file.writePixels(_height);
}
} // namespace fr
|
#------------------------------------------------------------------------------
# Libraries
#------------------------------------------------------------------------------
# Standard
import pandas as pd
from .sanity_check import check_type
#------------------------------------------------------------------------------
# Main
#------------------------------------------------------------------------------
def convert_dict_to_df(x):
""" Converts a dictionary to a pandas.DataFrame """
check_type(x=x,allowed=dict)
try:
x = pd.DataFrame().from_dict(data=x,orient="index")
except ValueError:
# This happens because we have already named the index
keys, dfs = list(x.keys()), list(x.values())
# Concat all dataframes
x = pd.concat(objs=dfs, axis=0, ignore_index=True, sort=False)
# Overwrite index
x.index = keys
return x
def convert_df_to_dict(x):
""" Converts a pandas.DataFrame to a dict"""
check_type(x=x,allowed=pd.DataFrame)
x = x.to_dict(orient="index")
x = {k: pd.DataFrame().from_dict(data=v, orient="index").T.rename(index={0:k}) for k,v in x.items()}
return x
|
After years of hoping and scant updates about the possibility of another sequel, Princess Diaries fans may finally have something tangible to anticipate. During Thursday’s episode of Watch What Happens Live with Andy Cohen, Anne Hathaway confirmed that the Princess Diaries team wants to make a third movie, but they’re being cautious about how they approach it.
It’s been 15 years since Princess Diaries 2: Royal Engagement, and unfortunately the cast and crew has had to say goodbye to their director, Gary Marshall, who died in 2016. Hathaway told the crowd on Watch What Happens Live that it’s important to her, Andrews and Martin Chase to get things exactly right if they move forward with a third film in the series.
In 2017, Andrews told BuzzFeed that she was “all for” a third film, noting, “I think we might do it in honor of [Marshall]. Annie had an idea that she wanted to pursue about it, and I’m all for it, so if she’d like to….” At the time, the outlet noted that prior to his death, Marshall revealed there were plans for a third film, but after he passed away, plans stalled.
In the era of reboots and reunions, a third Princess Diaries movie would surely be a hit, especially since its fans are still so into the films more than a decade after their release.
|
<reponame>MSalamatov/mongoose<filename>base/src/main/java/com/emc/mongoose/base/item/ItemType.java
package com.emc.mongoose.base.item;
/** Created by kurila on 28.03.16. */
public enum ItemType {
DATA,
PATH,
TOKEN;
@SuppressWarnings("unchecked")
public static <I extends Item, F extends ItemFactory<I>> F getItemFactory(
final ItemType itemType) {
if (ItemType.DATA.equals(itemType)) {
return (F) new DataItemFactoryImpl<DataItemImpl>();
} else if (ItemType.PATH.equals(itemType)) {
return (F) new PathItemFactoryImpl<PathItemImpl>();
} else if (ItemType.TOKEN.equals(itemType)) {
return (F) new TokenItemFactoryImpl<TokenItemImpl>();
} else {
throw new AssertionError("Item type \"" + itemType + "\" is not supported");
}
}
}
|
Primary balloon dilatation in coarctation of the aorta complicated by severe aortic insufficiency. The combination of coarctation of the aorta in the presence of severe aortic insufficiency poses a serious clinical problem. Although successful single- and two-stage repair for combined coarctation in the presence of severe aortic regurgitation has been described, the surgical management of this lesion remains particularly difficult. The analysis of larger series of patients operated upon for coarctation reveals significant early mortality rate in patients with associated severe aortic insufficiency. Although the exact cause of the acute left ventricular failure remains unclear and is a matter of debate, one can assume that changes in the haemodynamics, resulting in global myocardial ischaemia from impaired coronary blood supply or a massive volume overload of the left ventricle after the correction of the coarctation, could have led to myocardial irritability and left ventricular failure. We present a three-stage repair with subtotal relief of the coarctation by balloon angioplasty and stenting first; elective aortic valve replacement in a second stage and finally total balloon dilatation of the residual stenosis at the previously subtotal dilated coarcted segment.
|
Characterization of cystic fibrosis factor and its interaction with human immunoglobulin. Cystic fibrosis factor activity (CFFA), assayed as the ability to stop oyster ciliary movement, was present in serum-free medium from actively growing cystic fibrosis skin fibroblast cultures. CFFA was associated with a low molecular weight, negatively charged molecule that contained no uronic acid and was heat and pH labile. When CFFA-positive media were mixed with human IgG1, the CFFA was chromatographically displaced and emerged with the IgG1 fraction on column chromatography. Experiments in which various immunoglobulins were added to CFFA-positive culture media and then incubated with specific anti-immunoglobulins suggested that CFFA binding was class specific for human IgG, subclass specific for IgG1 and IgG2, and occurred with intact unaggregated heavy chains but not with - and -light chains, or Fab, Fc, and F(ab')2 fragments. The serum protein 2-microglobulin, which has structural homology to IgG, also bound CFFA. T h e p u r p o s e of this s t u d y is to c h a r a c t e r i z e C F F A f r o m skin fibroblast c u l t u r e s g r o w n in the absence of s e r u m a n d to e x a m i n e its association w i t h I g G. Materials and Methods Skin fibroblast lines were established from 18 patients having serum CFFA by the oyste r cilia test and 12 normal subjects in whom no activity could be detected. After 1 mo in culture (two to four subcultures by trypsinization) cells were stained with toluidine blue O and Alcian blue. Cell lines were trypsinized into suspension and approximately 10 s cells were inoculated into 75-cm 2 Falcon flasks (Falcon Plastics, Division of BioQuest, Oxnard, Calif.) containing Waymouth's "special" medium with 10% by volume of fetal calf serum. After the cells had formed an adherent nonconfluent monolayer (usually 12 h), the cultures were washed three times with warmed balanced salt solution, and Waymouth's "special" medium without serum was added (pH 7.5-7.8). Cell growth was monitored by total cell counts and DNA synthesis using tritiated thymidine. An aliquot of medium was removed from cultures 72 h after a medium change and assayed for CFFA. This medium will be referred to as "used medium." The medium was allowed to equilibrate in plastic Petri dishes at 25°C to a pH of 8.0 and tested for CFFA within 1 h. Ciliary action was observed under an inversion microscope by the method of Bowman et at. using gills from oysters Crassostrea virginica under the following conditions: (a) Oysters were kept in an aquarium for 1 mo to ensure "clean gills." (b) The gill segments were kept in unused medium at 4°C for 72 h before use to allow the crypts to release debris. (c) Care was taken to eliminate microscopic air bubbles as undissolved air appeared to prevent the inhibition of cilia. form and equilibrated with 0.15 M NaC1. 25 ml of used culture medium was precipitated by 1% cetylpyridinium chloride (CPC) and the supernatant containing the small molecular weight, negatively charged substances applied to the column. The eluate of the supernatant will be referred to as the initial volume. After washing with 50 ml of 0.15 M NaC1, elution was performed with increasing concentrations of NaC1 (0.4, 0.8, 1.3, 1.5, 1.8, and 2.0 M). Each fraction was precipitated with ethanol at 4°C, redissolved in 1 ml of unused medium, and assayed for protein, uronic acid, and CFFA. To determine the influence of human IgG1 on the elution pattern of the CFFA, 3.0 mg of human IgG1 was added to 25 ml of used CFFA-positive medium and to normal medium after CPC precipitation and immediately before chromatography. The interaction between CFFA and different immunoglobulins was studied in the following way. 50 #g of isolated monoclonal human immunoglobulins (IgG, IgA, IgM, IgD) followed by 100 /zl of rabbit antihuman immunoglobulin of corresponding specificity were added to 1.0-ml samples of used Waymouth's "special" medium from normal or cystic fibrosis metachromatic cultures. Titration experiments indicated that the rabbit antiglobulin was in antibody excess with respect to the 50 #g of immunoglobulin. Controls included used and unused media, isolated human immunoglobulin, and rabbit antiglobulin in unused media. After incubation at 4°C for 24 h, the samples were centrifuged for 30 min at 1,600 g and the supernatant was tested for CFFA. To study the site of interaction of CFFA with the IgG molecule, light and heavy chains of IgG1, papain Fab and Fc fragments, and a pepsin F(ab')2 fragment of an IgG1 monoclonal protein were prepared. RESULTS A number of variables influenced the ciliary inhibition by used media from metachromatic cystic fibrosis cultures. Ciliary inhibition could be demonstrated only if the media had been in contact with a growing cell population for at least 48 h and only if plastic rather than glass flasks were used to grow the cells in serum-free medium. CFFA was lost when the used medium was stored at room temperature over a 24-48 h period or at 4°C over an 11 day period (however, activity was retained at-20°C for at least 11 days), when the medium was heated to 100°C for 1 rain or to 56°C for 30 min, when the medium was exposed to 5 % CO2 with a consequent shift of the pH to the acid range (6.8-7.2) and equilibrated to pH 8.0 for testing. Media with CFFA could be diluted 1 : 4 before a decrease in activity could be demonstrated. Addition of IgG (50 #g/ml) had no effect on the stability of CFFA. When used culture media were assayed for ciliary inhibition using the conditions outlined, monitoring of media for CFFA gave reproducible results in 45/46 double blind experiments. After dialysis for 24 h CFFA was present on both sides of dialysis membrane, which retained molecules larger than 5,000 daltons. CFFA was detected in the media of metachromatic cystic fibrosis fibroblast cultures grown in the presence and in the absence of fetal calf serum (Table I). Anion exchange chromatography provided evidence that the cystic fibrosis factor was bound to IgG. CFFA found in serum-free media eluted only with the 1.5 M NaC1 fraction (Fig. 1). This fraction had no significant content of uronic acid or protein and no detectable IgG was present by double immunodiffusion in agar. When IgG was added to the CFFA-positive medium, after II 10% by volume fetal calf serum (FCS). 11 Subcultured after initial 2 wk with 1 X 10 a cell inoculum. Elution diagrams from Dowex 1-2X chloride columns (0.9 X 44 cm) of CPC supernatants of 25 ml of used medium from cystic fibrosis (CF) skin fibroblast cultures with cystic fibrosis factor activity (CFFA) and normal (NL) cultures with (3 mg/ml) and without IgG1 added to the CPC supernatant. Elution was performed with increasing NaC1 concentrations. Ethanol precipitates of each fraction were assayed for protein content, uronic acid content, and oyster ciliary inhibition (CFFA). Symbols: medium from normal cultures, D --D ; medium from normal cultures with IgG1 added, -- ; medium from cystic fibrosis cultures, O-" -O ; medium from cystic fibrosis cultures with JgG1 added, /X---/X. CPC precipitation, CFFA disappeared from the 1.5 M NaC1 eluate and appeared in the initial volume with the IgG1. CFFA was lost when used media from cystic fibrosis cultures were mixed with IgG subclasses 1 and 2, or their isolated heavy chains and specific antisera (Table II). Human immunoglobulins and precipitating antisera removed CFFA when a minimum of 8-12 ~g of immunoglobulin was used. IgG1 and IgG2 subclasses with antisera were equally effective on a quantitative basis in The experiments involved addition of 100 ttl of the appropriate class-specific antisera and 50 #g of various immunoglobulins to 1 ml of culture medium having CFFA. In control experiments the addition of the immunoglobulin or the antisera separately did not alter CFFA, Results arc shown as presence (+) or absence (-) of CFFA. influencing CFFA. Heat-aggregated IgG1 without antisera had no effect on CFFA. IgG subclasses 3 and 4, immunoglobulin classes IgM, IgA, IgD, kappa and lambda light chains, and fragments Fc, Fab, F(ab')~, and their specific antisera had no influence on CFFA (Table II). Of the three other human serum proteins tested (Table III), only fl~-microglobulin in the presence of its antiserum removed CFFA. DISCUSSION Earlier observations that CFFA in serum and used culture media containing serum could be found in the same chromatographic fraction as IgG suggested that CFFA might be either an immunoglobulin, a molecule with properties similar to immunoglobulin, or a molecule bound to immunoglobulin. Bowman et al. had previously suggested that the substance with CFFA was not an immunoglobulin, as the ciliary inhibitor associated with cultured fibroblasts grown in serum-free naedium for short culture periods did not demonstrate an}" immunological reaction with antisera specific for IgG. Beratis et al. 1 have reported a ciliary dyskinesis factor (mol wt 1-10,000) in the used culture medium from both homozygotes and heterozygotes, which when incubated with IgG became active in the rabbit tracheal cilia assay. In the present study skin fibroblasts were grown in serum-free medium to avoid contamination with fetal calf immunoglobulins. Immunologic analysis failed to reveal any imnmnoglobulins in used culture medium with CFFA, indicating that immunoglobulins were not synthesized in detectable amounts by the cultured fibroblast. It was concluded from these experiments that CFFA was not an immunoglobulin, and that binding to IgG was not required for CFFA. The following properties were noted for CFFA. CFFA was associated with a negatively charged substance that had a molecular weight of less than 5,000 daltons as determined by dialysis. CFFA was bound to IgG1 and IgG2 as shown in immunologic studies (Table II) and bv modification of its elution pattern on chromatography with IgG1 (Fig. 1). D A N E S E T AL. B R I E F D E F I N I T I V E R E P O R T The association between human immunoglobulins and cystic fibrosis factor does not represent an antigen-antibody reaction since the latter did not bind to inmmnoglobulin fragments containing the antibody binding sites. The interaction of immunoglobulin and cystic fibrosis factor occurs in the constant region of IgG1 and IgG2 heavy chains that is lacking in papain or pepsin fragments. The absence of disulfide bonds does not interfere with binding. The binding is IgG class and subclass specific. Since ~-microglobulin, a structural homologue of imnmnogiobulin G, is synthesized by human skin fibroblasts grown in medimn containing fetal calf serum, 2 the possibility that this protein is also synthesized in serum-free cultures cannot be excluded. SUMMARY Cystic fibrosis factor activity (CFFA), assayed as the ability to stop oyster ciliary movement, was present in serum-free medium from actively growing cystic fibrosis skin fibroblast cultures. CFFA was associated with a low molecular weight, negatively charged molecule that contained no uronic acid and was heat and pH labile. When CFFA-positive media were mixed with human IgG1, the CFFA was chromatographically displaced and emerged with the IgG1 fraction on column chromatography. Experiments in which various immunoglobulins were added to CFFA-positive culture media and then incubated with specific anti-immunoglobulins suggested that CFFA binding was class specific for human IgG, subclass specific for IgG1 and IgG2, and occurred with intact unaggregated heavy chains but not with K-and X-light chains, or Fab, Fc, and F(abr)2 fragments. The serum protein 32-microglobulin, which has structural homology to IgG, also bound CFFA.
|
<reponame>baymax19/hub
package cli
import (
"github.com/cosmos/cosmos-sdk/codec"
"github.com/spf13/cobra"
)
func GetQueryCmd(cdc *codec.Codec) *cobra.Command {
return QueryDepositsCmd(cdc)
}
|
The effect of physical training in children with asthma on pulmonary function, aerobic capacity and health-related quality of life: a systematic review of randomized control trials. OBJECTIVE Asthma is a leading cause of chronic illness in children, impacting heavily on their daily life and participation in physical activity. The purpose of this systematic review was to investigate the evidence for the use of physical therapy to improve pulmonary function and aerobic capacity in children with asthma. Furthermore, the review aims to update previous literature on the effect of exercise on health related quality of life. METHODS A search was conducted for randomized control trials (RCTs) using the electronic databases Medline, Embase, SPORTDiscus, AMED, CINAHL, and The Cochrane Central Register of Controlled Trials. Studies were included if the participants were asthmatic children aged 6-18 years participating in any mode of physical exercise. Studies were reviewed for study quality, participant details, exercise intervention details, and intervention outcomes. RESULTS A total of 16 studies and 516 subjects met inclusion criteria for review. Severity of asthma ranged from mild to severe. No improvement in pulmonary function was observed. Physical training led to an increase in aerobic capacity as measured by VO(2max) (mL/kg/min). CONCLUSIONS Findings suggest that physical training does not improve pulmonary function in children with asthma, but does increase aerobic capacity. The small number of studies investigating quality of life suggests that physical training does improve health related quality of life; however further well designed randomized control trials are needed to verify these findings.
|
<gh_stars>0
#[cfg(not(debug_assertions))]
#[cfg(test)]
mod tests {
use crate::persisted_dht::load_dht;
use crate::{NetworkConfig, Service};
use beacon_chain::builder::BeaconChainBuilder;
use beacon_chain::slot_clock::TestingSlotClock;
use eth2_libp2p::Enr;
use futures::{Future, IntoFuture};
use genesis::{generate_deterministic_keypairs, interop_genesis_state};
use slog::Logger;
use sloggers::{null::NullLoggerBuilder, Build};
use std::str::FromStr;
use std::sync::Arc;
use store::{migrate::NullMigrator, SimpleDiskStore};
use tempdir::TempDir;
use tokio::runtime::Runtime;
use types::{EthSpec, MinimalEthSpec};
fn get_logger() -> Logger {
let builder = NullLoggerBuilder;
builder.build().expect("should build logger")
}
#[test]
fn test_dht_persistence() {
// Create new LevelDB store
let path = TempDir::new("persistence_test").unwrap();
let store = Arc::new(SimpleDiskStore::open(&path.into_path()).unwrap());
// Create a `BeaconChain` object to pass to `Service`
let validator_count = 1;
let genesis_time = 13371337;
let log = get_logger();
let spec = MinimalEthSpec::default_spec();
let genesis_state = interop_genesis_state(
&generate_deterministic_keypairs(validator_count),
genesis_time,
&spec,
)
.expect("should create interop genesis state");
let chain = BeaconChainBuilder::new(MinimalEthSpec)
.logger(log.clone())
.store(store.clone())
.store_migrator(NullMigrator)
.genesis_state(genesis_state)
.expect("should build state using recent genesis")
.dummy_eth1_backend()
.expect("should build the dummy eth1 backend")
.null_event_handler()
.testing_slot_clock(std::time::Duration::from_secs(1))
.expect("should configure testing slot clock")
.reduced_tree_fork_choice()
.expect("should add fork choice to builder")
.build()
.expect("should build");
let beacon_chain = Arc::new(chain);
let enr1 = Enr::from_str("<KEY>").unwrap();
let enr2 = Enr::from_str("<KEY>").unwrap();
let enrs = vec![enr1, enr2];
let runtime = Runtime::new().unwrap();
let executor = runtime.executor();
let mut config = NetworkConfig::default();
config.boot_nodes = enrs.clone();
runtime
.block_on_all(
// Create a new network service which implicitly gets dropped at the
// end of the block.
Service::new(beacon_chain.clone(), &config, &executor, log.clone())
.into_future()
.and_then(move |(_service, _)| Ok(())),
)
.unwrap();
// Load the persisted dht from the store
let persisted_enrs = load_dht::<
beacon_chain::builder::Witness<
SimpleDiskStore<types::eth_spec::MinimalEthSpec>,
store::migrate::NullMigrator,
TestingSlotClock,
beacon_chain::eth1_chain::CachingEth1Backend<
types::eth_spec::MinimalEthSpec,
SimpleDiskStore<types::eth_spec::MinimalEthSpec>,
>,
types::eth_spec::MinimalEthSpec,
beacon_chain::events::NullEventHandler<types::eth_spec::MinimalEthSpec>,
>,
>(store);
assert!(
persisted_enrs.contains(&enrs[0]),
"should have persisted the first ENR to store"
);
assert!(
persisted_enrs.contains(&enrs[1]),
"should have persisted the second ENR to store"
);
}
}
|
import numpy as np
import pykin.utils.transform_utils as t_utils
import pykin.utils.kin_utils as k_utils
import pykin.kinematics.jacobian as jac
from pykin.planners.planner import Planner
from pykin.utils.error_utils import OriValueError, CollisionError
from pykin.utils.kin_utils import ShellColors as sc, logging_time
from pykin.utils.log_utils import create_logger
from pykin.utils.transform_utils import get_linear_interpoation, get_quaternion_slerp
logger = create_logger('Cartesian Planner', "debug",)
class CartesianPlanner(Planner):
"""
path planner in Cartesian space
Args:
robot(SingleArm or Bimanual): The manipulator robot type is SingleArm or Bimanual
self_collision_manager: CollisionManager for robot's self collision check
object_collision_manager: CollisionManager for collision check between robot and object
n_step(int): Number of waypoints
dimension(int): robot arm's dof
waypoint_type(str): Type of waypoint ex) "Linear", "Cubic", "Circular"
"""
def __init__(
self,
robot,
self_collision_manager=None,
object_collision_manager=None,
n_step=500,
dimension=7,
waypoint_type="Linear"
):
super(CartesianPlanner, self).__init__(
robot,
self_collision_manager,
object_collision_manager,
dimension)
self.n_step = n_step
self.waypoint_type = waypoint_type
self.eef_name = self.robot.eef_name
self.arm = None
self._dimension = dimension
super()._setup_q_limits()
super()._setup_eef_name()
def __repr__(self):
return 'pykin.planners.cartesian_planner.{}()'.format(type(self).__name__)
@logging_time
def get_path_in_joinst_space(
self,
current_q=None,
goal_pose=None,
waypoints=None,
resolution=1,
damping=0.5,
epsilon=1e-12,
pos_sensitivity=0.03,
is_slerp=False
):
self._cur_qpos = super()._change_types(current_q)
self._goal_pose = super()._change_types(goal_pose)
init_fk = self.robot.kin.forward_kinematics(self.robot.desired_frames, self._cur_qpos)
self._cur_pose = self.robot.get_eef_pose(init_fk)
self._resolution = resolution
self._damping = damping
self._pos_sensitivity = pos_sensitivity
self._is_slerp = is_slerp
if waypoints is None:
waypoints = self.generate_waypoints(is_slerp)
paths, target_positions = self._compute_path_and_target_pose(waypoints, epsilon)
return paths, target_positions
def _compute_path_and_target_pose(self, waypoints, epsilon):
cnt = 0
total_cnt = 10
while True:
cnt += 1
collision_pose = {}
cur_fk = self.robot.kin.forward_kinematics(self.robot.desired_frames, self._cur_qpos)
current_transform = cur_fk[self.eef_name].h_mat
eef_position = cur_fk[self.eef_name].pos
paths = [self._cur_qpos]
target_positions = [eef_position]
for step, (pos, ori) in enumerate(waypoints):
target_transform = t_utils.get_h_mat(pos, ori)
err_pose = k_utils.calc_pose_error(target_transform, current_transform, epsilon)
J = jac.calc_jacobian(self.robot.desired_frames, cur_fk, self._dimension)
J_dls = np.dot(J.T, np.linalg.inv(np.dot(J, J.T) + self._damping**2 * np.identity(6)))
dq = np.dot(J_dls, err_pose)
self._cur_qpos = np.array([(self._cur_qpos[i] + dq[i]) for i in range(self._dimension)]).reshape(self._dimension,)
is_collision_free = self._collision_free(self._cur_qpos)
if not is_collision_free:
_, name = self.self_c_manager.in_collision_other(other_manager=self.object_c_manager, return_names=True)
collision_pose[step] = (name, np.round(target_transform[:3,3], 6))
continue
if not self._check_q_in_limits(self._cur_qpos):
continue
cur_fk = self.robot.kin.forward_kinematics(self.robot.desired_frames, self._cur_qpos)
current_transform = cur_fk[self.robot.eef_name].h_mat
if step % (1/self._resolution) == 0 or step == len(waypoints)-1:
paths.append(self._cur_qpos)
target_positions.append(pos)
err = t_utils.compute_pose_error(self._goal_pose[:3], cur_fk[self.eef_name].pos)
if collision_pose.keys():
logger.error(f"Failed Generate Path.. Collision may occur.")
for name, pose in collision_pose.values():
logger.warning(f"\n\tCollision Names : {name} \n\tCollision Position : {pose}")
# logger.warning(f"Collision Position : {pose}")
raise CollisionError("Conflict confirmed. Check the object position!")
if err < self._pos_sensitivity:
logger.info(f"Generate Path Successfully!! Error is {err:6f}")
break
if cnt > total_cnt:
logger.error(f"Failed Generate Path.. The number of retries of {cnt} exceeded")
paths, target_positions = None, None
break
logger.error(f"Failed Generate Path.. Position Error is {err:6f}")
print(f"{sc.BOLD}Retry Generate Path, the number of retries is {cnt}/{total_cnt} {sc.ENDC}\n")
return paths, target_positions
# TODO
# generate cubic, circular waypoints
def generate_waypoints(self, is_slerp):
if self.waypoint_type == "Linear":
waypoints = [path for path in self._get_linear_path(self._cur_pose, self._goal_pose, is_slerp)]
if self.waypoint_type == "Cubic":
pass
if self.waypoint_type == "Circular":
pass
return waypoints
def get_waypoints(self):
return self.waypoints
def _change_pose_type(self, pose):
ret = np.zeros(7)
ret[:3] = pose[:3]
if isinstance(pose, (list, tuple)):
pose = np.asarray(pose)
ori = pose[3:]
if ori.shape == (3,):
ori = t_utils.get_quaternion_from_rpy(ori)
ret[3:] = ori
elif ori.shape == (4,):
ret[3:] = ori
else:
raise OriValueError(ori.shape)
return ret
def _get_linear_path(self, init_pose, goal_pose, is_slerp):
for step in range(1, self.n_step + 1):
delta_t = step / self.n_step
pos = get_linear_interpoation(init_pose[:3], goal_pose[:3], delta_t)
ori = init_pose[3:]
if is_slerp:
ori = get_quaternion_slerp(init_pose[3:], goal_pose[3:], delta_t)
yield (pos, ori)
def _get_cubic_path(self):
pass
def _get_cicular_path(self):
pass
@property
def resolution(self):
return self._resolution
@resolution.setter
def resolution(self, resolution):
self._resolution = resolution
@property
def damping(self):
return self._damping
@damping.setter
def damping(self, damping):
self._damping = damping
@property
def pos_sensitivity(self):
return self._pos_sensitivity
@pos_sensitivity.setter
def pos_sensitivity(self, pos_sensitivity):
self._pos_sensitivity = pos_sensitivity
@property
def is_slerp(self):
return self._is_slerp
@is_slerp.setter
def is_slerp(self, is_slerp):
self._is_slerp = is_slerp
|
<reponame>opf-labs/jhove2
/**
* JHOVE2 - Next-generation architecture for format-aware characterization
*
* Copyright (c) 2009 by The Regents of the University of California
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
*
* o Redistributions of source code must retain the above copyright notice,
* this list of conditions and the following disclaimer.
*
* o Redistributions in binary form must reproduce the above copyright notice,
* this list of conditions and the following disclaimer in the documentation
* and/or other materials provided with the distribution.
*
* o Neither the name of the University of California/California Digital
* Library, I<NAME>/Portico, or Stanford University, nor the names of
* its contributors may be used to endorse or promote products derived from
* this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
* CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
* SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*/
package org.jhove2.module.format.icc.field;
import java.util.Iterator;
import java.util.Properties;
import java.util.Set;
import java.util.TreeSet;
import org.jhove2.core.JHOVE2;
import org.jhove2.core.JHOVE2Exception;
/** ICC tags as defined in ICC.1:2004-10, \ua007 9.
*
* @author slabrams
*/
public class Tag
implements Comparable<Tag>
{
/** Singleton tags. */
protected static Set<Tag> tags;
/** Tag name. */
protected String name;
/** Tag signature. */
protected String signature;
/** Tag vendor. */
protected String vendor;
/**
* Instantiate a new <code>Tag</code> object.
* @param vendor Tag vendor
* @param signature
* Tag signature
* @param name
* Tag
*/
public Tag(String vendor, String signature, String name) {
this.vendor = vendor;
this.signature = signature;
this.name = name;
}
/** Initialize the tags.
* @param jhove2 JHOVE2 framework
* @throws JHOVE2Exception
*/
protected static synchronized void init(JHOVE2 jhove2)
throws JHOVE2Exception
{
if (tags == null) {
/* Initialize the tags from a Java resource bundle. */
tags = new TreeSet<Tag>();
Properties props =
jhove2.getConfigInfo().getProperties("Tags");
if (props != null) {
Set<String> set = props.stringPropertyNames();
Iterator<String> iter = set.iterator();
while (iter.hasNext()) {
String sig = iter.next();
String ven = props.getProperty(sig);
String nam = null;
int in = ven.indexOf('|');
if (in > 0) {
nam = ven.substring(in+1);
ven = ven.substring(0, in);
}
Tag tag = new Tag(ven, sig, nam);
tags.add(tag);
}
}
}
}
/**
* Get the tag for a signature.
* @param signature Tag signature
* @param jhove2 JHOVE2 framework
* @return Tag, or null if the signature is not a tag signature
* @throws JHOVE2Exception
*/
public static synchronized Tag getTag(String signature,
JHOVE2 jhove2)
throws JHOVE2Exception
{
init(jhove2);
Tag tag = null;
Iterator<Tag> iter = tags.iterator();
while (iter.hasNext()) {
Tag tg = iter.next();
if (signature.equals(tg.getSignature())) {
tag = tg;
break;
}
}
return tag;
}
/**
* Get the tags.
* @param jhove2 JHOVE2 framework
* @return tags
* @throws JHOVE2Exception
*/
public static Set<Tag> getTags(JHOVE2 jhove2)
throws JHOVE2Exception
{
init(jhove2);
return tags;
}
/**
* Get the tag name.
* @return Tag name
*/
public String getName() {
return this.name;
}
/**
* Get the tag signature.
* @return Tag signature
*/
public String getSignature() {
return this.signature;
}
/** Get the tag vendor.
* @return Tag vendor
*/
public String getVendor() {
return this.vendor;
}
/**
* Convert the tag to a Java string in the form:
* "signature: name".
* @return Java string representation of the tag
*/
public String toString() {
return this.getSignature() + ": " + this.getName();
}
/**
* Compare tag.
* @param signature
* Tag to be compared
* @return -1, 0, or 1 if this tag is less than,
* equal to, or greater than the second
*/
@Override
public int compareTo(Tag tag) {
return this.signature.compareTo(tag.getSignature());
}
}
|
Update on novel agents in renal cell carcinoma Renal cell carcinoma (RCC) is a disease with a variable natural history, sometimes presenting with a very indolent course and other times with an aggressive clinical course and unusual sites of metastasis. Surgical resection for stage IIII tumors represents the standard of care and is the only curative option available to patients. However, 4050% of patients develop metastatic disease. Prior to the advent of targeted therapy, cytokine therapy was the only treatment for RCC. The administration of high-dose, bolus IL-2 has historically produced consistent, durable responses in a small percentage of patients with advanced RCC. The use of IFN- is currently limited to combination therapies. Multiple new agents targeting the VEGF pathway have been tested and approved, including sunitinib, sorafenib and bevacizumab, with others waiting in the wings. In the majority of cases these drugs induce disease stabilization with eventual disease progression. Hence additional new pathways are being targeted and studied. Mechanisms of drug resistance, novel combinations, sequences and schedules are the focus of current clinical investigations. This review provides an updated list of the novel targeted agents in advanced clinical development for metastatic RCC.
|
package org.innovateuk.ifs.user.service;
import org.innovateuk.ifs.commons.rest.RestResult;
import org.innovateuk.ifs.user.resource.ProcessRoleResource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import java.util.List;
import java.util.concurrent.Future;
import static java.util.Arrays.asList;
import static org.innovateuk.ifs.application.service.Futures.adapt;
/**
* This class contains methods to retrieve and store {@link ProcessRoleResource} related data,
* through the RestService {@link UserRestService}.
*/
@Service
public class ProcessRoleServiceImpl implements ProcessRoleService {
@Autowired
private ProcessRoleRestService processRoleRestService;
@Override
public List<ProcessRoleResource> findAssignableProcessRoles(Long applicationId) {
return processRoleRestService.findAssignableProcessRoles(applicationId).getSuccess();
}
@Override
public Future<ProcessRoleResource> getById(Long id) {
return adapt(processRoleRestService.findProcessRoleById(id), RestResult::getSuccess);
}
}
|
<gh_stars>0
import { Component } from '@angular/core';
import { NgForm } from '@angular/forms';
@Component({
selector: 'app-auth',
templateUrl: './auth.component.html'
})
export class AuthComponent {
isLoginMode = true;
onSwitchMode() {
this.isLoginMode = !this.isLoginMode;
}
onSubmit(form: NgForm) {
console.log(form.value);
form.reset();
}
}
|
<reponame>ChristopherHarwell/story-squad-ds-e
from os import getenv
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from fastapi.responses import Response, JSONResponse
from fastapi.requests import Request
import uvicorn
from app.api import submission, visualization, clustering
app = FastAPI(
title="Labs26-StorySquad-DS-Team B",
description="A RESTful API for the Story Squad Project",
version="0.1",
docs_url="/",
)
app.include_router(submission.router)
app.include_router(visualization.router)
app.include_router(clustering.router)
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
if __name__ == "__main__":
uvicorn.run(app)
|
package slimeknights.mantle.util;
import com.google.common.collect.ImmutableList;
import java.util.Collection;
import java.util.List;
/**
* Builder for creating a list by adding the last elements for the final list first
*/
public class ReversedListBuilder<E> {
/** Store the list as a list of lists as that makes reversing easier */
private final ImmutableList.Builder<Collection<E>> unpacked = ImmutableList.builder();
/**
* Adds all data from the given collection to the builder.
* This is done in terms of collections rather than individual elements to speed up reversal, as the use of this adds elements in batches
* @param collection Collection to add
*/
public void addAll(Collection<E> collection) {
unpacked.add(collection);
}
/** Builds the final list of quads */
public ImmutableList<E> build() {
List<Collection<E>> unpacked = this.unpacked.build();
ImmutableList.Builder<E> packed = ImmutableList.builder();
for (int i = unpacked.size() - 1; i >= 0; i--) {
packed.addAll(unpacked.get(i));
}
return packed.build();
}
}
|
1. Field of the Invention
Aspects of the present invention relate to an organic light emitting diode. More particularly, aspects of the present invention relates to a multi-layer structure of the organic light emitting diode.
2. Description of the Related Art
Among various display panels for a display device, a display panel using an organic light emitting diode (OLED) has been receiving attention according to advances in semiconductor technology.
An active matrix type of OLED display using an organic light emitting diode includes a plurality of pixels arranged on a substrate in a matrix form and thin film transistors (TFTs) disposed at each of the pixels, such that each of the pixels is independently controlled through one of the thin film transistors.
In order to exhibit optimal characteristics of the OLED display, semiconductor elements including a thin film transistor and organic light emitting elements including an anode, a hole injection layer (HIL), a hole transport layer (HTL), an emission layer (EML), an electron transport layer (ETL), an electron injection layer (EIL), and a cathode should have good characteristics and should work well with each other.
Semiconductor elements are being actively studied and advances in technology with respect to semiconductor elements are being applied not only to the OLED displays but also to other technical systems. It would also be desirable to provide significant achievements through the study and development of the organic light emitting elements so that a satisfactory OLED display can be provided to a customer.
The above information disclosed in this Background section is only for enhancement of understanding of the background of the invention and therefore may contain information that does not form the prior art that is already known in this country to a person of ordinary skill in the art.
|
import { promises as fs } from 'fs';
import { Canvas, CanvasRenderingContext2D as NodeCanvasRenderingContext2D, ImageData, createCanvas, createImageData } from 'canvas';
import { ImageFilter } from './ImageFilter';
import { XbrzImageFilter } from './XbrzImageFilter';
import { convertPixelsFrom32To8, convertPixelsFrom8To32 } from '../support/ByteUtils';
// Compatibility layer for type checking in the browser
const CanvasRenderingContext2D = NodeCanvasRenderingContext2D || window.CanvasRenderingContext2D;
export class Image {
width: number;
height: number;
data: Uint32Array;
constructor(width: number, height: number, data: Uint32Array) {
this.width = width;
this.height = height;
this.data = data;
}
static fromCanvas(canvas: Canvas | HTMLCanvasElement): Image {
const imageData = (canvas as Canvas).getContext('2d').getImageData(0, 0, canvas.width, canvas.height);
return new Image(imageData.width, imageData.height, convertPixelsFrom8To32(imageData.data));
}
// Filters
applyFilter(filter: ImageFilter): Promise<Image> {
return filter.apply(this);
}
applyXbrzUpscaling(scale: 2 | 3 | 4 | 5 | 6 = 4): Promise<Image> {
return this.applyFilter(new XbrzImageFilter(scale));
}
// Output
toCanvas(): Canvas | HTMLCanvasElement;
toCanvas(canvas: Canvas | HTMLCanvasElement): Canvas | HTMLCanvasElement;
toCanvas(canvas: Canvas | HTMLCanvasElement, x: number, y: number): Canvas | HTMLCanvasElement;
toCanvas(context: CanvasRenderingContext2D): Canvas | HTMLCanvasElement;
toCanvas(context: CanvasRenderingContext2D, x: number, y: number): Canvas | HTMLCanvasElement;
toCanvas(value?: Canvas | HTMLCanvasElement | CanvasRenderingContext2D, x = 0, y = 0): Canvas | HTMLCanvasElement {
const context = this.resolveContext(value);
context.putImageData(this.getImageData(), x, y);
return context.canvas;
}
toDataURL(mimeType: 'image/png' | 'image/jpeg' = 'image/png'): string {
return this.toCanvas().toDataURL(mimeType);
}
toBlob(mimeType: 'image/png' | 'image/jpeg'): Promise<Blob> {
return new Promise((resolve, reject) => {
(this.toCanvas() as HTMLCanvasElement).toBlob((blob) => {
if (blob) resolve(blob);
else reject();
}, mimeType);
});
}
toBuffer(mimeType: 'image/png' | 'image/jpeg' = 'image/png'): Promise<Buffer> {
return new Promise<Buffer>((resolve, reject) => {
(this.toCanvas() as Canvas).toBuffer((err, result) => {
if (err) reject(err);
else resolve(result);
}, mimeType as 'image/png');
});
}
async toFile(path: string, mimeType?: 'image/png' | 'image/jpeg'): Promise<void> {
if (!mimeType) {
mimeType = /\.jpe?g$/i.test(path) ? 'image/jpeg' : 'image/png';
}
const buffer = await this.toBuffer(mimeType);
return fs.writeFile(path, buffer);
}
private resolveCanvas(canvas?: Canvas | HTMLCanvasElement): Canvas | HTMLCanvasElement {
return (canvas) ? canvas : createCanvas(this.width, this.height);
}
private resolveContext(value?: Canvas | HTMLCanvasElement | CanvasRenderingContext2D): CanvasRenderingContext2D {
if (value instanceof CanvasRenderingContext2D) {
return value;
}
return this.resolveCanvas(value as Canvas).getContext('2d') as CanvasRenderingContext2D;
}
private getImageData(): ImageData {
const data = convertPixelsFrom32To8(this.data);
return createImageData(data, this.width, this.height);
}
}
|
You have a busy life, and money plays a big part of it. Want to shop smarter, plan for a life milestone or finally have the money talk with your significant other? Start here.
|
Towards a low-cost embedded vision-based occupancy recognition system for energy management applications This paper focuses on the development of a low-cost real-time occupancy detection system for people using convolutional neural networks. The proposed detector was implemented in an embedded system composed of a Raspberry Pi 3, an Intel neural computer stick accelerator, and a control circuit containing a relay, a transistor, and the Raspberry output ports. The model was calibrated by varying two parameters: intersection-over-union score and probability size, both necessary to achieve high level of confidence when detecting a person. An experiment was carried out as proof of concept of the system under different test scenarios such as walking fast with poor and optimal lighting conditions and strolling with good lighting. As result, the system obtained a confidence level above the 80% on all test scenarios.
|
Endovascular photodynamic therapy with aminolaevulinic acid prevents balloon induced intimal hyperplasia and constrictive remodelling. BACKGROUND AND OBJECTIVE intimal hyperplasia (IH) and constrictive remodelling are important causes of restenosis following endovascular interventions, such as percutaneous transluminal angioplasty. Photodynamic therapy (PDT) with 5-aminolaevulinic (ALA) may prevent restenosis by cellular depletion and the elimination of cholinergic innervation. STUDY DESIGN/MATERIALS AND METHODS rats (n=90) were subdivided into 4 main groups. In the experimental group (n=36: 3 replications x 4 doses x 3 examination time-points), ALA was administered (200mg/kg i.v.) 2-3h before balloon injury (BI) of the common iliac artery followed by endovascular illumination with 633nm at either 12.5, 25, 50 or 100J/cm diffuser length (dl BI+PDT group). As control groups served the BI+Light only (LO) group (n=36) that received no ALA, the BI only group (n=9) (BI), and a group (n=9) that received a Sham procedure (Sham group). RESULTS planimetric analysis showed IH of 0.28+/-0.12mm (BI), 0.27+/-0.12mm (BI+LO at 100J/cmdl) in contrast to 0.02+/-0.02mm after BI+PDT at 100J/cmdl at 16 weeks (p<0.05). In the BI+PDT groups, a light-dose increase of a factor 2 led to an IH decrease of 17% (p<0.05). In the BI and BI+LO groups constrictive remodelling was found, in contrast to BI+PDT treated groups at 16 weeks. The staining of cholinergic innervation of the tunic media of the blood vessel wall in BI+PDT showed no damage at the highest fluence. CONCLUSION endovascular ALA-PDT prevents IH and constrictive remodelling after BI without damage of cholinergic innervation of the tunica media. The effective light fluence rate in the rat is 50-100J/cmdl.
|
6 facts about psychedelic drugs that will totally blow your mind In the effort to demonize mind-altering drugs, critics have overlooked some very real benefits
Despite the fact that the U.S. government deems many hallucinogenic or psychedelic substances to be dangerous, classifying them as Schedule I drugs with “no currently accepted medical use,” various scientists have dared to study their effects. What they’ve found over the years paints a startling, promising and powerful picture of potentially game-changing medicines.
Advertisement:
The government’s "war on drugs" policies severely limit research on psychedelics. Before scientists can complete any federally sanctioned studies, they have to jump through an expensive tangle of hoops and red tape. Restrictions aside, over the years researchers have collected a database of research showing that many psychedelics have an unprecedented potential to treat cancers, addictions and psychological traumas, among other things.
Here are some of the coolest things scientists have discovered about psychedelics over the years.
1. LSD can mitigate end-of-life anxiety.
The results of the first clinical study of the therapeutic use of LSD (lysergic acid diethylamide) in humans in more than 40 years were published in the peer-reviewed Journal of Nervous and Mental Disease in March. They show that LSD can promote statistically significant reductions in anxiety for people coming to terms with their own impending demise.
Swiss psychiatrist Peter Gasser and his colleagues conducted the double-blind, placebo-controlled study, sponsored by the non-profit Multidisciplinary Association for Psychedelic Studies (MAPS). They tracked 12 people who were near the end of life as they attended LSD-assisted psychotherapy sessions. In his report, Gasser concluded that the study subjects’ anxiety "went down and stayed down."
2. Psilocybin, aka magic mushrooms, actually calms, rather than stimulates, certain brain functions.
Advertisement:
The common conception is that psychedelics do something extra to cause their effects—increase activity, add hallucinations, promote awareness, etc. A study that examined brain scans of people under the influence of psilocybin found that it reduces activity in certain areas of the brain. That reduction of activity leads to the drug's effect on cognition and memory. Psychedelics, and psilocybin in particular, might actually be eliminating what could be called the extra "noise" in the brain.
3. The drug MDMA (aka ecstasy, or Molly) promotes release of the hormone oxytocin, which could help treat severe anxieties like PTSD and social anxiety resulting from autism.
Before the federal government classified it as a Schedule I substance, therapists experimented with MDMA (3,4-methylenedioxyrnethimphetarnine) beginning in the 1970s to help reduce moderate depression and anxiety among their adult patients. After widespread recreational use in the rave scene caught the attention of authorities, MDMA was criminalized in 1985. However, research primarily supported by the MAPS has continued to turn up positive results for the drug’s potential therapeutic use. Various clinical trials and statistical research have confirmed that MDMA can successfully treat post-traumatic stress in military veterans and others. One exampleis the clinical trial led by Michael Mithoefer, which used MDMA-assisted psychotherapy to treat chronic PTSD.
A 2009 study offers a plausible explanation for MDMA’s effectiveness treating PTSD. The double-blind, randomized, placebo-controlled study of 15 healthy individuals confirmed that MDMA causes the brain to release oxytocin, which is the human hormone linked to feelings of love and compassion.
Advertisement:
MAPS recently received government approval to launch a new study examining MDMA’s potential for treating social anxiety in autistic adults. Based on the known effects of MDMA, as well as individual reports, this exploratory study will focus on enhancing functional skills and quality of life in autistic adults with social anxiety.
4. Psilocybin could kill smoking addiction.
Psychiatry professor Matthew Johnson, who works at Johns Hopkins University School of Medicine, presented the preliminary results of a pilot feasibility study looking at the ability of psilocybin to treat smoking addiction at the 2013 Psychedelic Science conference in Oakland, Calif. For the study, five cigarette-addicted participants underwent placebo-controlled psilocybin treatment with a psychiatrist. All five completely quit smoking after their first psilocybin session. At all followup visits, which occurred up to one year later for the first four participants, it was biologically confirmed that the participants had abstained from cigarettes.
Advertisement:
5. Ayahuasca can treat drug addictionand possibly much more.
Ayahuasca is a brew prepared with the Banisteriopsis caapi vine, originally used for spiritual and healing purposes in the Peruvian Amazon rainforest. The vine is usually mixed with leaves containing the psychedelic compound DMT.
Gabor Mate, a medical doctor from Vancouver who is a prominent ayahuasca researcher,contends that therapy assisted by psychedelics, and ayahuasca in particular, can untangle complex, unconscious psychological stresses. He claims these stresses underlie and contribute to all chronic medical conditions, from cancer and addiction to depression and multiple sclerosis.
Advertisement:
The results of the first North American observational study on the safety and long-term effectiveness of ayahuasca treatment for addiction and dependence were published in June 2013 in the journal Current Drug Abuse Reviews. All of the participants in the study reported positive and lasting changes, and the study found statistically significant improvements “for scales assessing hopefulness, empowerment, mindfulness, and quality of life meaning and outlook subscales. Self-reported alcohol, tobacco and cocaine use declined, although cannabis and opiate use did not.” The reported reductions in problematic cocaine use were also statistically significant.
6. DMT occurs naturally in the human body, and taking it could simulate death.
The drug DMT (diemethyltryptamine), which causes hallucinogenic experiences, is made up of a chemical compound that already occurs within the human body endogenously (as well as in a number of plants). This means our brains are naturally set up to process the drug because it has receptors that exist specifically to do so. Cannabis is another illegal drug that occursendogenously.
Some research based on near-death experiences points to the fact that the brain releases DMT during death. Some researchers have also conjectured that DMT is released during other intense experiences, including orgasm.
|
import os
import logging
from medicine_api import models
from medicine_api.readers.price_reader import PriceReader
from django.core.management.base import BaseCommand
logger = logging.getLogger('medicine_api.readers.link_database_populator')
class Command(BaseCommand):
"""
Runs the CRON job to (re)populate the LinkMetadata model with refreshed information from the spreadsheet source.
"""
help = '(Re)populates the Medicine for Ukraine links database table with updated information.'
def handle(self, *args, **options):
logger.info('Link Database Checks Starting')
price_reader = PriceReader()
data = price_reader.get_link_data()
for item in data:
metadata_object = models.LinkMetadata()
metadata_object.set_from_link_data(item)
logger.info('Link Database Checks Complete')
self.stdout.write(self.style.SUCCESS('Link checking successfully completed.'))
|
/**
* @author Ranger Tsao(https://github.com/boliza)
*/
@DataObject(generateConverter = true)
public class RequestParameters {
private JsonObject properties;
private String text;
private Language language = Language.en;
private String pattern;
private boolean filter;
public RequestParameters() {
}
public RequestParameters(JsonObject json) {
RequestParametersConverter.fromJson(json, this);
}
public JsonObject toJson() {
JsonObject json = new JsonObject();
RequestParametersConverter.toJson(this, json);
return json;
}
public JsonObject getProperties() {
return properties;
}
public void setProperties(JsonObject properties) {
this.properties = properties;
}
@GenIgnore
public RequestParameters setAnnotators(List<String> annotators) {
checkProperties();
properties.put("annotators", String.join(",", annotators));
return this;
}
@GenIgnore
public RequestParameters addAnnotator(String annotator) {
checkProperties();
properties.put("annotators", String.join(",", properties.getString("annotators", annotator).split(",")));
return this;
}
@GenIgnore
public RequestParameters addProperty(String key, Object value) {
checkProperties();
properties.put("key", key);
return this;
}
private void checkProperties() {
if (properties == null) {
properties = new JsonObject();
}
}
public String getText() {
return text;
}
public RequestParameters setText(String text) {
this.text = text;
return this;
}
public Language getLanguage() {
return language;
}
public RequestParameters setLanguage(Language language) {
this.language = language;
return this;
}
public String getPattern() {
return pattern;
}
public RequestParameters setPattern(String pattern) {
this.pattern = pattern;
return this;
}
public boolean isFilter() {
return filter;
}
public RequestParameters setFilter(boolean filter) {
this.filter = filter;
return this;
}
}
|
def process_edit_distances(mx, reference_counter, reference_values, search_values):
min_distances = mx.min(axis=1)
matches = []
for i in range(mx.shape[0]):
min_dist = min_distances[i]
min_indices = np.argwhere(mx[i,:] == min_dist)
min_indices = list(min_indices[:,0])
for idx in min_indices:
reference, min_match = reference_values[i], search_values[idx]
matches.append((reference, reference_counter[reference], min_match, min_dist))
return matches
|
Fighter, Corpsman, Partisan an Attempt to Typify Former Soldiers Based on their Coping and Defense Mechanisms This work strives to develop a typological classification of the use of conscious and unconscious defense and coping mechanisms based on methodically and structurally collected data from a qualitative survey of 43 former soldiers in Germany. Seven coping and defense types were identified: the Fighter, the Comrade, the Corpsman, the Strategist, the Partisan, the Self-Protector and the Infantryman. The types identified differed with regard to the accumulation, combination, and use of their conscious and unconscious defense and coping mechanisms in the superordinate areas of behaviour, relationships, emotions, reflexivity and time focus. The typological classification could offer psychotherapeutic interventions tailored to individuals and their defense and coping mechanisms, which could lead to improved therapy use and compliance. The Situation of a Former German Soldier in an International Comparison In all, 85.5% of the German Armed Forces' (GAF) foreign missions are associated with the experience of stressful events and an increased prevalence of mental illness with a https://doi.org/10.1007/s12124-019-09507-1 generally low level of therapy use (). In addition to traumatizing events during their missions abroad and experiences of discrimination at home due to their careers, (former) soldiers in general have to manage high amounts of stress and adjustment. Different demands (assimilating into the military system, war experiences abroad, return and reintegration into professional and family life afterwards, leaving the military, reintegration into civilian life and perhaps dealing with symptoms of a traumarelated disorder) require an extremely high degree of adaptability (B. In contrast to the broad state of knowledge about the mental health of, for example, US soldiers, little is known about the handling of stressful events and mental illness among active GAF soldiers (), and even less is known about the coping of former GAF soldiers who have left the military system (). For various reasons, it is not possible to easily apply international findings to the situation in Germany, not least because of diverging national sociocultural factors and different benefits laws. The attitudes of the German public to military forces in general, and the GAF in particular, are still influenced by Nazi history and the war crimes of the German Wehrmacht. Therefore, active and former German soldiers, especially mentally ill soldiers, are exposed to high levels of stigmatization, which they have to cope with in their everyday life and which makes it even more difficult to seek professional help (). However, even in countries such as the USA, where traumatization of soldiers and their (subsequent) coping mechanisms have been the focus of scientific and societal attention for some time now, to our knowledge, there are hardly any studies or publications on the defense and coping mechanisms of former soldiers. Coping and Defense Mechanisms Beutel defines coping mechanisms as mostly conscious, not automatic, cognitive and experience-related or behavioural processes in persisting, aversive situations or situations expected to become aversive. Since the development of the transactional stress model by Lazarus and his colleagues (Lazarus and Launier 1981;Lazarus and Folkman 1984), coping mechanisms have become the subject of intensive psychological research (Overview: (Schwarzer 1998)). The subsequently developed category systems and instruments, such as the COPE (Kato 2015;), made it possible to discover coping mechanisms through self-disclosure and to develop first coping concepts. However, the role of individual personality traits and (biographical) motives was often neglected in those coping concepts and scales (Steffens and Kchele 1988). Lazarus himself admitted that unconscious intentions were hardly represented in this approach (Lazarus 2000). Although research results in recent years support a high proportion of unconscious, intuitive processes, especially in complex, fast decisions (;Gigerenzer and Kober 2009), in this respect little has changed in the approach to coping research to our knowledge. During the last decades, the research on defense mechanisms has grown immensely. Psychoanalysts from different schools interpret defense differently, and every definition involves the risk of an over-reduction of this very complex construct. In their 600-page book on the current state of theory and research of defense mechanisms, Hentschel et al. rightly stress the complexity of the subject. They also acknowledge the often divergent attitudes to psychoanalysis in general and to defense mechanisms in particular of the psychological research field. According to Beutel, defense mechanisms can be understood as unconscious, primarily cognitive and experience-related processes that include a narrowing or distortion of intersubjective reality and self-perception (Beutel 1990). After the development of Sigmund Freud's theory of defense, its further development by his daughter Anna Freud and the publication of her work 'Ego and the Mechanisms of Defense' (Freud 1936), the concept of defense was supplemented by intrapsychological and interpersonal perspectives of the various currents within the school of psychoanalysis. In recent decades, authors such as Vaillant, Laughlin and Knig have repeatedly attempted to develop and introduce a new kind of taxonomy, but the number and classification of the chosen defense mechanisms differed considerably from author to author (Vaillant 1971(Vaillant, 1992Laughlin 1979;Knig 1997;). In addition, the concept of the individual's psychic structure has become increasingly important in theory, in the context of operationalized psychodynamic diagnostics (OPD) (;Cierpka 2014), as well as in psychotherapeutic and trauma therapy practice (;Wller 2013). There seems to be relative agreement that defense and coping mechanisms have or are Ego-functions, with defense being used for inner-psychic protection and regulation and coping for real adaptation and problem solving (Steffens and Kchele 1988). In other words, only the corresponding (unconscious) defense enables successful (conscious) coping (Cierpka 2014). In psychoanalytic theory and research certain character types (Knig 2004) and clinical diagnoses are associated with certain constellations of defense and coping mechanisms, for example the obsessive-compulsive disorder with reaction formation and isolation () or the borderline personality disorder with mechanisms such as sensation seeking, autoaggressive behaviour, dissociation, splitting and many more. In recent decades, numerous well-known authors have repeatedly expressed criticism of an explicit separation of coping and defense (Cierpka 2014;Steffens and Kchele 1988;Beutel 1990). Steffens & Kchele already wrote in 1988: "We consider it sensible to give up a strict separation of coping and defense. Both processes complement each other, by no means exclude each other alternatively". Nevertheless, in the past, there were repeated "denial efforts of kinship relations" in both concepts (Cierpka 2014;Steffens and Kchele 1988;Beutel 1990). Even now, at least in Germany, the use of either coping self-rating scales in clinical psychiatric practice and science, on the one hand, and the focus on unconscious conflicts and defense mechanisms of psychoanalytic therapists and scientists, on the other hand, indicate a continued one-sided approach. To our knowledge, there has been no practical, integrative model for the identification and representation of coping and defense mechanisms in trauma-related disorders. This deficiency is surprising since coping mechanisms (such as sports or drug consumption) and/or unconscious defense mechanisms (e.g., rationalization or splitting) play an important role in diagnosis, therapy planning and prognostic assessment by therapists of different schools. This role is especially true for patients with trauma-related disorders, in whose treatment Egosupporting, affect-regulating interventions are often of great practical importance. In general, active coping (e.g., fight or flight) is more likely to be used if the person assumes he/she can control the threat or escape from it. If the individual considers control or escape impossible, he/she usually reacts with passive mechanisms (Olff and Langeland 2005). It was, therefore, assumed that the use of certain coping mechanisms was primarily situation-dependent. On the other hand, there are also indications that a certain coping style or the repetitive use of certain mechanisms can represent a risk factor in the development of trauma-related disorders (). This risk corresponds to the psychoanalytical assumption of a 'defense and coping profile' linked to the psychological structure and to findings from current trauma research indicating that an individual's coping style or coping mode influences whether the affected person can successfully handle the traumatic experience or develops a mental illness such as PTSD (;). Goal of this Study Apart from a phenomenological description of coping and defense mechanisms, this study aims to determine if there are specific combinations of mechanisms that lead to special adjustment types among former GAF soldiers and their approach to psychotherapeutic or psychiatric treatment. Research Context This study was conducted within the framework of the "German former soldiers' Readjustment Study", which is a research project designed to gain insight into the daily life of former German soldiers, including their life satisfaction, economic situation, family situation and their health status. It was approved by the ethics committee of the Charit Universitatsmedizin Berlin (Approval Number EA1/250/14). Recruitment and Sampling Recruiting potential participants took place via a project-owned website and the psychiatric ward of the GAF Hospital Berlin. Former soldiers of the GAF who had participated in at least one foreign assignment were included. Thus, it was possible to identify 103 potential participants, of whom five could not ultimately be contacted. Short telephone interviews were conducted with the remaining 98 participants. They were asked about their motivation for participating and to provide their sociodemographic data. The inclusion criteria were checked as well. In all, 43 of the remaining 98 participants were eventually invited to phase two of the project, the open interviews. The participants were initially selected at the Bundeswehr hospital in Berlin using opportunistic sampling (Teddlie and Yu 2007). Because theoretical saturation arose during the course of the interviews and the initial analysis, other interview partners not associated with the hospital were selected, visited and interviewed in their homes. For this purpose, participants with and without psychological symptoms and those who did or did not use psychosocial services were selected to create a contrasting sample (based on the theoretical sampling of Strauss and Corbin (Corbin and Strauss 2008)) and to achieve the highest possible degree of variation. Each interview was promptly followed by a debriefing session with the research team. Form, situation, content and countertransference experience were compared with previous interviews, and the next procedure was determined, especially with regard to sampling. After a total of 43 interviews, we reached theoretical saturation, which eventually led to the cessation of data collection. We then subjected 16 interviews, again selected on the basis of contrasting features (origin, age, gender, relationship status, number of children, training level, military rank and organizational area, country of mission, duration of deployment, psychological symptom burden, psychotherapy experience, and claim of service-incurred disability), to a detailed thematic analysis and compared the resulting concepts with the remaining 27 interviews. Participants The included sample comprised former soldiers of different ranks (from soldiers to senior officers), organizational units (Army, Navy, Air Force, Joint Support Service, and Central Medical Service) and federal states of Germany (12 out of 16). Four (9.3%) participants were female; the age span of the participants was between 26 and 69 (MW = 40.4; SD = 12.3). Some participants had just left the GAF, and others had been civilians for several years. According to the former soldiers interviewed, 19 (44%) suffered from a service-incurred disability, and three (6.9%) were in treatment time, which is a time span used for clinical and occupational rehabilitation in Germany during which soldiers, although not on active duty, cannot be discharged from military service. Data Collection The data were collected in personal, open interviews. The interviews were conducted jointly by two of the authors, with one of them moderating the discussion and the other responsible for the equipment and accompanying field observations. This also included his/her own counter-transference experience and observations of emotional reactions of the interviewees and in some cases, for example, their interaction with relatives or partly present pets, etc. During the interviews, the researchers used a narrative technique (Ksters 2014) that always started with a general introduction about the former soldiers' experiences and their adjustment in the GAF before, during, and after their deployment, as well as after their discharge and today. The interviewers let the participants talk about whatever seemed to be significant to the participants. If inconsistencies occurred, or if emotional responses were displayed, or if challenging situations were described, the interviewer encouraged the participants to elaborate more deeply on the emerging issues, especially with regard to conscious and unconscious mechanisms to adjust. The interviews were finished when the participants had nothing left to describe. Conscious coping mechanisms (e.g., alcohol consumption, sport) were reported during the interview, sometimes without being asked and sometimes directly asked at a later point in time. Unconscious coping and defense mechanisms were identified in the very detailed descriptions of the participant's situation and action processes (e.g., rationalization, denial) or observed during the interview (e.g., affect isolation, dissociation). The entire interview was recorded in MP3 format and then transcribed by an external transcription service pledged to confidentiality (). Data Analysis The methodological analysis process based on the principles of grounded theory and thematic analysis involved the iterative generation of hypotheses and the development of new models and typologies. According to, we also approached the material iteratively through inductive development, deduction and validation to develop a theoretical model anchored in the material, whereby the data collection and analysis processes were continued until saturation occurred, which is when no new insights result from data collection and analysis. Our approach corresponded to the steps mentioned by Guest et al.: 1. getting to know the material, 2. identifying thematic categories, 3. identifying structures and combinations and 4. building a theoretical model based on the findings (). The first step consisted of getting to know the material by repeatedly listening to the interviews, reading the transcripts and making notes and cross-references using the MAXQDA12 program. First codes were given to note down or summarize the observed or otherwise identified coping and defense mechanisms. For example, a passage in which an interviewee described how he cried during psychological sessions was given the code "crying" (along with others). In the second step, thematic categories were identified by combining, contrasting and merging similar or interconnected coping and defense mechanism codes. For example, the code "crying" was then combined with other codes such as "expressing sadness or anger" or "showing feelings", and the category "show emotions" was created. The third step included the review and analysis of the thematic categories found to identify underlying thematic structures, motifs, functions or specific combinations of coping and defense categories using new raw data from the ongoing data collection and analysis. To continue illustrating with the above example, the passages and codes were read again, the codes were compared, and the intensity and context in which emotions were shown or not shown were examined or other codes with emotional content were added to the category. The fourth and final step was to develop a typology based on the combinations of mechanisms found in the material, constantly cross-checking it with new material and also including current literature and existing models. Thus, for example, different manifestations of the area "emotion" arose and a model continuum with the superordinate area "emotion" was created, consisting of the three emotion categories "defensive", "partially permissive" and "affirmative" (Tables 1 and 2). For further verification and differentiation of our results, we used the "Document Portrait" function of the MAXQDA program, which is a tool for visualizing an interview by representing it as an image of its codes. The frequency and length of the coded defense and coping mechanisms within the conversation are displayed as a graphical representation, whereby the length of a segment is taken into account and included as a weighting factor for the graphical representation. The result of the preceding steps was the development of a typological classification, which we have summarized in the form of results tables for better representability (Tables 3-9). Quality Assurance To ensure the highest possible quality during the scientific process, qualitative research was conducted in accordance with qualitative research quality criteria (Stamer et al. *c = codes, ts = text segments 2015). In addition to the previously mentioned theory-based, iterative approach to sampling and data collection, analysis and verification, we also chose a multi-perspective, multidisciplinary research team (doctors and psychologists with and without clinical experience and with and without military background or deployment experience abroad). The validation processes in the research group, in particular, proved to be a very important element for the conscious, reflexive, flexible and self-critical handling of the data and team members' pre-concepts. In addition to the internal communicative validation by presentation of the (partial) results and discussion of (partial) results and corrective consensus-building, all project members committed themselves to regular participation in advanced training and external research workshops and to external research supervision. Results Overall, a very large number and range of described, named and observed defense and coping mechanisms were found in the group we interviewed as well as in the individual interviews. Thus, in the first step of thematic analysis (according to Guest et al. and Chapman et al. ), we extracted a total of 1960 text segments containing forms of conscious and unconscious coping and defense mechanisms, which were ordered in 89 codes. In the second analysis step, these codes were categorized Area Emoon according to their content, motif and function and assigned to 15 superordinate coping and defense categories (See Table 1). In the third step, the examination and analysis of the categories found to identify underlying thematic structures and combinations, the existing 15 coping and defense categories were reduced to the five overarching thematic areas of 'behaviour', 'relationship', 'emotion', 'reflexivity' and 'time', whereby each area could be divided into three value-neutral gradations (−, ± and +) (See Table 2). In the fourth step, theoretical assumptions were compared with the newly found data and coping/defense modes and combinations were identified and verified using the 'portrait function' (see: Data analysis) and integrated into a prototypical classification (See Tables 3-9). Although most former soldiers showed a large variety of different coping and defense mechanisms, many of the interviewees used a repetitive pattern of individual core mechanisms at different stages of adaptation. Analysing the combination, frequency and dominance of use of certain coping mechanisms in the investigated sample, seven distinguishable defense and coping types with special coping modes could be Tables 3 Prototypical classification Fighter (Fighng Mode) The Fighter takes maers into his own hands. He sets himself new goals and tasks, and he diverts himself with acvies. He acvely avoids aversive situaons and sets clear boundaries. He describes situaons focusing on his acons and behaviour and is capable of self-reflecon. He tries to avoid deeply emoonal involvement (by raonalizing, trivializing or disciplining himself). Human relaonships and closeness are seen posively. His me-related foci are present and future; his contextual foci are capacity, power and performance. identified: the Fighter, the Strategist, the Partisan, the Self-Protector, the Infantryman, the Comrade and the Corpsman, whereby the last can be seen as a subtype of the Comrade. The following tables describe the various subtypes according to their prototypical traits, expressions and modes (Tables 3-9), whereby the strengths, weaknesses, risks and opportunities of the types were clarified according to the principle of SWOT-analysis. A sample quote from an interview with a participant assigned to the corresponding type, translated from German, is intended to serve as clarification. For ease of readability, the masculine form is used in the following description. However, the descriptions apply to both sexes. Reflexions and Interpersonal Aspects When we launched our website, we anticipated a moderate response. This turned out to be a mistake. After just one week, the call to participate in the study had been shared Tables 4 Prototypical classification Strategist (Acve-reflexive Mode) The Strategist acts on his own iniave. He looks on his own for prospecve alternaves, makes his own decisions, is aware of inner changes since his mission abroad, intensively reflects on the past and the world of today. He can perceive and idenfy his emoons to a certain extent. He has a posive approach to relaonships as long as he remains fairly independent. His me-related foci are present-me and future; his contextual foci are posive aspects of the present and plans for the future, his self-efficacy and posively experienced soluon strategies. over 28,000 times. We had underestimated the veterans' need to communicate and their feeling of not being seen in their suffering. This impression was strenghtened in the further work process. The fact that the members of our research group were part of a well-known non-military university institution but still had military knowledge and in part military experience themselves certainly contributed to greater openness in the interviews. When we asked the potential interviewees in preliminary telephone interviews about their motivation to talk to us, we often heard answers like: "Finally there's someone listening", "For me, participating in this project is the first step in dealing with what I have experienced", or "Participation is the chance that something finally changes. Most people don't dare to speak." The great expectations of us and our project motivated us, but the idealization in some of those answers also made us afraid to disappoint those needs and hopes. 3.a Comrade (Social Mode) The Comrade puts emphasis on comradeship and his social role, stays in touch with his former comrades or creates a new social network (family, friends, animals). Loyalty to his peer group is highly important to him. He describes situaons with emphasis on social experiences. His ability to reflect on himself is reasonable to good, but he avoids or denies emoons when they are different from those of his surroundings. His me-related focus is rather indifferent, and his contextual foci are comradeship and his social surrounding. The interviews showed a very detailed descriptions of the behaviour and activities of the former soldiers at home and abroad. In contrast, hardly any of the interviewees spoke much about their emotional experiences. When feelings were shown openly, this often happened in the form of hardly regulated emotional reactions. The question of whether this phenomenon is a group characteristic or a symptom of trauma disorder (in the sense of emotional numbness or flooding) cannot be answered with absolute certainty. Our analysis showed that some emotions such as disappointment, anger and bitterness were frequently addressed and named, while others, such as sadness, guilt, shame and fear were hardly expressed or mentioned. One possible interpretation of this discrepancy could be that feelings such as disappointment, anger and bitterness were often directed to the external military or civil system and are also better compatible with the soldiers' self-standard (). Sadness, fear or shame, on the other hand, might be feelings that are less compatible with this ideal and therefore had to be fended off unconsciously or were deliberately concealed from us. This leads to the assumption that, regardless of posttraumatic symptoms, there may be a special degree of conscious and unconscious rejection of emotions within our group. We also have to acknowledge that the respective military rank may have an influence on the mechanisms and behaviours described. Types such as the infanteryman, for example, are likely to be less frequent and less long-term in leading positions, while types such as the strategist are more likely to feel comfortable in a Tables 6 Prototypical classification 3.b Corpsman (Altruisc Mode, Subtype of the Social Mode) The Corpsman is a person who is very commied to helping and supporng others, but who represses, trivializes or denies his own desires, needs and feelings. In his stories, he emphasizes other people's pain and his role as a helper. He has moderate self-reflection abilies, but he represses or denies his own feelings and is not aware of his limitaons. His me-related focus is the present me, and his contextual foci are other people's misery and suffering. Behaviour: +/-Relaonship & Bonding: ++ Emoon: -Reflexivity: +/-Time Focus: +/-Strengths: Gains confidence and self-esteem by supporng, helping and caring for others. Valuable member of social framework. Weaknesses: Self-esteem and psychological stability by caring for others; no awareness of own needs or that he subordinates them to other people's desires. Opportunies: Understanding and accepng own needs and desires, increasing autonomy. Risks: Altruism to the point of self-sacrifice, extreme overload and decompensaon due to self-care deficits. Quotaon: "I just didn't want to give up; I can't just leave because of the mother of XX and XX. I think they would collapse if they didn't have my support now. And when I say, 'I can't do it anymore! I'll just leave it!', then the house of cards collapses. And I know I'm just standing and holding on ght."(sighs heavily) leadership position on account of his need for autonomy as well as his abilities. However, these interactions certainly also exist in the civilian field. Classification of the Results The typology presented here is a first classification attempt based on the data material to illustrate an integrative approach linking schools and approaches to psychotherapy to be able to make an initial assessment of the conscious and unconscious coping abilities, strengths, weaknesses and psychotherapeutic needs of former GAF soldiers within the Tables 7 Prototypical classification Parsan (Aggressive Mode) The Parsan is in a fight against the military system, his country and civil society. He describes situaons focusing on his acons and behaviour (oscillang between aack and resignaon). His ability for self-reflection and mentalizing is poor, he represses or denies emoons despite hate and anger, projects inner feelings or conflicts onto his surrounding, rejects closeness and relaonships, and focuses on his past. His contextual focus is on the failure of others course of diagnostic conversations without the use of further questionnaires and independent of the psychotherapeutic background of the diagnostician. The great individual variability in coping and defense mechanisms used by former soldiers at different points in time confirms the frequently cited thesis that the situation and the perceived threat influence an individual's choice of coping mechanisms, whereby a simultaneous use of different mechanisms usually occurs (Olff and Langeland 2005). On the other hand, we have been able to identify recurring, often unconscious, core mechanisms in individual participants, which were used at several points of time. This consistency speaks to their cross-situational use. This consistency corresponds to the psychoanalytical assumption of a 'defense profile' linked to the psychological structure and to findings of current trauma research indicating that the individual coping style influences whether a person successfully processes a trauma in the long term or develops a trauma-related disorder (;). Self-Protector (Negang Mode) The Self-Protector is constantly trying to escape from difficult situaons and the respecve emoons, either through passive avoidance mechanisms, such as intellectualizing, raonalizing, trivializing, repressing or denying negave feelings and implemenng high safety precauons, or through acve avoidance mechanisms, such as flight, professional over-commitment and other means of distracon. He approves of relaonships to a certain extent but struggles with self-reflecon and deep emoonal experiences and involvement. His me-related focus is on the present and future, and his contextual foci are on his acvies, self-protecng acts, and psychosomac complaints. Behaviour: + and -Relaonship & Bonding: +/-Emoon: -Reflexivity: -Time Focus: +/-Strengths: Good defense mechanisms if problems are mild because needs and desire for protecon and flight can be answered promptly. Weaknesses: Avoidance of self-reflecon. Negavely experienced feelings must be fended off with all means, lile willingness to introspect. Opportunies: Improvement of his own emoonal approach, reducon of anxiety, tension and psychosomac reacons. Risks: Prolonged avoidance of therapy despite severe, mostly psychosomac symptoms. Overlaps of psychoanalytic character types and structures (e.g., Knig 2004) with the types introduced in this article can be seen. There are also similarities to coping concepts such as the classification of COPE (). However, both are closely associated to specific schools of psychology and therefore are often interpreted within the respective theoretical framework. Our aim was to use an unbiased and open approach to this topic to detect underlying patterns which are not automatically associated with an existing theoretical framework and therefore enable the integration of different perspectives. Therefore, the examination of existing theoretical backgrounds took place at a rather late point in time and involved more the comparison or exploration of similarities and differences. Revealing unconscious mechanisms requires a personal interview situation, and an external view is absolutely necessary. The research results of the last few years Tables 9 Prototypical classification Infantryman (Accepng Mode) The Infantryman is capable of accepng difficult situaons and adapng to changing circumstances. He executes the given commands reliably and can set aside inner conflicts. His self-reflecon and emoonal experience are diminished/rejected, his me-related focus is present-oriented, and his contextual foci are his (passive) behaviour, the command situaon and his execuon of orders. Behaviour: --Relaonship & Bonding: +/-Emoon: -Reflexivity: -Time Focus: + Strengths: Ability to withstand difficult situaons for a long me and accept and make the best of them. Weaknesses: Lile awareness of his own needs, and therefore lile commitment to stand up for himself. Opportunies: Perceiving his own limits and needs; learning to take responsibility; gaining more control over his life. Good willingness to enter therapy if "instrucon" comes from the outside. Risks: Long latency unl the use of assistance without clear instrucons or gatekeeper; no claim/use of medical/psychological services at all. Quotaon: "I was very lucky to have a friend in my circle of friends who worked here at Bundeswehr Hospital as a doctor, when I started feeling bad two years ago, and he said: 'This is not just about your normal life, which you've been living up to now, with all its difficules, but you are sll carrying around a lot of things from before that you haven't processed yet.' And he took me by the hand and brought me here, so to speak, and then I was tested here, and then they said to me: "Yes, there's something wrong here. We underscore the high proportion of unconscious, intuitively functioning processes, especially in the case of complex, fast-track decisions (). It is virtually impossible to record such processes by filling in self-rating scales, such as coping scales such as the COPE (). In fact, only on the basis of our interview approach were we able to identify additional mechanisms, to differentiate more precisely between emotional and reflexive reactions and to recognize the motives, gradations and characteristics of relationships, emotional approval, or reflexivity. Only in the context of conversations could motives be identified and coping mechanisms be detected and strictly assigned to the corresponding specific categories and modes. Despite scientifically well-investigated and effective therapy methods, it is wellknown that only a small percentage of (former) soldiers with mental illnesses use psychosocial services in the USA, Great Britain and Germany (;;Murphy and Busuttil 2015;;), and many wait for years to decades before entering treatment (;;;). Even if they see a doctor, many evidence-based treatment options that work in other clinical contexts often seem to fail in this specific context. One reason may be that members of the military system have different basic assumptions about mental illness and psychotherapeutic treatment from civilian patients. The conflict of identity that arises in the soldier from the implicit weakness of the mental illness and the fear of stigmatization may be even greater than in members of civilian society (). Rsch et al. also emphasized the relevance of fear and experience of stigmatization and discrimination among soldiers in their qualitative work on the self-revelation of psychiatric disease (). This finding raises the question of whether it is time to take a different approach. Taken these findings into consideration, our typology allows for a more tailored approach to psychotherapy and help. This approach could be a way to increase willingness to seek help and establish a relationship with the patient. Referring to the typology we introduced, it might be sensible to ask whether, for example, social and altruistic types such as the Comrade or the Corpsman are more likely to engage in therapy, regardless of psychotherapeutic school, if they are offered a group therapy setting at the outset or by offering a sports therapy programme to active types such as the Fighter, who quickly feel insecure in emotional contexts and use mechanisms, such as rationalization, trivialization, fighting, and powering out. Such a programme might be more inviting and would approach those former soldiers on their own terms before they are transferred to a more traditional therapeutic approach that focuses on communication and emotional insights, an approach with which those types might otherwise not be able to engage before their suffering has increased immensely. Individuals who may have internalized a very rigid male soldier ideal could, in this way, be slowly introduced to emotional topics. A subsequent psychodynamic or cognitive behavioural (psychotrauma) therapy, in accordance with proven guidelines and recommendations, would then be the second step. The same may be true of the Partisan, who would most likely benefit most from a mentalization-based therapy since it seems to be (psychoanalytically expressed) more a structural problem that prevents him from leading a balanced and successful life than a question of conflict. We are fully aware of the fact that our attempt at type formation is only a first step that must be confirmed by further research. However, we believe it is necessary to develop new approaches to mental illness and the use of therapy for former GAF soldiers in order to counteract the suffering of individuals and their families. Due to the high number of unreported cases of mentally ill soldiers in need of treatment () and the growing number of traumatized individuals in recent times, we consider a new approach to be overdue. We know that seeking help is not always a conscious, cognitive decision. In the decision process, many subconscious notions and stereotypical assumptions play a role and influence the outcome of this decision-making. Our typological classification creates initial insights into underlying motivations of conscious and unconscious coping and defense mechanisms and might be helpful in the adaptation of help on offer and therapy for the specific needs and abilities of various former soldiers. Additional research on conscious and unconscious coping mechanisms in traumarelated disorders is necessary independent of our results. From the point of view of care research, for example, the identification of individual core mechanisms, ideally at different examination times, could shed light on the influence of traumatic experiences on personal coping and defense styles and be used in the long term to create specific therapy regimes or for a more nuanced examination of the question of personal vulnerability. The question also arises as to how far the types identified can also be found in other socio-cultural contexts. Are there such types in the American military, too? Would it not make sense to also consider an appropriate typology for the treatment of civilian patients with trauma-related disorders? After all, everyone has their own coping (and defense) mechanisms. Would it be an opportunity for all groups who find therapeutic access difficult to think about including coping and defense much more consciously in their therapy planning? This approach would mean that coping styles could be used to enable therapy by changing and reducing the individual's defenses. Critical Discussion of the Typological Construct Although critical opinions of typological constructs must be considered, it has to be stated that many of these concepts are well-known and are currently widely used, such as the Myers-Briggs-Test or the concept of Typus Melancholicus (). Many of the theoretical models and classification experiments by famous scientists such as Karl Jaspers and Kurt Schneider are based on typological constructs and have made their way into some parts of today's classification systems such as the DSM-IV (;Schfer 2001). The development of typologies has always been of great importance for better understanding and categorization of human behaviour, especially in the psychiatric context. According to Schfer 2001, today the term 'type' is not a quantifiable scale that is used for the determination of individual characteristics. It instead represents a multi-dimensional prototype of patterns with which the individual can be compared. This comparison can be made via the endpoints of the continuum (extreme variants) or via gradual gradations on the continuum (accumulated types). In Anglo-American countries, the concept of type is given far less importance in personality research than in Central Europe. If the term 'type' is used at all, it is used in special cases. Especially in personality research, in the Anglo-American world, the term 'personality trait' is preferred to 'personality type'. As these 'traits' also occur with different degrees of expression and have no fixed boundaries, they are nonetheless very similar to the 'types' in many respects (Schfer 2001). Compared to categorical or dimensional approaches, typologies enable scientists and practitioners to use abstraction and reduction to make a complex process understandable and descriptive. In the past, critics of typological approaches rightly criticized the frequent lack of differentiation and validity of typological constructs. In relation to the past, this criticism is not completely unjustified. However, there are also counterexamples to the charge of lack of validity, such as the theory of personality functioning by Block and Block, who divided individuals into' resilient',' over-controlled' and' undercontrolled' (Block and Block, 1980). The validity of the model was statistically confirmed in several studies (). Regardless, typological analysis is ideally carried out in three steps: 1) finding the types, 2) describing the types and 3) diagnosing the types using qualitative and quantitative methods (Zerssen 1973). We, therefore, see our attempt to identify defense and coping mechanism types in the group of former GAF soldiers as a first step in the generation of hypotheses (finding and describing the types), which will hopefully be followed by more qualitative and quantitative validation steps in the future. Strengths and Limitations The study is a pilot study for the identification, illustration and typological classification of conscious and unconscious coping mechanisms of former soldiers with deployment experience. With 43 interviews, the study has a broad database for a qualitative design. Due to the high media resonance and the additional recruitment in the hospital and within the military, we likely reached a very large number of former soldiers and included a large variability of contrasting cases through the differentiated sampling. Influences by the investigator and resulting psychodynamic processes were minimized by the sampling, the divergent team composition (medical doctors and psychological staff of both sexes with and without military and combat experience and with and without psychotherapeutic training in different psychotherapeutic schools), and analysed and included in the evaluation by the conscious handling of countertransference experiences and regular team intervention and supervision. The open nature of the interview design with its low limitations, the lack of requirements and the possibility of asking questions led to very detailed, individual, subject-related descriptions of the internal and external processes. For most of the participants the interview situation, especially the conversation about stressful events in the war zone, was a challenge in itself. Therefore, some defense and coping mechanisms were already apparent during the interview which we were able to experience "live". Through the shown or not shown affect, the choice of words, the tone of voice and they kind of description, mechanisms were already recognizable in the interview situation. Our approach therefore involved, on the one hand, the analysis of the mechanisms remembered and narrated by the participants themselves, and, on the other hand, the consideration of the unconscious mechanisms and behaviours recognisable from the narrative. By the additional analysis of those mechanisms directly occurring in the course of the interview, we believe we counteracted a conscious or unconscious selection or distortion process, as occurs in any form of (retrospective) survey. In other words, these phenomena helped us to better understand the (unconscious) fears, values and motives of the former soldiers interviewed. Although we have paid great attention to differentiated sampling with the aim of achieving the greatest possible heterogeneity and have always oriented our data collection and analysis to the quality criteria of, there is still no need for generalizability in the sense of statistical representativeness, as is generally not the case with any form of qualitative procedure. Our typological classification constitutes the preliminary results and theoretical model from a pilot study. It must and should be validated and critically reviewed, and evaluated for its practical usability, in followup studies. Conclusion This work strives for the first time, to our knowledge, to develop a typological classification of the use of conscious and unconscious defense and coping mechanisms on the basis of methodically and structurally collected and analysed data from a qualitative pilot survey of former soldiers in Germany. Seven coping and defense types were identified: the Fighter, the Comrade, the Corpsman, the Strategist, the Partisan, the Self-Protector and the Infantryman. The types identified differed in the accumulation, combination, and use of their conscious and unconscious defense and coping mechanisms in categories in the superordinate areas of behaviour, relationships, emotions, reflexivity and time focus. The typological classification could offer psychotherapeutic interventions tailored to individuals and their defense and coping mechanisms, which could lead to improved therapy use and compliance. Nevertheless, further research is needed in the field of trauma management and validation and verification of our results in follow-up studies.
|
Bioactivity assessment of exopolysaccharides produced by Pleurotus pulmonarius in submerged culture with different agro-waste residues Pleurotus spp. are white-rot fungi that utilize different agro-wastes to produce useful biologically active compounds. In this study, exopolysaccharides (EPS) were produced by Pleurotus pulmonarius in submerged culture supplemented with different agro-wastes. Functional groups in EPS were revealed using Fourier Transform-Infrared (FT-IR) spectroscopy. Antimicrobial activity of EPS was tested against microorganisms using agar well diffusion. Scavenging potentials of EPS was tested against 1, 1- diphenyl-2-picryhydrazyl (DPPH), hydroxyl (OH), iron (Fe2+) and nitric oxide (NO) radicals. In vitro prebiotic activity of EPS was carried out. The highest yield (5.60 g/L) of EPS was produced by P. pulmonarius in submerged culture supplemented with groundnut shell (20.0 g/L). The functional groups in EPS were hydroxyl (-OH), methyl (-CH3), ketone (-RCOH) and carbonyl group (-C=O). EPS displayed zones of inhibition (5.0014.00 mm) against tested microorganisms. Scavenging activity of EPS ranged from 65.70-81.80% against DPPH. EPS supported the growth of Lactobacillus delbrueckii and Streptococcus thermophiles with values ranged from 3.04 1043.40 104 cfu/ml and 2.50 1042.81 104 cfu/ml, respectively. Submerged culture of P. pulmonarius with addition of agro-wastes enhanced yield of EPS. The EPS exhibited bio-functional properties like antimicrobial, antioxidant and prebiotic activities. Hence, agrowastes can be recycled in submerged fermentation with fungi to produce promising biomaterials for biopharmaceutical applications. Introduction Agro-wastes are often disposed indiscriminately and constitute a great nuisance to the environment (a). This has resulted to climatic variability with serious risk to human and ecological health (;and Ferronato and Torretta, 2019). Agro-waste residues such as peels, seeds, stones from fruits and vegetables contain indispensable chemical constituents, which can be used for the production of useful products (). The cultivation of microbial cells on agro-wastes (substrates) to synthesis biologically active compounds using different biotechnological innovations is gaining an interest and is of progressive boons (a). Fungal mycelia have the ability to utilize complex organic compounds in agro-residues through fermentation processes and produce biomolecules, which are useful as nutritional supplements and serves as complementary medicine to prevent degenerative diseases (;). The recovery of by-products from biologically treated agro-wastes will therefore, improve utilization of agro-wastes and then minimize the problem of environmental pollution (b). Agricultural wastes (lignocellulosic biomass) composed of cellulose, hemicelluloses and lignin, which can be enzymatically converted by microbes into various values added biotechnological products (Baig 2020). Plant biomass waste is therefore, a desirable alternative raw material to produce functional products since it is readily available and bio-renewable. Fungi such as Pleurotus spp., Polyporus ostriformi, Phanerochaete chrysosporium and many more have colonizing potentials on ligno-cellulosic materials (). P. pulmonarius is an economically important edible and medicinal macrofungus that traditionally grown on different substrates namely; corn cob, corn straw, peanut straw, soybean straw, rice straw, rape straw, wheat bran and cottonseed hulls (). Information gathered by Baeva et al. revealed that, species of genus Pleurotus capable of utilizing different agrowastes to produce polysaccharides. Fungal polysaccharides, especially EPS possess therapeutic properties like antimicrobial, antioxidant, and anticancer (). This has made EPS to receive greater interest for extensive applications in food, pharmaceutical, and cosmeceutical industries (Ozcan and Oner, 2015). Considering a great interest and uses of EPS for different biological purposes, submerged culture of fungi with agrowastes will be a new approach and an eco-friendly strategy to produce EPS. This study therefore, produces EPS by P. pulmonarius in submerged fermentation with some agro-wastes namely; peels of plantain, pineapple, mango, groundnut shell, coconut coir and walnut husk. The antimicrobial, antioxidant and prebiotic activities of EPS produced by P. pulmonarius were also investigated. Source of agro-wastes Peels of plantain, pineapple, mango, groundnut shell, coconut coir and walnut husk were obtained from fruit market in Odeomu, Nigeria. The agrowastes were air-dried at 29 C for 21 days and pulverized using a mill machine (5657 HAAN 1 TYPE ZM1, Retsch GmbH, Haan, Germany). Collection of microorganisms P. pulmonarius was collected from Federal Institute of Industrial Research Oshodi (FIIRO) Lagos, Nigeria. The fungus was sub-cultured on Potato Dextrose Agar to maintain a pure strain of the fungus. Indicator bacteria and fungi were obtained from Nigeria Institute of Medical Research (NIMR), Lagos, Nigeria. The indicator microorganisms include: Shigella dysenteriae, Escherichia coli O 157:H7, Salmonella typhi, Vibrio cholerae, Methicillin resistant Staphylococcus aureus (MRSA), Bacillus subtilis, Candida albicans, and Candida tropicalis (ATCC 66029). All the tested microorganisms were aseptically sub-cultured into appropriate media and incubated at 37 C for 24 h and 26 C for 48 h for bacteria and fungi, respectively. Two probiotic strains; L. delbrueckii and S. thermophiles used in this study were isolated from yoghurt starter culture. Production and extraction of EPS EPS was produced by P. pulmonarius in submerged fermentation with agro-wastes using methods of Smirderle et al. and Silveira et al. with slight modification. Briefly, 5.0 g/L glucose, 1.0 g/L yeast extract, 1.0 g/L peptone, 0.1 g/L MgSO 4, 0.1 g/L K 2 HPO 4 and 0.1 g/L CaCO 3 and agro-waste of 4.0 g/L, 12.0 g/L and 20.0 g/L in different Erlenmeyer flask. The flasks containing the mixture were autoclaved at 121 C for 15 min. Thereafter, the flasks were allowed to cool before inoculation of 7 days-old mycelium of P. pulmonarius. The flasks were incubated at 26 C and maintained at 120 rpm for 18 days in thermostat incubator with shaker (IN-SK100, Tianjin, China). After submerged fermentation, EPS was assayed using the methods of Svagelj et al. and Diamantopoulou et al.. Briefly, the concentrated culture filtrates were mixed with four volume of ethanol 95% v/v, stirred thoroughly and allow to stand for 24 h at 4 C. The precipitated EPS was treated with 1:4 of n-butanol: chloroform (1:5 v/v) to remove protein. The EPS content was determined by phenol-sulphuric acid methods and glucose was used as standard (). The solvent extract was purified by ion exchange chromatography at flow rate of 24 ml/h and elution was performed with distilled water. The samples were dialyzed exhaustively against distilled water to remove unwanted impurities for 24 h. The EPS obtained was freeze-dried (FD-10-MR, Xiangtan Xiangyi instrument Ltd, China) at -65 C. EPS produced by P. pulmonarius in submerged culture without agrowaste, EPS produced by P. pulmonarius in submerged culture with pineapple peel, EPS produced by P. pulmonarius in submerged culture with groundnut shell and EPS produced by P. pulmonarius in submerged culture with coconut husk were selected for further studies. FTIR spectroscopic analysis of EPS Structural analysis of EPS was determined using Fourier transforminfrared (FT-IR) spectroscopy (8400S, Shimadzu Scientific Instruments Inc.). EPS (1.0 l) was placed on fused KBr disc for detecting the functional groups. This was carefully placed on cell holder, clamped loosely and fixed on the infrared (IR) beam. The running was done at 400 to 4000 per cm wave number. Antimicrobial activity of EPS against microorganisms The antimicrobial activity of EPS was tested against strains of microorganism using agar well diffusion method (Cheesbrough 2000). Microorganisms (bacteria and fungi) were cultivated on their respective broth and incubated at 37 C for 24 h and 28 C for 48 h for bacteria and fungi, respectively. The inoculum size was a to 0.5 McFarland turbidity standard at 600 nm using visible spectrophotometer (UNICO S-1100 RS). A sterile swab stick moistened with bacterial or fungal inoculum was spread on Mueller Hinton agar. Subsequently, wells of 4 mm diameter were bored into the agar medium and filled with 50 l of EPS (1.0 mg/ml). Antibiotics namely; amoxicillin and ketoconazole were used as positive control, while sterile distilled water was served as negative control. The plates were incubated at 37 C for 24 h and 25 C for 72 h for bacteria and fungi, respectively. After incubation, zones of inhibition were measured and recorded in millimeters (mm). Minimum inhibitory concentration (MIC) of EPS was determined by varying concentrations of 0.25 mg/ml to 1.0 mg/ml. The lowest concentration of EPS that shown no visible growth were regarded as MIC. DPPH scavenging activity of EPS The free radical scavenging activity of EPS on DPPH was determined using the method of Gyamfi et al.. EPS was mixed with 1.0 ml of 0.4 mM DPPH in methanol (5.0 ml). The mixture was incubated at 27 C for 30 min in dark. The control contained only DPPH solution in methanol instead of sample while methanol served as the blank. Absorbance was noted at 517 nm by using a UV-visible spectrophotometer. The capacity of free radical scavenging was calculated as: Hydroxyl radical scavenging assay Scavenging potential of EPS against hydroxyl radical was assessed using the method of Halliwell et al.. The reaction mixture containing an aliquot of EPS (100 l), 120 l, 20 mM deoxyribose, 400 l, 0.1 M phosphate buffer, 40 l, 20mM hydrogen peroxide, 40 l, 500 M FeSO 4 and the volume was made up to 800 l with distilled water. The reaction mixture was incubated at 27 C for 30min. Thereafter, 0.5 ml of trichloro acetic acid (2.8%) was added, followed by 0.4 ml thiobarbituric acid (0.6%). Mixture was heated for 20 min and cooled. Absorbance of blank (Ab) and absorbance of the sample (As) was measured at 532 nm in spectrophotometer. Iron chelation activity of EPS The ability of EPS to chelate iron (II) sulphate (FeSO 4 ) was determined using methods described by Puntel et al. with little modification. Briefly, 150 mM FeSO 4 was added to a reaction mixture containing 168 l of 0.1M Tris-HCl pH 7.4, 218 l saline (0.9% NaCl) and EPS (100 l) of different concentrations (100-200 g/ml). The reaction mixture was incubated at 27 C for 5min, before the addition of 13 l of 0.25% 1, 10-phenantroline (w/v) and the absorbance was subsequently measured at 510 nm in the spectrophotometer. Chelating activity of EPS on iron radical was calculated using this equation: 2.10. Nitric oxide radical scavenging activity of EPS Nitric oxide scavenging activity of EPS was carried out using the method described by Jagetia and Baliga. A volume of 1.0 ml of sodium nitroprusside (10 mM) prepared in 0.5 mM phosphate buffer saline (pH 7.4) was mixed with EPS (100 l) and vortexed. The mixture was incubated at 25 C for 150 min. Thereafter, 1.0 mL of previous solution was mixed with 1.0 ml of Griess reagent (1% sulphanilamide, 2% o-phosphoric acid and 0.1% naphthylethylenediamine hydrochloride) and incubated at 26 AE 2 C for 30 min. The absorbance was recorded at 546 nm. Scavenging activity was calculated using the following formula: Influence of EPS on probiotic strains Probiotic growth stimulation of EPS was determined using the method described by Sawangwan et al. with modifications. Briefly, de Man, Rogosa and Sharpe (MRS) broth (10.0 ml) was prepared in different test tubes, sterilized at 121 C for 15 min and was allowed to cool. The MRS broth was supplemented with sterile (0.22 m milipore) 1% v/v of EPS, a negative control contained MRS broth, positive control contained MRS broth with 1% w/v of glucose or commercial prebiotic; fructose oligosaccharides (FOS). Each tube was inoculated with 100 l of 10 6 colony forming units (CFU) L. delbrueckii and S. thermophiles. The tubes were incubated at 37 C for 48 h under anaerobic conditions. After incubation, growth rate of probiotics was quantified by measuring their optical cell density using spectrophotometer at 620 nm. To determine load of L. delbrueckii and S. thermophiles, 100 l was serially diluted in sterile distilled water up to 10 4, where 100 l was transferred to MRS agar plates. Petri dishes were incubated at 37 C for 48 h. Thereafter, colonies of L. delbrueckii and S. thermophiles were counted using colony counter (TT-20, Techmel and Techmel, USA) and reported as colony forming unit per milliliter (CFU/ml). Statistical analysis All experimental studies were performed in replicate (n 3). Data were subjected to one-way analysis of variance (ANOVA) and results were presented as mean AE standard deviation (SD). Tests of significant differences were determined by Duncan's Multiple Range Test at P < 0.05. The statistical analysis was carried out using Statistical Package for Social Sciences (SPSS) version 23. Results and discussion Agro-industrial wastes contain nutrients and bioactive compounds that can be biologically converted into value added products and this therefore, remain circular strategies, implementing policy regulation and most cost-effective methods to utilize lignocellulose biomass as ecofriendly alternative to increase biobased products (). In this study, peels of plantain, pineapple, mango, shell of nut-foods namely; groundnut, walnut and coconut were successfully utilized by P. pulmonarius to enhance production of EPS. Figure 1 shows the yield of EPS produced by P. pulmonarius in a submerged culture supplemented with agro wastes. P. pulmonarius produced EPS of 1.20 g/L in a submerged culture without agrowaste. When 4.0 mg/L of agrowaste was supplemented into submerged culture, EPS produced by P. pulmonarius was within 2.0-3.8 g/L. When 12.0 mg/L and 20.0 g/L of agrowastes were added into submerged culture, EPS produced by P. pulmonarius ranged from 2.0 to 2.7 g/L and 2.0-5.6 g/L, respectively. P. pulmonarius produced more EPS when agrowastes was supplemented into submerged culture. The higher proportion of EPS from culture media containing agro-wastes could be attributed to colonizing potential of Pleurotus spp. on various agro-industrial residues (). Pleurotus spp. possesses ability to grow on varieties of lignocellulosic biomass, utilizing them as substrates to degrade both natural and anthropogenic aromatic compounds, which occurs by virtue of nonspecific oxidative enzymatic system that consist mainly laccases, phenol oxidases, manganese peroxidases and versatile peroxidases (;). The degrading enzymes in Pleurotus spp. mycelia enable them to store some metabolic products such as sterols, triterpenes, phenolic compounds and largely; polysaccharides (;Vamanu, 2012). Supplementation of agrowastes into cultivating media of fungi during fermentation processes is a mechanism to produce useful bioavailable metabolites for medicinal purposes. Bioproduction of multiple secondary metabolites by higher fungi have increased for industrial applications through optima fermentation conditions, application of metabolic engineering techniques, and fungal genetic manipulation (). Solid state fermentation and submerged cultivation of Basidiomycota are much faster biotechnological processing as a promising tool for industrial production of exopolysaccharides. Submerged cultivation of fungal mycelium is still attractive alternative to produce high-value metabolites; exopolysaccharides (EPS), which are often synthesize in the cell membrane through the action of enzymes (). Trametes versicolor, Lentinula edodes, and Pleurotus ostreatus grown on twelve formulations of different inexpensive lignocellulosic biomass such as oak sawdust, coconut fiber (hairs), coffee husks, and corn bran plus soybean oil produced better yield of polysaccharides (S anchez and Montoya, 2020). Agro-wastes are media that can be utilized by microorganisms to produce natural metabolites with antioxidant and antimicrobial activities (a). Hence, the utilization of agro-wastes to produce biologically active compounds by submerged fermentation with microbial technology is a major way to proffer solution to poor management and disposal of agro wastes. Shells of groundnut, coconut, walnut and dried fruit peels contain cellulose, hemicellulose and lignin, which enable fungi to grow and produce better metabolic products. Groundnut shell was utilized for the production of various bio-products like biodiesel, bioethanol, and enzyme (). Shells of coconut and walnut, fruit peels, seeds and other agrowastes are renewable sources for production of biopolymers. The varying amount of EPS could base on quantity of agrowastes, different type of agro-wastes used, their chemical composition and choice of fungi. It has been reported that optimal submerged culture conditions for maximum mycelial growth and EPS production depend strongly on type of substrates and fungal species (a(Park et al.,, 2002bMahapatra and Banerjee 2013). Other environmental parameters or physical conditions on the fungal mycelia formation during fermentation with culture medium composition could be associated with yield of EPS. The morphological nature, physicochemical parameters and various physiological activities on fungal mycelia in submerged cultures were strongly depended on the quantity of metabolites (Znidarsic and Pavko, 2001;). The findings of Kim et al. revealed that, characteristics of mycelia in submerged fermentation were influenced by culture conditions such as composition and initial pH of fermentation medium, age and size of inoculum, aeration rate, agitation speed and thus, affect yield of EPS. The basic composition of agro-wastes residues like nitrogen, carbon and minerals could also affect the production of EPS. Findings of Lee et al., Smiderle et al. and Thai and Keawsompong associated the production of EPS and cell mass to different carbon and nitrogen sources. Mahapatra and Banerjee suggested that, most of the EPS producing fungi are aerobic or facultative anaerobic, and oxygen limitation did not support EPS production and thus, revealed that fungi needed long incubation time for maximum EPS production. Figure S1(supplementary file 1) shows FT-IR spectra of EPS produced by P. pulmonarius in a submerged culture without agro-wastes, EPS produced by P. pulmonarius in submerged culture supplemented with groundnut shell, EPS produced by P. pulmonarius in submerged culture supplemented with coconut shell and EPS produced by P. pulmonarius in submerged culture supplemented with pineapple peel. The wave number (cm 1 ) interpreted for functional groups in EPS were shown in Table 1. A peak of 3448.84 cm 1 and 3404.47 cm 1 indicated hydroxyl group (-OH). A peak around 2929.97cm 1 and 2937.68 cm 1 suggested the presence of methyl (-CH 3 ). Bands at 1639.55 cm 1 and 1724.42 cm 1 were assigned to carbonyl (-CO). The CO group (carbonyl), -CH 3 group (methyl) and -OH group (hydroxyl) were functional groups in EPS produced by Pleurotus spp. The result is in agreement with a previous study by Shen et al. who conducted FT-IR analysis on EPS produced by P. pulmonarius. Findings of Patel et al. indicated sharp peak in FT-IR spectra of functional groups in fungal EPS, which ranged from 1400-1500 cm 1 and 1631.45 cm 1. The band at 1,162 cm 1 corresponds to 1,4-glycosidic linkage in polysaccharides that gives absorption bands in the range of 1,175-1,140 cm 1 (). The different functional groups in EPS occurred as a result of varying contents of monosaccharide composition. Fungal EPSs contain sugars of different monosaccharide units such as glucose, mannose, galactose, xylose, arabinose, fucose and rhamnose (Osi ). Zones of inhibition by EPS on pathogenic microorganisms were shown in Table 2. The zones of inhibition displayed by EPS-A, EPS-B, EPS-C and EPS-D against Shigella dysenteriae and E. coli O 157:H7 were not significantly different at p < 0.05. EPS inhibited the growth of Gram positive bacteria such as Bacillus subtilis, Staphylococcus aureus, yeast and other Gram-negative bacteria. Table 3 showed MIC of EPS against tested EPS-A: EPS produced by P. pulmonarius in submerged culture without agro-wastes. EPS-B: EPS produced by P. pulmonarius in submerged culture with groundnut shell. EPS-C: EPS produced by P. pulmonarius in submerged culture with coconut husk. EPS-D: EPS produced by P. pulmonarius in submerged culture with pineapple peel. AMX: Amoxicillin was used against bacteria and KET: *Ketoconazole was used against fungi. Ganoderma lucidum to their ability to chelate essential nutrients, which therefore, limit the growth of microorganisms. Polysaccharides (intra and extra-) produced by numerous microorganisms, most especially mushrooms are tagged with various biological and pharmacological activities such as antimicrobial, dietary supplements, immunostimulating and immunodulatory, anti-tumor, hypoglycemic and antioxidant (;). Therapeutic potentials of EPS are the main reasons for their medicinal and industrial uses. Hence, EPS produced by fungi in submerged culture with agro-wastes was enhanced and can be continually exploited for different bioactivities. Figure 2 shows antioxidant activity of EPS against DPPH, OH, Fe 2 and NO. The scavenging activity of EPS ranged from 67.80-81. 80%, 60.60-81.20%, 70.40-84.70%, 78.40-88.50% against DPPH, OH, Fe 2 and NO, respectively. Zhang et al. revealed EPS as antioxidant agent that scavenged OHand DPPH radicals. The researchers revealed that EPS exhibited antioxidant activity of 100%, which is the same as scavenging activity of vitamin C. In the findings of Liu et al., EPS produced by submerged culture of Inocutus hispidus showed antioxidant activities of 70.7% on hydroxyl and 50% on 2,2-diphenyl-1-picrylhydrazyl radicals. EPS produced by Gomphidius rutilus, mutant Cordyceps militaris SU5-08 and Pleurotus spp. in submerged fermentation demonstrated pronounced antioxidant effects on superoxide anion, 1,1-diphenyl-2-picrylhydrazyl, hydroxyl radical, and reducing power (;;;). EPS have different monosaccharide composition with functional groups such as -OH, -CH 3, -COOH, and -CO that substantiate its free radical scavenging and metal chelating activities (). Such functional groups in biomolecules or functional foods are stable enough to donate electrons to reduce the free radicals to a more stable form or react with the free radicals to dismiss the radical chain reaction (;). Fungal polysaccharides (extra-and intra-cellular) are of great economic importance with wide variety of biological activities, that promote health benefits with relatively nontoxic property (;). Exopolysaccharides from different Basidiomycota are reported to have a large potential for human health maintenance as well as reducing the risk of various diseases. Inonotus obliquus polysaccharides suppressed cell viability, colony-formation, and triggered cell apoptosis and thus, can be used as a promising alternative or supplementary medicine for cancer therapy (). Huang et al. revealed that Ganoderma lucidum polysaccharides and Polyporus umbellatus polysaccharides significantly enhanced the phagocytic function of macrophages and the activity of NK cells. EPS, having displayed different bioactivities, can be combined with compounds of potential value to produce new drugs for better and effective treatments of organic diseases. Table S1 (supplementary file 1) shows optical density at 620 nm for L. delbrueckii and S. thermophiles cultivated in MRS broth, MRS broth supplemented with EPS or commercial prebiotic after 48 h. Table 4 shows count (CFU/ml) of L. delbrueckii and S. thermophiles on MRS agar. Probiotic growth stimulation of EPS (A-D) was ranged within 3.04-3.40 10 4 CFU/ml for L. delbrueckii and 2.50 to 2.81 10 4 CFU/ml for S. thermophiles with varying optical density of 0.9910-1.0716 and 1.0216 to 1.0372, respectively. Sawangwan et al. revealed that Lactobacillus acidophilus grown in Lentinus edodoes extract and Lactobacillus plantarum cultured in P. pulmonarius extract have optical density (620 nm) of 1.9779 and 1.9702, respectively. This study indicated that EPS enhanced the growth of L. delbrueckii and S. thermophiles. Polysaccharides from Pleurotus spp. are supportive media that enhance the growth rate of probiotics (). In the findings of Nowak et al., mushroom polysaccharides support the growth of Lactobacillus strains than some commercially available prebiotics like inulin and fructooligosaccharides. The variability observed in biofunctionality of EPS against microorganisms, free radicals and towards the growth of L. delbrueckii and S. thermophiles could due to the content and quantity of monosaccharides as well as their molecular weight. Biological activities and therapeutic effects of polysaccharides depend on specific glycosidic linkages, degree of branching, monosaccharide composition and molecular weight. EPSs are natural polymers of microbial products with novel biological properties (anticancer, antimicrobial and antioxidants) and thus, serve as valuable resources with multiple biotechnological applications in industries. Conclusion Fruit peels, coconut husk, groundnut shell and walnut shell supplemented into submerged culture of P. pulmonarius mycelia supported the production of EPS. EPS inhibited the growth of pathogenic microorganisms, displayed antioxidant activities against free radicals and sustained the growth of probiotics. Therefore, bioactivities of EPS make it a better candidate of natural products that is often used as a preservative agent in foods industries. Production of natural bioactive compounds in submerged culture of fungi with agro-wastes will therefore, reduce indiscriminate disposal of agro-wastes into the environment. Chemical constituents of agro-industrial residues can be utilized by fungal enzymes to produce biologically active compounds (secondary metabolites) in larger scale production. Declarations Author contribution statement Clement Olusola Ogidi: Conceived and designed the experiments; Performed the experiments; Analyzed and interpreted the data; Wrote the paper. Adaeze Mascot Ubaru, Temilayo Ladi-Lawal: Conceived and designed the experiments; Performed the experiments; Analyzed and interpreted the data. Oluwakemi Abike Thonda: Performed the experiments; Analyzed and interpreted the data; Contributed reagents, materials, analysis tools or data. Oluwatoyin Modupe Aladejana: Conceived and designed the experiments; Performed the experiments; Contributed reagents, materials, analysis tools or data. Olu Malomo: Conceived and designed the experiments; Wrote the paper. EPS-A: EPS produced by P. pulmonarius in submerged culture without agrowastes. EPS-B : EPS produced by P. pulmonarius in submerged culture with groundnut shell. EPS-C: EPS produced by P. pulmonarius in submerged culture with coconut husk. EPS-D: EPS produced by P. pulmonarius in submerged culture with pineapple peel.
|
sample_employees = [
{
"employee_id": 153,
"name": "<NAME>",
"department": "Game Development",
"designation": "Developer",
"salary": 80000,
"sales": 750000
},
{
"employee_id": 293,
"name": "<NAME>",
"department": "Lead Protagonist",
"designation": "Acting",
"salary": 105000,
"sales": 550000
},
{
"employee_id": 904,
"name": "<NAME>",
"department": "Spiritual Mythology",
"designation": "Fantasy",
"salary": 15000,
"sales": 64000
},
{
"employee_id": 869,
"name": "<NAME>",
"department": "Lead Protagonist",
"designation": "Fantasy",
"salary": 185000,
"sales": 950000
},
{
"employee_id": 1083,
"name": "<NAME>",
"department": "Fictional Sorceress",
"designation": "Witcher III",
"salary": 55000,
"sales": 343000
}
]
|
The U.S. Economy to Bounce Back in Second Quarter PARTICIPANTS I Beacon Economics = Los Angeles, California; Conf. Board = Conference Board, New York, New York; Fannie Mae = Fannie Mae, Washington, D.C.; IHS = IHS Global Insight, Eddystone, Pennsylvania; GSU-EFC = Georgia State University, Economic Forecasting Center, Atlanta, Georgia; Moody's Economy = Moody's Economy.com, Westchester, Pennsylvania; Mortgage = Mortgage Bankers Association, Washington, D.C.; NAM = National Association of Manufacturers, Washington, D.C.; Northern Tr = Northern Trust Company, Chicago, Illinois; Perryman Gp = The Perryman Group, Waco, Texas; Royal Bank of Canada, Toronto, Ontario, Canada; SP UBS = UBS Bank, Salt Lake City, Utah; US Bank = U.S. Bank, Minneapolis, Minnesota; US Chamber = U.S. Chamber of Commerce, Washington, D.C.; Wells Fargo = Wells Fargo Bank, San Francisco, California.Consensus expects the country's GDP growth rate to remain in the neighborhood of 2.3% well into the second quarter of 2016 even though real GDP declined (now at a negative 0.2%) in the first quarter of 2015. Specifically, Rajeev Dhawan of the Economic Forecasting Center at Georgia State University's J. Mack Robinson College of Business expects the U.S. economy to bounce back in second quarter because of WOW. "The three components of WOW shaved off close to 2.5% of U.S. growth in the first quarter, "Dhawan said. (WOW stands for weather, oil, and the world economy.) The GDP report showed clear damage from these three factors.Dhawan mentioned further in his report that unusually cold weather in the Northeast during the first quarter resulted in a reduction of nondurable consumption goods (to a negative 0.3%), spending on utilities (heating) increased, and overall gasoline savings were wiped away. "We've almost reached the bottom, with oil rig counts having dropped sharply with only a little bit to go," wrote Dhawan." But prices will not reach the heights of $120 a barrel anytime soon. I expect oil to start creeping up to $70/barrel by year's end and stay in that range for the coming year," he added. Finally, the world economy factor influenced the real GDP due to the dragging recovery of China (now at 7%, down from double digits) and the European Central Bank's bond-buying program. Chinese economy's slow-paced recovery affects the emerging markets because of supply chain connections. Eurozone's challenge is related to a potential Greek rescue operation and the trillion-dollar liquidity injection (bond buying program) by the European Central Bank, which results in negative government bond yields. Consequently, these factors led to a decline in exports (now at a 2.3% decline).CONSUMERSThe improvements in employmentand increases in consumers' personal disposable income positively affect consumption and, subsequently, growth in the economy. However, as Dhawan pointed out, the weather factor, although a temporary one, played significant impact along with the changes in crude oil prices (now, at US$59/barrel).
|
<filename>app/core/event/BugVersionFixHandler.ts
import { Event, Inject } from '@eggjs/tegg';
import { EggLogger } from 'egg';
import { PACKAGE_VERSION_ADDED } from './index';
import { BUG_VERSIONS } from '../../common/constants';
import { PackageManagerService } from '../service/PackageManagerService';
import { BugVersionService } from '../service/BugVersionService';
@Event(PACKAGE_VERSION_ADDED)
export class BugVersionFixHandler {
@Inject()
private readonly bugVersionService: BugVersionService;
@Inject()
private readonly packageManagerService: PackageManagerService;
@Inject()
private readonly logger: EggLogger;
async handle(fullname: string) {
if (fullname !== BUG_VERSIONS) return;
try {
const bugVersion = await this.packageManagerService.getBugVersion();
if (!bugVersion) return;
await this.bugVersionService.cleanBugVersionPackageCaches(bugVersion);
} catch (e) {
e.message = `[BugVersionFixHandler] clean cache failed: ${e.message}`;
this.logger.error(e);
}
}
}
|
Main achievements and directions of scientific research in the field of foundation construction on sites composed of permafrost in the Russian Federation Construction in the northern construction-climatic zone associated with the solution of complex engineering problems. This is primarily the monitoring of the temperature state of the cryolithozone, which depends on a combination of climatic, engineering-geological, hydrogeological, environmental and technogenic factors. This is even more relevant in connection with the fact of degradation of permafrost. Generalized and classified technical solutions for the construction of foundations of buildings and structures built on the I and II principles. At the same time, technical audit of design solutions and scientific support during the entire life cycle of objects is required. In order to improve and implement the progressive direction in foundation construction, variants of design solutions for self-cooling freestanding columnar foundations of factory production are proposed and the work schedule is detailed. Permafrost soils in natural conditions are located in the northern regions of the Russian Federation, the United States of America, Canada, partially Scandinavian countries, Greenland, and for high-altitude permafrost island character is located in other countries of the world. On the territory of Russia, permafrost soils (cryolithozone) occupy 65% of the country's territory. The temperature of permafrost soils is known to range from -0.5°C to -10°C and depending on this, soils are divided into: low-temperature and high-temperature; hard-frozen (firmly cemented with ice, characterized by brittle fracture and almost incompressible) and plastic-frozen (cemented with ice, but having viscous properties and compressibility under external load). The temperature state of the cryolithozone is not stable. Factors that influence the natural environment include: instability of arctic ecosystems, solar radiation, surface albedo, and polar night. The consequences of human life are influenced on warm-balance characteristics of ground coat layer surface in winter and summer. The consequences of natural and climatic phenomena characterized by global warming and man-made processes are manifested in the form of degradation of the earth's cryolithozone. As a result of a comprehensive analysis of data from weather stations and geocryological hospitals, it was found that within a few decades, the border of permafrost can move north by 200-500 km. At the same time, the northern lands (in particular in Russia) contain the main natural resources, the development of which is necessary to ensure the viability of the country's economy. In all countries, including Russia, two basic principles of the use of permafrost soils as the foundations of buildings and structures are used (in other countries-in a different version and interpretation). ICRE-2020 IOP Conf. Series: Materials Science and Engineering 880 012016 IOP Publishing doi:10.1088/1757-899X/880/1/012016 2 I principle-permafrost soils of the base are used in a frozen state, preserved during construction and during the entire period of operation of the structure; II principle-permafrost soils of the base are used in thawed or thawing condition. To implement the I principle of construction on permafrost soils (solid-frozen low-temperature soils), the following basic engineering solutions are provided and developed: arrangement of ventilated subfields or cold ground floors of buildings with natural and incentive ventilation; laying in the base of the construction of pipes, boxes with forced ventilation; application of ventilated foundation structures with forced ventilation; the installation of the seasonally operating cooling devices mainly deep action (even in the plastic-frozen soils); the device of heat shields reducing the thermal impact of buildings, structures on the frozen ground. It is mandatory to take into account the depth of seasonal freezing-thawing of the soil (merging and nonmerging permafrost). The use of the II principle of construction on permafrost soils (and this is usually plastic-frozen hightemperature soils), as well as a fairly complex engineering task and it is connected with indepth termophysical calculation and with observing exact rules of works production. Technical solutions used in construction practice: preliminary (before the construction of the building) artificial thawing of base soils; replacement of icy soils with thawed non-sedimentary soil (sandy or large-block); increasing the depth of foundations with bear against the rock or other low-compressible soils. Intensive development of the northern regions of Russia has been going on for several decades, and in the last 10-15 years, the state and enterprise investors allocate very large funds for this. One of the main directions of improving the quality of work in the design, construction and operation of buildings, structures, highways are technical audit of design solutions and scientific support during the entire life cycle of objects. The purpose of the technical audit is to assess the effectiveness of the use of new technologies and equipment in specific conditions for a given production task, competitive evaluation of options for a set of heterogeneous criteria, structural and parametric selection of the best options, assessment of the payback period of investments and risk prediction. Scientific support includes systematization of scientific research, observations and results of geocryological and geotechnical monitoring, methods of mathematical modeling of thermodynamic and thermal interactions of objects and geotechnical systems with permafrost soils. A progressive direction for improving construction on permafrost soils is the use of self-cooling support systems-foundations. The thermocouple partially rises above the ground, and there are holes in the structure. Cold air due to convection falls down to the base, transfers the cold to the ground. The use of self cooling support systems has a number of advantages over other systems: -combining cooling and supporting structure into a single unit; -reducing the cost of cooling systems-the absence of special self-cooling devices (SSCD) which are installed separately from the foundations; -cooling of deep soil layers as opposed to surface cooling; -higher durability, reliability and aesthetics; -ecological safety; In the development of this direction of foundation construction on permafrost soils, the construction of a columnar self-cooling foundation is proposed and is currently being tested. On Figure 1 the variants of the design solution of self-cooling supports for buildings and structures are given. In this case, it is a free-standing columnar reinforced concrete foundation of factory manufacture. Channels and cavities in the foundation are located in such a way as to provide natural (and if necessary forced) ventilation of the internal surfaces of the structure and mainly the ground surface at the level of the upper boundary of the permafrost layer of the base Production time is limited. The start time of the excavation works (i.e. development of pits for free-standing foundations) must match the beginning of the period reduction in average daily outdoor temperature below 0°C. During this period, the soil thawed over the entire depth of the active layer, and the top "crust" is already partially frozen soil (if this is the case) is insignificant and does not preclude the development of soil conventional mechanisms. The pit is developed to the full depth to the level of the foundation sole. The base is prepared in the traditional way. The prefabricated foundation with a factory-applied layer of waterproofing is mounted in the design position. The channel openings are located above ground level. The foundation is installed in such a way that the inlet and outlet openings of the ventilation channels are located taking into account the direction of the prevailing winds. Further, without interruptions in the production of works, it is necessary to perform backfilling of the pit with soil with a layer-by-layer seal. The work must be performed as soon as possible, providing in advance the availability of the necessary machines, mechanisms, materials, structures and labor resources. This technology will avoid excessive thermal effects on the permafrost soil and after completion of work to ensure the penetration of cold air inside the foundation to cool the base. During the construction of both the I and II principles there is a task to improve the system of temperature monitoring of permafrost soils of the base of buildings and structures under construction. It is necessary to control, and sometime to operate a temperature regime of soils in the course of operation of objects. A modern measuring system allows monitoring temperature conditions continuously during the operation of objects. The set of equipment includes: thermocos installed in thermometric wells; controllers; loggers; personal computers for recording and processing data. The equipment works continuously for several years. The measured temperature values from the thermocos sensors are recorded on a MicroSD memory card. Data collection is carried out by copying a data file using a communication line to a personal computer in the form of an archive over a radio signal. Thus, as a result of many years of fundamental research in the Russian Federation, almost any engineering problems in construction in the northern construction and climate zone are solved. Self-cooling free-standing Scientific Research Institute of Transport Construction (TsNIIS) OJSC, Moscow, Self-cooling permafrost support systems International Association of Foundation Builders Collection of reports of the international scientific and technical conference "Technologies for the design and construction of foundations on permafrost soils" October 16-17, 2014, Moscow, All-Russian Exhibition Center Andreev V S, Nabokov A V, Passek V V 2017Development of a complex of self-cooling support systems as a fundamentally new direction in the construction of bridges and roads on permafrost Prospects for Science 4 p 18
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.