title
stringlengths
8
300
abstract
stringlengths
0
10k
Full-System Simulation of Java Workloads with RISC-V and the Jikes Research Virtual Machine
Managed languages such as Java, JavaScript or Python account for a large portion of workloads, both in cloud data centers and on mobile devices. It is therefore unsurprising that there is an interest in hardware-software co-design for these languages. However, existing research infrastructure is often unsuitable for this kind of research: managed languages are sensitive to fine-grained interactions that are not captured by high-level architectural models, yet are also too long-running and irregular to be simulated using cycle-accurate software simulators. Open-source hardware based on the RISC-V ISA provides an opportunity to solve this problem, by running managed workloads on RISC-V systems in FPGA-based full-system simulation. This approach achieves both the accuracy and simulation speeds required for managed workloads, while enabling modification and design-space exploration for the underlying hardware. A crucial requirement for this hardware-software research is a managed runtime that can be easily modified. The Jikes Research Virtual Machine (JikesRVM) is a Java Virtual Machine that was developed specifically for this purpose, and has become the gold standard in managed-language research. In this paper, we describe our experience of porting JikesRVM to the RISC-V infrastructure. We discuss why this combined setup is necessary, and how it enables hardware-software research for managed languages that was infeasible with previous infrastructure.
Tribological Behaviour of Al-6061 / SiC Metal Matrix Composite by Taguchi's Techniques
Tribological behaviour of aluminium alloy Al-6061 reinforced with silicon carbide particles (10% & 15%weight percentage of SiCp) fabricated by stir casting process was investigated. The wear and frictional properties of the metal matrix composites was studied by performing dry sliding wear test using a pin-on-disc wear tester. Experiments were conducted based on the plan of experiments generated through Taguchi‟s technique. A L9 Orthogonal array was selected for analysis of the data. Investigation to find the influence of applied load, sliding speed and sliding distance on wear rate, as well as the coefficient of friction during wearing process was carried out using ANOVA and regression equation for each response were developed for both 10% & 15% SiC reinforced Al-6061MMCs. Objective of the model was chosen as „smaller the better‟ characteristics to analyse the dry sliding wear resistance. Results show that sliding distance has the highest influence followed by load and sliding speed. Finally, confirmation tests were carried out to verify the experimental results & Scanning Electron Microscope were done on wear surfaces.
Compression Artifacts Removal Using Convolutional Neural Networks
This paper shows that it is possible to train large and deep convolutional neural networks (CNN) for JPEG compression artifacts reduction, and that such networks can provide significantly better reconstruction quality compared to previously used smaller networks as well as to any other state-of-the-art methods. We were able to train networks with 8 layers in a single step and in relatively short time by combining residual learning, skip architecture, and symmetric weight initialization. We provide further insights into convolution networks for JPEG artifact reduction by evaluating three different objectives, generalization with respect to training dataset size, and generalization with respect to JPEG quality level.
A Passive Elastic Ankle Exoskeleton Using Controlled Energy Storage and Release to Reduce the Metabolic Cost of Walking
Purely passive devices (e.g. dynamic ankle-foot orthoses (DAFOs)) can store and release elastic energy in rigid, non-hinged frames to assist walking without assistance from motors. This lightweight, simplistic approach has been shown to cause small increases in both walking speed and economy poststroke [6-8]. However there are downsides to current DAFO designs. First, rigid, non-hinged DAFOs restrict full ankle joint range of motion, allowing only limited rotation in the sagittal plane. Second, and perhaps more crucialcurrent DAFOs do not allow free ankle rotation during swing, making it difficult to dorsiflex in preparation for heel strike. Inability to dorsiflex freely during swing could impose a significant metabolic penalty, especially in healthy populations [9].
A low energy crystal-less double-FSK transceiver for wireless body-area-network
An energy-efficient crystal-less double-FSK transceiver for wireless body-area-network (WBAN) is implemented in 0.18 μm CMOS technology with 1 V supply. The injection-locking digitally-controlled oscillator (IL-DCO) replaces the crystal oscillator (XO), which leads to reduce energy consumption and system cost. It can detect whether injection locking occurs or not, and calibrates the frequency drift of DCO within 100 kHz accuracy over 100 degrees C temperature variation. The scalable divider-based double-FSK transmitter eliminates the necessity of power-consuming voltage-controlled oscillator (VCO) and direct digital synthesizer (DDS). The frequency calibration with IL-DCO and transmitter consumes 1 mW and 2 mW, respectively, with a data rate of 10 Mb/s, corresponding to energy consumption of 0.2 nJ per transmitted bit.
Neurofeedback-based functional near-infrared spectroscopy upregulates motor cortex activity in imagined motor tasks.
Neurofeedback is a method for using neural activity displayed on a computer to regulate one's own brain function and has been shown to be a promising technique for training individuals to interact with brain-machine interface applications such as neuroprosthetic limbs. The goal of this study was to develop a user-friendly functional near-infrared spectroscopy (fNIRS)-based neurofeedback system to upregulate neural activity associated with motor imagery, which is frequently used in neuroprosthetic applications. We hypothesized that fNIRS neurofeedback would enhance activity in motor cortex during a motor imagery task. Twenty-two participants performed active and imaginary right-handed squeezing movements using an elastic ball while wearing a 98-channel fNIRS device. Neurofeedback traces representing localized cortical hemodynamic responses were graphically presented to participants in real time. Participants were instructed to observe this graphical representation and use the information to increase signal amplitude. Neural activity was compared during active and imaginary squeezing with and without neurofeedback. Active squeezing resulted in activity localized to the left premotor and supplementary motor cortex, and activity in the motor cortex was found to be modulated by neurofeedback. Activity in the motor cortex was also shown in the imaginary squeezing condition only in the presence of neurofeedback. These findings demonstrate that real-time fNIRS neurofeedback is a viable platform for brain-machine interface applications.
Fair Client Puzzles from the Bitcoin Blockchain
Client puzzles have been proposed as a mechanism for proving legitimate intentions by providing “proofs of work”, which can be applied to discourage malicious usage of resources. A typical problem of puzzle constructions is the difference in expected solving time on different computing platforms. We call puzzles which can be solved independently of client computing resources fair client puzzles. We propose a construction for client puzzles requiring widely distributed computational effort for their solution. These puzzles can be solved using the mining process of Bitcoin, or similar cryptocurrencies. Adapting existing definitions, we show that our puzzle construction satisfies formal requirements of client puzzles under reasonable assumptions. We describe a way of transforming our client puzzles for use in denial of service scenarios and demonstrate a practical construction.
Time distortion when users at-risk for social media addiction engage in non-social media tasks.
BACKGROUND There is a growing concern over the addictiveness of Social Media use. Additional representative indicators of impaired control are needed in order to distinguish presumed social media addiction from normal use. AIMS (1) To examine the existence of time distortion during non-social media use tasks that involve social media cues among those who may be considered at-risk for social media addiction. (2) To examine the usefulness of this distortion for at-risk vs. low/no-risk classification. METHOD We used a task that prevented Facebook use and invoked Facebook reflections (survey on self-control strategies) and subsequently measured estimated vs. actual task completion time. We captured the level of addiction using the Bergen Facebook Addiction Scale in the survey, and we used a common cutoff criterion to classify people as at-risk vs. low/no-risk of Facebook addiction. RESULTS The at-risk group presented significant upward time estimate bias and the low/no-risk group presented significant downward time estimate bias. The bias was positively correlated with Facebook addiction scores. It was efficacious, especially when combined with self-reported estimates of extent of Facebook use, in classifying people to the two categories. CONCLUSIONS Our study points to a novel, easy to obtain, and useful marker of at-risk for social media addiction, which may be considered for inclusion in diagnosis tools and procedures.
A new ZVS LCL-resonant push-pull DC-DC converter topology
A new LCL resonant dc–dc converter topology is presented in which the resonantCL components are located after the output rectifier diodes. The push–pull converter topology is suitable for unregulated low-voltage to high-voltage power conversion, as in battery-powered systems, where input currents can exceed input voltages by an order of magnitude. The resonant circuit operates at twice the switching frequency, allowing for small resonant components. The MOSFET primary switches operate under zero-voltage switching (ZVS) conditions due to commutation of the transformer magnetizing current and the snubbing effect of the inherent drain-source capacitance. Output rectifier turn-off is effectively snubbed by the resonant capacitor. Laboratory tests show 93% efficiency at 12-V 160-A input, 235-V 1.8-kW output. Surge capability of up to 5 kW for 1 s has been tested. Circuit simulations and experimental results are presented and are shown to have excellent agreement with fundamental mode analysis.
Tasks and scenario-based evaluation of information visualization techniques
Usability evaluation of an information visualization technique can only be done by the joint evaluation of both the visual representation and the interaction techniques. This work proposes task models as a key element for carrying out such evaluations in a structured way. We base our work on a taxonomy abstracting from rendering functions supported by information visualization techniques. CTTE is used to model these abstract visual tasks as well as to generate scenarios from this model for evaluation purposes. We conclude that the use of task models allows generating test scenarios which are more effective than informal and unstructured evaluations.
Personalized Dialogue Generation with Diversified Traits
Endowing a dialogue system with particular personality traits is essential to deliver more human-like conversations. However, due to the challenge of embodying personality via language expression and the lack of large-scale persona-labeled dialogue data, this research problem is still far from well-studied. In this paper, we investigate the problem of incorporating explicit personality traits in dialogue generation to deliver personalized dialogues. To this end, firstly, we construct PersonalDialog, a large-scale multi-turn dialogue dataset containing various traits from a large number of speakers. The dataset consists of 20.83M sessions and 56.25M utterances from 8.47M speakers. Each utterance is associated with a speaker who is marked with traits like Age, Gender, Location, Interest Tags, etc. Several anonymization schemes are designed to protect the privacy of each speaker. This large-scale dataset will facilitate not only the study of personalized dialogue generation, but also other researches on sociolinguistics or social science. Secondly, to study how personality traits can be captured and addressed in dialogue generation, we propose persona-aware dialogue generation models within the sequence to sequence learning framework. Explicit personality traits (structured by key-value pairs) are embedded using a trait fusion module. During the decoding process, two techniques, namely persona-aware attention and persona-aware bias, are devised to capture and address trait-related information. Experiments demonstrate that our model is able to address proper traits in different contexts. Case studies also show interesting results for this challenging research problem.
Technological Learning and Innovation in China in the Context of Globalization
A team of Chinaand U.S.-based geographers develops the theoretical concept of “learning field” to advance the study of technological innovation through networking under conditions of ongoing globalization. The concept is applied in a survey of ca. 100 firms in the Zhengzhou Economic and Technological Development Zone, located in a relatively underdeveloped region of China. The findings emphasize the different patterns and challenges confronting companies of differing size, property rights, and R&D capacities, as well as the variable extent to which technological learning is based on local versus global linkages and networking. Key elements involved in successful technological upgrading (in addition to networking) are identified, including market structure, competitive strategies, and capital. Also examined are the roles played by geographic, relational, and institutional factors in providing opportunities for learning and cooperation among firms in an industrial district. Journal of Economic Literature, Classification Numbers: D21, D83, O31, P20. 8 figures, 3 tables, 67 references.
Neural signature of fictive learning signals in a sequential investment task.
Reinforcement learning models now provide principled guides for a wide range of reward learning experiments in animals and humans. One key learning (error) signal in these models is experiential and reports ongoing temporal differences between expected and experienced reward. However, these same abstract learning models also accommodate the existence of another class of learning signal that takes the form of a fictive error encoding ongoing differences between experienced returns and returns that "could-have-been-experienced" if decisions had been different. These observations suggest the hypothesis that, for all real-world learning tasks, one should expect the presence of both experiential and fictive learning signals. Motivated by this possibility, we used a sequential investment game and fMRI to probe ongoing brain responses to both experiential and fictive learning signals generated throughout the game. Using a large cohort of subjects (n = 54), we report that fictive learning signals strongly predict changes in subjects' investment behavior and correlate with fMRI signals measured in dopaminoceptive structures known to be involved in valuation and choice.
GPC Temperature Control of A Simulation Model Infant-Incubator and Practice with Arduino Board
The thermal environment surrounding preterm neonates in closed incubators is regulated via air temperature control mode. At present, these control modes do not take account of all the thermal parameters involved in a pattern of incubator such as the thermal parameters of preterm neonates (birth weight < 1000 grams). The objective of this work is to design and validate a generalized predictive control (GPC) that takes into account the closed incubator model as well as the newborn premature model. Then, we implemented this control law on a DRAGER neonatal incubator with and without newborn using microcontroller card. Methods: The design of the predictive control law is based on a prediction model. The developed model allows us to take into account all the thermal exchanges (radioactive, conductive, convective and evaporative) and the various interactions between the environment of the incubator and the premature newborn. Results: The predictive control law and the simulation model developed in Matlab/Simulink environment make it possible to evaluate the quality of the mode of control of the air temperature to which newborn must be raised. The results of the simulation and implementation of the air temperature inside the incubator (with newborn and without newborn) prove the feasibility and effectiveness of the proposed GPC controller compared with a proportional–integral–derivative controller (PID controller). Keywords—Incubator; neonatal; model; temperature; Arduino; GPC
Towards Bidirectional Hierarchical Representations for Attention-Based Neural Machine Translation
This paper proposes a hierarchical attentional neural translation model which focuses on enhancing source-side hierarchical representations by covering both local and global semantic information using a bidirectional tree-based encoder. To maximize the predictive likelihood of target words, a weighted variant of an attention mechanism is used to balance the attentive information between lexical and phrase vectors. Using a tree-based rare word encoding, the proposed model is extended to sub-word level to alleviate the out-of-vocabulary (OOV) problem. Empirical results reveal that the proposed model significantly outperforms sequence-to-sequence attention-based and tree-based neural translation models in English-Chinese translation tasks.
Duration modeling of Indian languages Hindi and Telugu
This paper reports a preliminary attempt on data-driven modeling of segmental (phoneme) duration for two Indian languages Hindi and Telugu. Classification and Regression Tree (CART) based data-driven duration modeling for segmental duration prediction is presented. A number of features are proposed and their usefulness and relative contribution in segmental duration prediction is assessed. Objective evaluation of the duration models, by root mean squared prediction error (RMSE) and correlation between actual and predicted durations, is performed. The duration models developed have been implemented in an Indian language Textto-Speech synthesis system [1] being developed within Festival framework [2].
Effect of volume expansion on the anginal threshold.
The clinical and hemodynamic effects of rapid atrial pacing were studied in 13 patients with coronary artery disease and four normal subjects before and following acute expansion of blood volume with low molecular weight dextran. Five patients developed angina during the initial pacing period. In two of these five patients angina recurred with infusion alone, and all five of them experienced pain during the second pacing period. The anginal threshold averaged 5 min (range, 1 to 8 min) before infusion and 1.8 min (range, 0 to 5 min) following infusion. Four patients with coronary artery disease remained free of pain during the first pacing period, one of these developed pain with infusion alone, and all developed pain during pacing following infusion, the anginal threshold being 2 min (range, 0 to 3 min). The remaining four patients with coronary artery disease did not develop angina during either pacing period. Administration of dextran was accompanied by an increase in left ventricular enddiastolic pressure in all patients during sinus rhythm and during atrial pacing. Although left ventricular volume was not determined, changes in left ventricular enddiastolic pressure were interpreted as showing directional changes in left ventricular volume. Analysis of the factors affecting myocardial oxygen requirements during the two pacing periods indicated that heart rate, systemic arterial pressure, and maximal rate of rise of left ventricular pressure were similar. Although the tension-time index
Fusing Bird View LIDAR Point Cloud and Front View Camera Image for Deep Object Detection
We propose a new method for fusing a LIDAR point cloud and camera-captured images in the deep convolutional neural network (CNN). The proposed method constructs a new layer called non-homogeneous pooling layer to transform features between bird view map and front view map. The sparse LIDAR point cloud is used to construct the mapping between the two maps. The pooling layer allows efficient fusion of the bird view and front view features at any stage of the network. This is favorable for the 3D-object detection using camera-LIDAR fusion in autonomous driving scenarios. A corresponding deep CNN is designed and tested on the KITTI[1] bird view object detection dataset, which produces 3D bounding boxes from the bird view map. The fusion method shows particular benefit for detection of pedestrians in the bird view compared to other fusion-based object detection networks.
A Generic Coordinate Descent Framework for Learning from Implicit Feedback
In recent years, interest in recommender research has shifted from explicit feedback towards implicit feedback data. A diversity of complex models has been proposed for a wide variety of applications. Despite this, learning from implicit feedback is still computationally challenging. So far, most work relies on stochastic gradient descent (SGD) solvers which are easy to derive, but in practice challenging to apply, especially for tasks with many items. For the simple matrix factorization model, an efficient coordinate descent (CD) solver has been previously proposed. However, efficient CD approaches have not been derived for more complex models. In this paper, we provide a new framework for deriving efficient CD algorithms for complex recommender models. We identify and introduce the property of k-separable models. We show that k-separability is a sufficient property to allow efficient optimization of implicit recommender problems with CD. We illustrate this framework on a variety of state-of-the-art models including factorization machines and Tucker decomposition. To summarize, our work provides the theory and building blocks to derive efficient implicit CD algorithms for complex recommender models.
Music Similarity Measures: What's the use?
Electronic Music Distribution (EMD) is in demand of robust, automatically extracted music descriptors. We introduce a timbral similarity measures for comparing music titles. This measure is based on a Gaussian model of cepstrum coefficients. We describe the timbre extractor and the corresponding timbral similarity relation. We describe experiments in assessing the quality of the similarity relation, and show that the measure is able to yield interesting similarity relations, in particular when used in conjunction with other similarity relations. We illustrate the use of the descriptor in several EMD applications developed in the context of the Cuidado European project.
Characterizing Data Structures for Volatile Forensics
Volatile memory forensic tools can extract valuable evidence from latent data structures present in memory dumps. However, current techniques are generally limited by a lack of understanding of the underlying data without the use of expert knowledge. In this paper, we characterize the nature of such evidence by using deep analysis techniques to better understand the life-cycle and recoverability of latent program data in memory. We have developed Cafegrind, a tool that can systematically build an object map and track the use of data structures as a program is running. Statistics collected by our tool can show which data structures are the most numerous, which structures are the most frequently accessed and provide summary statistics to guide forensic analysts in the evidence gathering process. As programs grow increasingly complex and numerous, the ability to pinpoint specific evidence in memory dumps will be increasingly helpful. Cafegrind has been tested on a number of real-world applications and we have shown that it can successfully map up to 96% of heap accesses.
STATISTICAL METHODS FOR ASSESSING AGREEMENT BETWEEN TWO METHODS OF CLINICAL MEASUREMENT
In clinical measurement comparison of a new measurement technique with an established one is often needed to see whether they agree sufficiently for the new to replace the old. Such investigations are often analysed inappropriately, notably by using correlation coefficients. The use of correlation is misleading. An alternative approach, based on graphical techniques and simple calculations, is described, together with the relation between this analysis and the assessment of repeatability.
3D Human Pose Estimation With 2D Marginal Heatmaps
Automatically determining three-dimensional human pose from monocular RGB image data is a challenging problem. The two-dimensional nature of the input results in intrinsic ambiguities which make inferring depth particularly difficult. Recently, researchers have demonstrated that the flexible statistical modelling capabilities of deep neural networks are sufficient to make such inferences with reasonable accuracy. However, many of these models use coordinate output techniques which are memory-intensive, not differentiable, and/or do not spatially generalise well. We propose improvements to 3D coordinate prediction which avoid the aforementioned undesirable traits by predicting 2D marginal heatmaps under an augmented soft-argmax scheme. Our resulting model, MargiPose, produces visually coherent heatmaps whilst maintaining differentiability. We are also able to achieve state-of-the-art accuracy on publicly available 3D human pose estimation data.
Reuters Tracer: A Large Scale System of Detecting & Verifying Real-Time News Events from Twitter
News professionals are facing the challenge of discovering news from more diverse and unreliable information in the age of social media. More and more news events break on social media first and are picked up by news media subsequently. The recent Brussels attack is such an example. At Reuters, a global news agency, we have observed the necessity of providing a more effective tool that can help our journalists to quickly discover news on social media, verify them and then inform the public. In this paper, we describe Reuters Tracer, a system for sifting through all noise to detect news events on Twitter and assessing their veracity. We disclose the architecture of our system and discuss the various design strategies that facilitate the implementation of machine learning models for noise filtering and event detection. These techniques have been implemented at large scale and successfully discovered breaking news faster than traditional journalism
Structural Graph Matching Using the EM Algorithm and Singular Value Decomposition
ÐThis paper describes an efficient algorithm for inexact graph matching. The method is purely structural, that is to say, it uses only the edge or connectivity structure of the graph and does not draw on node or edge attributes. We make two contributions. Commencing from a probability distribution for matching errors, we show how the problem of graph matching can be posed as maximum-likelihood estimation using the apparatus of the EM algorithm. Our second contribution is to cast the recovery of correspondence matches between the graph nodes in a matrix framework. This allows us to efficiently recover correspondence matches using singular value decomposition. We experiment with the method on both real-world and synthetic data. Here, we demonstrate that the method offers comparable performance to more computationally demanding methods. Index TermsÐInexact graph matching, EM algorithm, matrix factorization, mixture models, Delaunay triangulations.
A Corpus-Based Investigation of Definite Description Use
We present the results of a study of definite descriptions use in written texts aimed at assessing the feasibility of annotating corpora with information about definite description interpretation. We ran two experiments, in which subjects were asked to classify the uses of definite descriptions in a corpus of 33 newspaper articles, containing a total of 1412 definite descriptions. We measured the agreement among annotators about the classes assigned to definite descriptions, as well as the agreement about the antecedent assigned to those definites that the annotators classified as being related to an antecedent in the text. Themost interesting result of this study from a corpus annotation perspective was the rather low agreement (K=0.63) that we obtained using versions of Hawkins’ and Prince’s classification schemes; better results (K=0.76) were obtained using the simplified scheme proposed by Fraurud that includes only two classes, first-mention and subsequent-mention. The agreement about antecedents was also not complete. These findings raise questions concerning the strategy of evaluating systems for definite description interpretation by comparing their results with a standardized annotation. From a linguistic point of view, the most interesting observations were the great number of discourse-newdefinites in our corpus (in one of our experiments, about 50% of the definites in the collection were classified as discourse-new, 30% as anaphoric, and 18% as associative/bridging) and the presence of definites which did not seem to require a complete disambiguation. This paper will appear in Computational Linguistics.
CRSM: Crowdsourcing Based Road Surface Monitoring
Detecting road potholes and road roughness levels is key to road condition monitoring, which impacts transport safety and driving comfort. We propose a crowd sourcing based road surface monitoring system, simply called CRSM. CRSM can effectively detect road potholes and evaluate road roughness levels using our hardware modules mounted on distributed vehicles. These modules use low-end accelerometers and GPS devices to obtain vibration pattern, location, and vehicle velocity. Considering the high cost of onboard storage and wireless transmission, a novel light-weight data mining algorithm is proposed to detect road surface events and transmit potential pothole information to a central server. The server gathers reports from the multiple vehicles, and makes a comprehensive evaluation on road surface quality. We have implemented a product-quality system and deployed it on 100 taxies in the Shenzhen urban area. The results show that CRSM can detect road potholes with up to 90% accuracy, and nearly zero false alarms. CRSM can also evaluate road roughness levels correctly, even with some interferences from small bumps or potholes.
Spanish Lymphoma Group (GELTAMO) guidelines for the diagnosis, staging, treatment, and follow-up of diffuse large B-cell lymphoma
Diffuse large B-cell lymphoma (DLBCL) accounts for approximately 30% of non-Hodgkin lymphoma (NHL) cases in adult series. DLBCL is characterized by marked clinical and biological heterogeneity, encompassing up to 16 distinct clinicopathological entities. While current treatments are effective in 60% to 70% of patients, those who are resistant to treatment continue to die from this disease. An expert panel performed a systematic review of all data on the diagnosis, prognosis, and treatment of DLBCL published in PubMed, EMBASE and MEDLINE up to December 2017. Recommendations were classified in accordance with the Grading of Recommendations Assessment Development and Evaluation (GRADE) framework, and the proposed recommendations incorporated into practical algorithms. Initial discussions between experts began in March 2016, and a final consensus was reached in November 2017. The final document was reviewed by all authors in February 2018 and by the Scientific Committee of the Spanish Lymphoma Group GELTAMO.
Blockchain-inspired Event Recording System for Autonomous Vehicles
Autonomous vehicles are capable of sensing their environment and navigating without any human inputs. However, when autonomous vehicles are involved in accidents between themselves or with human subjects, liability must be indubitably decided based on accident forensics. This paper proposes a blockchain-inspired event recording system for autonomous vehicles. Due to the inefficiency and limited usage of certain blockchain features designed for the traditional cryptocurrency applications, we design a new “proof of event” mechanism to achieve indisputable accident forensics by ensuring that event information is trustable and verifiable. Specifically, we propose a dynamic federation consensus scheme to verify and confirm the new block of event data in an efficient way without any central authority. The security capability of the proposed scheme is also analyzed against different threat and attack models.
Endogenous opiates and the control of breathing in normal subjects and patients with chronic airflow obstruction.
To investigate the role of endorphins in central respiratory control, the effect of naloxone, a specific opiate antagonist, on resting ventilation and ventilatory control was investigated in a randomised double-blind, placebo-controlled study of normal subjects and patients with chronic airways obstruction and mild hypercapnia due to longstanding chronic bronchitis. In 13 normal subjects the ventilatory response to hypercapnia increased after an intravenous injection of naloxone (0.1 mg/kg), ventilation (VE) at a PCO2 of 8.5 kPa increasing from 55.6 +/- SEM 6.2 to 75.9 +/- 8.21 min-1 (p less than 0.001) and the delta VE/delta PCO2 slope increasing from 28.6 +/- 4.4 to 34.2 +/- 4.21 min-1 kPa-1 (p less than 0.05). There was no significant change after placebo (saline) injection. Naloxone had no effect on resting ventilation or on the ventilatory response to hypoxia in normal subjects. In all six patients naloxone significantly (p less than 0.02) increased mouth occlusion pressure (P 0.1) responses to hypercapnia. Although there was no change in resting respiratory frequency or tidal volume patients showed a significant (p less than 0.01) decrease in inspiratory timing (Ti/Ttot) and increase in mean inspiratory flow (VT/Ti) after naloxone. These results indicate that endorphins have a modulatory role in the central respiratory response to hypercapnia in both normal subjects and patients with airways obstruction. In addition, they have an inhibitory effect on the control of tidal breathing in patients with chronic bronchitis.
Factors affecting nurses' decisions to administer pediatric pain medication postoperatively.
Factors associated with pediatric nurses’ decisions to medicate for postoperative pain as well as knowledge and attitudes concerning analgesia were investigated in 38 nurses using the Nurses’ Pediatric Pain Relief Questionnaire. The charts of 38 children who were hospitalized for major surgery were reviewed for analgesia orders. Baccalaureate preparation in nursing was associated with the selection of more medium and high doses of analgesics on the questionnaire. School-aged children received the most pain medication. Medication orders were below therapeutic levels in about half the charts reviewed. 0 7988 by Grune & Stratton, Inc.
Promoting tourism destinations : A strategic marketing approach
Introduction This paper outlines the main topics of a report prepared for the Tourism Promotion Committee (TPC) in Heraklion District, Crete. This body is responsible for coordinating marketing activities and promoting its area. Its membership is composed of the main public and private sector organizations involved in providing tourism services in Crete. The report was compiled for TPC by the first author, who has been closely involved in the development of tourism marketing and planning on Crete for over ten years. It was commissioned with the aim of improving the destination’s marketing effectiveness and efficiency. The task assigned was to provide a report (i) reviewing the marketing activities undertaken by the TPC; (ii) providing the appropriate approach to be adopted and implemented in promoting Heraklion as tourism destination; and (iii) giving a set of recommendations for key areas requiring improvement. Marios D. Soteriades and Vasiliki A. Avgeli
On computable numbers, with an application to the Entscheidungsproblem
The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by finite means. Although the subject of this paper is ostensibly the computable numbers. it is almost equally easy to define and investigate computable functions of an integral variable or a real or computable variable, computable predicates, and so forth. The fundamental problems involved are, however, the same in each case, and I have chosen the computable numbers for explicit treatment as involving the least cumbrous technique. I hope shortly to give an account of the relations of the computable numbers, functions, and so forth to one another. This will include a development of the theory of functions of a real variable expressed in terms of computable numbers. According to my definition, a number is computable if its decimal can be written down by a machine.
The Normalized Difference Vegetation Index ( NDVI ) : unforeseen successes in animal ecology
This review highlights the latest developments associated with the use of the Normalized Difference Vegetation Index (NDVI) in ecology. Over the last decade, the NDVI has proven extremely useful in predicting herbivore and non-herbivore distribution, abundance and life history traits in space and time. Due to the continuous nature of NDVI since mid-1981, the relative importance of different temporal and spatial lags on population performance can be assessed, widening our understanding of population dynamics. Previously thought to be most useful in temperate environments, the utility of this satellite-derived index has been demonstrated even in sparsely vegetated areas. Climate models can be used to reconstruct historical patterns in vegetation dynamics in addition to anticipating the effects of future environmental change on biodiversity. NDVI has thus been established as a crucial tool for assessing past and future population and biodiversity consequences of change in climate, vegetation phenology and primary productivity.
FWD vehicle drifting control: The handbrake-cornering technique
Race drivers employ expert techniques to exploit the limits of the vehicle performance. In particular, rally driving techniques involve vehicle cornering at high sideslip angles (drifting), and hence operation of the vehicle beyond the stable limits enforced by stability control systems. In this work we study drifting techniques applicable to Front-Wheel-Drive (FWD) drive-train configurations. We present data collected during the execution of handbrake-cornering maneuvers by an expert driver in a FWD vehicle. Consequently, we calculate cornering equilibria using a vehicle model with driven front wheels, and rear wheels “locked” at zero angular rate under application of the handbrake. A controller is designed to stabilize the vehicle with respect to the calculated equilibria, using steering and drive/brake torque control inputs. The controller is implemented in simulation to demonstrate the stabilization of unstable drifting steady-states.
Finders , keepers ? Attracting , motivating and retaining knowledge workers
Attracting, motivating and retaining knowledge workers have become important in a knowledge-based and tight labour market, where changing knowledge management practices and global convergence of technology has redefined the nature of work. While individualisation of employment practices and team-based work may provide personal and organisational flexibilities, aligning HR and organisational strategies for competitive advantage has become more prominent. This exploratory study identifies the most and least effective HR strategies used by knowledge intensive firms (KIFs) in Singapore for attracting, motivating and retaining these workers. The most popular strategies were not always the most effective, and there appear to be distinctive ‘bundles’ of HR practices for managing knowledge workers. These vary according to whether ownership is foreign or local. A schema, based on statistically significant findings, for improving the effectiveness of these practices in managing knowledge workers is proposed. Cross-cultural research is necessary to establish the extent of diffusion of these practices. Contact: Frank M. Horwitz, Graduate School of Business, Breakwater Campus, University of Cape Town, Private Bag Rondebosch, Cape Town 7700 South Africa. Email: [email protected]
Fisheye lens distortion correction on multicore and hardware accelerator platforms
Wide-angle (fisheye) lenses are often used in virtual reality and computer vision applications to widen the field of view of conventional cameras. Those lenses, however, distort images. For most real-world applications the video stream needs to be transformed, at real-time (20 frames/sec or better), back to the natural-looking, central perspective space. This paper presents the implementation, optimization and characterization of a fisheye lens distortion correction application on three platforms: a conventional, homogeneous multicore processor by Intel, a heterogeneous multicore (Cell BE), and an FPGA implementing an automatically generated streaming accelerator. We evaluate the interaction of the application with those architectures using both high- and low-level performance metrics. In macroscopic terms, we find that todays mainstream conventional multicores are not effective in supporting real-time distortion correction, at least not with the currently commercially available core counts. Architectures, such as the Cell BE and FPGAs, offer the necessary computational power and scalability, at the expense of significantly higher development effort. Among these three platforms, only the FPGA and a fully optimized version of the code running on the Cell processor can provide realtime processing speed. In general, FPGAs meet the expectations of performance, flexibility, and low overhead. General purpose multicores are, on the other hand, much easier to program.
Description of the “Scenario Machine”
We present an updated description of the “Scenario Machine” code, which is used to carry out population-synthesis analyzes of the evolution of close binary stars.
Sectjunction: Wi-Fi indoor localization based on junction of signal sectors
In Wi-Fi fingerprint localization, a target sends its measured Received Signal Strength Indicator (RSSI) of access points (APs) to a server for its position estimation. Traditionally, the server estimates the target position by matching the RSSI with the fingerprints stored in database. Due to signal measurement uncertainty, this matching process often leads to a geographically dispersed set of reference points, resulting in unsatisfactory estimation accuracy. We propose a novel, efficient and highly accurate localization scheme termed Sectjunction which does not lead to a dispersed set of neighbors. For each selected AP, Sectjunction sectorizes its coverage area according to discrete signal levels, hence achieving robustness against measurement uncertainty. Based on the received AP RSSI, the target can then be mapped to the sector where it is likely to be. To further enhance its computational efficiency, Sectjunction partitions the site into multiple area clusters to narrow the search space. Through convex optimization, the target is localized based on the cluster and the junction of the sectors it is within. We have implemented Sectjunction, and our extensive experiments show that it significantly outperforms recent schemes with much lower estimation error.
Morbidity and discomfort of ten-core biopsy of the prostate evaluated by questionnaire.
Transition zone biopsies have been found to increase the detection rates of cancer of the prostate in patients with negative digital rectal examination. There are however no data available whether the higher biopsy rate is associated with greater morbidity. The present study was therefore designed to evaluate the complication rate of extended sextant biopsy. In this prospective study, 162 consecutive patients who presented for prostatic evaluation were included. After starting prophylactic antibiotic treatment 48 h prior to the procedure, transrectal ultrasound-guided core biopsies were obtained from each lobe: three each from the peripheral zone (apex, mid-zone and base) and two from the transition zone of each prostatic lobe. In all patients a questionnaire was obtained 10-12 days after the procedure. Major complications occurred in 3 patients. In 2 of the 3 cases major macroscopic hematuria was treated by an indwelling catheter for 1 or 2 days and 1 patient developed fever >38.5 degrees C for 1 day. Minor macroscopic hematuria was present in 68.5% of the patients. In 17.9% of these cases, the hematuria lasted for more than 3 days. Hematospermia was observed in 19.8% and minor rectal bleeding occurred in 4.9%. Ten-core biopsies did not lead to an increase in adverse effects or complications when compared to the results of sextant biopsies reported in the literature.
Does In Utero Exposure to Heavy Maternal Smoking Induce Nicotine Withdrawal Symptoms in Neonates?
Maternal drug use during pregnancy is associated with fetal passive addiction and neonatal withdrawal syndrome. Cigarette smoking—highly prevalent during pregnancy—is associated with addiction and withdrawal syndrome in adults. We conducted a prospective, two-group parallel study on 17 consecutive newborns of heavy-smoking mothers and 16 newborns of nonsmoking, unexposed mothers (controls). Neurologic examinations were repeated at days 1, 2, and 5. Finnegan withdrawal score was assessed every 3 h during their first 4 d. Newborns of smoking mothers had significant levels of cotinine in the cord blood (85.8 ± 3.4 ng/mL), whereas none of the controls had detectable levels. Similar findings were observed with urinary cotinine concentrations in the newborns (483.1 ± 2.5 μg/g creatinine versus 43.6 ± 1.5 μg/g creatinine; p = 0.0001). Neurologic scores were significantly lower in newborns of smokers than in control infants at days 1 (22.3 ± 2.3 versus 26.5 ± 1.1; p = 0.0001), 2 (22.4 ± 3.3 versus 26.3 ± 1.6; p = 0.0002), and 5 (24.3 ± 2.1 versus 26.5 ± 1.5; p = 0.002). Neurologic scores improved significantly from day 1 to 5 in newborns of smokers (p = 0.05), reaching values closer to control infants. Withdrawal scores were higher in newborns of smokers than in control infants at days 1 (4.5 ± 1.1 versus 3.2 ± 1.4; p = 0.05), 2 (4.7 ± 1.7 versus 3.1 ± 1.1; p = 0.002), and 4 (4.7 ± 2.1 versus 2.9 ± 1.4; p = 0.007). Significant correlations were observed between markers of nicotine exposure and neurologic-and withdrawal scores. We conclude that withdrawal symptoms occur in newborns exposed to heavy maternal smoking during pregnancy.
Isolation Enhancement in Patch Antenna Array With Fractal UC-EBG Structure and Cross Slot
A compact patch antenna array with high isolation by using two decoupling structures including a row of fractal uniplanar compact electromagnetic bandgap (UC-EBG) structure and three cross slots is proposed. Simulated results show that significant improvement in interelement isolation of 13 dB is obtained by placing the proposed fractal UC-EBG structure between the two radiating patches. Moreover, three cross slots etched on the ground plane are introduced to further suppress the mutual coupling. The design is easy to be manufactured without the implementation of metal vias, and a more compact array with the edge-to-edge distance of 0.22 λ0 can be facilitated by a row of fractal UC-EBG, which can be well applied in the patch antenna array.
3-D Statistical Channel Model for Millimeter-Wave Outdoor Communications
This paper presents a 3-dimensional millimeterwave statistical channel impulse response model from 28 GHz and 73 GHz ultrawideband propagation measurements [1], [2] . An accurate 3GPP-like channel model that supports arbitrary carrier frequency, RF bandwidth, and antenna beamwidth (for both omnidirectional and arbitrary directional antennas), is provided. Time cluster and spatial lobe model parameters are extracted from empirical distributions from field measurements. A step-by-step modeling procedure for generati ng channel coefficients is shown to agree with statistics from t he field measurements, thus confirming that the statistical cha nnel model faithfully recreates spatial and temporal channel impulse responses for use in millimeter-wave 5G air interface desig ns.
INTERNATIONAL INVESTMENT AND INTERNATIONAL TRADE IN THE PRODUCT CYCLE »
Anyone who has sought to understand the shifts in international trade and international investment over the past twenty years has chafed from time to time under an acute sense of the inadequacy of the available analytical tools. While the comparative cost concept and other basic concepts have rarely failed to provide some help, they have usually carried the analyst only a very little way toward adequate understanding. For the most part, it has been necessary to formulate new concepts in order to explore issues such as the strengths and limitations of import substitution in the development process, the implications of common market arrangements for trade and investment, the underlying reasons for the Leontief paradox, and other critical issues of the day. As theorists have groped for some more efiBcient tools, there has been a flowering in international trade and capital theory. But the very proliferation of theory has increased the urgency of the search for unifying concepts. It is doubtful that we shall flnd many propositions that can match the simplicity, power, and universality of application of the theory of comparative advantage and the international equilibrating mechanism; but unless the search for better tools goes on, the usefulness of economic theory for the solution of problems in international trade and capital movements will probably decline. The present paper deals with one promising line of generalization and synthesis which seems to me to have been somewhat neglected by the main stream of trade theory. It puts less emphasis upon comparative cost doctrine and more upon the timing of innovation, the effects of scale economies, and the roles of ignorance and uncertainty in influencing trade patterns. It is an approach
The Sources and Methods of Engineering Design Requirement
The increasing interest in emerging markets drives the product development activities for emerging markets. As a first step, companies need to understand the specific design requirements of a new market when expanding into it. Requirements from external sources are particularly challenging to be defined in a new context. This paper focuses on understanding the design requirement sources at the requirement elicitation phase. It aims at proposing an improved design requirement source classification considering emerging markets and presenting current methods for eliciting requirement for each source. The applicability of these methods and their adaption for emerging market is discussed.
Magnebike: A magnetic wheeled robot with high mobility for inspecting complex-shaped structures
This paper describes the Magnebike robot, a compact robot with two magnetic wheels in a motorbike arrangement, which is intended for inspecting the inner casing of ferromagnetic pipes with complex-shaped structures. The locomotion concept is based on an adapted magnetic wheel unit integrating two lateral lever arms. These arms allow for slight lifting off the wheel in order to locally decrease the magnetic attraction force when passing concave edges, as well as laterally stabilizing the wheel unit. The robot has the main advantage of being compact (180 × 130 × 220 mm) and mechanically simple: it features only five active degrees of freedom (two driven wheels each equipped with an active lifter stabilizer and one steering unit). The paper presents in detail design and implementation issues that are specific to magnetic wheeled robots. Low-level control functionalities are addressed because they are necessary to control the active system. The paper also focuses on characterizing and analyzing the implemented robot. The high mobility
Distributional Semantics Resources for Biomedical Text Processing
The openly available biomedical literature contains over 5 billion words in publication abstracts and full texts. Recent advances in unsupervised language processing methods have made it possible to make use of such large unannotated corpora for building statistical language models and inducing high quality vector space representations, which are, in turn, of utility in many tasks such as text classification, named entity recognition and query expansion. In this study, we introduce the first set of such language resources created from analysis of the entire available biomedical literature, including a dataset of all 1to 5-grams and their probabilities in these texts and new models of word semantics. We discuss the opportunities created by these resources and demonstrate their application. All resources introduced in this study are available under open licenses at http://bio.nlplab.org.
Detection of ductal carcinoma in situ in women undergoing screening mammography.
BACKGROUND With the large number of women having mammography-an estimated 28.4 million U.S. women aged 40 years and older in 1998-the percentage of cancers detected as ductal carcinoma in situ (DCIS), which has an uncertain prognosis, has increased. We pooled data from seven regional mammography registries to determine the percentage of mammographically detected cancers that are DCIS and the rate of DCIS per 1000 mammograms. METHODS We analyzed data on 653 833 mammograms from 540 738 women between 40 and 84 years of age who underwent screening mammography at facilities participating in the National Cancer Institute's Breast Cancer Surveillance Consortium (BCSC) throughout 1996 and 1997. Mammography results were linked to population-based cancer and pathology registries. We calculated the percentage of screen-detected breast cancers that were DCIS, the rate of screen-detected DCIS per 1000 mammograms by age and by previous mammography status, and the sensitivity of screening mammography. Statistical tests were two-sided. RESULTS A total of 3266 cases of breast cancer were identified, 591 DCIS and 2675 invasive breast cancer. The percentage of screen-detected breast cancers that were DCIS decreased with age (from 28.2% [95% confidence interval (CI) = 23.9% to 32.5%] for women aged 40-49 years to 16.0% [95% CI = 13.3% to 18.7%] for women aged 70-84 years). However, the rate of screen-detected DCIS cases per 1000 mammograms increased with age (from 0.56 [95% CI = 0.41 to 0.70] for women aged 40-49 years to 1.07 [95% CI = 0.87 to 1.27] for women aged 70-84 years). Sensitivity of screening mammography in all age groups combined was higher for detecting DCIS (86.0% [95% CI = 83.2% to 88.8%]) than it was for detecting invasive breast cancer (75.1% [95% CI = 73.5% to 76.8%]). CONCLUSIONS Overall, approximately 1 in every 1300 screening mammography examinations leads to a diagnosis of DCIS. Given uncertainty about the natural history of DCIS, the clinical significance of screen-detected DCIS needs further investigation.
CT Scan Screening for Lung Cancer: Risk Factors for Nodules and Malignancy in a High-Risk Urban Cohort
BACKGROUND Low-dose computed tomography (CT) for lung cancer screening can reduce lung cancer mortality. The National Lung Screening Trial reported a 20% reduction in lung cancer mortality in high-risk smokers. However, CT scanning is extremely sensitive and detects non-calcified nodules (NCNs) in 24-50% of subjects, suggesting an unacceptably high false-positive rate. We hypothesized that by reviewing demographic, clinical and nodule characteristics, we could identify risk factors associated with the presence of nodules on screening CT, and with the probability that a NCN was malignant. METHODS We performed a longitudinal lung cancer biomarker discovery trial (NYU LCBC) that included low-dose CT-screening of high-risk individuals over 50 years of age, with more than 20 pack-year smoking histories, living in an urban setting, and with a potential for asbestos exposure. We used case-control studies to identify risk factors associated with the presence of nodules (n=625) versus no nodules (n=557), and lung cancer patients (n=30) versus benign nodules (n=128). RESULTS The NYU LCBC followed 1182 study subjects prospectively over a 10-year period. We found 52% to have NCNs >4 mm on their baseline screen. Most of the nodules were stable, and 9.7% of solid and 26.2% of sub-solid nodules resolved. We diagnosed 30 lung cancers, 26 stage I. Three patients had synchronous primary lung cancers or multifocal disease. Thus, there were 33 lung cancers: 10 incident, and 23 prevalent. A sub-group of the prevalent group were stable for a prolonged period prior to diagnosis. These were all stage I at diagnosis and 12/13 were adenocarcinomas. CONCLUSIONS NCNs are common among CT-screened high-risk subjects and can often be managed conservatively. Risk factors for malignancy included increasing age, size and number of nodules, reduced FEV1 and FVC, and increased pack-years smoking. A sub-group of screen-detected cancers are slow-growing and may contribute to over-diagnosis and lead-time biases.
Common Fixed Point Theorems in Sequentially Compact Fuzzy Metric Spaces
In this paper, we give common fixed point theorems in sequentially compact fuzzy metric space (X,M, ∗) for pairs of weakly compatible mappings and for sequences of self mappings. Mathematics Subject Classification: 47H10; 54H25
Zyzzyva: Speculative Byzantine fault tolerance
A longstanding vision in distributed systems is to build reliable systems from unreliable components. An enticing formulation of this vision is Byzantine Fault-Tolerant (BFT) state machine replication, in which a group of servers collectively act as a correct server even if some of the servers misbehave or malfunction in arbitrary (“Byzantine”) ways. Despite this promise, practitioners hesitate to deploy BFT systems, at least partly because of the perception that BFT must impose high overheads. In this article, we present Zyzzyva, a protocol that uses speculation to reduce the cost of BFT replication. In Zyzzyva, replicas reply to a client's request without first running an expensive three-phase commit protocol to agree on the order to process requests. Instead, they optimistically adopt the order proposed by a primary server, process the request, and reply immediately to the client. If the primary is faulty, replicas can become temporarily inconsistent with one another, but clients detect inconsistencies, help correct replicas converge on a single total ordering of requests, and only rely on responses that are consistent with this total order. This approach allows Zyzzyva to reduce replication overheads to near their theoretical minima and to achieve throughputs of tens of thousands of requests per second, making BFT replication practical for a broad range of demanding services.
How Envy Influences SNS Intentions to Use
Social networking sites (SNS) have grown to be one of the most prevalent technologies, providing users a variety of benefits. However, SNS also provide user sufficient opportunities to access others’ positive information. This would elicit envy. In the current study, we develop a theoretical framework that elaborates the mechanism through which online envy is generated and influences SNS usage. We specify that online users experience two types of envy and each one could have distinct influences on continuance intention to use of SNS. Our findings provide valuable implications for both academic researchers and IS practitioners.
Damaged Goods: Perception of Pornography Addiction as a Mediator Between Religiosity and Relationship Anxiety Surrounding Pornography Use.
Recent research on pornography suggests that perception of addiction predicts negative outcomes above and beyond pornography use. Research has also suggested that religious individuals are more likely to perceive themselves to be addicted to pornography, regardless of how often they are actually using pornography. Using a sample of 686 unmarried adults, this study reconciles and expands on previous research by testing perceived addiction to pornography as a mediator between religiosity and relationship anxiety surrounding pornography. Results revealed that pornography use and religiosity were weakly associated with higher relationship anxiety surrounding pornography use, whereas perception of pornography addiction was highly associated with relationship anxiety surrounding pornography use. However, when perception of pornography addiction was inserted as a mediator in a structural equation model, pornography use had a small indirect effect on relationship anxiety surrounding pornography use, and perception of pornography addiction partially mediated the association between religiosity and relationship anxiety surrounding pornography use. By understanding how pornography use, religiosity, and perceived pornography addiction connect to relationship anxiety surrounding pornography use in the early relationship formation stages, we hope to improve the chances of couples successfully addressing the subject of pornography and mitigate difficulties in romantic relationships.
Secure Border Gateway Protocol (S-BGP)
The Border Gateway Protocol (BGP), which is used to distribute routing information between autonomous systems (ASes), is a critical component of the Internet's routing infrastructure. It is highly vulnerable to a variety of malicious attacks, due to the lack of a secure means of verifying the authenticity and legitimacy of BGP control traffic. This paper describes a secure, scalable, deployable architecture (S-BGP) for an authorization and authentication system that addresses most of the security problems associated with BGP. The paper discusses the vulnerabilities and security requirements associated with BGP, describes the S-BGP countermeasures, and explains how they address these vulnerabilities and requirements. In addition, this paper provides a comparison of this architecture to other approaches that have been proposed, analyzes the performance implications of the proposed countermeasures, and addresses operational issues.
Oxidation resistance of graphene-coated Cu and Cu/Ni alloy.
The ability to protect refined metals from reactive environments is vital to many industrial and academic applications. Current solutions, however, typically introduce several negative effects, including increased thickness and changes in the metal physical properties. In this paper, we demonstrate for the first time the ability of graphene films grown by chemical vapor deposition to protect the surface of the metallic growth substrates of Cu and Cu/Ni alloy from air oxidation. In particular, graphene prevents the formation of any oxide on the protected metal surfaces, thus allowing pure metal surfaces only one atom away from reactive environments. SEM, Raman spectroscopy, and XPS studies show that the metal surface is well protected from oxidation even after heating at 200 °C in air for up to 4 h. Our work further shows that graphene provides effective resistance against hydrogen peroxide. This protection method offers significant advantages and can be used on any metal that catalyzes graphene growth.
The Linear Programming Approach to Approximate Dynamic Programming
The curse of dimensionality gives rise to prohibitive computational requirements that render infeasible the exact solution of large-scale stochastic control problems. We study an efficient method based on linear programming for approximating solutions to such problems. The approach “fits” a linear combination of pre-selected basis functions to the dynamic programming cost-to-go function. We develop error bounds that offer performance guarantees and also guide the selection of both basis functions and “state-relevance weights” that influence quality of the approximation. Experimental results in the domain of queueing network control provide empirical support for the methodology.
Noncontact Proximity Vital Sign Sensor Based on PLL for Sensitivity Enhancement
In this paper, a noncontact proximity vital sign sensor, using a phase locked loop (PLL) incorporated with voltage controlled oscillator (VCO) built-in planar type circular resonator, is proposed to enhance sensitivity in severe environments. The planar type circular resonator acts as a series feedback element of the VCO as well as a near-field receiving antenna. The frequency deviation of the VCO related to the body proximity effect ranges from 0.07 MHz/mm to 1.8 MHz/mm (6.8 mV/mm to 205 mV/mm in sensitivity) up to a distance of 50 mm, while the amount of VCO drift is about 21 MHz in the condition of 60 °C temperature range and discrete component tolerance of ±5%. Total frequency variation occurs in the capture range of the PLL which is 60 MHz. Thus, its loop control voltage converts the amount of frequency deviation into a difference of direct current (DC) voltage, which is utilized to extract vital signs regardless of the ambient temperature. The experimental results reveal that the proposed sensor placed 50 mm away from a subject can reliably detect respiration and heartbeat signals without the ambiguity of harmonic signals caused by respiration signal at an operating frequency of 2.4 GHz.
2sRanking-CNN: A 2-stage ranking-CNN for diagnosis of glaucoma from fundus images using CAM-extracted ROI as an intermediate input
Glaucoma is a disease in which the optic nerve is chronically damaged by the elevation of the intra-ocular pressure, resulting in visual field defect. Therefore, it is important to monitor and treat suspected patients before they are confirmed with glaucoma. In this paper, we propose a 2-stage ranking-CNN that classifies fundus images as normal, suspicious, and glaucoma. Furthermore, we propose a method of using the class activation map as a mask filter and combining it with the original fundus image as an intermediate input. Our results have improved the average accuracy by about 10% over the existing 3-class CNN and ranking-CNN, and especially improved the sensitivity of suspicious class by more than 20% over 3-class CNN. In addition, the extracted ROI was also found to overlap with the diagnostic criteria of the physician. The method we propose is expected to be efficiently applied to any medical data where there is a suspicious condition between normal and disease.
Three-dimensional bipedal walking control using Divergent Component of Motion
In this paper, we extend the Divergent Component of Motion (DCM, also called `Capture Point') to 3D. We introduce the “Enhanced Centroidal Moment Pivot point” (eCMP) and the “Virtual Repellent Point” (VRP), which allow for the encoding of both direction and magnitude of the external (e.g. leg) forces and the total force (i.e. external forces plus gravity) acting on the robot. Based on eCMP, VRP and DCM, we present a method for real-time planning and control of DCM trajectories in 3D. We address the problem of underactuation and propose methods to guarantee feasibility of the finally commanded forces. The capabilities of the proposed control framework are verified in simulations.
Surround-screen projection-based virtual reality: the design and implementation of the CAVE
Several common systems satisfy some but not all of the VR definition above. Flight simulators provide vehicle tracking, not head tracking, and do not generally operate in binocular stereo. Omnimax theaters give a large angle of view [8], occasionally in stereo, but are not interactive. Head-tracked monitors [4][6] provide all but a large angle of view. Head-mounted displays (HMD) [7][13] and BOOMs [9] use motion of the actual display screens to achieve VR by our definition. Correct projection of the imagery on large screens can also create a VR experience, this being the subject of this paper. This paper describes the CAVE (CAVE Automatic Virtual Environment) virtual reality/scientific visualization system in detail and demonstrates that projection technology applied to virtual-reality goals achieves a system that matches the quality of workstation screens in terms of resolution, color, and flicker-free stereo. In addition, this format helps reduce the effect of common tracking and system latency errors. The off-axis perspective projection techniques we use are shown to be simple and straightforward. Our techniques for doing multi-screen stereo vision are enumerated, and design barriers, past and current, are described. Advantages and disadvantages of the projection paradigm are discussed, with an analysis of the effect of tracking noise and delay on the user. Successive refinement, a necessary tool for scientific visualization, is developed in the virtual reality context. The use of the CAVE as a one-to-many presentation device at SIGGRAPH '92 and Supercomputing '92 for computational science data is also mentioned. Previous work in the VR area dates back to Sutherland [12], who in 1965 wrote about the “Ultimate Display.” Later in the decade at the University of Utah, Jim Clark developed a system that allowed wireframe graphics VR to be seen through a headmounted, BOOM-type display for his dissertation. The common VR devices today are the HMD and the BOOM. Lipscomb [4] showed a monitor-based system in the IBM booth at SIGGRAPH '91 and Deering [6] demonstrated the Virtual Portal, a closets ized three-wal l project ion-based system, in the Sun Microsystems' booth at SIGGRAPH '92. The CAVE, our projection-based VR display [3], also premiered at SIGGRAPH '92. The Virtual Portal and CAVE have similar intent, but different implementation schemes.
Comprehensive ICF core set for obstructive pulmonary diseases: validation of the activities and participation component through the patient's perspective.
PURPOSE This study aimed to validate the Activities and Participation component of the Comprehensive International Classification of Functioning, Disability and Health (ICF) Core Set for Obstructive Pulmonary Diseases (OPD) from the patient's perspective. METHODS A cross-sectional qualitative study was conducted with a convenience sample of outpatients with Chronic Obstructive Pulmonary Disease (COPD). Individual interviews were performed and analysed according to the meaning condensation procedure. RESULTS Fifty-one participants (70.6% male) with a mean age of 69.5 ± 10.8 years old were included. Twenty-one of the 24 categories contained in the Activities and Participation component of the Comprehensive ICF Core Set for OPD were identified by the participants. Additionally, seven second-level categories that are not covered by the Core Set were reported: complex interpersonal interactions, informal social relationships, family relationships, conversation, maintaining a body position, eating and preparing meals. CONCLUSIONS The activities and participation component of the ICF Core Set for OPD was largely supported by the patient's perspective. The categories included in the ICF Core Set that were not confirmed by the participants and the additional categories that were raised need to be further investigated in order to develop an instrument according to the patient's perspective. This will promote a more patient-centred assessments and rehabilitation interventions. Implications for Rehabilitation The Activities and Participation component of the Comprehensive ICF Core Set for OPD is largely supported by the perspective of patients with COPD and therefore could be used in the assessment of patients' individual and social life. The information collected through the Activities and Participation component of the Comprehensive ICF Core Set for OPD could be used to plan and assess rehabilitation interventions for patients with COPD.
A rational account of pedagogical reasoning: Teaching by, and learning from, examples
Much of learning and reasoning occurs in pedagogical situations--situations in which a person who knows a concept chooses examples for the purpose of helping a learner acquire the concept. We introduce a model of teaching and learning in pedagogical settings that predicts which examples teachers should choose and what learners should infer given a teacher's examples. We present three experiments testing the model predictions for rule-based, prototype, and causally structured concepts. The model shows good quantitative and qualitative fits to the data across all three experiments, predicting novel qualitative phenomena in each case. We conclude by discussing implications for understanding concept learning and implications for theoretical claims about the role of pedagogy in human learning.
Multispectral and hyperspectral remote sensing for identification and mapping of wetland vegetation: a review
Wetland vegetation plays a key role in the ecological functions of wetland environments. Remote sensing techniques offer timely, up-to-date, and relatively accurate information for sustainable and effective management of wetland vegetation. This article provides an overview on the status of remote sensing applications in discriminating and mapping wetland vegetation, and estimating some of the biochemical and biophysical parameters of wetland vegetation. Research needs for successful applications of remote sensing in wetland vegetation mapping and the major challenges are also discussed. The review focuses on providing fundamental information relating to the spectral characteristics of wetland vegetation, discriminating wetland vegetation using broad- and narrow-bands, as well as estimating water content, biomass, and leaf area index. It can be concluded that the remote sensing of wetland vegetation has some particular challenges that require careful consideration in order to obtain successful results. These include an in-depth understanding of the factors affecting the interaction between electromagnetic radiation and wetland vegetation in a particular environment, selecting appropriate spatial and spectral resolution as well as suitable processing techniques for extracting spectral information of wetland vegetation.
Using automatic keyword extraction to detect off-topic posts in online discussion boards
Online discussions boards represent a rich repository of knowledge organized in a collection of user generated content. These conversational cyberspaces allow users to express opinions, ideas and pose questions and answers without imposing strict limitations about the content. This freedom, in turn, creates an environment in which discussions are not bounded and often stray from the initial topic being discussed. In this paper we focus on approaches to assess the relevance of posts to a thread and detecting when discussions have been steered off-topic. A set of metrics estimating the level of novelty in online discussion posts are presented. These metrics are based on topical estimation and contextual similarity between posts within a given thread. The metrics are aggregated to rank posts based on the degree of relevance they maintain. The aggregation scheme is data-dependent and is normalized relative to the post length.
An Islanding Detection Method Based on Dual-Frequency Harmonic Current Injection Under Grid Impedance Unbalanced Condition
This paper proposes a three-phase grid impedance detection method based on dual-frequency harmonic current injection for islanding detection. Grid impedance detection based on single harmonic current injection is reliable in three-phase impedance balanced grid. However, under impedance unbalanced condition, the harmonic voltage caused by the injected symmetric harmonic current is asymmetric, which affects the calculation of grid impedances and even leads to failed detection. The method based on dual-frequency harmonic current injection is injecting two non-characteristic symmetric harmonic currents, and then according to the different harmonic voltages caused by different frequency harmonic currents, all three-phase impedances can be calculated accurately. The implementing algorithm of the presented method is derived and its performance is analyzed in detail. Simulation and experiments are carried out under grid impedance balanced and unbalanced conditions. Theoretical analysis and experiment results proved that the proposed method is feasible.
Lessons from applying the systematic literature review process within the software engineering domain
A consequence of the growing number of empirical studies in software engineering is the need to adopt systematic approaches to assessing and aggregating research outcomes in order to provide a balanced and objective summary of research evidence for a particular topic. The paper reports experiences with applying one such approach, the practice of systematic literature review, to the published studies relevant to topics within the software engineering domain. The systematic literature review process is summarised, a number of reviews being undertaken by the authors and others are described and some lessons about the applicability of this practice to software engineering are extracted. The basic systematic literature review process seems appropriate to software engineering and the preparation and validation of a review protocol in advance of a review activity is especially valuable. The paper highlights areas where some adaptation of the process to accommodate the domain-specific characteristics of software engineering is needed as well as areas where improvements to current software engineering infrastructure and practices would enhance its applicability. In particular, infrastructure support provided by software engineering indexing databases is inadequate. Also, the quality of abstracts is poor; it is usually not possible to judge the relevance of a study from a review of the abstract alone. 2006 Elsevier Inc. All rights reserved.
PAST-TENSE GENERATION FROM FORM VERSUS MEANING: BEHAVIOURAL DATA AND SIMULATION EVIDENCE.
The standard task used to study inflectional processing of verbs involves presentation of the stem form from which the participant is asked to generate the past tense. This task reveals a processing disadvantage for irregular relative to regular English verbs, more pronounced for lower-frequency items. Dual- and single-mechanism theories of inflectional morphology are both able to account for this pattern; but the models diverge in their predictions concerning the magnitude of the regularity effect expected when the task involves past-tense generation from meaning. In this study, we asked normal speakers to generate the past tense from either form (verb stem) or meaning (action picture). The robust regularity effect observed in the standard form condition was no longer reliable when participants were required to generate the past tense from meaning. This outcome would appear problematic for dual-mechanism theories to the extent that they assume the process of inflection requires stem retrieval. By contrast, it supports single-mechanism models that consider stem retrieval to be task-dependent. We present a single-mechanism model of verb inflection incorporating distributed phonological and semantic representations that reproduces this task-dependent pattern.
A Novel UWB Monopole Antenna With Tunable Notched Behavior Using Varactor Diode
This letter presents a novel ultrawideband (UWB) antenna with tunable notched band. The antenna is assembled on an FR4 substrate with thickness of 0.8 mm and εr = 4.4. By inserting a π-shaped slot on the radiating patch, band-notch function is achieved. By loading the slot with lumped varactor, tunability of the created notch would be possible. Based on this technique, an electronically controlled notched-band antenna is designed and fabricated including a single varactor diode with a varying capacitance value of 0.63-2.67 pF. Using the cited varactor, notched band tunability of 2.7-7.2 GHz is obtained.
WILDSENSING: Design and deployment of a sustainable sensor network for wildlife monitoring
The increasing adoption of wireless sensor network technology in a variety of applications, from agricultural to volcanic monitoring, has demonstrated their ability to gather data with unprecedented sensing capabilities and deliver it to a remote user. However, a key issue remains how to maintain these sensor network deployments over increasingly prolonged deployments. In this article, we present the challenges that were faced in maintaining continual operation of an automated wildlife monitoring system over a one-year period. This system analyzed the social colocation patterns of European badgers (Meles meles) residing in a dense woodland environment using a hybrid RFID-WSN approach. We describe the stages of the evolutionary development, from implementation, deployment, and testing, to various iterations of software optimization, followed by hardware enhancements, which in turn triggered the need for further software optimization. We highlight the main lessons learned: the need to factor in the maintenance costs while designing the system; to consider carefully software and hardware interactions; the importance of rapid prototyping for initial deployment (this was key to our success); and the need for continuous interaction with domain scientists which allows for unexpected optimizations.
Enriching Knowledge in Business Process Modelling: A Storytelling Approach
The main goal of Business Process Management (BPM) is conceptualising, operationalizing and controlling workflows in organisations based on process models. In this paper we discuss several limitations of the workflow paradigm and suggest that process models can also play an important role in analysing how organisations think about themselves through storytelling. We contrast the workflow paradigm with storytelling through a comparative analysis. We also report a case study where storytelling has been used to elicit and document the practices of an IT maintenance team. This research contributes towards the development of better process modelling languages and tools.
Urokinase-receptor-mediated phenotypic changes in vascular smooth muscle cells require the involvement of membrane rafts.
The cholesterol-enriched membrane microdomains lipid rafts play a key role in cell activation by recruiting and excluding specific signalling components of cell-surface receptors upon receptor engagement. Our previous studies have demonstrated that the GPI (glycosylphosphatidylinositol)-linked uPAR [uPA (urokinase-type plasminogen activator) receptor], which can be found in lipid rafts and in non-raft fractions, can mediate the differentiation of VSMCs (vascular smooth muscle cells) towards a pathophysiological de-differentiated phenotype. However, the mechanism by which uPAR and its ligand uPA regulate VSMC phenotypic changes is not known. In the present study, we provide evidence that the molecular machinery of uPAR-mediated VSMC differentiation employs lipid rafts. We show that the disruption of rafts in VSMCs by membrane cholesterol depletion using MCD (methyl-beta-cyclodextrin) or filipin leads to the up-regulation of uPAR and cell de-differentiation. uPAR silencing by means of interfering RNA resulted in an increased expression of contractile proteins. Consequently, disruption of lipid rafts impaired the expression of these proteins and transcriptional activity of related genes. We provide evidence that this effect was mediated by uPAR. Similar effects were observed in VSMCs isolated from Cav1Z(-/-) (caveolin-1-deficient) mice. Despite the level of uPAR being significantly higher after the disruption of the rafts, uPA/uPAR-dependent cell migration was impaired. However, caveolin-1 deficiency impaired only uPAR-dependent cell proliferation, whereas cell migration was strongly up-regulated in these cells. Our results provide evidence that rafts are required in the regulation of uPAR-mediated VSMC phenotypic modulations. These findings suggest further that, in the context of uPA/uPAR-dependent processes, caveolae-associated and non-associated rafts represent different signalling membrane domains.
Reliability of the Brazilian Portuguese version of the fatigue severity scale and its correlation with pulmonary function, dyspnea, and functional capacity in patients with COPD*
OBJECTIVE To describe the intra-rater and inter-rater reliability of the Brazilian Portuguese version of the fatigue severity scale (FSS) in patients with COPD and to identify the presence of its association with parameters of pulmonary function, dyspnea, and functional capacity. METHODS This was an observational cross-sectional study involving 50 patients with COPD, who completed the FSS in interviews with two researchers in two visits. The FSS scores were correlated with those of the Medical Research Council (MRC) scale, as well as with FEV1, FVC, and six-minute walk distance (6MWD). RESULTS The mean age of the patients was 69.4 ± 8.23 years, whereas the mean FEV1 was 46.5 ± 20.4% of the predicted value. The scale was reliable, with an intraclass correlation coefficient of 0.90 (95% CI, 0.81-0.94; p < 0.01). The FSS scores showed significant correlations with those of MRC scale (r = 0.70; p < 0.01), as well as with 6MWD (r = -0.77; p < 0.01), FEV1 (r = -0.38; p < 0.01), FVC (r = -0.35; p < 0.01), and stage of the disease in accordance with the Global Initiative for Chronic Obstructive Lung Disease criteria (r = 0.37; p < 0.01). CONCLUSIONS The Brazilian Portuguese version of the FSS proved reliable for use in COPD patients in Brazil and showed significant correlations with sensation of dyspnea, functional capacity, pulmonary function, and stage of the disease.
Factors influencing the ownership and utilization of long-lasting insecticidal nets for malaria prevention in Ethiopia
Utilization of long-lasting insecticidal nets (LLINs) is regarded as key malaria prevention and control strategy. However, studies have reported a large gap in terms of both ownership and utilization particularly in the sub-Saharan Africa (SSA). With continual efforts to improve the use of LLIN and to progress malaria elimination, examining the factors influencing the ownership and usage of LLIN is of high importance. Therefore, the current study was conducted to examine the level of ownership and use of LLIN along with identification of associated factors at household level. A cross-sectional study was conducted in Mirab Abaya District, Southern Ethiopia in June and July 2014. A total of 540 households, with an estimated 2690 members, were selected in four kebeles of the district known to have high incidence of malaria. Trained data collectors interviewed household heads to collect information on the knowledge, ownership and utilization of LLINs, which was complemented by direct observation on the conditions and use of the nets through house-to-house visit. Bivariate and multivariable logistic regression analyses were used to determine factors associated to LLIN use. Of 540 households intended to be included in the survey, 507 responded to the study (94.24% response rate), covering the homes of 2759 people. More than 58% of the households had family size >5 (the regional average), and 60.2% of them had at least one child below the age of 5 years. The ownership of at least one LLIN among households surveyed was 89.9%, and using at least one LLIN during the night prior to the survey among net owners was 85.1% (n = 456). Only 36.7% (186) mentioned at least as the mean of correct scores of all participants for 14 possible malaria symptoms and 32.7% (166) knew at least as the mean of correct scores of all participants for possible preventive methods. Over 30% of nets owned by the households were out of use. After controlling for confounding factors, having two or more sleeping places (adjusted odds ratio [aOR] = 2.58, 95% CI 1.17, 5.73), knowledge that LLIN prevents malaria (aOR = 2.51, 95% CI 1.17, 5.37), the presence of hanging bed nets (aOR = 19.24, 95% CI 9.24, 40.07) and walls of the house plastered or painted >12 months ago (aOR = 0.09, 95% CI 0.01, 0.71) were important predictors of LLIN utilization. This study found a higher proportion of LLIN ownership and utilization by households than had previously been found in similar studies in Ethiopia, and in many studies in SSA. However, poor knowledge of the transmission mechanisms and the symptoms of malaria, and vector control measures to prevent malaria were evident. Moderate proportions of nets were found to be out of use or in poor repair. Efforts should be in place to maintain the current rate of utilization of LLIN in the district and improve on the identified gaps in order to support the elimination of malaria.
Evaluation of a psychoeducation program for Chinese clients with schizophrenia and their family caregivers.
OBJECTIVES To evaluate the effectiveness of a psychoeducation program for Chinese clients with schizophrenia and their family caregivers. METHODS A randomized controlled trial was conducted. Seventy-three clients with a diagnosis of schizophrenia and their caregivers (n=73) were recruited and randomized into a study (n=36) and control group (n=37). Ten psychoeducation sessions were provided to the study group. The outcomes were measured at the baseline, immediately after (post-1), six months (post-2), and 12 months after the intervention (post-3). RESULTS There were significant treatment effects across time for all client outcomes: adherence to medication (p<0.01), mental status (p<0.01), and insight into illness (p<0.01). However, no significant differences were found between groups at the post-3 measures for all client outcomes. For the caregivers, significant group differences were only detected in self-efficacy at the post-1 (p=0.007) and post-2 (p<0.001) measures, the level of satisfaction at the post-1 (p=0.033) and post-2 (p<0.021) measures, and the perception of family burden at the post-2 measures (p=0.043). CONCLUSION A psychoeducation intervention had positive effects on Chinese clients and their caregivers. However, these effects might not be sustained 12 months after the intervention. PRACTICE IMPLICATIONS To substantiate its effects, psychoeducation should be an ongoing intervention, with its outcomes constantly evaluated.
Cache based recurrent neural network language model inference for first pass speech recognition
Recurrent neural network language models (RNNLMs) have recently produced improvements on language processing tasks ranging from machine translation to word tagging and speech recognition. To date, however, the computational expense of RNNLMs has hampered their application to first pass decoding. In this paper, we show that by restricting the RNNLM calls to those words that receive a reasonable score according to a n-gram model, and by deploying a set of caches, we can reduce the cost of using an RNNLM in the first pass to that of using an additional n-gram model. We compare this scheme to lattice rescoring, and find that they produce comparable results for a Bing Voice search task. The best performance results from rescoring a lattice that is itself created with a RNNLM in the first pass.
Microstructure of (Ti, Si, Al)N nanocomposite coatings
Abstract (Ti,Si,Al)N nanocomposite coatings were prepared by a combination of r.f. and d.c. reactive magnetron sputtering. The composition of the films was evaluated by electron probe microanalysis (EPMA) and Rutherford backscattering spectrometry (RBS) and structure by X-ray diffraction (XRD) and high-resolution transmission electron microscopy (HRTEM). XRD experiments showed the development of crystalline phases whose structure is very similar to that of bulk TiN. Diffracted peak positions revealed changes of the lattice parameter from 0.418 to 0.429 nm when the ion adatom mobility was enhanced. The lowest lattice parameter corresponds to a Ti–Si–Al–N phase where some of the Si and Al atoms are occupying Ti positions in the fcc TiN lattice, while the highest corresponds to a system where a partial Si segregation has occurred being enough to nucleate and develop a Si 3 N 4 amorphous phase. Cross-sectional HRTEM images of samples grown under high adatom mobility shown grains with a diameter between 3 and 10 nm surrounded by an amorphous layer, while for the samples grown under limited adatom mobility conditions grains with a diameter between 18 and 28 nm were observed. Furthermore, through the visualization of bright field images it was possible to discern a columnar structure.
A simple method for quantitative determination of polysaccharides in fungal cell walls
A simple and reliable method for quantitative determination of cell wall polymers in fungal cell with an s.e.m. of 5% is described. This protocol is based on the hydrolysis by sulfuric acid of β-glucan, mannan, galactomannan and chitin present at different levels in the wall of yeasts and filamentous fungi into their corresponding monomers glucose, mannose, galactose and glucosamine. The released monosaccharides are subsequently separated and quantified by high-performance ionic chromatography coupled to pulse amperometry detection, with a detection limit of 1.0 μg ml−1. This procedure is well suited to screening a large collection of yeast mutants or to evaluating effects of environmental conditions on cell wall polysaccharide content. This procedure is also applicable to other fungal species, including Schizosaccharomyces pombe, Candida albicans and Aspergillus fumigatus. Results can be obtained in 3 d.
Active current balancing for parallel-connected silicon carbide MOSFETs
In high power applications of silicon carbide (SiC) MOSFETs where parallelism is employed, current unbalance can occur and affect the performance and reliability of the power devices. In this paper, factors which cause current unbalance in these devices are analyzed. Among them, the threshold voltage mismatch is identified as a major factor for dynamic current unbalance. The threshold distribution of SiC MOSFETs is investigated, and its effect on current balance is studied in experiments. Based on these analyses, an active current balancing scheme is proposed. It is able to sense the unbalanced current and eliminate it by actively controlling the gate drive signal to each device. The features of fine time resolution and low complexity make this scheme attractive to a wide variety of wide-band-gap device applications. Experimental and simulation results verify the feasibility and effectiveness of the proposed scheme.
Effects of joint torque constraints on humanoid robot balance recovery in the presence of external disturbance
A humanoid robot should be able to keep balance under unexpected disturbance, and can take three strategies, i.e. ankle strategy, hip strategy and step strategy, to recover balance from biomechanics research. In this paper, the relationship between limited joint torque and balance recovery strategy is analyzed using Zero moment Point Manipulability Ellipsoid. Furthermore, during balance control, the constraints between the feet and the ground must be maintained. The satisfaction of these constraints, namely the gravity constraints, the friction constraints and the CoP constraints, imposed bounds on the control torque are investigated. Such control bounds have significant effects on designing balance recovery strategies and can be used to predict the type of falls.
Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design
Flow-based generative models are powerful exact likelihood models with efficient sampling and inference. Despite their computational efficiency, flowbased models generally have much worse density modeling performance compared to state-of-the-art autoregressive models. In this paper, we investigate and improve upon three limiting design choices employed by flow-based models in prior work: the use of uniform noise for dequantization, the use of inexpressive affine flows, and the use of purely convolutional conditioning networks in coupling layers. Based on our findings, we propose Flow++, a new flow-based model that is now the state-of-the-art non-autoregressive model for unconditional density estimation on standard image benchmarks. Our work has begun to close the significant performance gap that has so far existed between autoregressive models and flow-based models. Our implementation is available at https://github.com/aravind0706/flowpp.
Double-Blind Multicentre Isradipine Dose-Confirmation Study in Pakistan
Isradipine 2.5mg twice daily was more effective than the 1.25mg twice daily dose in achieving normalisation of blood pressure and overall response. Larger reductions in blood pressure occurred sooner after starting isradipine 2.5mg twice daily than after the 1.25mg twice daily dose. Thus, this study confirms the results of European and American studies. Isradipine 2.5mg twice daily appears to be the optimal dosage for Pakistani patients with hypertension.
A reconfigurable patch antenna using switchable slots for circular polarization diversity
A novel design of a microstrip patch antenna with switchable slots (PASS) is proposed to achieve circular polarization diversity. Two orthogonal slots are incorporated into the patch and two pin diodes are utilized to switch the slots on and off. By turning the diodes on or off, this antenna can radiate with either right hand circular polarization (RHCP) or left hand circular polarization (LHCP) using the same feeding probe. Experimental results validate this concept. This design demonstrates useful features for wireless communication applications and future planetary missions.
Promoting healthy growth: what are the priorities for research and action?
Healthy growth from conception through the first 2 y of life is the foundation for adequate organ formation and function, a strong immune system, physical health, and neurological and cognitive development. Recent studies identified several low-cost interventions to address undernutrition during this age period and noted the lower returns on investment of intervening after this critical period. Although these interventions should be implemented widely, it is recognized that existing nutrition solutions, even if universally applied, would only avert a minority fraction of the estimated death and disability due to undernutrition. This paper reviews some of the knowledge and learning needed to close this "impact gap." Five areas are prioritized for future research: 1) study healthy growth from a lifecycle perspective, because maternal, fetal, and newborn outcomes are connected; 2) understand why growth faltering begins so early in breast-fed infants in the developing world; 3) apply new tools and technologies to study long-recognized problems such as the interaction between nutrition and infection; 4) explore new hypotheses for understanding nutrient assimilation and use to discover and develop intervention leads; and 5) understand the role of the environment in healthy growth and the potential synergistic benefits of multi-sectoral interventions. Policymakers are urged to invest in nutrition-specific and -sensitive interventions to promote healthy growth from conception through the first 2 y of life because of their immediate and long-term health and development benefits.
Species and Evolution in Clonal Organisms - Summary and Discussion
This paper briefly summarizes the major points made in nine talks at the symposium on "Species and evolution in clonal organisms" sponsored by the American Society of Zoologists, the Society of Systematic Zoology, and the California Academy of Sciences on 28 December 1988, in San Francisco, California (table 1). The summary is followed by an account of a roundtable discussion at the end of the symposium, during which questions from the audience were addressed to symposium participants. We believe that the discussion, in particular, offers valuable insight into many complex problems encountered in research on species and their evolutionary significance, and hope that it will be especially useful in the design and organization of future research in this area.
WADE: Writeback-aware dynamic cache management for NVM-based main memory system
Emerging Non-Volatile Memory (NVM) technologies are explored as potential alternatives to traditional SRAM/DRAM-based memory architecture in future microprocessor design. One of the major disadvantages for NVM is the latency and energy overhead associated with write operations. Mitigation techniques to minimize the write overhead for NVM-based main memory architecture have been studied extensively. However, most prior work focuses on optimization techniques for NVM-based main memory itself, with little attention paid to cache management policies for the Last-Level Cache (LLC). In this article, we propose a Writeback-Aware Dynamic CachE (WADE) management technique to help mitigate the write overhead in NVM-based memory.<sup;>1</sup;> The proposal is based on the observation that, when dirty cache blocks are evicted from the LLC and written into NVM-based memory (with PCM as an example), the long latency and high energy associated with write operations to NVM-based memory can cause system performance/power degradation. Thus, reducing the number of writeback requests from the LLC is critical. The proposed WADE cache management technique tries to keep highly reused dirty cache blocks in the LLC. The technique predicts blocks that are frequently written back in the LLC. The LLC sets are dynamically partitioned into a frequent writeback list and a nonfrequent writeback list. It keeps a best size of each list in the LLC. Our evaluation shows that the technique can reduce the number of writeback requests by 16.5% for memory-intensive single-threaded benchmarks and 10.8% for multicore workloads. It yields a geometric mean speedup of 5.1% for single-thread applications and 7.6% for multicore workloads. Due to the reduced number of writeback requests to main memory, the technique reduces the energy consumption by 8.1% for single-thread applications and 7.6% for multicore workloads.
The Edinburgh Twitter Corpus
We describe the first release of our corpus of 97 million Twitter posts. We believe that this data will prove valuable to researchers working in social media, natural language processing, large-scale data processing, and similar areas.
Schottky barrier characterization of lead phthalocyanine/aluminium interfaces
Abstract Thin sandwich film structure consisting of several cells is fabricated by successive thermal sublimation of aluminium, lead phthalocyanine and aluminium under high vacuum conditions (10 − 4  Pa). The dark current density–voltage ( J – V ) characteristics indicate rectifying junctions exists at PbPc/Al interface. Devices exposed to oxygen were found to exhibit an enhanced Schottky type behaviour. Measurements on the dependence of capacitance and conductance on frequency and temperature are also investigated. These are quantitatively interpreted using an equivalent circuit model. Structural properties of lead phthalocyanine film were studied using X-ray diffraction techniques.
Low-dose of thalidomide in the treatment of refractory myeloma.
history of granulomatosis, hepatosplenomegaly, HBsAg and EBV-VCA positivity. The only surviving patient had received an allogeneic bone-marrow transplantation. Our two children had been infected by EBV with a chronic high titer of IgG against EBV-VCA, consistent with a life-long infection.9 EBV may infect T-cells creating reactive lymphoid proliferations without contributing to the neoplastic process; however EBV or other viral infections (i.e. Cytomegalovirus, hepatitis B), or immune defects suppressing NK activity – such as altered T4/T8 ratio – may lead to neoplastic transformation in particular hosts, and this could be the case with our patients. All available lymph node samples were therefore submitted to EBV-DNA detection without finding amplification of the viral genome. The presence of EBV-DNA is a feature of angioimmunoblastic PTCL found in Eastern Countries, being less common in Europe.3,10 As to optimal treatment, we agree with the Taiwan group:3 marrow transplantation can cure this disease. The role of retinoic acid cannot be assessed by anecdotal experiences, even though malignant cell differentiation and apoptosis might depend on retinoic acid administration.10 No national or international co-operative group is currently dealing with poor-prognosis PTCL: a common study is therefore needed, perhaps including both adults and children.
Toward Massive Machine Type Cellular Communications
Cellular networks have been engineered and optimized to carrying ever-increasing amounts of mobile data, but over the last few years, a new class of applications based on machine-centric communications has begun to emerge. Automated devices such as sensors, tracking devices, and meters, often referred to as machine-to-machine (M2M) or machine-type communications (MTC), introduce an attractive revenue stream for mobile network operators, if a massive number of them can be efficiently supported. The novel technical challenges posed by MTC applications include increased overhead and control signaling as well as diverse application-specific constraints such as ultra-low complexity, extreme energy efficiency, critical timing, and continuous data intensive uploading. This article explains the new requirements and challenges that large-scale MTC applications introduce, and provides a survey of key techniques for overcoming them. We focus on the potential of 4.5G and 5G networks to serve both the high data rate needs of conventional human-type communication (HTC) subscribers and the forecasted billions of new MTC devices. We also opine on attractive economic models that will enable this new class of cellular subscribers to grow to its full potential.
Anaplastic large cell lymphoma Hodgkin's-like: a randomized trial of ABVD versus MACOP-B with and without radiation therapy.
During the last few years, morphological, immunohistochemical, and genetic findings have placed anaplastic large cell lymphoma (ALCL) as a distinct clinicopathologic entity, and several reports have focused on the existence of different subtypes of the tumor. Particular attention has been paid to the ALCL-Hodgkin's-like (HL) subtype, which seems to be on the border between Hodgkin's disease (HD) and high-grade non-Hodgkin's lymphoma (HG-NHL). From September 1994 to July 1997, during the course of an Italian multicentric trial, 40 ALCL-HLs were randomized to receive as front-line chemotherapy MACOP-B (methotrexate with leucovorin, doxorubicin, cyclophosphamide, vincristine, prednisone, and bleomycin-a third-generation HG-NHL regimen) or ABVD (doxorubicin, bleomycin, vinblastine, and dacarbazine-a scheme specific for HD). All patients with bulky disease in the mediastinum at diagnosis underwent local radiotherapy after the chemotherapeutic program. Complete response (CR) was achieved in 17 of the 19 (90%) patients who were treated with MACOP-B, and in 19 of the 21 (91%) patients who were administered ABVD. The probability of relapse-free survival, projected at 32 months, was 94% for the MACOP-B subset and 91% for the ABVD subset. The majority of patients with mediastinal bulky disease obtained CR (evaluated with 67Ga single photon emission computed tomography [SPECT]) after their radiotherapy. The present study suggests that ALCL-HL, in line with its borderline status, responds in an equivalent way to third-generation chemotherapy for HG-NHL and to conventional HD treatment in terms of both CR and relapse-free survival rates. However, as to the latter, a longer follow-up period may be needed before stating the absolute equivalence of the two regimens used.
Do bosons obey BoseEinstein distribution: Two iterated limits of Gentile distribution
Abstract It is a common impression that by only setting the maximum occupation number to infinity, which is the demand of the indistinguishability of bosons, one can achieve the statistical distribution that bosons obey — the Bose–Einstein distribution. In this Letter, however, we show that only with an infinite maximum occupation number one cannot uniquely achieve the Bose–Einstein distribution, since in the derivation of the Bose–Einstein distribution, the problem of iterated limit is encountered. For achieving the Bose–Einstein distribution, one needs to take both the maximum occupation number and the total number of particles to infinities, and, then, the problem of the order of taking limits arises. Different orders of the limit operations will lead to different statistical distributions. For achieving the Bose–Einstein distribution, besides setting the maximum occupation number, we also need to state the order of the limit operations.
Antibiotics in general practice.
Phase II study of vicriviroc versus efavirenz (both with zidovudine/lamivudine) in treatment-naive subjects with HIV-1 infection.
BACKGROUND Vicriviroc (VCV) is a CCR5 antagonist with nanomolar activity against human immunodeficiency virus (HIV) replication in vitro and in vivo. We report the results of a phase II dose-finding study of VCV plus dual nucleoside reverse-transcriptase inhibitors (NRTIs) in the treatment-naive HIV-1-infected subjects. METHODS This study was a randomized, double-blind, placebo-controlled trial that began with a 14-day comparison of 3 dosages of VCV with placebo in treatment-naive subjects infected with CCR5-using HIV-1. After 14 days of monotherapy, lamivudine/zidovudine was added to the VCV arms; subjects receiving placebo were treated with efavirenz and lamivudine/zidovudine; the planned treatment duration was 48 weeks. RESULTS Ninety-two subjects enrolled. After 14 days of once-daily monotherapy, the mean viral loads decreased from baseline values by 0.07 log(10) copies/mL in the placebo arm, 0.93 log(10) copies/mL in the VCV 25 mg arm, 1.18 log(10) copies/mL in the VCV 50 mg arm, and 1.34 log(10) copies/mL in the VCV 75 mg arm (P < .001 for each VCV arm vs. the placebo arm). The combination-therapy portion of the study was stopped because of increased rates of virologic failure in the VCV 25 mg/day arm (relative hazard [RH], 21.6; 95% confidence interval [CI], 2.8-168.9) and the VCV 50 mg/day arm (RH, 11.7; 95% CI, 1.5-92.9), compared with that in the control arm. CONCLUSIONS VCV administered with dual NRTIs in treatment-naive subjects with HIV-1 infection had increased rates of virologic failure, compared with efavirenz plus dual NRTIs. No treatment-limiting toxicity was observed. Study of higher doses of VCV as part of combination therapy is warranted.
Attaining human-level performance for anatomical landmark detection in 3D CT data
We present an efficient neural network approach for locating anatomical landmarks, using a two-pass, two-resolution cascaded approach which leverages a mechanism we term atlas location autocontext. Location predictions are made by regression to Gaussian heatmaps, one heatmap per landmark. This system allows patchwise application of a shallow network, thus enabling the prediction of multiple volumetric heatmaps in a unified system, without prohibitive GPU memory requirements. Evaluation is performed for 22 landmarks defined on a range of structures in head CT scans and the neural network model is benchmarked against a previously reported decision forest model trained with the same cascaded system. Models are trained and validated on 201 scans. Over the final test set of 20 scans which was independently annotated by 2 observers, we show that the neural network reaches an accuracy which matches the annotator variability.
The neurophysiology of P 300--an integrated review.
Event-related potentials (ERPs) are very small voltages recorded from the scalp which originate in the brain structures in response to specific events or stimuli. They appear as a series of peaks and troughs interspersed in the Electroencephalogram (EEG) waves. The exact neural origins and neuropsychological meaning of the P300 are imprecisely known, even though appreciable progress has been made in the last 25 years. In this review, we will focus on the possible neural generators of this potential. Given the attention and memory operations associated with P300 generation, the first human studies on the neural origins of this ERP focused on the hippocampal formation using depth electrodes implanted to assess sources of epileptic foci in patients. Other lesion studies have found that the integrity of the temporal-parietal lobe junction is involved with either generation or transmission processes subsequent to hippocampal activity and contributes to ERP measures. These findings imply that hippocampal absence does not eliminate the P300, but that the temporal-parietal junction does affect its production. As mentioned till now, the neuroelectric events that underlie P300 generation stem from the interaction between frontal lobe and hippocampal/temporal-parietal function. ERP and fMRI studies using oddball tasks have obtained patterns consistent with this frontal-to-temporal and parietal lobe activation pattern. Further support comes from magnetic resonance imaging (MRI) of gray matter volumes that suggest individual variation in P3a amplitude from distracter stimuli is correlated with frontal lobe area size, whereas P3b amplitude from target stimuli is correlated with parietal area size. Given distinct neuropsychological correlates for P3a and P3b, different neurotransmitters may be engaged for each constituent subcomponent under specific stimulus/task processing requirements. Available data suggest that dopaminergic/frontal processes for P3a and locus-coeruleus-norepinephrine/ parietal activity for P3b are reasonable to propose. This dual-transmitter P300 hypothesis is speculative but appears to account for a variety of findings and provides a useful framework for evaluating drug effects.
The mechanics of state-dependent neural correlations
Simultaneous recordings from large neural populations are becoming increasingly common. An important feature of population activity is the trial-to-trial correlated fluctuation of spike train outputs from recorded neuron pairs. Similar to the firing rate of single neurons, correlated activity can be modulated by a number of factors, from changes in arousal and attentional state to learning and task engagement. However, the physiological mechanisms that underlie these changes are not fully understood. We review recent theoretical results that identify three separate mechanisms that modulate spike train correlations: changes in input correlations, internal fluctuations and the transfer function of single neurons. We first examine these mechanisms in feedforward pathways and then show how the same approach can explain the modulation of correlations in recurrent networks. Such mechanistic constraints on the modulation of population activity will be important in statistical analyses of high-dimensional neural data.
Sleep-disordered breathing in heart failure.
Sleep-disordered breathing-comprising obstructive sleep apnoea (OSA), central sleep apnoea (CSA), or a combination of the two-is found in over half of heart failure (HF) patients and may have harmful effects on cardiac function, with swings in intrathoracic pressure (and therefore preload and afterload), blood pressure, sympathetic activity, and repetitive hypoxaemia. It is associated with reduced health-related quality of life, higher healthcare utilization, and a poor prognosis. Whilst continuous positive airway pressure (CPAP) is the treatment of choice for patients with daytime sleepiness due to OSA, the optimal management of CSA remains uncertain. There is much circumstantial evidence that the treatment of OSA in HF patients with CPAP can improve symptoms, cardiac function, biomarkers of cardiovascular disease, and quality of life, but the quality of evidence for an improvement in mortality is weak. For systolic HF patients with CSA, the CANPAP trial did not demonstrate an overall survival or hospitalization advantage for CPAP. A minute ventilation-targeted positive airway therapy, adaptive servoventilation (ASV), can control CSA and improves several surrogate markers of cardiovascular outcome, but in the recently published SERVE-HF randomized trial, ASV was associated with significantly increased mortality and no improvement in HF hospitalization or quality of life. Further research is needed to clarify the therapeutic rationale for the treatment of CSA in HF. Cardiologists should have a high index of suspicion for sleep-disordered breathing in those with HF, and work closely with sleep physicians to optimize patient management.
Behavior-Based Network Access Control: A Proof-of-Concept
Current NAC technologies implement a pre-connect phase whe re t status of a device is checked against a set of policies before being granted access to a network, an d a post-connect phase that examines whether the device complies with the policies that correspond to its rol e in the network. In order to enhance current NAC technologies, we propose a new architecture based on behaviorsrather thanrolesor identity, where the policies are automatically learned and updated over time by the membe rs of the network in order to adapt to behavioral changes of the devices. Behavior profiles may be presented as identity cards that can change over time. By incorporating an Anomaly Detector (AD) to the NAC server or t each of the hosts, their behavior profile is modeled and used to determine the type of behaviors that shou ld be accepted within the network. These models constitute behavior-based policies. In our enhanced NAC ar chitecture, global decisions are made using a group voting process. Each host’s behavior profile is used to compu te a partial decision for or against the acceptance of a new profile or traffic. The aggregation of these partial vote s amounts to the model-group decision. This voting process makes the architecture more resilient to attacks. E ven after accepting a certain percentage of malicious devices, the enhanced NAC is able to compute an adequate deci sion. We provide proof-of-concept experiments of our architecture using web traffic from our department netwo rk. Our results show that the model-group decision approach based on behavior profiles has a 99% detection rate o f nomalous traffic with a false positive rate of only 0.005%. Furthermore, the architecture achieves short latencies for both the preand post-connect phases.
In vitro susceptibility of Candida albicans to four disinfectants and their combinations.
AIM The aim of this study was to evaluate the susceptibility of seven strains of Candida albicans to four disinfectants: iodine potassium iodide, chlorhexidine acetate, sodium hypochlorite and calcium hydroxide. In addition, all possible pairs of the disinfectants were tested in order to compare the effect of the combination and its components. METHODOLOGY Filter paper discs were immersed in standardized yeast suspensions and then transferred to disinfectant solutions of different concentrations and incubated at 37 degrees C for 30 s, 5 min, 1 h and 24 h. After incubation the filter paper discs were transferred to vials with PBS and glass beads that were then vigorously shaken for dispersal of the yeast cells. PBS with resuspended yeasts was serially diluted 10-fold. Droplets of 25 microL from each dilution were inoculated on TSB agar plates and incubated in air at 37 degrees C for 24 h. The number of colony-forming units was then calculated from appropriate dilutions. RESULTS C. albicans cells were highly resistant to calcium hydroxide. Sodium hypochlorite (5% and 0.5%) and iodine (2%) potassium iodide (4%) killed all yeast cells within 30 s, whilst chlorhexidine acetate (0.5%) showed complete killing after 5 min. Combinations of disinfectants were equally or less effective than the more effective component. All C. albicans strains tested showed similar susceptibility to the medicaments tested. CONCLUSIONS This study indicates that sodium hypochlorite, iodine potassium iodide and chlorhexidine acetate are more effective than calcium hydroxide against C. albicans in vitro. However, combining calcium hydroxide with sodium hypochlorite or chlorhexidine may provide a wide-spectrum antimicrobial preparation with a long-lasting effect.
Effect of a beta 2-agonist (broxaterol) on respiratory muscle strength and endurance in patients with COPD with irreversible airway obstruction.
The effect of broxaterol, a new beta 2-agonist, on respiratory muscle endurance and strength was studied in a double-blind, placebo-controlled, randomized crossover clinical trial in 16 patients with chronic obstructive pulmonary disease (COPD) with irreversible airway obstruction (FEV1 = 57.1 percent of predicted). One patient withdrew from the study because of acute respiratory exacerbation. Inspiratory muscle strength was assessed by maximal inspiratory pressure (MIP) and endurance time was determined as the length of time a subject could breathe against inspiratory resistance (target mouth pressure = 70 percent of MIP, Ti/Ttot = 0.4). Broxaterol (B) or placebo (P) was given orally for seven days at the dose of 0.5 mg three times a day with a washout period of 72 h between study treatments. Measurements were performed before administration of B or P and 2 h (six patients) or 8 h (nine patients) after the end of each treatment. No significant changes in FEV1 or FRC were observed after B or P suggesting that diaphragmatic length was maintained constant with each treatment. The MIP did not significantly change, while endurance time increased after B in the patients tested at 2 h (from 234.8 +/- 48.1 s to 284.0 +/- 48.0 s, p less than 0.05) and at 8 h (from 187.2 +/- 31.1 s to 258.2 +/- 40.4 s, p less than 0.005). No changes were observed after P. Minute ventilation, airway occlusion pressure (P0.1), integrated electromyographic activities of the diaphragm (Edi), and intercostal parasternals (Eic) (normalized to the value obtained during MIP) showed no change during the endurance run with different treatments. We conclude that in a group of COPD patients with irreversible airway obstruction, B significantly improves respiratory muscle endurance, and that this does not arise as a result of an effect on neuromuscular drive or pulmonary mechanics, but may be mediated by peripheral factors.