title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Functional roles of nucleosome stability and dynamics. | Nucleosome is a histone-DNA complex known as the fundamental repeating unit of chromatin. Up to 90% of eukaryotic DNA is wrapped around consecutive octamers made of the core histones H2A, H2B, H3 and H4. Nucleosome positioning affects numerous cellular processes that require robust and timely access to genomic DNA, which is packaged into the tight confines of the cell nucleus. In living cells, nucleosome positions are determined by intrinsic histone-DNA sequence preferences, competition between histones and other DNA-binding proteins for genomic sequence, and ATP-dependent chromatin remodelers. We discuss the major energetic contributions to nucleosome formation and remodeling, focusing especially on partial DNA unwrapping off the histone octamer surface. DNA unwrapping enables efficient access to nucleosome-buried binding sites and mediates rapid nucleosome removal through concerted action of two or more DNA-binding factors. High-resolution, genome-scale maps of distances between neighboring nucleosomes have shown that DNA unwrapping and nucleosome crowding (mutual invasion of nucleosome territories) are much more common than previously thought. Ultimately, constraints imposed by nucleosome energetics on the rates of ATP-dependent and spontaneous chromatin remodeling determine nucleosome occupancy genome-wide, and shape pathways of cellular response to environmental stresses. |
A New Softmax Operator for Reinforcement Learning | A softmax operator applied to a set of values acts somewhat like the maximization function and somewhat like an average. In sequential decision making, softmax is often used in settings where it is necessary to maximize utility but also to hedge against problems that arise from putting all of one’s weight behind a single maximum utility decision. The Boltzmann softmax operator is the most commonly used softmax operator in this setting, but we show that this operator is prone to misbehavior. In this work, we study an alternative softmax operator that, among other properties, is both a non-expansion (ensuring convergent behavior in learning and planning) and differentiable (making it possible to improve decisions via gradient descent methods). We provide proofs of these properties and present empirical comparisons between various softmax operators. |
Design of an Active 1-DOF Lower-Limb Exoskeleton with Inertia Compensation | Limited research has been done on exoskeletons to enable faster movements of the lower extremities. An exoskeleton’s mechanism can actually hinder agility by adding weight, inertia and friction to the legs; compensating inertia through control is particularly difficult due to instability issues. The added inertia will reduce the natural frequency of the legs, probably leading to lower step frequency during walking. We present a control method that produces an approximate compensation of an exoskeleton’s inertia. The aim is making the natural frequency of the exoskeleton-assisted leg larger than that of the unaided leg. The method uses admittance control to compensate the weight and friction of the exoskeleton. Inertia compensation is emulated by adding a feedback loop consisting of low-pass filtered acceleration multiplied by a negative gain. This gain simulates negative inertia in the low-frequency range. We tested the controller on a statically supported, single-DOF exoskeleton that assists swing movements of the leg. Subjects performed movement sequences, first unassisted and then using the exoskeleton, in the context of a computer-based task resembling a race. With zero inertia compensation, the steady-state frequency of leg swing was consistently reduced. Adding inertia compensation enabled subjects to recover their normal frequency of swing. |
Efficacy of clonidine in patients with essential hypertension with neurovascular contact of the rostral ventrolateral medulla | The rostral ventrolateral medulla (RVLM) is an important center for regulation of sympathetic nerve activity. Several clinical studies have suggested an association between neurovascular contact (NVC) of RVLM and essential hypertension. Microvascular decompression (MVD) of RVLM decreases blood pressure (BP) in hypertensive patients with NVC of this region. Therefore, MVD could be a useful therapeutic strategy to reduce BP in these patients. However, as MVD is an invasive procedure, it is worthy to seek useful antihypertensive agents for hypertensive patients with NVC. It is reported that sympathetic nerve activity is elevated in patients with hypertension accompanied by NVC of RVLM. It is anticipated that sympatholytic agents could be effective in lowering BP in these patients. In this study, we investigated the efficacy of clonidine, an α2 adrenergic agonist, in essential hypertensives with NVC of RVLM. Thirty consecutive essential hypertensive patients with NVC and 30 consecutive essential hypertensive patients without contact were treated with clonidine for 4 weeks, and decreases in BP and plasma norepinephrine levels were compared between the two groups. Decreases in BP and plasma norepinephrine levels were significantly greater in patients with NVC than in those without contact. These results suggest that clonidine exhibits significantly greater reductions of BP and sympathetic nerve activity in essential hypertensive patients with NVC compared with those without contact of the rostral ventrolateral medulla. |
Potential-based bounded-cost search and Anytime Non-Parametric A* | Article history: Received 1 May 2012 Received in revised form 1 May 2014 Accepted 3 May 2014 Available online 10 May 2014 |
Sets with Cardinality Constraints in Satisfiability Modulo Theories | Boolean Algebra with Presburger Arithmetic (BAPA) is a decidable logic that can express constraints on sets of elements and their cardinalities. Problems from verification of complex properties of software often contain fragments that belong to quantifier-free BAPA (QFBAPA). Deciding the satisfiability of QFBAPA formulas has been shown to be NP-complete using an eager reduction to quantifier-free Presburger arithmetic that exploits a sparse-solution property. In contrast to many other NP-complete problems (such as quantifier-free firstorder logic or linear arithmetic), the applications of QFBAPA to a broader set of problems has so far been hindered by the lack of an efficient implementation that can be used alongside other efficient decision procedures. We overcome these limitations by extending the efficient SMT solver Z3 with the ability to reason about cardinality constraints. Our implementation uses the DPLL(T ) mechanism of Z3 to reason about the top-level propositional structure of a QFBAPA formula, improving the efficiency compared to previous implementations. Moreover, we present a new algorithm for automated decomposition of QFBAPA formulas. Our algorithm alleviates the exponential explosion of considering all Venn regions, significantly improving the tractability of formulas with many set variables. Because it is implemented as a theory plugin, our implementation enables Z3 to prove formulas that use QFBAPA constructs alongside constructs from other theories that Z3 supports (e.g. linear arithmetic, uninterpreted function symbols, algebraic data types), as well as in formulas with quantifiers. We have applied our implementation to verification of functional programs; we show it can automatically prove formulas that no automated approach was reported to be able to prove before. |
An analysis of 3D point cloud reconstruction from light field images | Current methodologies for the generation of 3D point cloud from real world scenes rely upon a set of 2D images capturing the scene from several points of view. Novel plenoptic cameras sample the light field crossing the main camera lens creating a light field image. The information available in a plenoptic image must be processed in order to render a view or create the depth map of the scene. This paper analyses a method for the reconstruction of 3D models. The reconstruction of the model is obtained from a single image shot. Exploiting the properties of plenoptic images, a point cloud is generated and compared with a point cloud of the same object but generated with a different plenoptic camera. |
Text as Scene: Discourse Deixis and Bridging Relations | This paper presents a new framework, “text as scene”, which lays the foundations for the annotation of two coreferential links: discourse deixis and bridging relations. The incorporation of what we call textual and contextual scenes provides more flexible annotation guidelines, broad type categories being clearly differentiated. Such a framework that is capable of dealing with discourse deixis and bridging relations from a common perspective aims at improving the poor reliability scores obtained by previous annotation schemes, which fail to capture the vague references inherent in both these links. The guidelines presented here complete the annotation scheme designed to enrich the Spanish CESS-ECE corpus with coreference information, thus building the CESS-Ancora corpus. |
Concatenated p-mean Word Embeddings as Universal Cross-Lingual Sentence Representations | Average word embeddings are a common baseline for more sophisticated sentence embedding techniques. However, they typically fall short of the performances of more complex models such as InferSent. Here, we generalize the concept of average word embeddings to power mean word embeddings. We show that the concatenation of different types of power mean word embeddings considerably closes the gap to state-of-the-art methods monolingually and substantially outperforms these more complex techniques crosslingually. In addition, our proposed method outperforms different recently proposed baselines such as SIF and Sent2Vec by a solid margin, thus constituting a much harder-to-beat monolingual baseline. Our data and code are publicly available.1 |
Attention to visual speech gestures enhances hemodynamic activity in the left planum temporale. | Observing a speaker's articulatory gestures can contribute considerably to auditory speech perception. At the level of neural events, seen articulatory gestures can modify auditory cortex responses to speech sounds and modulate auditory cortex activity also in the absence of heard speech. However, possible effects of attention on this modulation have remained unclear. To investigate the effect of attention on visual speech-induced auditory cortex activity, we scanned 10 healthy volunteers with functional magnetic resonance imaging (fMRI) at 3 T during simultaneous presentation of visual speech gestures and moving geometrical forms, with the instruction to either focus on or ignore the seen articulations. Secondary auditory cortex areas in the bilateral posterior superior temporal gyrus and planum temporale were active both when the articulatory gestures were ignored and when they were attended to. However, attention to visual speech gestures enhanced activity in the left planum temporale compared to the situation when the subjects saw identical stimuli but engaged in a nonspeech motion discrimination task. These findings suggest that attention to visually perceived speech gestures modulates auditory cortex function and that this modulation takes place at a hierarchically relatively early processing level. |
Positive margins prediction in breast cancer conservative surgery: Assessment of a preoperative web-based nomogram. | Margin status of the surgical specimen has been shown to be a prognostic and risk factor for local recurrence in breast cancer surgery. It has been studied as a topic of intervention to diminish reoperation rates and reduce the probability of local recurrence in breast conservative surgery (BCS). This study aims to validate the Dutch BreastConservation! nomogram, created by Pleijhus et al., which predicts preoperative probability of positive margins in BCS. Patients with diagnosis of breast cancer stages cT1-2, who underwent BCS at the Breast Center of São João University Hospital (BC-CHSJ) in 2013-2014, were included. Association and correlation were evaluated for clinical, radiological, pathological and surgical variables. Multivariable logistic regression and ROC curves were used to assess nomogram parameters and discrimination. In our series of 253 patients, no associations were found between margin status and other studied variables (such as age or family history of breast cancer), except for weight (p-value = 0.045) and volume (p-value = 0.012) of the surgical specimen. Regarding the nomogram, a statistically significant association was shown between cN1 status and positive margins (p-value = 0.014). No differences were registered between the scores of patients with positive versus negative margins. Discrimination analysis showed an AUC of 0.474 for the basic and 0.508 for the expanded models. We cannot assume its external validation or its applicability to our cohort. Further studies are needed to determine the validity of this nomogram and achieve a broader view of currently available tools. |
Characterizing pseudoentropy and simplifying pseudorandom generator constructions | We provide a characterization of pseudoentropy in terms of hardness of sampling: Let (X,B) be jointly distributed random variables such that B takes values in a polynomial-sized set. We show that B is computationally indistinguishable from a random variable of higher Shannon entropy given X if and only if there is no probabilistic polynomial-time S such that (X,S(X)) has small KL divergence from (X,B). This can be viewed as an analogue of the Impagliazzo Hardcore Theorem (FOCS '95) for Shannon entropy (rather than min-entropy).
Using this characterization, we show that if f is a one-way function, then (f(Un),Un) has "next-bit pseudoentropy" at least n+log n, establishing a conjecture of Haitner, Reingold, and Vadhan (STOC '10). Plugging this into the construction of Haitner et al., this yields a simpler construction of pseudorandom generators from one-way functions. In particular, the construction only performs hashing once, and only needs the hash functions that are randomness extractors (e.g. universal hash functions) rather than needing them to support "local list-decoding" (as in the Goldreich--Levin hardcore predicate, STOC '89).
With an additional idea, we also show how to improve the seed length of the pseudorandom generator to ~{O}(n3), compared to O(n4) in the construction of Haitner et al. |
Advertising and Consumers' Communications | U recently, brand identities were built by firms via brand image advertising. However, the flourishing consumer communication weakened the firms’ grip on their brands. The interaction between advertising and consumer communications and their joint impact on brand identity is the focal point of this paper. We present a model in which consumer preference for functional attributes may correlate with the identity they desire to project of themselves. This correlation is known to the firm but not to the consumers. Both the firm and the consumers can communicate their desired brand identity, although the actual brand identity is determined endogeneously by the composition of consumers who purchase it (i.e., what types of people consume the brand). We find that sometimes the firm can strengthen the identity of its brand by refraining from advertising. This result is based on the following intermediate finding: advertising can diminish the endogeneous informativeness of consumer communications by making it one-sided. Furthermore, it turns out that refraining from brand image advertising may be optimal for the firm when the product is especially well positioned to create a strong identity—i.e., when consumer preferences for functional and self-expressive attributes are highly correlated. |
A Hybrid Heuristic for the Minimum Weight Vertex Cover Problem | Given an undirected graph with weights associated with its vertices, the minimum weight vertex cover problem seeks a subset of vertices with minimum sum of weights such that each edge of the graph has at least one endpoint belonging to the subset. In this paper, we propose a hybrid approach, combining a steady-state genetic algorithm and a greedy heuristic, for the minimum weight vertex cover problem. The genetic algorithm generates vertex cover, which is then reduced to minimal weight vertex cover by the heuristic. We have evaluated the performance of our algorithm on several test problems of varying sizes. Computational results show the effectiveness of our approach in solving the minimum weight vertex cover problem. |
Nucleon Spin Content from Elastic Form Factor Data | I use the formalism of skewed parton distributions (SPD) to describe elastic form factors of the nucleon. The results are consistent with approximate dipole behavior of $G_{Mp}(Q^2)$ and recent JLab data for the ratio of the proton form factors $G_{Ep}/G_{Mp}$ at $Q^2\leq 3.5$ GeV$^2$. Using the Angular Momentum Sum Rule, I obtain a numerical estimate for the valence quark contributions to the nucleon spin at the level of 50%, with their orbital angular momentum {\it included}. When combined with polarized DIS measurements, this result indicated that the orbital angular momentum of quarks contributes about 25% to the nucleon spin. |
Seasonal variations in air-water exchange of polychlorinated biphenyls in lake superior. | Instantaneous and seasonal fluxes of polychlorinated biphenyls (PCBs) across the Lake Superior air-water interface were predicted using a modified two-film gasexchange model. Instantaneous fluxes of CPCB (sum of 80 congeners) were calculated from air and water samples collected simultaneously during short cruises in 1988,1990, and 1992. Volatilization fluxes dominated in late summer in warmer waters (15-22 "C) while CPCBs were deposited into cool springtime waters (-2 "C). Seasonal fluxes were calculated using air samples collected at 12-day intervals over 2 years (1991-1992). Gas-phase CPCB exhibited two concentration maxima: in May and in August. The monthly average homolog concentrations were modeled as the sum of two Lorentzian functions and applied to the two-film gas-exchange model. The largest volatilization fluxes were calculated for fall, when water temperatures were relatively warm (7-10 "C) and vapor PCB concentrations were low (-65 pg/m3). Vapor deposition to Lake Superior is indicated from late April through May, into 0-3 "C water when vapor CPCB concentrations peak at -200 pg/m3. The annual CPCB flux from Lake Superior for 1992 is 250 kg/yr. |
Pathophysiology of Peyronie's disease | Peyronie's disease is an inflammatory condition characterized by the formation of fibrous, noncompliant nodules in the tunica albuginea which can impede tunical expansion during penile erection, leading to deformity and bending. While the cause of this disease is thought to be due to microvascular trauma and abnormal wound healing, other hypotheses include genetic predisposition. In this review the pathophysiology of Peyronie's disease is discussed as well as current hypothese regarding its origin. |
Confidence and certainty: distinct probabilistic quantities for different goals | When facing uncertainty, adaptive behavioral strategies demand that the brain performs probabilistic computations. In this probabilistic framework, the notion of certainty and confidence would appear to be closely related, so much so that it is tempting to conclude that these two concepts are one and the same. We argue that there are computational reasons to distinguish between these two concepts. Specifically, we propose that confidence should be defined as the probability that a decision or a proposition, overt or covert, is correct given the evidence, a critical quantity in complex sequential decisions. We suggest that the term certainty should be reserved to refer to the encoding of all other probability distributions over sensory and cognitive variables. We also discuss strategies for studying the neural codes for confidence and certainty and argue that clear definitions of neural codes are essential to understanding the relative contributions of various cortical areas to decision making. |
Energy optimisation of hybrid off-grid system for remote telecommunication base station deployment in Malaysia | Cellular network operators are always seeking to increase the area of coverage of their networks, open up new markets and provide services to potential customers in remote rural areas. However, increased energy consumption, operator energy cost and the potential environmental impact of increased greenhouse gas emissions and the exhaustion of non-renewable energy resources (fossil fuel) pose major challenges to cellular network operators. The specific power supply needs for rural base stations (BSs) such as cost-effectiveness, efficiency, sustainability and reliability can be satisfied by taking advantage of the technological advances in renewable energy. This study investigates the possibility of decreasing both operational expenditure (OPEX) and greenhouse gas emissions with guaranteed sustainability and reliability for rural BSs using a solar photovoltaic/diesel generator hybrid power system. Three key aspects have been investigated: (i) energy yield, (ii) economic factors and (iii) greenhouse gas emissions. The results showed major benefits for mobile operators in terms of both environmental conservation and OPEX reduction, with an average annual OPEX savings of 43% to 47% based on the characteristics of solar radiation exposure in Malaysia. Finally, the paper compares the feasibility of using the proposed approach in a four-season country and compares the results against results obtained in Malaysia, which is a country with a tropical climate. |
Decision Tree Algorithms Predict the Diagnosis and Outcome of Dengue Fever in the Early Phase of Illness | BACKGROUND
Dengue is re-emerging throughout the tropical world, causing frequent recurrent epidemics. The initial clinical manifestation of dengue often is confused with other febrile states confounding both clinical management and disease surveillance. Evidence-based triage strategies that identify individuals likely to be in the early stages of dengue illness can direct patient stratification for clinical investigations, management, and virological surveillance. Here we report the identification of algorithms that differentiate dengue from other febrile illnesses in the primary care setting and predict severe disease in adults.
METHODS AND FINDINGS
A total of 1,200 patients presenting in the first 72 hours of acute febrile illness were recruited and followed up for up to a 4-week period prospectively; 1,012 of these were recruited from Singapore and 188 from Vietnam. Of these, 364 were dengue RT-PCR positive; 173 had dengue fever, 171 had dengue hemorrhagic fever, and 20 had dengue shock syndrome as final diagnosis. Using a C4.5 decision tree classifier for analysis of all clinical, haematological, and virological data, we obtained a diagnostic algorithm that differentiates dengue from non-dengue febrile illness with an accuracy of 84.7%. The algorithm can be used differently in different disease prevalence to yield clinically useful positive and negative predictive values. Furthermore, an algorithm using platelet count, crossover threshold value of a real-time RT-PCR for dengue viral RNA, and presence of pre-existing anti-dengue IgG antibodies in sequential order identified cases with sensitivity and specificity of 78.2% and 80.2%, respectively, that eventually developed thrombocytopenia of 50,000 platelet/mm(3) or less, a level previously shown to be associated with haemorrhage and shock in adults with dengue fever.
CONCLUSION
This study shows a proof-of-concept that decision algorithms using simple clinical and haematological parameters can predict diagnosis and prognosis of dengue disease, a finding that could prove useful in disease management and surveillance. |
Towards Time-Aware Knowledge Graph Completion | Knowledge graph (KG) completion adds new facts to a KG by making inferences from existing facts. Most existing methods ignore the time information and only learn from time-unknown fact triples. In dynamic environments that evolve over time, it is important and challenging for knowledge graph completion models to take into account the temporal aspects of facts. In this paper, we present a novel time-aware knowledge graph completion model that is able to predict links in a KG using both the existing facts and the temporal information of the facts. To incorporate the happening time of facts, we propose a time-aware KG embedding model using temporal order information among facts. To incorporate the valid time of facts, we propose a joint time-aware inference model based on Integer Linear Programming (ILP) using temporal consistency information as constraints. We further integrate two models to make full use of global temporal information. We empirically evaluate our models on time-aware KG completion task. Experimental results show that our time-aware models achieve the state-of-the-art on temporal facts consistently. |
A New Incompleteness Result for Hoare's System | A structure A is presented for which Hoare's formal system for partial correctness is incomplete, even if the entire first-order theory of A is included among the axioms. It follows that the language of first-order logic is insufficient to express all loop invariants. The implications of this result for program-proving are discussed. |
The Cell Structures of Plant, Animal and Microbial Symbionts, Their Differences and Similarities | Organisms from different kingdoms and with different cell biologies living together in symbiosis have to solve the common problems of mutual recognition and of establishing an interchange of substances with each other. However, the different symbioses are often considered as disconnected phenomena, so that the common problems are not dealt with from a similar point of view. This leads to further problems of standardization of termin ology, for example the definition of “symbiont”, “parasite”, as well as their cell-to-cell relationships in geometrical terms. |
Personality Traits and Music Genre Preferences: How Music Taste Varies Over Age Groups | Personality traits are increasingly being incorporated in systems to provide a personalized experience to the user. Current work focusing on identifying the relationship between personality and behavior, preferences, and needs oen do not take into account dierences between age groups. With music playing an important role in our lives, dierences between age groups may be especially prevalent. In this work we investigate whether dierences exist in music listening behavior between age groups. We analyzed a dataset with the music listening histories and personality information of 1415 users. Our results show agreements with prior work that identied personality-based music listening preferences. However, our results show that the agreements we found are in some cases divided over dierent age groups, whereas in other cases additional correlations were found within age groups. With our results personality-based systems can provide beer music recommendations that is in line with the user’s age. |
Portraying Witnesses. The Apostles in Early Christian Art and Poetry, written by Dijkstra, R. | Bespreking van: Roald Dijkstra,Portraying Witnesses. The Apostles in Early Christian Art and Poetry s.l.:s.n. ,2014 978-94-6259-163-9 |
Wealth and happiness across the world: material prosperity predicts life evaluation, whereas psychosocial prosperity predicts positive feeling. | The Gallup World Poll, the first representative sample of planet Earth, was used to explore the reasons why happiness is associated with higher income, including the meeting of basic needs, fulfillment of psychological needs, increasing satisfaction with one's standard of living, and public goods. Across the globe, the association of log income with subjective well-being was linear but convex with raw income, indicating the declining marginal effects of income on subjective well-being. Income was a moderately strong predictor of life evaluation but a much weaker predictor of positive and negative feelings. Possessing luxury conveniences and satisfaction with standard of living were also strong predictors of life evaluation. Although the meeting of basic and psychological needs mediated the effects of income on life evaluation to some degree, the strongest mediation was provided by standard of living and ownership of conveniences. In contrast, feelings were most associated with the fulfillment of psychological needs: learning, autonomy, using one's skills, respect, and the ability to count on others in an emergency. Thus, two separate types of prosperity-economic and social psychological-best predict different types of well-being. |
Reading Twice for Natural Language Understanding | Despite the recent success of neural networks in tasks involving natural language understanding (NLU) there has only been limited progress in some of the fundamental challenges of NLU, such as the disambiguation of the meaning and function of words in context. This work approaches this problem by incorporating contextual information into word representations prior to processing the task at hand. To this end we propose a general-purpose reading architecture that is employed prior to a task-specific NLU model. It is responsible for refining context-agnostic word representations with contextual information and lends itself to the introduction of additional, context-relevant information from external knowledge sources. We demonstrate that previously non-competitive models benefit dramatically from employing contextual representations, closing the gap between general-purpose reading architectures and the state-of-the-art performance obtained with fine-tuned, task-specific architectures. Apart from our empirical results we present a comprehensive analysis of the computed representations which gives insights into the kind of information added during the refinement process. |
Effectiveness of computer-generated (virtual reality) graded exposure in the treatment of acrophobia. | OBJECTIVE
The authors' goal was to examine the efficacy of computer-generated (virtual reality) graded exposure in the treatment of acrophobia (fear of heights).
METHOD
Twenty college students with acrophobia were randomly assigned to virtual reality graded exposure treatment (N = 12) or to a waiting-list comparison group (N = 8). Seventeen students completed the study. Sessions were conducted individually over 8 weeks. Outcome was assessed by using measures of anxiety, avoidance, attitudes, and distress associated with exposure to heights before and after treatment.
RESULTS
Significant differences between the students who completed the virtual reality treatment (N = 10) and those on the waiting list (N = 7) were found on all measures. The treatment group was significantly improved after 8 weeks, but the comparison group was unchanged.
CONCLUSIONS
The authors conclude that treatment with virtual reality graded exposure was successful in reducing fear of heights. |
Inferring ontology graph structures using OWL reasoning | Ontologies are representations of a conceptualization of a domain. Traditionally, ontologies in biology were represented as directed acyclic graphs (DAG) which represent the backbone taxonomy and additional relations between classes. These graphs are widely exploited for data analysis in the form of ontology enrichment or computation of semantic similarity. More recently, ontologies are developed in a formal language such as the Web Ontology Language (OWL) and consist of a set of axioms through which classes are defined or constrained. While the taxonomy of an ontology can be inferred directly from the axioms of an ontology as one of the standard OWL reasoning tasks, creating general graph structures from OWL ontologies that exploit the ontologies’ semantic content remains a challenge. We developed a method to transform ontologies into graphs using an automated reasoner while taking into account all relations between classes. Searching for (existential) patterns in the deductive closure of ontologies, we can identify relations between classes that are implied but not asserted and generate graph structures that encode for a large part of the ontologies’ semantic content. We demonstrate the advantages of our method by applying it to inference of protein-protein interactions through semantic similarity over the Gene Ontology and demonstrate that performance is increased when graph structures are inferred using deductive inference according to our method. Our software and experiment results are available at http://github.com/bio-ontology-research-group/Onto2Graph . Onto2Graph is a method to generate graph structures from OWL ontologies using automated reasoning. The resulting graphs can be used for improved ontology visualization and ontology-based data analysis. |
Modeling of multiversion concurrency control system using Event-B | Concurrency control in a database system involves the activity of controlling the relative order of conflicting operations, thereby ensuring database consistency. Multiversion concurrency control is timestamp based protocol that can be used to schedule the operations to maintain the consistency of the databases. In this protocol each write on a data item produces a new copy (or version) of that data item while retaining the old version. A systematic approach to specification is essential for the production of any substantial system description. Formal methods are mathematical technique that provide systematic approach for building and verification of model. We have used Event-B as a formal technique for construction of our model. Event-B provides complete framework by rigorous description of problem at abstract level and discharge of proof obligations arising due to consistency checking. In this paper, we outline formal construction of model of multiversion concurrency control scheme for database transactions using Event-B. |
Semi-supervised deep learning by metric embedding | Deep networks are successfully used as classification models yielding state-ofthe-art results when trained on a large number of labeled samples. These models, however, are usually much less suited for semi-supervised problems because of their tendency to overfit easily when trained on small amounts of data. In this work we will explore a new training objective that is targeting a semi-supervised regime with only a small subset of labeled data. This criterion is based on a deep metric embedding over distance relations within the set of labeled samples, together with constraints over the embeddings of the unlabeled set. The final learned representations are discriminative in euclidean space, and hence can be used with subsequent nearest-neighbor classification using the labeled samples. |
Domain Invariant and Class Discriminative Feature Learning for Visual Domain Adaptation | Domain adaptation manages to build an effective target classifier or regression model for unlabeled target data by utilizing the well-labeled source data but lying different distributions. Intuitively, to address domain shift problem, it is crucial to learn domain invariant features across domains, and most existing approaches have concentrated on it. However, they often do not directly constrain the learned features to be class discriminative for both source and target data, which is of vital importance for the final classification. Therefore, in this paper, we put forward a novel feature learning method for domain adaptation to construct both domain invariant and class discriminative representations, referred to as DICD. Specifically, DICD is to learn a latent feature space with important data properties preserved, which reduces the domain difference by jointly matching the marginal and class-conditional distributions of both domains, and simultaneously maximizes the inter-class dispersion and minimizes the intra-class scatter as much as possible. Experiments in this paper have demonstrated that the class discriminative properties will dramatically alleviate the cross-domain distribution inconsistency, which further boosts the classification performance. Moreover, we show that exploring both domain invariance and class discriminativeness of the learned representations can be integrated into one optimization framework, and the optimal solution can be derived effectively by solving a generalized eigen-decomposition problem. Comprehensive experiments on several visual cross-domain classification tasks verify that DICD can outperform the competitors significantly. |
From dictatorship to a reluctant democracy: stroke therapists talking about self-management | PURPOSE
Self-management is being increasingly promoted within chronic conditions including stroke. Concerns have been raised regarding professional ownership of some programmes, yet little is known of the professional's experience. This paper aims to present the views of trained therapists about the utility of a specific self-management approach in stroke rehabilitation.
METHOD
Eleven stroke therapists trained in the self-management approach participated in semi-structured interviews. These were audio recorded, transcribed verbatim and analysed thematically.
RESULTS
Two overriding themes emerged. The first was the sense that in normal practice therapists act as "benign dictators", committed to help their patients, but most comfortable when they, the professional, are in control. Following the adoption of the self-management approach therapists challenged themselves to empower stroke survivors to take control of their own recovery. However, therapists had to confront many internal and external challenges in this transition of power resulting in the promotion of a somewhat "reluctant democracy".
CONCLUSIONS
This study illustrates that stroke therapists desire a more participatory approach to rehabilitation. However, obstacles challenged the successful delivery of this goal. If self-management is an appropriate model to develop in post stroke pathways, then serious consideration must be given to how and if these obstacles can be overcome.
IMPLICATIONS FOR REHABILITATION
Stroke therapists perceive that self-management is appropriate for encouraging ownership of rehabilitation post stroke. Numerous obstacles were identified as challenging the implementation of self-management post stroke. These included: professional models, practices and expectations; institutional demands and perceived wishes of stroke survivors. For self-management to be effectively implemented by stroke therapists, these obstacles must be considered and overcome. This should be as part of an integrated therapy service, rather than as an add-on. |
High Dynamic Range Imaging | While real scenes produce a wide range of brightness variations, current cameras use low dynamic range image detector that typically provide 256 levels of brightness data at each pixel. We propose methods to create High Dynamic Range images, the method to enhance the dynamic range of is based on capturing multiple exposure photographs of the scene. Even if there are few methods available for creating HDR images, HDR display technologies still lag behind. We present a tone mapping algorithm to display HDR images on Low Dynamic Range display devices. We also deal with histogram manipulation for painterly effect on output. Applications of HDR images are discussed in brief. An algorithm to create HDR images for still scenes in Color Filter Array (CFA) domain is proposed and successfully implemented. |
Variational Continual Learning | This paper develops variational continual learning (VCL), a simple but general framework for continual learning that fuses online variational inference (VI) and recent advances in Monte Carlo VI for neural networks. The framework can successfully train both deep discriminative models and deep generative models in complex continual learning settings where existing tasks evolve over time and entirely new tasks emerge. Experimental results show that variational continual learning outperforms state-of-the-art continual learning methods on a variety of tasks, avoiding catastrophic forgetting in a fully automatic way. |
Plasmodium transmission blocking activities of Vernonia amygdalina extracts and isolated compounds | Medicinal plants are a validated source for discovery of new leads and standardized herbal medicines. The aim of this study was to assess the activity of Vernonia amygdalina leaf extracts and isolated compounds against gametocytes and sporogonic stages of Plasmodium berghei and to validate the findings on field isolates of Plasmodium falciparum. Aqueous (Ver-H2O) and ethanolic (Ver-EtOH) leaf extracts were tested in vivo for activity against sexual and asexual blood stage P. berghei parasites. In vivo transmission blocking effects of Ver-EtOH and Ver-H2O were estimated by assessing P. berghei oocyst prevalence and density in Anopheles stephensi mosquitoes. Activity targeting early sporogonic stages (ESS), namely gametes, zygotes and ookinetes was assessed in vitro using P. berghei CTRPp.GFP strain. Bioassay guided fractionation was performed to characterize V. amygdalina fractions and molecules for anti-ESS activity. Fractions active against ESS of the murine parasite were tested for ex vivo transmission blocking activity on P. falciparum field isolates. Cytotoxic effects of extracts and isolated compounds vernolide and vernodalol were evaluated on the human cell lines HCT116 and EA.hy926. Ver-H2O reduced the P. berghei macrogametocyte density in mice by about 50% and Ver-EtOH reduced P. berghei oocyst prevalence and density by 27 and 90%, respectively, in An. stephensi mosquitoes. Ver-EtOH inhibited almost completely (>90%) ESS development in vitro at 50 μg/mL. At this concentration, four fractions obtained from the ethylacetate phase of the methanol extract displayed inhibitory activity >90% against ESS. Three tested fractions were also found active against field isolates of the human parasite P. falciparum, reducing oocyst prevalence in Anopheles coluzzii mosquitoes to one-half and oocyst density to one-fourth of controls. The molecules and fractions displayed considerable cytotoxicity on the two tested cell-lines. Vernonia amygdalina leaves contain molecules affecting multiple stages of Plasmodium, evidencing its potential for drug discovery. Chemical modification of the identified hit molecules, in particular vernodalol, could generate a library of druggable sesquiterpene lactones. The development of a multistage phytomedicine designed as preventive treatment to complement existing malaria control tools appears a challenging but feasible goal. |
Coarse-to-Fine n-Best Parsing and MaxEnt Discriminative Reranking | Discriminative reranking is one method for constructing high-performance statistical parsers (Collins, 2000). A discriminative reranker requires a source of candidate parses for each sentence. This paper describes a simple yet novel method for constructing sets of 50-best parses based on a coarse-to-fine generative parser (Charniak, 2000). This method generates 50-best lists that are of substantially higher quality than previously obtainable. We used these parses as the input to a MaxEnt reranker (Johnson et al., 1999; Riezler et al., 2002) that selects the best parse from the set of parses for each sentence, obtaining an f-score of 91.0% on sentences of length 100 or less. |
DARPA communicator evaluation: progress from 2000 to 2001 | This paper describes the evaluation methodology and results of the DARPA Communicator spoken dialog system evaluation experiments in 2000 and 2001. Nine spoken dialog systems in the travel planning domain participated in the experiments resulting in a total corpus of 1904 dialogs. We describe and compare the experimental design of the 2000 and 2001 DARPA evaluations. We describe how we established a performance baseline in 2001 for complex tasks. We present our overall approach to data collection, the metrics collected, and the application of PARADISE to these data sets. We compare the results we achieved in 2000 for a number of core metrics with those for 2001. These results demonstrate large performance improvements from 2000 to 2001 and show that the Communicator program goal of conversational interaction for complex tasks has been achieved. |
Cytokines in atherosclerosis: pathogenic and regulatory pathways. | Atherosclerosis is a chronic disease of the arterial wall where both innate and adaptive immunoinflammatory mechanisms are involved. Inflammation is central at all stages of atherosclerosis. It is implicated in the formation of early fatty streaks, when the endothelium is activated and expresses chemokines and adhesion molecules leading to monocyte/lymphocyte recruitment and infiltration into the subendothelium. It also acts at the onset of adverse clinical vascular events, when activated cells within the plaque secrete matrix proteases that degrade extracellular matrix proteins and weaken the fibrous cap, leading to rupture and thrombus formation. Cells involved in the atherosclerotic process secrete and are activated by soluble factors, known as cytokines. Important recent advances in the comprehension of the mechanisms of atherosclerosis provided evidence that the immunoinflammatory response in atherosclerosis is modulated by regulatory pathways, in which the two anti-inflammatory cytokines interleukin-10 and transforming growth factor-beta play a critical role. The purpose of this review is to bring together the current information concerning the role of cytokines in the development, progression, and complications of atherosclerosis. Specific emphasis is placed on the contribution of pro- and anti-inflammatory cytokines to pathogenic (innate and adaptive) and regulatory immunity in the context of atherosclerosis. Based on our current knowledge of the role of cytokines in atherosclerosis, we propose some novel therapeutic strategies to combat this disease. In addition, we discuss the potential of circulating cytokine levels as biomarkers of coronary artery disease. |
Influence of fracture history and bone mineral density testing on the treatment of osteoporosis in two non-academic community centers | A history of fracture and a low bone mineral density (BMD) are the strongest predictors of future osteoporotic fracture. This prospective cohort study assessed the impact of these two factors on treatment patterns in women undergoing their first BMD testing in a non-academic community setting. Successive women seen for first BMD testing at two testing centers completed a baseline questionnaire and a mailed 3-month follow-up questionnaire. Patients were grouped by history of fracture after age 20 years (present, absent) and by BMD result [osteoporosis (OP), osteopenia (OPN), normal BMD]. Thirty percent of 1144 patients at least 45 years old reported a history of fracture after age 20 years. They were no more likely than those without a history of fracture to be taking calcium (52% of total), vitamin D (31%), estrogen (31%), or a bisphosphonate (2%) before BMD testing. The BMD testing revealed OP in 20%, OPN in 45%, and normal BMD in 35%. Three months later, the percentages of patients taking treatments differed as follows: calcium (66 vs 53% in OP and OPN groups vs normal BMD), vitamin D (46 vs 37% in OPN group vs normal BMD), estrogen (25 vs 36 vs 44% in groups with OP, OPN, and normal BMD), a bisphosphonate (43 vs 11 vs 1%), and at least one of estrogen or a bisphosphonate (58 vs 43 vs 46%). Treatment decisions were influenced by first BMD testing but not significantly by a history of fracture. There is a substantial care gap in the treatment of patients with OP: either bisphosphonate or estrogen therapy was taken by only 31% of patients at least 45 years old and with a history of fracture after age 20 years before BMD testing and by only 58% of these and who also had OP by BMD. |
Construction of partial discharge measurement system under influence of cylindrical metal particle in transformer oil | This paper reported the construction of partial discharge measurement system under influence of cylindrical metal particle in transformer oil. The partial discharge of free cylindrical metal particle in the uniform electric field under AC applied voltage was studied in this paper. The partial discharge inception voltage (PDIV) for the single particle was measure to be 11kV. The typical waveform of positive PD and negative PD was also obtained. The result shows that the magnitude of negative PD is higher compared to positive PD. The observation on cylindrical metal particle movement revealed that there were a few stages of motion process involved. |
How to Train Triplet Networks with 100K Identities? | Training triplet networks with large-scale data is challenging in face recognition. Due to the number of possible triplets explodes with the number of samples, previous studies adopt the online hard negative mining(OHNM) to handle it. However, as the number of identities becomes extremely large, the training will suffer from bad local minima because effective hard triplets are difficult to be found. To solve the problem, in this paper, we propose training triplet networks with subspace learning, which splits the space of all identities into subspaces consisting of only similar identities. Combined with the batch OHNM, hard triplets can be found much easier. Experiments on the large-scale MS-Celeb-1M challenge with 100 K identities demonstrate that the proposed method can largely improve the performance. In addition, to deal with heavy noise and large-scale retrieval, we also make some efforts on robust noise removing and efficient image retrieval, which are used jointly with the subspace learning to obtain the state-of-the-art performance on the MS-Celeb-1M competition (without external data in Challenge1). |
Hysteresis and phase transitions for 1 D and 3 D models in shape memory alloys | By means of the Ginzburg-Landau theory of phase transitions, we study a non-isothermal model to characterize the austenite-martensite transition in shape memory alloys. In the first part of this paper, the onedimensional model proposed in [3] is modified by varying the expression of the free energy. In this way, the description of the phenomenon of hysteresis, typical of these materials, is improved and the related stressstrain curves are recovered. Then, a generalization of this model to the three dimensional case is proposed and its consistency with the principles of thermodynamics is proved. Unlike other three dimensional models, the transition is characterized by a scalar valued order parameter φ and the Ginzburg-Landau equation, ruling the evolution of φ, allows us to prove a maximum principle, ensuring the boundedness of φ itself. |
The role of translators in children's literature : invisible storytellers | List of Figures Series Editor's Foreword Acknowledgments Preface Introduction Part One 1: Didactic Translation: Religious Texts, Courtesy Books, Schoolbooks and Political Persuasion 2: Popular Fiction in Translation: The Child as Consumer of Romances and Fables in the Medieval and Early Modern Periods 3: Tales of the Marvellous 1690-1760: The Arabian Nights and the French Fairy Tale 4: Imagination, Reason and Mapping the World 1750-1820 5: Religious Stories and the Artful Fairy Tale in the Nineteenth Century 6: The Translating Woman: Assertive Professional or Invisible Storyteller 7: Summary of Part One: Translation Practices and the Child Audience Part Two Introduction 8: Into the Twentieth Century: Classics, the Folk Tale and Internationalism 1870-1940 9: Rewarding Translation for Children: Landmark Translations from 1940 and the Batchelder and Marsh Awards 10: Retranslation in the Twentieth And Twenty-First Centuries: For Children or Adults? 11: Translators' Voices 12: From Anonymity to Global Marketing: The Role of Translators in Children's Literature Notes Bibliography Index |
Forecasting models of emergency department crowding. | OBJECTIVES
The authors investigated whether models using time series methods can generate accurate short-term forecasts of emergency department (ED) bed occupancy, using traditional historical averages models as comparison.
METHODS
From July 2005 through June 2006, retrospective hourly ED bed occupancy values were collected from three tertiary care hospitals. Three models of ED bed occupancy were developed for each site: 1) hourly historical average, 2) seasonal autoregressive integrated moving average (ARIMA), and 3) sinusoidal with an autoregression (AR)-structured error term. Goodness of fits were compared using log likelihood and Akaike's Information Criterion (AIC). The accuracies of 4- and 12-hour forecasts were evaluated by comparing model forecasts to actual observed bed occupancy with root mean square (RMS) error. Sensitivity of prediction errors to model training time was evaluated, as well.
RESULTS
The seasonal ARIMA outperformed the historical average in complexity adjusted goodness of fit (AIC). Both AR-based models had significantly better forecast accuracy for the 4- and the 12-hour forecasts of ED bed occupancy (analysis of variance [ANOVA] p < 0.01), compared to the historical average. The AR-based models did not differ significantly from each other in their performance. Model prediction errors did not show appreciable sensitivity to model training times greater than 7 days.
CONCLUSIONS
Both a sinusoidal model with AR-structured error term and a seasonal ARIMA model were found to robustly forecast ED bed occupancy 4 and 12 hours in advance at three different EDs, without needing data input beyond bed occupancy in the preceding hours. |
Night time road curvature estimation based on Convolutional Neural Networks | Detecting the road geometry at night time is an essential precondition to provide optimal illumination for the driver and the other traffic participants. In this paper we propose a novel approach to estimate the current road curvature based on three sensors: A far infrared camera, a near infrared camera and an imaging radar sensor. Various Convolutional Neural Networks with different configuration are trained for each input. By fusing the classifier responses of all three sensors, a further performance gain is achieved. To annotate the training and evaluation dataset without costly human interaction a fully automatic curvature annotation algorithm based on inertial navigation system is presented as well. |
Self-reported quality of life of adolescents with cerebral palsy: a cross-sectional and longitudinal analysis | BACKGROUND
Children with cerebral palsy who can self-report have similar quality of life (QoL) to their able-bodied peers. Is this similarity also found in adolescence? We examined how self-reported QoL of adolescents with cerebral palsy varies with impairment and compares with the general population, and how factors in childhood predict adolescent QoL.
METHODS
We report QoL outcomes in a longitudinal follow-up and cross-sectional analysis of individuals included in the SPARCLE1 (childhood) and SPARCLE2 (adolescent) studies. In 2004 (SPARCLE1), a cohort of 818 children aged 8-12 years were randomly selected from population-based cerebral palsy registers in nine European regions. We gathered data from 500 participants about QoL with KIDSCREEN (ten domains); frequency of pain; child psychological problems (Strengths and Difficulties Questionnaire); and parenting stress (Parenting Stress Index). At follow-up in 2009 (SPARCLE2), 355 (71%) adolescents aged 13-17 years remained in the study and self-reported QoL (longitudinal sample). 76 additional adolescents self-reported QoL in 2009, providing data for 431 adolescents in the cross-sectional sample. Researchers gathered data at home visits. We compared QoL against matched controls in the general population. We used multivariable regression to relate QoL of adolescents with cerebral palsy to impairments (cross-sectional analysis) and to childhood QoL, pain, psychological problems, and parenting stress (longitudinal analysis).
FINDINGS
Severity of impairment was significantly associated (p<0·01) with reduced adolescent QoL on only three domains (Moods and emotions, Autonomy, and Social support and peers); average differences in QoL between the least and most able groups were generally less than 0·5 SD. Adolescents with cerebral palsy had significantly lower QoL than did those in the general population in only one domain (Social support and peers; mean difference -2·7 [0·25 SD], 95% CI -4·3 to -1·4). Pain in childhood or adolescence was strongly associated with low adolescent QoL on eight domains. Childhood QoL was a consistent predictor of adolescent QoL. Child psychological problems and parenting stress in childhood or their worsening between childhood and adolescence predicted only small reductions in adolescent QoL.
INTERPRETATION
Individual and societal attitudes should be affected by the similarity of the QoL of adolescents with and without cerebral palsy. Adolescents with cerebral palsy need particular help to maintain and develop peer relationships. Interventions in childhood to alleviate psychological difficulties, parenting stress, and especially pain, are justified for their intrinsic value and for their longer term effect on adolescent QoL.
FUNDING
SPARCLE1 was funded by the European Union Research Framework 5 Program (grant number QLG5-CT-2002-00636), the German Ministry of Health GRR-58640-2/14, and the German Foundation for the Disabled Child. SPARCLE2 was funded by: Wellcome Trust WT086315 A1A (UK and Ireland); Medical Faculty of the University of Lübeck E40-2009 and E26-2010 (Germany); CNSA, INSERM, MiRe-DREES, and IRESP (France); Ludvig and Sara Elsass Foundation, The Spastics Society and Vanforefonden (Denmark); Cooperativa Sociale "Gli Anni in Tasca" and Fondazione Carivit, Viterbo (Italy); Göteborg University-Riksforbundet for Rorelsehindrade Barn och Ungdomar and the Folke Bernadotte Foundation (Sweden). |
Polyphenols in Fruits and Vegetables and Its Effect on Human Health | Polyphenols represent a group of chemical substances common in plants, structurally characterized by the presence of one or more phenol units. Polyphenols are the most abundant antioxidants in human diets and the largest and best studied class of polyphenols is flavonoids, which include several thousand compounds. Numerous studies confirm that they exert a protective action on human health and are key components of a healthy and balanced diet. Epidemiological studies correlate flavonoid intake with a reduced incidence of chronic diseases, such as cardiovascular disease, diabetes and cancer. The involvement of reactive oxygen species (ROS) in the etiology of these degenerative conditions has suggested that phytochemicals showing antioxidant activity may contribute to the prevention of these pathologies. The present review deals with phenolic compounds in plants and reports on recent studies. Moreover, the present work includes information on the relationships between the consumption of these compounds, via feeding, and risk of disease occurrence, i.e. the effect on human health. Results obtained on herbs, essential oils, from plants grown in tropical, subtropical and temperate regions, were also reported. |
Violation of Kirchhoff's laws for a coherent RC circuit. | What is the complex impedance of a fully coherent quantum resistance-capacitance (RC) circuit at gigahertz frequencies in which a resistor and a capacitor are connected in series? While Kirchhoff's laws predict addition of capacitor and resistor impedances, we report on observation of a different behavior. The resistance, here associated with charge relaxation, differs from the usual transport resistance given by the Landauer formula. In particular, for a single-mode conductor, the charge-relaxation resistance is half the resistance quantum, regardless of the transmission of the mode. The new mesoscopic effect reported here is relevant for the dynamical regime of all quantum devices. |
Accounting for the Neglected Dimensions of AI Progress | We analyze and reframe AI progress. In addition to the prevailing metrics of performance, we highlight the usually neglected costs paid in the development and deployment of a system, including: data, expert knowledge, human oversight, software resources, computing cycles, hardware and network facilities, development time, etc. These costs are paid throughout the life cycle of an AI system, fall differentially on different individuals, and vary in magnitude depending on the replicability and generality of the AI solution. The multidimensional performance and cost space can be collapsed to a single utility metric for a user with transitive and complete preferences. Even absent a single utility function, AI advances can be generically assessed by whether they expand the Pareto (optimal) surface. We explore a subset of these neglected dimensions using the two case studies of Alpha* and ALE. This broadened conception of progress in AI should lead to novel ways of measuring success in AI, and can help set milestones for future progress. |
A stochastic approach to simulate artists behaviour for automatic felt-tipped stippling | Nowadays, non-photorealistic rendering is an area in computer graphics that tries to simulate what artists do and the tools they use. Stippling illustrations with felt-tipped colour pen is not a commonly used technique by artists due to its complexity. In this paper we present a new method to simulate stippling illustrations with felt-tipped colour pen from a photograph or an image. This method infers a probability function with an expert system from some rules given by the artist and then simulates the behaviour of the artist when placing the dots on the illustration by means of a stochastic algorithm. |
Beyond the Doubting of a Shadow A Reply to Commentaries on Shadows of the Mind | 1. Bernard J. Baars: Can physics provide a theory of consciousness? 2. David J. Chalmers: Minds, machines, and mathematics 3. Solomon Feferman: Penrose's Gödelian argument 4. Stanley A. Klein: Is quantum mechanics relevant to understanding consciousness? 5. Tim Maudlin: Between the motion and the act.... 6. John McCarthy: Awareness and understanding in computer programs 7. Daryl McCullough: Can humans escape Gödel? 8. Drew McDermott: [STAR] Penrose is wrong 9. Hans Moravec: Roger Penrose's gravitonic brains |
Answering the Call for a Standard Reliability Measure for Coding Data | In content analysis and similar methods, data are typically generated by trained human observers who record or transcribe textual, pictorial, or audible matter in terms suitable for analysis. Conclusions from such data can be trusted only after demonstrating their reliability. Unfortunately, the content analysis literature is full of proposals for so-called reliability coefficients, leaving investigators easily confused, not knowing which to choose. After describing the criteria for a good measure of reliability, we propose Krippendorff’s alpha as the standard reliability measure. It is general in that it can be used regardless of the number of observers, levels of measurement, sample sizes, and presence or absence of missing data. To facilitate the adoption of this recommendation, we describe a freely available macro written for SPSS and SAS to calculate Krippendorff’s alpha and illustrate its use with a simple example. |
PD-1 regulates germinal center B cell survival and the formation and affinity of long-lived plasma cells | Memory B and plasma cells (PCs) are generated in the germinal center (GC). Because follicular helper T cells (TFH cells) have high expression of the immunoinhibitory receptor PD-1, we investigated the role of PD-1 signaling in the humoral response. We found that the PD-1 ligands PD-L1 and PD-L2 were upregulated on GC B cells. Mice deficient in PD-L2 (Pdcd1lg2−/−), PD-L1 and PD-L2 (Cd274−/−Pdcd1lg2−/−) or PD-1 (Pdcd1−/−) had fewer long-lived PCs. The mechanism involved more GC cell death and less TFH cell cytokine production in the absence of PD-1; the effect was selective, as remaining PCs had greater affinity for antigen. PD-1 expression on T cells and PD-L2 expression on B cells controlled TFH cell and PC numbers. Thus, PD-1 regulates selection and survival in the GC, affecting the quantity and quality of long-lived PCs. |
Combining Words and Speech Prosody for Automatic Topic Segmentation | We present a probabilistic model that uses both prosodic and lexical cues for the automatic segmentation of speech into topic units. The approach combines hidden Markov models, statistical la ngu ge models, and prosody-based decision trees. Lexical informa tion is obtained from a speech recognizer, and prosodic features ar e extracted automatically from speech waveforms. We evaluate o ur approach on the Broadcast News corpus, using standard evaluat ion metrics. Results show that the prosodic model alone outperf orms the word-based segmentation method. Furthermore, we achie ve an additional reduction in error by combining the prosodic and wordbased knowledge sources. |
Fractal feature analysis and classification in medical imaging. | Following B.B. Mandelbrot's fractal theory (1982), it was found that the fractal dimension could be obtained in medical images by the concept of fractional Brownian motion. An estimation concept for determination of the fractal dimension based upon the concept of fractional Brownian motion is discussed. Two applications are found: (1) classification; (2) edge enhancement and detection. For the purpose of classification, a normalized fractional Brownian motion feature vector is defined from this estimation concept. It represented the normalized average absolute intensity difference of pixel pairs on a surface of different scales. The feature vector uses relatively few data items to represent the statistical characteristics of the medial image surface and is invariant to linear intensity transformation. For edge enhancement and detection application, a transformed image is obtained by calculating the fractal dimension of each pixel over the whole medical image. The fractal dimension value of each pixel is obtained by calculating the fractal dimension of 7x7 pixel block centered on this pixel. |
Employers dependency on foreign workers in Malaysia food service industry: A preliminary study | As a rapid developing country, many Malaysian citizens are currently living in the fast paced of life. This way of life has developed a trend of eating out in various restaurants and food stalls due to lack of time for food preparations. This trend has contributed to the booming of the food serving industry businesses and has created mass vacancies for employment. However, these employment vacancies are occupied by foreign workers. This paper is intends to (1) examine the relationship between contributing factors with employers dependency towards foreign workers and lastly (2) analyze the effects of contributing factors towards employers dependency on foreign workers. There are four variables used in this research namely government laws and regulations, attitudes of workers, availability of local workers and wages and incentive. Here, this study is applying correlation and regression to analyze the data. The findings show that all variables are associated with employers' dependency towards foreign workers. Lack of law and regulation enforcement is the most significant predictor that influences employers' dependencies towards foreign workers. |
BeeHive Lab project - sensorized hive for bee colonies life study | The rearing of bees is a quite difficult job since it requires experience and time. Beekeepers are used to take care of their bee colonies observing them and learning to interpret their behavior. Despite the rearing of bees represents one of the most antique human habits, nowadays bees risk the extinction principally because of the increasing pollution levels related to human activity. It is important to increase our knowledge about bees in order to develop new practices intended to improve their protection. These practices could include new technologies, in order to increase profitability of beekeepers and economical interest related to bee rearing, but also innovative rearing techniques, genetic selections, environmental politics and so on. Moreover bees, since they are very sensitive to pollution, are considered environmental indicators, and the research on bees could give important information about the conditions of soil, air and water. In this paper we propose a real hardware and software solution for apply the internet-of-things concept to bees in order to help beekeepers to improve their business and collect data for research purposes. |
Learning in Alzheimer's disease is facilitated by social interaction. | Seminal work in Gary Van Hoesen's laboratory at Iowa in the early 1980s established that the hallmark neuropathology of Alzheimer's disease (AD; neurofibrillary tangles) had its first foothold in specific parts of the hippocampal formation and entorhinal cortex, effectively isolating the hippocampus from much of its input and output and causing the distinctive impairment of new learning that is the leading early characteristic of the disease (Hyman et al., 1984). The boundaries and conditions of the anterograde memory defect in patients with AD have been a topic of intense research interest ever since (e.g., Graham and Hodges, 1977; Nestor et al., 2006). For example, it has been shown that patients with AD may acquire some new semantic information through methods such as errorless learning, but learning under these conditions is typically slow and inefficient. Drawing on a learning paradigm (a collaborative referencing task) that was previously shown to induce robust and enduring learning in patients with hippocampal amnesia, we investigated whether this task would be effective in promoting new learning in patients with AD. We studied five women with early-stage AD and 10 demographically matched healthy comparison participants, each interacting with a familiar communication partner. AD pairs displayed significant and enduring learning across trials, with increased accuracy and decreased time to complete trials, in a manner indistinguishable from healthy comparison pairs, resulting in efficient and economical communication. The observed learning here most likely draws on neural resources outside the medial temporal lobes. These interactive communication sessions provide a potent learning environment with significant implications for memory intervention. |
Theoretical prediction of chemically bound compounds made of argon and hydrocarbons. | Equilibrium structures, vibrational properties, and stabilities of organo-argon compounds, HArC4H and HArC6H, are studied at the MP2=full/6-311++G(2d,2p) level of theory. Ab initio calculations show that the two molecules are metastable, but protected against the Ar + HY (Y = C4H or C6H) dissociation by high barriers, and these species are energetically stable with respect to the three separate fragments (H + Ar +Y). The implied kinetic stability of HArC4H and HArC6H molecules suggests that these species should very likely be candidates for experimental observation. The results may open possibilities of organic chemistry of light noble gas elements. |
Gaze Estimation in the 3 D Space Using RGB-D sensors Towards Head-Pose And User Invariance | We address the problem of 3D gaze estimation within a 3D environment from remote sensors,which is highly valuable for applications in human-human and human-robot interactions. To the contrary of most previous works, which are limited to screen gazing applications, we propose to leverage the depth data of RGB-D cameras to perform an accurate head pose tracking, acquire head pose invariance through a 3D rectification process that renders head pose dependent eye images into a canonical viewpoint, and computes the line-ofsight in 3D space. To address the low resolution issue of the eye image resulting from the use of remote sensors, we rely on the appearance based gaze estimation paradigm, which has demonstrated robustness against this factor. In this context, we do a comparative study of recent appearance based strategies within our framework, study the generalization of these methods to unseen individual, and propose a cross-user eye image alignment technique relying on the direct registration of gaze-synchronized eye images. We demonstrate the validity of our approach through extensive gaze estimation experiments on a public dataset as well as a gaze coding task applied to natural job interviews. |
Recognition of multiple overlapping activities using compositional CNN-LSTM model | This paper introduces a new task, a recognition of multiple overlapping activities in the context of activity recognition. We propose a compositional CNN+LSTM algorithm. The experimental results show on the artificial dataset that it improved the accuracy from 27% to 43%. |
Learning to Map Natural Language Statements into Knowledge Base Representations for Knowledge Base Construction | Directly adding the knowledge triples obtained from open information extraction systems into a knowledge base is often impractical due to a vocabulary gap between natural language (NL) expressions and knowledge base (KB) representation. This paper aims at learning to map relational phrases in triples from natural-language-like statement to knowledge base predicate format. We train a word representation model on a vector space and link each NL relational pattern to the semantically equivalent KB predicate. Our mapping result shows not only high quality, but also promising coverage on relational phrases compared to previous research. |
Induction of Association Rules: Apriori Implementation | We describe an implementation of the well-known apriori algorithm for the induction of association rules [Agrawal et al. (1993), Agrawal et al. (1996)] that is based on the concept of a prefix tree. While the idea to use this type of data structure is not new, there are several ways to organize the nodes of such a tree, to encode the items, and to organize the transactions, which may be used in order to minimize the time needed to find the frequent itemsets as well as to reduce the amount of memory needed to store the counters. Consequently, our emphasis is less on concepts, but on implementation issues, which, however, can make a considerable difference in applications. |
Neovascular glaucoma after vitrectomy in patients with proliferative diabetic retinopathy | To evaluate the prevalence and risk factors of neovascular glaucoma (NVG) after vitrectomy in patients with vitreous hemorrhage associated with proliferative diabetic retinopathy (PDR). This retrospective, noncomparative, observational study included 127 eyes of 127 patients with PDR who received vitrectomy with a follow-up period of at least 6 months. The prevalence of NVG and associated risk factors were assessed including sex, age, previous panretinal photocoagulation, baseline intraocular pressure, combined phacovitrectomy, and pretreatment with intravitreal bevacizumab (IVB) before vitrectomy for the treatment of vitreous hemorrhage. NVG developed in 15 (11.8%) of 127 patients. Of the 15 eyes with NVG, 11 cases (73.3%) postoperatively developed NVG within 6 months. Postoperative NVG was associated with preoperative IVB treatment (odds ratio, 4.43; P = 0.019). The prevalence of NVG after vitrectomy was 11.8%, and an associated risk factor for NVG was preoperative IVB for the treatment of vitreous hemorrhage. |
Predictors of response and remission in the acute treatment of first-episode schizophrenia patients — Is it all about early response? | BACKGROUND
To evaluate the predictive validity of early response compared to other well-known predictor variables in acutely ill first-episode patients.
METHODS
112 patients were treated with a mean dosage of 4.14 mg (±1.70) haloperidol and 112 patients with a mean dosage of 4.17 mg (±1.55) risperidone for a mean inpatient treatment duration of 42.92 days (±16.85) within a double-blind, randomized controlled trial. Early response was defined as a ≥ 30% improvement in the PANSS total score by week 2, response as a ≥ 50% reduction in the PANSS total score from admission to discharge and remission according to the consensus criteria. Univariate tests and logistic regression models were applied to identify significant predictors of response and remission.
RESULTS
52% of the patients were responders and 59% remitters at discharge. Non-remitters at discharge were hindered from becoming remitters mainly by the presence of negative symptoms. Univariate tests revealed several significant differences between responders/non-responders and remitters/non-remitters such as age, severity of baseline psychopathology as well as the frequency of early response. Both early response (p<0.0001) and a higher PANSS positive subscore at admission (p=0.0002) were identified as significant predictors of response at discharge, whereas a shorter duration of untreated psychosis (p=0.0167), a lower PANSS general psychopathology subscore (p<0.0001), and early treatment response (p=0.0002) were identified as significant predictors of remission.
CONCLUSION
Together with the finding that early response is a significant predictor of response and remission, the relevance and predictive validity of negative and depressive symptoms for outcome is also highlighted. |
Adversarially Learned Anomaly Detection | Anomaly detection is a significant and hence well-studied problem. However, developing effective anomaly detection methods for complex and high-dimensional data remains a challenge. As Generative Adversarial Networks (GANs) are able to model the complex high-dimensional distributions of real-world data, they offer a promising approach to address this challenge. In this work, we propose an anomaly detection method, Adversarially Learned Anomaly Detection (ALAD) based on bi-directional GANs, that derives adversarially learned features for the anomaly detection task. ALAD then uses reconstruction errors based on these adversarially learned features to determine if a data sample is anomalous. ALAD builds on recent advances to ensure data-space and latent-space cycle-consistencies and stabilize GAN training, which results in significantly improved anomaly detection performance. ALAD achieves state-of-the-art performance on a range of image and tabular datasets while being several hundred-fold faster at test time than the only published GAN-based method. |
AT-MAC: Adaptive MAC-Frame Payload Tuning for Reliable Communication in Wireless Body Area Networks | In wireless sensor networks, adaptive tuning of Medium Access Control (MAC) parameters is necessary in order to assure the QoS requirements. In this paper, we propose an adaptive MAC-frame payload tuning mechanism for wireless body area networks (WBANs) to maximize the probability of successful packet delivery or reliability of the associated sensor nodes based on real-time situation. The enabling algorithm, Adaptively Tuned MAC (AT-MAC), has been proposed to tune the MAC-frame payload of a WBAN sensor node, which is compliant with the IEEE 802.15.4 protocol. AT-MAC prioritizes sensor nodes based on the seriousness of the health parameters that are being measured by the respective sensor nodes. Further, we consider a Markov chain-based analytical approach that acknowledges the slotted CSMA/CA backoff mechanism with retry limits, as described in the IEEE 802.15.4 protocol. We derive expressions for reliability, power consumption, and throughput, which are the key metrics to evaluate the network performance of the proposed protocol, and analyze the impact of MAC parameters on them. Finally, results indicate that the low rate and low power IEEE 802.15.4 can be used effectively in case of WBANs if the payload is tuned properly through the proposed algorithm. The proposed AT-MAC algorithm yields around 70 percent increase in reliability of a critical node in a WBAN. |
The Brief Obsessive–Compulsive Scale (BOCS): A self-report scale for OCD and obsessive–compulsive related disorders | BACKGROUND
The Brief Obsessive Compulsive Scale (BOCS), derived from the Yale-Brown Obsessive-Compulsive Scale (Y-BOCS) and the children's version (CY-BOCS), is a short self-report tool used to aid in the assessment of obsessive-compulsive symptoms and diagnosis of obsessive-compulsive disorder (OCD). It is widely used throughout child, adolescent and adult psychiatry settings in Sweden but has not been validated up to date.
AIM
The aim of the current study was to examine the psychometric properties of the BOCS amongst a psychiatric outpatient population.
METHOD
The BOCS consists of a 15-item Symptom Checklist including three items (hoarding, dysmorphophobia and self-harm) related to the DSM-5 category "Obsessive-compulsive related disorders", accompanied by a single six-item Severity Scale for obsessions and compulsions combined. It encompasses the revisions made in the Y-BOCS-II severity scale by including obsessive-compulsive free intervals, extent of avoidance and excluding the resistance item. 402 adult psychiatric outpatients with OCD, attention-deficit/hyperactivity disorder, autism spectrum disorder and other psychiatric disorders completed the BOCS.
RESULTS
Principal component factor analysis produced five subscales titled "Symmetry", "Forbidden thoughts", "Contamination", "Magical thoughts" and "Dysmorphic thoughts". The OCD group scored higher than the other diagnostic groups in all subscales (P < 0.001). Sensitivities, specificities and internal consistency for both the Symptom Checklist and the Severity Scale emerged high (Symptom Checklist: sensitivity = 85%, specificities = 62-70% Cronbach's α = 0.81; Severity Scale: sensitivity = 72%, specificities = 75-84%, Cronbach's α = 0.94).
CONCLUSIONS
The BOCS has the ability to discriminate OCD from other non-OCD related psychiatric disorders. The current study provides strong support for the utility of the BOCS in the assessment of obsessive-compulsive symptoms in clinical psychiatry. |
"Play PRBLMS": Identifying and Correcting Less Accessible Content in Voice Interfaces | Voice interfaces often struggle with specific types of named content. Domain-specific terminology and naming may push the bounds of standard language, especially in domains like music where artistic creativity extends beyond the music itself. Artists may name themselves with symbols (e.g. M S C RA) that most standard automatic speech recognition (ASR) systems cannot transcribe. Voice interfaces also experience difficulty surfacing content whose titles include non-standard spellings, symbols or other ASCII characters in place of English letters, or are written using a non-standard dialect. We present a generalizable method to detect content that current voice interfaces underserve by leveraging differences in engagement across input modalities. Using this detection method, we develop a typology of content types and linguistic practices that can make content hard to surface. Finally, we present a process using crowdsourced annotations to make underserved content more accessible. |
Runtime Verification and Enforcement for Android Applications with RV-Droid | RV-Droid is an implemented framework dedicated to runtime verification (RV) and runtime enforcement (RE) of Android applications. RV-Droid consists of an Android application that interacts closely with a cloud. Running RV-Droid on their devices, users can select targeted Android applications from Google Play (or a dedicated repository) and a property. The cloud hosts thirdparty RV tools that are used to synthesize AspectJ aspects from the property. According to the chosen RV tool and the specification, some appropriate monitoring code, the original application and the instrumentation aspect are woven together. Weaving can occur either on the user’s device or in the dedicated cloud. The woven application is then retrieved and executed on the user’s device and the property is runtime verified. RV-Droid is generic and currently works with two existing runtime verification frameworks for (pure) Java programs: with JavaMOP and (partially) with RuleR. RV-Droid does not require any modification to the Android kernel and targeted applications can be retrieved off-the-shelf. We carried out several experiments that demonstrated the effectiveness of RV-Droid on monitoring (security) properties. |
Integrated Billing Solutions in the Internet of Things | The Internet of Things is one of the most promising technological developments in information technology. It promises huge financial and nonfinancial benefits across supply chains, in product life cycle and customer relationship applications as well as in smart environments. However, the adoption process of the Internet of Things has been slower than expected. One of the main reasons for this is the missing profitability for each individual stakeholder. Costs and benefits are not equally distributed. Cost benefit sharing models have been proposed to overcome this problem and to enable new areas of application. However, these cost benefit sharing approaches are complex, time consuming, and have failed to achieve broad usage. In this chapter, an alternative concept, suggesting flexible pricing and trading of information, is proposed. On the basis of a beverage supply chain scenario, a prototype installation, based on an open source billing solution and the Electronic Product Code Information Service (EPCIS), is shown as a proof of concept and an introduction to different pricing options. This approach allows a more flexible and scalable solution for cost benefit sharing and may enable new business models for the Internet of Things. University of Bremen, Planning and Control of Production Systems, Germany |
Dyspraxia in autism: association with motor, social, and communicative deficits. | Impaired performance of skilled gestures, referred to as dyspraxia, is consistently reported in children with autism; however, its neurological basis is not well understood. Basic motor skill deficits are also observed in children with autism and it is unclear whether dyspraxia observed in children with autism can be accounted for by problems with motor skills. Forty-seven high-functioning children with an autism spectrum disorder (ASD), autism, or Asperger syndrome (43 males, four females; mean age 10y 7m [SD 1y 10m], mean Full-scale IQ (FSIQ) 99.4 [SD 15.9]), and 47 typically developing (TD) controls (41 males, six females; mean age 10y 6m [SD 1y 5m], mean FSIQ 113.8 [SD 12.3], age range 8-4y) completed: (1) the Physical and Neurological Assessment of Subtle Signs, an examination of basic motor skills standardized for children, and (2) a praxis examination that included gestures to command, to imitation, and with tool-use. Hierarchical regression was used to examine the association between basic motor skill performance (i.e. times to complete repetitive limb movements) and praxis performance (total praxis errors). After controlling for age and IQ, basic motor skill was a significant predictor of performance on praxis examination. Nevertheless, the ASD group continued to show significantly poorer praxis than controls after accounting for basic motor skill. Furthermore, praxis performance was a strong predictor of the defining features of autism, measured using the Autism Diagnostic Observation Schedule, and this correlation remained significant after accounting for basic motor skill. Results indicate that dyspraxia in autism cannot be entirely accounted for by impairments in basic motor skills, suggesting the presence of additional contributory factors. Furthermore, praxis in children with autism is strongly correlated with the social, communicative, and behavioral impairments that define the disorder, suggesting that dyspraxia may be a core feature of autism or a marker of the neurological abnormalities underlying the disorder. |
Professor Forcing: A New Algorithm for Training Recurrent Networks | The Teacher Forcing algorithm trains recurrent networks by supplying observed sequence values as inputs during training and using the network’s own one-stepahead predictions to do multi-step sampling. We introduce the Professor Forcing algorithm, which uses adversarial domain adaptation to encourage the dynamics of the recurrent network to be the same when training the network and when sampling from the network over multiple time steps. We apply Professor Forcing to language modeling, vocal synthesis on raw waveforms, handwriting generation, and image generation. Empirically we find that Professor Forcing acts as a regularizer, improving test likelihood on character level Penn Treebank and sequential MNIST. We also find that the model qualitatively improves samples, especially when sampling for a large number of time steps. This is supported by human evaluation of sample quality. Trade-offs between Professor Forcing and Scheduled Sampling are discussed. We produce T-SNEs showing that Professor Forcing successfully makes the dynamics of the network during training and sampling more similar. |
Essentials of electronic testing for digital, memory, and mixed-signal VLSI circuits [Book Review] | The first € price and the £ and $ price are net prices, subject to local VAT. Prices indicated with * include VAT for books; the €(D) includes 7% for Germany, the €(A) includes 10% for Austria. Prices indicated with ** include VAT for electronic products; 19% for Germany, 20% for Austria. All prices exclusive of carriage charges. Prices and other details are subject to change without notice. All errors and omissions excepted. M. Bushnell, V.D. Agrawal Essentials of Electronic Testing for Digital, Memory and MixedSignal VLSI Circuits |
Mechanisms of Persistent Activity in Cortical Circuits: Possible Neural Substrates for Working Memory. | A commonly observed neural correlate of working memory is firing that persists after the triggering stimulus disappears. Substantial effort has been devoted to understanding the many potential mechanisms that may underlie memory-associated persistent activity. These rely either on the intrinsic properties of individual neurons or on the connectivity within neural circuits to maintain the persistent activity. Nevertheless, it remains unclear which mechanisms are at play in the many brain areas involved in working memory. Herein, we first summarize the palette of different mechanisms that can generate persistent activity. We then discuss recent work that asks which mechanisms underlie persistent activity in different brain areas. Finally, we discuss future studies that might tackle this question further. Our goal is to bridge between the communities of researchers who study either single-neuron biophysical, or neural circuit, mechanisms that can generate the persistent activity that underlies working memory. |
Adaptive noise cancellation with a multirate normalized least mean squares filter | Multirate adaptive filtering is related to the problem of reconstructing a high-resolution signal from two or more observations that are sampled at different rates. A popular existing method for solving this problem uses the multirate adaptive filter structure that is based on the least mean squares (LMS) approach. However, its low convergence rate restricts the use of this method. In this study, a multirate normalized LMS (NLMS) filter is proposed as an alternative to that of LMS based filter, for the reconstruction of the high-resolution signal from several low-resolution noisy observations. In the simulation example performed on an audio signal, it is observed that the proposed method leads to the better results than the existing method especially in the convergence rate. |
Contour-aware network for semantic segmentation via adaptive depth | Semantic segmentation has been widely investigated for its important role in computer vision. However, some challenges still exist. The first challenge is how to perceive semantic regions with various attributes, which can result in unbalanced distribution of training samples. Another challenge is accurate semantic boundary determination. In this paper, a contour-aware network for semantic segmentation via adaptive depth is proposed which particularly exploits the power of adaptive-depth neural network and contouraware neural network on pixel-level semantic segmentation. Specifically, an adaptive-depth model, which can adaptively determine the feedback and forward procedure of neural network, is constructed. Moreover, a contour-aware neural network is respectively built to enhance the coherence and the localization accuracy of semantic regions. By formulating the contour information and coarse semantic segmentation results in a unified manner, global inference is proposed to obtain the final segmentation results. Three contributions are claimed: (1) semantic segmentation via adaptive depth neural network; (2) contouraware neural network for semantic segmentation; and (3) global inference for final decision. Experiments on three popular datasets are conducted and experimental results have verified the superiority of the proposed method compared with the state-of-the-art methods. © 2018 Elsevier B.V. All rights reserved. |
A Capability-Based Framework for Open Innovation: Complementing Absorptive Capacity | We merge research into knowledge management, absorptive capacity, and dynamic capabilities to arrive at an integrative perspective, which considers knowledge exploration, retention, and exploitation inside and outside a firm’s boundaries. By complementing the concept of absorptive capacity, we advance towards a capability-based framework for open innovation processes. We identify the following six ‘knowledge capacities’ as a firm’s critical capabilities of managing internal and external knowledge in open innovation processes: inventive, absorptive, transformative, connective, innovative, and desorptive capacity. ‘Knowledge management capacity’ is a dynamic capability, which reconfigures and realigns the knowledge capacities. It refers to a firm’s ability to successfully manage its knowledge base over time. The concept may be regarded as a framework for open innovation, as a complement to absorptive capacity, and as a move towards understanding dynamic capabilities for managing knowledge. On this basis, it contributes to explaining interfirm heterogeneity in knowledge and alliance strategies, organizational boundaries, and innovation |
Coping with school failure: Characteristics of students employing successful and unsuccessful coping strategies | This study examined the characteristics of four groups of children employing positive, defensive, self-blame, or mixed strategies to cope with a failure experience in school. The findings indicated that children who employ positive/action-oriented strategies are more likely to have higher academic achievement and a higher sense of self-worth. In addition, they tend to view themselves as more competent in the area of scholastic achievement and express that they feel successful in their peer relations. Although preliminary, these results suggest that children would profit from a school environment that fosters the development of positive/problem-focused skills. |
Prior and contextual emotion of words in sentential context | A set of words labeled with their prior emotion is an obvious place to start on the automatic discovery of the emotion of a sentence, but it is clear that context must also be considered. It may be that no simple function of the labels on the individual words captures the overall emotion of the sentence; words are interrelated and they mutually influence their affect-related interpretation. It happens quite often that a word which invokes emotion appears in a neutral sentence, or that a sentence with no emotional word carries an emotion. This could also happen among different emotion classes. The goal of this work is to distinguish automatically between prior and contextual emotion, with a focus on exploring features important in this task. We present a set of features which enable us to take the contextual emotion of a word and the syntactic structure of the sentence into account to put sentences into emotion classes. The evaluation includes assessing the performance of different feature sets across multiple classification methods. We show the features and a promising learning method which significantly outperforms two reasonable baselines. We group our features by the similarity of their nature. That is why another facet of our evaluation is to consider each group of the features separately and investigate how well they contribute to the result. The experiments show that all features contribute to the result, but it is the combination of all the features that gives the best performance. © 2013 Elsevier Ltd. All rights reserved. |
Atypical manifestations in Brazilian patients with neuro-Behçet’s disease | Type and frequency of systemic and neurologic manifestations of Behçet’s disease (BD) vary with ethnicity. In Brazil, BD occurs as sporadic cases. We describe clinical and radiological features of 36 Brazilian patients of mixed ethnicity with neuro-Behçet’s disease (NBD). Medical records of 178 BD patients were reviewed and 36 (20%) NBD patients were identified. Twenty-one NBD patients (58.3%) were female and 27 (75%) presented with parenchymal manifestations. Brainstem involvement was the most common neurologic syndrome (41.7%). Seizures (27.8%), isolated aseptic meningitis (16.7%), optic neuropathy (ON) (16.7%), cerebral venous thrombosis (CVT) (8.3%), peripheral neuropathy (2.8%), and spinal cord involvement (5.6%) were other neurologic manifestations observed among Brazilian NBD patients. Eighteen (50%) had at least one relapse, and isolated aseptic meningitis was the most common relapsing manifestation. No significant differences concerning the number of relapses between parenchymal and non-parenchymal groups were found. A multivariate model including disease duration, cell count in spinal fluid, cyclosporine use, immunosuppressive use at disease onset, age at NBD onset, and ON did not reveal any significant associations with NBD relapse. There was a low frequency of CVT and an unexpected higher number of isolated aseptic meningitis. Brazilian NBD patients present more parenchymal and atypical manifestations, and relapse more often than NBD patients from other populations. |
How Can Childbirth Care for the Rural Poor Be Improved? A Contribution from Spatial Modelling in Rural Tanzania | INTRODUCTION
Maternal and perinatal mortality remain a challenge in resource-limited countries, particularly among the rural poor. To save lives at birth health facility delivery is recommended. However, increasing coverage of institutional deliveries may not translate into mortality reduction if shortage of qualified staff and lack of enabling working conditions affect quality of services. In Tanzania childbirth care is available in all facilities; yet maternal and newborn mortality are high. The study aimed to assess in a high facility density rural context whether a health system organization with fewer delivery sites is feasible in terms of population access.
METHODS
Data on health facilities' location, staffing and delivery caseload were examined in Ludewa and Iringa Districts, Southern Tanzania. Geospatial raster and network analysis were performed to estimate access to obstetric services in walking time. The present geographical accessibility was compared to a theoretical scenario with a 40% reduction of delivery sites.
RESULTS
About half of first-line health facilities had insufficient staff to offer full-time obstetric services (45.7% in Iringa and 78.8% in Ludewa District). Yearly delivery caseload at first-line health facilities was low, with less than 100 deliveries in 48/70 and 43/52 facilities in Iringa and Ludewa District respectively. Wide geographical overlaps of facility catchment areas were observed. In Iringa 54% of the population was within 1-hour walking distance from the nearest facility and 87.8% within 2 hours, in Ludewa, the percentages were 39.9% and 82.3%. With a 40% reduction of delivery sites, approximately 80% of population will still be within 2 hours' walking time.
CONCLUSIONS
Our findings from spatial modelling in a high facility density context indicate that reducing delivery sites by 40% will decrease population access within 2 hours by 7%. Focused efforts on fewer delivery sites might assist strengthening delivery services in resource-limited settings. |
k-Anonymously Private Search over Encrypted Data | In this paper we compare the performance of various homomorphic encryption methods on a private search scheme that can achieve k-anonymity privacy. To make our benchmarking fair, we use open sourced cryptographic libraries which are written by experts and well scrutinized. We find that Goldwasser-Micali encryption achieves good enough performance for practical use, whereas fully homomorphic encryptions are much slower than partial ones like Goldwasser-Micali and Paillier. |
Comparison of Change Theories | The purpose of this article is to summarize several change theories and assumptions about the nature of change. The author shows how successful change can be encouraged and facilitated for long-term success. The article compares the characteristics of Lewin’s Three-Step Change Theory, Lippitt’s Phases of Change Theory, Prochaska and DiClemente’s Change Theory, Social Cognitive Theory, and the Theory of Reasoned Action and Planned Behavior to one another. Leading industry experts will need to continually review and provide new information relative to the change process and to our evolving society and culture. here are many change theories and some of the most widely recognized are briefly summarized in this article. The theories serve as a testimony to the fact that change is a real phenomenon. It can be observed and analyzed through various steps or phases. The theories have been conceptualized to answer the question, “How does successful change happen?” Lewin’s Three-Step Change Theory Kurt Lewin (1951) introduced the three-step change model. This social scientist views behavior as a dynamic balance of forces working in opposing directions. Driving forces facilitate change because they push employees in the desired direction. Restraining forces hinder change because they push employees in the opposite direction. Therefore, these forces must be analyzed and Lewin’s three-step model can help shift the balance in the direction of the planned change (http://www.csupomona.edu/~jvgrizzell/best_practices/bctheory.html). T INTERNATIONAL JOURNAL OF MNAGEMENT, BUSINESS, AND ADMINISTRATION 2_____________________________________________________________________________________ According to Lewin, the first step in the process of changing behavior is to unfreeze the existing situation or status quo. The status quo is considered the equilibrium state. Unfreezing is necessary to overcome the strains of individual resistance and group conformity. Unfreezing can be achieved by the use of three methods. First, increase the driving forces that direct behavior away from the existing situation or status quo. Second, decrease the restraining forces that negatively affect the movement from the existing equilibrium. Third, find a combination of the two methods listed above. Some activities that can assist in the unfreezing step include: motivate participants by preparing them for change, build trust and recognition for the need to change, and actively participate in recognizing problems and brainstorming solutions within a group (Robbins 564-65). Lewin’s second step in the process of changing behavior is movement. In this step, it is necessary to move the target system to a new level of equilibrium. Three actions that can assist in the movement step include: persuading employees to agree that the status quo is not beneficial to them and encouraging them to view the problem from a fresh perspective, work together on a quest for new, relevant information, and connect the views of the group to well-respected, powerful leaders that also support the change (http://www.csupomona.edu/~jvgrizzell/best_practices/bctheory.html). The third step of Lewin’s three-step change model is refreezing. This step needs to take place after the change has been implemented in order for it to be sustained or “stick” over time. It is high likely that the change will be short lived and the employees will revert to their old equilibrium (behaviors) if this step is not taken. It is the actual integration of the new values into the community values and traditions. The purpose of refreezing is to stabilize the new equilibrium resulting from the change by balancing both the driving and restraining forces. One action that can be used to implement Lewin’s third step is to reinforce new patterns and institutionalize them through formal and informal mechanisms including policies and procedures (Robbins 564-65). Therefore, Lewin’s model illustrates the effects of forces that either promote or inhibit change. Specifically, driving forces promote change while restraining forces oppose change. Hence, change will occur when the combined strength of one force is greater than the combined strength of the opposing set of forces (Robbins 564-65). Lippitt’s Phases of Change Theory Lippitt, Watson, and Westley (1958) extend Lewin’s Three-Step Change Theory. Lippitt, Watson, and Westley created a seven-step theory that focuses more on the role and responsibility of the change agent than on the evolution of the change itself. Information is continuously exchanged throughout the process. The seven steps are: |
Concentration profiles in the underlying alloy during the oxidation of iron-chromium alloys | Abstract Compositional changes in Fe-27.4, 37.4 and 59.5% Cr alloys during the formation of Cr 2 O 3 -rich scales have been measured by electron probe microanalysis and calculated by solution of the appropriate transport equations. Agreement between corresponding results is within the limits of experimental error. Chromium concentration profiles have also been calculated for Fe-14.0 and Fe-18.0% Cr by a similar method, allowance being made for the formation of γ-phase at the surface of the alloy. The chromium concentration at the alloy/oxide interface never falls below the limiting value for thermodynamic stability of a Cr 2 O 3 scale. Chemical transformation of the Cr 2 O 3 scales from within is therefore not possible and the failure of such scales on Fe-14.0 and Fe-18.0% Cr is attributed to mechanical causes. |
Income Distribution and Business Cycles | Despite a century of work, significant disagreement remains on what causes business cycles. A literature survey of business cycle theory and empirical research demonstrates how extensive the work is in this field (see Zarnowitz for an overview of business cycles). Not only is there a lack of consensus across schools of thought, but substantial theoretical differences exist within the Neoclassical school. Controversy persists over whether cycles are created endogenously or exogenously, whether the real or financial sector is more important, and whether the economy is fundamentally stable or unstable. However, progressive economists agree that profits, profit rates, the price-cost cycle, and the functional distribution of income are central in generating cycles. The purpose of this study is to argue for the importance of income distribution in generating capitalist cycles because of its affect on profitability and investment. Specifically, the objectives are two-fold: (1) To explain how income distribution is determined and how changes in income distribution induce cycles, and (2) To measure income distribution and to determine whether there is an empirical connection between cycles and income distribution. This paper's specific contributions are that it synthesizes heterodox cycle theory, clarifies the specific and critical role played by income distribution in perpetuating cycles, and sets forth a multidimensional, environmental context which determines cycles and trends in income distribution. This effort should prove to be of interest and importance to heterodox economists. First, cycles in production affect employment and income levels of many people who are of modest means and depend upon the business community for their living standard. Understanding cycles is one step toward creating instrumental change. Second, the following theory incorporates a number of important principles: endogenously created outcomes, interdependent variables, power differences between corporations and labor, importance of property rights and status, and a central role for institutions and technology. And finally, this work attempts to forge some convergence in heterodox thought on business cycles. This is one area where progressive economists can find common ground.(1) Literature Survey Heterodox economic schools emphasize the central role of profits in generating cycles. For Veblen, profit considerations dominate business decision-making. When selling prices are above costs, production increases and collateral values rise along with the use of credit; the reverse causes debt liquidation and a downturn (Veblen, 1904, chap. 7). Mitchell (1951 and 1959) is known for his elaborate statistical analysis of cycles and the importance he places on profit pursuits creating each of the cycle phases. Heilbroner and Gordon argue that cycles are functions of the quantity of investment opportunities and the given social setting. This setting depends on the current state of the relationship between business and labor, business and government, and business and the public. However, there is no endogenous explanation for when social settings change (Heilbroner, 1988, pp. 73-74; Gordon, 1978). Some economists argue that income distribution affects economic stability or growth (Peach, 1987; Jarsulic, 1988; Myrdal, 1962; Hahnel and Sherman, 1982; Sherman, 1991; Kalecki, 1971; and Pulling, 1978) while others believe that income distribution is important, but not causal (Haslag and Fomby, 1988 and Goldstein, 1986). Peach writes that income is distributed according to the forms of institutional power arrangements in society and that there is no tradeoff between growth and increased income equality. After reviewing the empirical work on income shares and crises, Jarsulic constructs a dynamic model where expectations and nonexpectational factors play an important role in affecting stability. He concludes that economists should explicitly consider how income shares are determined and that this could provide further insight into the analysis of financial instability (Jarsulic 1988, pp. … |
Tensor-Factorized Neural Networks | The growing interests in multiway data analysis and deep learning have drawn tensor factorization (TF) and neural network (NN) as the crucial topics. Conventionally, the NN model is estimated from a set of one-way observations. Such a vectorized NN is not generalized for learning the representation from multiway observations. The classification performance using vectorized NN is constrained, because the temporal or spatial information in neighboring ways is disregarded. More parameters are required to learn the complicated data structure. This paper presents a new tensor-factorized NN (TFNN), which tightly integrates TF and NN for multiway feature extraction and classification under a unified discriminative objective. This TFNN is seen as a generalized NN, where the affine transformation in an NN is replaced by the multilinear and multiway factorization for tensor-based NN. The multiway information is preserved through layerwise factorization. Tucker decomposition and nonlinear activation are performed in each hidden layer. The tensor-factorized error backpropagation is developed to train TFNN with the limited parameter size and computation time. This TFNN can be further extended to realize the convolutional TFNN (CTFNN) by looking at small subtensors through the factorized convolution. Experiments on real-world classification tasks demonstrate that TFNN and CTFNN attain substantial improvement when compared with an NN and a convolutional NN, respectively. |
Better streaming algorithms for clustering problems | We study clustering problems in the streaming model, where the goal is to cluster a set of points by making one pass (or a few passes) over the data using a small amount of storage space. Our main result is a randomized algorithm for the k--Median problem which produces a constant factor approximation in one pass using storage space O(k poly log n). This is a significant improvement of the previous best algorithm which yielded a 2O(1/ε) approximation using O(nε) space. Next we give a streaming algorithm for the k--Median problem with an arbitrary distance function. We also study algorithms for clustering problems with outliers in the streaming model. Here, we give bicriterion guarantees, producing constant factor approximations by increasing the allowed fraction of outliers slightly. |
Westward extension of the Snaefellsnes Volcanic Zone of Iceland | On the basis of new geophysical data it is postulated that volcanic features similar to those occurring in the east-west trending Snaefellsnes volcanic zone of Iceland extend at least 100 km to the west of the Snaefellsnes peninsula. |
Lung transplantation for advanced cystic lung disease due to nonamyloid kappa light chain deposits. | RATIONALE
Cystic lung light chain deposition disease (LCDD) is a severe and rare form of nonamyloid kappa light chain deposits localized in the lung, potentially leading to end-stage respiratory insufficiency.
OBJECTIVES
To assess the outcome after lung transplantation (LT) in this setting with particular attention to disease recurrence.
METHODS
We conducted a retrospective multicenter study of seven patients who underwent LT for cystic lung LCDD in France between September 1992 and June 2012 in five centers.
MEASUREMENTS AND MAIN RESULTS
In total, five females and two males (mean age, 39.1 ± 5.3 yr) underwent one single LT or seven double LT (one retransplantation). Before LT, the patients showed a constant obstructive ventilatory pattern with low carbon monoxide diffusing capacity and resting hypoxemia. Lung computed tomography revealed widespread cysts with occasional micronodulations. No extrapulmonary disease or plasma cell neoplasm was detected. The serum-free kappa/lambda light chain ratio was increased in three cases. The median follow-up after LT was 56 months (range, 1-110 mo). Kaplan-Meier survival was 85.7, 85.7, and 64.3% at 1, 3, and 5 years, respectively. Three patients died from multiorgan failure (n = 1), chronic rejection (n = 1), and breast cancer (n = 1) at 23 days, 56 months, and 96 months, respectively. At the end of follow-up, no patients showed recurrence on imaging or histopathology.
CONCLUSIONS
This small case series confirms that cystic lung LCDD is a severe disease limited to the lung, affecting mostly young females. LT appears to be a good therapeutic option allowing for satisfactory long-term survival. We found no evidence of recurrence of the disease after LT. |
Content Caching and Distribution in Smart Grid Enabled Wireless Networks | To facilitate wireless transmission of multimedia content to mobile users, we propose a content caching and distribution framework for smart grid enabled OFDM networks, where each popular multimedia file is coded and distributively stored in multiple energy harvesting enabled serving nodes (SNs), and the green energy distributively harvested by SNs can be shared with each other through the smart grid. The distributive caching, green energy sharing, and the on-grid energy backup have improved the reliability and performance of the wireless multimedia downloading process. To minimize the total on-grid power consumption of the whole network, while guaranteeing that each user can retrieve the whole content, the user association scheme is jointly designed with consideration of resource allocation, including subchannel assignment, power allocation, and power flow among nodes. Simulation results demonstrate that bringing content, green energy, and SN closer to the end user can notably reduce the on-grid energy consumption. |
Intelligent Bayesian Asset Allocation via Market Sentiment Views | The sentiment index of market participants has been extensively used for stock market prediction in recent years. Many financial information vendors also provide it as a service. However, utilizing market sentiment under the asset allocation framework has been rarely discussed. In this article, we investigate the role of market sentiment in an asset allocation problem. We propose to compute sentiment time series from social media with the help of natural language processing techniques. A novel neural network design, built upon an ensemble of evolving clustering and long short-term memory, is used to formalize sentiment information into market views. These views are later integrated into modern portfolio theory through a Bayesian approach. We analyze the performance of this asset allocation model from many aspects, such as stability of portfolios, computing of sentiment time series, and profitability in our simulations. Experimental results show that our model outperforms some of the most successful forecasting techniques. Thanks to the introduction of the evolving clustering method, the estimation accuracy of market views is significantly improved. |
Give Me More Feedback: Annotating Argument Persuasiveness and Related Attributes in Student Essays | • Oracle experiment: to understand how well these attributes, when used together, can explain persuasiveness, we train 3 linear SVM regressors, one for each component type, to score an arguments persuasiveness using gold attribute’s as features • Two human annotators who were both native speakers of English were first familiarized with the rubrics and definitions and then trained on five essays • 30 essays were doubly annotated for computing inter-annotator agreement • Each of the remaining essays was annotated by one of the annotators • Score/Class distributions by component type: Give me More Feedback: Annotating Argument Persusiveness and Related Attributes in Student Essays |
Design and methods of the MAINTAIN study: a randomized controlled clinical trial of micronutrient and antioxidant supplementation in untreated HIV infection. | Micronutrient deficiencies are common in HIV positive persons and are associated with a poorer prognosis, but the role of micronutrient supplementation in the medical management of HIV infection remains controversial, as some but not all studies show immunological and clinical benefit. Micronutrients supplementation could be a relatively low cost strategy to defer the initiation of expensive, potentially toxic and lifelong antiretroviral therapy. The MAINTAIN study is a Canadian multi-center randomized control double blind clinical trial to evaluate if micronutrient supplementation of HIV positive persons slows progression of immune deficiency and delays the need to start antiretroviral therapy and is safe, compared to standard multivitamins. Untreated asymptomatic HIV positive adults will receive a micronutrient and antioxidant preparation (n = 109) or an identical appearing recommended daily allowance multivitamin and mineral preparation (n = 109) for two years. Participants will be followed quarterly and monitored for time from baseline to CD4 T lymphocyte count <350 mm(3), or emergence of CDC-defined AIDS-defining illness, or the start of antiretroviral therapy. We will also compare safety and health related quality of life between groups. Primary analysis will compare the incidence of the composite primary outcome between study groups and will be by intention-to-treat. The study was originally expected to last three years, with accrual over one year and a minimum of two years follow up of the last enrolled participant. We discuss here the study design and methods, often used for evaluation of complementary and adjunctive treatments for health maintenance in HIV infection, which are common interventions. |
A review of biometric technology along with trends and prospects | Identity management through biometrics offer potential advantages over knowledge and possession based methods. A wide variety of biometric modalities have been tested so far but several factors paralyse the accuracy of mono modal biometric systems. Usually, the analysis of multiple modalities offers better accuracy. An extensive review of biometric technology is presented here. Besides the mono modal systems, the article also discusses multi modal biometric systems along with their architecture and information fusion levels. The paper along with the exemplary evidences highlights the potential for biometric technology, market value and prospects. Keywords— Biometrics, Fingerprint, Face, Iris, Retina, Behavioral biometrics, Gait, Voice, Soft biometrics, Multi-modal biometrics. |
Organic Photodiodes: The Future of Full Color Detection and Image Sensing. | Major growth in the image sensor market is largely as a result of the expansion of digital imaging into cameras, whether stand-alone or integrated within smart cellular phones or automotive vehicles. Applications in biomedicine, education, environmental monitoring, optical communications, pharmaceutics and machine vision are also driving the development of imaging technologies. Organic photodiodes (OPDs) are now being investigated for existing imaging technologies, as their properties make them interesting candidates for these applications. OPDs offer cheaper processing methods, devices that are light, flexible and compatible with large (or small) areas, and the ability to tune the photophysical and optoelectronic properties - both at a material and device level. Although the concept of OPDs has been around for some time, it is only relatively recently that significant progress has been made, with their performance now reaching the point that they are beginning to rival their inorganic counterparts in a number of performance criteria including the linear dynamic range, detectivity, and color selectivity. This review covers the progress made in the OPD field, describing their development as well as the challenges and opportunities. |
MalClassifier: Malware family classification using network flow sequence behaviour | Anti-malware vendors receive daily thousands of potentially malicious binaries to analyse and categorise before deploying the appropriate defence measure. Considering the limitations of existing malware analysis and classification methods, we present MalClassifier, a novel privacy-preserving system for the automatic analysis and classification of malware using network flow sequence mining. MalClassifier allows identifying the malware family behind detected malicious network activity without requiring access to the infected host or malicious executable reducing overall response time. MalClassifier abstracts the malware families' network flow sequence order and semantics behaviour as an n-flow. By mining and extracting the distinctive n-flows for each malware family, it automatically generates network flow sequence behaviour profiles. These profiles are used as features to build supervised machine learning classifiers (K-Nearest Neighbour and Random Forest) for malware family classification. We compute the degree of similarity between a flow sequence and the extracted profiles using a novel fuzzy similarity measure that computes the similarity between flows attributes and the similarity between the order of the flow sequences. For classifier performance evaluation, we use network traffic datasets of ransomware and botnets obtaining 96% F-measure for family classification. MalClassifier is resilient to malware evasion through flow sequence manipulation, maintaining the classifier's high accuracy. Our results demonstrate that this type of network flow-level sequence analysis is highly effective in malware family classification, providing insights on reoccurring malware network flow patterns. |
Distinguishable brain activation networks for short- and long-term motor skill learning. | The acquisition of a new motor skill is characterized first by a short-term, fast learning stage in which performance improves rapidly, and subsequently by a long-term, slower learning stage in which additional performance gains are incremental. Previous functional imaging studies have suggested that distinct brain networks mediate these two stages of learning, but direct comparisons using the same task have not been performed. Here we used a task in which subjects learn to track a continuous 8-s sequence demanding variable isometric force development between the fingers and thumb of the dominant, right hand. Learning-associated changes in brain activation were characterized using functional MRI (fMRI) during short-term learning of a novel sequence, during short-term learning after prior, brief exposure to the sequence, and over long-term (3 wk) training in the task. Short-term learning was associated with decreases in activity in the dorsolateral prefrontal, anterior cingulate, posterior parietal, primary motor, and cerebellar cortex, and with increased activation in the right cerebellar dentate nucleus, the left putamen, and left thalamus. Prefrontal, parietal, and cerebellar cortical changes were not apparent with short-term learning after prior exposure to the sequence. With long-term learning, increases in activity were found in the left primary somatosensory and motor cortex and in the right putamen. Our observations extend previous work suggesting that distinguishable networks are recruited during the different phases of motor learning. While short-term motor skill learning seems associated primarily with activation in a cortical network specific for the learned movements, long-term learning involves increased activation of a bihemispheric cortical-subcortical network in a pattern suggesting "plastic" development of new representations for both motor output and somatosensory afferent information. |
Single Image Layer Separation via Deep Admm Unrolling | Single image layer separation aims to divide the observed image into two independent components according to special task requirements and has been widely used in many vision and multimedia applications. Because this task is fundamentally ill-posed, most existing approaches tend to design complex priors on the separated layers. However, the cost function with complex prior regularization is hard to optimize. The performance is also compromised by fixed iteration schemes and less data fitting ability. More importantly, it is also challenging to design a unified framework to separate image layers for different applications. To partially mitigate the above limitations, we develop a flexible optimization unrolling technique to incorporate deep architectures into iterations for adaptive image layer separation. Specifically, we first design a general energy model with implicit priors and adopt the widely used alternating direction method of multiplier (ADMM) to establish our basic iteration scheme. By unrolling with residual convolution architectures, we successfully obtain a simple, flexible, and data-dependent image separation method. Extensive experiments on the tasks of rain streak removal and reflection removal validate the effectiveness of our approach. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.