instruction
stringlengths 10
664
| context
stringlengths 1
5.66k
| response
stringlengths 1
3.34k
| category
stringclasses 1
value |
---|---|---|---|
Discordance between resident and faculty perceptions of resident autonomy: can self-determination theory help interpret differences and guide strategies for bridging the divide? | To identify and interpret differences between resident and faculty perceptions of resident autonomy and of faculty support of resident autonomy. Parallel questionnaires were sent to pediatric residents and faculty at the University of Rochester Medical Center in 2011. Items addressed self-determination theory (SDT) constructs (autonomy, competence, relatedness) and asked residents and faculty to rate and/or comment on their own and the other group's behaviors. Distributions of responses to 17 parallel Likert scale items were compared by Wilcoxon rank-sum tests. Written comments underwent qualitative content analysis. Respondents included 62/78 residents (79%) and 71/100 faculty (71%). The groups differed significantly on 15 of 17 parallel items but agreed that faculty sometimes provided too much direction. Written comments suggested that SDT constructs were closely interrelated in residency training. Residents expressed frustration that their care plans were changed without explanation. Faculty reported reluctance to give "passive" residents autonomy in patient care unless stakes were low. Many reported granting more independence to residents who displayed motivation and competence. Some described working to overcome residents' passivity by clarifying and reinforcing expectations. | Faculty and residents had discordant perceptions of resident autonomy and of faculty support for resident autonomy. When faculty restrict the independence of "passive" residents whose competence they question, residents may receive fewer opportunities for active learning. Strategies that support autonomy, such as scaffolding, may help residents gain confidence and competence, enhance residents' relatedness to team members and supervisors, and help programs adapt to accreditation requirements to foster residents' growth in independence. | closed_qa |
Chest pain: if it hurts a lot, is heart attack more likely? | In previous studies including patients with suspected cardiac chest pain, those who had acute myocardial infarction (AMI) reported more severe chest pain than those without AMI. However, many patients with AMI present with very mild pain or discomfort. We aimed to investigate whether peak pain severity, as reported by patients in the Emergency Department, has any potential role in the risk stratification of patients with suspected cardiac chest pain. In this secondary analysis from a prospective diagnostic cohort study, we included patients presenting to the Emergency Department with suspected cardiac chest pain. Patients were asked to report their maximum pain severity using a 11-point numeric rating scale at the time of initial presentation. The primary outcome was a diagnosis of AMI, adjudicated by two independent investigators on the basis of reference standard (12 h) troponin testing. Of the 455 patients included in this analysis, 79 (17.4%) had AMI. Patients with AMI had marginally higher pain scores (eight, interquartile range 5-8) than those without AMI (seven, interquartile range 6-8, P=0.03). However, the area under the receiver operating characteristic curve for the numeric rating scale pain score was 0.58 (95% confidence interval 0.51-0.65), indicating poor overall diagnostic accuracy. AMI occurred in 12.1% of patients with pain score 0-3, 17.1% with pain score 4-6 and 18.8% with pain score 7-10. Among patients with AMI, pain score was not correlated with 12-h troponin levels (r=-0.001, P=0.99). | Pain score has limited diagnostic value for AMI. Scores should guide analgesia but shift the probability of AMI very little, and should not guide other clinical management. | closed_qa |
Cyclophosphamide: As bad as its reputation? | Despite new treatment modalities, cyclophosphamide (CYC) remains a cornerstone in the treatment of organ or life-threatening vasculitides and connective tissue disorders. We aimed at analysing the short- and long-term side-effects of CYC treatment in patients with systemic autoimmune diseases. Chart review and phone interviews regarding side effects of CYC in patients with systemic autoimmune diseases treated between 1984 and 2011 in a single university centre. Adverse events were stratified according to the "Common Terminology Criteria for Adverse Events" version 4. A total of 168 patients were included. Cumulative CYC dose was 7.45 g (range 0.5-205 g). Gastro-intestinal side effects were seen in 68 events, hair loss occurred in 38 events. A total of 58 infections were diagnosed in 44/168 patients (26.2%) with 9/44 suffering multiple infections. Severity grading of infections was low in 37/58 cases (63.8%). One CYC-related infection-induced death (0.6%) was registered. Amenorrhoea occurred in 7/92 females (7.6%) with 5/7 remaining irreversible. In females with reversible amenorrhoea, prophylaxis with nafarelin had been administered. Malignancy was registered in 19 patients after 4.7 years (median, range 0.25-22.25) presenting as 4 premalignancies and 18 malignancies, 3 patients suffered 2 premalignancies/malignancies each. Patients with malignancies were older with a higher cumulative CYC dose. Death was registered in 28 patients (16.6%) with 2/28 probably related to CYC. | Considering the organ or life-threatening conditions which indicate the use of CYC, severe drug-induced health problems were rare. Our data confirm the necessity to follow-up patients long-term for timely diagnosis of malignancies. CYC side-effects do not per se justify prescription of newer drugs or biologic agents in the treatment of autoimmune diseases. | closed_qa |
Is it all Lynch syndrome? | Mismatch repair-deficient (MMRD) colorectal cancer (CRC) and endometrial cancer (EC) may be suggestive of Lynch syndrome (LS). LS can be confirmed only by positive germ-line testing. It is unclear if individuals with MMRD tumors but no identifiable cause (MMRD+/germ-line-) have LS. Because LS is hereditary, individuals with LS are expected to have family histories of LS-related tumors. Our study compared the family histories of MMRD+/germ-line- CRC and/or EC patients with LS CRC and/or EC patients. A total of 253 individuals with an MMRD CRC or EC from one institution were included for analysis in one of four groups: LS; MMRD+/germ-line-; MMRD tumor with variant of uncertain significance (MMRD+/VUS); and sporadic MSI-H (MMRD tumor with MLH1 promoter hypermethylation or BRAF mutation). Family histories were analyzed utilizing MMRpro and PREMM1,2,6. Kruskal-Wallis tests were used to compare family history scores. MMRD+/germ-line- individuals had significantly lower median family history scores (MMRpro = 8.1, PREMM1,2,6 = 7.3) than did LS individuals (MMRpro = 89.8, PREMM1,2,6 = 26.1, P<0.0001). | MMRD+/germ-line- individuals have less suggestive family histories of LS than individuals with LS. These results imply that MMRD+/germ-line- individuals may not all have LS. This finding highlights the need to determine other causes of MMRD tumors so that these patients and their families can be accurately counseled regarding screening and management.Genet Med 17 6, 476-484. | closed_qa |
Is the iPad suitable for image display at American Board of Radiology examinations? | The study aimed to determine the acceptability of the iPad 3 as a display option for American Board of Radiology (ABR) examinations. A set of 20 cases for each of nine specialties examined by the ABR was prepared. Each comprised between one and seven images and case information and had been used in previous ABR Initial Certification examinations. Examining radiologists (n = 119) at the ABR oral Initial Certification examinations reviewed sets from one or more specialties on both a 2 MP LED monitor and on the iPad 3 and rated the visibility of the salient image features for each case. The Wilcoxon signed rank test was performed to compare ratings. In addition, a thematic analysis of participants' opinions was undertaken. When all specialties were pooled, the iPad 3 ratings were significantly higher than the monitor ratings (p = 0.0217). The breast, gastrointestinal, genitourinary, and nuclear medicine specialties also returned significantly higher ratings for the visibility of relevant image features for the iPad 3. Monitor ratings were significantly higher for the vascular and interventional specialty, although no images were rated unacceptably poor on the iPad in this specialty. | The relevant image features were rated more visible on the iPad 3 than on the monitors overall. The iPad 3 was well accepted by a large majority of examiners and can be considered adequate for image display for examination in most or all specialties. | closed_qa |
Does appendiceal diameter change with age? | The purposes of this study were to determine whether age-related changes in appendiceal diameter identified on CT and pathology are apparent on sonography and to assess the relationship between normal appendiceal diameter and patient-specific factors. Ultrasound examinations from 388 unique pediatric patients with normal appendixes, evenly distributed by age, were reviewed. Appendiceal diameter and wall thickness were correlated with patient age, sex, height, weight, and presence of enlarged lymph nodes. Mean (± SD) anteroposterior and transverse appendiceal diameters were 4.4 ± 0.9 and 5.1 ± 1.0 mm, respectively. Appendiceal diameter was normally distributed across the population but was not significantly associated with age. Centers for Disease Control and Prevention (CDC) weight percentile for age was the only statistically significant patient-specific predictor of transverse diameter (p = 0.001) and approached significance for anteroposterior diameter (p = 0.051). The presence of enlarged lymph nodes was a significant predictor of anteroposterior diameter (p = 0.029) and approached significance for transverse diameter (p = 0.07). Wall thickness was normally distributed across the population and was significantly associated with age (p = 0.011; effect size, -0.05 mm/y). | Appendiceal diameter measured on ultrasound is normally distributed in children and does not depend on age. Age-dependent diagnostic cutoffs for normal sonographic diameter are thus not needed. There is, however, a relationship between age and appendiceal wall thickness, suggesting the need for age-dependent diagnostic values if this criteria are to be used to diagnose appendicitis. Although the CDC weight percentile for age and the presence of enlarged lymph nodes affect appendiceal diameter on ultrasound, these effects are small and of doubtful clinical significance. | closed_qa |
Does integrative medicine enhance balance in aging adults? | Postural balance and potentially fall risk increases among older adults living with neurological diseases, especially Parkinson's disease (PD). Since conventional therapies such as levodopa or deep brain stimulation may fail to alleviate or may even worsen balance, interest is growing in evaluating alternative PD therapies. The purpose of the current study was to assess improvement in postural balance in PD patients following electroacupuncture (EA) as an alternative therapy. 15 aging adults (71.2 ± 6.3 years) with idiopathic PD and 44 healthy age-matched participants (74.6 ± 6.5 years) were recruited. The PD participants were randomly assigned (at a ratio of 2:1) to an intervention (n = 10) or to a control group (n = 5). The intervention group received a 30-min EA treatment on a weekly basis for 3 weeks, while the control group received a sham treatment. Outcomes were assessed at baseline and after the final therapy. Measurements included balance assessment, specifically the ratio of medial-lateral (ML) center-of-gravity (COG) sway to anterior-posterior (AP) sway (COGML/AP) and ankle/hip sway during eyes-open, eyes-closed, and eyes-open dual-task trials, the Unified Parkinson's Disease Rating Scale (UPDRS), as well as quality of life, concerns for fall, and pain questionnaires. No difference was observed for the assessed parameters between the intervention and the control group at baseline. After treatment, an improvement in balance performance was observed in the intervention group. Compared with the healthy population, PD patients prior to treatment had larger COGML/AP sway with more dependency on upper-body movements for maintaining balance. Following EA therapy, COGML/AP sway was reduced by 31% and ankle/hip sway increased by 46% in the different conditions (p = 0.02 for the dual-task condition). The clinical rating revealed an overall improvement (p<0.01) in mentation, behavior, and mood (UPDRS part I, 49%), activities of daily living (UPDRS part II, 46%), and motor examination (UPDRS part III, 40%). There was a significant reduction (p<0.02) in the specific items regarding UPDRS fall status (67%) and rigidity (48%). Changes were small and nonsignificant in the controls (p>0.29). | This pilot study demonstrates improvement in rigidity and balance following EA. These preliminary results suggest EA could be a promising alternative treatment for balance disturbance in PD. | closed_qa |
Can preoperative scoring systems be applied to Asian hip fracture populations? | Hip fractures in the elderly are a major cause of morbidity and mortality. Determining which patients will benefit from hip fracture surgery is crucial to reducing mortality and morbidity. Our objectives are: 1) to define the rate of index admission, 1-month and 1-year mortality in all hip fracture patients, and 2) to apply the Nottingham Hip Fracture Score (NHFS) to determine validity in an Asian population. This is a prospective cohort study of 212 patients with hip fractures above 60 years from September 2009 to April 2010 for 1-year. Sociodemographic, prefracture comorbidity and data on functional status was collected on admission, and at intervals after discharge. The main outcome measures were mortality on index admission, 1 month and 12 months after treatment. In our study, the overall mortality at 1-month and 1-year after surgery was 7.3% and 14.6% respectively. Surgically treated hip fracture patients had lower odds ratio (OR) for mortality as compared to conservatively treated ones. The OR was 0.17 during index admission, 0.17 at 1-month, and 0.18 at 12-months after discharge. These were statistically significant. Adjustments for age, gender, and duration to surgery were taken into account. The NHFS was found to be a good predictor of 1-month mortality after surgery. | Surgically treated hip fracture patients have a lower OR for mortality than conservatively managed ones even up to 1-year. The NHFS has shown to predict 1-month mortality accurately for surgically treated hip fracture patients, even for our Asian population. It can be used as a tool for clinicians at the individual patient level to communicate risk with patients and help plan care for fracture patients. | closed_qa |
Is feeling extraneous from one's own body a core vulnerability feature in eating disorders? | To identify core vulnerability features capable of discriminating subjects who are more prone to develop eating disorders. A nonclinical group composed of 253 university students was studied by means of the Identity and Eating Disorders questionnaire (IDEA), exploring abnormal attitudes toward one's own body and difficulties in the definition of one's own identity, the Body Uneasiness Test (BUT) and different self-reported questionnaires evaluating the specific and general psychopathology of eating disorders. The results were compared with those of a clinical eating disorder group. In the student sample, a group composed of 35 subjects with abnormal eating patterns and a group (218 subjects) without such features were identified. The IDEA total and subscale scores were found to be significantly higher in subjects with abnormal eating patterns than in subjects without them (all p<0.001). Positive correlations between the IDEA total and subscale scores and the BUT global score were observed in both groups (all p<0.01). The comparison of the scores on the IDEA between the clinical group (patients with full-blown eating disorders) and the subjects with abnormal over-threshold eating patterns yields a significant difference in the 'feeling extraneous from one's own body' subscale of the IDEA. | The IDEA resulted in being a valid instrument to identify a vulnerability to eating disorders in subjects with abnormal eating patterns in the general population and to recognize the presence of a significant discomfort related to the body. Feeling extraneous from one's own body is the experience that discriminates most between clinical and nonclinical subjects. | closed_qa |
Does a medial retraction blade transmit direct pressure to pharyngeal/esophageal wall during anterior cervical surgery? | A prospective study of 25 patients who underwent anterior cervical surgery. To assess retraction pressure and the exposure of pharyngeal/esophageal (P/E) wall to the medial retractor blade to clarify whether medial retraction causes direct pressure transmission to the P/E wall. Retraction pressure on P/E walls has been used to explain the relation between the retraction pressure and dysphagia or the efficacies of new retractor blades. However, it is doubtful whether the measured pressure represent real retraction pressure on the P/E wall because exposure of the P/E in the surgical field could be reduced by the shielding effect of thyroid cartilage. Epi- and endoesophageal pressures were serially measured using online pressure transducers 15 minutes before retraction, immediately after retraction, and 30 minutes after retraction. To measure the extent of P/E wall exposure to pressure transducer, we used posterior border of thyroid cartilage as a landmark. Intraoperative radiograph was used to mark the position of the posterior border of thyroid cartilage. We checked out the marked location on retractors by measuring the distance from distal retractor tip. The mean epiesophageal pressure significantly increased after retraction (0 mmHg: 88.7 ± 19.6 mmHg: 81.9 ± 15.3 mmHg). The mean endoesophageal pressure minimally changed after retraction (9.0 ± 6.6 mmHg: 15.7 ± 13.8 mmHg: 17.0 ± 14.3 mmHg). The mean location of the posterior border of thyroid cartilage was 7.3 ± 3.5 mm on the retractor blade from the tip, which means epiesophageal pressure was measured against the posterior border of thyroid cartilage, not against the P/E wall. | We suggest that a medial retraction blade does not transmit direct pressure on P/E wall due to minimal wall exposure and intervening thyroid cartilage. Our result should be considered when measuring retraction pressure during anterior cervical surgery or designing novel retractor systems. | closed_qa |
Is organizational progress in the EFQM model related to employee satisfaction? | To determine whether there is greater employee satisfaction in organisations that have made more progress in implementation of the European Foundation for Quality Management (EFQM) model. A series of cross-sectional studies (one for each assessment cycle) comparing staff satisfaction survey results between groups of healthcare organisations by degree of implementation of the EFQM model (assessed in terms of external recognition of management quality in each organisation). 30 healthcare organisations including hospitals, primary care and mental health providers in Osakidetza, the Basque public health service. Employees of 30 Osakidetza organisations. Progress in implementation of EFQM model. Scores in 9 dimensions of employee satisfaction from questionnaires administered in healthcare organisations in 4 assessment cycles between 2001 and 2010. Comparing satisfaction results in organisations granted Gold or Silver Q Awards and those without this type of external recognition, we found statistically significant differences in the dimensions of training and internal communication. Then, comparing recipients of Gold Q Awards with those with no Q Certification, differences in leadership style and in policy and strategy also emerged as significant. | Progress of healthcare organisations in the implementation of the EFQM Excellence Model is associated with increases in their employee satisfaction in dimensions that can be managed at the level of each organisation, while dimensions in which no statistically significant differences were found represent common organisational elements with little scope for self-management. | closed_qa |
Does medication abuse in patients with chronic migraine influence the effectiveness of preventive treatment with topiramate? | Patients with chronic migraine (CM) and medication abuse are difficult to treat, and have a greater tendency towards chronification and a poorer quality of life than those with other types of headache.AIM: To evaluate whether the presence of medication abuse lowers the effectiveness of topiramate. A series of patients with CM were grouped according to whether they met abuse criteria or not. They were advised to stop taking the drug that they were abusing. Treatment was adjusted to match their crises and preventive treatment with topiramate was established from the beginning. The number of days with headache and intense migraine in the previous month and at four months of treatment was evaluated. In all, 262 patients with CM criteria were selected and 167 (63.7%) of them fulfilled abuse criteria. In both groups there was a significant reduction in the number of days with headache/month and number of migraine attacks/month at the fourth month of treatment with topiramate. The percentage of reduction in the number of days with headache/month in CM without abuse was 59.3 ± 36.1%, and with abuse, 48.7 ± 41.7% (p = 0.0574). The percentage of reduction in the number of days with intense migraine/month in CM without abuse was 61.2%, and with abuse, 50% (p = 0.0224). Response rate according to the number of days with headache/month in CM without abuse was 69%, and with abuse, 57%. Response rate according to the number of intense migraines/month in CM without abuse was 76.8%, and in CM with abuse, 61% (p = 0.0097). | Topiramate was effective in patients with CM with and without medication abuse, although effectiveness is lower in the latter case. | closed_qa |
Reliability of proliferation assessment by Ki-67 expression in neuroendocrine neoplasms: eyeballing or image analysis? | The latest WHO classification for neuroendocrine neoplasms (NEN) of the gastrointestinal tract defines grade according to Ki-67 and mitotic indices. Some have questioned the reproducibility and thus the reliability of Ki-67 assessment. We therefore investigated the accuracy of this proliferation marker in NEN. The Ki-67 index of tumor specimens of NEN (n = 73) was assessed by two pathologists as in routine practice with eyeballing and twice by image analysis using ImageJ freeware at different magnifications. RESULTS were correlated with overall survival. The intraclass correlation coefficient (ICC) between pathologists was 0.88. The ICC for the measurements using image analysis was 0.85. The ICC between all four measurements (pathologists and ImageJ) was 0.80. If the Ki-67 index was translated to grade as prescribed by the current WHO classification (<3% = grade 1, 3-20% = grade 2,>20% = grade 3), kappa was between 0.61 and 0.75. Grades based on pathologist scoring were often (16-29%) higher than grades assigned by image analysis (p<0.001). Grade was significantly correlated with survival (p<0.0001) irrespective of the way Ki-67 was assessed. | Assessment of the Ki-67 index by eyeballing correlates remarkably well with the Ki-67 index as calculated by image analysis and is therefore an accurate parameter. Moreover, it is significantly related to survival irrespective of the method used. Yet if the Ki-67 index is translated to grade, the grade should be interpreted with caution due to values around threshold levels. | closed_qa |
The frontal branch of the facial nerve: can we define a safety zone? | The temporal branch of the facial nerve, a particularly important branch in facial expression, is commonly exposed to surgical trauma. The frontal branch is the most important branch of the temporal branch in the clinical point of view. However, it does not really define in the international nomenclature. The objective of this study was to clearly identify this branch, to perform a cartography of the crossing areas of this branch; and therefore to define statistically a zone of safety within the fronto-temporal region. We used 12 fresh cadavers to perform 24 facial nerve dissections. After the identification of the facial nerve, the branches of the temporofacial trunk were identified, dissected and followed till their penetration. We measured the relationship of the frontal branch with the zygomatic arch, temporal vessels and lateral border of the orbit. We conducted a statistical study to assess the risk of injury of this branch within the temporal region. We observed an important variability in the distribution of this branch in the temporal region. We defined three zones of decreasing safety at the level of three interest landmarks: at the level of the inferior part of the zygomatic arch, we estimated an elevated risk of nerve injury (>85%) from 22.6 to 26.06 mm in front of the tragus; at the level of the superior part of the zygomatic arch, we estimated an elevated risk of nerve injury (>85%) from 27.46 to 30.43 mm in front of the tragus; at the level of the lateral border of the orbit, we estimated an elevated risk of nerve injury (>85%) from 16.20 to 19.17 mm behind this landmark. | There exists no real area of anatomical safety in the temporal region. It seems, however, possible to define areas of relative safety that would be of great help for the surgeon or the morphologist wishing to approach pathologies of this region. | closed_qa |
Rate and timing of spontaneous resolution in a vitreomacular traction group: should the role of watchful waiting be re-evaluated as an alternative to ocriplasmin therapy? | The incidence of spontaneous resolution of vitreomacular traction (VMT) is low in studies of Ocriplasmin that have had a limited follow-up. Previous studies did not look for morphological parameters in the natural history using spectral-domain ocular coherence tomography (SD-OCT) imaging. The purpose of this study was to investigate how often and when spontaneous VMT resolution occurs in candidates for Ocriplasmin therapy. The study is a retrospective chart review of patients who would have high chances of a benefit by an Ocriplasmin injection, without epiretinal membrane or vitreomacular adhesion of 1500 µm or more on SD-OCT. Main outcome measures were the frequency of complete VMT resolution and the best corrected visual acuity seen in the natural history. Out of the 46 patients that were included after screening 889 SD-OCT images, 20 were found to exhibit spontaneous resolution during the follow-up period (median: 594 days, 95% CI 567 to 719 days), the majority after 6-12 months of observation (95% CI 266 to 617 days). The group with spontaneous VMT resolution and a mean improvement of one line in best corrected visual acuity included a few patients losing vision by macular hole formation. In the absence of resolution, patients lost on average one early treatment diabetic retinopathy study letter per year. Younger age was found to increase the chance of spontaneous resolution. | A shorter follow-up might underestimate the incidence of spontaneous VMT resolution as the functional outcome of watchful waiting. The likelihood of resolution does not seem to decrease after 12 months. | closed_qa |
Do environmental factors modify the genetic risk of prostate cancer? | Many SNPs influence prostate cancer risk. To what extent genetic risk can be reduced by environmental factors is unknown. We evaluated effect modification by environmental factors of the association between susceptibility SNPs and prostate cancer in 1,230 incident prostate cancer cases and 1,361 controls, all white and similar ages, nested in the Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Trial. Genetic risk scores were calculated as number of risk alleles for 20 validated SNPs. We estimated the association between higher genetic risk (≥12 SNPs) and prostate cancer within environmental factor strata and tested for interaction. Men with ≥12 risk alleles had 1.98, 2.04, and 1.91 times the odds of total, advanced, and nonadvanced prostate cancer, respectively. These associations were attenuated with the use of selenium supplements, aspirin, ibuprofen, and higher vegetable intake. For selenium, the attenuation was most striking for advanced prostate cancer: compared with<12 alleles and no selenium, the OR for ≥12 alleles was 2.06 [95% confidence interval (CI), 1.67-2.55] in nonusers and 0.99 (0.38-2.58) in users (Pinteraction = 0.031). Aspirin had the most marked attenuation for nonadvanced prostate cancer: compared with<12 alleles and nonusers, the OR for ≥12 alleles was 2.25 (1.69-3.00) in nonusers and 1.70 (1.25-2.32) in users (Pinteraction = 0.009). This pattern was similar for ibuprofen (Pinteraction = 0.023) and vegetables (Pinteraction = 0.010). | This study suggests that selenium supplements may reduce genetic risk of advanced prostate cancer, whereas aspirin, ibuprofen, and vegetables may reduce genetic risk of nonadvanced prostate cancer. | closed_qa |
Central venous oxygen saturation: a potential new marker for circulatory stress in haemodialysis patients? | Haemodialysis causes recurrent haemodynamic stress with subsequent ischaemic end-organ dysfunction. As dialysis prescriptions/schedules can be modified to lessen this circulatory stress, an easily applicable test to allow targeted interventions in vulnerable patients is urgently required. Intra-dialytic central venous oxygen saturation (ScvO2) and clinical markers (including ultrafiltration, blood pressure) were measured in 18 prevalent haemodialysis patients. Pre-dialysis ScvO2 was 63.5 ± 13% and fell significantly to 56.4 ± 8% at end dialysis (p = 0.046). Ultrafiltration volume, a key driver of dialysis-induced myocardial ischaemia, inversely correlated to ScvO2 (r = -0.680, p = 0.015). | This initial study demonstrates ScvO2 sampling is practical, with a potential clinical utility as an indicator of circulatory stress during dialysis. | closed_qa |
Is local hypoperfusion the reason for transient neurological deficits after STA-MCA bypass for moyamoya disease? | Hyperperfusion is believed to be the cause of transient neurological events (TNEs) in patients with moyamoya disease (MMD) who have undergone an extracranial-to-intracranial (EC-IC) bypass between the superficial temporal artery (STA) and the middle cerebral artery (MCA). The objective of this study was to evaluate this possibility by analyzing cerebral blood flow (CBF) data obtained with thermal diffusion probes used at the authors' center. The authors examined postoperative cerebral perfusion in 31 patients with MMD who underwent a direct EC-IC STA-MCA bypass. A Hemedex Q500 flow probe was placed in the frontal lobe adjacent to the bypass and connected to a Bowman cerebral perfusion monitor, and CBF data were statistically analyzed using JMP 8.0.2 software. Seven patients experienced a TNE after surgery in the left hemisphere (that is, after left-sided surgery), manifesting as dysphasia approximately 24 hours postoperatively and which had improved by 48 hours. No TNEs were observed after right-sided surgeries. Operative and postoperative CBFs in the left side with the TNE were compared with those in the left side with no TNE and on the right side. A detailed analysis of 64,980 minute-by-minute flow observations showed that the initial postbypass CBF was higher on the left side where the TNEs occurred. This CBF increase was followed by a widely fluctuating pattern and a statistically significant and sharp drop in perfusion (p<0.001, mean difference of CBF between groups, paired t-test) associated with a TNE not observed in the other 2 groups. | On the basis of the authors' initial observations, an early-onset altered pattern of CBF was identified. These findings suggest local hypoperfusion as the cause of the TNEs. This hypoperfusion may originate from competing blood flows resulting from impaired cerebral autoregulation and a fluctuating flow in cerebral microcirculation. | closed_qa |
Translating pharmacological findings from hypothyroid rodents to euthyroid humans: is there a functional role of endogenous 3,5-T2? | During the last two decades, it has become obvious that 3,5-diiodothyronine (3,5-T2), a well-known endogenous metabolite of the thyroid hormones thyroxine (T4) or triiodothyronine (T3), not only represents a simple degradation intermediate of the former but also exhibits specific metabolic activities. Administration of 3,5-T2 to hypothyroid rodents rapidly stimulated their basal metabolic rate, prevented high-fat diet-induced obesity as well as steatosis, and increased oxidation of long-chain fatty acids. The aim of the present study was to analyze associations between circulating 3,5-T2 in human serum and different epidemiological parameters, including age, sex, or smoking, as well as measures of anthropometry, glucose, and lipid metabolism. 3,5-T2 concentrations were measured by a recently developed immunoassay in sera of 761 euthyroid participants of the population-based Study of Health in Pomerania. Subsequently, analysis of variance and multivariate linear regression analysis were performed. Serum 3,5-T2 concentrations exhibited a right-skewed distribution, resulting in a median serum concentration of 0.24 nM (1st quartile: 0.20 nM; 3rd quartile: 0.37 nM). Significant associations between 3,5-T2 and serum fasting glucose, thyrotropin (TSH), as well as leptin concentrations were detected (p<0.05). Interestingly, the association to leptin concentrations seemed to be mediated by TSH. Age, sex, smoking, and blood lipid profile parameters did not show significant associations with circulating 3,5-T2. | Our findings from a healthy euthyroid population may point toward a physiological link between circulating 3,5-T2 and glucose metabolism. | closed_qa |
Does rectum and bladder dose vary during the course of image-guided radiotherapy in the postprostatectomy setting? | To assess the variations in actual doses delivered to the rectum and bladder in the course of postprostatectomy radiotherapy using kilovoltage-cone-beam computed tomography datasets acquired during image-guided radiotherapy. Twenty consecutive patients treated with intensity-modulated or intensity-modulated arc therapy to the prostate bed were retrospectively evaluated. Both the planning tomography and kilovoltage-cone-beam computed tomography were acquired with an empty rectum and a half-full bladder. Target localization was performed on the basis of soft tissue matching using cone-beam computed tomography scans before each treatment fraction. A total of 16 cone-beam computed tomography scans per patient (acquired at the first 5 fractions and twice weekly thereafter) were used for the assessments. The bladder and rectum were re-contoured offline on each cone-beam computed tomography scan by a single physician, and the delivered doses were recalculated. The variations in certain dose-volume parameters for the rectum and bladder (BD2cc, RD 2cc, V40%, V50%, V60%, V65%) were analyzed using the paired t test. Most of the dose volume variations for rectum and bladder were significantly higher than predicted (P<0.05) for the 320 kilovoltage-cone-beam computed tomography sets, except for the doses received by 2 cc of the bladder and V50 and V60 of the rectum. The dose-volume parameters of the bladder did not meet our criteria of V65 ≤25% and V40 ≤50% in 10% and 20% of the patients, respectively. None of the dose-volume histograms showed rectal V65 ≥17%; however, the rectal V40 ≤35% dose constraint was not met in 11 patients. For all patients, the ANOVA test revealed no significant difference between the variations. | Actual doses delivered during treatment were found to be higher than predicted, but the majority of calculated bladder and rectal doses remained in the limits of our plan acceptance criteria. Interfraction variability of the rectum and bladder is a major concern in the postprostatectomy radiotherapy setting, even when patients are instructed about rectal and bladder preparation before the radiotherapy course. Image guidance with cone-beam computed tomography at each treatment fraction may offer a viable tool to account for interfraction variations of the rectum and bladder throughout the treatment course. | closed_qa |
Blood alcohol concentration in intoxicated patients seen in the emergency department: does it influence discharge decisions? | The purpose of this study was to investigate whether blood alcohol concentration (BAC) measurement was routinely requested in emergency departments and whether the observation period in the emergency department allowed sufficient time for alcohol elimination before the patient was discharged. A retrospective review of medical records of all emergency alcohol-related admissions over a 12-month period from January 2012, in patients older than 18 years, was conducted. We estimated BAC at discharge for each patient by using the following formula: [BAC at admission--(length of stay × 15)]. Then, we focused on patients discharged from the emergency department with an estimated BAC greater than 50 mg/100 ml because of the risk of subsequent legal proceedings, because this is the legal limit for driving in France. A total of 907 patients admitted for acute alcohol intoxication (F10.0) were included, of whom 592 were male. Women were more likely to be admitted at night. The mean length of stay was 18.7 hours. BAC was measured in 893 patients. Patients ages 35-49 years had the highest measured BAC. No repeat BAC was taken before the discharge decision. Three hundred thirteen patients were discharged with an estimated BAC above 50 mg/100 ml. | Emergency physicians routinely requested BAC at admission but did not request alcohol kinetics while the patient was under observation. The discharge decision was based on clinical judgment. Doctors who do not advise patients appropriately before discharge may be guilty of negligence in their duty of care. | closed_qa |
Do positive alcohol expectancies have a critical developmental period in pre-adolescents? | Positive outcome expectancies have been shown to predict initiation of alcohol use in children and to mediate and moderate the relationship between dispositional variables and drinking behavior. Negative outcome expectancies for alcohol appear to weaken as children progress to middle adolescence, but positive expectancies tend to increase during this time. Positive alcohol expectancies have been found to increase in children in third and fourth grades, indicating what some investigators have termed a possible critical period for the development of positive expectancies. In the present study, we assessed alcohol expectancies at baseline, 6, 12, and 18 months in 277 second-through sixth-grade students. Children completed the Alcohol Expectancy Questionnaire-Adolescent. Univariate analyses of covariance were conducted. There were significant main effects for grade on positive alcohol-expectancy change for Global Positive Transformations at 12 and 18 months, Social Behavior Enhancement or Impediment at 6 and 12 months, and Relaxation/Tension Reduction at 6 and 18 months, whereby a consistent pattern emerged in that lower grades did not differ from each other, but they differed significantly from the higher grades. | Data support a critical developmental period for positive alcohol expectancies, with the greatest change observed between third and fourth grade and between fourth and fifth grade, and only in those expectancies clearly describing positive outcomes (e.g., Relaxation/Tension Reduction) via positive or negative reinforcement versus those with either combined or ambiguous outcomes (e.g., Social Behavior Enhancement or Impediment). | closed_qa |
Substance use disorders among first- and second- generation immigrant adults in the United States: evidence of an immigrant paradox? | A growing number of studies have examined the "immigrant paradox" with respect to the use of licit and illicit substances in the United States. However, there remains a need for a comprehensive examination of the multigenerational and global links between immigration and substance use disorders among adults in the United States. The present study, using data from the National Epidemiologic Survey on Alcohol and Related Conditions, aimed to address these gaps by comparing the prevalence of substance use disorders of first-generation (n = 3,338) and second-generation (n = 2,515) immigrants with native-born American adults (n = 15,733) in the United States. We also examined the prevalence of substance use disorders among first-generation emigrants from Asia, Africa, Europe, and Latin America in contrast to second-generation and native-born Americans. The prevalence of substance use disorders was highest among native-born Americans, slightly lower among second-generation immigrants, and markedly lower among first-generation immigrants. Adjusted risk ratios were largest among individuals who immigrated during adolescence (ages 12-17 years) and adulthood (age 18 years or older). Results were consistent among emigrants from major world regions. | Consistent with a broad body of literature examining the links between the immigrant paradox and health outcomes, results suggest that nativity and age at arrival are significant factors related to substance use disorders among first- and second-generation immigrants in the United States. | closed_qa |
Does sexual self-concept ambiguity moderate relations among perceived peer norms for alcohol use, alcohol-dependence symptomatology, and HIV risk-taking behavior? | The current study examines the relation between peer descriptive norms for alcohol involvement and alcohol-dependence symptomatology and whether this relation differs as a function of sexual self-concept ambiguity (SSA). This study also examines the associations among peer descriptive norms for alcohol involvement, alcohol-dependence symptomatology, and lifetime HIV risk-taking behavior and how these relations are influenced by SSA. Women between ages 18 and 30 years (N = 351; M = 20.96, SD = 2.92) completed an online survey assessing sexual self-concept, peer descriptive norms, alcohol-dependence symptomatology, and HIV risk-taking behaviors. Structural equation modeling was used to test hypotheses of interest. There was a significant latent variable interaction between SSA and descriptive norms for peer alcohol use. There was a stronger positive relationship between peer descriptive norms for alcohol and alcohol-dependence symptomatology when SSA was higher compared with when SSA was lower. Both latent variables exhibited positive simple associations with alcohol-dependence symptoms. Peer descriptive norms for alcohol involvement directly and indirectly influenced HIV risk-taking behaviors, and the indirect influence was conditional based on SSA. | The current findings illustrate complex, nuanced associations between perceived norms, identity-related self-concepts, and risky health behaviors from various domains. Future intervention efforts may be warranted to address both problem alcohol use and HIV-risk engagement among individuals with greater sexual self-concept ambiguity. | closed_qa |
"Do you think you have mental health problems? | Mental disorders are common among people presenting for the treatment of substance-related problems, and guidelines recommend systematic screening for these conditions. However, even short screens impose a considerable burden on clients and staff. In this study, we evaluated the performance of a single screening question (SSQ) relative to longer instruments. We analyzed a sample of 544 adults recruited from outpatient substance use disorder treatment centers in Ontario, Canada. Clients were asked a simple SSQ, followed by the Kessler Screening Scale for Psychological Distress (K6) and the Global Appraisal of Individual Needs Short Screener (GAIN-SS). Caseness was ascertained using the Structured Clinical Interview for DSMIV (SCID). We measured and compared performance using receiver operating characteristic (ROC) curve analysis and explored client characteristics associated with screen performance using ROC regression. Last, we used logistic regression to test whether SSQs can be usefully combined with other measures. The prevalence of past-month disorder was 71%. The performance of the SSQ (AUC = 0.77, 95% CI [0.72, 0.83]) was similar to, and not significantly different from, those of the K6 (AUC = 0.78) and the GAIN (AUC = 0.79). The K6 and the GAIN performed better than the SSQ among people with a psychotic disorder. The addition of the SSQ slightly improved the performance of the other measures. | SSQs can offer screening performance comparable to that of longer instruments. Reasons for caution include the small number of possible thresholds, lower accuracy than other measures in identifying psychotic disorder, and possible differential functioning in different populations. Performance of all three screens was also moderate; when prevalence is high and resources are available, offering full assessments may be preferable to screening. Nevertheless, SSQs offer an intriguing area for further evaluation. | closed_qa |
Evaluation of an after-hours call center: are pediatric patients appropriately referred to the emergency department? | There is concern that after-hours nurse telephone triage systems are overwhelming the emergency department (ED) with nonemergent pediatric referrals. This study aimed to critically review a nonpediatric hospital-based call center with the aim of identifying the algorithms responsible for the majority of nonessential referrals. This is a retrospective observational study performed at a tertiary medical care facility over 1 year. Telephone triage forms of children and adolescents younger than 18 years, exclusively referred by triage nurses using the Barton Schmitt protocols, were reviewed, and their ED course was evaluated by consulting the electronic medical record. "Essential" referrals to the ED were classified as presentations warranting immediate evaluation or referrals requiring "essential interventions" such as serum laboratory tests, imaging, complex procedures, intravenous medications, subspecialty consultation, or admission. A total of 220 patients were included in this study. Of these, 73 (33%) were classified as nonessential, whereas 147 (67%) were classified as essential. Nonessential patients were significantly younger compared with essential referrals (P<0.05). They also had lower triage scores (P = 0.026) and shorter ED stays (P<0.0001). The algorithms for "fever-3 months or older" (12.3%), "vomiting without diarrhea" (8.2%), "trauma-head" (8.2%), "headache" (6.8%), and "sore throat" (5.5%) were determined most likely to result in a nonessential referral. | Our study identifies that a third of unnecessary pediatric visits to the ED occurred as a result of the nurse triage telephone system in question. We recommend review of the algorithms stated to reduce strain on local ED resources. | closed_qa |
Does cruciate retention primary total knee arthroplasty affect proprioception, strength and clinical outcome? | It remains unclear what the contribution of the PCL is in total knee arthroplasty (TKA). The goal of this study was to investigate the influence of the PCL in TKA in relationship to clinical outcome, strength and proprioception. Two arthroplasty designs were compared: a posterior cruciate-substituting (PS) and a posterior cruciate-retaining (CR) TKA. A retrospective analysis was performed of 27 CR and 18 PS implants with a minimum of 1 year in vivo. Both groups were compared in terms of clinical outcome (range of motion, visual analogue scale for pain, Hospital for Special Surgery Knee Scoring system, Lysholm score and Knee Injury and Osteoarthritis Outcome Score), strength (Biodex System 3 Dynamometer(®)) and proprioception (balance and postural control using the Balance Master system(®)). Each design was also compared to the non-operated contralateral side in terms of strength and proprioception. There were no significant differences between both designs in terms of clinical outcome and strength. In terms of proprioception, only the rhythmic weight test at slow and moderate speed shifting from left to right was significant in favour of the CR design. None of the unilateral stance tests showed any significant difference between both designs. There was no difference in terms of strength and proprioception between the operated side and the non-operated side. | Retaining the PCL in TKA does not result in an improved performance in terms of clinical outcome and proprioception and does not show any difference in muscle strength. | closed_qa |
Is isolated insert exchange a valuable choice for polyethylene wear in metal-backed unicompartmental knee arthroplasty? | The aim of this study was to evaluate the clinical outcome and survival rate after isolated liner exchange for polyethylene (PE) wear in well-fixed metal-backed fixed-bearing unicompartmental knee arthroplasty (UKA). Twenty medial UKAs in 19 patients [mean age 68.7 years ± 8.7 (range 48.5-81.5 years)] operated on for a direct PE liner exchange after isolated PE wear between 1996 and 2010 in two institutions were retrospectively reviewed. The mean delay between the index operation and revision was 8.2 years ± 2.6 (range 4.8-12.8 years). A four-level satisfaction questionnaire was used, and clinical outcomes were assessed using Knee Society scores (KSS) and range of motion (ROM) evaluation. Radiological evaluation analysed the position of the implants and progression of the disease. Survival rate of the implants was evaluated using Kaplan-Meier analysis with two different end-points. At the last follow-up [mean 6.8 years ± 5.2 (range 1.1-15.9 years)], 15 patients (79 %) were enthusiastic or satisfied. KSS improved from 73.4 to 86.4 points (p = 0.01) and function from 58.9 to 89.2 points (p < 0.001). ROM at last FU was 126.5° ± 10.3°. The survival rate at 12 years considering "revision for any reason" as the end-point was 71.3 ± 15.3 %, and the survival rate at 12 years considering "revision of UKA to TKA" as the end-point was 93.3 ± 6.4 %. | Isolated liner exchange for PE wear in well-fixed metal-backed fixed-bearing UKA represents a valuable treatment option in selective patients with durable improvement of clinical outcomes without compromising any future revision. | closed_qa |
Compression of human primary cementoblasts leads to apoptosis: A possible cause of dental root resorption? | One of the most common side effects of orthodontic treatment is root resorption on the pressure side of tooth movement. This is usually repaired by cementoblasts, but 1-5 % of patients eventually experiences a marked reduction in root length because no repair has occurred. The reason why cementoblasts should lose their repair function in such cases is not well understood. There is evidence from genome-wide expression analysis (Illumina HumanHT-12 v4 Expression BeadChip Kit;> 30,000 genes) that apoptotic processes are upregulated after the compression of cementoblasts, which is particularly true of the pro-apoptotic gene AXUD1. Human primary cementoblasts (HPCBs) from two individuals were subjected to compressive loading at 30 g/cm(2) for 1/6/10 h. The cells were then evaluated for apoptosis by flow cytometry, for mRNA expression of putative genes (AXUD1, AXIN1, AXIN2) by quantitative PCR, and for involvement of c-Jun-N-terminal kinases (JNKs) in the regulation of AXUD1 via western blotting. In addition, platelet-derived growth factor receptor-β (PDGFRβ) was selectively inhibited by SU16f to analyze the effect of PDGFRβ-dependent signal transduction on AXUD1 and AXIN1 expression. The percentage of apoptotic HPCBs rose after only 6 h of compressive loading, and 18-20 % of cells were apoptotic after 10 h. Microarray data revealed significant upregulation of the pro-apoptotic gene AXUD1 after 6 h and quantitative PCR significant AXUD1 upregulation after 6 and 10 h of compression. AXIN1 and AXIN2 expression in HPCBs was significantly increased after compressive loading. Our tests also revealed that PDGFRβ signaling inhibition by SU16f augmented the expression of AXIN1 and AXUD1 in HPCBs under compression. | Increased apoptosis of compressed HPCBs might help explain why cementoblasts, rather than invariably repairing all cases of root resorption, sometimes allow the original root length to shorten. The pathway hypothesized to lead to cementoblast apoptosis involves PDGF signaling, with this signal transduction's inhibition augmenting the expression of pro-apoptotic genes. Thus activating PDGF signaling may modify the signaling pathway for the apoptosis of cementoblasts, which would reveal a protective role of PDGF for these cells. Further studies are needed to develop strategies of treatment capable of minimizing root resorption. | closed_qa |
All-cause mortality and estimated renal function in type 2 diabetes mellitus outpatients: Is there a relationship with the equation used? | We investigated the relationship between serum creatinine (SCr) and estimated glomerular filtration rate (eGFR), evaluated by different formulae, and all-cause mortality (ACM) in type 2 diabetes mellitus (T2DM) outpatients. This observational cohort study considered 1365 T2DM outpatients, who had been followed up for a period of up to 11 years. eGFR was estimated using several equations. Seventy subjects (5.1%) died after a follow-up of 9.8 ± 3 years. Univariate analysis showed that diagnosis of nephropathy (odds ratio (OR): 2.554, 95% confidence interval (CI): 1.616-4.038, p<0.001) and microvascular complications (OR: 2.281, 95% CI: 1.449-3.593, p<0.001) were associated with ACM. Receiving operating characteristic (ROC) curves showed that the areas under the curve for ACM were similar using the different eGFR equations. eGFR values were predictors of ACM, and the hazard ratios (HRs) of the different equations for eGFR estimation were similar. | In our cohort of T2DM outpatients, different eGFR equations perform similarly in predicting ACM, whereas SCr did not. | closed_qa |
Color vision in ADHD: part 2--does attention influence color perception? | To investigate the impact of exogenous covert attention on chromatic (blue and red) and achromatic visual perception in adults with and without Attention Deficit Hyperactivity Disorder (ADHD). Exogenous covert attention, which is a transient, automatic, stimulus-driven form of attention, is a key mechanism for selecting relevant information in visual arrays. 30 adults diagnosed with ADHD and 30 healthy adults, matched on age and gender, performed a psychophysical task designed to measure the effects of exogenous covert attention on perceived color saturation (blue, red) and contrast sensitivity. The effects of exogenous covert attention on perceived blue and red saturation levels and contrast sensitivity were similar in both groups, with no differences between males and females. Specifically, exogenous covert attention enhanced the perception of blue saturation and contrast sensitivity, but it had no effect on the perception of red saturation. | The findings suggest that exogenous covert attention is intact in adults with ADHD and does not account for the observed impairments in the perception of chromatic (blue and red) saturation. | closed_qa |
Dental materials for primary dentition: are they suitable for occlusal restorations? | Eight specimens of each material were subjected to two-body wear test, using a chewing simulator. The wear region of each material was examined under a profilometer, measuring the vertical loss (μm) and the volume loss (mm(3)) of the materials. The results showed significant differences of vertical loss and volume loss of the test materials (p<0.001). Amalgam had the highest wear resistance. Twinky Star (compomer) had the lowest vertical loss and volume loss. There was no significant difference of vertical loss among compomers, Dyract Extra, Dyract Flow and Dyract Posterior. Riva Self Cure (GIC) had no statistically significant difference compared with the compomers (except Twinky Star). No statistically significant difference was found also between Equia (GIC) and Ketac Moral (GIC) with Dyract Extra (Compomer). RMGICs were found to have the lowest wear resistance. For the statistical analysis, the PASW 20.0 (SPSS Statistics, IBM, Chicago) package was used. Means and standard deviations were measured with descriptive statistics and analyzed using one-way ANOVA. | Compomers and some GICs, that have moderate wear resistance, may be sufficient for occlusal restorations in primary dentitions. | closed_qa |
Thyroid fine needle aspiration biopsy: do nodule volume and cystic degeneration ratio affect specimen adequacy and cytological diagnosis time? | A fine needle aspiration biopsy (FNAB) of thyroid nodules - the least invasive and most accurate method used to investigate malignant lesions - may yield non-diagnostic specimens even under ultrasonographic guidance. To evaluate the effects of thyroid nodule volume and extent of cystic degeneration on both the non-diagnostic specimen ratio as well as cytopathologist's definitive cytological diagnosis time. In this single center study, FNAB was performed on 505 patients with single thyroid nodules greater than 10 mm. Nodule volume was calculated prior to FNAB and cystic degeneration ratio was recorded. All biopsies were performed by a single radiologist who also prepared specimen slides. Specimen adequacy and final diagnosis were made in the pathology laboratory by a single-blinded cytopathologist based on the Bethesda system. Definitive cytological diagnosis time was recorded upon reaching a definitive diagnosis. The specimen adequacy ratio was 85.3%. The mean nodule volume of adequate specimens was larger than those of non-diagnostic samples (6.00 mL vs. 3.05 mL; P = 0.001). There was no correlation between nodule volume and cytopathologist's definitive cytological diagnosis time (r = 0.042). Biopsy of predominantly solid nodules yielded better specimen adequacy ratios compared to predominantly cystic nodules (87.8% vs. 75.3%; P = 0.028). Definitive cytological diagnosis times were longer in predominantly cystic nodules compared to predominantly solid nodules (376 s vs. 294 s; P = 0.019). | Predominantly cystic nodules are likely to benefit from repeated nodular sampling until the specimen is declared adequate by an on-site cytopathologist. If a cytopathologist is not available, obtaining more specimens per nodule may achieve desired adequacy ratios. | closed_qa |
Surgical fires in laser laryngeal surgery: are we safe enough? | Laser surgery of the larynx and airway remains high risk for the formation of operating room fire. Traditional methods of fire prevention have included use of "laser safe" tubes, inflation of a protective cuff with saline, and wet pledgets to protect the endotracheal tube from laser strikes. We tested a mechanical model of laser laryngeal surgery to evaluate the fire risk. Mechanical model. Laboratory. An intubation mannequin was positioned for suspension microlaryngoscopy. A Laser-Shield II cuffed endotracheal tube was placed through the larynx and the cuff inflated using saline. Wet pledgets covered the inflated cuff. A CO2 laser created an inadvertent cuff strike at varying oxygen concentrations. Risk reduction measures were implemented to discern any notable change in the outcome after fire. At 100% FiO2 an immediate fire with sustained flame was created and at 40% FiO2 a near immediate sustained flame was created. At 29% FiO2, a small nonsustained flame was noted. At room air, no fire was created. There was no discernible difference in the severity of laryngeal damage after the fire occurred whether the tube was immediately pulled from the mannequin or if saline was poured down the airway as a first response. | While "laser safe" tubes provide a layer of protection against fires, they are not fire proof. Inadvertent cuff perforation may result in fire formation in low-level oxygen enriched environments. Placement of wet pledgets do not provide absolute protection. Endotracheal tube (ETT) cuffs should be placed distally well away from an inadvertent laser strike while maintaining the minimum supplemental oxygen necessary. | closed_qa |
Can a community health worker and a trained traditional birth attendant work as a team to deliver child health interventions in rural Zambia? | Teaming is an accepted approach in health care settings but rarely practiced at the community level in developing countries. Save the Children trained and deployed teams of volunteer community health workers (CHWs) and trained traditional birth attendants (TBAs) to provide essential newborn and curative care for children aged 0-59 months in rural Zambia. This paper assessed whether CHWs and trained TBAs can work as teams to deliver interventions and ensure a continuum of care for all children under-five, including newborns. We trained CHW-TBA teams in teaming concepts and assessed their level of teaming prospectively every six months for two years. The overall score was a function of both teamwork and taskwork. We also assessed personal, community and service factors likely to influence the level of teaming. We created forty-seven teams of predominantly younger, male CHWs and older, female trained TBAs. After two years of deployment, twenty-one teams scored "high", twelve scored "low," and fourteen were inactive. Teamwork was high for mutual trust, team cohesion, comprehension of team goals and objectives, and communication, but not for decision making/planning. Taskwork was high for joint behavior change communication and outreach services with local health workers, but not for intra-team referral. Teams with members residing within one hour's walking distance were more likely to score high. | It is feasible for a CHW and a trained TBA to work as a team. This may be an approach to provide a continuum of care for children under-five including newborns. | closed_qa |
Is mix of care influenced by the provider environment? | In Australia, access to dental care has been available through several different pathways: (1) private practice; (2) public clinics; (3) Aboriginal Medical Services (AMS)-based clinics; and (4) until recently, the Chronic Disease Dental Scheme (CDDS). The aim of the present study was to compare the types of dental services most commonly delivered in the various clinical pathways based on the hypothesis that disease-driven care should lead to similar mixes of dental care provided. Data from a series of previously published sources was used to identify and compare the most commonly performed dental procedures in the different pathways. A comparison was also made with the available international data (US). There was a marked difference between service mixes provided through the four pathways. Patients obtaining dental care through AMS-based and public pathways had more extractions and less restorative and preventive care compared with private and CDDS pathways. Compared with the international data, dental service mixes in Australia were found to be not as evenly distributed. Value of care provided through private and CDDS pathways were two- to threefold higher than that of AMS-based and public pathways. | The data indicate that the original hypothesis that the disease-driven care should lead to similar mixes of provided dental care, is not supported. | closed_qa |
Is periodontitis a risk factor for cognitive impairment and dementia? | Dementia is a multi-etiologic syndrome characterized by multiple cognitive deficits but not always by the presence of cognitive impairment. Cognitive impairment is associated with multiple non-modifiable risk factors but few modifiable factors. Epidemiologic studies have shown an association between periodontitis, a potentially modifiable risk factor, and cognitive impairment. The objective of this study is to determine whether clinical periodontitis is associated with the diagnosis of cognitive impairment/dementia after controlling for known risk factors, including age, sex, and education level. A case-control study was conducted in Granada, Spain, in two groups of dentate individuals aged>50 years: 1) cases with a firm diagnosis of mild cognitive impairment or dementia of any type or severity and 2) controls with no subjective memory loss complaints and a score>30 in the Phototest cognitive test (screening test for cognitive impairment). Periodontitis was evaluated by measuring tooth loss, plaque and bleeding indexes, probing depths, and clinical attachment loss (AL). The study included 409 dentate adults, 180 with cognitive impairment and 229 without. A moderate and statistically significant association was observed between AL and cognitive impairment after controlling for age, sex, education level, oral hygiene habits, and hyperlipidemia (P = 0.049). No significant association was found between tooth loss and cognitive impairment. | Periodontitis appears to be associated with cognitive impairment after controlling for confounders such as age, sex, and education level. | closed_qa |
Can antiretroviral therapy modify the clinical course of HDV infection in HIV-positive patients? | Infection with hepatitis delta virus (HDV) affects approximately 6-14.5% of patients coinfected with HIV-1 and HBV, showing a more aggressive clinical course compared with an HIV-negative population. There is no universally approved treatment for chronic hepatitis D (CHD) in HIV-infected patients. Antiretroviral therapy (ART) containing tenofovir has been recently associated with HDV suppression. Our aim was to evaluate whether the outcome of CHD in HIV-infected patients can be favourably influenced by ART including reverse transcriptase inhibitors. The clinical course of four HBV/HDV/HIV-coinfected patients receiving ART were retrospectively examined. HDV RNA became undetectable in all patients after a variable period of ART along with the disappearance of hepatitis B surface antigen in two of them, and an increase in CD4(+) T-cell count. In all patients, virological changes were associated with improved liver function tests and clinical features. | We suggest that ART regimens including drugs active against HBV could have beneficial effects on the clinical course of CHD in patients with HIV-1 by favouring immunological reconstitution. | closed_qa |
Dental health among older Israeli adults: is this a reflection of a medical care model inadequately addressing oral health? | Israel's health-care system is considered as one of the most efficient worldwide. The purpose of the present study was to assess oral health outcomes, dental care use and respective social inequalities among the older segment of the Israeli population. Secondary analyses were conducted of recently available data from the Survey of Health, Ageing, and Retirement in Europe (SHARE Israel, wave 2), which specifically includes information on chewing ability, denture wearing and dental care use obtained from more than 2,400 Israeli people, 50+ years of age. Multivariate logistic regressions and concentration indices were used to analyse determinants of oral health and dental care use. Seventy per cent of respondents reported being able to bite/chew on hard foods and 49% of respondents reported wearing dentures. Forty-three per cent of respondents had visited a dentist within the past 12 months, with about half of all dental visits being made for solely nonpreventive reasons. Significant income-related inequalities were identified, with higher income being associated with greater dental care use (particularly preventive dental visits), better chewing ability and less denture wearing. | For the older segment of the Israeli population and compared with other countries, the findings of the present study suggest a relatively low level of chewing ability, a high extent of nonpreventive dental visiting, as well as considerable inequalities in oral health and care. It seems that the Israeli health-care system may be improved even further by more comprehensive inclusion of dental care into universal health coverage. | closed_qa |
Can the Perinatal Information System in Peru be used to measure the proportion of adverse birth outcomes attributable to maternal syphilis infection? | To describe the capacity of Peru's Perinatal Information System (Sistema Informático Perinatal, SIP) to provide estimates for monitoring the proportion of stillbirths and other adverse birth outcomes attributable to maternal syphilis. A descriptive study was conducted to assess the quality and completeness of SIP data from six Peruvian public hospitals that used the SIP continuously from 2000 - 2010 and had maternal syphilis prevalence of at least 0.5% during that period. In-depth interviews were conducted with Peruvian stakeholders about their experiences using the SIP. Information was found on 123 575 births from 2000 - 2010 and syphilis test results were available for 99 840 births. Among those 99 840 births, there were 1 075 maternal syphilis infections (1.1%) and 619 stillbirths (0.62%). Among women with syphilis infection in pregnancy, 1.7% had a stillbirth, compared to 0.6% of women without syphilis infection. Much of the information needed to estimate the proportion of stillbirths attributable to maternal syphilis was available in the SIP, with the exception of syphilis treatment information, which was not collected. However, SIP data collection is complex and time-consuming for clinicians. Data were unlinked across hospitals and not routinely used or quality-checked. Despite these limitations, the SIP data examined were complete and valid; in 98% of records, information on whether or not the infant was stillborn was the same in both the SIP and clinical charts. Nearly 89% of women had the same syphilis test result in clinical charts and the SIP. | The large number of syphilis infections reported in Peru's SIP and the ability to link maternal characteristics to newborn outcomes make the system potentially useful for monitoring the proportion of stillbirths attributable to congenital syphilis in Peru. To ensure good data quality and sustainability of Peru's SIP, data collection should be simplified and information should be continually quality-checked and used for the benefit of participating facilities. | closed_qa |
Quality indicators for prostate radiotherapy: are patients disadvantaged by receiving treatment in a 'generalist' centre? | The purpose of this retrospective review was to evaluate concordance with evidence-based quality indicator guidelines for prostate cancer patients treated radically in a 'generalist' (as distinct from 'sub-specialist') centre. We were concerned that the quality of treatment may be lower in a generalist centre. If so, the findings could have relevance for many radiotherapy departments that treat prostate cancer. Two hundred fifteen consecutive patients received external beam radiotherapy (EBRT) and/or brachytherapy between 1.10.11 and 30.9.12. Treatment was deemed to be in line with evidence-based guidelines if the dose was: (i) 73.8-81 Gy at 1.8-2.0 Gy/fraction for EBRT alone (eviQ guidelines); (ii) 40-50 Gy (EBRT) for EBRT plus high-dose rate (HDR) brachytherapy boost (National Comprehensive Cancer Network (NCCN) guidelines); and (iii) 145 Gy for low dose rate (LDR) I-125 monotherapy (NCCN). Additionally, EBRT beam energy should be ≥6 MV using three-dimensional conformal RT (3D-CRT) or intensity-modulated RT (IMRT), and high-risk patients should receive neo-adjuvant androgen-deprivation therapy (ADT) (eviQ/NCCN). Treatment of pelvic nodes was also assessed. One hundred four high-risk, 84 intermediate-risk and 27 low-risk patients (NCCN criteria) were managed by eight of nine radiation oncologists. Concordance with guideline doses was confirmed in: (i) 125 of 136 patients (92%) treated with EBRT alone; (ii) 32 of 34 patients (94%) treated with EBRT + HDR BRT boost; and (iii) 45 of 45 patients (100%) treated with LDR BRT alone. All EBRT patients were treated with ≥6 MV beams using 3D-CRT (78%) or IMRT (22%). 84%, 21% and 0% of high-risk, intermediate-risk and low-risk patients received ADT, respectively. Overall treatment modality choice (including ADT use and duration where assessable) was concordant with guidelines for 176/207 (85%) of patients. | The vast majority of patients were treated concordant with evidence-based guidelines suggesting that, within the limits of the selected criteria, prostate cancer patients are unlikely to be disadvantaged by receiving radiotherapy in this 'generalist' centre. | closed_qa |
Can hyperbaric oxygen be used to prevent deep infections in neuro-muscular scoliosis surgery? | The prevalence of postoperative wound infection in patients with neuromuscular scoliosis surgery is significantly higher than that in patients with other spinal surgery. Hyperbaric oxygen has been used as a supplement to treat postsurgical infections. Our aim was to determine beneficiary effects of hyperbaric oxygen treatment in terms of prevention of postoperative deep infection in this specific group of patients in a retrospective study. Forty two neuromuscular scoliosis cases, operated between 2006-2011 were retrospectively reviewed. Patients who had presence of scoliosis and/or kyphosis in addition to cerebral palsy or myelomeningocele, postoperative follow-up>1 year and posterior only surgery were the subjects of this study. Eighteen patients formed the Hyperbaric oxygen prophylaxis (P-HBO) group and 24, the control group. The P-HBO group received 30 sessions of HBO and standard antibiotic prophylaxis postoperative, and the control group (received standard antibiotic prophylaxis). In the P-HBO group of 18 patients, the etiology was cerebral palsy in 13 and myelomeningocele in 5 cases with a mean age of 16.7 (11-27 yrs). The average follow-up was 20.4 months (12-36mo). The etiology of patients in the control group was cerebral palsy in 17, and myelomeningocele in 7 cases. The average age was 15.3 years (8-32 yrs). The average follow-up was 38.7 months (18-66mo). The overall incidence of infection in the whole study group was 11.9% (5/42). The infection rate in the P-HBO and the control group were 5.5% (1/18), and 16.6% (4/24) respectively. The use of HBO was found to significantly decrease the incidence of postoperative infections in neuromuscular scoliosis patients. | In this study we found that hyperbaric oxygen has a possibility to reduce the rate of post-surgical deep infections in complex spine deformity in high risk neuromuscular patients. | closed_qa |
Does cochleostomy location influence electrode trajectory and intracochlear trauma? | Trauma to intracochlear structures during cochlear implant insertion is associated with poorer hearing outcomes. One way surgeons can influence insertion trauma is by choosing the surgical approach. We seek to compare cochleostomy (CO), peri-round window (PRW), and round window (RW) approaches using a fresh frozen temporal bone model. Experiments using fresh frozen temporal bones. Cochlear implant insertions using the three aforementioned approaches were performed on 15 fresh frozen human temporal bones using a Cochlear 422 electrode. Insertions were evaluated by examining fluoroscopic recordings of histologic sections. Five cochlear implant insertions were performed using each of the three aforementioned approaches. Fluoroscopic examination revealed that none of the CO or PRW insertions contacted the modiolus during insertion, whereas three of five RW insertions did. RW insertions were less linear during insertion when compared to CO and PRW insertions (P < .05). CO insertions had significantly larger angular depth of insertion (487°) when compared to PRW (413°) and RW (375°) (P < .05). Histologic examination revealed one RW insertion resulted in osseous spiral lamina fracture, whereas the remaining insertions had no evidence of trauma. In the damaged specimen, the inserted electrode was observed to rest in the scala vestibuli, whereas the remaining electrodes rested in the scala tympani. | Due to variability in RW anatomy, a CO or PRW window surgical approach appears to minimize the risk for insertion trauma. However, with favorable anatomy, a Cochlear 422 electrode can be inserted with any of the three approaches. | closed_qa |
Does mutual compensation of the cognitive effects induced by pain and opioids exist? | Studies have demonstrated that both pain and opioids have actions on the central nervous system that may interfere with cognitive function, but their effects have mainly been analysed separately and not as an integrated process. The objective of this study is to test two hypotheses: (1) the analgesic effect of opioids improves cognitive function by decreasing pain, and (2) pain antagonizes cognitive effects of opioids. Randomized, placebo-controlled, crossover study. Three experiments were conducted with 22 healthy males. Sustained attention, memory and motor function/attention/mental flexibility were evaluated by continuous reaction time (CRT), verbal fluency test (VFT) and trail making test-B (TMT-B), respectively. In the 1st experiment, the cognitive effects of experimental tonic pain of mild and moderate intensities produced by a computer-controlled pneumatic tourniquet cuff were assessed; in the 2nd, the effects of saline solution and remifentanil were assessed in the absence of pain; and in the 3rd experiment, the cognitive effects of moderate pain intensity relieved by remifentanil infusion were assessed followed by increasing pain to moderate intensity during a constant remifentanil infusion. The first two experiments demonstrated that pain and remifentanil impaired CRT. In the 3rd experiment, remifentanil infusion relieving pain significantly impaired CRT and further deterioration was noted following increasing pain intensity. | Pain and remifentanil seemed to have additive deleterious cognitive effects. This study represents an initial step to enhance our basic understanding of some of the cognitive effects following a painful stimulus and an opioid infusion separately and combined in a sequence comparable to clinical settings. | closed_qa |
Is 3D technique superior to 2D in Down syndrome screening? | The objective of this article is to investigate whether in the clinical setting of second trimester ultrasound (US) investigations, 3D multiplanar correction prior to the measurement of Down syndrome (DS) facial markers (nasal bone length, prenasal thickness, fetal profile line, maxilla-nasion-mandible angle, prenasal thickness to nasal bone length ratio, and prefrontal space ratio) is superior to subjective judgment of a correct midsagittal plane by 2D technique. Measurements were performed on 2D images and 3D volumes (corrected to the midsagittal plane), acquired during the same scanning session. All six markers were measured in 105 datasets (75 of euploid fetuses and 30 of DS fetuses). The maxilla-nasion-mandible angle measured on 2D images was significantly larger than on 3D volumes (p < 0.01). In all other markers, there was no significant difference between measurements performed on 2D images or 3D volumes. No statistical difference was found for any marker between measurements performed on images acquired by either 2D or 3D US in their ability to discriminate between normal and DS fetuses. | Nasal bone length, prenasal thickness, fetal profile line, prenasal thickness to nasal bone length ratio, and prefrontal space ratio can be confidently used as DS markers in second trimester US examinations performed by 2D US. © 2014 John Wiley&Sons, Ltd. | closed_qa |
Case-based learning and simulation: useful tools to enhance nurses' education? | To compare skills acquired by undergraduate nursing students enrolled in a medical-surgical course. To compare skills demonstrated by students with no previous clinical practice (undergraduates) and nurses with clinical experience enrolled in continuing professional education (CPE). In a nonrandomized clinical trial, 101 undergraduates enrolled in the "Adult Patients 1" course were assigned to the traditional lecture and discussion (n = 66) or lecture and discussion plus case-based learning (n = 35) arm of the study; 59 CPE nurses constituted a comparison group to assess the effects of previous clinical experience on learning outcomes. Scores on an objective structured clinical examination (OSCE), using a human patient simulator and cases validated by the National League for Nursing, were compared for the undergraduate control and intervention groups, and for CPE nurses (Student's t test). Controls scored lower than the intervention group on patient assessment (6.3 ± 2.3 vs 7.5 ± 1.4, p = .04, mean difference, -1.2 [95% confidence interval (CI) -2.4 to -0.03]) but the intervention group did not differ from CPE nurses (7.5 ± 1.4 vs 8.8 ± 1.5, p = .06, mean difference, -1.3 [95% CI -2.6 to 0.04]). The CPE nurses committed more "rules-based errors" than did undergraduates, specifically patient identifications (77.2% vs 55%, p = .7) and checking allergies before administering medication (68.2% vs 60%, p = .1). | The intervention group developed better patient assessment skills than the control group. Case-based learning helps to standardize the process, which can contribute to quality and consistency in practice: It is essential to correctly identify a problem in order to treat it. Clinical experience of CPE nurses was not associated with better adherence to safety protocols. | closed_qa |
Can serum NGAL levels be used as an inflammation marker on hemodialysis patients with permanent catheter? | Neutrophil gelatinase-associated lipocalin (NGAL) is a member of lipocalin family and released from many tissues and cells. We aimed to investigate the relationship among serum NGAL levels, the inflammation markers (IL-6, hs-CRP, TNF-α) and different vascular access types used in dialysis patients. The study population included 90 patients and 30 healthy age-matched controls. The patients were divided into three groups (I, II, III) and group IV included the controls. In group I and II, the patients were with central venous permanent catheter and arterio-venous fistula, respectively. Group III included 30 patients with chronic renal failure. Hemogram, biochemical assays, ferritin, IL-6, hs-CRP, TNF-α, and NGAL were evaluated in all groups. Serum NGAL levels were markedly higher in group I than in group II (7645.80 ± 924.61 vs. 4131.20 ± 609.87 pg/mL; p < 0.05). Positive correlation was detected between NGAL levels and duration of catheter (r: 0.903, p: 0.000), hs-CRP (r: 0.796, p: 0.000), IL-6 (r: 0.687, p: 0.000), TNF-α (r: 0.568, p: 0.000) levels and ferritin (r: 0.318, p: 0.001), whereas NGAL levels were negatively correlated with serum albumin levels (r: -0.494, p: 0.000). In multiple regression analysis, duration of catheter hs-CRP and TNF-α were predictors of NGAL in hemodialysis patients. | Inflammation was observed in hemodialysis patients and increases with catheter. Our findings show that a strong relationship among serum NGAL levels, duration of catheter, hs-CRP and TNF-α. NGAL may be used as a new inflammation marker in hemodialysis patients. | closed_qa |
Multicystic dysplastic kidney: is an initial voiding cystourethrogram necessary? | Traditionally, a voiding cystourethrogram (VCUG) has been obtained in patients diagnosed with multicystic dysplastic kidney (MCDK) because of published vesicoureteral reflux (VUR) rates between 10%-20%. However, with the diagnosis and treatment of low grade VUR undergoing significant changes, we questioned the utility of obtaining a VCUG in healthy patients with a MCDK. We reviewed our experience to see how many of the patients with documented VUR required surgical intervention. We performed a retrospective review of children diagnosed with unilateral MCDK from 2002 to 2012 who also underwent a VCUG. A total of 133 patients met our inclusion criteria. VUR was identified in 23 (17.3%) children. Four patients underwent ureteral reimplant (3.0%). Indications for surgical therapy included breakthrough urinary tract infections (2 patients), evidence of dysplasia/scarring (1 patient) and non-resolving reflux (1 patient). All patients with a history of VUR who are toilet trained, regardless of the grade or treatment, are currently being followed off antibiotic prophylaxis. To date, none have had a febrile urinary tract infection (UTI) since cessation of prophylactic antibiotics. Hydronephrosis in the contralateral kidney was not predictive of VUR (p = 0.99). | Routine VCUG in healthy children diagnosed with unilateral MCDK may not be warranted given the low incidence of clinically significant VUR. If a more conservative strategy is preferred, routine VCUG may be withheld in those children without normal kidney hydronephrosis and considered in patients with normal kidney hydronephrosis. If a VCUG is not performed the family should be instructed in signs and symptoms of urinary tract infection. | closed_qa |
Is mycophenolate mofetil an alternative agent to corticosteroids in traumatic nerve paralysis? | The effects of an immunosuppressive agent, mycophenolate mofetil (MM), were investigated and compared with those of methylprednisolone (MP) and dexamethasone (DXM) on the traumatic nerve function. This is a randomized controlled animal study. This experimental study was performed on 84 male Wistar albino rats. The rats were assigned to 12 groups each consisting of 7 animals. The groups were formed according to application of normal-dose DXM (group 1A-B), high-dose MP (group 2A-B), normal-dose MP (group 3A-B), MM (group 4A-B), and MM with high-dose MP combination therapies (group VA-B). Right sciatic nerve dissection was performed, and compound muscle action potential thresholds were recorded. The nerve was traumatized with the compression of a Jeweller forceps for 20 seconds. Posttraumatic thresholds were also recorded. The compound muscle action potential thresholds were recorded in the first and fourth weeks for the assigned groups. Then, the nerve was transected and prepared for electron microscopic and histopathologic examinations. Nitric oxide and malondialdehyde assessments were performed on both tissue and blood samples. Only the MM and MP+MM groups had satisfactory electron microscopic findings and were about to reach the tissue characteristics of the control animals. Despite the electrophysiologic recovery, the DXM group was found to have poor electron microscopic scoring. | Mycophenolate mofetil has been found to be beneficial in the treatment of traumatic nerve paralysis. Although a complementary investigation is needed, this immunosuppressive agent may be an alternative to corticosteroids for the selected cases where steroid therapy is contraindicated. | closed_qa |
Does the degree of ptosis predict the degree of psychological morbidity in bariatric patients undergoing reconstruction? | There is proven therapeutic benefit in bariatric surgery for obese patients. Successful bariatric surgery will result in massive weight loss and ptotic skin, which can cause significant functional and psychological problems. As the incidence of bariatric surgery increases, so will the demand for plastic surgery. Currently, there is no evidence-based indication for massive weight loss body contouring, and therefore there is no standardized provision. A prospective, multicenter, observational study of outcomes in 75 patients undergoing bariatric and plastic surgery procedures at two clinical sites was performed to determine whether the degree of ptosis can be determined by the type (malabsorptive or restrictive) of bariatric surgery and if the extent of disfigurement has an impact on psychological morbidity. Massive weight loss body contouring is not purely aesthetic surgery, but it leads to functional and psychosocial benefits. This study has given preliminary data on which anthropometric measurements and their thresholds lead to the greatest benefit from massive weight loss body contouring. From this study, the fourth quartiles of the anthropometric measurements xiphisternum to pubic symphysis (≥91 cm), umbilicus to pubic symphysis (≥38 cm), and hip circumference (≥143 cm) were statistically significant in crossing the psychometric tolerances from within the normal range to pathological psychology. | This study demonstrates that there is a statistically significant, quantifiable correlation among type of bariatric surgery, degree of ptosis, and psychological morbidity in patients who have undergone bariatric surgery. This pilot study could provide the basis for evidence-based guidelines for plastic surgery referral. | closed_qa |
Do therapeutic imagery practices affect physiological and emotional indicators of threat in high self-critics? | Imagery is known to be a powerful means of stimulating various physiological processes and is increasingly used within standard psychological therapies. Compassion-focused imagery (CFI) has been used to stimulate affiliative emotion in people with mental health problems. However, evidence suggests that self-critical individuals may have particular difficulties in this domain with single trials. The aim of the present study was to further investigate the role of self-criticism in responsiveness to CFI by specifically pre-selecting participants based on trait self-criticism. Using the Forms of Self-Criticism/Self-Reassuring Scale, 29 individuals from a total sample of 139 were pre-selected to determine how self-criticism impacts upon an initial instance of imagery. All participants took part in three activities: a control imagery intervention (useable data N = 25), a standard CFI intervention (useable data N = 25), and a non-intervention control (useable data N = 24). Physiological measurements (alpha amylase) as well as questionnaire measures of emotional responding (i.e., the Positive and Negative Affect Schedule, the Types of Positive Affect Scale, and the State Adult Attachment Scale) were taken before and after the different interventions. Following both imagery interventions, repeated measures analyses revealed that alpha amylase increased significantly for high self-critics compared with low self-critics. High self-critics (HSC) also reported greater insecurity on entering the imagery session and more negative CFI experiences compared with low self-critics. | Data demonstrate that HSC respond negatively to imagery interventions in a single trial. This highlights that imagery focused therapies (e.g., CFI) need interventions that manage fears, blocks, and resistances to the techniques, particularly in HSC. | closed_qa |
Is insomnia associated with deficits in neuropsychological functioning? | People with insomnia complain of cognitive deficits in daily life. Results from empirical studies examining associations between insomnia and cognitive impairment, however, are mixed. Research is needed that compares treatment-seeking and community-based insomnia study samples, measures subjective as well as objective cognitive functioning, and considers participants' pre-insomnia cognitive function. We used data from the Dunedin Study, a representative birth cohort of 1,037 individuals, to examine whether insomnia in early midlife was associated with subjective and objective cognitive functioning. We also tested whether individuals with insomnia who reported seeking treatment for their sleep problems (treatment-seekers) showed greater impairment than other individuals with insomnia (non-treatment-seekers). The role of key confounders, including childhood cognitive ability and comorbid health conditions, was evaluated. Insomnia was diagnosed at age 38 according to DSM-IV criteria. Objective neuropsychological assessments at age 38 included the WAIS-IV IQ test, the Wechsler Memory Scale, and the Trail-Making Test. Childhood cognitive functioning was assessed using the Wechsler Intelligence Scale for Children-Revised (WISC-R). A total of 949 cohort members were assessed for insomnia symptoms and other study measures at age 38. Although cohort members with insomnia (n = 186, 19.6%) had greater subjective cognitive impairment than their peers at age 38, they did not exhibit greater objective impairment on formal testing. Treatment-seekers, however, exhibited significant objective impairment compared to non-treatment-seekers. Controlling for comorbidity, daytime impairment, and medications slightly decreased this association. Childhood cognitive deficits antedated the adult cognitive deficits of treatment-seekers. | Links between insomnia and cognitive impairment may be strongest among individuals who seek clinical treatment. Clinicians should take into account the presence of complex health problems and lower premorbid cognitive function when planning treatment for insomnia patients. | closed_qa |
Acute kidney injury after cardiac surgery: is minocycline protective? | Acute kidney injury (AKI) after cardiac bypass surgery (CABG) is common and carries a significant association with morbidity and mortality. Since minocycline therapy attenuates kidney injury in animal models of AKI, we tested its effects in patients undergoing CABG. This is a randomized, double-blinded, placebo-controlled, multi-center study. We screened high risk patients who were scheduled to undergo CABG in two medical centers between Jan 2008 and June 2011. 40 patients were randomized and 19 patients in each group completed the study. Minocycline prophylaxis was given twice daily, at least for four doses prior to CABG. Primary outcome was defined as AKI [0.3 mg/dl increase in creatinine (Cr)] within 5 days after surgery. Daily serum Cr for 5 days, various clinical and hemodynamic measures and length of stay were recorded. The two groups had similar baseline and intra-operative characteristics. The primary outcome occurred in 52.6% of patients in the minocycline group as compared to 36.8% of patients in the placebo group (p = 0.51). Peak Cr was 1.6 ± 0.7 vs. 1.5 ± 0.7 mg/dl (p = 0.45) in minocycline and placebo groups, respectively. Death at 30 days occurred in 0 vs. 10.5% in the minocycline and placebo groups, respectively (p = 0.48). There were no differences in post-operative length of stay, and cardiovascular events between the two groups. There was a trend towards lower diastolic pulmonary artery pressure [16.8 ± 4.7 vs. 20.7 ± 6.6 mmHg (p = 0.059)] and central venous pressure [11.8 ± 4.3 vs. 14.6 ± 5.6 mmHg (p = 0.13)]in the minocycline group compared to placebo on the first day after surgery. | Minocycline did not protect against AKI post-CABG. | closed_qa |
Restless legs syndrome in non-dialysis renal patients: is it really that common? | Sleep disorders are frequent in chronic kidney disease (CKD). Among them, restless legs syndrome (RLS) may affect up to 60% of patients on dialysis, and it has been related to a poor quality of life and higher cardiovascular risk. Despite its high prevalence in advanced stages of renal disease, RLS frequency in non-dialysis CKD has not been clearly established. The aim of this study was to assess the frequency of RLS in non-dialysis CKD patients (stages 2 to 4) followed in a reference nephrology outpatient clinic. A standardized questionnaire following the international RLS study group diagnostic criteria was self-administered by 110 patients regularly followed in the nephrology clinic. The series comprised 69 men and 41 women, aged 68 ± 13.2 years, with mean serum creatinine of 1.7 ± 0.8 mg/dL. Subsequently, patients classified as probable RLS according to the questionnaire underwent a systematic neurological examination. The presence of peripheral artery disease was evaluated by the ankle-brachial index (ABI). The frequency of probable RLS according to the questionnaire results was 21% (17% for men and 27% for women). However, after thorough neurological examination, the diagnosis of RLS was confirmed in only 5 patients. Therefore, the overall definitive RLS frequency was 4.5% (within the prevalence reported for the general population) and was higher among women (9.7% vs 0.2%). In the remaining cases symptoms were due to leg discomfort related with other disorders. Patients with probable and improbable RLS were not significantly different in age, ABI, diabetes, and other comorbid circumstances, except for tricyclic antidepressant prescription, which was more frequent in the probable RLS group (17% vs 2%). Renal function was better in definitive RLS patients than cases classified as probable RLS by the questionnaire but not confirmed after neurological exam. | Although RLS can represent an early manifestation of CKD, its prevalence seems very close to that reported for the general population. Diagnostic confirmation of RLS dramatically falls after expert examination, raising the question whether, in the study of RLS cohorts, CKD has a potentially causal relationship or is a confounding factor associated with other causes of leg discomfort. | closed_qa |
Is the chronotype associated with obstructive sleep apnea? | Chronotype and obstructive sleep apnea (OSA) appear to have a similar lifelong evolution, which could indicate a possible effect of morningness or eveningness in the apnea-hypopnea index (AHI). The present study aimed to examine the prevalence of chronotypes in a representative sample of São Paulo city residents and to investigate the effect of chronotypes on the severity of OSA. We performed a cross-sectional analysis using the São Paulo Epidemiologic Sleep Study (EPISONO). All participants underwent a full-night polysomnography and completed the Morningness-eveningness, Epworth Sleepiness Scale, and UNIFESP Sleep questionnaires. Chronotypes were classified as morning-type, evening-type, and intermediate. Morning-type individuals represented 52.1% of the sample, followed by intermediate (39.5%), and evening-type (8.4%) individuals. After stratifying the sample by body mass index (BMI) (>26.8 kg/m(2)) and age (>42 years), we observed increased AHI values in morning- and evening-type individuals. | We demonstrated, for the first time, an age- and BMI-related effect of morning- and evening-types in OSA severity, suggesting that the intermediate chronotype might play a role as a protective factor in older and overweight patients. | closed_qa |
Bone marrow FDG-PET/CT in Hodgkin lymphoma revisited: do imaging and pathology match? | To directly compare visual and quantitative (18)F-fluoro-2-deoxy-D-glucose positron emission tomography/computed tomography (FDG-PET/CT) to bone marrow biopsy (BMB) findings in the right posterior iliac crest in patients with newly diagnosed Hodgkin lymphoma. This retrospective study included 26 patients with newly diagnosed Hodgkin lymphoma in whom FDG-PET/CT was performed before BMB of the right posterior iliac crest. The right posterior iliac crest was assessed for bone marrow involvement, both visually and semi-quantitatively [using maximum standardized uptake value (SUVmax) measurements]. BMB of right the posterior iliac crest was used as reference standard. BMB of the right posterior iliac crest was positive in 5 (19.2 %) of 26 patients. There was full agreement between visual FDG-PET/CT and BMB findings in the right posterior iliac crest (i.e. no false-positive or false-negative FDG-PET/CT findings). Accordingly, sensitivity, specificity, positive predictive value, and negative predictive value of visual FDG-PET/CT assessment for the detection of bone marrow involvement in the right posterior iliac crest were 100 % (5/5) (95 % CI 51.1-100 %), 100 % (21/21) (95 % CI 81.8-100 %), 100 % (5/5) (95 % CI 51.1-100 %), and 100 % (21/21) (95 % CI 81.8-100 %), respectively. SUVmax of BMB-positive cases (mean ± SD: 3.4 ± 0.85) was nearly significantly higher (P = 0.052) than that of BMB-negative cases (mean ± SD 2.7 ± 0.63). | This histopathological correlation study confirms the very high diagnostic value of FDG-PET/CT in the detection of bone marrow involvement in newly diagnosed Hodgkin lymphoma, and supports the substitution of BMB with FDG-PET/CT in this setting. | closed_qa |
Is implementation of the 2013 Australian treatment guidelines for posttraumatic stress disorder cost-effective compared to current practice? | To assess, from a health sector perspective, the incremental cost-effectiveness of three treatment recommendations in the most recent Australian Clinical Practice Guidelines for posttraumatic stress disorder (PTSD). The interventions assessed are trauma-focused cognitive behavioural therapy (TF-CBT) and selective serotonin reuptake inhibitors (SSRIs) for the treatment of PTSD in adults and TF-CBT in children, compared to current practice in Australia. Economic modelling, using existing databases and published information, was used to assess cost-effectiveness. A cost-utility framework using both quality-adjusted life-years (QALYs) gained and disability-adjusted life-years (DALYs) averted was used. Costs were tracked for the duration of the respective interventions and applied to the estimated 12 months prevalent cases of PTSD in the Australian population of 2012. Simulation modelling was used to provide 95% uncertainty around the incremental cost-effectiveness ratios. Consideration was also given to factors not considered in the quantitative analysis but could determine the likely uptake of the proposed intervention guidelines. TF-CBT is highly cost-effective compared to current practice at $19,000/QALY, $16,000/DALY in adults and $8900/QALY, $8000/DALY in children. In adults, 100% of uncertainty iterations fell beneath the $50,000/QALY or DALY value-for-money threshold. Using SSRIs in people already on medications is cost-effective at $200/QALY, but has considerable uncertainty around the costs and benefits. While there is a 13% chance of health loss there is a 27% chance of the intervention dominating current practice by both saving dollars and improving health in adults. | The three Guideline recommended interventions evaluated in this study are likely to have a positive impact on the economic efficiency of the treatment of PTSD if adopted in full. While there are gaps in the evidence base, policy-makers can have considerable confidence that the recommendations assessed in the current study are likely to improve the efficiency of the mental health care sector. | closed_qa |
Age-related cognitive task effects on gait characteristics: do different working memory components make a difference? | Though it is well recognized that gait characteristics are affected by concurrent cognitive tasks, how different working memory components contribute to dual task effects on gait is still unknown. The objective of the present study was to investigate dual-task effects on gait characteristics, specifically the application of cognitive tasks involving different working memory components. In addition, we also examined age-related differences in such dual-task effects. Three cognitive tasks (i.e. 'Random Digit Generation', 'Brooks' Spatial Memory', and 'Counting Backward') involving different working memory components were examined. Twelve young (6 males and 6 females, 20 ~ 25 years old) and 12 older participants (6 males and 6 females, 60 ~ 72 years old) took part in two phases of experiments. In the first phase, each cognitive task was defined at three difficulty levels, and perceived difficulty was compared across tasks. The cognitive tasks perceived to be equally difficult were selected for the second phase. In the second phase, four testing conditions were defined, corresponding to a baseline and the three equally difficult cognitive tasks. Participants walked on a treadmill at their self-selected comfortable speed in each testing condition. Body kinematics were collected during treadmill walking, and gait characteristics were assessed using spatial-temporal gait parameters. Application of the concurrent Brooks' Spatial Memory task led to longer step times compared to the baseline condition. Larger step width variability was observed in both the Brooks' Spatial Memory and Counting Backward dual-task conditions than in the baseline condition. In addition, cognitive task effects on step width variability differed between two age groups. In particular, the Brooks' Spatial Memory task led to significantly larger step width variability only among older adults. | These findings revealed that cognitive tasks involving the visuo-spatial sketchpad interfered with gait more severely in older versus young adults. Thus, dual-task training, in which a cognitive task involving the visuo-spatial sketchpad (e.g. the Brooks' Spatial Memory task) is concurrently performed with walking, could be beneficial to mitigate impairments in gait among older adults. | closed_qa |
Osteodesis for hallux valgus correction: is it effective? | Although the etiology of hallux valgus is contested, in some patients it may be failure of the stabilizing soft tissue structures around the first ray of the foot. Because there is lack of effective soft tissue techniques, osteotomies have become the mainstream surgical approach to compensate for the underlying soft tissue deficiency; osteodesis, a soft tissue nonosteotomy technique, may be a third alternative, but its efficacy is unknown.QUESTIONS/ We asked: (1) Can an osteodesis, a distal soft tissue technique, correct hallux valgus satisfactorily in terms of deformity correction and improvement in American Orthopaedic Foot and Ankle Society (AOFAS) score? (2) Is the effectiveness of an osteodesis affected by the patient's age or deformity severity? (3) What complications are associated with this procedure? Between February and October 2010, we performed 126 operations to correct hallux valgus, of which 126 (100%) were osteodeses. Sixty-one patients (110 procedures) (87% of the total number of hallux valgus procedures) were available for followup at a minimum of 12 months (mean, 23 months; range, 12-38 months). This group formed our study cohort. During the study period, the general indications for this approach included failed conservative measures for pain relief and metatarsophalangeal angle greater than 20° or intermetatarsal angle greater than 9°. Intermetatarsal cerclage sutures were used to realign the first metatarsal and postoperative fibrosis was induced surgically between the first and second metatarsals to maintain its alignment. The radiologic first intermetatarsal angle, metatarsophalangeal angle, and medial sesamoid position were measured by Hardy and Clapham's methods for deformity and correction evaluation. Clinical results were assessed by the AOFAS score. The intermetatarsal angle was improved from a preoperative mean of 14° to 7° (p<0.001; Cohen's d=1.8) at followup, the metatarsophalangeal angle from 31° to 18° (p<0.001; Cohen's d=3.1), the medial sesamoid position from position 6 to 3 (p<0.001; Cohen's d=2.4), and AOFAS hallux score from 68 to 96 points (p<0.001). Neither patient age nor deformity severity affected the effectiveness of the osteodesis in correcting all three radiologic parameters; however, the deformities treated in this series generally were mild to moderate (mean intermetatarsal angle, 14°; range, 9°-22°). There were six stress fractures of the second metatarsal (5%), five temporary metatarsophalangeal joint medial subluxations all resolved in one month by the taping-reduction method without surgery, and six metatarsophalangeal joints with reduced dorsiflexion less than 60°. | The osteodesis is a soft tissue nonosteotomy technique, and provided adequate deformity correction and improvement in AOFAS scores for patients with mild to moderate hallux valgus deformities, although a small number of the patients had postoperative stress fractures of the second ray develop. Future prospective studies should compare this technique with osteotomy techniques in terms of effectiveness of the correction, restoration of hallux function, complications, and long-term recurrence. | closed_qa |
Can 111In-RGD2 monitor response to therapy in head and neck tumor xenografts? | RGD (arginylglycylaspartic acid)-based imaging tracers allow specific imaging of integrin αvβ3 expression, proteins overexpressed during angiogenesis; however, few studies have investigated the potential of these tracers to monitor responses of antiangiogenic or radiation therapy. In the studies presented here, (111)In-RGD2 was assessed for its potential as an imaging tool to monitor such responses to therapies. DOTA-E-[c(RGDfK)]2 was radiolabeled with (111)In ((111)In-RGD2), and biodistribution studies were performed in mice with subcutaneous FaDu or SK-RC-52 xenografts after treatment with either antiangiogenic therapy (bevacizumab or sorafenib) or tumor irradiation (10 Gy). Micro-SPECT imaging studies and subsequent quantitative analysis were also performed. The effect of bevacizumab, sorafenib, or radiation therapy on tumor growth was determined. The uptake of (111)In-RGD2 in tumors, as determined from biodistribution studies, correlated well with that quantified from micro-SPECT images, and both showed that 15 d after irradiation (111)In-RGD2 uptake was enhanced. Specific or nonspecific uptake of (111)In-RGD2 in FaDu or SK-RC-52 xenografts was not affected after antiangiogenic therapy, except in head and neck squamous cell carcinoma 19 d after the start of sorafenib therapy (P<0.05). The uptake of (111)In-RGD2 followed tumor volume in studies featuring antiangiogenic therapy. However, the uptake of (111)In-RGD2 in FaDu xenografts was decreased as early as 4 h after tumor irradiation, despite nonspecific uptake remaining unaltered. Tumor growth was inhibited after antiangiogenic or radiation therapy. | Here, it is suggested that (111)In-RGD2 could allow in vivo monitoring of angiogenic responses after radiotherapy and may therefore prove a good clinical tool to monitor angiogenic responses early after the start of radiotherapy in patients with head and neck squamous cell carcinoma. Despite clear antitumor efficacy, antiangiogenic therapy did not alter tumor uptake of (111)In-RGD2, indicating that integrin expression was not altered. | closed_qa |
Mesenchymal stem cell implantation in osteoarthritic knees: is fibrin glue effective as a scaffold? | The cell-based tissue engineering approach that uses mesenchymal stem cells (MSCs) has addressed the issue of articular cartilage repair in osteoarthritic (OA) knees. However, to improve outcomes, an advanced surgical procedure with tissue-engineered scaffolds may be needed to treat patients with large cartilage lesions. To investigate the clinical and second-look arthroscopic outcomes of the implantation of MSCs loaded in fibrin glue as a scaffold in patients with OA knees and to compare these outcomes with those of MSC implantation without a scaffold. Cohort study; Level of evidence, 3. This study retrospectively evaluated 54 patients (56 knees) who were examined with second-look arthroscopy after MSC implantation for cartilage lesions in their OA knees. Patients were divided into 2 groups: 37 patients (39 knees) were treated with MSC implantation without a scaffold (group 1), and 17 patients (17 knees) underwent implantation of MSCs loaded in fibrin glue as a scaffold (group 2). Clinical outcomes were evaluated according to the International Knee Documentation Committee (IKDC) score and the Tegner activity scale, and cartilage repair was assessed with the International Cartilage Repair Society (ICRS) grade. Statistical analyses were performed to identify various prognostic factors associated with the clinical and second-look arthroscopic outcomes. At final follow-up (mean, 28.6 months; range, 24-34 months), the mean IKDC score and Tegner activity scale in each group significantly improved: group 1, from 38.1±7.7 to 62.0±11.7 (IKDC) and from 2.5±0.9 to 3.5±0.8 (Tegner); group 2, from 36.1±6.2 to 64.4±11.5 (IKDC) and from 2.2±0.8 to 3.8±0.8 (Tegner) (P<.001 for all). According to the overall ICRS cartilage repair grades, 9 of the 39 lesions (23%) in group 1 and 12 of the 17 lesions (58%) in group 2 achieved a grade of I or II. There was a significant difference in ICRS grades between the groups (P=.028). Overweight (body mass index≥27.5 kg/m2) and large lesion size (≥5.7 cm2) were significant predictors of poor clinical and arthroscopic outcomes in group 1 (P<.05 for both). There was a similar trend in group 2, but the differences were not significant, possibly owing to the smaller sample size. | Clinical and arthroscopic outcomes of MSC implantation were encouraging for OA knees in both groups, although there were no significant differences in outcome scores between groups. However, at second-look arthroscopy, there were better ICRS grades in group 2. | closed_qa |
Does shoulder impingement syndrome affect the shoulder kinematics and associated muscle activity in archers? | Archery related injuries, such as shoulder impingement syndrome are caused by repeated motion of the shoulder. The aim of this study was to analyze differences in the shoulder kinematics and the associated muscle activity between archers with shoulder impingement and uninjured archery players. Thirty male archers, who were divided into an impingement group and an uninjured group, were included in this study. The angle of scapular elevation, shoulder joint abduction, horizontal extension, and elbow joint flexion as well as the electromyographic activity of the upper trapezius, lower trapezius, deltoid middle, deltoid posterior, biceps brachii, and triceps brachii muscles at the point of stabilization during shooting were measured. Variables differing between impingement and uninjured groups were identified, and a stepwise regression analysis was performed to identify a combination of variables that effectively impingement syndrome. The results indicated that the angle of scapular elevation was significantly greater than that uninjured group (P<0.05). The angle of horizontal extension in the impingement group was significantly smaller than that in the uninjured group (P<0.05). The angle of elbow flexion in the impingement group was significantly smaller than that in the uninjured group (P<0.05). The levels of upper trapezius and deltoid middle muscle activity were significantly higher in the impingement group, while the level of lower trapezius muscle activity was significantly lower (P<0.05) when compared to the uninjured group. The impingement group had a greater angle of scapular elevation, smaller angle of horizontal extension, smaller angle of elbow flexion, higher the levels of upper trapezius, lower the levels of lower trapezius, higher deltoid middle muscle activity and higher UT/LT ratio (all differences were significant). A logistic model for predicting impingement syndrome showed that UT/LT ratio was significantly related impingement syndrome (P<0.05). | The authors concluded that archers with shoulder impingement syndrome exhibit different kinematics and muscle activity compared to uninjured archers. Therefore, in order to prevent shoulder joint impingement during archery, training is necessary what can make lower trapezius muscle activity increased to decrease the UT/LT ratio. | closed_qa |
Are speed cameras able to reduce traffic noise disturbances? | Disturbance by traffic noise can result in health problems in the long run. However, the subjective perception of noise plays an important role in their development. The aim of this study was to determine if speed cameras are able to reduce subjective traffic noise disturbance of residents of high-traffic roads in Luebeck? In August 2012 a speed camera has been installed in 2 high-traffic roads in Luebeck (IG). Residents living 1.5 km in front of the installed speed cameras and behind them received a postal questionnaire to evaluate their subjective noise perception before (t0), 8 weeks (t1) and 12 months (t2) after the installation of the speed camera. As controls (CG) we asked residents of another high-traffic road in Luebeck without speed cameras and residents of 2 roads with several consecutive speed cameras installed a few years ago. Furthermore, objective measures of the traffic noise level were conducted. Response rates declined from 35.9% (t0) to 27.2% (t2). The proportion of women in the CG (61.4-63.7%) was significantly higher than in the IG (53.7-58.1%, p<0.05), and responders were significantly younger (46.5±20.5-50±22.0 vs. 59.1±17.0-60.5±16.9 years, p<0.05). A reduction of the perceived noise disturbance of 0.2 point, measured on a scale from 0 (no disturbance) to 10 (heavy disturbance), could be observed in both IG and CG. Directly asked, 15.2% of the IG and 19.3% of the CG reported a traffic noise reduction at t2. The objective measure shows a mean reduction of 0.6 dB at t1. | The change of noise level of 0.6 dB, which could only be experienced by direct comparison, is in line with the subjective noise perception. As sole method to reduce traffic noise (and for health promotion) a speed camera is insufficient. | closed_qa |
Does cesarean section before the scheduled date increase the risk of neonatal morbidity? | Previous studies led to the recommendation to schedule planned elective cesarean deliveries at or after 39 weeks of gestation and not before 38 weeks. The question is whether this practice is appropriate in face of possible risks to the newborn should the pregnancy have to be ended by cesarean section before the scheduled date. To compare the outcomes of newborn infants who were delivered on their scheduled day by elective cesarean section versus those who required delivery earlier. This single-center retrospective study was based on medical records covering a period of 18 months. We compared the neonatal outcomes of 272 infants delivered by elective cesarean section as scheduled (at 38.8 +/- 0.8 weeks gestation)and 44 infants who had to be delivered earlier than planned j(at 37.9 +/- 1.1 weeks). We found no morbidity directly related to delivery by cesarean section before the scheduled date. There were no significant differences in the need for resuscitation after delivery. Although more of the infants who were delivered early were admitted to intensive care and overall stayed longer in the hospital (5.8 +/- 7.3 vs. 3.9 +/- 0.8 days, P<0.02), their more severe respiratory illness and subsequent longer hospitalization was the result of their younger gestational age. Transient tachypnea of the newborn was associated with younger gestational age at delivery in both groups. | We suggest continuing with the current recommendation to postpone elective cesarean singleton deliveries beyond 38-39 weeks of gestation whenever possible. | closed_qa |
Are physician pagers an outmoded technology? | Pagers are the most commonly used method of communications in American hospitals. However, its financial cost and efficiency is unknown. To evaluate the efficiency of conventional hospital pagers and to estimate the financial cost of time wasted by the use of these pagers. We conducted a survey among 100 clinicians, nurses and pharmacists in our community teaching hospital, estimating the time spent in the process of sending and responding to pages and the financial equivalent of this time, and evaluating the potential advantages of hospital-based wireless telephones compared with traditional pagers. A total of 70 clinicians completed the survey for a response rate of 70%. The average time spent per daytime shift in using the paging system was between 48 and 66 minutes for physicians, 120 minutes for nurses and 165 minutes for pharmacists. The financial cost of time lost for a single medical ward for one month was estimated to be $2,732-$17,250, depending on the case scenario. | Our study suggests that the traditional paging system is an inefficient means of communication between clinicians and hospital staff and that a switch to direct phone calls might be far more cost-effective. Similar considerations probably apply to most hospitals that still use traditional pagers. | closed_qa |
Does a Nintendo Wii exercise program provide similar exercise demands as a traditional pulmonary rehabilitation program in adults with COPD? | The chronic obstructive pulmonary disease (COPD) population can experience lower activity and fitness levels than the non-COPD population. The Nintendo Wii may be an appropriate at-home training device for the COPD population, which could be used as a supplement for a pulmonary rehabilitation program. This study was a randomized, within-subject, cross-over study involving 10 adults with COPD previously enrolled in St Paul's Hospital's pulmonary rehabilitation program. This study attempted to determine if specific Wii activities resulted in similar energy expenditures to that of a more traditional pulmonary rehabilitation activity. Participants completed two 15-min exercise interventions in a single session, with a washout period of 30 min in-between. The interventions were an experimental Wii intervention and a traditional treadmill intervention. There was no significant difference in total energy expenditure between the two 15-min exercise interventions [mean difference 36.3 joules; 95% confidence interval (CI): 31.4, 104]. There was no significant difference in heart rate (mean difference -0.167 beats per minute; 95% CI: -4.83, 4.50), rating of perceived exertion (mean difference 0.100; 95% CI: -0.416, 0.616) and Borg dyspnea scale (mean difference 0.267; 95% CI: -0.004, 0.537) between the two 15-min exercise interventions. There was a significant difference in SpO2 between the two 15-min exercise interventions (Wii intervention mean difference 2.33% > treadmill intervention; 95% CI: 1.52, 3.15). | Gaming technology can provide an exercise program that has similar cardiovascular demands to traditional pulmonary rehabilitation programs for patients with COPD. Further research is necessary to address feasibility and long-term adherence. | closed_qa |
Could light meal jeopardize laboratory coagulation tests? | Presently the necessity of fasting time for coagulation tests is not standardized. Our hypothesis is that this can harm patient safety. This study is aimed at evaluating whether a light meal (i.e. breakfast) can jeopardize laboratory coagulation tests. A blood sample was firstly collected from 17 fasting volunteers (12 h). Immediately after blood collection, the volunteers consumed a light meal. Then samples were collected at 1, 2 and 4 h after the meal. Coagulation tests included: activated partial thromboplastin time (APTT), prothrombin time (PT), fibrinogen (Fbg), antithrombin III (AT), protein C (PC) and protein S (PS). Differences between samples were assessed by Wilcoxon ranked-pairs test. The level of statistical significance was set at P<0.05. Mean % differences were determined and differences between and baseline and 1, 2 and 4h samples were compared with reference change value (RCV). A significantly higher % activity of AT was observed at 1 h and 4 h after meal vs. baseline specimen [113 (104-117) and 111 (107-120) vs. 109 (102-118), respectively; P = 0.029 and P = 0.016]. APTT at 2 h was found significantly lower than baseline samples [32.0 (29.9-34.8) vs. 34.1 (32.2-35.2), respectively; P = 0.041]. The results of both Fbg and PS tests were not influenced by a light meal. Furthermore, no coagulation tests had significant variation after comparison with RCV. | A light meal does not influence the laboratory coagulation tests we assessed, but we suggest that the laboratory quality managers standardize the fasting time for all blood tests at 12 hours, to completely metabolize the lipids intake. | closed_qa |
Does hypericin boost the efficacy of high-power laser? | Lasers are widely used in treating symptomatic benign prostatic hyperplasia. In current practice, potassium titanyl phosphate (KTP) lasers are the most common type of laser systems used. The aim here was to evaluate the rapid effect of high-power laser systems after application of hypericin. Experimental animal study conducted in the Department of Urology, Gülhane Military Medical Academy, Ankara, Turkey, in 2012. Sixteen rats were randomized into four groups: 120 W KTP laser + hypericin; 120 W KTP laser alone; 80 W KTP laser + hypericin; and 80 W KTP laser alone. Hypericin was given intraperitoneally two hours prior to laser applications. The laser incisions were made through the quadriceps muscle of the rats. The depth and the width of the laser incisions were evaluated histologically and recorded. To standardize the effects of the laser, we used the ratio of depth to width. These new values showed us the depth of the laser application per unit width. The new values acquired were evaluated statistically. Mean depth/width values were 231.6, 173.6, 214.1 and 178.9 in groups 1, 2, 3 and 4, respectively. The most notable result was that higher degrees of tissue penetration were achieved in the groups with hypericin (P<0.05). | The encouraging results from our preliminary study demonstrated that hypericin may improve the effects of KTP laser applications. | closed_qa |
Do value thresholds for oncology drugs differ from nononcology drugs? | In the past decade, many oncologic drugs have been approved that extend life and/or improve patients' quality of life. However, new cancer drugs are often associated with high price and increased medical spending. For example, in 2010, the average annual cost of care for breast cancer in the final stage of disease was reported to be $94,284, and the total estimated cost in the United States was $16.50 billion. To determine whether value threshold, as defined by the incremental cost-effectiveness ratio (ICER), differed between oncology and other therapeutic areas. The PubMed database was searched for articles published between January 2003 and December 2013 with calculated ICER for therapeutic drug entities in a specific therapeutic area. The search term used was "ICER" and "United States." From 275 results, only those articles that reported ICERs using quality-adjusted life-years (QALY) were included. In addition, only those articles that used a U.S. payer perspective were retained. Among those, nondrug therapy articles and review articles were excluded. The mean ICER and value threshold for oncologic drugs and nononcologic drugs were evaluated for the analysis. From 54 articles selected for analysis, 13 pertained to drugs in oncology therapeutics, and the remaining 41 articles addressed ICER for drugs in other therapeutic areas. The mean and median of ICERs calculated for cancer-specific drug intervention was $138,582/QALY and $55,500/QALY, respectively, compared with $49,913/QALY and $31,000/QALY, respectively, for noncancer drugs. Among the cancer drugs, 45.0% had ICERs below $50,000/QALY and 70.0% below $100,000/QALY. In comparison, 72.0% of noncancer drugs showed ICERs below $50,000/QALY, and 90.0% had ICERs below $100,000/QALY. When a specific threshold was mentioned, it was in the range of $100,000-$150,000 in cancer drugs, whereas drugs in other therapeutic areas used traditional threshold value within the range of $50,000-$100,000. | The average ICER reported for cancer drugs was more than 2-fold greater than the average ICER for noncancer drugs. In general, articles that addressed the relative value of oncologic pharmaceuticals used higher value thresholds and reported higher ICERs than articles evaluating noncancer drugs. | closed_qa |
A case study in generic drug use: should there be risk adjustment in incentive payments for the use of generic medications? | Encouraging generic drug use has reduced health care costs for payers and consumers, but the availability of therapeutically interchangeable medications or generic medications of choice is not equal across disease states. The extent to which systems of care are able to substitute with generics is not well understood. To (a) define and measure the maximum generic rate (MGR) of currently prescribed drugs within an academic medical group in and (b) illustrate differences across drugs associated with selected underlying diseases. Prescription claims data were examined from an academic medical group in Chicago, Illinois. Based on pharmacologic and therapeutic criteria, drugs were classified into 2 categories-potentially substitutable and not potentially substitutable-based on whether the drugs are branded forms of the same chemical entities that are available as generics or are therapeutically interchangeable with other medications that have different chemical compositions but the same mechanisms of action and potential efficacy. A medication was considered potentially substitutable if it (a) did not have a narrow therapeutic index as defined by the FDA; (b) did not belong to 1 of 6 protected classes of drugs in the Medicare D provisions; (c) was substitutable with a generic medication containing the same chemical entity; or (d) was therapeutically interchangeable with a therapeutically equivalent medication. MGR was defined as the percentage of prescriptions that could potentially be prescribed in generic form. This rate was examined overall and across drugs known to be associated with illustrative diseases including hypertension, diabetes mellitus, and obstructive lung diseases. The MGR ranged from 100% for drugs used in hypertension to 26.7% for drugs used in obstructive lung diseases. The MGR was 83.6%. | Payers wishing to promote generic substitution should incorporate the potential for substitution of clinically appropriate generic medications as part of incentives for generic utilization to avoid unintended consequences of using a fixed target rate. A practical methodology for determining an MGR is offered. | closed_qa |
Is there an association between the high-risk medication star ratings and member experience CMS star ratings measures? | Methods to achieve high star ratings for the High-Risk Medication (HRM) measure are thought to result in unintended consequences and to compromise several member experience measures that ultimately put at risk the plan sponsor's Medicare Part D Centers for Medicare Medicaid (CMS) star rating. To determine if HRM scores are associated with relevant member experience measure scores. This is a cross-sectional analysis utilizing CMS 2013 and 2014 plan star ratings reports (2011 and 2012 benefit year data) for Medicare Advantage prescription drug (MA-PD) plans and prescription drug plans (PDPs). Medicare contracts with complete data for all measures of interest in 2013 and 2014 star ratings reports were included (N = 443). Bivariate linear regressions were performed for each of 2 independent variables: (1) 2014 HRM score and (2) 2013 to 2014 change in HRM score. Dependent variables were the 2014 scores for "Getting Needed Prescription Drugs," "Complaints about Drug Plan," "Rating of Drug Plan," and "Members Choosing to Leave the Plan." The bivariate linear regressions demonstrated weak positive associations between the 2014 HRM score and each of the 2014 member experience measures that explained 0.5% to 4% (R2) of variance of these measures. The bivariate regressions for the 2013 to 2014 change in the HRM score and 2014 member experience measures of interest demonstrated associations accounting for 1% to 8% of variance (R2). The greatest associations were observed between each independent variable and the 2014 "Getting Needed Prescription Drugs" score with correlation coefficients of 0.21 and 0.29. | HRM star ratings and change in HRM star ratings are weakly correlated with member experience measures in concurrent measurement periods. Plan sponsors may be more aggressive in HRM utilization management, since it is unlikely to negatively impact CMS summary star ratings. | closed_qa |
Does nonsurgical treatment improve longitudinal outcomes of lateral epicondylitis over no treatment? | Lateral epicondylitis is a painful tendinopathy for which several nonsurgical treatment strategies are used. Superiority of these nonsurgical treatments over nontreatment has not been definitively established.QUESTIONS/ We asked whether nonsurgical treatment of lateral epicondylitis compared with observation only or placebo provides (1) better overall improvement, (2) less need for escape interventions, (3) better outcome scores, and (4) improved grip strength at intermediate- to long-term followup. The English-language literature was searched using PubMed and the Cochrane Central Register of Controlled Trials. Randomized-controlled trials (RCTs) comparing any form of nonsurgical treatment with either observation only or placebo at followup of at least 6 months were included. Nonsurgical treatments included injections (corticosteroid, platelet-rich plasma, autologous blood, sodium hyaluronate, or glycosaminoglycan polysulfate), physiotherapy, shock wave therapy, laser, ultrasound, corticosteroid iontophoresis, topical glyceryl trinitrate, or oral naproxen. Methodologic quality was assessed with the Consolidated Standards of Reporting Trials (CONSORT) checklist, and 22 RCTs containing 2280 patients were included. Pooled analyses were performed to evaluate overall improvement; requirement for escape interventions (treatment of any kind, outside consultation, and surgery); outcome scores (Patient-Rated Tennis Elbow Evaluation [PRTEE]; DASH; Pain-Free Function Index [PFFI]; EuroQoL [EQ]-5D; and overall function); and maximum and pain-free grip strength. Sensitivity analyses were performed using only trials of excellent or good quality. Heterogeneity analyses were performed, and funnel plots were constructed to assess for publication bias. Nonsurgical treatment was not favored over nontreatment based on overall improvement (risk ratio [RR] = 1.05 [0.96-1.15]; p = 0.32), need for escape treatment (RR = 1.50 [0.84-2.70]; p = 0.17), PRTEE scores (mean difference [MD]= 1.47, [0.68-2.26]; p<0.001), DASH scores (MD = -2.69, [-15.80 to 10.42]; p = 0.69), PFFI scores (standardized mean difference [SMD]= 0.25, [-0.32 to 0.81]; p = 0.39), overall function using change-from-baseline data (SMD = 0.11, [-0.14 to 0.36]; p = 0.37) and final data (SMD = -0.16, [-0.79 to 0.47]; p = 0.61), EQ-5D scores (SMD = 0.08, [-0.52 to 0.67]; p = 0.80), maximum grip strength using change-from-baseline data (SMD = 0.12, [-0.11 to 0.35]; p = 0.31) and final data (SMD = 4.37, [-0.65 to 9.38]; p = 0.09), and pain-free grip strength using change-from-baseline data (SMD = -0.20, [-0.84 to 0.43]; p = 0.53) and final data (SMD = -0.03, [-0.61 to 0.54]; p = 0.91). | Pooled data from RCTs indicate a lack of intermediate- to long-term clinical benefit after nonsurgical treatment of lateral epicondylitis compared with observation only or placebo. | closed_qa |
KIR genotype distribution among symptomatic patients with and without Helicobacter pylori infection: is there any role for the B haplotype? | Contact of peripheral blood lymphocytes with Helicobacter pylori was proved to induce non- major histocompatibility complex-restricted cytotoxicity and natural killer cells are thought to play an important role in the immunity against H. pylori. In this research, we investigated any possible association between killer immunoglobulin-like receptors (KIR) genotypes and H. pylori infection. KIR genotype was analysed in 101 Lebanese symptomatic patients (51 H. pylori positive and 50 H. pylori-negative) using the KIR Genotyping SSP kit. Among the H. pylori-positive patients, the AA, AB and BB genotypical frequencies were, respectively, 43.14%, 41.18% and 15.68% with an A:B ratio of 1.76:1. The AA, AB and BB genotypes frequencies for H. pylori-negative individuals were 18%, 62% and 20%, respectively, with an A:B ratio of 0.96:1. No significant difference between patients and controls was detected. | We noticed a reduced distribution of A haplotype among the 'H. pylori-negative' patients as compared with the "H. pylori-positive" group. This is the first study in the international literature that targets the correlation between KIR genotypes and H. pylori. | closed_qa |
Does noninvasive ventilation delivery in the ward provide early effective ventilation? | Although noninvasive ventilation (NIV) is increasingly used in general wards, limited information exists about its ability to provide effective ventilation in this setting. We aim to evaluate NIV delivered in the ward by assessing (1) overall time of application and occurrence of adverse events and (2) differences between daytime and nighttime NIV application. We studied subjects with hypercapnic acute hypercapnic respiratory failure not fulfilling strict criteria for ICU admission, and excluded those who interrupted NIV prior to 48 h. Time spent on NIV, presence and extent of air leaks, and occurrence of desaturations were assessed for the overall study period, and compared between daytime and nighttime. We enrolled 42 subjects, 25 of whom received NIV for at least 48 h and were included in the data analysis. NIV was successful for 20 subjects, who did not reach criteria for ICU admission. Both PaCO2 and pH significantly improved on average after 2 h and at the end of the study period. NIV was applied for 64.5% of the overall study period and had absent or compensated air leaks for 62.3% of the overall 48-h period. NIV was applied for 55.8% of daytime and for 79.3% of nighttime (P<.01). Effective NIV application was significantly longer overnight (76.9%) than during daytime (53.2%) (P<.01). | In selected subjects with hypercapnic acute respiratory failure not fulfilling criteria for ICU admission, the application of NIV in the ward is feasible; in addition, NIV can be safely administered overnight. | closed_qa |
Is neck circumference measurement an indicator for abdominal obesity? | Neck circumference (NC) measurement is one of the simple screening measurements which can be used as an index of upper body fat distribution to identify obesity. The aim of this study was to determine the relationship between neck circumferences and obesity. A total 411 volunteer adults participated in this study (174 men, 237 women). A questionnaire which consisted of anthropometric measurements and demographic features was used. Patients with NC ≥37 cm for men and ≥34 cm for women require evaluation of overweight status. The percentages of the men and women with BMI ≥ 25 kg/m(2) were 55.2% and 27.0% respectively and with high neck circumferences were 85.1% and 38.8%, respectively. The percentages of the men and women with high waist circumference were 31.6% and 79.3%, respectively. In both gender there were positive significant correlations between neck circumference, body weight (men, r=0.576; women, r=0.702; p=0.000), waist circumferences (men, r=0.593; women r=0.667; p=0.000), hip circumferences (men, r=0.568; women, r=0.617; p=0.000) and BMI (men, r=0.587; women, r=0.688; p=0.000). | This study indicates that NC was associated with body weight, BMI, waist and hip circumferences and waist/hip ratio for men and women. A significant association was found between NC and conventional overweight and obesity indexes. NC was associated with waist/hip ratio for men and women. | closed_qa |
Management of LBP at primary care level in South Africa: up to standards? | Primary Health Care (PHC) is well suited for management of low back pain (LBP). Prevalence of (chronic) LBP is suspected to be high among visitors of the South African primary care centers, but currently no information exists on prevalence or guideline adherence. To establish if treatment received for LBP in public PHC in the Cape Town area compares with international evidence based guidelines. Cluster randomization determined the 8 community health centres where the study took place. A measurement tool was developed and validated for this population. Descriptive analysis and logistic regression analytical techniques were applied. 489 participants (mean age: 44.8) were included in this study. Lifetime prevalence was 73.2% and 26.3% suffered from chronic low back pain (CLBP) . Pain medication was the only form of treatment received by 90% of the sample. Interventions received seemed to be unrelated to type of LBP (acute, sub acute and chronic). Referral to physiotherapy, education and advice to stay active were rarely done. Participants expressed low satisfaction with treatment. | Current management of LBP at PHC level appears to be ineffective and not conform guidelines. Further South African research should focus on barriers as well as measures to be taken for implementation of LBP guidelines. | closed_qa |
Does chronic rhinosinusitis increase the risk of lung cancer? | Chronic rhinosinusitis is one of the most common chronic inflammatory diseases of the upper airway. A previous study of chronic rhinosinusitis and the risks of lung cancer was based on a self-reported questionnaire concerning rhinosinusitis. Population-based cohort studies of the correlation between chronic rhinosinusitis and the adenocarcinoma subtype of lung cancer have been limited. In the present study, we used a population-based database to investigate the risks related to the adenocarcinoma subtype of lung cancer among patients with chronic rhinosinusitis. We identified 13 072 patients who were diagnosed with chronic rhinosinusitis in 1998-2010 as the exposure group. There were 52 288 randomly selected patients as the comparison cohort. We used data from the Taiwan National Health Insurance Research Database; the Taiwan National Health Insurance programme offers health-care services to 99% of the 23 million people residing in Taiwan. We compared the incidence of the adenocarcinoma subtype of lung cancer between the two cohorts. The cumulative incidence and hazard ratios of developing the adenocarcinoma subtype of lung cancer were calculated. This study included 13 072 participants with chronic rhinosinusitis and 52 288 participants with non-chronic rhinosinusitis. The risk of the adenocarcinoma subtype of lung cancer was higher in the chronic rhinosinusitis cohort than in the non-chronic rhinosinusitis cohort, with an adjusted hazard ratio of 3.52 after controlling for age and gender. | This large population-based cohort study demonstrated that patients in Taiwan with previous chronic rhinosinusitis are at greater risk of developing the adenocarcinoma subtype of lung cancer. | closed_qa |
Is human T-lymphotropic virus type 1 infection associated with hearing loss? | Human T-lymphotropic virus type 1 (HTLV-1) infection is endemic in the northeast area of Iran. Although various neurological disturbances have been reported in HTLV-1 infection, possible audiovestibular involvement during this infection has not yet been studied. Case control study. Sixty-eight cases in three groups including 24 HTLV-1-infected patients with HTLV-1- associated myelopathy/tropical spastic paraparesis (HAM/TSP) (group 1), 23 HTLV-1-infected cases without clinical presentation (group 2), and 21 normal individuals (group 3) entered our study. A complete history of hearing-related disorders and a profile of audiologic tests, including pure-tone audiometry (PTA) with high frequencies, speech reception threshold (SRT), and auditory brainstem response (ABR) were taken. Subjective audiovestibular complaints of participants showed a significant difference among HAM/TSP patients and the two other groups regarding hearing loss and tinnitus, but not vertigo or aural fullness. Hearing evaluation by SRT and PTA in all frequencies showed a significant difference between HAM/TSP patients (group 1) and the controls (group 3). The difference was also significant between asymptomatic cases (group 2) and the controls only in PTA frequencies above 4 kHz. Auditory brainstem-evoked potential did not show any significant differences among the groups regarding latency of I, III, and V waves and interwave differences. | HTLV-1 infection, particularly in those with a clinical presentation, appears to accompany hearing loss. Based on the results of PTA and ABR tests, this study may suggest a cochlear source of hearing impairment rather than neural problems. | closed_qa |
Is preeclampsia associated with fetal malformation? | To examine fetal malformations in mother-infant pairs with and without pregnancy-related hypertension. This was an observational, population-based study of women delivering a singleton at our hospital. Specific fetal malformations identified in women with gestational hypertension or preeclampsia were compared to those without pregnancy-related hypertension. Women with chronic hypertension, superimposed preeclampsia on chronic hypertension and pregestational diabetes were excluded. Between March 2002 and December 2012, a total of 151 997 women delivered, and 10 492 (7%) had preeclampsia, 4282 (3%) had gestational hypertension and 137,223 (90%) were referent normotensive controls. Women with preeclampsia were significantly more likely to deliver infants with malformations when compared to normotensive controls (2.5% versus 1.6%, p < 0.001), whereas women with gestational hypertension were not (1.9% versus 1.6%, p = 0.16). The overall risk for fetal malformation associated with preeclampsia remained significant following logistic regression for age, race, parity and maternal body-habitus (adjusted OR 1.5; 95% CI: 1.3-1.7). Only single-organ system malformations - microcephaly and hypospadias - remained associated with preeclampsia (p < 0.001), and fetal growth restriction was a co-factor for both. | Preeclampsia was associated with increased rates of fetal malformations when compared to normotensive women - specifically microcephaly and hypospadias. These associations appear predominantly as a consequence of impaired fetal growth. | closed_qa |
Tranfusion risk: is "two-step" vaginal delivery a risk for postpartum hemorrhage? | In the active management strategy of third stage of labor, the optimal timing for clamping the umbilical cord after birth has been a subject of controversy. We want to evaluate if "two-step" delivery is a risk factor for postpartum hemorrhage (PPH), defined as need of transfusion, comparing to operative delivery, elective caesarean delivery and emergency caesarean delivery. This is a retrospective cohort study conducted in division of Perinatal Medicine, Policlinico Abano Terme. We evaluated the need of transfusion in all cases of PPH verified in all single deliveries between January 2011 and December 2012. The main outcome measure was blood loss and red blood cell transfusion. We found 17 cases of PPH (0.88%). The distribution of PPH in relation to mode of delivery was 0.71%, 2.46% and 1.98% respectively for two-step vaginal delivery (RR = 0.81 (0.56-1.22)), emergency cesarean section (RR = 2.88 (1.27-7.77)) and operative vaginal delivery (RR = 2.88 (0.59-5.66)). In labor induction there is a stronger relative risk association between PPH and as emergency cesarean delivery (p < 0.05) as operative vaginal delivery (p < 0.05). | "Two-step" delivery approach did not increase the risk of PPH with respect to operative delivery, elective caesarean section and emergency caesarean section. | closed_qa |
Complex febrile crises: should we change the way we act? | Febrile seizures are one of the most frequent reasons why patients visit the healthcare specialist. Up until now, patients with complex febrile seizures (CFS) have been hospitalised, bearing in mind the higher percentages of epilepsy and acute complications that were classically reported. Today there are studies that back the idea of being less invasive in the management of these patients. AIMS. To describe the characteristics of patients hospitalised due to CFS and to propose a new protocol to be followed in dealing with such cases. The medical records of patients hospitalised because of CFS (January 2010-December 2013) were analysed retrospectively. Epidemiological and clinical data are presented, together with information from complementary tests and about development. CFS account for 4.2% of all neuropaediatric cases of admittance to hospital in (67 patients). Mean age at the time of the event: 25 months. A pathological family history existed in 47% of cases, and 31% had a previous personal history of febrile seizures. The CFS lasted less than five minutes in 54% of patients; there were also recurrences, most of them with a total of two crises and during the first day (CFS due to recurrence are the most frequent). None of the complementary tests that were carried out were of any use as a diagnostic aid during the acute phase. During their follow-up, five patients presented complications. Patients with a family history of febrile seizures presented a higher risk of epilepsy or recurrence (p = 0.02), with no significant differences as regards age, number of seizures, febrile interval, epileptic status or type of CFS. | The CFS are not associated with greater acute complications, and the complementary examinations do not allow high-risk patients to be distinguished at an early stage. Hospitalising them could be avoided in the absence of other clinical signs and symptoms, and thus be limited to selected cases. | closed_qa |
Is Hepatic Resection for Large or Multifocal Intrahepatic Cholangiocarcinoma Justified? | The role of surgical resection for patients with large or multifocal intrahepatic cholangiocarcinoma (ICC) remains unclear. This study evaluated the long-term outcome of patients who underwent hepatic resection for large (≥7 cm) or multifocal (≥2) ICC. Between 1990 and 2013, 557 patients who underwent liver resection for ICC were identified from a multi-institutional database. Clinicopathologic characteristics, operative details, and long-term survival data were evaluated. Of the 557 patients, 215 (38.6 %) had a small, solitary ICC (group A) and 342 (61.4 %) had a large or multifocal ICC (group B). The patients in group B underwent an extended hepatectomy more frequently (16.9 vs. 30.4 %; P<0.001). At the final pathology exam, the patients in group B were more likely to show evidence of vascular invasion (22.5 vs. 38.5 %), direct invasion of contiguous organs (6.5 vs. 12.9 %), and nodal metastasis (13.3 vs. 21.0 %) (all P<0.05). Interestingly, the incidences of postoperative complications (39.3 vs. 46.8 %) and hospital mortality (1.1 vs. 3.7 %) were similar between the two groups (both P>0.05). The group A patients had better rates for 5-year overall survival (OS) (30.5 vs. 18.7 %; P<0.05) and disease-free survival (DFS) (22.6 vs. 8.2 %; P<0.05) than the group B patients. For the patients in group B, the factors associated with a worse OS included more than three tumor nodules [hazard ratio (HR), 1.56], nodal metastasis (HR, 1.47), and poor differentiation (HR, 1.48). | Liver resection can be performed safely for patients with large or multifocal ICC. The long-term outcome for these patients can be stratified on the basis of a prognostic score that includes tumor number, nodal metastasis, and poor differentiation. | closed_qa |
Cervical lymph node metastasis in differentiated thyroid carcinoma: does it have an impact on disease-related morbid events? | The aim of the study was to evaluate the prognostic impact of cervical lymph node (CLN) metastasis in well-differentiated thyroid cancer (WDTC). A cohort of 164 patients who underwent thyroidectomy for WDTC in the last 12 years was studied. Patients were classified into those with CLN metastasis (group 1) and those without (group 2), with 82 patients in each group. Morbid events (recurrence and death) and prognostic risk factors were analyzed. The mean age of the whole group was 41.3±14.2 years, with 52.4% being at least 45 years old; 69.6% of the patients were female (P=0.02), with a female to male ratio of 2.3 : 1. Near-total thyroidectomy and radioactive iodine-131 therapy were performed in all 81.1% of patients in whom papillary cancer was found. Morbid events were found in 22/164 (13.4%) patients (17 recurrences and five deaths). All recurrences occurred in patients with the papillary subtype, whereas 4/5 deaths occurred in those with follicular thyroid cancer. Morbid events were found in 14/82 (17.1%) and 8/82 (9.6%) patients in group 1 and group 2, respectively (P=0.1). Comparison of morbid events between group 1 and group 2 showed statistically significant differences with respect to tumor size of 1 cm or more, multifocality, and positivity of the follow-up diagnostic tools at 6-12 months. Kaplan-Meier analysis revealed no significant difference between the two groups (P=0.4) over a mean follow-up period of 125.1±14.6 months. Multivariate Cox analysis showed that development of remote deposits, lymph node metastasis five or more nodes, extracapsular nodal invasion in at least three nodes, and positivity of the follow-up diagnostic tools at 6 months were the significant prognostic factors of morbid events. | Our study revealed the impact of CLN metastasis on the occurrence of morbid events in WDTC patients (especially in those with the papillary subtype) with node metastasis five or more, extracapsular nodal invasion in at least three nodes, and positivity of follow-up diagnostic tools at 6 months. Hence, these patients should be optimally managed and more closely monitored. | closed_qa |
Can nasal decongestants improve eustachian tube function? | To evaluate the effect of nasal decongestants on eustachian tube (ET) opening. A prospective nonrandomized study. A tertiary referral center. Twenty-four patients (44 ears) with intact eardrums, 39 patients (43 ears) having a noninfected eardrum defect, and six patients with an upper airway infection. Nasal or intratympanal (in perforated ears) application of a nasal decongestant (xylometazoline 0.1%). Change of tube opening quality (yes or no; better or worse) measuring tube opening parameters (pressure, latency) using the Estève method and pressure equalization tests (swallowing at negative and positive external ear canal pressures). In most cases, nasal decongestion or intratympanal use of decongestants have no effect on ET opening. Improvement in tube opening is rather an exception and, in a minority of patients, a reduced ET function was evident. | Our acute studies revealed no improvement in eustachian ventilatory tube function with the administration of nasal decongestants. | closed_qa |
HER2 immunohistochemical assessment with A0485 polyclonal antibody: is it time to refine the scoring criteria for the "2+" category? | An accurate determination of human epidermal growth factor receptor 2 (Her2) status in women with breast cancer is mandatory to identify patients who will benefit from trastuzumab-based therapy. Her2 immunohistochemical analysis (IHC) (performed with A0485 polyclonal antibody) on 943 invasive breast cancer cases was evaluated independently and blindly twice by 3 of us (V.A., I.P., and A.C.) according to DAKO scoring criteria. A total of 230 cases of invasive breast cancer scored 2+ at IHC and consequently evaluated by FISH were reviewed first independently, and then simultaneously by 3 of us (V.A., I.P., and A.C.) at a multiheaded microscope assessing the following parameters: overall signal intensity, granularity and continuity of membrane staining, and the presence of band-like membrane pattern in>25% of tumor cells. The frequencies of HER2 gene amplification for all the immunohistologic parameters (individually considered or in combination) were compared by Pearson χ analysis. Combinations of staining patterns did not give any statistically significant results, except when combining strong staining intensity and continuity of membrane signal. In fact, only 9 of the 86 cases with a weak-to-moderate staining intensity, which showed a fragmented membrane signal, resulted in being amplified by FISH, whereas 19 of the 51 cases presenting an overall strong IHC reaction and some extent of continuous membrane signal were FISH amplified (P=0.002). | Combined intensity and linearity of membrane signal, although limited, resulted in the best aid (P=0.0002) in making the final score decision in borderline IHC Her2 tests similar to what is envisaged in the Her2 scoring system for gastric cancer. | closed_qa |
Concomitant MAZE procedure during cardiac surgical procedures: is there any survival advantage in conversion to sinus rhythm? | We retrospectively evaluated the conversion rate to SR and its correlation with long-term survival in 209 patients with chronic AF, who had a MAZE procedure during cardiac surgical procedures between the years 2006 and 2011 at our institution. The mean age was 67.2 ± 12.0 years and 52.2% were female (N. = 109). Perioperative mortality was 5.74% (N. = 12). In univariate analysis, significant risk factors for perioperative mortality were age (P = 0.0033), duration of perfusion time (P = 0.0093), elevated creatinine (≥ 1.6 mg/dL, P = .02), and cross clamp time (P = 0.016). In multivariate analysis age (HR 2.97) and duration of perfusion time (HR 1.48) were the only independent predictors of perioperative mortality. The overall one and five-year survival rates were 88% ± 2.2%, and 76% ± 3.3%, respectively. The one and five-year survival rates for patients who converted and were in sinus rhythm (SR) upon discharge (N. = 154) were 88% ± 2.6% and 80% ± 3.5%, respectively. While the one and five-year survival rates for patients who were still in AF upon discharge (N. = 55) were 94% ± 3% and 82% ± 6.6%, respectively, this survival difference was not statistically significant (P = 0.24). Significant risk factors for long-term mortality included DM (P = 0.023), preoperative MI (P = 0.043), preoperative renal insufficiency (creatinine, ≥ 1.6 mg/dL, P = 0.02) and asthma/COPD (P = 0.040). In multivariate analysis, age (HR 1.048) and preoperative MI (HR 1.948) were the only independent predictors of long-term mortality. | The surgical MAZE procedure has a high conversion rate, however, our data did not show improved survival in patients who converted to SR prior to discharge. | closed_qa |
Are school children ready to donate blood? | Voluntary non-remunerated blood donors are considered the best among all different types of blood donors for improving the supply of safe blood. Though safe blood transfusion services have improved in Pakistan, but efforts are still required to optimize blood banks and improve recruitment of voluntary donors, such as senior school/college going students as a source of safe blood. This study looks into the awareness of senior school children concerning blood related issues including blood donation. This cross-sectional descriptive study enrolled 106 senior school students of private schools. Data were collected through self-administered questionnaire. Although 90% of the students considered blood as an important entity for saving lives still 56.8% had never thought of donating blood. Respondents had good knowledge regarding the possible spread of HIV/AIDS, and Hepatitis B and C through unsafe transfusions. Possible hindrances to donating blood included fear of needles, fear of acquiring disease, lack of knowledge regarding where to donate blood and lack of trust on blood banks. More than half of the students believed that blood should be bought from professional blood donors. | Senior school going children are not ready to donate blood. Lack of knowledge and prevailing misconceptions regarding blood transfusions need to be addressed and mechanisms to motivate and mobilize youth for becoming voluntary blood donors need to be established. | closed_qa |
Do practice characteristics explain differences in morbidity estimates between electronic health record based general practice registration networks? | General practice based registration networks (GPRNs) provide information on population health derived from electronic health records (EHR). Morbidity estimates from different GPRNs reveal considerable, unexplained differences. Previous research showed that population characteristics could not explain this variation. In this study we investigate the influence of practice characteristics on the variation in incidence and prevalence figures between general practices and between GPRNs. We analyzed the influence of eight practice characteristics, such as type of practice, percentage female general practitioners, and employment of a practice nurse, on the variation in morbidity estimates of twelve diseases between six Dutch GPRNs. We used multilevel logistic regression analysis and expressed the variation between practices and GPRNs in median odds ratios (MOR). Furthermore, we analyzed the influence of type of EHR software package and province within one large national GPRN. Hardly any practice characteristic showed an effect on morbidity estimates. Adjusting for the practice characteristics did also not alter the variation between practices or between GPRNs, as MORs remained stable. The EHR software package 'Medicom' and the province 'Groningen' showed significant effects on the prevalence figures of several diseases, but this hardly diminished the variation between practices. | Practice characteristics do not explain the differences in morbidity estimates between GPRNs. | closed_qa |
Do immunocompromised children benefit from having surgical lung biopsy performed? | Surgical lung biopsy is considered a gold standard for the evaluation of pulmonary disease in immunocompromised children. However, in the literature, its accuracy and the rate of complications vary. We aimed to evaluate the yield of surgical lung biopsies in the management of persistent pulmonary findings in immunocompromised children. We performed a retrospective review of clinical records of immunocompromised children who underwent surgical lung biopsies, and evaluated the impact that preoperative factors had on outcomes. Twenty-five patients underwent 27 surgical lung biopsies. The underlying immunodeficiency included allogeneic stem cell transplantation (n = 12), chemotherapy for solid tumors (n = 6), hematologic malignancy (n = 4), primary immunodeficiency (n = 4) and chronic steroid use (n = 1). Biopsies provided a specific histopathologic or microbiologic diagnosis in 10 cases (37%). No preoperative factor predicted a diagnostic biopsy. Five of the 27 biopsies were beneficial for the patients (18%). A major complication related to the procedure was reported for 1 biopsy (4%). | We conclude that surgical lung biopsy in pediatric immunocompromised patients appears to be safe, but has a relatively low diagnostic yield and an even lower yield with regards to the benefit it provides. | closed_qa |
Is a specialist breathlessness service more effective and cost-effective for patients with advanced cancer and their carers than standard care? | Breathlessness is common in advanced cancer. The Breathlessness Intervention Service (BIS) is a multi-disciplinary complex intervention theoretically underpinned by a palliative care approach, utilising evidence-based non-pharmacological and pharmacological interventions to support patients with advanced disease. We sought to establish whether BIS was more effective, and cost-effective, for patients with advanced cancer and their carers than standard care. A single-centre Phase III fast-track single-blind mixed-method randomised controlled trial (RCT) of BIS versus standard care was conducted. Participants were randomised to one of two groups (randomly permuted blocks). A total of 67 patients referred to BIS were randomised (intervention arm n = 35; control arm n = 32 received BIS after a two-week wait); 54 completed to the key outcome measurement. The primary outcome measure was a 0 to 10 numerical rating scale for patient distress due to breathlessness at two-weeks. Secondary outcomes were evaluated using the Chronic Respiratory Questionnaire, Hospital Anxiety and Depression Scale, Client Services Receipt Inventory, EQ-5D and topic-guided interviews. BIS reduced patient distress due to breathlessness (primary outcome: -1.29; 95% CI -2.57 to -0.005; P = 0.049) significantly more than the control group; 94% of respondents reported a positive impact (51/53). BIS reduced fear and worry, and increased confidence in managing breathlessness. Patients and carers consistently identified specific and repeatable aspects of the BIS model and interventions that helped. How interventions were delivered was important. BIS legitimised breathlessness and increased knowledge whilst making patients and carers feel 'not alone'. BIS had a 66% likelihood of better outcomes in terms of reduced distress due to breathlessness at lower health/social care costs than standard care (81% with informal care costs included). | BIS appears to be more effective and cost-effective in advanced cancer than standard care. | closed_qa |
Prospective Analysis of Payment per Hour in Head and Neck Reconstruction: Fiscally Feasible or Futile? | The authors assess the fiscal viability of complex head and neck reconstructive surgery by evaluating its financial reimbursement in the setting of resources used. The authors prospectively assessed provider reimbursement for consecutive patients undergoing head and neck reconstruction. Total care time was determined by adding 15 minutes to the operative time for each postoperative hospital day and each postoperative follow-up appointment within the 90-day global period. Physician reimbursement was divided by total care time hours to determine an hourly rate of reimbursement. A control group of patients undergoing carpal tunnel release was evaluated using the same methods described. A total of 50 patients met the inclusion criteria for study. The payer was Medicaid for nine patients (18 percent), Medicare for 19 patients (38 percent), and commercial for 22 patients (44 percent). The average provider revenue per case was $3241.01 ± $2500.65. For all patients, the mean operative time was 10.6 ± 3.87 hours and the mean number of postoperative hospital days was 15.1 ± 8.06. The mean reimbursement per total care time hour was $254 ± $199.87. Statistical analysis demonstrated difference in reimbursement per total care time hour when grouped by insurance type (p = 0.002) or flap type (p = 0.033). Of the 50 most recent patients to undergo carpal tunnel release, the average revenue per case was $785.27. | Total care time analysis demonstrates that physician reimbursement is not commensurate with resources used for complex head and neck reconstructive surgery. | closed_qa |
Is inhaled colistin beneficial in ventilator associated pneumonia or nosocomial pneumonia caused by Acinetobacter baumannii? | In the present study, our objective was to evaluate and compare the clinical and microbiological results in patients receiving systemic and systemic plus inhaled colistin therapy due to nosocomial pneumonia (NP) or ventilator associated pneumonia (VAP) caused by Acinetobacter baumannii. A retrospective matched case-control study was performed at the ICUs at Izmir Katip Celebi University Ataturk Training and Research Hospital from January 2013 to December 2014. Eighty patients who received only systemic colistin were matched 43 patients who received systemic colistin combined with inhaled therapy. In 97.6 % of the patients colistin was co-administered with at least one additional antibiotic. The most frequently co-administered antibiotics were carbapenems (79.7 %). The patient groups did not differ significantly in terms of the non-colistin antibiotics used for treatment (p > 0.05). Acute renal injury was observed in 53.8 % and 48.8 % of the patients who received parenteral colistin or parenteral plus inhaler colistin, respectively (p = 0.603). There were no significant differences between the groups in terms of clinical success (p = 0.974), clinical failure (p = 0.291), or recurrence (p = 0.094). Only, a significantly higher partial clinical improvement rate was observed in the systemic colistin group (p = 0.009). No significant differences between the two groups in terms of eradication (p = 0.712), persistence (p = 0.470), or recurrence (p = 0.356) rates was observed. One-month mortality rate was similar in systemic (47.5 %) and systemic plus inhaled (53.5 %) treatment groups (p = 0.526). | Our results suggest that combination of inhaled colistin with intravenous colistin had no additional therapeutic benefit in terms of clinical or microbiological outcomes. | closed_qa |
Active Staphylococcus aureus infection: Is it a contra-indication to the repair of complex hernias with synthetic mesh? | The management of chronic mesh infection is challenging and controversial. The use of synthetic material to repair the abdominal wall in the infected setting is not recommended, especially in the presence of active infection caused by Staphylococcus aureus. This is a prospective observational study designed to evaluate the outcomes in patients with active mesh infection caused by Staphylococcus aureus. Patients underwent simultaneous removal and replacement of polypropylene mesh. The treatment protocol included the complete removal of infected mesh, followed by the anatomical reconstruction, and reinforcement of the abdominal wall using a new onlay polypropylene mesh. Early and late wound complications, medical complications, and hernia recurrences were analyzed. From 2006 until 2014, 22 patients with a mean age of 57.2 years and mean BMI of 29,3 kg/m2 were studied. Sinuses were present in 21 patients. A recurrent ventral hernia was observed in 14 patients; two patients required a complex abdominal wall reconstruction due to enteric fistulas. Bowel resections or other potentially contaminated procedures were associated in 10 patients. Fourteen patients (63.6%) had an uneventful postoperative course; 5 (22.7%) patients had wound infections requiring debridement and three required partial (2) or total (1) mesh removal. Two patients died due to medical complications. Adverse results on long-term follow-up included one hernia recurrence after complete mesh removal and one persistent sinus after partial mesh removal requiring a reoperation to remove mesh remnants. All of the patients were considered free of infection after a mean follow-up of 44 months. | Synthetic mesh replacement in patients with active Staphylococcus aureus infection has an acceptable incidence of postoperative wound infection and prevents hernia recurrence. Large-pore polypropylene mesh is a suitable material to be used in the infected surgical field as an onlay graft. | closed_qa |
Mobile revolution: a requiem for bleeps? | Effective communication is a vital part of good clinical care. Traditionally bleep systems have been used as the mainstay of communication. Mobile technology is increasingly seen as a quicker, easier and more reliable method of communication. Our objective was to assess the use of mobile devices within a typical National Health Service (NHS) hospital, discuss potential benefits and pitfalls, and develop suggestions for future improvements. A survey of 600 hospital doctors was conducted in a large NHS district general hospital between 1 May and 30 June 2015. The questionnaire explored the patterns of use, attitudes and impact of mobile communication, and identified potential risks and benefits of its wider adoption within the NHS. 92% of doctors use their personal mobile for hospital-related work. 95% share their personal number with colleagues, and 64% have it available through hospital switchboard. 77% use their personal mobile to discuss patient matters, and 48% are prevented from communicating effectively due to poor signal within the hospital. 90% are contacted when not at work on a weekly or daily basis regarding patients. 73% feel that traditional bleeps should be replaced with new mobile technologies. | Mobile phone usage is very common among doctors, and is the preferred method of communication within the hospital. Mobile technology has the potential to revolutionise communication and clinical care and should be embraced. The introduction of new technology will inevitably change existing hospital dynamics, and consequently may create a new set of challenges that will require further work to explore in the future. | closed_qa |
Is ultrasonographic evaluation essential for diagnosis of retained products of conception after surgical abortion? | A total of 466 patients with a gestational age<10 weeks were followed up at the 7th day after MVA of unwanted pregnancies. The patients who had intense or moderate bleeding and other symptoms related to RPOC had a re-evacuation and all the patients were followed up until the next menstruation. Ultrasonographic evaluation was repeated weekly in asymptomatic patients with abnormal ultrasonographic findings (increased endometrial thickness, presence of hyperechogenic, mixed and hypoechogenic material) until a normal endometrial cavity was visualized. Out of the 466 patients, 15 (3.2%) had symptoms of RPOC at day 7 while the remaining 451 (96.8%) were asymptomatic and 20 (57.9%) had normal ultrasonographic findings. The 15 symptomatic patients (3.2%) had a repeated MVA. Nine of these 15 patients (60%) had endometrial echogenicity with mixed patterns, two (13.3%) had endometrial echogenicity with hyperechoic patterns, and the remaining four (26.7%) had normal endometrial echogenicity at day 7. Histopathologic examination of 12 of these 15 patients (80%) showed chorionic villi while no gestational tissue was noted in the remaining three of these patients. Endometrial thickness ≥ 10 mm on day 7 had 75% sensitivity and 100% specificity for diagnosis of RPOC in symptomatic women. None of the 451 asymptomatic patients developed any symptoms or needed any further intervention. | As abnormal ultrasonographic findings return to normal over time in asymptomatic patients, the diagnosis of RPOC should not be based on ultrasonographic findings. | closed_qa |
Retinal detachments in southern New Zealand: do poorer patients have poorer outcomes? | To investigate associations between socioeconomic status, retinal detachment type and post-operative visual outcomes in southern New Zealand. A retrospective review of all cases of rhegmatogenous retinal detachments in Dunedin Hospital over two years was performed. Patient demographics and macula involvement at presentation were the primary outcome measures. The New Zealand Deprivation Index was used to group patients into low (30% least deprived), medium (middle 40%) and high (30% most deprived). Patients were excluded if they were not from New Zealand, or had traumatic detachments. During the study period, 95 retinal detachments in 94 patients were managed in Dunedin Hospital. Only 15% of retinal detachments occurred in the most deprived. More deprived patients had longer delays before assessment in hospital (mean of 29.8 days versus 10.1 days for the least deprived and 12.8 days for the medium category, overall p=0.025). There was no evidence of an association between deprivation and macula-off status (overall p=0.650) or visual acuity at one or three months (p=0.063 and p=0.328 respectively). Nor was there an association between referral pathway and macula-off status (p=0.242). | Retinal detachment in southern New Zealand may be less common amongst those with the most deprived socioeconomic status who also experience longer delays till first treatment; but there was no association between socioeconomic status and patients being macula-off at presentation, or having poorer visual outcomes. More targeted patient education towards our most deprived citizens may reduce delays in treatment, and result in better visual outcomes. | closed_qa |
Predicted survival in patients with brain metastases from colorectal cancer: Is a current nomogram helpful? | To examine the clinical applicability of a new nomogram by comparing survival of patients with brain metastases from colorectal cancer treated with surgery and/or radiotherapy in the authors' institutions with nomogram-predicted median survival. Retrospective analysis of 64 patients treated with comparable approaches and during the same time period as the patients in the nomogram study. Points were assigned for age, performance status, number and site of brain metastases, as required for nomogram use. In 46 patients (72%), the observed survival was shorter than the predicted median. The median deviation was -1.4 months. The nomogram underestimated the survival of patients treated with radiosurgery/surgery by a median of 4.2 months, whereas it overestimated the survival of patients treated with whole-brain radiotherapy (WBRT) by a median of 2.1 months (p=0.0001). Nevertheless, all 5 patients with predicted median survival ≤3 months died within 3 months. Among 8 patients with predicted median survival>12 months, 6 (75%) survived for>12 months. Not all prognostic factors in the nomogram correlated with survival. In the multivariate Cox model, only performance status and number of brain metastases were significant, both with p=0.0001. | Despite differences in prognostic factors and survival of many individual patients, especially those with intermediate prognosis, the nomogram performed promising in poor- and good-prognosis patients. Evaluation of separate prediction tools for patients treated with WBRT and more aggressive local approaches appears warranted in order to minimize the influence of better local control of the brain metastases. | closed_qa |
Urinalysis requests on the elderly residing in the Auckland community: tick box requesting? | Urinalysis for microscopy and culture is one of the most frequently requested tests for microbiology laboratories, particularly from elderly patients. This study sought to describe the clinical appropriateness of urinalysis from community-dwelling elderly patients and subsequent antibiotic prescription. Demographic, laboratory, and antibiotic prescription data were collected on all samples submitted from patients ≥ 70 years during August 2014 to Labtests Auckland. In addition, clinical data were collected by questionnaire from a subgroup of 200 patients. During August 2014, approximately 7% of the Auckland population aged ≥ 70 years had urinalysis submitted. Urine dipstick was not routinely performed before specimen submission, particularly from patients living at home rather than a long-term care facility, and nearly 50% of samples were not cultured due to absence of pyuria. Escherichia coli was isolated from 23% of female and 7% of male specimens. E. coli isolates from our cohort were less susceptible to all antibiotics tested against compared with all E. coli isolated from all urines in 2014. Clinical indications were absent in 40% of the subgroup of patients. Antibiotic prescription within 7 days of urinalysis was common (36%). | This study highlights the frequency of urinalysis testing among the elderly residing in the community. Clinical indications are often absent, and treatment of asymptomatic bacteriuria is likely to be contributing to excessive antibiotic prescription in this group of patients. | closed_qa |
Does Congenital Heart Disease Affect Neurodevelopmental Outcomes in Children with Down Syndrome? | The impact that congenital heart disease (CHD) has on the neurodevelopment of children with Down syndrome (DS) is unknown and potentially has implications for targeted early intervention. This study assessed the relationship between CHD that required surgery in the first year of life and neurodevelopmental, behavioral and emotional functioning outcomes in children with DS. A retrospective chart review of 1092 children (0-18 years) with DS who visited a single institution from 8/08-8/13 was performed. Children who underwent at least one of nine neurodevelopmental (cognitive, language, developmental) or academic tests were included in the analysis (N = 178). Cohort was age-divided into infants/toddlers (0-2 years), preschoolers (3-5 years), and school age/adolescent (6-18 years). Test scores of children with DS who underwent cardiac surgery in the first year of life were compared to children with DS without CHD. T test, chi-square and Mann Whitney U tests were used where appropriate. Infants/toddlers with cardiac surgery had lower scores for receptive (P = .01), expressive (P = .021) and composite language (P < .001) compared to those with no CHD. Preschoolers with cardiac surgery had lower language scores and lower visual motor scores, although not statistically significant. In school age children with cardiac surgery there were no differences in IQ scores, language scores, or academic achievement scores compared to those without CHD. Also at school-age there was no difference in the incidence of ADHD, executive function or on internalizing and externalizing behavior scores. | Children with DS undergoing cardiac surgery during the first year demonstrated poorer neurodevelopmental outcomes as infants/toddler but had no difference at school age compared to children with DS without CHD. These results will guide early interventions to optimize neurodevelopmental outcomes in children with DS and will help with family counseling after CHD repair. | closed_qa |
Is PET/CT Necessary in the Management of Early Breast Cancer? | Advanced imaging methods in early breast cancers are not recommended before surgery. In contrast to the accepted guidelines, some recent studies have shown some benefits with the use of PET/CT in early-stage breast cancer. In this study, we aimed to document the efficacy of PET/CT in detection of distant metastasis as well as other primary cancers. In this retrospective study, we reviewed the records of all women patients diagnosed with early breast cancer between March 2012 and December 2014. Besides demographics, we recorded the clinical TNM stage, histology of the tumor, and hormone receptor status. As PET/CT imaging is a routine procedure in our center for early breast cancer, tumor size, lymph node status, distant metastasis, and possible other primary malignancies detected by PET/CT were also recorded. Of the 419 women included in the study, 24.8% were clinically staged as stage I while the rest were stage II. Distant metastases were detected in 42 patients (10%). The yield of PET/CT in detecting metastasis was significant in stage II patients compared with stage I patients (12.4% vs 2.9%). In subgroup analysis of stage II patients, the performance of PET/CT in detecting metastasis was still evident in stage IIA patients (9.5%). In logistic regression analysis of the significant and near-significant factors (as detected by univariate analysis) effecting PET/CT detected distant metastasis, only nodal status (P = 0.053) was found to be significant. | We suggest the use of PET/CT in investigating metastasis in axilla positive and clinically stage II early breast cancer patients. | closed_qa |
Subsets and Splits