instruction
stringlengths
10
664
context
stringlengths
1
5.66k
response
stringlengths
1
3.34k
category
stringclasses
1 value
Can poison control data be used for pharmaceutical poisoning surveillance?
To determine the association between the frequencies of pharmaceutical exposures reported to a poison control center (PCC) and those seen in the emergency department (ED). A statewide population-based retrospective comparison of frequencies of ED pharmaceutical poisonings with frequencies of pharmaceutical exposures reported to a regional PCC. ED poisonings, identified by International Classification of Diseases, Version 9 (ICD-9) codes, were grouped into substance categories. Using a reproducible algorithm facilitated by probabilistic linkage, codes from the PCC classification system were mapped into the same categories. A readily identifiable subset of PCC calls was selected for comparison. Correlations between frequencies of quarterly exposures by substance categories were calculated using Pearson correlation coefficients and partial correlation coefficients with adjustment for seasonality. PCC reported exposures correlated with ED poisonings in nine of 10 categories. Partial correlation coefficients (r(p)) indicated strong associations (r(p)>0.8) for three substance categories that underwent large changes in their incidences (opiates, benzodiazepines, and muscle relaxants). Six substance categories were moderately correlated (r(p)>0.6). One category, salicylates, showed no association. Limitations Imperfect overlap between ICD-9 and PCC codes may have led to miscategorization. Substances without changes in exposure frequency have inadequate variability to detect association using this method.
PCC data are able to effectively identify trends in poisonings seen in EDs and may be useful as part of a pharmaceutical poisoning surveillance system. The authors developed an algorithm-driven technique for mapping American Association of Poison Control Centers codes to ICD-9 codes and identified a useful subset of poison control exposures for analysis.
closed_qa
Do postmarketing surveillance studies represent real-world populations?
To evaluate outcomes after carotid artery stenting in larger real-world populations, the Food and Drug Administration mandated that companies conduct postmarketing surveillance (PMS) studies of approved stent systems. Whether PMS studies are representative of carotid artery stenting in routine clinical practice has not been established. Within the National Cardiovascular Database Registry-Carotid Artery Revascularization and Endarterectomy (NCDR CARE) Registry, we compared patient and procedural characteristics, in-hospital outcomes, and subsequent all-cause mortality after carotid artery stenting in PMS study participants and nonparticipants. We conducted both crude and propensity score-adjusted comparisons for all outcomes between groups. Compared with nonparticipants, participants in PMS studies had lower rates of symptomatic carotid artery disease within the preceding 6 months, prior stroke, and acute evolving stroke at baseline. The PMS study participants had lower unadjusted rates of combined in-hospital death, stroke, or myocardial infarction (2.3% versus 4.1%; P<0.001), driven by lower rates of stroke (1.7% versus 2.7%; P=0.005) and death (0.3% versus 1.4%; P<0.001). Differences in survival persisted after propensity score adjustment (odds ratio, 0.44; 95% confidence interval, 0.21 to 0.95; P=0.04 for in-hospital mortality; and hazard ratio, 0.80; 95% confidence interval, 0.66 to 0.97; P=0.02 for 2-year mortality). Baseline differences in neurological history explained the largest proportion of the difference in outcomes between groups.
Participants in PMS studies for carotid artery stenting have different clinical and procedural characteristics and lower mortality compared with nonparticipants. Extrapolating results from PMS studies of carotid artery stenting to larger real-world settings should be done only with great caution.
closed_qa
Can the viability of a nonunion be evaluated using SPECT/CT?
The importance of the vitality of a nonunion is crucial for the planning of the reconstructive procedure. Purpose of the present study is to analyze the role of single photon emission computed tomography (SPECT) in diagnosing and planning the treatment of atrophic nonunions in the upper and lower extremity. This study examined retrospectively the SPECT/CT scans of 10 patients (mean age = 44.5 ± 16.5 years, 9 males/1 female, 4 tibia/4 femur/1 radius/1 fibula) who underwent surgical exploration for suspected avital pseudarthrosis. Surgical and histopathological findings were compared with the radiologists' findings to assess the sensitivity and specificity of SPECT in diagnosing avital nonunions. The average interval from the osteosynthesis until their SPECT scan was 18 months. All surgical findings have been documented electronically in the hospital computer system. Results of the radiologist's reading were then compared with surgical exploration and histopathological findings and specifity and sensitivity was calculated. There were 4 vital and 6 nonvital pseudarthroses. SPECT scans identified all the vital pseudarthroses and 3 of the 6 nonvital pseudarthroses. The sensitivity of SPECT in diagnosing non-vital atrophic nonunions is 50% and the specifity 100%.
SPECT/CT scan is a test with a low sensitivity but good specificity that excludes infection and confirms nonviability of the nonunion site. However, we shall wait for larger pool of research results in order to incorporate this test in routine clinical use.
closed_qa
Are serum protein biomarkers derived from proteomic analysis useful in screening for trisomy 21 at 11-13 weeks?
The aim of this study is to identify potential biomarkers for fetal trisomy 21 from previous publications using proteomic techniques and examine the potential value of such biomarkers in early screening for this aneuploidy. This was a case-control study of 25 pregnancies with fetal trisomy 21 and 50 euploid controls undergoing first-trimester screening for aneuploidies by a combination of maternal age, fetal nuchal translucency (NT) thickness and maternal serum free β-human chorionic gonadotrophin (β-hCG) and pregnancy-associated plasma protein-A (PAPP-A). The maternal serum concentrations of afamin, apolipoprotein E, clusterin, ceruloplasmin, epidermal growth factor, fetuin-A, pigment epithelium-derived factor glycoprotein and transthyretin were determined using an ELISA and compared in the euploid and trisomy 21 groups. In pregnancies with fetal trisomy 21, the median maternal age, fetal NT thickness and serum free β-hCG were increased, whereas serum PAPP-A was decreased. However, there were no significant differences between cases and controls in any of the biomarkers.
Proteins identified as potential biomarkers for trisomy 21 using proteomic techniques have not been found to be useful in early screening for this aneuploidy.
closed_qa
Do voluntary step reactions in dual task conditions have an added value over single task for fall prediction?
Stepping reactions play a critical role in responding to balance perturbations, whether they are a consequence of external perturbation or self-induced in nature. The aim of the present study was to determine prospectively the capacity of voluntary stepping performance in singleand dual-task conditions, to predict future falls among older community-dwelling persons. We also aimed to assess whether dual task conditions have an added value over single tasks for fall prediction. A total of 100 healthy old volunteers (mean age 78.4±5.7 yrs), from two self-care protected retirement homes for older adults, performed the Voluntary Step Execution Test in single- and dual-task conditions as a reaction time task while standing on a single force platform. Step initiation, preparatory and swing phases, and foot-contact time were extracted from data on center of pressure and ground reaction force. One-year fall incidences were monitored. Ninety-eight subjects completed the one-year follow-up, 49 non-fallers, 32 one-time fallers, and 17 recurrent fallers (two or more falls). Recurrent fallers had significantly slower voluntary step execution times in both single- and dual-task conditions, especially due to a slower preparation phase. Two stepwise (backward) logistic regression models showed that longer step execution times have strong predictive value for falls in both single- and dual-task conditions (odds ratio (OR) 8.7 and 5.4, respectively, p<0.05).
Voluntary Step Execution Test in both single- and dual-task conditions is a simple and safe examination which can potentially and effectively predict future falls, with no added value to dual- over single-task condition.
closed_qa
Is chair rise performance a useful measure of leg power?
Chair rise performance, which is simple to assess in a home or clinic setting, has been used as a method of predicting leg power deficit in older adults. More recently, chair rise performance has been assessed in younger populations as a baseline for assessment of subsequent age-related declines in function and power. However, as rising from a chair repeatedly not only requires lower limb strength and power but also good balance and coordination, it may not be purely a measure of leg power especially among these younger, well functioning groups who are yet to experience agerelated declines and deficits in function. The aim of this study was to assess whether chair rise performance can be considered as a predictor of leg power, and hence of deficits in this, in men and women in mid-life. We assessed the relationship of chair rise performance with leg extensor power (LEP), measured using the Nottingham Power Rig (NPR), and with standing balance performance. LEP was measured in a clinic setting in a sub-sample of 81 men and 93 women from the MRC National Survey of Health and Development, a nationally representative cohort born in Britain in 1946. The time taken to rise from a chair 10 times and standing balance time were assessed during home visits at the same age. Increasing LEP was associated with better chair rise performance among those who completed 10 chair rises in ≥15 seconds, after adjustment for body size (p=0.008). Better standing balance performance was associated with better chair rise performance in men, but not women.
That LEP and standing balance are both related to chair rise time in men suggests that chair rise time should not be thought of purely as a proxy measure of leg power in middle-aged populations. This has implications for longitudinal studies which want to study age-related decline in chair rise performance.
closed_qa
Acute ankle sprain: is there a best support?
Acute lateral ankle sprain accounts for 85% of all sprains, being generally accepted as the most common sports-related ligamentous injury. There is a lack of consensus about the optimal management of these injuries despite their frequency. The time-honoured mantra of rest, ice, elevation and compression is still commonly used, even though the current evidence for compression is conflicting. A prospective randomized controlled clinical trial was carried out in the emergency department of a regional hospital in Ireland to compare outcomes, in terms of ankle function, pain improvement and return-to-work times, in adults presenting within 24 h of first-time acute lateral ankle sprain, among three external supports. We found no statistically significant differences among all three treatments in terms of ankle joint function, using the Karlsson ankle function scale, at 10 or 30-days follow-up. There was a tendency for Elastoplast bandaging to provide better average ankle function at both time points, when compared with double tubigrip and no support. Participants returned to work an average 2 days earlier, if treated with Elastoplast.
This study found no statistically significant difference in ankle function between double tubigrip bandage, Elastoplast bandage and no support at 10 or 30-days follow-up.
closed_qa
Tuberculosis risk before and after highly active antiretroviral therapy initiation: does HAART increase the short-term TB risk in a low incidence TB setting?
To evaluate the short-term and long-term effects of highly active antiretroviral therapy (HAART) on tuberculosis (TB) risk compared with risk without HAART in a low TB incidence setting. An observational cohort study among HIV-infected persons in care at the Comprehensive Care Center (Nashville, TN) between January 1998 and December 2008. A marginal structural model was used to estimate the effect of HAART on short-term (≤180 days) and long-term (>180 days) TB risk, with CD4⁺ lymphocyte count incorporated as a time-updated covariate. Of 4534 HIV-infected patients, 34 developed TB (165 per 100,000 person-years; 20,581 person-years of follow-up). Seventeen cases occurred among persons not on HAART or>30 days after HAART discontinuation (212 per 100,000 person-years; 8019 person-years of follow-up). Seventeen occurred among persons on HAART (135 per 100,000 person-years; 12,562 person-years of follow-up); 10 in the first 180 days (402 per 100,000 person-years; 2489 person-years of follow-up); and 7 after more than 180 days (69 per 100,000 person-years; 10,073 person-years of follow-up). After adjusting for the most recent CD4⁺ lymphocyte count, the risk of TB in the first 180 days of HAART exposure relative to no HAART was 0.68 (0.14-3.22, P = 0.63).
In this low TB incidence setting, the TB rate in the first 180 days of HAART was almost twice as high as persons not on HAART. However, after adjusting for most recent CD4⁺ count, there was no significant difference in TB risk between these 2 groups. This suggests that low recent CD4⁺ lymphocyte count influences TB risk during the first 180 days of HAART.
closed_qa
Is there any association between TACSTD2, KIAA1253, Ku70 and mutant KRAS gene expression and clinical-pathological features of colorectal cancer?
Samples of tumor and normal tissue of patients surgically treated for colorectal cancer between July 2005 and July 2009 were stored in a tissue bank. These samples were studied with the technique of real-time polymerase chain reaction in respect to expression of the following genes: KRAS codon 12 mutation, TACSTD2, Ku70, and SERIN1. Tumor samples of 37 patients were studied. The mean age was 65.5 years. Twenty one patients (56.8%) were male. Nine patients (24.3%) were classified as TNM stage I, 11 patients (29.8%) as TNM stage II, eight patients (21.6%) as TNM stage III and nine patients (24.3%) as TNM stage IV. The Ku70 expression in poorly-differentiated tumors is significantly higher than in well and moderately-differentiated tumors (2.76 vs. 1.13; p<0.05). SERIN1, TACSTD2 and KRAS codon 12 mutation are not associated with clinical-pathological characteristics of colorectal cancer.
Ku70 expression in poorly-differentiated tumors is significantly higher than in well and moderately-differentiated colorectal tumors.
closed_qa
Do brain activation changes persist in athletes with a history of multiple concussions who are asymptomatic?
To evaluate brain activation patterns of asymptomatic athletes with a history of two or more concussions. A paired case-control design was used to evaluate brain activation patterns during cognitive performance in 14 athletes with a history of two or more concussions and 14 age- and sex-matched controls with no previous concussion. Percentage Blood-Oxygen-Level-Dependent (BOLD) change during an N-back working memory task was assessed in all participants. Performance on the Trail-Making Test Form A and B, Symbol-Digit Modalities Test and the Immediate Post-concussion Assessment and Cognitive Test (ImPACT) was also compared between groups. As expected, brain regions activated during the performance of the N-back were equivalent between groups. The groups performed similarly on the neurocognitive measures. The history of concussion group was less accurate than controls on the 1-, 2- and 3-back conditions of the N-back.
Following the complete resolution of symptoms, a history of two or more concussions is not associated with changes in regional brain activation during the performance of working memory task. Compensatory brain activation may only persist during the typically brief time athletes experience symptoms following concussion.
closed_qa
Effects on outcomes of heart rate reduction by ivabradine in patients with congestive heart failure: is there an influence of beta-blocker dose?
This study used the SHIFT (Systolic Heart failure treatment with the I(f) inhibitor ivabradine Trial) database to assess the impact of background beta-blocker dose on response to ivabradine. In systolic heart failure, reduction in relatively high heart rates improves clinical outcomes when achieved with beta-blockers and even more so when the sinus node inhibitor ivabradine also is added. Among patients with systolic heart failure, sinus rhythm, and heart rate ≥70 beats/min on recommended background therapy, maximally tolerated beta-blocker doses were subgrouped as no beta-blocker,<25%, 25% to<50%, 50% to<100%, and 100% of European Society of Cardiology–suggested target doses. The impact of ivabradine on cardiovascular death or heart failure hospitalization (primary endpoint) was analyzed in each subgroup as time-to-first event using Cox models adjusted for heart rate. The statistical models assessed heterogeneity and trend of the treatment effect across subgroups, and an additional analysis was made adjusting for the interaction of randomized treatment with baseline heart rate. The primary endpoint and heart failure hospitalizations were significantly reduced by ivabradine in all subgroups with<50% of target beta-blocker dose, including no beta-blocker (p = 0.012). Despite an apparent trend to reduction in treatment-effect magnitude with increasing beta-blocker dose, no variation in treatment effect was seen in general heterogeneity interaction tests (p = 0.35). Across beta-blocker subgroups, treatment effect was borderline nonsignificant only for the primary endpoint (p = 0.056), and significance was further lost after adjusting for interaction between baseline heart rate and ivabradine effect (p = 0.14).
The magnitude of heart rate reduction by beta-blocker plus ivabradine, rather than background beta-blocker dose, primarily determines subsequent effect on outcomes. (Effects of ivabradine on cardiovascular events in patients with moderate to severe chronic heart failure and left ventricular systolic dysfunction. A three-year randomised double-blind placebo-controlled international multicentre study; ISRCTN70429960)
closed_qa
Is abdominal compression useful in lung stereotactic body radiation therapy?
To determine the usefulness of abdominal compression in lung stereotactic body radiation therapy (SBRT) depending on lobe tumor location. Twenty-seven non-small cell lung cancer patients were immobilized in the Stereotactic Body Frame™ (Elekta). Eighteen tumors were located in an upper lobe, one in the middle lobe and nine in a lower lobe (one patient had two lesions). All patients underwent two four-dimensional computed tomography (4DCT) scans, with and without abdominal compression. Three-dimensional tumor motion amplitude was determined using manual landmark annotation. We also determined the internal target volume (ITV) and the influence of abdominal compression on lung dose-volume histograms. The mean reduction of tumor motion amplitude was 3.5 mm (p = 0.009) for lower lobe tumors and 0.8 mm (p = 0.026) for upper/middle lobe locations. Compression increased tumor motion in 5 cases. Mean ITV reduction was 3.6 cm(3) (p = 0.039) for lower lobe and 0.2 cm(3) (p = 0.048) for upper/middle lobe lesions. Dosimetric gain of the compression for lung sparing was not clinically relevant.
The most significant impact of abdominal compression was obtained in patients with lower lobe tumors. However, minor or negative effects of compression were reported for other patients and lung sparing was not substantially improved. At our institute, patients with upper or middle lobe lesions are now systematically treated without compression and the usefulness of compression for lower lobe tumors is evaluated on an individual basis.
closed_qa
Are quality improvement methods a fashion for hospitals in Taiwan?
This study reviews the rise and fall of the quality improvement (QI) methods implemented by hospitals in Taiwan, and examines the factors related to these methods. Cross-sectional, questionnaire-based survey. One hundred and thirty-nine district teaching hospitals, regional hospitals and medical centers. Directors or the persons in charge of implementing QI methods. s) None. s) Breadth and depth of the 18 QI methods. Seventy-two hospitals responded to the survey, giving a response rate of 52%. In terms of breadth based on the hospitals' self-reporting, the average number of QI methods adopted per hospital was 11.78 (range: 7-17). More than 80% of the surveyed hospitals had implemented eight QI methods, and>50% had implemented five QI methods. The QI methods adopted by over 80% of the surveyed hospitals had been implemented for a period of ∼7 years. On the basis of the authors' classification, seven of the eight QI methods (except for QI team in total quality management) had an implementation depth of almost 70% or higher in the surveyed hospitals.
This study provides a snapshot of the QI methods implemented by hospitals in Taiwan. The results show that the average breadth of the QI methods adopted was 11.78; however, only 8.83 were implemented deeply. The hospitals' accreditation level was associated with the breadth and depth of QI method implementation.
closed_qa
Does stimulant pretreatment modify atomoxetine effects on core symptoms of ADHD in children assessed by quantitative measurement technology?
To compare the reduction of ADHD symptoms under atomoxetine (ATX) in patients with and without pretreatment with a stimulant medication using a computer-based Continuous Performance Test (cb-CPT) combined with an infrared motion tracking (MT) device. Double-blind, placebo-controlled study in ADHD patients (6-12 years) treated with ATX (target dose = 1.2 mg/kg per day). The cb-CPT/MT scores were analyzed using ANCOVA (last observation carried forward). Patient data (n = 125) suggested a differential ATX treatment effect between pretreated and stimulant-naïve patients in terms of three cb-CPT/MT parameters.
This secondary analysis provided evidence that ATX reduced ADHD symptom severity measured by cb-CPT/MT parameters regardless of stimulant pretreatment. A few differential effects were seen based on the cb-CPT/MT. However, no clear pattern could be identified and, overall, the observed differences have no larger clinical relevance. The ATX effect in this study seemed to be largely independent of any previous exposure to stimulants.
closed_qa
Risk screening for ADHD in a college population: is there a relationship with academic performance?
The present study examines the relationship between self-reported levels of ADHD and academic outcomes, as well as aptitude. A total of 523 college students took the Adult Self-Report Scale-Version 1.1 (ASRS-V1.1), and their scores were compared with course performance and ACT (American College Test) composite scores. The measure identified 70 students (13.4%) as being in the "highly likely" category for an ADHD diagnosis. Course exam and ACT scores for the 70 "highly likely" students were statistically identical to the remaining 453 students in the sample and the 77 students identified as "highly unlikely" as well. Only 4 of the "highly likely" 70 students were registered with the university's Office of Student Disability Services as having been diagnosed with ADHD.
The ASRS-V1.1 failed to discriminate academic performance and aptitude differences between ADHD "highly likely" and "highly unlikely" individuals. The use of self-report screeners of ADHD is questioned in contexts relating ADHD to academic performance.
closed_qa
Do pharmacokinetics explain persistent restenosis inhibition by a single dose of paclitaxel?
The purpose of this study was to investigate the elimination of paclitaxel from the arterial wall after a single short administration with a coated balloon. Slightly oversized paclitaxel-coated balloons (dose 3 or 9 μg/mm(2)) without or with premounted stents were inflated in nonatherosclerotic coronary arteries of either young domestic pigs or adult Goettingen minipigs. The paclitaxel content of plasma, arterial segments, and residual hearts (without treated arteries) was measured for up to 180 days using high-performance liquid chromatography/ultraviolet detection or mass spectrometry. Angiograms were evaluated for lumen narrowing. The paclitaxel concentration in plasma remained<10 ng/mL. In arteries of domestic pigs and minipigs treated with paclitaxel-coated balloons with premounted stents, 10%±5% or 21%±8% of dose, respectively, was initially detected and decreased to 3.5%±3.1% of dose (domestic pig) by Day 7. Within 6 months it fell with a half-life of 1.9 months to 0.40%±0.35%. After 3 months the concentration in the arterial wall was 17±11 μmol/L. Without a stent, drug transfer to the vessel wall was somewhat reduced and elimination faster. Immediately after treatment up to 26%±4% of dose was detected in the residual whole hearts. It dropped with a half-life of 45 days to 1.5%±0.6% of dose (0.3 μmol/L) within 6 months.
After a single local administration with coated balloons, paclitaxel stays in the vessel wall of pigs long enough to explain persistent inhibition of neointimal proliferation. The pharmacokinetics of paclitaxel does, however, not exclude other reasons for sustained efficacy such as early blocking of processes initiating excessive and prolonged neointimal proliferation.
closed_qa
Surgical treatment of left main disease and severe carotid stenosis: does the off-pump technique provide a better outcome?
Left main disease (LMD), combined with carotid artery stenosis (CAS), constitutes a high-risk patient population. Priority is often given to coronary revascularization, due to the severity of the angina. However, the choice of revascularization strategy [off-pump coronary artery bypass (OPCAB) vs coronary artery bypass grafting (CABG)] remains elusive. A total of 1340 patients with LMD were non-randomly assigned to either on-pump (CABG group, n = 680) or off-pump (OPCAB group, n = 634) revascularization between 1 January 2006 and 21 September 2010. Multivariable regression was used to determine the risk-adjusted impact of a revascularization strategy on a composite in-hospital outcome (MACCE), and proportional hazards regression was used to define the variables affecting long-term survival. Significant CAS was found in 130 patients: 84 (13.1%) patients underwent OPCAB, while 46 patients (6.8%) underwent CABG (P<0.05). Patients with a history of stroke/transient ischaemic attack were also more likely to receive OPCAB (7.1 vs 4.7%; P = 0.08). OPCAB patients were older, in a higher New York Heart Association (NYHA) class, with a lower LVEF and higher EuroSCORE. A calcified aorta was found in 79 patients [OPCAB-CABG: 49 (7.73%) vs 30 (4.41%); P = 0.016] and resulted in a less complex revascularization (OPCAB-CABG: 2.3 ± 0.71 vs 3.19 ± 0.82; P<0.05), and 30-day mortality was insignificantly higher in the CABG (2.7 vs 2.8%) as well as MACCE (11.2 vs 12.2%; P = NS). This trend reversed when late mortality was evaluated; however, it did not reach significance at 60 months. Preoperative renal impairment requiring dialysis was found to be a technique-independent predictor of MACCE. The number of arterial conduits also influenced MACCE.
Off-pump coronary revascularization may offer risk reduction of neurological complications in patients with a significant carotid artery disease and a history of previous stroke, but a larger study population is needed to support this thesis. The growing discrepancy in long-term survival should draw attention to a more complete revascularization in OPCAB patients.
closed_qa
Sleep apnea as a comorbidity in obese psoriasis patients: a cross-sectional study. Do psoriasis characteristics and metabolic parameters play a role?
Psoriasis is associated with a variety of comorbidities such as obesity and cardiovascular disease. In a cross-sectional study, we explored whether obstructive sleep apnea and hypopnea syndrome (OSAHS) is associated with psoriasis characteristics and metabolic parameters. Thirty-five patients with chronic plaque psoriasis underwent a nocturnal polysomnography study and were analysed for Apnoea-Hypopnoea Index to assess OSAHS severity and Framigham score to predict the absolute risk of coronary artery disease at 10 years. The association of OSAHS with psoriasis was examined according to psoriasis characteristics (PASI and DLQI scores, disease duration and previous use of systemic treatments), metabolic parameters (Body Mass Index - BMI, waist to hip ratio - WHR, lipid profile) and other comorbidities (obesity, hypertension, arthritis and cardiovascular disease). There was no correlation between psoriasis characteristics and OSAHS. Psoriasis patients with OSAHS presented more frequent snoring and lower sleep quality compared with those without OSAHS. In univariate analyses, OSAHS was associated with increased BMI and hypertension in psoriasis patients. In multivariable logistic regression models, there was statistically significant evidence that only BMI and hypertension were associated with increased risk of OSAHS, adjusting for psoriasis characteristics, age and gender. Presence of metabolic syndrome, WHR, and smoking were not significant risk factors for OSAHS. In subgroup analyses, OSAHS correlated with duration of psoriasis (>8 years) in women (P = 0.021) and with Framigham score in men (P = 0.035).
OSAHS may be a comorbidity in obese psoriasis patients with hypertension. Treatment with continuous positive airway pressure and weight loss interventions should be initiated.
closed_qa
Plasma YKL-40: a potential biomarker for psoriatic arthritis?
Plasma YKL-40 is an inflammatory biomarker. No useful biomarker exists in patients with psoriasis or psoriatic arthritis. To measure YKL-40 and high-sensitivity C-reactive protein (hs-CRP) in patients with psoriasis or psoriatic arthritis before and during treatment. In 48 patients with psoriasis, we measured YKL-40, hs-CRP and Psoriasis Area and Severity Index (PASI) at inclusion and in a subgroup of 14 patients, we repeated the measurements after four to six weeks of methotrexate treatment. In 42 patients with psoriatic arthritis, we measured YKL-40 and hs-CRP at inclusion and during 48 weeks of adalimumab treatment. The patients with psoriatic arthritis were divided into responders and non-responders. In patients with psoriasis, the baseline median PASI score was 10.8 and baseline YKL-40 was 45 μg/L. Seventeen per cent had elevated plasma YKL-40 compared with healthy subjects. Baseline PASI and YKL-40 were not correlated (rho = 0.14, P = 0.347) and YKL-40 and hs-CRP remained unchanged after treatment. In patients with psoriatic arthritis, the median pretreatment YKL-40 was 112 μg/L and 43% had elevated YKL-40. YKL-40 decreased in 33 patients who responded to adalimumab (from 112 μg/L to 68 at 48 weeks, P = 0.007). Hs-CRP decreased (from 4.65 mg/L to 0.91, P = 0.013) in the responders. In the non-responders (n = 9), YKL-40 and hs-CRP remained unchanged.
YKL-40 is elevated in many patients with psoriatic arthritis, but not in patients with psoriasis. YKL-40 decreased in patients with psoriatic arthritis who responded to treatment. YKL-40 may be a useful biomarker to monitor the effect of treatment with tumour necrosis factor-α inhibitors in patients with psoriatic arthritis.
closed_qa
Neoadjuvant therapy and liver transplantation for hilar cholangiocarcinoma: is pretreatment pathological confirmation of diagnosis necessary?
Neoadjuvant chemoradiotherapy followed by operative staging and liver transplantation is an effective treatment for patients with unresectable hilar cholangiocarcinoma (CCA) and CCA arising in the setting of primary sclerosing cholangitis (PSC). Pathologic confirmation of CCA is notoriously difficult, and many patients have been treated based on clinical criteria without pathological confirmation. We reviewed our experience with the specific aim of determining the need for pathological confirmation of CCA before treatment. Two hundred and fifteen patients received neoadjuvant therapy between 1992 and 2011. One hundred and eighty-two patients underwent operative staging and 38 (21%) had findings that precluded transplantation. Pathological confirmation of CCA before therapy was achieved in 45 of 87 (52%) PSC patients and 22 of 49 (45%) de novo patients who underwent transplantation. Pretreatment pathological confirmation was associated with significantly worse 5-year survival after start of therapy for PSC patients (50% vs 80%; p = 0.001), but not for de novo patients (39% vs 48%; p = 0.27). Pretreatment pathological confirmation was associated with worse 5-year survival after transplantation for PSC patients (66% vs 92%; p = 0.01), but not for de novo patients (63% vs 65%; p = 0.71). The difference in the PSC patients was not due to recurrent cancer. Absence of pretreatment pathological confirmation did not result in less detection of residual CCA in the explanted livers or in less recurrence after transplantation.
Rates of residual CCA in liver explants and recurrences after transplantation are comparable for patients with and without pretreatment pathological confirmation of CCA and attest to the accuracy of clinical diagnostic criteria. Pretreatment pathological confirmation of CCA is desirable but should not be a requirement for treatment.
closed_qa
Is renal thrombotic angiopathy an emerging problem in the treatment of ovarian cancer recurrences?
Ovarian cancer is usually diagnosed at an advanced stage, with most patients undergoing surgery followed by platinum- and taxane-based chemotherapy. After initial clinical remission, the majority recur, leading to additional treatments, including not only platinums and taxanes but also pegylated liposomal doxorubicin (PLD), gemcitabine, topotecan, and, more recently, bevacizumab, which may extend survival times. PLD, in particular, has been extensively studied by our group, with encouraging therapeutic results. We, however, observed instances of chronic kidney disease (CKD) developing among patients who received long-term treatment for recurrent ovarian cancer. To document the frequency and contributing factors to the emergence of CKD, we initiated a retrospective review at two institutions. Fifty-six consecutive patients with recurrent ovarian cancer receiving treatment at New York University Cancer Institute were reviewed for the presence of renal disease in 1997-2010. At Shaare Zedek Medical Center, 73 consecutive patients with ovarian cancer were reviewed in 2002-2010. Patients were diagnosed with CKD if they had an estimated GFR<60 mL/minute per 1.73 m2 for>3 months and were staged according to the National Kidney Foundation guidelines. Thirteen patients (23%) developed stage ≥3 CKD. Three patients had renal biopsies performed that showed thrombotic microangiopathy.
CKD is emerging as a potential long-term consequence of current chemotherapy for recurrent ovarian cancer.
closed_qa
Squamous cell carcinoma of the oral cavity in nonsmoking women: a new and unusual complication of chemotherapy for recurrent ovarian cancer?
To describe occurrences of oral squamous cell carcinoma (SCC) in patients who had received long-term pegylated liposomal doxorubicin (PLD) for ovarian cancer. In our cohort of patients on maintenance PLD for ovarian and related mullerian epithelial malignancies, we encountered two patients with invasive SCC of the oral cavity (one of them multifocal) and one with high-grade squamous dysplasia. Review of patients at our institution receiving PLD for recurrent ovarian cancer identified three additional patients. The duration of treatment, cumulative PLD dose, human papillomavirus (HPV) positivity, BRCA status, stage at diagnosis, outcome, and other characteristics are reviewed. All five cases were nonsmokers with no known risk factors for HPV and four were negative for p16 expression. Four of the patients had known BRCA mutations whereas one tested negative. Cumulative doses of PLD were>1,600 mg/m2 given over 30-132 months. Three had SCCs staged as T1N0 oral tongue, alveolar ridge (gingival), and multifocal oral mucosa; one had a T2N0 oral tongue; and one had dysplasia. After excision, two were given radiation but recurred shortly thereafter; the others remain well and have had no further exposure to cytotoxic drugs, including PLD.
Awareness of this possible long-term complication during PLD treatment should enhance the likelihood of early detection of oral lesions in these patients. Decisions to continue maintenance PLD after complete response of the original cancer should perhaps consider the benefits of delaying ovarian cancer recurrence versus the possible risk for a secondary cancer.
closed_qa
Transcutaneous electrical nerve stimulation: an effective treatment for refractory non-neurogenic overactive bladder syndrome?
To assess the effect of transcutaneous electrical nerve stimulation (TENS) for treating refractory overactive bladder syndrome (OAB). A consecutive series of 42 patients treated with TENS for refractory OAB was prospectively investigated at an academic tertiary referral centre. Effects were evaluated using bladder diary for at least 48 h and satisfaction assessment at baseline, after 12 weeks of TENS treatment, and at the last known follow-up. Adverse events related to TENS were also assessed. Mean age of the 42 patients (25 women, 17 men) was 48 years (range, 18-76). TENS was successful following 12 weeks of treatment in 21 (50 %) patients, and the positive effect was sustained during a mean follow-up of 21 months (range, 6-83 months) in 18 patients. Following 12 weeks of TENS treatment, mean number of voids per 24 h decreased significantly from 15 to 11 (p<0.001) and mean voided volume increased significantly from 160 to 230 mL (p<0.001). In addition, TENS completely restored continence in 7 (39 %) of the 18 incontinent patients. Before TENS, all 42 patients were dissatisfied or very dissatisfied; following 12 weeks of TENS treatment, 21 (50 %) patients felt satisfied or very satisfied (p<0.001). No adverse events related to TENS were noted.
TENS seems to be an effective and safe treatment for refractory OAB warranting randomized, placebo-controlled trials.
closed_qa
Does good clinical practice at the primary care improve the outcome care for diabetic patients?
The Middle East region is predicted to have one of the highest prevalence of diabetes mellitus (DM) in the world. This is the first study in the region to assess treatment outcome of DM according to gender. To assess the quality and effectiveness of diabetes care provided to patients attending primary care settings according to gender in the State of Qatar. It is an observational cohort study. The survey was carried out in primary health care (PHC) centers in the State of Qatar. The study was conducted from January 2010 to August 2010 among diabetic patients attending (PHC) centers. Of the 2334 registered with diagnosed diabetes, 1705 agreed and gave their consent to take part in this study, thus giving a response rate of 73.1%. Face to face interviews were conducted using a structured questionnaire including socio-demographic, clinical and satisfaction score of the patients. Majority of subjects were diagnosed with type 2 DM (84.9%). A significantly larger proportion of females with DM were divorced or widowed (9.1%) in comparison to males with DM (3.4%; p<0.001). A significantly larger proportion of females were overweight (46.5%; p=0.009) and obese (29.5%; p=0.003) in comparison to males. Males reported significantly greater improvements in mean values of blood glucose (mmol/l) (-2.11 vs. -0.66; p=0.007), HbA1c (%) (-1.44 vs. -0.25; p=0.006), cholesterol (mmol/l) (-0.16 vs. 0.12; p=0.053) and systolic blood pressure (mmHg) (-9.04 vs. -6.62; p<0.001) in comparison to females. While there was a remarkable increase in male patients with normal range of fasting blood glucose (FBG; 51.6%) as compared to the FBG measurement 1 year before (28.5%: p<0.001) there was only a slight increase in females normal range FBG during this period from 28.0% to 30.4% (p=0.357).
The present study revealed that the current form of PHC centers afforded to diabetic patients provided significantly improved outcomes for males, but only minor improved outcomes for females. This study reinforces calls for a gender-specific approach to diabetes care.
closed_qa
Does body mass index impact the number of LNs harvested and influence long-term survival rate in patients with stage III colon cancer?
The aim of this study is to evaluate whether different body mass index (BMI) values affect lymph node (LN) retrieval and whether such variations influence long-term survival in Asian patients. From January 1995 to July 2003, 645 stage III colon cancer patients were enrolled in our study. Patients were stratified into four groups: Obese (BMI ≧ 27 kg/m(2)), overweight (24 ≤ BMI<27 kg/m(2)), normal (18.5 ≤ BMI<24 kg/m(2)), and underweight (BMI<18.5 kg/m(2)). Mean BMI in the cohort was 23.3 kg/m(2). Mean number of LNs harvested was 23.1, 19.5, 19.8 and 28.1 in the normal, overweight, obese and underweight groups, respectively. There was a significant difference in the mean number of LNs harvested when comparing the overweight and underweight groups to the normal group (p = 0.013 and p = 0.04, respectively). Females were overrepresented in the underweight group (p = 0.011), and patients who had proximal colon cancers were more frequently underweight (p = 0.018). The mean number of LNs harvested varied by cases of right hemicolectomy (p = 0.009) and proximal cancer location (p = 0.009) for different BMI groups. Multivariate analysis showed that underweight, proximal colon cancer, well- or moderately differentiated adenocarcinoma and stage IIIC cancer were significant variables for adequate LN recovery. BMI was not significantly associated with relapse-free survival (p = 0.523) or overall survival (p = 0.127).
BMI is associated with LN harvest but is not an independent variable in stage III colon cancer survival.
closed_qa
A prospective single-center study of sentinel lymph node detection in cervical carcinoma: is there a place in clinical practice?
To establish the accuracy of sentinel lymph node (SLN) detection in early cervical cancer. Sentinel lymph node detection was performed prospectively over a 6-year period in 86 women undergoing surgery for cervical carcinoma by the combined method (Tc-99m and methylene blue dye). Further ultrastaging was performed on a subgroup of 26 patients who had benign SLNs on initial routine histological examination. The SLN was detected in 84 (97.7%) of 86 women by the combined method. Blue dye uptake was not seen in 8 women (90.7%). Sentinel lymph nodes were detected bilaterally in 63 women (73.3%), and the external iliac region was the most common anatomic location (48.8%). The median SLN count was 3 nodes (range, 1-7). Of the 84 women with sentinel node detection, 65 also underwent bilateral pelvic lymph node dissection, and in none of these cases was a benign SLN associated with a malignant non-SLN (100% negative predictive value). The median non-SLN count for all patients was 19 nodes (range, 8-35). Eighteen patients underwent removal of the SLN without bilateral pelvic lymph node dissection. Nine women (10.5%) had positive lymph nodes on final histology. One patient had bulky pelvic nodes on preoperative imaging and underwent removal of the negative bulky malignant lymph nodes and a benign SLN on the contralateral side. This latter case confirms the unreliability of the SLN method with bulky nodes. The remaining 8 patients had positive SLNs with negative nonsentinel lymph nodes. Fifty-nine SLNs from 26 patients, which were benign on initial routine histology, underwent ultrastaging, but no further disease was identified. Four patients (5%) relapsed after a median follow-up of 28 months (range, 8-80 months).
Sentinel lymph node detection is an accurate and safe method in the assessment of nodal status in early cervical carcinoma.
closed_qa
Impact of lap-band size on weight loss: does gender matter?
Laparoscopic adjustable gastric band (LAGB) has gone through major design modifications to improve clinical endpoints and reduce complications. Little is known, however, about the effects of LAGB size on clinical outcomes, or whether outcomes differ based on gender. We set out to examine the impact of band size on surgical weight loss, reoperations, comorbidity resolution, and compare outcomes within gender. We reviewed our prospectively collected longitudinal bariatric database between 2008 and 2010, and compared patients with BMI 35-50 kg/m(2) who had undergone LAGB with the LAP-BAND® APS to those who had the larger APL. Those patients with initial BMI > 50 kg/m(2) were excluded to reduce any possible selection bias which favors larger band use in such subjects. Three hundred ninety-four patients met our inclusion criteria; 230 (58 %) in the APS group and 164 (42 %) in the APL group. Female patients in APS group experienced significantly higher percentage excess body weight loss at 6 months, 1 year, and 2 years in comparison to female patients in APL group (p < 0.001 for all time points). In contrast, a reverse pattern was observed for male patients. No significant differences were observed between the groups regarding frequency of band adjustments, complications, or comorbidity resolution.
Male patients might benefit from APL bands, in contrast to female patients who appear to experience superior weight loss with the smaller APS bands. This study provides the first set of evidence to facilitate surgical decision making for band size selection and highlights differences between genders.
closed_qa
Is the expression of Transforming Growth Factor-Beta1 after fracture of long bones solely influenced by the healing process?
Circulating TGF-β1 levels were found to be a predictor of delayed bone healing and non-union. We therefore aimed to investigate some factors that can influence the expression of TGF-β1. The correlation between the expression of TGF-β1 and the different socio-demographic parameters was analysed. Fifty-one patients with long bone fractures were included in the study and divided into different groups according to their age, gender, cigarette smoking status, diabetes mellitus and regular alcohol intake. TGF-β1 levels were analysed in patient's serum and different groups were retrospectively compared. Significantly lower TFG-β1 serum concentrations were observed in non-smokers compared to smokers at week 8 after surgery. Significantly higher concentrations were found in male patients compared to females at week 24. Younger patients had significantly higher concentrations at week 24 after surgery compared to older patients. Concentrations were significantly higher in patients without diabetes compared to those with diabetes at six weeks after surgery. Patients with chronic alcohol abuse had significantly higher concentrations compared to those patients without chronic alcohol abuse.
TGF-β1 serum concentrations vary depending upon smoking status, age, gender, diabetes mellitus and chronic alcohol abuse at different times and therefore do not seem to be a reliable predictive marker as a single-point-in-time measurement for fracture healing.
closed_qa
Is there subclinical enthesitis in early psoriatic arthritis?
Enthesitis is a recognized feature of spondylarthritides (SpA), including psoriatic arthritis (PsA). Previously, ultrasound imaging has highlighted the presence of subclinical enthesitis in established SpA, but there are little data on ultrasound findings in early PsA. The aim of our study was to compare ultrasound and clinical examination (CE) for the detection of entheseal abnormalities in an early PsA cohort. Forty-two patients with new-onset PsA and 10 control subjects underwent CE of entheses for tenderness and swelling, as well as gray-scale (GS) and power Doppler (PD) ultrasound of a standard set of entheses. Bilateral elbow lateral epicondyles, Achilles tendons, and plantar fascia were assessed by both CE and ultrasound, the latter scored using a semiquantitative (SQ) scale. Inferior patellar tendons were assessed by ultrasound alone. A GS SQ score of>1 and/or a PD score of>0 was used to describe significant ultrasound entheseal abnormality. A total of 24 (57.1%) of 42 patients in the PsA group and 0 (0%) of 10 controls had clinical evidence of at least 1 tender enthesis. In the PsA group, for sites assessed by both CE and ultrasound, 4% (7 of 177) of nontender entheses had a GS score>1 and/or a PD score>0 compared to 24% (9 of 37) of tender entheses. CE overestimated activity in 28 (13%) of 214 of entheses. All the nontender ultrasound-abnormal entheses were in the lower extremity.
The prevalence of subclinical enthesitis in this early PsA cohort was low. CE may overestimate active enthesitis. The few subclinically inflamed entheses were in the lower extremity, where mechanical stress is likely to be more significant.
closed_qa
End-stage renal disease and critical limb ischemia: a deadly combination?
This study was planned to evaluate the prognostic impact of end-stage renal disease (ESRD) in patients with critical leg ischemia (CLI) undergoing infrainguinal revascularization. 1425 patients who underwent infrainguinal revascularization for CLI were the subjects of the present analysis. Ninety-five patients had ESRD (eGFR<15 ml/min/m²), and of them 66 (70%) underwent percutaneous transluminal angioplasty and 29 (30%) underwent bypass surgery. ESRD patients had significantly lower overall survival (at 3-year, 27.1% vs. 59.7%, p<0.0001), leg salvage (at 3-year, 57.7% vs. 83.0%, p<0.0001), and amputation free survival (at 3-year, 16.2% vs. 52.9%, p<0.0001) than patients with no or less severe renal failure. The difference in survival was even greater between 86 one-to-one propensity matched pairs (at 3-year, 23.1% vs. 67.3%, p<0.0001). ESRD was an independent predictor of all-cause mortality (RR 2.46, 95%CI 1.85-3.26). Logistic regression showed that age ≥ 75 years was the only independent predictor of 1-year all-cause mortality (OR 4.92, 95%CI 1.32-18.36). Classification and regression tree analysis showed that age ≥ 75 years and, among younger patients, bypass surgery for leg ulcer and gangrene were associated with significantly higher 1-year mortality
Lower limb revascularization in patients with CLI and end-stage renal failure is associated with favourable leg salvage. However, these patients have a very poor survival and this may jeopardize any attempt of revascularization. Further studies are needed to identify ESRD patients with acceptable life expectancy and who may benefit from lower limb revascularization.
closed_qa
Pulmonary embolism diagnosis and mortality with pulmonary CT angiography versus ventilation-perfusion scintigraphy: evidence of overdiagnosis with CT?
The purposes of this study were to determine whether pulmonary emboli diagnosed with pulmonary CT angiography (CTA) represent a milder disease spectrum than those diagnosed with ventilation-perfusion (V/Q) scintigraphy, to determine the trends in incidence and mortality among patients with the diagnosis of pulmonary embolism from 2000 to 2007, and to correlate incidence and mortality trends with imaging modality trends. Diagnoses of pulmonary embolism from 2000 to 2007 at an urban academic medical center were retrospectively identified. Patient data were collected from the hospital database and the Social Security Death Index. Incident diagnoses, type of imaging used, and date of death were documented. Bivariate and multivariate analyses were used to explore the relations between imaging use and the incidence and mortality of pulmonary embolism. Logistic regression analysis was used to estimate the odds of death of pulmonary embolism diagnosed with pulmonary CTA versus V/Q scintigraphy. The cases of 2087 patients (1361 women, 726 men; mean age, 61.8 years) with pulmonary embolism were identified. From 2000 to 2007 the incidence of pulmonary embolism increased from 0.69 to 0.91 per 100 admissions in strong correlation with increased use of pulmonary CTA. There was no change in mortality, but the case-fatality rate decreased from 5.7% to 3.3%. On average, pulmonary emboli diagnosed with pulmonary CTA were one half as lethal as those diagnosed with V/Q scintigraphy (odds ratio, 0.538; 95% CI, 0.314-0.921).
The results of this study are evidence that the shift in imaging from V/Q scintigraphy to pulmonary CTA resulted in increased diagnosis of a less fatal spectrum of pulmonary embolic disease, raising the possibility of overdiagnosis. Outcome-based clinical trials with long-term follow-up would be helpful to further guide management.
closed_qa
Can sarcoidosis and metastatic bone lesions be reliably differentiated on routine MRI?
Sarcoidosis lesions revealed on MRI in the axial skeleton and long bones resemble osseous metastases, which can lead to a potentially significant misdiagnosis. We hypothesized that osseous sarcoidosis lesions could be differentiated from osseous metastases on MRI and sought to propose and evaluate features distinguishing these entities. MR images obtained at 1.5 T of 34 subjects (22 with osseous metastatic disease, 12 with osseous sarcoidosis) with 79 single or multiple bone lesions (40 metastatic, 39 sarcoidal) were reviewed independently by two blinded, experienced musculoskeletal radiologists. Fluid-sensitive and T1-weighted images were viewed separately. Proposed discriminating features were peri- or intralesional fat, specified border characteristics, and the presence of an extraosseous soft-tissue mass. An additional feature for spinal lesions was posterior element involvement. On the basis of these criteria, the readers provided a binary diagnosis and confidence score. The overall sensitivity for both readers was 46.3% and specificity, 97.4%. T1-weighted images were associated with higher sensitivity than T2-weighted images (59.0% vs 34.1%, respectively; p = 0.025) and with comparable specificity (97.6% vs 97.2%, p = 0.91). Diagnostic accuracy was higher using the discriminators of a mass or posterior element involvement for metastasis (83.3%) than border characteristics (68.0%) or lesion fat (65.0%) for sarcoidosis; the latter two features provided near 100% specificity but poor sensitivity (14.3% and 0%, respectively). Readers reported higher confidence diagnosing osseous sarcoidosis lesions than metastatic lesions, with a trend for higher confidence with T1-weighted images (p = 0.088).
Osseous sarcoidosis lesions cannot be reliably distinguished from metastatic lesions on routine MRI studies by readers experienced in evaluating these lesions.
closed_qa
Can opposite clear corneal incisions have a role with post-laser in situ keratomileusis astigmatism?
To evaluate the astigmatic correcting effect of paired opposite clear corneal incisions (OCCIs) on the steep axis in patients with residual astigmatism after laser in situ keratomileusis (LASIK) Thirty-one eyes of 24 patients with a mean age of 28.4 years ±2.46 (range, 19-36 years) were recruited for the study. Inclusion criteria included residual astigmatism of ≥1.5 diopter (D) after LASIK with inadequate residual stromal bed thickness that precluded ablation. The cohort was divided into two groups; group I (with astigmatism ranging from -1.5 D to -2.5 D) and group II (with astigmatism>-2.5 D). The steep axis was marked prior to surgery. Paired three-step self-sealing opposite clear corneal incisions were performed 1-mm anterior to the limbus on the steep axis with 3.2-mm keratome for group I and 4.1 mm for group II. Patients were examined 1 day, 1 week, 1 month, 3 months and 6 months, postoperatively. Visual acuity, refraction, keratometry, and corneal topography were evaluated preoperatively and postoperatively. Analysis of the difference between groups was performed with the Student t-test. P<0.05 was considered statistically significant. The mean uncorrected visual acuity (UCVA) improved from 0.35±0.13 (range, 0.1-0.6) to 0.78±0.19 (range, 0.5-1) in group I and from 0.26±0.19 (range, 0.1-0.5) to 0.7±0.18 (range, 0.4-1) in group II. The increase in UCVA was statistically significant in both groups (P=0.001, both cases). The mean preoperative and postoperative keratometric astigmatism in group I was 2.0±0.48 D (range, 1.5-2.5 D) and 0.8±0.37 D (range, 0.1-1.4 D), respectively. The decrease in keratometric astigmatism was highly statistically significant in group II (P=0.001.). Mean surgically induced astigmatic reduction by vector analysis was 1.47±0.85 D and 2.21±0.97 D in groups I and II respectively. There were no incision-related complications.
Paired OCCIs were predictable and effective in correcting post-LASIK astigmatism and required no extra surgical skill or expensive instruments. OCCIs are especially useful in eyes with insufficient corneal thickness for LASIK retreatment.
closed_qa
Can passengers' active head tilt decrease the severity of carsickness?
We investigated the effect of the passenger head-tilt strategy on the severity of carsickness in lateral acceleration situations in automobiles. It is well known that the driver is generally less susceptible to carsickness than are the passengers. However, it is also known that the driver tilts his or her head toward the curve center when negotiating a curve, whereas the passenger's head moves in the opposite direction. Therefore, we hypothesized that the head-tilt strategy has the effect of reducing the severity of carsickness. A passenger car was driven on a quasi-oval track with a pylon slalom while the participant sat in the navigator seat. The experiment was terminated when either the participant felt the initial symptoms of motion sickness or the car finished 20 laps. In the natural head-tilt condition, the participants were instructed to sit naturally, to relax, and not to oppose the lateral acceleration intentionally. In the active head-tilt condition, the participants were asked to tilt their heads against the centrifugal acceleration, thus imitating the driver's head tilt. The number of laps achieved in the active condition was significantly greater than that in the natural condition. In addition, the subjective ratings of motion sickness and symptoms in the active condition were significantly lower than those in the natural condition.
We suggest that an active head tilt against centrifugal acceleration reduces the severity of motion sickness.
closed_qa
Duodenal bulb biopsies for diagnosing adult celiac disease: is there an optimal biopsy site?
Recent studies highlight the role of duodenal bulb biopsy in the diagnosis of celiac disease. To determine whether a targeted duodenal bulb biopsy in addition to distal duodenal biopsies is the optimal strategy to identify villous atrophy. Prospective cohort study. Tertiary-care referral center. Seventy-seven patients undergoing clinically indicated EGD with duodenal biopsies were recruited. Of these, 28 had newly diagnosed celiac disease and 49 were controls. At endoscopy, 8 duodenal biopsy specimens were taken: 4 from the second part of the duodenum and 4 quadrantically from the bulb (at the 3-, 6-, 9-, and 12-o'clock positions). Increasing the diagnostic yield and detection of the most severe villous atrophy in celiac disease with the addition of a targeted duodenal bulb biopsy. The most severe degree of villous atrophy was detected when distal duodenal biopsy specimens were taken in addition to a duodenal bulb biopsy specimen from either the 9- or 12-o'clock position (96.4% sensitivity; 95% CI, 79.7%-100%). The difference between the 12-o'clock position biopsy and the 3-o'clock position biopsy in detecting the most severe villous atrophy was 92% (24/26) versus 65% (17/26) (P = .02). Small sample and study performed in a tertiary referral center.
This study demonstrates the patchy appearance of villous atrophy that occurs within the duodenum. A targeted duodenal bulb biopsy from either the 9- or 12-o'clock position in addition to distal duodenal biopsies may improve diagnostic yields by detecting the most severe villous atrophy within the duodenum.
closed_qa
Interval colon cancer in a Lynch syndrome patient under annual colonoscopic surveillance: a case for advanced imaging techniques?
Lynch syndrome confers increased risk for various malignancies, including colorectal cancer. Colonoscopic surveillance programs have led to reduced incidence of colorectal cancer and reduced mortality from colorectal cancer. Colonoscopy every 1-2 years beginning at age 20-25, or 10 years earlier than the first diagnosis of colorectal cancer in a family, with annual colonoscopy after age 40, is the recommended management for mutation carriers. Screening programs have reduced colon cancer mortality, but interval cancers may occur. We describe a 48-year-old woman with Lynch syndrome who was found to have an adenoma with invasive colorectal cancer within one year after a normal colonoscopy.
Our patient illustrates two current concepts about Lynch syndrome: 1) adenomas are the cancer precursor and 2) such adenomas may be "aggressive," in the sense that the adenoma progresses more readily and more rapidly to carcinoma in this setting compared to usual colorectal adenomas. Our patient's resected tumor invaded only into submucosa and all lymph nodes were negative; in that sense, she represents a success for annual colonoscopic surveillance. Still, this case does raise the question of whether advanced imaging techniques are advisable for surveillance colonoscopy in these high-risk patients.
closed_qa
Is prevention of atopic eczema with hydrolyzed formulas cost-effective?
The German Infant Nutritional Intervention (GINI) trial, a prospective, randomized, double-blind intervention, enrolled children with a hereditary risk for atopy. When fed with certain hydrolyzed formulas for the first 4 months of life, the risk was reduced by 26-45% in PP and 8-29% in intention-to-treat (ITT) analyses compared with children fed with regular cow's milk at age 6. The objective was to assess the cost-effectiveness of feeding hydrolyzed formulas. Cost-effectiveness was assessed with a decision tree model programmed in TreeAge. Costs and effects over a 6-yr period were analyzed from the perspective of the German statutory health insurance (SHI) and a societal perspective at a 3% effective discount rate followed by sensitivity analyses. The extensively hydrolyzed casein formula would be the most cost-saving strategy with savings of 478 € per child treated in the ITT analysis (CI95%: 12 €; 852 €) and 979 € in the PP analysis (95%CI: 355 €; 1455 €) from a societal perspective. If prevented cases are considered, the partially whey hydrolyzed formula is cost-saving (ITT -5404 €, PP -6358 €). From an SHI perspective, the partially whey hydrolyzed formula is cost-effective, but may also be cost-saving depending on the scenario. An extensively hydrolyzed whey formula also included into the analysis was dominated in all analyses.
For the prevention of AE, two formulas can be cost-effective or even cost-saving. We recommend that SHI should reimburse formula feeding or at least the difference between costs for cow's milk formula and the most cost-effective formula.
closed_qa
Does the African-American-white mortality gap persist after playing professional basketball?
The African-American-white mortality gap for males in the United States is 6 years in favor of whites. Participation in professional sport may moderate this ethnic disparity. The historical cohort of professional basketball players, with nearly equal numbers of African-American and white players, can provide a natural experiment that may control for the classic confounders of income, education, socioeconomic status (SES), and physical factors related to mortality. The objectives of this study are to assess mortality and calculate survival for the overall study population and within ethnicity. Data were combined from several publicly available sources. The cohort was analyzed to compare longevity among all players, and for players stratified by ethnicity, with the general U.S. population. The final dataset included 3366 individuals, of whom 56.0% were African American. Results suggest white players live 18 months longer than their African-American colleagues. African-American players gained 9 years on their respective referent and live longer than white men in the general public. After controlling for covariates, we found that African-American players have a 75% increased risk of death compared with white players, a statistically significant gap (p<.0001, 95% confidence interval 1.41-2.44).
The African-American-white mortality gap for males is largely ameliorated (1.5 years vs. 6.1 years) in professional basketball but still persists.
closed_qa
Can erosions on MRI of the sacroiliac joints be reliably detected in patients with ankylosing spondylitis?
Erosions of the sacroiliac joints (SIJ) on pelvic radiographs of patients with ankylosing spondylitis (AS) are an important feature of the modified New York classification criteria. However, radiographic SIJ erosions are often difficult to identify. Recent studies have shown that erosions can be detected also on magnetic resonance imaging (MRI) of the SIJ early in the disease course before they can be seen on radiography. The goals of this study were to assess the reproducibility of erosion and related features, namely, extended erosion (EE) and backfill (BF) of excavated erosion, in the SIJ using a standardized MRI methodology. Four readers independently assessed T1-weighted and short tau inversion recovery sequence (STIR) images of the SIJ from 30 AS patients and 30 controls (15 patients with non-specific back pain and 15 healthy volunteers) ≤ 45 years old. Erosions, EE, and BF were recorded according to standardized definitions. Reproducibility was assessed by percentage concordance among six possible reader pairs, kappa statistics (erosion as binary variable) and intraclass correlation coefficient (ICC) (erosion as sum score) for all readers jointly. SIJ erosions were detected in all AS patients and six controls by ≥ 2 readers. The median number of SIJ quadrants affected by erosion recorded by four readers in 30 AS patients was 8.6 in the iliac and 2.1 in the sacral joint portion (P<0.0001). For all 60 subjects and for all four readers, the kappa value for erosion was 0.72, 0.73 for EE, and 0.63 for BF. ICC for erosion was 0.79, 0.72 for EE, and 0.55 for BF, respectively. For comparison, the kappa and ICC values for bone marrow edema were 0.61 and 0.93, respectively.
Erosions can be detected on MRI to a comparable degree of reliability as bone marrow edema despite the significant heterogeneity of their appearance on MRI.
closed_qa
Is obesity at individual and national level associated with lower age at menarche?
A unique standardized international data set from adolescent girls in 34 countries in Europe and North America participating in the Health Behaviour in School-aged Children Study (HBSC) is used to investigate the contribution of body mass index (BMI) at individual and country level to cross-national differences in age at menarche. Two independent nationally representative survey data sets from 15-year-olds (n = 27,878, in 34 countries, year = 2005/2006) and 11-year-olds (n = 18,101, in 29 countries, year = 2001/2002) were analyzed. The survey instrument is a self-report questionnaire. Median age at menarche and 95% confidence intervals (CIs) were estimated using Kaplan-Meier analysis. Hierarchical models were used to assess the relationship between BMI and age at menarche (months). "Country-level obesity" was measured by prevalence of overweight/obesity (%) in each country. Country-level median age at menarche ranged between 12 years and 5 months and 13 years and 5 months. Country-level prevalence of overweight among 15-year-old girls ranged from 4% to 28%. Age at menarche was inversely associated with individual BMI (unstandardized regression coefficient beta = -1.01; 95% CI, -1.09 to -.94) and country-level aggregate overweight at age 11 (unstandardized regression coefficient beta = -.25; 95% CI, -.43 to -.08). Individual- and country-level measures of BMI account for 40% of the country-level variance in age at menarche.
The findings add to the evidence that obesity in childhood is a risk factor for early puberty in girls and accounts for much of the cross-national variation in age at menarche. Future HBSC surveys can track this relationship in the wake of the obesity "epidemic."
closed_qa
Should measures of patient experience in primary care be adjusted for case mix?
Uncertainties exist about when and how best to adjust performance measures for case mix. Our aims are to quantify the impact of case-mix adjustment on practice-level scores in a national survey of patient experience, to identify why and when it may be useful to adjust for case mix, and to discuss unresolved policy issues regarding the use of case-mix adjustment in performance measurement in health care.DESIGN/ Secondary analysis of the 2009 English General Practice Patient Survey. Responses from 2 163 456 patients registered with 8267 primary care practices. Linear mixed effects models were used with practice included as a random effect and five case-mix variables (gender, age, race/ethnicity, deprivation, and self-reported health) as fixed effects. Primary outcome was the impact of case-mix adjustment on practice-level means (adjusted minus unadjusted) and changes in practice percentile ranks for questions measuring patient experience in three domains of primary care: access; interpersonal care; anticipatory care planning, and overall satisfaction with primary care services. Depending on the survey measure selected, case-mix adjustment changed the rank of between 0.4% and 29.8% of practices by more than 10 percentile points. Adjusting for case-mix resulted in large increases in score for a small number of practices and small decreases in score for a larger number of practices. Practices with younger patients, more ethnic minority patients and patients living in more socio-economically deprived areas were more likely to gain from case-mix adjustment. Age and race/ethnicity were the most influential adjustors.
While its effect is modest for most practices, case-mix adjustment corrects significant underestimation of scores for a small proportion of practices serving vulnerable patients and may reduce the risk that providers would 'cream-skim' by not enrolling patients from vulnerable socio-demographic groups.
closed_qa
Should orthotopic heart transplantation using marginal donors be limited to higher volume centers?
This study examined whether institutional volume impacts outcomes after orthotopic heart transplantation (OHT) utilizing marginal donors. Adult patients undergoing OHT with the use of marginal donors between 2000 and 2010 were identified in the United Network for Organ Sharing database. A previously derived and validated donor risk score (range, 1 to 15) was used to define marginal donors as those in the 90th percentile of risk (score≥7). Patients were stratified into equal-size tertiles based on overall institutional OHT volume. Posttransplant outcomes were compared between these center cohorts. A total of 3,176 OHTs utilizing marginal donors were identified. In Cox regression analysis, recipients undergoing OHT at low-volume centers were at significantly increased risk of 30-day (hazard ratio 1.82 [1.31 to 2.54], p<0.001), 1-year (hazard ratio 1.40 [1.14 to 1.73], p=0.002), and 5-year posttransplant mortality (hazard ratio 1.29 [1.10 to 1.52], p=0.02). These findings persisted after adjusting for recipient risk, differences in donor risk score, and year of transplantation (each p<0.05). In Kaplan-Meier analysis, there was a similar trend of decreasing 1-year survival with decreasing center volume: high (86.0%), intermediate (85.7%), and low (81.2%; log rank p=0.003). Drug-treated rejection within the first post-OHT year was more common in low-volume versus high-volume centers (34.3% versus 24.2%, p<0.001). At an overall mean follow-up of 3.4±2.9 years, low-volume centers also had higher incidences of death due to malignancy (2.8% versus 1.3%, p=0.01) or infection (6.2% versus 4.1%, p=0.02).
Consolidating the use of marginal donors to higher volume centers may be prudent in improving post-OHT outcomes in this higher risk patient subset.
closed_qa
Are thromboembolic and bleeding complications a drawback for composite aortic root replacement?
Valve-preserving aortic root reconstruction is being performed with increasing frequency. Independent of durability concerns, enthusiasm for retaining the native valve is often championed on the presumption that composite graft replacement of the aorta will be complicated by thromboembolism and bleeding. Our goal in this late follow-up study is to determine if thromboembolism or bleeding, or both, are indeed problematic after composite aortic root replacement. Between 1995 and 2011, 306 patients (mean age, 56±14 years) underwent composite graft replacement of the aorta. St. Jude mechanical valve conduits (St. Jude Medical, St Paul, MN) were used in 242 patients, and 64 received a biologic conduit. Long-term postoperative follow-up (mean, 56 months; range, 1 to 97 months) was performed through our Aortic Database, supplemented by patient interviews and use of the Social Security Death Index. Hospital mortality was 2.9% overall and 1.4% in the last 8 years. Kaplan-Meier curves showed freedom (±standard deviation) from bleeding, stroke, and distal embolism as 94.3%±1.7% at 5 years and 91.3%±2.4% at 10 years. Survival was 93.5%±1.8% at 5 years and 80.9%±4.6% at 10 years, which was not statistically different from that for an age- and sex-matched population in Connecticut. Freedom from reoperation of the aortic root was 99% at 10 years.
Patients had excellent survival and few thromboembolic and bleeding complications after composite aortic root replacement. These data supporting minimal morbidity in the setting of well-established durability should be used to put alternative procedures, such as valve-preserving aortic root reconstruction, into context.
closed_qa
Are treatments for vasovagal syncope effective?
Therapies used to treat vaso-vagal syncope (VVS) recurrence have not been proven effective in single studies. Comprehensive search of PubMed, EMBASE and Cochrane Central databases of published trials was done. Randomized or non-randomized studies, comparing the intervention of interest to control group(s), with the endpoint of spontaneous recurrence or syncope on head-up tilt test, were included. Data were extracted on an intention-to-treat basis. Study heterogeneity was analyzed by Cochran's Q statistics. A random-effect analysis was used. α-adrenergic agonists were found effective (n=400, OR 0.19, CI 0.06-0.62, p<0.05) in preventing VVS recurrence. β-blockers were not found to be effective when only randomized studies comparing β-blockers to non-pharmacologic agents were assessed (9 studies, n=583, OR 0.48, CI 0.22-1.04, p=0.06). Tilt-training had no effect when only randomized studies were considered (4 studies, n=298, OR 0.47, CI 0.21-1.05, p=0.07). Selective serotonin reuptake inhibitors were found effective (n=131, OR 0.28, CI 0.10-0.74, p<0.05), though the analysis contained only 2 studies. Pacemakers were found effective in preventing syncope recurrence when all studies were analyzed (n=463, OR 0.13, CI 0.05-0.36, p<0.05). However, studies comparing active pacemaker to sensing mode only did not show benefit (3 studies, n=162, OR 0.45, CI 0.09-2.14, p=0.32).
This meta-analysis highlights the totality of evidence for commonly used medications used to treat VVS, and the requirement for larger, double-blind, placebo controlled trials with longer follow-up.
closed_qa
Are poor health behaviours in anxious and depressed cardiac patients explained by sociodemographic factors?
While there is evidence of poor health behaviours in anxious and depressed cardiac patients, it is possible that sociodemographic factors explain these associations. Few previous studies have adequately controlled for confounders. The present study investigated health behaviours in anxious and depressed cardiac patients, while accounting for sociodemographic confounders. A consecutive sample of 275 patients admitted to hospital after acute myocardial infarction (32%) or for coronary bypass surgery (40%) or percutaneous coronary intervention (28%) was interviewed six weeks after hospital discharge. Anxiety and depression were assessed using the Hospital Anxiety and Depression Scale (HADS). Smoking, physical activity, alcohol intake and dietary fat intake were assessed by self-report. Backward stepwise logistic regression was used to identify the factors independently associated with anxiety and depression. In total, 41 patients (15.2%) were 'depressed' (HADS-D ≥8) while 68 (25.2%) were 'anxious' (HADS-A ≥8). Depressed patients reported higher rates of smoking (χ2)= 4.47, p = 0.034), lower physical activity (F = 8.63, p < 0.004) and higher dietary fat intake (F = 7.22, p = 0.008) than non-depressed patients. Anxious patients reported higher smoking rates (χ2)= 5.70, p = 0.024) and dietary fat intake (F = 7.71, p = 0.006) than non-anxious patients. In multivariate analyses, an association with depression was retained for both diet and physical activity, and an association with anxiety was retained for diet. Low social support and younger age were significant confounders with depression and anxiety respectively.
While the high smoking rates evidenced in anxious and depressed patients were explained by sociodemographic factors, their poor diet and low physical activity (depressed patients only) were independent of these factors. Given the impact of lifestyle modification on survival after a cardiac event, anxious and depressed patients should be a priority for cardiac rehabilitation and other secondary prevention programmes.
closed_qa
Chemotherapy-related thrombocytosis: does it increase the risk of thromboembolism?
Chemotherapy increases the risk of thromboembolism in patients with cancer. Although thrombocytopenia is a known side effect of chemotherapy, reactive thrombocytosis related to chemotherapy is uncommonly reported. The present study aimed to determine the incidence of gemcitabine-related thrombocytosis and the associated risk of thromboembolism. Medical records of 250 consecutive patients with a malignant disease who received gemcitabine-based therapy were reviewed. A multivariate analysis was done to determine factors associated with thromboembolism. A total of 220 eligible patients with a median age of 63 years (range 26-83) were identified. Of these 220 patients, 95% had advanced malignancy and 59% had received prior chemotherapy. A total of 69% of patients received a platinum combination. In all, 46% patients experienced thrombocytosis following chemotherapy, with a median platelet count of 632 × 10(9)/l (range 457-1,385). Twenty-three of the 220 patients experienced a vascular event within 6 weeks of treatment. Eleven patients with thrombocytosis experienced a vascular event compared with 10 patients without thrombocytosis (not significant). On multivariate analysis, leukocytosis (odds ratio 5.8, 95% confidence interval 2.1-15.8) and comorbid illnesses (odds ratio 4.1, 95% confidence interval 1.4-12.6) were correlated with thromboembolism.
Although gemcitabine-based therapy has been associated with an increased incidence of thrombocytosis, it does not increase the risk of thromboembolism in cancer patients. Leukocytosis and comorbid illnesses do increase the risk of thromboembolism.
closed_qa
Does risk-based coagulation screening predict intraventricular haemorrhage in extreme premature infants?
Intraventricular haemorrhage (IVH) continues to be a significant contributor to neonatal morbidity and mortality, especially in the extremely premature population (<26 weeks). The aims of the study were to test the hypothesis that risk-based coagulopathy screening could identify infants at risk of severe IVH/mortality, and whether preterm infants born at less than 26 weeks of gestation who received early (within first 48 h) fresh frozen plasma (FFP) had a lower incidence of IVH than those who did not. Chart review of preterm infants born less than 26-week gestation was conducted. The study compared two cohorts of infants who either had 'early' risk-based coagulopathy screening (within first 48 h, n = 47) or 'late' screening (n = 55). Baseline and clinical characteristics of the two cohorts were similar. 'Early' coagulopathy screening predicted infants at risk of severe IVH [relative risk (RR) 2.59, 95% confidence interval (CI) 1.18-5.67, P<0.01] but not mortality (RR 1.2, 95% CI 0.79-1.94). FFP was administered significantly more in the 'early' screened cohort (P<0.001); however, the incidence of IVH was similar in those who received early FFP administration than those who did not.
'Early' risk-based coagulopathy screening may identify preterm infants at risk of severe IVH; however, the study failed to show any benefit of early treatment of a coagulopathy with FFP in a small but high-risk population.
closed_qa
Are names of children with attention deficit hyperactivity disorder more 'hyperactive'?
The role of the meaning of given names has been noted in psychotherapy as well as in everyday life. This study aimed to investigate the possible association between the nature of given names of children and attention deficit hyperactivity disorder (ADHD) diagnosis. A total of 134 given names of children and adolescent patients diagnosed as having ADHD were compared with those of an age- and gender-matched randomly chosen control group from the general population. The first names of the two cohorts were compared with regard to the following: the literal meaning of their names, whether the name constitutes a verb, the prevalence of each name and their length (number of syllables). The meaning of first names of children and adolescents with ADHD combined type were rated by referees as expressing significantly more activity and containing less syllables than the names of controls. In addition, the prevalence of their names was significantly lower than that of names used in the general population. All findings remained significant following Bonferroni adjustment.
Our findings demonstrate an intriguing relationship between children's given names and ADHD diagnosis. Given names may serve as a possible predictor of later diagnosis of ADHD. Clinicians should be more attentive to given names in the context of child psychiatric evaluation and therapy.
closed_qa
Tricuspid valve repair: is ring annuloplasty superior?
Tricuspid regurgitation (TR) secondary to left heart disease is the most common aetiology of tricuspid valve (TV) insufficiency. Valve annuloplasty is the primary treatment for TV insufficiency. Several studies have shown the superiority of annuloplasty with a prosthetic ring over other repair techniques. We reviewed our experience with different surgical techniques for the treatment of acquired TV disease focusing on long-term survival and incidence of reoperation. A retrospective analysis of 717 consecutive patients who underwent TV surgery between 1975 and 2009 with either a ring annuloplasty [Group R: N = 433 (60%)] or a De Vega suture annuloplasty [Group NR: no ring; N = 255 (36%)]. Twenty-nine (4%) patients underwent other types of TV repair. A ring annuloplasty was performed predominantly in the late study period of 2000-09. TV aetiology was functional in 67% (479/717) of the patients. Ninety-one percent of the patients (n = 649) underwent concomitant coronary artery bypass grafting and/or mitral/aortic valve surgery. Patients who received a ring annuloplasty were older (67 ± 13 vs 60 ± 13 years; P<0.001). Overall 30-day mortality was 13.8% (n = 95) [Group R: n = 55 (12.7%) and Group NR: n = 40 (15.7%)]. Ten-year actuarial survival after TV repair with either the De Vega suture or ring annuloplasty was 39 ± 3 and 46 ± 7%, respectively (P = 0.01). Twenty-eight (4%) patients required a TV reoperation after 5.9 ± 5.1 years. Freedom from TV reoperation 10 years after repair with a De Vega annuloplasty was 87.9 ±3% compared with 98.4 ± 1% after the ring annuloplasty (P = 0.034).
Patients who require TV surgery either as an isolated or a combined procedure constitute a high-risk group. The long-term survival is poor. Tricuspid valve repair with a ring annuloplasty is associated with improved survival and a lower reoperation rate than that with a suture annuloplasty.
closed_qa
Is lidocaine Bier's block safe?
To assess the safety profile of lidocaine Bier's block when compared with that of prilocaine. A retrospective audit of patients undergoing Bier's block using 0.5% lidocaine during a 27-month period (April 2008-June 2010) at the Royal United Hospital Bath emergency department. 416 patients with sufficient data were included in the study; 360 women and 56 men. The mean patient age was 65 years. Complications were reported in 39 cases; transient hypotension/vasovagal episodes and transient mild bradycardia were most frequent. No patients required any medical intervention. There was no occurrence of anaphylaxis, convulsion, hypotensive episodes requiring medical intervention, collapse or death.
No clinically significant morbidity or mortality as a consequence of lidocaine Bier's block was demonstrated in this audit.
closed_qa
Are there differences in injury mortality among refugees and immigrants compared with native-born?
The authors studied injury mortality in Denmark among refugees and immigrants compared with that among native Danes. A register-based, historical prospective cohort design. All refugees (n=29, 139) and family reunited immigrants (n=27, 134) who between 1 January 1993 and 31 December 1999 received residence permission were included and matched 1:4 on age and sex with native Danes. Civil registration numbers were cross-linked to the Register of Causes of Death, and fatalities due to unintentional and intentional injuries were identified based on ICD-10 diagnosis. Sex-specific mortality ratios were estimated by migrant status and region of birth, adjusting for age and income and using a Cox regression model after a median follow-up of 11-12 years. Compared with native Danes, both female (RR=0.44; 95% CI 0.23 to 0.83) and male (RR=0.40; 95% CI 0.29 to 0.56) refugees as well as female (RR=0.40; 95% CI 0.21 to 0.76) and male (RR=0.22; 95% CI 0.12 to 0.42) immigrants had significantly lower mortality from unintentional injuries. Suicide rates were significantly lower for male refugees (RR=0.38; 95% CI 0.24 to 0.61) and male immigrants (RR=0.24; 95% CI 0.10 to 0.59), whereas their female counterparts showed no significant differences. Only immigrant women had a significantly higher homicide rate (RR=3.09; 95% CI 1.11 to 8.60) compared with native Danes.
Overall results were advantageous to migrant groups. Research efforts should concentrate on investigating protective factors among migrants, which may benefit injury prevention in the majority population.
closed_qa
Is the diagnosis of ADHD influenced by time of entry to school?
The authors examined the proposed immaturity hypothesis, which suggests that younger children may have developmental immaturity and not ADHD, using data from a large, clinically referred population of individuals with and without ADHD. The sample consisted of individuals with and without an ADHD diagnosis, ascertained from ongoing studies in our laboratory, born in August (Younger Cohort N = 562) and born in September (Older Cohort N = 529). The authors compared studywide diagnosis rates of ADHD, ADHD familiality patterns, ADHD symptoms, psychiatric comorbidity, and functional impairments between the two cohorts. Studywide rates of ADHD diagnosis, ADHD-associated symptoms, ADHD-associated impairments, ADHD-associated comorbid disorders, and familiality were similar in the two age cohorts.
Results showed that ADHD-associated familial, clinical, and functional correlates are similar irrespective of age at entry to school, indicating that when ADHD symptoms are present, a diagnosis of ADHD should be considered rather than attributing these symptoms to developmental immaturity.
closed_qa
Dysfunctional cognitions and their emotional, behavioral, and functional correlates in adults with attention deficit hyperactivity disorder (ADHD): is the cognitive-behavioral model valid?
To investigate the presence of dysfunctional cognitions in adults with ADHD and to determine whether these cognitions are associated with emotional symptoms, maladaptive coping, and functional impairment, as predicted by the cognitive-behavioral model. A total of 35 adult participants with ADHD, 20 nonclinical controls, and 20 non-ADHD clinical controls were assessed with measures of ADHD symptoms, dysfunctional cognitions, depression and anxiety symptoms, coping strategies, and quality of life. ADHD group showed elevated scores of dysfunctional cognitions relative to nonclinical control group and comparable with clinical control group. Dysfunctional cognitions were strongly associated with emotional symptoms. ADHD group also showed elevated scores in maladaptive coping strategies of the escape-avoidance type. Life impairment was satisfactorily predicted in data analysis when ADHD symptoms, dysfunctional cognitions, and emotional symptoms were fitted into a regression model.
Cognitive-behavioral therapy model appears to be a valid complementary model for understanding emotional and life impairment in adults with ADHD.
closed_qa
Social adversity, stress, and alcohol problems: Are racial/ethnic minorities and the poor more vulnerable?
Experiences of racial/ethnic bias and unfair treatment are risk factors for alcohol problems, and population differences in exposure to these social adversities (i.e., differential exposure) may contribute to alcohol-related disparities. Differential vulnerability is another plausible mechanism underlying health disparities, yet few studies have examined whether populations differ in their vulnerability to the effects of social adversity on psychological stress and the effects of psychological stress on alcohol problems. Data from the 2005 U.S. National Alcohol Survey (N = 4,080 adult drinkers) were analyzed using structural equation modeling to assess an overall model of pathways linking social adversity, depressive symptoms, heavy drinking, and alcohol dependence. Multiple group analyses were conducted to assess differences in the model's relationships among Blacks versus Whites, Hispanics versus Whites, and the poor (income below the federal poverty line) versus non-poor (income above the poverty line). The overall model explained 48% of the variance in alcohol dependence and revealed significant pathways between social adversity and alcohol dependence involving depressive symptoms and heavy drinking. The effects of social adversity and depressive symptoms were no different among Blacks and Hispanics compared with Whites. However, the poor (vs. non-poor) showed stronger associations between unfair treatment and depressive symptoms and between depressive symptoms and heavy drinking.
Contrary to some prior studies, these findings suggest that racial disparities in alcohol problems may be more a function of racial/ethnic minorities' greater exposure, rather than vulnerability, to chronic stressors such as social adversity. However, observed differences between the poor and non-poor imply that differential vulnerability contributes to socioeconomic disparities in alcohol problems. Efforts to reduce both differential exposure and vulnerability might help to mitigate these disparities.
closed_qa
Do substance use norms and perceived drug availability mediate sexual orientation differences in patterns of substance use?
Illicit drug and heavy alcohol use is more common among sexual minorities compared with heterosexuals. This difference has sometimes been attributed to more tolerant substance use norms within the gay community, although evidence is sparse. The current study investigated the role of perceived drug availability and tolerant injunctive norms in mediating the linkage between minority sexual orientation status and higher rates of prior-year substance use. We used data from the second California Quality of Life Survey (Cal-QOL II), a followback telephone survey in 2008-2009 of individuals first interviewed in the population-based 2007 California Health Interview Survey. The sample comprised 2,671 individuals, oversampled for minority sexual orientation. Respondents were administered a structured interview assessing past-year alcohol and illicit drug use, perceptions of perceived illicit drug availability, and injunctive norms concerning illicit drug and heavier alcohol use. We used structural equation modeling methods to test a mediational model linking sexual orientation and substance use behaviors via perceptions of drug availability and social norms pertaining to substance use. Compared with heterosexual individuals, sexual minorities reported higher levels of substance use, perceived drug availability, and tolerant social norms. A successfully fitting model suggests that much of the association between minority sexual orientation and substance use is mediated by these sexual orientation-related differences in drug availability perceptions and tolerant norms for substance use.
Social environmental context, including subcultural norms and perceived drug availability, is an important factor influencing substance use among sexual minorities and should be addressed in community interventions.
closed_qa
Rapid repeat pregnancy in adolescents: do immediate postpartum contraceptive implants make a difference?
The purpose of this study was to determine contraceptive continuation and repeat pregnancy rates in adolescents who are offered immediate postpartum etonogestrel implant (IPI) insertion. Participants in an adolescent prenatal-postnatal program were enrolled in a prospective observational study of IPI insertion (IPI group, 171) vs other methods (control group, 225). Contraceptive continuation and repeat pregnancies were determined. Implant continuation at 6 months was 96.9% (156/161 participants); at 12 months, the continuation rate was 86.3% (132/153 participants). At 6 months, 9.9% of the control participants were pregnant (21/213); there were no IPI pregnancies. By 12 months, 18.6% of control participants (38/204) experienced pregnancy vs 2.6% of IPI recipients (4/153; relative risk, 5.0; 95% confidence interval [CI], 1.9-12.7). Repeat pregnancy at 12 months was predicted by not receiving IPI insertion (odds ratio, 8.0; 95% CI, 2.8-23.0) and having>1 child (odds ratio, 2.1; 95% CI, 1.1-4.3; P = .03).
IPI placement in adolescents has excellent continuation 1 year after delivery; rapid repeat pregnancy is significantly decreased compared with control participants.
closed_qa
Are postoperative complications more common following colon and rectal surgery in patients with chronic kidney disease?
Patients with CKD were identified within our database. Patients with an eGFR of 15-59 ml/min (CKD Stages 3 and 4) formed the CKD group and were compared with American Society of Anesthesiology (ASA) score-matched controls with an eGFR of ≥ 60 ml/min. Assessments included demographics, comorbidity, ASA score, operative details and 30-day postoperative outcome. Seventy patients in the CKD group were matched with 70 controls. ASA scores and length of stay did not differ significantly between the groups. CKD patients were older (mean age 76.5 years vs 71.1 years; P<0.001) and had a lower mean body mass index (24.3 vs 28.2; P<0.001) compared with controls. Compared with the CKD group, the mean operation time was longer in the control group (181.5 min vs 151.6 min; P = 0.02) and the estimated blood loss was greater (232 ml vs 165 ml; P = 0.004). Postoperative infection was more common in the CKD group (60%vs 40%; P = 0.01). There were no significant differences in reoperation rates, 30-day readmissions or the incidence of acute renal failure (ARF).
Patients with CKD Stages 3 and 4 had a higher incidence of postoperative infections than matched controls after colorectal surgery. ARF developed in 18.6% of patients. Preoperative optimization should include adequate hydration and assessment of potentially nephrotoxic substances for bowel preparation, preoperative antibiotics and pain control.
closed_qa
Impact of trauma center designation on outcomes: is there a difference between Level I and Level II trauma centers?
Within organized trauma systems, both Level I and Level II trauma centers are expected to have the resources to treat patients with major multisystem trauma. The evidence supporting separate designations for Level I and Level II trauma centers is inconclusive. The objective of this study was to compare mortality and complications for injured patients admitted to Level I and Level II trauma centers. Using data from the Pennsylvania Trauma Outcomes Study registry, we performed a retrospective observational study of 208,866 patients admitted to 28 Level I and Level II trauma centers between 2000 and 2009. Regression modeling was used to estimate the association between patient outcomes and trauma center designation, after controlling for injury severity, mechanism of injury, transfer status, and physiology. Patients admitted to Level I trauma centers had a 15% lower odds of mortality (adjusted odds ratio [adj OR] 0.85; 95% CI 0.72 to 0.99) and a 35% increased odds of complications (adj OR 1.37; 95% CI 1.04 to 1.79). The survival benefit associated with admission to Level I centers was strongest in patients with very severe injuries (Injury Severity Score [ISS]≥ 25; adj OR 0.78; 95% CI 0.64 to 0.95). Less severely injured patients with an ISS<9 (adj OR 0.91; 95% CI 0.64 to 1.30) and with an ISS between 9 and 15 (adj OR 0.98; 95% CI 0.81 to 1.18) had similar risks of mortality in Level I and Level II trauma centers.
Severely injured patients admitted to Level I trauma centers have a lower risk of mortality compared with patients admitted to Level II centers. These findings support the continuation of a 2-tiered designation system for trauma.
closed_qa
Can hospitals "game the system" by avoiding high-risk patients?
It has been suggested that implementation of quality-improvement benchmarking programs can lead to risk-avoidance behaviors in some physicians and hospitals in an attempt to improve their rankings, potentially denying patients needed treatment. We hypothesize that avoidance of high-risk patients will not change risk-adjusted rankings. We conducted a simulation analysis of 6 complex operations in the Nationwide Inpatient Sample, including abdominal aortic aneurysm repair, aortic valve replacement, coronary artery bypass grafting, percutaneous coronary intervention, esophagectomy, and pancreatic resection. Primary outcomes included in-hospital mortality. Hospitals were ranked into quintiles based on observed-to-expected (O/E) mortality ratios, with their expected mortalities calculated based on models generated from the previous 3 years. Half of the hospitals were then randomly selected to undergo risk avoidance by avoiding 25% of patients with higher than median risks (ie, Charlson, Elixhauser, age, minority, or uninsured status). Their new O/E ratios and hospital-rank categories were compared with their original values. A total of 2,235,298 patients were analyzed, with an overall observed mortality rate of 1.9%. Median change in O/E ratios across all simulations was zero, and O/E ratios did not change in 97.5% to 99.3% of the hospitals, depending on the risk definitions. Additionally, 70.5% to 98.0% of hospital rankings remained unchanged, 1.3% to 13.1% of hospital rankings improved, and 0.7% to 14.3% of hospital rankings worsened after risk avoidance.
Risk-adjusted rankings of hospitals likely cannot be changed by simply avoiding high-risk patients. In the minority of scenarios in which risk-adjusted rankings changed, they were as likely to improve as worsen after risk avoidance.
closed_qa
Are high-frequency (600 Hz) oscillations in human somatosensory evoked potentials due to phase-resetting phenomena?
Median nerve somatosensory evoked potentials (SEP) contain a brief oscillatory wavelet burst at about 600 Hz (σ-burst) superimposed on the initial cortical component (N20). While invasive single-cell recordings suggested that this burst is generated by increased neuronal spiking activity in area 3b, recent non-invasive scalp recordings could not reveal concomitant single-trial added-activity, suggesting that the SEP burst might instead be generated by phase-reset of ongoing high-frequency EEG. Here, a statistical model and exemplary data are presented reconciling these seemingly contradictory results. A statistical model defined the conditions required to detect added-activity in a set of single-trial SEP. Its predictions were tested by analyzing human single-trial scalp SEP recorded with custom-made low-noise amplifiers. The noise level in previous studies did not allow to detect single-trial added-activity in the period concomitant with the trial-averaged σ-burst. In contrast, optimized low-noise recordings do reveal added-activity in a set of single-trials.
The experimental noise level is the decisive factor determining the detectability of added-activity in single-trials. A low-noise experiment provided direct evidence that the SEP σ-burst is at least partly generated by added-activity matching earlier invasive single-cell recordings.
closed_qa
Patient goals after incontinence procedures: does the single-incision sling satisfy them?
This study was undertaken to describe short-term postoperative achievement of subjective preoperative goals for single-incision MiniArc slings, in comparison with tension-free vaginal tape (TVT). Patients submitted to mid-urethral sling (TVT and MiniArc) procedures for stress urinary incontinence (SUI) in two centers were included in this prospective study. Before surgery, the patients completed a preoperative open-ended questionnaire, in which they described their personal outcomes goals for SUI surgery and the degree of severity of their symptoms. At the first postoperative check, they were asked to assess the degree to which their goals had been met and the degree of postoperative incontinence symptoms; their grade of satisfaction was evaluated with IIQ-7, UDI-6 and a 0-10 visual analog scale. One hundred and eight patients (TVT n=51, MiniArc n=57) were included in this study. Incontinence symptom relief and improvement of quality of life were the most commonly described preoperative goals. Six to eight weeks after surgery, 47 patients (92.1%) after TVT and 53 (92.9%) women after single-incision slings were objectively cured (P=1). After surgery, more than 90% of the patients in both groups achieved their preoperative goals. Symptom scores improved significantly and were comparable in both groups.
Our results show that self-reported achievement of preoperative goals of patients submitted to single-incision slings are comparable at the first follow-up with patients who have undergone the classic mid-urethral sling.
closed_qa
Improving stress echocardiography accuracy for detecting left circumflex artery stenosis: a new echocardiographic sign?
The accuracy and reproducibility of stress echocardiography (SE) for the detection of coronary artery lesions requires improvement, particularly in the left circumflex artery (LCx). To evaluate the feasibility and diagnostic value of a new sign: Rise of the Apical lateral wall and/or Horizontal displacement of the Apex toward the septum ("RA-HA") in apical echocardiographic views. Consecutive patients with normal left ventricular function at rest, positive SE and an indication for coronary angiography were included. SEs were analysed blindly by three independent cardiologists: two seniors (S1 and S2) and one junior (J). Of 81 patients, 58 had an exercise SE and 23 had a dobutamine SE. Significant coronary stenosis was found in 59 of 77 patients who underwent coronary angiography (76.6%). Interobserver reproducibility for the presence of RA-HA was very good between S1 and S2 (κ = 0.86), and good between S1 and J (0.67) and S2 and J (0.70). The sensitivity, specificity and positive and negative predictive values of RA-HA for the detection of significant coronary artery stenosis were, respectively, 39-41%, 83-89%, 88-92% and 29-31% for S1/S2; and 29%, 83%, 85% and 26% for J. To predict LCx stenosis (single or multivessel): 67-70%, 89%, 80-81% and 80-82% for S1/S2, respectively, and 50%, 89%, 75% and 74% for J.
With a short learning curve, RA-HA is easily diagnosed with a very good interobserver reproducibility. It has high specificity and PPV for the detection of a coronary artery stenosis, particularly in the LCx artery, during exercise or dobutamine SE.
closed_qa
Should computed tomography coronary angiography be aborted when the calcium score exceeds a certain threshold in patients with chest pain?
There is ongoing debate about whether a computed tomography coronary angiography (CTCA) should be aborted when the calcium score (CS) exceeds a certain threshold in patients with chest pain. The aim of this study was to discover whether specific "cutpoints" regarding coronary artery CS could be determined to predict severe coronary stenoses assessed by CTCA, thus identifying patients amenable to an invasive diagnostic approach. 294 consecutive patients with chest pain of uncertain cause who were referred for non-invasive diagnostic CTCA were included. Subjects underwent Agatston CS and CTCA using current 64-slice technology. Severe coronary stenoses were noted in 75 of 294 (25.1%) patients on CTCA. A very high prevalence of severe coronary stenoses was found in patients with CS ≥ 400 (87.0%). The CS had area under the ROC curve 0.86 to predict severe coronary stenoses on CTCA. The best discriminant cut-off point was CS ≥ 400 (sensitivity of 55.3%, specificity of 93.5, positive predictive value of 85.8%, negative predictive value of 84.0%). Multivariable logistic regression analysis controlling for traditional risk factors showed CS ≥ 400 remained an independent predictor of severe coronary stenoses on CTCA (OR 14.553, 95% confidence interval 4.043 to 52.384, p<0.001).
CS can be used as a "gatekeeper" to CTCA in patients with chest pain. Due to the very high prevalence of severe coronary stenoses in patients with CS ≥ 400, further evaluation with CTCA is not warranted as these patients should be referred to invasive coronary angiography, avoiding the repeated exposure to ionizing radiation and iodinated contrast.
closed_qa
Are there symptom differences in patients with coronary artery disease presenting to the ED ultimately diagnosed with or without ACS?
Symptoms are compared among patients with coronary artery disease (CAD) admitted to the emergency department with or without acute coronary syndrome (ACS). Sex and age are also assessed. A secondary analysis from the PROMOTION (Patient Response tO Myocardial Infarction fOllowing a Teaching Intervention Offered by Nurses) trial, an multicenter randomized controlled trial, was conducted. Of 3522 patients with CAD, at 2 years, 565 (16%) presented to the emergency department, 234 (41%) with non-ACS and 331 (59%) with ACS. Shortness of breath (33% vs 25%, P = .028) or dizziness (11% vs 3%, P = .001) were more common in non-ACS. Chest pain (65% vs 77%, P = .002) or arm pain (9% vs 21%, P = .001) were more common in ACS. In men without ACS, dizziness was more common (11% vs 2%; P = .001). Men with ACS were more likely to have chest pain (78% vs 64%; P = .003); both men and women with ACS more often had arm pain (men, 19% vs 10% [P = .019]; women, 26% vs 13% [P = .023]). In multivariate analysis, patients with shortness of breath (odds ratio [OR], 0.617 [confidence interval [CI], 0.410-0.929]; P = .021) or dizziness (OR, .0311 [CI, 0.136-0.708]; P = .005) were more likely to have non-ACS. Patients with prior percutaneous coronary intervention (OR, 1.592 [CI, 1.087-2.332]; P = .017), chest pain (OR, 1.579 [CI, 1.051-2.375]; P = .028), or arm pain (OR, 1.751 [CI, 1.013-3.025]; P<.042) were more likely to have ACS.
In patients with CAD, shortness of breath and dizziness are more common in non-ACS, whereas prior percutaneous coronary intervention and chest or arm pain are important factors to include during ACS triage.
closed_qa
Does vascular endothelial growth factor participate in uterine myoma growth stimulation?
Peptide growth factors play a role in the rebuilding of extracellular matrix in the course of leiomyoma growth, and exert a regulative effect on the cell only when they bind with a specific membrane receptor and transmit a signal into the cell. A high content of certain peptide growth factors and their receptors in leiomyoma suggests that in the course of the tumour growth hyperstimulation of cells takes place. A combined action of various peptide growth factors causes an amplification of signal paths in cells, inducing gene expression of proteins responsible for cell division and changes of metabolism. We therefore decided to evaluate the amounts and expression of VEGF, their receptor and mRNA levels. Studies were performed on human myometrium and uterine leiomyomas of various weights (small: i.e. less than 10 g, and large: i.e. more than 100 g). Expression and content of VEGF-A, D and VEGF R-1, R-2 were analysed with Western blot and ELISA methods, respectively. The RT-PCR method was used to determine VEGF mRNA levels. Our immunoblotting studies and immunoenzymatic assay, as well as RT-PCR technique, did not detect significant differences in the expression of VEGFs and their receptors in control myometrium and in uterine leiomyomas.
The increase in the amount of some peptide growth factors, especially FGFs and IGF-I, in large leiomyomas without any change in VEGF content means a decrease in the proportional relationship of the latter to other growth factors. Stimulation of extracellular matrix formation seems stronger than angiogenesis during myoma growth.
closed_qa
Is urolithiasis in children associated with obesity or malnutrition?
Although it is known that obesity predisposes to urolithiasis, a tendency for malnutrition in children with urolithiasis owing to recurrent urinary infections and abdominal pain also makes sense. In this study, we aimed to determine the nutritional status of infants and children with urolithiasis, and to observe whether obesity or malnutrition is more prevalent in that population. One hundred eighty-seven children aged 4 months to 17 years (mean, 4.9 ± 4.4 years) with urolithiasis, and 278 age- and sex-matched children without any chronic diseases were included. Anthropometric evaluations, including weight and height standard deviation score (SDS), body mass index, and triceps and subscapular skinfold thickness (SFT), were performed. Mean weight SDSs of the patients was statistically lower than that of the control subjects (P<.0001). Malnutrition rate was statistically higher in the patients with urolithiasis when evaluated according to weight SDS and percentiles of body mass index and SFT. When the age factor was taken into account, the percentage of malnutrition, determined by the percentiles of triceps and subscapular SFT measurements, was found to be higher in children younger than 2 years. Short stature was more prevalent in older children.
Malnutrition among children with urolithiasis is not as rare as thought previously. A careful anthropometric evaluation should be included in the clinical assessment of those children.
closed_qa
Is myocutaneous flap alone sufficient for reconstruction of chest wall osteoradionecrosis?
This study was carried out to determine whether the myocutaneous flap, alone, is sufficient to reconstruct a chest wall defect after osteoradionecrosis and provide satisfactory stability to the chest wall. This study involved five patients who were subjected to post-mastectomy radiotherapy as a treatment for breast cancer. Excision of the ulcer and all the necrotic ribs, with preservation of the parietal pleura and reconstruction with the latissimus dorsi flap, was done without the use of either an artificial prosthesis or autologous rib to reconstruct the chest wall defect. Clinical and radiological follow-up showed no complications regarding respiratory impairment or pleural complications.
The use of myocutaneous flap in patients with chest wall defect following osteoradionecrosis is satisfactory to cover the chest wall defect and provide satisfactory stability to the chest wall.
closed_qa
Does abdominoplasty have a positive influence on quality of life, self-esteem, and emotional stability?
In a previous prospective study, the authors evaluated the quality of life in patients undergoing aesthetic surgery. In this survey, the authors split up the operative indication and analyzed quality of life, self-esteem, and emotional stability after abdominoplasty alone. Sixty-three patients participated in the study. The testing instrument consisted of a self-developed questionnaire to collect demographic and socioeconomic data and a postoperative complication questionnaire developed especially for abdominoplasties. In addition, a standardized self-assessment test on satisfaction and quality of life (Questions on Life Satisfaction), the Rosenberg Self -Esteem Questionnaire, and the Freiburg Personality Inventory were used. Significantly increasing values in some items of the standardized self-assessment test on satisfaction and quality of life were found: sum scores of the General Life Satisfaction showed a significant improvement (p = 0.004) and the scores of the items housing/living conditions (p = 0.000) and family life/children (p = 0.000). Within the Satisfaction with Health module, a significant improvement in the items mobility (p = 0.02) and independence from assistance (p = 0.01) was found. Values in the module Satisfaction with Appearance (Body Image) increased regarding satisfaction with the abdomen (p = 0.001). Over 84 percent were very satisfied with the aesthetic result, 93.4 percent would undergo the same treatment again, and 88.9 percent would further recommend the operation. Data revealed that participants' self-esteem was very high and their emotional stability was very well balanced.
This study demonstrates that abdominoplasty increases most aspects of quality of life, particularly family life, living conditions, mobility, and independency from assistance. Also, patient self-esteem and emotional stability ratings are very high postoperatively.
closed_qa
Does epidural clonidine improve postoperative analgesia in major vascular surgery?
The prospective, single-blinded study involved 60 patients randomised into three groups (20 patients each): Group BM- bupivacaine 0.125% and morphine 0.1 mg/ml; Group BC-bupivacaine 0.125% and clonidine 5 μg/ml; Group MC-morphine 0.1 mg/ml and clonidine 5 μg/ml continuously infused at 5 ml/h. The quality and duration of the analgesia measured by the Visual Analogue Scale (VAS) at rest and on movement, additional analgesia requirements, sedation scores, haemodynamic parameters and side effects (respiratory depression, motor block, toxic effects, nausea and pruritus) were recorded. The average VAS scores at rest and on movement were significantly lower in Group MC at two, six and 24 hours following the start of epidural infusion (P<0.05). The duration of the analgesic effect after finishing the epidural infusion was significantly longer in Group MC (P<0.05). Patients from Group MC were intubated longer. Additional analgesia consumption, sedation scores and haemodynamic profiles were similar in all three groups. Pruritus was more frequent in morphine groups (P<0.05), but other side effects were similar in all three groups.
Under study conditions, clonidine added to morphine, not 0.125% bupivacaine, provided significantly better pain scores at two, six and 24 hours following the start of epidural infusion and the longest-lasting analgesia following the discontinuation of epidural infusion. However, patients from the Group MC were mechanically ventilated longer than patients from other two groups. Continuous monitoring of the patient is necessary after the administration of clonidine for epidural analgesia.
closed_qa
The vein collar: an anastomotic servant or a patency promoter?
Primary patency regarding the use of vein collar were re-analyzed in 345 patients from SCAMICOS with Kaplan-Meier life-table technique and Cox proportional hazards regression in a counting process notation to evaluate any interaction between time-period and the effect of a vein collar on the primary patency rate. No overall effect on primary patency of a vein collar at the distal anastomosis was found irrespective of the site anastomosis. However, during the first 30 days of follow-up the primary patency among the femoro-crural bypasses was 0.87 (0.79-0.95) and 0.72 (0.63-0.83) with and without vein collar respectively. The interaction between vein collar and time-period was not statistically significant (P=0.070) and neither was the Score test for the whole interaction analysis (P=0.091) for the patients with anastomosis to the crural arteries. No such initial differences were found for the patients with anastomosis to the popliteal artery below-knee.
A clinically relevant but not statistically significant better primary patency during the first 30 days was found for patients with PTFE-bypass to the crural arteries with a vein collar at the distal anastomosis. There were no long-term advantages of the vein collar irrespective of the location of the anastomosis.
closed_qa
Tibial angioplasty in diabetic patients: should all vessels be treated?
We retrospectively reviewed all consecutive diabetic patients with tibial disease with no concomitant proximal lesions who were treated by angioplasty. Among 82 patients with isolated tibial disease 48 patients were selected. All patients had to have more than one diseased tibial vessel that can be treated by angioplasty. Group A patients (N.=25) had only one tibial vessel treated while group B patients (N.=23) had more than one tibial vessel treated. We compared both groups with respect to patients' characteristics, lesion morphology, and limb salvage rate. Lesion morphology was worse in group A than B: anterior tibial artery showed more long lesions (17 vs. 8), more multiple lesions (22 vs. 11), and peroneal artery showed more long lesions (23 vs. 10), more multiple lesions (24 vs. 12), and more occlusions (18 vs. 10). Limb salvage rate at 12 months was similar (91%) in both groups. There were 5 complications in each group.
The lesion morphology was worse in group A. Simpler lesions in group B motivated performing more than one vessel angioplasty. There was no difference in the limb salvage rate in the medium term among both groups. Additional vessels angioplasty in less diseased arteries was not associated with substantial additional morbidity.
closed_qa
Gender and stroke lateralization: factors of functional recovery after the first-ever unilateral stroke?
The goal of this prospective study was to evaluate gender differences in rehabilitation outcome in patients after the first-ever unilateral stroke. A total of eighty right-handed patients were prospectively enrolled, 35 (44%) women, and 45 (56%) men. A degree of neurological deficit was quantified by the National Institutes of Health Stroke Scale. Functional outcome was assessed by the Motor Status Scale, Chedoke Arm and Hand Activity Inventory, Rivermead Mobility Index and Barthel Index. At the time of hospital admission there was no significant gender difference in clinical stroke severity. At discharge, we registered significantly better motor and functional recovery in men compared to women. Further, we found significantly better rehabilitation outcome in women with stroke in dominant left hemisphere (LH) than in women with stroke in subdominant right hemisphere (RH). Conversely, men with stroke in subdominant RH had significantly better rehabilitation outcome than men with stroke in dominant LH. Using a multivariate analysis we have found that men with stroke in RH had significantly higher probability to reach not only high response in mobility, but also more autonomy in ADL. The frequency of stroke in LH was significantly higher in both genders aged less than 51 years, as well as in women, while the frequency of stroke in RH was significantly higher in men.
This paper places particular emphasis on substantial gender-based differences in functional recovery of patients with their first-ever unilateral stroke.
closed_qa
Experiences from a randomised, controlled trial on cycling to school: does cycling increase cardiorespiratory fitness?
The objective of the present study was to investigate the effect of a 12-week randomised controlled cycling-to-school trial on cardiorespiratory fitness. A total of 53 10- to 13-year-old children from one public school were included. The children were randomised into either a cycling group or a control group. The cycling group was encouraged to cycle to and from school each day during a period of 12 weeks. Peak oxygen consumption (VO(2peak)) and anthropometrical data (weight and height) were measured at baseline and at the end of the 12-week period. No significant differences were observed in VO(2peak) change over the 12-week period between the cycling group and the control group (49.7 ml O(2)/min/kg vs. 50.6 ml O(2) /min/kg; effect size=-0.13, F=0.495, p=0.486). Within the intervention group, 69.2% (95% CI 50.1-88.2) started cycling, and within the control group 40.8% (95% CI 20.9-60.5) started cycling. Given that several children in both groups (intervention and control) started cycling to school, re-analyses were conducted between those starting cycling and those not starting cycling. At follow up, a significant difference between those starting cycling and those who did not starting cycling was observed in VO(2peak) (51.7 ml O(2)/min/kg vs. 47.9 ml O(2)/min/kg; effect size=0.49, F=8.145, p=0.007), after adjustment for baseline scores, gender and age.
This study indicates that cycling to school improves cardiorespiratory fitness.
closed_qa
Is there equity in use of healthcare services among immigrants, their descendents, and ethnic Danes?
Legislation in Denmark explicitly states the right to equal access to healthcare. Nevertheless, inequities may exist; accordingly evidence is needed. Our objective was to investigate whether differences in healthcare utilisation in immigrants, their descendents, and ethnic Danes could be explained by health status, socioeconomic factors, and integration. We conducted a nationwide survey in 2007 with 4952 individuals aged 18-66 comprising ethnic Danes; immigrants from the former Yugoslavia, Iran, Iraq, Lebanon, Pakistan, Somalia, Turkey; and Turkish and Pakistani descendents. Data were linked to registries on healthcare utilisation. Using Poisson regression models, contacts to hospital, emergency room (ER), general practitioner (GP), specialist in private practice, and dentist were estimated. Analyses were adjusted for health symptoms, sociodemographic factors, and proxies of integration. In adjusted analyses, immigrants and their descendents had increased use of ER (multiplicative effect 1.19-5.02 dependent on immigrant and descendent group) and less frequent contact to dentist (multiplicative effect 0.04-0.80 dependent on the group). For hospitalisation, GP, and specialist doctor, physical health symptoms had positive but different explanatory effects within groups; however, most immigrant and descendent groups had increased use of services compared with that of ethnic Danes. Socioeconomic factors and integration had no systematic effect on the use in the different groups.
The Danish healthcare system seems responsive to health across different population groups. We found no systematic pattern of inequity in use of free-of-charge healthcare services, but for dentists, who require co-payment, we found inequity among immigrants and descendents compared with ethnic Danes.
closed_qa
Socially differentiated cardiac rehabilitation: can we improve referral, attendance and adherence among patients with first myocardial infarction?
From 1 September 2002 to 31 December 2005, 388 first-incidence MI patients ≤75 years were hospitalised. Register check for newly hospitalised MI patients, screening interview, and systematic referral were conducted by a project nurse. Patients were referred to a standard rehabilitation programme (SRP). If patients were identified as socially vulnerable, they were offered an extended version of the rehabilitation programme (ERP). Excluded patients were offered home visits by a cardiac nurse. Concordance principles were used in the individualised programme elements. Adherence was registered until the 1-year follow up. 86% were referred to the CR. A large share of elderly patients and women were excluded. The attendance and adherence rates were 80% and 71%, respectively among all hospitalised patients. Among referred patients, the attendance rate was 93%. Patients were equally distributed to the SRP and the ERP. No inequality was found in attendance and adherence among referred patients.
It seems possible to overcome unequal referral, attendance, and adherence in cardiac rehabilitation by organisation of systematic screening and social differentiation.
closed_qa
Removal of industry-sponsored formula sample packs from the hospital: does it make a difference?
Most US hospitals distribute industry-sponsored formula sample packs. No research has examined outcomes associated with sample pack removal as part of a hospital intervention to eliminate sample distribution postpartum. To examine prospectively hospital-based and breastfeeding outcomes associated with removal of industry-sponsored formula sample packs from the hospital. We enrolled mothers postpartum at Cooper University Hospital, an urban New Jersey hospital, in 2009-2010. For the first 6 months, all women received industry-sponsored formula samples packs (control group); for the next 6 months, all postpartum women received hospital-sponsored bags with no formula at source (intervention group). Research assistants blinded to the design called subjects weekly for 10 weeks to determine feeding practices. We enrolled 527 breastfeeding women (284 control; 243 intervention). At 10 weeks postpartum, 82% of control and 36% of intervention women (P<.001) reported receiving formula in the "diaper discharge bag." Kaplan-Meyer curves for any breastfeeding showed the intervention was associated with increased breastfeeding (P = .03); however, exclusive breastfeeding was not significantly different between intervention and controls (P = .46). In post hoc analysis, receiving no take-home formula in bottles from the hospital was associated with increased exclusive breastfeeding in control (P = .02) and intervention (P = .03) groups at 10 weeks.
Although the hospital-branded replacement contained no formula at source, many women reported receiving bottles of formula from the hospital. Change in practice to remove industry-sponsored formula sample packs was associated with increased breastfeeding over 10 weeks, but the intervention may have had a greater impact had it not been contaminated.
closed_qa
Tissue banking in a regional hospital: a promising future concept?
Vital tissue provided by fresh frozen tissue banking is often required for genetic tumor profiling and tailored therapies. However, the potential patient benefits of fresh frozen tissue banking are currently limited to university hospitals. The objective of the present pilot study--the first one in the literature--was to evaluate whether fresh frozen tissue banking is feasible in a regional hospital without an integrated institute of pathology. Patients with resectable breast and colon cancer were included in this prospective study. Both malignant and healthy tissue were sampled using isopentan-based snap-freezing 1 h after tumor resection and stored at -80 °C before transfer to the main tissue bank of a University institute of pathology. The initial costs to set up tissue banking were 35,662 US$. Furthermore, the running costs are 1,250 US$ yearly. During the first 13 months, 43 samples (nine samples of breast cancer and 34 samples of colon cancer) were collected from 41 patients. Based on the pathology reports, there was no interference with standard histopathologic analyses due to the sample collection.
This is the first report in the literature providing evidence that tissue banking in a regional hospital without an integrated institute of pathology is feasible. The interesting findings of the present pilot study must be confirmed by larger investigations.
closed_qa
Improving quality of medical treatment and care: are surgeons' working conditions and job satisfaction associated to patient satisfaction?
Over the last decades, surgeons, researchers, and health administrators have been working hard to define standards for high-quality treatment and care in Surgery departments. However, it is unclear whether patients' perceptions of medical treatment and care are related and affected by surgeons' perceptions of their working conditions and job satisfaction. The aim of this study was to evaluate patients' satisfaction in relation to surgeons' working conditions. A cross-sectional survey with 120 patients and 109 surgeons working in Surgery hospital departments was performed. Surgeons completed a survey evaluating their working conditions and job satisfaction. Patients assessed quality of medical care and treatment and their satisfaction with being a patient in this department. Seventy percent of the patients were satisfied with performed surgeries and services in their department. Surgeons' job satisfaction and working conditions rated with moderate scores. Bivariate analyses showed correlations between patients' satisfaction and surgeons' job satisfaction and working conditions. Strongest correlations were found between kindness of medical staff, treatment outcome and overall patient satisfaction.
This study demonstrates strong associations between surgeons' working conditions and patient satisfaction. Based on these findings, hospital managements should improve work organization, workload, and job resources to not only improve surgeons' job satisfaction but also quality of medical treatment and patient satisfaction in Surgery departments.
closed_qa
Can amino acid carbon isotope ratios distinguish primary producers in a mangrove ecosystem?
The relative contribution of carbon from terrestrial vs. marine primary producers to mangrove-based food webs can be challenging to resolve with bulk carbon isotope ratios (δ(13)C). In this study we explore whether patterns of δ(13)C values among amino acids (AAs) can provide an additional tool for resolving terrestrial and marine origins of carbon. Amino acid carbon isotope ratios (δ(13)C(AA)) were measured for several terrestrial and marine primary producers in a mangrove ecosystem at Spanish Lookout Caye (SLC), Belize, using gas chromatography-combustion-isotope ratio mass spectrometry. The δ(13)C values of essential amino acids (δ(13)C(EAA)) were measured to determine whether they could be used to differentiate terrestrial and marine producers using linear discriminant analysis. Marine and terrestrial producers had distinct patterns of δ(13)C(EAA) values in addition to their differences in bulk δ(13)C values. Microbial mat samples and consumers (Crassostrea rhizophorae, Aratus pisonii, Littoraria sp., Lutjanus griseus) were most similar to marine producers. Patterns of δ(13)C(EAA) values for terrestrial producers were very similar to those described for other terrestrial plants.
The findings suggest that δ(13)C(EAA) values may provide another tool for estimating the contribution of terrestrial and marine sources to detrital foodwebs. Preliminary analyses of consumers indicate significant use of aquatic resources, consistent with other studies of mangrove foodwebs.
closed_qa
Is a cementless dual mobility socket in primary THA a reasonable option?
Dislocation after THA continues to be relatively common. Dual mobility sockets have been associated with low dislocation rates, but it remains unclear whether their use in primary THA would not introduce additional complications.QUESTIONS/ We therefore asked whether a current cementless dual mobility socket (1) reduced the dislocation rate after primary THA, (2) provided a pain-free and mobile hip, and (3) provided durable radiographic fixation of the acetabular component without any unique modes of failure. We retrospectively reviewed 168 patients who underwent primary THA using a dual mobility socket between January 2000 and June 2002. The average age at surgery was 67 years. We assessed the rate of dislocation, hip function, and acetabular fixation on serial radiographs. Of the 168 patients, 119 (71%) had clinical and radiographic evaluation at a minimum of 5 years (mean, 6 years; range, 5-8 years). A long-neck option left the base of the Morse taper uncovered in 53 hips. Four patients underwent revision for dislocation between the femoral head and the mobile insert (intraprosthetic dislocation) at a mean 6 years; all four revisions occurred among the 53 hips with an incompletely covered Morse taper.
A current cementless dual mobility socket was associated with a pain-free and mobile hip and durable acetabular fixation without dislocations if the long-neck option was not used. However, intraprosthetic dislocation related to contact at the femoral neck to mobile insert articulation required revision in four hips. Surgeons should be aware of this specific complication.
closed_qa
Are obstetrician-gynecologists satisfied with their maternal-fetal medicine consultants?
To survey generalist obstetrician-gynecologists about their satisfaction with and patterns of referral to maternal-fetal medicine (MFM) specialists. A survey was sent three times to 1030 randomly selected American Congress of Obstetricians and Gynecologists members across the country, and results were tabulated. A total of 516 surveys (50%) were returned; 68% of respondents were satisfied (S) with available MFM services and 31% were not satisfied (Not S). S and Not S respondents were similar with respect to age, gender, years in practice, type of practice, hours worked per week, proximity to MFM specialists, number of deliveries per year, and level of nursery in their hospital. Reasons for dissatisfaction included: MFM specialist not readily available (49%), during the day (26%), at night (35%), or on weekends (36%); MFM specialist unwilling to take care of hospitalized patients (26%); or MFM specialist does only ultrasound, chorionic villus sampling, and amniocentesis (32%). Although some generalists do not consult MFM specialists frequently, the majority of both S and Not S respondents would request an MFM consult or comanagement for 26 of 38 specific maternal, fetal, and obstetric diagnoses/complications.
The majority of obstetrician-gynecologists are satisfied with their MFM support. The dissatisfaction expressed by 31% of generalists might be ameliorated if individual MFM specialists increased their availability and/or broadened their scope of practice.
closed_qa
Depressive symptoms in older people with metabolic syndrome: is there a relationship with inflammation?
To investigate if there is a higher prevalence of depressive symptoms in older people with metabolic syndrome (MetS) compared with those without and whether dedpressive symptoms are independently associated to MetS and its single components and to the inflammatory markers. Physical parameters, standard blood analytes, high sensitivity C-reactive protein (hsCRP) and erythrocyte sedimentation rate (ESR) were assessed. Fifteen-item Geriatric Depression Scale and mini mental state examination (MMSE) were administered. One hundred thirty-three subjects were enrolled. MetS patients (57) exhibited higher prevalence of depressive symptoms (p < 0.0001), worse cognitive function (p < 0.0001), and higher levels of ESR and hsCRP were higher (p < 0.0001). The univariate analysis showed a linear strong correlation of depressive symptoms (p < 0.0001) with the MMSE score (r = -0.422), body mass index (r = 0.414), MetS (r = 0.582), number of MetS components (r = 0.663), fasting blood glucose (r = 0.565), ESR (r = 0.565), hsCRP (r = 0.745), central obesity (r = 0.269; p = 0.002), and high-density lipoprotein cholesterol (r = -0.241; p = 0.005). However, the multivariate analysis showed that only age (B = -0.093; p = 0.032), MetS (B = 1.446; p = 0.025), fasting blood glucose (B = 0.039; p = 0.005), and hsCRP (B = 7.649; p < 0.0001) were independently associated with depressive symptoms.
MetS and inflammation are independently associated with depressive symptoms in older people. Inflammation may explain cognitive decline too. Further investigations are needed to better understand the direction of these associations and to determine whether these can be reversible.
closed_qa
Cyclic Vomiting Syndrome (CVS): is there a difference based on onset of symptoms--pediatric versus adult?
Cyclic Vomiting Syndrome (CVS) is a well-recognized functional gastrointestinal disorder in children but its presentation is poorly understood in adults. Genetic differences in pediatric-onset (presentation before age 18) and adult-onset CVS have been reported recently but their clinical features and possible differences in response to therapy have not been well studied. This was a retrospective review of 101 CVS patients seen at the Medical College of Wisconsin between 2006 and 2008. Rome III criteria were utilized to make the diagnosis of CVS. Our study population comprised of 29(29%) pediatric-onset and 72 (71%) adult-onset CVS patients. Pediatric-onset CVS patients were more likely to be female (86% vs. 57%, p = 0.005) and had a higher prevalence of CVS plus (CVS + neurocognitive disorders) as compared to adult-onset CVS patients (14% vs. 3%, p = 0.05). There was a longer delay in diagnosis (10 ± 7 years) in the pediatric-onset group when compared to (5 ± 7 years) adult-onset CVS group (p = 0.001). Chronic opiate use was less frequent in the pediatric-onset group compared to adult-onset patients (0% vs. 23%, p = 0.004). Aside from these differences, the two groups were similar with regards to their clinical features and the time of onset of symptoms did not predict response to standard treatment. The majority of patients (86%) responded to treatment with tricyclic antidepressants, anticonvulsants (topiramate), coenzyme Q-10, and L-carnitine. Non-response to therapy was associated with coalescence of symptoms, chronic opiate use and more severe disease as characterized by longer episodes, greater number of emergency department visits in the year prior to presentation, presence of disability and non-compliance on univariate analysis. On multivariate analysis, only compliance to therapy was associated with a response. (88% vs. 38%, Odds Ratio, OR 9.6; 95% Confidence Interval [CI], 1.18-77.05).
Despite reported genetic differences, the clinical features and response to standard therapy in pediatric- and adult-onset CVS were mostly similar. Most patients (86%) responded to therapy and compliance was the only factor associated with a response.
closed_qa
The UK Clinical Aptitude Test: is it a fair test for selecting medical students?
The United Kingdom Clinical Aptitude Test (UKCAT) is designed to increase diversity and fairness in selection to study medicine.AIM: The aim of this study is to determine if differences in: access to support and advice, in modes of preparation, type of school/college attended, level of achievement in mathematics, gender and age influence candidate performance in the UKCAT and thereby unfairly advantage some candidates over others. Confidential, self-completed, on-line questionnaire of applicants to study on an undergraduate medical degree course who had taken the UKCAT in 2010. Differentials in access to support and advice, in modes of preparation, type of school/college attended, in level of achievement in mathematics, gender and age were found to be associated with candidate performance in the UKCAT.
The findings imply that the UKCAT may disadvantage some candidate groups. This inequity would likely be improved if tutors and career advisors in schools and colleges were more informed about the UKCAT and able to offer appropriate advice on preparation for the test.
closed_qa
Do combined alternating sessions of 1540 nm nonablative fractional laser and percutaneous collagen induction with trichloroacetic acid 20% show better results than each individual modality in the treatment of atrophic acne scars?
There have been no well-controlled studies evaluating the efficacy of combining 1540 nm nonablative fractional laser with percutaneous collagen induction (PCI) and trichloroacetic acid (TCA) 20% in the treatment of atrophic acne scars. We hypothesized that combined alternating sessions of both modalities would show better results than each individual modality. Thirty-nine patients with post acne atrophic scars were included in this study. Patients were randomly equally divided into three groups; group 1 was subjected to six sessions of PCI combined with TCA 20% in the same session, group 2 was subjected to six sessions of 1540 nm fractional laser and group 3 was subjected to combined alternating sessions of the previously mentioned two modalities. Scar severity scores improved by a mean of 59.79% (95% CI 47.38-72.21) (p<0.001) in group 1, a mean of 61.83% (95% CI 54.09-69.56) (p<0.001) in group 2 and a mean of 78.27% (95% CI 74.39-82.15) (p<0.001) in group 3. The difference in the degree of improvement was statistically significant when comparing the three groups using ANOVA test (p = 0.004).
The current work recommends combining 1540 nm nonablative fractional laser in alternation with PCI and TCA 20% in the treatment of atrophic acne scars.
closed_qa
Is it time to abandon paper?
A multidisciplinary primary care clinic in Sydney, Australia, was planning to use electronic questionnaires to measure patient-reported outcomes. Semi-structured interviews with 20 patients were undertaken to explore, among other things, practical issues regarding different questionnaire formats. The response rates and costs of email versus postal invitations were also evaluated. Compared with postal invitations, email invitations offered a cost-effective and practical alternative, with a greater proportion of patients volunteering for an interview. Assuming the interface is well-designed and user-friendly, many patients were happy to use the Internet to answer questionnaires. Most patients thought alternate formats should also be offered. Patients discussed advantages and disadvantages of the Internet format. Although more younger patients and females had given the clinic an email address; both sexes, and young and old patients, expressed strong preferences for either wanting or not wanting to use the Internet.
Researchers should consider using email invitations as a cost-effective first-line strategy to recruit patients to participate in health services research. Internet questionnaires are potentially cheaper than paper questionnaires, and the format is acceptable to many patients. However, for the time being, concurrent alternate formats need to be offered to ensure wider acceptability and to maximize response rates.
closed_qa
Three-dimensional conformal brachytherapy boost in locally recurrent or residual cervical carcinoma: does it impact clinical outcome?
Fourteen consecutive patients with recurrent or residual cervical cancer who were treated with interstitial brachytherapy as a boost were included in the study. All patients received 50.4 Gy external radiation (EBRT) to whole pelvis with conformal technique to reduce the dose to bowel. The clinical target volume (CTV) and organs at risk were contoured on CT scan with gold seeds being a surrogate marker of initial tumor extent implanted before commencing treatment. The median dose of prescription was 10.5-12Gy in 3 fractions. Dose volume histogram was calculated to evaluate the dose that covers 100% and 90% of the target volume and dose to the bladder, rectum and bowel (2 mL, 1 mL volume). The median follow-up was 12 months (range 6-18). The doses to CTV (D90, D100) ranged from 1141 to 2014 cGy, and 585 to 969 cGy, respectively. The mean cumulative 2-mL rectal, bladder and bowel doses were 66.70, 73.15 and 61.01Gy, respectively. Rectal toxicity of grade 2 or more had a strong correlation with the dose delivered (Spearman's correlation, 0.950). The local control rate at one year was 92% with failure seen in one patient only.
Conformal EBRT supplemented with 3D-IBT seems to be a practical and appropriate approach to give the most optimal therapeutic benefit with the least side-effects in postoperative recurrent and residual cervical cancer patients.
closed_qa
Beyond early intervention: can we adopt alternative narratives like 'Woodshedding' as pathways to recovery in schizophrenia?
Significant numbers of those developing a first episode of psychosis are on a path to a persisting and potentially life long condition. Constituting the schizophrenia spectrum disorders, such conditions demand the particular qualities and attitudes inherent within recovery-based practice. This paper explores some of these qualities and attitudes by examining the tension between a traditional 'clinical' narrative used by many health providers and a 'human' narrative of users of services and their families. We draw out key features and constructs of recovery practice as they relate to the EI paradigm. These include: woodshedding, turning points, discontinuous improvement models, therapeutic optimism, gradualism and narratives of story telling. We also highlight the role of family members and other close supporters and believe their potential contribution requires greater consideration.
The early intervention (EI) paradigm can resonate and indeed offer a stronghold for recovery-based practice where traditional mental health services have sometimes struggled. Conversely, failure of caregivers to provide such an approach in the early phase of illness can cause unnecessary and sometimes disastrous consequences.
closed_qa
Do premorbid impairments predict emergent 'prodromal' symptoms in young relatives at risk for schizophrenia?
Individuals at risk for developing schizophrenia (SZ) in the future frequently exhibit subtle behavioural and neurobiological abnormalities in their childhood. A better understanding of the role of these abnormalities in predicting later onset of 'prodromal' symptoms or psychosis may help in early identification of SZ. In an ongoing prospective follow-up study of young genetically at-risk relatives of patients with SZ, we studied the prevalence of problems in premorbid social adjustment and childhood psychopathology and examined their relationship with the presence and progression of 'prodromal' symptoms of SZ. Growth curve analyses showed that 'prodromal' symptoms, as measured by the Scale of 'Prodromal' Symptoms, increased during follow-up. Premorbid maladjustment and childhood behavioural disturbances were cross-sectionally correlated broadly with 'prodromal' symptomatology scores. Longitudinal analyses revealed that behavioural disturbances, but not childhood maladjustment at baseline, significantly predicted increases in 'prodromal' symptomatology during the 2-year study period.
Premorbid behavioural disturbance and maladjustment may predict the later emergence of 'prodromal' symptoms. 'Prodromal' symptoms in young at-risk relatives may define a subgroup worthy of follow-up into the age of risk for psychosis in order to cost-effectively characterize the predictors of psychotic symptoms and SZ.
closed_qa
Is there adaptation of the exocrine pancreas in wild animal?
Physiology of the exocrine pancreas has been well studied in domestic and in laboratory animals as well as in humans. However, it remains quite unknown in wildlife mammals. Roe deer and cattle (including calf) belong to different families but have a common ancestor. This work aimed to evaluate in the Roe deer, the adaptation to diet of the exocrine pancreatic functions and regulations related to animal evolution and domestication. Forty bovine were distributed into 2 groups of animals either fed exclusively with a milk formula (monogastric) or fed a dry feed which allowed for rumen function to develop, they were slaughtered at 150 days of age. The 35 Roe deer were wild animals living in the temperate broadleaf and mixed forests, shot during the hunting season and classified in two groups adult and young. Immediately after death, the pancreas was removed for tissue sample collection and then analyzed. When expressed in relation to body weight, pancreas, pancreatic protein weights and enzyme activities measured were higher in Roe deer than in calf. The 1st original feature is that in Roe deer, the very high content in pancreatic enzymes seems to be related to specific digestive products observed (proline-rich proteins largely secreted in saliva) which bind tannins, reducing their deleterious effects on protein digestion. The high chymotrypsin and elastase II quantities could allow recycling of proline-rich proteins. In contrast, domestication and rearing cattle resulted in simplified diet with well digestible components. The 2nd feature is that in wild animal, both receptor subtypes of the CCK/gastrin family peptides were present in the pancreas as in calf, although CCK-2 receptor subtype was previously identified in higher mammals.
Bovine species could have lost some digestive capabilities (no ingestion of great amounts of tannin-rich plants, capabilities to secrete high amounts of proline-rich proteins) compared with Roe deer species. CCK and gastrin could play an important role in the regulation of pancreatic secretion in Roe deer as in calf. This work, to the best of our knowledge is the first study which compared the Roe deer adaptation to diet with a domesticated animal largely studied.
closed_qa
Are perceived stress, depressive symptoms and religiosity associated with alcohol consumption?
The aim of this study was to investigate the association of perceived stress, depressive symptoms and religiosity with frequent alcohol consumption and problem drinking among freshmen university students from five European countries. 2529 university freshmen (mean age 20.37, 64.9% females) from Germany (n = 654), Poland (n = 561), Bulgaria (n = 688), the UK (n = 311) and Slovakia (n = 315) completed a questionnaire containing the modified Beck Depression Inventory for measuring depressive symptoms, the Cohen's perceived stress scale for measuring perceived stress, the CAGE-questionnaire for measuring problem drinking and questions concerning frequency of alcohol use and the personal importance of religious faith. Neither perceived stress nor depressive symptoms were associated with a high frequency of drinking (several times per week), but were associated with problem drinking. Religiosity (personal importance of faith) was associated with a lower risk for both alcohol-related variables among females. There were also country differences in the relationship between perceived stress and problem drinking.
The association between perceived stress and depressive symptoms on the one side and problem drinking on the other demonstrates the importance of intervention programs to improve the coping with stress.
closed_qa
Should risky treatments be reserved for secondary prevention?
Clinical intuition suggests that risk-reducing treatments are more beneficial for patients with greater risk of disease. This intuition contributes to our rationale for tolerating greater adverse event risk in the setting of secondary prevention of certain diseases such as myocardial infarction or stroke. However, under certain conditions treatment benefits may be greater in primary prevention, even when the treatment carries harmful adverse effect potential. We present simple decision-theoretic models that illustrate conditions of risk and benefit under which a treatment is predicted to be more beneficial in primary than in secondary prevention. The models cover a spectrum of possible clinical circumstances, and demonstrate that net benefit in primary prevention can occur despite no benefit (or even net harm) in secondary prevention.
This framework provides a rationale for extending the familiar concept of balancing risks and benefits to account for disease-specific considerations of primary vs. secondary prevention.
closed_qa
Is omentectomy mandatory in the operation for ovarian cancer?
To investigate whether omentectomy is required in the operation for ovarian cancer, in particular at the early stage. F344 nude rats were divided into two groups: one in which laparotomy and omentectomy were performed (primary omentectomy group, n=6) and one without omentectomy (n=12). Concurrently, DISS cells derived from ovarian cancer were transplanted intraperitoneally. After three weeks, the 12 rats without omentectomy were divided into two more groups: one in which the omentum was resected together with the tumor (sham operation/omentectomy group, n=6) and one without omentectomy (sham operation alone group, n=6). The survival of the sham operation alone group was shortest with a median of 35 days, while the median of the primary omentectomy group was 42 days. In the sham operation/omentectomy group, four rats survived beyond Day 90, which was significant compared with other two groups. The intraperitoneal findings in the primary omentectomy group revealed extensive disseminated foci on the mesentery and under the abdominal wall. The sham operation alone group was characterized by jaundice resulting from the compression of the biliary system at the liver hilum by the omental mass. Disseminated foci were not observed in the peritoneal cavity from the sham operation/omentectomy group.
This study suggests the possibility that the omentum has a role in capturing cancer cells and suppressing further peritoneal dissemination. Therefore, although omentectomy is rewarding if disseminated foci are present in the omentum, it is suggested that the timing of omentectomy requires reconsideration in the absence of omental metastasis.
closed_qa
Is Drosera meristocaulis a pygmy sundew?
South America and Oceania possess numerous floristic similarities, often confirmed by morphological and molecular data. The carnivorous Drosera meristocaulis (Droseraceae), endemic to the Neblina highlands of northern South America, was known to share morphological characters with the pygmy sundews of Drosera sect. Bryastrum, which are endemic to Australia and New Zealand. The inclusion of D. meristocaulis in a molecular phylogenetic analysis may clarify its systematic position and offer an opportunity to investigate character evolution in Droseraceae and phylogeographic patterns between South America and Oceania. Drosera meristocaulis was included in a molecular phylogenetic analysis of Droseraceae, using nuclear internal transcribed spacer (ITS) and plastid rbcL and rps16 sequence data. Pollen of D. meristocaulis was studied using light microscopy and scanning electron microscopy techniques, and the karyotype was inferred from root tip meristem. The phylogenetic inferences (maximum parsimony, maximum likelihood and Bayesian approaches) substantiate with high statistical support the inclusion of sect. Meristocaulis and its single species, D. meristocaulis, within the Australian Drosera clade, sister to a group comprising species of sect. Bryastrum. A chromosome number of 2n = approx. 32-36 supports the phylogenetic position within the Australian clade. The undivided styles, conspicuous large setuous stipules, a cryptocotylar (hypogaeous) germination pattern and pollen tetrads with aperture of intermediate type 7-8 are key morphological traits shared between D. meristocaulis and pygmy sundews of sect. Bryastrum from Australia and New Zealand.
The multidisciplinary approach adopted in this study (using morphological, palynological, cytotaxonomic and molecular phylogenetic data) enabled us to elucidate the relationships of the thus far unplaced taxon D. meristocaulis. Long-distance dispersal between southwestern Oceania and northern South America is the most likely scenario to explain the phylogeographic pattern revealed.
closed_qa
Monitoring performance for blood pressure management among patients with diabetes mellitus: too much of a good thing?
Performance measures that reward achieving blood pressure (BP) thresholds may contribute to overtreatment. We developed a tightly linked clinical action measure designed to encourage appropriate medical management and a marker of potential overtreatment, designed to monitor overly aggressive treatment of hypertension in the face of low diastolic BP. We conducted a retrospective cohort study in 879 Department of Veterans Affairs (VA) medical centers and smaller community-based outpatient clinics. The clinical action measure for hypertension was met if the patient had a passing index BP at the visit or had an appropriate action. We examined the rate of passing the action measure and of potential overtreatment in the Veterans Health Administration during 2009-2010. There were 977,282 established VA patients, 18 years and older, with diabetes mellitus (DM). A total of 713,790 patients were eligible for the action measure; 94% passed the measure (82% because they had a BP<140/90 mm Hg at the visit and an additional 12% with a BP ≥140/90 mm Hg and appropriate clinical actions). Facility pass rates varied from 77% to 99% (P<.001). Among all patients with DM, 197,291 (20%) had a BP lower than 130/65 mm Hg; of these, 80 903 (8% of all patients with DM) had potential overtreatment. Facility rates of potential overtreatment varied from 3% to 20% (P<.001). Facilities with higher rates of meeting the current threshold measure (<140/90 mm Hg) had higher rates of potential overtreatment (P<.001).
While 94% of diabetic veterans met the action measure, rates of potential overtreatment are currently approaching the rate of undertreatment, and high rates of achieving current threshold measures are directly associated with overtreatment. Implementing a clinical action measure for hypertension management, as the Veterans Health Administration is planning to do, may result in more appropriate care and less overtreatment.
closed_qa
Does occupational lifting and carrying among female health care workers contribute to an escalation of pain-day frequency?
The aim of the study was to investigate if different frequencies, loads and trunk postures of occupational lifting and carrying increases the risk of sub-chronic (1-30 days last 12 months) low back pain (LBP) to become persistent (>30 days last 12 months) among female health care workers. Female health care workers answered a questionnaire about occupational lifting or carrying frequency (rarely, occasionally and frequently), load (low: 1-7 kg, moderate: 8-30 kg and heavy:>30 kg) and trunk posture (upright or forward bent back), and days with LBP in 2005 and 2006. The odds ratio (OR) for developing persistent LBP in 2006 from these characteristics of occupational lifting and carrying was investigated with multi-adjusted logistic regressions among female health care workers with sub-chronic LBP (n = 2381) in 2005. Among health care workers with sub-chronic LBP, increased risk of persistent LBP was found from frequently lifting or carrying with forward bent back of moderate loads (OR: 1.63; 95% CI: 1.15-2.33) and heavy loads (OR: 1.56; 95% CI: 1.04-2.34). No increased risk for LBP to develop into a persistent condition was found for frequent lifting with upright back, frequent lifting or carrying of light loads, or occasionally lifting or carrying of any loads.
Preventive initiatives for sub-chronic LBP to develop into a persistent condition ought to focus on reducing frequent lifting and carrying of moderate and heavy loads with forward bent back.
closed_qa
Occupational solvent exposure and cognition: does the association vary by level of education?
Chronic occupational solvent exposure is associated with long-term cognitive deficits. Cognitive reserve may protect solvent-exposed workers from cognitive impairment. We tested whether the association between chronic solvent exposure and cognition varied by educational attainment, a proxy for cognitive reserve. Data were drawn from a prospective cohort of French national gas and electricity (GAZEL) employees (n = 4,134). Lifetime exposure to 4 solvent types (chlorinated solvents, petroleum solvents, benzene, and nonbenzene aromatic solvents) was assessed using a validated job-exposure matrix. Education was dichotomized at less than secondary school or below. Cognitive impairment was defined as scoring below the 25th percentile on the Digit Symbol Substitution Test at mean age 59 (SD 2.8; 88% of participants were retired at testing). Log-binomial regression was used to model risk ratios (RRs) for poor cognition as predicted by solvent exposure, stratified by education and adjusted for sociodemographic and behavioral factors. Solvent exposure rates were higher among less-educated patients. Within this group, there was a dose-response relationship between lifetime exposure to each solvent type and RR for poor cognition (e.g., for high exposure to benzene, RR = 1.24, 95% confidence interval 1.09-1.41), with significant linear trends (p<0.05) in 3 out of 4 solvent types. Recency of solvent exposure also predicted worse cognition among less-educated patients. Among those with secondary education or higher, there was no significant or near-significant relationship between any quantification of solvent exposure and cognition.
Solvent exposure is associated with poor cognition only among less-educated individuals. Higher cognitive reserve in the more-educated group may explain this finding.
closed_qa
Is it effective to perform two more prostate biopsies according to prostate-specific antigen level and prostate volume in detecting prostate cancer?
To evaluate the effectiveness of 2 more core prostate biopsy protocol in detecting the prostate cancer (PCa) by comparing 10-core prostate biopsy with 12-core according to the prostate-specific antigen (PSA) level and the prostate volume. A total of 474 men with elevated serum levels of PSA between 2.5 and 20.0 ng/mL, regardless of abnormal finding on digital rectal examination and transrectal ultrasonography, received transrectal ultrasound-guided prostate biopsies. The patients were prospectively randomized to undergo 10-core (group 1, n = 351) or 12-core (group 2, n = 123) biopsy. The PCa detection rates were assessed and compared according to the serum level of PSA and prostate volume. Of 474 men, 128 (27.0%) were diagnosed with PCa. The PCa detection rates of 10-core and 12-core biopsies were 26.4% and 28.4%, respectively (P = .378). There was no difference in cancer detection rates according to PSA level in both groups. Comparing the cancer detection rates according to the prostate volume (<40 mL and ≥ 40 mL), the patients with prostate volume ≥ 40 mL showed higher cancer detection rates in 12-core biopsy group (26.9%) compared with 10-core biopsy group (16.4%) (P<.05).
The overall cancer detection rates showed no differences in both groups. But the 12-core biopsy was a more efficient method in men with a prostate volume of ≥ 40 mL, compared to the 10-core biopsy.
closed_qa
Is routine echocardiography necessary after catheter ablation of atrioventricular nodal re-entrant tachycardia?
The aim of this study was to investigate whether pericardial effusion (PE) detected by transthoracic echocardiography (TTE) was clinically significant and whether routine echocardiography was necessary after catheter ablation of atrioventricular nodal re-entrant tachycardia (AVNRT). A total of 202 patients with AVNRT were included in the study from three centers. The patients received basic electrophysiology-guided therapy, followed by radiofrequency ablation (RFA). All patients underwent TTE before and after RFA therapy. The mean age of the study population was 46.2 ± 17.9 and 30.7% of the patients were male. Of these patients, six (3%) had postoperative PE, as detected by TTE. However, none of them had cardiac tamponade (CT). Four patients had minimal PE, while two had mild PE. Repeated TTE at one to three months showed resolved PE. No significant difference was seen among the patients with and/or without PE in terms of age, gender, the number of RFA applications, or RFA duration; however, significantly prolonged duration of fluoroscopy exposure was observed in the patients with PE.
PE was detected in 3% of the patients by TTE and associated with prolonged duration of fluoroscopy exposure. However, no patients with moderate or large PE or cardiac tamponade were found in the study. In conclusion, we suggest that TTE should only be performed in the presence of clinical indications following ablation of AVNRT.
closed_qa
Does obesity modify the association of supplemental folic acid with folate status among nonpregnant women of childbearing age in the United States?
Obesity is associated with an increased risk of having a pregnancy affected by a neural tube defect (NTD). It is not clear whether the amount of folic acid required by obese women to protect against NTDs is the same as that for nonobese women. We analyzed data from the National Health and Nutrition Examination Survey, representative of the noninstitutionalized civilian U.S. population, to assess whether body mass index (BMI; normal weight, overweight, and obese categories) modified the association between supplemental folic acid intake and folate status. We estimated the geometric mean concentration among nonpregnant women of childbearing age (15-44 years) during the postfortification period of: serum folate (2003-2008); red blood cell (RBC) folate (2007-2008); and plasma total homocysteine (tHcy; 2003-2006), adjusted for age, race and ethnicity, and total dietary folate expressed as dietary folate equivalents for strata of supplement use and BMI. BMI was inversely associated with serum folate among women who did not use supplements containing folic acid; no differences between women in different BMI categories were observed among supplement users. Regardless of supplement use, obese women had the highest RBC folate concentrations. There were no differences in tHcy by BMI, regardless of supplement use.
These results do not support a straightforward modification of the relationship between supplemental folic acid intake and folate status by BMI. In this population, BMI may affect the body distribution of folate, as reflected by lower serum and higher RBC folate levels in obese women who do not use supplements.
closed_qa