instruction
stringlengths
10
664
context
stringlengths
1
5.66k
response
stringlengths
1
3.34k
category
stringclasses
1 value
Chronic mad honey intoxication syndrome: a new form of an old disease?
Although cases of acute mad honey intoxication have been reported earlier, chronic mad honey intoxication (CMHI) syndrome has not been described and we address this issue only in this study. We prospectively evaluated the history of non-commercial honey intake in all patients referred to our institution for investigation of slow heart rate or atrioventricular (AV) conduction abnormalities. Between April 2008 and December 2008, 173 patients were referred to our institution for assessment of sinus bradycardia and various degrees of AV block and/or permanent pacemaker implantation. All patients were questioned about history of honey intake. Detailed evaluation revealed a history of daily honey intake for a long period of time in five of the patients (2.8%). This non-commercial honey was made by different amateur beekeepers in eastern Back Sea region of Turkey. Discontinuation of honey intake resulted in prompt normalization of conduction and significant symptomatic improvement. None of the patients were admitted to hospital and all were asymptomatic during 3 months follow-up. Holter monitoring for 24-h revealed no abnormality at first and third month.
This is the first report of CMHI. This issue should be suggested during assessment of patients with unexpected conduction abnormalities, because abandonment of honey intake results in prompt symptomatic and electrocardiographic improvement.
closed_qa
Orthotopic liver transplantation: T-tube or not T-tube?
The purpose of this study was to compare outcomes after duct-to-duct anastomoses with or without biliary T-tube in orthotopic liver transplantation. We pooled the outcomes of 1027 patients undergoing choledocho-choledochostomy with or without T-tube in 9 of 46 screened trials by means of fixed or random effects models. The "without T-tube" and "with T-tube" groups had equivalent outcomes for: anastomotic bile leaks or fistulas, choledocho-jejunostomy revisions, dilatation and stenting, hepatic artery thromboses, retransplantation, and mortality due to biliary complications. The "without T-tube" group had better outcomes when considering "fewer episodes of cholangitis," "fewer episodes of peritonitis," and showed a favorable trend for "overall biliary complications." Although the "with T-tube" group showed superior result for "anastomotic and nonanastomotic strictures," the incidence of interventions was not diminished.
Our systematic review and meta analysis favor the abandonment of T-tubes in orthotopic liver transplantation.
closed_qa
Does sexual function change after surgery for stress urinary incontinence and/or pelvic organ prolapse?
The purpose of this study was to assess sexual function in women after surgery for stress urinary incontinence and/or pelvic organ prolapse (UI/POP) at 3 and 6 months with the Pelvic Organ Prolapse Urinary Incontinence Sexual Questionnaire (PISQ). Of 269 eligible women participating in a trial of prophylactic antibiotic use with suprapubic catheters, 102 (37.9%) agreed to participate in a sexual function study. Women underwent a variety of anti-incontinence and reconstructive surgeries. Sexual function and urinary incontinence were assessed preoperatively and at 3 and 6 months, postoperatively, with the PISQ and Incontinence Impact Questionnaires (IIQ-7). Paired t tests compared changes over time. Logistic regression compared worsening PISQ versus other variables. Generalized McNemar test compared individual questions preoperatively and postoperatively. Significance was set at P<.05. Mean age was 47.1 (23-85) years, and 64% of women were premenopausal. Seventy-five (74%) women completed questionnaires at 3 or 6 months. Sexual function scores improved after surgery as did IIQ-7 scores (PISQ 89 vs 95, P<.001; IIQ-7 = 52 vs 13, P<.001). The Behavioral Emotive domain scores did not change at 3 to 6 months compared with preoperative scores P = .57), whereas the Physical domain improved (P<.001). Worsening PISQ scores were independent of age, type of surgery, hysterectomy, complications, or hormonal status (logistic regression, all P<.05).
Sexual function scores in women improve after surgery for UI/POP as did improvement of incontinence at 3 to 6 months after surgery.
closed_qa
Do panels vary when assessing intrapartum adverse events?
A national audit project, Scotland-wide Learning from Intrapartum Critical Events (SLICE), included local assessment of quality of care in cases of perinatal death and neonatal encephalopathy due to intrapartum events. Concerns had been raised about interobserver variation in case assessment by different panels. We therefore studied the extent of agreement and disagreement between assessment panels, and examined the areas in which agreement and disagreement tended to occur. 8 cases were randomly selected from all 42 cases identified during a 6-month period (1 January-1 July 2005). Each case was independently reviewed by three panels: the local hospital clinical risk-management group and two specially convened external panels. Panels assessed quality of care in three areas: admission assessment, recognition of incident, and method and timing of delivery. Predefined standards of care were provided for these three areas. Panels were also asked to assess the overall quality of care. For each area of care, agreement between the two external panels was lowest. The lowest levels of agreement between panels were seen in assessment of overall care (50% crude agreement between external panel 1 and the hospital (kappa = 0.24, AC(1) = 0.36); 29% crude agreement between external panels 1 and 2 (kappa = -0.11, AC(1) = 0.1); 47% crude agreement between external panel 2 and the hospital (kappa = 0.36, AC(1) = 0.46). The lowest level of agreement among all three panels was also in the assessment of overall care (crude agreement 48%; kappa = 0.16, AC(1) = 0.34).
Moderate to substantial agreement among the three panels was achieved for the three areas in which explicit standards were provided. Therefore, a systematic approach to analysis of adverse events in perinatal care improves reproducibility.
closed_qa
Is family therapy useful for treating children with anorexia nervosa?
Research suggests that family-based treatment (FBT) is an effective treatment for adolescents with anorexia nervosa (AN). This retrospective case series was designed to examine its usefulness with younger children. Data were abstracted from medical records of 32 children with a mean age of 11.9 years (range 9.0-12.9) meeting diagnostic criteria for AN (n=29) and eating disorder not otherwise specified-restricting type (n=3) who were treated at two sites with FBT. Baseline characteristics, before and after weights, and Eating Disorder Examination (EDE) scores were compared with an adolescent cohort (N=78) with a mean age of 15.5 years (range 13.1-18.4) who were treated with FBT. Children with AN share most disordered eating behaviors with their adolescent counterparts; however, their EDE scores are significantly lower than adolescents at both pre- and posttreatment assessments. Over the course of treatment with FBT, children showed statistically and clinically significant weight gain and improvements in eating disordered thinking as measured by the EDE.
FBT appears to be an acceptable and effective treatment for AN in children.
closed_qa
Perendoscopic variceal pressure measurement: a reliable estimation of portal pressure in patients with cirrhosis?
In patients with cirrhosis, the hepatic venous pressure gradient (HVPG) is the reference method for the assessment of portal hypertension (PHT). Variceal pressure (VP) may be measured at endoscopy, but its relationship to the HVPG remains controversial. The aim of the study was to retrospectively compare HVPG and VP values obtained in a cohort of patients with cirrhosis and PHT. Within 8 days (range: 6-10 days), 64 patients in a stable condition with biopsy-proven cirrhosis [alcoholic: 47; other 17; mean age: 56.5 yrs (35-70); mean Child-Pugh's score: 9.4 +/- 1.9; ascites: 37/64; previous variceal bleeding (="bleeders"): 24/64) and oesophageal varices (grade 2: 49; grade 3: 15)] underwent both measurement of the HVPG during transjugular liver biopsy and VP at endoscopy using a "home made" pressure sensitive gauge in the absence of needle puncture of the varix. Alcoholic hepatitis was present in 28 patients with alcoholic cirrhosis. The pressure sensitive gauge was well tolerated. The mean HVPG and VP values were 18.5 +/- 3.4 mmHg and 19 +/- 3.7 mmHg, respectively. A significant difference was observed between "bleeders" (n=24) and non "bleeders" (n=40) in terms of VP values (21.4 +/- 3.3 vs 17.2 +/- 3.2 mmHg, P<0.001), but not for HVPG values (19.4 +/- 4.1 vs 17.9 +/- 2.8 mmHg, P=0.075). A positive correlation was observed between VP and HVPG values (r=0.62, P<0.0001).
In this group of patients with cirrhosis and oesophageal varices, a "home-made" pressure sensitive gauge allowed a non invasive perendoscopic measurement of VP. The positive correlation between VP and HVPG values suggests that measurement of VP may be a reliable estimate of portal pressure in these patients.
closed_qa
Is low pre-pregnancy body mass index a risk factor for preterm birth and low neonatal birth weight?
A pregnant woman's weight is an extremely important factor in the course of pregnancy and delivery. Not only obesity but also being underweight may lead to complications in pregnancy such as: preterm delivery and low neonatal birth weight. The aim of this study was to analyze the relationship between a low BMI and outcome of pregnancy, birth weight and general well being of the neonates. 415 patients who were hospitalized in the Department of Obstetrics and Reproduction Wrocław Medical University between 1996-2005 was done. The patients were divided into 3 groups I--Underweight (BMI<19,8), II--Appropriate weight (BMI 19,8-26,0) and III--Overweight (BMI>26,0). The frequency of preterm deliveries as well as low neonatal birth weight<2500g, in underweight mothers was higher than in other groups.
Low pre-pregnancy BMI is an important factor risk factor in preterm deliveries. There was no correlation between BMI and the general well being of the neonates.
closed_qa
Is triple combination of different neurohormonal modulators recommended for treatment of mild-to-moderate congestive heart failure patients?
63 patients with CHF (NYHA class II-III) as a result of ischemic heart disease and dilated cardiomyopathy with LV EF<40% were randomly assigned to one of the treatment variants on 1:1:1 basis: B+Q (n = 22; mean daily dose of B-5.5 mg; Q-15.4 mg), B+V (n = 23; mean daily dose of B = 4.8 mg; V = 128 mg) and combination of B+Q+V (n = 18; mean daily dose of B = 4.1 mg; Q = 12 mg; V = 82 mg). At baseline, all the patients in this study were on background B treatment and according to the study design Q or V were then added to B at randomization. NYHA FC, 6-min walking test (6MT), QL, 2D-echocardiography, plasma rennin activity (PRA), angiotensin II (AT-II), aldosterone (Ald), norepinephrine (NE), epinephrine (E), brain natriuretic peptide (BNP) concentrations and 24-hour HRV parameters were investigated at baseline, 3 and 6 months after randomization. During the study NYHA FC improvement was revealed in all 3 treatment groups with comparative significant changes in 6MT distance by 20.4%, 19.1% and 19.4% in B+Q, B+V and B+Q+V groups. QL maximally decreased in B+V combination (from 45 to 21 points). LV volumes significantly decreased and LV ejection fraction (EF) increased in all groups to the end of the study. Triple combination had no additional effect on LV volumes and LVEF changes compared to B+Q and B+V groups. Maximally plasma NE concentrations decreased in B+Q group (from 650 to 430 pg/ml, p = 0.007). A worse effect was observed in the combination of B+Q+V, with any NE changes in B+V group. The E concentration increased significantly (from 215 to 295 pg/ml, p = 0.024) in the B+Q+V group at the end of the study. Plasma A-H concentration did not differ from the baseline during the study in B+Q group, but significantly increased in B+V group and maximally in B+Q+V group (from 11.4 to 23.5 pg/ml, p = 0.009). To the end of the study plasma Ald concentrations remain reduced significantly only in B+V group. The level of BNP significantly decreased in all 3 treatment groups. Significant changes in HRV indices, both in time and frequency domain, were revealed in the B+Q group at 3-month follow-up and SDNN increased on month 24 (p = 0.039). These changes became insignificant at the end of the study. The lesser effect was revealed in B+Q+V group, with insignificant trend toward an increase of SDNN to the end of the study. HRV indices did not improve in the B+V group.
During long-term treatment the triple combination of B+Q+V has no significant advantages over B+Q and B+V by the functional status, QL and parameters of LV remodeling in patients with mild-to-moderate CHF. The combination of B+Q has more potent effect on 24-hour HRV parameters, sympatho-adrenal activity and renal function compared to B+V and B+Q+V groups in CHF patients in our study. The combination B+Q+V may have a negative effect on NH profile (excessive activation of ATII and E) in CHF patients. The triple combination is not recommended for therapy of stable mild-to-moderate CHF patients.
closed_qa
Aggressive clear cell chondrosarcomas: do distinctive characteristics exist?
Clear cell chondrosarcoma (CCC) is commonly considered to be a low-grade subtype of chondrosarcoma. However, a few cases of CCC behave as high-grade lesions (with early metastases or multiple/synchronous locations). To investigate morphologic features that can help predict the aggressiveness of these CCCs. To investigate possible hallmarks of this aggressiveness, we are presenting the clinicopathologic features of 6 cases of CCC, 4 of which presented aggressive features and 2 low-grade behavior. The patients were 5 men and 1 woman; their ages ranged from 22 to 47 years. Histologic appearance, ultrastructure, and immunohistochemical expression of metalloproteinase 1 and 2 and their inhibitors were evaluated in all 6 cases. Pain was the most common symptom; the lesions were located in the femur (4), humerus (2), and vertebral body (1), with 1 patient presenting a double/synchronous lesion. Although no major differences were detected using conventional light microscopy, an ultrastructural analysis--at variance with usual cases--showed a lack of superficial microvilli in more than 50% of neoplastic cells in the aggressive cases, therefore suggesting a less differentiated phenotype. In addition, metalloproteinase 2 was more diffusely expressed in the aggressive tumors than in the conventional CCCs, whereas p53 labeling was always negative.
The aggressive behavior of some CCCs may be, at least in part, correlated to a lesser degree of cell differentiation and to the expression of tumor cell proteins, such as metalloproteinase 2, which are able to favor neoplastic spreading.
closed_qa
Visual exploratory behaviour in infancy and novelty seeking in adolescence: two developmentally specific phenotypes of DRD4?
The present study was designed to investigate the association between visual exploratory behaviour in early infancy, novelty seeking in adolescence, and the dopamine D4 receptor (DRD4) genotype. Visual attention was measured in 232 three-month-old infants (114 males, 118 females) from a prospective longitudinal study using a habituation-dishabituation paradigm. At age 15 years, the Junior Temperament and Character Inventory (JTCI/12-18) was administered to assess adolescent novelty seeking. DNA was genotyped for the DRD4 exon III polymorphism. Boys with a higher decrement of visual attention during repeated stimulation in infancy displayed significantly higher JTCI novelty seeking at age 15 years. Furthermore, boys carrying the 7r allele of DRD4 exhibited both greater rates of attention decrement in infancy and higher scores on NS in adolescence. In contrast, no association between DRD4, visual attention and novelty seeking was observed in girls.
The present investigation provides further evidence supporting a role of DRD4 in novelty seeking during the course of development.
closed_qa
Expression of suppressors of cytokine signaling in diseased periodontal tissues: a stop signal for disease progression?
Inflammatory cytokines are thought to trigger periodontal tissue destruction. In addition to being regulated by anti-inflammatory mediators, their activity is under the control of suppressors of cytokine signaling (SOCS), which down-regulate the signal transduction as part of an inhibitory feedback loop. We therefore investigated the expression of SOCS-1, -2 and -3, and the cytokines tumor necrosis factor-alpha (TNF-alpha) and interleukin-10, in different forms of human periodontal diseases. Quantitative polymerase chain reaction (RealTime-PCR) was performed with mRNA from gingival biopsies of control subjects and from that of patients with chronic gingivitis and chronic periodontitis. Our results show that patients with chronic gingivitis and chronic periodontitis exhibit significantly higher SOCS-1, -2 and -3, TNF-alpha and interleukin-10 mRNA expression when compared with healthy controls. The data also demonstrate that SOCS-1 and -3 mRNA expression was higher in tissue from patients with chronic gingivitis than chronic periodontitis, while the levels of SOCS-2, TNF-alpha and interleukin-10 mRNA were similar in these groups.
The increased expression of SOCS-1, -2 and -3 mRNA in diseased periodontal tissues is believed to be involved in the down-regulation of inflammatory cytokine and Toll-like receptor signaling, and therefore in the attenuation of both the inflammatory reaction and disease severity. Furthermore, it is possible that variation in the levels of SOCS mRNA expressed in different forms of periodontal diseases may determine the stable or progressive nature of the lesions.
closed_qa
Personalizing and externalizing biases in deluded and depressed patients: are attributional biases a stable and specific characteristic of delusions?
The purpose of this study was to explore whether explicit and implicit attributional styles of delusional patients were associated to their clinical state, and whether attributions biases are specific to delusional psychopathology or also appear in other disorders (i.e. depression). A cross-sectional design was used. The sample consisted of 136 participants (40 acute deluded participants, 25 remitted deluded participants, 35 depressed patients and 36 normal controls). The Internal, Personal and Situational Attributions Questionnaire (IPSAQ) and the Pragmatic Inferential Test (PIT) were used to assess explicit and implicit attributional style, respectively. All participants, with the exception of the depressed patients group, showed an externalizing bias (EB) for negative events. Although both acute and remitted deluded patients showed a similar overall pattern of explicit attributions, the personalizing bias (PB) was significantly greater in the acute group. The magnitude of this bias, which was also found in the depressed patients, was significantly related to the patient's degree of severity, as assessed by the total BPRS score (r=.45, p<.001). The results on the implicit attributions were more equivocal, perhaps due the low reliability of the PIT.
Attributional biases seem to be a stable characteristic of delusions. Yet, the PB might be a rather unspecific characteristic that varies with the degree of the severity of psychopathology. The implications of these findings for understanding the role of attributional biases in depression and delusion formation are discussed.
closed_qa
Can dynamic susceptibility contrast magnetic resonance imaging replace single-photon emission computed tomography in the diagnosis of patients with Alzheimer's disease?
To compare single-photon emission computed tomography (SPECT) and magnetic resonance imaging (MRI) in a cohort of patients examined for suspected dementia, including patients with no objective cognitive impairment (control group), mild cognitive impairment (MCI), and Alzheimer's disease (AD). Twenty-four patients, eight with AD, 10 with MCI, and six controls were investigated with SPECT using 99mTc-hexamethylpropyleneamine oxime (HMPAO) and dynamic susceptibility contrast magnetic resonance imaging (DSC-MRI) with gadobutrol. Three observers performed a visual interpretation of the SPECT and MR images using a four-point visual scale. SPECT was superior to DSC-MRI in differentiating normal from pathological. All three observers showed statistically significant results in discriminating between the control group, AD, and MCI by SPECT, with a P value of 0.0006, 0.04, and 0.01 for each observer. The statistical results were not significant for MR (P values 0.8, 0.1, and 0.2, respectively).
DSC-MRI could not replace SPECT in the diagnosis of patients with Alzheimer's disease. Several patient- and method-related improvements should be made before this method can be recommended for clinical practice.
closed_qa
Can you find the source of her pain?
The features of abdominal pain in this gravid patient mimicked more common diagnoses like preterm labor, chorioamnionitis, and appendicitis. A 40-year-old multipara presented at 30 weeks and 6 days with abdominal pain. The cause was not discovered until the time of cesarean delivery several days after admission.
This common gynecologic problem can precipitate severe problems in a pregnant woman and should be considered part of the diagnosis in pregnant patients presenting with pain.
closed_qa
Is fusion necessary for surgically treated burst fractures of the thoracolumbar and lumbar spine?
A prospective clinical trial was conducted. To compare the results of fusion versus nonfusion for surgically treated burst fractures of the thoracolumbar and lumbar spine. The operative results of surgically treated burst fractures with short segmental fixation have been well documented. There is no report comparing the results of fusion and nonfusion. Fifty-eight patients were included in this study, with the inclusion criteria as follows: neurologically intact spine with a kyphotic angle>or = 20 degrees, decreased vertebral body height>or = 50% or a canal compromise>or = 50%, incomplete neurologic deficit with a canal compromise 50%, complete neurologic deficit, and multilevel spinal injury or multiple traumas. All patients were randomly assigned to fusion or nonfusion groups, and operative treatment with posterior reduction and instrumentation was carried out. Posterior fusion with autogenous bone graft was performed for the fusion group (n = 30), and no fusion procedure was done for the nonfusion group (n = 28). The average follow-up period was 41 months (range, 24-71 months). The average loss of kyphotic angle was not statistically significant between these 2 groups. The radiographic parameters were statistically significantly better in the nonfusion group, including angular change in the flexion-extension lateral view (4.8 degrees vs. 1.0 degrees), lost correction of decreased vertebral body height (3.6% vs. 8.3%), intraoperative estimated blood loss (303 mL vs. 572 mL), and operative time (162 minutes vs. 224 minutes). The scores on the low back outcome scale were not statistically significant for these 2 groups.
The short-term results of short segmental fixation without fusion for surgically treated burst fractures of the thoracolumbar spine were satisfactory. The advantages of instrumentation without fusion are the elimination of donor site complications, saving more motion segments, and reducing blood loss and operative time.
closed_qa
Is cardiac surgery warranted in children with Down syndrome?
To compare children with Down syndrome and children without Down syndrome and investigate whether there is a significant difference in the burden that is placed on the health care system between these two groups only in respect of the repair of congenital heart disease at Red Cross War Memorial Children's Hospital, Cape Town, South Africa. This study is a retrospective case control review. Red Cross War Memorial Children's Hospital, Cape Town, South Africa. The sample group of 50 Down syndrome children who had received cardiac surgery between January 1998 and June 2003 was compared with a control group of 50 nonsyndromic children who had received cardiac surgery during the same period. Sex and diagnoses (cardiac and noncardiac), number of days spent in hospital and in ICU, complication rates, re-operation rates, early mortality rates, planned further cardiac surgery. Costs of these outcomes were not quantified in exact monetary terms. There was no significant difference between the two groups in terms of the burden that was placed on the health care system. Similar complication rates, re-operation rates and early mortality rates were recorded for both groups. The Down syndrome group appeared to benefit more from cardiac surgery than the non-Down syndrome group.
Denying cardiac surgery to children with Down syndrome does not improve the efficiency of resource allocation. It is therefore not reasonable to suggest that the problem of scarce resources can be ameliorated by discriminating against children with Down syndrome.
closed_qa
Are there differences between women with urge predominant and stress predominant mixed urinary incontinence?
We sought to determine if there are differences in clinical and urodynamic parameters between women with urge predominant and those with stress predominant mixed urinary incontinence (MUI). Charts of 99 female patients with complaints of MUI were reviewed. Patients were divided into two groups based on the subjective predominance of either stress incontinence (MSUI) or urge incontinence (MUUI). All patients completed a subjective evaluation including an AUA Symptom Index, Urogenital Distress Inventory (UDI-6), and Incontinence Impact Questionnaire (IIQ-7). Objective non-invasive measures included physical exam, 48-hr voiding diary, and a 24-hr pad test. Videourodynamics studies (VUDS), performed in all patients, were reviewed and the presence and characteristics of detrusor overactivity (DO) and stress incontinence were noted. There were no significant differences between groups with respect to symptom scores. MUUI patients had significantly higher pad usage, and lower maximum and average voided volumes than MSUI patients. They were also more likely to have lower urodynamic bladder capacities and demonstrable DO (70% vs. 26%) on VUDS with contractions occurring at lower bladder volumes and with higher amplitude. MSUI patients were more likely to have demonstrable SUI on physical examination (63% vs. 16%) and on VUDS (100% vs. 61%).
There do appear to be differences in clinical and urodynamic parameters between patients with stress predominant and urge predominant MUI. These may help to determine which component of the mixed incontinence is more problematic.
closed_qa
Does long-term losartan- vs atenolol-based antihypertensive treatment influence collagen markers differently in hypertensive patients?
The aim of this study was to investigate the effects of losartan- vs atenolol-based antihypertensive treatment on circulating collagen markers beyond the initial blood pressure (BP) reduction. In 204 patients with hypertension and left ventricular (LV) hypertrophy we measured serum concentration of carboxy-terminal telopeptide of type I procollagen (ICTP), carboxy-terminal propeptide of type I procollagen (PICP), amino-terminal propeptide of type III procollagen (PIIINP), amino-terminal propeptide of type I procollagen (PINP) and LV mass by echocardiography at baseline and annually during 4 years of losartan- or atenolol-based antihypertensive treatment; 185 patients completed the study. Beyond the first year of treatment systolic and diastolic BP, LV mass index (LVMI) as well as collagen markers did not change significantly and were equal in the two treatment groups. Changes in PICP during first year of treatment were related to subsequent changes in LV mass index after 2 and 3 years of treatment (r=0.28 and r=0.29, both p<0.05) in patients randomized to losartan, but not atenolol.
Long-term losartan- vs atenolol-based antihypertensive treatment did not influence collagen markers differently, making a BP-independent effect of losartan on collagen markers unlikely. However, initial reduction in circulating PICP may predict later regression of LV hypertrophy during losartan-based antihypertensive treatment.
closed_qa
Is triple combination of different neurohormonal modulators recommended for treatment of mild-to-moderate congestive heart failure patients?
Sixty three patients with CHF (NYHA class II-III) as a result of ischemic heart disease and dilated cardiomyopathy with LV EF<40% were randomly assigned to one of the treatment variants on 1:1:1 basis: B+Q (n = 22), B+V (n = 23) and combination of B+Q + V (n = 18). At baseline, all the patients in this study were on background B treatment and according to the study design Q or V were then added to B at randomization. NYHA FC, 6-min walking test (6MT), QOL, 2D-echocardiography, plasma renin activity (PRA), angiotensin II (AT-II), aldosterone (Ald), norepinephrine (NE), epinephrine (E), brain natriuretic peptide (BNP) concentrations and 24-hour HRV parameters were investigated at baseline, 3 and 6 months after randomization. During the study NYHA FC improvement was revealed in all three treatment groups with comparative significant changes in 6MT distance by 20.4%, 19.1% and 19.4% in B+Q, B+V and B+Q+V groups, respectively. QOL maximally decreased in B+V combination (from 45 to 21 points). LV volumes significantly decreased and LV ejection fraction (EF) increased in all groups to the end of the study. Triple combination had no additional effect on LV volumes and LVEF changes compared to B+Q and B+V groups. Plasma NE concentrations decreased maximally in B+Q group (from 650 to 430 pg/ml, p = 0.007). The lesser effect was observed in the combination of B+Q+V, with any NE changes in B+ V group. The E concentration increased significantly (from 215 to 295 pg/ml, p = 0.024) in the B+Q+V group at the end of the study. Plasma A-II concentration did not differ from the baseline during the study in B+Q group, but significantly increased in B+V group and maximally in B+Q+V group (from 11.4 to 23.5 pg/ml, p = 0.009). To the end of the study plasma Ald concentrations remain reduced significantly only in B+V group. The level of BNP significantly decreased in all 3 treatment groups. Significant changes in HRV indices, both in time and frequency domain, were revealed in the B+Q group at 3-month follow-up and SDNN increased on month 24 (p = 0.039). These changes became insignificant at the end of the study. The lesser effect was revealed in B+Q+V group, with insignificant trend toward an increase of SDNN to the end of the study. HRV indices did not improve in the B+V group.
During long-term treatment the triple combination of B+Q+ V has no significant advantages over B+Q and B+V by the functional status, QOL and parameters of LV remodeling in patients with mild-to-moderate CHF. The combination of B+Q has more potent effect on 24-hour HRV parameters, sympatho-adrenal activity and renal function compared to B+V and B+Q+V groups in CHF patients in our study. The combination B+Q+V may have a negative effect on NH profile (excessive activation of ATII and E) in CHF patients. The triple combination is not recommended for therapy of stable mild-to-moderate CHF patients.
closed_qa
Characteristics of action of various drugs blocking atrioventricular conduction (beta-blockers, verapamil, diltiazem) in constant fibrillation tachyarrhythmia. Is monotherapy optimal?
Thirty patients with CCF (mean age 64.5 +/- 9.5 years) received beta-blockers (n = 10, atenolol in a dose 50.0 +/- 23.2 mg/day or metoprolol in a dose 45.0 +/- 20.9 mg/day), verapamil (n = 10, 192.0 +/- 83.9 mg/day) and diltiazem (n = 10, 286.6 +/- 107.2 mg/day). The patients were studied with Holter ECG monitoring (Schiller MT-100, Switzerland) and high resolution ECG (electrocardioanalyser Cardis, Geolink-electronics, RF) with construction of periodograms of ff waves and interval histograms RR (IHrr), estimation of the rhythm variability (SDRR, rMSSD, PNN50). Beta-blockers (atenolol, metoprolol), verapamil and diltiazem had no significant effect on the period of ff waves. The degree of a mean heart rate lowering decreased in the following order: beta-blockers-verapamil-diltiazem (30.1 +/- 12.5, 25.0 +/- 18.8 and 22.0 +/- 23.6 beat/min, differences are insignificant), this corresponded to the degree of Rrmin increase (0.12 +/- 0.04, 0.08 +/- 0.07 and 0.07-0.08). In CCF the inhibiting effect of beta-blockers and verapamil is substrate-dependent: the shorter baseline Rrmin (and higher heart rate), the more potent is the effect due to action of beta-blockers and verapamil (r = -0.58 and r = -0.57, p<0.05) and reduction of a mean heart rate (r = -0.74 and r = -0.84, p<0.05). Dependence of diltiazem effect on initial Rrmin is inverse. In contrast to Ca antagonists (verapamil, diltiazem), beta-blockers increased latent conduction manifesting in a significant rise of Rrmax, range of RR intervals (the difference between Rrmax and Rrmin) and in increased latent conduction (Rrmax/Rrmin) by 0.4 versus 0 and 0.1 in the groups on verapamil and diltiazem). In addition to insufficient shift RRmod, there appeared non-optimal rhythm structure--combination of a large number of short and long RR in small number of middle ones. Verapamil and diltiazem improved the rhythm pattern due to proportional increase of RRmin shift RRmod (r = 0.72 and r = 0.71, p<0.05) and absence of a distinct effect on latent conduction. The between groups differences by SDRR, RMSSD and PNN50 dynamics were insignificant. Diltiazem in doses 360-480 mg/ day moderately increased latent conduction, but was low effective in the presence of early peak RR (0.28-0.46 s).
Monotherapy with AB-blocking drugs was possible only in patients with moderate tachycardia, no waves of fibrillation of large and middle periods (0.15 s and higher) and should be conducted under Rrmin control. In the other cases, the above drugs are either low effective or promote non-optimal rhythm structure. Therefore, combined therapy with AB-blocking drugs and cardiac glycosides is indicated for CCF patients.
closed_qa
Are laboratory-based antibiograms reliable to guide the selection of empirical antimicrobial treatment in patients with hospital-acquired infections?
Antibiograms are often taken into account to define a rational selection of an empirical antimicrobial therapy for treating patients with hospital-acquired infections. In this study, we performed a paired comparison between the antibiogram constructed with laboratory-based data and that formed with data subjected to prior clinical validation. Between 2003 and 2005, the laboratory of microbiology printed in duplicate every individual susceptibility report corresponding to hospitalized patients and the copy was sent to the department of infection control. Every individual report was assessed in real time at the bedside of the patient by a multidisciplinary team for clinical significance and appropriateness of the specimen, as well as for the type, source and origin of the infection. Cumulative resistance rates were estimated in parallel at the laboratory with the whole data, and at the infection control department with data subjected to prior clinical validation. These rates were designated as 'laboratory-based' and 'clinically based', respectively. A total of 2305 individual susceptibility reports were assessed. Only 1429 (62.0%) were considered as clinically significant by the multidisciplinary team. Escherichia coli, Enterobacter cloacae, Citrobacter freundii group, Klebsiella species and Proteus mirabilis resistant to broad-spectrum cephalosporins, as well as methicillin-resistant Staphylococcus aureus, were significantly more frequent in the clinically based rates (P<or = 0.03).
Laboratory-based data underestimate the frequency of several major resistant organisms in patients with hospital-acquired infection. Previous clinical validation of the individual susceptibility reports seems to be a suitable strategy to get more reliable data.
closed_qa
Nonsteroidal anti-inflammatory drug-induced fracture nonunion: an inhibition of angiogenesis?
Approximately 5% to 10% of fractures may result in delayed union or nonunion. The results of research done over the past three decades have shown that the use of nonsteroidal anti-inflammatory drugs (NSAIDs) has an inhibitory effect on fracture repair, but the exact mechanism of action remains to be elucidated. Cancer research has identified that NSAIDs impede cell proliferation by inhibiting angiogenesis. It is proposed that a similar mechanism occurs in the induction of NSAID-induced nonunions. This hypothesis was investigated in a randomized placebo-controlled trial of the NSAID rofecoxib with use of a murine femoral fracture model. Two hundred and forty mice were randomized to receive either the nonsteroidal anti-inflammatory drug rofecoxib (5 mg/kg orally) in a 0.5% methylcellulose solution (the NSAID group) or the 0.5% methylcellulose solution only (the control group). Two hundred and thirty-five of the 240 mice underwent surgery to induce an open transverse middiaphyseal femoral fracture, which was then treated with use of a custom-made external fixator. Five additional animals underwent sham surgery with no fracture induced. Outcomes measures included radiographic assessment, histologic analysis, biomechanical testing, and use of laser Doppler flowmetry to assess blood flow across the fracture gap. Radiography revealed similar healing patterns in both groups; however, at the later stages (day 32), the NSAID group had poorer healing. Histological analysis demonstrated that the control animals healed quicker (at days 24 and 32) and had more callus and less fibrous tissue (at days 8 and 32) than the NSAID animals did. Biomechanical testing found that the control animals were stronger at day 32. Both groups exhibited a similar pattern of blood flow; however, the NSAID group exhibited a lower median flow from day 4 onward (significant at days 4, 16, and 24). Positive correlations were demonstrated between both histological and radiographic assessments of healing and increasing blood flow. NSAID-treated animals exhibited lower blood flow and poorer healing by all parameters. Regression analysis, however, demonstrated that the negative effect of NSAIDs on fracture repair is independent of its inhibitory action on blood flow.
Following the development of a novel method of analyzing functional vascularity across a fracture gap, we have demonstrated that the cyclooxygenase-2 (COX-2) inhibitor rofecoxib has a significant negative effect on blood flow across the fracture gap as well as an inhibiting effect on fracture repair.
closed_qa
Is there a safe area for the axillary nerve in the deltoid muscle?
Several authors have defined a variety of so-called safe zones for deltoid-splitting incisions. The first aim of the present study was to investigate the distance of the axillary nerve from the acromion and its relation to arm length. The second aim was to identify a safe area for the axillary nerve during surgical dissection of the deltoid muscle. Twenty-four shoulders of embalmed adult cadavers were included in the study. The distance from the anterior edge of the acromion to the course of the axillary nerve was measured and was recorded as the anterior distance. The same measurement from the posterior edge of the acromion to the course of the axillary nerve was made and was recorded as the posterior distance for each limb. Correlation analysis was performed between the arm length and the anterior distance and the posterior distance for each limb. The ratios between arm length and the anterior and posterior distances were calculated for each case and were recorded as an anterior index and a posterior index. The average arm length was 30.40 cm. The average anterior distance was 6.08 cm, and the average posterior distance was 4.87 cm. There was a significant correlation between arm length and both anterior distance (r = 0.79, p<0.001) and posterior distance (r = 0.61, p = 0.001). The axillary nerve was not found to lie at a constant distance from the acromion at every point along its course. The average anterior index was 0.20, and the average posterior index was 0.16.
The present study describes a safe area above the axillary nerve that is quadrangular in shape, with the length of the lateral edges being dependent on the individual's arm length. Using this safe area should provide a safe exposure for the axillary nerve during shoulder operations.
closed_qa
Preservation of the ulnar bursa within the carpal tunnel: does it improve the outcome of carpal tunnel surgery?
It was hypothesized that preserving a layer of gliding tissue, the parietal layer of the ulnar bursa, between the contents of the carpal tunnel and the soft tissues incised during carpal tunnel surgery might reduce scar pain and improve grip strength and function following open carpal tunnel decompression. Patients consented to randomization to treatment with either preservation of the parietal layer of the ulnar bursa beneath the flexor retinaculum at the time of open carpal tunnel decompression (fifty-seven patients) or division of this gliding layer as part of a standard open carpal tunnel decompression (sixty-one patients). Grip strength was measured, scar pain was rated, and the validated Patient Evaluation Measure questionnaire was used to assess symptoms and disability preoperatively and at eight to nine weeks following the surgery in seventy-seven women and thirty-four men; the remaining seven patients were lost to follow-up. There was no difference between the groups with respect to age, sex, hand dominance, or side of surgery. Grip strength, scar pain, and the Patient Evaluation Measure score were not significantly different between the two groups, although there was a trend toward a poorer subjective outcome as demonstrated by the questionnaire in the group in which the ulnar bursa within the carpal tunnel had been preserved. Preserving the ulnar bursa within the carpal tunnel did, however, result in a lower prevalence of suspected wound infection or inflammation (p = 0.04).
In this group of patients, preservation of the ulnar bursa around the median nerve during open carpal tunnel release produced no significant difference in grip strength or self-rated symptoms. We recommend incision of the ulnar bursa during open carpal tunnel decompression to allow complete visualization of the median nerve and carpal tunnel contents.
closed_qa
Should our well-child care system be redesigned?
The goal was to examine pediatricians' views about whether and how well-child care for children 0 to 5 years of age should be changed. A mail survey of a national random sample of 1000 general pediatricians was performed with a survey instrument that examined pediatricians' attitudes and behaviors toward our current way and an ideal way of providing well-child care. Results were analyzed for the following 3 major domains of change in well-child care: provider type, visit format, and visit location. Sixty percent (n = 502) of eligible subjects responded to the survey. Nearly all respondents (97%) rated the current US system as excellent or good in providing well-child care. Most pediatricians (85%-91%) reported that they are currently the main providers of anticipatory guidance, developmental screening, and psychosocial screening. However, a majority (54%-60%) reported that, in an ideal system that maximized the effectiveness and efficiency of care, nonphysicians would provide these services. Fewer pediatricians (24%) reported that ideally nonphysicians should provide the physical examination. The majority of respondents (79%-93%) reported that at least some anticipatory guidance, minor acute care, and chronic care services could be conducted through telephone or e-mail communication, and 55% stated that at least some well-child care services should be provided in alternative locations, such as day care centers. In multivariate analysis, support for these changes was distributed widely across pediatricians with varying personal and practice characteristics.
Although most pediatricians are generally satisfied with our current way of providing well-child care, a majority think that a system that is less reliant on physicians and face-to-face office visits would be a more effective and efficient way to provide care.
closed_qa
Is septal glucose metabolism altered in patients with left bundle branch block and ischemic cardiomyopathy?
Left bundle branch block (LBBB) is common in patients with heart failure (HF) and contributes to left ventricular (LV) dysfunction. The abnormal septal motion may alter septal metabolic demand but this has not been well characterized in patients with ischemic cardiomyopathy (ICM) and LV dysfunction. The aim of this study was to determine the effect of LBBB on septal metabolism in patients with ICM, LV dysfunction, and LBBB. Fifty-three patients with LV dysfunction and ICM were identified: 34 with LBBB, 19 with normal QRS (</=100, control patients). PET using (18)F-FDG and (82)Rb was used to measure myocardial glucose metabolism and perfusion, respectively. Perfusion-metabolism differences were determined. Scar scores (matched decreases in (18)F-FDG and (82)Rb), mismatch scores (hibernating myocardium with decreased (82)Rb relative to (18)F-FDG), and reverse-mismatch (R-MM) scores (reduced (18)F-FDG relative to (82)Rb) were assessed in the septum and lateral wall. (18)F-FDG uptake in the septum was reduced in patients with LBBB (64.0% +/- 15.4%) compared with control patients (74.9% +/- 14.3%; P<0.05). Mean septal R-MM was greater in patients with LBBB (19.1% +/- 15.3%) versus control patients (4.7% +/- 10.6%; P<0.05). However, 32% (11/34) of patients with LBBB did not demonstrate septal R-MM, 91% (10/11) of whom demonstrated lateral wall perfusion defects. Of the 68% (23/34) of patients with LBBB and septal R-MM, 52% (12/23) demonstrated lateral wall perfusion defects (P<0.05). There was a significant difference in the percentage of the lateral wall with scar between those with septal R-MM (9.3% +/- 10.5%) and those without (19.9% +/- 14.3%; P<0.05).
Previously, LBBB was believed to be characterized by reduced glucose metabolism relative to perfusion in the septum; however, this is not always the case in ICM. LBBB is not associated with septal R-MM in>30% of this patient population. Absence of this finding was often associated with lateral wall perfusion defects, suggesting an alteration in the metabolic demand on the septum. This may have implications for HF therapies such as resynchronization and requires further study.
closed_qa
18F-FDG PET definition of gross tumor volume for radiotherapy of non-small cell lung cancer: is a single standardized uptake value threshold approach appropriate?
PET with (18)F-FDG has been used in radiation treatment planning for non-small cell lung cancer (NSCLC). Thresholds of 15%-50% the maximum standardized uptake value (SUV(max)) have been used for gross tumor volume (GTV) delineation by PET (PET(GTV)), with 40% being the most commonly used value. Recent studies indicated that 15%-20% may be more appropriate. The purposes of this study were to determine which threshold generates the best volumetric match to GTV delineation by CT (CT(GTV)) for peripheral NSCLC and to determine whether that threshold can be generalized to tumors of various sizes. Data for patients who had peripheral NSCLC with well-defined borders on CT and SUV(max) of greater than 2.5 were reviewed. PET/CT datasets were reviewed, and a volume of interest was determined to represent the GTV. The CT(GTV) was delineated by using standard lung windows and reviewed by a radiation oncologist. The PET(GTV) was delineated automatically by use of various percentages of the SUV(max). The PET(GTV)-to-CT(GTV) ratios were compared at various thresholds, and a ratio of 1 was considered the best match, or the optimal threshold. Twenty peripheral NSCLCs with volumes easily defined on CT were evaluated. The SUV(max) (mean +/- SD) was 12 +/- 8, and the mean CT(GTV) was 198 cm(3) (97.5% confidence interval, 5-1,008). The SUV(max) were 16 +/- 5, 13 +/- 9, and 3.0 +/- 0.4 for tumors measuring greater than 5 cm, 3-5 cm, and less than 3 cm, respectively. The optimal thresholds (mean +/- SD) for the best match were 15% +/- 6% for tumors measuring greater than 5 cm, 24% +/- 9% for tumors measuring 3-5 cm, 42% +/- 2% for tumors measuring less than 3 cm, and 24% +/- 13% for all tumors. The PET(GTV) at the 40% and 20% thresholds underestimated the CT(GTV) for 16 of 20 and 14 of 20 lesions, respectively. The mean difference in the volumes (PET(GTV) minus CT(GTV) [PET(GTV) - CT(GTV)]) at the 20% threshold was 79 cm(3) (97.5% confidence interval, -922 to 178). The PET(GTV) at the 20% threshold overestimated the CT(GTV) for all 4 tumors measuring less than 3 cm and underestimated the CT(GTV) for all 6 tumors measuring greater than 5 cm. The CT(GTV) was inversely correlated with the PET(GTV) - CT(GTV) at the 20% threshold (R(2) = 0.90, P<0.0001). The optimal threshold was inversely correlated with the CT(GTV) (R(2) = 0.79, P<0.0001).
No single threshold delineating the PET(GTV) provides accurate volume definition, compared with that provided by the CT(GTV), for the majority of NSCLCs. The strong correlation of the optimal threshold with the CT(GTV) warrants further investigation.
closed_qa
Can pancreatic steatosis explain the finding of pancreatic hyperenzymemia in subjects with dyslipidemia?
It has been proposed by some investigators that benign pancreatic hyperenzymemia could result from pancreatic steatosis that they believe would have been caused by dyslipidemia; their diagnosis of steatosis was based on the finding of a hyperechogenic pancreas at ultrasound. The aim of this study was to assess the validity of this proposed model. The study group was composed of 18 subjects with benign pancreatic hyperenzymemia, 12 men and 6 women; mean age, 55 years; range, 38 to 68 years. All 18 had dyslipidemia and 9 had hyperechogenic pancreas at ultrasound. In addition, 6 subjects with benign pancreatic hyperenzymemia but who did not have dyslipidemia or hyperechogenic pancreas and 10 healthy subjects with none of these conditions were also studied as controls. In each of these subjects, magnetic resonance imaging of the pancreas was performed to assess the presence of pancreatic steatosis. Magnetic resonance imaging showed normal pancreas with no signs of fatty infiltration in all 18 subjects with dyslipidemia, including those with both dyslipidemia and hyperechogenic pancreas at ultrasound. A similar result was found in all control subjects.
The finding of a completely normal pancreas at magnetic resonance imaging does not support the proposed model in which pancreatic hyperenzymemia in subjects with dyslipidemia is attributed to pancreatic steatosis.
closed_qa
Is defective pancreatic beta-cell mass environmentally programmed in Goto-Kakizaki rat model of type 2 diabetes?
The Goto-Kakizaki (GK) rat is a spontaneous model of type 2 diabetes with a well established pathological pancreatic beta-cell development. Hyperglycemia experienced during early postnatal life contributes to the programming of endocrine pancreas. We have analyzed the consequences of hyperglycemic versus euglycemic suckling period for the pancreatic beta-cell mass and the in vivo glucose tolerance and insulin secretion in 4-week-old unweaned control Wistar (W), diabetic GK, and in offspring issued from crosses between normoglycemic W and diabetic GK rats. Mother/father crosses yielded offspring designated as follows: W/W, GK/GK, W/GK, and GK/W. In vivo glucose tolerance and insulin secretion tests were performed on males 4 weeks after birth, that is, just before weaning. Beta-cell mass was determined by immunohistochemistry and morphometry. Four-week-old W/GK and GK/W rats are normoglycemic, normoinsulinemic, and display a similarly small beta-cell mass. Both W/GK and GK/W rats exhibit in vivo glucose intolerance and defective insulin secretion in response to glucose.
Our data obtained from crossbreeding studies during suckling period suggest that the defective pancreatic beta-cell mass is not environmentally programmed in the GK model of type 2 diabetes. Rather, they support the hypothesis that the beta-cell mass defect in the GK is linked to genetic determinism.
closed_qa
Sacral nerve stimulation in fecal incontinence: are there factors associated with success?
Sacral nerve stimulation has been used successfully in treating fecal incontinence. This study was designed to evaluate the proportion of patients with unsuccessful implantation despite positive test stimulation and to examine and compare factors associated with the success of the transitory and permanent sacral nerve stimulation. A total of 61 patients (55 females; median age, 56 (range, 33-77) years) with refractory fecal incontinence underwent temporary stimulation. A 50 percent or greater improvement in the number of episodes of fecal incontinence or urgency was required to proceed to permanent implantation and was the criteria of success of permanent sacral nerve stimulation at the last follow-up visit in implanted patients. The factors compared between the success and the failure groups during temporary and permanent stimulation were patients' age and gender, diagnosis and characteristics of fecal incontinence, previous surgery, quality of life scores, anorectal manometry, endoanal ultrasound, and electrophysiologic tests performed before stimulation. Temporary stimulation was successful in 35 patients (57.4 percent). A permanent neurostimulation device was implanted in 33 patients. Age was the only factor related to success of the temporary stimulation (P=0.03). After permanent implantation, 31 percent of patients did not attain screening phase results for the number of episodes of fecal incontinence or urgency. A neurologic disorder was more frequently the origin of fecal incontinence in the success group compared with others (P=0.03). The left bulbocavernosus reflex was more frequently delayed in the success group than in the others (P=0.03), and a prolonged or absent bulbocavernosus reflex was more frequent in the success group than in the failure group (P=0.03).
Patients with fecal incontinence from neurologic origins could be good candidates for sacral nerve stimulation.
closed_qa
Preoperative screening for coagulation disorders in children undergoing adenoidectomy (AT) and tonsillectomy (TE): does it prevent bleeding complications?
Bleeding remains the most important complication of adenotonsillectomy in children. Preoperative coagulation tests are widely used to detect unknown bleeding disorders. To determine the efficacy of preoperative coagulation screening in preventing bleeding complications. Study group 1: 148 healthy children referred by the otorhinolaryngology department for preoperative pediatric examination. Study group 2: 124 healthy children sent to the hemostaseologic clinic for preoperative investigation of a prolonged PTT. The incidence of relevant coagulation disorders detected by a standardized bleeding history and coagulation screening tests was studied prospectively in 2 study groups planned for AT and/or TE. The frequency of abnormal bleeding was investigated retrospectively in those children who underwent surgery. Bleeding disorders were detected in 7/148 and 15/124 children in study group 1 and 2 respectively. 141/148 and 79/124 children actually underwent surgery, 62 TE +/- AT + 79 AT alone and 26 TE +/- AT + 53 AT alone respectively. Major bleeding occurred in 1/141 patients (1 TE) in study group 1. Preoperatively, this child had shown normal coagulation screening tests. In 4/79 patients (3 TE, 1 AT) in study group 2, surgery was complicated by major bleeding. Despite extensive testing, no relevant bleeding disorder had been diagnosed in these children preoperatively. Sensitivity of coagulation screening tests for major bleeding was 0 in study group 1.
In our study, coagulation screening failed to effectively identify patients at risk of bleeding.
closed_qa
Can the determination of IgG antibodies to the pertussis toxin help the diagnosis of the disease?
In the differential diagnosis of protracted irritating cough we should always consider the possibility of pertussis. Serology performed primarily in a late stage of the disorder does not always provide a clear answer. We wanted to verify whether a quantitative determination of IgG antibodies to the pertussis toxin (IgG-PT) could help establish a clear diagnosis. Between 1 January and 30 June 2005 we performed serological investigations in 139 children presenting with an irritable cough or after application of an acellular pertussis vaccine. In 95 children we compared the serological response of agglutination antibodies and IgG-PT. After vaccination the children presented with different levels of antibodies and these were not always identical in two types of serological response. Children with clinical manifestations of pertussis presented a good formation of antibodies, but the two kinds of antibodies often persisted for long periods at unchanged levels.
The determination of IgG-PT can assist the diagnosis of the disease, but this investigation alone cannot yield a clear-cut confirmation of pertussis.
closed_qa
Is FKBP5 a genetic marker of affective psychosis?
A dysregulation of the hypothalamic-pituitary-adrenal (HPA) axis has been proposed as an important pathogenic factor in depression. Genetic variants of FKBP5, a protein of the HPA system modulating the glucocorticoid receptor, have been reported to be genetically associated with improved response to medical treatment and an increase of depressive episodes. We examined three single nucleotide polymorphisms (SNPs) in FKBP5, rs4713916 in the proposed promoter region, rs1360780 in the second intron and rs3800373 in the 3'-untranslated region (3'-UTR), in a case-control study of Caucasian origin (affective psychosis: n = 248; controls: n = 188) for genetic association and association with disease related traits. Allele and genotype frequencies of rs4713916, rs1360780 and rs3800373 were not significantly different between cases and controls. Two three-locus haplotypes, G-C-T and A-T-G, accounted for 86.2% in controls. Odds ratios were not increased between cases and controls, except the rare haplotype G-C-G (OR 6.81), representing 2.1% of cases and 0.3% of controls. The frequency of rs4713916AG in patients deviated from expected Hardy-Weinberg equilibrium, the genotype AA at rs4713916 in monopolar depression (P = 0.011), and the two-locus haplotype rs1360780T--rs3800373T in the total sample (overall P = 0.045) were nominally associated with longer continuance of disease.
Our data do not support a significant genetic contribution of FKBP5 polymorphisms and haplotypes to affective psychosis, and the findings are inconclusive regarding their contribution to disease-related traits.
closed_qa
Orthotopic ileal neobladder reconstruction for bladder cancer: is adjuvant chemotherapy safe?
We examined our database of patients undergoing radical cystectomy (RC) with orthotopic neobladder (NB) to determine whether adjuvant chemotherapy in this group is safe. We performed a retrospective analysis of patients who underwent radical cystectomy and urinary diversion between 1992 and 2004. Relevant clinical and therapeutic data were entered into a database. High-risk bladder cancer patients who underwent NB were identified. They were stratified into 2 groups, those who received adjuvant chemotherapy and those who did not. The incidence of complications between the 2 groups was analyzed and compared. Over the 12-year period, 136 patients underwent RC and NB construction for bladder cancer. Of these, 83 patients were at high risk for recurrence. Nineteen patients received adjuvant chemotherapy and 64 did not. The complication rate in the adjuvant chemotherapy group was 53% and it was 23% in those who did not receive chemotherapy. There were no perioperative or treatment related death. There were 2 patients with grade 4 toxicity in the adjuvant chemotherapy group. There was a statistical difference between these two groups with regard to the incidence of complications. However, none of these complications was life-threatening, required only conservative treatment and caused no long-term disability.
Adjuvant chemotherapy is a safe treatment for patients undergoing RC and NB substitution. Hence, the option of orthotopic NB should not be denied in selected bladder cancer patients with high risk for recurrent disease.
closed_qa
Do leptomeningeal venous drainage and dysplastic venous dilation predict hemorrhage in dural arteriovenous fistula?
The objective of this study was to correlate the presence of leptomeningeal venous drainage and dysplastic venous dilation with the risk of intracranial hemorrhage in DAVFs. The subjects for this research were composed of 93 patients with DAVFs who were studied retrospectively with regard to therapeutic success and failure, who had undergone either neurosurgery or embolization or a combination of both methods, and whose disease was located in the cavernous sinus, the superior sagittal sinus, or the transverse-sigmoid sinus of the anterior fossa or of the tentorium. Also among these study subjects were patients who had had angiography done in at least 6 cranial vessels (external and internal carotid and vertebral arteries) before and after treatment and who had a minimum follow-up of 3 months. Cases of leptomeningeal venous drainage and dysplastic venous dilation and their associations with intracranial hemorrhage were ascertained for all the 93 case patients included in the study. The presence of leptomeningeal drainage (P = .0002) and that of dysplastic venous dilation (P = .036) increased the risk of intracranial hemorrhage in case patients for DAVFs of the 5 regions.
There is a statistical significance between risk of intracranial hemorrhage in DAVFs and the presence of leptomeningeal drainage and dysplastic venous dilation.
closed_qa
Is sex a prognostic factor in stroke rehabilitation?
We sought to assess the specific influence of sex on rehabilitation results. A case-control study in 440 consecutive patients with sequelae of first ischemic stroke were enrolled in 2 subgroups (males and females) and matched for severity of stroke (evaluated by means of the Canadian Neurological Scale), age (within 1 year), and onset-admission interval (within 3 days). Functional data, evaluated by means of the Barthel Index and the Rivermead Mobility Index, were compared between subgroups. Logistic regressions were used to clarify the role of sex in affecting global autonomy and mobility. After rehabilitation treatment, a sex-related difference was observed essentially in the higher levels of response. Indeed, more men than women reached independence in both stair climbing and activities of daily living (ADL), with a higher response and effectiveness on mobility. In multivariate analyses, male patients had a 3 times higher probability than female patients of good autonomy in both stair climbing and ADL (odds ratio [OR]=3.22; 95% CI, 1.67 to 6.18 and OR=2.92; 95% CI, 1.63 to 5.42, respectively). Conversely, female patients had a higher risk of walking with a cane (OR=1.69; 95%, CI 1.04 to 2.76) or of partial autonomy with respect to ADL (OR=1.90; 95% CI, 1.25 to 2.91). No significant difference was found for the other functional parameters.
Female sex is a mildly unfavorable prognostic factor in rehabilitation results after stroke.
closed_qa
Carboplatin/cyclophosphamide or carboplatin/paclitaxel in elderly patients with advanced ovarian cancer?
To determine the feasibility of two chemotherapy regimens in elderly patients with advanced ovarian carcinoma (AOC). Eighty-three patients>or=70 years were previously enrolled in a trial evaluating carboplatin and cyclophosphamide (CC). On the basis of identical eligibility criteria, 75 further patients were enrolled in a trial evaluating carboplatin and paclitaxel (Taxol) (CP). The primary end point of these studies was the feasibility of six courses of chemotherapy. Comprehensive geriatric assessment (CGA) parameters were assessed in terms of prognostic factors. More patients in the CC group presented with performance status of two or more, depression symptoms, use of co-medications, hypoalbuminemia, abnormal Mini-Mental Status score, or sub-optimal surgery. Both regimens appeared feasible: 75.6% in the CC group and 68.1% in the CP group completed six courses. CC and CP groups had similar overall survival (OS). Independent prognostic factors of poorer OS were the following: increasing age (P = 0.013), depression symptoms at baseline (P<0.001), International Federation of Gynecology and Obstetrics stage IV (P = 0.001), and use of paclitaxel (P = 0.025).
As this is a non-randomised retrospective review of two consecutive studies, no firm conclusion can be drawn. It seems, however, that in elderly patients with AOC the use of paclitaxel results in more toxicity. CGA parameters and particularly emotional disorders might help to determine a priori the risk/benefit ratio of chemotherapy in this patient population.
closed_qa
Is self-efficacy a predictor of short-term post-surgical adjustment among Chinese women with breast cancer?
High self-efficacy (SE) is regarded as beneficial for cancer patients in facilitating adaptation and therefore desirable. However, this may not always be the case. A longitudinal cohort study of women receiving breast cancer surgery. Path analysis examined impact of high and low baseline SE scores on outcome. Post hoc analysis stratified outcome expectations by SE. 405/529 eligible Chinese women aged 28-79 years receiving breast cancer surgery in six regional Hong Kong hospitals were interviewed within 1 week of surgery. After assessing SE, incongruence between expectancy and outcome of surgery (E-OI), and psychological morbidity, 91% of women were followed for 1 month when psychological and social morbidity were assessed (follow-up). After adjustment for demographic and histopathological factors, psychological morbidity was predicted by E-OI. Women with high E-OI had more impairment of sexuality and self-image. Women with high SE had better self-image and relationships with friends, but tended to underestimate the negative consequences of surgery on appearance. This increased E-OI and thereby psychological morbidity.
High post-surgical SE benefits early social adaptation, but also leads to under-estimating the negative impacts of surgery, impairing psychological adjustment. High SE can thereby contribute indirectly and significantly to increased psychological morbidity.
closed_qa
Can we define the ideal duration of antibiotic therapy?
Because of the increasing development of antimicrobial resistance, there is a greater responsibility within the medical community to limit the exposure of patients to antibiotics. We tested the hypothesis that shorter courses of antibiotics are associated with similar or better results than longer durations. We also sought to investigate the difference between a fixed duration of therapy and one based on physiologic measures such as fever and leukocytosis. All infectious episodes on the general surgery units of the University of Virginia Health System from December 15, 1996, to July 31, 2003, were analyzed retrospectively for the relation between the duration of antibiotic therapy and infectious complications (recurrent infection with the same organism or at the same site). All infections associated with either fever or leukocytosis were categorized into quartiles on the basis of the absolute length of antibiotic administration or the duration of treatment following resolution of fever or leukocytosis. Multivariate logistic regression models were developed to estimate the independent risk of recurrence associated with a longer duration of antibiotic use. Of the 5,561 treated infections, 4,470 were associated with fever (temperature>or =38 degrees C) or leukocytosis (white blood cell count>or =11,000/mm(3)). For all infections, whether analyzed by absolute duration or time from resolution of leukocytosis or fever, the first or second quartiles (0-12 days, 0-9 days, 0-9 days, respectively) were associated with the lowest recurrence rates (14-18%, 17-23%, 18-19%, respectively). Individual analysis of intra-abdominal infections and pneumonia yielded similar results. The fixed-duration groups received fewer days of antibiotics on average, with outcomes similar to those in the physiologic parameters group.
Shorter courses of antibiotics were associated with similar or fewer complications than prolonged therapy. In general, adopting a strategy of a fixed duration of therapy, rather than basing duration on resolution of fever or leukocytosis, appeared to yield similar outcomes with less antibiotic use.
closed_qa
The open abdomen in trauma: do infectious complications affect primary abdominal closure?
One of the primary goals of damage control surgery in the trauma patient is primary closure of the abdomen. We hypothesized that extra-abdominal infections, such as those complicating injuries to the thorax, diaphragm, long bones, or musculoskeletal system, would decrease the likelihood of primary abdominal closure and increase hospital resource utilization in patients requiring open abdominal management. The trauma registry of the American College of Surgeons (TRACS) was reviewed retrospectively from 1995-2002 for open abdomen technique and damage control surgery. The outcome was primary fascial closure or delayed closure. Patients who died prior to closure were excluded. We evaluated infectious complications, including ventilator-associated pneumonia (VAP), blood stream infection (BSI), and surgical site infection (SSI). Other parameters studied were multiple rib fractures, long bone fractures, chest injuries, diaphragm injuries, empyema, and transfusion requirements. Hospital charges were obtained from the hospital administrative database. Univariate, multivariate, and regression analyses were performed to identify the effects of infectious complications on primary abdominal closure, length of stay, total hospital charges, and disposition. Three hundred forty-four patients required the open abdomen technique: 67% received damage control laparotomy and 33% decompression of abdominal compartment syndrome. Two hundred seventy-six patients (80%) went on to abdominal closure of some form and constituted the primary study group. Primary abdominal closure was achieved in 180 (65%) with a mean time to closure of 3.5 days. Ventilator-associated pneumonia, BSI, and SSI were associated with lack of primary closure (p<0.05). Increased blood transfusions also were associated with failure of primary closure (p<0.05). Ventilator-associated pneumonia and BSI were associated with significantly greater lengths of stay in the intensive care unit (ICU) (24.2 days vs. 12.6 days and 30.5 days vs. 17.9 days; both p<0.0001) and significantly greater total hospital charges (232,080 US dollar vs. 142,893 US dollar; 247,440 US dollar vs. 160,940 US dollar; and 264,778 US dollar vs. 170,447 US dollar; all p<0.001).
Inability to achieve primary abdominal closure was associated with infectious complications (VAP, BSI, and SSI) and large transfusion requirements. Infectious complications also significantly increased ICU utilization and hospital charges. Death was associated with BSI, femur fractures, and large transfusion requirements, whereas infectious complications did not have a significant impact on discharge disposition.
closed_qa
High serum levels of tumour necrosis factor-alpha and interleukin-8 in severe asthma: markers of systemic inflammation?
Severe asthma is characterized by elevated levels of pro-inflammatory cytokines and neutrophilic inflammation in the airways. Blood cytokines, markers of 'systemic' inflammation, may be a feature of amplified inflammation in severe asthma. To detect differences in IL-8, TNF-alpha, IL-16 and IL-13 levels in the serum(s) of stable severe and mild-moderate asthmatics related to blood leucocytes proportion, airway calibre and exhaled nitric oxide (NO) levels. We assessed cytokine serum levels by ELISA and blood leucocyte counts by an alkaline peroxidase method in 20 healthy controls, 22 mild-moderate [forced expiratory volume in 1 s (FEV1)(%pred): 89+/-3] and 14 severe asthmatics [FEV1(%pred): 49+/-2]. IL-8 and TNF-alpha levels were higher in severe asthmatics than in mild-moderate asthmatics or in controls (P<0.05). No differences in IL-16 and IL-13 levels were detected. Severe asthmatics showed higher circulating neutrophil and eosinophil number than controls (P<0.05). In severe asthmatics, exhaled NO levels were superior than in controls (P<0.05), but inferior than in mild-moderate asthmatics (P<0.05). We found positive correlation between TNF-alpha levels and exhaled NO (r=0.67; P=0.01) or circulating neutrophil counts (r=0.57; P=0.03) in severe asthmatics.
sTNF-alpha and sIL-8 are markers of 'systemic' inflammation in severe asthmatics, in conjunction with augmented circulating neutrophils, suggesting the involvement of neutrophil-derived cytokine pattern in severe asthma.
closed_qa
Do shrimp-allergic individuals tolerate shrimp-derived glucosamine?
There is concern that shrimp-allergic individuals may react to glucosamine-containing products as shrimp shells are a major source of glucosamine used for human consumption. The purpose of this study was to determine whether shrimp-allergic individuals can tolerate therapeutic doses of glucosamine. Subjects with a history of shrimp allergy were recruited and tested for both shrimp reactivity via a prick skin test and shrimp-specific IgE by an ImmunoCAP assay. Fifteen subjects with positive skin tests to shrimp and an ImmunoCAP class level of two or greater were selected for a double-blind placebo-controlled food challenge (DBPCFC) using glucosamine-chondroitin tablets containing 1,500 mg of synthetically produced (control) or shrimp-derived glucosamine. Immediate reactions, including changes in peak flow and blood pressure, and delayed reactions (up to 24 h post-challenge) via questionnaire were noted and assessed. All subjects tolerated 1,500 mg of both shrimp-derived or synthetic glucosamine without incident of an immediate hypersensitivity response. Peak flows and blood pressures remained constant, and no subject had symptoms of a delayed reaction 24 h later.
This study demonstrates that glucosamine supplements from specific manufacturers do not contain clinically relevant levels of shrimp allergen and therefore appear to pose no threat to shrimp-allergic individuals.
closed_qa
Is the short-term outcome of transurethral resection of the prostate affected by preoperative degree of bladder outlet obstruction, status of detrusor contractility or detrusor overactivity?
Ninety-two patients with LUTS/BPH aged 50 years or older who were considered to be appropriate candidates for TURP were included in this study. Pressure-flow study and filling cystometry were performed to determine BOO, DUA and DO before TURP. The efficacy of TURP was determined at 3 months after surgery using the efficacy criteria for treatment of BPH assessed by the International Prostate Symptom Score, QOL index, maximum flow rate and postvoid residual urine volume. On preoperative urodynamics, 60%, 40% and 48% of patients showed BOO, DUA and DO, respectively. After TURP, 76% showed 'excellent' or 'good' overall efficacy, whereas only 13% fell into the 'poor/worse' category. The efficacy was higher as the preoperative degree of BOO worsened. In contrast, neither DO nor DUA influenced the outcome of TURP. However, the surgery likely provided unfavorable efficacy for patients having DO but not BOO. Only 20% of the patients who had both DO and DUA but did not have BOO achieved efficacy.
Transurethral resection of the prostate is an effective surgical procedure for treatment of LUTS/BPH, especially for patients with BOO. DUA may not be a contraindication for TURP. The surgical indication should be circumspect for patients who do not have BOO but have DO.
closed_qa
Changes in religiousness and spirituality attributed to HIV/AIDS: are there sex and race differences?
Having a serious illness such as HIV/AIDS raises existential issues, which are potentially manifested as changes in religiousness and spirituality. The objective of this study was (1) to describe changes in religiousness and spirituality of people with HIV/AIDS, and (2) to determine if these changes differed by sex and race. Three-hundred and forty-seven adults with HIV/AIDS from 4 sites were asked demographic, clinical, and religious/spiritual questions. Six religious/spiritual questions assessed personal and social domains of religiousness and spirituality. Eighty-eight participants (25%) reported being "more religious" and 142 (41%) reported being "more spiritual" since being diagnosed with HIV/AIDS. Approximately 1 in 4 participants also reported that they felt more alienated by a religious group since their HIV/AIDS diagnosis and approximately 1 in 10 reported changing their place of religious worship because of HIV/AIDS. A total of 174 participants (50%) believed that their religiousness/spirituality helped them live longer. Fewer Caucasians than African Americans reported becoming more spiritual since their HIV/AIDS diagnosis (37% vs 52%, respectively; P<.015), more Caucasians than African Americans felt alienated from religious communities (44% vs 21%, respectively; P<.001), and fewer Caucasians than African Americans believed that their religiousness/spirituality helped them live longer (41% vs 68% respectively; P<.001). There were no significantly different reported changes in religious and spiritual experiences by sex.
Many participants report having become more spiritual or religious since contracting HIV/AIDS, though many have felt alienated by a religious group-some to the point of changing their place of worship. Clinicians conducting spiritual assessments should be aware that changes in religious and spiritual experiences attributed to HIV/AIDS might differ between Caucasian and African Americans.
closed_qa
In vivo follicular unit multiplication: is it possible to harvest an unlimited donor supply?
Follicular unit extraction is a process of removing one follicular unit at a time from the donor region. The most important limitation of this surgical procedure is a high transection rate. In this clinical study, we have transplanted different parts of transected hair follicle by harvesting with the follicular unit extraction technique (FUE) in five male patients. In each patient, three boxes of 1 cm(2) are marked at both donor and recipient sites. The proximal one-third, one-half, and two-thirds of 15 hair follicles are extracted from each defined box and transplanted in recipient boxes. The density is determined at 12 months after the procedure. A mean of 3 (range, 2-4) of the proximal one-third, 4.4 (range, 2-6) of the proximal one-half, and 6.2 (range, 5-8) of the proximal two-thirds of the transplanted follicles were observed as fully grown after 1 year. At the donor site, the regrowth rate was a mean of 12.6 (range, 10-14) of the proximal one-third, 10.2 (range, 8-13) of the proximal one-half, and 8 (range, 7-12) of the proximal two-thirds, respectively.
The survival rate of the transected hair follicles is directly related to the level of transection. Even the transected parts, however, can survive at the recipient site; the growth rate is not satisfactory and they are thinner than the original follicles. We therefore recommend that the surgeon not transplant the sectioned parts and be careful with the patients whose transection rate is high during FUE procedures.
closed_qa
Does an immunochromatographic D-dimer exclude acute lower limb deep venous thrombosis?
A pre-test probability score and D-dimer may reduce the need for ultrasound examinations for excluding lower limb deep venous thrombosis (DVT). To establish the accuracy of an immunochromatographic D-dimer assay called 'Simplify' for diagnosis of acute DVT by complete (calf veins included) lower limb ultrasound examination. A total of 453 consecutive patients presented to the ED of a tertiary centre with suspected first episode of DVT, were prospectively recruited. A pre-test probability score (Hamilton Score), an immunochromatographic D-dimer and a complete, single, unilateral lower limb ultrasound examination were performed in all patients. All patients with a negative ultrasound examination were followed up for 3 months. There were 159 men and 294 women with a mean age of 55.8 years (SD 20.3). Of the 227 patients with a negative D-dimer, 214 patients had negative ultrasound examinations and 13 patients had isolated calf DVT. Among the 226 patients with a positive D-dimer, 74 patients had DVT and 152 patients had negative ultrasound examinations. The sensitivity, specificity, positive and negative predictive values were 85.1% (75.8-91.8), 58.5% (53.4-63.5), 32.7% (26.6-38.9) and 94.3% (90.4-96.9), respectively. One hundred and sixty-five patients had an unlikely Hamilton Score and a negative D-dimer. The negative predictive value of the immunochromatographic D-dimer in an unlikely Hamilton Score population was 98.8% (95.7-99.8%).
An unlikely probability Hamilton Score and a negative immunochromatographic D-dimer reliably exclude both proximal and isolated calf DVT.
closed_qa
IL-17 mRNA in sputum of asthmatic patients: linking T cell driven inflammation and granulocytic influx?
The role of Th2 cells (producing interleukin (IL-)4, IL-5 and IL-13) in allergic asthma is well-defined. A distinct proinflammatory T cell lineage has recently been identified, called Th17 cells, producing IL-17A, a cytokine that induces CXCL8 (IL-8) and recruits neutrophils. Neutrophilic infiltration in the airways is prominent in severe asthma exacerbations and may contribute to airway gland hypersecretion, bronchial hyper-reactivity and airway wall remodelling in asthma.AIM: to study the production of IL-17 in asthmatic airways at the mRNA level, and to correlate this with IL-8 mRNA, neutrophilic inflammation and asthma severity. We obtained airway cells by sputum induction from healthy individuals (n = 15) and from asthmatic patients (n = 39). Neutrophils were counted on cytospins and IL-17A and IL-8 mRNA expression was quantified by real-time RT-PCR (n = 11 controls and 33 asthmatics). Sputum IL-17A and IL-8 mRNA levels are significantly elevated in asthma patients compared to healthy controls. IL-17 mRNA levels are significantly correlated with CD3gamma mRNA levels in asthmatic patients and mRNA levels of IL-17A and IL-8 correlated with each other and with sputum neutrophil counts. High sputum IL-8 and IL-17A mRNA levels were also found in moderate-to-severe (persistent) asthmatics on inhaled steroid treatment.
The data suggest that Th17 cell infiltration in asthmatic airways links T cell activity with neutrophilic inflammation in asthma.
closed_qa
Do the American College of Surgeons' "major resuscitation" trauma triage criteria predict emergency operative management?
We wish to assess whether individual or collective American College of Surgeons' "major resuscitation" criteria accurately identify injured patients who receive emergency operative treatment. In this observational secondary registry analysis of 8,289 consecutive trauma team activations during a 7.5-year period, we evaluated the test performance of 5 American College of Surgeons' major criteria in predicting emergency (within 1 hour) operative management by general (for adults) or pediatric (for children) surgeons. In adults, the individual major resuscitation criteria each predicted emergency operative management as follows (sorted from highest to lowest test performance): gunshot wounds to the neck or torso (likelihood ratio positive [LR+] 7.5; 95% confidence interval [CI]6.2 to 9.1); confirmed hypotension (LR+ 5.3; 95% CI 4.0 to 7.1); interhospital transfers requiring blood transfusions (LR+ 4.6; 95% CI 2.6 to 8.2); respiratory compromise (LR+ 2.9; 95% CI 2.2 to 3.7), and Glasgow Coma Scale score less than 8 (LR+ 2.1; 95% CI 1.6 to 2.7). The collective strategy of using any of these 5 criteria exhibited a LR+ of 3.5 (95% CI 3.2 to 3.8), sensitivity 82% (95% CI 75% to 87%), and specificity 76% (95% CI 75% to 77%). Our findings in children were similar, but their precision was limited by the low baseline prevalence of emergency operative intervention.
These 5 American College of Surgeons-mandated major resuscitation criteria vary several-fold in their individual ability to predict emergency operative management and collectively exhibit modest test characteristics for this purpose. Selective use of these criteria or revisions thereof could result in more efficient secondary trauma triage. Our results do not support the existing obligatory use of these criteria to maintain American College of Surgeons trauma center certification.
closed_qa
Posttraumatic stress disorder, tenderness, and fibromyalgia syndrome: are they different entities?
Many features of fibromyalgia syndrome (FMS) resemble those of posttraumatic stress disorder (PTSD). The goal of this study was to investigate the comorbidity of FMS and PTSD in a cohort of men following an intensive, initial, defined traumatic event. One hundred twenty-four males (55 patients with PTSD, 20 patients with major depression, and 49 controls) were evaluated for the presence of FMS. The major traumatic events in all PTSD patients were combat-related. Each individual completed questionnaires characterizing his disease, disabilities, and quality of life. Forty-nine percent of PTSD patients, compared to 5% of major depression patients and none of normal controls, fulfilled the American College of Rheumatology criteria for FMS (P<.0001). Significant correlations were detected between tender points and measured parameters in the PTSD group.
In male patients, PTSD is highly associated with FMS. The degree and impact of these disorders are also highly related.
closed_qa
Is microfracture of chondral defects in the knee associated with different results in patients aged 40 years or younger?
Age-dependent studies about the clinical result after microfracture of cartilage lesions in the knee are still missing. This prospective study was performed to discover age-dependent differences in the results after microfracture over a period of 36 months. Between 1999 and 2002, 85 patients (mean age, 39 years) with full-thickness chondral lesions underwent the microfracture procedure and were evaluated preoperatively and at 6, 18, and 36 months after surgery. Depending on the patients' age (<or =40 years or>40 years) and the localization of the defects (femoral condyles, tibia, and patellofemoral joint), the patients were assigned to 6 different groups. Exclusion criteria were meniscal pathologic conditions, tibiofemoral malalignment, and ligament instabilities. Baseline clinical scores were compared with follow-up data by use of paired Wilcoxon tests for the modified Cincinnati knee score and the International Cartilage Repair Society (ICRS) score. The scores improved in all groups over the whole study period (P<.05). Patients aged 40 years or younger had significantly better results (P<.01) for both scores compared with older patients. Between 18 and 36 months after microfracture, the ICRS score deteriorated significantly (P<.05) in patients aged over 40 years whereas younger patients with defects on the femoral condyles and on the tibia showed neither a significant improvement nor a significant deterioration in the ICRS score (P>.1). Magnetic resonance imaging 36 months after surgery revealed better defect filling and a better overall score in younger patients (P<.05). The Spearman coefficient of correlation between clinical and magnetic resonance imaging scores was 0.84.
The clinical results after microfracture of full-thickness cartilage lesions in the knee are age-dependent. Deterioration begins 18 months after surgery and is significantly pronounced in patients aged older than 40 years. The best prognostic factor was found to be a patient age of 40 or younger with defects on the femoral condyles.
closed_qa
Can induced anxiety from a negative earlier experience influence vascular surgeons' statistical decision-making?
Increasing detection, new screening recommendations, and popular press attention contribute to the rising prevalence of asymptomatic abdominal aortic aneurysms (AAA). Evidence-based guidelines recommend the optimal time to operate is when the aneurysm is 5.5 cm in diameter. Smaller AAAs are periodically monitored with imaging. Recent events and emotional reactions to risk and uncertainty, including anxiety, can cause decision-making to diverge from cognitively based assessments. It is not known whether this applies to vascular surgeons making statistically-optimal, risky decisions. We tested whether an unexpected, recent negative event might influence vascular surgeons' decisions about a computer-simulation AAA-analog that includes statistical information. We performed a randomized, computer-based field experiment with evidenced-based statistical information readily available on bursting probabilities. Participants included vascular surgeons with AAA operative experience attending two vascular surgery conferences held in 2005 (n=81). The intervention was a randomly assigned, anxiety-inducing, bursting balloon versus a nonbursting balloon before a statistical decision-making computer simulation. The main outcomes measure was real-time prospective choice to opt out of expanding AAA simulation. A Cox proportional hazard model was used to assess the likelihood of opting out, while controlling for important covariates. The experimental group was more likely to opt out (hazard ratio: 3.32; 95% CI: 1.25 to 8.81), even after controlling for initial anxiety levels, risk attitudes, uncertainty attitudes, use of statistical information, surgical experience, and demographics.
Experiencing a negative, potentially anxiety-provoking, preceding event can influence decision-making, even among experienced vascular surgeons who have ready access to statistical risk information.
closed_qa
Resection of neurogenic tumors in children: is thoracoscopy superior to thoracotomy?
Minimally invasive resection of solid tumors is controversial because of concerns of inadequate resection and local recurrence. Thoracoscopy has been used in the diagnosis of mediastinal tumors in children, but its role in resection is unproved. The purpose of this study was to compare thoracoscopic and open approaches to the resection of thoracic neurogenic tumors in children. The tumor registry of a regional children's hospital was queried to identify patients who underwent resection of neurogenic tumors over a 6-year period. Thoracoscopic and open groups were compared for demographic, operative, oncologic, and outcomes characteristics. Seventeen children underwent resection of mediastinal neurogenic tumors (10 thoracoscopic resections, 7 open resections). Mean age was 4.7 years (range 6 months to 12 years). The thoracoscopic and open groups showed no difference in operative time or blood loss. Tumors in the two groups were comparable in size (5.2+/-2.2 cm versus 5.7+/-2.6 cm), histology, surgical margin, and stage. Hospital stay was shorter after thoracoscopic resection (1.9+/-0.7 days versus 4.1+/-2.5 days, p<0.05). There were no regional recurrences. Distant metastases developed in one patient in each group. Eight of 10 children with malignant tumors remain disease-free at an average of 25 months of followup (range 3 to 80 months).
Thoracoscopic resection of neurogenic tumors achieved similar local control and disease-free survival when compared with open resection in this preliminary series. These results were accompanied by a shorter hospital stay. These findings suggest that thoracoscopic resection of neurogenic tumors in children may offer advantages to open resection and should be studied in the context of a large, cooperative trial.
closed_qa
Long-term results of endobronchial brachytherapy: A curative treatment?
To evaluate outcomes after high-dose-rate endobronchial brachytherapy (HDR-EBBT) for limited lung carcinoma. A total of 106 patients with endobronchial lung cancer and not eligible for surgery or external beam radiotherapy, without nodal or visceral metastases, were treated with HDR-EBBT. They had developed disease relapse after surgery (n = 43) or external beam radiotherapy (n = 27) or had early lung cancer with respiratory insufficiency (n = 36). Treatment consisted of six fractions of 5 or 7 Gy, usually delivered 1 cm from the source. The complete histologic response rate, evaluated at 3 months after HDR-EBBT, was 59.4%. At 3 and 5 years, the local control, overall survival, and cause-specific survival rates were 60.3% and 51.6%, 47.4 and 24%, and 67.9 and 48.5%, respectively. Factors significantly associated with local failure were high tumor volume (tumor length>2 cm, bronchial obstruction>25%, tumor visibility on CT scan) and previous endoscopic treatment. Cause-specific survival, but not overall survival, was significantly associated with local control, probably because of the high rate of deaths not related to lung cancer. Five deaths were attributed to the HDR-EBBT procedure (two from fatal hemoptysis and three from bronchial necrosis).
High-dose-rate-EBBT achieved a long-term cause-specific survival rate of 50% of the patients with localized endobronchial carcinoma and could be considered curative.
closed_qa
Does academic intervention impact ABS qualifying examination results?
To assess the impact of a focused academic support program on American Board of Surgery In-Training Examination (ABSITE) scores and Qualifying Examination (QE) outcomes. A mandatory intervention program was begun in April 2001 for residents with ABSITE Total Test (TT) percentiles<31. Program elements included: 1) individual faculty mentoring and personal learning plan 2) QE videotape review sessions 3) Surgical Education and Self-Assessment Program (SESAP) 4) monthly rotation evaluations, and 5) quarterly status feedback. A free medical evaluation was offered. Mock orals participation, educational psychologist consultation, and voluntary followup mentoring were added later. Study data were reviewed for 2003-2005 Chief Residents including ABSITE scores, QE results, conference attendance, rotation Overall Performance ratings, and resident surgeon case volumes. Results were compared for the academic intervention (AI) and no intervention (NI) groups. Fifteen residents graduated during the study period. Eight residents completed nine interventions; seven returned to TT percentiles>30 (7/8, 88%). First post-intervention ABSITE gains were large compared to NI and national peer groups. Standard Score (SS) TT gains were maintained until residency completion by four AI residents. Median AI PGY-5 TT percentile was 32 and three scores were<or=25. Six AI residents (6/8, 75%) and all NI residents (7/7, 100%) passed the QE on their first attempts. AI and NI groups were similar for conference attendance, rotation evaluations, and operative log totals.
A focused academic support intervention for residents with marginal ABSITE TT percentiles can produce immediate substantial gains. Gains are variably maintained through remaining residency years. PGY-5 TT percentiles<or=25, seen with three AI residents (3/8, 38%), are associated with a 40% first QE failure rate. Therefore, our 75% QE first-time pass rate for AI residents argues for intervention success.
closed_qa
Is it appropriate to use core clerkship grades in the selection of residents?
This study challenges the appropriateness of using core clerkship grades for resident selection. The authors hypothesize that substantial variability occurred in the system of grading. In this retrospective cross-sectional study, variability in the grading systems for third-year core clinical clerkships were examined. From the Medical Student Performance Evaluation of applicants from U.S. medical schools for residency training in the authors' department in 2004 and 2005, the authors gathered the following variables: medical school, third-year core clerkship grading systems, and percentage of students in each grade category. Descriptive analyses were conducted and within institution variability across clerkship scores was analyzed using repeated measure analysis of variance (ANOVA) and t-test. University teaching hospital. The survey covered 121 of 122 U.S. medical schools accredited by the AAMC/LCME. Grading systems used included: variations of Honors/Pass/Fail (H,P,F) system in 76 schools, letter grade systems in 22 schools, and other variants (eg, Outstanding, Advanced, and Proficient in 6 schools and Pass/Fail in 4 schools). Thirteen schools (10%) provided either no grading system or no interpretable system. Grading systems included were further defined into 2 scores in 6 schools, 3 in 34 schools, 4 in 38 schools, 5 in 23 schools, and more than 6 in 6 schools. For schools using a grading system containing 3 or more scores, the percentage of students given the highest grade was significantly less in Surgery (28%) compared with Family Medicine (34%) and Psychiatry (35%) (p = 0.001).
Core clerkship grading systems and the percentage to which institutions grade students as having achieved the highest performance level vary greatly among U.S. medical schools. Within institutions, significant variability exists among clerkships in the percentage of the highest grade given, which makes interpersonal comparison based on core clerkship grades difficult and suggests that this method may not be a reliable indicator of performance.
closed_qa
Patient participation in the medical specialist encounter: does physicians' patient-centred communication matter?
Physicians' patient-centred communication is assumed to stimulate patients' active participation, thus leading to more effective and humane exchange in the medical consultation. We investigated the relationship between physicians' patient-centred communication and patient participation in a medical specialist setting. Participants were 30 residents and specialists in internal medicine, and 323 of their patients. Participants completed a questionnaire prior to a (videotaped) follow-up consultation. Physicians' patient-centred communication was assessed by coding behaviours that facilitate or rather inhibit patients to express their perspective. Patient participation was determined by assessing (a) their relative contribution to the conversation, and (b) their active participation behaviour. Analyses accounted for relevant background characteristics. Physicians' facilitating behaviour was found to be positively associated with patients' relative contribution to the conversation as well as patients' active participation behaviour. Physicians' inhibiting behaviour was not related to patients' relative contribution, and was, unexpectedly, positively associated with patients' active participation behaviour. Physicians' behaviour was particularly associated with patients' expression of concerns and cues.
Physicians in internal specialist medicine appear to be able to facilitate patients' active participation in the visit. The findings indicate that inhibiting behaviour may not have the expected blocking effect on patient participation: patients voiced their perspectives just the same and expressed even more concerns. Showing inhibiting behaviour may, alternatively, be a physician's response to the patient's increased participation in the encounter.
closed_qa
Are patients aware of the association between smoking and bladder cancer?
Smoking is the single greatest risk factor for bladder cancer. Since few studies have demonstrated the efficacy of screening for bladder cancer, primary prevention by decreasing the modifiable risk factors is the best defense. An aspect of modifying a behavioral risk factor is awareness of the association between behavior and disease. While many anti-smoking campaigns specifically focus on lung cancer, few mention bladder cancer. We evaluated the awareness of smoking as a risk factor for bladder cancer. Between February and May 2005 we prospectively surveyed patients presenting to a urology clinic regarding their knowledge of risk factors for bladder cancer and other cancers. The questionnaire also captured data regarding patient smoking habits. A total of 280 patients completed the survey, including 34% who were younger than 50 years, 63% who were male, 89% who were white and 57% who were college graduates. Only 36% vs 98% of the sample reported that smoking was a risk factor for bladder vs lung cancer. Patients with a higher level of education and females were statistically more likely to be aware of the association between smoking and bladder cancer.
Patients at a urology clinic had low overall knowledge regarding bladder cancer risk factors. Most patients queried had no idea regarding the relationship between bladder cancer and tobacco use regardless of smoking status. Our study suggests the need for the American public to be better educated to help combat smoking related cancers.
closed_qa
Is there an indication for frozen section examination of the ureteral margins during cystectomy for transitional cell carcinoma of the bladder?
We evaluated the incidence of pathological findings of the ureter at cystectomy for transitional cell carcinoma of the bladder and assessed the usefulness of intraoperative frozen section examination of the ureter. Histopathological findings of ureteral frozen section examination were compared to the corresponding permanent sections and the diagnostic accuracy of frozen section examination was evaluated. These segments were then compared to the more proximal ureteral segments resected at the level where they cross over the common iliac arteries. The histopathological findings of the ureteral segments were then correlated for upper urinary tract recurrence and overall survival. Transitional cell carcinoma or carcinoma in situ was found on frozen section examination of the distal ureter in 39 of 805 patients (4.8%) and on permanent sections in 29 (3.6%). In 755 patients the false-negative rate of frozen section examination of the ureters was 0.8%. Of the patients with carcinoma in situ diagnosed on the first frozen section examination 80% also had carcinoma in situ in the bladder. Transitional cell carcinoma or carcinoma in situ in the most proximally resected ureteral segments was found in 1.2% of patients. After radical cystectomy there was tumor recurrence in the upper urinary tract in 3% of patients with negative ureteral frozen section examination and in 17% with carcinoma in situ on frozen section examination.
Routine frozen section examination of the ureters at radical cystectomy is only recommended for patients with carcinoma in situ of the bladder, provided the ureters are resected where they cross the common iliac arteries.
closed_qa
Is biopsy Gleason score independently associated with biochemical progression following radical prostatectomy after adjusting for pathological Gleason score?
Biopsy Gleason score is known to be associated with prostate specific antigen failure following radical prostatectomy. However, it is unclear whether it remains associated with outcome after surgery when the pathological Gleason score is known. We determined the association between biopsy Gleason score and biochemical progression after correcting for preoperative and postoperative characteristics, including pathological Gleason score, in 1,931 men treated with radical prostatectomy between 1988 and 2005 in the Shared Equal Access Regional Cancer Hospital Database Study Group database. Gleason score was examined as a categorical variable of 2 to 6, 3 + 4 and 4 + 3 or greater. Higher biopsy Gleason scores were positively associated with extracapsular extension (p<0.001), positive surgical margins (p<0.001), seminal vesicle invasion (p<0.001), positive lymph nodes (p<0.001) and biochemical progression (log rank p<0.001). After adjusting for only preoperative characteristics biopsy Gleason 3 + 4 and 4 + 3 or greater were associated with increased risk of biochemical progression compared to biopsy Gleason 6 or less (p = 0.001 and<0.001, respectively). After further adjusting for multiple pathological characteristics, including pathological Gleason score, the association between higher biopsy Gleason score and progression was little changed, in that men with biopsy Gleason 3 + 4 and 4 + 3 or greater were significantly more likely to experience progression (p = 0.001 and<0.001, respectively). Furthermore, when stratified by pathological Gleason score, higher biopsy Gleason scores were associated with an increased risk of biochemical progression in each pathological Gleason score category (log rank p</=0.007).
Biopsy Gleason score remained strongly associated with progression even when the pathological Gleason score was known and controlled for. If confirmed at other centers, incorporation of biopsy Gleason score into postoperative nomograms designed to estimate the progression risk might improve model precision.
closed_qa
Is there a relationship between sex hormones and erectile dysfunction?
The prevalence of erectile dysfunction increases as men age. Simultaneously, age related changes occur in male endocrine functioning. We examined the association between erectile dysfunction and total testosterone, bioavailable testosterone, sex hormone-binding globulin and luteinizing hormone. Data were obtained from the Massachusetts Male Aging Study, a population based cohort study of 1,709 men. Self-reported erectile dysfunction was dichotomized as moderate or severe vs none or mild. Odds ratios and 95% CI were used to assess the association between sex hormone levels and erectile dysfunction. Multiple logistic regression models were used to adjust for potential confounders including age, body mass index, partner availability, phosphodiesterase type 5 inhibitor use, depression, diabetes and heart disease. Using data from the most recent followup, analyses were conducted on 625 men with complete data. A moderate decrease in erectile dysfunction risk was observed with increasing total testosterone and bioavailable testosterone levels. However, this effect was not apparent after controlling for potential confounders. Increased luteinizing hormone levels (8 IU/l or greater) were associated with a higher risk of erectile dysfunction (adjusted OR 2.91, 95% CI 1.55-5.48) compared to luteinizing hormone levels less than 6 IU/l. A significant interaction between luteinizing hormone and total testosterone levels showed that increased testosterone levels were associated with a decrease in risk of erectile dysfunction among men with luteinizing hormone levels greater than 6 IU/l.
In this large population based cohort of older men we found no association among total testosterone, bioavailable testosterone, sex hormone-binding globulin and erectile dysfunction. Testosterone levels were associated with a decrease in risk of erectile dysfunction only in men with increased luteinizing hormone levels.
closed_qa
Can a complete primary repair approach be applied to cloacal exstrophy?
Surgical reconstruction for children with cloacal exstrophy remains challenging. The operative approach to cloacal exstrophy has expanded with the addition of the complete primary exstrophy repair. We assessed the safety and efficacy of complete primary exstrophy repair for this complex condition. We performed a retrospective review of children treated from birth for cloacal exstrophy between March 1, 1994 and January 1, 2003 at our institution. We evaluated associated anomalies, method of closure, complications and urinary continence. Seven patients with cloacal exstrophy were initially treated at our institution. One mortality occurred before complete primary exstrophy repair was attempted. Six patients were converted to a classic exstrophy appearance and underwent closure using complete primary exstrophy repair principles within 7 to 182 days (mean 68, median 32) postoperatively. Postoperative development of moderate hydronephrosis was seen in 1 patient and severe hydronephrosis in 1. Three of six patients had vesicoureteral reflux. Six patients had dry intervals and spontaneous voids before toilet training. Two patients had stress urinary incontinence. Two patients have been treated with bladder neck injections. One has undergone bladder neck reconstruction and construction of a nonorthotopic channel for clean intermittent catheterization (Mitrofanoff). One patient reported complete dryness after toilet training. One child has undergone bladder augmentation.
This series represents our initial efforts to use complete primary exstrophy repair for cloacal exstrophy. The application of the principles of complete primary exstrophy repair in a sequential fashion appears to be a viable and safe addition to the surgical armamentarium in this challenging patient population.
closed_qa
Do the lungs contribute to propofol elimination in patients during orthotopic liver transplantation without veno-venous bypass?
The clearance of propofol is very rapid, and its transformation takes place mainly in the liver. Some reports indicated extrahepatic clearance of the drug and that the lungs are the likely place where the process occurs. This study was undertaken to compare the plasma concentrations of propofol both in the pulmonary and radial arteries after constant infusion during the dissection, anhepatic and reperfusion phases of orthotopic liver transplantation (OLT) without veno-venous bypass, attempting to investigate extrahepatic clearance and to determine whether the human lungs take part in the elimination of propofol. Fifteen patients undergoing OLT without veno-venous bypass were enrolled in the study, and propofol was infused via a forearm vein at a rate of 2 mg x kg-1 x h-1. Blood samples were simultaneously collected from pulmonary and radial arteries at the end of the first hepatic portal dissection (T0), at the clamping of the portal vein (T1), 30, and 60 minutes after the beginning of the anhepatic phase (T2, T3), and 30, 60, and 120 minutes after the unclamping of the new liver (T4, T5, T6). Plasma propofol concentrations were measured using a reversed-phase, high-performance liquid chromatographic method with fluorescence detection. The concentrations of plasma propofol in the pulmonary and radial arteries at T2 and T3 rose significantly compared with T0 and T1 (P<0.01) respectively. After reperfusion, the drug concentrations at T4, T5 and T6 decreased significantly compared with T2, T3 (P<0.01) respectively. There were no significant differences in plasma propofol concentrations between the pulmonary and radial arteries at any time points.
Propofol is eliminated mainly by the liver, and also by extrahepatic organs. The lungs seem to be not a major site contributing to the extrahepatic metabolism of propofol in humans.
closed_qa
Is the hypoxia-inducible factor-1 alpha mRNA expression activated by ethanol-induced injury, the mechanism underlying alcoholic liver disease?
Excessive alcohol consumption can result in multiple organ injury, of which alcoholic liver disease (ALD) is the most common. With economic development and improvement of living standards, the incidence of diseases caused by alcohol abuse has been increasing in China, although its pathogenesis remains obscure. The aim of this study was to investigate the role of hypoxia in chronic ALD. Twenty-eight male Sprague-Dawley rats were randomized into a control group (n=12) with a normal history and an experimental group (n=16) fed with 10 ml/kg of 56% (vol/vol) ethanol once per day by gastric lavage for 24 weeks. At 24 weeks, blood samples were collected and then the rats were killed. Liver samples were frozen at -80 degrees C and used for RT-PCR; other liver samples were obtained for immunohistochemical staining. When the period of alcohol consumption increased, the positive rate of expression of hypoxia-inducible factor-1 alpha (HIF-1alpha) mRNA was more significantly elevated in the liver of the alcohol group than in the control group (P<or = 0.05). The HIF-1alpha protein located in the cytoplasm was seldom expressed in the control group, but significantly in the alcohol group (P<or = 0.01).
HIF-1alpha mRNA expression was activated by ethanol-induced injury in this study, suggesting that hypoxia is involved in the underlying mechanism of ALD.
closed_qa
Outcome of severe acute pancreatitis: is there a role for conservative management of infected pancreatic necrosis?
Infected pancreatic necrosis is associated with high morbidity and mortality and is mandatory for surgical or radiological intervention. A selected group of patients with CT evidence of infected pancreatic necrosis and a comparatively lower APACHE score may be clinically stable throughout the course of their illness. Case records of 52 patients with severe acute pancreatitis admitted from October 2000 to September 2005 were retrospectively analysed to assess the feasibility of conservative management of infected pancreatic necrosis. CT evidence of retroperitoneal air pockets, deteriorated clinical condition, sepsis and positive blood culture were used to diagnose infected pancreatic necrosis. In the 52 male patients reviewed, 24 patients had infected pancreatic necrosis. Eighteen patients who had progressively deteriorated clinical conditions required surgical intervention; five patients of whom (27.8%) died. Six patients with transient end organ dysfunction and stable clinical conditions were treated with prolonged administration of antibiotics and ICU support. All these patients recovered and discharged from the hospital, and no symptoms or readmission happened during follow-up of 6-44 months.
Selected patients with infected pancreatic necrosis who are clinically stable with transient end organ dysfunction can be treated conservatively with a favourable outcome. Necrosectomy associated with high morbidity and mortality in these patients can be avoided. The need for intervention should be individualized and based on clinical conditions of the patients.
closed_qa
Does biofuel smoke contribute to anaemia and stunting in early childhood?
Reliance on biomass fuels for cooking and heating exposes many women and young children in developing countries to high levels of air pollution indoors. Exposure to biomass smoke has been linked to reduced birth weight, acute respiratory infections, and childhood mortality. This study examines the association between household use of biofuels (wood, dung, and crop residues) for cooking and heating and prevalence of anaemia and stunting in children. Data are from a 1998-99 national family health survey in India, which measured height, weight, and blood haemoglobin of 29 768 children aged 0-35 months in 92 486 households. Multinomial logistic regression is used to estimate the effects of biofuel use on prevalence of anaemia and stunting, controlling for exposure to tobacco smoke, recent episodes of illness, maternal education and nutrition, and other potentially confounding factors. Analysis shows that prevalence of moderate-to-severe anaemia was significantly higher among children in households using biofuels than among children in households using cleaner fuels (RRR = 1.58; 95% CI: 1.28, 1.94), independent of other factors. Prevalence of severe stunting was also significantly higher among children in biofuel-using households (RRR = 1.84; 95% CI: 1.44, 2.36). Thirty-one per cent of moderate-to-severe anaemia and 37% of severe stunting among children aged 6-35 months in India may be attributable to exposure to biofuel smoke. Effects on mild anaemia and moderate stunting were smaller, but positive and statistically significant. Effects of exposure to tobacco smoke on anaemia and stunting were small and not significant.
The study provides a first evidence of the strong association between biofuel use and risks of anaemia and stunting in children, suggesting that exposure to biofuel smoke may contribute to chronic nutritional deficiencies in young children.
closed_qa
Conditional release: a less restrictive alternative to hospitalization?
This study examined conditional release--that is, involuntary outpatient commitment orders upon release from hospitalization--as a least restrictive alternative to psychiatric hospitalization in Victoria, Australia. Records were obtained from the Victorian Psychiatric Case Register for patients who experienced psychiatric hospitalization: between 1990 and 2000 a total of 8,879 patients were given conditional release and 16,094 were not. Compared with the group that was hospitalized but did not receive a conditional release, the group that received a conditional release was more likely to have more prior hospitalizations of greater than average duration. Patients with schizophrenia were more likely to be given conditional release. Patients given conditional release experienced a care pattern involving briefer inpatient episodes (8.3 fewer days per episode), more inpatient days, and longer duration of restrictive care--that is, combined inpatient and conditional release periods (5.1 more days per month in care).
For patients at risk of long-term hospitalization, conditional release may help to shorten inpatient episodes by providing a least restrictive alternative to continued hospitalization. However, patients who were given conditional release doubled the amount of days they spent under restrictive care, compared with the amount of time they previously spent in the hospital before entering a period of combined inpatient and conditional release commitment. Additional oversight may have led to more frequent hospitalization. This consequence raises new questions regarding the possible benefits of such extended oversight and new challenges for release planning using conditional release as a least restrictive method of care.
closed_qa
Does ADHD in adults affect the relative accuracy of metamemory judgments?
Prior research suggests that individuals with ADHD overestimate their performance across domains despite performing more poorly in these domains. The authors introduce measures of accuracy from the larger realm of judgment and decision making--namely, relative accuracy and calibration--to the study of self-evaluative judgment accuracy in adults with ADHD. Twenty-eight adults with ADHD and 28 matched controls participate in a computer-administered paired-associate learning task and predict their future recall using immediate and delayed judgments of learning (JOLs). Retrospective confidence judgments are also collected. Groups perform equally in terms of judgment magnitude and absolute judgment accuracy as measured by discrepancy scores and calibration curves. Both groups benefit equally from making their JOL at a delay, and the group with ADHD show higher relative accuracy for delayed judgments.
Results suggest that under certain circumstances, adults with ADHD can make accurate judgments about their future memory.
closed_qa
Does the EGFR and VEGF expression predict the prognosis in colon cancer?
The pathological specimens of 60 colon carcinoma patients were retrospectively evaluated and grouped according to EGFR and VEGF staining intensity and percentage of stained neoplastic cells. A final score was assigned to each case by multiplying percentage and staining score. The patients were stratified into the following categories: negative (score 0), low expression (score 1 or 2), and high expression (score 4). The remaining patient data were filtered out from the institutional cancer database. The mean survival time was 28.93 +/- 14.1 (range 2-52) months in the EGFR-negative group, 23.92 +/- 14.0 (range 6-46) months in the group with a low EGFR expression, and 17.00 +/- 12.8 (range 10-40) months in the group with a high EGFR expression. The median survival time was 27.50 +/- 14.7 (range 4-52) months in the VEGF-negative group, 29.33 +/- 12.8 (range 6-48) months in the group with a low VGEF expression, and 14.50 +/- 14.2 (range 2-40) months in the group with a high VGEF expression. The expression of EGFR and VEGF was not an independent factor that affects survival.
The EGFR and VEGF expression rates of colon tumors do not predict the survival. In addition, the EGFR expression in the primary tumor was not predictive of metastatic lymph nodes. The prognostic value of EGFR/VEGF staining may be further questioned.
closed_qa
The age at which young deaf children receive cochlear implants and their vocabulary and speech-production growth: is there an added value for early implantation?
The age at which a child receives a cochlear implant seems to be one of the more important predictors of his or her speech and language outcomes. However, understanding the association between age at implantation and child outcomes is complex because a child's age, length of device use, and age at implantation are highly related. In this study, we investigate whether there is an added value to earlier implantation or whether advantages observed in child outcomes are primarily attributable to longer device use at any given age. Using hierarchical linear modeling, we examined latent-growth curves for 100 children who had received their implants when they were between 1 and 10 yr of age, had used oral communication, and had used their devices for between 1 and 12 yr. Children were divided into four groups based on age at implantation: between 1 and 2.5 yr, between 2.6 and 3.5 yr, between 3.6 and 7 yr, and between 7.1 and 10 yr. Investigation of growth curves and rates of growth over time revealed an additional value for earlier implantation over and above advantages attributable to longer length of use at any given age. Children who had received their implants before the age of 2.5 yr had exhibited early bursts of growth in consonant-production accuracy and vocabulary and also had significantly stronger outcomes compared with age peers who had received their implants at later ages. The magnitude of the early burst diminished systematically with increasing age at implantation and was not observed for children who were older than 7 yr at implantation for consonant-production accuracy or for children who were over 3.5 yr old at implantation for vocabulary. The impact of age at implantation on children's growth curves differed for speech production and vocabulary.
There seems to be a substantial benefit for both speech and vocabulary outcomes when children receive their implant before the age of 2.5 yr. This benefit may combine a burst of growth after implantation with the impact of increased length of use at any given age. The added advantage (i.e., burst of growth) diminishes systematically with increasing age at implantation.
closed_qa
Does universal newborn hearing screening identify all children with GJB2 (Connexin 26) deafness?
Deafness is the most common neurosensory defect at birth, and GJB2 (connexin 26) mutations are the most frequent genetic cause of hearing loss in many populations. The hearing loss caused by GJB2 mutations is usually congenital in onset and moderate to profound in degree. Considerable phenotypic variation has been noted however, including two anecdotal cases of apparent non penetrance at birth. The objective of this study is to document nine additional children with two pathogenic GJB2 mutations who had non penetrance of hearing loss at birth. Subjects were identified through a national repository which includes deaf probands ascertained primarily from the United States through the Annual Survey of Deaf and Hard of Hearing Children and Youth conducted at the Research Institute at Gallaudet University. The hearing of each of these children had been screened at birth using standard audiologic techniques. Parents were interviewed and available medical records were reviewed. Testing for GJB2 mutations was performed by PCR and sequencing of the entire coding exon in all nine individuals. Using parent interviews and medical records, we documented that all nine children passed newborn audiologic hearing screening. The age at which the hearing loss was subsequently identified in these nine children ranged from 12-60 mo. Of these nine children, 3 were compound heterozygotes and six were homozygous for the 35delG mutation in the GJB2 gene.
These nine cases demonstrate that current newborn hearing screening does not identify all infants with two GJB2 mutations. These cases suggest that the frequency of non penetrance at birth is approximately 3.8% or higher. It is important to consider connexin deafness in any child with recessive nonsyndromic hearing loss as well as simplex cases with no history of other affected family members even when the newborn hearing screening results were within the normal range.
closed_qa
Neurological manifestations of type 1 Gaucher's disease: Is a revision of disease classification needed?
Gaucher's disease (GD), the most prevalent inherited lysosomal storage disorder, is caused by deficient glucocerebrosidase activity. The resulting accumulation of glucocerebrosides in lysosomes of macrophages leads to hepatosplenomegaly, anemia, thrombocytopenia, and various bone manifestations. Gaucher's disease is classified into 3 types based on the nature of its effects on the central nervous system. Type 1, the most common variant, is classically nonneuronopathic. However, the occurrence of Parkinsonism seems to be more frequent in type I Gaucher's disease than in the general population. Furthermore, heterozygotes for certain glucocerebrosidase gene mutations have a higher risk to develop Parkinson's disease. We report our experience about 9 patients with Gaucher's disease and their association with neurological manifestations.
These recent data may discuss Gaucher's classification and the existence of a continuum between neurologic and non-neurologic forms of the disease.
closed_qa
Metastatic breast cancer: do current treatments improve quality of life?
In metastatic breast cancer cases, the currently available therapeutic approaches provide minimal improvement in survival. As such, quality of life (QOL) becomes one of the main objectives of treatment. It is not known whether current treatments derived from trials improve QOL. The aim was to evaluate changes in QOL among metastatic breast cancer patients receiving treatment derived from trials. Prospective observational QOL survey in a tertiary cancer center. To evaluate the influence of current treatments on patients' QOL, the Medical Outcomes Study Short Form-36 (SF-36) and the Beck Depression Inventory (BDI) were applied on three occasions: before starting treatment and at the 6th and 12th weeks, to consecutive metastatic breast cancer patients over a one-year period. We found an improvement in QOL in the sample evaluated (n = 40), expressed by changes in the overall SF-36 score (p = 0.002) and the BDI (p = 0.004). Taken individually, the SF-36 components Pain, Social Functioning and Mental Health also improved significantly. Patients with worse initial performance status and secondary symptoms displayed greater improvement than those with better initial performance status and asymptomatic disease (p<0.001). Patients who received more than one type of therapy showed larger gains than those given only one type (p = 0.038).
In our environment, current metastatic breast cancer treatments can improve QOL, especially among symptomatic patients and those with low performance status.
closed_qa
Invasive aspergillosis: is treatment with "inexpensive" amphotericin B cost saving if "expensive" voriconazole is only used on demand?
Voriconazole for the treatment of invasive aspergillosis (IA) shows superior clinical outcome and tolerability compared to conventional amphotericin B. However, the latter is often used as initial treatment due to lower drug acquisition costs. Therefore we performed a cost-effectiveness analysis. A decision analytic model was designed to compare the cost-effectiveness of a regimen of voriconazole followed by conventional amphotericin B to a regimen of conventional amphotericin B followed by voriconazole. Patients initiated on treatment either completed initial therapy or switched to second line therapy due to toxicity or non-response. Probability of a switch was based on clinical trial data and local rates of renal toxicity. Resource use in the hospital was taken from the Global Comparative Aspergillosis (GCA) study. Costs were based on local drug acquisition costs, local cost estimates for hospitalisation and adjusted additional costs of amphotericin B-induced acute renal failure from the literature. Effectiveness was defined as survival at 12 weeks from the GCA study. An incremental cost-effectiveness ratio was estimated as the incremental cost per life saved comparing voriconazole to conventional amphotericin B. Based on this model, initial therapy of IA with voriconazole reduced total costs when compared to initial therapy with conventional amphotericin B (CHF 37 878/patient vs CHF 49 861/patient) and resulted in better survival at 12 weeks, making it the dominant treatment in terms of incremental cost-effectiveness. Results were most sensitive to alternative assumptions of the incidence of acute renal failure, but cost savings were sustained for voriconazole over a wide range of values.
Considering that initial therapy with voriconazole is both cost-saving and results in better clinical outcomes, voriconazole is the dominant cost-effective option for initial therapy of IA, despite very low drug acquisition costs of conventional amphotericin B.
closed_qa
Is chronic pain in adulthood related to childhood factors?
To investigate whether recalled childhood pain experiences and illnesses are associated with chronic pain in young adults. A cross-sectional population-based survey recruited participants aged 18-25 years for a case-control study and obtained information on current pain and recalled childhood experiences. In total, 858 respondents were classified as either non-pain controls (n = 276), non-chronic pain cases (pain for<or = 3 months in the previous 6 months, n = 435), or chronic pain cases (pain of>3 months' duration, n = 119). 858 young adults responded to the survey (adjusted response rate 37%). Of the recalled exposures in childhood, family members with pain (OR 2.48, 95% CI 1.48, 4.15), having more than 2 relatives with pain during childhood (OR 3.03, 95% CI 1.44, 6.40), being admitted to hospital during childhood (OR 1.71, 95% CI 1.04, 2.80), and having more illness than one's peer group at secondary school (OR 3.98, 95% CI 1.99, 7.96) were significantly associated with having chronic pain as a young adult, after adjustment for age, sex, and current psychological distress scores. Recall bias was assessed by comparing actual and recalled admission to the neonatal intensive care unit, with no significant differences being found between the participating groups.
Several associations were observed between pain status as a young adult and selected self-reported childhood experiences of illness and pain. The role of recall bias cannot be excluded in this retrospective study, but the results emphasize the importance of family and childhood experiences of pain in potentially influencing future adult pain status.
closed_qa
Increased homocysteine in heart failure: a result of renal impairment?
Hyperhomocysteinemia may constitute a risk factor for patients with severe heart failure. This study examines the relationship between plasma homocysteine concentration and left ventricular ejection fraction with renal function in heart failure patients free of coronary artery disease. Left ventricular ejection fraction was documented in 62 patients with advanced heart failure who had no proven significant coronary artery stenosis. Glomerular filtration rate was measured using the Cockroft-Gault equation. Elevated homocysteine levels (>or=15 micromol/L) were detected in 22 patients. Low glomerular filtration rate was observed in patients who had normal serum creatinine concentration. Homocysteine was strongly correlated with age, duration of disease, left ventricular ejection fraction, serum creatinine, and glomerular filtration rate. Statistically significant trends were observed across respective homocysteine quartiles. However, by multivariate regression, the strongest predictor of homocysteine was the glomerular filtration rate.
Impaired renal function leads to a diminished clearance rate, which can be a prominent pathophysiological mechanism in the elevation of homocysteine concentration in heart failure.
closed_qa
Can hip protector use cost-effectively prevent fractures in community-dwelling geriatric populations?
To estimate the cost-effectiveness from a societal perspective of a hip protector (HP) program over the remaining lifetime of individuals initially living at home. A state-transition Markov model considering outcomes of HP use in cohorts stratified by age, sex, and functional and residential status. Costs, transition probabilities, HP adherence, and efficacy were derived from published sources. Community and nursing homes in the United States. Hypothetical cohort of individuals aged 65 and older without a hip fracture and initially living at home. HP program. Fractures, life years, and dollars saved, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICER). HP use prevented fractures and increased life expectancy in all cohorts. HP use saved costs and improved QALYs in women initiating HP use at age 80 and in men at age 85. In women initiating HP use at age 75, the HP ICER was 19,000 dollars/QALY. In men initiating HP use at age 80, HP use saved costs but slightly decreased QALYs. In younger cohorts, HP use was neither cost saving nor QALY improving. In sensitivity analyses, if there was no QALY loss from wearing a HP, the ICER was less than 50,000 dollars/QALY for all age and sex cohorts. If HP cost was reduced 50%, HP use was cost saving for women initiating HP use at age 75. In probabilistic sensitivity analyses, the HP ICER was less than 50,000 dollars/QALY in 68% of simulations for women initiating HP use at age 75 and 61% of simulations for men initiating at age 85.
HP use saved costs and QALYs for older age cohorts of both sexes. Additional research on the quality-of-life effects and obstacles to wearing HP is warranted.
closed_qa
Does adding intravenous fentanyl to caudal block in children enhance the efficacy of multimodal analgesia as reflected in the plasma level of catecholamines?
Several studies showed that single analgesic modality management can attenuate perioperative stress, but little is known about the effect of multimodal analgesia on catecholamine responses to surgical trauma in children. Fifty children (American Society of Anesthesiologists Grade I or II) were randomly allocated to one of two groups: one received general anaesthesia and a caudal block (control group), and one group was given general anaesthesia, caudal block and intravenous (i.v.) fentanyl 2 microg kg(-1) (fentanyl group). Plasma epinephrine and norepinephrine concentrations were measured three times during the perioperative period: at induction time (T(0)), at the end of surgery (T(1)) and when the children were fully awake in the postanaesthesia care unit (T(2)). There was a significant reduction in the catecholamine levels in the two groups when (T(1)) and (T(2)) were compared with T(0). When plasma epinephrine levels (at T(0), T(1) and T(2)) between the two groups were compared, a statistically significant reduction at T(2) was obtained in the fentanyl group, when compared with the control group. However, plasma norepinephrine levels showed no statistically significant difference between the two groups (at T(0), T(1) and T(2)).
These findings suggest that the multimodal analgesic approach of adding i.v. low-dose fentanyl to a caudal block may decrease the plasma epinephrine release in children undergoing inguinal herniotomy.
closed_qa
Are multiple blood transfusions really a cause of acute respiratory distress syndrome?
Multiple blood transfusions are considered a common cause of acute respiratory distress syndrome (ARDS). We hypothesized that ARDS is more a consequence of ARDS risk factors (in particular circulatory shock) requiring transfusions than a result of the transfusions themselves. This retrospective study included 103 patients admitted during a 10-month period to an 858-bed university hospital who received multiple transfusions (more than six units of packed red blood cells in 24 h). Ten patients developed ARDS; they were more commonly admitted with circulatory shock (36 (38.7%) vs. 8 (80%), P = 0.01), polytrauma (7 (7.5%) vs. 4 (40%), P = 0.01) or thoracic trauma (3 (3.2%) vs. 4 (40%), P = 0.01). The sequential organ-failure assessment (SOFA) score at admission was higher in patients who developed ARDS than in those who did not (9.0 +/- 3.1 vs. 5.6 +/- 3.4, P<0.005). The total amount of transfusion in the first 24 h was 14.0 +/- 6.8 U in the ARDS patients and 10.6 +/- 7.3 U in the other patients (P = 0.17); the differences remained non-significant in the following days. During the first 24 h, patients who developed ARDS received more fresh frozen plasma than those who did not (21.8 +/- 10.6 U vs. 10.7 +/- 14.7 U, P = 0.02). Patients who developed ARDS had lower PaO2/FiO2 ratios (114 +/- 61 mmHg vs. 276 +/- 108 mmHg, P = 0.01), lower arterial pH (7.27 +/- 0.10 vs. 7.34 +/- 0.11, P = 0.06) and higher minute volume (10.6 +/- 2.8 L min(-1) vs. 7.9 +/- 1.8 L min(-1), P = 0.03) than patients without ARDS. Multivariable analysis retained thoracic trauma and hypoxaemia during the first 24 h (but not multiple transfusions) as independent risk factors for ARDS.
In this retrospective study, the development of ARDS in massively transfused patients was less related to poly-transfusion than to other factors related to circulatory shock, polytrauma or thoracic trauma. Thoracic trauma and a low PaO2 during the first 24 h were identified as independent risk factors for ARDS.
closed_qa
Are first-time episodes of serious LBP associated with new MRI findings?
Magnetic resonance (MR) imaging is frequently used to evaluate first-time episodes of serious low back pain (LBP). Common degenerative findings are often interpreted as recent developments and the probable anatomic cause of the new symptoms. To date no prospective study has established a baseline MR status of the lumbar spine in subjects without significant LBP problems and prospectively surveyed these subjects for acute changes shortly after new and serious LBP episodes. This method can identify new versus old MR findings possibly associated with the acute symptomatic episode. To determine if new and serious episodes of LBP are associated with new and relevant findings on MRI. Prospective observational study with baseline and post-LBP MRI monitoring of 200 subjects over 5 years. Clinical outcomes: LBP intensity (visual analogue scale), Oswestry Disability Index, and work loss. MRI outcomes: disc degeneration, herniation, annular fissures, end plate changes, facet arthrosis, canal stenosis, spondylolisthesis, and root impingement. 200 subjects with a lifetime history of no significant LBP problems, and a high risk for new LBP episodes were studied at baseline with physical examination, plain radiographs, and MR imaging. Subjects were followed every 6 months for 5 years with a detailed telephone interview. Subjects with a new severe LBP episode (LBP>or=6/10,>1 week) were assessed for new diagnostic tests. New MR imaging, taken within 6 to 12 weeks of the start of a new LBP episode, was compared with baseline (asymptomatic) images. Two independent and blinded readers evaluated each baseline and follow-up study. During the 5-year observation period of 200 subjects, 51 (25%) subjects were evaluated with a lumbar MRI for clinically serious LBP episodes, and 3/51 (6%) had a primary radicular complaint. These 51 subjects had 67 MR scans. Of 51 subjects, 43 (84%) had either unchanged MR or showed regression of baseline changes. The most common progressive findings were disc signal loss (10%), progressive facet arthrosis (10%), or increased end plate changes (4%). Only two subjects, both with primary radicular complaints, had new findings of probable clinical significance (4%). Subjects having another MR were more likely to have had chronic pain at baseline (odds ratio [OR]=3.19; 95% confidence interval [CI]1.61-6.32), to smoke (OR=5.81; 95% CI 1.99-16.45), have baseline psychological distress (OR 2.27; 95% CI 1.15-4.49), and have previous disputed compensation claims (OR=2.35; 95% CI 0.97-5.69). Subjects involved in current compensation claims were also more likely to have an MR scan to evaluate the LBP episode (risk ratio=4.75, p<.001), but were unlikely to have significant new findings. New findings were not more frequent in subjects with LBP episodes developing after minor trauma than when LBP developed spontaneously.
Findings on MR imaging within 12 weeks of serious LBP inception are highly unlikely to represent any new structural change. Most new changes (loss of disc signal, facet arthrosis, and end plate signal changes) represent progressive age changes not associated with acute events. Primary radicular syndromes may have new root compression findings associated with root irritation.
closed_qa
Is acamprosate use in alcohol dependence treatment reflected in improved subjective health status outcomes beyond cognitive behavioural therapy alone?
To examine whether the addition of acamprosate to Cognitive Behavioural Therapy (CBT) outpatient alcohol dependence treatment impacted on subjective health status. Among 268 patients consecutively treated for alcohol dependence, 149 chose CBT alone. A matched design was used. From a possible pool of 119 Acamprosate + CBT and 149 CBT-only patients, 86 Acamprosate + CBT subjects were individually matched with 86 CBT-only patients on parameters of gender, age, prior detoxification and alcohol dependence severity. Health Status (SF-36) and Psychological Well-Being (GHQ-28) was assessed pre- and post-treatment. Pre-treatment, both self-reported health status and psychological well-being was markedly below normative (community) ranges. Program completers significantly improved across both measures over 12 weeks of treatment and some health domains approximated community levels. No treatment group differences were observed.
Participants who completed the CBT-based treatment showed significant improvement in self-reported health status. The use of acamprosate did not register additional improvement on either SF-36 or GHQ-28, beyond CBT alone.
closed_qa
Can a costly intervention be cost-effective?
To examine the cost-effectiveness of the Fast Track intervention, a multi-year, multi-component intervention designed to reduce violence among at-risk children. A previous report documented the favorable effect of intervention on the highest-risk group of ninth-graders diagnosed with conduct disorder, as well as self-reported delinquency. The current report addressed the cost-effectiveness of the intervention for these measures of program impact. Costs of the intervention were estimated using program budgets. Incremental cost-effectiveness ratios were computed to determine the cost per unit of improvement in the 3 outcomes measured in the 10th year of the study. Examination of the total sample showed that the intervention was not cost-effective at likely levels of policymakers' willingness to pay for the key outcomes. Subsequent analysis of those most at risk, however, showed that the intervention likely was cost-effective given specified willingness-to-pay criteria.
Results indicate that the intervention is cost-effective for the children at highest risk. From a policy standpoint, this finding is encouraging because such children are likely to generate higher costs for society over their lifetimes. However, substantial barriers to cost-effectiveness remain, such as the ability to effectively identify and recruit such higher-risk children in future implementations.
closed_qa
Can administration of metoclopramide reduce artefacts related to abdominal activity in myocardial perfusion SPECT?
Myocardial perfusion SPECT is frequently affected by artefacts related to abdominal activity. Metoclopramide has been suggested to relieve this, but two previous studies have shown conflicting results. Ninety-five patients received 10 mg metoclopramide orally after injection of 99mTc-tetrofosmin for the stress scan and 86 patients had metoclopramide after their rest injection. A control group of 82 patients did not receive metoclopramide. Scans were evaluated visually by three readers. Metoclopramide given before the stress scan led to abdominal activity being visually better in 16 scans, worse in 10, and unchanged in 67 scans, compared to the same patient's rest scan without metoclopramide administration. Metoclopramide administered before the rest scan resulted in abdominal activity in 11 scans being visually better, in 19 worse, and 53 scans were deemed unchanged. These differences were not significant. The number of repeat stress or rest scans was not significantly different between patients who had received metoclopramide and those who had not. The administration of metoclopramide, irrespective of whether it was given before the stress or rest scan, made no significant difference to inferior wall-to-abdomen count ratio.
Neither qualitative nor quantitative analysis showed an effect of metoclopramide on abdominal activity in myocardial perfusion SPECT.
closed_qa
Can intrapleural C-reactive protein predict VATS pleurodesis failure?
Intrapleural inflammatory reaction after surgery for spontaneous pneumothorax is a key indicator whether an effective pleurodesis has been achieved. In this study, we tested the hypothesis that intrapleural C-reactive protein (CRP) might precisely quantify the postoperative pleural inflammation, offering potentially useful information for patient management. The study population consisted of 75 consecutive patients who underwent video-assisted thoracoscopic pleurectomy or pleural abrasion for spontaneous pneumothorax between April 2003 and August 2004. We assessed CRP levels in pleural and blood samples taken daily in the first 4 postoperative days. Intrapleural CRP profile was significantly lower in patients who underwent pleural abrasion, in younger patients (<25 years) and in patients who were not drained before surgery. Patients with pleurodesis failure had a lower CRP peak with a delayed peak. Receiving operating characteristics (ROC) analysis showed that the cutoff value of intrapleural CRP for pleurodesis failure was 25 mg/l on the second postoperative day (sensitivity 87.5 %, specificity 66.6 %, positive predictive value 24.1, negative predictive value 97.7 %).
Pleural CRP levels of less than 25 mg/l on the second postoperative day indicate only a moderate pleural inflammation.
closed_qa
Is it possible to predict postnatal depression?
the authors investigated the role of the gynaecologist in trying to predict postnatal depression. Women suffering from postnatal depression (PND) are the expression of a failure to adapt to the unjust demands that society makes on them. Isolation and the lack of social support during and after the pregnancy are very strong factors of risk for postpartum depression. The problem is serious and it develops rapidly, within two weeks of childbirth. It requires immediate and continuous treatment. There is also some risk of infanticide or suicide. submission of a questionnaire based on the EPDS (Edinburgh Postnatal Depression Scale) to 222 pregnant women between 28 and 40 weeks of gestation. 28.4% of the patients resulted positive to the test (score>12 points) and the hypothesis would seem to be that there is a continuum between depression suffered pre- and postpartum, and that the depression begins during pregnancy and then becomes more acute or less latent at the time of confinement.
the gynaecologist must have a role in helping to achieve an early diagnosis of the depression, because the earlier the problem is recognised the greater are the possibilities of therapy and preventing any consequences for the entire family group.
closed_qa
Is it possible to predict the clinical course of gallstone disease?
A knowledge of the predictive factors of the development and persistence of symptoms in gallstone patients (GS) plays a key role in clinical decision making. The aim of this study was to evaluate the presence of predictive factors for biliary pain development in GS, stressing the role of gallbladder motility. A total of 153 (104 women, 49 men) consecutive GS were enrolled. Gallbladder motility (%Emptying [%E], residual volume [RV]) was evaluated by ultrasonography and biliary symptoms were evaluated using a specific questionnaire in all GS at baseline and after 4 yr of follow-up. At enrolment, 61 GS reported recent (GSr) and 31 GS remote (GSo) (>2 yr before) episodes of biliary pain, and 61 were asymptomatic (GSa). At baseline, GSr showed a greater %E and a smaller RV than both GSo and GSa (p<0.001). After follow-up, biliary pain developed more frequently in GSr (33.3%) than in GSo (16%) and GSa (15%) (p= 0.04). The search for predictive factors of biliary pain development (by univariate and multivariate analyses) revealed a high %E, a small RV, and a history of biliary pain as risk factors.
Efficient gallbladder motility is present in symptomatic GS and it represents a risk factor for biliary pain development while sluggish motility seems to play a protective role. Thus, gallbladder motility evaluation is a useful diagnostic tool in clinical decision making for GS; in symptomatic GS, a progressive reduction of gallbladder motility could suggest a "wait and see" management policy instead of an immediate surgical approach.
closed_qa
Does the recent increase in HIV diagnoses among men who have sex with men in the UK reflect a rise in HIV incidence or increased uptake of HIV testing?
To determine whether the increase in HIV diagnoses since 1997 among men who have sex with men (MSM) in the UK reflects a rise in HIV incidence or an increase in HIV testing. Estimates of HIV incidence were derived using data from UK HIV surveillance systems (HIV diagnoses; CD4 surveillance; unlinked anonymous surveys) for 1997-2004. Data on HIV testing were provided by KC60 statutory returns, voluntary testing and unlinked anonymous surveys in sentinel genitourinary medicine (GUM) clinics. HIV diagnoses among MSM in the UK rose by 54% between 1997 and 2004 (from 1382 to 2124), with variation by age and geographical location. The number of HIV diagnoses among MSM<35 years of age in London showed no increase, but in all other groups it increased. Throughout the UK, uptake of HIV testing increased significantly among MSM attending GUM clinics between 1997 and 2004, including "at-risk" MSM (p<0.001). Direct incidence estimates (serological testing algorithm for recent HIV seroconversion assay) provided no evidence of a statistically significant increase or decrease in HIV incidence. Indirect estimates suggested that there may have been a rise in HIV incidence, but these estimates were influenced by the increased uptake of HIV testing.
The number of HIV diagnoses increased among MSM in the UK between 1997 and 2004, except among younger MSM in London, in whom there was no change. The increase in HIV diagnoses among MSM in the UK since 1997 seems to reflect an increase in HIV testing rather than a rise in HIV incidence.
closed_qa
Is childhood immunisation associated with atopic disease from age 7 to 32 years?
There is ongoing conjecture over whether childhood immunisation leads to an increased risk of developing atopic diseases. To examine associations between childhood immunisation and the risk of atopic disease. Immunisation histories of 8443 Tasmanian children born in 1961 obtained from school medical records were linked to the Tasmanian Asthma Study. Associations between immunisation status and atopic diseases were examined while adjusting for possible confounders using multiple logistic regression. Diphtheria immunisation was weakly associated with an increased risk of asthma by age 7 years (odds ratio (OR) 1.3, 95% confidence interval (CI) 1.1 to 1.7), but there was no evidence of any association for four other vaccinations studied. An increased risk of eczema by age 7 years was associated with immunisation against diphtheria (OR 1.5, 95% CI 1.1 to 2.1), tetanus (OR 1.5, 95% CI, 1.1 to 2.0), pertussis (OR 1.5, 95% CI 1.1 to 1.9) and polio (OR 1.4, 95% CI 1.0 to 1.9) but not small pox. Similar but slightly weaker patterns of association were observed between the risk of food allergies and immunisation against diphtheria (OR 1.5, 95% CI 1.0 to 2.1), pertussis (OR 1.4, 95% CI 1.1 to 1.9), polio (OR 1.4, 95% CI 1.00 to 2.1) and tetanus (OR 1.30 95% CI 0.99 to 1.70), but not with small pox. There was no evidence of associations between immunisation history and hay fever, or incidence of later-onset atopic outcomes.
The few effects seen in this study are small and age-dependent, and nearly all our findings support numerous previous studies of no effect of vaccines on asthma. Based on these findings, the fear of their child developing atopic disease should not deter parents from immunising their children, especially when weighed against the benefits.
closed_qa
Diagnostic criteria for congenital long QT syndrome in the era of molecular genetics: do we need a scoring system?
Previously published diagnostic systems, based on ECG analysis and clinical parameters (Schwartz criteria and Keating criteria), have been used to estimate the probability of inherited long QT syndrome (LQTS). Nowadays, a certain diagnosis can often be made by DNA testing. We aimed to establish the predictive power of the Schwartz and Keating criteria, using DNA testing as a reference, and to determine the best diagnostic strategy. We studied 513 relatives (aged>10 years) of 77 consecutive LQTS probands with a known disease-causing mutation. The Schwartz criteria identified 'high probability of LQTS' (score>or=4) in 41 of 208 mutation carriers, yielding 19% sensitivity and 99% specificity. The Keating criteria had 36% sensitivity and 99% specificity. Alternatively, by analysing QTc duration alone, we found that 430 ms is the optimal cut-off value to distinguish carriers (>or=430 ms) from non-carriers (<430 ms), yielding 72% sensitivity and 86% specificity (area under the curve 0.788).
The existing clinical criteria have good specificity in identifying mutation carriers. However, their sensitivity is too low for clinical use. Analysis of QTc duration alone is more useful to screen for LQTS carriership (QTc>or= 430 ms) as its sensitivity is far superior, although its specificity remains acceptable. In genotyped families, genetic testing is the preferred diagnostic test.
closed_qa
Use of herbal remedies by Hispanic patients: do they inform their physician?
This study measured the knowledge and use of herbs among Hispanics and assessed their experiences when discussing herb use with their physician. Self-administered questionnaires were collected from 620 Hispanic patients seeking treatment in urban health centers. Most (80.3%) reported using herbs. Herb users were more comfortable speaking Spanish (91.9% vs 80.2%) and had been in the United States less than 5 years (47.0% vs 29.4%). More users considered herbs as drugs (60.5% vs 39.6%). Users were more aware that herbs could harm a baby if taken during pregnancy (56.4% vs 36.0%). The majority did not know the English name for 23 of the 25 herbs. A majority indicated their physician was unaware of their herb use. Few (17.4%) responded that their physicians asked about herb use. Only 41.6% thought their physician would understand their herb use, and 1.8% believed their physician would encourage continued use. There were no significant differences between herb users and nonusers in their perception of patient-physician communication levels.
Primary care physicians need to be aware that most Hispanic patients are likely to use herbs. It is important to initiate and encourage discussion of their patient's interest in and use of these therapies.
closed_qa
Avoiding prolonged waiting time during busy periods in the emergency department: Is there a role for the senior emergency physician in triage?
Patient satisfaction at emergency departments can be improved by reductions in waiting time. Traditional methods require registration and triage before seeing the doctor with senior emergency physicians mainly engaged in treating serious cases. We examine a radical change in workflow pattern on waiting time by placing a senior emergency physician with the triage nurse and examining the impact of treating simple cases upfront with discharge on the waiting times for stretcher cases. A senior emergency physician was placed with the triage nurse in the Department of Emergency Medicine at Alexandra Hospital during peak busy periods of patient attendance over a period of 2 months. Measures were made of waiting time (registration to doctor consult) of PACS 3 and PACS 2 (Patient Acvity Score) cases accordingly. Ten days were chosen for the changed workflow practice and 10 days for controls in which normal traditional working practice followed. On all days, there was the same number of medical staff. The average waiting time for walk-in patients (PACS 3) was 19 min on experimental days as compared with 35.5 min on control days, with 78% being seen within 30 min in the experimental group compared with 48% on control days (P<0.05). The PACS 2 waiting time was also significantly decreased on experimental days (P<0.05).
Placing a senior emergency physician with the triage nurse reduced waiting times for walk-in cases. One third of attendances were treated and discharged quickly, allowing the consulting room and PACS 1/PACS 2 doctors to act more efficiently.
closed_qa
Do patients have a preference for major connector designs?
Nineteen Kennedy Class I or II partially edentulous patients participated at two centers. The four major connector analogs (MCAs) were fabricated for each subject using light-polymerizing acrylic resin. The subjects were asked to wear each of them in the mouth for 30 seconds in six pairs in random order, and to report their preference for each pair. Based on these data, the four analogs were ranked in a descending preference order for each patient. Within-subject comparisons preferences were performed with the Friedman test, and the multiple comparisons were performed with the Wilcoxon Signed Ranks test for data of each sample independently. Statistically significant and consistent preference orders were revealed for both samples, and the thin and wide design was significantly preferred to the thick and narrow design. However, a higher variation was observed for the first preference of each subject.
Subjects demonstrated a tendency to prefer thinner MCAs. However, the individual predilections of patients may not be an appropriate basis for an attempt to find a 'best design' applicable to all patients.
closed_qa
Do burned-out and work-engaged employees differ in the functioning of the hypothalamic-pituitary-adrenal axis?
The central aim of the present study was to examine differences in the functioning of the hypothalamic-pituitary-adrenal (HPA) axis between 29 burned-out, 33 work-engaged, and 26 healthy reference managers, as identified with the Maslach Burnout Inventory-General Survey and the Utrecht Work Engagement Scale. All of the managers were employed in a large Dutch telecommunications company. Salivary cortisol was sampled on three consecutive workdays and one nonworkday to determine the cortisol awakening response. Salivary dehydroepiandrosterone-sulfate (DHEAS), a cortisol counterbalancing product of the HPA axis, was measured on these days 1 hour after managers awakened. The dexamethasone suppression test was used to investigate the feedback sensitivity of the HPA axis. The morning cortisol levels were higher on the workdays than on the nonworkday, but this effect did not differ between the three groups. The burned-out, work-engaged, and reference groups did not differ in the cortisol and DHEAS levels, the slope of the cortisol awakening response, and the cortisol : DHEAS ratio. The work-engaged group showed a stronger cortisol suppression in response to the dexamethasone suppression test than the other two groups, the finding suggesting higher feedback sensitivity among work-engaged managers.
Burned-out and work-engaged managers only differ marginally in HPA-axis functioning.
closed_qa
Quality of life and pain in Chinese lung cancer patients: Is optimism a moderator or mediator?
To clarify if optimism exerts a primarily moderating or mediating influence on the pain-QoL association in Chinese lung cancer patients. About 334 Chinese lung cancer patients were interviewed at baseline during the first outpatient visit (Baseline), at 4 months after Baseline (FU1), and at 8 months after Baseline (FU2). Respondents completed the Chinese version of the FACT-G version-3 scale (FACT-G (Ch)). Optimism and pain were assessed using two 11-point self-rated items. Linear mixed effects (LME) models tested the moderating and mediating effects of optimism on QoL. Optimism, pain, and QoL were most strongly correlated at FU1. LME models failed to show any moderating effect by optimism on the pain-QoL association (standardized beta = -0.049, 95% CI -0.097 to 0.001). After adjustment for age, cancer stage, and disease recurrence, a modest mediating effect was observed for optimism on the pain-QoL association over the duration of the study (standardized beta = 0.047; Sobel test z = -4.317, p<0.001).
Optimism qualifies as a mediator between pain and QoL suggesting that pessimistic lung cancer patients are likely to experience greater QoL decrements in response to pain in the early post-diagnostic period. Effective pain control may be enhanced by inclusion of interventions that facilitate optimistic perspectives in patients. This study lends further support to the view that lung cancer patients' psychological needs are important in both pain control and QoL.
closed_qa
Is sonographic assessment of cervical length better than digital examination in screening for preterm delivery in a low-risk population?
This randomized controlled trial compared the diagnostic accuracy of the sonographic assessment of cervical length and clinical digital examination of the cervix in the second trimester regarding the prediction of preterm delivery in a low-risk population. In total, 282 unselected, asymptomatic women with singleton pregnancy randomly underwent sonographic cervical length measurement (study group, n=138) or clinical digital examination (control group, n=144) in the second trimester. In the study group cervical length<or=5th percentile (<or=24 mm) for our population was defined as shortened. In the control group, Bishop score>or=95th percentile (>or=4) for our population was defined as high. The primary outcome measure was the diagnostic accuracy of both tests regarding the prediction of preterm delivery (<37 weeks). Shortened cervical length was found in 6/138 (4.3%) women whereas the high Bishop score was found in 17/144 (11.8%) (p=0.038, Fisher's exact test). The incidence of preterm delivery was 5.7% (16/282). Regarding the prediction of preterm delivery, shortened cervical length and high Bishop score had sensitivity 57.1% versus 33.3% and positive predictive value 66.7% versus 17.6%. Shortened cervical length in comparison with high Bishop score had 12-fold higher positive likelihood ratio for preterm delivery in a low-risk population (37.4; 95%CI [8.2-170.7] versus 3.2; 95%CI [1.1-9.2]).
Sonographic assessment of cervical length has better diagnostic accuracy in the prediction of preterm delivery compared to digital examination in a low-risk population.
closed_qa
Adolescents' eating, exercise, and weight control behaviors: does peer crowd affiliation play a role?
To examine the association between peer crowd affiliation (e.g., Jocks, Populars, Burnouts, Brains) and adolescents' eating, exercise, and weight control behaviors. The roles of gender and ethnicity were also examined. Ethnically diverse adolescents (N = 705; 66% girls) completed the Peer Crowd Questionnaire, eating and exercise items from the Youth Risk Behavior Surveillance System, and weight control behaviors from the Eating Attitudes Test-12. Controlling for gender and ethnicity, adolescents affiliating with the Burnouts reported more unhealthful eating and more bulimic behaviors than others; adolescents affiliating with the Brains reported more healthful eating, less unhealthful eating, and more dieting; those affiliating with Jocks and Populars reported engaging in more exercise; and Populars also reported more unhealthful eating. In addition, boys exercised more than girls; girls reported more dieting and bulimic behaviors. Black adolescents reported more unhealthful eating and less dieting than other adolescents.
Along with gender and ethnicity, peer crowd affiliation is related to adolescents' eating, exercise, and weight control behaviors. Prevention programs should consider adolescent peer crowds in developing health promotion and obesity prevention programs.
closed_qa
Immunohistochemical localization of thymidine phosphorylase in gastric cancer: is there a role of the differential expression in tumor cells and associated stromal cells?
The aim of this study was to investigate the expression of thymidine phosphorylase (TP), a known angiogenic factor for endothelial cells, in gastric carcinoma cells and tumor-associated stromal cells. Sixty-six gastric carcinomas were studied. TP expression was assessed with the P-GF.44C mouse monoclonal antibody using the avidin-biotin immunoperoxidase technique. The results were correlated with several clinicopathological parameters and patient survival. TP expression in cancer cells was related to the age of the patients and the overall survival. When TP was expressed in tumor-associated stromal cells, it was statistically related to poorly-differentiated tumors. Statistical analysis revealed no relationship between TP expression and any of the clinicopathological parameters under evaluation, when considering stromal TP immunoreactivity separately for stromal fibroblasts and associated inflammatory cells, or when considering the tumors as TP-positive, irrespective of the tissue localization of the enzyme.
TP seems to be a prognostic indicator in gastric cancer patients only when the enzyme is located in tumor cells. The different impacts of TP expression in tumor cells and associated stromal cells might indicate that the enzyme may have more than one function involved in tumor growth.
closed_qa
Can urethral mobility be assessed using the pelvic organ prolapse quantification system?
To determine whether the Pelvic Organ Prolapse-Quantification (POP-Q) system can be used as a replacement for Q-tip testing to assess urethral mobility in women. We performed a retrospective review of a clinical database of 1490 patients presenting to a urogynecology clinic. The evaluation included both Q-tip straining angle and POP-Q examination. Urethral hypermobility was defined by the Q-tip test as a straining angle of 30 degrees or greater relative to the horizontal. The correlation between point Aa of the POP-Q system and the maximal Q-tip straining angle was determined using the Spearman correlation coefficient. The mean age of the 1490 patients was 59.5 +/- 13.1 years; the median parity was 2. A total of 62 patients (4.2%) reported prior surgery for incontinence or prolapse. The mean Q-tip straining angle was 44.7 degrees +/- 21.8 degrees. The point Aa values were +3 to -3 cm (median -2). The correlation coefficient between the Q-tip straining angle and point Aa was 0.54 (P<0.001). Urethral hypermobility was observed in 93.3% patients with stage 3 prolapse, 92.5% with stage 2, 88.9% with stage 1, and 55.8% with stage 0. Only in patients with stage 4 prolapse was urethral hypermobility observed 100% of the time.
The correlation between point Aa of the POP-Q and the Q-tip straining angle was moderately strong when analyzed across all degrees of prolapse. However, urethral hypermobility could not be reliably predicted from POP-Q measurement alone. Therefore, the Q-tip test remains an essential part of the urogynecologic evaluation.
closed_qa
Transperitoneal versus extraperitoneal robotic-assisted radical prostatectomy: is one better than the other?
To evaluate the differences, if any, in outcomes with transperitoneal (TP) versus extraperitoneal (EP) approaches during robotic-assisted radical prostatectomy (RARP). We reviewed the data from 40 consecutive patients who underwent EP-RARP at our institution by the same surgical team. The outcomes were compared with those of 40 consecutive patients who underwent TP-RARP performed by the same team in a nonrandomized manner. The operative and postoperative parameters (total operative time, estimated blood loss, length of stay, robotic console time, and robotic anastomosis time), as well as complications and surgical margin status, were analyzed and compared. The patient demographics were similar in both groups. Nerve sparing was performed in 35 and 36 patients in the TP and EP groups, respectively. Pelvic lymphadenectomy was performed in 14 and 12 patients in the TP and EP groups, respectively. The operative time was slightly longer with the TP approach at 236 minutes (range 111 to 360) compared with 229 minutes (range 143 to 382) in the EP group, but the difference was not statistically significant (P = 0.5722) between the two groups. Also, the differences in robot console time, anastomosis time, estimated blood loss, and length of stay were not statistically significant between the two groups at the 5% significance level. The complication and positive surgical margin rates were similar in both groups.
As expected, the EP approach is feasible with RARP. Our data suggest that the EP approach is comparable to the TP approach and produces favorable outcomes. Surgeon preference will likely play a significant role in the approach used.
closed_qa
Primer sensitivity: can it influence the results in Enterococcus faecalis prevalence studies?
Recent polymerase chain reaction (PCR)-based studies have shown significant variability in the prevalence of Enterococcus faecalis cases with nonhealing endodontic infections. This variability may be, at least in part, due to the differences in sensitivities of the primers used. The purpose of this study was to compare the sensitivity of 3 sets of PCR primers which have been reported in the endodontic literature. The 3 primers sets used were: group 1) tuf gene-based primers with genus-level specificity; and groups 2 and 3) 16S rDNA-based primers that were E. faecalis specific. Three strains of E. faecalis at concentrations of 10(2)-10(8) cells/mL were included in this study. The PCR amplification of E. faecalis strains with the 3 primer pairs showed that group 1 primers consistently had the highest sensitivity, followed by group 2 and group 3 (P<.0001).
A tuf-based PCR identification assay followed by direct sequencing would yield accurate and consistent prevalence rates of E. faecalis in endodontic infections.
closed_qa
Does caffeine modify corticomotor excitability?
To test the influence of caffeine on the lower and upper motor neuron excitability. In Experiment A, 18 healthy subjects received 200 mg of caffeine or placebo, in a randomized, double-blind, placebo-controlled design protocol. Mean F-waves amplitude, amplitude of the motor response evoked by magnetic stimulation (MEP), MEP duration, cortical silent period (CSP), central conduction time, and cortical threshold were evaluated. In Experiment B, 6 healthy controls received 400 mg of caffeine, the peripheral silent period (PSP) and CSP were evaluated. CSP was recorded bilaterally in biceps brachii (intensity 10% above threshold) and abductor digiti minimi (ADM) (intensity at 10% and 50% above threshold). Muscle contraction was above 50% of the maximum force in both experiments. Latencies were defined by a technician who was not aware of this investigation. Serum caffeine level was evaluated. In Experiment A, only the CSP, recorded in both ADM with intensity at 10% above threshold showed a significant change after caffeine (decrease of 17.1+/-34.0 ms, about 12% reduction). In Experiment B, PSP did not change, but CSP tested with intensities 10% above threshold was significantly decreased by 20.8+/-34.4 ms in ADM and 13.5+/-13.8 ms in biceps (about 13 and 16%, respectively). Serum caffeine level clearly increased after consumption but no correlation could be found between these levels and CSP reduction.
In our investigation, caffeine elicited a consistent decrease of the CSP, suggesting that caffeine increases cortical neuronal excitability.
closed_qa