instruction
stringlengths
10
664
context
stringlengths
1
5.66k
response
stringlengths
1
3.34k
category
stringclasses
1 value
Is routine pouchogram prior to ileostomy closure in colonic J-pouch really necessary?
Colonic J-pouch with coloanal anastomosis has gained popularity in the surgical treatment of middle and lower rectal pathologies. If a diverting ileostomy is performed, a pouchogram is frequently performed prior to ileostomy closure. The aim of this study was to assess the routine use of pouchogram prior to ileostomy closure in patients with colonic J pouch-anal anastomosis. All patients who underwent a colonic J pouch-anal anastomosis between 1990 and 2000 were retrospectively reviewed. Patients with temporary loop ileostomy who had pouchogram prior to ileostomy closure were included. Pouchogram results were compared to the patient's post ileostomy closure clinical outcome. Sensitivity, specificity and predictive values of pouchogram were assessed. Eighty-four patients had a pouchogram prior to ileostomy closure. Radiological abnormalities were evident in 6 patients, including 4 strictures, 1 pouch-vaginal fistula and 1 leak. Of these findings, 4 were false positives (3 strictures and 1 leak) and two were true positives (1 stricture and 1 pouch-vaginal fistula). The actual rate of pouch complications was 9.5% (8 complications) including 3 anastomotic leaks, all with normal pouchogram, 3 strictures requiring dilatation under anaesthesia, only one detected by pouchogram, and 2 pouch-vaginal fistulas, only one diagnosed by pouchogram. The sensitivity and specificity of pouchogram, respectively, was 0 and 98% for anastomotic leak, 33 and 96% for stricture, and 50 and 100% for pouch-vaginal fistula. Overall, pouchogram changed the management in only 1 of 84 patients.
Pouchogram has a low sensitivity in predicting complications following ileostomy closure in patients after colonic J-pouch anal anastomosis and rarely changes the management of these patients. The use of pouchogram prior to ileostomy closure may be unnecessary and should be reserved in cases of clinical suspicion of complications.
closed_qa
Is an easy and reliable diagnosis of localized neuropathic pain (LNP) possible in general practice?
Neuropathic pain (NP) is a common type of chronic pain in which 60% of patients present with localized symptoms. Early diagnosis of NP is often a challenge in primary care. Moreover, so far no standard diagnostic procedure for localized NP (LNP) is available. To help general practitioners, a screening tool was developed and evaluated. The development of the screening tool was based on the grading system principles for NP proposed by the IASP, focusing on medical history and distribution of painful symptoms and sensory signs. It was tested by 31 general practitioners and evaluated against the NP diagnosis of three pain specialists as reference in a single center prospective study in Spain using a cohort study design including an adult population of chronic pain patients. This design avoids spectrum bias where the spectrum of disease is not correctly reflected in the study population. General practitioners rated usefulness, simplicity, and time requirements of the tool. Diagnostic accuracy was expressed by sensitivity, specificity, and positive and negative predictive values. General practitioners consecutively screened 2079 chronic pain patients (mean age 60.7 ± 11.1 years, 69.9% female). Using the tool, 394 patients were diagnosed with LNP. Screening including sensory examination took 7 min (median). General practitioners rated the tool as useful (24/31; 77.4%) or very useful (7/31; 22.6%) for diagnosing LNP and facilitating clinical practice (30/31; 96.8%). Under daily practice conditions, sensitivity and specificity of the tool for detecting LNP was 46.7% and 86.6%, respectively.
The proposed screening tool was shown to be easy and useful for detecting NP and LNP in chronic pain patients as a fast first assessment tool in primary care, thus facilitating the choice of a topical treatment.
closed_qa
Is non-vascularized autografting in the proximal scaphoid nonunions ineffective?
In this study, we aimed to evaluate the results of proximal scaphoid non-union treated with non-vascularized bone grafting and screw fixation. Thirteen patients who were treated surgically for proximal scaphoid non-union with a minimum of one-year follow-up in our clinic were evaluated. Wrist movements were measured by standard goniometry and muscle strength by hand dynamometry. Non-union was classified radiologically according to the Schernberg classification, while functional assessment was performed based on the Herbert-Fisher Grading System and the Mayo Clinic Modified Wrist Scoring System. The mean follow-up period was 14 months (range, 12 to 40 months). Full union was observed in eight of 13 patients (61.5%). The mean time to union was 16 (range, 12 to 40) weeks. There was no loss of function of more than 10% compared to the healthy hand in the cases with full union. Postoperative mean grip strength was 37.3±3.0 kg. The rates of excellent and good results were 61.5%, moderate and poor results were 38.5% according to the Herbert-Fisher classification and the mean Mayo score was 80±13.
We obtained no satisfactory results in patients treated with non-vascularized bone grafting and screw fixation for proximal scaphoid non-unions. We suggest that grafting should be carried out in selected cases due to the adverse effects of open techniques and bone grafting on vascularity of scaphoid bone.
closed_qa
Are there any adverse effects of static magnetic field from magnetic resonance imaging devices on bone health of workers?
In this study, we aimed to evaluate the effects of static magnetic field (SMF) from magnetic resonance imaging (MRI) devices on the bone health of MRI workers. Fourteen volunteer MRI technicians working with 1.5 Tesla MRI units at least for two years were included in the study. An age and sex-matched control group from indoor working 14 volunteer paramedical staff who were not exposed to SMF and met the identical criteria was formed. Dual-energy X-ray absorptiometry (DXA) scanning was performed in all participants. Parathyroid hormone, calcium, phosphorus, alkaline phosphatase, 25-hydroxyvitamin D3, and 1.25-dihydroxyvitamin D3 levels were measured. The mean vertebral and femoral neck bone mineral content (BMC) and bone mineral density (BMD) as well as the mean 25-hydroxyvitamin D3 level of MRI technicians was found to be lower than the control group (p<0.01). Despite presenting within the normal range, the mean calcium level of MRI technicians was higher than the control group (p<0.05). There was no statistically significant difference in other variables between the groups.
To the best of our knowledge, adverse effects of SMF from MRI devices on the bone health of MRI workers were detected for the first time. However, further multicenter studies and animal experiments are required to gain a better understanding of the mechanism that how the SMF affects bone health in chronic exposure.
closed_qa
Are asthmatic patients prone to bone loss?
Recent studies suggest an association between allergic diseases, including asthma, and lower vitamin D level, a well-known risk factor of osteoporosis. However, it is not yet clearly known whether patients with asthma are prone to bone loss. To evaluate whether the occurrence of airway hyperresponsiveness (AHR) or asthma is related to significant changes in bone mineral density (BMD). We retrospectively enrolled 7,034 patients who had undergone a health checkup program, including BMD tests and methacholine bronchial challenge tests, at the Seoul National University Hospital, Healthcare System Gangnam Center, from November 1, 2004 to April 30, 2011. Asthma was ascertained by self-reported medical diagnosis by a physician. Patients with a history of systemic corticosteroid medication use were excluded from the study. Among a total of 7,034 patients, 216 (3.1%) had a positive AHR test result, and 217 (3.1%) had a history of asthma. Lumbar spine and femur BMD of patients with AHR were significantly lower than those without AHR (-0.53 ± 1.50 vs -0.03 ± 1.49, -0.47 ± 0.97 vs -0.22 ± 0.99, respectively; P<.001 for both). After being adjusted for age, sex, body mass index, smoking status, postmenopausal state, and previous history of hormone replacement therapy, the proportion of patients with osteopenia or osteoporosis was much higher in the AHR-positive group than in the AHR-negative group (odds ratio, 1.715; 95% confidence interval, 1.252-2.349) and in the ever-asthma group than in the never-asthma group (odds ratio, 1.526; 95% confidence interval, 1.120-2.079).
In the current study, AHR and asthma were related to clinically meaningful BMD decrease, although the causal relationship is unclear.
closed_qa
Synchronous splenectomy during cholecystectomy for hereditary spherocytosis: is it really necessary?
Expert guidelines recommend performing synchronous splenectomy in patients with mild hereditary spherocytosis (HS) and symptoms of gallstone disease. This recommendation has not been widely explored in the literature. The aim of this study is to determine if our data support expert opinion and if different practice patterns should exist. This is an IRB-approved retrospective study. All HS patients under 18 years of age who underwent cholecystectomy for symptomatic gallstones at a single institution between 1981 and 2009 were identified. Patients who underwent cholecystectomy without concurrent splenectomy were reviewed retrospectively for future need for splenectomy and evidence of recurrent gallstone disease. Of the 32 patients identified, 27 underwent synchronous splenectomy. The remaining 5 patients underwent cholecystectomy without splenectomy and had a mean age of 9.4 years. One of the 5 patients eventually required splenectomy for left upper quadrant pain. None of the remaining 4 required hospitalization for symptoms related to hemolysis or hepatobiliary disease. Median follow-up is 15.6 years.
The need for splenectomy in patients with mild HS and symptomatic cholelithiasis should be assessed on a case by case basis. Our recommendation is to not perform synchronous splenectomy in conjunction with cholecystectomy for these patients if no indication for splenectomy exists.
closed_qa
Should the ovary always be conserved in torsion?
All children with a diagnosis of ovarian torsion admitted to our hospital from January 2009 to January 2013 were included. Patients with underlying ovarian pathology were excluded. There were 13 torsions in 12 children (one bilateral). All underwent detorsion with or without evacuation of hematoma. Follow-up ultrasonography (USG) with color Doppler was done for all 13 ovaries, which showed an ovary with good vascularity and follicular development in 12 ovaries (92%). In 76% (10 of 13) of cases, intraoperatively, the ovary was judged to be moderately to severely ischemic/necrotic. Yet, follow-up sonograms showed the ovary with follicular development in all cases except one (7%). There were no major complications in our series.
Simple detorsion, instead of traditionally advocated oophorectomy, was not accompanied by an increase in morbidity. On follow-up, almost all patients studied had functioning ovarian tissue despite the grave ischemia observed intraoperatively. Detorsion should be the procedure of choice for all cases of simple ovarian torsion in children.
closed_qa
Should we await IFN-free regimens to treat HCV genotype 1 treatment-naive patients?
In treatment-naive patients mono-infected with genotype 1 chronic HCV, treatments with telaprevir/boceprevir (TVR/BOC)-based triple therapy are standard-of-care. However, more efficacious direct-acting antivirals (IFN-based new DAAs) are available and interferon-free (IFN-free) regimens are imminent (2015). A mathematical model estimated quality-adjusted life years, cost and incremental cost-effectiveness ratios of (i) IFN-based new DAAs vs. TVR/BOC-based triple therapy; and (ii) IFN-based new DAAs initiation strategies, given that IFN-free regimens are imminent. The sustained virological response in F3-4/F0-2 was 71/89% with IFN-based new DAAs, 85/95% with IFN-free regimens, vs. 64/80% with TVR/BOC-based triple therapy. Serious adverse events leading to discontinuation were taken as: 0-0.6% with IFN-based new DAAs, 0% with IFN-free regimens, vs. 1-10% with TVR/BOC-based triple therapy. Costs were €60,000 for 12weeks of IFN-based new DAAs and two times higher for IFN-free regimens. Treatment with IFN-based new DAAs when fibrosis stage ⩾F2 is cost-effective compared to TVR/BOC-based triple therapy (€37,900/QALY gained), but not at F0-1 (€103,500/QALY gained). Awaiting the IFN-free regimens is more effective, except in F4 patients, but not cost-effective compared to IFN-based new DAAs. If we decrease the cost of IFN-free regimens close to that of IFN-based new DAAs, then awaiting the IFN-free regimen becomes cost-effective.
Treatment with IFN-based new DAAs at stage ⩾F2 is both effective and cost-effective compared to TVR/BOC triple therapy. Awaiting IFN-free regimens and then treating regardless of fibrosis is more efficacious, except in F4 patients; however, the cost-effectiveness of this strategy is highly dependent on its cost.
closed_qa
Explaining individual differences in alcohol intake in adults: evidence for genetic and cultural transmission?
The current study aimed to describe what proportion of variation in adult alcohol intake is attributable to genetic differences among individuals and what proportion to differences in environmental experiences individuals have been exposed to. Effects of age, gender, spousal resemblance, and cultural transmission of alcohol intake from parents to offspring were taken into account. In a twin-family design, the effects of genetic and cultural transmission and shared and nonshared environment on alcohol intake were estimated with genetic structural equation models. Data originated from adult twins, their siblings, parents (n = 12,587), and spouses (n = 429) registered with the population-based Netherlands Twin Register (63.5% female; ages 18-97 years). Alcohol intake (grams per day) was higher among men than women and increased with age. Broad-sense heritability estimates were similar across sex and age (53%). Spousal resemblance was observed (r = .39) but did not significantly affect the heritability estimates. No effects of cultural transmission were detected. In total, 23% of the variation in alcohol intake was explained by additive genetic effects, 30% by dominant (nonadditive) gene action, and 47% by environmental effects that were not shared among family members.
Individual differences in adult alcohol intake are explained by genetic and individual-specific environmental effects. The same genes are expressed in males and females and in younger and older participants. A substantial part of the heritability of alcohol intake is attributable to nonadditive gene action. Effects of cultural transmission that have been reported in adolescence are not present in adulthood.
closed_qa
Effectiveness of treatment for adolescent substance use: is biological drug testing sufficient?
The purpose of this study was to compare the relative effectiveness of three treatment modalities for adolescent substance use: biological drug screening (BDS), Motivational Enhancement Therapy-Cognitive Behavioral Therapy (MET/CBT5), and BDS combined with MET/CBT5, relative to no treatment. This study comprised 5,186 adolescents (70% male) enrolled in substance use treatment and tracked through the Substance Abuse and Mental Health Services Administration's Center for Substance Abuse Treatment's database (BDS = 1,110; MET/CBT5 = 784; BDS combined with MET/CBT5 = 2,539; no treatment = 753). Outcomes of interest were substance use frequency and severity of substance use problems at 3, 6, and 12 months, as measured by the Global Appraisal of Individual Needs survey. Propensity score weighting was used to adjust for pretreatment covariate imbalances between groups. Weighted generalized linear models were used to estimate the impact of treatment on outcomes at 3, 6, and 12 months. BDS, alone or in combination with MET/CBT5, was associated with improved substance use and substance problems outcomes. Relative to youth reporting no treatment services, the BDS group reported significantly lower substance use at all visits, with the observed difference increasing over time. BDS alone was associated with significantly fewer substance problems than bds combined with met/cbt5 at all visits and significantly lower use at 12 months.
Our results demonstrate significant improvement on substance use outcomes associated with BDS and offer preliminary evidence that BDS, particularly standalone BDS, may be an effective form of drug treatment for adolescents. Further work, including randomized studies, should explore the optimal format of administering BDS to adolescents to achieve maximum effectiveness.
closed_qa
Arthroscopic debridement for acutely infected prosthetic knee: any role for infection control and prosthesis salvage?
The purpose of this study was to assess the success rate of arthroscopic debridement guided by C-reactive protein (CRP) levels for acutely infected total knee prostheses. From January 2002 to December 2009, 16 consecutive eligible patients met the following inclusion criteria: duration of symptoms less than 72 hours, previously well-functioning prostheses, and no radiographic signs of loosening. Each patient underwent arthroscopy with thorough debridement and synovectomy and copious irrigation. In addition to the standard anterior portals, a posterior portal was used, and a drain was placed through this portal. The need for subsequent open debridement was determined by the postarthroscopy trends of CRP levels. Treatment success was defined as continuing freedom from infection based on clinical and laboratory results, salvage of the prosthesis, and no evidence of infection for at least 2 years. Arthroscopic debridement eradicated the infection in 10 (62.5%) of the 16 cases. The other 6 knees (37.5%) underwent subsequent open debridement with polyethylene insert exchange, which resulted in successful infection control with prosthetic salvage.
Patients who had undergone total knee arthroplasty (TKA) and had acute joint infection for less than 72 hours with no evidence of a loosening prosthesis were treated by arthroscopic debridement guided by the CRP level and had a 62.5% success rate with arthroscopic treatment alone but a 100% success rate when initial failures were treated with open debridement and polyethylene exchange.
closed_qa
Mucinous cystic neoplasm of the pancreas: is surgical resection recommended for all surgically fit patients?
Surgical removal of mucinous cystic neoplasms (MCNs) is usually recommended because of the risk of malignancy. However, increased experience of MCNs suggests that the incidence of invasion is lower than had been thought. This study was designed to establish more reasonable surgical indications for MCN through re-assessment using strict pathologic diagnostic criteria. Ninety-four patients who underwent surgical removal of MCNs at Seoul National University Hospital from 1991 to 2012 were retrospectively analyzed. Pathologic results were re-evaluated by an experienced pathologist. Medical records and radiologic images were reviewed to determine factors predicting malignancy. Of the 94 patients, 4 were found to have intraductal papillary mucinous neoplasms (IPMNs). Of the 90 MCNs, 60 (66.7%) were low-grade, 21 (23.3%) were intermediate-grade, and 5 (5.5%) were high-grade dysplasias; and 4 (4.4%) were invasive carcinoma. Mural nodules on CT scan (p = 0.005) and abnormal serum CA19-9 concentration (p = 0.029) were significant predictors of malignancy. All MCNs less than 3 cm in size with normal serum tumor markers were benign and all malignant MCNs had cyst fluid CA19-9 over 10,000 units/ml. The five year disease specific survival rates were 98.8% for all patients and 75.0% for those with invasive MCNs.
MCNs had a low prevalence of malignancy. Regardless of the histological grade, long-term outcome was excellent. Therefore, in the absence of specific symptoms, surgery may not be indicated for MCNs<3 cm without mural nodules or elevated serum tumor markers. Validation by a prospective study with very careful design is needed.
closed_qa
Do the clinical features in infantile-onset saccade initiation delay (congenital ocular motor apraxia) correlate with brain magnetic resonance imaging findings?
Infantile-onset saccade initiation delay (ISID) is a defect in saccade initiation. Other features may include impaired smooth ocular pursuit, developmental delay, hypotonia, and ataxia. Brain magnetic resonance imaging (MRI) can be normal or show supratentorial or infratentorial abnormalities. Our aim was to correlate the clinical features of ISID with brain MRI findings. Detailed review of the English medical literature between 1952 and 2012 revealed 67 studies with possible ISID. Patients without a brain MRI or with inadequate information, Joubert syndrome, neurodegenerative disorders, and acquired saccade initiation delay were excluded. Ninety-one patients (age range, 3 months to 45 years) met the inclusion criteria and were divided into 3 groups based on their brain MRI findings: normal (n = 55), supratentorial abnormalities (n = 17), and infratentorial abnormalities (n = 19). The patients' clinical features including the direction of head thrusts, smooth pursuit, optokinetic response (OKR), tone, development, and coordination were compared and analyzed among the MRI groups using χ test. Horizontal head thrusts were significantly more common in patients with infratentorial abnormalities or normal brain MRI, whereas vertical head thrusts were more common among patients with supratentorial abnormalities (P<0.0001). The slow phases of the OKR were significantly more likely to be impaired in patients with supratentorial or infratentorial abnormalities than in those with a normal MRI (P = 0.011). Other neuro-ophthalmological, neurological, and developmental features were similar among patients in the 3 neuroimaging groups.
The direction of head thrust and the integrity of the slow phases of the OKR are useful clinical indicators of possible sites of abnormality on brain MRI in patients with ISID.
closed_qa
Listeria and enterococcal infections in neonates 28 days of age and younger: is empiric parenteral ampicillin still indicated?
Empiric parenteral ampicillin has traditionally been used to treat listeria and enterococcal serious bacterial infections (SBI) in neonates 28 days of age or younger. Anecdotal experience suggests that these infections are rare. Existing data suggest an increasing resistance to ampicillin. Guidelines advocating the routine use of empiric ampicillin may need to be revisited. This study aimed to describe the epidemiology and ampicillin sensitivity of listeria and enterococcal infections in neonates 28 days of age and younger who presented to 2 pediatric emergency departments (ED) in Michigan. We conducted a 2-center, retrospective chart review (2006-2010) of neonates 28 days of age or younger who were evaluated for SBI in the ED. We abstracted and compared relevant demographic, historical and physical details, laboratory test results, and antibiotic sensitivity patterns to ampicillin from the eligible patient records. We identified SBI in 6% (72/1192) of neonates 28 days of age or younger who were evaluated for SBI, of which 0.08% (1/1192) neonates had enterococcal bacteremia and 0.08% (1/1192) neonates had listeria bacteremia. A total of 1.4% (15/1192) of patients had enterococcal urinary tract infection (UTI). Urinalysis is less helpful as a screening tool for enterococcal UTI when compared with Escherichia coli UTI (P<0.001). Seventy-three percent (11/15) of urine isolates had an increase of minimal inhibitory concentrations, which indicate gradual development of resistance to ampicillin.
Listeria is an uncommon cause of neonatal SBI in febrile neonates who presented to the ED. Empiric use of ampicillin may need to be reconsidered if national data confirm very low listeria and enterococcal prevalence and high ampicillin resistance patterns.
closed_qa
Can a simple urinalysis predict the causative agent and the antibiotic sensitivities?
The objective of this study was (1) to determine the reliability of urinalysis (UA) for predicting urinary tract infection (UTI) in febrile children, (2) to determine whether UA findings can predict Escherichia coli versus non-E. coli urinary tract infection, and (3) to determine if empiric antibiotics should be selected based on E. coli versus non-E. coli infection predictions. This was a retrospective chart review of children from 2 months to 2 years of age who presented to the emergency department with fever (rectal temperature>100.4°F) and had a positive urine culture. This study was conducted between January 2004 and December 2007. Negative UA was defined as urine white blood cell count less than 5 per high-power field, negative leukocyte esterase, and negative nitrites. Urine cultures were classified into E. coli and non-E. coli groups. These groups were compared for sex, race, and UA findings. Multivariate forward logistic regression, using the Wald test, was performed to calculate the likelihood ratio (LR) of each variable (eg, sex, race, UA parameters) in predicting UTI. In addition, antibiotic sensitivities between both groups were compared. Of 749 medical records reviewed, 608 were included; negative UA(-) was present in 183 cases, and positive UA(+) was observed in 425 cases. Furthermore, 424 cases were caused by E. coli, and 184 were due to non-E. coli organisms. Among 425 UA(+) cases, E. coli was identified in 349 (82.1%), whereas non-E. coli organisms were present in 76 (17.9%); in contrast, in 183 UA(-) cases, 108 (59%) were due to non-E. coli organisms versus 75 (41%), which were caused by E. coli. Urinalysis results were shown to be associated with organism group (P<0.001). Positive leukocytes esterase had an LR of 2.5 (95% confidence interval [CI], 1.5-4.2), positive nitrites had an LR of 2.8 (95% CI, 1.4-5.5), and urine white blood cell count had an LR of 1.8 (95% CI, 1.3-2.4) in predicting E. coli versus non-E. coli infections. Antibiotic sensitivity compared between UA groups demonstrated equivalent superiority of cefazolin (94.7% sensitive in UA(+) vs 84.0% in UA(-) group; P<0.0001), cefuroxime (98.2% vs 91.7%; P<0.001), and nitrofurantoin (96.1% vs 82.2%; P<0.0001) in the UA(+) group. In contrast, the UA(-) group showed significant sensitivity to trimethoprim-sulfamethoxazole (82.2% vs 71.3% in UA(+); P = 0.008).
Urinalysis is not an accurate predictor of UTI. A positive urine culture in the presence of negative UA most likely grew non-E. coli organisms, whereas most UA(+) results were associated with E. coli. This study also highlighted local patterns of antibiotic resistance between E. coli and non-E. coli groups. Negative UA results in the presence of strong suspicion of a UTI suggest a non-E. coli organism, which may be best treated with trimethoprim-sulfamethoxazole. Conversely, UA(+) results suggest E. coli, which calls for treatment with cefazolin or cefuroxime.
closed_qa
Can Doppler flow parameters of carotid stenosis predict the occurrence of new ischemic brain lesions detected by diffusion-weighted MR imaging after filter-protected internal carotid artery stenting?
Carotid angioplasty and stent placement are increasingly being used for the treatment of symptomatic and asymptomatic carotid artery disease. Carotid angioplasty and stent placement carry an inherent risk of distal cerebral embolization, precipitating new brain ischemic lesions and neurologic symptoms. Our purpose was to evaluate the frequency of new ischemic lesions found on diffusion-weighted imaging after protected carotid angioplasty and stent placement and to determine the association of new lesions with ICA Doppler flow parameters. Fifty-two patients (mean age, 68 ± 11 years) with 50%-69% (n = 20, group 1) and ≥70% (n = 32, group 2) internal carotid artery stenosis underwent carotid angioplasty and stent placement with distal filter protection. DWI was performed before and 48 hours after carotid angioplasty and stent placement. Thirty-three (63.4%) patients showed new lesions. The average number of new postprocedural lesions was 3.4 per patient. Most of the postprocedural lesions were<5 mm (range, 3-23 mm), cortical and corticosubcortical, and clinically silent. Group 2 had a significantly higher number of new lesions compared with group 1 (P<.001). A significant relationship was found between ICA Doppler flow parameters and the appearance of new lesions.
The appearance of new ischemic lesions was significantly related to the Doppler flow parameters, particularly peak systolic velocity.
closed_qa
NOD2 gene mutations in ulcerative colitis: useless or misunderstood?
NOD2 mutations have been linked to an increased risk of Crohn's disease and to some of its phenotypes. The association between NOD2 mutations and susceptibility to ulcerative colitis (UC) remains somewhat controversial and potential correlations between these mutations and UC phenotype have not been studied.AIM: To assess whether NOD2 mutations are a risk factor for UC in Portugal and if there are any genotype-phenotype correlations in these patients. The three main NOD2 mutations were searched in 200 patients with UC and in 202 healthy controls. NOD2 mutations were present in 28 patients with UC (14.0 %) and in 27 controls (13.4 %) (p = 0.853). Mutation carriers were more likely to receive steroids during the first year of disease than non-carriers (54.2 % vs. 29.6 %, p = 0.018) and among these patients the need for intravenous administration was more frequent in those with the R702W polymorphism (90.0 % vs. 45.5 %, p = 0.014). In patients with severe colitis admitted for intravenous steroids, a greater proportion of mutation carriers was considered intravenous-steroid refractory and required salvage therapy (90.0 % vs. 38.1 %, p = 0.004). Patients with NOD2 mutation were submitted to colectomy more frequently than non-carriers (17.9 % vs. 4.1 %. p = 0.015). No correlation with the need for immunosuppressants/immunomodulators was found.
In the Portuguese population, NOD2 mutations do not increase the risk of UC but are associated with a more aggressive course including greater need of steroids in the first year, increased incidence of intravenous-steroid refractoriness and a higher colectomy rate.
closed_qa
Is tibial tuberosity-trochlear groove distance an appropriate measure for the identification of knees with patellar instability?
Tibial tuberosity-trochlear groove distance (TT-TG) has been regarded as a useful tool for establishing therapeutic choices for patellar instability. Recently, it has been shown that TT-TG negatively correlated with the quadriceps angle, suggesting that if used individually, neither provide a valid measure of instability. This study aimed to compare TT-TG distance between both knees in patients with unilateral instability to assess whether this measurement is a decisive element in the management decisions for patellar instability. Sixty-two patients (18 male and 44 female), reporting to a specialist patella clinic for recurrent unilateral patellar instability, were included in the study. Patients underwent bilateral long leg computed tomography scan to determine TT-TG distance in both knees. Tibial TT-TG in symptomatic and asymptomatic knees in the same individual was compared statistically. Mean TT-TG distance in the symptomatic knee was 16.9 (±4.9) mm, compared to 15.6 (±5.6) mm in the asymptomatic knee. Tibial TT-TG was not significantly different between stable and unstable knees (n.s.).
The lack of difference in TT-TG distance between stable and unstable knees suggests that TT-TG distance alone may not be a decisive element in establishing therapeutic choices for patellar instability. It should, therefore, be interpreted with caution during clinical evaluations.
closed_qa
Renal denervation using an irrigated catheter in patients with resistant hypertension: a promising strategy?
Systemic hypertension is an important public health problem and a significant cause of cardiovascular mortality. Its high prevalence and the low rates of blood pressure control have resulted in the search for alternative therapeutic strategies. Percutaneous renal sympathetic denervation emerged as a perspective in the treatment of patients with resistant hypertension. To evaluate the feasibility and safety of renal denervation using an irrigated catheter. Ten patients with resistant hypertension underwent the procedure. The primary endpoint was safety, as assessed by periprocedural adverse events, renal function and renal vascular abnormalities at 6 months. The secondary endpoints were changes in blood pressure levels (office and ambulatory monitoring) and in the number of antihypertensive drugs at 6 months. The mean age was 47.3 (± 12) years, and 90% of patients were women. In the first case, renal artery dissection occurred as a result of trauma due to the long sheath; no further cases were observed after technical adjustments, thus showing an effect of the learning curve. No cases of thrombosis/renal infarction or death were reported. Elevation of serum creatinine levels was not observed during follow-up. At 6 months, one case of significant renal artery stenosis with no clinical consequences was diagnosed. Renal denervation reduced office blood pressure levels by 14.6/6.6 mmHg, on average (p = 0.4 both for systolic and diastolic blood pressure). Blood pressure levels on ambulatory monitoring decreased by 28/17.6 mmHg (p = 0.02 and p = 0.07 for systolic and diastolic blood pressure, respectively). A mean reduction of 2.1 antihypertensive drugs was observed.
Renal denervation is feasible and safe in the treatment of resistant systemic arterial hypertension. Larger studies are required to confirm our findings.
closed_qa
Is the long-term prognosis of transient ischemic attack or minor ischemic stroke affected by the occurrence of nonfocal symptoms?
In patients with a transient ischemic attack or ischemic stroke, nonfocal neurological symptoms, such as confusion and nonrotatory dizziness, may be associated with a higher risk of vascular events. We assessed the relationship between nonfocal symptoms and the long-term risk of vascular events or death in patients with a transient ischemic attack or minor ischemic stroke. We related initial symptoms with outcome events in 2409 patients with a transient ischemic attack (n=723) or minor ischemic stroke (n=1686), included in the Life Long After Cerebral ischemia cohort. All patients underwent a standardized interview on the occurrence of focal and nonfocal neurological symptoms during the qualifying event. The primary outcome was the composite of any stroke, myocardial infarction, or vascular death. Secondary outcomes were all-cause death, vascular death, cardiac death, myocardial infarction, and stroke. Hazard ratios were calculated with Cox regression. Focal symptoms were accompanied by nonfocal symptoms in 739 (31%) patients. During a mean follow-up of 10.1 years, the primary outcome occurred in 1313 (55%) patients. There was no difference in the risk of the primary outcome between patients with both focal and nonfocal symptoms and patients with focal symptoms alone (adjusted hazard ratio, 0.97; 95% confidence interval, 0.86-1.09; P=0.60). The risk of each of the secondary outcomes was also similar in both groups.
About one third of the patients with a transient ischemic attack or minor ischemic stroke has both focal and nonfocal neurological symptoms. Nonfocal symptoms are not associated with an increased long-term risk of vascular events or death.
closed_qa
Is lymph-node ratio a superior predictor than lymph node status for recurrence-free and overall survival in patients with head and neck squamous cell carcinoma?
TNM status is questioned as an exact predictor of survival in different tumour entities. Recently, lymph node ratio (LNR) has been described as a predictor of survival in patients with HNSCC. The purpose of this study was to evaluate to which degree LNR could be used as a more accurate predictor than TNM staging? A total of 291 patients, with a follow-up of at least 3 years, were analyzed using log-rank statistic, univariate and multivariate data analyzes, and p values, for prediction of lymph node ratio on overall and recurrence-free survival. Survival differed significantly if patients were stratified for LNR. Impact of LNR on survival was significantly different even in patients with extracapsular spread. Patients with pN0 had no survival benefit compared with patients with pN1 or higher with a LNR lower than 6 %.
LNR is a prognostic tool in patients with a lymph node status pN0-pN2b. LNR remained significant even in patients with extracapsular spread, contrary to TNM status. With LNR, stratification for high-risk patients (higher than 6 % LNR) can be evaluated easily. We would suggest using LNR in the clinical routine.
closed_qa
Singapore rhabdomyosarcoma (RMS) experience: shall we change our practice?
Although rhabdomyosarcoma (RMS) constitutes nearly 4% of all children diagnosed with cancer in the ethnically diverse small island city of Singapore, it is unknown how children with RMS fare. This study investigated 50 children with RMS from April 1993 to December 2010 from KK Women's and Children's Hospital (KKH) and National University Hospital (NUH). They were treated either as per Intergroup Rhabdomyosarcoma Study Group (IRSG) or Société Internationale Pediatrique D'Oncologie (SIOP) regimens. Median age of diagnosis was 5.1 years (range, 0.1 to 17.3 years) with a median follow-up of 3.3 years (range, 0.4 to 15.6 years). According to IRSG classifi cation, 18 (36%) were staged as low-risk (LR); 19 (38%) were intermediate-risk (IR), 12 (24%) were high-risk (HR) and it was unknown in 1 patient. Twenty-nine (58%) were of embryonal subtype, 17 (34%) were alveolar and subclassification was not available in 4. The primary sites of tumour were: head and neck region (n = 22); genitourinary (n = 19); extremity (n = 10); and abdomen/retroperitoneal (n = 5). At the time of analysis, 80% were alive with no evidence of disease, 9 were dead of disease, and 2 were alive with disease. By disease risk group, the 5-year event-free survival (EFS) for LR group disease was 81.3% (95% CI, 62.0 to 100.0), IR group was 61.4% (95% CI, 32.3 to 90.4) and HR group was 25.0% (95% CI, 0.0 to 49.5) respectively (P<0.001). The 5-year EFS for risk by chemotherapy received as per SIOP vs per IRSG revealed: LR 83.3% vs 75.0% (P = 0.787); IR 83.3% vs 43.8% (P = 0.351); HR 0.0% vs 42.9% (P = 0.336) respectively. Of 15 relapses (HR, n = 7), at median of 2 years, 4 of 6 patients treated as per SIOP regimen were dead of disease and 3 of 8 treated as per IRSG were alive.
Radiation therapy (RT) can be avoided in LR classification although those in higher risk classification need RT to local and distant metastatic disease. The outcome of children with RMS in Singapore can be further improved by coming together as a cooperative group to provide the best total care. Improved communication, multidisciplinary team collaboration, standardisation of protocols and rigorous data collection are keys.
closed_qa
Do we pay our community preceptors?
Family medicine clerkships depend heavily on community-based family physician preceptors to teach medical students. These preceptors have traditionally been unpaid, but in recent years some clerkships have started to pay preceptors. This study determines trends in the number and geographic region of programs that pay their community preceptors, identifies reasons programs pay or do not pay, and investigates perceived advantages and disadvantages of payment. We conducted a cross-sectional, electronic survey of 134 family medicine clerkship directors at allopathic US medical schools. The response rate was 62% (83/132 clerkship directors). Nineteen of these (23%) currently pay community preceptors, 11 of whom are located in either New England or the South Atlantic region. Sixty-three percent of programs who pay report that their community preceptors are also paid for teaching other learners, compared to 32% of those programs who do not pay. Paying respondents displayed more positive attitudes toward paying community preceptors, though a majority of non-paying respondents indicated they would pay if they had the financial resources.
The majority of clerkships do not pay their community preceptors to teach medical students, but competition from other learners may drive more medical schools to consider payment to help with preceptor recruitment and retention. Medical schools located in regions where there is competition for community preceptors from other medical and non-medical schools may need to consider paying preceptors as part of recruitment and retention efforts.
closed_qa
Is confirmatory testing of Roche cobas 4800 CT/NG test Neisseria gonorrhoeae positive samples required?
Recently marketed nucleic acid amplification tests (NAAT) for the detection of Neisseria gonorrhoeae (NG) have improved specificity over previous generation assays. A study to assess the necessity for confirmation of Roche cobas 4800 NG positive samples was undertaken by the Public Health Wales Microbiology Molecular Diagnostic Unit in Cardiff. Classical NG culture identification was compared to cobas 4800 (DR-9), opacity (opa) gene and porA pseudogene (pap) results. Confirmatory NAATs (opa/pap) were performed prospectively for 120 cobas 4800 NG positive urogenital and extragenital samples. Retrospective supplementary NAAT and sequence analysis of additional cobas 4800 NG positive extragenital samples was also carried out. Of the 188 classically identified clinical NG isolates, 184 were identified as NG in all 3 molecular targets. Two isolates were only detected by 2 molecular targets. A further 2 isolates were culture false-positives. Combining the results from prospective and retrospective testing, the sensitivity and negative predictive value for cobas 4800 NG detection for urogenital, rectal and oropharyngeal samples was 100%. Specificity for all sample types was greater than 99.7%. Positive predictive value was 96.0% and 96.4% for urogenital and rectal specimens, respectively, and 88.6% for oropharyngeal samples.
Molecular tests could be used for culture confirmation where available. Roche cobas 4800 Chlamydia trachomatis/Neisseria gonorrhoeae (CT/NG)CT/NG gonorrhoea diagnosis is superior to culture with urogenital and rectal positives not requiring confirmation. Roche cobas 4800 oropharyngeal NG detection findings warrant further prospective study of routine confirmatory testing accounting for its cost and clinical usefulness.
closed_qa
Hospice inpatients' views on physical examination by medical students: is it acceptable?
Hospices are increasingly involved in medical student teaching, which the patients generally enjoy. No studies have specifically investigated how hospice patients view the prospect of physical examination by students. Previous evidence involves patients who have already seen students, while the views of other patients are unknown. This study aimed to provide an initial understanding of the views of a diverse group of hospice inpatients on the acceptability and perceived importance of students physically examining them. 42 hospice inpatients completed a short questionnaire focusing on their views of medical students examining them. Patients chose to do this alone or via a short interview. All inpatients at Exeter Hospice were considered eligible, including patients who were asked and those who may not have been asked to see students; all 42 patients completed the study. In accordance with existing evidence, patients generally held positive views about seeing students. However, many patients expressed concerns about being physically examined by students, specifically including that it might be painful, tiring or embarrassing. Most importantly, several patients who did not wish to be examined by medical students said they would feel obliged to accept it, or would find it difficult to decline.
Hospice inpatients generally wish to be involved in medical student teaching, but many are concerned about being physically examined, and some feel a sense of obligation to participate. There are implications for hospices that teach students. Further research is necessary to investigate the frequency and severity of these concerns.
closed_qa
Does amount of weight gain during pregnancy modify the association between obesity and cesarean section delivery?
Two-thirds of reproductive-aged women in the United States are overweight or obese and at risk for numerous associated adverse pregnancy outcomes. This study examined whether the amount of weight gained during pregnancy modifies the prepregnancy body mass index (BMI)-cesarean delivery association. A total of 2,157 women aged 18-45 who participated in the 2008-2009 North Carolina Pregnancy Risk Assessment Monitoring System had complete information on prepregnancy BMI, maternal weight gain, and mode of delivery on infant birth certificates. Logistic regression was used to obtain odds ratios (ORs) and 95 percent confidence intervals (CIs) to model the association between prepregnancy BMI and cesarean delivery, and a stratified analysis was conducted to determine whether maternal weight gain was an effect modifier of the prepregnancy BMI-cesarean delivery association. Obese women had 1.78 times the odds of cesarean delivery as compared with women with a normal BMI (95% CI: 1.44-2.16). When adjusted for race/ethnicity, live birth order, household income, and education, the association increased in magnitude and remained statistically significant (OR = 2.01, 95% CI: 1.63-2.43). In stratified analyses, the obesity-cesarean delivery association persisted and remained statistically significant among all maternal weight gain categories.
Health care practitioners should stress the importance of achieving a healthy prepregnancy weight and gaining an appropriate amount of weight during pregnancy to reduce the risk of cesarean delivery and other adverse pregnancy outcomes.
closed_qa
Risks and benefits of hormone therapy: has medical dogma now been overturned?
In an integrated overview of the benefits and risks of menopausal hormone therapy (HT), the Women's Health Initiative (WHI) investigators have claimed that their 'findings … do not support use of this therapy for chronic disease prevention'. In an accompanying editorial, it was claimed that 'the WHI overturned medical dogma regarding menopausal [HT]'. To evaluate those claims. Epidemiological criteria of causation were applied to the evidence. A 'global index' purporting to summarize the overall benefit versus the risk of HT was not valid, and it was biased. For coronary heart disease, an increased risk in users of estrogen plus progestogen (E + P), previously reported by the WHI, was not confirmed. The WHI study did not establish that E+ P increases the risk of breast cancer; the findings suggest that unopposed estrogen therapy (ET) does not increase the risk, and may even reduce it. The findings for stroke and pulmonary embolism were compatible with an increased risk, and among E+ P users there were credible reductions in the risk of colorectal and endometrial cancer. For E+ P and ET users, there were credible reductions in the risk of hip fracture. Under 'worst case' and 'best case' assumptions, the changes in the incidence of the outcomes attributable to HT were minor.
Over-interpretation and misrepresentation of the WHI findings have damaged the health and well-being of menopausal women by convincing them and their health professionals that the risks of HT outweigh the benefits.
closed_qa
Does the transition into daylight saving time really cause partial sleep deprivation?
A total of 378 students answered the Morningness-Eveningness Questionnaire (MEQ) to determine their chronotype and kept a diary about sleep-wake schedules 1 week before and after the DST transition. Oral mucosal cell samples were collected for genetic analysis. After the DST transition, intermediate types (I-types) delayed bedtime and increased their time in bed and all groups delayed their wake-up time. All groups presented a shorter phase angle between sunset and the bedtime after the DST transition. On the other hand, only E-types showed a tendency to reduce the phase angle between sunrise and wake-up time, while I-types and M-types kept the same phase angles between sunrise and wake-up time after the DST transition. The polymorphisms in the human genes CLOCK and PER3 were not associated with individual differences in sleep patterns, nor were they associated with an adjustment to the DST transition.
Under the new set of social times determined by DST, the adjustment was only partial. I-types delayed bedtime and all groups delayed their wake-up times after the beginning of DST. Consequently, the time in bed after the DST transition was not reduced; Morning (M-types) and Evening-types (E-types) kept the same time in bed and I-types showed an increase on it.
closed_qa
Does rurality influence treatment decisions in early stage laryngeal cancer?
The mortality rate of laryngeal cancer has been trending downward with the use of more effective surgical, radiation, and systemic therapies. Although the best treatment for this disease is not entirely clear, there is a growing consensus on the value of primary radiotherapy as an organ preservation strategy. This study examines urban-rural differences in the use of radiotherapy as the primary treatment for early stage laryngeal cancer in Pennsylvania. The sample was drawn from the Pennsylvania tumor registry, which lists 2,437 laryngeal cancer patients diagnosed from 2001 to 2005. We selected 1,705 adults with early stage squamous cell carcinoma of the larynx for our analysis. Demographic data and tumor characteristics were included as control variables in multivariate analyses. Rurality was assigned by ZIP code of patient residence. Controlling for demographic and clinical factors, rural patients were less likely than urban patients to receive radiotherapy as the primary treatment modality for early stage larynx cancer (OR 0.740, 95% CI 0.577-0.949, P = .0087). No other associations between rural status and treatment choice were statistically significant.
Relatively fewer rural patients with larynx cancer are treated primarily with radiation therapy. Further investigations to describe this interaction more thoroughly, and to see if this observation is found in larger population data sets, are warranted.
closed_qa
Do breast cups improve breast cancer dosimetry?
Treating patients with large or pendulous breasts is challenging. Although brassiere cups are currently in use, no study has yet been carried out to assess their dosimetric impact. The aim of the present study was to evaluate the possible dosimetric advantages of the use of breast cups on patients with large or pendulous breasts. Two CT studies were carried out on 12 breast cancer patients with large or pendulous breasts, with one study involving the use of breast cups. Radiation plans were developed in accordance with each of the CT studies. The following were compared: planning target volume (PTV), volume irradiated by the 95% isodose, conformity index, homogeneity index, mean lung dose, and mean heart dose was also compared for left breast treatment. The plan involving the use of cups was found to be the best option, leading to all patients being treated with cups. The resulting acute toxicity and cosmesis were also recorded. Both scenarios involved the use of film dosimetry to evaluate the skin doses. The use of breast cups resulted in a significant reduction of the PTV volume (from 1640 cm3 to 1283 cm3), of the irradiated volume (from 2154 cm3 to 1477 cm3) and of the conformity index (from 1383 to 1213). Despite slight improvements in the homogeneity index (from 0.12 to 0.10), statistical significance was not attained. The use of breast cups also led to significant dose reductions in V20 for lung (from 13.7% to 1.7%) and V5 for heart (from 9.8% to 2.7%). No differences in acute toxicity or cosmesis were observed compared to patients treated without cups.
Our results show that the use of brassiere cups during breast radiation therapy leads to improvements in the main dosimetric factors analyzed. Furthermore, modifications to standard irradiation protocols are not required. In summary, we consider the technique of using breast cups with radiation therapy highly appropriate when treating breast cancer patients with large or pendulous breasts.
closed_qa
Cognitive recovery after severe traumatic brain injury in children/adolescents and adults: similar positive outcome but different underlying pathways?
Does younger age at the time of severe traumatic brain injury (STBI) protect from cognitive symptoms? To answer this question, the authors compared the neuropsychological profile of late school-age children/adolescents and young adult patients at mid- and long-term recovery periods (6 and 12 months post-STBI). Twenty-eight children/adolescents and 26 clinically matched adults were tested on measures of general intelligence, attention, executive functions, visuoperceptual, visuospatial and visuoconstructive abilities. Coma duration and the post-acute Glasgow Outcome Scale (GOS) score were used as predictor variables in a series of regression analyses. Children/adolescents and adults similarly improved on most measures, except for visuospatial and visuoconstructive skills, which worsened in time for children/adolescents. Coma duration significantly predicted performance IQ and visuoperceptual scores in children/adolescents. The GOS score significantly predicted performance and verbal IQ, sustained attention, visuoconstructive and long-term memory skills. Coma duration predicted executive function skills in both age groups.
(1) No evidence was found for a neuroprotective effect of younger age at STBI; and (2) Coma duration and GOS score predicted neuropsychological recovery in children/adolescents and adults, respectively. This suggests the existence of underlying age-specific recovery processes after STBI.
closed_qa
Can factors affecting complication rates for ureteric re-implantation be predicted?
To determine preoperative predictive factors of postoperative complications of ureteric re-implantation in children by using the modified Clavien classification system (MCCS), which has been widely used for complication rating of surgical procedures. In all, 383 children who underwent ureteric re-implantation for vesico-ureteric reflux (VUR) and obstructing megaureters between 2002 and 2011 were included in the study. Intravesical and extravesical ureteric re-implantations were performed in 338 and 45 children, respectively. Complications were evaluated according to the MCCS. Univariate and multivariate analyses were used to determine predictive factors affecting complication rates. In all, 247 girls and 136 boys were studied. The mean (sd) age was 46 (25) months and the mean (sd) follow-up was 49.4 (27.8) months. The mean (sd) hospitalisation time was 4.7 (1.6) days. Complications occurred in 76 (19.8%) children; 34 (8.9%) were MCCS grade I, 22 (5.7%) were grade II and 20 (5.2%) were grade III. Society of Fetal Urology (SFU) grade 3-4 hydronephrosis, obstructing megaureters, a tailoring-tapering and folding procedure, refractory voiding dysfunction and a duplex system were statistically significant predictors of complications on univariate analysis. Prior injection history, paraureteric diverticula, stenting, gender, age, operation technique (intra vs extravesical) were not significant predictors of complications. In the multivariate analysis refractory voiding dysfunction, a tailoring-tapering and folding procedure, obstructing megaureters (diameter of>9 mm) and a duplex system were statistically significant predictors of complications.
Ureteric re-implantation remains a valid option for the treatment of certain patients with VUR. Refractory voiding dysfunction, a tailoring-tapering and folding procedure, obstructing megaureters (diameter of>9 mm) and associated duplex systems were the main predictive factors for postoperative complications. Use of a standardised complication grading system, such as the MCCS, should be encouraged to allow the valid comparison of complication rates between series.
closed_qa
Current status of L. infantum infection in stray cats in the Madrid region (Spain): implications for the recent outbreak of human leishmaniosis?
Since 2009, the incidence of human leishmaniosis in the SW of the Madrid region has been unusually high. Although dogs are the main reservoir for this disease, a role played by dogs in this outbreak has been ruled out and investigators are now considering other hosts (eg. cats, rabbits, hares) as possible alternative reservoirs.This study was designed to examine the Leishmania infantum status of stray cats in Madrid to assess its possible implications in the human leishmaniosis outbreak. 346 captured stray cats were tested for antibodies against L. infantum by the indirect fluorescent antibody technique (IFAT) and nested-PCR methods were used to detect Leishmania DNA in blood samples of cats testing seropositive for L. infantum and/or retroviruses infection. Cats were also tested for Toxoplasma gondii using the direct agglutination test (DAT) and feline leukemia virus (FeLV) antigen and feline immunodeficiency virus (FIV) antibodies (PetChek* FIV/FeLV). The presence of intestinal parasites was determined using a routine coprological method. The seroprevalence of L. infantum infection (cut off ≥ 1/100) was 3.2% (11/346). However, it was not possible to amplify Leishmania DNA in any of the blood samples. Seropositivity was not associated with sex, age, capture site, clinical status, retrovirus infection or T. gondii seropositivity. Of the 11 cats seropositive for L. infantum, 3 also tested positive for FIV, none for FeLV and 6 for T. gondii. It should be mentioned that the prevalence of FeLV p27 antigen was 4% and of FIV antibody was 9.2%. Although the seroprevalence of T. gondii was quite high at 53.5%, no T. gondii oocysts were found in any of the faeces samples analysed (n = 287). In contrast, intestinal parasites were detected in 76 (26.5%) samples, Toxocara cati being the most prevalent.
Our results suggest a stable L. infantum infection situation among the stray cats of the Madrid area; the disease is uncommon and no clinical cases have been reported to date. The detection of other zoonotic parasites such as T. gondii and T. cati in stray cats indicates a need to adopt strict control measures in this population.
closed_qa
Mixed state discrimination: a DSM problem that won׳t go away?
DSM׳s replacement of 'mixed episodes' with 'mixed features' has ironically created a specifier, which potentially lacks specificity because it overlooks two key symptoms: psychomotor agitation and distractibility. Therefore, the present study examined the presence of psychomotor agitation and distractibility across the mood disorder spectrum. Two hundred patients were diagnosed and assigned to one of three groups (depression, bipolar spectrum disorder (BDspectrum) and bipolar disorder) based on clinical evaluation by a psychiatrist. On the basis of MDQ scores, the depression group was then further subdivided into two groups: unipolar depression (UP) and mixed depression (UPmix). These four groups were then compared to examine the relative distribution of psychomotor agitation and distractibility. Participants underwent a clinical evaluation by a psychiatrist and completed a series of questionnaires. Increased distraction, racing thoughts, and increased irritability were the most commonly reported manic symptoms amongst the unipolar depression group. Further, UPmix and BDspectrum had significantly higher psychomotor agitation and distractibility than the other two groups. The present study depended on self-report measures and did not include standardised measures of distractibility and psychomotor agitation. Future research needs to examine pure unipolar patients without any manic symptoms to clarify further how different this group would be from those with mixed features.
The present findings suggest that distractibility and psychomotor agitation may represent the core of mixed states, as they are more common in patients with mixed depression and bipolar spectrum disorder than patients diagnosed with unipolar depression and bipolar I disorder. Future research and clinical implications are discussed.
closed_qa
Incidence of endophthalmitis after intravitreal injection: is antibioprophylaxis mandatory?
Endophthalmitis is the most dreaded complication after intravitreal injection. With the rise of antiangiogenics their rate is getting higher each year. The use of antibioprophylaxis is controversial. We tried to evaluate the impact of antibioprophylaxis on intravitreal injection endophthalmitis incidence. All patients who received intravitreal injections between January 2007 and October 2012 were included in this retrospective study. Until June 2012 all patients had antibiotics the days following the injection. From July 2012 the antibiotic was replaced by an antiseptic immediately after the injection. An overall number of 11,450 injections were performed. The overall rate of endophthalmitis was 6/11,450 (0.052%). The incidence of endophthalmitis in the group with antibiotics was 3/10,144 injections (0.03%), 2 were culture proven (0.02%). The incidence in the group without antibiotics was 3/1306 (0.23%). The difference was significant (P=0.024).
The incidence of endophthalmitis post-intravitreal injections seems to be lower when using antibiotics. However, a prospective study is mandatory to draw more robust conclusions.
closed_qa
Hepatic resection for hepatocellular carcinoma: do contemporary morbidity and mortality rates demand a transition to ablation as first-line treatment?
Despite the rising incidence of hepatocellular carcinoma (HCC), challenges and controversy persist in optimizing treatment. As recent randomized trials suggest that ablation can have oncologic equivalence compared with resection for early HCC, the relative morbidity of the 2 approaches is a central issue in treatment decisions. Although excellent contemporary perioperative outcomes have been reported by a few hepatobiliary units, it is not clear that they can be replicated in broader practice. Our objective was to help inform this treatment dilemma by defining perioperative outcomes in a broader set of patients as represented in NSQIP-participating institutions. Mortality and morbidity data were extracted from the 2005-2010 NSQIP Participant Use Data Files based on Current Procedural Terminology (hepatectomy and ablation) and ICD-9 (HCC). Perioperative outcomes were reviewed, and factors associated with morbidity and mortality were identified with multivariable logistic regression. Eight hundred and thirty-seven (52%) underwent minor hepatectomy, 444 (28%) underwent major hepatectomy, and 323 (20%) underwent surgical ablation. Mortality rates were 3.4% for minor hepatectomy, 3.7% for ablation, and 8.3% for major hepatectomy (p<0.01). Major complication rates were 21.3% for minor hepatectomy, 9.3% for ablation, and 35.1% for major hepatectomy (p<0.01). When controlling for confounders, ablation was associated with decreased mortality (adjusted odds ratio = 0.20; 95% CI, 0.04-0.97; p = 0.046) and major complications (adjusted odds ratio = 0.34; 95% CI, 0.22-0.52; p<0.001).
Exceedingly high complication rates after major hepatectomy for HCC exist in the broader NSQIP treatment environment. These data strongly support the use of parenchymal-sparing minor resections or ablation over major hepatectomy for early HCC when feasible.
closed_qa
Do the abnormal results of an implantation renal biopsy affect the donor renal function?
Living kidney donation has become an important source for renal transplantation. Thus, renal function after donation is an important issue. In this study, we examined histological abnormalities in implantation biopsy specimens from living kidney donors and analyzed the renal function of the remaining kidney. Using the 2007 Banff classification system, we analyzed 121 kidneys from living donors who underwent implantation biopsies (IBs) between 2010 and 2011. Donor characteristics, intraoperative factors, and perioperative renal functions, such as serum creatinine and glomerular filtration rate (GFR), were evaluated. Univariate and multivariate regression analyses were performed to identify the factors related to each histological abnormality and postoperative 1-year donor renal function. Most histological abnormalities in healthy living donors were scored as 1 on the Banff scale. Univariate and multivariate analyses revealed that donor age was the only preoperative factor related to tubular atrophy (odds ratio [OR] = 1.104; P = .012) and glomerular sclerosis (OR = 1.050; P = .019). Intraoperative factors were not related to histological parameters. And histological abnormalities did not affect postoperative 1-year renal function. In contrast, donor age, preoperative GFR, and estimated blood loss were significantly related to 1-year postoperative GFR.
Most histological abnormalities in healthy living donors were minor. The incidence of abnormalities correlated with donor age. However, postoperative renal functions in living donors were not affected by histological abnormalities. Larger-scale investigations with long-term follow-up analysis will be needed.
closed_qa
HIV testing among pregnant women living with HIV in India: are private healthcare providers routinely violating women's human rights?
In India, approximately 49,000 women living with HIV become pregnant and deliver each year. While the government of India has made progress increasing the availability of prevention of mother-to-child transmission of HIV (PMTCT) services, only about one quarter of pregnant women received an HIV test in 2010, and about one-in-five that were found positive for HIV received interventions to prevent vertical transmission of HIV. Between February 2012 to March 2013, 14 HIV-positive women who had recently delivered a baby were recruited from HIV positive women support groups, Government of India Integrated Counseling and Testing Centers, and nongovernmental organizations in Mysore and Pune, India. In-depth interviews were conducted to examine their general experiences with antenatal healthcare; specific experiences around HIV counseling and testing; and perceptions about their care and follow-up treatment. Data were analyzed thematically using the human rights framework for HIV testing adopted by the United Nations and India's National AIDS Control Organization. While all of the HIV-positive women in the study received HIV and PMTCT services at a government hospital or antiretroviral therapy center, almost all reported attending a private clinic or hospital at some point in their pregnancy. According to the participants, HIV testing often occurred without consent; there was little privacy; breaches of confidentiality were commonplace; and denial of medical treatment occurred routinely. Among women living with HIV in this study, violations of their human rights occurred more commonly in private rather than public healthcare settings.
There is an urgent need for capacity building among private healthcare providers to improve standards of practice with regard to informed consent process, HIV testing, patient confidentiality, treatment, and referral of pregnant women living with HIV.
closed_qa
Risk stratification based on thyroid cytology: can we rely on national data?
Determine correlation of malignancy rates between fine needle aspiration (FNA) biopsy and surgical specimen in an urban academic environment. Retrospective review at an academic medical center of fine needle aspiration biopsies and surgical specimens in a head and neck otolaryngology practice between 2000 and 2012. Of the 74 biopsies diagnosed as follicular lesion, 34 (45.9%) were malignant. Of the 45 biopsies diagnosed as follicular neoplasm, 22 (48.9%) were malignant. These results are significantly higher than the average risk of malignancy cited by the American Thyroid Association of 5%-10% and 20%-30% for follicular lesions and neoplasms respectively.
The rate of malignancy based on a FNA diagnosis of indeterminate cytology (follicular lesion or follicular neoplasm) can vary greatly among different institutions. Thyroid surgeons should be aware of their local pathology practices to better guide therapy and counsel patients.
closed_qa
Are midwifery clients in Ontario making informed choices about prenatal screening?
Informed choice is often lacking in women's decisions about prenatal screening.AIM: The aim of this study is to evaluate how well midwives in Ontario, Canada are facilitating informed choice in this area. An Internet-based survey was used to investigate 171 midwifery clients' knowledge, attitude towards and experience of prenatal genetic screening tests, and to determine the proportion of study participants who made an informed choice about prenatal screening. All participants demonstrated adequate knowledge of prenatal screening. The vast majority (93.0%) of participants made an informed choice. Participants who chose to screen had lower knowledge scores than those who opted out of screening. Client satisfaction rates in regard to care received in this area ranged from 97% to 100%.
Results of this study suggest that Ontario midwives are effective in conveying information on prenatal genetic screening, contributing to high levels of client knowledge and satisfaction in comparison to similar studies in other jurisdictions.
closed_qa
Does analgesic overuse contribute to chronic post-traumatic headaches in adolescent concussion patients?
The causes of persistent headache following concussion are poorly understood. The objective of this study is to explore analgesic overuse as a potential cause of chronic post-traumatic headache among adolescents referred to a headache clinic following concussion. A retrospective chart review was conducted of all adolescent concussion patients referred to our pediatric headache clinic over the 16-month period between August 1, 2011, and November 30, 2012. Those patients with chronic post-traumatic headaches of 3-12 months' duration who also met International Headache Society criteria for probable medication-overuse headache were identified. Demographic data, concussion symptoms, and headache features were characterized from the initial evaluation and from follow-up visits. Of 104 adolescent concussion patients referred during the study period, 77 had chronic post-traumatic headache of 3-12 months' duration. Fifty-four of 77 (70.1%) met criteria for probable medication-overuse headache. Only simple analgesics were overused. Thirty-seven patients (68.5%) had resolution of headaches or improvements to preconcussion headache patterns after discontinuing analgesics; seven (13%) had no change in headaches or worsening of headaches after discontinuing analgesics and 10 (18.5%) did not discontinue analgesics or were lost to follow-up.
Excessive use of analgesics postconcussion may contribute to chronic post-traumatic headaches in some adolescents. Management of patients with chronic post-traumatic headache should include analgesic detoxification when medication overuse is suspected.
closed_qa
Does patient age still affect receipt of adjuvant therapy for colorectal cancer in New South Wales, Australia?
To investigate the effect of patient age on receipt of stage-appropriate adjuvant therapy for colorectal cancer in New South Wales, Australia. A linked population-based dataset was used to examine the records of 580 people with lymph node-positive colon cancer and 498 people with high-risk rectal cancer who underwent surgery following diagnosis in 2007/2008. Multilevel logistic regression models were used to determine whether age remained an independent predictor of adjuvant therapy utilisation after accounting for significant patient, surgeon and hospital characteristics. Overall, 65-73% of eligible patients received chemotherapy and 42-53% received radiotherapy. Increasing age was strongly associated with decreasing likelihood of receiving chemotherapy for lymph node-positive colon cancer (p<0.001) and radiotherapy for high-risk rectal cancer (p=0.003), even after adjusting for confounders such as Charlson comorbidity score and ASA health status. People aged over 70years for chemotherapy and over 75years for radiotherapy were significantly less likely to receive treatment than those aged less than 65. Emergency resection, intensive care admission, and not having a current partner also independently predicted chemotherapy nonreceipt. Other predictors of radiotherapy nonreceipt included being female, not being discussed at multidisciplinary meeting, and lower T stage. Adjuvant therapy rates varied widely between hospitals where surgery was performed.
There are continuing age disparities in adjuvant therapy utilisation in NSW that are not explained by patients' comorbidities or health status. Further exploration of these complex treatment decisions is needed. Variation by hospital and patient characteristics indicates opportunities to improve patient care and outcomes.
closed_qa
Can a novel computerized cognitive screening test provide additional information for early detection of Alzheimer's disease?
Virtual reality testing of everyday activities is a novel type of computerized assessment that measures cognitive, executive, and motor performance as a screening tool for early dementia. This study used a virtual reality day-out task (VR-DOT) environment to evaluate its predictive value in patients with mild cognitive impairment (MCI). One hundred thirty-four patients with MCI were selected and compared with 75 healthy control subjects. Participants received an initial assessment that included VR-DOT, a neuropsychological evaluation, magnetic resonance imaging (MRI) scan, and event-related potentials (ERPs). After 12 months, participants were assessed again with MRI, ERP, VR-DOT, and neuropsychological tests. At the end of the study, we differentiated two subgroups of patients with MCI according to their clinical evolution from baseline to follow-up: 56 MCI progressors and 78 MCI nonprogressors. VR-DOT performance profiles correlated strongly with existing predictive biomarkers, especially the ERP and MRI biomarkers of cortical thickness.
Compared with ERP, MRI, or neuropsychological tests alone, the VR-DOT could provide additional predictive information in a low-cost, computerized, and noninvasive way.
closed_qa
Does electroencephalogram phase variability account for reduced P3 brain potential in externalizing disorders?
Amplitude deficits of the P3 event-related potential (ERP) are associated with externalizing psychopathology but little is known about the nature of underlying brain electrical activity that accounts for this amplitude reduction. We sought to understand if group differences in task-induced phase-locking in electroencephalographic (EEG) delta and theta frequencies may account for P3-externalizing associations. Adult males (N=410) completed a visual oddball task and frontal and parietal P3-related delta- and theta-band phase-invariant evoked energy and inter-trial phase-locking measures were investigated with respect to the externalizing spectrum, including substance dependence, adult antisociality, and childhood disruptive disorders. We hypothesized that P3-related phase-locking is weaker in externalizing-diagnosed individuals and this might mediate prior findings of reduced evoked P3 energy. Reductions in both evoked energy and phase-locking, in both frequency bands, at both scalp sites, were associated with greater odds of externalizing diagnoses. Generally, adding phase-locking to evoked energy came with better prediction model fit. Moreover, reduced theta-band phase-locking partially mediated the effects of within-frequency evoked energy on externalizing prediction.
Inter-trial phase-locking underlying P3 appears to be an important distinction between externalizing and control subjects.
closed_qa
Is restrictive atrial septal defect a risk in partial anomalous pulmonary venous drainage repair?
The creation or enlargement of an atrial septal defect (ASD) in partial anomalous pulmonary venous drainage (PAPVD) repair may pose a risk of postoperative pulmonary vein stenosis (PVS), superior vena cava stenosis (SVCS), and atrial rhythm disturbances. 155 children who underwent repair of right PAPVD between 1990 and 2010 were reviewed. PVS and SVCS were defined by mean gradients on echocardiography: mild=3 to 5 mm Hg; severe=6 mm Hg or higher. Postoperative cardiac rhythms were categorized as sinus, transient nonsinus, and persistent nonsinus rhythms. Outcomes were compared between patients who underwent the creation or superior enlargement of an ASD (group A) and those who did not (group B). There was no early or late death. Freedom from any PVS at 15 years after operation was lower in group A than in group B (76.1% vs 96.5%, p=0.002), and no differences were found in freedom from severe PVS (p=0.103), any SVCS (p=0.419), or severe SVCS (p=0.373). Group A patients had more PVS-related reoperations (p=0.022). Nineteen patients had nonsinus rhythm, and 4 patients experienced first-degree atrioventricular block, but no significant difference was found between the groups. Cox regression revealed the creation or superior enlargement of an ASD as a predictor for postoperative PVS (p=0.032). A case-match analysis confirmed a higher risk of PVS in patients with the creation or superior enlargement of an ASD (p=0.018).
Late outcomes after repair of PAPVD are excellent. The subgroup that requires creation or superior enlargement of an ASD in repair of a right PAPVD is at a higher risk of late PVS and a subsequent increase in PVS-related reoperation. The presence of restrictive ASD did not increase SVCS, sinus node, or atrial conduction dysfunction.
closed_qa
Does recipient age impact functional outcomes of orthotopic heart transplantation?
This study evaluated changes in physical functional performance after orthotopic heart transplantation (OHT) with particular attention to the impact of recipient age on functional outcomes. Retrospective review of all first-time, single-organ adult OHTs in the United States between 2005 and 2010. Patients were primarily stratified by age. The validated Karnofsky performance scale, which ranges from 0 (death) to 100 (fully independent with no evidence of disease and no complaints), was used to measure functional status. A total of 10,049 OHT recipients were identified, with 1,431 (14%) aged 65 years or greater. Mean Karnofsky score prior to OHT was comparable between cohorts (younger: 50.7±25.2 versus older: 50.1±25.0; p=0.38). At a median follow-up of 2.1 years (interquartile range 0.7 to 3.3 years), 64% of OHT recipients had improved functional performance. The mean improvement in Karnofsky score was similar between younger and older patients (19.6±42.0 vs 17.5±41.8; p=0.10). Twenty percent of younger patients were functionally independent prior to OHT, with 67% being functionally independent at last follow-up (p<0.001). Similarly, in the older cohort, 20% were functionally independent prior to OHT, with 66% being functionally independent at last follow-up (p<0.001). Multivariable analysis adjusting for potential confounders confirmed that age, both as a continuous and categoric variable, did not impact odds of functional improvement after OHT. Subanalysis using 70 years as the age cutoff produced similar results.
In the modern era, OHT is associated with improvements in functional performance in most recipients, and this beneficial effect is preserved across the age spectrum. These data provide a benchmark for functional outcomes after OHT and may have important implications in organ allocation.
closed_qa
Matching physical work demands with functional capacity in healthy workers: can it be more efficient?
To determine if functional capacity (FC) and physical work demands can be matched and to determine the validity of normative values for FC related to physical work demands as a screening instrument for work ability. Forty healthy working subjects were included in this study. Subjects were categorized into four physical work demand categories (sedentary, light, moderate and heavy). FC was tested with a standardized Functional Capacity Evaluation (FCE) following the WorkWell Protocol and physical work demands were determined with an onsite Work Load Assessment (WLA) according to the Task Recording and Analyses on Computer (TRAC) method. Physical work demands were compared to FC and normative values derived from previous research. 88% of the subjects scored higher on FCE than observed during WLA. The tenth percentile of normative values appeared valid in 98% for sedentary/light work for the subjects tested in this study. For moderate or heavy work, the thirtieth percentile of normative values appeared valid in 78% of all cases.
Functional capacity and physical work demands can be matched in most instances, but exceptions should be kept in mind with regards to professions classified as moderate or heavy physical work, especially concerning lifting high. Normative values may be considered as an additional screening tool for balancing workload and capacity. It is recommended to further validate normative values in a broader and more extensive working population.
closed_qa
Is yoga training beneficial for exercise-induced bronchoconstriction?
Some studies have shown the beneficial effects of yoga for individuals with bronchial hyperreactivity with regard to (1) a reduction in the use of rescue medication, (2) an increase in exercise capacity, and (3) an improvement in lung function. Despite the fact that yoga is promising as a new treatment for pediatric patients, further studies are needed to assess the use of this training for asthma management. This study was performed to assess the beneficial effects of yoga in exercise-induced bronchoconstriction (EIB) in children. The study was prospective, with no control group. Participants were randomly chosen among the new patients at the unit. This study was conducted in the Erciyes University School of Medicine, Pediatric Allergy Unit, in Kayseri, Turkey. Two groups of asthmatic children aged 6-17 y were enrolled in the study: (1) children with positive responses to an exercise challenge (n = 10), and (2) those with negative responses (n = 10). Both groups attended 1-h sessions of yoga training 2 ×/wk for 3 mo. Researchers administered spirometric measurement to all children before and immediately after participating in an exercise challenge. This process was performed at baseline and at the study's end. Age, gender, IgE levels, eosinophil numbers, and spirometric measurement parameters including forced expiratory volume in 1 sec (FEV1), forced expiratory flow 25%-75% (FEF25%-75%), forced vital capacity (FVC), peak expiratory flow percentage (PEF%), and peak expiratory flow rate (PEFR) were compared using the Mann-Whitney U test and the Wilcoxon test. A P value<.05 was considered significant. At baseline, no significant differences were observed between the groups regarding demographics or pre-exercise spirometric measurements (P>.05, Mann-Whitney U test). Likewise, no significant differences in spirometric measurements existed between the groups regarding the change in responses to an exercise challenge after yoga training (P>.05, Wilcoxon test). For the exercise-response-positive group, the research team observed a significant improvement in maximum forced expiratory volume 1% (FEV1%) fall following the exercise challenge after yoga training (P>.05, Wilcoxon test). All exercise-response-positive asthmatics became exerciseresponse-negative asthmatics after yoga training.
This study showed that training children in the practice of yoga had beneficial effects on EIB. It is the research team's opinion that yoga training can supplement drug therapy to achieve better control of asthma.
closed_qa
Can a resident's publication record predict fellowship publications?
Internal medicine fellowship programs have an incentive to select fellows who will ultimately publish. Whether an applicant's publication record predicts long term publishing remains unknown. Using records of fellowship bound internal medicine residents, we analyzed whether publications at time of fellowship application predict publications more than 3 years (2 years into fellowship) and up to 7 years after fellowship match. We calculate the sensitivity, specificity, positive and negative predictive values and likelihood ratios for every cutoff number of application publications, and plot a receiver operator characteristic curve of this test. Of 307 fellowship bound residents, 126 (41%) published at least one article 3 to 7 years after matching, and 181 (59%) of residents do not publish in this time period. The area under the receiver operator characteristic curve is 0.59. No cutoff value for application publications possessed adequate test characteristics.
The number of publications an applicant has at time of fellowship application is a poor predictor of who publishes in the long term. These findings do not validate the practice of using application publications as a tool for selecting fellows.
closed_qa
Is circumferential minimally invasive surgery effective in the treatment of moderate adult idiopathic scoliosis?
Outcomes for minimally invasive scoliosis correction surgery have been reported for mild adult scoliosis. Larger curves historically have been treated with open surgical procedures including facet resections or posterior column osteotomies, which have been associated with high-volume blood loss. Further, minimally invasive techniques have been largely reported in the setting of degenerative scoliosis.QUESTIONS/ We describe the effects of circumferential minimally invasive surgery (cMIS) for moderate to severe scoliosis in terms of (1) operative time and blood loss, (2) overall health and disease-specific patient-reported outcomes, (3) deformity correction and fusion rate, and (4) frequency and types of complications. Between January 2007 and January 2012, we performed 50 cMIS adult idiopathic scoliosis corrections in patients with a Cobb angle of greater than 30° but less than 75° who did not have prior thoracolumbar fusion surgery; this series represented all patients we treated surgically during that time meeting those indications. Our general indications for this approach during that period were increasing back pain unresponsive to nonoperative therapy with cosmetic and radiographic worsening of curves. Surgical times and estimated blood loss were recorded. Functional clinical outcomes including VAS pain score, Oswestry Disability Index (ODI), and SF-36 were recorded preoperatively and postoperatively. Patients' deformity correction was assessed on pre- and postoperative 36-inch (91-cm) standing films and fusion was assessed on CT scan. Minimum followup was 24 months (mean, 48 months; range, 24-77 months). Mean blood loss was 613 mL for one-stage surgery and 763 mL for two-stage surgery. Mean operative time was 351 minutes for one-stage surgery and 482 minutes for two-stage surgery. At last followup, mean VAS and ODI scores decreased from 5.7 and 44 preoperatively to 2.9 and 22 (p<0.001 and 0.03, respectively) and mean SF-36 score increased from 48 preoperatively to 74 (p = 0.026). Mean Cobb angle and sagittal vertical axis decreased from 42° and 51 mm preoperatively to 16° and 27 mm postoperatively (both p<0.001). An 88% fusion rate was confirmed on CT scan. Perioperative complications occurred in 11 of the 50 patients (22%), with delayed complications needing further surgery in 10 more patients at last followup.
cMIS provides for good clinical and radiographic outcomes for moderate (30°-75°) adult idiopathic scoliosis. Patients undergoing cMIS should be carefully selected to avoid fixed, rigid deformities and a preoperative sagittal vertical axis of greater than 10 cm; surgeons should consider alternative techniques in those patients.
closed_qa
Do echo-enhanced needles make a difference in sonographically guided vascular access?
The purpose of this study was to compare sonographically guided vascular access using standard and echo-enhanced needles in a variety of tissue-simulating vascular phantoms. We conducted a prospective single-blinded observational study at an academic medical center. All participants performed real-time sonographically guided vascular access using both a standard 18-gauge needle and an echo-enhanced needle in both in-plane and out-of plane approaches on 3 different vascular access phantoms. The outcome measures included time to dye flash, first-pass success, visibility of the needle tip at the time of puncture, total number of attempts, number of redirections, and incidence of posterior wall penetration. A total of 408 sonographically guided cannulations were performed by 34 participants. The time from needle stick to dye flash, first-pass success, and the total number of attempts were not significantly different between the two needles (P>.05). The tip of the needle was seen at the time of puncture in 79% of attempts with the standard needle (95% confidence interval [CI], 68%-86%) and in 86% of attempts with the echo-enhanced needle (95% CI, 76%-92%), although this difference was not significant (P= .103). The posterior wall was penetrated with the standard needle in 14% of attempts (95% CI, 9.6%-20%) and in 6% of attempts with the echo-enhanced needle (95% CI, 3.5%-11%), and the difference was significant (P<.02).
Echo-enhanced needles decreased the incidence of posterior wall punctures when compared to standard needles during sonographically guided vascular access. However, there were no significant differences in other sonographically guided vascular access metrics.
closed_qa
Is postdilatation useful after implantation of the Edwards valve?
Few data are available about postdilatation (PD) for the treatment of significant paravalvular aortic regurgitation (AR) after transcatheter aortic valve implantation of the Edwards valve. A total of 470 patients, aged 83.4 ± 6.4 yrs, with logistic European System for Cardiac Operative Risk Evaluation 21.9 ± 12.3, undergoing transcatheter aortic valve implantation with the Edwards valve were evaluated. PD was performed using the balloon delivery system when significant paravalvular AR was identified. The diameter of the valve was measured from cine acquisition at three different levels. PD was performed in 49 (10.4%) patients with grade 2, 3, or 4 AR as 42.1%, 55.3%, and 2.6%, respectively. After PD, a reduction of at least 1 degree of AR was achieved in 81.5% of cases. Residual AR grades 2, 3, and 4 were observed in 36.8%, 10.5%, and 0%, respectively. A significant increase in the prosthesis diameter was observed at the three valve levels (absolute Δ 3.5%-5.4%, P.<0.01). For the 23 mm valve, mid level of valve increased from 23.0 ± 0.4 to 24.1 ± 0.5 mm (P < 0.01) and for the 26 mm, from 25.2 ± 0.9 to 26.6 ± 0.9 mm (P < 0.01). Occurrence of annulus rupture (4.1% vs. 1.7%, P = 0.24), cerebrovascular accidents (2.0% vs. 2.1%, P = 0.72), need for new pacemaker (8.2% vs. 5.5%, P = 0.31), and 30-day composite endpoint (24.5% vs. 20.2%, P = 0.48) were not significantly different between PD and non-PD groups.
PD for the treatment of significant paravalvular leak proved to be a feasible treatment allowing a significant increase in valve size and decrease in PVL without increase in stroke rates. This promising approach needs further confirmation.
closed_qa
Can soy intake affect serum uric acid level?
Hyperuricemia is a recognized risk factor for cardiovascular diseases. Soy foods contain a moderate amount of purine and may predispose to raised serum uric acid (UA). However, no study has examined the long-term effect of soy intake on UA levels. We examined whether consumption of soy foods and isoflavone extracts for 6 months altered serum UA. The analysis included two randomized controlled trials (soy protein trial and whole soy trial) among total 450 postmenopausal women with either prehypertension or prediabetes. We conducted a pooled analysis by combining participants from both the soy flour and soy protein groups (combined soy foods group), participants from both the isoflavone and daidzein groups (combined isoflavone group) and participants from both milk placebo groups. Fasting venous samples were obtained at baseline and the end of the trial for serum UA analysis. In the pooled data, 417 subjects completed the study according to protocol. The baseline serum UA levels were comparable among the three combined groups. There was a lower decrease in UA levels among women in the combined soy foods group compared with women in the other two groups (p = 0.028 and 0.026). The net decrease and % decrease in UA were 14.5 μmol/L (95 % CI 1.93-25.6, p = 0.023) or 4.9 % (95 % CI 1.3-8.5 %, p = 0.023) between the combined soy foods group and placebo group.
Among Chinese postmenopausal women with either prehypertension or prediabetes, soy intake did not increase urate levels.
closed_qa
Serum titanium, niobium and aluminium levels two years following instrumented spinal fusion in children: does implant surface area predict serum metal ion levels?
Measurement of serum metal ion levels is used to determine systemic exposure to implant-derived metal debris that may be generated by processes of wear and corrosion. The aim of this study is to investigate predictors of serum metal ion levels in children undergoing instrumented spinal arthrodesis using a titanium alloy, focusing on implant characteristics and instrumentation construct design variables. This prospective longitudinal cohort study involved 33 children. Serum samples were obtained preoperatively:and at five defined interval periods over the first:two post-operative years. Samples were analysed using high resolution:inductively coupled plasma mass spectrometry to measure titanium, niobium and aluminium concentrations. Instrumentation characteristics were catalogued and construct surface area (SA) measurements calculated using an implant-specific software algorithm tool. Significantly elevated levels of serum titanium and niobium were observed (p<0.0001), with>95 % of post-operative levels abnormally elevated. Significant predictors of serum titanium and niobium levels included time since surgery, surgical procedure (posterior or anterior fusion), number of levels fused, number of pedicle screws inserted, total rod length, total metal SA, total exposed metal SA and total metal-on-metal SA. All significant instrumentation variables were highly correlated.
There is a strong relationship between implant SA and both serum titanium and niobium levels. The direct clinical implications of these findings for patients are uncertain, but remain of concern. Surgeons should be aware of the strong correlation between implant surface area of the chosen construct and the subsequent serum metal ion levels.
closed_qa
Is neutrophil lymphocyte ratio an indicator for proteinuria in chronic kidney disease?
Recent studies have shown that neutrophil lymphocyte ratio (NLR) is a strong indicator in determining inflammation in cardiac and non-cardiac diseases. We aimed to evaluate the relationship between proteinuria and NLR in chronic kidney disease (CKD) patients without diabetes mellitus (DM). Between 2011 and 2012 files of a total of 1000 CKD patients attending outpatient clinic were retrospectively scanned. Patients with DM, chronic disease, malignancy or stage 5 CKD were excluded. After these patients were excluded, a total of 69 patients with stage 3 and 4 CKD were evaluated. The study comprised 27 patients with CKD without proteinuria (Group 1), 42 patients with CKD and proteinuria (Group 2) and 30 healthy volunteers (Group 3). NLR was highest in Group 2 and this was statistically significant compared with the control group (p = 0.012). The platelet lymphocyte ratio (PLR) in Group 2 was higher than the control group at a significant level (p = 0.004). There was a moderate positive correlation found between proteinuria and NLR (p = 0.013, r = 0.3). There was a positive correlation found between proteinuria and PLR (p = 0.002, r = 0.306).
In conclusion, NLR, a parameter easily found in routine blood counts of CKD patients, is a marker with prognostic value for the presence and degree of proteinuria.
closed_qa
Does treatment interruption and baseline hemoglobin affect overall survival in early laryngeal cancer treated with radical radiotherapy?
In this retrospective study we assessed different factors affecting the outcome of early laryngeal cancer, focusing on the impact of the pretreatment hemoglobin (Hb) level, time interval between diagnosis and start of radiotherapy, as well as treatment interruption during the course of radiotherapy. We reviewed the hospital records, oncology database and radiotherapy treatment sheets of 88 patients with T1-T3 N0M0 squamous cell carcinoma of the larynx who had been treated with radical radiotherapy at Northamptonshire Centre for Oncology during the period from 1st January 1996 till 31st December 2002 inclusive. Patients were followed up for 10 years. There were no significant overall survival differences with regard to sex , stage, radiotherapy dose received, treatment interruption for 1 to 2 days , as well as the delay to start radiotherapy (mean delay 57 days). However, there was statistically significant adverse overall survival outcome with increasing age (p<0.001). On the other hand, patients with pretreatment Hb level>12 g/dl had significant statistical overall survival benefit over those with ≤12 g/dl (p=0.018).
Pretreatment Hb level had a significant impact on overall survival in patients with early laryngeal carcinoma treated with radical radiotherapy. Time to start radiation treatment, treatment interruption for 1 or 2 days and different dose / fractionations did not affect the overall survival.
closed_qa
Does the computed tomography perfusion imaging improve the diagnostic accuracy in the response evaluation of esophageal carcinoma to the neoadjuvant chemoradiotherapy?
To estimate whether the computed tomography (CT) perfusion imaging could be useful to predict the pathological complete response (pCR) of esophageal cancer to the neoadjuvant chemoradiotherapy (NACRT). Twenty-seven patients with the advanced squamous cell esophageal carcinoma, who were treated with concomitant CRT (CIS/5-FU/LV and 45-50 Gy total radiation dose), were re-evaluated using CT examination, which included the low-dose CT perfusion study. CT perfusion series were analysed using the deconvolution-based CT perfusion software (Perfusion 3.0, GE), and color parametric maps of the blood flow (BF), blood volume (BV), mean transit time (MTT), and permeability surface area product (PS) were displayed. All patients were operated and histopathological analysis of the resected esophagus considered the gold standard for pathologic complete response (pCR). BFpost-NACRT, BVpost-NACRT, and PSpost-NACRT were significantly lower, and MTTpost-NACRT significantly higher in the pCR group. Mean (±SD), or median perfusion parameter values in the pCRs (11 patients) vs non-pCRs (16 patients) were: BFpost-NACRT- 21.4±5.0 vs 86.0±29 ml/min/100 g (p<0.001), BVpost-NACRT- 1.3 vs 3.9 ml/100 g (p<0.001), MTTpost-NACRT- 5.5 vs 3.7 s (p=0.018), and PSpost-NACRT- 5.9 vs 9.8 ml/min/100 g (p=0.006). ROC analysis revealed that BFpost- NACRT (AUC=1.000), BVpost-NACRT (AUC=0.932), MTTpost-NACRT (AUC=0.801), and PSpost-NACRT (AUC=0.844) could predict the pCR (p<0.01), while maximal esophageal wall thickness could not (AUC=0.676, p=0.126). If we set a cut-off value of BFpost-NACRT<30.0 ml/min/100 g, pCR was predicted with sensitivity and specificity of 100%.
CT perfusion imaging enables accurate prediction of pCR of esophageal carcinoma to neoadjuvant chemoradiotherapy.
closed_qa
Is (99m)Tc-MDP whole body bone scintigraphy adjuvant to (18)F-FDG-PET for the detection of skeletal metastases?
Due to the fact that fluorine-18-fluorodeoxyglucose positron emission tomography/computed tomography ((18)F-FDG-PET/CT) and technetium-99m-methylenediphosphonate ((99m)Tc-MDP) whole body scans identify bone metastases by different mechanisms, i.e. by using glucose metabolism and osteoblastic response in the bone, respectively, it can be expected that there may be some differences between these two methods in the number of lesions identified. The aim of this study was to compare the sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) in detecting bone metastases between (18)F-FDG-PET/CT and conventional (99m)Tc-MDP whole body scans. Between 2006-2009, 121 patients with malignancies (62 male and 59 female, mean age 59.3±10.8 years, range 37-84) were examined with (18)F-FDG-PET/CT and conventional (99)Tc-MDP whole-body scans for detection of bone metastases. For (18)F-FDG-PET/CT and for (99m)TC-MDP, sensitivity, specificity, accuracy, PPV and NPV for detecting all studied bone metastases were 88.3, 83.6, 86.7, 91.7, 77.8% and 91.7, 71.0, 84.9, 86.6, 80.8%, respectively. For bone metastases of breast and lung cancers, the specificity and accuracy of PET/CT was higher than that of bone scintigraphy. On the other hand, the sensitivity of bone scintigraphy was higher than PET/CT in breast and lung cancers groups and all patients. In the detection of osteolytic and osteosclerotic metastases no difference was found between the two methods, while for osteolytic lesions the mean standardized uptake value (SUV) max was higher than for osteosclerotic lesions.
For the detection of bone metastases the specificity and accuracy of (18)F-FDG-PET/CT were higher compared to bone scintigraphy, while the sensitivity was lower. It is the opinion of the authors that both studies are complementary to final diagnosis.
closed_qa
Is methylene diphosphonate bone scan necessary for initial staging of Ewing sarcoma if 18F-FDG PET/CT is performed?
The purpose of this study was to determine whether methylene diphosphonate (MDP) bone scans are necessary during initial staging in patients with Ewing sarcoma (ES) in whom (18)F-FDG PET/CT is performed. A retrospective review was performed of patients who underwent FDG PET/CT and MDP bone scan before treatment of newly diagnosed ES from January 2004 to November 2012. Studies were reviewed to document suspected primary and metastatic malignancy. Pathology and imaging follow-up were used to determine the presence or absence of disease at suspected sites. Sixty patients were identified in whom FDG PET/CT and MDP bone scans were performed before treatment of newly diagnosed ES. Forty-four primary malignancies had a lytic CT appearance, three were sclerotic, and 13 involved only soft tissue. In 11 of 12 patients with osseous metastases, these were detected on PET/CT, with the one false-negative occurring in a sclerotic primary tumor; in nine of 12 patients with osseous metastases, these were detected on MDP bone scan, with the three false-negatives occurring in patients with lytic primary tumors. Only one of 13 patients with a soft-tissue primary malignancy had bone metastases on both bone scan and PET/CT. PET/CT also showed that eight patients had lung metastases and three patients had lymph node metastases, which were not evident on MDP bone scan.
When ES is lytic, MDP bone scan does not add to staging performed by FDG PET/CT; thus, MDP bone scanning may be omitted. However, when ES is sclerotic, MDP bone scan may detect osseous metastases not detected by FDG PET/CT.
closed_qa
Is the new ACR-SPR practice guideline for addition of oblique views of the ribs to the skeletal survey for child abuse justified?
The purpose of our study was to determine whether adding oblique bilateral rib radiography to the skeletal survey for child abuse significantly increases detection of the number of rib fractures. We identified all patients under 2 years old who underwent a skeletal survey for suspected child abuse from January 2003 through July 2011 and who had at least one rib fracture. These patients were age-matched with control subjects without fractures. Two randomized radiographic series of the ribs were performed, one containing two views (anteroposterior and lateral) and another with four views (added right and left oblique). Three fellowship-trained radiologists (two in pediatrics and one in trauma) blinded to original reports independently evaluated the series using a Likert scale of 1 (no fracture) to 5 (definite fracture). We analyzed the following: sensitivity and specificity of the two-view series for detection of any rib fracture and for location (using the four-view series as the reference standard), interobserver variability, and confidence level. We identified 212 patients (106 with one or more fractures and 106 without). The sensitivity and specificity of the two-view series were 81% and 91%, respectively. Sensitivity and specificity for detection of posterior rib fractures were 74% and 92%, respectively. There was good agreement between observers for detection of rib fractures in both series (average kappa values of 0.70 and 0.78 for two-views and four-views, respectively). Confidence significantly increased for four-views.
Adding bilateral oblique rib radiographs to the skeletal survey results in increased rib fracture detection and increased confidence of readers.
closed_qa
Is dry eye associated with acquired aponeurogenic blepharoptosis?
To study the relation between signs of dry eye and acquired aponeurogenic blepharoptosis. Prospective case-control study in which 100 patients with uni-or bilateral acquired aponeurogenic blepharoptosis were matched for age and gender to 100 controls. The margin-reflex distance (MRD), the Schirmer-1 score, the duration of the tear film break up time (BUT), and the presence of any corneal staining with fluorescein were evaluated in both groups and compared. Data were analysed using either Fisher's exact test or linear regression. The Schirmer-1 score was<10 mm in 36 patients versus 14 controls (p = 0.0005). It was<5 mm in 15 patients versus 5 controls (p = 0.03). Hard contact lenses were worn by 29 patients versus 4 controls (p = 0.000002), and soft contact lenses by 11 patients and 4 controls (p = 0.1). After exclusion of contact lens wearers from analysis, the Schirmer-1 score was<10 mm in 30 of the remaining 60 patients and in 17 of the remaining 92 controls (p = 0.00006). The tear BUT was<10 sec in 75 patients versus 71 controls (p = 0.6). Corneal staining was present in 25 patients versus 15 controls (p = 0.1). The score of the Schirmer-1 test and the MRD decreased with age in both groups.
(1) Compared to matched controls, patients with acquired aponeurogenic blepharoptosis more often have a decreased aqueous tear production (as measured by a Schirmer-1 test). Although a low tear production may have a causative role in the etiology of acquired blepharoptosis, it may also be explained by a dampened reflex in blepharoptosis patients (2) With age, the MRD as well as the Schirmer-1 score decrease in both groups.
closed_qa
Anderson-hynes pyeloplasty in patients less than 12 months old. Is the laparoscopic approach safe and feasible?
The aim of our study is to compare the outcomes of open and laparoscopic pyeloplasty in children less than 12 months of age. We reviewed all medical charts of patients less than 12 months old who underwent pyeloplasty from January 2007 to February 2013. We divided them in two groups: Open pyeloplasty (OP) and laparoscopic pyeloplasty (LP). The following data were analyzed: age, sex, weight, US measurements, operative time, hospital stay, complications, and success rate. Quantitative data were analyzed with the Student t test or Mann-Whitney U test, and chi-square test or Fisher test for qualitative data. Fifty-eight patients (46 boys and 12 girls) with a mean age of 4.66 months (±3.05) were included. Mean age was 4.25 months and 5.15 months in OP and LP group respectively. Mean weight was 6.78 kg and 7.02 kg in OP and LP groups. There were no statistical differences in age, weight, and sex between OP and LP groups. There were no statistical differences in preoperative ultrasonography measurements. Mean posterior-anterior (PA) pelvis diameter was 28.57 mm and 23.94 mm in OP and LP groups, respectively. Mean calices diameter were 10.86 mm and 10.96 mm in OP and LP groups, respectively. Mean operative time was 129.53 minutes in the OP group and 151.92 minutes in the LP group with statistical differences (P=0.018). Mean hospital stay was 6.34 days in the OP group and 3.46 in the LP group with statistical differences (P<0.05). No intraoperative and postoperative complications were found in either group. Hydronephrosis improved in all patients, and no patient needed a repeated pyeloplasty.
The laparoscopic approach of Anderson-Hynes pyeloplasty in patients less than 12 months old is a safe procedure with the same outcomes as the open approach.
closed_qa
Surgical treatment of infective endocarditis in active intravenous drug users: a justified procedure?
Infective endocarditis is a life threatening complication of intravenous drug abuse, which continues to be a major burden with inadequately characterised long-term outcomes. We reviewed our institutional experience of surgical treatment of infective endocarditis in active intravenous drug abusers with the aim of identifying the determinants long-term outcome of this distinct subgroup of infective endocarditis patients. A total of 451 patients underwent surgery for infective endocarditis between January 1993 and July 2013 at the University Hospital of Heidelberg. Of these patients, 20 (7 female, mean age 35 ± 7.7 years) underwent surgery for infective endocarditis with a history of active intravenous drug abuse. Mean follow-up was 2504 ± 1842 days. Staphylococcus aureus was the most common pathogen detected in preoperative blood cultures. Two patients (10%) died before postoperative day 30. Survival at 1, 5 and 10 years was 90%, 85% and 85%, respectively. Freedom from reoperation was 100%. Higher NYHA functional class, higher EuroSCORE II, HIV infection, longer operating time, postoperative fever and higher requirement for red blood cell transfusion were associated with 90-day mortality.
In active intravenous drug abusers, surgical treatment for infective endocarditis should be performed as extensively as possible and be followed by an aggressive postoperative antibiotic therapy to avoid high mortality. Early surgical intervention is advisable in patients with precipitous cardiac deterioration and under conditions of staphylococcal endocarditis. However, larger studies are necessary to confirm our preliminary results.
closed_qa
A two fold risk of metabolic syndrome in a sample of patients with schizophrenia: do consanguinity and family history increase risk?
Patients with schizophrenia are at greater risk for metabolic syndrome (MetS) and other cardiovascular risk factors. The objective of the study was to examine the prevalence of metabolic syndrome (MetS) and its criteria among patients with schizophrenia (Sz) according to the revised criteria of NCEP ATP III and assess which component contributed to the increased risk of the MetS in schizophrenia patients. This was a matched case-control study. Outpatient clinics of the Psychiatry department and Primary Health Care (PHC) Centers of the Supreme Council of Health, State of Qatar. The study was carried out among patients with schizophrenia (SZ) and healthy subjects above 20 years old. The study based on matched by age and gender of 233 cases and 466 controls. The survey was conducted from June 2010 to May 2011. Face to face interviews were conducted using a structured questionnaire followed by laboratory tests. Metabolic syndrome was defined using the National Cholesterol Education Program - Third Adult Treatment Panel (ATP III). The prevalence of metabolic syndrome among schizophrenic patients (36.5%) were significantly higher than healthy subjects (18.7%) (p<0.001). The prevalence of MetS in schizophrenic subjects was reported to be two times higher than in the general population. The MetS components were higher among schizophrenic patients than healthy subjects. Among the components of MetS, central obesity (63.9%) was the most common criteria among patients compared to healthy subjects (45.7%) (p<0.001). Schizophrenic patients (27%) were significantly obese than the healthy subjects (13.1%). Female schizophrenia patients were more likely to have three or more metabolic abnormalities compared to men.
The study indicated that metabolic syndrome was highly prevalent in patients with schizophrenia. The female gender was significantly associated with a higher prevalence of metabolic syndrome. The identification and clinical management of this high risk group is of great importance.
closed_qa
Are general surgery residents ready to practice?
General surgery residency training has changed with adoption of the 80-hour work week, patient expectations, and the malpractice environment, resulting in decreased resident autonomy during the chief resident year. There is considerable concern that graduating residents are not prepared for independent surgical practice. Two online surveys were developed, one for "young surgeons" (American College of Surgeons [ACS] Fellows 45 years of age and younger) and one for "older surgeons" (ACS Fellows older than 45 years of age). The surveys were distributed by email to 2,939 young and 9,800 older surgeons. The last question was open-ended with a request to provide comments. A qualitative and quantitative analysis of all comments was performed. The response rate was 9.6% (282 of 2,939) of young and 10% (978 of 9,800) of older surgeons. The majority of young surgeons (94% [58.7% strongly agree, 34.9% agree]) stated they had adequate surgical training and were prepared for transition to the surgery attending role (91% [49.6% strongly agree, 41.1% agree]). In contrast, considerably fewer older surgeons believed that there was adequate surgical training (59% [18.7% strongly agree, 40.2% agree]) or adequate preparation for transition to the surgery attending role (53% [16.93% strongly agree, 36.13% agree]). The 2 groups' responses were significantly different, chi-square test of association (3) = 15.73, p = 0.0012. Older surgeons focused considerably more on residency issues (60% vs 42%, respectively), and young surgeons focused considerably more on business and practice issues (30% vs 14%, respectively).
Young and older surgeons' perceptions of general surgery residents' readiness to practice independently after completion of general surgery residency differ significantly. Future work should focus on determination of specific efforts to improve the transition to independent surgery practice for the general surgery resident.
closed_qa
The use of systematic reviews in clinical trials and narrative reviews in dermatology: is the best evidence being used?
Systematic reviews -the most comprehensive type of literature review-should be taken into account before a clinical trial or a narrative review on a topic is undertaken. The objective of this study was to describe the use of systematic reviews in clinical trials and narrative reviews in dermatology. This was a descriptive cross-sectional study. We selected randomized clinical trials and narrative reviews from the dermatological clinical research journals identified as most important (according to impact factor) and from Actas Dermosifiliográficas, and studied the bibliographies to ascertain whether the authors made reference to existing systematic reviews and Cochrane reviews. Of the 72 clinical trials for which a systematic review was available, 24 (33.3%) cited at least 1 review; reference was made to relevant Cochrane reviews in 15.6% of cases and to non-Cochrane reviews in 32%. In the case of the 24 narrative reviews for which a review was available, 10 (41.7%) cited at least 1 review; Cochrane reviews were cited in 20% and non-Cochrane reviews in 35.3%.In the case of Actas Dermosifiliográficas, very few clinical trials were found and the findings for narrative review articles were similar to those observed for the other journals.
Systematic reviews are not often taken into account by the authors of clinical trials and narrative reviews and this may lead to redundant studies and publications. Authors appear to use Cochrane reviews even less than non-Cochrane reviews and are therefore ignoring one of the main sources of available evidence.
closed_qa
Quality of life after complete lymphadenectomy for vulvar cancer: do women prefer sentinel lymph node biopsy?
Leg lymphoedema occurs in up to 60% of women after a complete inguinal-femoral lymphadenectomy for vulvar cancer. To avoid lymphoedema, sentinel lymph node biopsy has become the preferred method of staging. However, false-negative results may influence survival, making the sentinel node procedure unacceptable to many fully informed women. The aims of this study were to measure the quality of life (QoL) in women after a complete lymphadenectomy for vulvar cancer and to quantify the risk to survival these women would be prepared to take with sentinel node biopsy. Sixty women who had a complete lymphadenectomy for early-stage vulvar cancer participated in structured interviews. The severity of lymphoedema symptoms was recorded. The QoL-adjusted survival was measured using the Utility-Based Questionnaire-Cancer, a cancer-specific validated QoL instrument. The women stated their preference for sentinel node biopsy or complete lymphadenectomy. A "standard-gamble" preference table was used to quantify the degree of risk to survival they would take to avoid lymphoedema. Seventy-three percent of women reported lymphoedema after complete lymphadenectomy. Women with lymphoedema or leg pain had significantly worse scores for QoL in terms of social activity as well as physical and sexual function. Overall, 80% of women would choose complete lymphadenectomy rather than sentinel node biopsy if the risk of missing a positive lymph node was higher than 1 in 100, but if the risk of missing a positive lymph node was lower than 1 in 100, almost one third of the women would prefer sentinel node biopsy.
Although women treated for early-stage vulvar cancer report reduced QoL after complete lymphadenectomy, most would choose complete lymphadenectomy over sentinel node biopsy. However, there is an individual level of risk that each woman can define with regard to her preference for the sentinel node procedure. Women with early-stage vulvar cancer should be offered an informed choice between complete lymphadenectomy or sentinel node biopsy.
closed_qa
Do Intraoperative LIV-Tilt and Disk Angle Remain Stable at 2-year Follow-up Compared With Upright Radiographs in Patients With Idiopathic Scoliosis?
This study was a retrospective chart and radiographic review. The aim of this study was to determine if lowest instrumented vertebra (LIV) tilt and disk wedging measured intraoperatively correlated to their respective values on standing radiographs at intermediate follow-up. No guidelines exist regarding an acceptable intraoperative LIV-tilt. After IRB approval, a consecutive series of patients with adolescent idiopathic scoliosis (AIS) and structural lumbar curves treated with posterior spinal fusion (PSF) at a single institution between 2007 and 2010 was identified. A total of 163 patients with AIS underwent PSF during this time period. Seventeen patients had fusion of structural lumbar curves with adequate imaging and a minimum 2-year follow-up. The LIV-tilt and disk angle below the LIV was measured on the preoperative standing, intraoperative supine fluoroscopy and postoperative standing radiographs, and coronal balance was measured on the preoperative and postoperative standing radiographs using a standardized method separately by 2 authors. The curve distribution was as follows: Lenke 3 (29%), Lenke 5 (47%), and Lenke 6 (24%). There was agreement on radiographic measurements between the 2 authors with a correlation coefficient of 0.98 for coronal balance, 0.91 for LIV-tilt, and 0.65 for disk angle. LIV-tilt improved from 19.4 degrees preoperatively to 3.6 degrees intraoperatively. At minimum 2-year follow-up, LIV had on average progressed to 8.6 degrees. The disk angle improved from 5.4 degrees preoperatively to 2.5 degrees intraoperatively. This improvement was maintained at 2 years (2.8 degrees). Coronal balance also improved during the postoperative period from 17.9 mm immediately following surgery to 11.1 mm at the last follow-up.
Compared with prone intraoperative fluoroscopic images, disk wedging below LIV remains stable at 2 years postsurgery on standing radiographs in patients with AISundergoing PSF, including structural lumbar curves, whereas LIV-tilt improvement is not maintained. Intraoperative fluoroscopy provides a reliable prediction of disk wedging below LIV, 2 years after surgery on standing radiographs.
closed_qa
Focal sclerosis of semicircular canals with severe DFNA9 hearing impairment caused by a P51S COCH-mutation: is there a link?
Focal sclerosis of one or more semicircular canals on computed tomographic (CT) scans and a corresponding signal loss on magnetic resonance (MR) imaging are radiologic lesions that are linked to patients who are suffering from advanced otovestibular impairment caused by hereditary DFNA9 hearing loss. DFNA9 is a hereditary hearing loss that is characterized by late-onset progressive imbalance and hearing deterioration, caused by mutations in the COCH gene. To date, no radiologic lesions have been associated with this condition. A retrospective chart review Tertiary referral center The radiologic data of 9 patients who presented between 2007 and 2012 with otovestibular deterioration caused by a mutation in the COCH gene were reviewed. All 9 subjects were carriers of the same c.151C>T, p.Pro51Ser (P51S) - missense mutation in the COCH gene. In 8 of them similar sclerotic lesions and/or narrowing were demonstrated in one or more semicircular canals on computed tomography CT scan, with a signal loss at corresponding areas on T2-weighted magnetic resonance (MR) images. In 1 patient, the posterior part of the vestibule was also affected. The posterior canals were affected in most cases (58%), compared with the superior (21%) and lateral canals (16%) or the vestibule (5%). Only 68.4% of the lesions on MR images were also visible on CT scans, suggesting a fibrotic process without calcification. Ears presenting radiologic lesions showed significantly more severe hearing loss (median PTA 104 dB HL) compared with unaffected ears (58 dB HL).
Eight of 9 subjects with the same P51S mutation in the COCH gene showed similar radiologic lesions, affecting the PSCC in the majority of the cases. These radiologic abnormalities occurred in more advanced stages of the otovestibular deterioration, supporting the hypothesis that these lesions might represent the end phase of a low-grade chronic inflammation or protein deposition. A new phenotypic and characteristic radiologic feature of DFNA9 has been discovered.
closed_qa
Do staff nurse perceptions of nurse leadership behaviors influence staff nurse job satisfaction?
Nurse managers leadership behaviors influence the job satisfaction of staff nurses. Transformational leadership is 1 of the 5 components associated with the Magnet Recognition Program®. The aim of this study was to examine the relationship between staff nurse perception of nurse manager leadership behavior and staff nurse job satisfaction in a hospital on the Magnet® journey and the influence of nurse manager leadership style on staff nurse job satisfaction. A descriptive, correlational design using a self-report survey with convenience sampling was used for this quantitative research study. Staff nurses completed the Multifactor Leadership Questionnaire 5X Short Form, the Abridged Job Descriptive Index survey, and a demographic questionnaire. Pearson correlations and regression analyses were completed to explore the relationship and influence of nurse manager leadership style on staff nurse job satisfaction. Transformational and transactional leadership styles of nurse managers were positively related to staff nurse overall job satisfaction and satisfaction with opportunity for promotion. Passive-avoidant leadership style of nurse managers was negatively related to staff nurse satisfaction with work, promotion, supervision, and coworker. Satisfaction with nurse manager leadership was a positive influence on overall nurse job satisfaction when separately controlling for the influence of each leadership style.
Transformational and transactional leadership styles should be taught and encouraged among nurse managers to positively influence the job satisfaction of staff nurses.
closed_qa
Is there a need for repetition of skin test in childhood allergic diseases?
Skin prick tests are widely used to determine sensitivity in allergic diseases. There is limited information about the natural history of skin sensitization tests and factors that affect them. It was aimed to determine the changes in skin test results and the factors affecting the reactivity of skin tests after a period of approximately four years in children with allergic disease. SPT of 170 patients among 2485 children with asthma and/or allergic rhinitis and/or atopic dermatitis, who underwent SPT between 2005 and 2007, were repeated after an interval of at least 3 years. The mean age was 10.7 ± 3.1 (5-18) years and 70% of the patients were male. In total 66 (39.0% of the study population) had a different skin tests result in follow-up. Alterations: loss of sensitivity in 18 (11%) patients, the formation of a new sensitivity in 37 (22%) patients, and 11 (6%) both gained and lost sensitization. The presence of atopy in the family, the presence of allergic rhinitis and IgE elevation significantly predicted the incidence of new sensitization. The presence of sensitization to multiple allergens significantly predicted the incidence of loss of sensitization.
It is found that there was an alteration of sensitization in 4/10 children at the end of the average 4-year period. The presence of family atopy, the presence of allergic rhinitis and serum total IgE elevation were risk factors for the development of new sensitization. On the other hand sensitization to multiple allergens was risk factors for the loss of sensitization.
closed_qa
Prospective evaluation of intravascular volume status in critically ill patients: does inferior vena cava collapsibility correlate with central venous pressure?
In search of a standardized noninvasive assessment of intravascular volume status, we prospectively compared the sonographic inferior vena cava collapsibility index (IVC-CI) and central venous pressures (CVPs). Our goals included the determination of CVP behavior across clinically relevant IVC-CI ranges, examination of unitary behavior of IVC-CI with changes in CVP, and estimation of the effect of positive end-expiratory pressure (PEEP) on the IVC-CI/CVP relationship. Prospective, observational study was performed in surgical/medical intensive care unit patients between October 2009 and July 2013. Patients underwent repeated sonographic evaluations of IVC-CI. Demographics, illness severity, ventilatory support, CVP, and patient positioning were recorded. Correlations were made between CVP groupings (<7, 7-12, 12-18, 19+) and IVC-CI ranges (<25, 25-49, 50-74, 75+). Comparison of CVP (2-unit quanta) and IVC-CI (5-unit quanta) was performed, followed by assessment of per-unit ΔIVC-CI/ΔCVP behavior as well as examination of the effect of PEEP on the IVC-CI/CVP relationship. We analyzed 320 IVC-CI/CVP measurement pairs from 79 patients (mean [SD] age, 55.8 [16.8]years; 64.6% male; mean [SD] Acute Physiology and Chronic Health Evaluation II, 11.7 [6.21]). Continuous data for IVC-CI/CVP correlated poorly (R = 0.177, p<0.01) and were inversely proportional, with CVP less than 7 noted in approximately 10% of the patients for IVC-CIs less than 25% and CVP less than 7 observed in approximately 85% of patients for IVC-CIs greater than or equal to 75%. Median ΔIVC-CI per unit CVP was 3.25%. Most measurements (361 of 320) were collected in mechanically ventilated patients (mean [SD]PEEP, 7.76 [4.11] cm H2O). PEEP-related CVP increase was approximately 2 mm Hg to 2.5 mm Hg for IVC-CIs greater than 60% and approximately 3 mm Hg to 3.5 mm Hg for IVC-CIs less than 30%. PEEP also resulted in lower IVC-CIs at low CVPs, which reversed with increasing CVPs. When IVC-CI was examined across increasing PEEP ranges, we noted an inverse relationship between the two variables, but this failed to reach statistical significance.
IVC-CI and CVP correlate inversely, with each 1 mm Hg of CVP corresponding to 3.3% median ΔIVC-CI. Low IVC-CI (<25%) is consistent with euvolemia/hypervolemia, while IVC-CI greater than 75% suggests intravascular volume depletion. The presence of PEEP results in 2 mm Hg to 3.5 mm Hg of CVP increase across the IVC-CI spectrum and lower collapsibility at low CVPs. Although IVC-CI decreased with increasing degrees of PEEP, this failed to reach statistical significance. While this study represents a step forward in the area of intravascular volume estimation using IVC-CI, our findings must be applied with caution owing to some methodologic limitations.
closed_qa
Do all β-blockers attenuate the excess hematopoietic progenitor cell mobilization from the bone marrow following trauma/hemorrhagic shock?
Severe injury results in increased mobilization of hematopoietic progenitor cells (HPC) from the bone marrow (BM) to sites of injury, which may contribute to persistent BM dysfunction after trauma. Norepinephrine is a known inducer of HPC mobilization, and nonselective β-blockade with propranolol has been shown to decrease mobilization after trauma and hemorrhagic shock (HS). This study will determine the role of selective β-adrenergic receptor blockade in HPC mobilization in a combined model of lung contusion (LC) and HS. Male Sprague-Dawley rats were subjected to LC, followed by 45 minutes of HS. Animals were then randomized to receive atenolol (LCHS + β1B), butoxamine (LCHS + β2B), or SR59230A (LCHS + β3B) immediately after resuscitation and daily for 6 days. Control groups were composed of naive animals. BM cellularity, %HPCs in peripheral blood, and plasma granulocyte-colony stimulating factor levels were assessed at 3 hours and 7 days. Systemic plasma-mediated effects were evaluated in vitro by assessment of BM HPC growth. Injured lung tissue was graded histologically by a blinded reader. The use of β2B or β3B following LCHS restored BM cellularity and significantly decreased HPC mobilization. In contrast, β1B had no effect on HPC mobilization. Only β3B significantly reduced plasma G-CSF levels. When evaluating the plasma systemic effects, both β2B and β3B significantly improved BM HPC growth as compared with LCHS alone. The use of β2 and β3 blockade did not affect lung injury scores.
Both β2 and β3 blockade can prevent excess HPC mobilization and BM dysfunction when given after trauma and HS, and the effects seem to be mediated systemically, without adverse effects on subsequent healing. Only treatment with β3 blockade reduced plasma G-CSF levels, suggesting different mechanisms for adrenergic-induced G-CSF release and mobilization of HPCs. This study adds to the evidence that therapeutic strategies that reduce the exaggerated sympathetic stimulation after severe injury are beneficial and reduce BM dysfunction.
closed_qa
Is the postero-medial portal safe in posterior ankle arthroscopy?
Posterior ankle arthroscopy is considered to pose a risk of neurological and vascular complications. Some authors consider the postero-medial portal to be risky and recommend using only the postero-lateral portal. The aim of this study was to analyze the margin of error offered by posterior ankle arthroscopy portals. Twenty MRI studies of the ankle joint were analyzed. The paths of the postero-medial and postero-lateral portals were drawn. Next, the path of the probe was diverged to aim at the neurovascular bundle and the angle of deviation was measured. We analyzed the distance between the probe located directly laterally to the flexor hallucis longus tendon and the neurovascular bundle. The mean angle of deviation leading to collision with neurovascular bundle structures was 53.3° (range 37-70°) and 30.75° (range 22-41°) for the postero-medial and postero-lateral portals, respectively, p<0.05. The mean minimal distance between and the probe and the bundle was 12 mm (range 5-15 mm) and 13 mm (range 8-18 mm) for the postero-medial and postero-lateral portals, respectively, p>0.05.
1. The postero-medial arthroscopic portal is at least as safe as the postero-lateral one in posterior ankle arthroscopy. 2. Keeping instruments strictly laterally to the flexor hallucis longus tendon leaves at least 5 mm distance from the neurovascular bundle.
closed_qa
Do parents and children agree?
Multi-item measures of inflammatory bowel disease (IBD) activity based on clinical, laboratory, and/or endoscopic variables do not take into consideration the impact on the patients' emotional aspects and adaptation to the disease. The aim of the present study was to evaluate concordance between parent and child ratings of health-related quality of life on the IMPACT-III questionnaire in children with IBD. The IMPACT-III questionnaire was used to measure quality of life in 27 patients (mean age 14.2 ± 3 years, 40% girls) and one of their parents (82% mothers). Most of the patients had inactive disease at the time of the study. Differences between parent-proxy ratings and child ratings on the IMPACT-III were compared via paired-samples t tests, intraclass correlation coefficients, and standardized difference scores. Parent-proxy and patient ratings were similar on total IMPACT-III and its related domains (bowel symptoms, systemic symptoms, social functioning, body image, treatment/interventions), except that significant differences on emotional functioning ratings were found (P = 0.003). Intraclass correlation coefficients showed medium-to-large effect sizes (range 0.52-0.88) and standardized difference scores showed varying degrees of bias depending on the domain measured (range -0.64 to 0.32).
Parents served as a good proxy for quality-of-life ratings in this population of pediatric patients with IBD. The degree of concordance between parent and child scores, however, varied, as observed in the present study in which parents underreported their child's health-related quality of life on the IMPACT-III emotional functioning domain.
closed_qa
Is there any predictor for clinical outcome in EGFR mutant NSCLC patients treated with EGFR TKIs?
Tyrosine kinase inhibitors (TKIs) of the epidermal growth factor receptor (EGFR) have demonstrated some dramatic response rate and prolonged progression-free survival (PFS) in advanced non-small-cell lung cancer (NSCLC) patients with activating EGFR mutation. However, PFS and overall survival (OS) among those patients who were treated with EGFR TKIs are inconsistent and unpredictable. In this study, we evaluated predictors of clinical outcome in EGFR mutant NSCLC patients treated with EGFR TKIs. A total of 148 patients who had metastatic or recurrent NSCLC with activating EGFR mutation treated with either erlotinib or gefitinib as a first-line (n = 10) and a second-line or more treatment (n = 138) were retrospectively reviewed. The median follow-up duration was 21.9 months (range, 1.1-62.5). The median PFS and OS for a total 148 patients were 10.6 months (95 % CI 9.0-12.2) and 21.8 months (95 % CI 18.5-25.1), respectively. The survival outcomes were similar between the first-line and second-line or more line of treatment of EGFR TKIs (P = 0.512 for PFS, P = 0.699 for OS). Although a high number of metastasis sites (3-6 vs. 1-2) were associated with shorter PFS and OS (median PFS 9.9 vs. 11.9 months, P = 0.019; median OS 16.4 vs. 22.2 months, P = 0.021, respectively) in univariate analysis, but not in multivariate analysis. According to the clinical and molecular markers by multivariate analysis, there were no significant differences in PFS. When PFS was dichotomized by median 11 months for 105 patients treated with EGFR TKIs as second-line therapy, no significant differences in any clinical or molecular features were found between longer PFS and shorter PFS groups.
Despite the inconsistencies in PFS among EGFR mutant patients treated with EGFR TKIs, no significant differences of clinical features were noted, thereby suggesting a need for more understanding of the heterogeneity of underlying biology.
closed_qa
Is stability of the proximal tibiofibular joint important in the multiligament-injured knee?
The incidence of proximal tibiofibular joint instability in the setting of the multiligament-injured knee has not been previously reported. The integrity of the proximal tibiofibular joint is required to perform a fibular-based, lateral-sided knee reconstruction.QUESTIONS/ We report (1) the frequency of proximal tibiofibular joint instability in patients presenting with multiligament knee injuries and evaluate (2) our ability to restore stability to this joint, (3) patient-reported outcome scores, and (4) complications in patients surgically treated for proximal tibiofibular joint instability at the time of treatment of multiligament knee instability. From 2005 to 2013, 124 patients (129 knees) sustaining multiligament knee injuries with Grade 3 instability to at least two ligaments were treated at our institution. We defined proximal tibiofibular joint instability as a dislocated or dislocatable proximal tibiofibular joint at the time of surgery. These patients underwent surgery to restore proximal tibiofibular joint stability and ligament reconstruction or repair and were followed with routine clinical examination, radiographs, and subjective outcome measures, including Lysholm and IKDC scores. Minimum followup was 12 months (mean, 32 months; range, 12-61 months). Twelve knees (12 patients, 9% of 129 knees) showed proximal tibiofibular joint instability. Knee stability in 10 patients was restored to Grade 1 or less in all surgically treated ligaments. No proximal tibiofibular joint instability has recurred. No patients have complained of ankle stiffness or pain. In the ten patients with subjective scores, mean Lysholm score was 75 (range, 54-95) and mean IKDC score was 58 (range, 22-78). There were four complications: one failed posterolateral corner reconstruction, one proximal tibiofibular joint screw removal secondary to pain over the screw head, one deep infection treated with serial irrigation and débridements with graft retention, and one closed manipulation secondary to arthrofibrosis and loss of ROM.
In the setting of multiligament-injured knees, our series demonstrated a 9% incidence of proximal tibiofibular joint instability. The technique we describe successfully restored stability to the proximal tibiofibular joint and resulted in satisfactory patient-reported outcomes with low complication rates.
closed_qa
Does intraarticular inflammation predict biomechanical cartilage properties?
Intact cartilage in the lateral compartment is an important requirement for medial unicompartmental knee arthroplasty (UKA). Progression of cartilage degeneration in the lateral compartment is a common failure mode of medial UKA. Little is known about factors that influence the mechanical properties of lateral compartment cartilage.QUESTIONS/ The purposes of this study were to answer the following questions: (1) Does the synovial fluid white blood cell count predict the biomechanical properties of macroscopically intact cartilage of the distal lateral femur? (2) Is there a correlation between MRI grading of synovitis and the biomechanical properties of macroscopically intact cartilage? (3) Is there a correlation between the histopathologic assessment of the synovium and the biomechanical properties of macroscopically intact cartilage? The study included 84 patients (100 knees) undergoing primary TKA for varus osteoarthritis between May 2010 and January 2012. All patients underwent preoperative MRI to assess the degree of synovitis. During surgery, the cartilage of the distal lateral femur was assessed macroscopically using the Outerbridge grading scale. In knees with an Outerbridge grade of 0 or 1, osteochondral plugs were harvested from the distal lateral femur for biomechanical and histologic assessment. The synovial fluid was collected to determine the white blood cell count. Synovial tissue was taken for histologic evaluation of the degree of synovitis. The mean aggregate modulus and the mean dynamic modulus were significantly greater in knees with 150 or less white blood cells/mL synovial fluid compared with knees with greater than 150 white blood cells/mL synovial fluid. There was no correlation among MRI synovitis grades, histopathologic synovitis grades, and biomechanical cartilage properties.
The study suggests that lateral compartment cartilage in patients with elevated synovial fluid white blood cell counts has a reduced ability to withstand compressive loads.
closed_qa
Is temporary employment a risk factor for work disability due to depressive disorders and delayed return to work?
Research on temporary employment as a risk factor for work disability due to depression is mixed, and few studies have measured work disability outcome in detail. We separately examined the associations of temporary employment with (i) the onset of work disability due to depression, (ii) the length of disability episodes, and (iii) the recurrence of work disability, taking into account the possible effect modification of sociodemographic factors. We linked the prospective cohort study data of 107 828 Finnish public sector employees to national registers on work disability (>9 days) due to depression from January 2005 to December 2011. Disability episodes were longer among temporary than permanent employees after adjustment for age, sex, level of education, chronic somatic disease, and history of mental/behavioral disorders [cumulative odds ratio (COR) 1.37, 95% confidence interval (95% CI) 1.25-51). The association between temporary employment and the length of depression-related disability episodes was more pronounced among participants with a low educational level (COR 1.95, 95% CI 1.54-2.48) and older employees (>52 years; COR 3.67, 95% CI 2.83-4.76). The association was weaker in a subgroup of employees employed for ≥ 50% of the follow-up period (95% of the original sample). Temporary employment was not associated with the onset or recurrence of depression-related work disability.
Temporary employment is associated with slower return to work, indicated by longer depression-related disability episodes, especially among older workers and those with a low level of education. Continuous employment might protect temporary employees from prolonged work disability.
closed_qa
Is a complete blood cell count useful in determining the prognosis of pulmonary embolism?
Pulmonary embolism (PE) is the third cardiovascular cause of hospital admission, following acute coronary syndrome and stroke. Despite high-tech diagnostic methods and new treatment modalities, PEs continue to have a high mortality rate within the first 3 months. This study was designed to assess the additional prognostic value of a complete blood cell count, renal function markers, C-reactive protein, and simplified pulmonary embolism severity index (sPESI) scoring system in PE 100-day mortality. The study retrospectively enrolled 208 consecutive patients who were hospitalized with the diagnosis of an acute PE. The patients' demographic characteristics and clinical and laboratory parameters were recorded from the hospital electronic database and patient's case notes. The primary end point of the study was an adverse 100-day outcome, defined as death from any cause. The all-cause mortality in the first 100 days was 14.42 %. The mean age was 57.87 ± 18.17 (range: 16-93) years. We included 79 (38 %) male and 129 (62 %) female individuals. Red cell distribution width (RDW) and sPESI were found to be statistically significant predictors of PE mortality by multivariate regression analysis. On multivariate regression analysis, RDW was associated with a 4.08-fold (95 % confidence interval: 1.229-13.335, P = 0.021) increase in PE mortality.
The results of this study demonstrated that RDW and sPESI may be a useful guide in predicting 100-day mortality. The elevated RDW may alert physicians to possible poor prognosis.
closed_qa
Does gastric acid suppression affect sunitinib efficacy in patients with advanced or metastatic renal cell cancer?
Renal cell cancer is a chemotherapy-insensitive cancer treated by vascular endothelial growth factor receptor antagonists. Recently, a question has arisen on whether there is an interaction between tyrosine kinase inhibitors, such as sunitinib, and acid suppressing agents. A retrospective chart review was conducted for patients at two tertiary care centers who received sunitinib between 1 January 2006 and 31 March 2013. Using electronic systems and a province-wide electronic health records database, medication dispensing records were obtained. A univariate Cox's proportional hazard model determined if acid suppression had effects on progression-free survival and overall survival. Of 383 patient charts reviewed, 231 were included in the study. Patients on intermittent acid suppression, lost to follow-up or received sunitinib for less than one week were excluded from the study. The median age of the study population was 65. Patients who received no acid suppression (n = 186) had a median progression-free survival of 23.6 weeks (95% CI, 19.0-31.9 weeks) and patients who received continuous acid suppression (n = 45) had a median progression-free survival of 18.9 weeks (95% CI, 11.0-23.7 p = 0.04). A median overall survival of 62.4 weeks (95% CI, 42.0-82.7 weeks) was observed in the group with no acid suppression, while a median overall survival of 40.9 weeks (95% CI, 26.1-74.4 weeks) was observed in the continuous acid suppression group (p = 0.02).
There was a significant difference in progression-free survival and overall survival between the acid suppressed and no acid suppression groups. Further research is required to confirm this potential interaction.
closed_qa
A CT-based Medina classification in coronary bifurcations: does the lumen assessment provide sufficient information?
To evaluate the distribution of atherosclerosis at bifurcations with computed tomography coronary angiography (CTCA) and propose a novel CT-Medina classification for bifurcation lesions. In 26 patients (age 55 ± 10 years, 81% male) imaged with CTCA, 39 bifurcations were studied. The bifurcations analysis included the proximal main vessel, the distal main vessel and the side branch (SB). Plaque contours were manually traced on CTCA; the lumen, vessel and plaque area were measured, as well as plaque burden (%). The carina cross-sections were divided into four equal parts according to the expected wall shear stress (WSS) to assess circumferential plaque distribution. All the bifurcation lesions were classified using the Medina classification and a novel CT-Medina classification combining lumen narrowing and plaque burden ≥70%. Presence of severe plaque (plaque burden ≥70%) by CTCA was demonstrated in 12.8% (5/39) of the proximal segments, 15.4% (6/39) of the distal segments and 7.7% (3/39) of the SB segments. The thickest plaque was located more often in low WSS parts of the carina cross-sections, whereas the flow divider was rarely affected. Although in the majority of bifurcations plaque was present, based on the Medina classification 92% of the assessed bifurcations were identified as 0,0,0. Characterization of bifurcation lesions using the new CT-Medina classification provided additional information in seven cases (18%) compared to the Medina classification
Atherosclerotic plaque is widely present in all bifurcation segments, even in the absence of coronary lumen stenosis. A CT-Medina classification combining lumen and plaque parameters is more informative than angiographic classification of bifurcation lesions and could potentially facilitate the decision-making on the treatment of these lesions.
closed_qa
Is there B cell involvement in a rat model of spontaneous idiopathic nephrotic syndrome treated with LF15-0195?
The Buffalo/Mna (Buff/Mna) rat spontaneously develops idiopathic nephrotic syndrome (INS), and its nephropathy recurs after the renal transplantation of a healthy graft. Only LF15-0195 is able to cause regression of the Buff/Mna nephropathy and to induce regulatory T cells, which decrease proteinuria when transferred into proteinuric Buff/Mna rats. Based on previous research on B cells in human INS, we evaluated the involvement of B cells in our model and the impact of LF15-0195. We studied the effect of LF15-0195 on peripheral B cells by flow cytometry and quantitative reverse transcription-polymerase chain reaction. B cells were purified from LF15-0195-treated Buff/Mna rats in remission, and transferred into proteinuric Buff/Mna rats. We treated the Buff/Mna rats with mitoxantrone and measured the depletion of B/T cells in parallel with proteinuria. LF15-0195 changed the phenotype of B cells: the number of naïve mature B cells increased significantly, while the number of switched, transitional 1, and transitional 2 B cells decreased. There were no changes in the amount of memory, activated or regulatory B cells. We observed a significant increase of immunoglobulin (Ig)M mRNA transcripts in the LF15-0195-treated Buff/Mna B cells compared to controls, but no difference in the level of IgG. This profile is consistent with a block in B cell maturation at the IgM to IgG switch. The transfer of B cells from LF15-0195-treated rats into proteinuric Buff/Mna rats did not have an effect on proteinuria. Mitoxantrone, despite causing a significant depletion of B cells, did not reduce proteinuria.
Despite LF15-0195 acting on B cells, the beneficial effects of this drug on nephrotic syndrome did not involve the induction of regulatory B cells. Moreover, the B cell depletion was not effective in reducing proteinuria, indicating that B cells are not a therapeutic target.
closed_qa
Does gestational diabetes history increase epicardial fat and carotid intima media thickness?
Gestational diabetes mellitus (GDM) is defined as glucose intolerance that has begun during pregnancy. Recent studies have proven that development of atherosclerosis may be established in this population even without presence of type 2 diabetes. For assessment of atherosclerosis, epicardial fat thickness (EFT) is recently being used as a surrogate marker. In this study, we aimed to prove that women with GDM history are more inclined to have higher EFT levels than women without GDM history. Sixty-two patients with previous GDM and 33 age- and sex-matched controls were allocated. Epicardial fat thicknesses of the subjects were measured with transthorasic echocardiography and carotid intima media thickness (c-IMT) was measured with ultrasound. Insulin resistance (IR) of each subject was assessed with Homeostasis model of assessment-insulin resistance (HOMA-IR). Carotid IMT and EFT were significantly higher in previous GDM group than controls. Serum gamma-glutamyl transferase (GGT), uric acid, and high-sensitivity C-reactive protein (hs-CRP) levels were also found significantly higher in the patients with previous GDM as compared to the controls. We observed that carotid IMT (β = 310, P = 0.003), total cholesterol (β = 315, P = 0.002), BMI (β = 308, P = 0.002), HbA1c (β = 227, P = 0.018), and HOMA-IR (β = 184, P = 0.049) were independently correlated with EFT.
Although the number of patients included in this study is limited, high EFT results may indicate presence of atherosclerosis in women with previous GDM.
closed_qa
Is there a trend of decreasing prevalence of TMD-related symptoms with ageing among the elderly?
Older adults have not been studied as much as younger ones regarding prevalence of TMD-related symptoms. The aim was to assess the prevalence of TMD-related symptoms in two population samples, 70 and 80 years old. Identical questionnaires were in 2012 sent to all subjects born in 1932 and 1942 living in two Swedish counties. The response rate was 70.1%, resulting in samples of 5697 70- and 2922 80-year-old subjects. The questionnaire comprised 53 questions. Answers to questions on problems regarding TMD-related symptoms and awareness of bruxism were analysed. Twelve per cent of the women and 7% of the men in the 70-year-old group reported some, rather great or severe problems regarding TMD pain. In the 80-year-olds the prevalence was 8% and 7%, respectively. Subjects who had problems with TMJ sounds reported difficulty to open the jaw wide 6-times and TMJ pain 10-13-times more frequently than subjects without such problems. Changes of taste and awareness of bruxism were the only variables significantly associated with TMD symptoms in both age groups. Number of teeth was not significantly associated with any of the TMD-related symptoms.
Most of the elderly subjects had no severe problems with TMD-related symptoms, but 12% of the 70-year-old women reported some, rather great or severe problems. The marked gender difference at age 70 had disappeared in the 80-year-old group. The prevalence was lower among the 80- compared with the 70-year-old subjects of both sexes. The results support the comorbidity between TMD-related symptoms and general health problems.
closed_qa
Does caries risk assessment predict the incidence of caries for special needs patients requiring general anesthesia?
The aim of this study was to correlate the caries-related variables of special needs patients to the incidence of new caries. Data for socio-demographic information and dental and general health status were obtained from 110 patients treated under general anesthesia because of their insufficient co-operation. The Cariogram program was used for risk assessment and other caries-related variables were also analyzed. Within a defined follow-up period (16.3 ± 9.5 months), 64 patients received dental examinations to assess newly developed caries. At baseline, the mean (SD) values of the DMFT (decayed, missing and filled teeth) and DT (decayed teeth) for the total patients were 9.2 (6.5) and 5.8 (5.3), respectively. During the follow-up period, new caries occurred in 48.4% of the patients and the mean value (SD) of the increased DMFT (iDMFT) was 2.1 (4.2). The patients with a higher increment of caries (iDMFT ≥3) showed significantly different caries risk profiles compared to the other patients (iDMFT<2) (p<0.05). Close correlations existed between the caries increment and several caries-related variables; baseline DMFT, insufficient self-tooth-brushing and malocclusion were greatly associated with new caries development.
Caries risk assessment could predict the incidence of future caries in hospital-based dentistry. Past caries experience and inadequate oral hygiene maintenance were largely related to caries development in special needs patients.
closed_qa
Prevention of delirium in trauma patients: are we giving thiamine prophylaxis a fair chance?
Delirium is associated with increased morbidity and mortality in injured patients. Wernicke encephalopathy (WE) is delirium linked to malnutrition and chronic alcoholism. It is prevented with administration of thiamine. Our primary goal was to evaluate current blood alcohol level (BAL) testing and thiamine prophylaxis in severely injured patients. We retrospectively reviewed the cases of 1000 consecutive severely injured patients admitted to hospital between Mar. 1, 2009, and Dec. 31, 2009. We used the patients' medical records and the Alberta Trauma Registry. Among 1000 patients (mean age 48 yr, male sex 70%, mean injury severity score 23, mortality 10%), 627 underwent BAL testing at admission; 221 (35%) had a BAL greater than 0 mmol/L, and 189 (30%) had a BAL above the legal limit of 17.4 mmol/L. The mean positive BAL was 41.9 mmol/L. More than 4% had a known history of alcohol abuse. More patients were assaulted (20% v. 9%) or hit by motor vehicles (10% v. 6%) when intoxicated (both p<0.05). Most injuries occurred after falls (37%) and motor vehicle collisions (33%). Overall, 17% of patients received thiamine prophylaxis. Of the 221 patients with elevated BAL, 44% received thiamine prophylaxis. Of those with a history of alcohol abuse, 77% received thiamine prophylaxis.
Despite the strong link between alcohol abuse, trauma and WE, more than one-third of patients were not screened for alcohol use. Furthermore, a minority of intoxicated patients received adequate prophylaxis against WE. Given the low risk and cost of BAL testing and thiamine prophylaxis and the high cost of delirium, standard protocols for prophylaxis are essential.
closed_qa
Anesthesia for bariatric surgery: 8-year retrospective study: are our patients now easier to manage?
To review the perioperative management of patients who had undergone bariatric surgery in our institution during an 8-year period, with the aim of identifying variables that correlated with improved clinical outcomes and changes in perioperative practice. This was a retrospective observational study of 437 patients who had undergone bariatric surgery from January 2005 to June 2013. Of these patients, 163 had undergone open or laparoscopic biliopancreatic diversion (Group 1), and 274 had been managed according to a Tailored Laparoscopic Approach Program (TLAP) (Group 2). We analyzed major cardiocirculatory, pulmonary, and surgery-related complications, mortality rate, intensive care unit (ICU) admissions, post-anesthetic care unit (PACU) length of stay, and perioperative management standards, throughout the study period. Changes were observed in anesthetic patterns and perioperative care standards during the study period: 25% of patients had combined epidural anesthesia in 2005, compared with none at present; ICU admissions decreased from 28.6% in 2005 to 3.1% at present; and time in PACU declined from a median of 23 h in 2005 to 5.12h at present. Duration of postoperative opioid therapy was also significantly reduced (from 48 h to 6h). Group 2 had a significantly lower mortality rate than Group 1 (0.37% versus 4.3%, respectively, P=0.004).
In our institution, adoption of a TLAP for bariatric surgery has led to changes in perioperative care standards that have been followed by clear improvements according to morbidity, mortality and management indicators.
closed_qa
Is the Cambridge Cognitive Examination - revised a good tool for detection of dementia in illiterate Brazilian older adults?
Few studies have been published on the use of the Cambridge Cognitive Examination Test - Revised (CAMCOG-R) for cognitive assessment of low educational level older adults. The aim of the present study was to determine the accuracy of the Brazilian version of the CAMCOG-R (Br-CAMCOG-R) within a sample of low educational level and illiterate older adults. The Br-CAMCOG-R was administered to outpatients in a public geriatric clinic. The diagnosis of dementia was based on the Diagnostic and Statistical Manual of Mental Disorders Fourth Edition criteria. The receiving operator characteristic curves were plotted, and the best trade-offs between sensitivities and specificities were calculated. A total of 189 participants were evaluated. The mean age was 77 ± 6.9 years. The mean educational level was 3.1 ± 2.2 years. The mean test score was 66.5 ± 13.1 points; there were 56 (29.6%) participants with dementia. The best cut-off score for illiterate participants was 50/51; sensitivity, specificity and area under the curve (AUC) were 69%, 69% and 0.75, respectively; for participants with a low educational level, the best cut-off point was 60/61; the sensitivity, specificity and AUC were 83%, 85%, and 0.93, respectively; for participants with a middle educational level, the best cut-off point was 69/70; the sensitivity, specificity and AUC were 90%, 76% and 0.91, respectively.
The Br-CAMCOG-R was useful for identifying cases of dementia among older adults with middle and low levels of literacy, but inadequate for the illiterate individuals.
closed_qa
Is dexmedetomidine superior to midazolam as a premedication in children?
In the current published literature, there are controversial results regarding the effectiveness of dexmedetomidine compared with midazolam as premedication in children. The aim of this meta-analysis was to compare the use of dexmedetomidine as a premedication in pediatric patients with that of midazolam. We searched for articles published in English that matched the key words 'dexmedetomidine', 'midazolam', and 'children' in the PubMed, Cochrane Library, Ovid, and Google Scholar databases. Additional studies were identified from the reference lists of the retrieved articles. Only prospective randomized controlled trials (RCTs) that compared the use of dexmedetomidine and midazolam as premedications in children were included. The extraction of data from the articles was performed independently by two authors using a predesigned Excel spreadsheet. The relative risks (RRs), weighted mean differences (WMDs), and their corresponding 95% confidence intervals (95% CIs) were calculated for dichotomous and continuous outcome data using the quality effects model of the MetaXL version 1.3 software. Eleven prospective RCTs (829 children) met our criteria. Compared with midazolam, dexmedetomidine premedication was associated with more satisfactory sedation upon parent separation (eight RCTs [679 children]; RR: 1.25; 95% CI: 1.06, 1.46) and upon mask acceptance (seven RCTs [559 children]; RR: 1.17; 95% CI: 1.01, 1.36). During the postoperative period, premedication with dexmedetomidine lowered the numbers of requests for rescue analgesia (six RCTs [477 children]; RR: 0.55; 95% CI: 0.40, 0.74) and lowered the risks of agitation or delirium (seven RCTs with [466 children]; RR: 0.59; 95% CI: 0.40, 0.88), and shivering (three RCTs [192 children]; RR: 0.33; 95% CI: 0.18, 0.61). However, dexmedetomidine premedication reduced systolic blood pressure (three RCTs [242 children]; WMD: -11.47 mm·Hg(-1) ; 95% CI: -13.95, -8.98), mean blood pressure (three RCTs [202 children]; WMD: -5.66 mm·Hg(-1) ; 95% CI: -8.89, -2.43), and heart rate (six RCTs [444 children]; WMD: -12.71 beat·min(-1) ; 95% CI: -14.80, -10.62), and prolonged the onset of sedation (two RCTs [132 children] WMD: 13.78 min; 95% CI: 11.33, 16.23; I(2) = 0%) relative to midazolam.
This meta-analysis demonstrated that dexmedetomidine premedication is superior to midazolam premedication in terms of producing satisfactory sedation upon parent separation and mask acceptance. Dexmedetomidine premedication provides clinical benefits that included reducing the requirements for rescue analgesia and reducing agitation or delirium and shivering during the postoperative period. However, the risks of heart rate and blood pressure decreases, and the prolonged onset of sedation associated with dexmedetomidine should be considered.
closed_qa
Is antenatal care preparing mothers to care for their newborns?
Neonatal mortality has remained resistant to change in the wake of declining child mortality. Suboptimal newborn care practices are predisposing factors to neonatal mortality. Adherence to four ANC consultations is associated with improved newborn care practices. There is limited documentation of this evidence in sub-Saharan Africa where suboptimal newborn care practices has been widely reported. Structured interviews were held with 928 women having children under-five months old at their homes in Masindi, Uganda, from October-December 2011. Four/more ANC consultations (sufficient ANC) was considered the exposure variable. Three composite variables (complete cord care, complete thermal care and complete newborn vaccination status) were derived by combining related practices from a list of recommended newborn care practices. Logistic regression models were used to assess for associations. One in five women 220(23.7%) were assessed to practice complete cord care. Less than ten percent 57(6.1%) were considered to practice complete thermal care and 611(65.8%) were assessed to have complete newborn vaccination status. Application of substance on the cord 744 (71.6%) and early bathing 816 (87.9%) were main drivers of sub-optimal newborn care practices. Multivariable logistic models did not demonstrate significant association between four/more ANC consultations and complete cord care, complete thermal care or complete newborn vaccination status. Secondary or higher education was associated with complete cord care [adjusted Odds Ratio (aOR): 2.72; 95% CI: 1.63-4.54] and complete newborn vaccination [aOR: 1.37; 95% CI: 1.04-1.82]. Women who reported health facility delivery were more likely to report complete thermal care [aOR: 3.63; 95% CI: 2.21-5.95] and newborn vaccination [aOR: 1.84; 95% CI: 1.23-2.75], but not complete cord care. Having the first baby was associated with complete thermal care [aOR: 2.00; 95% CI: 1.24-3.23].
Results confirm suboptimal newborn care practices in Masindi. Despite being established policy, adherence to four or more ANC consultations was not associated with complete cord care, complete thermal care or complete newborn vaccination. This finding has important implications for the implementation of focused ANC to improve newborn care practices. Future ANC interventions should focus on addressing application of substance on the cord and early bathing of the baby during the immediate neonatal period.
closed_qa
Can senior volunteers deliver reminiscence and creative activity interventions?
Palliative care patients and their family caregivers may have a foreshortened perspective of the time left to live, or the expectation of the patient's death in the near future. Patients and caregivers may report distress in physical, psychological, or existential/spiritual realms. To conduct a randomized controlled trial examining the effectiveness of retired senior volunteers (RSVs) in delivering a reminiscence and creative activity intervention aimed at alleviating palliative care patient and caregiver distress. Of the 45 dyads that completed baseline assessments, 28 completed postintervention and 24 completed follow-up assessments. The intervention group received three home visits by RSVs; control group families received three supportive telephone calls by the research staff. Measures included symptom assessment and associated burden, depression, religiousness/spirituality, and meaning in life. Patients in the intervention group reported a significantly greater reduction in frequency of emotional symptoms (P=0.02) and emotional symptom bother (P=0.04) than the control group, as well as improved spiritual functioning. Family caregivers in the intervention group were more likely than control caregivers to endorse items on the Meaning of Life Scale (P=0.02). Only improvement in intervention patients' emotional symptom bother maintained at follow-up after discontinuing RSV contact (P=0.024).
Delivery of the intervention by RSVs had a positive impact on palliative care patients' emotional symptoms and burden and caregivers' meaning in life. Meaningful prolonged engagement with palliative care patients and caregivers, possibly through alternative modes of treatment delivery such as continued RSV contact, may be necessary for maintenance of therapeutic effects.
closed_qa
Β-adrenergic blockade combined with subcutaneous B-type natriuretic peptide: a promising approach to reduce ventricular arrhythmia in heart failure?
Clinical studies failed to prove convincingly efficiency of intravenous infusion of neseritide during heart failure and evidence suggested a pro-adrenergic action of B-type natriuretic peptide (BNP). However, subcutaneous BNP therapy was recently proposed in heart failure, thus raising new perspectives over what was considered as a promising treatment. We tested the efficiency of a combination of oral β1-adrenergic receptor blocker metoprolol and subcutaneous BNP infusion in decompensated heart failure. The effects of metoprolol or/and BNP were studied on cardiac remodelling, excitation-contraction coupling and arrhythmias in an experimental mouse model of ischaemic heart failure following postmyocardial infarction. We determined the cellular and molecular mechanisms involved in anti-remodelling and antiarrhythmic actions. As major findings, the combination was more effective than metoprolol alone in reversing cardiac remodelling and preventing ventricular arrhythmia. The association of the two molecules improved cardiac function, reduced hypertrophy and fibrosis, and corrected the heart rate, sympatho-vagal balance (low frequencies/high frequencies) and ECG parameters (P to R wave interval (PR), QRS duration, QTc intervals). It also improved altered Ca(2+) cycling by normalising Ca(2+)-handling protein levels (S100A1, SERCA2a, RyR2), and prevented pro-arrhythmogenic Ca(2+) waves derived from abnormal Ca(2+) sparks in ventricular cardiomyocytes. Altogether these effects accounted for decreased occurrence of ventricular arrhythmias.
Association of subcutaneous BNP and oral metoprolol appeared to be more effective than metoprolol alone. Breaking the deleterious loop linking BNP and sympathetic overdrive in heart failure could unmask the efficiency of BNP against deleterious damages in heart failure and bring a new potential approach against lethal arrhythmia during heart failure.
closed_qa
Is adequate pain relief and time to analgesia associated with emergency department length of stay?
Evaluate the association of adequate analgesia and time to analgesia with emergency department (ED) length of stay (LOS). Post hoc analysis of real-time archived data. We included all consecutive ED patients ≥18 years with pain intensity>6 (verbal numerical scale from 0 to 10), assigned to an ED bed, and whose pain was re-evaluated less than 1 h after receiving analgesic treatment. The main outcome was ED-LOS in patients who had adequate pain relief (AR=↓50% pain intensity) compared with those who did not have such relief (NR). A total of 2033 patients (mean age 49.5 years; 51% men) met our inclusion criteria; 58.3% were discharged, and 41.7% were admitted. Among patients discharged or admitted, there was no significant difference in ED-LOS between those with AR (median (25th-75th centile): 9.6 h (6.3-14.8) and 18.2 h (11.6-25.7), respectively) and NR (median (25th-75th centile): 9.6 h (6.6-16.0) and 17.4 h (11.3-26.5), respectively). After controlling for confounding factors, rapid time to analgesia (not AR) was associated with shorter ED-LOS of discharged and admitted patients (p<0.001 and<0.05, respectively). When adjusting for confounding variables, ED-LOS is shortened by 2 h (95% CI 1.1 to 2.8) when delay to receive analgesic is<90 min compared with>90 min for discharged and by 2.3 h (95% CI 0.17 to 4.4) for admitted patients.
In our study, AR was not linked with short ED-LOS. However, rapid administration of analgesia was associated with short ED-LOS.
closed_qa
Does postponement of first pregnancy increase gender differences in sickness absence?
From 1970-2012, the average age at first delivery increased from 23.2-28.5 in Norway. Postponement of first pregnancy increases risks of medical complications both during and after pregnancy. Sickness absence during pregnancy has over the last two decades increased considerably more than in non-pregnant women. The aim of this paper is twofold: Firstly to investigate if postponement of pregnancy is related to increased sickness absence and thus contributing to the increased gender difference in sickness absence; and secondly, to estimate how much of the increased gender difference in sickness absence that can be accounted for by increased sickness absence amongst pregnant women. We employed registry-data to analyse sickness absence among all Norwegian employees with income equivalent to full-time work in the period 1993-2007. After control for age, education, and income, pregnant women's sickness absence (age 20-44) increased on average 0.94 percentage points each year, compared to 0.29 in non-pregnant women and 0.14 in men. In pregnant women aged 20-24, sickness absence during pregnancy increased by 0.96 percent points per calendar year, compared to 0.60 in age-group 30-34. Sickness absence during pregnancy accounted for 25% of the increased gender gap in sickness absence, accounting for changes in education, income and age.
Postponement of first pregnancy does not explain the increase in pregnant women's sickness absence during the period 1993-2007 as both the highest level and increase in sickness absence is seen in the younger women. Reasons are poorly understood, but still important as it accounts for 25% of the increased gender gap in sickness absence.
closed_qa
Is poor performance on NBME clinical subject examinations associated with a failing score on the USMLE step 3 examination?
To investigate the association between poor performance on National Board of Medical Examiners clinical subject examinations across six core clerkships and performance on the United States Medical Licensing Examination Step 3 examination. In 2012, the authors studied matriculants from the Uniformed Services University of the Health Sciences with available Step 3 scores and subject exam scores on all six clerkships (Classes of 2007-2011, N = 654). Poor performance on subject exams was defined as scoring one standard deviation (SD) or more below the mean using the national norms of the corresponding test year. The association between poor performance on the subject exams and the probability of passing or failing Step 3 was tested using contingency table analyses and logistic regression modeling. Students performing poorly on one subject exam were significantly more likely to fail Step 3 (OR 14.23 [95% CI 1.7-119.3]) compared with students with no subject exam scores that were 1 SD below the mean. Poor performance on more than one subject exam further increased the chances of failing (OR 33.41 [95% CI 4.4-254.2]). This latter group represented 27% of the entire cohort, yet contained 70% of the students who failed Step 3.
These findings suggest that individual schools could benefit from a review of subject exam performance to develop and validate their own criteria for identifying students at risk for failing Step 3.
closed_qa
Do quality of life or physical function at diagnosis predict short-term outcomes during intensive chemotherapy in AML?
Intensive chemotherapy (IC) used to treat acute myeloid leukemia (AML) is associated with toxicity, particularly in older adults. Emerging data suggest that baseline quality of life (QOL) and physical function may predict outcomes in oncology, although data in AML are limited. We investigated the association between baseline QOL and physical function with short-term treatment outcomes in adults and elderly AML patients. We conducted a prospective, longitudinal study of adults (age 18+) AML patients undergoing IC. Before starting IC, patients completed the European Organisation for the Research and Treatment of Cancer (EORTC) 30-item questionnaire (QLQ-C30) and Functional Assessment of Cancer Therapy Fatigue subscale (FACT-Fatigue) in addition to physical function tests (grip strength, timed chair stands, 2-min walk test). Outcomes included 60-day mortality, intensive care unit (ICU) admission and achievement of complete remission (CR). Logistic regression was carried out to evaluate each outcome. Of the 239 patients (median age 57.5 years), 56.7% were male and median Charlson comorbidity score was 0. Sixty-day mortality, ICU admission and CR occurred in 9 (3.7%), 15 (6.3%) and 167 (69.9%) patients, respectively. Using univariate regression, neither QOL nor physical function at presentation was predictive of 60-day mortality (all P>0.05), whereas ICU admission (P<0.001) and remission status at 30 days (P = 0.007) were. Fatigue (P = 0.004) and role functioning (P = 0.003) were predictors of ICU admission; QOL and physical function were not. A higher Charlson score predicted ICU admission (P = 0.01) and remission status (P = 0.002). The cytogenetic risk group was associated with achievement of CR (P = 0.02); QOL and physical function were not (all P>0.05). Findings were similar when patients age 60+ were examined. Relationships between fatigue and role functioning with ICU admission deserve further exploration.
Baseline QOL and physical function tests in this prospective study were not associated with short-term mortality, ICU admission or achievement of CR after the first cycle of chemotherapy.
closed_qa
Laryngeal mask airways and use of a Boyle-Davis gag in ENT surgery: is there a learning curve?
The objective was to identify whether the experience of the operating surgeon was relevant to the frequency of the laryngeal mask airway (LMA) airway obstruction or change to an endotracheal tube during ear, nose, and throat surgery. Data were prospectively collected for 186 patients undergoing a procedure with the use of a Boyle-Davis gag and LMA over 12 months in a district-general hospital in the United Kingdom. patient demographics (age, mallampati grade), grade of surgeon, grade of anesthetist, LMA size inserted, and any intraoperative adjustments needed were recorded. There was an overall intraoperative airway intervention rate of 21%. The experience of the surgeon affected the rate of intraoperative airway interventions encountered, reflected by the significantly lower rate of airway complications (ie, 10%) seen when associate specialists perform these types of procedures compared to other grades of surgeon (Fisher's exact test 2-tailed P value = .04). A significant complication rate of 50% was seen with core surgical trainees compared to other grades of surgeon (Fisher's exact test 2-tailed P value = .002).
The results of this study suggest there may be a learning curve for otolaryngology trainees when using a LMA. However, larger studies and further subanalyses are essential before further conclusions can be made.
closed_qa
Coping with heat stress during match-play tennis: does an individualised hydration regimen enhance performance and recovery?
To determine whether an individualised hydration regimen reduces thermal, physiological and perceptual strain during match-play tennis in the heat, and minimises alterations in neuromuscular function and physical performance postmatch and into recovery. 10 men undertook two matches for an effective playing time (ball in play) of 20 min (∼113 min) in ∼37°C and ∼33% RH conditions. Participants consumed fluids ad libitum during the first match (HOT) and followed a hydration regimen (HYD) in the second match based on undertaking play euhydrated, standardising sodium intake and minimising body mass losses. HYD improved prematch urine specific gravity (1.013±0.006 vs 1.021±0.009 g/mL; p<0.05). Body mass losses (∼0.3%), fluid intake (∼2 L/h) and sweat rates (∼1.6 L/h) were similar between conditions. Core temperature was higher during the first 10 min of effective play in HOT (p<0.05), but increased similarly (∼39.3°C) on match completion. Heart rate was higher (∼11 bpm) throughout HOT (p<0.001). Thermal sensation was higher during the first 7.5 min of effective play in HOT (p<0.05). Postmatch knee extensor and plantar flexor strength losses, along with reductions in 15 m sprint time and repeated-sprint ability (p<0.05), were similar in both conditions, and were restored within 24 h.
Both the hydration regimen and ad libitum fluid consumption allowed for minimal body mass losses (<1%). However, undertaking match-play in a euhydrated state attenuated thermal, physiological and perceptual strain. Maximal voluntary strength in the lower limbs and repeated-sprint ability deteriorated similarly in both conditions, but were restored within 24 h.
closed_qa
Should fat graft be recommended in tendon scar treatment?
Lipostructure has been reported as a successful ancillary tool for surgery in tenolysis procedures, but to date no reports of its capability to resolve tendon adherence without further surgery have been reported. The aim of this study is to highlight the role of lipografting in the treatment of tendon and joint adherences. In our experience, we started treating important tendon adherences together with nerve entrapment on the dorsal aspect of the foot in two cases and in a severe burned hand. We achieved good results both in terms of function and sensory recovery. A twenty four month follow up showed good maintenance of the ROM. We also reported gaining of almost 30-40 degrees of a flexion contracure in the second finger of a burned hand, minimizing further surgery for scar contracture and tenoarthrolysis, with a stable follow up.
We suggest that prior to refer to surgery scars involving tendons as well as joints should be considered for lipografting.
closed_qa