instruction
stringlengths
10
664
context
stringlengths
1
5.66k
response
stringlengths
1
3.34k
category
stringclasses
1 value
Dental caries experience in preschool children: is it related to a child's place of residence and family income?
To assess the caries prevalence in 3- to 5-year-old children and determine whether urbanisation and income are associated with the dental decay status of these preschool children residing in the district of Lahore, Pakistan. Multistage random sampling was done to collect the sample of children from urban and rural areas. A list of children 3 to 5 years of age was prepared, and every 2nd child on the list was randomly selected until a total of 700 children were enrolled in the study. Lady Health Workers (LHWs) were trained to conduct this survey after permission from the pertinent authorities. The data of the children and their mothers regarding their age, gender, socioeconomic status (SES) and area of residence were collected. The caries status of children was recorded using the dmft index as per WHO criteria. The prevalence of dental caries in preschool children of Lahore was found to be 40.5%. Within this group, caries prevalence was 33.3% in 3-year-old children, 47.6% in 4-year-old children and 75% in the 5-year-old children. The mean dmft score for the entire child population was 1.85 ± 3.26. A significant association was found between caries prevalence, low socioeconomic status, female gender and rural residence.
Preschool children in Lahore, Pakistan have average dmft scores of 1.85 (± 3.26), which are mostly related to untreated carious lesions. Lower caries experience was found to be associated with rural residence and low family income.
closed_qa
Should care managers for older adults be located in primary care?
To determine the effect of a primary care-based care management initiative on residential care placement and death in a population of frail older adults referred for needs assessment in New Zealand. Randomized controlled trial with follow-up at 3, 6, 12, 18, and 24 months for residential care placement and mortality. Fifty-five family physician practices in New Zealand that established a care management initiative for older adults assessed as being at high risk of residential care placement in 2004 to 2006. Three hundred fifty-one individuals (243 female, 108 male) aged 65 and older (mean 81) who were assessed as being at risk of permanent residential care placement. The care management program (Coordinator of Services for Elderly) consisted of a nominated health professional care manager geographically aligned to family physicians housed with the family physician or located nearby. Rates of permanent residential care placement and mortality. The risk of permanent residential care placement or death was 0.36 for usual care (control group) and 0.26 for the care management initiative, a 10.2% absolute risk reduction, with the majority of the risk reduction seen in residential care placement (control group 0.25, intervention group 0.16).
A family physician-aligned community care management approach reduces frail older adults' risk of mortality and permanent residential care placement.
closed_qa
Does self-reported clinical experience predict performance in medical school and internship?
Medical school admissions committees attempt to select the most qualified applicants. In addition to traditional performance measures, committees often look favourably upon applicants who report previous clinical experience. This study aimed to determine if self-reported clinical experience is a valid indicator of future performance in medical school and internship. We collected data for seven year groups (1993-1999; n = 1112) and operationalised trainee performance in terms of five outcomes: cumulative medical school grade point average (GPA); US Medical Licensing Examination (USMLE) Step 1 and 2 scores, and scores on a validated programme director's evaluation measuring intern expertise and professionalism. We then conducted a series of analyses of covariance to compare outcomes in applicants who self-reported previous clinical experience with outcomes in those who did not. In these analyses, the independent variable was self-reported clinical experience (yes/no), the covariate was undergraduate GPA, and the dependent variables were the five performance outcomes. In four of five analyses, we found no differences in the performance of the two groups (clinical experience versus no clinical experience). However, on the cumulative medical school GPA outcome, applicants who reported previous clinical experience had statistically significantly lower cumulative GPAs upon graduation than those who did not report such experience (F(1,940) = 9.35, p = 0.002, partial η(2) = 0.01 [small effect size]).
Our results suggest that applicants who self-report previous clinical experience may not be better candidates. In fact, on some measures of performance, these applicants may actually perform worse than those who report no clinical experience.
closed_qa
Can achievement goal theory provide a useful motivational perspective for explaining psychosocial attributes of medical students?
Psychosocial competence and frustration tolerance are important characteristics of skilled medical professionals. In the present study we explored the usefulness of applying a comprehensive motivational theory (Goal orientations), for this purpose. According to goal orientation theory, learning motivation is defined as the general goals students pursue during learning (either mastery goals - gaining new knowledge; or performance goals - gaining a positive evaluation of competence or avoiding negative evaluation). Perceived psychosocial abilities are a desirable outcome, and low frustration tolerance (LFT), is a negative feature of student behavior. The hypothesis was that the mastery goal would be positively associated with psychosocial abilities while performance goals would be positively associated with LFT. 143 first-year medical students completed at the end of an annual doctor-patient communication course a structured questionnaire that included measures of learning goal orientations (assessed by Pattern of Adaptive Learning Scale - PALS), psychosocial abilities (assessed by Psychological Medicine Inventory- student version -PMI-S) and Low Frustration Tolerance (LFT). All study variables were found reliable (Cronbach's α ranged from .66 to .90) and normally distributed. Hierarchical multiple regression analysis revealed significant associations supporting the hypotheses. The mastery goal orientation was positively associated with perceived psychosocial abilities (PMI-S) (β = .16, p<.05) and negatively associated with low frustration tolerance (β = -.22, p<.05) while performance goal orientation was significantly associated with low frustration tolerance (β = .36, p<.001).
The results suggest that the goal orientations theory may be a useful theoretical framework for understanding and facilitating learning motivation among medical students. Limitations and suggestions for practice within medical education context are discussed.
closed_qa
Are marginalized women being left behind?
While India has made significant progress in reducing maternal mortality, attaining further declines will require increased skilled birth attendance and institutional delivery among marginalized and difficult to reach populations. A population-based survey was carried out among 16 randomly selected rural villages in rural Mysore District in Karnataka, India between August and September 2008. All households in selected villages were enumerated and women with children 6 years of age or younger underwent an interviewer-administered questionnaire on antenatal care and institutional delivery. Institutional deliveries in rural areas of Mysore District increased from 51% to 70% between 2002 and 2008. While increasing numbers of women were accessing antenatal care and delivering in hospitals, large disparities were found in uptake of these services among different castes. Mothers belonging to general castes were almost twice as likely to have an institutional birth as compared to scheduled castes and tribes. Mothers belonging to other backward caste or general castes had 1.8 times higher odds (95% CI: 1.21, 2.89) of having an institutional delivery as compared to scheduled castes and tribes. In multivariable analysis, which adjusted for inter- and intra-village variance, Below Poverty Line status, caste, and receiving antenatal care were all associated with institutional delivery.
The results of the study suggest that while the Indian Government has made significant progress in increasing antenatal care and institutional deliveries among rural populations, further success in lowering maternal mortality will likely hinge on the success of NRHM programs focused on serving marginalized groups. Health interventions which target SC/ST may also have to address both perceived and actual stigma and discrimination, in addition to providing needed services. Strategies for overcoming these barriers may include sensitization of healthcare workers, targeted health education and outreach, and culturally appropriate community-level interventions. Addressing the needs of these communities will be critical to achieving Millennium Development Goal Five by 2015.
closed_qa
Is blood ammonia influenced by kidney function?
We have investigated whether blood ammonia is increased with worsening CKD. Fifty eight subjects with a range of CKD were recruited for analysis of plasma ammonia and other electrolytes. The concentrations of plasma ammonia were all within the normal reference range and there was no correlation between ammonia and CKD without any effect of dialysis.
Blood ammonia is not elevated in or related to the severity of chronic kidney disease.
closed_qa
Social functioning in early psychosis: are all the domains predicted by the same variables?
Participants were recruited from early psychosis intervention programmes and community mental health clinics in British Columbia, Canada, and completed the following measures: client's assessment of strengths, interests, and goals, brief psychiatric rating scale, Beck depression inventory and California verbal learning task. Multiple linear regressions revealed that: more negative symptoms and higher depression predicted a less active social life; more negative symptoms and poorer short-term verbal learning ability predicted lower vocational functioning; and more negative symptoms and male gender predicted lower independent living skills.
Results suggest that negative symptoms are predictive of all three areas of functioning but that specific variables add significant unique variance to individual areas of social functioning. Although a global social functioning score can be considered useful, greater precision can be gained by the use of domain-specific measures.
closed_qa
Drug-induced pneumonitis due to sirolimus: an interaction with atorvastatin?
Sirolimus is an immunosupressant used in renal transplantation because of its lack of nephrotoxicity. We report four cases of pneumonitis due to sirolimus, possibly revealing an interaction with atorvastatin. Four patients (previously on long-term treatment with atorvastatin) presented with respiratory symptoms between 3 and 56 months after starting treatment with sirolimus following renal transplantation. Thoracic CT scans showed bilateral areas of peripheral alveolar consolidation. Bronchial lavage showed a lymphocytic alveolitis. Open-lung biopsy showed organizing pneumonia associated with diffuse alveolar damage and caseating granulomata. We attributed the pneumonitis to sirolimus on account of clinical and radiological resolution within 1 to 6 months of stopping treatment. We raise the possibility of an association between sirolimus and atorvastatin by competition for their hepatic degradation pathway via cytochrome P450 3A4.
Sirolimus causes drug-induced pneumonitis that is predominantly an organizing pneumonia. Atorvastatin may encourage its development by competition with sirolimus in the liver.
closed_qa
Elevated hematocrit in nonalcoholic fatty liver disease: a potential cause for the increased risk of cardiovascular disease?
Hematocrit is an important hemorheological parameter. Both hematocrit and nonalcoholic fatty liver disease (NAFLD) are strongly correlated with cardiovascular disease. However, there is no study to explore the direct relationship of hematocrit with NAFLD. Hematocrit levels were analyzed from 1,821 Chinese Han adults, and questionnaire and physical examination were administered to know and suspect the relationship with NAFLD. NAFLD morbidity was positively correlated with hematocrit levels in both male and female. Multivariate-adjusted odds ratio illustrated that, compared with the lowest quartile of hematocrit, the highest quartile subjects had a 185% and 386% increased risk for developing NAFLD in male and female, respectively. Further receiver operating characteristic analysis showed that the optimal cutoff value of hematocrit for detecting NAFLD was 42.75 in male and 37.55 in female. Unhealthy lifestyles had similar effects on NAFLD and hematocrit.
The prevalence of NAFLD is positively associated with hematocrit levels. Though the cause-effect relationship between NAFLD and hematocrit still needs further investigation to clarify, the elevated hematocrit in NAFLD patients may be of significance to link NAFLD and cardiovascular disease.
closed_qa
Can ambulatory blood pressure measurements substitute assessment of subclinical cardiovascular damage?
We have previously demonstrated that markers of subclinical organ damage (SOD) improve cardiovascular risk prediction in healthy individuals. We wanted to investigate whether this additive effect of SOD was due to inaccurate blood pressure (BP) measurement or whether ambulatory BP (AMBP) added further to risk prediction. In a population cohort of 1385 Danish individuals free of cardiovascular disease and diabetes, we recorded traditional risk factors, AMBP and pulse wave velocity (PWV), urine albumin/creatinine ratio (UACR), left ventricular mass index (LVMI) and carotid atherosclerotic plaques at baseline. A composite cardiovascular endpoint (CEP) consisting of cardiovascular death and nonfatal myocardial infarction and stroke was recorded in national registries. During a median follow-up of 12.8 years, a total of 119 CEPs occurred. In categorical analysis, presence of SOD as well as masked hypertension increased sensitivity of Systemic Coronary Risk Estimation from 73.9 to 89.1% (P < 0.001) and reduced specificity from 60.1 to 41.8% (P < 0.001). In continuous analysis, logUACR [hazard ratio = 1.20 (95% confidence interval [CI] 1.05-1.38), P = 0.009], atherosclerotic plaques [hazard ratio = 1.82 (95% CI 1.21-2.74), P = 0.004] and 24-h SBP [hazard ratio = 1.34 (95% CI 1.12-1.60), P = 0.002]but not logPWV or LVMI predicted CEP in a model with adjustments for age, sex, conventional BP, total cholesterol and smoking. Compared with a risk model using only traditional risk factors, adding PWV, UACR, plaques, LVMI and 24-h SBP increased C-index significantly from 0.76 to 0.79% and produced a net reclassification improvement of 23.3% (P = 0.001).
UACR and plaques predicted cardiovascular events independently of AMBP and improved risk prediction.
closed_qa
Do practicing clinicians agree with expert ratings of neonatal intensive care unit quality measures?
To assess the level of agreement when selecting quality measures for inclusion in a composite index of neonatal intensive care quality (Baby-MONITOR) between two panels: one comprised of academic researchers (Delphi) and another comprised of academic and clinical neonatologists (clinician). In a modified Delphi process, a panel rated 28 quality measures. We assessed clinician agreement with the Delphi panel by surveying a sample of 48 neonatal intensive care practitioners. We asked the clinician group to indicate their level of agreement with the Delphi panel for each measure using a five-point scale (much too high, slightly too high, reasonable, slightly too low and much too low). In addition, we asked clinicians to select measures for inclusion in the Baby-MONITOR based on a yes or no vote and a pre-specified two-thirds majority for inclusion. In all, 23 (47.9%) of the clinicians responded to the survey. We found high levels of agreement between the Delphi and clinician panels, particularly across measures selected for the Baby-MONITOR. Clinicians selected the same nine measures for inclusion in the composite as the Delphi panel. For these nine measures, 74% of clinicians indicated that the Delphi panel rating was 'reasonable'.
Practicing clinicians agree with an expert panel on the measures that should be included in the Baby-MONITOR, enhancing face validity.
closed_qa
Do topical antibiotics reduce exit site infection rates and peritonitis episodes in peritoneal dialysis patients?
Peritonitis is the major cause of peritoneal dialysis (PD) technique failure. Prophylactic topical antibiotics have been reported to reduce peritoneal dialysis catheter exit site infections (ESI) and peritonitis rates. We audited the effect of different exit site practices in the 12 Pan Thames and South East England PD centres, on ESIs and peritonitis between 2005 and 2008. PD patients used prophylactic mupirocin (n=1,270), gentamicin (n=502) and no prophylactic antibiotics (n=1,203); annualised ESI rates were reduced with mupirocin (median 0.18, interquartile range [IQR] 0.13-0.23, patient episodes per year, vs. median 0.32, IQR 0.24-0.69, for no antibiotic prophylaxis, p<0.01). Gentamicin treatment was not significantly lower (median 0.29, IQR 0.21-0.47). Staphylococcal ESIs accounted for 39.6% in the no antibiotic group and fell to 25.7% with mupirocin and 28.2% with gentamicin. Despite the reduction in ESIs, there was no significant reduction in peritonitis rates (no antibiotics: median 0.56, IQR 0.5-0.65; mupirocin: median 0.55, IQR 0.53-0.75; and gentamicin: median 0.47, IQR 0.32-0.65). In particular, mupirocin did not reduce Staphylococcus aureus peritonitis rates.
Topical antibiotics have been reported to reduce both ESI and peritonitis rates in controlled trials, and although in this audit of routine clinical practice, topical mupirocin did reduce overall ESI rates and both mupirocin and gentamicin reduced S. aureus ESIs, neither reduced overall peritonitis rates.
closed_qa
Changes of blood endocannabinoids during anaesthesia: a special case for fatty acid amide hydrolase inhibition by propofol?
• Available data from animal studies suggest that the narcotic drug propofol interacts with the endocannabinoid system. Inhibition of enzymatic degradation of anandamide could explain some of the characteristics of propofol. Direct measurements have not been reported yet in humans. • Propofol does not change the time course of anandamide plasma concentrations during anaesthesia. Furthermore, propofol does not inhibit fatty acid amide hydrolase activity ex vivo or in vitro. Thus, specific characteristics of the narcotic drug propofol cannot be explained by peripheral inhibition of anandamide degradation in humans. The aim of our study was to describe the time course of endocannabinoids during different anaesthesia protocols in more detail, and to challenge the hypothesis that propofol acts as a FAAH inhibitor. Endocannabinoids were measured during the first hour of anaesthesia in 14 women and 14 men undergoing general anaesthesia with propofol and in 14 women and 14 men receiving thiopental/sevoflurane. We also incubated whole human blood samples ex vivo with propofol and the known FAAH inhibitor oloxa and determined FAAH enzyme kinetics. Plasma anandamide decreased similarly with propofol and thiopental/sevoflurane anaesthesia, and reached a nadir after 10 min. Areas under the curve for anandamide (mean and 95% CI) were 53.3 (47.4, 59.2) nmol l(-1) 60 min with propofol and 48.5 (43.1, 53.8) nmol l(-1) 60 min with thiopental/sevoflurane (P= NS). Anandamide and propofol plasma concentrations were not correlated at any time point. Ex vivo FAAH activity was not inhibited by propofol. Enzyme kinetics (mean ± SD) of recombinant human FAAH were K(m) = 16.9 ± 8.8 µmol l(-1) and V(max) = 44.6 ± 15.8 nmol mg(-1) min(-1) FAAH without, and K(m) = 16.6 ± 4.0 µmol l(-1) and V(max) = 44.0 ± 7.6 nmol mg( 1 ) min(-1) FAAH with 50 µmol l(-1) propofol (P= NS for both).
Our findings challenge the idea that propofol anaesthesia and also propofol addiction are directly mediated by FAAH inhibition, but we cannot exclude other indirect actions on cannabinoid receptors.
closed_qa
Survival after biventricular mechanical circulatory support: does the type of device matter?
Biventricular support can be achieved using paracorporeal biventricular assist devices (BiVADs), the total artificial heart (TAH), and implantable VADs. This study evaluated the influence of the device on patient survival. Data from 383 patients (321 men [84%]) undergoing primary, planned biventricular support using durable devices between 2000 and 2010 were extracted from the French multicentric Groupe de Réflexion sur l'Assistance Mécanique (GRAM) registry. Mean age was 41.6 ± 14.0 years. Patients were classified as group 1, 255 (67%) with paracorporeal BiVADs; group 2, 90 (24%) with TAH; and group 3, 38 (10%) with implantable BiVADs. Mean patient support duration was 82.8 ± 107.4 days and similar among groups (p = 0.53). Bridging to transplantation was successful in 211 patients (55%) and to recovery in 23 (6%). Mortality on device was similar among groups (p = 0.16). TAH patients had a significantly lower stroke rate (p<0.0001). Actuarial estimates for survival while on support were 75.2% ± 2.3%, 64.4% ± 2.7%, 61.1% ± 2.8%, and 56.8% ± 3.1% at 30, 60, 90, and 180 days, respectively, and were similar among groups. However, TAH patients undergoing prolonged support (≥90 days) showed a trend toward improved survival (p = 0.08). Actuarial post-transplant survival estimates were, respectively, 81.7 ± 2.7, 75.3 ± 3.0, 73.0 ± 3.0, and 64.7 ± 3.7 at 1 month and 1, 3, and 5 years and were similar among groups (p = 0.84).
Survival while on support and after heart transplantation did not differ significantly in patients supported with paracorporeal BiVADs, implantable BiVADs, or the TAH. Patients undergoing prolonged support (>90 days) tended to have improved survival when supported with TAH compared with BiVADs, which may be related to a lower incidence of neurologic events.
closed_qa
Do patients with chronic heart failure have chest pain?
In questionnaire surveys, patients with chronic heart failure frequently report "pain" as a symptom. We investigated the prevalence of chest pain as a possible cause for pain, particularly in patients with prior myocardial infarction. Questionnaire survey. Community heart failure clinic. 1 786 patients with heart failure due to left ventricular systolic dysfunction (mean ± SD age 70.1 ± 11.0 years; 73% male; left ventricular ejection fraction (LVEF) 35.3 ± 9.9%; 65.6 with underlying ischaemic heart disease (IHD)). Patients with chronic heart failure completed a questionnaire. Answers to the questions: (1) "In the last week, how many days did you get angina chest pain?"; and "In the last month, how much did the following affect you:" (2) "chest pains at rest"; (3) "chest pains during normal activity". 73% of those with IHD, and 84% of those without had had no angina in the previous week; 79% and 82%, respectively, had at most "little" chest pain at rest; 67% and 76%, respectively, had at most "little" chest pain during exertion. Angina increased with NYHA class, but there was no relation between angina and sex of patient, age or LVEF. There was a weak relation between chest pain and an adverse outcome in the patients with ischaemic heart disease.
Although pain is commonly reported in patients with chronic heart failure, it seems unlikely that the pain is due to angina, even in patients with underlying coronary heart disease.
closed_qa
Is a head CT necessary after uncomplicated coiling of unruptured intracranial aneuryms?
In this study, we sought to determine whether routine head computed tomographies (CTs) after uncomplicated coil embolization of intracranial aneurysms can add any significant clinical value. We retrospectively reviewed the medical records of 139 patients with unruptured aneurysms who underwent 150 elective coiling procedures between January 2008 and June 2010. A total of 6 head CTs were obtained emergently after intraprocedural complications and 122 head CTs were obtained routinely after uncomplicated coil embolization of intracranial aneurysms. The 122 head CTs that were obtained routinely after uncomplicated coil embolization of unruptured intracranial aneurysms did not show any acute or subacute changes.
A head CT after uncomplicated coil embolization of an intracranial aneurysm does not add any significant clinical value and should not be ordered routinely.
closed_qa
Is inadequate human immunodeficiency virus care associated with increased ED and hospital utilization?
There is a lack of data on the effect(s) of suboptimal human immunodeficiency virus (HIV) care on subsequent health care utilization among emergency department (ED) patients with HIV. Findings on their ED and inpatient care utilization patterns will provide information on service provision for those who have suboptimal access to HIV-related care. A pilot prospective study was conducted on HIV-positive patients in an ED. At enrollment, participants were interviewed regarding health care utilization. Participants were followed up for 1 year, during which time data on ED visits and hospitalizations were obtained from their patient records. Inadequate HIV care (IHC) was defined according to Infectious Diseases Society of America recommendations as less than 3 scheduled clinic visits for HIV care in the year before enrollment. Cox regression models were used to evaluate whether IHC was associated with increased hazard of health care utilization. Of 107 subjects, 36% were found to have IHC. Inadequate HIV care did not predict more frequent ED visits but was significantly associated with fewer hospitalizations (adjusted incidence rate ratio, 0.61 [95% CI: 0.43-0.86]). Inadequate HIV care did not significantly increase the hazard for earlier ED visit or hospitalization. However, further stratification analysis found that IHC increased the hazard of hospitalization for subjects without comorbid diseases (adjusted hazard ratio, 2.50 [95% CI: 1.10-5.68]).
In our setting, IHC does not appear to be associated with earlier or more frequent ED visits but may lead to earlier hospitalization, particularly among those without other chronic diseases.
closed_qa
Is there a dose response for valgus unloader brace usage on knee pain, function, and muscle strength?
To examine whether there was a dose response for valgus unloader brace wear on knee pain, function, and muscle strength in participants with medial compartment knee osteoarthritis. In this single-group study, participants with medial compartment knee osteoarthritis were followed for approximately 6 months. Recruitment was conducted in the general community, and testing was performed at a university laboratory. A convenience sample of patients (N=32) who were prescribed a valgus unloader brace agreed to participate, met the inclusion criteria, and completed the baseline data collection. Twenty-four participants (20 men, 4 women) completed baseline and follow-up collections. Participants wore their valgus unloader brace as needed. Knee extensor, flexor, and plantar flexor strength was tested at baseline and follow-up. Participants filled out Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) and Medical Outcomes Study 36-Item Short-Form Health Survey questionnaires to assess pain and function. Self-selected walking velocity and stride length were objective measures of function. Brace usage (dose) and activity (step count) were recorded at least 4 days/week for the study duration. Positive relationships existed between brace wear usage and percent change in step count (r=.59, P=.006) and percent change in hamstrings strength (r=.37, P=.072). At follow-up, there was significant improvement in hamstrings strength (P=.013), and trends toward improvements in WOMAC pain (P=.059) and WOMAC function (P=.089).
Our results indicate that greater brace use may positively affect physical activity level, but there was minimal effect of brace wear dosage on lower-limb muscle strength. Only knee flexion showed a positive relationship. Our finding of no decreased muscle strength indicates that increased brace use over a 6-month period does not result in muscle impairment.
closed_qa
Does innovation in obesity drugs affect stock markets?
This study empirically analyzes the effects of public information about the pharmaceutical R&D process on the market valuation of the sponsoring firm. We examined the market's response to scientific news and regulatory decisions about an antiobesity drug, rimonabant, and the effects on the sponsoring company (Sanofi-Aventis) and its incumbent competitors (Abbott and Roche). Event study methodology was used to test the null hypothesis of no market response. We covered the full life cycle of rimonabant (1994-2008), using a data set of daily closing price and volume. The results suggest that scientific news in the initial stages of the drug R&D process (i.e., drug discovery, preclinical and clinical trials) had no significant effects. However, news related to regulatory decisions, such as recall or safety warning, had significant negative effects on the company's market value. No spillover/contagion effects on competitor firms were detected.
Market reactions occur at the time when the regulator takes decisions about drugs. Scientific news, even those of high-impact, may pass unnoticed.
closed_qa
Congenital lung anomalies: can we postpone resection?
The management of asymptomatic congenital lung lesions is controversial. It is unclear whether elective resection provides a significant benefit. We sought to determine whether early vs delayed resection of asymptomatic congenital lung malformations resulted in complications. Institutional billing records were queried for patients with lung malformations over a 10-year period. Medical records were reviewed for demographics, type of anomaly, symptoms, management, and procedural or disease-related complications. Eighty-seven patients were identified. The diagnoses included congenital cystic adenomatoid malformation (41%), bronchogenic cyst (19.3%), sequestration (13.2%), and congenital lobar emphysema (12.0%). Fifty patients were observed for some period. Eleven became symptomatic, and 47 underwent resection at a mean age of 11 months. There was no difference in the type of resection, length of hospitalization, or complication rate between patients who underwent early vs delayed resection. There were no occurrences of malignancy or death.
In our series, there was no difference in measurable outcomes between early and delayed resection of congenital lung lesions. These data provide some support for a management strategy that might include observation with delayed resection for asymptomatic patients.
closed_qa
Agency for Healthcare Research and Quality pediatric indicators as a quality metric for surgery in children: do they predict adverse outcomes?
The pediatric quality indicators (PDIs) were developed by the Agency for Healthcare Research and Quality to compare patient safety and quality of pediatric care. These are being considered for mandatory reporting as well as pay-for-performance efforts. The present study evaluates the PDIs' predictive value for surgical outcomes in children. A cross-sectional study was performed using nationwide inpatient data from 1988 to 2007. Patients younger than 18 years with an inpatient surgical procedure were included and evaluated for 10 PDIs. Odds ratios for mortality, increase in length of stay, and total charges were calculated using multivariate regression adjusting for age, sex, race, region, hospital type, and comorbidities. A total of 1,964,456 pediatric discharges were included. Mortality rates were 5.4% for patients with at least 1 PDI and 0.6% for those with none. Multivariate analysis showed that occurrence of any PDI was associated with a 20% increased risk of mortality. The PDIs were associated with an increased length of stay and total hospital charges.
The present study shows that PDIs are associated with increased mortality risk as well as increased hospital stay and total hospital charges. This provides positive evidence for the utility of these indicators as metrics for quality and patient safety.
closed_qa
Pediatric chronic ulcerative colitis: does infliximab increase post-ileal pouch anal anastomosis complications?
Total proctocolectomy with ileal pouch anal anastomosis (IPAA) is a common surgical approach to chronic ulcerative colitis (CUC). Preoperative use of Infliximab (IFX) has raised concern of increased postoperative complications. We sought to compare outcomes of pediatric patients (≤ 18 years) who were treated with IFX before IPAA to those who did not. Patients (≤ 18 years of age) who underwent IPAA from 2003 to 2008 for CUC were included, and their records were retrospectively reviewed for preoperative medications, operative technique, and 1-year postoperative complications (leak, wound infection, small bowel obstruction, pouchitis). Subjects were divided into 2 groups--those who received IFX preoperatively and those who did not. Eleven patients received IFX preoperatively, and 27 children did not. All complications following IPAA were more frequent in the IFX group compared to controls (55% vs 26%). Small bowel obstruction was significantly higher in the IFX group (55% vs 7%). Long-term complications occurred in 64% of the IFX group and 61% of the controls.
Children that were treated with IFX prior to IPAA suffered twice as many postoperative complications. Long-term outcomes are similar. Currently, we recommend colectomy with end ileostomy for patients that receive IFX within 8 weeks of colectomy for CUC.
closed_qa
Is daily dilatation by parents necessary after surgery for Hirschsprung disease and anorectal malformations?
Most surgeons recommend daily dilatation after surgery for Hirschsprung disease and anorectal malformations. Our goal was to critically evaluate the potential risks and benefits of this practice. A retrospective chart review was carried out of all children undergoing repair of Hirschsprung disease or anorectal malformation over 5 years. Patients with long segment Hirschsprung disease or anal stenosis were excluded. There were 95 patients, of which 34 had Hirschsprung disease and 61 had an anorectal malformation. Postoperatively, 65 underwent routine dilatation by parents; and 30 underwent weekly calibration by the surgeon, with daily dilatation by the parents only if the anastomosis was felt to be narrow. Of the 30 children undergoing weekly calibration, only 5 (17%) developed late narrowing that required conversion to the daily parental dilatation approach. There were no significant differences between the 2 approaches with respect to stricture development, anastomotic disruption, perineal excoriation, or enterocolitis.
Weekly calibration by the surgeon is associated with similar outcomes to daily dilatation by the parents. Because this approach is kinder to the parents and the child, it should be seriously considered for the postoperative management of children with Hirschsprung disease or anorectal malformations.
closed_qa
Do we increase the operative risk by adding the Cox Maze III procedure to aortic valve replacement and coronary artery bypass surgery?
Recent reports from Europe and the United States have suggested that patients presenting for open surgery with a significant history of atrial fibrillation (AF) have inferior early and late outcomes if AF is left untreated. On the other hand, there is reluctance among surgeons to treat AF surgically, especially when atriotomies may be required otherwise, which is the case with aortic valve replacement (AVR) or coronary artery bypass grafting (CABG). The objective of this study was to explore the potential impact of the addition of the Cox Maze III procedure on short- and long-term outcomes of patients when combined with AVR or CABG. Since 2005, 485 patients have undergone the Cox Maze III procedure at Inova Heart and Vascular Institute, 95 of whom had a full Cox Maze III with an AVR or CABG (Cox Maze III/AVR = 30; Cox Maze III/CABG = 47; Cox Maze III/AVR/CABG = 18). In addition, 4255 patients with no history of AF underwent AVR or CABG without surgical ablation (AVR = 422; CABG = 3518; AVR/CABG = 315). Data from our CABG, valve, and AF registries were used for analyses. Patients with and without the Cox Maze III were propensity score matched using a 0.10 caliper to improve balance on clinical and demographic variables. Differences in perioperative and postoperative outcomes by group were evaluated using the Fisher exact test, and a Kaplan-Meier survival analysis was completed. Health-related quality of life (Short Form 12) was obtained at baseline and 6 months post-surgery (n = 72). All 95 patients who underwent the Cox Maze III were propensity score matched with patients who did not undergo the Cox Maze III. Mean age (t = 0.3, P = .79) and European System for Cardiac Operative Risk Evaluation score (t = -1.8, P = .07) were similar between the groups. There were no significant differences in major postoperative morbidities between the groups despite the Cox Maze III group being on bypass longer (164.4 vs 108.8 minutes; t = -9.8, P<.001). Pacemaker implantation was significantly higher in the Cox Maze III group (P = .03). Survival during follow-up (mean = 35 months) was not different between patients who did and did not undergo the Cox Maze III procedure (log rank = 0.49, P = .48). Improvement in physical health-related quality of life was similar for both groups (F = 0.01, P = .94). At 1 year, 94% of the patients (60/64) who underwent the Cox Maze III procedure were in sinus rhythm (81% off class I and III antiarrhythmic drugs).
The addition of the Cox Maze III procedure to AVR or CABG did not convey an increase in major morbidity and perioperative risk. Patients who underwent the Cox Maze III procedure demonstrated similar survival over time with improvement in health-related quality of life. The Cox Maze III should not be denied to patients in whom the cardiac surgical procedure does not include atriotomies because of the perceived increased operative risk. The Cox Maze III may significantly improve their outcome.
closed_qa
Emergency medical services triage using the emergency severity index: is it reliable and valid?
Efficient communication between emergency medical services (EMS) and ED providers using a common triage system may enable more effective transfers when EMS arrives in the emergency department. We sought (1) to evaluate inter-rater reliability between Emergency Severity Index (ESI) assignments designated by EMS personnel and emergency triage nurses (registered nurses [RNs]) and (2) to evaluate the validity of EMS triage assignments using the ESI instrument. This prospective, observational study evaluated inter-rater reliability in ESI scores assigned by prehospital personnel and RNs. EMS providers were trained to use the ESI by the same methods used for nurse training. EMS personnel assigned triage scores to patients independent of assignments by the RN. Inter-rater reliability, differences based on provider experience, and validity of EMS triage assignments (sensitivity and specificity) were evaluated. Seventy-five paired, blinded triages were completed. Overall concordance between EMS providers and RNs was 0.409 (95% confidence interval [CI], 0.256-0.562). Agreement for EMS providers with less experience was 0.519 (95% CI, 0.258-0.780), whereas concordance for those with more experience was 0.348 (95% CI, 0.160-0.536; χ(2) = 1.413, df = 1, P = .235). Sensitivity ranged from 0% to 67.86%. Specificity ranged from 68.09% to 97.26%.
We observed moderate concordance between EMS and RN ESI triage assignments. EMS sensitivity for correct acuity assignment was generally poor, whereas specificity for correctly not assigning a particular level was better. Additional research investigating the potential causes of the poor agreement that we observed is warranted.
closed_qa
Does a combination of two radiographs increase accuracy in detecting acid-induced periapical lesions and does it approach the accuracy of cone-beam computed tomography scanning?
The purpose of this study was to determine whether the use of a combination of 2 images (storage phosphor plates [SPPs] and F-speed films [Eastman Kodak, Rochester, NY]) with a 10° difference in horizontal beam angulation resulted in better detectability of chemically created periapical defects than when only 1 image was used and whether a detectability as good as that achieved by limited cone-beam computed tomography (LCBCT) scanning could be achieved. Lesions were created by 1, 1.5, and 2 hours of acid application apical to extracted teeth in jaw specimens. After repositioning, teeth were radiographed with Accu-I-Tomo LCBCT, Digora Optime SPP system, and F-speed films. The SPPs and films were exposed at 0° and 10° horizontal angulations. The diagnostic accuracy (Az) was compared using 2-way analysis of variance; pair-wise comparisons were performed using the post hoc t test. Kappa was used to measure interobserver agreement. A combination of 2 exposures with a 10° difference in horizontal angulation caused an increase, although not statistically significant, in the accuracy of both films and SPPs for all acid durations (P>.05) compared with when only 1 exposure was used. The accuracy did not approach that of LCBCT.
Using a combination of 2 exposures instead of 1 did not significantly increase the accuracy in detecting acid-induced lesions at the apices of single-rooted premolars. The accuracy of LCBCT was superior.
closed_qa
PGE(2) receptor (EP(4)) agonists: potent dilators of human bronchi and future asthma therapy?
Asthma and chronic obstructive pulmonary disease are characterized by inappropriate constriction of the airway smooth muscle. In this context, the physiological response of the human airways to selective relaxant agonists like PGE(2) is highly relevant. The aim of this study was thus to characterize the PGE(2) receptor subtypes (EP(2) or EP(4)) involved in the relaxation of human bronchial preparations. Human bronchial preparations cut as rings were mounted in organ baths for isometric recording of tension and a pharmacological study was performed using selective EP(2) or EP(4) ligands. In the presence of a thromboxane TP receptor antagonist and indomethacin, PGE(2) induced the relaxation of human bronchi (E(max) = 86 ± 04% of papaverine response; pEC(50) value = 7.06 ± 0.13; n = 6). This bronchodilation was significantly blocked by a selective EP(4) receptor antagonist (GW627368X, 1 and 10 μmol/L) with a pK(B) value of 6.38 ± 0.19 (n = 5). In addition, the selective EP(4) receptor agonists (ONO-AE1-329; L-902688), but not the selective EP(2) receptor agonist (ONO-AE1-259), induced potent relaxation of bronchial preparations pre-contracted with histamine or anti-IgE.
PGE(2) and EP(4) agonists induced potent relaxations of human bronchial preparations via EP(4) receptor. These observations suggest that EP(4) receptor agonists could constitute therapeutic agents to treat the increased airway resistance in asthma.
closed_qa
Is short-interval mammography necessary after breast conservation surgery and radiation treatment in breast cancer patients?
The optimum timing and frequency of mammography in breast cancer patients after breast-conserving therapy (BCT) are controversial. The American Society of Clinical Oncology recommends the first posttreatment mammogram 1 year after diagnosis but no earlier than 6 months after completion of radiotherapy. The National Comprehensive Cancer Network recommends annual mammography. Intermountain Healthcare currently follows a more frequent mammography schedule during the first 2 years in BCT patients. This retrospective study was undertaken to determine the cancer yield mammography during the first 2 years after BCT. 1,435 patients received BCT at Intermountain Healthcare between 2003 and 2007, inclusive. Twenty-three patients had bilateral breast cancer (1,458 total breasts). Patients were followed up for 24 months after diagnosis. The 1- and 2-year mammography yields were determined and compared with those of the general screening population. 1,079 breasts had mammography at less than 1 year, and two ipsilateral recurrences (both noninvasive) were identified; 1,219 breasts had mammography during the second year, and nine recurrences (three invasive, six noninvasive) were identified. Of the 11 ipsilateral recurrences during the study, three presented with symptoms and eight were identified by mammography alone. The mammography yield was 1.9 cancers per 1,000 breasts the first year and 4.9 per 1,000 the second year.
These data demonstrate that the mammography yield during the first 2 years after BCT is not greater than that in the general population, and they support the policy for initiating followup mammography at 1 year after BCT.
closed_qa
Is there any role for urodynamic study in children with high-grade vesicoureteral reflux?
To determine the clinical symptoms and urodynamic characteristics among children with primary high-grade vesicoureteral reflux (VUR). We prospectively studied clinical symptoms and urodynamic parameters in 147 consecutive patients ≤ 12 years old with idiopathic high-grade VUR referred to our hospital. Of 147 patients with high-grade VUR, 139 cases with mean age of 5.3 years met our inclusion criteria (88.5% females, 11.5% males). The most common symptom was recurrent urinary tract infection (57%) and urgency (59%) followed by enuresis (31.6%) and frequency (26.6%). Normal urodynamic findings were observed in 23% of patients. Overactive bladder (74%), high-end filling pressure (72.7%), low-compliance bladder (56%), and low bladder capacity (51%) were the most common urodynamic reports in this study. Other urodynamic findings were underactive bladder (1.5%), hypersensitive bladder (1.5%), hyposensitive bladder (3%), and high capacity bladder (2.2%).
Proper management of VUR is very important because of its harmful potential effects on kidney function in children. With regard to the issue that most children with grade III and higher VUR had overactive bladder, high-end filling pressure, and other urodynamic disorders in their urodynamic study, it seems that these urodynamic disorders could be the basic cause of reflux.
closed_qa
Does prolonged warm ischemia after partial nephrectomy under pneumoperitoneum cause irreversible damage to the affected kidney?
We determined the effects of warm ischemia time on the recovery of renal function after partial nephrectomy under pneumoperitoneum. In this prospective study 37 consecutive patients who underwent laparoscopic partial nephrectomy or robot-assisted partial nephrectomy between June 2008 and May 2009 to remove a single cT1 renal tumor were evaluated using (99m)Tc-diethylenetriamine pentaacetic acid renal scintigraphy preoperatively, and at 3 and 12 months postoperatively. The most significant reduction in the glomerular filtration rate of the affected kidney at 3 and 12 months after surgery (p = 0.018, p = 0.036, respectively) was seen for a warm ischemia time cutoff of 28 minutes. The glomerular filtration rate of the affected kidney was consistently and significantly reduced at 3 and 12 months postoperatively (-22.4% to -30.6%, p<0.001) in patients with a warm ischemia time greater than 28 minutes. In contrast, no significant glomerular filtration rate change was seen in patients with a warm ischemia time of 28 minutes or less. In terms of the contributional change of the affected kidney to total renal function, there is a trend toward a recovery after an initial decrease in both groups with a warm ischemia time greater than 28 minutes vs 28 minutes or less. On multivariate analysis warm ischemia time was a strong independent predictor of glomerular filtration rate reduction even 12 months after surgery (β = -1.3; 95% CI -1.8, -0.7; p<0.001).
If the warm ischemia time is greater than 28 minutes during laparoscopic partial nephrectomy or robot-assisted partial nephrectomy, the functional damage to the affected kidney progresses even up to 1 year after surgery.
closed_qa
The impact of organ dysfunction in cirrhosis: survival at a cost?
The incidence of cirrhosis and subsequent development of organ dysfunction (OD) requiring intensive care unit (ICU) support is rising. Historically, critically ill cirrhotics are perceived as having poor prognosis and substantial cost of care. The aim was to prospectively analyse resource utilisation and cost of a large cohort of patients (n=660) admitted to a Liver ICU from 2000 to 2007 with cirrhosis and OD. Child Pugh, MELD, SOFA, APACHE II, and organ support requirements were collected. The Therapeutic Intervention Scoring System (TISS) score, a validated tool for estimating cost in ICU, was calculated daily. Logistic regression was used to determine independent predictors of increased cost. Alcohol was the most common etiology (47%) and variceal bleeding (VB) the most common reason for admission (35%). Invasive ventilatory support was required in 74% of cases, vasopressors in 49%, and 50% required renal replacement therapy. Forty-nine per cent of non-transplanted patients survived to ICU discharge. Median TISS score and ICU cost per patient were 261 and €14,139, respectively. VB patients had the highest survival rates (53% vs. 24%; p<0.001) and lower associated cost. A combination of VB (OR 0.48), need for ventilation (OR 2.81), low PO(2)/FiO(2) on admission (OR 0.97), and lactate (OR 0.93) improved cost prediction on multivariate analysis (AUROC 0.7; p<0.001) but organ failure scores per se were poor predictors of cost.
Patients with cirrhosis and OD result in considerable resource expenditure but have acceptable hospital survival. Further health economic assessment and outcome prediction tools are required to appropriately target resource utilisation.
closed_qa
Is routine blood cross-matching necessary in elective laparoscopic colorectal surgery?
Routine pre-operative cross-matching of two units of packed red cells (PRC) is current practice in most hospitals for patients undergoing elective laparoscopic colorectal surgery (LCS). To determine the usage of PRC in patients undergoing elective LCS&its cost implications. Retrospective analysis of 116 consecutive laparoscopic colorectal resections under the care of 2 consultant surgeons. Surgical procedures were anterior resection (31.9%; n = 37), right hemicolectomy (22.4%; n = 26), sigmoid colectomy (22.4%; n-26), subtotal colectomy (7.8%; n = 9), APR (4.3%; n = 5), panproctocolectomy (3.4%; n = 4), completion proctectomy (1.7%, n = 2), left hemicolectomy (0.9%, n = 1), total colectomy (0.9%; n = 1)&resection rectopexy (0.9%; n = 1). The median age was 65 years, 58% female. The median pre-operative haemoglobin was 131 g/L, median blood loss 100 ml and median post-operative haemoglobin 111.5 g/L. Eleven cases were converted. Three patients required perioperative blood transfusion, 2 of whom underwent open conversion. The cost of carrying out a group&save (G&S) in our hospital is £40.60 excluding laboratory staff labour cost. A 2 unit cross-match costs £294.60. There is potential for substantial cost savings with change of practice to G&S only.
G&S is sufficient to allow safe&cost-effective operative practice in laparoscopic colorectal surgery.
closed_qa
Are surrogate assumptions and use of diuretics associated with diagnosis and staging of acute kidney injury after cardiac surgery?
This study measured the association between the Acute Kidney Injury Network (AKIN) diagnostic and staging criteria and surrogates for baseline serum creatinine (SCr) and body weight, compared urine output (UO) with SCr criteria, and assessed the relationships between use of diuretics and calibration between criteria and prediction of outcomes.DESIGN, SETTING, PARTICIPANTS, & This was a retrospective cohort study using prospective measurements of SCr, hourly UO, body weight, and drug administration records from 5701 patients admitted, after cardiac surgery, to a cardiac intensive care unit between 1995 and 2006. More patients (n=2424, 42.5%) met SCr diagnostic criteria with calculated SCr assuming a baseline estimated GFR of 75 ml/min per 1.73 m(2) than with known baseline SCr (n=1043, 18.3%). Fewer patients (n=484, 8.5%) met UO diagnostic criteria with assumed body weight (70 kg) than with known weight (n=624, 10.9%). Agreement between SCr and UO criteria was fair (κ=0.28; 95% confidence interval 0.25-0.31). UO diagnostic criteria were specific (0.95; 0.94-0.95) but insensitive (0.36; 0.33-0.39) compared with SCr. Intravenous diuretics were associated with higher probability of falling below the UO diagnostic threshold compared with SCr, higher 30-day mortality (relative risk, 2.27; 1.08-4.76), and the need for renal support (4.35; 1.82-10.4) compared with no diuretics.
Common surrogates for baseline estimated GFR and body weight were associated with misclassification of AKIN stage. UO criteria were insensitive compared with SCr. Intravenous diuretic use further reduced agreement and confounded association between AKIN stage and 30-day mortality or need for renal support.
closed_qa
Are routine pelvic radiographs in major pediatric blunt trauma necessary?
Screening pelvic radiographs to rule out pelvic fractures are routinely used for the initial evaluation of pediatric blunt trauma. Recently, the utility of routine pelvic radiographs in certain subsets of patients with blunt trauma has been questioned. There is a growing amount of evidence that shows the clinical exam is reliable enough to obviate the need for routine screening pelvic radiographs in children. To identify variables that help predict the presence or absence of pelvic fractures in pediatric blunt trauma. We conducted a retrospective study from January 2005 to January 2010 using the trauma registry at a level 1 pediatric trauma center. We analyzed all level 1 and level 2 trauma victims, evaluating history, exam and mechanism of injury for association with the presence or absence of a pelvic fracture. Of 553 level 1 and 2 trauma patients who presented during the study period, 504 were included in the study. Most of these children, 486/504 (96.4%), showed no evidence of a pelvic fracture while 18/504 (3.6%) had a pelvic fracture. No factors were found to be predictive of a pelvic fracture. However, we developed a pelvic fracture screening tool that accurately rules out the presence of a pelvic fracture P = 0.008, NPV 99, sensitivity 96, 8.98 (1.52-52.8). This screening tool combines eight high-risk clinical findings (pelvic tenderness, laceration, ecchymosis, abrasion, GCS<14, positive urinalysis, abdominal pain/tenderness, femur fracture) and five high-risk mechanisms of injury (unrestrained motor vehicle collision [MVC], MVC with ejection, MVC rollover, auto vs. pedestrian, auto vs. bicycle).
Pelvic fractures in pediatric major blunt trauma can reliably be ruled out by using our pelvic trauma screening tool. Although no findings accurately identified the presence of a pelvic fracture, the screening tool accurately identified the absence of a fracture, suggesting that pelvic radiographs are not warranted in this subset of patients.
closed_qa
Does hormone replacement therapy have beneficial effects on renal functions in menopausal women?
The study was carried out to evaluate the possible effects of hormone replacement therapy (HRT) on renal functions in postmenopausal women. A total of 85 postmenopausal women without a history of medical illness were enrolled in the study. They were divided into HRT users and control groups. After 30 weeks of HRT use, the changes in serum urea, creatinine, uric acid, urinary protein, urinary creatinine, urinary protein/creatinine ratio and glomerular filtration rate (GFR) (mL/min/1.73 m(2)) were evaluated. HRT was associated with statistically significant increases in glomerular filtration rate (p<0.01), while serum urea, creatinine, uric acid, urinary protein, urinary creatinine and urinary protein/creatinine ratio did not change significantly in both groups.
In our study, we suggested that usage of hormone replacement therapy appeared to affect renal functions in postmenopausal women. There were beneficial effects of HRT on GFR in our postmenopausal patients. HRT may have possible protective mechanisms for kidney against adverse effects of aging.
closed_qa
Does the amplatzer septal occluder device alter ventricular contraction pattern?
To assess by cardiovascular magnetic resonance (CMR) and CMR tagging if the Amplatzer Septal Occluder affects right ventricular (RV) and left ventricular (LV) motion pattern. Sixteen consecutive patients with significant atrial septal defect (ASD) and nine consecutive patients with persistent foramen ovale (PFO) as controls were studied before and a median of 14 days after defect closure by an Amplatzer occluder. By CMR end-diastolic (EDV) and end-systolic (ESV) RV and LV volumes were determined. Aortic and pulmonary artery flow was measured for assessment of left-to-right shunt (Qp/Qs). By CMR tagging circumferential strain and radial shortening, maximal rotation and torsion were measured In ASD patients RV-EDV and RV-ESV decreased (P<0.05). LV-EDV and LV-ESV increased after ASD closure (P<0.005). Qp/Qs dropped from 1.8 to 1.0 (P<0.001). PFO patients showed no ventricular volume change after PFO closure. In ASD patients circumferential strain and radial shortening and maximal rotation of the RV decreased by ASD closure (P<0.01). In LV only maximal rotation at the base and apex decreased significantly (P<0.05). Torsion remained constant. In PFO patients no tagging parameter changed after defect closure.
The Amplatzer occluder itself does not change the ventricular contraction pattern. All volume and myocardial deformation changes were caused by ventricular loading shifts.
closed_qa
Carotid artery stenting in clinical practice: does sex matter?
Carotid artery stenting (CAS) is increasingly used for treatment of severe carotid artery stenosis, but only few procedural risk factors for complications of CAS are clearly defined yet. A possible impact of the patient's gender on the outcome of patients undergoing CAS has not been investigated properly and only little information about this topic is available so far. We analysed data of the German prospective, multicenter CAS Registry of the Arbeitsgemeinschaft leitende kardiologische Krankenhausärzte. From July 1996 to May 2009 5130 patients underwent CAS at 35 German hospitals and were enrolled into the prospective ALKK CAS Registry. Therefrom 1443 (28.1%) patients were female. There was no significant time-related difference in the proportion of women undergoing CAS over the years. Women undergoing CAS were significantly older than men (73 years vs. 70 years, p<0.01) and had a longer in hospital stay in comparison to men (p<0.01). The majority of patients treated with CAS was between 60 and 80 years of age (∼73%). No significant differences between women and men could be found regarding in-hospital events like death (0.5% vs. 0.5%, p = 0.99), major or minor stroke (1.7% vs. 1.6%, p = 0.97; 1.0% vs. 1.6%, p = 0.12), TIA (2.8% vs. 2.6%, p = 0.64), amaurosis fugax (0.3% vs. 0.5%, p = 0.25) , intracranial bleeding (0.5% vs. 0.3%, p = 0.43), myocardial infarction (0.1% vs. 0.0%, p = 0.48) or all non-fatal strokes and all death (3.0% vs. 3.4%, p = 0.47). 30 day event rates did not show gender related differences in the combined endpoint of the outcome of patients undergoing CAS, as well (♀ n = 31/882 [3.5%] vs. ♂ n = 109/2273 [4.8%], p = 0.12).
Our results do not suggest any gender-related differences in success rates and complications in CAS. In clinical practice approximately 30% of patients treated with CAS are women. The institutions and people who participated in the ALKK CAS Registry are listed in Zahn et al.16 The authors have no funding, financial relationships, or conflicts of interest to disclose.
closed_qa
Metabolic syndrome in adolescence: can it be predicted from natal and parental profile?
There are well-established predisposing factors for the development of metabolic syndrome (MetS) in childhood or adolescence, but no specific risk profile has been identified as yet. The Prediction of Metabolic Syndrome in Adolescence (PREMA) study was conducted (1) to construct a classification score that could detect children at high risk for MetS in adolescence and (2) to test its predictive accuracy. In the derivation cohort (1270 children), data from natal and parental profile and from initial laboratory assessment at 6 to 8 years of age were used to detect independent predictors of MetS at 13 to 15 years of age according to the International Diabetes Federation definition. In the validation cohort (1091 adolescents), the discriminatory capacity of the derived prediction score was tested on an independent adolescent population. MetS was diagnosed in 105 adolescents in the derivation phase (8%), whereas birth weight<10th percentile (odds ratio, 6.02; 95% confidence interval, 2.53-10.12, P<0.001), birth head circumference<10th percentile (odds ratio, 4.15; 95% confidence interval, 2.04-7.14, P<0.001), and parental overweight or obesity (in at least 1 parent; odds ratio, 3.22; 95% confidence interval, 1.30-5.29, P<0.01) were independently associated with diagnosis of MetS in adolescence. Among adolescents in the validation cohort (86 [8%] with MetS), the presence of all these 3 predictors predicted MetS with a sensitivity of 91% and a specificity of 98%.
The coexistence of low birth weight, small head circumference, and parental history of overweight or obesity may be useful for detection of children at risk of developing MetS in adolescence.
closed_qa
Does VMAT for treatment of NSCLC patients increase the risk of pneumonitis compared to IMRT ?
Volumetric modulated arc therapy (VMAT) for treatment of non-small cell lung cancer (NSCLC) patients potentially changes the risk of radiation-induced pneumonitis (RP) compared to intensity modulated radiation therapy (IMRT) if the dose to the healthy lung is changed significantly. In this study, clinical IMRT plans were used as starting point for VMAT optimization and differences in risk estimates of RP between the two plan types were evaluated. Fifteen NSCLC patients prescribed 66 Gy in 2 Gy fractions were planned with IMRT and subsequently with single arc VMAT. Dose metrics were evaluated for target and lung together with population averaged dose volume histograms. The risk of RP was calculated using normal tissue complication probability (NTCP) models. Finally, applicability of the plans was tested through delivery on an Elekta accelerator. When changing from IMRT to VMAT only modest differences were observed in the dose to the lung and target volume. On average, fractions of lung irradiated to doses between 18 Gy and 48 Gy were statistically significant reduced using VMAT compared to IMRT. For the fraction of lung receiving more than 20 Gy the reduction was 1.2% percentage points: (range -0.6 -2.6%). The evaluated toxicity were smaller with VMAT compared to IMRT, however only modest differences were observed in the NTCP values. The plans were delivered without any problems. The average beam on time with VMAT was 83 s. This was a reduction of 141 s (ranging from 37 s to 216 s) compared to IMRT.
Using IMRT as reference for the VMAT optimization it was possible to implement VMAT in the clinic with no increase in estimated risk of RP. Thus, toxicity is not expected to be a hindrance to using VMAT and will profit from the shorter delivery time with VMAT compared to IMRT.
closed_qa
Does individual learning styles influence the choice to use a web-based ECG learning programme in a blended learning setting?
The compressed curriculum in modern knowledge-intensive medicine demands useful tools to achieve approved learning aims in a limited space of time. Web-based learning can be used in different ways to enhance learning. Little is however known regarding its optimal utilisation. Our aim was to investigate if the individual learning styles of medical students influence the choice to use a web-based ECG learning programme in a blended learning setting. The programme, with three types of modules (learning content, self-assessment questions and interactive ECG interpretation training), was offered on a voluntary basis during a face to face ECG learning course for undergraduate medical students. The Index of Learning Styles (ILS) and a general questionnaire including questions about computer and Internet usage, preferred future speciality and prior experience of E-learning were used to explore different factors related to the choice of using the programme or not. 93 (76%) out of 123 students answered the ILS instrument and 91 the general questionnaire. 55 students (59%) were defined as users of the web-based ECG-interpretation programme. Cronbach's alpha was analysed with coefficients above 0.7 in all of the four dimensions of ILS. There were no significant differences with regard to learning styles, as assessed by ILS, between the user and non-user groups; Active/Reflective; Visual/Verbal; Sensing/Intuitive; and Sequential/Global (p = 0.56-0.96). Neither did gender, prior experience of E-learning or preference for future speciality differ between groups.
Among medical students, neither learning styles according to ILS, nor a number of other characteristics seem to influence the choice to use a web-based ECG programme. This finding was consistent also when the usage of the different modules in the programme were considered. Thus, the findings suggest that web-based learning may attract a broad variety of medical students.
closed_qa
Is the Modified Early Warning Score (MEWS) superior to clinician judgement in detecting critical illness in the pre-hospital environment?
A retrospective observational cohort study of consecutive adult (≥16 yrs) emergency department attendances to a single centre over a two-month period. The outcome of interest was the occurrence or not of an adverse event within 24h of admission. Hospital pre-alerting was used as a measure of current critical illness detection and its accuracy compared with MEWS scores calculated from pre-hospital observations. 3504 patients were included in the study. 76 (2.5%) suffered an adverse event within 24 h of admission. Paramedics pre-alerted the hospital in 224 cases (7.3%). Clinical judgement demonstrated a sensitivity of 61.8% (95% CI 51.0-72.8%) with a specificity of 94.1% (95% CI 93.2-94.9%). MEWS was a good predictor of adverse outcomes and hence critical illness detection (AUC 0.799, 95% CI 0.738-0.856). Combination systems of MEWS and clinical judgement may be effective MEWS ≥4+clinical judgement: sensitivity 72.4% (95% CI 62.5-82.7%), specificity 84.8% (95% CI 83.52-86.1%).
Clinical judgement alone has a low sensitivity for critical illness in the pre-hospital environment. The addition of MEWS improves detection at the expense of reduced specificity. The optimal scoring system to be employed in this setting is yet to be elucidated.
closed_qa
Is routine histopathology of tonsil specimen necessary?
Tonsillar diseases are common in paediatric and adult otolaryngological practice. These diseases require tonsillectomy. Specimens are subjected to histopathology routinely in my institution for fear of infections and tumour without consideration for risk factors. The financial burden is on the patients and waste of histopathologist's man hour because other specimens are left un-attended. This study aims to find out the necessity of routine histopathology of tonsil specimens. A 2 year retrospective review of the histopathological results of two (paediatric and adult) groups of 61 patients managed for tonsillar diseases at the ENT UNIT of Jos University Teaching Hospital from July 2005 to June, 2007. Data extracted included biodata, clinical features and histopathological diagnosis. The 61 patients comprise 35 children and 26 adults. The youngest and oldest paediatric patients were 1 year and 3 months and 16 years respectively, a range of 1 year 3 months to 16 years. The youngest and oldest adults were 17 and 50 years with a range of 17-50 years. Groups mean ages were 5.1 and 28.5 years. The gender ratios were 1:2.7 and 1:1.9 respectively. One adult was HIV positive. The histopathological diagnosis were chronic nonspecific tonsillitis in 10(16.6%), follicular tonsillitis in 23(38.3%), chronic suppurative tonsillitis in 10(16.6%), lymphoid hyperplasia in 18(30.0%) and lymphoma in 1(1.0%) respectively.
Histopathologic request for tonsillectomy specimens should be based on certain risk factors with consideration of the cost to patients and to spare the histopathologist's man hour.
closed_qa
Hepatocellular carcinoma in hepatitis D: does it differ from hepatitis B monoinfection?
A total of 92 consecutive HCC cases seropositive for antibody against HDV antigen (HDV group) were compared with 92 HBsAg-positive and anti-HDV-negative cases (HBV group). The features including sex, body mass index, presence of ascites, serum biochemistry, gross tumor appearance, child class, barcelona cancer liver clinic and okuda stages were not significantly different between the 2 groups. Decreased liver size was noticed more in cases of HDV compared with HBV group where the liver size was normal or increased (P=0.000). HDV patients had lower platelets (P=0.053) and larger varices on endoscopy (P=0.004). Multifocal tumors and elevated alpha-fetoprotein level>1000 IU/mL were more common in HBV group (P=0.040 and P= 0.061). TNM classification showed more stage III-IV disease in HBV group (P=0.000).
Decreased liver size and indirect evidence of more severe portal hypertension and earlier TNM stage compared with HBV monoinfection indicate that HDV infection causes HCC in a different way, possibly indirectly by inducing inflammation and cirrhosis.
closed_qa
Outpatient blind percutaneous liver biopsy in infants and children: is it safe?
BPLB was performed as an outpatient procedure using the aspiration Menghini technique in 80 infants and children, aged 2 months to 14 yrs, for diagnosis of their CLD. Patients were divided into three groups: Group 1 (<1 year), group 2 (1-6 yrs), and group 3 (6-14 yrs). The vital signs were closely monitored 1 hr before biopsy, and then 1, 2, 6, and 24 hrs after biopsy. Twenty-four hours pre- and post-biopsy complete blood counts, liver enzymes, prothrombin time (PT), and abdominal ultrasonography, searching for a biopsy-induced hematoma, were done for all patients. No mortality or major morbidities were encountered after BPLB. The rate of minor complications was 17.5% including irritability or "pain" requiring analgesia in 10%, mild fever in 5%, and drowsiness for>6 hrs due to oversedation in 2.5%. There was a statistically significant rise in the 1-hr post-biopsy mean heart and respiratory rates, but the rise was non-significant at 6 and 24 hrs except for group 2 where heart rate and respiratory rates significantly dropped at 24 hrs. No statistically significant difference was noted between the mean pre-biopsy and the 1, 6, and 24-hrs post-biopsy values of blood pressure in all groups. The 24-hrs post-biopsy mean hemoglobin and hematocrit showed a significant decrease, while the 24-hrs post-biopsy mean total leucocyte and platelet counts showed non-significant changes. The 24-hrs post-biopsy mean liver enzymes were non-significantly changed except the 24-hrs post-biopsy mean PT which was found to be significantly prolonged, for a yet unknown reason(s).
Outpatient BPLB performed by the Menghini technique is safe and well tolerated even in infants and young children. Frequent, close monitoring of patients is strongly recommended to achieve optimal patient safety and avoid potential complications.
closed_qa
Does bladder wall thickness decrease when obstruction is resolved?
The aim of the current study was to determine if sonographic bladder wall thickness diminishes after symptomatic obstruction is resolved in female patients after stress incontinence surgery. Between December 2008 and December 2010, 62 female patients with symptomatic bladder outlet obstruction, as defined by Blaivas, who had undergone prior surgery for urinary stress incontinence were included in the study. The patients' history was taken and symptoms were noted. Patients underwent gynaecological examination, and multichannel urodynamic assessment was performed. Vaginal sonographic assessment of the bladder wall thickness (BWT) was performed before and after urethrolysis. 62 patients were included in this study, 55 of whom had undergone suburethral sling insertion and seven had Burch colposuspension. Postoperatively, BWT decreased significantly from 9.1 mm  ±  2.1 to 7.6 mm  ±  2.2 (p<0.0001). In seven patients, obstruction was still unresolved postoperatively; of these, two had undergone a retropubic sling insertion and two had a Burch colposuspension. An ROC curve analysis showed a significant positive association between residual urine and persistent obstruction before surgery (AUC 0.76, 95%CI 0.58-0.94; p<0.05).
If obstruction is resolved, bladder wall thickness decreases. Preoperatively elevated residual urine may increase the risk of persistent obstruction after urethrolysis.
closed_qa
The relevance of factor VIII (FVIII) pharmacokinetics to TDM and hemophilia a treatment: is B domain-deleted FVIII equivalent to full-length FVIII?
Recombinant DNA-derived clotting factor VIII concentrates (rFVIII) potentially have safety advantages over plasma-derived products. Removal of the B domain of the FVIII molecule does not seem to reduce the procoagulant activity and improve the efficiency of the manufacturer. However, when used, clinically possible differences in hemostatic efficacy between the full-length (FL) and B domain-deleted (BDD) molecules have emerged. This article predicts the impact that differences in the pharmacokinetic behavior between BDD- and FL-rFVIII may have on bleed prophylaxis in hemophilia A. Published data on the pharmacokinetic and biological effects of FL- and BDD-rFVIII were examined and used well-established proven pharmacokinetic modeling applied to therapeutic target plasma concentrations of FL- and BDD- rFVIII. Biochemical differences between the 2 molecules in standard laboratory assays can be shown and in vivo BDD-rFVIII appears to show a shorter half-life possibly because of greater susceptibility to proteolytic degradation. Theoretical modeling demonstrates that if patients switch between FL-rFVIII to BDD-rFVIII, it could result in very different concentrations of active clotting factor.
As demonstrated, around 40% of patients if switched from FL-rFVIII to BDD-rFVIII would have lower concentrations of FVIII in the blood. It is essential that clinicians are aware of this possibility and that there is sufficient and appropriate follow-up of patients with hemophilia A who are switching the type of factor concentrate used in their treatment.
closed_qa
Escalation of cocaine intake with extended access in rats: dysregulated addiction or regulated acquisition?
Understanding the neurobehavioral mechanisms underlying dysregulated cocaine intake is important for the development of new cocaine abuse therapies. The current study determined if cocaine escalation under extended access conditions (6-h access) is regulated by discrimination learning processes. Rats were initially trained on cocaine self-administration (0.1 or 0.25 mg/kg/infusion) using a fixed ratio 1 (FR 1) schedule under 1-h access for 12 sessions. Some rats were then trained to self-administer cocaine under 1-h or 6-h access conditions exclusively for 14 additional sessions, while other rats were trained under both 1- and 6-h access conditions that were cued or noncued for 28 additional sessions (14 sessions for each 1- and 6-h access). Two additional groups of rats were initially trained to self-administer cocaine using an FR 1 schedule under 10-min access for 12 sessions; half of the animals were then switched to 60-min access conditions for 14 additional sessions. When access conditions were differentially cued, escalation of cocaine intake was evident in animals with both 1- and 6-h access conditions during the escalation phase. Escalation also was evident in animals initially trained with 10-min access and then switched to 60-min access.
The results demonstrate that dysregulated and regulated intakes can be expressed within the same animal, indicating that escalation is context-dependent. Furthermore, escalated cocaine intake can be expressed under 1-h access conditions. Overall, these results suggest that escalated cocaine intake may be representative of discrimination-dependent regulated intake rather than addiction-like, compulsive intake.
closed_qa
Concurrent endometrial intraepithelial carcinoma (EIC) and serous ovarian cancer: can EIC be seen as the precursor lesion?
The pathogenesis of serous ovarian carcinoma (SOC) is still unknown. Recently, endometrial intraepithelial carcinoma (EIC) was proposed to be the precursor lesion of SOC. This study examines the model of EIC as precursor for SOC. Cases of SOC with a noninvasive or superficially invasive serous lesion, a hyperplastic lesion with/without atypia, or EIC in the endometrium were selected for inclusion in this study. Tissue sections from both ovaries, the fallopian tubes, and the uterus were extensively reviewed by an expert gynecopathologist. For both EIC and SOC, immunostaining for p53, Ki-67, estrogen receptor, and progesterone receptor; TP53 mutation analysis; and in situ ploidy analysis were performed. Nine cases of SOC with concurrent EIC in the endometrium were identified. Immunostaining for p53, Ki-67, estrogen receptor, and progesterone receptor revealed almost identical expression patterns and similar intensities in each pair of EIC and coincident SOC. Identical TP53 mutations were found in SOC and coinciding EIC in 33% of the cases, suggesting a clonal origin. DNA ploidy analysis, as a marker for neoplastic progression, demonstrated an increased number of aneuploid nuclei in SOC compared to their corresponding EIC (P = 0.039). In addition, the mean amount of DNA per nucleus in SOC was higher (ie, more aneuploid) compared to EIC (P = 0.039).
This study provides a first indication of EIC as possible precursor lesion for SOC. This finding could have major clinical implications for future ovarian cancer management and underscores EIC as a possible target for early SOC detection and prevention.
closed_qa
Nasal batten grafts: are patients satisfied?
To learn how nasal batten grafts affect patients' assessment of their nasal airway patency and to determine the extent to which patients believe batten grafts altered their appearance. A prospective survey study of 18 patients in a tertiary veterans hospital who had nasal airway obstruction (NAO) due to nasal valve collapse was completed. Patients had placement of bilateral polyethylene batten grafts during a 36-month study period. The Nasal Obstruction Symptom Evaluation (NOSE) validated survey was used to measure a patient's subjective postoperative change in nasal airway obstruction. In addition, the patients were asked to rate the extent their appearance had changed. All patients presented with complaints of NAO due to nasal valve collapse either in isolation or in combination with another anatomical source of obstruction. The nasal valve collapse was identified by clinical examination. All patients had preoperative photographs. Most patients had a trial with an intranasal stent before opting for surgical implantation of the batten grafts. The results of the NOSE survey demonstrate significant improvement in nasal obstruction. Patients also reported only a minimal change in appearance. There was 1 patient with implant extrusions and only a few implants were removed.
Nasal airway obstruction due to nasal valve collapse can be effectively treated with polyethylene batten grafts. The implants are well tolerated, and patients report a significant improvement in NAO. There is little risk of implant extrusion, exposure, or intolerance. In addition, patients did not note a significant change to their appearance.
closed_qa
Faculty development projects for international health professions educators: Vehicles for institutional change?
Projects are an important tool in faculty development, and project emphasis may offer insights into perceived education priorities. Impact of projects has been focused on individuals, not institutions or health.AIM: Education innovation projects of Fellows in an international faculty development program were examined to better understand perceived needs in health professions education and institutional impact of projects. Four hundred and thirty-five projects were analyzed to identify focus areas. Fellows were asked to identify changes in their schools and communities resulting from their projects. New education methods and curriculum change were common project focus areas. Regional differences were evident with a higher percentage of education methods projects by Fellows residing in India (52%), compared with South Africa (25%) and Brazil (24%). Fifty-six percent of projects were incorporated into the curriculum and/or incorporated as institutional policy. One-third to two-thirds of respondents noted improved teaching quality, collaboration, education research interest, assessment, student performance, and curriculum alignment with community health needs.
National differences in project focus may offer insight into local conditions and needs. High rates of diffusion of projects and impact on faculty, students, and curriculum suggest that faculty development projects may be a strategy for institutional change in resource limited environments.
closed_qa
Does comorbid depression predict subsequent adverse life events in youth with attention-deficit/hyperactivity disorders?
Studies have primarily focused on adverse life events (ALEs) as potential causes rather than as outcomes of pediatric depression. The current study prospectively examines ALEs in a sample of youth with attention-deficit/hyperactivity disorders (ADHD) to determine whether having a major depressive disorder (MDD) at baseline (T1) predicts counts of child-dependent or child-independent ALEs at a second assessment (T2) ≈ 8 months later. Subjects with ADHD 11-18 years old were drawn mostly from a tertiary mental health clinic and evaluated with semi-structured diagnostic interviews, and parent and teacher questionnaires of ADHD severity. Eighteen with and 61 without initial MDD at T1 were compared at T2 regarding counts of subsequent overall, child-dependent, and child-independent ALEs reported on life events questionnaires by the child or parent. The group initially with MDD had higher overall ALEs (p=0.01) and child-dependent ALEs (p ≤ 0.001) but not child-independent ALEs (p=0.12) at T2 relative to the nondepressed group, although only 3 of 18 continued to meet full criteria for MDD. The group initially with MDD also had a higher baseline ADHD severity (p=0.04) and proportion of oppositional or conduct disorders (p=0.004). In multivariate analyses, the group initially having MDD had a higher adjusted mean at T2 of child-dependent ALEs (p=0.02), but not of overall ALEs (p=0.06), after controlling for other T1 variables, including ALEs of the same type, ADHD severity, externalizing disorders, and the interaction of externalizing disorders with MDD.
These findings suggest that child-dependent ALEs are potentially an important outcome after youth with ADHD have an episode of MDD. Youth with ADHD who develop comorbid MDD should be closely monitored and offered interventions to address the potential burden of child-dependent ALEs lingering after a depressive episode.
closed_qa
Does empathy change in first-year dental students?
Professionalism is a central tenet of the dental undergraduate curriculum. Dental undergraduate curricula and standards expect the dentist to put the patient's interests first, and in this respect, an important attitude is empathy. This study examined the self-reported empathy levels of first-year dental students before and after an early analytical exposure to behavioural sciences and the clinical encounter. First-year dental undergraduates were given an attitudinal questionnaire to complete before and after the behavioural science course. The questionnaire consisted of the HP version of the Jefferson Scale of Physician Empathy and the Patient-Practitioner Orientation Scale. Paired non-parametric tests and Spearman's Rho correlations, along with simple descriptive statistics, were used to test the statistical significance of observations. A total of 66 paired questionnaires were returned, giving a response rate of 75%. There were no correlations between age and total mean score of JSPE or PPOS, and no gender differences. There was a significant increase (P<0.01) in empathy as measured by the JSPE between pre- and post-course scores. The PPOS did not record any significant change in the sharing, caring or total scale scores pre- to post-course.
The modified JSPE has potential utility in assessing the cognitive-affective aspect of dental students' empathy. Using the JSPE, short-term measurable empathy changes can be detected in first-year dental undergraduates after the structured and assessed analytical introduction to the clinical encounter and environment.
closed_qa
Acquiring psychomotor skills in operative dentistry: do innate ability and motivation matter?
The acquisition of psychomotor skills is a key competence in the practice of dentistry, and innate abilities and motivation have been shown to influence motor performance. However, the explicit integration of these factors into the design of research projects about skill acquisition in dentistry has been limited. Therefore, the purpose of this study was to provide a comprehensive analysis of how dental students' abilities and motivation affected their performance in an operative task. A longitudinal study with two cohorts of dental students was conducted in laboratory classes forming part of an operative technique course. A range of standardised psychometric tests was used to assess different abilities before completing a cavity preparation on Frasaco teeth. This was followed immediately by completion of an Intrinsic Motivation Inventory. Low but statistically significant correlations (P<0.05) were found between dental performance and psychomotor ability (r=0.22), and also dental performance and motivation (r=0.19). A significant difference (P<0.05) was found in the grades obtained for the cavity preparation exercise in one cohort between students with higher levels of psychomotor ability compared with those with lower levels (Tracing scores) (P<0.05). No significant differences in grades obtained for the cavity preparation exercise were found between students with higher and lower levels of motivation.
Both innate psychomotor ability and motivation showed only weak positive associations with dental performance on cavity preparation exercises. Our study suggests that student-related factors only provide limited information to explain differences in performance or to be useful as specific predictors of future performance by individuals.
closed_qa
Preoperative MRI sphincter morphology and anal manometry: can they be markers of functional outcome following anterior resection for rectal cancer?
Consecutive patients with rectal adenocarcinoma underwent preoperative manometric assessment and MRI staging. MRIs were assessed with regard to anorectal angle, puborectalis thickness, canal length and external and internal anal sphincter thickness. Functional outcome was categorized into three groups according to the number of adverse postoperative symptoms (frequency, urgency, leakage, diarrhoea, use of pads, use of antidiarrhoeal medication): 0, 1 and ≥ 2. This was evaluated 1 year following surgery and 6 months following stoma reversal where applicable. Univariate analysis of an ordinal regression model was performed with significance at the 5% level. Thirty patients were assessed. No single preoperative manometric parameter proved significant (P>0.05). Only puborectalis thickness showed a significant (P = 0.01) relationship with the number of adverse symptoms suffered postoperatively. On receiver operating characteristics analysis, a cut-off value of 3.5 mm gave an optimal sensitivity of 0.5 (95% CI, 0.17-0.83) and specificity of 0.86 (95% CI, 0.64-0.96).
Measurements of the puborectalis thickness on preoperative staging MRIs for rectal cancer may help predict functional outcome following AR. Prospective assessment of larger numbers with a fully validated continence score are required to evaluate these findings further.
closed_qa
Wild boar: an increasing concern for Aujeszky's disease control in pigs?
The goal of this study was describing the temporal evolution of Aujeszky's disease virus (ADV) contact prevalence among Eurasian wild boar (Sus scrofa) populations under different management regimes and contact likelihoods with domestic pigs. Given the recent increase in wild boar abundance throughout Europe, we hypothesized that wild boar contact with ADV would remain stable in time even after significant reduction of ADV prevalence in domestic pigs. Sera from 1659 wild boar were collected from 2000 to 2010 within 6 areas of the Iberian Peninsula and tested for the presence of antibodies against ADV by ELISA. According to sampling date, wild boar were grouped into three time periods. ADV prevalence was compared through period both globally and by geographic area. Overall seroprevalence for the ten-year study period was 49.6 ± 2.4%. The highest seroprevalence was recorded in areas with intense wild boar management. The annual proportion of positive wild boar sampling sites remained stable through the study period, while the percentage of domestic pig AD positive counties decreased from 70% in 2003 to 1.7% in 2010.
Results presented herein confirmed our hypothesis that ADV would remain almost stable in wild boar populations. This evidences the increasing risk wild boar pose in the final stages of ADV eradication in pigs and for wildlife conservation.
closed_qa
Impending macrosomia: will induction of labour modify the risk of caesarean delivery?
To compare the annual incidence rates of caesarean delivery between induction of labour and expectant management in the setting of macrosomia. This is a retrospective cohort study. Deliveries in the USA in 2003. Singleton births of macrosomic neonates to low-risk nulliparous women at 39 weeks of gestation and beyond. Women who had induction of labour at 39 weeks of gestation with a neonatal birthweight of 4000 ± 125 g (3875-4125 g) were compared with women who delivered (either induced or spontaneous labour) at 40, 41 or 42 weeks (i.e. expectant management), assuming an intrauterine fetal weight gain of 200 g per additional week of gestation. Similar comparisons were made at 40 and 41 weeks of gestation. Chi-square test and multivariable logistic regression analysis were used for statistical comparison. Method of delivery, 5-minute Apgar scores, neonatal injury. There were 132,112 women meeting the study criteria. In women whose labours were induced at 39 weeks and who delivered a neonate with a birthweight of 4000 ± 125 g, the frequency of caesarean was lower compared with women who delivered at a later gestational age (35.2% versus 40.9%; adjusted OR 1.25, 95% CI 1.17-1.33). This trend was maintained at both 40 weeks (36.1% versus 42.6%; adjusted OR 1.31, 95% CI 1.23-1.40) and 41 weeks (38.9% versus 41.8%; adjusted OR 1.16, 95% CI 1.06-1.28) of gestation.
In the setting of known birthweight, it appears that induction of labour may reduce the risk of caesarean delivery. Future research should concentrate on clinical and radiological methods to better estimate birthweight to facilitate improved clinical care. These findings deserve examination in a large, prospective, randomised trial.
closed_qa
Is early laparoscopic cholecystectomy a safe procedure in patients when the duration of acute cholecystitis is more than three days?
The role of laparoscopic cholecystectomy for patients with acute cholecystitis and symptoms for>3 days is debated. Our purpose was to compare the results of laparoscopic cholecystectomy in patients with acute cholecystitis and symptoms for ≤ 3 days and>3 days. Sixty patients with acute cholecystitis had a laparoscopic cholecystectomy performed by the same surgeon. There were 39 patients in the short group (symptoms ≤ 3 days) and 21 patients in the long group (symptoms>3 days). Demographic data, surgical findings and clinical results were analyzed. There were no significant differences in age, gender, comorbidities, abnormal liver function tests, white bile, gallbladder empyema, blood loss, conversion rate, postoperative hospital stay or complication rates between the groups. The mean duration of acute cholecystitis was 1.9 days in the short group and 5.3 days in the long group (p<0.0001). The long group had a longer operating time (p=0.004) and a higher rate of subhepatic drains (p=0.014).
Laparoscopic cholecystectomy is a safe and feasible procedure for patients with acute cholecystitis when the duration of symptoms is>3 days, however, a higher conversion rate is seen for acute chronic cholecystitis.
closed_qa
Is lymph-node micrometastasis in gallbladder cancer a significant prognostic factor?
The purpose of our study was to investigate prognostic significance of lymph-node micrometastasis in gallbladder carcinoma. In total, 1,094 lymph nodes from 41 patients who had undergone radical resection with lymph-node dissection, including para-aortic lymph nodes were stained with hematoxylin and eosin (H&E) and immunostained with anti-cytokeratin 7/8 antibody. Micrometastasis in each lymph node was defined as tumor cells that were detectable only by immunohistochemical evaluation and were not detected by H&E staining. Metastases were detected in 163 lymph nodes (14.9%) by H&E staining. Micrometastases were found in 25 of the remaining lymph nodes (2.3%). Among 24 patients with lymph node metastasis based on the H&E staining, 12 had micrometastases. Of the 17 patients in whom lymph-node metastasis was not detected by the H&E staining, one was found to have micrometastasis. Micrometastasis correlated significantly with lymph node metastasis on H&E staining and pN (Tumor-Node-Metastasis 5th ed.). On multivariate analysis of data from 17 node-positive patients who underwent curative resection, micrometastasis and microscopic venous invasion were significant prognostic factors.
Our findings suggest that micrometastasis might be traces of scatter of cancer cells to the whole body rather than an event in an initial stage of the metastasis.
closed_qa
Are there any similarities in the hepatic vascular anatomy among blood relatives?
The existence of similarities in the hepatic vascular anatomy among blood relatives (BR) have never been studied before. Since in living donor liver transplantation (LDLT), the donor may be a BR, an opportunity is available to assess whether there are similarities in the hepatic vascular anatomy among BR. We conducted an analysis of 61 LDLT during the period from January 2004 to August 2008. Based on preoperative multi-detector computed tomography data, the hepatic arteries (HA) were classified into 4 groups, the portal vein (PV) was classified into 2 groups and the right hepatic vein (RHV) was classified into 2 groups. The data of each group were then compared between BR (n=47) and NBR (n=14). With regard to the HA anatomy, 30 cases (68%) of the BR donor matched that of the recipient and 9 cases (69%) in the NBR donor. The PV anatomy was matched in 41 cases (87%) of BR donor and 11 cases (79%) in the NBR donor. The anatomy of the RHV was matched in 25 cases (53%) in the BR donor and 9 cases (64%) in NBR donor. There was no significant difference in all contexts.
No similarities were therefore observed in the hepatic vascular anatomy among BR.
closed_qa
Should pancreaticoduodenectomy be performed in the elderly?
Pancreaticoduodenectomy (PD) is indicated in benign or malignant pancreatic head diseases. It is a difficult operation with high morbidity especially in elderly patients. The aim of our study was to determine whether pancreaticoduodenectomy is associated with higher morbidity and mortality in patients ≥ 70 years old. During 17 years, 173 patients were operated by Whipple intervention, whatever the disease. From a prospective database, patients were divided in 2 groups (Group A ≥ 70 years old, Group B<70). Postoperative mortality was not significantly higher in elderly (12% vs. 4.1%; p=0.06). However, re-intervention and morbidity were more important in univariate analysis (p=0.03 and p=0.002 respectively). In multivariate analysis, age ≥ 70 years old was not an independent prognostic factor of mortality (p=0.27) and re-intervention (p=0.07). Whereas age (p=0.04) and preoperative morbidity (p=0.02) were independent prognostic factors of morbidity.
PD requires careful patient selection. However, age should not be a limiting factor.
closed_qa
Predictive value of positron emission tomography-computed tomography image fusion in the diagnosis of head and neck cancer: does it really improve staging and management?
To determine (1) the accuracy of positron emission tomography - computed tomography in the diagnosis of head and neck cancer, (2) the learning curve involved, and (3) whether its use alters patient management. A retrospective study including 80 patients with head and neck cancer who underwent positron emission tomography - computed tomography image fusion at Blackpool Victoria Hospital. Fifty-three patients underwent positron emission tomography - computed tomography for staging (32 for detection of a primary tumour and 21 for detection of distant metastasis) and 27 for detection of loco-regional recurrence. Ten primary tumours and 20 recurrences were accurately diagnosed by this method. Eighteen patients had their tumour stage and management modified as a result of this method of imaging. The effect of the learning curve resulted in better true positive detection rates, one year after introduction (81 versus 61 per cent). The sensitivity and specificity of this method in detecting head and neck cancer were 70 and 42 per cent, respectively, whereas those of conventional imaging were 73 and 51 per cent, respectively.
Compared with magnetic resonance imaging, the benefits of positron emission tomography - computed tomography may be limited to diagnosis of recurrence, as it is less hindered by tissue fibrosis, radiotherapy-related oedema, scarring and inflammation.
closed_qa
Cochlear implantation in superficial siderosis: a viable option?
Superficial siderosis of the central nervous system is characterized by accumulation of haemosiderin in the subpial layers of the brain and spinal cord. The evidence largely suggests a retro-cochlear cause for hearing loss with questionable involvement of cochlea. We present our experience with two patients of superficial siderosis who underwent cochlear implantation, and discuss their outcomes and the underlying pathology. The first patient developed a gradually progressive, profound hearing loss over 25 years, clinical diagnosis being made on MRI scans. The second patient was referred to us with bilateral sensorineural hearing loss, tinnitus, ataxia, dementia, seizures, and visual impairment. Both underwent cochlear implantation for auditory rehabilitation. The first patient gained significant benefit, whereas the second patient has not developed any meaningful auditory stimulation at 9-month post-operative follow-up.
Hearing loss due to superficial siderosis even though predominantly retro-cochlear may be successfully rehabilitated with a cochlear implant. However, outcomes are variable and more evidence regarding experience with cochlear implant in such patients with long-term follow-up is desirable.
closed_qa
Resection of colorectal liver metastases in the elderly: does age matter?
A prospective database of resection for colorectal liver metastases at a single centre was retrospectively analysed to compare the outcome in patients aged ≥75 years (group E) with those aged<75 years (group Y). Data were analysed using the Kaplan-Meier method with Cox regression modelling. Of 1443 resections, 151 (10.5%) in group E were compared with 1292 (89.5%) in group Y. The two groups were matched apart from higher American Society of Anesthesiology scores (P=0.001) and less use of chemotherapy (P=0.01) in the elderly. Perioperative morbidity and 90-day mortality were higher in the elderly compared with the younger group (32.5%vs 21.2%, P=0.02, and 7.3%vs 1.3%, P=0.001). In the last 5 years, mortality in the elderly improved and was no longer significantly different from that of the younger patients [n=2/76 (2.6%) vs n=9/559 (1.6%); P=0.063]. The 5-year survival was similar in groups E and Y for cancer-specific (41.4%vs 41.6%, P=0.917), overall (37.0%vs 38.2%) and median (44.1 months vs 43.6 months, P=0.697) survival respectively.
In the elderly liver resection for metastatic disease can be performed with acceptable mortality and morbidity with as good a prospect of survival as for younger patients.
closed_qa
Is a single bioelectrical impedance equation valid for children of wide ranges of age, pubertal status and nutritional status?
Bioelectrical impedance analysis (BIA) is widely used to predict body composition in paediatric research and clinical practice. Many equations have been published, but provide inconsistent predictions. To test whether a single equation for lean mass (LM) estimation from BIA is appropriate across wide ranges of age, pubertal status and nutritional status, by testing whether specific groups differ in the slope or intercept of the equation.SUBJECTS/ In 547 healthy individuals aged 4-24 years (240 males), we collected data on body mass (BM) and height (HT), and lean mass (LM) using the 4-component model. Impedance (Z) was measured using TANITA BC418MA instrumentation. LM was regressed on HT(2)/Z. Multiple regression analysis was conducted to investigate whether groups based on gender, age, pubertal status or nutritional status differed in the association of LM with HT(2)/Z. BM ranged from 5 to 128 kg. HT(2)/Z was a strong predictor of LM (r (2)=0.953, s.e.e.=2.9 kg). There was little evidence of a sex difference in this relationship, however, children aged 4-7 years and 16-19 years differed significantly from other age groups in regression slopes and intercepts. Similar variability was encountered for pubertal stage, but not for nutritional status.
No single BIA equation applies across the age range 4-24 years. At certain ages or pubertal stages, the slope and intercept of the equation relating LM to HT(2)/Z alters. Failure to address such age effects is likely to result in poor accuracy of BIA (errors of several kg) for longitudinal studies of change in body composition.
closed_qa
Current dysphonia trends in patients over the age of 65: is vocal atrophy becoming more prevalent?
The current trends in geriatric voice referrals including the number of patients over the age of 65 years seen per year, the common diagnostic patterns, and specifically the number of patients with vocal atrophy were assessed. Retrospective cohort study. A retrospective chart review of all patients seen at the Emory Voice Center for otolaryngologic complaints between the years of 2004 and 2009 was performed. Of the 6,360 patients seen over a 6-year period, 21% were over the age of 65 years. Fifty-eight percent of patients over the age of 65 years had vocal complaints, with the most common diagnoses being vocal atrophy (25%), neurologic vocal dysfunction (23%), and vocal fold immobility (19.2%). Of those patients diagnosed with vocal atrophy, the majority opted for voice therapy (57%), followed by reassurance (39%), and injection laryngoplasty (6%). There was a statistically significant improvement in mean pretherapy and post-therapy voice-related quality of life (VRQOL) score.
As the number of people in the over 65-year-old age bracket increases, so do the number of geriatric referrals. Although diagnostic trends remain the same, vocal atrophy is becoming more prevalent, with a large number of patients seeking intervention. This will likely result in an increased need for health resources in the future.
closed_qa
Does joint position affect US findings in inflammatory arthritis?
Musculoskeletal US is being increasingly used for the assessment of synovitis, although questions remain about its reliability. One potential factor affecting reliability is the lack of consensus of image acquisition methods such as using different joint positions. This may have an implication on the reproducibility of studies that use US as an outcome measure. The aim of this study was to determine whether a change in joint position might significantly alter the quantification of US-detected synovitis in patients with inflammatory arthritis (IA). IA patients with clinically swollen wrists, MCP and/or knee joints were recruited. These joints were assessed quantitatively for the presence of synovitis when they were placed in different positions. Seventy-five patients with IA were assessed. The greatest grey scale (GS) and power Doppler (PD) scores for the MCP joints were found in the flat (0°) position (91 and 100% of cases, respectively) compared with other positions (P < 0.001). Similar results were found in the wrist joints. The greatest GS and PD scores for the knee joint were found in 30° flexion [100 and 95.6% of cases, respectively, compared with other positions (P < 0.001)]. The inter- and intra-reader reliability was good to excellent.
The position in which a joint is scanned for synovitis appears to significantly influence the US assessment of synovitis. Our study suggests that the standardized scanning of the hand joints in a flat position and the knees in a 30° position are associated with the highest GS and PD scores.
closed_qa
Does the spectrum of peri-intraventricular haemorrhages in preterm infants change over the years?
Analysis of prevalence and degree of intraventricular haemorrhages in preterm infants treated at the Institute of Mother and Child in Warsaw between 2005-2009.The results were compared with a similar analysis conducted between 1998-2002 in an effort to find an answer to the question regarding definite changes or trends of changes in this pathology over time. The studied population comprised 350 infants born at 22-34 weeks of gestation, hospitalised between 2005-2009. These infants were compared with 354 infants treated between 1998 and 2002. Haemorrhages was diagnosed on the basis of repeated ultrasounds performed in a standard manner, through the fontanelle, according to classic rules. MR imaging was performed as required. The extent of haemorrhage was classified in stages 1 to 4. In deceased infants the stage of haemorrhage was verified on the basis of autopsy results. The investigations carried out between 2005 and 2009 showed haemorrhages in 174 infants (49.7%). Extensive stage 3 haemorrhages were diagnosed in 45 infants (12.9%),and grade 4 in 35 infants (10,0%). 40 infants (11,4%) died during hospitalization. Autopsy was conducted in 26 deceased infants (65%); in 18 cases (69%) the diagnosis was confirmed and in no case was the diagnosis regarding the extent of haemorrhage changed. Studies carried out in the period 1998-2002 revealed haemmorhages in 248 infants (70%), including stage 3 in 67 infants (19%), and stage 4 in 34 infants (10%). 93 infants died during hospitalisation. Prevalence of all types of peri- intraventricular haemorrhages (PVH/IVH) is currently significantly lower, but the prevalence of extensive haemorrhages of grade 4 has not decreased. The number of deaths has decreased by half, although at present more infants with grade 4 haemorrhages survive. Comparison of prevalence of (PVH/IVH) of all grades in both cohorts of infants born up to 34 weeks of gestation in different periods, i.e. 1998-2002 and 2005-2009, shows a statistically significant decrease. However, the comparison of prevalence of extensive IVH of grade 4, does not show a statistically significant decrease. The percentage of women with threatened pregnancy who recieved corticotheraphy was 64.2%. This is still definitely too low.
1. Statistically significant decrease in the prevalence and the severity of peri-intraventricular haemorrhage in the analysis carried out between 2005 and 2009 is a positive conclusion. A negative finding is the fact that the incidence of IV degree intraventricular haemorrhage, does not show a falling trend. 2. A fall in the number of deaths in the population of premature infants born in our Department can be the result of significantly improved medical care in the compared groups. 3. In both cohorts still insufficient percentage of pregnant women receiving prenatal corticosteroids in cases of high risk pregnancy, could be linked with unsatisfactory prophylactie perinatal care. This could lead to lack of improvement in the incidence of IV degree intraventricular haemorrhage. 4. The existing data base in Poland on the incidence of PVH/IVH in the risk group, is insufficient for comparison with European Union Countries data in the EuroNeoNet. The significance of this pathology on individual, social and economic levels, creates a need to carry out periodical analysis, at regional level, concerning its incidence, causes and effects.
closed_qa
Is a neutral head position safer than 45-degree neck rotation during ultrasound-guided internal jugular vein cannulation?
The optimal degree of neck rotation during internal jugular vein (IJV) cannulation remains undetermined because previous studies suggested using sonography, but without puncturing the vein. We assessed whether a neutral position (NP) of the head (0 degrees) during ultrasound-guided cannulation of the IJV was safer than rotating the neck to 45 degrees head turned. The effect of these 2 positions during ultrasound-guided cannulation on major complications was the primary outcome. Overall complications, venous access time, and perception of difficulty during the procedure were also evaluated. A prospective, randomized, controlled, nonblinded study was conducted in a tertiary neurosurgical hospital. Patients undergoing major elective neurosurgical procedures requiring a central venous line were randomly allocated to 2 groups; ultrasound-guided cannulation of the IJV was then performed using an out-of-plane orientation. One thousand four hundred twenty-four patients were evaluated, but 92 were excluded; 670 were allocated to the head turned group and 662 to the NP group. Cannulation was 100% successful. Demographic data were similar in the 2 groups except for IJV positions. There were only 10 major complications: 6 in the 0-degree NP group and 4 in the 45-degree head turned group. The frequency of these complications was not different between the 2 groups. The overall complication rate was 13%, and was higher in women, in patients with ASA physical status ≥II, and in patients with a smaller diameter vein, or when the vein was located deeper and lateral or in the anterolateral position. An increased venous access time was associated with an increased rate of overall complications. The perception of difficulty performing the procedure with the head placed in the 2 positions was not statistically different in either group.
A head NP was as safe as a 45-degree neck rotation during ultrasound-guided IJV cannulation with regard to both major and minor complications, and venous access time was similar. Ultrasound guidance helps determine optimal head rotation for IJV cannulation.
closed_qa
Is there a role for free breathing non-contrast steady-state free precession renal MRA imaging for assessing live donors?
Accurate pre-operative evaluation of renal vascular anatomy is essential for successful renal harvest in live donor transplantation. Non-contrast renal MR angiographic (MRA) techniques are potentially well suited to the screening of donors; however, their restricted imaging field of view (FOV) has previously been an important limitation. We sought to assess whether the addition of a large FOV balanced fast field echo (BFFE) steady-state free precession (SSFP) sequence to non-contrast SSFP MRA could overcome this problem. Comparison with contrast-enhanced MRA (CE MRA) and findings at surgery were performed. 22 potential renal donors each underwent SSFP and CE MRA. 11 out of 22 potential donors subsequently underwent a donor nephrectomy. All images were diagnostic. Both SSFP MRA and CE MRA identified an equal number of arteries. Surgery confirmed two accessory renal arteries, both demonstrated with both imaging techniques. A third accessory vessel was identified with both techniques on a kidney contralateral to the donated organ. 6 out of 11 procured kidneys demonstrated early branch arteries at surgery, 5 out of 6 of which had been depicted on both SSFP and CE MRA. The median grading of image quality for main renal arteries was slightly better for CE MRA (p=0.048), but for accessory vessels it was better for SSFP MRA.
This pilot study indicates that by combining free-breathing SSFP MRA with large-FOV bFFE images, an accurate depiction of renal vascular anatomy without the need for intravenous contrast administration can be obtained, as compared with surgical findings and CE MRA.
closed_qa
Is fasting insulin concentration inversely associated with rate of weight gain?
To test whether a higher fasting insulin concentration is associated with a lower rate of weight gain over six to seven years. Two longitudinal epidemiologic cohorts including blacks and whites. The Coronary Artery Risk Development in Young Adults (CARDIA) Study examined subjects aged 18-30 y in 1985-86 and 1992-93 (n = 3636), and the Atherosclerosis Risk in Communities (ARIC) Study examined subjects aged 45-64 y in 1987-89 and 1993-95 (n = 11179). In each study, fasting insulin at baseline and weight change during follow-up were measured in participants without diabetes. In whites and black men in CARDIA, there was a positive age-adjusted association between baseline insulin and weight change, although weight change was not entirely monotonic across the insulin quartiles. In these race-gender groups, the linear regression coefficients indicated that each 50 pmol/L increment of baseline insulin was associated (P<0.05) with approximately 0.10 kg/y greater rate of weight gain (95% confidence intervals (CI) for this estimate in kg/y were 0.023-0.187 for white women, 0.015-0.150 for black men, and 0.011-0.158 for white men). The association was eliminated entirely with adjustment for baseline weight. In contrast, among whites and black women in ARIC, the association was negative, with the linear regression coefficients suggesting that each 50 pmol/L higher fasting insulin concentration was associated (P<0.05) with a 0.03-0.10 kg/y lower rate of weight gain (95% CI for this estimate in kg/y were -0.133 to -0.061 for black women, -0.106 to -0.054 for white women, and -0.055 to -0.009 for white men). This finding was generally strengthened by adjustment for baseline body mass index (BMI).
A higher fasting insulin concentration is associated modestly with a lower rate of weight gain in ARIC, but not in CARDIA.
closed_qa
ApoE polymorphism and albuminuria in diabetes mellitus: a role for LDL in the development of nephropathy in NIDDM?
Chronic hyperglycaemia stands with diabetes duration as the main predicting factor for the development of nephropathy in insulin dependent diabetes mellitus (IDDM). In contrast, nephropathy in non-insulin-dependent diabetes mellitus (NIDDM) presents with a different natural history and, as well as atherosclerosis, can precede diabetes diagnosis and even the onset of patent hyperglycaemia. The role of lipid abnormalities in this matter remains debated. We studied the prevalence of nephropathy (N+ = urinary albumin excretion rate (UAE)>20 mg/d) in 134 Caucasian NIDDM patients ranked according to alipoprotein E (apoE) genotype (same distribution in 132 controls). Age, diabetes duration and sex ratio did not differ between N+ and N-. A patient with E2E4 (n = 1) was excluded from the analysis. The prevalence of nephropathy was significantly reduced in E2 allele carriers (36%, 8/22) vs 69% (77/111) in E2 non-carriers (P<0.01). Relative risk (RR) of E2 carriers developing nephropathy was 0.52 (95% CI = 0.35-0.80). Both groups were comparable in terms of age (55 +/- 11 vs 57 +/- 11 years), diabetes duration (15 +/- 9 vs 14 +/- 10 years) and prevalence of retinopathy (59 vs 48%). Similar results were observed when patients with diabetes duration longer than 8 years were studied (n = 94).
It has been largely established that low-density lipoprotein (LDL)-cholesterol level in E2 allele carriers (whether diabetic or not) was lower than in E2 non-carriers. The 2-fold increase of nephropathy in E2 non-carriers with NIDDM argues for a role for LDL in the development of human nephropathy in NIDDM patients. This result is in agreement with previous data established both in vitro and in vivo in animal models. These findings support evidence for the pathogenic and morphologic similarities between kidney disease and atherosclerosis in NIDDM patients.
closed_qa
Unilateral cleft lip with or without cleft palate and handedness: is there an association?
The purpose of this study was to investigate the possibility of a relationship between the side of occurrence of unilateral clefting of the lip and/or palate and handedness, also taking into account the type of the initial cleft condition, a factor that has not been adequately assessed in previous studies. This was a retrospective study. Division of Orthodontics, The Hospital for Sick Children, Toronto, Canada, and Cleft Lip and Palate Program, Children's Hospital, Winnipeg, Canada. Subjects were 289 patients (176 males and 113 females) 9 years of age or older presenting with a history of unilateral clefts of the lip with or without the palate. Of these patients, 217 were recruited from the patient pool of the Orthodontic Clinic at the Hospital for Sick Children in Toronto. The remaining 72 were selected from the registry of the Cleft Lip and Palate Program of the Children's Hospital in Winnipeg. Any syndromic cases were excluded from the sample. Assessment of handedness was performed by asking the patients to fill out a multi-item questionnaire in which patients were asked to identify which hand they would use for different tasks. The side and type of the initial cleft condition were identified by reviewing each patient's hospital chart and by cross-referencing with clinical examination. Statistical evaluation of the results was performed by using the chi-square test. There was a significantly larger number of left-sided clefts (198) in the sample than right-sided clefts (91), (p<.001). The proportion of left-sided clefts among left-handers (84.6%) was higher than that among right-handers (66.8%). However, the relationship between side of cleft and handedness was not statistically significant (p = .185). Clefts of the primary palate only seemed to occur on the left side 3.5 times more often than on the right, whereas the corresponding ratio of left:right manifestation for clefts of the primary and secondary palate was 1.8:1. The difference was statistically significant (p<.05).
The findings of this study confirm the affinity of unilateral clefts for the left side but suggest that there are differences between clefts of the primary palate only and clefts of the primary and secondary palate. Also, non-right-handed patients show a greater predilection for having a cleft on the left side than do right-handed patients.
closed_qa
Is a standard regime for anticoagulation with heparin in unstable angina adequate?
A prospective series of 108 Emergency Department attendees over a six-month period with a clinical diagnosis of unstable angina for whom anticoagulation with heparin was prescribed were included in the study. The standard regime was a 5000 unit bolus followed by an intravenous infusion of 1000 units per hour (1200 units if the patient's weight was greater than 80 kg), with subsequent adjustments being made by reference to a nomogram. The activated partial thromboplastin time (APTT) was measured at six and 12 hours after treatment began. Two commonly used criteria for adequate heparinisation were compared: 1. APTT greater than 1.5 times control and 2. APTT in the range of 60-85 seconds. There were valid data for 90 patients at six hours and 79 at 12 hours. Compared to the criterion for adequate anticoagulation of APTT greater than 1.5 times the control, 25% of patients were subtherapeutic at six hours and 12% at 12 hours. Compared to the criterion APTT greater than 60 seconds, 53% of patients were subtherapeutic at 6 hours and 47% at 12 hours. At 6 hours, 26% of patients were over-anticoagulated as defined as APTT greater than 85 seconds. This had reduced to 13% by 12 hours.
In the context of recent research suggesting that an APTT of greater than 1.5 times the control is sufficient to reduce complications in unstable angina, our results demonstrate that a standard regime of heparinisation will achieve this goal in the majority of patients within 6 hours of starting heparin therapy. However, if an APTT of 60-85 seconds is the goal, this standard regime is inadequate.
closed_qa
Can positive affect items be used to assess depressive disorders in the Japanese population?
The purpose of the present study was to examine the measurement properties of positive affect items among the Japanese population. Responses to the Japanese version of the Center for Epidemiologic Studies Depression Scale and four additional negatively revised items of the original positive affect items were compared for 85 Japanese psychiatric out-patients with dysphoric-mood-related symptoms and 255 demographically matched controls. Responses to positive affect items were generally comparable between the two groups, whereas responses to negative symptom items were markedly different (P<0.002 for all comparisons). The group difference was most marked for symptom persistence. Responses to the four negatively revised items of positive affect revealed a similar picture to that of the negative symptom items. The internal consistency of the scale significantly improved when the original positive affect items were replaced by the negatively revised items (P<0.001 for both).
Positive affect items with positive wording cannot be used to assess depressive disorders in the Japanese population adequately, but this can be done with the corresponding negatively revised items.
closed_qa
Are rural general practitioner--obstetricians performing too many prenatal ultrasound examinations?
To determine the frequency of prenatal ultrasonography (PNU) in western Labrador in 1994, assess the appropriateness of the ultrasound examinations according to current guidelines and determine whether there was any relation between number of PNU examinations and patient management and obstetric outcomes. Review of all obstetric charts and PNU requisition forms for all deliveries in one hospital in 1994. Labrador City and Wabush, Newfoundland. During the study period, there were 103 singleton deliveries, and these mothers underwent a total of 225 PNU studies (mean 2.16 studies per delivery). More than half (53.3%) of the examinations were classified as inappropriate. There were no significant differences in the number of studies between low- and high-risk pregnancies or between uncomplicated deliveries and those in which induction or instrumental or operative delivery occurred, nor was there any relation between number of PNU examinations and maternal or neonatal outcome.
Compared with PNU use as recommended by the Canadian Task Force on the Periodic Health Examination, this type of examination was overused in Labrador City and Wabush, although the rate of use was comparable to that reported in other Canadian studies. This overuse was not associated with any identifiable effect on maternal or neonatal outcome or on the management of pregnancy and labour. More judicious use of PNU, in accordance with evidence-based guidelines, is recommended.
closed_qa
Can the urea breath test for H.pylori replace endoscopy for the assessment of dyspepsia in primary care?
The urea breath test may have value in the initial assessment of dyspepsia in primary care. This pilot study tracks patient and general practitioner behaviour which cannot be predicted with modelling studies. The urea breath test was made available over a period of 18 months. The test was requested when general practitioners would normally have used a trial of medication or referred for endoscopy. Patients with a positive urea breath test had early endoscopy before treatment. Patients with a negative urea breath test were treated according to symptom response. A follow-up questionnaire was given 6-24 months after the urea breath test. Urea breath tests were requested on 249 patients; clinical notes and follow-up interview data were available for 207 patients (83%). The urea breath test was positive for 89 patients (43%); 70 were referred for endoscopy and peptic ulcer disease was found in 33 (47%). The urea breath test was negative for 118 patients; 14 were follow-up tests after previous H.pylori treatment. For the 104 patients with dyspepsia, a negative test and no previous treatment, 42% had 1 or more previous investigations for dyspepsia and 66% had dyspepsia symptoms for more than one year. During follow-up, 21 patients had endoscopy. Dyspepsia symptom scores were significantly lower at follow-up (p<0.01). Using a global assessment, 66% had fewer symptoms, 22% same and 12% had more symptoms. The symptom improvement was greater if the duration of symptoms was less than one year (p<0.05). Medication use did not change significantly. Twelve patients were dissatisfied with management; most of these would have preferred endoscopy.
A negative urea breath test appears to have some reassurance value. The use of the urea breath test as initial assessment for dyspespia may prevent the need for some endoscopy. Further controlled studies of breath testing compared with early endoscopy are required.
closed_qa
Diabetes mellitus after renal transplantation: as deleterious as non-transplant-associated diabetes?
Despite use of lower doses of corticosteroid hormones after renal allotransplantation in the era of cyclosporine and tacrolimus, posttransplant diabetes mellitus remains a common clinical problem. We prospectively investigated the effect of posttransplant diabetes on long-term (mean follow-up, 9.3+/-1.5 years) graft and patient survival in the 11.8% of our renal transplant population (n = 40) who developed diabetes after kidney transplantation, and we compared outcome in 38 randomly chosen nondiabetic control patients who had received transplants concurrently. Twelve-year graft survival in diabetic patients was 48%, compared with 70% in control patients (P = 0.04), and Cox's regression analysis revealed diabetes to be a significant predictor of graft loss (P = 0.04, relative risk = 3.72) independent of age, sex, and race. Renal function at 5 years as assessed by serum creatinine level was inferior in diabetic patients compared to control patients (2.9+/-2.6 vs. 2.0+/-0.07 mg/dl, P = 0.05). Two diabetic patient who experienced graft loss had a clinical course and histological features consistent with diabetic nephropathy; other diabetes-related morbidity in patients with posttransplant diabetes included ketoacidosis, hyperosmolar coma or precoma, and sensorimotor peripheral neuropathy. Patient survival at 12 years was similar in diabetic and control patients (71% vs. 74%).
Posttransplant diabetes mellitus is associated with impaired long-term renal allograft survival and function, complications similar to those in non-transplant-associated diabetes may occur in posttransplant diabetes, and, hence, as in non-transplant-associated diabetes, tight glycemic control may also be warranted in patients with posttransplant diabetes.
closed_qa
Can we detect or predict the presence of occult nodal metastases in patients with squamous carcinoma of the oral tongue?
When to do a neck dissection as part of the surgical treatment for a patient with squamous carcinoma of the oral tongue is controversial, particularly when the primary can be resected without entering the neck. If the patient who is at high risk for having occult nodal disease in the neck can be identified, node dissection with the glossectomy could be justified. To better identify patients for this procedure, we correlated various tumor and patient factors along with preoperative diagnostic studies with the presence or absence of pathologically positive nodes in a group of patients who underwent node dissection. Ninety-one previously untreated patients with biopsy-proved squamous carcinoma of the oral tongue were prospectively studied. All patients had a glossectomy and neck dissection as their initial treatment. The pathology findings (ie, lymph nodes with squamous cancer) were correlated with many preoperative and intraoperative factors, and a statistical analysis was made. The use of computed tomography and ultrasound was not better than the clinical examination in determining the presence or absence of nodal metastases. The best predictors were depth of muscle invasion, double DNA aneuploidy, and histologic differentiation of the tumor.
All patients with stage T2-T4 squamous cancers of the oral tongue should have an elective dissection of the neck. Patients with T1N0 cancer who have a double DNA-aneuploid tumor, depth of muscle invasion>4 mm, or have a poorly differentiated cancer should definitely undergo elective neck dissection. Ultrasound and computed tomography are of little value in predicting which patients have positive nodes.
closed_qa
Is hormone replacement therapy protective for hand and knee osteoarthritis in women?
To explore whether hormone replacement therapy (HRT) has a protective role for osteoarthritis (OA) of the hand and knee in a cross sectional study of women in the general population. 1003 women aged 45-64 (mean age 54.2) from the Chingford Study were asked details of HRT use. Standard anteroposterior radiographs of hands, knees were taken and scored according to the methods of Kellgren and Lawrence (grade 2+ positive for OA), and using individual features of osteophytes and joint space narrowing. Analysis compared ever use (>12 months) versus never use, and current use (>12 months) versus never use. Only 606 definitely postmenopausal women were included in the analysis. Odds ratios and 95% confidence intervals were calculated using logistic regression for risk of user versus non-user at each site, adjusted for age, height and weight, menopausal age and for bone mineral density of the femoral neck. For current users (n = 72) there was a significant protective effect of HRT for knee OA (defined by Kellgren and Lawrence grade or osteophytes 0.31 (95% CI 0.11, 0.93), and a similar but not significant effect for moderate joint space narrowing of the knee, 0.41 (95% CI 0.05, 3.15) and for distal interphalangeal OA 0.48 (95% CI 0.17, 1.42). No clear effect was seen for the carpometacarpal joint, CMC OA 0.94 (95% CI 0.44, 2.03). When analysing ever users (n = 129) the protective effect was reduced. For ex-users of>12 months (mean duration 40.7 months), there was no overall protective effect of HRT for OA. Additional adjustment for hysterectomy, physical activity, social class, and smoking made little difference to the results.
These data show an inverse association of current HRT use and radiological OA of the knee suggestive of a protective effect. The effect was weaker in the hand joints. The mechanism of the protection is unclear but has important implications for aetiopathogenesis.
closed_qa
Great grand multiparity: is it a risk?
To compare antenatal and intrapartum complications incidence among women delivering for the 10th time or more and to compare this with those of low parity (para 2-5). The records of (154) women of great grand multipara (para>10) were reviewed and compared with (308) women (para 2-5) delivered during the same period. Antepartum as well as intrapartum complications were compared, all occurring between 16 April 1994 and 15 January 1995. Great grand multiparous women are older. The incidence of diabetes mellitus, chronic hypertension and preterm labor are similar to those with low parity, while they have higher incidences of pre-eclampsia (7.1% vs. 2.69%) and intrauterine fetal death (5.2 vs. 1.3%) P<0.025. There were no differences in the incidences of placental abruption, placenta previa, malpresentation, postpartum hemorrhage and operative delivery between the two groups. Great grand multipara also have a higher incidence of macrosomia (12% vs. 12%) P<0.0001.
Great grand multiparous woman are at increased risk of having pre-eclampsia, intrauterine fetal death and macrosomia.
closed_qa
Is the efficacy of topical corticosteroid therapy for psoriasis vulgaris enhanced by concurrent moclobemide therapy?
Psychosocial factors have been implicated in the onset and exacerbation of psoriasis. We conducted a randomized, placebo-controlled, double-blind study to investigate the effect of an antidepressant agent, moclobemide, on the course of psoriasis vulgaris. Sixty subjects were enrolled in the study. Patients were randomly assigned to treatment groups. Patients received moclobemide 450 mg/day or placebo and a topical corticosteroid ointment (diflucortolone valerate) for 6 weeks. Patients were examined at the beginning of the study and at 2-week intervals. At each visit, the severity of psoriasis and psychologic status were evaluated with the Psoriasis Area Severity Index (PASI), Beck Depression Inventory (BDI), Hamilton Rating Scale for Anxiety (HAM-A), Hamilton Rating Scale for Depression (HRS-D-17) and State-Trait Anxiety Inventory including state (STAI-1) and trait anxiety (STAI-2). Treatment efficacy was able to be evaluated in 22 patients in the moclobemide-treated group and in 20 in the placebo-treated group. The improvement rates in PASI, BDI, STAI-1, and HAM-A scores were significantly higher in the moclobemide treatment group. The level of state anxiety was diminished in the moclobemide group. Correlation was positive between improvement rates of the psoriatic lesions and state anxiety in all patients.
Our results suggest that an antidepressant drug is useful in the treatment of psoriasis.
closed_qa
Treating hyperlipidemia for the primary prevention of coronary disease. Are higher dosages of lovastatin cost-effective?
To compare the average and marginal life-time cost-effectiveness of increasing dosages of 3-hydroxy-3-methylglutaryl coenzyme A (HMG-CoA) reductase inhibitors, such as lovastatin, for the primary prevention of coronary heart disease (CHD). We estimated the lifelong costs and benefits of the modification of lipid levels achieved with lovastatin based on published studies and a validated CHD prevention computer model. Patients were middle-aged men and women without CHD, with mean total serum cholesterol levels of 6.67, 7.84, and 9.90 mmol/L (258, 303, and 383 mg/dL), and high-density lipoprotein cholesterol levels of 1.19 mmol/L (46 mg/dL), as described in clinical trials. We estimated the cost per year of life saved for dosages of lovastatin ranging from 20 to 80 mg/d that reduced the total cholesterol level between 17% and 34%, and increased high-density lipoprotein cholesterol level between 4% and 13%. After discounting benefits and costs by 5% annually, the average cost-effectiveness of lovastatin, 20 mg/d, ranged from $11,040 to $52,463 for men and women. The marginal cost-effectiveness of 40 mg/d vs 20 mg/d remained in this range ($25,711 to $60,778) only for persons with baseline total cholesterol levels of 7.84 mmol/L (303 mg/dL) or higher. However, the marginal cost-effectiveness of lovastatin, 80 mg/d vs 40 mg/d, was prohibitively expensive ($99,233 to $716,433 per year of life saved) for men and women, irrespective of the baseline total cholesterol level.
Assuming that $50,000 per year of life saved is an acceptable cost-effectiveness ratio, treatment with lovastatin at a dosage of 20 mg/d is cost-effective for middle-aged men and women with baseline total cholesterol levels of 6.67 mmol/L (258 mg/dL) or higher. At current drug prices, treatment with 40 mg/d is also cost-effective for total cholesterol levels of 7.84 mmol/L (303 mg/dL) or higher. However, treatment with 80 mg/d is not cost-effective for primary prevention of CHD.
closed_qa
Do gastrointestinal symptoms accompanying sore throat predict streptococcal pharyngitis?
The purpose of this study was to determine whether gastrointestinal (GI) symptoms are more common in streptococcal than nonstreptococcal pharyngitis, and, if so, whether these symptoms are useful diagnostic predictors. Patients aged 4 and older presenting consecutively to one of three family practice clinics and one emergency department with the chief complaint of sore throat were invited to participate in the study. A nurse administered a brief symptom checklist; after documenting clinical signs, the clinician assessed and treated the patient. All patients were screened for group A streptococcus using the Abbott Test Pack Plus. Patients were enrolled from January 1996 through March 1996. Significant associations of signs and symptoms with streptococcal pharyngitis were determined by chi square, likelihood ratios were calculated, and logistic regression was used to compare diagnostic prediction models with and without GI symptoms. Six hundred fifty-seven consecutive patients with the presenting complaint of sore throat were enrolled in the study. The mean age of the patients enrolled was 19 years; the median age was 14. Thirty-two percent of the children (ages 4 to 18), 23% of the adults (ages 19 to 74), and 29% of all patients had streptococcal pharyngitis. Symptom frequencies for streptococcal and nonstreptococcal pharyngitis, respectively, were: nausea (39% vs 31%, P = .14); vomiting (14% vs 7%, P = .004); abdominal pain (27% vs 26%, P = .621); and any GI symptom (47% vs 41%, P = .45). When included in a predictive model with other significant predictors of streptococcal pharyngitis including age, palatal petechiae, absence of cough, and anterior cervical adenopathy, the addition of nausea or vomiting added slight predictive power to the models, but abdominal pain and "any GI symptom" did not.
Nausea and vomiting are somewhat more common in streptococcal than in nonstreptococcal pharyngitis, but appear to have limited usefulness as clinical predictors of streptococcal pharyngitis.
closed_qa
Is there a clinically significant change in pressure-flow study values after urethral instrumentation in patients with lower urinary tract symptoms?
To determine the effect of urethral instrumentation on pressure-flow study values and subsequent grading of bladder outflow obstruction (BOO) in men with lower urinary tract symptoms (LUTS) using suprapubic intravesical pressure monitoring. Seventy-two men with LUTS underwent pressure-flow study using suprapubic intravesical pressure monitoring. The urethra was then instrumented successively with a 12 F catheter and a 17 F cystoscope, and a further pressure-flow study recorded after each procedure. Standard pressure-flow variables were measured for the three recordings. The presence and degree of obstruction were determined using commonly recognized grading systems, i.e. the Abrams-Griffiths nomogram. the linear passive urethral resistance ratio (LPURR) and the urethral resistance algorithm (URA). There were statistically significant differences in the detrusor pressure at maximum flow and detrusor opening pressure between the uninstrumented and instrumented studies (12 F and 17 F) but no difference in peak flow rates between the groups or in the Abrams-Griffiths number or URA value between studies. Using the LPURR, there was a tendency to a lower obstruction class after urethral instrumentation, ranking as 17 F>12 F>no instrumentation.
The changes seen after urethral instrumentation represent no more than the biological intra-individual variation normally seen in pressure-flow studies and do not lead to a clinically significant change in obstruction class.
closed_qa
Retroperitoneal lymphadenectomy for post-chemotherapy residual masses: is a modified dissection and resection of residual masses sufficient?
To determine if post-chemotherapy retroperitoneal lymphadenectomy for residual masses can be limited to resection of the residual masses and a modified template dissection, without loss of therapeutic efficacy. Between 1985 and 1995, 50 patients underwent one of three types of retroperitoneal lymphadenectomy for a residual mass after cisplatin-based chemotherapy for stages II and III testicular non-seminomatous germ cell tumour. The pre-operative imaging, operative record and pathology reports were reviewed to determine the location of the residual masses and whether tumour, defined as teratoma or viable carcinoma, was within the boundaries of the modified template and/or residual masses. The median (range) follow-up was 56 (1-140) months. Of 39 patients undergoing a bilateral dissection, one (2.6%) with a left testicular cancer had teratoma identified outside the boundaries of the modified template and the residual masses. The nine patients who underwent resection of residual masses and a modified-template dissection were relapse-free at a median follow-up of 55 months. One of two patients undergoing resection of residual mass alone had two recurrences arising from incomplete resection. Four of eight patients undergoing a modified dissection retained ejaculation, compared with seven of 25 (28%) undergoing a non-nerve sparing bilateral dissection.
This retrospective study suggests that in patients whose tumour markers become normal and have a residual mass after chemotherapy, residual masses can be resected with a modified-template dissection with no significant risk of leaving tumour in the retroperitoneum.
closed_qa
Do inflammatory processes contribute to radiation induced erythema observed in the skin of humans?
Two prospective trials were designed to determine whether there may be a role for inflammatory mediators in human skin erythema at both high and low doses per fraction and for 'out of field' effects. Trial 1. Effects of topical indomethacin (1%) and hydrocortisone (1%) applied before and during radiotherapy were compared for erythema induced by 20 Gy in four fractions (n = 26, 6 MV). Trial 2. Effects of topical hydrocortisone (1 %) applied before and during radiotherapy and no medication were compared for erythema induced by 1, 3, 5 and 7 Gy in five fractions (n = 21, 120 kV). Erythema was measured using reflectance spectrophotometry (RFS) and laser Doppler (LD) on a weekly basis. Trial 1. A bi-phasic reaction time course was suggested in two-thirds of the cases. The first phase did not appear to be influenced by hydrocortisone cream but the second was significantly attenuated. Indomethacin had no effect on either reaction phase. Erythema measured several centimetres outside of the field was reduced by hydrocortisone but not by indomethacin. Trial 2. Trial 2 confirmed the presence of measurable erythema, invisible to the eye, that coincided in its time course to the first phase of erythema noted in trial 1. This reaction was more intense than predicted by the LQ formula and was non-significantly attenuated by topical hydrocortisone. RFS readings proved to be less subject to inter- and intra-patient variations than the LD unit used.
Inflammatory responses may play a role in the mediation of the erythematous response to radiation in human skin. Further studies are warranted.
closed_qa
Diverticulectomy, myotomy, and fundoplication through laparoscopy: a new option to treat epiphrenic esophageal diverticula?
To describe the technique and the results of laparoscopic diverticulectomy combined with esophageal myotomy and antireflux wrap for epiphrenic diverticula of the esophagus. The epiphrenic diverticulum of the esophagus is a rare disease probably caused by a longstanding impairment of the esophageal motor activity. Although there is almost universal agreement to operate only on symptomatic patients, the optimal treatment is controversial. The best-accepted guideline is to treat the underlying motor disorder. This is generally done through a left thoracotomic approach that allows diverticulectomy, esophageal myotomy, and partial fundoplication. From January 1994 through February 1996, 4 patients underwent laparoscopic transhiatal diverticulectomy, esophageal myotomy, and partial fundoplication at our institution. A thorough preoperative study was done with barium swallow, esophagoscopy, and manometry in all patients; 24-hour pH monitoring was done in one case. No postoperative complications were observed. Short- and medium-term results are satisfactory.
No theoretical objection should be made to this approach, because the principle of treatment of the diverticular pouch and the underlying motor disorder and the prevention of reflux is respected. Longer follow-up and a wider series are mandatory to substantiate these initially favorable results.
closed_qa
Is age relevant to functional outcome after restorative proctocolectomy for ulcerative colitis?
Restorative proctocolectomy for mucosal ulcerative colitis is well established. However, the effect of age on physiologic sphincter parameters is poorly understood. Our objective was to determine whether age at the time of restorative proctocolectomy correlates with physiologic changes. In the approximately 20 years during which restorative proctocolectomy has been performed for ulcerative colitis, the indications have changed. Initially, the procedure was recommended only in patients under approximately 50 years. However, the procedure has been considered in older patients because of the increasing age of our population, the increasing frequency of recognition of patients during the "second peak" of mucosal ulcerative colitis, and the decreasing morbidity rates, due to the learning curve and to newer techniques, such as double-stapling. Few authors have presented data analyzing the effects of this operation in older patients. One hundred twenty-two patients who had undergone a two-stage restorative proctocolectomy for mucosal ulcerative colitis were divided into three groups according to age: group I (>60 years), 11 men, 6 women; group II (40-60 years), 29 men, 18 women; and group III (<40 years) 29 men, 29 women. The patients were prospectively evaluated using anal manometry and subjective functional results. Comparisons were made before surgery, after colectomy and before closure of ileostomy, and at 1 or more years after surgery. There were no significant differences among the groups relative to manometric results, frequency of bowel movements, incontinence scores, or overall patient satisfaction. The postoperative mean and maximum resting pressures were significantly reduced (p<0.001), and conversely the sensory threshold (p<0.005) and capacity (p<0.001) were increased in all groups up to 1 year after surgery. There were no statistically significant changes in the squeeze pressure or length of the high-pressure zone in any group at any point in time. After surgery, the mean and maximum resting pressures had returned to 80% of their original values.
Although anorectal function is transiently somewhat impaired after restorative proctocolectomy, the impairment is not an age-related phenomenon.
closed_qa
Symptomatic cholelithiasis: a different disease in men?
To determine the importance of gender in the clinical presentation and subsequent clinical outcome (risk of conversion from laparoscopic to open technique and risk of postoperative mortality) for patients undergoing cholecystectomy. Age and clinical presentation have consistently been found to be important predictors of cholecystectomy outcomes; male gender has been cited in disparate studies as possibly having prognostic significance. A statewide cholecystectomy registry (30,145 cases between 1989-1993) was analyzed. Hierarchical log-linear modeling was used to identify associations between characteristics of clinical presentation. Multivariate logistic regression analysis was used to determine predictions of conversion and mortality. Male gender was associated with twice the expected incidence of acute cholecystitis and pancreatitis in the elderly (>or = 65 years). Males had a significantly increased risk for conversion to open technique, but this decreased during the time frame of the study. Mortality was twice as high among males (confidence interval, 1.4-2.9, p = 0.0001).
Males presenting for cholecystectomy are more likely to have severe disease. Independent of clinical presentation, they face increased risks of conversion to open technique and of postoperative mortality.
closed_qa
Are cognitive changes the first symptoms of Huntington's disease?
Huntington's disease is a neurodegenerative disorder due to an excessive number of CAG repeats in the IT15 gene on chromosome 4. The first symptoms are typically choreic movements or psychiatric disorders, whereas global cognitive decline generally becomes obvious later. This study was aimed at detecting early subtle cognitive deficits in asymptomatic gene carriers. As part of the testing procedure for predictive diagnosis of Huntington's disease, 91 asymptomatic at risk candidates had a neuropsychological examination, evaluating global efficiency, attention, memory (Wechsler memory scale and California verbal learning test), and executive functions. The groups of carriers (n=42) and non-carriers (n=49) differed only on a few memory variables. When we considered the group of gene carriers as a whole, significant correlations emerged between the number of CAG repeats and (a) performance on several tests of executive functions, and (b) performance on the hard pairs associates of the Wechsler memory scale. Further analysis of performance on this memory subtest led to the division of the group of carriers into two subgroups, without any overlap. The performance of subjects without cognitive deficits (n=32) was similar to that of non-carriers on all tests. The subjects with cognitive deficits (n=10) differed from both carriers without cognitive deficits and non-carriers over a wide array of variables measuring executive functions and memory. Moreover, qualitative aspects of the performance of carriers with cognitive deficits in the California verbal learning test closely resembled those of patients diagnosed as having Huntington's disease.
This suggests that these subjects already have Huntington's disease, despite a total lack of motor and psychiatric signs. An ongoing follow up study is testing the prediction that they will develop the full range of symptoms of the disease earlier than carriers without cognitive deficits.
closed_qa
Does the gubernaculum migrate during inguinoscrotal testicular descent in the rat?
The role of the gubernaculum in descent of the testis is controversial. The mechanism of testicular descent has been studied in the rat, because inguino-scrotal descent occurs postnatally in this species. Several authors have claimed that the cremasteric sac forms by eversion of the gubernacular cone, whereby regression of the extra-abdominal part of the gubernaculum creates a space into which the gubernacular cone everts to form the processus vaginalis within the scrotum. This postulated lack of any gubernacular migration phase contrasts with the situation in the human, where gubernacular migration appears to be an integral component of testicular descent. This study was designed to determine in the rat whether there is any gubernacular migration toward the scrotum during testicular descent, or whether eversion of the cremasteric sac alone could account for the extension of this sac into the bottom of the scrotum. Oblique sagittal sections of the inguino-perineal region were taken from rats aged 21 days of gestation and days 1, 3, 4, 6, 8, and 10 days postnatally. Histological sections were examined and the following measurements were obtained: gubernacular cone height, gubernaculum-scrotum distance, processus vaginalis length, and pubic symphysis-anus distance. The gubernaculum was not in close proximity to the developing scrotum at any age. After 21 days of gestation, there was little evidence of a substantial gubernacular bulb distal to the processus vaginalis. At all ages the gubernacular cone height was significantly less than the distance from the gubernaculum to the scrotum.
These results show that the gubernaculum does not develop in close proximity to the developing scrotum. Even if complete eversion of the gubernaculum takes place, the gubernaculum would still fail to reach the bottom of the scrotum. It is proposed that gubernacular eversion is more apparent than real and that some degree of gubernacular migration is needed for complete extension of the cremasteric sac to the bottom of the scrotum.
closed_qa
Critical illness polyneuropathy: a new iatrogenically induced syndrome after cardiac surgery?
Critical illness polyneuropathy (CIP) is a newly described severe complication after open heart surgery leading to tetraplegia for weeks to months. The purpose of the study was to gather further information on critical illness polyneuropathy developing in patients after cardiac surgery and to evaluate the hypothetical risk factors possibly related to the onset of this neurological disorder. From July 1994 to October 1995, 7 out of 1511 patients undergoing open heart surgery developed critical illness polyneuropathy, which was diagnosed on the basis of electromyographic and nerve conduction features. The only common clinical finding was an intensive care unit (ICU) stay beyond seven days, therefore a similar group of 37 patients staying longer than seven days in the intensive care unit during the same period of time was evaluated and retrospectively compared to the 7 patients developing critical illness polyneuropathy. Univariate analysis of several traits was performed to evaluate possible risk factors. 4 Out of 7 patients in the CIP group died, all due to multiple organ failure, in contrast to 3/37 patients in the control group, again due to multiple organ failure. Patients developing CIP were staying significantly longer in the ICU (62+/-3 versus 14+/-8 days, P<0.01) and had a significantly longer time on ventilator support (50+/-28 versus 7+/-13 days, P<0.01) The incidence of sepsis was significantly higher in the CIP group than in the control group (85.7 versus 10.8%, P<0.01). Compared to the control group the proportion of patients receiving corticosteroids (100 versus 10.8%, P<0.01) and increased dosages of epinephrine and norepinephrine was higher in the CIP group (85.7 versus 35.1%, P<0.05). Furthermore, the proportion of patients requiring chronic venovenous hemodiafiltration was significantly elevated in the CIP group (85.7 versus 5.4%, P<0.01).
CIP, despite it's benign nature due to it's spontaneous remission in patients who survive, is a disturbing complication following cardiac surgery which is associated with high mortality, a prolonged stay in the ICU, as well as an extended time on ventilator support. Interventions like chronic hemodiafiltration, the application of corticosteroids and the administration of high doses of catecholamines are more frequent in patients with CIP. Whether this indicates a causal relationship remains to be elucidated.
closed_qa
Misrepresentation of publications by applicants for radiology fellowships: is it a problem?
We performed this study to determine whether applicants to the body and breast/body imaging fellowship programs at our institution misrepresented their publications in their applications or curricula vitae, as has been reported recently regarding applicants for gastroenterology fellowships. We also wanted to alert program directors to this issue. For each applicant in 1992-1995, every article cited on an application form or curriculum vitae as published or in press was cross-referenced with computer databases or the actual journals. Of 201 applicants, 87 (43%) listed at least one article citation (total citations, 261; mean number of citations, 3.0; maximum number of citations, 20). Of 261 citations, 39 (15%) could not be verified. Seven articles (listed by six applicants) did not appear in print 16-30 months after being listed as in press; six citations (by six applicants) put the applicant's name higher on the authorship list than was actually true; two articles (by two applicants) were not in the location cited or elsewhere; and 24 articles (by 14 applicants) were listed as appearing in journals that could not be found. The first three categories were judged as misrepresentations of publications; the fourth category was judged indeterminate for misrepresentation.
A minimum of 16% (14/87) of applicants to the body and breast/body imaging fellowship programs at our institution who cited publications, or 7% of all 201 applicants in the time studied, appear to have misrepresented their publication record. Program directors should be aware of the possible means for prevention of this problem.
closed_qa
The decline in Rh hemolytic disease: should Rh prophylaxis get all the credit?
This study sought to quantify the magnitude of Rh disease reduction occurring secondary to Rh prophylaxis and other determinants. Outcomes considered included maternal Rh sensitization, neonatal Rh disease, and perinatal deaths from Rh disease. Analysis was based on Poisson regression modeling of ecological data from Manitoba, Canada, and conditional probability modeling. The ecological analysis showed that changes in birth order and Rh prophylaxis resulted in 24% (95% confidence interval [CI] = 1% 42%) and 69% (95% CI = 61%, 76%) decreases, respectively, in Rh sensitizations (D and non-D) in Manitoba between 1963 and 1988. Rh prophylaxis and nonprogram factors were responsible for 83% (95% CI = 44%, 95%) and 78% (95% CI = 42%, 91%), respectively, of the reduction in perinatal deaths from Rh disease. Similar results were obtained with conditional probability modeling, which also provided estimates for the effects of changes in abortion rates and racial composition.
In addition to Rh prophylaxis, changes in other determinants were responsible for an important fraction of the decline in Rh disease. These results provide a historical perspective on the conquest of Rh disease and also have important implications for public health policy, particularly in developing countries.
closed_qa
Epidemic obesity in the United States: are fast foods and television viewing contributing?
This study examined the association between TV viewing, fast food eating, and body mass index. Associations between hours of TV viewing, frequency of eating at fast food restaurants, body mass index, and behaviors were assessed cross sectionally and longitudinally over 1 year in 1059 men and women. Fast food meals and TV viewing hours were positively associated with energy intake and body mass index in women but not in men. TV viewing predicted weight gain in high-income women.
Secular increases in fast food availability and access to televised entertainment may contribute to increasing obesity rates in the United States.
closed_qa
Thrombolytic therapy guided by a decision analysis model: are there potential benefits for patient management?
Although thrombolytic therapy improves the outcome of myocardial infarction, it is associated with increased risks of stroke and bleeding; these risks may outweigh the benefits of therapy. The risks and benefits of thrombolysis, for any individual clinical situation, can be explicitly estimated by means of decision analysis. The aim of this study was to compare the actual use of thrombolytic agents for suspected acute myocardial infarction (AMI) with the management preferred by a decision analysis model. Admission data prospectively obtained in 262 consecutive patients admitted to a rural community hospital's coronary care unit with suspected AMI, as well as clinical decisions and outcomes, were reviewed and analyzed. Seventeen deaths from AMI and no major strokes were observed, compared with 18.30 deaths and 0.85 major strokes predicted by a decision analysis model. Forty-seven of 84 patients with confirmed AMI and 3 of 178 without AMI were given a thrombolytic agent, compared with 65 patients with and 7 without AMI who had decision analysis-guided therapy. Decision analysis-guided therapy could have saved 3.7 additional lives and gained 29.6 life years, but produced 0.4 extra strokes. Changing the quality adjustment for stroke or heart failure would not have altered the treatment preferred by decision analysis in any of the 262 cases studied. Some patients were predicted to benefit considerably from thrombolysis with little extra risk of stroke and vice versa: all cases must, therefore, be assessed individually.
A decision analysis model can guide thrombolytic therapy by promptly defining its risks and benefits.
closed_qa
Does the use of topical lidocaine, epinephrine, and tetracaine solution provide sufficient anesthesia for laceration repair?
To determine: 1) the effectiveness of lidocaine, epinephrine, and tetracaine (LET) solution in eliminating or reducing the pain experienced in suturing superficial lacerations in adult patients; and 2) the effectiveness of LET in reducing the pain of local anesthetic injection. A prospective, randomized, double-blind study in which 60 adult patients with superficial lacerations were entered was conducted in the ED of a community-based teaching hospital affiliated with the University of Toronto. Following application of the LET or placebo (sterile water) solution to the laceration, a visual analog pain scale was recorded by the patient upon needle probing of the wound margin. If probing was painless, the laceration was repaired using LET alone. If injection of local anesthetic was required, an additional pain scale was elicited to quantify the attenuation of the pain of injection by the prior application of LET. Pain scale values on needle probing were significantly reduced in the LET group vs the placebo group (medians of 4.0 vs 5.0 cm, respectively; p<0.05). Only 13 of the 30 patients in the LET group required additional anesthetic, while all 30 patients in the placebo group requested local anesthetic. Pain scale values on injection of local anesthetic were not significantly different between the LET and placebo groups (medians of 3.5 vs 5.0 cm, respectively; p = 0.09), although there was a trend for lower pain scale values for those patients who received LET. No adverse effects were noted after the application of either LET or placebo solution. Follow-up was achieved for 54 of 60 patients with only 1 complication (a wound infection) reported in the LET group.
Significantly fewer patients require an injectable anesthetic when LET is applied. Those who do require an injection may experience less discomfort. These advantages should be balanced against the 20 to 30 minutes necessary for the LET to take effect.
closed_qa
Twenty-four hour blood pressure monitoring in early pregnancy: is it predictive of pregnancy-induced hypertension and preeclampsia?
To investigate whether a chronobiological analysis applied to automated 24-hour blood pressure monitoring in early pregnancy provides objective parameters enabling detection of single patients at risk of pregnancy-induced hypertension or preeclampsia. 24-hour automatic blood pressure monitoring was performed at 8-16 and 20-25 gestational weeks in 104 women at risk of pregnancy-induced hypertension or preeclampsia. The subjects were hospitalized to be synchronized to rest-activity and meal-timing schedules. All women were followed longitudinally until post-partum. Chronobiological analysis of blood pressure values was performed; sensitivity, specificity and predictive values of MESOR and hyperbaric index were also calculated. Incidence of pregnancy-induced hypertension or preeclampsia, gestational week at delivery and birthweight were recorded. Nine thousand nine hundred and eighty-four blood pressure measurements were analyzed. In patients who later developed overt hypertension, systolic and diastolic blood pressure MESOR, hyperbaric index and percent time elevation were already significantly higher in early pregnancy than in those who remained normotensive. The best sensitivity and specificity was obtained between 20-25 weeks of gestation with systolic single cosinor MESOR and Hyperbaric Index using as cut-off 103 mmHg (sensitivity: 88%; specificity: 75%) and 10 mmHg/24 hour (sensitivity: 70%; specificity: 92%), respectively.
The chronobiological analysis applied to 24-hour blood pressure monitoring during pregnancy allows definition of objective cut-off values which can be particularly useful in the routine clinical practice when the risk of developing pregnancy-induced hypertension or preeclampsia must be calculated in the individual subject.
closed_qa
Can exercise testing in patients with a history of myocardial infarction predict fatal and non-fatal reinfarction?
Exercise testing (ET) is the preferred initial strategy for risk stratification in patients who are able to exercise and have an interpretable electrocardiogram. However, although it is often suggested and widely applied, its usefulness years after myocardial infarction (MI) is questionable. Therefore, this study was designed to assess the value of exercise testing in predicting the risk of fatal or non-fatal reinfarction in patients with chronic stable coronary artery disease (CAD) due to old myocardial infarction. Our study involved 766 consecutive stable subjects [mean (SD) age 57 (8.6) years; male: 89%] with stable CAD due to old MI [mean time from MI: 2.8 (0.75) years], who underwent a Bruce treadmill test and whose data were prospectively entered into our institutional database. Patients were followed up for an average of 7 (0.6) years. Reinfarction was observed in 62 patients; 54 non-fatal and 8 (13%) fatal. Relative risk (RR) of cardiac death for subjects with reinfarction was 4.02 [95% confidence interval (CI): 2.46 to 6.55]. Univariate predictors of fatal or non-fatal reinfarction were: multivessel disease (RR 7.99, CI 1.12 to 56.82), EF<40% (RR 2.91, CI 1.64 to 7.17), ST depression on rest ECG (RR 2.4, CI 1.30 to 4.45), BP increase with exercise<10 mmHg (RR 2.36, CI 1.41 to 3.93), BP/HR interaction<10 mmHg +<85% max (RR 2.16, CI 1.24 to 3.76). Markers of reduced risk of recurrence included low-risk Duke Treadmill Score (RR 0.55, CI 0.33 to 0.91) and EF>or = 40% (RR 0.34, CI 0.19 to 0.60). A Cox regression model with clinical and exercise parameters detected ST depression on rest ECG (RR 1.47, CI 1.07 to 2.02), BP increase with exercise<10 mmHg (RR 1.41, CI 1.07 to 1.87), low-risk Duke Treadmill Score (RR 0.79, CI 0.60 to 1.02). A model with coronary anatomy and ejection fraction was also able to identify multivessel disease (RR 2.95, CI 1.43 to 6.09), EF<40% (RR 1.62, CI 1.17 to 2.25) and BP increase with exercise<10 mmHg (RR 2.53, CI 1.35 to 4.71).
Stable patients with a history of MI represent a very low-risk population in whom reinfarction continues to have a severe prognosis. ET is unable to identify subjects in whom there is a risk of recurrence, especially if only ischemic parameters are evaluated (in this setting, a clinical or anatomic risk stratification may be better). The application of the Duke Treadmill Score could help to identify a very low-risk group in which no additional testing is required. Therefore, routine ET in stable patients with a history of MI is better at identifying a very low-risk group than in predicting recurrence.
closed_qa
Is lidocaine infiltration during femoral sheath removal really necessary?
To compare the effectiveness of intravenous morphine sulfate to intravenous morphine sulfate plus 1% lidocaine infiltration in pain management associated with femoral sheath removal, and to evaluate pain intensity and associated complications during femoral sheath removal in patients with heart disease receiving interventional treatment. Descriptive, correlational, case-control study. Mid-metropolitan, university-affiliated, tertiary care medical center. Fifty patients with heart disease, who had no known allergies to morphine or lidocaine and who had not had an acute myocardial infarction, admitted to the cardiovascular intervention unit. Vertical visual analogue scale, short-form McGill Pain Questionnaire, and demographic tool for complications. No significant differences in pain intensity, sensation, affect, and total scores were found between the analgesic regimens. Six (12%) patients had vasovagal reactions develop, but no statistical differences were found between pain intensity scores and associated complications.
Morphine sulfate does appear effective in controlling pain associated with femoral sheath removal. The benefit of lidocaine infiltration during this procedure is questionable.
closed_qa