instruction
stringlengths
10
664
context
stringlengths
1
5.66k
response
stringlengths
1
3.34k
category
stringclasses
1 value
Communication between primary care and physician specialist: is it improving?
Efforts have recently been made in Spain to improve the communication model between primary care and specialized care. The aim of our study was to analyze the impact of a change in the communication model between the two areas when comparing a traditional system to a consulting system in terms of satisfaction of general practitioners and the number of patient referrals. A questionnaire was used to assess the point of view on the relations with the endocrinologist team of 20 general practitioners from one primary care center at baseline and 18 months after the implementation of the new method of communication. In addition, we counted the number of referrals during the two periods. We analyzed 30 questionnaires; 13 before and 17 after the consulting system was established. Consulting system was preferred to other alternatives as a way of communication with endocrinologists. After the consulting system was implemented, general practitioners were more confident in treating hypothyroidism and diabetes. There was a decrease in the number of patient referrals to specialized care from 93.8 to 34.6 per month after implementation of the consultant system.
The consultant system was more efficient in resolving problems and responding to general practitioners than the traditional system. General practitioners were more confident in self-management of hypothyroidism and diabetes. A very large decrease in the number of patient referrals was observed after implementation of the consultant system.
closed_qa
Do past mortality rates predict future hospital mortality?
This study aimed to determine whether hospitals with higher historical mortality rates are independently associated with worse patient outcomes. Observational study of in-hospital mortality in open abdominal aortic aneurysm repair, aortic valve replacement, and coronary artery bypass graft surgery in a California in-patient database was conducted. Hospitals' annual historical mortality rates between 1998 and 2010 were calculated based on 3 years of data before each year. Results were adjusted for race, sex, age, hospital teaching status, admission year, insurance status, and Charlson comorbidity index. Hospitals were divided into quartiles based on historical mortality rates. For abdominal aortic aneurysm repair, the odds ratio (OR) of in-hospital mortality for hospitals within the highest quartile of prior mortality was 1.30 compared with the lowest quartile (95% confidence interval [CI] 1.03 to 1.63). For aortic valve replacement, the OR was 1.41 for the 3rd quartile (95% CI 1.15 to 1.73) and 1.54 for the highest quartile (95% CI 1.27 to 1.87). For coronary artery bypass graft surgery, the OR was 1.33 for the 3rd (95% CI 1.2 to 1.49) and 1.58 for the highest (95% CI 1.41 to 1.76) quartiles.
Patients presenting to hospitals with high historical mortality rates have a 30% to 60% increased mortality risk compared with patients presenting to hospitals with low historical mortality rates.
closed_qa
Is restless legs syndrome associated with chronic mountain sickness?
Restless legs syndrome (RLS) and chronic mountain sickness (CMS) share physiological traits. Our objective was to explore a possible association between RLS and CMS. We carried a cross-sectional study with male subjects living between 4100 and 4300 m above sea level. Participants underwent a clinical interview, physical examination, electrocardiographic (EKG) recording, and spirometry. We classified subjects into CMS, Limbo, and healthy high-altitude dwellers (hHAD), according to their Quinghai score and hematocrit levels. We applied the "Paradigm of questions for epidemiological studies of RLS," The International Restless Leg Syndrome Study Group Scale, and the Pittsburgh Sleep Quality Index. Logistic regression analysis was used to determine the association between variables. Seventy-eight male subjects were included. Forty subjects were hHAD, 23 were CMS patients, and 15 participants were considered as Limbo. CMS and Limbo subjects had a higher frequency of RLS (p <0.05). Limbo subjects had the highest severity score for RLS. There were no differences in age, body mass index (BMI), or tobacco use between RLS patients and non-sufferers. In the multivariate analysis, CMS was not associated with RLS diagnosis. Oxygen saturation (p = 0.019), poor sleep quality (p <0.01), and Quinghai score of ≥6 (p = 0.026) were independently associated with RLS diagnosis.
Our results did not show a direct association between RLS and CMS; however, RLS was associated with reduced oxygen saturation. Hence, RLS could represent an early clinical manifestation of hypoxia, or, in CMS natural history, an early sign of maladaptation to high altitude.
closed_qa
Does tranexamic acid alter the risk of thromboembolism following primary total knee arthroplasty with sequential earlier anticoagulation?
In order to decrease the blood loss and transfusion requirement, tranexamic acid (TXA) has attracted the public's attention in total knee arthroplasty (TKA). However, the safety profile of TXA hindered its wide adoption. And the balance of anti-coagulation sequential anti-fibrinolysis has not yet been explored. This large, single center, prospective cohort study of consecutive cases aimed to investigate the epidemiology of vascular occlusive events associated with TXA and introduce our preliminary results of novel thromboprophylaxis. We prospectively collected patients' data of our institution through National Health Database. The primary outcome was the incidence of venous thromboembolism and mortality within 30days following primary TKA. Subgroup analysis was performed on the basis of TXA administration methods. During 2012 to 2014, a total of 2532 unilateral TKA procedures were conducted in our institution, 2222 with TXA, 310 without TXA. The total occurrence of vascular occlusive events was statistically significantly higher (17.55% Vs 9.35%, p<0.001) in the TXA group but this finding was confined to the calf veins, with the main difference being the incidence in the calf muscular veins (13.68% Vs 6.77%, p=0.001). Statistical difference was not detected neither in the incidence of symptomatic DVT nor asymptomatic DVT. No episode of symptomatic PE and all-cause mortality within 30days occurred postoperatively. Subgroup analysis revealed no significant difference with regard to the incidence of DVT (p>0.05).
This study confirmed that the incidence of postoperative VTE was unchanged when TXA was administered in primary unilateral TKA. And our study further indicated that earlier anticoagulation should be adopted to keep the balance between anti-fibrinolysis and anti-coagulation after administering TXA.
closed_qa
Does magnesium exposure affect neonatal resuscitation?
Research on immediate neonatal resuscitation suggests that maternal magnesium exposure may be associated with increased risk of low Apgar scores, hypotonia, and neonatal intensive care unit admission. However, not all studies support these associations. Our objective was to determine whether exposure to magnesium at the time of delivery affects initial neonatal resuscitation. This is a secondary analysis of the Randomized Controlled Trial of Magnesium Sulfate for the Prevention of Cerebral Palsy that evaluated whether the study drug (magnesium or placebo) that was administered at the time of delivery was associated with increased risk for a composite adverse neonatal resuscitation outcome (5-minute Apgar score<7, oxygen administration in the delivery room, intubation, chest compressions, hypotension, and hypotonicity). A subgroup analysis was performed among patients who delivered at ≥30 weeks of gestation. Log-linear regression was used to control for possible confounders. Data for 1047 patients were analyzed, of whom 461 neonates (44%) were exposed to magnesium. There was no increased risk for the primary composite outcome associated with magnesium exposure. Individual adverse neonatal outcomes and other secondary short-term neonatal outcomes that were evaluated also did not demonstrate an association with magnesium exposure.
Exposure to magnesium sulfate did not affect neonatal resuscitation or other short-term outcomes. These findings may be useful in planning neonatal care and patient counseling.
closed_qa
Do laborists improve delivery outcomes for laboring women in California community hospitals?
We sought to determine the impact of the laborist staffing model on cesarean rates and maternal morbidity in California community hospitals. This is a cross-sectional study comparing cesarean rates, vaginal birth after cesarean rates, composite maternal morbidity, and severe maternal morbidity for laboring women in California community hospitals with and without laborists. We conducted interviews with nurse managers to obtain data regarding hospital policies, practices, and the presence of laborists, and linked this information with patient-level hospital discharge data for all deliveries in 2012. Of 248 childbirth hospitals, 239 (96.4%) participated; 182 community hospitals were studied, and these hospitals provided 221,247 deliveries for analysis. Hospitals with laborists (n = 43, 23.6%) were busier, had more clinical resources, and cared for higher-risk patients. There was no difference in the unadjusted primary cesarean rate for laborist vs nonlaborist hospitals (11.3% vs 11.7%; P = .382) but there was a higher maternal composite morbidity rate (14.4% vs 12.0%; P = .0006). After adjusting for patient and hospital characteristics, there were no differences in laborist vs nonlaborist hospitals for any of the specified outcomes. Hospitals with laborists had higher attempted trial of labor after cesarean rates, and lower repeat cesarean rates (90.9% vs 95.9%; P<.0001). However, among women attempting trial of labor after cesarean, there was no difference in the vaginal birth after cesarean success rate.
We were unable to demonstrate differences in cesarean and maternal childbirth complication rates in community hospitals with and without laborists. Further efforts are needed to understand how the laborist staffing model contributes to neonatal outcomes, cost and efficiency of care, and patient and physician satisfaction.
closed_qa
Are Staffing, Work Environment, Work Stressors, and Rationing of Care Related to Care Workers' Perception of Quality of Care?
To describe care worker-reported quality of care and to examine its relationship with staffing variables, work environment, work stressors, and implicit rationing of nursing care. Cross-sectional study. National, randomly selected sample of Swiss nursing homes, stratified according to language region and size. A total of 4311 care workers of all educational backgrounds (registered nurses, licensed practical nurses, nurse aides) from 402 units in 155 nursing homes completed a survey between May 2012 and April 2013. Care worker-reported quality of care was measured with a single item; predictors were assessed with established instruments (eg, Practice Environment Scale-Nurse Working Index) adapted for nursing home use. A multilevel logistic regression model was applied to assess predictors for quality of care. Overall, 7% of care workers rated the quality of care provided as rather low or very low. Important factors related to better quality of care were higher teamwork and safety climate (odds ratio [OR] 6.19, 95% confidence interval [CI]4.36-8.79); better staffing and resources adequacy (OR 2.94, 95% CI 2.08-4.15); less stress due to workload (OR 0.71, 95% CI 0.55-0.93); less implicit rationing of caring, rehabilitation, and monitoring (OR 0.34, 95% CI 0.24-0.49); and less rationing of social care (OR 0.80, 95% CI 0.69-0.92). Neither leadership nor staffing levels, staff mix, or turnover was significantly related to quality of care.
Work environment factors and organizational processes are vital to provide high quality of care. The improvement of work environment, support in handling work stressors, and reduction of rationing of nursing care might be intervention points to promote high quality of care in nursing homes.
closed_qa
Open airway surgery for subglottic hemangioma in the era of propranolol: Is it still indicated?
With the emergence of propranolol as the primary treatment for hemangiomas the indications for surgical intervention have been greatly reduced. There remains a role for surgical management in those patients who fail medical therapy, particularly for hemangiomas involving the airway. Detailed is our experience with subglottic hemangiomas, including three patients who failed propranolol treatment and were successfully treated with surgical excision and single stage laryngotracheoplasty (LTP) with thyroid ala graft. Retrospective case series (level of evidence: 4). Six patients were treated with propranolol for subglottic hemangiomas over a 6 year period (2008-2014). Three patients responded to propranolol therapy and required no adjunctive surgical procedures. Three patients failed propranolol treatment, and required open resection of their subglottic hemangiomas and thyroid ala graft placement. Indications for resection were complete lack of response to propranolol in one patient, and initial response to propranolol with subsequent regrowth in the other two patients. All three patients were treated with submucosal extirpation of their hemangioma and single stage LTP; hemangioma was confirmed in all cases by positive GLUT-1 staining. All three surgical patients were successfully extubated post-operatively and none had hemangioma regrowth.
Fifty percent of patients in our series did not have long-term response to propranolol for subglottic hemangioma, highlighting the importance of close follow-up. When identified early, subglottic hemangiomas refractory to propranolol treatment can be successfully addressed with single stage LTP and tracheotomy can be avoided.
closed_qa
Increased 99mTc MDP activity in the costovertebral and costotransverse joints on SPECT-CT: is it predictive of associated back pain or response to percutaneous treatment?
Pain related to costovertebral and costotransverse joints is likely an underrecognized and potentially important cause of thoracic back pain. On combined single-photon emission computed tomography and computed tomography (SPECT-CT), increased technetium-99m methylene diphosphonate (99mTc MDP) activity at these articulations is not uncommon. We evaluated whether this activity corresponds with thoracic back pain and whether it predicts response to percutaneous injection. All 99mTc MDP SPECT-CT spine examinations completed at our institution from March 2008 to March 2014 were retrospectively reviewed to identify those with increased 99mTc MDP activity in the costovertebral or costotransverse joints. The presence of corresponding thoracic back pain, percutaneous injection performed at the relevant joint(s), and response to injection were recorded. A total of 724 99mTc MDP SPECT-CT examinations were identified. Increased 99mTc MDP activity at costovertebral or costotransverse joints was reported in the examinations of 55 patients (8%). Of these, 25 (45%) had corresponding thoracic back pain, and nine of 25 patients (36%) underwent percutaneous injection of the joint(s) with increased activity. At clinical follow-up two days to 12 weeks after injection, one patient (11%) had complete pain relief, two (22%) had partial pain relief, and six (67%) had no pain relief.
The findings suggest that increased activity in costovertebral and costotransverse joints on 99mTc MDP SPECT-CT is only variably associated with the presence and location of thoracic back pain; it does not predict pain response to percutaneous injection.
closed_qa
Promoting theory of mind in older adults: does age play a role?
Previous research on age-related changes in Theory of Mind (ToM) showed a decline in older adults, particularly pronounced over 75 years of age. Evidence that ToM may be enhanced in healthy aging people has been demonstrated, but no study has focused on the role of age on the effects of ToM training for elderly people. The present study was designed to examine the efficacy of a ToM training on practiced (ToM Strange Stories) and transfer tasks (ToM Animations) in both young and older adults. The study involved 127 older adults belonging to two age groups: young-old (Mage = 64.41; SD = 2.49; range: 60-69 years) and old-old (Mage = 75.66; SD = 4.38; range: 70-85 years), randomly assigned to either a ToM group or a control group condition. All participants took part in two 2-hour testing sessions and four 2-hour training sessions. Results showed that both young-old and old-old adults in the ToM group condition improved their ability to reason on complex-mental states significantly more than participants in the control group condition. This positive effect of the training was evident on practiced and transfer ToM tasks. Crucially, age did not moderate the effect of the ToM training.
These findings demonstrate that young-old and old-old adults equally benefit from the ToM training. Implications for the positive effect of the ToM training in old-old adults are discussed.
closed_qa
Assessment of Jordanian Patient's Colorectal Cancer Awareness and Preferences towards CRC Screening: Are Jordanians Ready to Embrace CRC Screening?
Colorectal cancer (CRC is increasingly becoming a major cause of cancer morbidity and mortality in Jordan. However the population's level of awareness about CRC, CRC screening test preferences and willingness to embrace screening are not known. The aim of this study was to assess the level of CRC awareness and screening preferences among Jordanian patients. A survey assessing the CRC knowledge levels was distributed among patients attending outpatient gastroenterology clinics in public hospitals throughout Jordan. A total of 800 surveys were distributed and of these 713 (89.1%) were returned. Only 22% of the participants correctly judged CRC among the choices provided as the commonest cause of cancer related deaths. The majority of participants (68.3%) underestimated their risk for CRC. Only 26.8% correctly judged their life time risk while 5% overestimated their risk. Two thirds of participants (66%) were willing to pay 500 Jordanian Dinars (equivalent to 706 US$) in order to get a prompt colonoscopy if recommended by their physician, while 25.5% reported that they would rather wait for 6 months in order to get a free colonoscopy.
Although the participants tended to underestimate their risk for CRC, they were mostly aware of CRC as a major cause of mortality and were willing to embrace the concept of CRC screening and bear the related financial costs. These findings about CRC awareness and propensity for screening provide a good foundation as the Jordanian health system moves forward with initiatives to promote CRC screening and prevention.
closed_qa
Could the Breast Prognostic Biomarker Status Change During Disease Progression?
Prognostic biomarkers in breast cancer are routinely investigated in the primary tumors to guide further management. However, it is proposed that the expression may change during the disease progression, and may result in a different immune profile in the metastatic nodes. This work aimed to investigate the expression of breast prognostic biomarkers in primary tumors and in its axillary nodal metastasis, to estimate the possible discordant expression. 60 paired primary and axillary nodal metastasis samples were collected from patients with primary breast cancer with positive nodal deposits, diagnosed at the Maadi Military Hospital, Cairo, Egypt, during the year 2013. ER, PR and HER2 expression was assessed by immunohistochemistry in all samples 48.3% of the included cases showed concordant results for both ER and PR receptors between the primary tumor and its nodal metastasis while 51.7% showed discordant results and the discordance level was statistically significant. On the other hand, 70% of the cases showed concordant Her2 results between the primary tumors and the nodal deposits, 30% showed discordant results and the difference was significant.
The study indicated that the discordance in ER and PR receptor expression between the primary breast tumor and their nodal metastasis may be significant. The possible switch in the biomarker status during the disease progression is worth noting and may change the patient therapeutic planning. So, whether the treatment selection should be based on biomarkers in the lymph node is a topic for further studies and future clinical trials.
closed_qa
Contrast Enhancement of the Right Ventricle during Coronary CT Angiography--Is It Necessary?
It is unclear if prolonged contrast media injection, to improve right ventricular visualization during coronary CT angiography, leads to increased detection of right ventricle pathology. The purpose of this study was to evaluate right ventricle enhancement and subsequent detection of right ventricle disease during coronary CT angiography. 472 consecutive patients referred for screening coronary CT angiography were retrospectively evaluated. Every patient underwent multidetector-row CT of the coronary arteries: 128x 0.6mm coll., 100-120kV, rot. time 0.28s, ref. mAs 350 and received an individualized (P3T) contrast bolus injection of iodinated contrast medium (300 mgI/ml). Patient data were analyzed to assess right ventricle enhancement (HU) and right ventricle pathology. Image quality was defined good when right ventricle enhancement>200HU, moderate when 140-200HU and poor when<140HU. Good image quality was found in 372 patients, moderate in 80 patients and poor in 20 patients. Mean enhancement of the right ventricle cavity was 268HU±102. Patients received an average bolus of 108±24 ml at an average peak flow rate of 6.1±2.2 ml/s. In only three out of 472 patients (0.63%) pathology of the right ventricle was found (dilatation) No other right ventricle pathology was detected.
Right ventricle pathology was detected in three out of 472 patients; the dilatation observed in these three cases may have been picked up even without dedicated enhancement of the right ventricle. Based on our findings, right ventricle enhancement can be omitted during screening coronary CT angiography.
closed_qa
Are measurements of peak nasal flow useful for evaluating nasal obstruction in patients with allergic rhinitis?
Nasal obstruction is one of the most bothering allergic rhinitis (AR) symptoms and there is a need for objective parameters to complement clinical evaluation due to blunted perception in many patients. In this study we compare measures of peak nasal inspiratory flow (PNIF) and peak nasal expiratory flow (PNEF) in patients with AR and in individuals without nasal symptoms and correlate them with the perception of nasal obstruction. A comparative cross-sectional study was conducted in 64 AR patients and 67 individuals without nasal symptoms aged between 16 and 50 years. All subjects had PNIF and PNEF measures and subjective evaluations of nasal obstruction were done through a visual analogue scale (VAS) and a symptoms questionnaire. The results show a lower PNIF and PNEF in AR patients compared to controls. There was no correlation between VAS score and PNIF and PNEF. There was a weak inverse correlation between PNIF and symptoms score.
Objective measures of nasal obstruction, especially PNIF, can give useful informations on aspects of the disease dif- ferent from those obtained from the patient`s perception.
closed_qa
Pericyte chemomechanics and the angiogenic switch: insights into the pathogenesis of proliferative diabetic retinopathy?
To establish the regulatory roles that pericytes have in coordinating retinal endothelial cell (EC) growth and angiogenic potential. Pericytes were derived from donor diabetic (DHuRP) or normal (NHuRP) human retinae, and characterized using vascular markers, coculture, contraction, morphogenesis, and proliferation assays. To investigate capillary "cross-talk," pericyte-endothelial coculture growth, and connexin-43 (Cx43) expression assays were performed. Paracrine effects were examined via treating EC with pericyte-derived conditioned media (CM) in proliferation, angiogenesis, and angiocrine assays. The effects of sphingosine 1-phosphate (S1P) were assessed using receptor antagonists. The DHuRP exhibit unique proliferative and morphologic properties, reflecting distinctive cytoskeletal and isoactin expression patterns. Unlike NHuRP, DHuRP are unable to sustain EC growth arrest in coculture and display reduced Cx43 expression. Further, CM from DHuRP (DPCM) markedly stimulates EC proliferation and tube formation. Treatment with S1P receptor antagonists mitigates DPCM growth-promotion in EC and S1P-mediated pericyte contraction. Angiocrine assays on normal and diabetic pericyte secretomes reveal factors involved in angiogenic control, inflammation, and metabolism.
Effects from the diabetic microenvironment appear sustainable in cell culture: pericytes derived from diabetic donor eyes seemingly possess a "metabolic memory" in vitro, which may be linked to original donor health status. Diabetes- and pericyte-dependent effects on EC growth and angiogenesis may reflect alterations in bioactive lipid, angiocrine, and chemomechanical signaling. Altogether, our results suggest that diabetes alters pericyte contractile phenotype and cytoskeletal signaling, which ultimately may serve as a key, initiating event required for retinal endothelial reproliferation, angiogenic activation, and the pathological neovascularization accompanying proliferative diabetic retinopathy.
closed_qa
Does aspirin administration increase perioperative morbidity in patients with cardiac stents undergoing spinal surgery?
Cohort. To compare the perioperative morbidity of patients with cardiac stents after spine surgery who continue to take aspirin before and after the operation with a similar group of patients who preoperatively discontinued aspirin. The preoperative discontinuation of anticoagulant therapy has been the standard of care for orthopedic surgical procedures. However, recent literature has demonstrated significant cardiac risk associated with aspirin withdrawal in patients with cardiac stents. Although it has recently been demonstrated that performing orthopedic surgery while continuing low-dose aspirin therapy seems to be safe, studies focused on spinal surgery have not yet been performed. Because of the risk of intraspinal bleeding and the serious consequences of subsequent epidural hematoma with associated spinal cord compression, spinal surgeons have been reluctant to operate on patients taking aspirin. This institutional review board-approved study included 200 patients. Preoperative parameters and postoperative outcome measures were analyzed for 100 patients who underwent spinal surgery after the discontinuation of anticoagulation therapy and 100 patients who continued to take daily aspirin through the perioperative period. The primary outcome measure was serious bleeding-related postoperative complications such as spinal epidural hematoma. The operative time, intraoperative estimated blood loss, hospital length of stay, transfusion of blood products, and 30-day hospital readmission rates were also recorded and compared. The patients who continued taking aspirin in the perioperative period had a shorter hospital length of stay on average (4.1 ± 2.7 vs. 6.2 ± 5.8; P<0.005), as well as a reduced operative time (210 ± 136 vs. 266 ± 143; P<0.01), whereas there was no significant difference in the estimated blood loss (642 ± 905 vs. 697 ± 1187), the amount of blood products transfused, overall intra- and postoperative complication rate (8% vs. 11%), or 30-day hospital readmission rate (5% vs. 5%). No clinically significant spinal epidural hematomas were observed in either of the study groups.
The current study has observed no appreciable increase in bleeding-related complication rates in patients with cardiac stents undergoing spine surgery while continuing to take aspirin compared with patients who discontinued aspirin prior to surgery. Although very large studies will be needed to determine whether aspirin administration results in a small complication rate increase, the current study provides evidence that perioperative aspirin therapy is relatively safe in patients undergoing spinal surgery.
closed_qa
Are there subtypes of panic disorder?
Panic disorder (PD) is associated with significant personal, social, and economic costs. However, little is known about specific interpersonal dysfunctions that characterize the PD population. The current study systematically examined these interpersonal dysfunctions. The present analyses included 194 patients with PD out of a sample of 201 who were randomized to cognitive-behavioral therapy, panic-focused psychodynamic psychotherapy, or applied relaxation training. Interpersonal dysfunction was measured with the Inventory of Interpersonal Problems-Circumplex (Horowitz, Alden, Wiggins,&Pincus, 2000). Individuals with PD reported greater levels of interpersonal distress than that of a normative cohort (especially when PD was accompanied by agoraphobia), but lower than that of a cohort of patients with major depression. There was no single interpersonal profile that characterized PD patients. Symptom-based clusters (with vs. without agoraphobia) could not be discriminated on core or central interpersonal problems. Rather, as revealed by cluster analysis based on the pathoplasticity framework, there were 2 empirically derived interpersonal clusters among PD patients that were not accounted for by symptom severity and were opposite in nature: domineering-intrusive and nonassertive. The empirically derived interpersonal clusters appear to be of clinical utility in predicting alliance development throughout treatment: Although the domineering-intrusive cluster did not show any changes in the alliance throughout treatment, the nonassertive cluster showed a process of significant strengthening of the alliance.
Empirically derived interpersonal clusters in PD provide clinically useful and nonredundant information about individuals with PD.
closed_qa
Can school health check-ups serve as screening tool for growth anomalies and obesity in children?
To evaluate the prevalence of growth disorders and obesity in schoolchildren and determine whether school health check-ups are effective in their screening. Subjects-methods: Analysis of anonymized growth and body mass index (BMI) data from 2887 children attending the 3rd grade from 2008 to 2009 after selection of 75 elementary schools in Paris. Linear growth velocity was abnormal in 198 children. Height and weight were above the French reference values (+ 0.9 ± 1.2 SD and + 1 ± 1.7 SD, respectively). BMI was higher, compared to reference values (+ 0.4 ± 1.4 SD). At their last check-up, 20.9% of children had a BMI>+ 2 SD.
School health check-ups constitute a good screening tool for growth and obesity. However, further work is needed to determine the most effective modality. The reference values currently used in France are no longer suitable and new reference charts need to be established. The high prevalence of obesity in schoolchildren remains a public health challenge.
closed_qa
Analysis of actions taken by medical rescue teams in the Polish Emergency Medical Servies system. Is the model of division into specialistad basic teams reasonable?
The Polish Emergency Medical Services (EMS) system is based on two types of medical rescue teams (MRT): specialist (S)--with system doctors and basic (B)--only paramedics. The aim of this study is to assess the reasonability of dividing medical rescue teams into specialist and basic. The retrospective analysis of medical cards of rescue activities performed during 21,896 interventions by medical rescue teams, 15,877 of which--by basic medical rescue teams (B MRT) and 6,019--by specialist medical rescue teams (S MRT). The procedures executed by both types of teams were compared. In the analysed group of dispatches, 56.4% were unrelated to medical emergencies. Simultaneously, 52.7% of code 1 interventions and 59.2% of code 2 interventions did not result in transporting the patient to the hospital. The qualification of S teams' dispatches is characterised by a higher number of assigned codes 1 (53.2% vs. 15.9%). It is worth emphasising that the procedures that can be applied exclusively by system doctors do not exceed 1% of interventions. Moreover, the number of the actions performed in medical emergencies in the secured region by the S team that is dispatched as the first one is comparable to that performed by B teams. The low need for usinq S teams'aid by B teams (0.92% of the interventions) was also indicated.
This study points to the necessity to discuss the implementation of straightforward principles of call qualification and the optimisation of the system doctors' role in prehospital activities.
closed_qa
Is there a risk profile for the vulnerable junior doctor?
Mental ill health is prevalent among doctors, especially those in the early stages of postgraduate training. However, a paucity of research has examined factors predictive of psychological distress in this population. To report the findings from a multi-centre survey of mental health among junior doctors in Ireland, and assess the extent to which moderator variables (e.g., age, academic performance, nationality, etc.) alter the levels of psychological distress caused by internship. An online, anonymous, questionnaire was distributed to all interns in the Republic of Ireland in January 2012. A total of 270 interns responded to the survey (45.0 % response rate), with 48.5 % of the respondents having a score indicative of psychological distress. A regression model found that nationality, academic performance, intern training network, rating of work stressors, home stressors, and work-life balance were associated with differing levels of mental health as measured by the General Health Questionnaire-12.
There is a need to consider moderator variables when examining mental health in healthcare populations to avoid drawing overly simplistic conclusions. Interns in Ireland reported particularly high levels of psychological distress compared to other studies of mental health among healthcare populations.
closed_qa
Metformin: Potential analgesic?
To determine the association of self-report use of metformin and pain intensity. Survey-based cross sectional. Primary care in an academic medical center. Three hundred and twenty nine participants with diabetes. A total of 329 men and women, aged 18-65, completed a phone-based survey. We utilized the Brief Pain Inventory to assess for pain intensity ratings; Leeds Assessment of Neuropathic Symptoms and Signs to screen for neuropathy; and the Personal Health Questionnaire (PHQ8) Depression Scale to assess for depression. Three hundred and twenty nine diabetics (mean age 54- ± 8-year old) completed the study (162 metformin users, 167 nonusers). Compared with non-users, metformin users were used more often [38% vs 20%, P = 0.001]; had lower mean depression scores [6.8 vs 8.3; P = 0.026]and fewer comorbidities [1.5 vs 1.8, P = 0.022]. Adjusting for those three variables, pain scores were not significantly different between groups. In a subset analyses of those with neuropathic pain (n = 156), there were no differences in pain scores found between groups.
In a clinic sample of patients with diabetes, the use of metformin at an average dose of 1,432 mg (SD = 596 mg) was not associated with lower pain scores. Given the anti-nociceptive effects of metformin in the animal models of pain, and the relative safety of metformin, future research should evaluate the effect of the higher dose of metformin as a potential analgesic.
closed_qa
Ventricular tachycardias in patients with pulmonary hypertension: an underestimated prevalence?
Sudden cardiac death (SCD) accounts for approximately 30 % in patients with pulmonary arterial hypertension (PAH). The exact circumference for SCD in this patient population is still unclear. Malignant cardiac arrhythmias are reported to be rarely present. There are no systematic data concerning long-term electrocardiographic (ECG) recording in patients with PAH. We sought to investigate the rate of potentially relevant arrhythmias in patients with pulmonary hypertension (PH). Consecutive patients without diagnosis of known cardiac arrhythmias followed in our outpatient clinic for PH were enrolled in the study. All patients underwent a 72-h Holter ECG. Clinical data, 6-min walk distance, laboratory values, and echocardiography were collected/performed. Ninety-two consecutive patients (New York Heart Association class (NYHA) III/IV: 65.2 %/5.4 %, PH Group 1: 35.9 %, Group 3: 10.9 %, Group 4: 28.3 %, Group 5: 2.2 %) were investigated. Relevant arrhythmias were newly detected in 17 patients: non-sustained ventricular tachycardia (n = 12), intermittent second-degree heart block (n = 1), intermittent third-degree heart block (n= 3), and atrial flutter (n = 1). Echocardiographic systolic pulmonary pressure and diameter of the right heart were elevated in patients with relevant arrhythmias. Right heart catheterization revealed higher pulmonary vascular resistance (672 vs. 542 dyn · s · cm(-5), p = 0.247) and lower cardiac index (2.46 vs. 2.82 l/min/m(2), p = 0.184).
Ventricular tachycardias occur more often in PH patients than previously reported. However, the prognostic relevance of non-sustained ventricular tachycardias in this cohort remains unclear. As a large number of PH patients die from SCD, closer monitoring, e.g., using implantable event recorders, might be useful to identify patients at high risk.
closed_qa
Do school resources influence the relationship between adolescent financial background and their school perceptions?
Socioeconomic status (SES) influences students' school perceptions and affects their performance, engagement, and personal beliefs. This study examined the effects of school population SES and school resources on the association between student SES and student perceptions. School liking, classmate social relationships, family affluence, and experience of hunger were assessed in a nationally representative sample of 12,642 students (grades 5-10) in the 2009-2010 Health Behavior in School-Aged Children study. School characteristics included school meal program, Title 1 dollars received per student, school resources, and urban/rural status. Multilevel analysis was used. At the individual level, both school liking and social relationships were negatively associated with student grade level. Boys liked school less and had more positive perceptions of social relationships than girls. Students in rural schools and who experienced hunger liked schools less and had poorer perceptions of social relationships than their respective counterparts. School-level percentage of students eligible for free/reduced meals accounted for 33% of the between-school variance in social relationships.
Family and school economic characteristics and grade level influenced students' school perceptions. The associations between student SES, school population SES, and school perceptions suggests that school health professionals should recognize and address student economic issues at school.
closed_qa
Weight-related behaviors when children are in school versus on summer breaks: does income matter?
Income disparities in US youth in academic achievement appear to widen during the summer because of discontinued learning among children from lower-income households. Little is known about whether behavioral risk factors for childhood obesity, such as diet and physical activity, also demonstrate a widening difference by income when children are out of school. Data from US children in grades 1-12 in the National Health and Nutrition Examination Survey 2003-2008 (N = 6796) were used to estimate screen time, moderate-to-vigorous physical activity (MVPA), and consumption of calories, vegetables, and added sugar. Linear regression was used to compare among children of households ≤185% and>185% poverty, as well as during the school year versus on school breaks. Children surveyed during summer breaks consumed fewer vegetables (-0.2 cups/day) and more added sugar (+2.1 teaspoons/day), were more active (+4.6 MVPA minutes/day) and watched more television (+18 minutes/day). However, the nonsignificant interaction between school breaks and income indicated that lower-income students were not "less healthy" than higher-income students during the summer breaks.
Obesity-related risk factors were more prevalent during the summer and among lower-income youths, but the income disparity in these behaviors was not exacerbated when schools are not in session.
closed_qa
Can Preoperative Psychological Assessment Predict Outcomes After Temporomandibular Joint Arthroscopy?
Psychological assessment has been used successfully to predict patient outcomes after cardiothoracic and bariatric surgery. The purpose of this study was to determine whether preoperative psychological assessment could be used to predict patient outcomes after temporomandibular joint arthroscopy. Consecutive patients with temporomandibular dysfunction (TMD) who could benefit from arthroscopy were enrolled in a prospective cohort study. All patients completed the Millon Behavior Medicine Diagnostic survey before surgery. The primary predictor variable was the preoperative psychological scores. The primary outcome variable was the difference in pain between the pre- and postoperative periods. The Spearman rank correlation coefficient and the Pearson product-moment correlation were used to determine the association between psychological factors and change in pain. Univariable and multivariable analyses were performed using a mixed-effects linear model and multiple linear regression. A P value of .05 was considered significant. Eighty-six patients were enrolled in the study. Seventy-five patients completed the study and were included in the final analyses. The mean change in visual analog scale (VAS) pain score 1 month after arthroscopy was -15.4 points (95% confidence interval, -6.0 to -24.7; P<.001). Jaw function also improved after surgery (P<.001). No association between change in VAS pain score and each of the 5 preoperative psychological factors was identified with univariable correlation analyses. Multivariable analyses identified that a greater pain decrease was associated with a longer duration of preoperative symptoms (P = .054) and lower chronic anxiety (P = .064).
This study has identified a weak association between chronic anxiety and the magnitude of pain decrease after arthroscopy for TMD. Further studies are needed to clarify the role of chronic anxiety in the outcome after surgical procedures for the treatment of TMD.
closed_qa
Pertussis vaccination in adult trauma patients: Are we missing an opportunity?
Trauma centers commonly administer tetanus prophylaxis to patients sustaining open wounds. In the United States, there are different vaccinations available for adult administration: tetanus/diphtheria toxoid (Td) or tetanus/reduced diphtheria and acellular pertussis (Tdap). The importance of the Tdap preparation lies in its vaccination against pertussis while providing tetanus immunity. Vaccination against pertussis is paramount for disease prevention. In recent decades, there has been a steady rise in pertussis cases. This epidemic increase caused the Centers for Disease Control and Prevention (CDC) Advisory Committee on Immunization Practices (ACIP) to recommend the routine use of Tdap when tetanus prophylaxis is indicated. The aim of this study was to gather data on which formulation of tetanus vaccination is currently being given to adult trauma patients. We hoped to increase awareness of the expanded recommendations for vaccination against pertussis when tetanus prophylaxis is indicated, thus providing patients with protection against pertussis. An institutional review board exempt, web-based, nationwide survey was sent to adult trauma center coordinators that could be located via an Internet search. Questions included trauma center level designation, number of trauma evaluations per year, zip code, hospital description (university, university affiliated, or community), and which vaccination is given for adults<65 years and those ≥65. At the conclusion of the survey, hyperlinks to the CDC ACIP recommendations were provided as an educational tool. A total of 718 emails were successfully sent and 439 (61%) completed surveys were returned. Level 4/5 centers had the highest compliance rates for those patients between ages 18 and 64 (93%), followed by level 2/3 (87%), and then level 1 centers (57%). Among all centers, the use of Tdap was lower in the ≥65 year group. Level 2 trauma centers were the most compliant with this age group (61%) followed by level 4/5 (57%) and level 1 (43%) centers.
With the increase in pertussis cases, vaccination remains crucial to prevention. The CDC recommendations for Tdap have existed for adults<65 years since 2005 and those ≥65 years since 2012. However, many adult trauma centers do not adhere to the current CDC ACIP guidelines for tetanus/pertussis vaccination. In particular, level 1 trauma centers and those classified as university hospitals have the lowest rate of compliance with these recommendations. Through this survey, trauma centers were educated on current recommendations. Increased vaccination of trauma patients with Tdap should improve protection against this virulent pathogen.
closed_qa
Is medial elbow pain correlated with cubital tunnel syndrome?
Medial elbow pain is often considered to be a symptom associated with ulnar neuropathy at the elbow (UNE). We examined the relationship between medial elbow pain and a positive electrodiagnostic (EDx) test result for UNE. We performed a retrospective review of 884 patients referred for EDx evaluation of UNE. Regression models were used to determine the odds ratios between clinical findings and a positive EDx result for UNE. Patients reported medial elbow pain in 44.3% of cases. Clinical factors that correlated with a positive EDx study result for UNE included male gender, small and ring finger numbness, ulnar intrinsic weakness, and age. Medial elbow pain was negatively correlated with a positive EDx result.
This study demonstrates a negative correlation between medial elbow pain and a positive EDx result for UNE. Medial elbow pain should not be considered a clear diagnostic symptom of UNE.
closed_qa
Is self-confidence a factor for successful breastfeeding?
Maternal self-confidence and self-efficacy in breastfeeding are recognized as factors positively associated with the initiation and duration of breastfeeding. To evaluate the importance of this association using the Breast Feeding Self-Efficacy Scale (BSES). This prospective study was conducted in 2012 in the Jeanne-de-Flandre maternity department in the Lille University Hospital (France). During their time in the maternity department, breastfeeding mothers who participated in the study completed the BSES, a brief self-assessment of their feelings of self-efficacy relating to breastfeeding. They then received follow-up telephone interviews at 1 and 3 months postpartum. One hundred and forty-nine mothers were included in the study. Breastfeeding rates were 86.5% at 1 month and 60% at 3 months. The BSES score of mothers who continued to breastfeed at 1 and 3 months was significantly higher than the score of mothers who had already weaned their children, with an AUROC of 0.72 at 3 months. This confirmed the reliability of the BSES for predicting adherence to breastfeeding. The BSES score of mothers who had previously breastfed was significantly higher than for those breastfeeding for the first time. The threshold score for the BSES was determined as 116/165.
It is important that mothers who lack confidence in their ability to breastfeed be identified early, whether on the maternity ward or even before this point. The value of BSES-based breastfeeding support intervention needs to be evaluated through randomized trials.
closed_qa
Condylar asymmetry in patients with juvenile idiopathic arthritis: Could it be a sign of a possible temporomandibular joints involvement?
The aim of the study was to evaluate the condylar and ramal asymmetry of the mandible in patients with juvenile idiopathic arthritis (JIA) using orthopantomographies (OPTs). A total of 30 JIA patients with confirmed diagnosis of JIA and a routine OPT, seeking for orthodontic therapy, free of specific symptoms of temporomandibular joint involvement, and 30 normal matched subjects with OPT were comprised in the study. The method of Habets et al. was used to compare the condyles and rami in OPT. The significance of between-group differences were assessed using Mann-Whitney test. The results showed a high significant difference in the range of asymmetry of the condyle, being the patient group highly asymmetrical (P<0.0001). No differences were found in the range of asymmetry of the ramus between groups (P = 0.47). The intra-group comparison between males and females showed a difference in the patient group (P = 0.04), being the females more asymmetric.
Knowing that the temporomandibular joint (TMJ) is highly susceptible to inflammatory alterations during growth, even in absence of symptomatology, and being the OPT a cost-benefit favorable imaging tool widespread in the dental field, the latter could be used as a first screening examination in JIA patients to calculate the condylar asymmetry index. The use of this screening tool will help the physicians in addressing the patients that should undergo a more detailed TMJ imaging to early detect TMJ abnormalities and to early set up a targeted therapy of the related cranial growth alterations.
closed_qa
Is hepato-imino diacetic acid scan a better imaging modality than abdominal ultrasound for diagnosing acute cholecystitis?
The role of hepato-imino diacetic acid scan (HIDA) in the diagnosis of acute cholecystitis remains controversial when compared with the more commonly used abdominal ultrasound (AUS). The diagnostic imaging workup of 1,217 patients who presented to the emergency department at a single hospital with acute abdominal pain and suspicion of acute cholecystitis was reviewed to calculate the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of AUS and HIDA. In patients undergoing both imaging modalities, HIDA had significantly higher sensitivity (90.7% vs 64.0%, P<.001) and specificity (71.4% vs 58.4%, P = .005) than AUS for the diagnosis of acute cholecystitis. Additionally, PPV and NPV of HIDA (56.2% and 95.0%, respectively) were higher than PPV and NPV of AUS (38.4% and 80.0%, respectively) when both imaging modalities were used for the same patient.
In adults with acute abdominal pain, HIDA significantly increases the accuracy of the correct diagnosis.
closed_qa
Are preformed endotracheal tubes appropriately designed for pediatric patients?
The aim of the study was to examine different brands of preformed oral and nasal endotracheal tubes (ETT) and to assess whether the bend placement gave acceptable guidance for ETT depth positioning in children. The distance from the vertex of the bend to the tip (bend-to-tip distance) of seven brands of preformed oral and nasal ETTs were measured. Front teeth-to-carina (FTC) and nares-to-carina (NC) distance data from orally (0-19 years) and nasally (0-8 years) intubated children were used to assess the risk of endobronchial intubation if a preformed ETT had been placed with its bend at the front teeth or nares. While the bend-to-tip distance of a cuffed oral preformed ETT only differed by 0-1 cm from a same size ETT from another brand, uncuffed oral ETTs differed by 0-4 cm. The bend-to-tip distance of cuffed and uncuffed ETTs of the same brand and size differed by 0-3 cm. Had preformed cuffed oral ETTs been placed with their bends at the front teeth in children of the FTC reference group, endobronchial intubation would have occurred in 0-27% of the patients, depending on the size and brand of the used ETT. In contrast to oral ETTs, the bend-to-tip distance of cuffed nasal ETTs differed more (0-5.5 cm) between brands, and uncuffed nasal ETTs less (0-3 cm). Also, the bend-to-tip distance of a cuffed nasal ETT was consistently greater (2-9 cm) than that of a same brand and size nasal uncuffed ETT. Had a preformed cuffed nasal ETT been placed with its bend at the nares in the NC reference group, 50-100% of the patients would have been endobronchially intubated.
The bend-to-tip distance of preformed ETTs varies between brands, especially for nasal tubes. Some preformed tubes are not well suited for routine use in children. There is a high risk for accidental endobronchial intubation if a cuffed preformed ETT is positioned with its bend at the front teeth or nares in a young child. ETT tube tip position needs to be carefully controlled when a preformed ETT is used in a child.
closed_qa
Nares-to-carina distance in children: does a 'modified Morgan formula' give useful guidance during nasal intubation?
Knowledge of the normal nares-to-carina (NC) distance might prevent accidental bronchial intubation and be helpful when designing preformed endotracheal tubes (ETT). The aim was to measure NC distance and to examine whether a height/length-based 'modified Morgan formula' would give useful guidance for nasotracheal ETT depth positioning. Two groups were studied. A younger group consisted of nasally intubated postoperative patients. In these, NC distance was obtained as the sum of ETT length and the distance from the ETT tip to the carina, as measured from an anteroposterior chest X-ray. An older group consisted of children who had undergone computerized tomography (CT) examination including head, neck, and chest. In these, NC was measured directly from the CT image. The modified Morgan formula was derived from the NC vs height/length relationship. Nares-to-carina distance was best predicted by a linear equation based on patient height. The equation in the younger group (1 day-8 years, n = 57) was: NC (cm) = 0.14 × height + 5.8, R(2) = 0.90, and in the older group (2.1-20 years, n = 45): NC (cm) = 0.15 × height + 3.4, R(2) = 0.93. The equation for the groups combined (n = 102) was: NC (cm) = 0.14 × height + 6.2, R(2) = 0.97. Based on the latter equation, a modified Morgan formula was identified as: ETT position at nares in cm = 0.12 × height + 5. If the ETT had been placed as calculated by this formula, the ETT tip would have been at 85 + 5% (mean ± sd) of NC distance, and the ETT tip-to-carina distance would have been 3.1 ± 1.1 cm (range 0-6.6). Bronchial intubation would not have occurred in any child, but a comparison to tracheal length measurements indicates that ETT tip position could be too proximal in some children.
The study confirms previous reports: NC distance can be well predicted from height/length. A modified Morgan formula might decrease the risk for accidental endobronchial intubation in infants and children, but ETT position need to be confirmed by auscultation or other verification.
closed_qa
Does anemia-polycythemia complicating twin-twin transfusion syndrome affect outcome after fetoscopic laser surgery?
Twin anemia-polycythemia sequence (TAPS) can occur as a unique disease or as a complication of twin-twin transfusion syndrome (TTTS). Middle cerebral artery (MCA) Doppler studies are not currently part of the routine evaluation of monochorionic twins since they are not used in the Quintero staging system. As such, the true incidence of TAPS is unknown. We aimed to compare the characteristics and outcomes of twin pregnancies with TTTS complicated by spontaneous anemia-polycythemia vs those with TTTS alone. This was a secondary analysis of data collected prospectively from a cohort of 156 consecutive patients undergoing fetoscopic laser surgery for TTTS, between October 2011 and August 2014. TAPS was defined as discordance in the preoperative MCA peak systolic velocity (PSV), with one twin fetus having MCA-PSV ≤ 1.0 multiples of the median (MoM) and the other having MCA-PSV ≥ 1.5 MoM. Maternal demographics as well as preoperative, operative and postoperative variables were analyzed. Included in the final analysis were 133 patients with complete records: 11 cases with TTTS with anemia-polycythemia and 122 cases with TTTS alone. There was no difference in maternal body mass index, gestational age (GA) at procedure, rate of preterm prelabor rupture of membranes or GA at delivery between the two groups. Patients with TTTS and anemia-polycythemia were more likely to be older (P = 0.03) and parous (P = 0.04) and had a significantly lower number of placental anastomoses (P = 0.01). The dual live-birth rate was similar for both groups (P = 0.76).
Cases of TTTS with anemia-polycythemia were more likely to be found in parous and older women and were characterized by fewer vascular anastomoses. TTTS with anemia-polycythemia was not associated with worse perinatal outcome after laser therapy. Copyright © 2015 ISUOG. Published by John Wiley&Sons Ltd.
closed_qa
Is familial screening useful in selective immunoglobulin A deficiency?
Selective immunoglobulin A deficiency (SIgAD), the most common primary immunodeficiency, is often asymptomatic. High rates of familial clustering have been described in SIgAD, but the causative genetic defect and mechanism of inheritance are unknown. To determine whether familial SIgAD cases show more severe clinical and immunological characteristics than sporadic ones; to investigate the utility of screening first-degree relatives (FDRs) of these patients, and to determine whether symptoms in affected family members are important enough to justify screening. Descriptive, cross-sectional study (October 2010-September 2011) of all patients with SIgAD and followed up in our center. Demographic, clinical, and analytical data were reviewed. A familial case was defined as an SIgAD patient with at least one affected FDR. Of the 130 participants, 42 were SIgAD patients and 88 FDR. There were 13 (31%) familial cases and and 14 (16%) affected FDRs. Six family members had to be analyzed in order to detect one affected one. There were no clinical differences between familial and sporadic SIgAD cases. The percentages of intestinal disease (p=001, OR=9.57, 95%CI 2.59-35.3), hospitalizations (p=045, OR=4.01; 95%CI 1.10-14.67], and need for chronic treatment (p=006, OR=5.5; 95%CI 1.57-19.54) were higher in affected FDRs than in unaffected ones.
The symptoms were not more severe in familial than sporadic SIgAD cases. Nonetheless, the elevated prevalence of affected FDRs with significant morbidity may justify routine screening of close family members of these patients.
closed_qa
Is Etanercept 25 mg Once Weekly as Effective as 50 mg at Maintaining Response in Patients with Ankylosing Spondylitis?
To investigate, in a pilot randomized controlled trial, whether etanercept (ETN) 25 mg once weekly is effective at maintaining a clinical response in patients with ankylosing spondylitis (AS) who have responded to the standard 50 mg dose. Adults with AS not responding to conventional therapies were prescribed ETN 50 mg once weekly for 6 months. Responders as defined by the Bath Ankylosing Spondylitis Disease Activity Index (BASDAI) were randomly assigned to taper to 25 mg once weekly or continue on 50 mg and followed for a further 6 months. The primary outcome measure was maintenance of a 50% reduction in the BASDAI or fall in BASDAI by ≥ 2 units and a ≥ 2-unit reduction in BASDAI spinal pain as measured on a 10-point visual analog scale at 6 months postrandomization. Of 89 patients assessed for eligibility, 59 were enrolled; 47 (80%) had sufficient clinical response and were eligible for randomization, 24 were assigned to continue receiving ETN 50 mg, and 23 to taper to 25 mg. After 6 months, 20 (83%) of the 50 mg arm maintained clinical response compared with 12 (52%) of the 25 mg arm (a difference of -31%, 95% CI -58% - -5%).
Although this pilot study demonstrates that treatment with ETN 25 mg was less effective at maintaining treatment response in the stepdown phase, 52% of participants maintained treatment response. Future research should address which patients are suitable for tapering.
closed_qa
Can the Cancer-related Fatigue Case-definition Criteria Be Applied to Chronic Medical Illness?
Fatigue is a crucial determinant of quality of life across rheumatic diseases, but the lack of agreed-upon standards for identifying clinically significant fatigue hinders research and clinical management. Case definition criteria for cancer-related fatigue were proposed for inclusion in the International Classification of Diseases. The objective was to evaluate whether the cancer-related fatigue case definition performed equivalently in women with breast cancer and systemic sclerosis (SSc) and could be used to identify patients with chronic illness-related fatigue. The cancer-related fatigue interview (case definition criteria met if ≥ 5 of 9 fatigue-related symptoms present with functional impairment) was completed by 291 women with SSc and 278 women successfully treated for breast cancer. Differential item functioning was assessed with the multiple indicator multiple cause model. Items 3 (concentration) and 10 (short-term memory) were endorsed significantly less often by women with SSc compared with cancer, controlling for responses on other items. Omitting these 2 items from the case definition and requiring 4 out of the 7 remaining symptoms resulted in a similar overall prevalence of cancer-related fatigue in the cancer sample compared with the original criteria (37.4% vs 37.8%, respectively), with 97.5% of patients diagnosed identically with both definitions. Prevalence of chronic illness-related fatigue was 36.1% in SSc using 4 of 7 symptoms.
The cancer-related fatigue criteria can be used equivalently to identify patients with chronic illness-related fatigue when 2 cognitive fatigue symptoms are omitted. Harmonized definitions and measurement of clinically significant fatigue will advance research and clinical management of fatigue in rheumatic diseases and other conditions.
closed_qa
Are perioperative near-infrared spectroscopy values correlated with clinical and biochemical parameters in cyanotic and acyanotic infants following corrective cardiac surgery?
Near-infrared spectroscopy (NIRS) is a useful non-invasive tool for monitoring infants undergoing cardiac surgery. In this study, we aimed to determine the NIRS values in cyanotic and acyanotic patients who underwent corrective cardiac surgery for congenital heart diseases. Thirty consecutive infants who were operated on with the diagnosis of ventricular septal defect (n=15) and tetralogy of Fallot (n=15) were evaluated retrospectively. A definitive repair of the underlying cardiac pathology was achieved in all cases. A total of six measurements of cerebral and renal NIRS were performed at different stages of the perioperative period. The laboratory data, mean urine output and serum lactate levels were evaluated along with NIRS values in each group. The NIRS values differ in both groups, even after the corrective surgical procedure is performed. The recovery of renal NIRS values is delayed in the cyanotic patients.
Even though definitive surgical repair is performed in cyanotic infants, recovery of the renal vasculature may be delayed by up to two days, which is suggestive of a vulnerable period for renal dysfunction.
closed_qa
Do atmospheric conditions influence the first episode of primary spontaneous pneumothorax?
Several studies suggest that changes in airway pressure may influence the onset of primary spontaneous pneumothorax (PSP). The aim of this study was to investigate the influence of atmospheric changes on the onset of the first episode of PSP. We retrospectively analysed cases of pneumothorax admitted to our department between 1 January 2009 and 31 October 2013. Patients with recurrent pneumothorax, traumatic pneumothorax, older than 35 years or presenting history of underlying pulmonary disease were excluded. Meteorological data were collected from the Météo-France archives. Variation (Δ) of mean atmospheric pressure, and relative humidity, were calculated for each day between the day at which symptoms began (D-day), the day before first symptoms (D-1), 2 days before the first symptoms (D-2) and 3 days before the first symptoms (D-3). Six hundred and thirty-eight cases of pneumothorax were observed during the period of this study; 106 of them (16.6%) were a first episode of PSP. We did not observe any significant differences between days with or without PSP admission for any of the weather parameters that we tested. We could not find any thresholds in the variation of atmospheric pressure that could be used to determine the probability of PSP occurrence.
Variation of atmospheric pressure, relative humidity, rainfall, wind speed and temperature were not significantly related to the onset of the first episode of PSP in healthy patients. These results suggest that the scientific community should focus on other possible aetiological factors than airway pressure modifications.
closed_qa
Does the complexity of coronary artery disease affect outcomes after complete revascularization with long segmental reconstruction of the left anterior descending artery using the left internal thoracic artery?
We evaluated the influence of the complexity of coronary artery disease stratified by the Synergy between Percutaneous Coronary Intervention with Taxus and Cardiac Surgery (SYNTAX) score on early or late outcomes after complete revascularization with long segmental reconstruction of the left anterior descending artery using the left internal thoracic artery. From March 1995 to December 2003, a total of 102 patients with triple-vessel and/or left main disease underwent complete revascularization with long segmental left anterior descending artery reconstruction (≥2 cm) with or without endarterectomy using the left internal thoracic artery. The patients were divided into two groups according to the median SYNTAX score: the low group (SYNTAX score of<32, n = 50) and the high group (SYNTAX score of ≥32, n = 52). Outcomes were compared between the two groups, and predictors of follow-up death and major adverse cardiac and cerebrovascular events were determined. The mean number of distal anastomoses was 4.2 ± 1.1, and complete revascularization was achieved in 96% of patients. The early mortality rate was 2.9%, and no significant differences in the perioperative results were observed between groups. There were no significant differences in overall survival or major adverse cardiac and cerebrovascular event-free survival between the two groups. The hazard ratio of SYNTAX score for early mortality was 0.94 (95% confidence interval: 0.88-1.01) and for major adverse cardiac and cerebrovascular events was 0.97 (95% confidence interval: 0.92-1.02).
The complexity of coronary artery disease had no impact on early or late outcomes after complete revascularization with long segmental left anterior descending artery reconstruction using the left internal thoracic artery.
closed_qa
Does ethnicity have an effect on fetal behavior?
Fetal behavior was assesed by Kurjak's antenatal neurodevelopmental test (KANET) using four-dimensional (4D) ultrasound between 28 and 38 weeks of gestation. Eighty-nine Japanese (representative of Asians) and seventy-eight Croatian (representative of Caucasians) pregnant women were studied. The total value of KANET score and values of each parameter (eight parameters) were compared. The total KANET score was normal in both populations, but there was a significant difference in total KANET scores between Japanese (median, 14; range, 10-16) and Croatian fetuses (median, 12; range, 10-15) (P<0.0001). When individual KANET parameters were compared, we found significant differences in four fetal movements (isolated head anteflexion, isolated eye blinking, facial alteration or mouth opening, and isolated leg movement). No significant differences were noted in the four other parameters (cranial suture and head circumference, isolated hand movement or hand to face movements, fingers movements, and gestalt of general movements).
Our results suggest that ethnicity should be considered when evaluating fetal behavior, especially during assessment of fetal facial expressions. Although there was a difference in the total KANET score between Japanese and Croatian populations, all the scores in both groups were within normal range. Our results indicate that ethnical differences in fetal behaviour do not affect the total KANET score, but close follow-up should be continued in some borderline cases.
closed_qa
Should All Cases of High-Grade Serous Ovarian, Tubal, and Primary Peritoneal Carcinomas Be Reclassified as Tubo-Ovarian Serous Carcinoma?
The dualistic theory of ovarian carcinogenesis proposes that epithelial "ovarian" cancer is not one entity with several histological subtypes but a collection of different diseases arising from cells of different origin, some of which may not originate in the ovarian surface epithelium. All cases referred to the Pan-Birmingham Gynaecological Cancer Centre with an ovarian, tubal, or primary peritoneal cancer between April 2006 and April 2012 were identified from the West Midlands Cancer Registry. Tumors were classified into type I (low-grade endometrioid, clear cell, mucinous, and low-grade serous) and type II (high-grade serous, high-grade endometrioid, carcinosarcoma, and undifferentiated) cancers. Ovarian (83.5%), tubal (4.3%), or primary peritoneal carcinoma (12.2%) were diagnosed in a total of 583 woman. The ovarian tumors were type I in 134 cases (27.5%), type II in 325 cases (66.7%), and contained elements of both type I and type II tumors in 28 cases (5.7%). Most tubal and primary peritoneal cases, however, were type II tumors: 24 (96.0%) and 64 (90.1%), respectively. Only 16 (5.8%) of the ovarian high-grade serous carcinomas were stage I at diagnosis, whereas 240 (86.6%) were stage III+. Overall survival varied between the subtypes when matched for stage. Stage III low-grade serous and high-grade serous carcinomas had a significantly better survival compared to clear cell and mucinous cases, P = 0.0134. There was no significant difference in overall survival between the high-grade serous ovarian, tubal, or peritoneal carcinomas when matched for stage (stage III, P = 0.3758; stage IV, P = 0.4820).
Type II tumors are more common than type I and account for most tubal and peritoneal cancers. High-grade serous carcinomas, whether classified as ovarian/tubal/peritoneal, seem to behave as one disease entity with no significant difference in survival outcomes, therefore supporting the proposition of a separate classification of "tubo-ovarian serous carcinoma".
closed_qa
Is there a threshold concentration of cat allergen exposure on respiratory symptoms in adults?
Cat allergen concentrations higher than 8 μg/g in settled house dust, have been suggested to provoke exacerbation of allergic respiratory symptoms. However, whether the 8 μg/g of indoor cat allergen concentration is indeed the minimal exposure required for triggering the asthma related respiratory symptoms or the development of sensitization has not yet been confirmed. We studied the associations between domestic cat allergen concentrations and allergic symptoms in the European Community Respiratory Health Survey II, with the aim of confirming this suggested threshold. Cat allergen concentrations were measured in the mattress dust of 3003 participants from 22 study centres. Levels of specific immunoglobulin E to cat allergens were measured in serum samples using an immunoassay. Information on allergic symptoms, medication use, home environment and smoking was obtained from a face-to-face interview. Domestic cat allergen concentrations were not associated with allergic/ asthmatic symptoms in the entire study population, nor in the subset sensitized to cat allergen. We also found no association among individuals exposed to concentrations higher than 8 μg/g. However, exposure to medium cat allergen concentrations (0.24-0.63 μg/g) was positively associated with reported asthmatic respiratory symptoms in subjects who have experienced allergic symptoms when near animals.
The proposed 8 μg/g threshold of cat allergen concentrations for the exacerbation of allergic/ respiratory symptoms was not confirmed in a general European adult population. Potential biases attributable to avoidance behaviours and an imprecise exposure assessment cannot be excluded.
closed_qa
Residual Anatomical Disease in Diffuse Large B-Cell Lymphoma Patients With FDG-PET-Based Complete Response After First-Line R-CHOP Therapy: Does It Have Any Prognostic Value?
This study aimed to determine the prognostic value of residual anatomical disease, including its size and reduction relative to baseline, in diffuse large B-cell lymphoma patients who have F-fluoro-2-deoxy-d-glucose positron emission tomography-based complete response after first-line R-CHOP therapy. This retrospective study included 47 patients. In patients with computed tomography (CT)-based residual disease, the size of the largest residual lesion (Resmax) and the sum of the sizes of all residual lesions (Restotal) were measured, and their reductions relative to baseline (ΔResmax and ΔRestotal) were calculated. Patients with high-risk National Comprehensive Cancer Network International Prognostic Index (NCCN-IPI) scores had significantly lower progression-free survival (PFS) and overall survival (OS) than patients with low-risk NCCN-IPI scores (P = 0.032 and P = 0.022). In contrast, patients with residual lesions at CT had no significantly lower PFS and OS than those without (P = 0.531 and P = 0.801). In the subpopulation with CT-based residual disease, patients with high Resmax, high Restotal, low ΔResmax, and low ΔRestotal had no significantly different PFS and OS than those with low Resmax, low Restotal, high ΔResmax, and high ΔRestotal (P = 0.980 and P = 0.790, P = 0.423 and P = 0.229, P = 0.923 and P = 0.893, and P = 0.923 and P = 0.893, respectively).
The NCCN-IPI retains its prognostic value in diffuse large B-cell lymphoma patients with F-fluoro-2-deoxy-d-glucose positron emission tomography-based complete response after first-line R-CHOP therapy. However, the presence of residual anatomical disease, including its size and reduction relative to baseline, has no prognostic value in these patients.
closed_qa
Does treatment impact health outcomes for patients after acute coronary syndrome?
Mortality rates for acute coronary syndrome (ACS) patients are still very high all over the world. Our study aimed to investigate the impact of ACS treatment on cardiovascular (CV) mortality eight years following ACS. A retrospective cohort study with a total of 613 patients was used. The data was collected from databases and medical records. An evidence-based treatment (EBT) algorithm was used based on the ESC guidelines. Logistic regression analysis and standardized odds ratios with 95% confidence interval (CI) were used for the risk assessment, with a p level<0.05 considered as significant. The median follow-up time in this study was 7.6 years. During follow-up 48.9% of the patients (n=300) died from CV and 207 (69%) for a relevant reason. For monotherapy ACE inhibitors and β-blockers, and for fixed dose combined drugs ACE inhibitors and diuretics, were most frequently used. EBT was provided to 37.8% of patients. The EBT use (HR 0.541, CI 0.394-0.742, p<0.001) during follow-up period was important for reducing CV mortality in ACS patients.
The combined use of EBT significantly improved outcomes. The recurrent myocardial infarction and percutaneous coronary intervention patients were more frequent in EBT and it was beneficial for reducing CV mortality.
closed_qa
Are lifestyle behavioral factors associated with health-related quality of life in long-term survivors of non-Hodgkin lymphoma?
The objective of the current study was to determine whether survivors of non-Hodgkin lymphoma are meeting select American Cancer Society (ACS) health-related guidelines for cancer survivors, as well as to examine relationships between these lifestyle factors and health-related quality of life (HRQoL) and posttraumatic stress (PTS). A cross-sectional sample of 566 survivors of NHL was identified from the tumor registries of 2 large academic medical centers. Respondents were surveyed regarding physical activity, fruit and vegetable intake, body weight, tobacco use, HRQoL using the Medical Outcomes Study Short Form-36, and PTS using the Posttraumatic Stress Disorder CheckList-Civilian form. Lifestyle cluster scores were generated based on whether individuals met health guidelines and multiple linear regression analysis was used to evaluate relationships between lifestyle behaviors and HRQoL scores and PTS scores. Approximately 11% of participants met all 4 ACS health recommendations. Meeting all 4 healthy recommendations was related to better physical and mental QoL (standardized regression coefficient [β], .57 [P<.0001] and β, .47 [P = .002]) and to lower PTS scores (β, -0.41; P = .01).
Survivors of NHL who met more ACS health-related guidelines appeared to have better HRQoL and less PTS. Unfortunately, many survivors are not meeting these guidelines, which could impact their overall well-being and longevity.
closed_qa
Small kidneys for large recipients: does size matter in renal transplantation?
Imbalance between transplanted renal mass and the metabolic demands of the recipient has been identified as a predictor of renal graft function. Multiple factors have been used to test this influence, but none of them is consensually accepted. The aim of this study is to evaluate the influence of the imbalance between transplanted renal mass and the metabolic needs of the recipient by analyzing the relationship between the ratio of the weight of the renal graft and the body weight of the recipient (Kw/Rw) on transplantation outcomes. Prospective observational study of 236 first and single cadaveric renal transplants in non-hyperimmunized recipients was conducted. Grafts were orthogonally measured and weighed immediately before implantation, and these measures were correlated with donor and recipient data. According to the Kw/Rw ratio, patients were divided into three groups: Kw/Rw<2.8 (P25), Kw/Rw = 2.8-4.2, and Kw/Rw>4.2 (P75). After a mean follow-up of 5.2 years, transplant outcomes (delayed graft function; acute rejections; and estimated 1-, 6-, 12-, 36-, and 60-month renal function, graft, and patient survivals) were evaluated and correlated in uni- and multivariate analyses with the Kw/Rw ratio. Mean values for graft dimensions were 109.47 × 61.77 × 40.07 mm and the mean weight was 234.63 g. Mean calculated volume was 145.64 mL. The mean Kw/Rw ratio was 3.65 g/kg. These values were significantly lower for female grafts (3.91 vs 3.24, P<.001). According to the Kw/Rw ratio groups, there were no differences on delayed graft function, acute rejection episodes, and estimated graft function at the defined times. The increase in estimated glomerular filtration rate by a mean of 3.6 mL/min between 1 and 6 months for patients with Kw/Rw<2.8 was not statistically relevant when compared to the higher ratio group with a mean variation of -0.91 mL/min (P = .222). Graft survival rate at 5 years after transplantation was 79% in the Kw/Rw<2.8 group and 82% in the Kw/Rw>4.2 group (P = .538). Patient survival rate at 5 years after transplantation was 85% in the Kw/Rw<2.8 group and 92% in the high ratio group (P = .381). Kw/Rw ratio was not an independent risk factor for transplant failure at 5.2 years in a multivariate logistic regression analysis. Irrespective of recipient weight, graft survival was significantly higher for grafts with volume or weight above the 50 percentile (vol>134 mL, P = .011 or weight>226 g, P = .016).
The imbalance between implanted renal mass and recipient metabolic demands does not seem to influence the functional outcomes and graft survival up to 60 months post-transplantation. Nevertheless, irrespective of recipient weight, graft survival is significantly higher for grafts with volume or weight above the 50 percentile.
closed_qa
Anxiety and depression symptoms in hepatic encephalopathy: are they psychiatric or organic?
Hepatic encephalopathy (HE) represents a broad continuum of neuropsychiatric abnormalities, from subtly altered mental status to deep coma, seen in patients with liver dysfunction. HE can mimic all of the major psychiatric syndromes. The distinction between HE and a psychiatric condition, namely depression, is sometimes difficult. Some liver patients end up being medicated with psychiatric drugs which might worsen their medical state. The main objective of this study was to try to find the correlations between anxiety and depression symptoms and the presence of HE to better diagnose and treat these patients. Sixty consecutive liver transplant candidates, attending the outpatient clinics of a liver transplantation center were studied from January 1, 2012, to December 1, 2012. Each patient was assessed by means of Psychometric Hepatic Encephalopathy Score subtests and Hospital Anxiety and Depression Scale. We found a statistically significant relationship between HE and some of the depressive symptoms: anhedonia and loss of energy.
These findings may indicate that when in the presence of an HE patient with depressive symptoms, HE-directed therapies should be attempted before antidepressant drugs.
closed_qa
Stents for bronchial stenosis after lung transplantation: should they be removed?
Airway complications after lung transplantation are the major cause of morbidity, affecting up to 33% of all cases. Bronchial stenosis is the most common complication. The use of stents has been established as the most effective therapy; however, their removal is recommended after 3-6 months of use. We have been using self-expandable stents as a definitive treatment and remove them only if necessary. For this report, we evaluated the use of self-expandable stents as a definitive treatment for bronchial stenosis after lung transplantation. We performed a retrospective cohort study to evaluate patients with bronchial stenosis from August 2003 to April 2014. Clinical and pulmonary function test data were collected. Two hundred lung transplants were performed, 156 of which were bilateral. Sixteen patients experienced airway complications: 4 had dehiscence, 2 necrosis, and 10 bronchial stenosis. Of these patients, 7 had undergone bilateral procedures, and 2 patients developed stenosis in both sides. Twelve anastomotic stenoses were observed. The follow-up after stenting ranged from 1 to 7 years. All patients had increased lung function, and 4 remained stable with sustained increase in pulmonary function without episodes of infection. Three patients required removal of their prosthesis 6 months to 1 year after implantation because of complications. Two patients died owing to unrelated causes.
Definitive treatment of bronchial stenosis with self-expandable stents is a viable option. The 1st year seems to be the most crucial for determining definitive treatment, because no patients required removal of their stent after 1 year.
closed_qa
Does lower urinary tract status affect renal transplantation outcomes in children?
Lower urinary tract dysfunction (LUTD), an important cause of end stage renal disease (ESRD) in children, can adversely affect renal graft survival. We compared renal transplant patients with LUTD as primary renal disease to those without LUTD. The data of 60 children who underwent renal transplantation (RTx) between 2000 and 2012 were retrospectively reviewed. All patients with LUTD were evaluated with urodynamic tests preoperatively; 15 patients required clean intermittent catheterization and 9 patients underwent augmentation cystoplasty before RTx. There were 25 children with LUTD. The mean follow-up for LUTD (+) and LUTD (-) groups were 63 (22-155) and 101 months (14-124), and graft survival were 76% for LUTD (+) and 80% for LUTD (-), respectively (P = .711). On the other hand, creatinine levels at last follow-up were significantly higher in the LUTD (+) group (1.3 ± 0.3 mg/dL vs 0.96 ± 0.57 mg/dL, P<.001). Infectious complications and postoperative urinary tract infection incidences were also higher in the LUTD (+) group (68% vs 25.7%, P = .002 and 60% vs 11.4%, P<.01).
UTI is significantly higher after kidney transplantation in patients with LUTD. Despite the higher risk of UTI, renal transplantation can be performed safely in those patients with careful patient selection, preoperative management, and close postoperative follow-up. Restoration of good bladder function is the key factor in the success of kidney transplantation in those patients.
closed_qa
Graft function and arterial stiffness: can bioimpedance analysis be useful in renal transplant recipients?
We aimed to determine the total body water (TBW) by means of bioimpedance analysis (BIA) and to analyze the association of TBW, graft function, and arterial stiffness by means of pulse-wave velocity (PWV) and echocardiographic measurements in renal transplant (RT) recipients. Eighty-two RT recipients (mean age, 38.7 ± 11.5 y; 58 male) who were using ≥1 antihypertensive treatment were enrolled in the study. Biochemical parameters, 24-hour urinary protein loss, estimated glomerular filtration rate (eGFR), transthoracic echocardiography, bioimpedance analysis according to systolic blood pressure, TBW, lean tissue index (LTI), extracellular water (ECW), intracellular water (ICW), lean tissue mass (LTM), phase angle (Phi50) levels, and renal resistive index (RRI) were evaluated. TBW and ECW were significantly correlated with systolic blood pressure. Urinary protein loss, pulmonary artery pressure, frequency of overhydration, systolic blood pressure, TBW, LTI, ECW, ICW, LTM, and Phi50 values were significantly higher in patients with estimated glomerular filtration rate (eGFR) 15-49 mL/min but similar in patients with eGFR 50-70 mL/min.
Hypertensive RT recipients have increased TBW, LTI, ICW, FTI, LTM, and Phi50 values. Graft function is positively correlated with systolic blood pressure and BIA parameters. Therefore, hypertensive RT recipients should be closely followed with the use of BIA for an early diagnosis of loss of graft function.
closed_qa
The metabolic syndrome and cancer: Is the metabolic syndrome useful for predicting cancer risk above and beyond its individual components?
The metabolic syndrome (MetS) is a risk factor for cancer. However, it is not known if the MetS confers a greater cancer risk than the sum of its individual components, which components drive the association, or if the MetS predicts future cancer risk. We linked 20,648 participants from the Australian and New Zealand Diabetes and Cancer Collaboration with complete data on the MetS to national cancer registries and used Cox proportional hazards models to estimate associations of the MetS, the number of positive MetS components, and each of the five MetS components separately with the risk for overall, colorectal, prostate and breast cancer. Hazard ratios (HR) and 95% confidence intervals (95%CI) are reported. We assessed predictive ability of the MetS using Harrell's c-statistic. The MetS was inversely associated with prostate cancer (HR 0.85; 95% CI 0.72-0.99). We found no evidence of an association between the MetS overall, colorectal and breast cancers. For those with five positive MetS components the HR was 1.12 (1.02-1.48) and 2.07 (1.26-3.39) for overall, and colorectal cancer, respectively, compared with those with zero positive MetS components. Greater waist circumference (WC) (1.38; 1.13-1.70) and elevated blood pressure (1.29; 1.01-1.64) were associated with colorectal cancer. Elevated WC and triglycerides were (inversely) associated with prostate cancer. MetS models were only poor to moderate discriminators for all cancer outcomes.
We show that the MetS is (inversely) associated with prostate cancer, but is not associated with overall, colorectal or breast cancer. Although, persons with five positive components of the MetS are at a 1.2 and 2.1 increased risk for overall and colorectal cancer, respectively, and these associations appear to be driven, largely, by elevated WC and BP. We also demonstrate that the MetS is only a moderate discriminator of cancer risk.
closed_qa
Does a Change Over All Equal a Change in All?
This study aimed at testing whether drinking volume and episodic heavy drinking (EHD) frequency in Germany are polarizing between consumption levels over time. Polarization is defined as a reduction in alcohol use among the majority of the population, while a subpopulation with a high intake level maintains or increases its drinking or its EHD frequency. The polarization hypothesis was tested across and within socio-economic subgroups. Analyses were based on seven cross-sectional waves of the Epidemiological Survey of Substance Abuse (ESA) conducted between 1995 and 2012 (n = 7833-9084). Overall polarization was estimated based on regression models with time by consumption level interactions; the three-way interaction with socio-economic status (SES) was consecutively introduced to test the stability of effects over socio-economic strata. Interactions were interpreted by graphical inspection. For both alcohol use indicators, declines over time were largest in the highest consumption level. This was found within all SES groups, but was most pronounced at low and least pronounced at medium SES.
The results indicate no polarization but convergence between consumption levels. Socio-economic status groups differ in the magnitude of convergence which was lowest in medium SES. The overall decline was strongest for the highest consumption level of low SES.
closed_qa
Should we call the neurologist?
St Vincent's University Hospital has an established neurology consultation service. Referral volumes have been growing. The Department regularly reviews its service to monitor changes and seek improvements. We sought to determine the impact of the growing service on patient care, on the department itself in delivering the service, and on inpatient admission trends. We reviewed the electronic referral forms of all consults seen over a 9-week period in 2014 (n = 213). We recorded the source of each consult, demographic information, clinical presentation, time from referral to consult, and outcome. We compared the consult list to inpatient admissions list to determine the proportion admitted from consults. We compared our results to previous reviews by this and other neurology departments in Ireland. Three quarters of neurology consults relate to acute admissions. Patients are all seen within one working day of referral. A significant change in management (83.6 %) resulted from the majority of consults. Consultants see an average of 4.8 (range 0-10) consults per day, needing up to 7.5 h per day to deliver the service. One-third of the department's inpatients come from consults.
The service significantly benefits patient care. The increasing number of consults will require increased resources and/or service reorganisation to maintain the current level of service.
closed_qa
Are patient-nurse relationships in breast cancer linked to adult attachment style?
The aim of this study was to ascertain if patients with breast cancer who have positive attachment models of 'self' and 'other' perceive higher levels of support from nurses than do patients with negative attachment models. Attachment models of 'self' and 'other' develop in childhood and affect relationships throughout life. People with negative attachment models tend to perceive themselves as unworthy of receiving support and to perceive others as incapable or unwilling to offer support. Attachment processes are activated when individuals feel threatened and seek support from those close to them. Breast cancer may represent such a threat and relationships between patients with breast cancer and nurses may therefore be influenced by patients' attachment models. A between-subjects cross-sectional design was used. Explanatory variables were indicators of patients' attachment models. Response variables were patient ratings of nurse support. Covariates were patient age and patient distress levels. One hundred and fifty-three patients with breast cancer, diagnosed 1-3 years previously, were recruited when attending follow-up oncology appointments over 51 weeks in 2010-2011. Participants completed questionnaires assessing attachment models, distress and perceived support, from the nurse who was available to support them through their cancer. The hypotheses were tested by logistic regression analysis. Patients with more positive models of 'self' perceived more support from nurses.
Patients' perceptions of nurses when being treated for breast cancer are influenced by patients' own models of attachment. Knowledge of this would help nurses further to individualize the emotional support they give patients.
closed_qa
Is bronchial wall imaging affected by temporal resolution?
To evaluate the influence of temporal resolution (TR) on cardiogenic artefacts at the level of bronchial walls. Ninety patients underwent a dual-source, single-energy chest CT examination enabling reconstruction of images with a TR of 75 ms (i.e., optimized TR) (Group 1) and 140 ms (i.e., standard TR) (Group 2). Cardiogenic artefacts were analyzed at the level of eight target bronchi, i.e., right (R) and left (L) B1, B5, B7, and B10 (total number of bronchi examined: n = 720). Cardiogenic artefacts were significantly less frequent and less severe in Group 1 than in Group 2 (p < 0.0001) with the highest scores of discordant ratings for bronchi in close contact with cardiac cavities: RB5 (61/90; 68%); LB5 (66/90; 73%); LB7 (63/90; 70%). In Group 1, 78% (560/720) of bronchi showed no cardiac motion artefacts, whereas 22% of bronchi (160/720) showed artefacts rated as mild (152/160; 95%), moderate (7/160; 4%), and severe (1/160; 1%). In Group 2, 70% of bronchi (503/720) showed artefacts rated as mild (410/503; 82%), moderate (82/503; 16%), and severe (11/503; 2%).
At 75 ms, most bronchi can be depicted without cardiogenic artefacts.
closed_qa
Is there a difference in the maternal and neonatal outcomes between patients discharged after 24 h versus 72 h following cesarean section?
To compare the incidence of postpartum maternal and neonatal complications and hospital readmission in patients discharged 24 versus 72 h after cesarean section. Using randomization, 1495 patients were discharged after 24 h and 1503 patients were discharged after 72 h. All patients fulfilled the discharge criteria. Patients were assessed 6 weeks after delivery, any maternal or neonatal problems or hospital readmissions during this time interval were reported. There was no difference in maternal hospital readmission between the two groups, but there was a significantly higher neonatal readmission rate in the 24-h group mainly due to neonatal jaundice. As for the complications reported after 6 weeks, the only two significant outcomes were initiating breast feeding, being significantly higher in the 72-h group [OR and 95% CI 0.77 (0.66-0.89)] and the mood swings being significantly lower in the 72-h group [OR and 95% CI 2.28 (1.94-2.68)].
Our recommendation is still in favor of late discharge, after cesarean delivery. Bearing in mind, that an early 24-h discharge, after cesarean delivery is feasible, but with special care of the neonate, with early visit to the pediatrician and early establishment of effective lactation.
closed_qa
Does intra-articular fracture change the lubricant content of synovial fluid?
Lubrication function is impaired and the lubricant content of synovial fluid (SF) changes immediately after plateau tibia fractures. Here, we aimed to analyze the lubricant content of SF at chronic term following plateau tibia fracture. Forty-eight surgically treated patients without joint incongruency (<2 mm displacement) were included in the study. Joint aspiration had been possible in 16 of the participants. However, sampling could be made from healthy knees in only ten of these patients. Twenty-six SF samples (16 injured knees, 10 healthy knees) were analyzed for concentrations of hyaluronic acid (HA), proteoglycan-4 (PRG4), TNF-α, IL-1β, and IL-6. The group of experimental samples were obtained at a mean of 31 (12-66) months after injury from patients with a mean age of 45.1 (32-57) years. There were no relationships detected between biochemical analysis results and patient ages, sexes, postoperative time, and fracture type. After excluding six patients for whom we could not sample from their healthy knee, ten patients' values were compared with paired Wilcoxon signed rank test and no significant differences detected between the healthy and injured knee in terms of the SF concentrations of HA and PRG4 (p = 0.225 and 0.893, respectively). Similarly, there were no statistically significant differences in SF sample concentrations of TNF-α, IL-1β, and IL-6 between healthy and injured knees.
Despite acute changes, the long-term concentrations of HA and PRG4 were similar after plateau tibial fracture. We could not detect any concentration level differences between healthy knees and injured knees regarding HA and PRG4 in the long-term follow-up.
closed_qa
Can emotional stress trigger the onset of epilepsy?
The aim of this study was to investigate the potential role of an acute adverse stress as "trigger" for the onset of epilepsy. Among 4618 consecutive patients, twenty-two reported a major life event within three months before the onset of epilepsy. All patients had focal epilepsy except one with idiopathic generalized epilepsy. The temporal lobe was involved in 90% of patients with focal epilepsy. More precisely, 13 patients (62% of patients with focal epilepsy) had medial temporal lobe epilepsy (MTLE), two had lateral temporal lobe epilepsy, four had temporoparietooccipital junction epilepsy, and two patients had central lobe epilepsy. The mean age and the median age at onset of epilepsy for patients with MTLE were both 38 years (range: 9.5-65 years). Ten patients had right and three had left MTLE. Among patients with focal epilepsy, MRI was abnormal in 7 (33%) with hippocampal sclerosis in four, periventricular nodular heterotopia in two, and complex cortical dysgenesis in one. The mean age at onset of epilepsy for patients with brain lesions was 26 years (range: 9.5-49). Twelve patients (54%) reported a death as a triggering factor for the onset of their epilepsy. Seven patients (32%) reported that a relationship of trust had been broken. Three patients (14%) had been subjects of violence. No patient reported sexual abuse as a triggering factor.
This study provides evidence that some patients (5/1000 patients) began their seizures in the wake of significant life events. The average age at onset of epilepsy is quite late, around age 30, even in the presence of brain lesions. These patients are emotionally and affectively more prone to have consequences of a stressful life event. The recognition and management of such situations may bring significant relief with improvement of the control of epilepsy.
closed_qa
Is end-stage lateral osteoarthritic knee always valgus?
We hypothesized that not all persons with end-stage lateral osteoarthritis (OA) have valgus malalignment and that full extension radiographs may underreport radiographic disease severity. The purpose of this study was to examine the demographic and radiographic features of end-stage lateral compartment knee OA. We retrospectively studied 133 knees in 113 patients who had undergone total knee arthroplasty between June 2008 and August 2010. All patients had predominantly lateral idiopathic compartment OA according to the compartment-specific Kellgren-Lawrence grade (KLG). The mechanical axis angle (MAA), compartment-specific KLG and joint space narrowing (JSN) of the tibiofemoral joint at extension and 30° of knee flexion, tibia vara angle, tibial slope angle, body mass index, age, and sex were surveyed. End-stage lateral compartment knee OA has varus (37.6 %), neutral (22.6 %), and valgus (39.8 %) MAA on both-leg standing hip-knee-ankle radiographs. KLGs at 30° of knee flexion (fKLG) were grades 3 and 4 in all patients. However, for KLGs at full extension (eKLG), 54 % of all patients had grades 3 and 4. The others (46 %) showed grades 1 and 2. We observed significant differences in lateral compartment eKLG/eJSN (2.3/2.3 mm in varus, 2.5/1.9 mm in neutral, 2.9/1.6 mm in valgus, p = 0.01 and 0.03, respectively), tibia vara angle (4.9° in varus, 4.1° in neutral, 3.0° in valgus, p<0.01), and medial compartment eKLG/eJSN (2.1/3.1 mm in varus, 2.0/3.4 mm in neutral, 1.8/4.3 mm in valgus, p<0.01 and 0.01, respectively) between MAA groups, except for the tibial slope angle (9.7° in varus, 10.1° in neutral, 9.8° in valgus, p = 0.31).
Varus alignment was paradoxically shown in approximately one-third of those with end-stage lateral knee OA on both-leg standing hip-knee-ankle radiographs. Films taken in full extension underreported the degree of OA radiographic severity.
closed_qa
Correlation between pedicle size and the rate of pedicle screw misplacement in the treatment of thoracic fractures: Can we predict how difficult the task will be?
A retrospective study. To correlate the incidence of pedicle-screw (PS) misplacement with the dimensions of the pedicles in the treatment of thoracic spine fractures. The technical challenge of internal fixation with PS in the thoracic spine has been well documented in the literature. However, there are no publications that document the correlation between the pedicle dimensions of the thoracic vertebrae in the preoperative computed tomography scans (CT) and the rate of PS misplacement. All patients who had PSs inserted between the T1 and T12 vertebrae during a 24-month period were included in this study. PS position was assessed on high quality CT scans by two independent observers and classified in 2 categories: correct or misplaced. The transverse diameter, craniocaudal diameter and cross-sectional area of the pedicles from T1 to T12 were measured in the pre-operative CT. During the period of this study 36 patients underwent internal fixation with 218 PS. Of the 218 screws, 184 (84.5%) were correct and 34 (15.5%) were misplaced. Misplacement rate was 33% for pedicles with a transverse diameter less than 5 mm, 10.7% for those with a transverse diameter between 5 and 7 mm and 0% for those with a transverse diameter larger than 7 mm. There was a statistically significant difference in the rate of PS misplacement in pedicles with transverse diameter smaller than 5 mm compared with the others. Also, those with transverse diameter between 5.1 and 7 mm compared with those bigger than 7 mm in diameter. The rate of PS misplacement was higher between T3 and T9 (p<0.05), which in turn correlated with pedicle transverse diameter.
The rate of PS misplacement in the mid thoracic spine (T4-T9) is high and correlates with pedicle transverse diameter.
closed_qa
Loop versus end colostomy reversal: has anything changed?
Though primary repair of colon injuries is preferred, certain injury patterns require colostomy creation. Colostomy reversal is associated with significant morbidity and healthcare cost. Complication rates may be influenced by technique of diversion (loop vs. end colostomy), though this remains ill-defined. We hypothesized that reversal of loop colostomies is associated with fewer complications than end colostomies. This is a retrospective, multi-institutional study (four, level-1 trauma centers) of patients undergoing colostomy takedown for trauma during the time period 1/2006-12/2012. Data were collected from index trauma admission and subsequent admission for reversal and included demographics and complications of reversal. Student's t test was used to compare continuous variables against loop versus end colostomy. Discrete variables were compared against both groups using Chi-squared tests. Over the 6-year study period, 218 patients underwent colostomy takedown after trauma with a mean age of 30; 190 (87%) were male, 162 (74%) had penetrating injury as their indication for colostomy, and 98 (45%) experienced at least one complication. Patients in the end colostomy group (n = 160) were more likely to require midline laparotomy (145 vs. 18, p<0.001), had greater intra-operative blood loss (260.7 vs. 99.4 mL, p<0.001), had greater hospital length of stay (8.4 vs. 5.5 days, p<0.001), and had more overall complications (81 vs. 17, p = 0.005) than patients managed with loop colostomy (n = 58).
Local takedown of a loop colostomy is safe and leads to shorter hospital stays, less intra-operative blood loss, and fewer complications when compared to end colostomy.
closed_qa
Does lesser trochanter implication affect hip flexion strength in proximal femur fracture?
In pertrochanteric and intertrochanteric femoral fractures, the avulsion of the lesser trochanter by the pull of the iliopsoas muscle is not uncommon. This fragment is not commonly fixed because the avulsion of the lesser is tough to not influence the clinical outcome but up to date there is no evidence to support this statement. The aim of this study is to evaluate if lesser trochanter implication affects psoas muscle strength in proximal femur fracture. Patients with a consolidated intertrochanteric or pertrochanteric fracture associated or not with lesser trochanter fracture were enrolled, respectively, in group A and group B. Criteria of inclusion were the achievement of an anatomic reduction with gamma nail and a complete consolidation of the fracture. Criteria of exclusion were a follow-up shorter than 6 months and age over 65 years old at surgery. Patients were retrospectively reviewed for the purpose of this study. Range of motion, modified Harris Hip Score (mHHS), flexion strength with hip in neutral position, at 90° of flexion and in "figure four" position were evaluated on injured and healthy side. On the pre-operative X-rays, the vertical displacement of the lesser trochanter was calculated. Groups A and B showed no significant difference in age and follow-up. No statistical difference between the two groups was found in range of motion, mean mHHS, hip flexion strength at 90° of hip flexion. Lesser trochanter fracture group showed a significantly reduced strength in flexion with hip in neutral flexion (mean difference between two groups was 18.5 kgf). Lesser trochanter displacement showed a significant correlation with strength at 90° of flexion.
Our results showed that lesser trochanter implication may result in decreased hip flexion strength. Lesser trochanter displacement is directly correlated with flexion strength. Further studies will be necessary to understand if lesser trochanter fixation may be a good solution for those patients.
closed_qa
Is augmentation plating an effective treatment for non-union of femoral shaft fractures with nail in situ?
There are few reports of non-union femur shaft fractures treated with plate fixation with the nail in situ. This study reports our results in 40 cases. Retrospective series of non-union and delayed union of femoral shaft after intramedullary nailing treated with plate fixation. Patients were serially followed-up till 12 months. Fracture union, time to union, knee range of motion, deformity, shortening and complications were recorded. Descriptive statistics were performed as applicable. There were 40 patients with mean age of 35 years (18-65). There were 14 cases of hypertrophic non-union, 24 cases of atrophic non-union and 2 cases of delayed union. The average time of surgery was 1 ½ h and average blood loss was 300 ml. Exchange nailing was done in 9 cases. Union was achieved in 39 patients. The mean time to fracture union was 4 months. Post-operative knee range of motion was>120° in 35 patients. One patient developed deep infection which was treated with removal of implants and exchange nailing with a vancomycin coated nail and union was achieved.
Plating is an effective treatment for non-union of diaphyseal femur fractures after intramedullary fixation with the nail in situ.
closed_qa
Does lumbar spinal stenosis increase the risk of spondylotic cervical spinal cord compression?
The aim of this prospective cross-sectional observational comparative study was to determine the prevalence of spondylotic cervical cord compression (SCCC) and symptomatic cervical spondylotic myelopathy (CSM) in patients with symptomatic lumbar spinal stenosis (LSS) in comparison with a general population sample and to seek to identify predictors for the development of CSM. A group of 78 patients with LSS (48 men, median age 66 years) was compared with a randomly selected age- and sex-matched group of 78 volunteers (38 men, median age 66 years). We evaluated magnetic resonance imaging findings from the cervical spine and neurological examination. The presence of SCCC was demonstrated in 84.6% of patients with LSS, but also in 57.7% of a sample of volunteers randomly recruited from the general population. Clinically symptomatic CSM was found in 16.7% of LSS patients in comparison with 1.3% of volunteers (p = 0.001). Multivariable logistic regression proposed the Oswestry Disability Index of 43% or more as the only independent predictor of symptomatic CSM in LSS patients (OR 9.41, p = 0.008).
The presence of symptomatic LSS increases the risk of SCCC; the prevalence of SCCC is higher in patients with symptomatic LSS in comparison with the general population, with an evident predominance of more serious types of MRI-detected compression and a clinically symptomatic form (CSM). Symptomatic CSM is more likely in LSS patients with higher disability as assessed by the Oswestry Disability Index.
closed_qa
Non-operative management of blunt hepatic trauma: Does angioembolization have a major impact?
A paradigm shift toward non-operative management (NOM) of blunt hepatic trauma has occurred. With advances in percutaneous interventions, even severe liver injuries are being managed non-operatively. However, although overall mortality is decreased with NOM, liver-related morbidity remains high. This study was undertaken to explore the morbidity and mortality of blunt hepatic trauma in the era of angioembolization (AE). A retrospective cohort of trauma patients with blunt hepatic injury who were assessed at our centre between 1999 and 2011 were identified. Logistic regression was undertaken to identify factors increasing the likelihood of operative management (OM) and mortality. We identified 396 patients with a mean ISS of 33 (± 14). Sixty-two (18%) patients had severe liver injuries (≥ AAST grade IV). OM occurred in 109 (27%) patients. Logistic regression revealed high ISS (OR 1.07; 95% CI 1.05-1.10), and lower systolic blood pressure on arrival (OR 0.98; 95% CI 0.97-0.99) to be associated with OM. The overall mortality was 17%. Older patients (OR 1.05; 95% CI 1.03-1.07), those with high ISS (OR 1.11; 95% CI 1.08-1.14) and those requiring OM (OR 2.89; 95% CI 1.47-5.69) were more likely to die. Liver-related morbidities occurred in equal frequency in the OM (23%) and AE (29%) groups (p = 0.32). Only 3% of those with NOM experienced morbidity.
The majority of patients with blunt hepatic trauma can be successfully managed non-operatively. Morbidity associated with NOM was low. Patients requiring AE had morbidity similar to OM.
closed_qa
Do individuals with and without depression value depression differently?
Health state valuations, used to evaluate the effectiveness of healthcare interventions, can be obtained either by the patients or by the general population. The general population seems to value somatic conditions more negatively than patients, but little is known about valuations of psychological conditions. This study examined whether individuals with and without depression differ in their valuations of depression and whether perceptions regarding depression (empathy, perceived susceptibility, stigma, illness perceptions) and individual characteristics (mastery, self-compassion, dysfunctional attitudes) bias valuations of either individuals with or without depression. In an online study, a general population sample used a time-trade-off task to value 30 vignettes describing depression states (four per participant) and completed questionnaires on perceptions regarding depression and individual characteristics. Participants were assigned to depression groups (with or without depression), based on the PHQ-9. A generalized linear mixed model was used to assess discrepancies in valuations and identify their determinants. The sample (N = 1268) was representative of the Dutch population on age, gender, education and residence. We found that for mild depression states, individuals with depression (N = 200) valued depression more negatively than individuals without depression (N = 1068) (p = .007). Variables related to perceptions of depression and individual characteristics were not found to affect valuations of either individuals with or individuals without depression.
Since the general population values depression less negatively, using their perspective might result in less effectiveness for interventions for mild depression. Perceptions of depression or to individual characteristics did not seem to differentially affect valuations made by either individuals with or without depression.
closed_qa
Do trauma center levels matter in older isolated hip fracture patients?
Younger, multi-trauma patients have improved survival when treated at a trauma center. Many regions now propose that older patients be triaged to a higher level trauma centers (HLTCs-level I or II) versus lower level trauma centers (LLTCs-level III or nondesignated TC), even for isolated injury, despite the absence of an established benefit in this elderly cohort. We therefore sought to determine if older isolated hip fracture patients have improved survival outcomes based on trauma center level. A retrospective cohort of 1.07 million patients in The Nationwide Emergency Department Sample from 2006-2010 was used to identify 239,288 isolated hip fracture patients aged ≥65 y. Multivariable logistic regression was performed controlling for patient- and hospital-level variables. The main outcome measures were inhospital mortality and discharge disposition. Unadjusted logistic regression analyses revealed 8% higher odds of mortality (odds ratio [OR], 1.08; 95% confidence interval [CI], 1.00-1.16) and 10% lower odds of being discharged home (OR, 0.90; 95% CI, 0.80-1.00) among patients admitted to an HLTC versus LLTC. After controlling for patient- and hospital-level factors, neither the odds of mortality (OR, 1.06; 95% CI, 0.97-1.15) nor the odds of discharge to home (OR, 0.98; 95% CI, 0.85-1.12) differed significantly between patients treated at an HLTC versus LLTC.
Among patients with isolated hip fractures admitted to HLTCs, mortality and discharge disposition do not differ from similar patients admitted to LLTCs. These findings have important implications for trauma systems and triage protocols.
closed_qa
Impact of oligodendroglial component in glioblastoma (GBM-O): Is the outcome favourable than glioblastoma?
Prognosis of patients with glioblastoma with oligodendroglial component (GBM-O) is not well defined. We report our experience of patients of GBM-O treated at our center. Between January 2007 and August 2013, out of 817 consecutive patients with glioblastoma (GBM), 74 patients with GBM-O were identified in our prospectively maintained database. An experienced neuropathologist revaluated the histopathology of all these 74 patients and the diagnosis of GBM-O was eventually confirmed in 57 patients. Patients were uniformly treated with maximal safe resection followed by focal radiotherapy with concurrent and adjuvant temozolamide (TMZ). At a median follow up of 16 months, median overall survival (OS) and progression free survival (PFS) of the entire cohort was 23 months and 13 months respectively. Near total excision was performed in 30/57 (52.6%). On univariate analysis, age<50 years was a significant favourable prognostic factor for OS (p = 0.009) and PFS (p = 0.017), while patients with near total resection had a significantly better PFS (p = 0.017), patients who completed a minimum of 6 cycles of adjuvant TMZ had significantly better OS (p = 0.000) and PFS (p = 0.003). On multivariate analysis, none of the above factors were significant except for patient who had completed a minimum of 6 cycles of TMZ (OS; p = 0.000&PFS; p = 0.015). A comparative analysis of GBM-O patients with a similarly treated cohort of 105 GBM patients during the same period revealed significantly better median OS in favour of GBM-O (p = 0.01).
Our experience suggests patients with GBM-O have a more favourable clinical outcome as compared to GBM.
closed_qa
Is laparoscopic treatment of incisional and recurrent hernias associated with an increased risk for complications?
Hernias of the ventral abdominal wall can be treated with an intraperitoneal onlay mesh (IPOM). The aim of this cohort study was to analyze the complications and recurrence rates after laparoscopic ventral hernia repair focusing especially on incisional and recurrent hernias. The study population comprised 149 patients with a hernia of the abdominal wall, which was treated with an IPOM between January 2006 and January 2011. Fifty-one patients had a primary hernia (group I) and 98 patients had preceding abdominal surgery (group II). In group II 64 patients had an incisional hernia and 34 patients had a recurrent hernia. The median body mass index was 30.3 kg/m(2) (14.8-69.1) without any significance in sub-group comparison. The mean duration of surgery and the length of stay were significantly longer in group II (p < 0.05). The overall rate of minor complications was 18.1%. There were significantly more minor complications in group II (7.8% vs. 23.5%, p = 0.02). Notably, there were also significantly more major complications in group II (14.3% vs. 2.0%; p = 0.02). The recurrence rate was significantly higher in group II (group I: 3.9% vs. group II: 16.3%, p < 0.05). There were no early recurrences in group I, but 5 early recurrences in group II.
Laparoscopic treatment of complex hernias as incisional hernias, recurrent hernias and hernias with interenteric and enteroperitoneal adhesions is associated with high rates of minor and major complications. A high level of expertise of the surgeon and the camera-guiding assistant is therefore needed.
closed_qa
Is the Intrahepatic Sound Speed an Indicator of the Fat Content of the Liver in Children?
Can the sonographically estimated intrahepatic sound speed give hints on the hepatic fat content in children? In 75 children and adolescents the intrahepatic sound speed was estimated during routine sonography of the liver as a result of an image reconstruction algorithm. It was correlated with weight, age, size, and body mass index (BMI) oft he children, respectively. The average hepatic sound speed in children of normal weight was 1 566 m/s (standard deviation (STD) 29 m/s, in overweight or obese children it was 1 501 m/s (STD 22 m/s), and in obese children it was 1 497 ms (STD 24 m/s). The strongest correlation was found between sound speed and BMI of the children, respectively. The difference of sound speed in normal weighing subjects and overweight/obese children was statistically significant.
There is a good correlation between the estimated intrahepatic sound speed and the fat content of the liver. Further examinations are encouraged.
closed_qa
The nutritional status of hospitalized children: Has this subject been overlooked?
To determine the nutritional status of hospitalized children at the time of admission and to investigate the relationship between diagnosis and nutritional status. Body weight, height, triceps skinfold thickness, and mid-arm circumference were measured on admission and percentages of weight-for-age, weight-for-height, body mass index, mid-arm circumference, and triceps skinfold thickness were calculated. The nutritional status was evaluated using the Waterlow, Gomez, and other anthropometric assessments. A total of 511 patients were included in the study with a mean age of 5.8±4.9 years. Malnutrition was determined in 52.7% of patients according to the Waterlow classification. Mild malnutrition was determined in 39%, moderate in 12%, and severe in 1.7%, with the characteristics of acute malnutrition in 23.9%, acute-chronic in 7.3%, and chronic in 21.5%. The highest rate of malnutrition was in the 0-2 years age group (62.3%). According to the Gomez classification, malnutrition rate was determined as 46.8%. The rates of malnutrition in malignant, gastrointestinal, and infectious diseases were 60%, 59.8%, and 54.5%, respectively.
The prevalence of malnutrition in hospitalized children was noticeably high. The nutritional evaluation of all patients and an early start to nutritional support could provide a significant positive contribution.
closed_qa
Evaluating Variations of Bladder Volume Using an Ultrasound Scanner in Rectal Cancer Patients during Chemoradiation: Is Protocol-Based Full Bladder Maintenance Using a Bladder Scanner Useful to Maintain the Bladder Volume?
The maintenance of full bladder is important to reduce radiation-induced toxicities and maintain the therapeutic consistency in locally advanced rectal cancer patients who underwent radiotherapy (RT). So, the aim of this study was to evaluate the effectiveness of protocol-based full bladder maintenance by assessing bladder volume variation using an ultrasound bladder scanner to maintain bladder volume. From March 2011 to May 2011, twenty consecutive rectal cancer patients receiving external beam RT participated in this prospective study. Protocol-based full bladder maintenance consisted of education, training and continuous biofeedback by measuring bladder volume. Bladder volume was measured by bladder scan immediately before simulation CT scan and before each treatment three times weekly during the RT period. The relative bladder volume change was calculated. Intra-patient bladder volume variations were quantified using interquartile range (IQR) of relative bladder volume change in each patient. We compared intra-patient bladder volume variations obtained (n=20) with data from our previous study patients (n=20) performing self-controlled maintenance without protocol. Bladder volumes measured by bladder scan highly correlated with those on simulation CT scan (R=0.87, p<0.001). Patients from this study showed lower median IQR of relative bladder volume change compared to patients of self-controlled maintenance from our previous study, although it was not statistically significant (median 32.56% vs. 42.19%, p=0.058). Upon logistic regression, the IQR of relative bladder volume change was significantly related to protocol-based maintenance [relative risk 1.045, 95% confidence intervals (CI) 1.004-1.087, p=0.033]. Protocol-based maintenance included significantly more patients with an IQR of relative bladder volume change less than 37% than self-controlled maintenance (p=0.025).
Our findings show that bladder volume could be maintained more consistently during RT by protocol-based management using a bladder scan.
closed_qa
'Are We Not Human?
The advent of anti-retroviral therapy (ART) in Southern Africa holds the promise of shifting the experience of HIV toward that of a manageable chronic condition. However, this potential can only be realized when persons living with HIV are able to access services without barriers, which can include stigma. Our qualitative study explored experiences of persons living with disabilities (PWD) in Lusaka, Zambia who became HIV-positive (PWD/HIV+). We conducted interviews with 32 participants (21 PWD/HIV+ and 11 key informants working in the fields of HIV and/or disability). Inductive thematic analysis of interview transcripts was informed by narrative theory. Participants' accounts highlighted the central role of stigma experienced by PWD/HIV+, with stigmatizing attitudes closely linked to prevailing societal assumptions that PWD are asexual. Seeking diagnostic and treatment services for HIV was perceived as evidence of PWD being sexually active. Participants recounted that for PWD/HIV+, stigma was enacted in a variety of settings, including the queue for health services, their interactions with healthcare providers, and within their communities. Stigmatizing accounts told about PWD/HIV+ were described as having important consequences. Not only did participants recount stories of internalized stigma (with its damaging effects on self-perception), but also that negative experiences resulted in some PWD preferring to "die quietly at home" rather than being subjected to the stigmatizing gaze of others when attempting to access life-preserving ART. Participants recounted how experiences of stigma also affected their willingness to continue ART, their willingness to disclose their HIV status to others, as well as their social relations. However, participants also offered counter-stories, actively resisting stigmatizing accounts and portraying themselves as resilient and resourceful social actors.
The study highlights a significant barrier to healthcare experienced by PWD/HIV+, with important implications for the future design and equitable delivery of HIV services in Zambia. Stigma importantly affects the abilities of PWD/HIV+ to manage their health conditions.
closed_qa
Correlation of quantitative dual-energy computed tomography iodine maps and abdominal computed tomography perfusion measurements: are single-acquisition dual-energy computed tomography iodine maps more than a reduced-dose surrogate of conventional computed tomography perfusion?
Study objectives were the quantitative evaluation of whether conventional abdominal computed tomography (CT) perfusion measurements mathematically correlate with quantitative single-acquisition dual-energy CT (DECT) iodine concentration maps, the determination of the optimum time of acquisition for achieving maximum correlation, and the estimation of the potential for radiation exposure reduction when replacing conventional CT perfusion by single-acquisition DECT iodine concentration maps. Dual-energy CT perfusion sequences were dynamically acquired over 51 seconds (34 acquisitions every 1.5 seconds) in 24 patients with histologically verified pancreatic carcinoma using dual-source DECT at tube potentials of 80 kVp and 140 kVp. Using software developed in-house, perfusion maps were calculated from 80-kVp image series using the maximum slope model after deformable motion correction. In addition, quantitative iodine maps were calculated for each of the 34 DECT acquisitions per patient. Within a manual segmentation of the pancreas, voxel-by-voxel correlation between the perfusion map and each of the iodine maps was calculated for each patient to determine the optimum time of acquisition topt defined as the acquisition time of the iodine map with the highest correlation coefficient. Subsequently, regions of interest were placed inside the tumor and inside healthy pancreatic tissue, and correlation between mean perfusion values and mean iodine concentrations within these regions of interest at topt was calculated for the patient sample. The mean (SD) topt was 31.7 (5.4) seconds after the start of contrast agent injection. The mean (SD) perfusion values for healthy pancreatic and tumor tissues were 67.8 (26.7) mL per 100 mL/min and 43.7 (32.2) mL per 100 mL/min, respectively. At topt, the mean (SD) iodine concentrations were 2.07 (0.71) mg/mL in healthy pancreatic and 1.69 (0.98) mg/mL in tumor tissue, respectively. Overall, the correlation between perfusion values and iodine concentrations was high (0.77), with correlation of 0.89 in tumor and of 0.56 in healthy pancreatic tissue at topt. Comparing radiation exposure associated with a single DECT acquisition at topt (0.18 mSv) to that of an 80 kVp CT perfusion sequence (2.96 mSv) indicates that an average reduction of Deff by 94% could be achieved by replacing conventional CT perfusion with a single-acquisition DECT iodine concentration map.
Quantitative iodine concentration maps obtained with DECT correlate well with conventional abdominal CT perfusion measurements, suggesting that quantitative iodine maps calculated from a single DECT acquisition at an organ-specific and patient-specific optimum time of acquisition might be able to replace conventional abdominal CT perfusion measurements if the time of acquisition is carefully calibrated. This could lead to large reductions of radiation exposure to the patients while offering quantitative perfusion data for diagnosis.
closed_qa
Will the introduction of non-invasive prenatal testing for Down's syndrome undermine informed choice?
To investigate whether the introduction of non-invasive pre-natal testing for Down's syndrome (DS) has the potential to undermine informed choice. Three hundred and ninety-three health professionals; 523 pregnant women. A cross-sectional questionnaire study across nine maternity units and three conferences in the UK designed to assess opinions regarding test delivery and how information should be communicated to women when offered Down's syndrome screening (DSS) or diagnosis using invasive (IDT) or non-invasive testing (NIPT). Both pregnant women and health professionals in the NIPT and DSS groups were less likely than the IDT group to consider that testing should take place at a return visit or that obtaining written consent was necessary, and more likely to think testing should be carried out routinely. Compared to health professionals, pregnant women expressed a stronger preference for testing to occur on the same day as pre-test counselling (P = 0.000) and for invasive testing to be offered routinely (P = 0.000). They were also more likely to indicate written consent as necessary for DSS (P = 0.000) and NIPT (P<0.05).
Health professionals and pregnant women view the consenting process differently across antenatal test types. These differences suggest that informed choice may be undermined with the introduction of NIPT for DS into clinical practice. To maintain high standards of care, effective professional training programmes and practice guidelines are needed which prioritize informed consent and take into account the views and needs of service users.
closed_qa
Public stigma in intellectual disability: do direct versus indirect questions make a difference?
Stigma may negatively impact individuals with intellectual disabilities (ID). However, most studies in the field have been based on the use of direct measurement methods for assessing stigma. This study examined public stigma towards individuals with ID within a representative sample of the Israeli public by comparing direct versus indirect questioning. Vignette methodology was utilised with two questionnaire versions. In the direct questionnaire (n = 306), the participants were asked how they would think, feel and behave if a man with ID asked them a question in a public place. In the indirect questionnaire (n = 301), the participants were asked to report how a hypothetical 'other man' would think, feel and behave in the same situation. Higher levels of stigma were reported among participants that answered the indirect questionnaire version. Furthermore, among those participants that answered the indirect questionnaire version, subjective knowledge of ID was a less important correlate of stigma than for those participants that answered the direct questionnaire.
Several explanations are suggested for the finding that indirect questioning elicits more negative stigmatic attitudes. Among others, indirect questioning may be a more appropriate methodology for eliciting immediate beliefs. Furthermore, the results call for implementing a comprehensive, multi-level programme to change stigma.
closed_qa
Breaking down barriers to communicating complex retinoblastoma information: can graphics be the solution?
To investigate the impact of a graphical timeline summarizing bilateral retinoblastoma disease and treatment outcomes on parents' understanding of complex medical information. Cross-sectional survey. Parents of children with retinoblastoma who were being actively managed at The Hospital for Sick Children were recruited. Forty-five parents from 42 families participated. After a standardized presentation on retinoblastoma and visual tool named Disease-Specific electronic Patient Illustrated Clinical Timeline (DePICT), parents completed a 19-item questionnaire designed to assess their understanding of treatment choices for 2 eyes in bilateral retinoblastoma as communicated using DePICT. SPSS was used to perform statistical analysis. Forty-five parents from 42 families participated (65% female). Median age of participants was 34 years. Median level of participant education was completion of college/trade school. The median level of annual income was $40,000 to $70,000 CDN. Median time since diagnosis of retinoblastoma in their child was 13.5 months. Twenty-three (51%) participants were parents of children with unilateral retinoblastoma, and 22 (49%) were parents of children with bilateral retinoblastoma. Median number of correct answers was 15 of 19, and mean score was 77%. Normal distribution of scores was noted. English as a first language was significantly associated with score (p = 0.01). No significant association was observed between other variables and score in all analyses.
This study builds on the validation of DePICT by demonstrating that parents can achieve good comprehension even when considering choices for treatment for 2 eyes with bilateral retinoblastoma. Clinical application of this tool can enhance the consent process.
closed_qa
Food intake reported versus nursing records: is there agreement in surgical patients?
To evaluate the agreement between oral feeding by patients and chart records of this acceptance. Besides the food intake surveys of surgical patients, the nursing records of nutrition were evaluated. Is was considered good oral feeding: intake ≥ 75% of total calories prescribed at the day; medium acceptance: 50 to 74.9%; low acceptance:<50% and NPO (nothing per oral). The Kappa coefficient was adopted to assess agreement. There were similar answers between patient and nursing records in 91.3% of NPO situations, 87.1% for good oral feeding, 17.8% for medium acceptance and 16.5% for low acceptance (Kappa = 0.45).
Agreement between patient's reports and nursing records was moderate to low. A higher proportion of similar answers were observed when the patients related good oral feeding or NPO.
closed_qa
Does previous chemotherapy-induced nausea and vomiting predict postoperative nausea and vomiting?
Postoperative nausea and vomiting (PONV) remains a problem in the postoperative period. Previous PONV in oncology patients has recently been associated with chemotherapy-induced nausea and vomiting (CINV). We assessed if CINV could improve Apfel's heuristic for predicting PONV. We conducted a retrospective study of 1500 consecutive patients undergoing intermediate or major cancer surgery between April and July 2011. PONV was assessed in the first postoperative day during post-anaesthesia care. The assigned anaesthetist completed an electronic medical record with all of the studied variables. Multiple logistic regression analyses were performed to assess whether any of the variables could add predictive ability to Apfel's tallying heuristic, and receiver operating characteristic (ROC) curves were modelled. The areas under the curve (AUC) were used to compare the model's discriminating ability for predicting patients who vomited from those who did not vomit. The overall incidence of PONV was 26%. Multiple logistic regressions identified two independent predictors for PONV (odds ratio; 95% CI), Apfel's score (1.78; 1.23-2.63) and previous chemotherapy-induced vomiting (3.15; 1.71-5.9), Hosmer-Lemeshow's P<0.0001. Previous CINV was the most significant predictor to be added to Apfel's heuristic in this population.
A history of chemotherapy-induced nausea vomiting was a strong predictor for PONV and should be investigated as an added risk factor for PONV in the preoperative period of oncology surgery in prospective studies.
closed_qa
Is MR-guided High-intensity Focused Ultrasound a Feasible Treatment Modality for Desmoid Tumors?
MR-guided high-intensity focused ultrasound is a noninvasive treatment modality that uses focused ultrasound waves to thermally ablate tumors within the human body while minimizing side effects to surrounding healthy tissues. This technology is FDA-approved for certain tumors and has potential to be a noninvasive treatment option for extremity soft tissue tumors. Development of treatment modalities that achieve tumor control, decrease morbidity, or both might be of great benefit for patients. We wanted to assess the potential use of this technology in the treatment of extremity desmoid tumors.QUESTIONS/ (1) Can we use MR-guided high-intensity focused ultrasound to accurately ablate a predetermined target volume within a human cadaver extremity? (2) Does MR-guided high-intensity focused ultrasound treatment stop progression and/or cause regression of extremity desmoid tumors? Simulated tumor volumes in four human cadavers, created by using plastic markers, were ablated using a commercially available focused ultrasound system. Accuracy was determined in accordance with the International Organization of Standards location error by measuring the farthest distance between the ablated tissue and the plane corresponding to the target. Between 2012 and 2014, we treated nine patients with desmoid tumors using focused ultrasound ablation. Indications for this were tumor-related symptoms or failure of conventional treatment. Of those, five of them were available for MRI followup at 12 months or longer (mean, 18.2 months; range, 12-23 months). The radiographic and clinical outcomes of five patients who had desmoid tumors treated with focused ultrasound were prospectively recorded. Patients were assessed preoperatively with MRI and followed at routine intervals after treatment with MRI scans and clinical examination. The ablation accuracy for the four cadaver extremities was 5 mm, 3 mm, 8 mm, and 8 mm. Four patients' tumors became smaller after treatment and one patient has slight progression at the time of last followup. The mean decrease in tumor size determined by MRI measurements was 36% (95% confidence interval, 7%-66%). No patient has received additional adjuvant systemic or local treatment. Treatment-related adverse events included first- and second-degree skin burns occurring in four patients, which were managed successfully without further surgery.
This preliminary investigation provides some evidence that MR-guided high-intensity focused ultrasound may be a feasible treatment for desmoid tumors. It may also be of use for other soft tissue neoplasms in situations in which there are limited traditional treatment options such as recurrent sarcomas. Further investigation is necessary to better define the indications, efficacy, role, and long-term oncologic outcomes of focused ultrasound treatment.
closed_qa
Do Upper Extremity Trauma Patients Have Different Preferences for Shared Decision-making Than Patients With Nontraumatic Conditions?
Shared decision-making is a combination of expertise, available scientific evidence, and the preferences of the patient and surgeon. Some surgeons contend that patients are less capable of participating in decisions about traumatic conditions than nontraumatic conditions.QUESTIONS/ (1) Do patients with nontraumatic conditions have different preferences for shared decision-making when compared with those who sustained acute trauma? (2) Do disability, symptoms of depression, and self-efficacy correlate with preference for shared decision-making? In this prospective, comparative trial, we evaluated a total of 133 patients presenting to the outpatient practices of two university-based hand surgeons with traumatic or nontraumatic hand and upper extremity illnesses or conditions. Each patient completed questionnaires measuring their preferred role in healthcare decision-making (Control Preferences Scale [CPS]), symptoms of depression (Patients' Health Questionnaire), and pain self-efficacy (confidence that one can achieve one's goals despite pain; measured using the Pain Self-efficacy Questionnaire). Patients also completed a short version of the Disabilities of the Arm, Shoulder, and Hand questionnaire and an ordinal rating of pain intensity. There was no difference in decision-making preferences between patients with traumatic (CPS: 3 ± 2) and nontraumatic conditions (CPS: 3 ± 1 mean difference = 0.2 [95% confidence interval, -0.4 to 0.7], p = 0.78) with most patients (95 versus 38) preferring shared decision-making. More educated patients preferred a more active role in decision-making (beta = -0.1, r = 0.08, p = 0.001); however, differences in levels of disability, pain and function, depression, and pain-related self-efficacy were not associated with differences in patients' preferences in terms of shared decision-making.
Patients who sustained trauma have on average the same preference for shared decision-making compared with patients who sustained no trauma. Now that we know the findings of this study, clinicians might be motivated to share their expertise about the treatment options, potential outcomes, benefits, and harms with the patient and to discuss their preference as well in a semiacute setting, resulting in a shared decision.
closed_qa
Is traditional Chinese medicine recommended in Western medicine clinical practice guidelines in China?
Evidence-based medicine promotes and relies on the use of evidence in developing clinical practice guidelines (CPGs). The Chinese healthcare system includes both traditional Chinese medicine (TCM) and Western medicine, which are expected to be equally reflected in Chinese CPGs. To evaluate the inclusion of TCM-related information in Western medicine CPGs developed in China and the adoption of high level evidence. All CPGs were identified from the China Guideline Clearinghouse (CGC), which is the main Chinese organisation maintaining the guidelines issued by the Ministry of Health of China, the Chinese Medical Association and the Chinese Medical Doctors' Association.TCM-related contents were extracted from all the CPGs identified. Extracted information comprised the institution issuing the guideline, date of issue, disease, recommendations relating to TCM, evidence level of the recommended content and references supporting the recommendations. A total of 604 CPGs were identified, only a small number of which (74/604; 12%) recommended TCM therapy and only five guidelines (7%) had applied evidence grading. The 74 CPGs involved 13 disease systems according to the International Classification of Diseases 10th edition. TCM was mainly recommended in the treatment part of the guidelines (73/74, 99%), and more than half of the recommendations (43/74, 58%) were related to Chinese herbal medicine (single herbs or herbal treatment based on syndrome differentiation).
Few Chinese Western medicine CPGs recommend TCM therapies and very few provide evidence grading for the TCM recommendation. We suggest that future guideline development should be based on systematic searches for evidence to support CPG recommendations and involve a multidisciplinary approach including TCM expertise.
closed_qa
Do patterns of mental healthcare predict treatment failure in young people with schizophrenia?
Little is known about the practice of predicting community-based care effectiveness of patients affected by schizophrenic disorders. We assessed predictors of treatment failure in a large sample of young people affected by schizophrenia. A cohort of 556 patients aged 18-35 years who were originally diagnosed with schizophrenia during 2005-2009 in a Mental Health Service (MHS) of the Italian Lombardy Region was identified. Intensity of mental healthcare received during the first year after index visit (exposure) was measured by patients' regularity in MHS attendance and the length of time covered with antipsychotic drug therapy. Patients were followed from index visit until 2012 for identifying hospital admission for mental disorder (outcome). A proportional hazards model was fitted to estimate the HR and 95% CIs for the exposure-outcome association, after adjusting for several covariates. A set of sensitivity analyses were performed in order to account for sources of systematic uncertainty. During follow-up, 144 cohort members experienced the outcome. Compared with patients on low coverage with antipsychotic drugs (≤ 4 months), those on intermediate (5-8 months) and high (≥ 9 months) coverage, had HRs (95% CI) of 0.94 (0.64 to 1.40) and 0.69 (0.48 to 0.98), respectively. There was no evidence that regular attendance at the MHS affected the outcome.
Patients in the early phase of schizophrenia and their families should be cautioned about the possible consequences of poor antipsychotic adherence. Physicians and decision makers should increase their contribution towards improving mental healthcare.
closed_qa
Can CPAP be indicated in adult patients with suspected obstructive sleep apnea only on the basis of clinical data?
There is scarce information about whether the diagnosis of OSA supported only by medical record data can be a useful and reliable tool to initiate a CPAP treatment. The aim of this study is to develop and assess the accuracy of clinical parameters for the diagnosis and prescription of CPAP in patients with suspected OSA. Adult patients who underwent polysomnography and completed the Berlin questionnaire, a clinical record, and the Epworth sleepiness scale were included in the study. A situation was simulated in which two blinded and independent observers would be able to indicate CPAP treatment if the patients were snorers with frequent apnea reports (≥3-4 times a week) and overweight (BMI > 25 kg/m(2)) plus one of the following: diurnal symptoms (tiredness after sleeping or at waking time ≥3-4 times a week or Epworth sleepiness scale>11), arterial hypertension, cerebrovascular accident, coronary event, type II diabetes or cardiac arrhythmias (observer 1, clinical criteria) or on the basis of the respiratory disturbance index, significant tiredness (≥3-4 times a week) or sleepiness (Epworth>11) and associated comorbidities (observer 2, reference method). The area under the ROC curve (ABC-ROC), sensitivity, specificity, and likelihood ratios were calculated. Among 516 subjects (72 % men), the median age was 52 years, BMI 28.3 kg/m(2), and RDI 19.7 events/h. The ABC-ROC, sensitivity, specificity, and positive likelihood ratio of the clinical parameters were of 0.64 to 0.65, 31 to 33 %, 97 to 98 %, and 11 to15 respectively. No differences in the diagnostic performance of the clinical criteria were observed between men and women.
These clinical parameters made it possible to indicate CPAP in approximately one third of the population with OSA which would have required it on the basis of their PSG and clinical history. This approach showed high specificity; hence, few patients who did not meet the criteria for CPAP use would have received this treatment.
closed_qa
Does time heal all wounds?
A lack of longitudinal studies has hampered the understanding of the development of posttraumatic stress symptoms (PTSS) in parents of children diagnosed with cancer. This study examines level of PTSS and prevalence of posttraumatic stress disorder (PTSD) from shortly after diagnosis up to 5 years after end of treatment or child's death, in mothers and fathers. A design with seven assessments (T1-T7) was used. T1-T3 were administered during treatment and T4-T7 after end of treatment or child's death. Parents (N = 259 at T1; n = 169 at T7) completed the PTSD Checklist Civilian Version. Latent growth curve modeling was used to analyze the development of PTSS. A consistent decline in PTSS occurred during the first months after diagnosis; thereafter the decline abated, and from 3 months after end of treatment only minimal decline occurred. Five years after end of treatment, 19% of mothers and 8% of fathers of survivors reported partial PTSD. Among bereaved parents, corresponding figures were 20% for mothers and 35% for fathers, 5 years after the child's death.
From 3 months after end of treatment the level of PTSS is stable. Mothers and bereaved parents are at particular risk for PTSD. The results are the first to describe the development of PTSS in parents of children diagnosed with cancer, illustrate that end of treatment is a period of vulnerability, and that a subgroup reports PTSD 5 years after end of treatment or child's death.
closed_qa
Do tall women beget larger babies?
To evaluate the possible relationship between maternal height and fetal size. We used a population-based cohort of apparently healthy mothers of singletons to evaluate quartiles of the maternal height distribution for parity, being overweight or obese, and for gestational age and birth weight parameters. We also generated birth weight by gestational age curves for each quartile. We analyzed data of 198,745 mothers. Mother from the four quartiles had similar parity, pre-gravid BMI, and gestational age at birth. Short mothers had a significantly higher rate of VLBW and LBW and 2501-4000 g infants, for an OR = 1.38 (95% CI: 1.17-1.62), OR = 2.2 (95% CI: 2.05-2.37) and OR = 1.82 (95% CI: 1.73-1.87) between the shortest and tallest mothers, respectively. By contrast, the opposite trend was noticed for birth weights>4000 g, for an OR = 2.77 (95% CI: 2.65-2.89) between the tallest and shortest mothers. A very similar "growth curve" was apparent until 33 weeks, when a slower growth velocity was observed for shorter compared with taller women.
Maternal stature does not appear to be associated with gestational age but significantly influences birth weight. Height-related differences between mothers appears to begin after 33 weeks' gestation.
closed_qa
Can the observed association between serum perfluoroalkyl substances and delayed menarche be explained on the basis of puberty-related changes in physiology and pharmacokinetics?
An association between serum levels of two perfluoroalkyl substances (PFAS) and delayed age at menarche was reported in a cross-sectional study of adolescents. Because perfluorooctanoic acid (PFOA) and perfluorooctane sulfonate (PFOS) have half-lives of years, growth dilution and the development of a new route of excretion (menstruation) could account for some or all of the reported association. To assess how much of the epidemiologic association between PFAS and delayed menarche can be explained by the correlation of growth and maturation with PFAS body burden. We developed a Monte Carlo (MC) physiologically-based pharmacokinetic (PBPK) model of PFAS to simulate plasma PFAS levels in a hypothetical female population aged 2 to 20years old. Realistic distributions of physiological parameters as well as timing of growth spurts and menarche were incorporated in the model. The association between PFAS level and delayed menarche in the simulated data was compared with the reported association. The prevalence of menarche, distributions of age-dependent physiological parameters, and quartiles of serum PFAS concentrations in the simulated subjects were comparable to those reported in the epidemiologic study. The delay of menarche in days per natural log increase in PFAS concentrations in the simulated data were about one third as large as the observed values.
The reported relationship between PFAS and age at menarche appears to be at least partly explained by pharmacokinetics rather than a toxic effect of these substances.
closed_qa
Comparative study of 2-D and bichanneled 3-D laparoscopic images: Is there a difference?
Lack of depth perception and spatial orientation are drawbacks of laparoscopic surgery. The advent of the 3-D camera system enables surgeons to regain binocular vision. The aim of this study was to gain subjective and objective data to determine whether 3-D systems are superior to 2-D systems. Our study consisted of two parts: a laparoscopic training model and an actual operation assessment. In the first part, we compared two groups of surgeon (specialists and trainees) performing a laparoscopic task using a 2-D and a 3-D camera system. In the second part, surgeons were assessed on their performance of standard laparoscopic cholecystectomies using the two different camera systems. At the end of each assessment, participants were required to complete a questionnaire on their impressions of the comparative ease of operation tasks under 2-D and 3-D vision. In the laboratory training model, trainees' performance time was shorter with the 3-D camera system than with the 2-D camera, but no difference was observed in the specialists group. In the surgical (cholecystectomy) assessment, no significant difference was observed between the 2-D and 3-D camera systems in terms of operative time and precision. The questionnaire indicated that all participants did not significantly favor the 3-D system.
We believe that the 3-D camera system can allow young surgeons to perform standard laparoscopic tasks safely and quickly, so as to accelerate the learning curve. However, new-generation 3-D systems will be essential to overcome surgeons' discomfort.
closed_qa
Nonvisualization of the Fetal Gallbladder: Can Levels of Gamma-Glutamyl Transpeptidase in Amniotic Fluid Predict Fetal Prognosis?
In cases of nonvisualization of the fetal gallbladder (NVFGB), we investigated whether amniotic fluid levels of gamma-Glutamyl transpeptidase (GGTP) can distinguish normal development or benign gallbladder agenesis from severe anomaly such as biliary atresia. This is a retrospective cohort study of pregnancies in which the gallbladder was not visualized in the second-trimester fetal anatomy scan. Levels of GGTP in amniotic fluid were analyzed prior to 22 weeks of gestation by amniocentesis. Data were collected regarding other fetal malformations, fetal karyotype, and screening results for cystic fibrosis transmembrane conductance regulator (CFTR) gene mutations. Of 32 cases of NVFGB, 27 (84%) had normal GGTP levels and a normal CFTR gene screening, and 1 of them had an abnormal karyotype. Three of the 5 cases with low GGTP were diagnosed with extrahepatic biliary atresia, proven by histopathological examination following termination of pregnancy. The fourth case had hepatic vasculature abnormality and the fifth isolated gallbladder agenesis. In 22 of 32 cases (68.7%), the gallbladder was detected either later in pregnancy or after delivery.
The findings support low levels of GGTP in amniotic fluid, combined with NVFGB, as a sign of severe disease, mainly biliary atresia. Normal GGTP levels, concomitant with isolated NVFGB, carry a good prognosis.
closed_qa
Short Children with CHARGE Syndrome: Do They Benefit from Growth Hormone Therapy?
We identified 51 children (28 boys and 23 girls) in KIGS (Pfizer International Growth Database). The median chronological age was 7.6 years at the start of GH therapy and 13.2 years at the latest visit. Evaluation for GH deficiency (n = 33) was based on the following: peak GH level 7.3 μg/l and IGF-I level -2.01 standard deviation score (SDS). Sixteen subjects (9 boys) were followed longitudinally for 2 years. Birth length (median SDS, -0.47) and weight (-0.97) were slightly reduced. At the start of GH therapy, height was -3.6 SDS, BMI -0.7 SDS, and the GH dose was 0.26 mg/kg/week. At the latest visit after 2.7 years of GH therapy, height had increased to -2.2 SDS and BMI to -0.5 SDS. In the longitudinal group, height increased from -3.72 SDS at the start of GH therapy to -2.92 SDS after 1 year to -2.37 SDS after 2 years of therapy (start - 2 years: p<0.05), height velocity increased from -1.69 to 2.98 to 0.95 SDS, and BMI and GH dose (mg/kg/week) remained almost unchanged.
Our data show a positive effect of conventional doses of GH on short-term growth velocity for the longitudinal as well as for the total group, without any safety issues.
closed_qa
Intestinal Endometriosis: Mimicker of Inflammatory Bowel Disease?
Endometriosis of the intestinal tract (IE) is thought to mimic inflammatory bowel disease (IBD) both clinically and pathologically but robust data on a large unselected series are missing. Diagnostic problems arise both at colonoscopy as well as on resection specimens for IE when IBD-like features are encountered. The aim was to establish the frequency of IBD-like histology in IE and which type of histological lesions are shared by these two entities. One hundred consecutive, unselected cases of surgically resected IE were collected and clinical features and histopathology reviewed and reevaluated. Seventy-five surgical specimens showed no histological alterations except for endometriosis foci. Twenty-two cases showed focal architectural alterations in the absence of significant inflammation. Three cases showed marked inflammatory and architectural mucosal changes making a differential diagnosis with IBD particularly challenging. On follow-up, however, these patients remained symptom-free and with no need for anti-inflammatory therapy after surgical resection of IE.
Diagnostic problems may arise in women who have IBD-like symptoms and histology at colonoscopy but who lack a known diagnosis of endometriosis. Clinicians must be aware that the diagnosis of IBD in patients with IE should be reevaluated over time.
closed_qa
Comparison of angiogenic and proliferative effects of three commonly used agents for pulmonary artery hypertension (sildenafil, iloprost, bosentan): is angiogenesis always beneficial?
Pulmonary artery hypertension (PAH) is devastating disease that has very serious outcomes. Dysregulated angiogenesis is one of the main responsible courses in pathophysiology of disease. Our experimental research intends to find out and compare the angiogenic effects of medications used sildenafil, iloprost, and bosentan in the treatment of PAH. This study was performed in Department of Biochemistry and Cancer and Stem Cell Research Laboratory of our institutes between August and October 2014. Angiogenic activity of sildenafil, iloprost, and bosentan were examined in vivo in chick chorioallantoic membrane (CAM) model and in vitro tube formation assay of human umbilical vein endothelial cells (HUVECs). Proliferative activity of these three agents was also determined through 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide assay on HUVECs. In CAM assay, when compared to the control and drug groups, treatment with sildenafil solutions resulted in a significant dose-dependent increase (budding, sprouting, extravasation) on CAM vessel growth. While there was no significant proliferative effect with iloprost and bosentan, presence of sildenafil caused a statistically significant proliferation on HUVECs following 24 and 48 h incubation (p<0.05) compared to the control group. Comparing the tube length/area ratio values, there was statistically significant increase in sildenafil group with respect to the other 2 groups (p<0.05). Iloprost and bosentan did not show a significant effect.
The results provide evidence that sildenafil but not iloprost and bosentan induces angiogenesis in vitro and in vivo. Dysregulated angiogenesis, as an important pathophysiological part in the progression of PAH, may be triggered by the chronic ingestion of sildenafil in the long treatment period and may cause negative effects.
closed_qa
(Ton)silly seasons?
Tonsillectomy is a common procedure, with potentially life-threatening complications. Previous investigations into post-tonsillectomy secondary haemorrhage rates suggest an influence of climactic and atmospheric conditions on haemorrhage rate, particularly temperature and water vapour pressure. With a single emergency department and a large variance in atmospheric conditions, Darwin, Australia, is ideal for investigating the effects of local climate on rates of post-operative haemorrhage. A five-year retrospective review was conducted of all tonsillectomy procedures performed between 2008 and 2013. Effects of atmospheric variables were examined using Pearson's correlation coefficient and analysis of variance. A total of 941 patients underwent tonsillectomy in the study period. The bleeding rate was 7.7 per cent. No variation was found between wet and dry season tonsillectomies (p = 0.4). Temperature (p = 0.74), water vapour pressure (p = 0.94) and humidity (p = 0.66) had no effect on bleeding.
The findings revealed no correlation between humidity, season, water vapour pressure and haemorrhage rates. Further research should use multi-site data to investigate the effect of air conditioning, humidification and climactic conditions between different regions in Australia.
closed_qa
Does Self-Citation Influence Quantitative Measures of Research Productivity Among Academic Oral and Maxillofacial Surgeons?
Quantitative measures of research productivity depend on the citation frequency of a publication. Citation-based metrics, such as the h-index (total number of publications h that have at least h citations), can be susceptible to self-citation, resulting in an inflated measure of research productivity. The purpose of the present study was to estimate the effect of self-citation on the h-index among academic oral and maxillofacial surgeons (OMSs). The present study was a cross-sectional study of full-time academic OMSs in the United States. The predictor variable was the frequency of self-citation. The primary outcome of interest was the h-index. Other study variables included demographic factors and citation metrics. Descriptive, bivariate, and regression statistics were computed. The study sample consisted of 325 full-time academic OMSs. Most surgeons were men (88.3%); approximately 40% had medical degrees. The study subjects had an average of 23.5 ± 37.1 publications. The mean number of self-citations was 15 + 56. The sample's mean h-index was 6.6 ± 7.6 and was associated with self-citation (r = 0.71, P<.001). Approximately 9% of subjects had a change in their h-index after removing self-citations. After adjusting for PhD degree, total number of publications, and academic rank, an increasing self-citation rate influenced the h-index (r = 0.006, P<.001). Surgeons with more than 14 self-citations were more likely to have their h-index influenced by self-citation.
Self-citation among full-time academic OMSs does not substantially affect the h-index. Surgeons in the top quartile of self-citation rates are more likely to influence their h-index.
closed_qa
The umbilicus: a reliable surface landmark for the aortic bifurcation?
Anatomical surface landmarks are frequently used by clinicians to guide both diagnosis and treatment. Few studies have examined the reliability of vascular anatomical landmarks in living subjects. The umbilicus has traditionally been described as a surface landmark for the bifurcation of the abdominal aorta. This study examined the factors affecting the position of the umbilicus relative to that of the aortic bifurcation in 95 patients. 106 consecutive abdominal CT scans were analysed by a surgeon and radiologist. Following exclusion of CT scans with relevant significant intra-abdominal pathology, 95 patients were included in the study. Measurements were taken of the craniocaudal distance between the aortic bifurcation and umbilicus, as well as maximum subcutaneous fat thickness at the level of the umbilicus. Patient age and gender were also documented. The umbilicus was found to lie -6.3 ± 26.5 mm from the aortic bifurcation. Increasing subcutaneous fat thickness was associated with a more caudal position of the umbilicus relative to the aortic bifurcation. This result was highly statistically significant in males over 65 years old.
This study suggests that the umbilicus is a reliable clinical surface landmark for the bifurcation of the abdominal aorta. Whilst some variation in craniocaudal distance exists between patients, in the majority of cases, the bifurcation of the abdominal aorta lies within a clinically narrow range of distances from the umbilicus.
closed_qa
Do questions help?
Audience response systems (ARSs) are electronic devices that allow educators to pose questions during lectures and receive immediate feedback on student knowledge. The current literature on the effectiveness of ARSs is contradictory, and their impact on student learning remains unclear. This randomised controlled trial was designed to isolate the impact of ARSs on student learning and students' perception of ARSs during a lecture. First-year medical student volunteers at Johns Hopkins were randomly assigned to either (i) watch a recorded lecture on an unfamiliar topic in which three ARS questions were embedded or (ii) watch the same lecture without the ARS questions. Immediately after the lecture on 5 June 2012, and again 2 weeks later, both groups were asked to complete a questionnaire to assess their knowledge of the lecture content and satisfaction with the learning experience. 92 students participated. The mean (95% CI) initial knowledge assessment score was 7.63 (7.17 to 8.09) for the ARS group (N=45) and 6.39 (5.81 to 6.97) for the control group (N=47), p=0.001. Similarly, the second knowledge assessment mean score was 6.95 (6.38 to 7.52) for the ARS group and 5.88 (5.29 to 6.47) for the control group, p=0.001. The ARS group also reported higher levels of engagement and enjoyment.
Embedding three ARS questions within a 30 min lecture increased students' knowledge immediately after the lecture and 2 weeks later. We hypothesise that this increase was due to forced information retrieval by students during the learning process, a form of the testing effect.
closed_qa
Could Poor Parental Recall of HPV Vaccination Contribute to Low Vaccination Rates?
Rates of initiation and completion of the human papillomavirus (HPV) vaccine series remain below national goals. Because parents are responsible for ensuring vaccination of their children, we examined the accuracy of parental recall of the number of shots their daughters received. Parents/guardians of girls aged 11 to 17 years were asked to recall the number of HPV doses received by their daughters. Dose number was confirmed using provider-verified medical records. Logistic regression assessed variables associated with correct recall. A total of 79 (63%) parents/guardians correctly identified the number of shots their daughters received. Ninety-one (73%) were aware of whether their daughter started the series at all. The only factor significantly associated with accurate recall in logistic regression models was female gender of parent/guardian.
Nearly 40% of parents/guardians inaccurately recalled the number of HPV shots their children received, which may contribute to low rates of vaccine initiation and completion.
closed_qa
Ultrasound-guided fine-needle aspiration of thyroid nodules: does the size limit its efficiency?
The management criterion of thyroid nodules is to evaluate the risk of malignancy, based on cytological examinations. Ultrasound-guided fine-needle aspiration biopsy (US-FNAB) has a highly diagnostic value for thyroid nodules. The aim of this study was to compare the efficacy of US-FNAB for thyroid nodules with different sizes. From August 2013 to November 2013, 344patients with thyroid nodules who had undergone US-FNAB were divided into three groups, according to the largest diameter of their nodules (group A, ≤5.0 mm; group B, 5.1-10.0 mm; group C,>10.0 mm). All the nodules were subsequently verified by histology or follow-up findings. The accuracy, sensitivity, specificity, positive predictive value, negative predictive value of aspiration cytology in each group was compared. Among 344 thyroid nodules diagnosed by cytology, the cytology was classified as nondiagnostic or unsatisfactory for 53 (15.4%) lesions, benign for 144 (41.9%) lesions, atypia of undetermined significance or follicular lesion of undetermined significance for 20 (5.8%) lesions, follicular neoplasm or suspicious for a follicular neoplasm for 26 (7.6%) lesions, suspicious for malignancy for 36 (10.5%) lesions, malignant for 65 (18.9%) lesions. There were 243 benign and 101 malignant nodules confirmed by the pathological or follow-up ultrasound. The sensitivity, specificity, accuracy, positive predictive value, and negative predictive value were confirmed to be 87.5% (14/16), 92.5% (37/40), 91% (51/56), 82.3% (14/17), and 94.8% (37/39) in group A; 92.3% (36/39), 96.9% (94/97), 95.5% (130/136), 92.3% (36/39), and 96.9% (94/97) in group B; and 91.3% (42/46), 93.4% (99/106), 92.7% (141/152) 85.7% (42/49), and 96.1% (99/103), in group C. There were no statistical differences in accuracy, sensitivity, specificity, false positive accuracy, false negative rate of fine needle aspiration of thyroid nodules with different sizes (P>0.05).
US-FNAB has similar diagnostic efficacy to thyroid nodules with different sizes.
closed_qa
Three-dimensional high-resolution anorectal manometry: does it allow automated analysis of sphincter defects?
All patients being tested in our department for faecal incontinence or dyschezia by 3D-HRAM and EUS were eligible for the study. 3D-HRAM was used to record resting and squeeze pressure, reflecting internal and external anal sphincter function, respectively. A software platform was designed to automatically analyse the 3D-HRAM images and calculate a diagnostic score for any anal sphincter defect compared with EUS. A total of 206 (91% female) patients of mean age of 54 years were included in the study. A sphincter defect was diagnosed by EUS in 130 (63%). The diagnostic scores from the 3D-HRAM automated analysis for an internal anal sphincter defect showed a sensitivity of 65% and a specificity of 65%. For an external anal sphincter defect, the sensitivity was 43% and the specificity 87%.
Our study developed a method based on 3D-HRAM to automatically diagnose sphincter defects, allowing a systematic and comprehensive analysis of the test recordings. Compared with EUS, the 3D-HRAM image analysis procedure revealed poor sensitivity and specificity.
closed_qa
Inhibition of brain retinoic acid catabolism: a mechanism for minocycline's pleiotropic actions?
Minocycline is a tetracycline antibiotic increasingly recognized in psychiatry for its pleiotropic anti-inflammatory and neuroprotective potential. While underlying mechanisms are still incompletely understood, several lines of evidence suggest a relevant functional overlap with retinoic acid (RA), a highly potent small molecule exhibiting a great variety of anti-inflammatory and neuroprotective properties in the adult central nervous system (CNS). RA homeostasis in the adult CNS is tightly controlled through local RA synthesis and cytochrome P450 (CYP450)-mediated inactivation of RA. Here, we hypothesized that minocycline may directly affect RA homeostasis in the CNS via altering local RA degradation. We used in vitro RA metabolism assays with metabolically competent synaptosomal preparations from murine brain and human SH-SY5Y neuronal cells as well as viable human SH-SY5Y neuroblastoma cell cultures. We revealed that minocycline potently blocks RA degradation as measured by reversed-phase high-performance liquid chromatography and in a viable RA reporter cell line, even at low micromolar levels of minocycline.
Our findings provide evidence for enhanced RA signalling to be involved in minocycline's pleiotropic mode of action in the CNS. This novel mode of action of minocycline may help in developing more specific and effective strategies in the treatment of neuroinflammatory or neurodegenerative disorders.
closed_qa